jsonata-js/jsonata

($.some_non_existent_key != 'anything') evaluates to false

iiian opened this issue · 3 comments

input:

{}

expression:

$.some_non_existent_key != 'anything'

evaluates to

false

:(

I can fix it with

($.some_non_existent_key ? $.some_non_existent_key : null) != 'anything'

Is there a reason why it should be this way from a language design perspective? My initial intuition is that there might be some kind of ambiguity, but if that's true I'd love to know understand what the ambiguity is.

Thanks

The underlying concept is that undefined behaves like null in SQL and thus

  • undefined = 'something' => false
  • undefined != 'something' => false
  • undefined = undefined => false
  • undefined != undefined => false

Which is probably a concept that needs to be added to https://docs.jsonata.org/comparison-operators

Hey @jhorbulyk, thanks for the background, I appreciate it.

undefined != 'something' => false

If you don't mind my challenging a bit (respectfully, of course), what is the use case for this? I don't know a whole lot about SQL 3VL, but the idea that (undefined = 'something') and undefined != 'something' both evaluate to false seems like a logical inconsistency even for a 3VL system, and ultimately something that an end-user has to step around. Would it not be more SQL-esque to have undefined = 'something' evaluate to undefined?