Ignoring the fact that a lot of languages, and database systems, do not support generics (but do already support null), you’ve just introduced a more complex type of null value; you’re simply slapping some lipstick on it. 😊
In a discussion about whether null should exist at all, and what might be better, saying that Optional values aren’t available in languages with type systems that haven’t moved on since the 1960s isn’t a strong point in my view.
The key point is that if your type system genuinely knows reliably whether something has a value or not, then your compiler can prevent every single runtime null exception from occurring by making sure it’s handled at some stage and tracking it for you until it is.
The problem with null is that it is pervasive - any value can be null, and you can check for it and handle it, but other parts of your code can’t tell whether that value can or can’t be null. Tracking potential nulls is in the memory of the programmer instead of deduced by the compiler, and checking for nulls everywhere is tedious and slow, so no one does that. Hence null bugs are everywhere.
Tony Hoare, an otherwise brilliant computer scientist, called it his billion dollar mistake a decade or two ago.
Ignoring the fact that a lot of languages, and database systems, do not support generics (but do already support null), you’ve just introduced a more complex type of null value; you’re simply slapping some lipstick on it. 😊
Type-safe lipstick :)
In a discussion about whether null should exist at all, and what might be better, saying that Optional values aren’t available in languages with type systems that haven’t moved on since the 1960s isn’t a strong point in my view.
The key point is that if your type system genuinely knows reliably whether something has a value or not, then your compiler can prevent every single runtime null exception from occurring by making sure it’s handled at some stage and tracking it for you until it is.
The problem with null is that it is pervasive - any value can be null, and you can check for it and handle it, but other parts of your code can’t tell whether that value can or can’t be null. Tracking potential nulls is in the memory of the programmer instead of deduced by the compiler, and checking for nulls everywhere is tedious and slow, so no one does that. Hence null bugs are everywhere.
Tony Hoare, an otherwise brilliant computer scientist, called it his billion dollar mistake a decade or two ago.