Took me 2 hours to find out why the final output of a neural network was a bunch of NaN. This is always very annoying but I can’t really complain, it make sense. Just sucks.
I hope it was garlic NaN at least.
I guess you can always just add an
assert not data.isna().any()
in strategic locationsThat could be a nice way. Sadly it was in a C++ code base (using tensorflow). Therefore no such nice things (would be slow too). I skill-issued myself thinking a struct would be 0 -initialized but
MyStruct input;
would not whileMyStruct input {};
will (that was the fix). Long story.I too have forgotten to memset my structs in c++ tensorflow after prototyping in python.
If you use the GNU libc the
feenableexcept
function, which you can use to enable certain floating point exceptions, could be useful to catch unexpected/unwanted NaNsOof. C++ really is a harsh mistress.
Oof. This makes me appreciate the abstractions in Go. It’s a small thing but initializing structs with zero values by default is nice.
If (var.nan){var = 0} my beloved.
It also depends on the context
this is just like in regular math too. not being a number is just so fun that nobody wants to go back to being a number once they get a taste of it
Fucking over-dramatic divisions by 0, sigh.
Also applies to nulls in SQL queries.
It’s not fun tracing where nulls are coming from when dealing with a 1500 line data warehouse pipeline query that aggregates 20 different tables.
Thanks. This is great
The funniest thing about NaNs is that they’re actually coded so you can see what caused it if you look at the binary. Only problem is; due to the nature of NaNs, that code is almost always going to resolve to “tried to perform arithmetic on a NaN”
There are also coded NaNs which are defined and sometimes useful, such as +/-INF, MAX, MIN (epsilon), and Imaginary
“Bounds checking, mobof–ker! Do you speak it?”
Consider IEEE754 arithmetic as monadic, simple!
NaN is such a fun floating point virus. Some really wonky gameplay after we hit NaN in a few spots.
Nanananana! Batman!
As I was coding in C++ my own Engine with OpenGL. I forgot something to do. Maybe forgot to assign a pointer or forgot to pass a variable. At the end I had copied a NaN value to a vertieces of my Model as the Model should be a wrapper for Data I wanted to read and visualize.
Printing the entire Model into the terminal confused me why everything is NaN suddenly when it started nicely.
deleted by creator
eli5?
NaN stands for Not a Number. to simplify very briefly (and not accurate at all), when defining a standard for representing fractional values using binary digits in computers they systematically assigned natural numbers in a range of values to some fractional numbers. some of the possible natural numbers for reasons not worth talking about were unused, so they were designated as NaNs, and the value of the NaN itself is supposed to tell you what went wrong in your calculations to get a NaN. obviously if you use a NaN in an arithmetic operation the result is also Not a Number and that’s what the meme is referring to.
i think the real explanation is simpler and more understandable.
NaN is what you get when you do something illegal like dividing by zero. There is no answer, but the operation has to result in something. So it gives you NaN, because the result is literally not a number
deleted by creator
This gave me some real Agent Smith vibes