• ReakDuck@lemmy.ml
    link
    fedilink
    arrow-up
    0
    ·
    7 months ago

    As I was coding in C++ my own Engine with OpenGL. I forgot something to do. Maybe forgot to assign a pointer or forgot to pass a variable. At the end I had copied a NaN value to a vertieces of my Model as the Model should be a wrapper for Data I wanted to read and visualize.

    Printing the entire Model into the terminal confused me why everything is NaN suddenly when it started nicely.

  • Kajika@lemmy.mlOP
    link
    fedilink
    arrow-up
    0
    ·
    7 months ago

    Took me 2 hours to find out why the final output of a neural network was a bunch of NaN. This is always very annoying but I can’t really complain, it make sense. Just sucks.

      • Kajika@lemmy.mlOP
        link
        fedilink
        arrow-up
        0
        ·
        7 months ago

        That could be a nice way. Sadly it was in a C++ code base (using tensorflow). Therefore no such nice things (would be slow too). I skill-issued myself thinking a struct would be 0 -initialized but MyStruct input; would not while MyStruct input {}; will (that was the fix). Long story.

  • affiliate@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    7 months ago

    this is just like in regular math too. not being a number is just so fun that nobody wants to go back to being a number once they get a taste of it

  • Omega_Haxors@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    7 months ago

    The funniest thing about NaNs is that they’re actually coded so you can see what caused it if you look at the binary. Only problem is; due to the nature of NaNs, that code is almost always going to resolve to “tried to perform arithmetic on a NaN”

    There are also coded NaNs which are defined and sometimes useful, such as +/-INF, MAX, MIN (epsilon), and Imaginary