7
NoMad
4y

[NN]
Day3: the accuracy has gone to shit and continues to stay that way, despite me cleaning that damn data up.

Urghhhhhhhh

*bangs head against the wall, repeatedly*

Comments
  • 2
    Like, what the fuck do you want from me you stupid ass network? The pattern is there, I'm giving you proper levels of cnn followed by fc, and yet you fail to see CLEAR FUCKING PATTERNS except for that one or two pattern you really like. 😑😑😑

    Go fuck yourself.
  • 2
    I'm so frustrated!!!
    *high pitch screaming follows*
  • 2
  • 1
    Your gradient is in local minimum ?

    Just random comment from guy without university diploma that knows nothing about deep learning.
  • 0
    @vane using Adam, changed learning rate, used nadam, used adagrad, used adadelta, used sgd, neither improves really. Adam does the best so far.

    (adagrad and adadelta don't really apply, but I tried them out anyways)
  • 0
    @vane I even tried different activations on different layers... Didn't do too well.
  • 1
    @NoMad so maybe you’re overfitting or under fitting ?

    🦆
  • 1
    I probably know nothing about machine learning but if you cut one variable from dataset or decrease dataset size and results are the same it can tell you something 😅
  • 1
    Have you tried threatening?
    That usually works for me.

    Just make sure to follow through, or it won’t take you seriously.
  • 0
    Read on some article (here: https://technologyreview.com/2019/...) that for some small network configuration, the random weight assigned at start can make them have shit accuracy for some reason, even after a lot of training
    Maybe wipe it and try again?
Add Comment