2

If the same data is being fit over and over again to a neural net shouldn't it start spitting out the result it was trained to very quickly if over svereal frames its training for 100 epochs each ?

Comments
  • 0
    @dumbdev I didn't intentionally it was getting paused due to a logic error on literally the same data points. but the point is you'd think those would start spitting out the expected result.
  • 1
    @AvatarOfKaine no, they won't. The analysis takes constant time in classic ML systems. It does not matter how fit it is, the time to get to the last layer is constant. It just produces different outputs
  • 0
    @iiii interesting you say the world classic. whats a more modern approach ?
  • 0
    @AvatarOfKaine I don't know how super machines like the one that plays Go works. It has a totally different architecture.

    If there's some way to make shortcuts in analysus, then the time might be shortened, but classic models do not support that.

    Genetic models do, but those are harder to use and teach, as I understand.
  • 0
    @dumbdev A POX ON YOU AND YOUR PAY TO PLAY ARTICLE SIR ! LMAO
  • 0
    anyway wanna know what made me ask this ?
  • 0
    @dumbdev the model generates a result that tells the character its controlling to perform opposite actions which nullifies them so it just gets stuck and sits there, continually fitting the same set of output data to the specified inputs.

    so one would think... it would budge a little.
  • 0
    @dumbdev here. tell me if everything looks setup right.

    I"m thinking something I'm doing isn't right. I got part of the concept from snakeai
Add Comment