3

So we are looking for a ML summer intern.
The amount of fake ML genius is astonishing.

These kids would put projects like "increased state of the art vision transformer model for automated guided missiles technology by 500%, leading to the destruction of ISIS in Irak and Syria", and then proceed to be unable to give you the derivative of Relu(x)=Max{x, 0} which is 1 or 0, or explain what batch normalization is ?

Comments
  • 5
    I bet they asked chat gpt to come up with that title, and I bet they can’t find Syria on the map
  • 1
    The derivative is also undefined at the second order discontinuity, so you can't answer the question either.
  • 0
  • 0
    @atheist backprop if you want
  • 1
    @valentinfngr the derivative of Max{x, 0} is undefined for x=0 as there is a second order discontinuity at x=0. Your answer didn't address that so your answer was incomplete. I know what backpropagation is. I also know university level calculus.
  • 2
    I am here for the sarcasm and pedantry.
  • 0
  • 0
    @atheist For gradient based optimization, all you need is to be differentiable almost everywhere, which is the case for ReLU
  • 0
    @valentinfngr yes. I know. But if you're gonna bitch that someone can't do something make damn sure you know exactly what you're talking about otherwise you're just complaining that someone is slightly more incompetent than you.
  • 1
    🍿
  • 1
    @retoor

    See finally somebody gets the inner the beauty of Marxism-Lenninism
Add Comment