2

this is torch. is this the proper way ?

self.linear_relu_stack = nn.Sequential(
nn.Linear(9, 512),
nn.ReLU(),
nn.Linear(512, 512),
nn.ReLU(),
nn.Linear(512,100),
nn.ReLU(),
nn.Linear(100,5),
nn.Sigmoid()
)

The sigmoid part ?
It returns garbage numbers on the first run of course.

Comments
  • 0
    Seems correct
    The layer passes through it and performs the sigmoid function right ?
  • 0
    @killames yeah. it does.
    just wondering about all the relu crap from the examples.
  • 0
    @AvatarOfKaine

    ReLU is an activation function, you need to apply your activation functions after every layer because it tells it what function to use for the nodes. In torch it operates on Tensors.

    So what it technically does is
    Sigmoid(Linear(ReLU(Linear (ReLU(Linear(...))))))

    It's correct
  • 0
    @Hazarth do you think its a valid application to train a turret to move in response to a target ? i know you can just calculate the position it should move to, but i want to simulate a human hand moving a control stick telling the turret which way to track manually and then just stop when its far enough.
  • 0
    @Hazarth that way I can just feed it data i generate.
  • 0
    @Hazarth hey. solve this for theta.
Add Comment