Ranter
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Comments
-
@killames yeah. it does.
just wondering about all the relu crap from the examples. -
Hazarth95523y@AvatarOfKaine
ReLU is an activation function, you need to apply your activation functions after every layer because it tells it what function to use for the nodes. In torch it operates on Tensors.
So what it technically does is
Sigmoid(Linear(ReLU(Linear (ReLU(Linear(...))))))
It's correct -
@Hazarth do you think its a valid application to train a turret to move in response to a target ? i know you can just calculate the position it should move to, but i want to simulate a human hand moving a control stick telling the turret which way to track manually and then just stop when its far enough.
this is torch. is this the proper way ?
self.linear_relu_stack = nn.Sequential(
nn.Linear(9, 512),
nn.ReLU(),
nn.Linear(512, 512),
nn.ReLU(),
nn.Linear(512,100),
nn.ReLU(),
nn.Linear(100,5),
nn.Sigmoid()
)
The sigmoid part ?
It returns garbage numbers on the first run of course.
question