Ranter
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Comments
-
jassole18401y@mansur85 vertex shader is something you shove it up your ass. Just need to pull the dangling shit stuck in the pubic hair to debug it.
-
the thing is, that you need to supply vertex data in some way. there are a few ways to do that, but you could potentially build a mini test case with the help of
https://web.cs.upb.de/cgvb/...
be aware that this is more of a learning program, but it should work in your case.
if you mean testing automatically, that is another story and i wouldn't know, how you would test this at all.. -
@jassole Welcome to undefined behavior hell then. You probably do something that isn't quite standards compliant and it works regardless on some GPUs (probably the ones from nvidia), but not on others.
-
jassole18401y@Oktokolo Annoying that even programmable shaders being 2 decade old concept, there are no standardized way to debug shaders, not even a printf debug. I don't care how slow printf will make shaders, just need to inspect bunch of values that is run on a certain GPU after X number of steps.
I need to check couple of floating points, so looks like I have to use pixel shaders and color output
ANNNNOOYING!!!!!! -
@jassole well to be fair, what are ya gonna do, print a message for every vertex/fragment?
not sure how useful that is with shaders, that are meant to visualize stuff.
however, not all hope is lost, if you use an nvidia card, then you might be able to use their debugging tool (i think it was called nvidia nsight or something) -
jassole18401y@thebiochemic yeah I don't mind it being super slow.
But aren't modern GPU generic and shaders are broken into generic compute kernels? And not a nVidia card.... -
@jassole i think in this case you're talking about gpgpu, and that stuff is going into the direction of CUDA and OpenCL and shit like that. I can imagine, that the debugging capabilities here are potentially better.
Since i never did anything like that yet, i cant tell how this stuff works at all tho.
At the end of the day, youll end up needing to do it with the output off the shaders. But atleast on fragment, you can split the quadrants of your view into 4 different views for debugging -
@jassole Vertex shaders are run for every single vertex - you would get thousands of printf's for a typical scene. You ask for insanity.
Yes, turning the output of the vertex shader into a color in the fragment shader like thebiochemic suggests (just in a part of the render, so you can see normal and debug output right next to eachother) is the way to go. -
jassole18401y
-
Debugging shaders can can be hard. I don't know much about WebGL, but a bit about OpenGL.
Use the vertex shader on the complete screen. i.e. remove all other faces and deal with 1 or 2 (2 triangles to complete a rectangle) faces only. Then you can play around by setting a pixel value based on something you want to debug to see which pixel has which debug output. If needed set the colour channel to either 1 or 0.
Ya cunts, how do you debug a vertex shader in webgl?
rant