Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Search - "shader"
-
Other people playing games: "Nice graphics!"
Me playing games: "Wonder how they did that shader..."7 -
Game devs: Let's make characters as photorealistic as possible.
Some asian girls: Let's try to look as unrealistic as possible:11 -
I had a project where I completely suprised the client and the company within the beginning. It was silky smooth, working on multiplatform ios android,standalone. I wrote the most complex shader I ever made. Everything was great and I even got a bonus for the project.
Then one day. Videos started to stutter. Not playing, completely black on some cases and some devices.
I started to think about reasons. I tried every solution I come up with. No success.
Updated all the codecs, middleware still nothing. Tried to solve the problem for a week. It was a total diaaster and I even thought I a dont deserve to be a developer.
We encoded the videos a few times. Decided to export the original video again, boom! It worked. Theres no particular reason why it worked. But it worked. I guess I am a good developer. Not the best but, eh. -
Sometimes I just don't know what to say anymore
I'm working on my engine and I really wanna push high triangle counts. I'm doing a pretty cool technique called visibility rendering and it's great because it kind of balances out some known causes of bad performance on GPUs (namely that pixels are always rasterized in quads, which is especially bad for small triangles)
So then I come across this post https://tellusim.com/compute-raster... which shows some fantastic results and just for the fun of it I implement it. Like not optimized or anything just a quick and dirty toy demo to see what sort of performance I can get
... I just don't know what to say. Using actual hardware accelerated rasterization, which GPUs are literally designed to be good at, I render about 37 million triangles in 3.6 ms. Eh, fine but not great. Then I implement this guys unoptimized(!) software rasterizer and I render the same scene in 0.5 ms?!
IT'S LITERALLY A COMPUTE SHADER. I rasterize the triangles manually IN SOFTWARE and write them out with 64-bit atomic image stores. HOW IS THIS FASTER THAN ACTUAL HARDWARE!???
AND BY LIKE A ORDER OF MAGNITUDE AT THAT???
Like I even tried doing some optimizations like backface cone culling on the meshlets, but doing that makes it slower. HOW. Im rendering 37 million triangles without ANY fancy tricks. No hi-z depth culling which a GPU would normally do. No backface culling which a GPU with normally do. Not even damn clipping of triangles. I render ALL of them ALL the time. At 0.5 ms7 -
It is 4:55.
I should sleep.
I've just created a GLSL shader on my phone which renders the Mandelbrot set in 4 bright flashing colors while zooming in.1 -
New idea: Fuck raytracing for global illumination because you just need too many rays for it to converge
What if we do surfels (to keep the number of probes down and relevant to our scene) and we update the 4x4-ish sized hemisphere irradiance maps not by tracing a single ray per frame per surfel. I have a fast as shit compute shader rasterizer... What if I just raster each surfel each frame? Should be around the same number of pixels as the primary visibility so totally feasible....
Each frame just jitter the projection a bit and voila. Should have extremely high quality diffuse global illumination at well below 1 ms. Holy shit this might just work3 -
Writing an efficient, modern renderer is truly an exercise of patience. You have a good idea? Hah, fuck you, GPUs don't support that. Okay but what if I try to use this advanced feature? Eh, probably not going to support exactly what you would like to do. Okay fuck it I'm gonna use the most obscure features possible. Congratulations, it doesn't work even on the niche hardware that supports that extension
If I sound jaded, ya better believe I f*cking am! I cannot wait for more graphics cards to support features like mesh shaders so we can finally compute shader all the things and do things the way we want to god dammit -
Apparently "in" and "out" are the new "varying" and "attribute" in OpenGL's shader language...
Even buying a new graphics card can f*ck up your whole day! 😥2 -
Just tried out Minecraft's shader mod SEUS and wondering what the fuck am I doing with my life being a web dev and not working on graphics.
If you have an nvidia gpu, please give it a try.
This is an example with PBR textures, it's mind blowing https://youtu.be/RbM5w9CBDIw
INB4 comment like "peasant web dev wants to do graphics lmao"11 -
So I'm struggling to finish this library which among other things is supposed to write flowing text. And this one's taking foreeeever and I'm hating it so much already.
I just keep daydreaming of starting a "simple" platformer. And then I go, "hm the parallax must be nice, it needs to have as many layers as possible, oh and look at this video, here they're even zooming and each layer rescales differently, good effect, I need to add that too. Also a plain platformer is just boring, it needs to have adventure elements, and even RPG too, yeah why not. Hm, it needs to have some motion blur, but oh I need this 1/48 shutter speed to make it look cinematic. Okay how do I go about adding this blur effect? What? Libgdx doesn't provide one out of the box? I need to use opengl shaders? A shader, eh... I'm not even sure what that is. Okay, let's see how to do it. Wow that's a total mess and resource hungry, and how will I calculate it all as to make it match the 1/48 thing?"
You know... Simple. And in the end, I'll abandon the library and won't get anywhere with the platformer (as usual).
Tsk tsk tsk5 -
Everytime you use OpenGL in a brand new project you have to go through the ceremonial blindfolded obstacle course that is getting the first damn triangle to show up. Is the shader code right? Did I forget to check an error on this buffer upload? Is my texture incomplete? Am I bad at matrix math? (Spoiler alert: usually yes) Did I not GL enable something? Is my context setup wrong? Did Nvidia release drivers that grep for my window title and refuse to display any geometry in it?
Oh. Needed to glViewport. OK.4 -
Putting every file, even SHADER CACHES (that huge cyan flower here (this illustration was made with "gource"), yes, every of those tiny little dots is a file) or even complete libraries into their git repository.2
-
PSA: The smaller the compute shader workgroups the more efficient they are, down to the wave size (32 on nvidia). Not exactly sure why, but looks like if you don't need group shared memory always have your workgroups be wave sized
Just this alone gave me a 30%+ performance increase. And combined with a few other changes got me from 50 µs to 10 µs, yay!5 -
Name one thing more fun than atomically writing values into a gpu buffer and them mysteriously vanishing into the aether immediately after the compute shader invocation
I can literally see them in the buffer using RenderDoc and then as soon as I go to the next command the buffer is completely filled with zeros again as if the values never existed
?? like how ??11 -
My last post was a year ago. What brought me back here is the ability of AI to agree and apologize to anything and everything, while producing the worst hopeful code.
4 days I wasted, trying to make an android audio visualizer, but AI... sigh.
It gave me the wrong structure of FFT bytes emitted. I corrected it
It gave me the wrong logarithm calc, I corrected it
It gave me the wrong sampling rate, I corrected it.
It gave me the wrong texture order, I corrected it.
It gave me the wrong glsl sample2d, I corrected it.
It gave me the wrong textureID generation, I corrected it.
It gave me a render which was about 10 fps, I found out that instead of using native onDraw, I had a fcking delta time in my shader. I almost corrected it, I gave up
Lets go to code generators with Annotations.
Like always, starts very positive, until I start to correct it.
It gave me the wrong file locations, I corrected it.
It gave me the wrong order of find copy modify and write to .build, I didnt correct it.
It gave me regexes to find annotations. Im like So whats the use of an "ANNOTATION PROCESSOR"
It apologizes and used a fucking regex in the processor,..... I didnt correct it, in the end, I was left with a separate module, targetting iOS Android and JVM, with an annotation processor implemented in jvmMain, which tries to modify commonMain src by finding annotations with regexes, which wont run on app build or app sync project, but only on java -jre command pointing to that fucking .java class in that module, which takes at least 2 mins to run, and Finally generate 0 files.
I needed to rant, I understand LLMs are just models of words built and stolen from the most intelligent and dumbest people out there. But Im an idiot for getting my hopes high. I cant build anything new and unheard of. I used to do that. I once made a textView + image print util for a bluetooth printer just to say FU to libraries and heavy sdks. like literally rasterizing shit to bluetooth packets. I needed to let off some steam. I havent been here in a year so I dont know what reactions I can get from this rant. I bet someone will just say yeah we tired of 'Fuck AI' rants. but shit, it hurts. When I gave up on that visualizer, I downloaded an app, I think its called project M, like in reference to MilkDrop.. like the Winamp Milkdrop. I opened it, played something on spotify, and let my eyes go blind9 -
I've been working on a shader for the past few days. Lots of doing math on paper and switching to code to implement it. Yesterday after 3 or 4 hours of trying to figure out why nothing is rendering, I realized that I wrote all my * for multiplication as x. Visual Studio never let me know its a syntax error, and my fried brain saw no issue. Needless to saw my shader is still bugged to hell, but at least my multiplication works.3
-
Google: "shader particle trail effect"
Click the YouTube link
4 hours later... I've seen all the "gold digger prank" videos.
Disaster! -
This is one of the coolest shader tutorials I have seen.
https://youtube.com/watch/...
It simply walks you through start to finish enhancing a weapon. I also found it can also be applied to 2D games as well. What I don't like is it is not setup to be generic. I will have to figure out how to make it be a weapon effect you can apply. I think having weapons provide a mask for where the shaders should be applied would make that possible. Then the generic effects can be applied to the weapons or removed. No need to have unique weapons of every type and for every effect.
This is the kind of tutorials that really get me going. When thinking of 2D I had not really thought about using shaders like this.4 -
So I got my compute shader rasterizer working pretty well now which is great. I now also have a fallback to hardware rasterization for triangles which are a bit sussy (mostly just too large) and getting that implemented without tanking performance (gazillion threads hitting the same atomic variable at the same time) involved some tricky workgroup/subgroup hackery but I'm happy with it
Only problem... I have like 90%+ SM occupancy (which is great) but I also have 90%+ SM occupancy which means the nvidia drivers think I'm mining cryptocurrency and start bottlenecking my compute performance at random. It slowly goes up to 3x, then it slowly goes down again, then it slowly goes up again... argh
Thanks, miners 😐8 -
So, in opengl 4.x, there are no primitives for circle, and the only ways to draw an almost perfect circle are following
Draw a triangle fan and fk up your memory for a circle
Draw a rectangle and use the fragment shader and distance equation to discard the bit that is not used
But you will need to add an if statement and potentially increase the frame time (from what i have heard)
And it will be more complicated than just using a triangle fan14 -
I swear both gl_PrimitiveID and the noiseX functions don't fucking work. No reason at all, they just don't want to work. Attached is a screenshot with a """""random""""" shade of blue per vertex (with flat interpolation) based on screen pos1
-
Why, just why is this being recommended to me like, on every fucking video.
Yes I do a bit of GLSL but that's fucking it.
https://youtu.be/OWdAT-D_klg2 -
There's nothing quite like an app force closing for no apparent reason, and no error log info.
Just spent an hour figuring out that one of the device I test on doesn't like linking GLSL shader programs if it contains a loop even know every other devices I've tested on are totally fine with it. 😑 -
Is it a good idea to switch from learning openGL to learning Vulkan now?
I was learning openGL in the past months and now that Vulkan is out I am thinking about learning that instead. I've heard that it's harder to learn though, so roughly how long do you think it would take to learn it as a openGL novice?
In openGL I have used instanced rendering with different textures, specular maps in the shader all in perspective 3D of course.3 -
New to dr. And love so many story here.
Currently work on shader project which i hate and very bad at.
Anyone know good place to l arn different type of shader? I have some badic knowledge how glsl work. But not enough knowledge on how to make cool stuff.
Also trying to add the texture. Not work yet tho ;(2 -
So, some friends of mine are going to work on a horror game in Unity2019.
Does anybody have software recommendations for audio editing, shader development or such?
Any advice greatly appreciated.9 -
Why is there no language that can run on a GPU, not considering GLSL and the such as they're shader languages and are only used for that.6
-
I was looking through some shader videos and wanted to see how hard it is to write one, and after some research I am now confused.
Are shades just basically fancy filters like black and white filters and those shitty Snapchat colour filters or is there a particular trait with shaders that diffferes them from filters that I am missing?
I always thought that shaders were all about ray tracing/marching and obtaining the effects that way.2 -
Not finding what I want via google so I'll ask here: What's the deal with opengles android shaders freezing my phone's screen?
Is it normal unavoidable behaviour for a shader with an infinite loop to fuck up the visual output irreversibly (until phone restart)? -
Sophomore year starting soon so I'm looking for new project (s) to complete in parallel with the studies.
Some are more design-y and some more backend-y but I recently started getting better at designing so :)
1) Learn some fragment shader stuff. I've always been messing around with graphics and have a game on steam, so I think that's a good idea to be paired with signal processing.
2) Reactive web services. Preferably with spring-boot or vert.x but
3) I would also like to dive into golang (and make some reactive thing with it)
4) WebAssembly seems nice... But I got some concerns
5) exercise making wireframes -> CSS (with some js)
6) I've never really done any real backed work with nodejs, except serving and aot compiling js, or doing gulp tasks
7) Implementing a whole project, or a fraction of it as serverless on aws
* I'm definitely going to use a couple very simple services to make a docker swarm with load balancing, etc, just because I know how everything works but got no practical knowledge
8) Design an esports jersey for the university department I'm in (shouldn't take long)
So what do you guys think? Recommendations are welcome :)
P.S. last year in review:
> A webapp running on a raspberry pi powering a reflex testing game on gpio (java/spring-boot , codename: buttonmasher)
> small Elastic search cluster to monitor some random university servers through kibana dashboards
> laser tracking on wall of *any* colour and variable light conditions via a webcam (opencv) , controlling the mouse pointer, whether you run it against a projector or any wall
> jstrain.herokuapp.com => a small JavaScript powered tool with a DSL to help you train more efficiently without a coach
> Various random Photoshop stuff