Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> GLSL is fine.

How would you use shared/local memory in GLSL? What if you want to implement Kahan summation, is that possible? How's the out-of-core and multi-GPU support in GLSL?

> People don't understand

Careful pointing that finger, 4 fingers might point back... Shadertoy isn't some obscure thing no one has heard of, some of us are in the demoscene since over 20 years :)



I don't know x3

> some of us are in the demoscene since over 20 years :)

Demoscene is different, though what I'm imagining with shadertoy and what it could be hasn't really been implemented. GLSL shaders are fully obscure outside of dev circles and that's a bummer.


> How would you use shared/local memory in GLSL?

In compute shaders the `shared` keyword is for this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: