erichlof/THREE.js-PathTracing-Renderer

More abstractions

tom-adsfund opened this issue · 9 comments

Amazing work on this very valuable project! I've been trying it with a Tesla V100, and the results are extremely good. I can imagine powerful new user interfaces being possible with this. I'll probably be posting more issues with ideas, but this will be my first.

Now that you have multiple examples, I think it would be very useful to others to make some abstractions that would allow putting together projects in quick but powerful way. The existing examples have many parts (js, shaders, etc) mixed together, and so it's unclear where there is possible duplication, varieties, etc.

Ideally there would be some basic setup that could then have features added and options chosen. Migrating existing examples to this new format would highlight where the similarities and differences lie, and make the abstractions clearer. This in turn would allow more people to work on separated components and accelerate the project development.

Hello @tom-adsfund
Thank you! I agree with you that there needs to be another level of abstraction. Mainly I have been exploring areas of path tracing that interest me, which is why the examples are so varied. Whenever I make a discovery, or make progress in a certain area, I immediately want to share it publicly (and the source code), so that others can also benefit from what I have learned or discovered. So what ends up happening is that I quickly start a new project in my IDE that showcases the new discovery or new feature that I have implemented. Hence all the seemingly random demos!

Having said that, I actually have made some progress on the code repetition-avoidance front by following mrdoob's advice and creating a library of path tracing shader pound-includes. Also, following that example, I made a InitCommon.js file that all the demos use. This file handles most of the boilerplate and three.js initializing.

I've tried to make files as easy as possible to understand how they fit into the bigger picture of the overall project/demos. Basically each demo has its own html shell (which loads in any dependencies), then its own js init file (that is specific to that particular scene and that can't really be abstracted much further), and its own shader, which makes use of the pound-includes mentioned earlier. Again, these shaders are specific to the particular scene, and use only what is necessary to run correctly and give no GLSL errors for that demo. There is not much dead code at all in the whole repo's demos. My goal has always been to have every single line of code necessary to run the demo. If a line was removed from anywhere (html, js, glsl), the demo would not work correctly.

The extra layer of abstraction that we speak of could possibly be in the form of a js init file that takes in user settings or parameters for the scene (as you mentioned), then it magically assembles the shader includes and creates the shader that is tailor-made for that particular scene or effect. I guess the ultimate experience would be that the end user/developer never has to even look at the raw shader code and its includes. Kind of like when we build basic three.js apps that use traditional rasterization and WebGL.

I would be open to someone coming along and extracting the above shader assembly, but at the moment I am still exploring various cutting-edge areas of real time ray tracing and path tracing that haven't been done before in the browser. WebGL2 has opened the door to a lot of possibilities. Even before we all move to WebGPU (which seems to be the next big trend in browser-based real time graphics), I think that there are still unexplored areas that could benefit a lot of users, especially since hearing the recent news that all major browsers now have full support for WebGL2. That's a huge segment of the tech-consuming population, which is exciting indeed!

Thanks for the comments and suggestions. Feel free to send me more! 😀
-Erich

Even for the development of your ideas having more abstractions would help because you could manage more demos and demo variations. I'm going to post another issue soon about an experiment I was doing, and how this would greatly help with that.

The automatic generation of the shader is exactly the type of thing I was thinking of. Just from a library user's perspective, understanding the shader language should be left until absolutely necessary. Again, with the experiment I was working on, I had to learn the hard way that shader language didn't have automatic type coercion from int to float!

I think Three.js has done an excellent job making it easy to make things simply while giving great power. Something I'd like to do and which should be trivial with the abstractions is to put an image behind "a piece of glass", itself with natural parameters, like curve. Obviously there's limits to this, but simple things can be composed much more easily, especially in important experiments around performance. Another example is with preferences: I've spent minutes suffering trying to control the camera movement with the method you've set as the default, and if that was a trivial option to choose differently, I could switch it to something easier for me! :)

I think there's really radical benefits to having abstractions that more people can use that will overcome any inconvenience or delay to your personal experiments. I hope my next issue will highlight a case of this, but with hundreds and even thousands more people using the project day-to-day you'll just get an abundance of payoffs by people working on things in parallel and together in groups.

The potential for this kind of rendering is so great, it's hard to understate. But it definitely needs the right abstractions to allow that potential to be realized.

I'm currently trying to port the renderer to the three.js' Nodes system (see #61), and I personally think that with it it would be much easier to work with abstractions (for example, to use function in JS which setups the slow-but-perfect rng in GLSL/WGSL or the other which setups the fast-but-jagged - without the neccesity to manually change that in the compiled GLSL/WGSL shader).

Also, one of the abstractions that I personally would love to see - to both be able to generate shader on-the-fly (as scene changes) for almost static scenes and to compile shader once and then just send to it scene data as it changes (in textures for GLSL or buffers for WGSL) for dynamic scenes. Will try to implement this in the port :-)

@LeviPesin It would be really cool to see you port it to Three.js Node Materials and WebGPU, which will be the next major standard. This would basically get us to an almost perfect place, but I can imagine there might be some really powerful things from this project (like more advanced materials and surfaces) that will have to be woven into the main Three.js project, that you would have to work with them on... but I have no idea about that.

Moving the Three.js renderer to a ray/path system will be so important (a profound and fundamental change) for expressiveness and understanding in day-to-day 3D. Keep up the great work!! So deeply important as we work on apps around biology, for example.

@LeviPesin

Great to hear that you're working on this! I'm glad that you are taking the plunge into WebGPU - I'm just starting to feel like I'm getting the hang of WebGL2, and then along comes WebGPU, ha! I will definitely look at your port to learn from.

Even though I don't know WebGPU, I am very comfortable with WebGL2. If you ever have any questions about my current project, and how it all fits together, please don't hesitate to email me. Or if you like, open a new question/issue thread here, and that way our discussion will be visible to all, some of whom may come here at a later point in time, and would be able to benefit from our discussions.

Unless it's super math-heavy (like iq's prng or his Torus quartic solver), I can account for every single line of code in this entire repo and I understand why it's there, and how it fits into the project as a whole. So if you are unsure about any part of it while trying to port to WebGPU, again, don't hesitate to ask. I will do my best to help as much as I can. :-)

@erichlof Thank you for your kind words (and, of course, for the amazing renderer)!!!

@LeviPesin how long do you estimate it will take to port the renderer?

About 2-3 months, I think...

Great!