Some image GI question
RenderTool opened this issue · 10 comments
HI!These tests and the results for these tests are outlined below.
the problem hapened in the latest version !(I have all your historical versions. The image above shows me using your old version --3.js R110)
The gamma of the image seems to be wrong. These are the parameters settings used in my testing:
GT630M 2GB bounces=10 HDRI_Exposure=1.8.
In addition, I don't think hard coding the sun in the HDRI image is suitable.Maybe you can convert HDRI into CDF (cumulative distribution function). The brighter the area, the larger the area. In this way, when sampling, you still use 0-1 random number to sample the corresponding points in the corresponding area, and the importance sampling is realized.
Hello @q750831855
Yes multiple importance sampling is tricky and subtle. As of now, the direct sampling of the sun direction is the best way I know how to make the image converge the fastest, especially with HDRI images. The path tracer you referenced at the end of your post is really cool! - I will try to learn from his PBR sampling strategies (like sampling GGX specular, something I have yet to implement!)
Having said that, he has not totally solved this HDRI problem either. Notice the white noise fireflies from the HDRI image not being sampled optimally:
It gets more tricky if you have multiple light sources in the HDRI like you posted about: the 3 big white circles. Yes you would need a CDF - I sort of know how to do it (I have been reading up on the subject), but I have not yet implemented it. If I come to learn the subject better, I will post a future demo of an HDRI with arbitrarily placed light sources, instead of an easy single sun light source.
Now as to why your 2 images of the same scene but different versions at the beginning of your post appear different: The earlier version (first image) I believe is tonemapping twice which is incorrect. I was applying the tonemap to the hdri when it was sampled, then appling the tonemap again to the final pass shader which outputs the final image to the canvas. The second, although darker, just needs to be given a higher exposure on the HDRI. Also don't forget to send more sample rays from the room walls and floor and ceiling towards the 'Quad' that represents the square area of the bright HDRI image. This should instantly brighten up the room and make everything converge almost instantly, after a couple seconds!
If you need help, you can refer to how I sample the Quad light in this demo:
And here's the 'quad' sampling function you can use to instantly brighten up and converge your room scene:
https://github.com/erichlof/THREE.js-PathTracing-Renderer/blob/gh-pages/js/pathTracingCommon.js#L1511-L1532
Although in your demo scene, the quad is oriented in the X-Y plane where as my function is meant for an X-Z plane typical room ceiling light. So make sure you change the following lines:
randPointOnLight.y = light.v0.y;
randPointOnLight.z = mix(light.v0.z, light.v3.z, clamp(rand(seed), 0.1, 0.9));
to:
randPointOnLight.y = mix(light.v0.y, light.v3.y, clamp(rand(seed), 0.1, 0.9));
randPointOnLight.z = light.v0.z;
Hope this helps!
HI! @erichlof Thank you for your explanation, but I'm afraid I'm still in a fog over what happened. I retested the scene on my gtx1080ti. I set the HDRI brightness to 10 even more higher, and the shadow in the window was not correctly rendered.
May be you visually misled by your black box.
@q750831855
Hello, thank you for the additional images. Yes something is clearly not adding up. I am puzzled also. At first glance, it looks like the first image path tracer is not doing the GI bounce/gather correctly (or at all!) - it is only collecting the direct light to the window image, which is why the middle box that is facing the camera and facing away from the window light is totally black - it's not getting any GI bounces. As a quick fix, can you just try increasing the bounces count in the bounces loop? So if it is bounces < 5, try bounces < 6, or even bounces < 7. What troubles me is that the refractive glass sphere is totally black - it's not getting enough ray bounces through the sphere to be able to report what's behind it.
Is there a way you can post the room scene model (.glb or .gltf) along with the project's matching .js file (that loads everything in and places the scene geometry) to your own repo here on GitHub temporarily? That way I can download the same exact scene files that you're using. Maybe if I can reproduce the visual discrepancy on my machine, I can experiment with some different remedies to try and understand the problem more deeply and hopefully fix whatever is wrong.
Let me know what the best way is to share the necessary scene files with me, if that is ok with you.
Thank you!
-Erich
@erichlof https://github.com/q750831855/THREE.js-PathTracing-Renderer_TEST.git
Due to network constraints, I re-uploaded a test copy.
@q750831855
Thanks so much for hosting a copy of your project! I have downloaded and run the test example above and I am happy to report that I have found the problem! It is late here in Texas though, and I have to get up early in the morning. I shall return soon with the revised pathtracing shader as well as a sample rendering that closely matches your second picture above!
@q750831855
Success! I was able to locate the source of the problem as well as fix it. Here are some various renders testing out the robustness of the new pathtracing shader to be used in your HDRI projects:
Smooth, correct GI everywhere now! This is the default diffuse paint material
now testing Brushed/glossy metal surface
Frosted pink glass, looking good! Notice the nice sun
ClearCoat plastic (in case you want a LEGO house, lol)
Just to make sure the solution is completely robust, I exited the house and went outside, then underneath the entire house looking up to the sky:
Notice the sun reflection and you can see the 3 spheres inside the house!
Similar view, but this time with slightly frosted glass, notice the correct blurred sun.
In the next post I will explain what was happening before and how I fixed the problem. Also I will give you the new shader link to replace your old one. I just changed 1 file. :)
So what was happening was that the diffuse rays were casting sun sampling rays in the SUN_DIRECTION vector. Now the SUN_DIRECTION for this HDRI is correct (you can double-check by uncommenting the red metal cylinder intersection routine - this should point right to the center of the sun in the image), and if you would have been outside the house model, it would have been fine. However, once you step inside the house, all is lost. The diffuse rays on the walls and floor of the house interior are still blindly casting sun sample rays, but they end up hitting the ceiling of the house and therefore they report black.
I designed my HDRI pathtracing demo to be used with models that you would view from the outside (like the Stanford Dragon or Bunny), but once you get into architectural rendering, that's a whole other ballgame. Rays cannot be simply sent out towards the sun anymore. You could devise a complicated solution where the rays would go towards the windows, but when you change the model, all is lost again.
I found a solution which is robust - you can load any model, and either view it from the outside as I originally intended, or step inside the model if it is a large model with holes or windows for light, and view its interior as well. I simplified the logic of the path tracer a little, so that all diffuse rays bounce more randomly (which gives better chance that they will hit a window and escape to find the light for GI gathering) and no longer do they blindly go towards the SUN_DIRECTION, because they would be wasted if they tried.
Lastly, the reason that this worked in your older version was because that's before I implemented the direct sun ray targeting, so it just naturally worked (but the HDRI gamma was incorrect in my old code, too bright). On the more recent versions, I was using models that would only be viewed from the outside, so I started sampling the sun and although it made the outside of the object converge much faster, it disallowed correctly viewing the interior a large architectural model, like the one you tried to render. I didn't even realize the implications of my new method!
So now we have a choice between 2 shaders - the current one is great for viewing models from the ouside, like most users would do I would assume. So I will leave that in place for my demo with the small Stanford Dragon on the coffee table and the garden HDRI. But if you do any kind of interior or architectural rendering, you must use the other shader. Here's a link:
Architectural HDRI shader
Just delete your current HDRI_Environment_Fragment.glsl file and replace it with this new one. Everything should just work right away.
In the future I might create a demo on this repo showing the 2 different situations and a separate demo for each, showing which solution you should choose for that particular situation. It is difficult to cover all cases (I leave that to the teams at Arnold, 3DStudioMax, Maya, Blender, etc.). I didn't want to clutter my demo pages here any more than they already are. But maybe in the future.. ;)
Anyway, I am glad I found out what was going on - thank you again for bringing the issue to my attention as well as hosting a tempory test repo so I could download and try out all of the same files that you were using.
Thanks and enjoy!
-Erich
Hi! @erichlof
Thank you again for helping me to solve my question. Rendering a beautiful interior building project used to be one of my work. Old habits die hard .When I see your sharing, the first reaction is to build a house. :)I hope I can contribute some code improvements next time instead of just asking for them.
@q750831855
No problem! Glad to help. Actually I end up learning something new myself when trying to solve these issues for other coders, so all is good!
Yes if you want to contribute features or improve existing code, feel free! My weakest areas where I could use help are: parsing skeletal animations (bone matrices) for three.js rigged animated characters so I could possibly path trace an animated mesh in real time. Another area is rendering outdoor foliage like thousands of trees, blades of grass, etc., and acceleration structures for all the above, with an emphasis on running the program on low-end hardware, even mobile. Another demo that needs work is the planet demo, I can't figure out how to get multiple resolutions of perlin noise that smoothly transitions from orbit level above the earth (continental scale) all the way down to 1 single rock on the ground. These are the areas I am either currently investigating, or haven't started yet because I'm not sure how to jump in and tackle the problem.
This ray/path tracing project has been a 5-year long adventure so far, and it keeps going! There's so much to do yet and so much to learn still. It's exciting but daunting at the same time!
Till we meet again!
Take care,
-Erich