-
-
Notifications
You must be signed in to change notification settings - Fork 35.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Should lightmaps be effected by physicallyCorrectLights? (they are currently) #21912
Comments
Thank you for bringing this up. You have explained the issue correctly -- as is evidenced by the comments you referenced. If you don't mind, a simple live example will be very helpful for experimenting. I'll explain the factor-of-PI issue when I get a chance. |
Here is a quick example. The 2 blender references are saved out with "save as render" which has blender apply its tone mapping. The first is using Blender's default "Filmic" profile, and the second is using "Standard", which I think is just linear. Its not a perfect match regardless, but as you can see when |
This change is acceptable. It is probably my fault.
|
Just to make sure we are on the same page. The fix would be to always multiply the lightMapIrradiance by PI regardless of if physicallyCorrectLights is enabled or not, correct? We should also think about how to clarify the comment on that line here and in other places. Happy to submit the PR. |
The real issue is that light maps can be in irradiance or radiance. Which
means incoming light or outgoing light. Incoming light is nice because it
can be very resolution and then you multiply it with the existing surface
textures. Outgoing light is simpler to capture. I believe incoming light
should have the Pi divisor but outgoing doesn’t. This maybe we add this
configuration to the system? This will help understanding and also
simplify things.
On Sun, May 30, 2021 at 8:06 AM netpro2k ***@***.***> wrote:
To clarify. The fix would be to always multiply the lightMapIrradiance by
PI regardless of if physicallyCorrectLights is enabled or not, correct? We
should also think about how to clarify the comment on that line here and in
other places.
Happy to submit the PR.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#21912 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAEPV7KTGXZMVSCALBIQH7DTQIS47ANCNFSM45VXFE3A>
.
--
Ben Houston, CTO https://threekit.com
613-762-4113 / http://calendly.com/benhouston3d
|
@netpro2k Thank you for the excellent example. :-) So far, I am not seeing anything incorrect in the three.js shader -- at least as it pertains to this issue. But what I do see is your renderings are about as bright as the lightmaps themselves. That is a red-flag for me. That should be impossible, given that diffusely-reflected radiance is Furthermore, your objects are reflecting only 23% of incident light due to their color settings. So that should make the rendered output even darker. So, something is not making sense...
Are the blender renderings using the lightmaps as the only light sources? Is only incident light baked into the lightmaps? |
So, can blender render the scene with the generated lightmaps as the only light source -- and produce the same output? |
OK, have a go at a shader that will use your lightmap. If the lightmap is the only light in the scene, then // output_radiance = Lambertian_BRDF * incident_irradiance
materialOutput = ( materialColor / PI ) * lightMapColor * lightMapIntensity |
Didn't bother with the intensity, but this graph: Which is not surprisingly closer to what the threejs render looks like with physicallyCorrectLights on. I guess this is a good way to rule out differences introduced in the rest of the color management workflow in Blender which is helpful. Still unclear how to produce a lightamp that will end up int three with a result similar to the raytraced result in blender. I think @bhouston was also correct in that there is a disconnect on whether we are considering the lightmap data as incoming our outgoing light... I am a bit unclear what baking the "diffuse" here in Blender would be... If excluding "color" like I am, do these end up being the same thing? Sidenot: I am not super familiar with Blender shader graph so I think my multiply example in the previous post was wrong. I think the "Factor" should have been 1 not 0.5. With just the multiply and a factor of 1 that produces: |
Not as far as I am aware. A lightmap models incident light, irradiance. The outgoing, directional radiance is computed. |
What I mean is it is not explicitly stated in the threejs MeshStandardMaterial docs, so people may be making different assumptions when talking about "lightmaps". But more directly for this case, I mean it is unclear which is being encoded into the lightmap from Blender when baking "diffuse" without "Color". If we end up with a radiance map instead of irradiance map, that would explain why things look different. |
It may also be that convention wise people may be saving radiance that
is pre-scaled by 1/pi. I could image all sorts of scale factors being
used in lightmaps because often they are stored in LDR image formats
but used for what is essentially HDR data. Thus scale factors could
be useful here. Maybe we just need lightMapFactor or something?
…On Sun, May 30, 2021 at 8:29 PM netpro2k ***@***.***> wrote:
What I mean is it is not explicitly stated in the threejs MeshStandardMaterial docs, so people may be making different assumptions when talking about "lightmaps".
But more directly for this case, I mean it is unclear which is being encoded into the lightmap from Blender when baking "diffuse" without "Color". If we end up with a radiance map instead of irradiance map, that would explain why things look different.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub, or unsubscribe.
--
Ben Houston, CTO https://threekit.com
613-762-4113 / http://calendly.com/benhouston3d
|
Emissive maps encode outgoing radiance. Lightmaps encode incident irradiance.
We already have |
We do already have |
In three.js, lightmaps are additive to total incoming, indirect light. I agree it would be a good idea to determine what the blender devs think they are. |
Yeah, to be clear Blender does not use the term "lightmap" anywhere. It has a generic "baking" utility to bake out different things to textures. I am using the "Diffuse" "bake type" to generate this map (with "color" "influence" unchecked). https://docs.blender.org/manual/en/latest/render/cycles/baking.html#settings The docs say this is the "diffuse pass of a material", so I guess this ends up being the outgoing radiance? I wonder if its possible to get an irradiance map out of Blender somehow... |
Did some digging through the Blender source, and with some printf debugging... Hard to fully understand whats going on but from what I can tell what we essentially end up with in the lightmap with the way we are exporting it, is the pathtraced direct+indirect colors of the pixel divided by the diffuse color of the pixel (since the full path trace will naturally include the surface diffuse of the surface your are baking). https://github.com/blender/blender/blob/master/intern/cycles/kernel/kernel_bake.h#L208-L221 Don't understand how the path tracing works well enough to know why this is a division and not a subtraction, or how close this ends up being the the "incident irradiance" data we actually want but this is what I know so far. For our own usecase I think we will just set our lightMapIntensity to PI for now, as I think this gets us results that seem "good enough", but I suspect this is still not quite correct, so I would still like to keep digging into this. If this does turn out to be "correct", it seems like actually encoding that data into LDR lightmaps would result in far more clipping at the high end... I feel like I am still missing something. I know this is now straying from being a Three issue somewhat, but I do think a reliable path for getting this data out of Blender in a format Thee can understand is probably a good idea, as I am not aware of many other engine-agnostic tools to do this, especially not ones that are free and open source. Also now that I have done the work of getting Blender to compile locally I can hopefully start poking at https://devtalk.blender.org/t/access-baked-lightprobe-irradiance-volume-data-from-python/6199/7 at some point soon, which is tangentially related to this. |
Picking this thread back up... I think there is still some confusion around weather what we are getting out of Blender is in line with what ThreeJS is expecting, but from what I was able to tell (just re-reading the above thread), multiplying the maps we are getting out of blender by Pi gets us roughly the correct results (excluding the issues with metals being discussed in #22692). This is specific to the workflow exporting out of Blender, so its likely the fix is for us just to set our lightmapIntensity to Pi, or by explicitly ignoring physicallyCorrectLights on lightmaps in our fork, though I do still think its worth investing in Blender as the happy path for developing assets for ThreeJS. This leaves the issue that MeshBasicMaterial does not apply physicallyCorrectLights to it's lightmaps. It is obviously slightly odd to be talking about lighting on an unlit material, but they have lightmaps so I think they should be used in the same way as MeshStandardMaterial. That is, a fully rough, fully non-metalic white MeshStandardMaterial with a lightmap should look roughly equivalent to a white MeshBasicMaterial with the same lightmap (minus some differences due to baseline specularity), regardless of if physicallyCorrectLights is enable. I can submit a PR for this if that makes sense. |
I think PRs tend to make these kind of conversations easier 👍 |
I opened a PR to have MeshBasicMaterial respect This still means we will need to set our |
Sorry for butting in here, but as I've seen this mentioned a couple of times all over the web: This does not refer to the actual shader math when adding a lightmap texture's effect to the render, does it? From my experimenting, a lightmap in threejs seems to be multiplied with the final result (ie darkening the underlying albedo), not adding to it, yes? |
Well, this is unfortunate... For For Granted, this is confusing, but it was a way to shoehorn light map support into |
After #26392 has been merged the "legacy lighting mode" is now deprecated. Meaning only one lighting mode which enables physically correct lights will be in place. That also means there will be no artist-friendly light intensity scaling factor anymore. I also think we should not remove light maps from |
In a scene with no lights, and no environment map I would expect an object with a lightmap to look quite close to the raytraced result in Blender when using a lightmap generated from that scene in Blender. (assuming you select a similar tone mapping and output encoding in both, and used an HDR lightmap).
With
physicallyCorrectLights
disabled, the result is quite close:left Three (Mozilla Hubs) | right Blender
with it enabled the object is a good deal darker than expected:
Digging into the code this difference makes sense as when
physicallyCorrectLights
is disabled the lightmap value is multiplied by PI. https://github.com/mrdoob/three.js/blob/dev/src/renderers/shaders/ShaderChunk/lights_fragment_maps.glsl.js#L11I admittedly don't fully understand all the math going on here, but from what I understand the multiplication by PI here when
physicallyCorrectLights
is disabled is to cancel out the multiplication by 1/PI in theBRDF_Diffuse_Lambert
, but does it actually make sense for 1/PI to be applied to lightmap data? Based on what I am seeing I am starting to think this is already "factored in" when raytracing the lightmap... I also noticed that with a MeshBasicMaterial the result is the same "correct" brightness whether or notphysicallyCorrectLights
is enabled. I know it is an "unlit" material, but I would expect purely pre-computed indirect diffuse light to behave the same in either case with no other lighting in the scene.There are a whole lot of factors at play in computing the final output color both in three and in Blender, so it's possible there is something fundamentally incorrect about my assumption that things should even be able to look the same. But the fact that things look almost entirely correct when simply multiplying the lightmap values by PI (to cancel out the 1/PI) makes me think otherwise.
From reading through a lot of the discussions on the different iterations of the physically based lighting (and adding, removing, and adding again of the 1/PI term in the BRDFs) I kept running into @WestLangley and @bhouston, so maybe one of you two may be able to shed some light (pun intended) on the expected behavior here? I would also particularly like to understand what @WestLangley meant by
// factor of PI should not be present; included here to prevent breakage
which was added as a comment many years ago and then subsequently propagated to a few other places, and eventually ended up now in the#ifndef PHYSICALLY_CORRECT_LIGHTS
I referenced above.Its a bit tricky to provide a working example of this exact model since its using a GLTF with some custom extensions. I can put together a simplified example if it would be helpful but I don't think the exact model or lightmap matters very much, its really a question of what precisely is the expected data in lightmaps and if that data should be effected by the
physicallyCorrectLights
setting.The text was updated successfully, but these errors were encountered: