Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Should lightmaps be effected by physicallyCorrectLights? (they are currently) #21912

Closed
netpro2k opened this issue May 28, 2021 · 25 comments
Closed
Labels

Comments

@netpro2k
Copy link

In a scene with no lights, and no environment map I would expect an object with a lightmap to look quite close to the raytraced result in Blender when using a lightmap generated from that scene in Blender. (assuming you select a similar tone mapping and output encoding in both, and used an HDR lightmap).

With physicallyCorrectLights disabled, the result is quite close:
image
left Three (Mozilla Hubs) | right Blender

with it enabled the object is a good deal darker than expected:
image

Digging into the code this difference makes sense as when physicallyCorrectLights is disabled the lightmap value is multiplied by PI. https://github.com/mrdoob/three.js/blob/dev/src/renderers/shaders/ShaderChunk/lights_fragment_maps.glsl.js#L11

I admittedly don't fully understand all the math going on here, but from what I understand the multiplication by PI here when physicallyCorrectLights is disabled is to cancel out the multiplication by 1/PI in the BRDF_Diffuse_Lambert, but does it actually make sense for 1/PI to be applied to lightmap data? Based on what I am seeing I am starting to think this is already "factored in" when raytracing the lightmap... I also noticed that with a MeshBasicMaterial the result is the same "correct" brightness whether or not physicallyCorrectLights is enabled. I know it is an "unlit" material, but I would expect purely pre-computed indirect diffuse light to behave the same in either case with no other lighting in the scene.

There are a whole lot of factors at play in computing the final output color both in three and in Blender, so it's possible there is something fundamentally incorrect about my assumption that things should even be able to look the same. But the fact that things look almost entirely correct when simply multiplying the lightmap values by PI (to cancel out the 1/PI) makes me think otherwise.

From reading through a lot of the discussions on the different iterations of the physically based lighting (and adding, removing, and adding again of the 1/PI term in the BRDFs) I kept running into @WestLangley and @bhouston, so maybe one of you two may be able to shed some light (pun intended) on the expected behavior here? I would also particularly like to understand what @WestLangley meant by // factor of PI should not be present; included here to prevent breakage which was added as a comment many years ago and then subsequently propagated to a few other places, and eventually ended up now in the #ifndef PHYSICALLY_CORRECT_LIGHTS I referenced above.

Its a bit tricky to provide a working example of this exact model since its using a GLTF with some custom extensions. I can put together a simplified example if it would be helpful but I don't think the exact model or lightmap matters very much, its really a question of what precisely is the expected data in lightmaps and if that data should be effected by the physicallyCorrectLights setting.

@WestLangley
Copy link
Collaborator

Thank you for bringing this up. You have explained the issue correctly -- as is evidenced by the comments you referenced.

If you don't mind, a simple live example will be very helpful for experimenting.

I'll explain the factor-of-PI issue when I get a chance.

@netpro2k
Copy link
Author

Here is a quick example. The 2 blender references are saved out with "save as render" which has blender apply its tone mapping. The first is using Blender's default "Filmic" profile, and the second is using "Standard", which I think is just linear. Its not a perfect match regardless, but as you can see when physicallyCorrectLights are on, it is far too dark. Its much closer with that option off.

https://netpro2k.github.io/threejs-lightmap-test/

@bhouston
Copy link
Contributor

bhouston commented May 30, 2021 via email

@netpro2k
Copy link
Author

netpro2k commented May 30, 2021

Just to make sure we are on the same page. The fix would be to always multiply the lightMapIrradiance by PI regardless of if physicallyCorrectLights is enabled or not, correct? We should also think about how to clarify the comment on that line here and in other places.

Happy to submit the PR.

@bhouston
Copy link
Contributor

bhouston commented May 30, 2021 via email

@WestLangley
Copy link
Collaborator

WestLangley commented May 30, 2021

@netpro2k Thank you for the excellent example. :-)

So far, I am not seeing anything incorrect in the three.js shader -- at least as it pertains to this issue.

But what I do see is your renderings are about as bright as the lightmaps themselves. That is a red-flag for me.

That should be impossible, given that diffusely-reflected radiance is 1/pi as bright as incident irradiance.

Furthermore, your objects are reflecting only 23% of incident light due to their color settings. So that should make the rendered output even darker.

So, something is not making sense...

The 2 blender references are saved out with "save as render"

Are the blender renderings using the lightmaps as the only light sources?

Is only incident light baked into the lightmaps?

@netpro2k
Copy link
Author

The blender renders are not using lightmaps, just lights. The same scene was used to generate the lightmaps by baking "diffuse" with only direct and indirect contribution. "Color" is unchecked as this would end up double applying the material's surface color if I am understanding things correctly.
image

@WestLangley
Copy link
Collaborator

So, can blender render the scene with the generated lightmaps as the only light source -- and produce the same output?

@netpro2k
Copy link
Author

netpro2k commented May 30, 2021

I am not aware of a way to have blender use texture as a "lightmap" per se, but it is be possible to generate a shader graph to use the texture data however we like.

For example as a multiply with the diffuse color:
image

Produces this image (saved with "save as render" with Filmic transform):
render-multiply

@WestLangley
Copy link
Collaborator

OK, have a go at a shader that will use your lightmap.

If the lightmap is the only light in the scene, then

// output_radiance = Lambertian_BRDF * incident_irradiance

materialOutput = ( materialColor / PI ) * lightMapColor * lightMapIntensity

@netpro2k
Copy link
Author

Didn't bother with the intensity, but this graph:
image

produces:
render-pi

Which is not surprisingly closer to what the threejs render looks like with physicallyCorrectLights on. I guess this is a good way to rule out differences introduced in the rest of the color management workflow in Blender which is helpful. Still unclear how to produce a lightamp that will end up int three with a result similar to the raytraced result in blender.

I think @bhouston was also correct in that there is a disconnect on whether we are considering the lightmap data as incoming our outgoing light... I am a bit unclear what baking the "diffuse" here in Blender would be... If excluding "color" like I am, do these end up being the same thing?

Sidenot: I am not super familiar with Blender shader graph so I think my multiply example in the previous post was wrong. I think the "Factor" should have been 1 not 0.5. With just the multiply and a factor of 1 that produces:
render-multiply

@WestLangley
Copy link
Collaborator

I think @bhouston was also correct in that there is a disconnect on whether we are considering the lightmap data as incoming our outgoing light

Not as far as I am aware. A lightmap models incident light, irradiance. The outgoing, directional radiance is computed.

@netpro2k
Copy link
Author

What I mean is it is not explicitly stated in the threejs MeshStandardMaterial docs, so people may be making different assumptions when talking about "lightmaps".

But more directly for this case, I mean it is unclear which is being encoded into the lightmap from Blender when baking "diffuse" without "Color". If we end up with a radiance map instead of irradiance map, that would explain why things look different.

@bhouston
Copy link
Contributor

bhouston commented May 31, 2021 via email

@WestLangley
Copy link
Collaborator

Emissive maps encode outgoing radiance. Lightmaps encode incident irradiance.

Maybe we just need lightMapFactor or something?

We already have lightMapIntensity.

@netpro2k
Copy link
Author

We do already have lightmapIntensity which I think could already be used for simple scaling. But I agree we do need to specify in the docs what the data represents. But also, for the Blender workflow (which I think makes sense to flesh out as a sort of "endorsed" path) we need to figure out how to get maps out with the data we want.

@WestLangley
Copy link
Collaborator

In three.js, lightmaps are additive to total incoming, indirect light.

#6263 (comment)

#6263 (comment)

I agree it would be a good idea to determine what the blender devs think they are.

@netpro2k
Copy link
Author

netpro2k commented May 31, 2021

I agree it would be a good idea to determine what the blender devs think they are.

Yeah, to be clear Blender does not use the term "lightmap" anywhere. It has a generic "baking" utility to bake out different things to textures. I am using the "Diffuse" "bake type" to generate this map (with "color" "influence" unchecked). https://docs.blender.org/manual/en/latest/render/cycles/baking.html#settings

The docs say this is the "diffuse pass of a material", so I guess this ends up being the outgoing radiance? I wonder if its possible to get an irradiance map out of Blender somehow...

@netpro2k
Copy link
Author

netpro2k commented Jun 4, 2021

Did some digging through the Blender source, and with some printf debugging... Hard to fully understand whats going on but from what I can tell what we essentially end up with in the lightmap with the way we are exporting it, is the pathtraced direct+indirect colors of the pixel divided by the diffuse color of the pixel (since the full path trace will naturally include the surface diffuse of the surface your are baking). https://github.com/blender/blender/blob/master/intern/cycles/kernel/kernel_bake.h#L208-L221

Don't understand how the path tracing works well enough to know why this is a division and not a subtraction, or how close this ends up being the the "incident irradiance" data we actually want but this is what I know so far.

For our own usecase I think we will just set our lightMapIntensity to PI for now, as I think this gets us results that seem "good enough", but I suspect this is still not quite correct, so I would still like to keep digging into this. If this does turn out to be "correct", it seems like actually encoding that data into LDR lightmaps would result in far more clipping at the high end... I feel like I am still missing something.

I know this is now straying from being a Three issue somewhat, but I do think a reliable path for getting this data out of Blender in a format Thee can understand is probably a good idea, as I am not aware of many other engine-agnostic tools to do this, especially not ones that are free and open source. Also now that I have done the work of getting Blender to compile locally I can hopefully start poking at https://devtalk.blender.org/t/access-baked-lightprobe-irradiance-volume-data-from-python/6199/7 at some point soon, which is tangentially related to this.

@netpro2k
Copy link
Author

Picking this thread back up... I think there is still some confusion around weather what we are getting out of Blender is in line with what ThreeJS is expecting, but from what I was able to tell (just re-reading the above thread), multiplying the maps we are getting out of blender by Pi gets us roughly the correct results (excluding the issues with metals being discussed in #22692). This is specific to the workflow exporting out of Blender, so its likely the fix is for us just to set our lightmapIntensity to Pi, or by explicitly ignoring physicallyCorrectLights on lightmaps in our fork, though I do still think its worth investing in Blender as the happy path for developing assets for ThreeJS.

This leaves the issue that MeshBasicMaterial does not apply physicallyCorrectLights to it's lightmaps. It is obviously slightly odd to be talking about lighting on an unlit material, but they have lightmaps so I think they should be used in the same way as MeshStandardMaterial.

That is, a fully rough, fully non-metalic white MeshStandardMaterial with a lightmap should look roughly equivalent to a white MeshBasicMaterial with the same lightmap (minus some differences due to baseline specularity), regardless of if physicallyCorrectLights is enable. I can submit a PR for this if that makes sense.

@mrdoob
Copy link
Owner

mrdoob commented Dec 22, 2021

I can submit a PR for this if that makes sense.

I think PRs tend to make these kind of conversations easier 👍

@netpro2k
Copy link
Author

I opened a PR to have MeshBasicMaterial respect physicallyCorrectLights here #23197

This still means we will need to set our lightmapIntensity to PI to use the "lightmaps" we are getting out of Blender, but these will now at least apply consistently to both MeshStandardMaterials and MeshBasicMaterials

@cptSwing
Copy link

cptSwing commented Feb 6, 2022

In three.js, lightmaps are additive to total incoming, indirect light.

Sorry for butting in here, but as I've seen this mentioned a couple of times all over the web: This does not refer to the actual shader math when adding a lightmap texture's effect to the render, does it? From my experimenting, a lightmap in threejs seems to be multiplied with the final result (ie darkening the underlying albedo), not adding to it, yes?

@WestLangley
Copy link
Collaborator

@cptSwing

Well, this is unfortunate...

For Physical, Standard, Phong, and Lambert -- materials that "respond to light" -- light maps in three.js only brighten the scene.

For MeshBasicMaterial, which does not respond to light, support for light maps was added long ago and retained for backwards-compatibility. In that case, if the product of the light map value and the light map intensity is less than 1, it will render darker than the equivalent scene having no light map at all.

Granted, this is confusing, but it was a way to shoehorn light map support into MeshBasicMaterial.

@Mugen87
Copy link
Collaborator

Mugen87 commented Jul 12, 2023

After #26392 has been merged the "legacy lighting mode" is now deprecated. Meaning only one lighting mode which enables physically correct lights will be in place. That also means there will be no artist-friendly light intensity scaling factor anymore.

I also think we should not remove light maps from MeshBasicMaterial even if the implementation is not physically plausible. This is still a useful feature and I have frequently seen projects using it in the past.

@Mugen87 Mugen87 closed this as completed Jul 12, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

6 participants