-
-
Couldn't load subscription status.
- Fork 36.1k
Add support for using BC5 and EAC_RG compressed textures as normal maps. #31695
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: dev
Are you sure you want to change the base?
Conversation
When the projectZ attribute is true this computes the Z component of the normal by projecting the XY components over the hemisphere.
📦 Bundle sizeFull ESM build, minified and gzipped.
🌳 Bundle size after tree-shakingMinimal build including a renderer, camera, empty scene, and dependencies.
|
src/nodes/accessors/MaterialNode.js
Outdated
| node = normalMap( this.getTexture( 'normal' ), this.getCache( 'normalScale', 'vec2' ) ); | ||
| node.normalMapType = material.normalMapType; | ||
|
|
||
| if ( material.normalMap.projectZ ) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should not rely on monkey-patched code here.
How about you introduce the property in the CompressedTexture class? Or is it also relevant for Texture?
Alternatively, move the property into the userData field of normalMap for now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When thinking about this: Why not checking for specific RG texture formats here and only then set the node property to true? In this way we need no additional texture property and it would not be necessary to update loaders.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is most useful in CompressedTexture, but in my particular application I'm using this with ExternalTexture and it can also be useful with uncompressed textures, to reduce their size by storing only the RG components.
I agree this should be done in some other way. A potential problem with this PR is that the attribute won't be cloned with the texture and that may cause issues. Adding it to the userData field as you propose would solve that problem.
I'm not sure enabling the Z reconstruction is the right thing to do for all RG texture formats. RG textures can be used for other purposes in addition to normal maps (heat distortion maps, 2d vector fields, chrominance values, etc). It may not cause any harm to just compute the Z component this way, but this won't extend to ASTC normal maps if we ever want to support that.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I went with the userData approach. Let me know if that looks good to you.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not sure enabling the Z reconstruction is the right thing to do for all RG texture formats.
My idea was to only do this for specific map types. So for a normal map, a format like bc5-rg-snorm (or SIGNED_RED_GREEN_RGTC2_Format) must hold packed RG data, otherwise it would be unusable for a normal map. This is something that could be implemented in the node classes so loaders don't have to worry about this detail.
But it's true a separate property allows more flexibility than deriving from texture formats.
It would be interesting to hear the opinion of others about this PR. Would you be okay if we label this PR as a "Draft PR" for now? I have the feeling we need to discuss the API a bit more in the team/community before making a decision.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The explicit unpack mode allows supporting ASTC LA normal maps or DXT5nm and in these cases it's not possible to infer the packing mode from the format alone.
@donmccurdy I'll ping you directly since you have done so much work in context of texture compression.
Similar to Texture.colorSpace, what do you think about introducing a new property Texture.packing that describes the packing strategy of a texture. The default would be NoPacking (as a pendant to NoColorSpace).
Packing strategies usually apply on certain texture types so probably most of the new constants for Texture.packing would be quite specific (like RGNormalPacking or GANormalPacking). But that is a similar situation like with Texture.colorSpace where certain constants only apply to color textures.
So...what do you think about this approach? Do you have maybe alternatives in mind?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I wouldn't be opposed to adding a Texture#packing property, I think @bhouston had proposed something similar once? A few other thoughts:
-
If we see
texture.format = THREE.RGFormat(or similar) assigned to a normal map slot, possibly that is enough in this case? -
In the newer TSL/NodeMaterial APIs with WebGPURenderer, is a property on THREE.Texture sure to be accessible? Or could the Texture instance have been wrapped in other nodes, obscuring the underlying Texture? If so, we could handle this by a new node to handle the unpacking, or a material property similar to
material.normalMapType, instead.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, I've been looking at the possibility of enabling the RGNormalPacking mode automatically based on the texture format. It could be something along these lines:
if ( material.normalMap.format == THREE.RGFormat || material.normalMap.format == RED_GREEN_RGTC2_Format ) {
node.unpackNormal = NormalRGPacking;
}
Let me know if it's worth adding this. The default behavior could still be overridden with the user data attribute: normalMap.userData.unpackNormal.
Note, this only handles RG textures and BC5. I think there's no identifier for EAC_RG11 in three.js. If support for it is added in the future, this could be handled here as well.
We could also add support for signed RG textures and eliminate the 2*x-1 unpacking in the normal map node, but barely anybody uses the signed formats, so I think it's not worth it.
I'm ambivalent about adding a Texture#packing property. I can experiment with that if you think that's the way to go.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suspect checking .format would make sense for WebGLRenderer but not for TSL in WebGPURenderer. Since the focus of this PR is WebGPURenderer, we can skip that for now, but I wouldn't be opposed to adding this support in WebGLRenderer later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This should work with either renderer. The main caveats are that it requires the format attribute to be set (which may not be the case if it's an external texture), and it does not support EAC_RG, since that format is not natively supported in three.js.
Happy to include it for now if you think it makes sense to pave the way in that direction.
Change type from bool to string and use packing mode constants. Print errors when trying to use incompatible packed normals and normal modes.
|
I've put together a demo that uses the changes I'm proposing here. You can check it out here: https://ludicon.com/sparkjs/gltf-demo/ |
|
Does anybody have any more thoughts about this? I'd like to release the source code of the spark.js gltf-demo once the changes in this PR are included in three.js. |
src/nodes/utils/Packing.js
Outdated
| * @param {Node<vec2>} xy - The X,Y coordinates of the normal. | ||
| * @return {Node<vec3>} The resulting normal. | ||
| */ | ||
| export const reconstructZ = ( xy ) => vec3( xy, sqrt( saturate( float( 1.0 ).sub( dot( xy, xy ) ) ) ) ); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'm not happy with the name yet since it is quite unspecific. I would prefer to integrate the normal term somehow in the name. I also like the pack/unpack notation.
How about using unpackNormal()? The idea is to have a function that converts a packed, two channel normal into a three channel (unpacked) normal.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This creates a conflict with the unpackNormal node attribute. If you want to stick with this name, I suggest renaming the attribute to unpackNormalMode.
|
@castano Sorry for the delay but do you think you can share one normal map that uses BC5 or EAC_RG compression? I think we could add a new mesh using such a texture in webgpu_materials so we have at least one test case for this code path. |



Normal maps are often compressed using the BC5 and EAC_RG formats. This is currently not supported by three.js as the shaders expect the sampled normal map values to contain RGB components representing the packed XYZ coordinate.
To address this issue I propose we add a projectZ attribute to the NormalMapNode object. When this attribute is true, the generated shader code computes the Z component of the normal by projecting the XY components over the hemisphere.
Texture loaders are expected to set the
projectZattribute of the texture, which in turns activates the code generation in the correspondingNormalMapNode.At the moment there are no texture loaders doing this, but this is something that the existing KTX and KTX2 loaders could benefit from. Additionally, I'd like to use this feature in texture loaders that use the spark.js encoders.
This contribution is funded by Ludicon