-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Revamp unit export code #12
base: master
Are you sure you want to change the base?
Conversation
ryanjsims
commented
Dec 4, 2024
- Corrected several incorrect assumptions about the unit format with regard to component separation, vertex access, uv maps, and bone remapping
- Added fallback player bone names for use when bones file not present for exported unit
- Fixed several GLTF warnings/errors
- Add several mesh layout formats and flesh out the MeshHeader struct
- Load more data from the meshes, including all UV maps and all layout vertices (group vertices can overlap and actually not include all vertices referenced by the group indices, which is weird)
Okay - this is mostly ready to go. There are still a few things I want to address here, like when a model doesn't actually have the LOD list the fallback is to just export everything. Also I want to make the Blender export more seamless by running the python import script from filediver, so that we can export directly to .blend rather than making it a two step process Performance exporting models has taken a noticeable hit due to all of these changes. It's not unbearable but might be something to improve. I expect a lot of it comes from texture conversion since filediver is now packing any texture it recognizes into the GLB file |
Here's the result of the following commands:
The only work done in Blender was switching to shaded view and turning the camera |
Hi, sorry for the late response! Really amazing work! I can see that you've made some great progress with reverse-engineering unit and material files. It's also absolutely incredible to see how far the 3D model exporting from this project has come due to your contributions! I just tried out your code at 6682e54 and it also gives me the textured helldiver model :) I'm still a bit skeptical because of all the python files though, since I'd ideally like for this app to "just work", which is also why I wrote it in pure Go. To make the python script work, I had to use exactly version 3.11 with uv (the setup_environment script gave me an error). Obviously, the python scripts make things tremendously easier and add some really, really useful functionality, so I think we should keep them, but I don't like the premise of having users spend hours figuring out how to make their python environment work (this especially applies if they aren't experienced in software development). |
BTW, I can see that you added some custom hashes to hashes/hashes.txt. The file is actually autogenerated (which I probably didn't make clear enough so apologies for that). The generator is hashes/generate/main.go. |
I'll have to check, but I think I fixed that with the most recent commit (probably misremembering what exactly I fixed)
I agree that it would be better to have the functionality in Go, I am not aware of any libraries to load or author blend files in Go though so I'd need to find one or make something specific. |
https://github.com/mewspring/blend This one looks interesting but was last updated several years ago |
Although porting the functionality to Go should be possible, I don't think it's worth the insane effort, especially since the Python scripts already exist. I was thinking perhaps we could have a version of filediver that includes Python and BPy in the binary or archive, with the option to use a local version of those and save on download size. I think a 500MB or so download is still less of a headache than having users do all the setup required to run the Python scripts. |
I checked out the latest commit. It still says Seems like this is only for reading. In any case, while it would be really nice, from my understanding, this wouldn't be worth the effort at the moment. It was already quite some effort to write the DDS and Wwise decoders and it seems to me that Blend is even more of a complex format. PS: Although, if we do implement the Blend conversion in Go, we would only need a subset of the format, so maybe it's not that unrealistic (but for now, let's just try to make the version using Python work as well as possible). |
OK, so I managed to use pyinstaller to create an executable from the "HD2 Accurate Blender Importer". The program with all resources is ~910MB (which can be compressed to ~470MB using ZIP). This way, although it makes the distribution huge, it should at least run reliably. Here's a script to package the Python script (requires uv by astral.sh): #rm -rf ./scripts_dist
uv venv --python 3.11
uv pip install pyinstaller bpy "numpy<2"
uv run pyinstaller -D --distpath ./scripts_dist ./scripts/hd2_accurate_blender_importer.py --add-data ./scripts/resources:resources --collect-all bpy --collect-submodules logging The output executable will be located in |
Yeah, off the top of my head since I'm not at my computer for the holidays, the functionality we would actually need for blend files would be:
Not impossibly long but a decent chunk of work, to save ~500 MB in the bundle from bpy and python. We would still need the actual template blend file and the included script. Maybe this should just be added as a "potential todo" issue that I could look at in the future A comment from the mewspring lib I found interesting is that the blend format is a "self documenting" format, so it may not be too terrible to implement, if that's still true |
As for the missing file, that's definitely from some functionality I've added around materials that I forgot to include on the commits. I'll get that added when I'm back. It'll allow the export of materials themselves onto basic planes for visualization via the bundled material. There does seem to be uncaptured nuance regarding materials, since it doesn't appear that exporting all material files actually exports all varieties of materials |
Your description of what is required does sound somewhat doable, especially with how the blend format seems to work (though we'd have to write some testing code to actually be able to reach a conclusion about the feasiblity). Another benefit would be a decent speedup, since we could write all the data immediately without going through GLB (and of course python/bpy). But for now, I agree putting this as a potential ToDo. No worries about the missing file, just add it when you're back and have time. |
a936c97
to
1a2b540
Compare
- Corrected several incorrect assumptions about the unit format with regard to component separation, vertex access, uv maps, and bone remapping - Added fallback player bone names for use when bones file not present for exported unit - Fixed several GLTF warnings/errors - Add several mesh layout formats and flesh out the MeshHeader struct - Load more data from the meshes, including all UV maps and all layout vertices (group vertices can overlap and actually not include all vertices referenced by the group indices, which is weird)
…even if a bones file is present, use material with the same index as the mesh, and change how components are added
…n saving png/jpg textures to output
…lb cannot natively represent them Later these glbs can be postprocessed to load the textures into blender with a script
…ccurate shaders Blender only has a python API, so the plan is to call a virtualenv python from filediver to create the blend file, similar to how ffmpeg is used. DDS and EXR python scripts could be implemented in filediver directly, so that models are exported with EXR files included. However, the DDS files are just tiny LUTs, so conversion isn't the worst thing in the world. Some other programs may prefer to have them in the original DDS format anyway, and there isn't a ratified extension for EXR files in the GLTF standard
…rom names to texture indices
… components more informatively
…in the highest detail entry
…s detail - it doesn't appear to be correct for the illuminate models?
…include only a single skeleton
…le is present in scripts_dist
… exporters Also clean up when the vertex coordinate conversion is applied, so that we don't rotate components unnecessarily Finally prep single_glb handling code to allow for any number of formats to use combined files
dc18478
to
f2c1b1a
Compare
Additionally, rename colors -> normals and update format to R10G10B10A2
Filtering out duplicates has an adverse effect when extracting by triad: If the triad isn't the first place a file is defined, that file will not be exported with the rest of the triad. This is especially problematic when exporting entire armor sets, which can often share geometry. Now, exporting a triad should include every file present in the triad, even if it is duplicated elsewhere Still need to figure out how material variants are applied, since many armors in game use materials that are different than the ones defined in their unit file
4th weight appears to always be zero, and the data stored in the two A channel bits might be used elsewhere?
193cd00
to
70a9359
Compare
… its not required when exporting the model