Skip to content

Conversation

wffurr
Copy link
Contributor

@wffurr wffurr commented Oct 1, 2025

This required minor changes to LLVMContext construction and PGOOptions.

"repo": "https://github.com/espressif/llvm-project.git",
"repo_ssh": "[email protected]:espressif/llvm-project.git",
"branch": "xtensa_release_18.1.2",
"branch": "xtensa_release_21.1.2",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no such a branch availabile

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks - I should have checked that. Looks like the new API for LLVMContext construction will have to get wrapped in an #ifdef LLVM version check, since xtensa's latest release is only LLVM 19.

@wffurr wffurr force-pushed the update-llvm branch 2 times, most recently from c11c325 to 0a7e444 Compare October 2, 2025 18:03
Wraps the LLVMContext construction in LLVM 21 version check; LLVM 21 makes a
breaking change in LLVMContext construction, but WAMR still needs to
support older LLVM versions, e.g. for xtensa/esp32 support which is only
available in at most LLVM 19.
@wffurr
Copy link
Contributor Author

wffurr commented Oct 3, 2025

Is this good to merge? The test failures look like timeouts; one error is "fatal error: error writing to /tmp/cckAF38s.s: No space left on device" which doesn't seem relevant. Thanks!

@yamt
Copy link
Collaborator

yamt commented Oct 6, 2025

Is this good to merge? The test failures look like timeouts; one error is "fatal error: error writing to /tmp/cckAF38s.s: No space left on device" which doesn't seem relevant. Thanks!

isn't it possible llvm 21 build actually somehow requires more space and caused ENOSPC on the ci?
i guess someone needs to investigate a bit further. (sorry i have no time right now)

LLVM 21 update uses more disk space and makes the standard runner fail
with "No space left on device".  Using the [free disk space
action](https://github.com/marketplace/actions/free-disk-space-ubuntu)
to delete the unused Android, Haskell, and .NET runtimes frees up space
on the runner.
@wffurr
Copy link
Contributor Author

wffurr commented Oct 6, 2025

Is this good to merge? The test failures look like timeouts; one error is "fatal error: error writing to /tmp/cckAF38s.s: No space left on device" which doesn't seem relevant. Thanks!

isn't it possible llvm 21 build actually somehow requires more space and caused ENOSPC on the ci? i guess someone needs to investigate a bit further. (sorry i have no time right now)

I found a script to free up disk space on the GitHub Ubuntu runner. Is that OK to use? Seems to fix the out of space error. If you'd prefer I didn't use the action from the marketplace, I can hoist that bit out into a script in the WAMR repo and use that instead.

@lum1n0us
Copy link
Collaborator

@yamt @TianlongLiang @loganek What are your thoughts on upgrading LLVM to version 21?

@yamt
Copy link
Collaborator

yamt commented Oct 10, 2025

@yamt @TianlongLiang @loganek What are your thoughts on upgrading LLVM to version 21?

i don't have strong opinions either ways.
it's a bit sad to use different versions for different targets. but i guess it doesn't matter much.
otoh, i don't see any strong reasons to update to 21 either.

@TianlongLiang
Copy link
Collaborator

For ARC, I don't think we should upgrade LLVM to 21.x. For the default target, we do need to update static pgo-related data structures like in this pr. Personally, I think we need to update the static PGO first before actually upgrading LLVM.

@wffurr
Copy link
Contributor Author

wffurr commented Oct 14, 2025

We're using WAMR with LLVM head and have patches for LLVM's changes in 21 and 22 that break WAMR. This is probably not the way the software is intended to be used. I was hoping to update to something closer to head.

I could send the patch for just #if LLVM_VERSION_MAJOR >= 21 in aot_llvm.c instead of updating the LLVM targets in the build script.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants