Skip to content

Conversation

@Pfannkuchensack
Copy link
Contributor

Summary

Add support for loading Flux LoRA models in the xlabs format.

The xlabs format uses a different key structure than other Flux LoRA formats:

  • double_blocks.X.processor.{qkv|proj}_lora{1|2}.{down|up}.weight

Where:

  • lora1 → image attention stream (img_attn)
  • lora2 → text attention stream (txt_attn)
  • qkv → query/key/value projection
  • proj → output projection

Changes

  • Add FluxLoRAFormat.XLabs enum value to taxonomy
  • Add flux_xlabs_lora_conversion_utils.py with format detection and conversion
  • Update formats.py to include xlabs in the detection cascade
  • Update lora.py loader to handle xlabs format
  • Update model probe in configs/lora.py to accept recognized Flux LoRA formats (fixes installation of xlabs LoRAs)
  • Add unit tests for xlabs format detection and conversion

Related Issues / Discussions

Adds support for xlabs-format Flux LoRAs which were previously rejected with "model does not match LyCORIS LoRA heuristics".

Example LoRA using this format: Flux Realism LoRA

QA Instructions

  1. Download an xlabs-format Flux LoRA (e.g., flux-RealismLora.safetensors from XLabs-AI)
  2. Install the LoRA via Model Manager → Import Models
  3. Verify it's recognized as a FLUX LoRA
  4. Use the LoRA in a Flux generation
  5. Verify the LoRA effect is applied to the output

Merge Plan

Standard merge, no special considerations.

Checklist

  • The PR has a short but descriptive title, suitable for a changelog
  • Tests added / updated (if applicable)
  • ❗Changes to a redux slice have a corresponding migration
  • Documentation added / updated (if applicable)
  • Updated What's New copy (if doing a release after this PR)

Add support for loading Flux LoRA models in the xlabs format, which uses
keys like `double_blocks.X.processor.{qkv|proj}_lora{1|2}.{down|up}.weight`.

The xlabs format maps:
- lora1 -> img_attn (image attention stream)
- lora2 -> txt_attn (text attention stream)
- qkv -> query/key/value projection
- proj -> output projection

Changes:
- Add FluxLoRAFormat.XLabs enum value
- Add flux_xlabs_lora_conversion_utils.py with detection and conversion
- Update formats.py to detect xlabs format
- Update lora.py loader to handle xlabs format
- Update model probe to accept recognized Flux LoRA formats
- Add unit tests for xlabs format detection and conversion
@github-actions github-actions bot added python PRs that change python files backend PRs that change backend files python-tests PRs that change python tests labels Dec 19, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

backend PRs that change backend files python PRs that change python files python-tests PRs that change python tests

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant