Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix Flux multiple Lora loading bug #10388

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

maxs-kan
Copy link

What does this PR do?

The current approach of checking for a key with a base_layer suffix may lead to a bug when multiple Lora models are loaded. If the first loaded Lora model does not have weights for layer n, and the second one does, loading the second model will lead to an error since the transformer state dict currently does not have key n.base_layer.weight. So I explicitly check for the presence of a key with the base_layer suffix.

@yiyixuxu

@a-r-r-o-w a-r-r-o-w requested a review from hlky December 26, 2024 12:25
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@hlky
Copy link
Collaborator

hlky commented Dec 26, 2024

Hi @maxs-kan, thanks for your contribution, can you share some example lora checkpoints that may lead to a bug?

@maxs-kan
Copy link
Author

maxs-kan commented Dec 26, 2024

Sure, try in the same order:
pipe.load_lora_weights(hf_hub_download("TTPlanet/Migration_Lora_flux","Migration_Lora_cloth.safetensors"), adapter_name="cloth")
pipe.load_lora_weights("alimama-creative/FLUX.1-Turbo-Alpha", adapter_name="turbo")

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants