Skip to content

Fix AdaLayerNormZeroSingle.forward return type annotation#13543

Open
Ricardo-M-L wants to merge 1 commit intohuggingface:mainfrom
Ricardo-M-L:fix-adalayernormzerosingle-return-type
Open

Fix AdaLayerNormZeroSingle.forward return type annotation#13543
Ricardo-M-L wants to merge 1 commit intohuggingface:mainfrom
Ricardo-M-L:fix-adalayernormzerosingle-return-type

Conversation

@Ricardo-M-L
Copy link
Copy Markdown
Contributor

What does this PR do?

AdaLayerNormZeroSingle.forward() has a return type annotation declaring a 5-element tuple:

def forward(self, x, emb=None) -> tuple[torch.Tensor, torch.Tensor, torch.Tensor, torch.Tensor, torch.Tensor]:
    ...
    return x, gate_msa  # actually returns 2 elements

The 5-element annotation was copied from AdaLayerNormZero.forward() which does return 5 elements (x, gate_msa, shift_mlp, scale_mlp, gate_mlp), but the "Single" variant only returns 2 (x, gate_msa).

All callers correctly unpack 2 elements (e.g., norm_hidden_states, gate = self.norm(hidden_states, emb=temb) in FluxSingleTransformerBlock), so this is just a type annotation fix.

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline?

Who can review?

@yiyixuxu @sayakpaul

The return type was annotated as a 5-element tuple (copied from
AdaLayerNormZero), but the method actually returns 2 elements
(x, gate_msa). All callers correctly unpack 2 elements.
@github-actions github-actions Bot added models size/S PR with diff < 50 LOC labels Apr 22, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

models size/S PR with diff < 50 LOC

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant