Skip to content

Support FP8 in batch split config#3109

Open
BirdsOfAFthr wants to merge 1 commit intomainfrom
amandaliang
Open

Support FP8 in batch split config#3109
BirdsOfAFthr wants to merge 1 commit intomainfrom
amandaliang

Conversation

@BirdsOfAFthr
Copy link
Collaborator

Description

This update enables FP8 quantization support for DeepSeek batch split configurations.

When quantization is active, the following changes apply:

  • Kernel Quantization: gmm kernels allow FP8 recipes (defined via the MaxText command line) in both forward and backward passes.

  • gmm Weight All-Gathers: gmm weight all-gathers are now quantized along non-expert, non-tensor, and non-tensor_transpose axes.

Note: Quantization for dispatch all-gathers is currently out of scope and will be implemented in a follow-up PR.

Tests

  • Verification: Validated via end-to-end (e2e) perf and convergence benchmarks.

  • Coverage: Unit tests will be added in a subsequent update.

Checklist

Before submitting this PR, please make sure (put X in square brackets):

  • I have performed a self-review of my code. For an optional AI review, add the gemini-review label.
  • I have necessary comments in my code, particularly in hard-to-understand areas.
  • I have run end-to-end tests tests and provided workload links above if applicable.
  • I have made or will make corresponding changes to the doc if needed, including adding new documentation pages to the relevant Table of Contents (toctree directive) as explained in our documentation.

@codecov
Copy link

codecov bot commented Feb 7, 2026

Codecov Report

❌ Patch coverage is 25.00000% with 54 lines in your changes missing coverage. Please review.

Files with missing lines Patch % Lines
src/MaxText/layers/deepseek_batchsplit.py 20.96% 49 Missing ⚠️
src/MaxText/layers/moe.py 57.14% 2 Missing and 1 partial ⚠️
src/maxtext/kernels/megablox/ops.py 33.33% 0 Missing and 2 partials ⚠️

📢 Thoughts on this report? Let us know!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant