Open
Conversation
964aca4 to
bd07254
Compare
Codecov Report✅ All modified and coverable lines are covered by tests. 📢 Thoughts on this report? Let us know! |
cdbeb48 to
5e9737b
Compare
NuojCheng
reviewed
Jan 29, 2026
NuojCheng
reviewed
Jan 29, 2026
bf0879b to
7ba2a20
Compare
277368f to
1321643
Compare
NuojCheng
approved these changes
Feb 5, 2026
Collaborator
NuojCheng
left a comment
There was a problem hiding this comment.
Can we also have a script to generate new logical_shardings.json
5fade9a to
7ac9df8
Compare
7ac9df8 to
c0a6b81
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description
Expand
sharding_dump.pyto output logical axes and update unit testsharding_compare_test.pyCurrently, two json files (
logical_shardings.json and named_shardings.json) will be generated per model/device_type/slice_number.To prevent negatively impacting GitHub Actions CI run times, we will only check a limited set of golden files (
Deepeek2 -16b/gpt-oss-20b/qwen-0.6b with tpu7x-16/v5p-16/v6e-16)into the MaxText repositoryCommand:
Get baseline sharding info:
There are two primary ways to use the script
run_sharding_dump.py:Run the script without any command-line arguments to iterate through all test
cases defined in
tests.utils.sharding_dump.TEST_CASES. It will skip anycombination for which the output files already exist.
Command:
python3 -m tests.utils.run_sharding_dumpProvide the
model_name,topology, andnum_sliceas command-line argumentsto generate sharding information for a single configuration. You must provide
all three arguments.
Command:
python3 -m tests.utils.run_sharding_dump --model_name <model> --topology <topology> --num_slice <slices>Example:
python3 -m tests.utils.run_sharding_dump --model_name gemma-7b --topology v5p-256 --num_slice 1Compare sharding info:
python3 -m pytest tests/unit/sharding_compare_test.py -s -v -k "llama3.1-70b" 2>&1 | tee test_output.logExample
Content in logical_shardings.json
Content in named_shardings.json
Tests
UT for sharding dump comparison Failed (physical weight) : https://paste.googleplex.com/6032726044049408
UT for sharding dump comparison Failed (logical): https://paste.googleplex.com/5855428334452736
UT for sharding dump comparison Successed: https://paste.googleplex.com/6737857618247680
Checklist
Before submitting this PR, please make sure (put X in square brackets):
gemini-reviewlabel.