Bugfix - Fix DeepSpeed BF16 config validation error#796
Open
Conversation
… scaling parameters
Contributor
There was a problem hiding this comment.
Pull request overview
Fixes DeepSpeed BF16 initialization failures in the Megatron-GPT benchmark by generating precision-specific DeepSpeed config sections, avoiding BF16 rejection of FP16-only loss-scaling fields under newer DeepSpeed/Pydantic validation.
Changes:
- Generate FP16 DeepSpeed config with loss-scaling parameters (unchanged behavior).
- Generate BF16 DeepSpeed config with only
enabled: Trueto satisfy strict BF16 schema validation. - Omit the precision section entirely for FP32 runs.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| @@ -307,15 +307,23 @@ def __prepare_deespeed_config(self, precision_megatron): | |||
| """Prepare deepspeed configs.""" | |||
| self._config_json_path = os.path.join(self._args.data_home, 'ds_config_gpt.json') | |||
| # Load deepspeed config template json file | |||
There was a problem hiding this comment.
The comment says "Load deepspeed config template json file", but this function is constructing the template dict inline (no JSON template is loaded). Updating/removing this comment would avoid confusion about where the config comes from.
Suggested change
| # Load deepspeed config template json file | |
| # Build deepspeed config template in memory |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description
The megatron-gpt:deepspeed benchmark fails with return code 3 (INVALID_BENCHMARK_RESULT) during the BF16 training round. The benchmark runs two precision rounds (FP16 then BF16), and while FP16 succeeds, BF16 crashes at DeepSpeed initialization with:
pydantic_core._pydantic_core.ValidationError: 5 validation errors for DeepSpeedBF16Config
loss_scale - Extra inputs are not permitted
loss_scale_window - Extra inputs are not permitted
min_loss_scale - Extra inputs are not permitted
initial_scale_power - Extra inputs are not permitted
hysteresis - Extra inputs are not permitted
Root Cause
__prepare_deespeed_config() in megatron_gpt3.py uses the same precision_template for both FP16 and BF16 configs. This template includes loss-scaling parameters (loss_scale, loss_scale_window, min_loss_scale, initial_scale_power, hysteresis) that are valid for FP16 but rejected by DeepSpeedBF16Config, which uses pydantic strict validation and does not accept extra fields.
BF16 does not need loss scaling because it has sufficient dynamic range to avoid the underflow/overflow issues that FP16 faces.
This was always technically incorrect, but only became a hard failure when DeepSpeed migrated from Pydantic v1 to Pydantic v2 (around DeepSpeed v0.15–v0.16). In Pydantic v1, the extra = "forbid" setting was less strictly enforced, so the extra fields were silently ignored. Pydantic v2 strictly rejects all unknown fields with a ValidationError.
Fix
Generate precision-specific DeepSpeed configs:
This fix is backward compatible — passing only {'enabled': True} for BF16 is valid in all DeepSpeed versions, since the loss-scaling fields were never used by BF16.