Skip to content

Commit e52bdab

Browse files
authored
Removed debug dump of universal checkpoints
1 parent 09a35f5 commit e52bdab

File tree

1 file changed

+0
-3
lines changed

1 file changed

+0
-3
lines changed

megatron/training.py

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -460,9 +460,6 @@ def setup_model_and_optimizer(model_provider_func):
460460

461461
else:
462462
args.iteration = 0
463-
464-
from .utils import dump_weights
465-
dump_weights(f'{args.universal_checkpoint=}', args.iteration, model, optimizer)
466463

467464
# tp_rank = mpu.get_tensor_model_parallel_rank()
468465
# pp_rank = mpu.get_pipeline_model_parallel_rank()

0 commit comments

Comments
 (0)