Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

the rm train model doesnot save value_head.bin, which is needed by the rm_engine function. #19

Open
PanYicheng opened this issue Feb 18, 2025 · 1 comment

Comments

@PanYicheng
Copy link

PanYicheng commented Feb 18, 2025

After training rm, the value head weight is stored inside the model.safetensors. while in the following rm_engine function, it reads an seperate file named value_head.bin.

def rm_engine(config):
    if config.need_value_func:
        prm_model = LLM(
            model=config.reward_model_dir, 
            task="reward",
            tensor_parallel_size=1, 
            trust_remote_code=True,
            max_model_len=config.max_model_len,
            enforce_eager=True,
            swap_space=0,
            gpu_memory_utilization=0.98 - config.llm_gpu_memory_utilization, # for qwen 7b, rm need 15G memory
        )
        
        v_head_state = torch.load(os.path.join(config.reward_model_dir, "value_head.bin"), weights_only=True)
        v_state = {}
        for name, param in v_head_state.items():
            v_state[name.replace("v_head.", "")] = param
        model_config = AutoConfig.from_pretrained(config.reward_model_dir, trust_remote_code=True, use_cache = False)
        v_head = ValueHead(model_config)
        v_head.load_state_dict(v_state)
        v_head.eval()

        tokenizer = AutoTokenizer.from_pretrained(config.reward_model_dir, trust_remote_code=True, use_cache = False, split_special_tokens=False,)
        return prm_model, v_head, tokenizer
    else:
        return None, None, None```
@Gxy-2001
Copy link
Collaborator

Thank you for your interest. I have uploaded save_rm.py, which serves as the script for converting RM format files. Let us know if you have any further questions!

#22

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants