Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

optimum cant use custom pipelines #2170

Closed
xiaoyao9184 opened this issue Jan 26, 2025 · 1 comment
Closed

optimum cant use custom pipelines #2170

xiaoyao9184 opened this issue Jan 26, 2025 · 1 comment
Labels

Comments

@xiaoyao9184
Copy link

xiaoyao9184 commented Jan 26, 2025

The config loaded from a HuggingFace model cannot be passed to transformers_pipeline, rendering custom_pipelines unusable. Similarly, externally loading config results in the error optimum.pipelines.pipelines_base.load_ort_pipeline() got multiple values for keyword argument 'config'. Furthermore, since transformers_pipeline does not support the trust_remote_code parameter, proper usage of custom_pipelines remains impossible.

configload in optimum pipeline

config = kwargs.get("config", None)
if config is None and isinstance(model, str):
config = AutoConfig.from_pretrained(model, _from_pipeline=task, **hub_kwargs, **kwargs)
hub_kwargs["_commit_hash"] = config._commit_hash

config also cannot be passed through external kwargs parameters.

model, model_id, tokenizer, feature_extractor = MAPPING_LOADING_FUNC[accelerator](
model,
targeted_task,
load_tokenizer,
tokenizer,
feature_extractor,
load_feature_extractor,
SUPPORTED_TASKS=supported_tasks,
config=config,
hub_kwargs=hub_kwargs,
token=token,
*model_kwargs,
**kwargs,
)

config not pass to transformers pipeline

return transformers_pipeline(
task,
model=model,
tokenizer=tokenizer,
feature_extractor=feature_extractor,
use_fast=use_fast,
**kwargs,
)

since the model is already loaded, transformers_pipeline will not load the config.
https://github.com/huggingface/transformers/blob/2e752ead46a8845e8a160d2043c1336447895690/src/transformers/pipelines/__init__.py#L825-L852

    if isinstance(config, str):
        # ...
    elif config is None and isinstance(model, str):
        # ...

As a result, custom_pipelines cannot be used.
https://github.com/huggingface/transformers/blob/2e752ead46a8845e8a160d2043c1336447895690/src/transformers/pipelines/__init__.py#L855-L856

    custom_tasks = {}
    if config is not None and len(getattr(config, "custom_pipelines", {})) > 0:
        custom_tasks = config.custom_pipelines
        # ...

    # Retrieve the task
    if task in custom_tasks:
        normalized_task = task
        targeted_task, task_options = clean_custom_task(custom_tasks[task])
        if pipeline_class is None:
            if not trust_remote_code:
                raise ValueError(
                    "Loading this pipeline requires you to execute the code in the pipeline file in that"
                    " repo on your local machine. Make sure you have read the code there to avoid malicious use, then"
                    " set the option `trust_remote_code=True` to remove this error."
                )
            class_ref = targeted_task["impl"]
            pipeline_class = get_class_from_dynamic_module(
                class_ref,
                model,
                code_revision=code_revision,
                **hub_kwargs,
            )
Copy link

This issue has been marked as stale because it has been open for 30 days with no activity. This thread will be automatically closed in 5 days if no further activity occurs.

@github-actions github-actions bot added the Stale label Feb 26, 2025
@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale Mar 3, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant