Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

move model for export preparation to be able reuse in other exporters #1698

Merged
merged 6 commits into from
Feb 27, 2024

Conversation

eaidova
Copy link
Contributor

@eaidova eaidova commented Feb 16, 2024

What does this PR do?

This PR propose to move model preparation to export outside onnx exporter for ability reuse it in different exporters (e.g. openvino, neuron) that may have own export configs. Changes are backward compatible for onnx export, should not affect other components, but will help in reusage models preparation (partitioning, building export configs) in other optimum exporters

@eaidova eaidova force-pushed the ea/move_model_preparation branch from ab08841 to b13da36 Compare February 19, 2024 05:59
Copy link
Collaborator

@echarlaix echarlaix left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM !

Copy link
Contributor

@fxmarty fxmarty left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks OK to me, but keep in mind optimum main repo CI is only testing ONNX export.

Comment on lines +1124 to +1133
normalized_task = task.replace("-with-past", "")
if normalized_task not in cls.get_all_tasks():
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this change related to the refactor?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it is not related to refactoring, it fixes bug that prevent registering tasks with-past in this case (I found it testing on optimum-intel side for registering configs for openvino)

@fxmarty
Copy link
Contributor

fxmarty commented Feb 26, 2024

@eaidova Could you push an empty commit (or alternatively merge main), so that we run with #1707 & run the CI?

@eaidova eaidova force-pushed the ea/move_model_preparation branch from 0aab2a3 to d20325c Compare February 26, 2024 12:28
@eaidova
Copy link
Contributor Author

eaidova commented Feb 26, 2024

@eaidova Could you push an empty commit (or alternatively merge main), so that we run with #1707 & run the CI?

@fxmarty done

@fxmarty
Copy link
Contributor

fxmarty commented Feb 26, 2024

@eaidova Thank you, can you install ruff==0.1.5 and run make style?

@eaidova
Copy link
Contributor Author

eaidova commented Feb 26, 2024

@eaidova Thank you, can you install ruff==0.1.5 and run make style?

@fxmarty, check_code_quality reports issues in optimum/optimum/examples/onnxruntime/training/image-classification/run_image_classification.py that was not modified by me. Should I fix it?

@fxmarty
Copy link
Contributor

fxmarty commented Feb 26, 2024

yes please, sorry if it is unrelated

@fxmarty
Copy link
Contributor

fxmarty commented Feb 26, 2024

Can you fix

exporters/onnx/test_onnx_export.py:30: in <module>
    from optimum.exporters.onnx import (
E   ImportError: cannot import name 'get_decoder_models_for_export' from 'optimum.exporters.onnx' (/home/runner/work/optimum/optimum/optimum/exporters/onnx/__init__.py)

Maybe it's good to keep backward compatibility in exporters/onnx/utils.py (just importing from exporters/utils.py & redefining the functions), with warnings that the functions will be removed in a future version. Although not explicitly exposed, maybe some people use these utils.

@eaidova
Copy link
Contributor Author

eaidova commented Feb 27, 2024

Can you fix

exporters/onnx/test_onnx_export.py:30: in <module>
    from optimum.exporters.onnx import (
E   ImportError: cannot import name 'get_decoder_models_for_export' from 'optimum.exporters.onnx' (/home/runner/work/optimum/optimum/optimum/exporters/onnx/__init__.py)

Maybe it's good to keep backward compatibility in exporters/onnx/utils.py (just importing from exporters/utils.py & redefining the functions), with warnings that the functions will be removed in a future version. Although not explicitly exposed, maybe some people use these utils.

@fxmarty added

@eaidova eaidova force-pushed the ea/move_model_preparation branch from dc64a62 to 132d86e Compare February 27, 2024 10:23
@fxmarty
Copy link
Contributor

fxmarty commented Feb 27, 2024

Thank you LGTM, failing tests are unrelated!

@fxmarty fxmarty merged commit 80e89f1 into huggingface:main Feb 27, 2024
51 of 61 checks passed
young-developer pushed a commit to young-developer/optimum that referenced this pull request May 10, 2024
…huggingface#1698)

* move model for export preparation to be able reuse in other exporters

* rename to utils.py

* Fix register decorator for task with past

* exporter_type -> exporter

* fix style

* add backward compatibility and warnings
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants