You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
when running `transformers-cli login` (stored in `~/.huggingface`).
113
113
model_kwargs (`Optional[Dict[str, Any]]`, defaults to `None`):
114
114
Experimental usage: keyword arguments to pass to the model during
115
-
the export. This argument should be used along the `custom_onnx_configs` argument
115
+
the export. This argument should be used along the `custom_export_configs` argument
116
116
in case, for example, the model inputs/outputs are changed (for example, if
117
117
`model_kwargs={"output_attentions": True}` is passed).
118
-
custom_onnx_configs (`Optional[Dict[str, OnnxConfig]]`, defaults to `None`):
119
-
Experimental usage: override the default ONNX config used for the given model. This argument may be useful for advanced users that desire a finer-grained control on the export. An example is available [here](https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model).
118
+
custom_export_configs (`Optional[Dict[str, OnnxConfig]]`, defaults to `None`):
119
+
Experimental usage: override the default export config used for the given model. This argument may be useful for advanced users that desire a finer-grained control on the export. An example is available [here](https://huggingface.co/docs/optimum/main/en/exporters/onnx/usage_guides/export_a_model).
120
120
fn_get_submodels (`Optional[Callable]`, defaults to `None`):
121
121
Experimental usage: Override the default submodels that are used at the export. This is
122
122
especially useful when exporting a custom architecture that needs to split the ONNX (e.g. encoder-decoder). If unspecified with custom models, optimum will try to use the default submodels used for the given task, with no guarantee of success.
@@ -134,7 +134,7 @@ def main_export(
134
134
```python
135
135
>>> from optimum.exporters.openvino import main_export
f"Asked to export a {model_type} model for the task {task}{autodetected_message}, but the Optimum OpenVINO exporter only supports the tasks {', '.join(model_tasks.keys())} for {model_type}. Please use a supported task. Please open an issue at https://github.com/huggingface/optimum/issues if you would like the task {task} to be supported in the ONNX export for {model_type}."
0 commit comments