-
Notifications
You must be signed in to change notification settings - Fork 512
Issues: huggingface/optimum
Community contribution -
optimum.exporters.onnx
support fo...
#555
by michaelbenayoun
was closed Jan 28, 2025
Closed
43
Community contribution -
BetterTransformer
integration for ...
#488
opened Nov 18, 2022 by
younesbelkada
Open
25
[Quick poll] Give your opinion on the future of the Hugging F...
#568
opened Dec 9, 2022 by
LysandreJik
Open
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Quantisation of BigBirdForTokenClassification suffers significant performance drop
bug
Something isn't working
#346
opened Aug 10, 2022 by
lewisbails
2 of 4 tasks
Consistent use of
"sequence-classification"
vs. "text-classification", "audio-classification"
#171
opened May 10, 2022 by
fxmarty
Optimum New feature or request
inference
Related to Inference
from_pretrained
does not seem to honour TRANSFORMERS_CACHE env var
feature-request
#186
opened May 13, 2022 by
axsaucedo
Optimum Inference next steps
inference
Related to Inference
#257
opened Jul 6, 2022 by
philschmid
5 of 8 tasks
Enable caching support for
from_pretrained
method in ORTModelForXXX
#258
opened Jul 6, 2022 by
philschmid
Add support for ORTValue and IO Bindings to avoid unnecessary copies between devices
#260
opened Jul 6, 2022 by
philschmid
Running into Issues Optimizing CausalLM Model for GPU
bug
Something isn't working
#274
opened Jul 9, 2022 by
sam-h-bean
2 of 4 tasks
Quantizer Not Respecting use_external_data_format
bug
Something isn't working
onnxruntime
Related to ONNX Runtime
quantization
#300
opened Jul 17, 2022 by
sam-h-bean
3 of 4 tasks
Feature request: allow user to provide tokenizer when loading transformer model
#320
opened Jul 27, 2022 by
jessecambon
[optimum-onnxruntime] The number of calibration samples must be divisible by (num_calibration_shards * calibration_batch_size)
bug
Something isn't working
quantization
#331
opened Aug 2, 2022 by
fxmarty
2 of 4 tasks
ValueError on attention mask size check for ORTModelForSeq2SeqLM with blenderbot-400M-distill
bug
Something isn't working
#347
opened Aug 10, 2022 by
guillermo-gabrielli-fer
3 of 4 tasks
Add all available ONNX models to ORTConfigManager
good first issue
Good for newcomers
#351
opened Aug 16, 2022 by
chainyo
30 of 57 tasks
Add support for joint quantization and graph optimization of the models
#355
opened Aug 19, 2022 by
jegork
Optimize ONNX model based on encoder-decoder
graph-optimization
onnxruntime
Related to ONNX Runtime
#396
opened Sep 23, 2022 by
jesusSant
Incomplete symbolic shape inference
bug
Something isn't working
graph-optimization
onnxruntime
Related to ONNX Runtime
#397
opened Sep 26, 2022 by
Matthieu-Tinycoaching
2 of 4 tasks
Not being able to load LongT5 checkpoint with ORTModelForSeq2SeqLM
bug
Something isn't working
onnxruntime
Related to ONNX Runtime
#406
opened Sep 29, 2022 by
caffeinetoomuch
2 of 4 tasks
Add support for ORTModelForObjectDetection
onnx
Related to the ONNX export
onnxruntime
Related to ONNX Runtime
#442
opened Oct 31, 2022 by
shivalikasingh95
ORTOptimizer with bart-large-cnn throwing "Incomplete shape inference"
bug
Something isn't working
onnxruntime
Related to ONNX Runtime
#448
opened Nov 4, 2022 by
ghost
2 of 4 tasks
Atol validation error with optimum.exporters.onnx but none with transformers.onnx
bug
Something isn't working
#449
opened Nov 4, 2022 by
fxmarty
2 of 4 tasks
Caching xenova repo onnx files - filename parameter not working
bug
Something isn't working
#2218
opened Mar 18, 2025 by
gidzr
2 of 4 tasks
Previous Next
ProTip!
What’s not been updated in a month: updated:<2025-02-21.