Releases: eagomez2/moduleprofiler
Releases · eagomez2/moduleprofiler
moduleprofiler v0.0.4
- Update documentation.
- Add ops estimation for
torch.nn.BatchNorm1d
andtorch.nn.BatchNorm2d
. - Add
exclude_from_ops
option toModuleProfiler
. - Fix
NaN
bit sizes and data types for modules without trainable or nontrainable parameters.
moduleprofiler v0.0.3
- Added
torch.nn.LayerNorm
reference. - Update
torch.nn.LSTM
documentation. - Fixed
torch.nn.LayerNorm
estimation.
moduleprofiler v0.0.2
- Updated documentation
- Fixed
ConvTranspose1d
andConvTranspose2d
overestimation of additions per filter.
moduleprofiler v0.0.1
First release of moduleprofiler
, a fee open-source package to profile torch.nn.Module
modules and obtain useful information to design a model that fits your needs and constraints at development time.
With moduleprofiler
you can:
- Calculate the number of parameters of your model.
- Trace the input and output sizes of each component of your model.
- Estimate the number of operations your model performs in a forward pass.
- Calculate per module and total inference time.
All results can be obtained in one of the following formats:
dict
(default output format)pandas.DataFrame
(to perform further calculations or filtering in your code)html
(to export as webpage)LaTeX
(to include in your publications)