Skip to content

Commit

Permalink
Merge pull request #30 from bhavnicksm/main
Browse files Browse the repository at this point in the history
Update README; Add ToC + Update supported
  • Loading branch information
bhavnicksm authored Mar 8, 2023
2 parents 979b418 + 7d54bc8 commit 2330cbd
Show file tree
Hide file tree
Showing 3 changed files with 65 additions and 26 deletions.
72 changes: 53 additions & 19 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,6 @@

# Nadir


![PyPI - Downloads](https://img.shields.io/pypi/dm/nadir)
![GitHub commit activity](https://img.shields.io/github/commit-activity/m/Dawn-Of-Eve/nadir)
![GitHub Repo stars](https://img.shields.io/github/stars/Dawn-Of-Eve/nadir?style=social)
Expand All @@ -12,34 +11,33 @@

PyTorch is a popular machine learning framework that provides a flexible and efficient way of building and training deep neural networks. This library, Nadir, is built on top of PyTorch to provide high-performing general-purpose optimisation algorithms.

:warning: ***Currently in Developement Beta version with every update having breaking changes; user discreation and caution advised!*** :warning:

## Supported Optimisers

| Optimiser | Paper |
|:---------: |:-----: |
| **SGD** | https://paperswithcode.com/method/sgd |
| **Momentum** | https://paperswithcode.com/method/sgd-with-momentum |
| **Adagrad** | https://www.jmlr.org/papers/volume12/duchi11a/duchi11a.pdf |
| **RMSProp** | https://paperswithcode.com/method/rmsprop |
| **Adam** | https://arxiv.org/abs/1412.6980v9 |
| **Adamax** | https://arxiv.org/abs/1412.6980v9 |
| **Adadelta** | https://arxiv.org/abs/1212.5701v1 |

# Table of Contents

- [Nadir](#nadir)
- [Table of Contents](#table-of-contents)
- [Installation](#installation)
- [Simple Usage](#simple-usage)
- [Supported Optimisers](#supported-optimisers)
- [Acknowledgements](#acknowledgements)
- [Citation](#citation)


## Installation

Nadir is on the PyPi packaging Index! :partying_face:
# Installation

Simply run the following command on your terminal and start using Nadir now!
You can either choose to install from the PyPI index, in the following manner:

```bash
$ pip install nadir
```
or install from source, in the following manner:

## Usage
```bash
$ pip install git+https://github.com/Dawn-Of-Eve/nadir.git
```
**Note:** Installing from source might lead to a breaking package. It is recommended that you install from PyPI itself.

# Simple Usage

```python
import nadir as nd
Expand All @@ -51,5 +49,41 @@ model = ...
config = nd.SGDConfig(lr=learning_rate)
optimizer = nd.SGD(model.parameters(), config)

# Call the optimizer step
optimizer.step()
```

# Supported Optimisers

| Optimiser | Paper |
|:---------: |:-----: |
| **SGD** | https://paperswithcode.com/method/sgd |
| **Momentum** | https://paperswithcode.com/method/sgd-with-momentum |
| **Adagrad** | https://www.jmlr.org/papers/volume12/duchi11a/duchi11a.pdf |
| **RMSProp** | https://paperswithcode.com/method/rmsprop |
| **Adam** | https://arxiv.org/abs/1412.6980v9 |
| **Adamax** | https://arxiv.org/abs/1412.6980v9 |
| **AdamW** | https://arxiv.org/abs/1711.05101v3 |
| **Adadelta** | https://arxiv.org/abs/1212.5701v1 |
| **AMSGrad** | https://arxiv.org/abs/1904.09237v1 |
| **RAdam** | https://arxiv.org/abs/1908.03265v4 |
| **Lion** | https://arxiv.org/abs/2302.06675 |

# Acknowledgements

We would like to thank all the amazing contributors of this project who spent so much effort making this repositary awesome! :heart:


# Citation

You can use the _Cite this repository_ button provided by Github or use the following bibtex:

```bibtex
@software{MinhasNadir,
title = {{Nadir: A Library for Bleeding-Edge Optimizers in PyTorch}},
author = {Minhas, Bhavnick and Kalathukunnel, Apsal},
year = 2023,
month = 3,
version = {0.0.2}
}
```
11 changes: 9 additions & 2 deletions src/nadir/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,12 +21,14 @@
from .adam import Adam, AdamConfig
from .adamax import Adamax, AdamaxConfig
from .base import BaseOptimizer, BaseConfig
from .lion import Lion, LionConfig
from .momentum import Momentum, MomentumConfig
from .rmsprop import RMSProp, RMSPropConfig
from .radam import Radam, RadamConfig
from .sgd import SGD, SGDConfig


__version__ = "0.0.2"
__version__ = "0.0.3"

__all__ = ('Adadelta',
'AdadeltaConfig',
Expand All @@ -35,12 +37,17 @@
'Adam',
'AdamConfig',
'Adamax',
'AdamaxConfig',
'AdamaxConfig',
'Adam'
'BaseOptimizer',
'BaseConfig',
'Lion',
'LionConfig',
'Momentum',
'MomentumConfig',
'RMSProp',
'RMSPropConfig',
'Radam',
'RadamConfig',
'SGD',
'SGDConfig')
8 changes: 3 additions & 5 deletions tests/mnist.py
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,7 @@

# writing the logging args as a namespace obj
largs = argparse.Namespace()
largs.run_name : str = 'Nadir-Adadelta 2'
largs.run_name : str = 'Nadir-Lion'
largs.run_seed : str = args.seed


Expand Down Expand Up @@ -94,7 +94,7 @@ def forward(self, x):
return output


def train(args, model, device, train_loader, optimizer, epoch):
def train(args, model, device, train_loader, oAdadeltaptimizer, epoch):
model.train()
for (data, target)in (pbar := tqdm(train_loader)):
data, target = data.to(device), target.to(device)
Expand Down Expand Up @@ -212,13 +212,11 @@ def mnist_tester(model, optimizer=None, args = None):
run.name = f'{largs.run_name}'
run.config.update(args)
run.config.update(largs)



# Initialising the optimiser
model = MNISTestNet().to(args.device)
# config = nd.AdadeltaConfig(lr = args.learning_rate, beta_1=args.betas[0], beta_2=args.betas[1])
optimizer = nd.Adadelta(model.parameters())
optimizer = nd.Lion(model.parameters())
# config = AutoConfig(args.params..)
# optimizer = args.optimizer(config)

Expand Down

0 comments on commit 2330cbd

Please sign in to comment.