Skip to content

Commit

Permalink
Merge pull request #62 from supernnova/adacs_doc_revision
Browse files Browse the repository at this point in the history
Adacs doc revision
  • Loading branch information
anaismoller authored Oct 28, 2024
2 parents 822b3f2 + 7843343 commit a8e3ab8
Show file tree
Hide file tree
Showing 7 changed files with 32 additions and 158 deletions.
137 changes: 26 additions & 111 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,135 +7,50 @@
[![Build Status](https://travis-ci.org/supernnova/SuperNNova.svg?branch=master)](https://travis-ci.org/supernnova/SuperNNova)


### What is SuperNNova (SNN)

SuperNNova is an open-source photoemtric time-series classification framework.

The framework includes different RNN architectures (LSTM, GRU, Bayesian RNNs) and can be trained with simulations in `.csv` and `SNANA FITS` format. SNN is part of the [PIPPIN](https://github.com/dessn/Pippin) end-to-end cosmology pipeline.

You can train your own model for time-series classification (binary or multi-class) using photometry and additional features.


Please include the full citation if you use this material in your research: [A Möller and T de Boissière,
MNRAS, Volume 491, Issue 3, January 2020, Pages 4277–4293.](https://academic.oup.com/mnras/article-abstract/491/3/4277/5651173)

### Read the documentation
For the main branch:
[https://supernnova.readthedocs.io](https://supernnova.readthedocs.io/en/latest/)

The paper branch differs slightly from the master. Take a look to "changelog_paper_to_new_branch" or [Build the docs for this branch](#docs).

### Installation
Clone this repository (preferred)
```bash
git clone https://github.com/supernnova/supernnova.git
```
or install pip module (check versioning)
```bash
pip install supernnova
```
and configure environment using this [documentation](https://supernnova.readthedocs.io/en/latest/installation/python.html)

### Read the paper

Links to the publication: [MNRAS](https://academic.oup.com/mnras/article-abstract/491/3/4277/5651173),[ArXiv](https://arxiv.org/abs/1901.06384). All results quoted in these publications were produced using the branch "paper" which is frozen for reproducibility.
### Read the papers

Please include the full citation if you use this material in your research: [A Möller and T de Boissière,
MNRAS, Volume 491, Issue 3, January 2020, Pages 4277–4293.](https://academic.oup.com/mnras/article-abstract/491/3/4277/5651173)

To reproduce [Möller & de Boissière, 2019 MNRAS](https://academic.oup.com/mnras/article-abstract/491/3/4277/5651173) switch to `paper` branch and build documentation.

## Table of contents
1. [Repository overview](#overview)
2. [Getting Started](#start)
0. [Use Poetry in new releases]
1. [With Conda](#conda)
2. [With Docker](#docker)
3. [Usage](#usage)
3. [Reproduce paper](#paper)
4. [Pipeline Description](#pipeline)
5. [Running tests](#test)
6. [Build the docs](#docs)

## Repository overview <a name="overview"></a>

├── supernnova --> main module
├──data --> scripts to create the processed database
├──visualization --> data plotting scripts
├──training --> training scripts
├──validation --> validation scripts
├──utils --> utilities used throughout the module
├── tests --> unit tests to check data processing
├── sandbox --> WIP scripts

## Getting started <a name="start"></a>

### With Conda <a name="conda"></a>

cd env

# Create conda environment
conda create --name <env> --file <conda_file_of_your_choice>

# Activate conda environment
source activate <env>

### With Docker <a name="docker"></a>

cd env

# Build docker images
make cpu # cpu image
make gpu # gpu image (requires NVIDIA Drivers + nvidia-docker)

# Launch docker container
python launch_docker.py (--use_gpu to run GPU based container)

To reproduce the Dark Energy Survey analyses use commit `fcf8584b64974ef7a238eac718e01be4ed637a1d`:
- [Möller et al. 2022 MNRAS](https://ui.adsabs.harvard.edu/abs/2022MNRAS.514.5159M/abstract)
- [Möller et al. 2024 MNRAS](https://ui.adsabs.harvard.edu/abs/2024MNRAS.533.2073M/abstract)
- [Vincenzi et al. 2023 MNRAS](https://ui.adsabs.harvard.edu/abs/2023MNRAS.518.1106V/abstract)
- [DES Collaboration 2024 ApJ](https://ui.adsabs.harvard.edu/abs/2024ApJ...973L..14D/abstract)

For more detailed instructions, check the full [setup instructions](https://supernnova.readthedocs.io/en/latest/installation/python.html)
To reproduce Fink analyses until 2024 use commit `fcf8584b64974ef7a238eac718e01be4ed637a1d` and check [Fink's github](https://github.com/astrolabsoftware/fink-science).


## Usage <a name="usage"></a>

When cloning this repository:

# Create data
python run.py --data --dump_dir tests/dump --raw_dir tests/raw --fits_dir tests/fits

# Train a baseline RNN
python run.py --train_rnn --dump_dir tests/dump

# Train a variational dropout RNN
python run.py --train_rnn --model variational --dump_dir tests/dump

# Train a Bayes By Backprop RNN
python run.py --train_rnn --model bayesian --dump_dir tests/dump

# Train a RandomForest
python run.py --train_rf --dump_dir tests/dump

When using pip, a full example is [https://supernnova.readthedocs.io](https://supernnova.readthedocs.io/en/latest/)

# Python
import supernnova.conf as conf
from supernnova.data import make_dataset

# get config args
args = conf.get_args()

# create database
args.data = True # conf: making new dataset
args.dump_dir = "tests/dump" # conf: where the dataset will be saved
args.raw_dir = "tests/raw" # conf: where raw photometry files are saved
args.fits_dir = "tests/fits" # conf: where salt2fits are saved
settings = conf.get_settings(args) # conf: set settings
make_dataset.make_dataset(settings) # make dataset

## Reproduce paper results <a name="paper"></a>
Please change to branch ``paper``:

python run_paper.py

## General pipeline description <a name="pipeline"></a>

- Parse raw data in FITS format
- Create processed database in HDF5 format
- Train Recurrent Neural Networks (RNN) or Random Forests (RF) to classify photometric lightcurves
- Validate on test set


## Running tests with py.test <a name="tests"></a>

PYTHONPATH=$PWD:$PYTHONPATH pytest -W ignore --cov supernnova tests


## Build docs <a name="docs"></a>
### Build docs <a name="docs"></a>

cd docs && make clean && make html && cd ..
firefox docs/_build/html/index.html


### ADACS
This package has been updated to a recent pytorch through the [ADACS Merit allocation program](https://adacs.org.au/merit-allocation-program) 2023-2024.
2 changes: 1 addition & 1 deletion docs/configuration/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ Visualization Parameters
Argument Type Help
=============================== ============ ==========================================================
--explore_lightcurves bool Plot a random selection of lightcurves
--plot_lcs bool Plot a random selection of lightcurves predictions
--plot_lcs bool Plot a random selection of lightcurves predictions
--plot_prediction_distribution bool Plot lcs and the histogram of probability for each class
=============================== ============ ==========================================================

Expand Down
4 changes: 2 additions & 2 deletions docs/data/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -152,13 +152,13 @@ e.g. ``--redshift_label REDSHIFT_FINAL``.

8. Use a different sntype label
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The default sntype label is ``SNTYPE``. If you want to use your own label, you'll need to specify it.
The default sntype label is ``SNTYPE``. If you want to use your own label, you'll need to specify it and provide it in the `HEAD` file.

.. code-block:: bash
snn make_data --dump_dir <path/to/save/database/> --raw_dir <path/to/raw/data/> --sntype_var <your/label>
e.g. ``--redshift_label SIM_SNTYPE``.
e.g. ``--sntype_var MYTYPE``.

9. Mask photometry
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Expand Down
2 changes: 1 addition & 1 deletion docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ Welcome to SuperNNova's documentation!
:caption: Getting started

installation/system.rst
installation/python.rst
installation/env.rst
installation/five_minute_guide.rst
installation/five_minute_guide_module.rst
installation/onthefly.rst
Expand Down
2 changes: 1 addition & 1 deletion docs/installation/python.rst → docs/installation/env.rst
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ if you create environment from "conda_gpu_env.yml".
which poetry # should print <conda env>/bin/poetry
poetry install
For developers, please refer to :ref:`dev-python-env`.
For developers (including testing local documentation), please refer to :ref:`dev-python-env`.


.. _DockerConfigurations:
Expand Down
2 changes: 1 addition & 1 deletion docs/visualization/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ Below is a sample plot:

.. image:: preds_variational.png


Beware: only MC Dropout (variational) and Bayes by Backprop (bayesian) models have this feature.

Predictions from multiple models
-------------------------------------------
Expand Down
41 changes: 0 additions & 41 deletions run_yml.py

This file was deleted.

0 comments on commit a8e3ab8

Please sign in to comment.