Skip to content

Commit

Permalink
Merge pull request #76 from legend-exp/refactor
Browse files Browse the repository at this point in the history
small docs fixes
  • Loading branch information
sofia-calgaro authored Feb 14, 2025
2 parents f4b53c0 + f844b5a commit 98b5343
Show file tree
Hide file tree
Showing 7 changed files with 20 additions and 13 deletions.
8 changes: 2 additions & 6 deletions docs/src/api.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,10 @@
# API

## Functions and macros

```@index
Pages = ["internal_api.md"]
Order = [:macro, :function]
Pages = ["api.md"]
Order = [:function]
```

## Documentation

### analysis.jl
```@docs
ZeroNuFit.Analysis.retrieve_real_fit_results
Expand Down
2 changes: 2 additions & 0 deletions docs/src/config.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
# Configuration file

Table of contents:

```@contents
Expand Down
1 change: 1 addition & 0 deletions docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ Pages = [
"inputs.md",
"toys.md",
"tutorial.md",
"api.md",
]
Depth = 1
```
2 changes: 2 additions & 0 deletions docs/src/inputs.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
# Partitions and events

The fit takes in inputs two files in JSON format (for a full customization of the fit), which paths have to be specified in the `config.json` file.

Table of contents:
Expand Down
16 changes: 9 additions & 7 deletions docs/src/likelihood.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
# Likelihood implementation

Table of contents:

```@contents
Expand All @@ -10,12 +12,12 @@ The implemented unbinned Likelihood function reads as:

```math
\begin{aligned}
\mathcal{L}(\Gamma) = \prod_k \bigg[ \textrm{Pois}(s_k+b_k) \bigg[ \prod_{i_k=1}^{N_k} \frac{1}{s_k + b_k} \left( b_k\cdot p_{\rm b}(E) + s_{\rm k}\cdot p_{\rm s}(E) \right) \bigg] \bigg]
\mathcal{L}(\Gamma,\, \boldsymbol{BI},\,\boldsymbol{\theta}|D) = \prod_k \bigg[ \textrm{Pois}(s_k+b_k) \bigg[ \prod_{i_k=1}^{N_k} \frac{1}{s_k + b_k} \left( b_k\cdot p_{\rm b}(E) + s_{\rm k}\cdot p_{\rm s}(E) \right) \bigg] \bigg]
\end{aligned}
```

where $\Gamma$ is the signal rate, BI is the background index, $\boldsymbol{\theta}$ are the nuisance parameters, and $D$ are the observed data.
Here, the first product runs over the number of partitions _k_ ($N_{\rm p}$ partitions in total) and the second over the events _i_ in a given partition ($N_{\rm k}$ events in total).

In case no events are found in a given partition _k_, the above Likelihood expression simplifies into

```math
Expand Down Expand Up @@ -46,7 +48,7 @@ Taking $x=Q_{\beta\beta} - \Delta_{\rm k}$, the signal energy distribution for e
\end{aligned}
```

Alternatively, the signal energy distribution can also be shaped as a Gaussian with a tail at low energies (e.g. for MAJORANA DEMONSTRATOR),
Alternatively, the signal energy distribution can also be shaped as a Gaussian with a tail at low energies (e.g. for MAJORANA DEMONSTRATOR data),

```math
\begin{aligned}
Expand All @@ -68,7 +70,7 @@ and

```math
\begin{aligned}
s_{\rm k} = \frac{\text{ln}\,2\,\mathcal{N}_{\rm A}}{m_{\rm 76}} \cdot (\varepsilon_{\rm k} + \alpha \cdot \sigma_{\varepsilon_{\rm k}}) \cdot \mathcal{E}_{\rm k} \cdot \Gamma
s_{\rm k} = \frac{\text{ln}\,2\cdot \mathcal{N}_{\rm A}}{m_{\rm 76}} \cdot (\varepsilon_{\rm k} + \alpha \cdot \sigma_{\varepsilon_{\rm k}}) \cdot \mathcal{E}_{\rm k} \cdot \Gamma
\end{aligned}
```

Expand All @@ -86,10 +88,10 @@ We defined our "log Likelihood" ($LL$) as:
\end{aligned}
```

The sum over all partitions $k$ was separated in a sum over partitions containing an event $i$ ($j$) and in a sum over partitions with no events ($l$).
The sum over all partitions $k$ was separated in a sum over partitions containing an event $i$ with energy $E_{\rm i}$ (sum with index $j$) and in a sum over partitions with no events (sum with index $l$).


## Free parameter priors
## Prior terms

Different free prameters can be identified within the framework:
- signal, $\Gamma$
Expand Down Expand Up @@ -140,7 +142,7 @@ The above products, then, can be expressed again as
\end{aligned}
```

### Marginalization and posterior distributions
### Posterior distributions and marginalization

The combined posterior probability density function is calculated according to Bayes’ theorem as:

Expand Down
2 changes: 2 additions & 0 deletions docs/src/toys.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
# Generating toys

Table of contents:

```@contents
Expand Down
2 changes: 2 additions & 0 deletions docs/src/tutorial.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,5 @@
# Tutorial

The aim of this tutorial consists in building proper config JSON files in order to run a neutrinoless double-beta decay analysis over GERDA and MAJORANA DEMONSTRATO (MJD) published data.
Additional info on the meaning of input parameters can be found under the "Configuration file" section, and for input files under the "Partitions and events" section.

Expand Down

0 comments on commit 98b5343

Please sign in to comment.