Skip to content

Releases: ziatdinovmax/NeuroBayes

v0.0.12

25 Feb 02:00
Compare
Choose a tag to compare

This release introduces partially Bayesian Transformers and neuron-level control over model stochasticity.

Key Additions:

Partially Bayesian Transformers: Transformer neural networks are at the heart of modern AI systems and are increasingly used in physical sciences. However, robust uncertainty quantification with Transformers remains challenging. While replacing all weights with probabilistic distributions and using advanced sampling techniques works for smaller networks, this approach is computationally prohibitive for Transformers. Our new partially Bayesian Transformer implementation allows you to selectively make specific modules (embedding, attention, etc.) probabilistic while keeping others deterministic, significantly reducing computational costs while still delivering reliable uncertainty quantification.

Fine-grained Stochasticity Control: Even with only some layers probabilistic, training deep learning models can be resource-intensive. You can now specify exactly which weights in particular layers should be stochastic, providing a finer control over the computational cost vs. uncertainty trade-off.

What's Changed

Full Changelog: 0.0.10...0.0.12

v0.0.10

06 Jan 19:59
5067265
Compare
Choose a tag to compare

Key updates:

  • Classification Support for Full and Partial BNNs: While the initial focus was on regression with (P)BNNs - since most tasks in physical sciences deal with (quasi-)continuous variables - it was brought to my attention that some research domains can benefit from classification capabilities. So, the new update introduces classification support. To help you get started, I've provided two toy data examples, which can easily be generalized to real-world problems.
  • Expanded SWA Options in JAX/Flax: This update enhances the Stochastic Weight Averaging options, providing more robust priors for both Full and Partial BNNs.
  • Automatic Restart for HMC/NUTS: Now, HMC/NUTS for (P)BNNs can automatically restart in case of bad initializations, which helps during the autonomous exploration of parameter spaces in experiments and simulations.
  • Additional Metrics for Active Learning and UQ: New metrics have been added to enhance the active learning and uncertainty quantification evaluation processes.
  • Minor bug fixes, improved documentation, and more examples!

Looking ahead, the next major step will be expanding Partial BNNs beyond the current MLP and ConvNet architectures to include RNNs, GNNs, and Transformers.

v0.0.9

27 Oct 19:59
Compare
Choose a tag to compare

What's Changed

Add an option to specify which layers in the provided architecture will be treated as probabilistic. For example,

# Initialize NN architecture
architecture = nb.FlaxMLP(hidden_dims = [64, 32, 16, 8], target_dim=1)

# Make the first and output layers probabilistic and the rest deterministic
probabilistic_layer_names = ['Dense0', 'Dense4']

# Intitalize and train a PBNN model
model = nb.PartialBNN(architecture, probabilistic_layer_names=probabilistic_layer_names)
model.fit(X_measured, y_measured, num_warmup=1000, num_samples=1000)

Full Changelog: 0.0.7...0.0.9

v0.0.7

25 Oct 17:35
Compare
Choose a tag to compare

What's Changed

Full Changelog: 0.0.5...0.0.7

v0.0.5

09 Oct 04:20
cbdf6f2
Compare
Choose a tag to compare

What's Changed

Full Changelog: 0.0.2...0.0.5

v0.0.2

05 Sep 16:55
b7d34a2
Compare
Choose a tag to compare
Merge pull request #1 from ziatdinovmax/hsk

Add heteroskedastic and uncertain input BNN