You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Introduction Microstructure.jl is a Julia toolbox (development version) aiming at fast and probabilistic microstructure imaging. It features flexible biophysical modelling with MRI data. For estimating microstructure parameters from these models, it includes generic estimators such as Markov Chain Monte Carlo (MCMC) sampling methods and Monte Carlo dropout with neural networks.
Goal
Using Flux.jl and Microstructure.jl to implement different types of neural networks. The current neural network estimator in Microstructure.jl uses multi-layer perceptron for supervised training with training samples generated from forward models in Microstructure.jl, e.g. MRI measurements as inputs and microstructure parameters as outputs. For other types of methods, an example we can try is to implement self-supervised method that uses the forward models in Microstructure.jl as a decoder.
Resources
Tutorials/domes about how to use Microstructure.jl will be available soon on the documentation website
For neural network examples using Flux, there are various models that you can reference at the Flux model zoo
Julia is a programming language designed for high performance. If you are interested in Julia or have experiences in related areas using other languages, join me in hacking towards the goal!
The text was updated successfully, but these errors were encountered:
Tinggong
changed the title
[PROJECT] Julia implementation of neural network estimators for microstructure imaging
[PROJECT] Julia implementation of neural network estimators
Apr 8, 2024
Introduction
Microstructure.jl is a Julia toolbox (development version) aiming at fast and probabilistic microstructure imaging. It features flexible biophysical modelling with MRI data. For estimating microstructure parameters from these models, it includes generic estimators such as Markov Chain Monte Carlo (MCMC) sampling methods and Monte Carlo dropout with neural networks.
Goal
Using Flux.jl and Microstructure.jl to implement different types of neural networks. The current neural network estimator in Microstructure.jl uses multi-layer perceptron for supervised training with training samples generated from forward models in Microstructure.jl, e.g. MRI measurements as inputs and microstructure parameters as outputs. For other types of methods, an example we can try is to implement self-supervised method that uses the forward models in Microstructure.jl as a decoder.
Resources
Julia is a programming language designed for high performance. If you are interested in Julia or have experiences in related areas using other languages, join me in hacking towards the goal!
The text was updated successfully, but these errors were encountered: