Skip to content

analysis_dynamic

rcruces edited this page Aug 31, 2023 · 1 revision

Functional dynamic connectivity analysis

Table of Contents

  1. Hidden Markov Model
  2. Sliding Window

Hidden Markov Model

  • Original paper:

    Vidaurre et al., "Brain network dynamics are hierarchically organized in time", PNAS, 2017. (https://www.pnas.org/content/114/48/12827)
    Full information at: https://github.com/OHBA-analysis/HMM-MAR/wiki

  • Useful papers:

    (1) Vidaurre et al., "Discovering dynamic brain networks from big data in rest and task", NeuroImage, 2018. (https://www.sciencedirect.com/science/article/pii/S1053811917305487)

  • Usage
    • Setup
    • Train HMM
      • [hmm,Gamma,Xi,vpath,GammaInit,residuals,fehist] = hmmmar(data,T,options);
      • Inputs

        data: Time series in cell format (each cell contains: time point x ROIs)
        T: Length of each time point in cell format (each cell contains: # of time points)
        options: Hyperparameters for HMM
        --My options--
        options = struct();
        options.K = NumStates; % Maximum # of HMM states, which can be estimated via Elbow method or other approaches
        options.Fs = 1/TR; % Sampling frequency. Typically 1/repetion time
        options.order = 0; % Maximum order of the MAR model (0: HMM with Gaussian observation)
        options.covtype = 'full'; % Choice of the covariance matrix
        options.zeromean = 0; % if 1, the mean of the time series will not be used to drive the states
        options.standardise = 0; % Standardize with mean 0, SD 1 (0 if we already have standardized data, 1 if not)
        options.cyc = 500; % Maximum # of variational inference cycles
        options.tol = 1e-7; % Minimum relative decrement of free energy for the algorithm to stop
        options.inittype = 'HMM-MAR'; % 'hmmmar' for HMM-MAR initialization or 'random' for random initialization
        options.initrep = 3; % # of repetitions of the initialization algorithm
        options.initcyc = 10; % Maximum # of optimization cycles in the initialization algoritm, per repetition
        options.verbose = 1; % Whether to show algorithm progress
        if use_stochastic
        options.BIGNbatch = round(NumSubj/20); % the amount of subject in each batch (NumSubj: # of subjects)
        options.BIGcyc = 500; % Maximum # of variational inference cycles
        options.BIGtol = options.tol; % Minimum relative decrement of free energy for convergence to be considered
        options.BIGundertol_tostop = 5; % # of inference steps with free energy decrements for the algorithm to stop
        options.forgetrate = 0.7; % Learning rate function
        options.BIGdelay = 1.0; % Learning rate function
        options.BIGbase_weights = 0.9; % Promote a democratic sampling of the subjets
        options.BIGverbose = 1; % Whether to show progress in the stochastic inference
        end
        options.pca = 0.9; % if specified, the HMM will work on a reduced PCA space

      • Outputs

        hmm: A structure with the estimated HMM-MAR model
        Gamma: Matrix containing the state time courses (# of time points x # of states)
        Xi: Matrix joint probability of past and future states conditioned on data (# of time points x # of states x # of states)
        vpath: Vector with the Viterbi path (# of time points x 1)
        GammaInit: The state time courses used after initialization
        residuals: If the model is trained on the residuals, the value of such residuals
        fehist: Historial of the free energy, with one element per iteration of the variational inference

    • Decode HMM
      • [test_Gamma,test_Xi] = hmmdecode(data,T,hmm,0);
      • [test_vpath] = hmmdecode(data,T,hmm,1);
      • Inputs

        data: Time series for decoding (time point x ROIs)
        T: Length of the time point (# of time points)
        hmm: Trained hidden markov model

      • Outputs

        test_Gamma: Matrix containing the state time courses (# of time points x # of states)
        test_Xi: Matrix joint probability of past and future states conditioned on data (# of time points x # of states x # of states)
        test_vpath: Matrix joint probability of past and future states conditioned on data (# of time points x # of states x # of states)

      • test_hmm = subject_hmm(data,T,hmm,test_Gamma,test_Xi);
      • Inputs

        Same as above

      • Outputs

        test_hmm: A structure with the estimated HMM-MAR model

    • Useful tools
      • TP = getTransProbs(test_hmm);
      • Inputs

        test_hmm: A structure with the estimated HMM-MAR model

      • Outputs

        TP: Transition probability

      • FO = getFractionalOccupancy(test_hmm,T,options);
      • Inputs

        test_hmm: A structure with the estimated HMM-MAR model
        test_T: Length of the time point (# of time points)
        options: The options defined above

      • Outputs

        FO: Fractional occupancy (how much time the HMM spends on each state)

  • You can find details at: https://github.com/OHBA-analysis/HMM-MAR/wiki
    If have difficulty: Say "HELLO! GIVE ME YOUR CODE!" to Bo-yong!

Sliding Window

  • Original paper:

    Allen et al., "Tracking whole-brain connectivity dynamics in the resting state", Cerebral Cortex, 2012. (https://academic.oup.com/cercor/article/24/3/663/394348)

  • Useful papers:

    (1) Damaraju et al., "Dynamic functional connectivity analysis reveals transient states of dysconnectivity in schizophrenia", NeuroImage: Clinical, 2014. (https://www.sciencedirect.com/science/article/pii/S2213158214000953)
    (2) Rashid et al., "Dynamic connectivity states estimated from resting fMRI Identify differences among Schizophrenia, bipolar disorder, and healthy control subjects", Frontier in Human Neuroscience, 2014. (https://www.frontiersin.org/articles/10.3389/fnhum.2014.00897/full)
    (3) Park et al., "Dynamic functional connectivity analysis reveals improved association between brain networks and eating behaviors compared to static analysis", Behavioural Brain Research, 2018. (https://www.sciencedirect.com/science/article/pii/S0166432817314249?via%3Dihub)
    (4) Park et al., "Structural and Functional Brain Connectivity Changes Between People With Abdominal and Non-abdominal Obesity and Their Association With Behaviors of Eating Disorders", Frontiers in Neuroscience, 2018. (https://www.frontiersin.org/articles/10.3389/fnins.2018.00741/full)
    (5) Lee et al., "Dynamic functional connectivity of migraine brain: a resting-state functional magnetic resonance imaging study", PAIN, 2019. (https://journals.lww.com/pain/fulltext/2019/12000/Dynamic_functional_connectivity_of_the_migraine.11.aspx)

  • Usage

    • Setup
      • Download FuNP toolbox (https://gitlab.com/by9433/funp)
      • Unzip the SurfaceAtlas and FuNP
      • Add the downloaded folder (SurfaceAtlas & FuNP) in MATLAB: addpath(genpath('~/FuNP'))
    • Run in MATLAB: Construct dynamic connectivity matrices
      • [OrigMat, RegMat, best_lambda] = MyDynConn(ts, Nroi, wsize, sigma, num_repetitions);
      • Inputs

        ts: Time series (time point x ROIs)
        Nroi: Number of ROI
        wsize: Size of the sliding window (unit of TRs)
        (theoretically: 1/Slowest frequency [e.g., HPF cutoff = 0.01Hz -> 100s -> if TR = 2s, then wsize = 50])
        (practically: wsize = 44~66s)
        sigma: Gaussian kernel (usually set to 3)
        num_repetitions: Number of repetitions for L1 regularization (usually set to 10)

      • Outputs

        OrigMat: Un-regularized dynamic connectivity matrix
        RegMat: Regularized dynamic connectivity matrix (USE THIS!!)
        best_lambda: Regularization factor for L1 regularization

  • Steps of group analysis

    (1) Construct functional dynamic connectivity matrices for all subjects
    (2) Perform subject-wise clustering to find the optimal number of clusters
    (3) Perform group-wise clustering with the optimal number of clusters -> We can obtain group-wise brain state matrices
    (4) Match the subject- and group-wise brain state matrices \

    [Code request]
    Visit: https://gitlab.com/by9433/dynamicconnectivity
    If have difficulty: Say "HELLO! GIVE ME YOUR CODE!" to Bo-yong!

Clone this wiki locally