-
Notifications
You must be signed in to change notification settings - Fork 11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Find states only at GFP peaks #163
Comments
Hey Matt, I think your approach is correct. You could try introducing unlabeled labels (
It become way more complicated if you want to fit the Another point to take into account is that this approach works well because Thank you for the feedback on GFP computation, we will investigate this next week and let you know if we plan any change that could resolve some of your MEG issues. With this approach, you can easily chose between 'std' or 'rms' for GFP computation. |
Hi @vferat, Thanks that code, It works great. I was previously only fitting with the GFP because it ran a lot quicker, but it is not necessary so long as I can extract the peaks this way. Also for reference, I ran the HCP MEG data with both STD and RMS, and compared it to the regular seg.plot() and here are the outputs for one epoch, but the pattern is consistent across epochs: |
If some However, if you a sure that there is time continuity (I can only think of |
Thank you @vferat, Good point! The HCP data I cannot be sure of time continuity since I opted to use their minimally preprocessed data, so I do not know what they threw out. Though I do have a raw EEG dataset as well that is of good quality so I do not have to throw out any epochs, and did define epochs with a fixed length and zero overlap. Thank you again for the code and the insight, it was extremely helpful. |
Hi @vferat , sorry to open this up again, but I wanted to run this on the individuals level analysis and the group level analysis too. I realized this code only works for individuals but not groups since the group shares the same ModK structure. I am guessing that to modify this pipeline the best thing would be to do the group level ModK, then segment everyone individually and use that individual segmentation within the one line of the loop where you have the ModK.labels_[epoch_peaks].astype(int) to something like seg.labels[e][epoch_peaks].astype(int). Though the only way to get this running was to also remove the line: epoch_peaks += e * n_sample_per_epoch. this seems to work and mostly passes the checks, but sometimes the -1 boarders double up, such that there will be two in a row. I am not completely sure why this happens, but I am guessing it has to do with the rejected edges from segmentation. |
HI, this is somewhat related to #160 because I am working with MEG data, but I am trying to take the transition matrices using only the GFP peaks in an effort to exclude the likely quick artifactually transitions in the GFP troughs. Before I reinvent the wheel I wanted to check and see if there was a simple way to do this besides saving the seg time indices of interest as a different variable and then rewriting the transition matrix code from scratch to work on the new variable.
So far I have identified the peak indices in seg in each epoch using this:
std=np.std(epoched._data, axis=1) peaks=[] for e in std: peaks.append(scipy.signal.find_peaks(e)[0])
This outputs essentially gives me the index in seg.labels where ModK.labels_ match up when ModK is fit with the GFP. I now realize I now realize ModK.labels_ has the info I need to apply into the transition matrix function.
As a side note related to #160, this is obveously done with STD and not RMS. I tried RMS but the timepoints end up not matching up due to ModK.lables_ and peaks having a different number of peaks when peaks is done with np.sqrt(np.mean(epoched._data**2, axis=1)) instead of np.std(epoched._data, axis=1). Though when I flatten and plot both over the seg.plot(), both match up pretty well, though STD matches up better.
Best,
Matt
The text was updated successfully, but these errors were encountered: