-
Notifications
You must be signed in to change notification settings - Fork 165
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
matlab - pull_chunk for eeg - timestamps not equidistant #63
Comments
Hi Marvin,
I think the full answer depends on what EEG device you're using and how
much you trust the hardware. Generally 'research grade' EEG amps are
trustworthy and sample properly (correct frequency and no dropped
frames/data) so you can simply linearly interpolate the timestamps
(automatically done in load_xdf.m in 'HandleJitterRemoval'). The
discrepancies in your data seem to happen in equal intervals... maybe this
happens when LSL synchronizes the clock drift between the two systems (if
you are using two systems)? If you're interested you can find the stream's
synchronization field and check if the timestamps match up at the points
where discrepancies occur.
Best,
Clement Lee
Applications Programmer
Swartz Center for Computational Neuroscience
Institute for Neural Computation, UC San Diego
858-822-7535
…On Thu, Oct 8, 2020 at 4:53 AM mrvnmtz ***@***.***> wrote:
Hey everybody,
I'm working with EEG Data that i want to process while reading it.
Therefor i use the pull_chunk function to read the data chunk by chunk. My
problem is that the difference between the timestamps is not constant. To
break down my problem to this issue i used the example code
"ReceiveDataInChunks.m" provided together with liblsl_matlab .
`% instantiate the library
disp('Loading the library...');
lib = lsl_loadlib();
% resolve a stream...
disp('Resolving an EEG stream...');
result = {};
while isempty(result)
result = lsl_resolve_byprop(lib,'type','EEG'); end
% create a new inlet
disp('Opening an inlet...');
inlet = lsl_inlet(result{1});
time = [];
ende = 0;
disp('Now receiving chunked data...');
while true
% get chunk from the inlet
[chunk,stamps] = inlet.pull_chunk();
for s=1:length(stamps)
time(ende+s) = stamps(s);
end
ende = ende + length(stamps);
pause(0.05);
end`
I let it run for some time and then I plot
plot(diff(time))
[image: pull_chunk error]
<https://user-images.githubusercontent.com/72555114/95454927-5f4e8a00-096d-11eb-8c17-c84532d02879.png>
So normally that should be a straigt line at y=0,0004883 (~1/2048Hz)
Does someone know why i get these discontinuities and how i solve them?
Thanks a lot in advance
Marvin
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#63>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AITC3JXCERZI3GERALLSJBDSJWR5DANCNFSM4SIU23BA>
.
|
Hi @mrvnmtz , Can you please let us know which device you're using and how it streams data via LSL (vendor-provided app, community app, custom, etc)? In order of decreasing importance:
|
Adding to those comments, to me these stamps actually don't look all that bad, as in, there's a good chance that dejitter will pretty much fix that up for you. I sometimes use the following rough back of the envelope calculation: it looks like the hardware is natively providing about 15 chunks per sec (the spikes), and if you use online dejitter that'll be smoothed out over a minute-long sliding window (effectively a bit more since it's exponentially weighted, but we're going to conservatively assume a minute). You can get a rough idea of the residual timing error after dejitter by taking the magnitude of those spikes (17ms) and dividing them by the number of spikes that are averaged (that's the effective number of time measurements, which ignores the flat portions, which are not actual timing measurements but filled in based on 1/srate), so that makes ca. 15x60 in total over a minute. So 17ms/(15x60) comes out at 0.018ms (i.e., less than a sample at 2048 Hz). It'll be a bit worse than that in in the first minute after turning on the stream since there's less data. Now, in case you don't actually need those stamps in real time, but you just want to record them to disk (eg to an XDF file), then that dejitter would instead be done on the whole recording at import time, in which case the timing error due to jitter will be even smaller (miniscule). |
The fact that you have irregular time stamp does not mean that you have missing data samples. So it fine to estimate interpolate/dejitter the time stamps as if they were perfectly regular.
The issue is when you have an event stream at the same time, and making sure the latency on the event stream can be aligned with the data streams. This is more problematic, and ideally you would have the events both in the EEG (as an extra channel) and in the LSL event stream and compare the latency, so you can compare the two - and run some optimization. In my opinion, make sure the computer that records the EEG is the same as the one generating event (to minimize network delays). Also work on the code segment that write events to the event streams and sends a TTL pulse to the EEG amplifier to ensure there is no buffer delay.
Cheers,
Arno
… On Oct 8, 2020, at 8:35 AM, Christian Kothe ***@***.***> wrote:
Adding to those comments, to me these stamps actually don't all that bad, as in, there's a good chance that dejitter will pretty much fix that up for you.
I sometimes use the following rough back of the envelope calculation: it looks like the HW is natively providing about 15 chunks per sec (the spikes), and if you use online dejitter that'll be smoothed out over a minute-long sliding window (effectively a bit more since it's exponentially weighted, but we're going to conservatively assume a minute). You can get a rough idea of the residual timing error after dejitter by taking the magnitude of those spikes (17ms) and dividing them by the number of spikes that are averaged (that's the effective number of time measurements, which ignores the flat portions, which are not actual timing measurements but filled in based on 1/srate), so that makes ca. 1560 in total over a minute. So 17ms/(1560) comes out at 0.18ms (i.e., less than a sample at 2048 Hz). It'll be a bit worse than that in in the first minute after turning on the stream since there's less data.
Now, in case you don't actually need those stamps in real time, but you just want to record them to disk (eg to an XDF file), then that dejitter would instead be done on the whole recording at import time, in which case the timing error due to jitter will be even smaller (miniscule).
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub, or unsubscribe.
|
Thanks to all for trying to help me. The device is a BioSemi device, should be this one: https://www.biosemi.com/ad-box_activetwo.htm. I link the device with the software that I think came with it to open the stream. So far i would have said that it looks quite trustworthy. @cll008 I normally link my BioSemi device to another PC than the one i run matlab on, but that's just because the lab i arranged like that. But I also tried to run matlab on the same Pc to which i connect the EEG and it seems to make no difference. Where do I find this stream's synchronization field? @cboulay So far I haven't done any postprocessing. I sometimes used inlet.time_correction();. I also tried to apply inlet.set_postprocessing(); but than i says " Unrecognized function or variable 'lsl_set_postprocessing' " @chkothe Unfortunatelly I do need the data online and i even have to classify them online. How do I dejitter, when processing the data online not saving them to an xdf-file? @arnodelorme Eventually i will have a second stream that will be sent by Matlab, but so far i got stuck on this issue While writing this answers i figured out that, when i put all my chunks together after each other in one matrix, my timeseries looks good, as such that I do not have any missing data. Thanks again for all your answers but right now it looks like this issue won't be a problem for me any longer. |
That’s something we need to fix. My Matlab license just expired. Any volunteers? |
Well I just bought a Matlab license so I guess this goes on my todo pile. I did implement this some time ago (labstreaminglayer/liblsl-Matlab@2ad59f1) but it might not be working on every conceivable combination of Matlab/OS/etc. @mrvnmtz, can you please tell me which version of Matlab and liblsl you are using? It could be that you just need to build liblsl-Matlab on your system. |
@dmedine Sorry for my late response, but if you still want to take a look at it, I'm using Matlab R2020a und i actually just updated liblsl when investigating this problem. |
Hello, I have the same issue. When I stream my EEG data (CXG) at 500 Hz, the difference between two consecutive samples is not close to 0.002 (as you can see in the figure below). However, the number of received samples is close to the frame rate. For instance, for a 5-second recording, I get around 2479 samples. |
@mrsaeedpour , it looks like the Cognionics LSL integration is calling The uneven sampling intervals are dejittered automatically when loading the file via an xdf importer. And they can be fixed online automatically by LSL's built-in dejittering by setting the postprocessing flags. However, do you really need the timestamps to be dejittered? LSL's dejittering simply assumes that devices have consistent inter-sample intervals at the source even if their timestamps in the stream don't have consistent intervals. If you simply process the data without paying attention to the timestamps then you are effectively making the same assumption. If you need to align these jittered data with other streams and your precision requirement is < 10 msec then yes you'll need to dejitter. But this is a very unusual requirement for online analysis. Most online analyses will simply want to run things as fast as possible so you'll align the most recent EEG with the most recent other stream. |
Hey everybody,
I'm working with EEG Data that i want to process while reading it. Therefor i use the pull_chunk function to read the data chunk by chunk. My problem is that the difference between the timestamps is not constant. To break down my problem to this issue i used the example code "ReceiveDataInChunks.m" provided together with liblsl_matlab .
`% instantiate the library
disp('Loading the library...');
lib = lsl_loadlib();
% resolve a stream...
disp('Resolving an EEG stream...');
result = {};
while isempty(result)
result = lsl_resolve_byprop(lib,'type','EEG'); end
% create a new inlet
disp('Opening an inlet...');
inlet = lsl_inlet(result{1});
time = [];
ende = 0;
disp('Now receiving chunked data...');
while true
% get chunk from the inlet
[chunk,stamps] = inlet.pull_chunk();
for s=1:length(stamps)
time(ende+s) = stamps(s);
end
ende = ende + length(stamps);
pause(0.05);
end`
I let it run for some time and then I plot
plot(diff(time))
So normally that should be a straigt line at y=0,0004883 (~1/2048Hz)
I stream the EEG data via a BioSemi device, if this is of interest.
Does someone know why i get these discontinuities and how i solve them?
Thanks a lot in advance
Marvin
The text was updated successfully, but these errors were encountered: