[Mne_analysis] Too short epochs

Eric Larson larson.eric.d at gmail.com
Mon Jun 8 09:47:03 EDT 2020
Search archives:

        External Email - Use Caution        

> Yes, I downsampled the raw data using the resample method. I did this
> because I am using pyprep for the initial stages of preprocessing and
> have got data from an experiment that runs for 40 minutes. So the memory
> requirements become huge. Is there a way to downsample the data and avoid
> these issue? How does the events array need to be adjusted?

Instead of resampling, I would first try to work around memory issues by
loading the raw data using memmapping, e.g. with preload='./tempfile' in
read_raw_fif <https://mne.tools/stable/generated/mne.io.read_raw_fif.html>
(or whatever reading function you're using). It will load all data into a
temporary array/file on disk rather than in memory. Modifications you make
to the data (filtering, etc.) will be made on disk on this temporary copy
instead of in memory. I'm not sure the extent to which pyprep avoids making
other copies of the data, though, so this might not fix the problem.

If you absolutely have to downsample, you can either do `find_events` on
your resampled raw data (but some events might be dropped) or do something
like this to adjust the events array from the original data:

events[:, 0] = np.round(events[:, 0] / ratio).astype(int)

where `ratio` is your downsampling ratio. After this if you have very
closely spaced events you could have some duplicates in `events`, which
you'll have to suitably combine somehow, depending on your analysis.


On Fri, Jun 5, 2020 at 10:01 AM Eric Larson <larson.eric.d at gmail.com> wrote:

> I’ve been preprocessing my EEG data using standard preprocessing steps
>> such as highpass filtering (1Hz), line noise removal, downsampling,
>> removing noisy channels, and ICA and downsampling using MNE
> Downsampling the raw data or when constricting epochs with `decim` or
> after creating epochs with `epochs.decimate`? Generally downsampling /
> resampling raw is discouraged...
>> When I then epoch my data it drops the majority of the epochs (40 out of
>> 70 with a sampling frequency of 250 and 60 out of 70 with a sampling
>> frequency of 100). The drop_log indicates that all of the epochs were
>> dropped because they were too short.
> This can happen if you resample raw and then don't adjust your `events`
> array to compensate, perhaps this is what happened?
> Eric
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.nmr.mgh.harvard.edu/pipermail/mne_analysis/attachments/20200608/9f0dc771/attachment-0001.html 

More information about the Mne_analysis mailing list