[Mne_analysis] mne.io.Raw filter function. Type error: ...must have a dtype of np.float64,
Ilias Koen
ilias.koen at dukodestudio.com
Mon Apr 2 14:33:04 EDT 2018
Thank you all for the clarification - that solution worked.
On 4/2/18 2:17 PM, Luke Bloy wrote:
> Alex is correct that the type error is coming from repeatedly calling
> apply_hilbert on the same data. however you are also filtering the
> same data repeatedly in a loop which is most likely not what you want.
>
> the following code snippet will make a copy of data_EEG before at the
> beginning of each loop, then filter and call apply_hilbert. It runs
> without error although it throws warnings as your filters are too long
> for your data.
>
> #################################################################
> # set epoching parameters
> event_id = dict(note=1)
> tmin, tmax = -0.1, 0
> baseline = None
>
> # get the header to extract events
> # data_EEG = mne.io.read_raw_fif(raw_fname, preload=False)
> # events = mne.find_events(data_EEG, stim_channel='STI 014')
> events = mne.find_events(data_EEG, stim_channel='STI1')
> print(events)
>
> frequency_map = list()
>
> for band, fmin, fmax in iter_freqs:
> this_data_eeg = data_EEG.copy()
> print("Band "+ band + " fmin: "+ str(fmin) + " fmax: " + str(fmax))
> # (re)load the data to save memory
> # raw = mne.io.read_raw_fif(raw_fname, preload=True)
> this_data_eeg.pick_types(eeg=True, eog=False, stim=True)
>
>
> # #bandpass filter and compute Hilbert
> this_data_eeg.filter(fmin, fmax, n_jobs=1, # use more jobs to
> speed up.
> l_trans_bandwidth=1.0, # make sure filter params are
> the same
> h_trans_bandwidth=1.0, # in each band and skip "auto"
> option.
> fir_design='firwin')
>
> this_data_eeg.apply_hilbert(n_jobs=1, envelope=False)
>
> epochs = mne.Epochs(this_data_eeg, events, event_id, tmin, tmax,
> baseline=baseline, reject=None, preload=True)
> # remove evoked response and get analytic signal (envelope)
> epochs.subtract_evoked() # for this we need to construct new epochs.
> epochs = mne.EpochsArray(data=np.abs(epochs.get_data()),
> info=epochs.info <http://epochs.info>, tmin=epochs.tmin)
> # now average and move on
> frequency_map.append(((band, fmin, fmax), epochs.average()))
> #################################################################
>
> On Mon, Apr 2, 2018 at 1:40 PM Eric Larson <larson.eric.d at gmail.com
> <mailto:larson.eric.d at gmail.com>> wrote:
>
>> you need to do
>>
>> raw_hilbert = data_EEG.copy().apply_hilbert(n_jobs=1,
>> envelope=False)
>
>
> I think this is a typo and Alex meant `envelope=True`, which will
> return the amplitude at each time instance.
>
> Eric
>
> _______________________________________________
> Mne_analysis mailing list
> Mne_analysis at nmr.mgh.harvard.edu
> <mailto:Mne_analysis at nmr.mgh.harvard.edu>
> https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
>
>
> The information in this e-mail is intended only for the person to
> whom it is
> addressed. If you believe this e-mail was sent to you in error and
> the e-mail
> contains patient information, please contact the Partners
> Compliance HelpLine at
> http://www.partners.org/complianceline . If the e-mail was sent to
> you in error
> but does not contain patient information, please contact the
> sender and properly
> dispose of the e-mail.
>
>
>
> _______________________________________________
> Mne_analysis mailing list
> Mne_analysis at nmr.mgh.harvard.edu
> https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
>
>
> The information in this e-mail is intended only for the person to whom it is
> addressed. If you believe this e-mail was sent to you in error and the e-mail
> contains patient information, please contact the Partners Compliance HelpLine at
> http://www.partners.org/complianceline . If the e-mail was sent to you in error
> but does not contain patient information, please contact the sender and properly
> dispose of the e-mail.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.nmr.mgh.harvard.edu/pipermail/mne_analysis/attachments/20180402/e3a74d16/attachment.html
More information about the Mne_analysis
mailing list