[Mne_analysis] Size limit of data?

Arnaud Ferre arnaud.ferre.pro at gmail.com
Mon May 12 06:06:51 EDT 2014
Search archives:

Hi everyone,

I do some analysis in time-frequency, notably, I calculate PSD. I use
functions of mne.time_frequency.tfr.
I think that the common time of your trials is some seconds. But I think
that there is a common limit to the data size.
Maybe have I exceeded it?

My trials last 70sec and contain 700 000 values.
When I test only 3 trials (epochs) on 2 channels (so only 6 data packages)
with the function induced_power(), my computer struggles (so with a full
test, my computer dies)!
I don't use a raw object (because my ECG data are not compatible), but I
don't think that it would be a problem for the time being.
I use Fs = 1000 and frequencies=np.arange(1, 130, 1). My computer has a 4
cores of 3.20GHz and 8.00Go of RAM, 64bits.

Is this normal due to the size of my data/my computer power?
If yes, how can I improve it (change computer, cut my trials, force to free
memory...)?
If no, an idea of my possible problem?

Thank by advance,
Best,
Arnaud
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.nmr.mgh.harvard.edu/pipermail/mne_analysis/attachments/20140512/64e02358/attachment.html 


More information about the Mne_analysis mailing list