[Mne_analysis] Size limit of data?

Arnaud Ferre arnaud.ferre.pro at gmail.com
Mon May 12 09:23:20 EDT 2014
Search archives:

Hi Martin,

Thank for this very clear and detailed answer. This should solve my problem.

I'm reassured that it comes from my data and my computer power.
Your 2 proposals are great. So, I'll make these changes.

I thought at another possibility: in Python, if I remember correctly,
internal variables of a function are only free when the function returns
value. But, it is possible to force to (optimally) free memory with the
command "del", and this during the execution of function. That remains to
be seen.

Thank,
Arnaud


2014-05-12 12:28 GMT+02:00 Martin Luessi <mluessi at nmr.mgh.harvard.edu>:

> Hi Arnaud,
>
> I think this is expected due to the limited memory of your computer.
> After applying tfr, your data needs about 6.5GB of memory (3 *2 * 70 *
> 1000 * 129 * 128 bits) and before tfr half of that (3.2GB), so you are
> exceeding the 8GB available on your computer. Some things that you could
> do to save memory:
>
> - Downsample the data to e.g. 500Hz. This would cut the memory needed by
> half.
>
> - Process one trial at a time, save the data to file and load it later
> for analysis (like that you won't have the 3.2GB original data in memory).
>
> HTH,
>
> Martin
>
> On 05/12/14 06:06, Arnaud Ferre wrote:
> > Hi everyone,
> >
> > I do some analysis in time-frequency, notably, I calculate PSD. I use
> > functions of mne.time_frequency.tfr.
> > I think that the common time of your trials is some seconds. But I think
> > that there is a common limit to the data size.
> > Maybe have I exceeded it?
> >
> > My trials last 70sec and contain 700 000 values.
> > When I test only 3 trials (epochs) on 2 channels (so only 6 data
> > packages) with the function induced_power(), my computer struggles (so
> > with a full test, my computer dies)!
> > I don't use a raw object (because my ECG data are not compatible), but I
> > don't think that it would be a problem for the time being.
> > I use Fs = 1000 and frequencies=np.arange(1, 130, 1). My computer has a
> > 4 cores of 3.20GHz and 8.00Go of RAM, 64bits.
> >
> > Is this normal due to the size of my data/my computer power?
> > If yes, how can I improve it (change computer, cut my trials, force to
> > free memory...)?
> > If no, an idea of my possible problem?
> >
> > Thank by advance,
> > Best,
> > Arnaud
> >
> >
> >
> > _______________________________________________
> > Mne_analysis mailing list
> > Mne_analysis at nmr.mgh.harvard.edu
> > https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
> >
>
>
> --
> Martin Luessi, Ph.D.
>
> Research Fellow
>
> Department of Radiology
> Athinoula A. Martinos Center for Biomedical Imaging
> Massachusetts General Hospital
> Harvard Medical School
> 149 13th Street
> Charlestown, MA 02129
>
> Fax: +1 617 726-7422
> _______________________________________________
> Mne_analysis mailing list
> Mne_analysis at nmr.mgh.harvard.edu
> https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
>
>
> The information in this e-mail is intended only for the person to whom it
> is
> addressed. If you believe this e-mail was sent to you in error and the
> e-mail
> contains patient information, please contact the Partners Compliance
> HelpLine at
> http://www.partners.org/complianceline . If the e-mail was sent to you in
> error
> but does not contain patient information, please contact the sender and
> properly
> dispose of the e-mail.
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.nmr.mgh.harvard.edu/pipermail/mne_analysis/attachments/20140512/26afe321/attachment.html 


More information about the Mne_analysis mailing list