[Mne_analysis] Preloading raw array too big

Peter Goodin pgoodin at swin.edu.au
Fri May 2 17:33:23 EDT 2014
Search archives:

Hi mne list, 

I'm new to mne and python and am having some problems preloading my neuromag data (a 2gb file).

When using raw.fiff.Raw(filename, preload=True) I get the numpy error that the array is too big. When I try to memory map the data using ...(filename,preload='str') I get the error 
    221             fid.seek(bytes - 1, 0)
    222             fid.write(asbytes('\0'))
    223             fid.flush()

Is there a work around? I have two files from each participant so would like to concatenate them at some point during the process for cleaning. 

Thanks for any suggestions. 

Peter

__________________________
Peter Goodin,
BSc (Hons), Ph.D Candidate.

Brain and Psychological Sciences Research Centre (BPsych)
Swinburne University,
Hawthorn, Vic, 3122
http://www.swinburne.edu.au/swinburneresearchers/index.php?fuseaction=profile&pid=4149

Monash Alfred Psychiatry Research Centre (MAPrc)
Level 4, 607 St Kilda Road,
Melbourne 3004



More information about the Mne_analysis mailing list