[Mne_analysis] continuation (from Fieldtrip list) of writing fifs

Dykstra, Andrew Andrew.Dykstra at med.uni-heidelberg.de
Wed Jan 30 09:38:14 EST 2013
Search archives:

Hi Alexandre, Peter,

While it's not nearly as problematic as going from 2 to 18 Gb, I also 
experience file size increases (usually exact doubling minus the last 
small buffer) when rewriting fif files, not only in MATLAB but also with 
the Python tools.  In my case, I've always wondered whether it's just a 
difference in the default writing precisions between the modern MNE 
package and our system (Neuromag 122), in which case I'm content with 
storing the data in the higher precision.  In any case, would it be 
possible to include a precision argument in the Python writing tool?

Cheers,
Andy

-- 
Andrew R. Dykstra, PhD
Auditory Cognition Lab
Neurologie und Poliklinik
Universitätsklinikum Heidelberg
Im Neuenheimer Feld 400
69120 Heidelberg

"How small the cosmos.  How paltry and puny compared to human consciousness . . . to a single individual recollection." - Vladimir Nabokov



"hi Peter,

> The size of the input file is ~2gb but when
> writing, the output is approx ~18gb. I suspect I'm writing a WHOLE bunch of
> redundant data, but am unable to open the new file, even on a machine
> running 16gb RAM.

what is likely to happen is that the matlab code writes data as double precision
while the original neuromag data were in float 16. Note that a single fif file
cannot exceed 2Go due to internal pointer arithmetic otherwise it will
be broken.
What you can do is hack the fif writing to make sure it's written back
in float16.

Best,
Alex




More information about the Mne_analysis mailing list