[Mne_analysis] regarding morphing in matlab

sheraz at nmr.mgh.harvard.edu sheraz at nmr.mgh.harvard.edu
Mon Mar 14 14:33:08 EDT 2011
Search archives:

Hi Pavan,

A small suggestion, if you in Linux, you can also called MNE function from
matlab, use some thing like this.

command = ['mne_make_movie  --stcin  <stcINfile>   --subject <subj>
--morph fsaverage --smooth 5 --stc <stcOUTfile>'];

[st,wt] = unix(command);

if st ~=0
error('ERROR : error in generating morph stc file')
wt
end

In the pipeline in matlab you can use mixture of matlab and mne code.

Sheraz Khan, Ph.D.

Research Fellow,
Department of Neurology /
Martinos Center for Biomedical Imaging
Massachusetts General Hospital /
Harvard Medical School
149 13th Street, CNY-10.023
Boston, MA  02129
fax 617-948-5966





> Hi Hari (and others),
>
> Thanks very much for the suggestions and apologies for the lengthy reply!
>
> I did check that most rows are entirely zero (about 2300/2562). Secondly,
> I did look at the vertex numbering and they seem to make sense. Also, I
> visualized the result as stc and they do seem as sparse as the data matrix
> suggests.
>
> It seems to me that smoothing (or as Matti recommends to call it in
> Section 8.3 of the manual v2.6.1 : smudging/blurring) is the root of the
> difference between the mne_make_movie result and my MATLAB implemenation.
>
> For example, I checked that the mne_make_movie result with only a single
> smoothing iteration (i.e. --smooth 1) is nearly as sparse. Specifically,
> in the resulting stc file 2372/2562 used vertices in stc_l.data are zero.
>
> Here are some snapshots from a a single time frame of a single subject stc
> file along with the same file morphed to a common brain with 3 different
> smoothing factors (1, 2 and 5). Matti recommends a smoothing factor
> between 4-7.
>
> https://neuro.hut.fi/~pavan/temp/smudging_subject01-lh-lat.png
> https://neuro.hut.fi/~pavan/temp/smudging01_common-lh-lat.png
> https://neuro.hut.fi/~pavan/temp/smudging02_common-lh-lat.png
> https://neuro.hut.fi/~pavan/temp/smudging05_common-lh-lat.png
>
> As you might guess, the sparseness of the respective data matrices
> decreases with increased smoothing.
>
> I would appreciate your (or anyone else's) comments on whether the
> smoothing assessment above is right and if so, what would your
> recommendations be regarding the choice of the smoothing factor. Please
> note that I am subjecting my morphed data to a substantial analysis
> pipeline and it is not a last stage visualization of group data as is
> typically the case.
>
> Also, any comments regarding implementation of the smoothing/blurring
> operation in MATLAB are welcome. Dealing with a large "vertex x vertex"
> distance matrix, as well as the choice of the neighborhood parameter 'N_j'
> as defined in Section 8.3 of the v2.6.1 manual are not straightforward to
> me.
>
> Thanks again,
> Pavan
>
>> Hi Pavan,
>>    I don't have much experience with morphing data in MATLAB. Given
>> that,
>> the procedure looks ok to me. Did you write any of your morphed data to
>> an stc file to see what it looks like?
>>
>> There are a few possibilities that occur to me:
>> (1) Are you sure 'most rows' of leftmapRelevant and rightmapRelevant are
>> infact zero? They ought to be sparse with probably 2 or 3 elements
>> non-zero in the entire row of 5000 (if that's your source space size)
>> odd
>> elements..
>>
>> (2) Depending on how you made the sample stc file that is in
>> 'common_brain' space, it might have lost track of the vertex numbering.
>> Could you look at stc_l.vertices and stc_r.vertices to see if they make
>> sense? If they are simply [0:2561]' or [0:10241]' for instance, the
>> procedure of course will give you mostly zeros.
>>
>> Hope it helps.
>>
>> Regards,
>> Hari
>>
>> On Mon, March 14, 2011 7:00 am, Pavan Ramkumar wrote:
>>> Dear MNE users,
>>>
>>> Just a follow up mail to my previous request. I would like to transform
>>> my
>>> data from one decimated source space to another from within matlab.
>>> Below
>>> are the steps I have adopted to do so. If anybody has previous
>>> experience
>>> in doing similar things, kindly share your thoughts.
>>>
>>> 1/ I precomputed the morph maps from each surface to the common brain
>>> with
>>> mne_make_morph_maps.
>>>
>>> 2/ I read in the morph maps using mne_read_morph_maps as follows:
>>> [leftmap,rightmap] = mne_read_morph_map('my_brain', 'common_brain');
>>>
>>> 3/ I read in an inverse operator for the input surface 'my_brain' into
>>> the
>>> structure 'inv_op'
>>>
>>> 4/ I read in a sample stc file for the target surface i.e.
>>> 'common_brain'
>>> to get the vertex info. The stc files were read into 'stc_l' and
>>> 'stc_r'
>>>
>>> 5/ Next, I used the following lines of code to transform my data:
>>> leftmapRelevant = leftmap(stc_l.vertices,inv_op.src(1).vertno);
>>> rightmapRelevant = rightmap(stc_r.vertices,inv_op.src(2).vertno);
>>> morphed_dataL = leftmapRelevant*dataL;
>>> morphed_dataR = rightmapRelevant*dataR;
>>>
>>> Note that stc_l.vertices contains all used vertices for the left
>>> hemishpere of the target surface (i.e. 'common_brain') and
>>> inv_op.src(1).vertno contains all used vertices for the left hemisphere
>>> of
>>> the source surface (i.e. 'my_brain'). The same is true for
>>> stc_r.vertices
>>> and inv_op.src(2).vertno respectively.
>>>
>>> Intuition suggests that these steps would suffice, but I did some
>>> checks
>>> and figured that most rows of leftmapRelevant and rightmapRelevant are
>>> entirely zero.
>>>
>>> I am using a morphgrade of 4 i.e. target surface consists of 2562 used
>>> vertices per hemisphere. By comparison, the source surfaces are
>>> computed
>>> with 5mm spacing and therefore consist of about 5000 used vertices per
>>> hemisphere.
>>>
>>> What am I missing?
>>>
>>> Thank you for your time.
>>> Best regards,
>>> Pavan
>>>
>>>> Dear MNE users,
>>>>
>>>> I would like to morph my data from a single subject into a common
>>>> brain.
>>>> However, instead of writing the result out as an stc file (which takes
>>>> up
>>>> too much space) I would like to do the morphing from within matlab,
>>>> and
>>>> then subsequently do some postprocessing on the morphed data before I
>>>> write out stc files.
>>>>
>>>> Does anybody know where to find the vertex numbers corresponding to
>>>> the
>>>> --morphgrade parameter used in mne_make_movie? Specifically, for
>>>> morphgrade = 5, where do I find the indices of the 10242 vertices that
>>>> are
>>>> selectively stored into the stc file by mne_make_movie? As far as I
>>>> looked, the manual does not explicitly give this information.
>>>>
>>>> Please advice.
>>>>
>>>> Thanks in advance,
>>>> Pavan
>>>>
>>>
>>> _______________________________________________
>>> Mne_analysis mailing list
>>> Mne_analysis at nmr.mgh.harvard.edu
>>> https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
>>>
>>>
>>>
>>
>>
>> --
>> Hari Bharadwaj
>>
>>
>> The information in this e-mail is intended only for the person to whom
>> it
>> is
>> addressed. If you believe this e-mail was sent to you in error and the
>> e-mail
>> contains patient information, please contact the Partners Compliance
>> HelpLine at
>> http://www.partners.org/complianceline . If the e-mail was sent to you
>> in
>> error
>> but does not contain patient information, please contact the sender and
>> properly
>> dispose of the e-mail.
>>
>>
>
> _______________________________________________
> Mne_analysis mailing list
> Mne_analysis at nmr.mgh.harvard.edu
> https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
>
>
>





More information about the Mne_analysis mailing list