[Mne_analysis] a junior question on mne_setup_source_space

peng prion.w at gmail.com
Thu May 1 11:05:29 EDT 2014
Search archives:

Thanks for the reply, dgw. Do you mean the segmentation is made in
fsaverage and then project to subject by
====
mne_setup_source_space --subject subj1 --ico 2 --morph fsaverage
====
I tried this way before and it always failed with the error message that
vertex XXX was used multiple times, no matter which resolution I use (ico
2, 3, 4 or 5). It seemed the resolution of fsaverage is not so high (2 mm),
thus I tried icbm152 and colin27 (both 1 mm resolution), and it did not
work either. Do you have any idea on possible reasons?
I just made ico 2 here as an example. Of course I am also interested to
know what is the minimum number you think appropriate for reasonable
mapping?

btw: I hope it is OK when I address you as dgw; that's what I saw as the
email sender. Please correct me if it is not what you like.

best
Peng


On Thu, May 1, 2014 at 4:12 PM, dgw <dgwakeman at gmail.com> wrote:

> Hi Peng,
>
> I think the basic point here is you have the logic reversed. The idea
> behind the morph option is that you would use  --subject subj1 and
> --morph fsaverage. You would then do this for all of your subjects.
> Although with such a tiny number of sources (--ico 2), I wouldn't
> expect to get very good results.
>
> HTH,
> D
>
> On Tue, Apr 29, 2014 at 12:44 PM, peng <prion.w at gmail.com> wrote:
> > Hi dgw,
> >
> >    First thanks you for help in previous discussions, also thanks alex,
> > martin, junpeng and many others for helps.
> >    I have played with the commands for a while and now understand more.
> > However, I am still not sure how this "morph" works. For example, in the
> > following command,
> >    mne_setup_source_space --subject fsaverage --ico 2 --morph subj1
> > --overwrite
> >    Previously, I thought with this command, it would took some locations
> in
> > the template fsaverage, and then projected them into subj1 based on their
> > anatomical similarity (e.g. shape of white matter). Or in another words,
> I
> > expected that if I repeated this command many times by changing the
> > subjects, I would have all subjects share the same or at least very close
> > locations (in cortex level) for the same vertex id. This is interesting
> for
> > me because it would give much convenience in later group analysis. e.g.
> > location #100 of subject 1 and location #100 of subject 2 means the same
> > thing in group level. However, later I applied the labels in freesurfer
> and
> > found out that for the same id (e.g. #100) may belong to different brain
> > areas (as indicated by labels). I am a little confused then. Is this
> > difference cause by small variations; or I was wrong in the root : This
> > morph does not grantee a one-to-one mapping between template and each
> > subject?
> >    Thanks a lot for the help!
> >
> > best
> > Peng
> >
> >
> >
> > On Fri, Mar 28, 2014 at 3:08 PM, dgw <dgwakeman at gmail.com> wrote:
> >>
> >> That certainly is the basic workflow in terms of commands, but I don't
> >> think you want to use the "tmp" subject. You also don't morph the data
> >> until later in the process.
> >>
> >> Can I recommend that you start off by doing the example in the
> >> "Cookbook" chapter? It does not take very long, and it will help you
> >> be more familiar with how the process works in one individual. Then
> >> you can try to take what you learn from that and apply it directly to
> >> your data.
> >>
> >> HTH,
> >> D
> >> .
> >>
> >> On Fri, Mar 28, 2014 at 9:52 AM, peng <prion.w at gmail.com> wrote:
> >> > Dear Alex, Martin and other MNE users,
> >> >
> >> > I have my MEG raw data in CTF .ds format and raw MRI image with dicom
> >> > format. I wish to use MNE to compute the leadfield (source space based
> >> > on
> >> > surface).
> >> > With your help, I wish to summarise a possible pipeline as a
> following:
> >> > 1. recon-all --subject test1 #use  in freesurfer to generate the brain
> >> > surfaces and other needed files.
> >> > 2. mne_watershed_bem --subject test1 --overwrite --atlas #generate
> >> > surfaces
> >> > for bem model
> >> > 3. mne_ctf2fiff --ds test1.ds --fif test1.fif --infoonly #generate MEG
> >> > information (sensor locations etc.)
> >> > 4. #use MNE_analyze with test1.fif and inflated.surf to generate
> >> > test1-trans.fif which contains the transformation matrix;
> >> >     #or use a previous calculated transformation matrix and convert it
> >> > to
> >> > fif format
> >> > 5. mne_setup_source_space --subject tmp --ico 5 --morph test1 #tmp is
> >> > name
> >> > of a template, could be fsaverage or icbm152...
> >> > #This is to use locations on the surface of the template cortex (white
> >> > matter?) as source space.
> >> > #These locations was morphed to subject <test1> to generate leadfield
> >> > for
> >> > this subject later
> >> > 6. mne_do_forward_solution --subject test1 \
> >> >    --src test1-fsaverage-ico-5-src.fif \
> >> >   --meas test1.fif \
> >> >   --trans test1-trans.fif \
> >> >   --megonly --overwrite \
> >> >   --fwd test1-oct-5-fwd.fif
> >> > # I can then find my leadfield (MxNx3 matrix, M=number of locations,
> N =
> >> > number of sensors) by importing test1-oct-5-fwd.fif into matlab.
> >> >
> >> > Did I miss something?
> >> > Thanks a lot for the help.
> >> >
> >> >
> >> >
> >> > On Fri, Mar 28, 2014 at 2:14 PM, peng <prion.w at gmail.com> wrote:
> >> >>
> >> >> Thank you Martin for the help.
> >> >> 1. I re-run the command "mne_setup_forward_model" and it seemed
> working
> >> >> this time (maybe I did not set the environment correctly). Sorry for
> >> >> the
> >> >> confusion.
> >> >> 2. I tried with MNE_analyze GUI and it was not easy for me. But I
> saved
> >> >> results from one subject successfully and it could be load by
> >> >> MNE_analyze.
> >> >> However I wanted to read it in matlab to check the real contents
> (which
> >> >> I
> >> >> assume is a structure with the transformation matrix). I failed with
> "x
> >> >> =
> >> >> fiff_read_mri(fname, 0). It complained "Could not find MRI data"...
> >> >> Would
> >> >> please let me know the function name to read and write *-src.fif file
> >> >> in
> >> >> matlab?
> >> >>
> >> >>
> >> >>
> >> >> On Wed, Mar 26, 2014 at 9:28 PM, Martin Luessi
> >> >> <mluessi at nmr.mgh.harvard.edu> wrote:
> >> >>>
> >> >>> On 03/26/14 16:00, peng wrote:
> >> >>> > Thank you for the answers.
> >> >>> > 1. Surprisingly I did not find any files were generated/saved
> after
> >> >>> > running mne_setup_forward_model.
> >> >>>
> >> >>> I assume there was an error, can you post the program output?
> >> >>>
> >> >>> > 2. I don't have a -trans.fif file. I tried to do it in GUI of
> >> >>> > MNE_analyze, it was quite complicated. If I have a co-registered
> mri
> >> >>> > file (generated by CTF software with .mri extension), can it be
> >> >>> > converted to fif format? If not, I can read the 4x4 head2mri
> matrix
> >> >>> > via
> >> >>> > matlab, can this information be written to fif format with certain
> >> >>> > tools
> >> >>> > in MNE?
> >> >>>
> >> >>> There is also a coreg GUI in MNE-Python, have a look here
> >> >>>
> >> >>> http://www.slideshare.net/mne-python/mnepython-coregistration
> >> >>>
> >> >>> You could write a .fif file with the transform in Matlab, but it
> seems
> >> >>> to me that this wouldn't be easier than using the coregistration
> tools
> >> >>> in mne_analyze or MNE-Python.
> >> >>>
> >> >>> HTH,
> >> >>>
> >> >>> Martin
> >> >>>
> >> >>>
> >> >>> > Thanks again!
> >> >>> >
> >> >>> >
> >> >>> > On Tue, Mar 25, 2014 at 11:39 PM, Alexandre Gramfort
> >> >>> > <alexandre.gramfort at telecom-paristech.fr
> >> >>> > <mailto:alexandre.gramfort at telecom-paristech.fr>> wrote:
> >> >>> >
> >> >>> >     hi,
> >> >>> >
> >> >>> >     >  Some additional questions.
> >> >>> >     > 1 After I ran mne_setup_forward_model, I have no feedback
> from
> >> >>> > the
> >> >>> >     command
> >> >>> >     > line, is it normal?
> >> >>> >
> >> >>> >     it should print that it saved a file to disk
> >> >>> >
> >> >>> >     > 2 I am using CTF data, thus I converted the .ds folder from
> >> >>> > CTF
> >> >>> > to fif
> >> >>> >     > format with mne_ctf2fiff command, with option --infoonly.
> Here
> >> >>> > I
> >> >>> >     only want
> >> >>> >     > to get the Leadfield thus I suppose data is not necessary.
> >> >>> > However
> >> >>> >     when I
> >> >>> >     > try mne_do_forward_solution, it asked for mri description
> >> >>> > file,
> >> >>> >     which I
> >> >>> >     > haven't. I have only a series of dicom files or the .mri
> file
> >> >>> >     generated by
> >> >>> >     > ctf software from them. Can I still move on?
> >> >>> >
> >> >>> >     you need to do the coregistration and get a -trans.fif file.
> >> >>> >
> >> >>> >     A
> >> >>> >     _______________________________________________
> >> >>> >     Mne_analysis mailing list
> >> >>> >     Mne_analysis at nmr.mgh.harvard.edu
> >> >>> >     <mailto:Mne_analysis at nmr.mgh.harvard.edu>
> >> >>> >
> https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
> >> >>> >
> >> >>> >
> >> >>> >     The information in this e-mail is intended only for the person
> >> >>> > to
> >> >>> >     whom it is
> >> >>> >     addressed. If you believe this e-mail was sent to you in error
> >> >>> > and
> >> >>> >     the e-mail
> >> >>> >     contains patient information, please contact the Partners
> >> >>> > Compliance
> >> >>> >     HelpLine at
> >> >>> >     http://www.partners.org/complianceline . If the e-mail was
> sent
> >> >>> > to
> >> >>> >     you in error
> >> >>> >     but does not contain patient information, please contact the
> >> >>> > sender
> >> >>> >     and properly
> >> >>> >     dispose of the e-mail.
> >> >>> >
> >> >>> >
> >> >>> >
> >> >>> >
> >> >>> > _______________________________________________
> >> >>> > Mne_analysis mailing list
> >> >>> > Mne_analysis at nmr.mgh.harvard.edu
> >> >>> > https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
> >> >>> >
> >> >>>
> >> >>> _______________________________________________
> >> >>> Mne_analysis mailing list
> >> >>> Mne_analysis at nmr.mgh.harvard.edu
> >> >>> https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
> >> >>
> >> >>
> >> >
> >> >
> >> > _______________________________________________
> >> > Mne_analysis mailing list
> >> > Mne_analysis at nmr.mgh.harvard.edu
> >> > https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
> >> >
> >> >
> >> > The information in this e-mail is intended only for the person to whom
> >> > it is
> >> > addressed. If you believe this e-mail was sent to you in error and the
> >> > e-mail
> >> > contains patient information, please contact the Partners Compliance
> >> > HelpLine at
> >> > http://www.partners.org/complianceline . If the e-mail was sent to
> you
> >> > in
> >> > error
> >> > but does not contain patient information, please contact the sender
> and
> >> > properly
> >> > dispose of the e-mail.
> >> >
> >> _______________________________________________
> >> Mne_analysis mailing list
> >> Mne_analysis at nmr.mgh.harvard.edu
> >> https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
> >
> >
> >
> > _______________________________________________
> > Mne_analysis mailing list
> > Mne_analysis at nmr.mgh.harvard.edu
> > https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
> >
> >
> > The information in this e-mail is intended only for the person to whom
> it is
> > addressed. If you believe this e-mail was sent to you in error and the
> > e-mail
> > contains patient information, please contact the Partners Compliance
> > HelpLine at
> > http://www.partners.org/complianceline . If the e-mail was sent to you
> in
> > error
> > but does not contain patient information, please contact the sender and
> > properly
> > dispose of the e-mail.
> >
> _______________________________________________
> Mne_analysis mailing list
> Mne_analysis at nmr.mgh.harvard.edu
> https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.nmr.mgh.harvard.edu/pipermail/mne_analysis/attachments/20140501/4c67de48/attachment.html 


More information about the Mne_analysis mailing list