[Mne_analysis] Number of used triangles and vertices in source space

VERRIER Clement clement.VERRIER at univ-amu.fr
Thu Apr 9 04:33:06 EDT 2020
Search archives:

        External Email - Use Caution        

Hi Alexandre,

Thank you! I attached a piece a code that does (I hope!) what I have explained before, with some comments.

Thank you again,

Clément



> Le 9 avr. 2020 à 09:17, Alexandre Gramfort <alexandre.gramfort at inria.fr> a écrit :
>
>        External Email - Use Caution
>
> hi Clément,
>
> what it is very possible is that the omission of certain dipoles in
> the src at forward
> stage due to min_dist parameter > 0 is not taken into account in the
> use_tris which
> is untouched.
>
> it would be easier for me to think about the issue if you share a code
> snippet I can play with.
>
> Alex
>
> On Wed, Apr 8, 2020 at 11:40 PM VERRIER Clement
> <clement.VERRIER at univ-amu.fr> wrote:
>>
>>        External Email - Use Caution
>>
>> Hello everyone,
>>
>> I have a question about SourceSpaces object. More precisely : I am following this tutorial (with MNE 0.20.0, https://mne.tools/stable/auto_examples/inverse/plot_custom_inverse_solver.html), with parameters loose and depth respectively set to 0.0 and 1.0 (so I am working with fixed orientation) in the _prepare_gain function. This latter returns a new forward object which contains a SourceSpaces, named src.
>>
>> One of my goals is to only deal with vertices (and associated triangles) that are involved in the gain matrix in forward[’sol’][‘data’] (whose shape is 305x7498), so I supposed these ones could be easily accessed using the ‘use_tris’ key. Unfortunately, it seems that it is not the case because if we proceed like this, we get 8196 vertices (4098 per hemisphere) and not 7498 as expected. Could someone explain me why ?
>>
>> I was told that for some good reasons, a part of these vertices can be deleted by some MNE-Python functions in order to make things working right. For example, if we consider the left hemisphere (i.e. src[0]), we can see that there are only 3732 used vertices with the ’nuse’ key. I looked through all the keys of src[0] and I found that ‘patch_inds’ seems to answer my question (at least it has a size of 3732), provided you rename the vertices of src[0][‘use_tris’] before (using, for example, numpy.searchsorted function). My approach to get associated triangles to these vertices is to check if each triangle has all its vertices in the ‘patch_inds’ array. If it is the case, then we keep this triangle ; if not, we delete it. This procedure gives a new triangles array, but when I try something like np.unique(new_triangles).size, it returns 3729 (expected 3732) for the left hemisphere, and 3760 (expected 3766) for the right hemisphere. Maybe I am completely wrong with all of this, but after many days of research, I still do not know how to solve my problem.
>>
>> I hope I have been clear enough in my explanations above.
>>
>> Thanks in advance,
>>
>> Clément
>> _______________________________________________
>> Mne_analysis mailing list
>> Mne_analysis at nmr.mgh.harvard.edu
>> https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
>
> _______________________________________________
> Mne_analysis mailing list
> Mne_analysis at nmr.mgh.harvard.edu
> https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.nmr.mgh.harvard.edu/pipermail/mne_analysis/attachments/20200409/a23e51df/attachment-0001.html 
-------------- next part --------------
A non-text attachment was scrubbed...
Name: exp_triangles.py
Type: text/x-python-script
Size: 4162 bytes
Desc: exp_triangles.py
Url : http://mail.nmr.mgh.harvard.edu/pipermail/mne_analysis/attachments/20200409/a23e51df/attachment-0001.bin 


More information about the Mne_analysis mailing list