[Mne_analysis] issue with brainsuite and bem ico_downsample
Eric Larson
larson.eric.d at gmail.com
Sat Apr 6 18:23:50 EDT 2019
External Email - Use Caution
That divide by zero is probably the problem and might be fixable. Can you
share the surfaces on DropBox or some other file hosting service?
Eric
On Sat, Apr 6, 2019, 14:01 Yvonne Yau <yvonne.yau at mail.mcgill.ca> wrote:
> External Email - Use Caution
>
> I opened an MNE issue for the downsampling problem (
> https://github.com/mne-tools/mne-python/issues/6127)
>
>
>
> When I set the ico=None, I run into a new issue with mne.make_bem_solution
> after mne.make_bem_model. I don't get this issue with the MNE sample
> dataset but with my acquired T1's. Any ideas? Thanks for your help guys.
>
> bem = mne.make_bem_solution(surfaces, verbose='INFO')
>
> Approximation method : Linear collocation
>
> Three-layer model surfaces loaded.
>
> Computing the linear collocation solution...
>
> Matrix coefficients...
>
> head (5002) -> head (5002) ...
>
> /dagher/dagher6/yyau/toolbox/anaconda3/envs/mne/lib/python3.6/site-packages/mne/bem.py:151:
> RuntimeWarning: divide by zero encountered in double_scalars
>
> miss /= (4.0 * n_memb)
>
> head (5002) -> outer_skull (5002) ...
>
> head (5002) -> inner_skull (5002) ...
>
> outer_skull (5002) -> head (5002) ...
>
> outer_skull (5002) -> outer_skull (5002) ...
>
> outer_skull (5002) -> inner_skull (5002) ...
>
> inner_skull (5002) -> head (5002) ...
>
> inner_skull (5002) -> outer_skull (5002) ...
>
> inner_skull (5002) -> inner_skull (5002) ...
>
> Inverting the coefficient matrix...
>
> Traceback (most recent call last):
>
>
> File "<ipython-input-18-24943e2f3a65>", line 1, in <module>
>
> bem = mne.make_bem_solution(surfaces, verbose='INFO')
>
>
> File
> "</dagher/dagher6/yyau/toolbox/anaconda3/envs/mne/lib/python3.6/site-packages/mne/externals/decorator.py:decorator-gen-39>",
> line 2, in make_bem_solution
>
>
> File
> "/dagher/dagher6/yyau/toolbox/anaconda3/envs/mne/lib/python3.6/site-packages/mne/utils/_logging.py",
> line 88, in wrapper
>
> return function(*args, **kwargs)
>
>
> File
> "/dagher/dagher6/yyau/toolbox/anaconda3/envs/mne/lib/python3.6/site-packages/mne/bem.py",
> line 326, in make_bem_solution
>
> _fwd_bem_linear_collocation_solution(bem)
>
>
> File
> "/dagher/dagher6/yyau/toolbox/anaconda3/envs/mne/lib/python3.6/site-packages/mne/bem.py",
> line 267, in _fwd_bem_linear_collocation_solution
>
> m['solution'] = _fwd_bem_multi_solution(coeff, m['gamma'], nps)
>
>
> File
> "/dagher/dagher6/yyau/toolbox/anaconda3/envs/mne/lib/python3.6/site-packages/mne/bem.py",
> line 225, in _fwd_bem_multi_solution
>
> return linalg.inv(solids, overwrite_a=True)
>
>
> File
> "/dagher/dagher6/yyau/toolbox/anaconda3/envs/mne/lib/python3.6/site-packages/scipy/linalg/basic.py",
> line 945, in inv
>
> a1 = _asarray_validated(a, check_finite=check_finite)
>
>
> File
> "/dagher/dagher6/yyau/toolbox/anaconda3/envs/mne/lib/python3.6/site-packages/scipy/_lib/_util.py",
> line 239, in _asarray_validated
>
> a = toarray(a)
>
>
> File
> "/dagher/dagher6/yyau/toolbox/anaconda3/envs/mne/lib/python3.6/site-packages/numpy/lib/function_base.py",
> line 498, in asarray_chkfinite
>
> "array must not contain infs or NaNs")
>
>
> ValueError: array must not contain infs or NaNs
>
>
> ------------------------------
> *From:* mne_analysis-bounces at nmr.mgh.harvard.edu <
> mne_analysis-bounces at nmr.mgh.harvard.edu> on behalf of Eric Larson <
> larson.eric.d at gmail.com>
> *Sent:* April 5, 2019 9:54 AM
> *To:* Discussion and support forum for the users of MNE Software
> *Subject:* Re: [Mne_analysis] issue with brainsuite and bem ico_downsample
>
>
> External Email - Use Caution
>
> The error is in the downsampling step (we should probably improve the
> message). Set `ico=None` and it won't try to downsample your surface:
>
> https://martinos.org/mne/stable/generated/mne.make_bem_model.html
>
> It would be good indeed to see how this works with sample data and see if
> it affects source localization -- feel free to open an MNE issue about that.
>
> Eric
>
>
> On Fri, Apr 5, 2019 at 4:32 AM Alexandre Gramfort <
> alexandre.gramfort at inria.fr> wrote:
>
> External Email - Use Caution
>
> hi Yvonne,
>
> I've never tried this route to BEM model generation. Can you do this
> procedure with
> MNE sample dataset and share with us the generate .surf files so we can
> replicate
> and hopefully fix the pb?
>
> ALex
>
> On Fri, Apr 5, 2019 at 1:16 AM Yvonne Yau <yvonne.yau at mail.mcgill.ca>
> wrote:
>
> External Email - Use Caution
>
> Hi guys,
> I'm having some issues with the BEM meshes for my EEG data. I've tried
> two approaches:
> (1) Using freesurfer watershed and trying different preflood levels, but
> it seems to do a bad job with the outer_skull.surf being misplaced (see
> attached image)
> (2) I've had better success with using Brainsuite and have followed the
> guidelines; converting it from dfs to surf using mne_convert_surface, and
> then reducing the number of triangles to 10,000 using mne_reduce_surface.
> When I try to run mne.make_bem_mode, I get the error:
> File "/usr/anaconda3/envs/mne/lib/python3.6/site-packages/mne/bem.py",
> line 344, in _ico_downsample
>
> raise RuntimeError(bad_msg)
>
>
> RuntimeError: A surface with 10000 triangles cannot be isomorphic with a
> subdivided icosahedron.
>
> Any ideas?
>
> Best, Yvonne
> _______________________________________________
> Mne_analysis mailing list
> Mne_analysis at nmr.mgh.harvard.edu
> https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
>
> _______________________________________________
> Mne_analysis mailing list
> Mne_analysis at nmr.mgh.harvard.edu
> https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
>
> _______________________________________________
> Mne_analysis mailing list
> Mne_analysis at nmr.mgh.harvard.edu
> https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.nmr.mgh.harvard.edu/pipermail/mne_analysis/attachments/20190406/b5710b95/attachment-0001.html
More information about the Mne_analysis
mailing list