[Mne_analysis] Fwd: SNR estimate for real MEG data

Eric Larson larson.eric.d at gmail.com
Fri Oct 19 10:10:27 EDT 2018
Search archives:

        External Email - Use Caution        

The documentation rendering had a bug that has been fixed. Hopefully the
math can help make things a bit clearer:

http://mne-tools.github.io/dev/generated/mne.minimum_norm.estimate_snr.html#mne.minimum_norm.estimate_snr

As to your specific questions, I don't have any answers offhand.

Eric


On Tue, Oct 16, 2018 at 3:51 PM amit <amit.neurosignal at gmail.com> wrote:

>         External Email - Use Caution
>
> Hi everyone,
> I am sending this mail again, sorry for repeated mail.
> I am trying to estimate SNR value of evoked field MEG data. If I look
> over MNE python code, the function for estimating SNR
> (mne.minimum_norm.inverse.estimate_snr()) is:
>
> def estimate_snr(evoked, inv, verbose=None):
>       .. versionadded:: 0.9.0
>     """  # noqa: E501
>     from scipy.stats import chi2
>     _check_reference(evoked)
>     _check_ch_names(inv, evoked.info)
>     inv = prepare_inverse_operator(inv, evoked.nave, 1. / 9., 'MNE')
>     sel = _pick_channels_inverse_operator(evoked.ch_names, inv)
>     logger.info('Picked %d channels from the data' % len(sel))
>     data_white = np.dot(inv['whitener'], np.dot(inv['proj'],
> evoked.data[sel]))
>     data_white_ef = np.dot(inv['eigen_fields']['data'], data_white)
>     n_ch, n_times = data_white.shape
>
>     # Adapted from mne_analyze/regularization.c, compute_regularization
>     n_zero = (inv['noise_cov']['eig'] <= 0).sum()
>     logger.info('Effective nchan = %d - %d = %d'
>                 % (n_ch, n_zero, n_ch - n_zero))
>     signal = np.sum(data_white ** 2, axis=0)  # sum of squares across
> channels
>     noise = n_ch - n_zero
>     snr = signal / noise
>
>     # Adapted from noise_regularization
>     lambda2_est = np.empty(n_times)
>     lambda2_est.fill(10.)
>     remaining = np.ones(n_times, bool)
>
>     # deal with low SNRs
>     bad = (snr <= 1)
>     lambda2_est[bad] = 100.
>     remaining[bad] = False
>
>     # parameters
>     lambda_mult = 0.9
>     sing2 = (inv['sing'] * inv['sing'])[:, np.newaxis]
>     val = chi2.isf(1e-3, n_ch - 1)
>     for n_iter in range(1000):
>         # get_mne_weights (ew=error_weights)
>         # (split newaxis creation here for old numpy)
>         f = sing2 / (sing2 + lambda2_est[np.newaxis][:, remaining])
>         f[inv['sing'] == 0] = 0
>         ew = data_white_ef[:, remaining] * (1.0 - f)
>         # check condition
>         err = np.sum(ew * ew, axis=0)
>         remaining[np.where(remaining)[0][err < val]] = False
>         if not remaining.any():
>             break
>         lambda2_est[remaining] *= lambda_mult
>     else:
>         warn('SNR estimation did not converge')
>     snr_est = 1.0 / np.sqrt(lambda2_est)
>     snr = np.sqrt(snr)
>     return snr, snr_est
>
> In the script while preparing an inverse operator for actually
> computing the inverse ( inv = prepare_inverse_operator(inv,
> evoked.nave, 1. / 9., 'MNE')), it assumes that the SNR value is 3. So
> my questions are:
>
> 1. Isn't it biasing that we assume SNR value before estimating it?
> 2. The 'snr' outputs series i.e. snr values at each time point of the
> data. Then what to take snr value for whole evoked segment: mean, max
> or the snr value for evoked peak?
> 3. Can you please suggest any reference for this snr estimation method?
>
> Thank you very much.
> _______________________________________________
> Mne_analysis mailing list
> Mne_analysis at nmr.mgh.harvard.edu
> https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.nmr.mgh.harvard.edu/pipermail/mne_analysis/attachments/20181019/62351a95/attachment.html 


More information about the Mne_analysis mailing list