[Mne_analysis] TFCE parameters

Natalie Klein neklein at andrew.cmu.edu
Mon Jul 18 16:20:16 EDT 2016
Search archives:

Thanks for the discussion, I am using TFCE myself and it is helpful to see
what others think about it. It seems safest to me to always start at 0 and
increase the step size to ease computational burden, as changing the start
value seems much more likely to have drastic effects on the overall result
than changing the step size.

Also, sorry to be a bit pedantic, but while I completely understand wanting
to play with the method to see how the parameters might affect the results,
please keep in mind that going back and changing the start/step values to
get different (or possibly 'more significant') p-values is problematic. To
me, one of the main benefits of using TFCE is to avoid having to choose a
single threshold ahead of time, as trying several thresholds and picking
the 'best one' results in a new multiple comparisons issue that is
typically not accounted for.




On Mon, Jul 18, 2016 at 3:49 PM, Cushing, Cody <CCUSHING1 at mgh.harvard.edu>
wrote:

> Ah, alright beautiful.  Thanks for the explanation Eric.  I'm imagining
> using a start of 2 on a whole brain analysis would not make much of a
> difference, which I'm imagining is where that number was coming from.
>
> Cheers,
> Cody
> ------------------------------
> *From:* mne_analysis-bounces at nmr.mgh.harvard.edu [
> mne_analysis-bounces at nmr.mgh.harvard.edu] on behalf of Eric Larson [
> larson.eric.d at gmail.com]
> *Sent:* Monday, July 18, 2016 3:38 PM
> *To:* Discussion and support forum for the users of MNE Software
> *Subject:* Re: [Mne_analysis] TFCE parameters
>
> Is there any motivation behind start>0 other than trying to reduce
>> computation time?
>>
>
> Not as far as I know.
>
>
>> Why are these the 2 parameters we have control for TFCE?  What's to be
>> gained/lost?
>>
>
> Think of it as a way to approximate an integral where each function value
> to put into the summation takes a long time to compute (the clustering
> step). Ideally we would start at zero and go in infinitesimal steps to the
> largest statistic value, but numerically that's infeasible and practically
> it would take forever. So the idea is to compute as few as possible
> (highest start and biggest step) without affecting the result. If using a
> smaller start and/or step affects the output, then you should use the
> smaller start and/or step because it should provide a better approximation
> to the integral. Without looking back, I would assume a start of 2 was
> suggested because it usually doesn't affect the result (at least in the
> suggester's experience).
>
> Eric
>
>
> _______________________________________________
> Mne_analysis mailing list
> Mne_analysis at nmr.mgh.harvard.edu
> https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
>
>
> The information in this e-mail is intended only for the person to whom it
> is
> addressed. If you believe this e-mail was sent to you in error and the
> e-mail
> contains patient information, please contact the Partners Compliance
> HelpLine at
> http://www.partners.org/complianceline . If the e-mail was sent to you in
> error
> but does not contain patient information, please contact the sender and
> properly
> dispose of the e-mail.
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.nmr.mgh.harvard.edu/pipermail/mne_analysis/attachments/20160718/4222e431/attachment.html 


More information about the Mne_analysis mailing list