[Mne_analysis] GSoC Idea, Improving decode module

JR KING jeanremi.king at gmail.com
Thu Mar 10 17:27:43 EST 2016
Search archives:

Hi Asish,

As Denis said, the decoding module is one possible target. Just FYI, there
are other possibilities too: e.g. across-subjects stats and viz isn't
really well developed/documented.

Currently the decoding classes have been developed separately, by different
authors and with different architectures. IMO, one great goal would thus be
to

*1. (hard*) homogenize the existing functions so that they all become
strictly compatible with sklearn (i.e, based on BaseEstimator_, using fit,
transform, predict and score methods).

*2.* (*medium-hard*): develop transformer objects that would ultimately
allow the users to pipe multiple processing steps: e.g. we typically aim at
getting:
make_pipeline(TimeFreq(), InverseTransform(), DataVectorizer(),
LogisticRegression())
or
make_pipeline(Filter(10, 30), Covariances(method='shrunk'),
Xdawn(n_components=4), TangentSpace(), SVM(kernel='linear'))

for which all the steps could be typically initialized with inst.info and
would take an X and a y to be fitted/predicted/scored.

*3. (easy*) Setup a systematic i/o to store the estimators, the predictions
and the scores.

As a concrete example, to optimize memory and CPU, the GAT currently stores
the predictions (y_pred_) in the object, and the scoring approach is
performed outside the CV. This storing and scoring isn't following sklearn
API. Consequently, one cannot use cross_val_score(GAT). Typically
refactoring this kind of feature requires some deep thinking because,
unlike sklearn, several decoding module are applied in a "mass
multivariate" way: i.e. many multivariate models are fitted on
independent/partially common/or even identical data. Optimizing memory and
CPU is thus probably the main challenge here.

I would consequently start by tackling the easy/medium problem first (e.g.
i/o in all decoding classes, vizualizing the fitted weights/patterns for
each decoding method), and see how we can develop some transformers, such
as EpochVectorizer, that would be common across decoding modules to format.

Hope this helps,


JR


In summary, this project will involve a series of usability improvements
>> for the decoding module and extend its functionality.
>
> I feel the above statement is quite vague for writing a detailed plan in
> the proposal. Or perhaps the "improvements" can only be known while the
> objectives(listed above) are being fulfilled?
> Lastly, going a little out of topic, could you now please elaborate on how
> to set up the cleaner framework of the decoding module, that you mentioned
> in the last message?
>
> Thank you
> Asish Panda
>
> On Fri, Feb 26, 2016 at 9:50 PM, Asish Panda <asishrocks95 at gmail.com>
> wrote:
>
>> Hello Jean
>>
>> Thank you very much for your response and the issues. I will get my hand
>> dirty right away! :)
>>
>> Thank you
>> Asish Panda
>>
>> On Fri, Feb 26, 2016 at 8:37 PM, JR KING <jeanremi.king at gmail.com> wrote:
>>
>>> Hi Asish,
>>>
>>> Thanks for your interest!.
>>>
>>> You can start with one of these easy PR:
>>> https://github.com/mne-tools/mne-python/issues/2874
>>> https://github.com/mne-tools/mne-python/issues/2176
>>> https://github.com/mne-tools/mne-python/issues/2189 (probably needs a
>>> bit of discussion)
>>>
>>> Once you're there I can suggest you some more fun things that you could
>>> do to set up a cleaner framework for the decoding module.
>>>
>>> All the best,
>>>
>>> Jean-Rémi
>>>
>>> On 26 February 2016 at 09:57, Asish Panda <asishrocks95 at gmail.com>
>>> wrote:
>>>
>>>> Hello everyone,
>>>>
>>>> I am looking forward to participate in GSoC and I am interested in the
>>>> idea of improving the decode module
>>>> <https://github.com/mne-tools/mne-python/wiki/GSOC-Ideas#3-improve-the-decoding-module>.
>>>> I have installed and set up the development environment and have been
>>>> trying to get familiar with various modules. However being quite new to
>>>> MEG, EEG I'm looking for some pointers to start as well as prerequisites to
>>>> work on decode module.
>>>> Lastly I apologize if I have been rude in any manner.
>>>>
>>>> Thank you
>>>> Asish Panda
>>>>
>>>> _______________________________________________
>>>> Mne_analysis mailing list
>>>> Mne_analysis at nmr.mgh.harvard.edu
>>>> https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
>>>>
>>>>
>>>> The information in this e-mail is intended only for the person to whom
>>>> it is
>>>> addressed. If you believe this e-mail was sent to you in error and the
>>>> e-mail
>>>> contains patient information, please contact the Partners Compliance
>>>> HelpLine at
>>>> http://www.partners.org/complianceline . If the e-mail was sent to you
>>>> in error
>>>> but does not contain patient information, please contact the sender and
>>>> properly
>>>> dispose of the e-mail.
>>>>
>>>>
>>>
>>> _______________________________________________
>>> Mne_analysis mailing list
>>> Mne_analysis at nmr.mgh.harvard.edu
>>> https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
>>>
>>>
>>> The information in this e-mail is intended only for the person to whom
>>> it is
>>> addressed. If you believe this e-mail was sent to you in error and the
>>> e-mail
>>> contains patient information, please contact the Partners Compliance
>>> HelpLine at
>>> http://www.partners.org/complianceline . If the e-mail was sent to you
>>> in error
>>> but does not contain patient information, please contact the sender and
>>> properly
>>> dispose of the e-mail.
>>>
>>>
>>
>
> _______________________________________________
> Mne_analysis mailing list
> Mne_analysis at nmr.mgh.harvard.edu
> https://mail.nmr.mgh.harvard.edu/mailman/listinfo/mne_analysis
>
>
> The information in this e-mail is intended only for the person to whom it
> is
> addressed. If you believe this e-mail was sent to you in error and the
> e-mail
> contains patient information, please contact the Partners Compliance
> HelpLine at
> http://www.partners.org/complianceline . If the e-mail was sent to you in
> error
> but does not contain patient information, please contact the sender and
> properly
> dispose of the e-mail.
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.nmr.mgh.harvard.edu/pipermail/mne_analysis/attachments/20160310/f6927867/attachment.html 


More information about the Mne_analysis mailing list