[Homer-users] Data export / GLM analysis?

katarina begus katarina.begus at gmail.com
Mon Aug 4 09:30:48 EDT 2014
Search archives:

Dear Homer2 users,

I was wondering if you could help us out with some questions we have about
using the data processed with Homer2, please?

We've been running GLM analysis on our data using a combination of custom
Matlab scripts and the SPM-NIRS toolbox so far, but would now like to start
using Homer2. Are there any scripts available that would do GLM on data
preprocessed in Homer2?

If not, we were thinking of extracting the data after processing and
re-converting it to the format that could be used with our current scripts.
But looking into that, we got stuck on how the processed data is organised.
We have 4 conditions, 40 channels, and each trial lasts 24s = 240 samples.
The numbers we get in the procResult files don't really make sense
according to that:

- In the .nirs files, procResult.dc has a structure of (3259 x 3 x 40). I
assume 3 = HbO-HHb-T and 40 = channels. But what would 3259 be and how are
the different conditions coded in this file?

- In the groupResults.mat, the group.subjs(1,1).procResult.dcAvg has a
structure of (508 x 3 x 40 x 4). Again, I assume 3 = HbO-HHb-T; 40 =
channels; 4 = conditions. But what is 508? In the processing stream we
specified to block average from -5 to 30s around the event, but that would
make 350 samples, not 508?

We would really appreciate if anyone could help!

Many thanks,
Katarina

-- 

Katarina Begus
PhD Student, Research Assistant
Centre for Brain and Cognitive Development
Birkbeck, University of London
Office Tel: +44(0) 20 7079 0766
Website: http://www.cbcd.bbk.ac.uk/people/students/katarina_begus
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.nmr.mgh.harvard.edu/pipermail/homer-users/attachments/20140804/49636e48/attachment.html 


More information about the Homer-users mailing list