Beauchamp:CorticalSurfaceHCP

From OpenWetWare
Brain picture
Beauchamp Lab



Introduction

Glasser et al. (Nature 2016) used the HCP dataset (including resting-state functional connectivity, T1 and T2 images) to parcellate the cerebral cortex into 180 areas per hemisphere, creating an atlas known as the HCP MMP (multimodal parcellation) 1.0 atlas. This page describes one technique to use this atlas in AFNI/SUMA. As an overview, a cortical surface model is created for each subject and registered to the FreeSurfer template (fsaverage). Benson et al. (https://www.biorxiv.org/content/early/2018/04/25/308247) created an HCP-aligned fsaverage surface. The correspondence between all fsaverage-aligned surfaces means that the Benson et al. label file can be applied to any individual's surface. In the example below, this is illustrated with the Colin N27 brain, a single subject dataset that is a common reference. While all alignment and labeling is done on the surface, in many cases it is desirable to apply the resulting labels to EPI data such as task-based fMRI or resting-state fMRI. This can be done either by mapping EPI data to the surface, or mapping labels from the surface back to the volume. Steps for the latter method are provided below. Coalson et al. (https://www.biorxiv.org/content/early/2018/04/23/255620) extensively documents the inaccuracies inherent in volume-based intersubject alignment methods and their inferiority to the surface-based methods described on this page.

Processing Steps

Create standardized cortical surface models (std.141) for your subject(s). See Cortical Surface models overview for details. Create or obtain a copy of the HCP Atlas in standard space. Here are instructions from Kate Mills:

 https://figshare.com/articles/HCP-MMP1_0_projected_on_fsaverage/3498446

Copy the HCP atlas converted into std.141 FreeSurfer template brain space into the subject directory. There are two files, one for each hemisphere. For instance, if we would like to see the labels on the N27 standard brain,

 cd /Volumes/data/BCM/N27/suma_MNI_N27
 cp /Volumes/data/scripts/std.141.?h.HCP.annot.niml.dset .

start Suma and load the annotation files. Make sure you load the std.141 brains or the labels will be incorrect.

 suma -spec std.141.MNI_N27_both.spec &
 DriveSuma -com surf_cont -load_dset std.141.lh.HCP.annot.niml.dset -surf_label lh.smoothwm.gii  -view_surf_cont y -switch_cmap ROI_i256

Different values/colors on the surface correspond to each atlas label.

You would probably want to do something similar for each of your subjects, all of which have had surface reconstruction completed, and you've run @SUMA_Make_Spec* so they have a std.141 mesh for themselves. Once your subjects have the std.141 mesh, you can follow along replacing your own subject's directories and name for MNI_N27.

Volume Processing Steps

It can also be useful to have the atlas labels in the volume, since this is the native space of the MRI data. This can be done with the AFNI program 3dSurf2Vol or the newer @surf_to_vol_spackle (you may have to refresh your AFNI version). Here is a sample command line

 foreach hemi (lh rh)
   3dSurf2Vol -spec std.141.MNI_N27_{$hemi}.spec -surf_A smoothwm -surf_B pial -grid_parent T1.nii \
 -sdata std.141.{$hemi}.HCP.annot.niml.dset -map_func mode -f_steps 10 -prefix HCP.volume_{$hemi}.nii -sv T1.nii
 end

alternatively, using ?h.ribbon.nii which will be located in the SUMA folder

 foreach hemi (lh rh)
   @surf_to_vol_spackle -spec std.141.MNI_N27_{$hemi}.spec -surfA smoothwm -surfB pial -maskset $hemi.ribbon.nii -surfset std.141.{$hemi}.HCP.annot.niml.dset -mode -prefix HCP.volume_{$hemi}.nii 
 end

Now use the 'max' command to combine the two hemispheres (without adding up anything on the midline), and use 3drefit to add labels and default to the ROI colormap.

 3dcalc -datum byte -prefix HCP.volume_both.nii -a HCP.volume_lh.nii -b HCP.volume_rh.nii -expr "max(a,b)"
 3drefit -cmap INT_CMAP HCP.volume_both.nii 
 3drefit -labeltable /Volumes/data/BCM/HCP_Atlas/HCP.niml.lt HCP.volume_both.nii 

The volume dataset has the HCP label at each location.

Automated Viewing

To make viewing easier, DriveSuma can be used to automate loading of the files.

set h = /Volumes/data/scripts
set p = `pwd` 
afni -R -niml &
suma -spec std.141.MNI_N27_both.spec -sv T1.nii &
DriveSuma -com surf_cont -load_dset std.141.lh.sulc.niml.dset -surf_label lh.smoothwm.gii  -view_surf_cont y -load_cmap {$h}/nice.1D.cmap -Dim 0.6
DriveSuma -com surf_cont -load_dset std.141.rh.sulc.niml.dsett -surf_label rh.smoothwm.gii  -view_surf_cont y -switch_cmap nice.1D -Dim 0.6
DriveSuma -com viewer_cont -key b  -1_only n
DriveSuma -com viewer_cont -key F3
DriveSuma -com viewer_cont -load_view  {$h}/nice.niml.vvs -com surf_cont -switch_surf lh.inf_200.gii
DriveSuma -com viewer_cont -key t
DriveSuma -com surf_cont -1_only n
plugout_drive -com 'SWITCH_OVERLAY HCP.volume_both.nii' -com 'SEE_OVERLAY +' -quit

Right click on the AFNI color bar and choose the "ROI_i256" color bar for optimal viewing.

Resampling to functional space

For a different example, let's say you're hoping to plot some functional data in one of these regions, specifically. We'll assume you've run FreeSurfer recon-all, @SUMA_Make_Spec_FS, and your afni_proc.py script (or whatever else you've used to generate some functional data.)

As above, you need to sample the atlas to your own subject.

 cp /Volumes/data/scripts/std.141.?h.HCP.annot.niml.dset .
 

Now, you either need to sample your functional data to the surface, or sample the atlas to your functional volume. In the first example, we'll move the functional data to the std.141 mesh on the surface (because that's how the atlas is defined.) In this example, as well, we're converting all the volumes of the stats file. All of the following will need to be done for both hemispheres.

 # Sample functional data to surface
 3dVol2Surf                                \
      -spec         $SUBJECTS_DIR/$sid/SUMA/std.141.${sid}_${hemi}.spec \
      -surf_A       smoothwm                 \
      -sv           $SUBJECTS_DIR/$sid/SUMA/${sid}_SurfVol.nii         \
      -grid_parent  stats.${sid}+orig           \
      -map_func     mask                     \
      -out_niml       stats.${sid}.$hemi.niml.dset

Now, your stats output has been resampled to the std.141 mesh. Since your atlas is already in this space, you just need to define the mask from the atlas with somthing like....

 3dcalc -a std.141.${hemi}.HCP.annot.niml.dset \
    -expr 'amongst(a,104,124,175,174,24,173,175,28,25)' \
    -prefix STG_mask.${hemi}.niml.dset

And take an average of your stats file in the mask.

 3dmaskave -q -mask STG_mask.${hemi}.niml.dset stats.${sid}.${hemi}.niml.dset > stats.${hemi}.out.txt

Your output file here is the average within the mask of your statistics. These will be in the order they are in the stats.${sid}+orig file. To find out what that is, run

 3dinfo -verb stats.${sid}+orig


As an alternative to resampling your statistics to the surface, you could sample the atlas from the surface to the volume. Most of this procedure is described above, but for clarity, it would look something like...

 cp /Volumes/data/scripts/std.141.?h.HCP.annot.niml.dset .
 @surf_to_vol_spackle -spec $SUBJECTS_DIR/$sid/SUMA/std.141.${sid}_{$hemi}.spec \
    -surfA smoothwm -surfB pial -maskset $SUBJECTS_DIR/$sid/SUMA/$hemi.ribbon.nii \
    -surfset std.141.{$hemi}.HCP.annot.niml.dset -mode -prefix HCP.volume_{$hemi}.nii
 3dresample -master stats.${sid}+orig -inset HCP.volume_${hemi}.nii \
    -prefix HCP.volume.res.${hemi}.nii
 3dcalc -a HCP.volume.res.${hemi}.nii -expr 'amongst(a,104,124,175,174,24,173,175,28,25)' \
    -prefix STG_mask.${hemi}.nii
 3dmaskave -q -mask STG_mask.${hemi}.nii stats.${sid}+orig > stats.${hemi}.out.txt