Friday, May 24, 2013

Processing a diffusion phantom with FSL

This is a how-to style post intended for FSL users. It may also be of interest to a broader diffusion imaging audience.

In diffusion imaging, a carefully constructed diffusion phantom may be used as a ground truth reference to evaluate competing diffusion models and tractography algorithms. The phantom constructed for the 2009 MICCAI Fiber Cup challenge [1] is one such reference; it is a coronal slice model of fibers that cross, touch and split into different configurations and with different curvatures. The phantom dataset is now publicly available.
Fiber pathways in the diffusion phantom
Image credit: Fibercup paper [1]



  I recently processed the raw phantom dataset to evaluate some algorithms. These are some of my notes on how to preprocess the data on FSL through bedpostx. bedpostx is an FSL program that computes the distribution of diffusion parameters at each voxel. It generates all the files necessary to run the probabilistic tractography tool probtrackx.
  • The raw data for the phantom can be obtained from the Fiber Cup download page. There are 6 datasets (2 image resolutions each acquired at 3 different b-values). The data is stored in 4D NIfTI format and has already been preprocessed to correct for eddy currents and movements so one can run bedpostx directly.
  • In order to run bedpostx, you need the bvals and bvecs gradient files, as well as the input data file and the nodif_brain_mask.
    • data.nii is the 4D phantom image file.
    • bvals is an ASCII text file of b-values corresponding to the gradient directions; the b-values are listed as a single row.
    • bvecs are the gradient vectors; the file format has 3 rows which correspond to the x-, y- and z-values of the gradient directions.
    • nodif_brain_mask.nii.gz is a 3D binary mask file; the file is generated using the FSL bet (or bet2) command.

Sample commands for the dwi-1500.nii dataset (3x3x3 mm resolution, b=1500):
  1. Download the 3x3x3 dwi-b1500.nii file from the Fibercup download page. Rename the file data.nii.
  2. Download the diffusion_directions_corrected.txt file. This is a file of coordinates with 130 entries. The first 65 lines (baseline + 64 directions) are repeated (65 x 2 = 130) since two passes were taken. This was most probably done to average over noise.
    This file, which I have renamed bvecs.tmp, needs to be formatted into the bvecs file so that the x-, y- and z-columns are transposed as rows (see 5 below for sample code).
  3. To generate the bvals file note that there should be 130 values since the number of entries should correspond to the number of gradients. The first value is 0 (for b0); the next 64 values should be 1500 (for b=1500); the final 65 values are a repeat of the first 65. This is the unix command-line perl code I used to create this file:
    perl -e 'print "0\t", "1500\t" x 64, "0\t", "1500\t" x 64' > bvals
  4. The nodif_brain_mask file is generated using the following BET (brain extraction tool) command:
    bet2 data.nii nodif_brain_mask.nii -m -n -f 0.15
    The -m flag specifies a binary mask; the -f flag is the mask threshold.
  5. Run dtifit, an FSL command that fits a diffusion tensor model at each voxel.
    dtifit -k data.nii -m nodif_brain_mask.nii.gz -o out -r bvecs -b bvals
    This extra step, suggested by my colleague Julio Duarte, can be used to check the directions in the bvecs file. This is done by viewing the output file out_V1.nii.gz on FSLview and verifying that the primary fiber direction at each voxel aligns with the phantom's.




    Here we see that the horizontal x-direction is aligned with the phantom but the vertical y-direction needs to be replaced by -y. 








    To correct this I used the following command-line perl code to regenerate the bvecs file:
    perl -ane 'print "$F[0] "' bvecs.tmp > bvecs
    perl -e 'print "\n"' >> bvecs
    perl -ane 'print $F[1]*-1," " ' bvecs.tmp >> bvecs
    perl -e 'print "\n"' >> bvecs
    perl -ane 'print "$F[2] "' bvecs.tmp >> bvecs
    perl -e 'print "\n"' >> bvecs

    If we re-run dtifit, we will see that the primary vectors align with the phantom's fiber pathways.

  6. Run bedpostx using the directory where the data, bvals, bvecs and nodif_brain_mask files are located:
    bedpostx ./ 
  7. As an optional final check compare the bedpostx vector output (dyads*.nii) to the dtifit output (out_V1.nii). 


The bedpostx dyads1 output (blue arrows) is overlaid on the dtifit out_V1 (red lines) vectors. We can see that they align with each other.









Reference
[1] Quantitative evaluation of 10 tractography algorithms on a realistic diffusion MR phantom. Fillard P, Descoteaux M, Goh A, Gouttard S, Jeurissen B, Malcolm J, Ramirez-Manzanares A, Reisert M, Sakaie K, Tensaouti F, Yo T, Mangin JF, Poupon C. Neuroimage 2011; 56(1):220-34.

Some resources
Takuya Hayashi has a program to swap the bvec values.  (Note: I have not verified that it works.)

2 comments:

Thijs said...

Hi Meena,

Concerning the changing of sign to correct the gradient directions: based on the 2D view you gave of the uncorrected phantom, you may just as well conclude that y is ok, and x needs to be replaced by -x. In short, this is because the diffusion measurements are symmetrical in the angular domain; i.e. if the data would only be 2D, then the orientations (x,y) = (-x,-y), and thus also (x,-y) = (-x,y).
However, don't forget that the fibercup phantom, even though it tries to present a 2D situation, is actually still a 3D object that has been scanned. So, in order to obtain a truely correct gradient table, you should also consider if the z direction is correct in relation to x and y.
Finally, you may also be interested to know that the fibercup phantom started a "second life" in context of the tractometer approach (an attempt at validating tractography pipelines in the context of connectomics): http://scil.dinf.usherbrooke.ca/tractometer .

Kind regards,
Thijs

Meena said...

Thijs, thanks. The Tractometer evaluation at SCIL is interesting. I have also (somewhat painstakingly) generated some ground truth data from the diffusion phantom but for an entirely different purpose. I plan to make it public at some point. So that's a "third life" if you will.