Results 1  10
of
72
Voxelbased morphometry—The methods
 Neuroimage
, 2000
"... At its simplest, voxelbased morphometry (VBM) involves a voxelwise comparison of the local concentration of gray matter between two groups of subjects. The procedure is relatively straightforward and involves spatially normalizing highresolution images from all the subjects in the study into the ..."
Abstract

Cited by 273 (4 self)
 Add to MetaCart
At its simplest, voxelbased morphometry (VBM) involves a voxelwise comparison of the local concentration of gray matter between two groups of subjects. The procedure is relatively straightforward and involves spatially normalizing highresolution images from all the subjects in the study into the same stereotactic space. This is followed by segmenting the gray matter from the spatially normalized images and smoothing the graymatter segments. Voxelwise parametric statistical tests which compare the smoothed graymatter images from the two groups are performed. Corrections for multiple comparisons are made using the theory of Gaussian random fields. This paper describes the steps involved in VBM, with particular emphasis on segmenting gray matter from MR images with nonuniformity artifact. We provide evaluations of the assumptions that underpin the method, including the accuracy of the segmentation and the assumptions made about the statistical distribution of the data. © 2000 Academic Press
A voxelbased morphometric study of ageing in 465 normal adult human brains.
 NeuroImage
, 2001
"... Voxelbasedmorphometry (VBM) is a wholebrain, unbiased technique for characterizing regional cerebral volume and tissue concentration differences in structural magnetic resonance images. We describe an optimized method of VBM to examine the effects of age on grey and white matter and CSF in 465 n ..."
Abstract

Cited by 267 (2 self)
 Add to MetaCart
(Show Context)
Voxelbasedmorphometry (VBM) is a wholebrain, unbiased technique for characterizing regional cerebral volume and tissue concentration differences in structural magnetic resonance images. We describe an optimized method of VBM to examine the effects of age on grey and white matter and CSF in 465 normal adults. Global grey matter volume decreased linearly with age, with a significantly steeper decline in males. Local areas of accelerated loss were observed bilaterally in the insula, superior parietal gyri, central sulci, and cingulate sulci. Areas exhibiting little or no age effect (relative preservation) were noted in the amygdala, hippocampi, and entorhinal cortex. Global white matter did not decline with age, but local areas of relative accelerated loss and preservation were seen. There was no interaction of age with sex for regionally specific effects. These results corroborate previous reports and indicate that VBM is a useful technique for studying structural brain correlates of ageing through life in humans. © 2001 Academic Press Key Words: ageing; normal; MRI; voxel based morphometry. INTRODUCTION There is compelling evidence from post mortem and in vivo studies that the brain shrinks with age, but accurate quantification of the specific patterns of agerelated atrophy has proved elusive. It is unclear whether there are predictable common patterns of ageing or whether individual human brains respond to the ageing process idiosyncratically. Postmortem analysis of mammalian brains suggest that there may be a gradient of ageing from the association areas to the primary sensory regions, with the former showing the most prominent correlations between age and atrophy
Controlling the familywise error rate in functional neuroimaging: a comparative review
 Statistical Methods in Medical Research
, 2003
"... Functional neuroimaging data embodies a massive multiple testing problem, where 100 000 correlated test statistics must be assessed. The familywise error rate, the chance of any false positives is the standard measure of Type I errors in multiple testing. In this paper we review and evaluate three a ..."
Abstract

Cited by 173 (7 self)
 Add to MetaCart
(Show Context)
Functional neuroimaging data embodies a massive multiple testing problem, where 100 000 correlated test statistics must be assessed. The familywise error rate, the chance of any false positives is the standard measure of Type I errors in multiple testing. In this paper we review and evaluate three approaches to thresholding images of test statistics: Bonferroni, random �eld and the permutation test. Owing to recent developments, improved Bonferroni procedures, such as Hochberg’s methods, are now applicable to dependent data. Continuous random �eld methods use the smoothness of the image to adapt to the severity of the multiple testing problem. Also, increased computing power has made both permutation and bootstrap methods applicable to functional neuroimaging. We evaluate these approaches on t images using simulations and a collection of real datasets. We �nd that Bonferronirelated tests offer little improvement over Bonferroni, while the permutation method offers substantial improvement over the random �eld method for low smoothness and low degrees of freedom. We also show the limitations of trying to �nd an equivalent number of independent tests for an image of correlated test statistics. 1
DeformationBased Surface Morphometry Applied to Gray Matter Deformation
, 2003
"... We present a unified statistical approach to deformationbased morphometry applied to the cortical surface. The cerebral cortex has the topology of a 2D highly convoluted sheet. As the brain develops over time, the cortical surface area, thickness, curvature and total gray matter volume change. It i ..."
Abstract

Cited by 80 (32 self)
 Add to MetaCart
We present a unified statistical approach to deformationbased morphometry applied to the cortical surface. The cerebral cortex has the topology of a 2D highly convoluted sheet. As the brain develops over time, the cortical surface area, thickness, curvature and total gray matter volume change. It is highly likely that such agerelated surface changes are not uniform. By measuring how such surface metrics change over time, the regions of the most rapid structural changes can be localized. We avoided using surface flattening, which distorts the inherent geometry of the cortex in our analysis and it is only used in visualization. To increase the signal to noise ratio, di#usion smoothing, which generalizes Gaussian kernel smoothing to an arbitrary curved cortical surface, has been developed and applied to surface data. Afterwards, statistical inference on the cortical surface will be performed via random fields theory. As an illustration, we demonstrate how this new surfacebased morphometry can be applied in localizing the cortical regions of the gray matter tissue growth and loss in the brain images longitudinally collected in the group of children and adolescents.
The quantitative evaluation of functional neuroimaging expertiments: The NPAIRS data analysis framework. NeuroImage 11: S592
, 2000
"... We introduce a dataanalysis framework and performance metrics for evaluating and optimizing the interaction between activation tasks, experimental designs, and the methodological choices and tools for data acquisition, preprocessing, data analysis, and extraction of statistical parametric maps (SPM ..."
Abstract

Cited by 68 (17 self)
 Add to MetaCart
We introduce a dataanalysis framework and performance metrics for evaluating and optimizing the interaction between activation tasks, experimental designs, and the methodological choices and tools for data acquisition, preprocessing, data analysis, and extraction of statistical parametric maps (SPMs). Our NPAIRS (nonparametric prediction, activation, influence, and reproducibility resampling) framework provides an alternative to simulations and ROC curves by using real PET and fMRI data sets to examine the relationship between prediction accuracy and the signaltonoise ratios (SNRs) associated with reproducible SPMs. Using crossvalidation resampling we plot training–test set predictions of the experimental design variables (e.g., brainstate labels) versus reproducibility
Mapping cortical change in Alzheimer’s disease, brain development, and schizophrenia
, 2004
"... ..."
A Framework For Computational Anatomy
, 2002
"... The rapid collection of brain images from healthy and diseased subjects has stimulated the development of powerful mathematical algorithms to compare, pool and average brain data across whole populations. Brain structure is so complex and variable that new approaches in computer vision, partial diff ..."
Abstract

Cited by 48 (16 self)
 Add to MetaCart
The rapid collection of brain images from healthy and diseased subjects has stimulated the development of powerful mathematical algorithms to compare, pool and average brain data across whole populations. Brain structure is so complex and variable that new approaches in computer vision, partial differential equations, and statistical field theory are being formulated to detect and visualize diseasespecific patterns. We present some novel mathematical strategies for computational anatomy, focusing on the creation of populationbased brain atlases. These atlases describe how the brain varies with age, gender, genetics, and over time. We review applications in Alzheimer's disease, schizophrenia and brain development, outlining some current challenges in the field.
Mapping anatomical correlations across cerebral cortex (MACACC) using cortical thickness from MRI. Neuroimage. 31:9931003
, 2006
"... (MACACC) using cortical thickness from MRI ..."
Smoothing and cluster thresholding for cortical surfacebased group analysis of fMRI data.
 Neuroimage
, 2006
"... Cortical surfacebased analysis of fMRI data has proven to be a useful method with several advantages over 3dimensional volumetric analyses. Many of the statistical methods used in 3D analyses can be adapted for use with surfacebased analyses. Operating within the framework of the FreeSurfer soft ..."
Abstract

Cited by 30 (0 self)
 Add to MetaCart
(Show Context)
Cortical surfacebased analysis of fMRI data has proven to be a useful method with several advantages over 3dimensional volumetric analyses. Many of the statistical methods used in 3D analyses can be adapted for use with surfacebased analyses. Operating within the framework of the FreeSurfer software package, we have implemented a surfacebased version of the cluster size exclusion method used for multiple comparisons correction. Furthermore, we have a developed a new method for generating regions of interest on the cortical surface using a sliding threshold of cluster exclusion followed by cluster growth. Cluster size limits for multiple probability thresholds were estimated using random field theory and validated with Monte Carlo simulation. A prerequisite of RFT or cluster size simulation is an estimate of the smoothness of the data. In order to estimate the intrinsic smoothness of group analysis statistics, independent of true activations, we conducted a group analysis of simulated noise data sets. Because smoothing on a cortical surface mesh is typically implemented using an iterative method, rather than directly applying a Gaussian blurring kernel, it is also necessary to determine the width of the equivalent Gaussian blurring kernel as a function of smoothing steps. Iterative smoothing has previously been modeled as continuous heat diffusion, providing a theoretical basis for predicting the equivalent kernel width, but the predictions of the model were not empirically tested. We generated an empirical heat diffusion kernel width function by performing surfacebased smoothing simulations and found a large disparity between the expected and actual kernel widths.
Mapping human brain function with MEG and EEG: methods and validation
 NeuroImage
, 2004
"... We survey the field of magnetoencephalography (MEG) and electroencephalography (EEG) source estimation. These modalities offer the potential for functional brain mapping with temporal resolution in the millisecond range. However, the limited number of spatial measurements and the illposedness of th ..."
Abstract

Cited by 27 (1 self)
 Add to MetaCart
We survey the field of magnetoencephalography (MEG) and electroencephalography (EEG) source estimation. These modalities offer the potential for functional brain mapping with temporal resolution in the millisecond range. However, the limited number of spatial measurements and the illposedness of the inverse problem present significant limits to our ability to produce accurate spatial maps from these data without imposing major restrictions on the form of the inverse solution. Here we describe approaches to solving the forward problem of computing the mapping from putative inverse solutions into the data space. We then describe the inverse problem in terms of low dimensional solutions, based on the equivalent current dipole (ECD), and high dimensional solutions, in which images of neural activation are constrained to the cerebral cortex. We also address the issue of objective assessment of the relative performance of inverse procedures by the freeresponse receiver operating characteristic (FROC) curve. We conclude with a discussion of methods for assessing statistical significance of experimental results through use of the bootstrap for determining confidence regions in dipolefitting methods, and random field (RF) and permutation methods for detecting significant activation in cortically constrained imaging studies.