Seed-based D Mapping
   HOME

TheInfoList



OR:

Seed-based d mapping (formerly Signed differential mapping) or SDM is a
statistical Statistics (from German: ''Statistik'', "description of a state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a scientific, industria ...
technique created by
Joaquim Radua Joaquim Radua is a Spanish psychiatrist and developer of methods for meta-analysis of neuroimaging studies. He has been named as one of the most cited researchers in Psychiatry / Psychology. Education Joaquim Radua studied Medicine as well as ...
for meta-analyzing studies on differences in
brain A brain is an organ that serves as the center of the nervous system in all vertebrate and most invertebrate animals. It is located in the head, usually close to the sensory organs for senses such as vision. It is the most complex organ in a v ...
activity or structure which used
neuroimaging Neuroimaging is the use of quantitative (computational) techniques to study the structure and function of the central nervous system, developed as an objective way of scientifically studying the healthy human brain in a non-invasive manner. Incre ...
techniques such as
fMRI Functional magnetic resonance imaging or functional MRI (fMRI) measures brain activity by detecting changes associated with blood flow. This technique relies on the fact that cerebral blood flow and neuronal activation are coupled. When an area o ...
, VBM, DTI or
PET A pet, or companion animal, is an animal kept primarily for a person's company or entertainment rather than as a working animal, livestock, or a laboratory animal. Popular pets are often considered to have attractive appearances, intelligence, ...
. It may also refer to a specific piece of software created by the SDM Project to carry out such meta-analyses.


The seed-based d mapping approach


Overview of the method

SDM adopted and combined various positive features from previous methods, such as ALE or MKDA, and introduced a series of improvements and novel features. One of the new features, introduced to avoid positive and negative findings in the same
voxel In 3D computer graphics, a voxel represents a value on a regular grid in three-dimensional space. As with pixels in a 2D bitmap, voxels themselves do not typically have their position (i.e. coordinates) explicitly encoded with their values. Ins ...
as seen in previous methods, was the representation of both positive differences and negative differences in the same map, thus obtaining a signed differential map (SDM). Another relevant feature, introduced in version 2.11, was the use of
effect size In statistics, an effect size is a value measuring the strength of the relationship between two variables in a population, or a sample-based estimate of that quantity. It can refer to the value of a statistic calculated from a sample of data, the ...
s (leading to effect-size SDM or 'ES-SDM'), which allows combination of reported peak coordinates with statistical parametric maps, thus allowing more exhaustive and accurate meta-analyses. The method has three steps. First, coordinates of cluster peaks (e.g. the
voxel In 3D computer graphics, a voxel represents a value on a regular grid in three-dimensional space. As with pixels in a 2D bitmap, voxels themselves do not typically have their position (i.e. coordinates) explicitly encoded with their values. Ins ...
s where the differences between patients and healthy controls were highest), and statistical maps if available, are selected according to SDM inclusion criteria. Second, coordinates are used to recreate statistical maps, and effect-sizes maps and their variances are derived from
t-statistic In statistics, the ''t''-statistic is the ratio of the departure of the estimated value of a parameter from its hypothesized value to its standard error. It is used in hypothesis testing via Student's ''t''-test. The ''t''-statistic is used in a ...
s (or equivalently from
p-value In null-hypothesis significance testing, the ''p''-value is the probability of obtaining test results at least as extreme as the result actually observed, under the assumption that the null hypothesis is correct. A very small ''p''-value means ...
s or z-scores). Finally, individual study maps are meta-analyzed using different tests to complement the main outcome with sensitivity and
heterogeneity Homogeneity and heterogeneity are concepts often used in the sciences and statistics relating to the uniformity of a substance or organism. A material or image that is homogeneous is uniform in composition or character (i.e. color, shape, siz ...
analyses.


Inclusion criteria

It is not uncommon in
neuroimaging Neuroimaging is the use of quantitative (computational) techniques to study the structure and function of the central nervous system, developed as an objective way of scientifically studying the healthy human brain in a non-invasive manner. Incre ...
studies that some regions (e.g. a priori regions of interest) are more liberally thresholded than the rest of the
brain A brain is an organ that serves as the center of the nervous system in all vertebrate and most invertebrate animals. It is located in the head, usually close to the sensory organs for senses such as vision. It is the most complex organ in a v ...
. However, a
meta-analysis A meta-analysis is a statistical analysis that combines the results of multiple scientific studies. Meta-analyses can be performed when there are multiple scientific studies addressing the same question, with each individual study reporting me ...
of studies with such intra-study regional differences in thresholds would be biased towards these regions, as they are more likely to be reported just because authors apply more liberal thresholds in them. In order to overcome this issue SDM introduced a criterion in the selection of the coordinates: while different studies may employ different thresholds, you should ensure that the same threshold throughout the whole brain was used within each included study.


Pre-processing of studies

After conversion of statistical parametric maps and peak coordinates to Talairach space, an SDM map is created for each study within a specific gray or white matter template. Pre-processing of statistical parametric maps is straightforward, while pre-processing of reported peak coordinates requires recreating the clusters of difference by means of an un-normalized
Gaussian Kernel In mathematics, a Gaussian function, often simply referred to as a Gaussian, is a function of the base form f(x) = \exp (-x^2) and with parametric extension f(x) = a \exp\left( -\frac \right) for arbitrary real constants , and non-zero . It is ...
, so that
voxels In 3D computer graphics, a voxel represents a value on a regular grid in three-dimensional space. As with pixels in a 2D bitmap, voxels themselves do not typically have their position (i.e. coordinates) explicitly encoded with their values. Ins ...
closer to the peak coordinate have higher values. A rather large full-width at half-maximum (FWHM) of 20mm is used to account for different sources of spatial error, e.g.
coregistration Image registration is the process of transforming different sets of data into one coordinate system. Data may be multiple photographs, data from different sensors, times, depths, or viewpoints. It is used in computer vision, medical imaging, mil ...
mismatch in the studies, the size of the cluster or the location of the peak within the cluster. Within a study, values obtained by close Gaussian kernels are summed, though values are combined by square-distance-weighted averaging.


Statistical comparisons

SDM provides several different statistical analyses in order to complement the main outcome with sensitivity and
heterogeneity Homogeneity and heterogeneity are concepts often used in the sciences and statistics relating to the uniformity of a substance or organism. A material or image that is homogeneous is uniform in composition or character (i.e. color, shape, siz ...
analyses. * The main statistical analysis is the
mean There are several kinds of mean in mathematics, especially in statistics. Each mean serves to summarize a given group of data, often to better understand the overall value (magnitude and sign) of a given data set. For a data set, the ''arithme ...
analysis, which consists in calculating the
mean There are several kinds of mean in mathematics, especially in statistics. Each mean serves to summarize a given group of data, often to better understand the overall value (magnitude and sign) of a given data set. For a data set, the ''arithme ...
of the
voxel In 3D computer graphics, a voxel represents a value on a regular grid in three-dimensional space. As with pixels in a 2D bitmap, voxels themselves do not typically have their position (i.e. coordinates) explicitly encoded with their values. Ins ...
values in the different studies. This
mean There are several kinds of mean in mathematics, especially in statistics. Each mean serves to summarize a given group of data, often to better understand the overall value (magnitude and sign) of a given data set. For a data set, the ''arithme ...
is
weighted A weight function is a mathematical device used when performing a sum, integral, or average to give some elements more "weight" or influence on the result than other elements in the same set. The result of this application of a weight function is ...
by the inverse of the variance and accounts for inter-study heterogeneity (QH maps). * Subgroup analyses are mean analyses applied to groups of studies to allow the study of
heterogeneity Homogeneity and heterogeneity are concepts often used in the sciences and statistics relating to the uniformity of a substance or organism. A material or image that is homogeneous is uniform in composition or character (i.e. color, shape, siz ...
. *
Linear model In statistics, the term linear model is used in different ways according to the context. The most common occurrence is in connection with regression models and the term is often taken as synonymous with linear regression model. However, the term ...
analyses (e.g. meta-regression) are a generalization of the mean analysis to allow comparisons between groups and the study of possible confounds.{{Cite journal , last1 = Radua , first1 = Joaquim , last2 = van den Heuvel , first2 = Odile A. , last3 = Surguladze , first3 = Simon , last4 = Mataix-Cols , first4 = David , title = Meta-analytical comparison of voxel-based morphometry studies in obsessive-compulsive disorder vs other anxiety disorders , journal = Archives of General Psychiatry , volume = 67 , issue = 7 , pages = 701–711 , date = 5 July 2010 , doi = 10.1001/archgenpsychiatry.2010.70 , pmid = 20603451 , doi-access = free A low variability of the regressor is critical in meta-regressions, so they are recommended to be understood as exploratory and to be more conservatively thresholded. * Jack-knife analysis consists in repeating a test as many times as studies have been included, discarding one different study each time, i.e. removing one study and repeating the analyses, then putting that study back and removing another study and repeating the analysis, and so on. The idea is that if a significant brain region remains significant in all or most of the combinations of studies it can be concluded that this finding is highly replicable. The
statistical significance In statistical hypothesis testing, a result has statistical significance when it is very unlikely to have occurred given the null hypothesis (simply by chance alone). More precisely, a study's defined significance level, denoted by \alpha, is the p ...
of the analyses is checked by standard randomization tests. It is recommended to use uncorrected p-values = 0.005, as this significance has been found in this method to be approximately equivalent to a corrected p-value = 0.05. A
false discovery rate In statistics, the false discovery rate (FDR) is a method of conceptualizing the rate of type I errors in null hypothesis testing when conducting multiple comparisons. FDR-controlling procedures are designed to control the FDR, which is the expe ...
(FDR) = 0.05 has been found in this method to be too conservative. Values in a Talairach label or coordinate can also be extracted for further processing or graphical presentation.


SDM software

SDM is software written by the SDM project to aid the meta-analysis of voxel-based
neuroimaging Neuroimaging is the use of quantitative (computational) techniques to study the structure and function of the central nervous system, developed as an objective way of scientifically studying the healthy human brain in a non-invasive manner. Incre ...
data. It is distributed as
freeware Freeware is software, most often proprietary, that is distributed at no monetary cost to the end user. There is no agreed-upon set of rights, license, or EULA that defines ''freeware'' unambiguously; every publisher defines its own rules for the f ...
including a graphical interface and a menu/command-line console. It can also be integrated as an SPM extension.


References


External links


SDM
software and documentation from the SDM Project. Biostatistics Neuroimaging Neuroimaging software Meta-analysis