HOME
*





SMCV
In statistics, the standardized mean of a contrast variable (SMCV or SMC), is a parameter assessing effect size. The SMCV is defined as mean divided by the standard deviation of a contrast variable. The SMCV was first proposed for one-way ANOVA cases and was then extended to multi-factor ANOVA cases . Background Consistent interpretations for the strength of group comparison, as represented by a contrast, are important. When there are only two groups involved in a comparison, SMCV is the same as the strictly standardized mean difference (SSMD). SSMD belongs to a popular type of effect-size measure called "standardized mean differences" which includes Cohen's d and Glass's \delta. In ANOVA, a similar parameter for measuring the strength of group comparison is standardized effect size (SES). One issue with SES is that its values are incomparable for contrasts with different coefficients. SMCV does not have such an issue. Concept Suppose the random values in t group ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Dual-flashlight Plot
In statistics, a dual-flashlight plot is a type of scatter-plot in which the standardized mean of a contrast variable (SMCV) is plotted against the mean of a contrast variable representing a comparison of interest . The commonly used dual-flashlight plot is for the difference between two groups in high-throughput experiments such as microarrays and high-throughput screening studies, in which we plot the SSMD versus average log fold-change on the ''y''- and ''x''-axes, respectively, for all genes or compounds (such as siRNAs or small molecules) investigated in an experiment. As a whole, the points in a dual-flashlight plot look like the beams of a flashlight with two heads, hence the name dual-flashlight plot. With the dual-flashlight plot, we can see how the genes or compounds are distributed into each category in effect sizes, as shown in the figure. Meanwhile, we can also see the average fold-change for each gene or compound. The dual-flashlight plot is similar to the volcano p ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

C+-probability
In statistics, a c+-probability is the probability that a contrast variable obtains a positive value. Using a replication probability, the c+-probability is defined as follows: if we get a random draw from each group (or factor level) and calculate the sampled value of the contrast variable based on the random draws, then the c+-probability is the chance that the sampled values of the contrast variable are greater than 0 when the random drawing process is repeated infinite times. The c+-probability is a probabilistic index accounting for distributions of compared groups (or factor levels). The c+-probability and SMCV are two characteristics of a contrast variable. There is a link between SMCV and c+-probability. The SMCV and c+-probability provides a consistent interpretation to the strength of comparisons in contrast analysis. When only two groups are involved in a comparison, the c+-probability becomes d+-probability which is the probability that the difference of values from ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Strictly Standardized Mean Difference
In statistics, the strictly standardized mean difference (SSMD) is a measure of effect size. It is the mean divided by the standard deviation of a difference between two random values each from one of two groups. It was initially proposed for quality control and hit selection in high-throughput screening (HTS) and has become a statistical parameter measuring effect sizes for the comparison of any two groups with random values. Background In high-throughput screening (HTS), quality control (QC) is critical. An important QC characteristic in a HTS assay is how much the positive controls, test compounds, and negative controls differ from one another. This QC characteristic can be evaluated using the comparison of two well types in HTS assays. Signal-to-noise ratio (S/N), signal-to-background ratio (S/B), and the Z-factor have been adopted to evaluate the quality of HTS assays through the comparison of two investigated types of wells. However, the S/B does not take into account any ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Statistics
Statistics (from German: '' Statistik'', "description of a state, a country") is the discipline that concerns the collection, organization, analysis, interpretation, and presentation of data. In applying statistics to a scientific, industrial, or social problem, it is conventional to begin with a statistical population or a statistical model to be studied. Populations can be diverse groups of people or objects such as "all people living in a country" or "every atom composing a crystal". Statistics deals with every aspect of data, including the planning of data collection in terms of the design of surveys and experiments.Dodge, Y. (2006) ''The Oxford Dictionary of Statistical Terms'', Oxford University Press. When census data cannot be collected, statisticians collect data by developing specific experiment designs and survey samples. Representative sampling assures that inferences and conclusions can reasonably extend from the sample to the population as a whole. An ex ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Effect Size
In statistics, an effect size is a value measuring the strength of the relationship between two variables in a population, or a sample-based estimate of that quantity. It can refer to the value of a statistic calculated from a sample of data, the value of a parameter for a hypothetical population, or to the equation that operationalizes how statistics or parameters lead to the effect size value. Examples of effect sizes include the correlation between two variables, the regression coefficient in a regression, the mean difference, or the risk of a particular event (such as a heart attack) happening. Effect sizes complement statistical hypothesis testing, and play an important role in power analyses, sample size planning, and in meta-analyses. The cluster of data-analysis methods concerning effect sizes is referred to as estimation statistics. Effect size is an essential component when evaluating the strength of a statistical claim, and it is the first item (magnitude) in the MA ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Mean
There are several kinds of mean in mathematics, especially in statistics. Each mean serves to summarize a given group of data, often to better understand the overall value ( magnitude and sign) of a given data set. For a data set, the '' arithmetic mean'', also known as "arithmetic average", is a measure of central tendency of a finite set of numbers: specifically, the sum of the values divided by the number of values. The arithmetic mean of a set of numbers ''x''1, ''x''2, ..., x''n'' is typically denoted using an overhead bar, \bar. If the data set were based on a series of observations obtained by sampling from a statistical population, the arithmetic mean is the '' sample mean'' (\bar) to distinguish it from the mean, or expected value, of the underlying distribution, the '' population mean'' (denoted \mu or \mu_x).Underhill, L.G.; Bradfield d. (1998) ''Introstat'', Juta and Company Ltd.p. 181/ref> Outside probability and statistics, a wide range of other notions of m ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Standard Deviation
In statistics, the standard deviation is a measure of the amount of variation or dispersion of a set of values. A low standard deviation indicates that the values tend to be close to the mean (also called the expected value) of the set, while a high standard deviation indicates that the values are spread out over a wider range. Standard deviation may be abbreviated SD, and is most commonly represented in mathematical texts and equations by the lower case Greek letter σ (sigma), for the population standard deviation, or the Latin letter '' s'', for the sample standard deviation. The standard deviation of a random variable, sample, statistical population, data set, or probability distribution is the square root of its variance. It is algebraically simpler, though in practice less robust, than the average absolute deviation. A useful property of the standard deviation is that, unlike the variance, it is expressed in the same unit as the data. The standard deviation o ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Contrast (statistics)
In statistics, particularly in analysis of variance and linear regression, a contrast is a linear combination of variables ( parameters or statistics) whose coefficients add up to zero, allowing comparison of different treatments. Definitions Let \theta_1,\ldots,\theta_t be a set of variables, either parameters or statistics, and a_1,\ldots,a_t be known constants. The quantity \sum_^t a_i \theta_i is a linear combination. It is called a contrast if Casella 2008, p. 11. Furthermore, two contrasts, \sum_^t a_i \theta_i and \sum_^t b_i \theta_i, are orthogonal if Examples Let us imagine that we are comparing four means, \mu_1,\mu_2,\mu_3,\mu_4. The following table describes three possible contrasts: The first contrast allows comparison of the first mean with the second, the second contrast allows comparison of the third mean with the fourth, and the third contrast allows comparison of the average of the first two means with the average of the last two. In a balanced one-way analy ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

ANOVA
Analysis of variance (ANOVA) is a collection of statistical models and their associated estimation procedures (such as the "variation" among and between groups) used to analyze the differences among means. ANOVA was developed by the statistician Ronald Fisher. ANOVA is based on the law of total variance, where the observed variance in a particular variable is partitioned into components attributable to different sources of variation. In its simplest form, ANOVA provides a statistical test of whether two or more population means are equal, and therefore generalizes the ''t''-test beyond two means. In other words, the ANOVA is used to test the difference between two or more means. History While the analysis of variance reached fruition in the 20th century, antecedents extend centuries into the past according to Stigler. These include hypothesis testing, the partitioning of sums of squares, experimental techniques and the additive model. Laplace was performing hypothesis testing ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Contrast Variable
In statistics, particularly in analysis of variance and linear regression, a contrast is a linear combination of variables (parameters or statistics) whose coefficients add up to zero, allowing comparison of different treatments. Definitions Let \theta_1,\ldots,\theta_t be a set of variables, either parameters or statistics, and a_1,\ldots,a_t be known constants. The quantity \sum_^t a_i \theta_i is a linear combination. It is called a contrast if Casella 2008, p. 11. Furthermore, two contrasts, \sum_^t a_i \theta_i and \sum_^t b_i \theta_i, are orthogonal if Examples Let us imagine that we are comparing four means, \mu_1,\mu_2,\mu_3,\mu_4. The following table describes three possible contrasts: The first contrast allows comparison of the first mean with the second, the second contrast allows comparison of the third mean with the fourth, and the third contrast allows comparison of the average of the first two means with the average of the last two. In a balanced one-way analysi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Non-central T-distribution
The noncentral ''t''-distribution generalizes Student's ''t''-distribution using a noncentrality parameter. Whereas the central probability distribution describes how a test statistic ''t'' is distributed when the difference tested is null, the noncentral distribution describes how ''t'' is distributed when the null is false. This leads to its use in statistics, especially calculating statistical power. The noncentral ''t''-distribution is also known as the singly noncentral ''t''-distribution, and in addition to its primary use in statistical inference, is also used in robust modeling for data. Definitions If ''Z'' is a standard normal random variable, and ''V'' is a chi-squared distributed random variable with ν degrees of freedom that is independent of ''Z'', then :T=\frac is a noncentral ''t''-distributed random variable with ν degrees of freedom and noncentrality parameter μ ≠ 0. Note that the noncentrality parameter may be negative. Cumulative distribution fun ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Minimum-variance Unbiased Estimator
In statistics a minimum-variance unbiased estimator (MVUE) or uniformly minimum-variance unbiased estimator (UMVUE) is an unbiased estimator that has lower variance than any other unbiased estimator for all possible values of the parameter. For practical statistics problems, it is important to determine the MVUE if one exists, since less-than-optimal procedures would naturally be avoided, other things being equal. This has led to substantial development of statistical theory related to the problem of optimal estimation. While combining the constraint of unbiasedness with the desirability metric of least variance leads to good results in most practical settings—making MVUE a natural starting point for a broad range of analyses—a targeted specification may perform better for a given problem; thus, MVUE is not always the best stopping point. Definition Consider estimation of g(\theta) based on data X_1, X_2, \ldots, X_n i.i.d. from some member of a family of densities p_\theta ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]