Definitions
Standard normal distribution
The simplest case of a normal distribution is known as the ''standard normal distribution'' or ''unit normal distribution''. This is a special case when and , and it is described by thisGeneral normal distribution
Every normal distribution is a version of the standard normal distribution, whose domain has been stretched by a factor (the standard deviation) and then translated by (the mean value): : The probability density must be scaled by so that the integral is still 1. If is aNotation
The probability density of the standard Gaussian distribution (standard normal distribution, with zero mean and unit variance) is often denoted with the Greek letter ( phi). The alternative form of the Greek letter phi, , is also used quite often. The normal distribution is often referred to as or . Thus when a random variable is normally distributed with mean and standard deviation , one may write :Alternative parameterizations
Some authors advocate using the precision as the parameter defining the width of the distribution, instead of the deviation or the variance . The precision is normally defined as the reciprocal of the variance, . The formula for the distribution then becomes : This choice is claimed to have advantages in numerical computations when is very close to zero, and simplifies formulas in some contexts, such as in theCumulative distribution functions
The__Fourier_transform_and_characteristic_function_
The___Moment_and_cumulant_generating_functions_
The_ moment_generating_function_of_a_real_random_variable___Stein_operator_and_class_
Within___Maximum_entropy_
Of_all_probability_distributions_over_the_reals_with_a_specified_mean___Other_properties_
__Related_distributions_
__Central_limit_theorem_
__Operations_and_functions_of_normal_variables_
__Operations_on_a_single_normal_variable_
If__=_Operations_on_two_independent_normal_variables_
= *_If__=_Operations_on_two_independent_standard_normal_variables_
= If___Operations_on_multiple_independent_normal_variables_
*_Any_linear_combination_of_independent_normal_deviates_is_a_normal_deviate. *_If___Operations_on_multiple_correlated_normal_variables_
*_A_quadratic_form_of_a_normal_vector,_i.e._a_quadratic_function___Operations_on_the_density_function_
The_split_normal_distribution_is_most_directly_defined_in_terms_of_joining_scaled_sections_of_the_density_functions_of_different_normal_distributions_and_rescaling_the_density_to_integrate_to_one._The_truncated_normal_distribution_results_from_rescaling_a_section_of_a_single_density_function.__Infinite_divisibility_and_Cramér's_theorem_
For_any_positive_integer___Bernstein's_theorem_
Bernstein's_theorem_states_that_if___Extensions_
The_notion_of_normal_distribution,_being_one_of_the_most_important_distributions_in_probability_theory,_has_been_extended_far_beyond_the_standard_framework_of_the_univariate_(that_is_one-dimensional)_case_(Case_1)._All_these_extensions_are_also_called_''normal''_or_''Gaussian''_laws,_so_a_certain_ambiguity_in_names_exists. *_The___Statistical_inference_
__Estimation_of_parameters_
It_is_often_the_case_that_we_do_not_know_the_parameters_of_the_normal_distribution,_but_instead_want_to_Estimation_theory, estimate_them._That_is,_having_a_sample___Sample_mean_
Estimator___Sample_variance_
The_estimator___Confidence_intervals_
By_Cochran's_theorem,_for_normal_distributions_the_sample_mean___Normality_tests_
Normality_tests_assess_the_likelihood_that_the_given_data_set__comes_from_a_normal_distribution._Typically_the_null_hypothesis_''H''0_is_that_the_observations_are_distributed_normally_with_unspecified_mean_''μ''_and_variance_''σ''2,_versus_the_alternative_''Ha''_that_the_distribution_is_arbitrary._Many_tests_(over_40)_have_been_devised_for_this_problem._The_more_prominent_of_them_are_outlined_below: Diagnostic_plots_are_more_intuitively_appealing_but_subjective_at_the_same_time,_as_they_rely_on_informal_human_judgement_to_accept_or_reject_the_null_hypothesis. *____Bayesian_analysis_of_the_normal_distribution_
Bayesian_analysis_of_normally_distributed_data_is_complicated_by_the_many_different_possibilities_that_may_be_considered: *_Either_the_mean,_or_the_variance,_or_neither,_may_be_considered_a_fixed_quantity. *_When_the_variance_is_unknown,_analysis_may_be_done_directly_in_terms_of_the_variance,_or_in_terms_of_the____Sum_of_two_quadratics_
_=_Scalar_form_
= The_following_auxiliary_formula_is_useful_for_simplifying_the_posterior_distribution, posterior_update_equations,_which_otherwise_become_fairly_tedious. :_=_Vector_form_
= A_similar_formula_can_be_written_for_the_sum_of_two_vector_quadratics:_If_x,_y,_z_are_vectors_of_length_''k'',_and_A_and_B_are_symmetric_matrix, symmetric,_invertible_matrices_of_size___Sum_of_differences_from_the_mean_
Another_useful_formula_is_as_follows:__With_known_variance_
For_a_set_of_i.i.d._normally_distributed_data_points_X_of_size_''n''_where_each_individual_point_''x''_follows___With_known_mean_
For_a_set_of_i.i.d._normally_distributed_data_points_X_of_size_''n''_where_each_individual_point_''x''_follows___With_unknown_mean_and_unknown_variance_
For_a_set_of_i.i.d._normally_distributed_data_points_X_of_size_''n''_where_each_individual_point_''x''_follows__Occurrence_and_applications
The_occurrence_of_normal_distribution_in_practical_problems_can_be_loosely_classified_into_four_categories: #_Exactly_normal_distributions; #_Approximately_normal_laws,_for_example_when_such_approximation_is_justified_by_the___Exact_normality_
__Approximate_normality_
''Approximately''_normal_distributions_occur_in_many_situations,_as_explained_by_the___Assumed_normality_
__Methodological_problems_and_peer_review_
John_Ioannidis_argues_that_using_normally_distributed_standard_deviations_as_standards_for_validating_research_findings_leave_falsifiability, falsifiable_predictions_about_phenomena_that_are_not_normally_distributed_untested._This_includes,_for_example,_phenomena_that_only_appear_when_all_necessary_conditions_are_present_and_one_cannot_be_a_substitute_for_another_in_an_addition-like_way_and_phenomena_that_are_not_randomly_distributed._Ioannidis_argues_that_standard_deviation-centered_validation_gives_a_false_appearance_of_validity_to_hypotheses_and_theories_where_some_but_not_all_falsifiable_predictions_are_normally_distributed_since_the_portion_of_falsifiable_predictions_that_there_is_evidence_against_may_and_in_some_cases_are_in_the_non-normally_distributed_parts_of_the_range_of_falsifiable_predictions,_as_well_as_baselessly_dismissing_hypotheses_for_which_none_of_the_falsifiable_predictions_are_normally_distributed_as_if_were_they_unfalsifiable_when_in_fact_they_do_make_falsifiable_predictions._It_is_argued_by_Ioannidis_that_many_cases_of_mutually_exclusive_theories_being_accepted_as_"validated"_by_research_journals_are_caused_by_failure_of_the_journals_to_take_in_empirical_falsifications_of_non-normally_distributed_predictions,_and_not_because_mutually_exclusive_theories_are_true,_which_they_cannot_be,_although_two_mutually_exclusive_theories_can_both_be_wrong_and_a_third_one_correct.__Computational_methods_
__Generating_values_from_normal_distribution_
__Numerical_approximations_for_the_normal_CDF_and_normal_quantile_function_
The_standard_normal_cumulative_distribution_function, CDF_is_widely_used_in_scientific_and_statistical_computing. The_values_Φ(''x'')_may_be_approximated_very_accurately_by_a_variety_of_methods,_such_as_numerical_integration,_Taylor_series,_asymptotic_series_and_Gauss's_continued_fraction#Of_Kummer's_confluent_hypergeometric_function, continued_fractions._Different_approximations_are_used_depending_on_the_desired_level_of_accuracy. *__give_the_approximation_for_Φ(''x'')_for_''x_>_0''_with_the_absolute_error__(algorith__History_
__Development_
Some_authors_attribute_the_credit_for_the_discovery_of_the_normal_distribution_to_Abraham_de_Moivre, de_Moivre,_who_in_1738_published_in_the_second_edition_of_his_"''The_Doctrine_of_Chances''"_the_study_of_the_coefficients_in_the_binomial_expansion_of_._De_Moivre_proved_that_the_middle_term_in_this_expansion_has_the_approximate_magnitude_of___Naming_
Today,_the_concept_is_usually_known_in_English_as_the_normal_distribution_or_Gaussian_distribution.__Other_less_common_names_include_Gauss_distribution,_Laplace-Gauss_distribution,_the_law_of_error,_the_law_of_facility_of_errors,_Laplace's_second_law,_Gaussian_law. Gauss_himself_apparently_coined_the_term_with_reference_to_the_"normal_equations"_involved_in_its_applications,_with_normal_having_its_technical_meaning_of_orthogonal_rather_than_"usual"._However,_by_the_end_of_the_19th_century_some_authors_had_started_using_the_name_''normal_distribution'',_where_the_word_"normal"_was_used_as_an_adjective –_the_term_now_being_seen_as_a_reflection_of_the_fact_that_this_distribution_was_seen_as_typical,_common –_and_thus_"normal"._Charles_Sanders_Peirce, Peirce_(one_of_those_authors)_once_defined_"normal"_thus:_"...the_'normal'_is_not_the_average_(or_any_other_kind_of_mean)_of_what_actually_occurs,_but_of_what_''would'',_in_the_long_run,_occur_under_certain_circumstances."_Around_the_turn_of_the_20th_century_Karl_Pearson, Pearson_popularized_the_term_''normal''_as_a_designation_for_this_distribution. Also,_it_was_Pearson_who_first_wrote_the_distribution_in_terms_of_the_standard_deviation_''σ''_as_in_modern_notation._Soon_after_this,_in_year_1915,_Ronald_Fisher, Fisher_added_the_location_parameter_to_the_formula_for_normal_distribution,_expressing_it_in_the_way_it_is_written_nowadays:__See_also_
*_Bates_distribution_–_similar_to_the_Irwin–Hall_distribution,_but_rescaled_back_into_the_0_to_1_range *_Behrens–Fisher_problem_–_the_long-standing_problem_of_testing_whether_two_normal_samples_with_different_variances_have_same_means; *_Bhattacharyya_distance_–_method_used_to_separate_mixtures_of_normal_distributions *_Erdős–Kac_theorem_–_on_the_occurrence_of_the_normal_distribution_in_number_theory *_Full_width_at_half_maximum *_Gaussian_blur_–_convolution,_which_uses_the_normal_distribution_as_a_kernel *_Modified_half-normal_distribution__Notes_
__References_
__Citations_
__Sources_
*_ *__In_particular,_the_entries_fo__External_links_
*___Fourier_transform_and_characteristic_function_
The___Moment_and_cumulant_generating_functions_
The____Stein_operator_and_class_
Within___Zero-variance_limit_
In_the_ limit_(mathematics), limit_when___Maximum_entropy_
Of_all_probability_distributions_over_the_reals_with_a_specified_mean___Other_properties_
__Related_distributions_
__Central_limit_theorem_
__Operations_and_functions_of_normal_variables_
__Operations_on_a_single_normal_variable_
If__=_Operations_on_two_independent_normal_variables_
= *_If__=_Operations_on_two_independent_standard_normal_variables_
= If___Operations_on_multiple_independent_normal_variables_
*_Any_linear_combination_of_independent_normal_deviates_is_a_normal_deviate. *_If___Operations_on_multiple_correlated_normal_variables_
*_A_quadratic_form_of_a_normal_vector,_i.e._a_quadratic_function___Operations_on_the_density_function_
The_split_normal_distribution_is_most_directly_defined_in_terms_of_joining_scaled_sections_of_the_density_functions_of_different_normal_distributions_and_rescaling_the_density_to_integrate_to_one._The_truncated_normal_distribution_results_from_rescaling_a_section_of_a_single_density_function.__Infinite_divisibility_and_Cramér's_theorem_
For_any_positive_integer___Bernstein's_theorem_
Bernstein's_theorem_states_that_if___Extensions_
The_notion_of_normal_distribution,_being_one_of_the_most_important_distributions_in_probability_theory,_has_been_extended_far_beyond_the_standard_framework_of_the_univariate_(that_is_one-dimensional)_case_(Case_1)._All_these_extensions_are_also_called_''normal''_or_''Gaussian''_laws,_so_a_certain_ambiguity_in_names_exists. *_The___Statistical_inference_
__Estimation_of_parameters_
It_is_often_the_case_that_we_do_not_know_the_parameters_of_the_normal_distribution,_but_instead_want_to_Estimation_theory, estimate_them._That_is,_having_a_sample___Sample_mean_
Estimator___Sample_variance_
The_estimator___Confidence_intervals_
By_Cochran's_theorem,_for_normal_distributions_the_sample_mean___Normality_tests_
Normality_tests_assess_the_likelihood_that_the_given_data_set__comes_from_a_normal_distribution._Typically_the_null_hypothesis_''H''0_is_that_the_observations_are_distributed_normally_with_unspecified_mean_''μ''_and_variance_''σ''2,_versus_the_alternative_''Ha''_that_the_distribution_is_arbitrary._Many_tests_(over_40)_have_been_devised_for_this_problem._The_more_prominent_of_them_are_outlined_below: Diagnostic_plots_are_more_intuitively_appealing_but_subjective_at_the_same_time,_as_they_rely_on_informal_human_judgement_to_accept_or_reject_the_null_hypothesis. *____Bayesian_analysis_of_the_normal_distribution_
Bayesian_analysis_of_normally_distributed_data_is_complicated_by_the_many_different_possibilities_that_may_be_considered: *_Either_the_mean,_or_the_variance,_or_neither,_may_be_considered_a_fixed_quantity. *_When_the_variance_is_unknown,_analysis_may_be_done_directly_in_terms_of_the_variance,_or_in_terms_of_the____Sum_of_two_quadratics_
_=_Scalar_form_
= The_following_auxiliary_formula_is_useful_for_simplifying_the_posterior_distribution, posterior_update_equations,_which_otherwise_become_fairly_tedious. :_=_Vector_form_
= A_similar_formula_can_be_written_for_the_sum_of_two_vector_quadratics:_If_x,_y,_z_are_vectors_of_length_''k'',_and_A_and_B_are_symmetric_matrix, symmetric,_invertible_matrices_of_size___Sum_of_differences_from_the_mean_
Another_useful_formula_is_as_follows:__With_known_variance_
For_a_set_of_i.i.d._normally_distributed_data_points_X_of_size_''n''_where_each_individual_point_''x''_follows___With_known_mean_
For_a_set_of_i.i.d._normally_distributed_data_points_X_of_size_''n''_where_each_individual_point_''x''_follows___With_unknown_mean_and_unknown_variance_
For_a_set_of_i.i.d._normally_distributed_data_points_X_of_size_''n''_where_each_individual_point_''x''_follows__Occurrence_and_applications
The_occurrence_of_normal_distribution_in_practical_problems_can_be_loosely_classified_into_four_categories: #_Exactly_normal_distributions; #_Approximately_normal_laws,_for_example_when_such_approximation_is_justified_by_the___Exact_normality_
__Approximate_normality_
''Approximately''_normal_distributions_occur_in_many_situations,_as_explained_by_the___Assumed_normality_
__Methodological_problems_and_peer_review_
John_Ioannidis_argues_that_using_normally_distributed_standard_deviations_as_standards_for_validating_research_findings_leave_falsifiability, falsifiable_predictions_about_phenomena_that_are_not_normally_distributed_untested._This_includes,_for_example,_phenomena_that_only_appear_when_all_necessary_conditions_are_present_and_one_cannot_be_a_substitute_for_another_in_an_addition-like_way_and_phenomena_that_are_not_randomly_distributed._Ioannidis_argues_that_standard_deviation-centered_validation_gives_a_false_appearance_of_validity_to_hypotheses_and_theories_where_some_but_not_all_falsifiable_predictions_are_normally_distributed_since_the_portion_of_falsifiable_predictions_that_there_is_evidence_against_may_and_in_some_cases_are_in_the_non-normally_distributed_parts_of_the_range_of_falsifiable_predictions,_as_well_as_baselessly_dismissing_hypotheses_for_which_none_of_the_falsifiable_predictions_are_normally_distributed_as_if_were_they_unfalsifiable_when_in_fact_they_do_make_falsifiable_predictions._It_is_argued_by_Ioannidis_that_many_cases_of_mutually_exclusive_theories_being_accepted_as_"validated"_by_research_journals_are_caused_by_failure_of_the_journals_to_take_in_empirical_falsifications_of_non-normally_distributed_predictions,_and_not_because_mutually_exclusive_theories_are_true,_which_they_cannot_be,_although_two_mutually_exclusive_theories_can_both_be_wrong_and_a_third_one_correct.__Computational_methods_
__Generating_values_from_normal_distribution_
__Numerical_approximations_for_the_normal_CDF_and_normal_quantile_function_
The_standard_normal_cumulative_distribution_function, CDF_is_widely_used_in_scientific_and_statistical_computing. The_values_Φ(''x'')_may_be_approximated_very_accurately_by_a_variety_of_methods,_such_as_numerical_integration,_Taylor_series,_asymptotic_series_and_Gauss's_continued_fraction#Of_Kummer's_confluent_hypergeometric_function, continued_fractions._Different_approximations_are_used_depending_on_the_desired_level_of_accuracy. *__give_the_approximation_for_Φ(''x'')_for_''x_>_0''_with_the_absolute_error__(algorith__History_
__Development_
Some_authors_attribute_the_credit_for_the_discovery_of_the_normal_distribution_to_Abraham_de_Moivre, de_Moivre,_who_in_1738_published_in_the_second_edition_of_his_"''The_Doctrine_of_Chances''"_the_study_of_the_coefficients_in_the_binomial_expansion_of_._De_Moivre_proved_that_the_middle_term_in_this_expansion_has_the_approximate_magnitude_of___Naming_
Today,_the_concept_is_usually_known_in_English_as_the_normal_distribution_or_Gaussian_distribution.__Other_less_common_names_include_Gauss_distribution,_Laplace-Gauss_distribution,_the_law_of_error,_the_law_of_facility_of_errors,_Laplace's_second_law,_Gaussian_law. Gauss_himself_apparently_coined_the_term_with_reference_to_the_"normal_equations"_involved_in_its_applications,_with_normal_having_its_technical_meaning_of_orthogonal_rather_than_"usual"._However,_by_the_end_of_the_19th_century_some_authors_had_started_using_the_name_''normal_distribution'',_where_the_word_"normal"_was_used_as_an_adjective –_the_term_now_being_seen_as_a_reflection_of_the_fact_that_this_distribution_was_seen_as_typical,_common –_and_thus_"normal"._Charles_Sanders_Peirce, Peirce_(one_of_those_authors)_once_defined_"normal"_thus:_"...the_'normal'_is_not_the_average_(or_any_other_kind_of_mean)_of_what_actually_occurs,_but_of_what_''would'',_in_the_long_run,_occur_under_certain_circumstances."_Around_the_turn_of_the_20th_century_Karl_Pearson, Pearson_popularized_the_term_''normal''_as_a_designation_for_this_distribution. Also,_it_was_Pearson_who_first_wrote_the_distribution_in_terms_of_the_standard_deviation_''σ''_as_in_modern_notation._Soon_after_this,_in_year_1915,_Ronald_Fisher, Fisher_added_the_location_parameter_to_the_formula_for_normal_distribution,_expressing_it_in_the_way_it_is_written_nowadays:__See_also_
*_Bates_distribution_–_similar_to_the_Irwin–Hall_distribution,_but_rescaled_back_into_the_0_to_1_range *_Behrens–Fisher_problem_–_the_long-standing_problem_of_testing_whether_two_normal_samples_with_different_variances_have_same_means; *_Bhattacharyya_distance_–_method_used_to_separate_mixtures_of_normal_distributions *_Erdős–Kac_theorem_–_on_the_occurrence_of_the_normal_distribution_in_number_theory *_Full_width_at_half_maximum *_Gaussian_blur_–_convolution,_which_uses_the_normal_distribution_as_a_kernel *_Modified_half-normal_distribution__Notes_
__References_
__Citations_
__Sources_
*_ *__In_particular,_the_entries_fo__External_links_
*___Fourier_transform_and_characteristic_function_
The_