HOME
*





Metaphysical Subjectivism
Subjectivism is the doctrine that "our own mental activity is the only unquestionable fact of our experience", instead of shared or communal, and that there is no external or objective truth. The success of this position is historically attributed to Descartes and his methodic doubt, although he used it as an epistemological tool to prove the opposite (an objective world of facts independent of one's own knowledge, ergo the "Father of Modern Philosophy" inasmuch as his views underlie a scientific worldview). Subjectivism accords primacy to subjective experience as fundamental of all measure and law. In extreme forms like Solipsism, it may hold that the nature and existence of every object depends solely on someone's subjective awareness of it. One may consider the qualified empiricism of George Berkeley in this context, given his reliance on God as the prime mover of human perception. Metaphysical subjectivism Subjectivism is a label used to denote the philosophical tenet that " ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Panpsychism
In the philosophy of mind, panpsychism () is the view that the mind or a mindlike aspect is a fundamental and ubiquitous feature of reality. It is also described as a theory that "the mind is a fundamental feature of the world which exists throughout the universe." It is one of the oldest philosophical theories, and has been ascribed to philosophers including Thales, Plato, Spinoza, Leibniz, William James, Alfred North Whitehead, Bertrand Russell, and Galen Strawson. In the 19th century, panpsychism was the default philosophy of mind in Western thought, but it saw a decline in the mid-20th century with the rise of logical positivism. Recent interest in the hard problem of consciousness and developments in the fields of neuroscience, psychology, and quantum physics have revived interest in panpsychism in the 21st century. Overview Etymology The term ''panpsychism'' comes from the Greek ''pan'' ( πᾶν: "all, everything, whole") and ''psyche'' ( ψυχή: "soul, mind").Clarke ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Thought-experiment
A thought experiment is a hypothetical situation in which a hypothesis, theory, or principle is laid out for the purpose of thinking through its consequences. History The ancient Greek ''deiknymi'' (), or thought experiment, "was the most ancient pattern of mathematical proof", and existed before Euclidean mathematics, where the emphasis was on the conceptual, rather than on the experimental part of a thought-experiment. Johann Witt-Hansen established that Hans Christian Ørsted was the first to use the German term ' (lit. thought experiment) circa 1812. Ørsted was also the first to use the equivalent term ' in 1820. By 1883 Ernst Mach used the term ' in a different way, to denote exclusively the conduct of a experiment that would be subsequently performed as a by his students. Physical and mental experimentation could then be contrasted: Mach asked his students to provide him with explanations whenever the results from their subsequent, real, physical experiment differed ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Coherence (philosophical Gambling Strategy)
In a thought experiment proposed by the Italian probabilist Bruno de Finetti in order to justify Bayesian probability, an array of wagers is coherent precisely if it does not expose the wagerer to certain loss regardless of the outcomes of events on which they are wagering, even if their opponent makes the most judicious choices. Operational subjective probabilities as wagering odds One must set the price of a promise to pay $1 if John Smith wins tomorrow's election, and $0 otherwise. One knows that one's opponent will be able to choose either to buy such a promise from one at the price one has set, or require one to buy such a promise from them, still at the same price. In other words: Player A sets the odds, but Player B decides which side of the bet to take. The price one sets is the "operational subjective probability" that one assigns to the proposition on which one is betting. If one decides that John Smith is 12.5% likely to win—an arbitrary valuation—one might then ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Bruno De Finetti
Bruno de Finetti (13 June 1906 – 20 July 1985) was an Italian probabilist statistician and actuary, noted for the "operational subjective" conception of probability. The classic exposition of his distinctive theory is the 1937 "La prévision: ses lois logiques, ses sources subjectives," which discussed probability founded on the coherence of betting odds and the consequences of exchangeability. Life De Finetti was born in Innsbruck, Austria, and studied mathematics at Politecnico di Milano. He graduated in 1927 writing his thesis under the supervision of Giulio Vivanti. After graduation, he worked as an actuary and a statistician at ''Istituto Nazionale di Statistica'' ( National Institute of Statistics) in Rome and, from 1931, the Trieste insurance company Assicurazioni Generali. In 1936 he won a competition for Chair of Financial Mathematics and Statistics, but was not nominated due to a fascist law barring access to unmarried candidates; he was appointed as ordinary profess ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Edwin Thompson Jaynes
Edwin Thompson Jaynes (July 5, 1922 – April 30, 1998) was the Wayman Crow Distinguished Professor of Physics at Washington University in St. Louis. He wrote extensively on statistical mechanics and on foundations of probability and statistical inference, initiating in 1957 the maximum entropy interpretation of thermodynamics as being a particular application of more general Bayesian/information theory techniques (although he argued this was already implicit in the works of Josiah Willard Gibbs). Jaynes strongly promoted the interpretation of probability theory as an extension of logic. In 1963, together with Fred Cummings, he modeled the evolution of a two-level atom in an electromagnetic field, in a fully quantized way. This model is known as the Jaynes–Cummings model. A particular focus of his work was the construction of logical principles for assigning prior probability distributions; see the principle of maximum entropy, the principle of maximum caliber, the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Bayesian Statistics
Bayesian statistics is a theory in the field of statistics based on the Bayesian interpretation of probability where probability expresses a ''degree of belief'' in an event. The degree of belief may be based on prior knowledge about the event, such as the results of previous experiments, or on personal beliefs about the event. This differs from a number of other interpretations of probability, such as the frequentist interpretation that views probability as the limit of the relative frequency of an event after many trials. Bayesian statistical methods use Bayes' theorem to compute and update probabilities after obtaining new data. Bayes' theorem describes the conditional probability of an event based on data as well as prior information or beliefs about the event or conditions related to the event. For example, in Bayesian inference, Bayes' theorem can be used to estimate the parameters of a probability distribution or statistical model. Since Bayesian statistics treats proba ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Christopher Bishop
Christopher Michael Bishop (born 7 April 1959) is the Laboratory Director at Microsoft Research Cambridge, Honorary Professor of Computer Science at the University of Edinburgh and a Fellow of Darwin College, Cambridge. Bishop is a member of the UK AI Council. He was also recently appointed to the Prime Minister's Council for Science and Technology. Education Bishop obtained a Bachelor of Arts degree in physics from St Catherine's College, Oxford, and a PhD in Theoretical Physics from the University of Edinburgh, with a thesis on quantum field theory supervised by David Wallace and Peter Higgs. Research and career Bishop investigates machine learning, in which computers are made to learn from data and experience. Written works Bishop is the author of two highly cited and widely adopted machine learning text books: Neural Networks for Pattern Recognition (1995) anPattern Recognition and Machine Learning(2006). Awards and honours Bishop was awarded the Tam Dalyell prize i ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Richard Threlkeld Cox
Richard Threlkeld Cox (August 5, 1898 – May 2, 1991) was a professor of physics at Johns Hopkins University, known for Cox's theorem relating to the foundations of probability.. Biography He was born in Portland, Oregon, the son of attorney Lewis Cox and Elinor Cox. After Lewis Cox died, Elinor Cox married John Latané, who became a professor at Johns Hopkins University in 1913. In 1915 Richard enrolled at Johns Hopkins University to study physics, but his studies were cut short when he was drafted for World War I. He stayed in the US after being drafted and returned to Johns Hopkins University after the war, completing his BA in 1920. He earned his PhD in 1924; his dissertation was ''A Study of Pfund's Pressure Gauge''. He taught at New York University (NYU) from 1924 to 1943, before returning to Johns Hopkins to teach. He studied probability theory, the scattering of electrons, and the discharges of electric eels. Richard Cox's most important work was Cox's theorem. His wife, ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Probability
Probability is the branch of mathematics concerning numerical descriptions of how likely an event is to occur, or how likely it is that a proposition is true. The probability of an event is a number between 0 and 1, where, roughly speaking, 0 indicates impossibility of the event and 1 indicates certainty."Kendall's Advanced Theory of Statistics, Volume 1: Distribution Theory", Alan Stuart and Keith Ord, 6th Ed, (2009), .William Feller, ''An Introduction to Probability Theory and Its Applications'', (Vol 1), 3rd Ed, (1968), Wiley, . The higher the probability of an event, the more likely it is that the event will occur. A simple example is the tossing of a fair (unbiased) coin. Since the coin is fair, the two outcomes ("heads" and "tails") are both equally probable; the probability of "heads" equals the probability of "tails"; and since no other outcomes are possible, the probability of either "heads" or "tails" is 1/2 (which could also be written as 0.5 or 50%). These con ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Bayesian Probability
Bayesian probability is an interpretation of the concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as reasonable expectation representing a state of knowledge or as quantification of a personal belief. The Bayesian interpretation of probability can be seen as an extension of propositional logic that enables reasoning with hypotheses; that is, with propositions whose truth or falsity is unknown. In the Bayesian view, a probability is assigned to a hypothesis, whereas under frequentist inference, a hypothesis is typically tested without being assigned a probability. Bayesian probability belongs to the category of evidential probabilities; to evaluate the probability of a hypothesis, the Bayesian probabilist specifies a prior probability. This, in turn, is then updated to a posterior probability in the light of new, relevant data (evidence). The Bayesian interpretation provides a standard set of procedures and form ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Non-cognitivism
Non-cognitivism is the meta-ethical view that ethical sentences do not express propositions (i.e., statements) and thus cannot be true or false (they are not truth-apt). A noncognitivist denies the cognitivist claim that "moral judgments are capable of being objectively true, because they describe some feature of the world". If moral statements cannot be true, and if one cannot know something that is not true, noncognitivism implies that moral knowledge is impossible. Non-cognitivism entails that non-cognitive attitudes underlie moral discourse and this discourse therefore consists of non-declarative speech acts, although accepting that its surface features may consistently and efficiently work as if moral discourse were cognitive. The point of interpreting moral claims as non-declarative speech acts is to explain what moral claims mean if they are neither true nor false (as philosophies such as logical positivism entail). Utterances like "Boo to killing!" and "Don't kill" ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]