HOME





Mark Burgess (computer Scientist)
Mark Burgess (born 19 February 1966) is an independent researcher and writer, formerly professor at Oslo University College in Norway and creator of the CFEngine software and company, who is known for work in computer science in the field of policy-based configuration management. Early life and education Burgess was born in Maghull in the United Kingdom to English parents. He grew up in Bloxham, a small village in Oxfordshire from the age of 5–18, attending Bloxham Primary School, Warriner Secondary School and Banbury Upper School. He studied astrophysics at the (then) School of Physics at the Newcastle University, University of Newcastle upon Tyne, where he later switched to pure Physics and then Theoretical Physics for his bachelor's degree. He stayed on to obtain a Doctor of Philosophy in Theoretical Physics (Quantum Field Theory) in Newcastle, in the field of ''Spontaneous Symmetry Breaking in Non-Abelian Gauge Theories'', for which he received the Keith Runcorn Prize. ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Oslo University College
Oslo University College (; HiO) was the largest state university college in Norway from 1994 to 2011, with more than 18,000 students and approximately 1800 employees. Facts about OUC
Oslo University College merged with Akershus University College to form Oslo and Akershus University College in 2011, and this institution became Oslo Metropolitan University in 2018. OUC was established on 1 August 1994 when the Norwegian college system was restructured and 18 smaller colleges in the Oslo area merged. From the 2000s most of the school was located in the city centre of Oslo along Pilestredet street. The main campus was the previous Frydenlund Brewery near Bislett stadium. OUC offered the broadest portfolio of professional studies available in Norway. The language of instruction was Norwegian language, Norwegian.


Faculties


[...More Info...]       [...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Robin Milner
Arthur John Robin Gorell Milner (13 January 1934 – 20 March 2010) was a British computer scientist, and a Turing Award winner.Obituary – Professor Robin Milner: computer scientist
'''', 31 March 2010.


Life, education and career

Milner was born in , near ,

picture info

1966 Births
Events January * January 1 – In a coup, Colonel Jean-Bédel Bokassa takes over as military ruler of the Central African Republic, ousting President David Dacko. * January 3 – 1966 Upper Voltan coup d'état: President Maurice Yaméogo is deposed by a military coup in the Republic of Upper Volta (modern-day Burkina Faso). * January 10 ** Pakistani–Indian peace negotiations end successfully with the signing of the Tashkent Declaration, a day before the sudden death of Indian prime minister Lal Bahadur Shastri. ** Georgia House of Representatives, The House of Representatives of the US state of Georgia refuses to allow African-American representative Julian Bond to take his seat, because of his anti-war stance. * January 15 – 1966 Nigerian coup d'état: A bloody military coup is staged in Nigeria, deposing the civilian government and resulting in the death of Prime Minister Abubakar Tafawa Balewa. * January 17 ** The Nigerian coup is overturned by another faction of the ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Principal Component Analysis
Principal component analysis (PCA) is a linear dimensionality reduction technique with applications in exploratory data analysis, visualization and data preprocessing. The data is linearly transformed onto a new coordinate system such that the directions (principal components) capturing the largest variation in the data can be easily identified. The principal components of a collection of points in a real coordinate space are a sequence of p unit vectors, where the i-th vector is the direction of a line that best fits the data while being orthogonal to the first i-1 vectors. Here, a best-fitting line is defined as one that minimizes the average squared perpendicular distance from the points to the line. These directions (i.e., principal components) constitute an orthonormal basis in which different individual dimensions of the data are linearly uncorrelated. Many studies use the first two principal components in order to plot the data in two dimensions and to visually identi ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

PageRank
PageRank (PR) is an algorithm used by Google Search to rank web pages in their search engine results. It is named after both the term "web page" and co-founder Larry Page. PageRank is a way of measuring the importance of website pages. According to Google: Currently, PageRank is not the only algorithm used by Google to order search results, but it is the first algorithm that was used by the company, and it is the best known. As of September 24, 2019, all patents associated with PageRank have expired. Description PageRank is a link analysis algorithm and it assigns a numerical weighting to each element of a hyperlinked set of documents, such as the World Wide Web, with the purpose of "measuring" its relative importance within the set. The algorithm may be applied to any collection of entities with reciprocal quotations and references. The numerical weight that it assigns to any given element ''E'' is referred to as the ''PageRank of E'' and denoted by PR(E). A PageRank resu ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Semantics
Semantics is the study of linguistic Meaning (philosophy), meaning. It examines what meaning is, how words get their meaning, and how the meaning of a complex expression depends on its parts. Part of this process involves the distinction between sense and reference. Sense is given by the ideas and concepts associated with an expression while reference is the object to which an expression points. Semantics contrasts with syntax, which studies the rules that dictate how to create grammatically correct sentences, and pragmatics, which investigates how people use language in communication. Lexical semantics is the branch of semantics that studies word meaning. It examines whether words have one or several meanings and in what lexical relations they stand to one another. Phrasal semantics studies the meaning of sentences by exploring the phenomenon of compositionality or how new meanings can be created by arranging words. Formal semantics (natural language), Formal semantics relies o ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Spacetime
In physics, spacetime, also called the space-time continuum, is a mathematical model that fuses the three dimensions of space and the one dimension of time into a single four-dimensional continuum. Spacetime diagrams are useful in visualizing and understanding relativistic effects, such as how different observers perceive ''where'' and ''when'' events occur. Until the turn of the 20th century, the assumption had been that the three-dimensional geometry of the universe (its description in terms of locations, shapes, distances, and directions) was distinct from time (the measurement of when events occur within the universe). However, space and time took on new meanings with the Lorentz transformation and special theory of relativity. In 1908, Hermann Minkowski presented a geometric interpretation of special relativity that fused time and the three spatial dimensions into a single four-dimensional continuum now known as Minkowski space. This interpretation proved vital t ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Agent-based Model
An agent-based model (ABM) is a computational model for simulating the actions and interactions of autonomous agents (both individual or collective entities such as organizations or groups) in order to understand the behavior of a system and what governs its outcomes. It combines elements of game theory, complex systems, emergence, computational sociology, multi-agent systems, and evolutionary programming. Monte Carlo methods are used to understand the stochasticity of these models. Particularly within ecology, ABMs are also called individual-based models (IBMs). A review of recent literature on individual-based models, agent-based models, and multiagent systems shows that ABMs are used in many scientific domains including biology, ecology and social science. Agent-based modeling is related to, but distinct from, the concept of multi-agent systems or multi-agent simulation in that the goal of ABM is to search for explanatory insight into the collective behavior of agents ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  




Jan Bergstra
Johannes Aldert "Jan" Bergstra (born 1951) is a Dutch computer scientist. His work has focused on logic and the theoretical foundations of software engineering, especially on formal methods for system design. He is best known as an expert on algebraic methods for the specification of data and computational processes in general. Biography Jan Bergstra was born in 1951 in Rotterdam, the son of Tjeerd Bergstra and Johanna Bisschop.Jan A. Bergstra (2009)Curriculum Vitae Jan Aldert Bergstra at ''uva.nl''. October 20, 2009. Accessed August 30, 2013 He was educated at the Montessori Lyceum Rotterdam (gymnasium beta) and then studied mathematics at Utrecht University, starting in 1969. After an MSc he wrote a PhD thesis, defended in 1976, on recursion theory in higher types, under the supervision of Dirk van Dalen. Bergstra held posts at the Institute of Applied Mathematics and Computer Science of the University of Leiden (1976–82), and the Centrum Wiskunde & Informatica (CWI) in Am ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Promise Theory
Promise theory is a method of analysis suitable for studying any system of interacting components. In the context of information science, promise theory offers a methodology for organising and understanding systems by modelling voluntary cooperation between individual actors or agents, which make public their ''intentions'' to one another in the form of promises. Promise theory is grounded in graph theory and set theory. The goal of promise theory is to reveal the behavior of a whole by taking the viewpoint of the parts rather than the whole. In other words, it is a bottom-up, constructionist view of the world. Promise theory is not a technology or design methodology. It doesn't advocate any position or design principle, except as a method of analysis. Promise theory is being used in a variety of disciplines ranging from network ( SDN) and computer systems management to organizations and finance. History An early form of promise theory was proposed by physicist and compu ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


picture info

Claude Shannon
Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was an American mathematician, electrical engineer, computer scientist, cryptographer and inventor known as the "father of information theory" and the man who laid the foundations of the Information Age. Shannon was the first to describe the use of Boolean algebra—essential to all digital electronic circuits—and helped found artificial intelligence (AI). Roboticist Rodney Brooks declared Shannon the 20th century engineer who contributed the most to 21st century technologies, and mathematician Solomon W. Golomb described his intellectual achievement as "one of the greatest of the twentieth century". At the University of Michigan, Shannon dual degreed, graduating with a Bachelor of Science in electrical engineering and another in mathematics, both in 1936. A 21-year-old master's degree student in electrical engineering at MIT, his thesis, "A Symbolic Analysis of Relay and Switching Circuits", demonstrated that electric ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]  


Exponential Smoothing
Exponential smoothing or exponential moving average (EMA) is a rule of thumb technique for smoothing time series data using the exponential window function. Whereas in the simple moving average the past observations are weighted equally, exponential functions are used to assign exponentially decreasing weights over time. It is an easily learned and easily applied procedure for making some determination based on prior assumptions by the user, such as seasonality. Exponential smoothing is often used for analysis of time-series data. Exponential smoothing is one of many window functions commonly applied to smooth data in signal processing, acting as low-pass filters to remove high-frequency noise. This method is preceded by Poisson's use of recursive exponential window functions in convolutions from the 19th century, as well as Kolmogorov and Zurbenko's use of recursive moving averages from their studies of turbulence in the 1940s. The raw data sequence is often represented by \ ...
[...More Info...]      
[...Related Items...]     OR:     [Wikipedia]   [Google]   [Baidu]