The San Diego Supercomputer Center (SDSC) is an organized research unit
of the
University of California, San Diego
The University of California, San Diego (UC San Diego or colloquially, UCSD) is a public university, public Land-grant university, land-grant research university in San Diego, California. Established in 1960 near the pre-existing Scripps Insti ...
(UCSD). SDSC is located at the UCSD campus'
Eleanor Roosevelt College
The Eleanor Roosevelt College (Roosevelt or ERC) is one of seven undergraduate colleges at the University of California San Diego (UC San Diego or UCSD). While ERC has students of all majors, the college emphasizes international understanding in ...
east end, immediately north the Hopkins Parking Structure.
The current SDSC director is Michael L. Norman, UCSD physics professor, who succeeded noted
grid computing
Grid computing is the use of widely distributed computer resources to reach a common goal. A computing grid can be thought of as a distributed system with non-interactive workloads that involve many files. Grid computing is distinguished from co ...
pioneer
Francine Berman. He was named director September 2010, having been interim director more than a year.
SDSC was founded in 1985 and describes its mission as "developing and using technology to advance science." The SDSC was one of the five original
NSF Supercomputing Centers and the
National Science Foundation
The National Science Foundation (NSF) is an independent agency of the United States government that supports fundamental research and education in all the non-medical fields of science and engineering. Its medical counterpart is the National I ...
(NSF) continues to be a primary funder of the SDSC. Its research pursuits are
high performance computing
High-performance computing (HPC) uses supercomputers and computer clusters to solve advanced computation problems.
Overview
HPC integrates systems administration (including network and security knowledge) and parallel programming into a multid ...
,
grid computing
Grid computing is the use of widely distributed computer resources to reach a common goal. A computing grid can be thought of as a distributed system with non-interactive workloads that involve many files. Grid computing is distinguished from co ...
,
computational biology
Computational biology refers to the use of data analysis, mathematical modeling and computational simulations to understand biological systems and relationships. An intersection of computer science, biology, and big data, the field also has fo ...
,
geoinformatics
Geoinformatics is the science and the technology which develops and uses information science infrastructure to address the problems of geography, cartography, geosciences and related branches of science and engineering, such as Land Surveying.
...
,
computational physics
Computational physics is the study and implementation of numerical analysis to solve problems in physics for which a quantitative theory already exists. Historically, computational physics was the first application of modern computers in science, ...
,
computational chemistry
Computational chemistry is a branch of chemistry that uses computer simulation to assist in solving chemical problems. It uses methods of theoretical chemistry, incorporated into computer programs, to calculate the structures and properties of m ...
,
data management
Data management comprises all disciplines related to handling data as a valuable resource.
Concept
The concept of data management arose in the 1980s as technology moved from sequential processing (first punched cards, then magnetic tape) to r ...
,
scientific visualization
Scientific visualization ( also spelled scientific visualisation) is an interdisciplinary branch of science concerned with the visualization of scientific phenomena.Michael Friendly (2008)"Milestones in the history of thematic cartography, stat ...
, and
computer network
A computer network is a set of computers sharing resources located on or provided by network nodes. The computers use common communication protocols over digital interconnections to communicate with each other. These interconnections are ...
ing. SDSC computational biosciences contributions and earth science and genomics computational approaches are internationally recognized.
Divisions and projects
SDSC roles include creating and maintaining the
Protein Data Bank
The Protein Data Bank (PDB) is a database for the three-dimensional structural data of large biological molecules, such as proteins and nucleic acids. The data, typically obtained by X-ray crystallography, NMR spectroscopy, or, increasingly, cry ...
, the George E. Brown, Jr. Network for Earthquake Engineering Simulation Cyberinfrastructure Center (NEESit), cyberinfrastructure for the geosciences (GEON), and the Tree of Life Project (TOL) are especially well known.
SDSC is one of the four original
TeraGrid
TeraGrid was an e-Science grid computing infrastructure combining resources at eleven partner sites. The project started in 2001 and operated from 2004 through 2011.
The TeraGrid integrated high-performance computers, data resources and tools, a ...
project sites with
National Center for Supercomputing Applications
The National Center for Supercomputing Applications (NCSA) is a state-federal partnership to develop and deploy national-scale computer infrastructure that advances research, science and engineering based in the United States. NCSA operates as a ...
(NCSA),
Argonne National Laboratory
Argonne National Laboratory is a science and engineering research United States Department of Energy National Labs, national laboratory operated by University of Chicago, UChicago Argonne LLC for the United States Department of Energy. The facil ...
, and the Center for Advanced Computing Research (CACR).
SDSC is a data management software development pioneer, having developed the
Rocks
In geology, rock (or stone) is any naturally occurring solid mass or aggregate of minerals or mineraloid matter. It is categorized by the minerals included, its chemical composition, and the way in which it is formed. Rocks form the Earth's ...
cluster computing environment and
storage resource broker (SRB).
SDSC is home to the Performance Modeling and Characterization (PMaC) laboratory, whose mission is to bring scientific rigor to the prediction and understanding of factors affecting the performance of current and projected High Performance Computing (HPC) platforms. PMaC is funded by the Department of Energy (SciDac PERC research grant), the Department of Defense (Navy DSRC PET program), DARPA, and the National Science Foundation. Allan E. Snavely founded the PMaC laboratory in 2001.
In 2009 a combined team from SDSC and Lawrence Berkeley National Labs led by Allan Snavely won the prestigious Data Challenge competition held in Portland Oregon, at SC09, the annual premier conference in High Performance Computing, Networking, Storage and Analysis for their design of a new kind of supercomputer that makes extensive use of flash memory and nicknamed "Dash". Dash is a prototype for a much larger system nicknamed "Gordon" that the team will deploy at SDSC in 2011 with more than 256 TB of flash memory.
SDSC is also home to the Center for Applied Internet Data Analysis (CAIDA), and the Computational and Applied Statistics Laboratory (CASL). CAIDA is a collaboration of government, research, and commercial entities working together to improve the Internet. It features an academic network test infrastructure called the ''Archipelago Measurement Infrastructure'' (Ark), similar to networks such as
PlanetLab
PlanetLab was a group of computers available as a testbed for computer networking and distributed systems research. It was established in 2002 by Prof. Larry L. Peterson and Prof. David Culler, and as of June 2010, it was composed of 1090 nodes at ...
and
RIPE Atlas
Réseaux IP Européens (RIPE, French for "European IP Networks") is a forum open to all parties with an interest in the technical development of the Internet. The RIPE community's objective is to ensure that the administrative and technical coo ...
.
See also
*
National Digital Library Program
The Library of Congress National Digital Library Program (NDLP) is assembling a digital library of reproductions of primary source materials to support the study of the history and culture of the United States. Begun in 1995 after a five-year p ...
(NDLP)
*
(NDIIPP)
References
External links
SDSCPMacAllan E. SnavelyNEESitChronopolis
{{authority control
University of California, San Diego
Supercomputer sites
E-Science
Cyberinfrastructure
La Jolla, San Diego
1985 establishments in California