NetCDF Explorer
   HOME

TheInfoList



OR:

NetCDF (Network Common Data Form) is a set of
software libraries In computer science, a library is a collection of non-volatile resources used by computer programs, often for software development. These may include configuration data, documentation, help data, message templates, pre-written code and subro ...
and self-describing, machine-independent data formats that support the creation, access, and sharing of array-oriented scientific data. The project homepage is hosted by the Unidata program at the University Corporation for Atmospheric Research (UCAR). They are also the chief source of netCDF software, standards development, updates, etc. The format is an
open standard An open standard is a standard that is openly accessible and usable by anyone. It is also a prerequisite to use open license, non-discrimination and extensibility. Typically, anybody can participate in the development. There is no single definition ...
. NetCDF Classic and 64-bit Offset Format are an international standard of the Open Geospatial Consortium. The project started in 1988 and is still actively supported by UCAR. The original netCDF binary format (released in 1990, now known as "netCDF classic format") is still widely used across the world and continues to be fully supported in all netCDF releases. Version 4.0 (released in 2008) allowed the use of the HDF5 data file format. Version 4.1 (2010) added support for C and Fortran client access to specified subsets of remote data via OPeNDAP. Version 4.3.0 (2012) added a CMake build system for Windows builds. Version 4.7.0 (2019) added support for reading Amazon S3 objects. Further releases are planned to improve performance, add features, and fix bugs.


History

The format was originally based on the conceptual model of the Common Data Format developed by NASA, but has since diverged and is not compatible with it.


Format description

The netCDF libraries support multiple different binary formats for netCDF files: * The classic format was used in the first netCDF release, and is still the default format for file creation. * The 64-bit offset format was introduced in version 3.6.0, and it supports larger variable and file sizes. * The netCDF-4/HDF5 format was introduced in version 4.0; it is the HDF5 data format, with some restrictions. * The HDF4 SD format is supported for read-only access. * The CDF5 format is supported, in coordination with the parallel-netcdf project. All formats are " self-describing". This means that there is a header which describes the layout of the rest of the file, in particular the data arrays, as well as arbitrary file metadata in the form of name/value attributes. The format is platform independent, with issues such as endianness being addressed in the
software libraries In computer science, a library is a collection of non-volatile resources used by computer programs, often for software development. These may include configuration data, documentation, help data, message templates, pre-written code and subro ...
. The data are stored in a fashion that allows efficient subsetting. Starting with version 4.0, the netCDF API allows the use of the HDF5 data format. NetCDF users can create HDF5 files with benefits not available with the netCDF format, such as much larger files and multiple unlimited dimensions. Full backward compatibility in accessing old netCDF files and using previous versions of the C and Fortran APIs is supported.


Software


Access libraries

The software libraries supplied by UCAR provide read-write access to netCDF files, encoding and decoding the necessary arrays and metadata. The core library is written in C, and provides an API for C, C++ and two APIs for Fortran applications, one for Fortran 77, and one for Fortran 90. An independent implementation, also developed and maintained by Unidata, is written in 100% Java, which extends the core data model and adds additional functionality. Interfaces to netCDF based on the C library are also available in other languages including R (''ncdf'', ''ncvar'' and ''RNetCDF'' packages), Perl, Python, Ruby, Haskell,
Mathematica Wolfram Mathematica is a software system with built-in libraries for several areas of technical computing that allow machine learning, statistics, symbolic computation, data manipulation, network analysis, time series analysis, NLP, optimizat ...
, MATLAB,
IDL IDL may refer to: Computing * Interface description language, any computer language used to describe a software component's interface ** IDL specification language, the original IDL created by Lamb, Wulf and Nestor at Queen's University, Canada ...
, Julia and
Octave In music, an octave ( la, octavus: eighth) or perfect octave (sometimes called the diapason) is the interval between one musical pitch and another with double its frequency. The octave relationship is a natural phenomenon that has been refer ...
. The specification of the API calls is very similar across the different languages, apart from inevitable differences of syntax. The API calls for version 2 were rather different from those in version 3, but are also supported by versions 3 and 4 for backward compatibility. Application programmers using supported languages need not normally be concerned with the file structure itself, even though it is available as open formats.


Applications

A wide range of application software has been written which makes use of netCDF files. These range from command line utilities to graphical visualization packages. A number are listed below, and a longer list is on the UCAR website. * A commonly used set of Unix command line utilities for netCDF files is the NetCDF Operators (NCO) suite, which provide a range of commands for manipulation and analysis of netCDF files including basic record concatenating, array slicing and averaging. * ncBrowse is a generic netCDF file viewer that includes Java graphics, animations and 3D visualizations for a wide range of netCDF file conventions. * ncview is a visual browser for netCDF format files. This program is a simple, fast, GUI-based tool for visualising fields in a netCDF file. One can browse through the various dimensions of a data array, taking a look at the raw data values. It is also possible to change color maps, invert the data, etc. * Panoply is a netCDF file viewer developed at the NASA Goddard Institute for Space Studies which focuses on presentation of geo-gridded data. It is written in Java and thus platform independent. Although its feature set overlaps with ncBrowse and ncview, Panoply is distinguished by offering a wide variety of map projections and ability to work with different scale color tables. * The NCAR Command Language (NCL) is used to analyze and visualize data in netCDF files (among other formats). * The Python programming language can access netCDF files with the PyNIO module (which also facilitates access to a variety of other data formats). netCDF files can also be read with the Python module netCDF4-python, and into a pandas-like DataFrame with the xarray module. *
Ferret The ferret (''Mustela furo'') is a small, Domestication, domesticated species belonging to the family Mustelidae. The ferret is most likely a domesticated form of the wild European polecat (''Mustela putorius''), evidenced by their Hybrid (biol ...
is an interactive computer visualization and analysis environment designed to meet the needs of oceanographers and meteorologists analyzing large and complex gridded data sets. Ferret offers a Mathematica-like approach to analysis; new variables may be defined interactively as mathematical expressions involving data set variables. Calculations may be applied over arbitrarily shaped regions. Fully documented graphics are produced with a single command. * The Grid Analysis and Display System (GrADS) is an interactive desktop tool that is used for easy access, manipulation, and visualization of earth science data. GrADS has been implemented worldwide on a variety of commonly used operating systems and is freely distributed over the Internet. * nCDF_Browser is a visual nCDF browser, written in the
IDL IDL may refer to: Computing * Interface description language, any computer language used to describe a software component's interface ** IDL specification language, the original IDL created by Lamb, Wulf and Nestor at Queen's University, Canada ...
programming language. Variables, attributes, and dimensions can be immediately downloaded to the IDL command line for further processing. All the Coyote Library files necessary to run nCDF_Browser are available in the zip file. * ArcGIS versions after 9.2 support netCDF files that follow the
Climate and Forecast Metadata Conventions The Climate and Forecast (CF) metadata conventions are conventions for the description of Earth sciences data, intended to promote the processing and sharing of data files. The metadata defined by the CF conventions are generally included in the s ...
and contain rectilinear grids with equally-spaced coordinates. The Multidimensional Tools toolbox can be used to create raster layers, feature layers, and table views from netCDF data in ArcMap, or convert feature, raster, and table data to netCDF. * OriginPro version 2021b supports netCDF CF Convention. Averaging can be performed during import to allow handling of large datasets in a GUI software. * The Geospatial Data Abstraction Library provides support for read and write access to netCDF data. * netCDF Explorer is multi-platform graphical browser for netCDF files. netCDF Explorer can browse files locally or remotely, by means of OPeNDAP * R supports netCDF through packages such as ''ncdf4'' (including HDF5 support) or ''RNetCDF'' (no HDF5 support).
HDFql
enables users to manage netCDF-4/HDF5 files through a high-level language (similar to SQL) in C, C++, Java, Python, C#, Fortran and R. *
ECMWF The European Centre for Medium-Range Weather Forecasts (ECMWF) is an independent intergovernmental organisation supported by most of the nations of Europe. It is based at three sites: Shinfield Park, Reading, United Kingdom; Bologna, Italy; an ...
's Metview workstation and batch system can handle NetCDF together with GRIB and BUFR. *
OpenChrom OpenChrom is an open source software for the analysis and visualization of mass spectrometric and chromatographic data. Its focus is to handle native data files from several mass spectrometry systems (e.g. GC/MS, LC/MS, Py-GC/MS, HPLC-MS), vendor ...
ships a converter under the terms of the Eclipse Public License


Common uses

It is commonly used in climatology, meteorology and
oceanography Oceanography (), also known as oceanology and ocean science, is the scientific study of the oceans. It is an Earth science, which covers a wide range of topics, including ecosystem dynamics; ocean currents, waves, and geophysical fluid dynamic ...
applications (e.g., weather forecasting, climate change) and
GIS A geographic information system (GIS) is a type of database containing Geographic data and information, geographic data (that is, descriptions of phenomena for which location is relevant), combined with Geographic information system software, sof ...
applications. It is an input/output format for many GIS applications, and for general scientific data exchange. To quote from their site: : "NetCDF (network Common Data Form) is a set of interfaces for array-oriented data access and a freely-distributed collection of data access libraries for C, Fortran, C++, Java, and other languages. The netCDF libraries support a machine-independent format for representing scientific data. Together, the interfaces, libraries, and format support the creation, access, and sharing of scientific data."


Conventions

The Climate and Forecast (CF) conventions are metadata conventions for earth science data, intended to promote the processing and sharing of files created with the NetCDF Application Programmer Interface (API). The conventions define metadata that are included in the same file as the data (thus making the file "self-describing"), that provide a definitive description of what the data in each variable represents, and of the spatial and temporal properties of the data (including information about grids, such as grid cell bounds and cell averaging methods). This enables users of data from different sources to decide which data are comparable, and allows building applications with powerful extraction, regridding, and display capabilities.


Parallel-NetCDF

An extension of netCDF for
parallel computing Parallel computing is a type of computation in which many calculations or processes are carried out simultaneously. Large problems can often be divided into smaller ones, which can then be solved at the same time. There are several different fo ...
called Parallel-NetCDF (or PnetCDF) has been developed by
Argonne National Laboratory Argonne National Laboratory is a science and engineering research United States Department of Energy National Labs, national laboratory operated by University of Chicago, UChicago Argonne LLC for the United States Department of Energy. The facil ...
and Northwestern University. This is built upon MPI-IO, the I/O extension to
MPI MPI or Mpi may refer to: Science and technology Biology and medicine * Magnetic particle imaging, an emerging non-invasive tomographic technique * Myocardial perfusion imaging, a nuclear medicine procedure that illustrates the function of the hear ...
communications. Using the high-level netCDF data structures, the Parallel-NetCDF libraries can make use of optimizations to efficiently distribute the file read and write applications between multiple processors. The Parallel-NetCDF package can read/write only classic and 64-bit offset formats. Parallel-NetCDF cannot read or write the HDF5-based format available with netCDF-4.0. The Parallel-NetCDF package uses different, but similar APIs in Fortran and C. Parallel I/O in the Unidata netCDF library has been supported since release 4.0, for HDF5 data files. Since version 4.1.1 the Unidata NetCDF C library supports parallel I/O to classic and 64-bit offset files using the Parallel-NetCDF library, but with the NetCDF API.


Interoperability of C/Fortran/C++ libraries with other formats

The netCDF C library, and the libraries based on it (Fortran 77 and Fortran 90, C++, and all third-party libraries) can, starting with version 4.1.1, read some data in other data formats. Data in the HDF5 format can be read, with some restrictions. Data in the
HDF4 Hierarchical Data Format (HDF) is a set of file formats (HDF4, HDF5) designed to store and organize large amounts of data. Originally developed at the U.S. National Center for Supercomputing Applications, it is supported by The HDF Group, a non-p ...
format can be read by the netCDF C library if created using the HDF4 Scientific Data (SD) API.


NetCDF-Java common data model

The NetCDF-Java library currently reads the following file formats and remote access protocols: * BUFR Format Documentation (ongoing development) * CINRAD level II (Chinese Radar format) * DMSP (
Defense Meteorological Satellite Program The Defense Meteorological Satellite Program (DMSP) monitors meteorological, oceanographic, and solar-terrestrial physics for the United States Department of Defense. The program is managed by the United States Space Force with on-orbit operati ...
) * DORADE radar file format * GINI ( GOES Ingest and
NOAA The National Oceanic and Atmospheric Administration (abbreviated as NOAA ) is an United States scientific and regulatory agency within the United States Department of Commerce that forecasts weather, monitors oceanic and atmospheric conditio ...
PORT Interface) image format * GEMPAK gridded data * GRIB version 1 and version 2 (ongoing work on tables) * GTOPO 30-sec elevation dataset ( USGS) * Hierarchical Data Format (HDF4, HDF-EOS2, HDF5, HDF-EOS5) * NetCDF (classic and large format) * NetCDF-4 (built on HDF5) * NEXRAD Radar level 2 and level 3. There are a number of other formats in development. Since each of these is accessed transparently through the NetCDF API, the NetCDF-Java library is said to implement a common data model for scientific datasets. The Java common data model has three layers, which build on top of each other to add successively richer semantics: # The ''data access'' layer, also known as the syntactic layer, handles data reading. # The ''coordinate system'' layer identifies the coordinates of the data arrays. Coordinates are a completely general concept for scientific data; specialized
georeferencing Georeferencing means that the internal coordinate system of a map or aerial photo image can be related to a geographic coordinate system. The relevant coordinate transforms are typically stored within the image file ( GeoPDF and GeoTIFF are exam ...
coordinate systems, important to the Earth Science community, are specially annotated. # The ''scientific data type'' layer identifies specific types of data, such as grids, images, and point data, and adds specialized methods for each kind of data. The data model of the data access layer is a generalization of the NetCDF-3 data model, and substantially the same as the NetCDF-4 data model. The coordinate system layer implements and extends the concepts in the
Climate and Forecast Metadata Conventions The Climate and Forecast (CF) metadata conventions are conventions for the description of Earth sciences data, intended to promote the processing and sharing of data files. The metadata defined by the CF conventions are generally included in the s ...
. The scientific data type layer allows data to be manipulated in coordinate space, analogous to the Open Geospatial Consortium specifications. The identification of coordinate systems and data typing is ongoing, but users can plug in their own classes at runtime for specialized processing.


See also

* Common Data Format (CDF) * CGNS ( CFD General Notation System) * EAS3 (Ein-Ausgabe-System) *
FITS Flexible Image Transport System (FITS) is an open standard defining a digital file format useful for storage, transmission and processing of data: formatted as multi-dimensional arrays (for example a 2D image), or tables. FITS is the most com ...
(Flexible Image Transport System) * GRIB (GRIdded Binary) * Hierarchical Data Format (HDF) * OPeNDAP client-server protocols * Tecplot binary files *
XDMF XDMF (eXtensible Data Model and Format) provides a standard way to access data produced by HPC codes. Data format refers to the raw data to be manipulated, the description of the data is separate from the values themselves. It distinguishes the ...
(eXtensible Data Model Format) *
XMDF XMDF (eXtensible Model Data Format) is a library providing a standard format for the geometric data storage of river cross-sections, 2D/ 3D structured and unstructured meshes, geometric paths through space, and associated time data. XMDF uses H ...
(eXtensible Model Data Format)


References


External links

* {{Official website *
NetCDF User's Guide
— describes the file format * tp://ftp.uni-duisburg.de/FlightGear/Devel/An_Introduction_to_Distributed_Visualization.pdf "An Introduction to Distributed Visualization" section 4.2 contains a comparison of CDF, HDF, and netCDF.
Animating NetCDF Data in ArcMap


Computer file formats Earth sciences data formats Earth sciences graphics software Meteorological data and networks