Actions

NetCDF

From BAWiki

Revision as of 14:54, 2 April 2016 by imported>Lang Guenther (→‎Reduction of Dataset Size: bold text)

General Aspects

Purpose of these BAWiki Pages

These BAWiki pages do describe all NetCDF conventions required to store baw-specific data in NetCDF data files (see network common data form). I. e. all local conventions are listed, which go beyond the international agreed-upon CF-metadata convention.

In cases where the international agreed-upon CF conventions are insufficient, essentially the Unstructured Grid Metadata Conventions for Scientific Datasets (UGRID Conventions) published on GITHUB are used.

Further activities related with OpenDAP to extract a selection of data defined on unstructured grids can be found on e. g. OPULS.

Some further widely spread templates known are the NODC NetCDF Templates. The NODC data center has been recently merged with other data centers and is now part of National Centers for Environmental Information (NCEI).

The BAW instance of a NetCDF file developed since 2010 is a file of type CF-NETCDF.NC.

Since version NetCDF-4.0 HDF (Hierarchical Data File, see HDF5 Group) is used as the underlying file format. Due to the use of HDF concepts like online compression of data stored in NetCDF files is supported as well as chunking of variables to balance read performance in case of different access to data, e.g. time-series vs. synoptic data set access.

Important NetCDF Utilities

Important (helpful) NetCDF Utilities are:

  • NCDUMP create (selective) text representation of the contents of a NetCDF file;
  • NCCOPY (selective) copy an existing NetCDF file to another, change level of compression, change internal file structure (File Chunking); and
  • NCGEN create NetCDF file from a CDL text file; optionally also C or FORTRAN code can be automatically generated.

File Chunking

The chunk size of variables stored in a CF NetCDF file may have significant influence on read performance in case data have to be read along different dimensions, e.g. spatial versus time-series access. Chunk size can be individually tuned using the NetCDF API. As a simple alternative, already helpful in many situations, you can also make use of the NCCOPY program. For further informations about chunking please read the following informations:

NetCDF vs. GRIB

Besides NetCDF GRIB is also widely used. Concerning problems of interoperability between NetCDF and GRIB a workshop was held at ECMWF in September 2014 . Further informatioins can be found on the website of the workshop on Closing the GRIB/NetCDF gap.

Literature

Signell, R. P. und Snowden, D. P. (2014) Advances in a Distributed Approach for Ocean Model Data Interoperability. J. Mar. Sci. Eng. 2014, 2, 194-208. Describes also the benefits using UGRID CF metadata standard to store data in netCDF files.

How to acknowledge Unidata

"Software and technologies developed and distributed by the Unidata Program Center are (with very few exceptions) Free and Open Source, and you can use them in your own work with no restrictions. In order to continue developing software and providing services to the Unidata community, it is important that the Unidata Program Center be able to demonstrate the value of the technologies we develop and services we provide to our sponsors — most notably the National Science Foundation. Including an acknowledgement in your publication or web site helps us do this."

"It helps even more if we are aware of what you're doing. If you're using Unidata technologies and citing them in a paper, poster, thesis, or other venue, we'd be grateful if you would let us know about it by sending a short message to support@unidata.ucar.edu. Thanks!"

Informal

Citation

  • Unidata, (year): Package name version number [software]. Boulder, CO: UCAR/Unidata Program Center. Available from URL-to-software-page.

Where is NetCDF used?

Wo wird NetCDF verwendet?

For an overview please visit Where is NetCDF used?.

Terminology

Global Attributes

Grids

Time Coordinate

Vertical Coordinate

Horizontal Coordinate Reference System

Reduction of Dataset Size

Traditionally, up to the availability of NetCDF-4 (HDF),

were the only ways to reduce data set sizes. Now, with the availability of NetCDF-4 (HDF), it is recommended to use online compression instead. Online compression can be activated on a per variable basis via the NetCDF API. For existing NetCDF files NCCOPY also allows you to (online-) compress the file after it has been created.

Data

Synoptic Data

Time Series Data

Analysis Data


back to Standard-Software-Applications (Add-ons)


Overview