GHCNData.jl

Helps access the Global Historical Climatological Network's daily data
Author willtebbutt
Popularity
3 Stars
Updated Last
3 Years Ago
Started In
January 2021

GHCNData

Build Status Coverage Code Style: Blue ColPrac: Contributor's Guide on Collaborative Practices for Community Packages

Utility functionality to help getting hold of daily data from the Global Historical Climatology Network archive.

If you use this data, you should acknowledge it appropriately. Instruction for doing so can be found at the top of NOAA's readme.

Why Bother?

While the GHCN data is fairly straightforward, it's not as simple as just downloading a single file and opening it as a DataFrame. There are a few different kinds of files that you need to be aware of, each of which has a well-documented but non-standard format. As such, it makes sense to implement the functionality to load the files in a format more amenable to standard workflows.

Usage

Data Loading

This package provides helper functions to download and load the data offered by NOAA. There are four core functions that you should be aware of

load_station_metadata
load_inventories
load_data_file
load_countries_metadata

Each of these functions download the corresponding data using DataDeps.jl if it's not already available, and parses it into a DataFrame.

NOAA's documentation is the best place to look to understand these files, but the docstrings in this package provide a brief overview.

Typical Workflows

Commonly, you'll want to load all of the data associated with a particular collection of stations in a particular region of the world. There are basically two steps to do this:

  1. Use load_inventories() to find out which stations exist at which latitudes / longitude, and their corresponding ID.
  2. Use load_data_file(station_id) to load each station that you've found in your region of interest.

For an example of this kind of thing, see the code for select_data in dataset_loading.jl.

You might also be interested in, for example, the properties of the station in question (e.g. its elevation). For that data, use load_station_metadata().

Helper Functions

This package presently provides two bits of functionality to process the data a bit once it's been loaded.

select_data pretty much implements the workflow discussed above.

convert_to_time_series "stacks" the output of load_data_file, converting from 1 row == 1 month (different day's data live in different columns in the raw data), to a format in which 1 row == 1 day.

Both functions are quite opinionated, so while they're hopefully helpful examples of things that you might want to do with the GHCN data, you'll probably need to tweak them a bit for your use-case.

Missing Functionality and Contributing

If you build on this functionality, please consider contributing back so that we can make all of our lives easier! Similarly, please open an issue (or, even better, a PR) if you feel that something that would be useful is missing.

Development has been driven on an as-needed basis, so while this is package will grab most (all?) of the daily data for you, it is a little sparse on utility functionality. In particular, please note that convert_to_time_series and select_data may not make assumptions about the data that are appropriate for your use case. If in doubt, I would recommend using the functionality in dataset_loading.jl, as it just provides helpful functionality to extract the data.

Moreover, it doesn't currently implement anything to grab or process the monthly data, but it should be a straightforward extension of the existing functionality to do so.

Bug Reporting

If you either find a bug, or think something looks suspicious, please open an issue / PR. When considering whether or not to open an issue / PR, note that it's generally better to open an issue erroneously (no harm is done if it turns out there wasn't a problem after all) than it is for a problem to slip by (data-related bugs cause papers to be retracted and generally hold back progress). If in doubt, open an issue.

Why are there so few tests?

Three of the four core functions listed above are lightly tested -- load_data_file has yet to be tested because, as presently implemented, the CI runner would need to download the entire collection of daily data for each run, which seems impractical. If you have any suggestions for how to alleviate this, please open an issue / PR!

Related Work

Scott Hosking provides similar functionality in a Python package.

Used By Packages

No packages found.