Skip to content

Latest commit

 

History

History
4728 lines (4143 loc) · 185 KB

AWAP_GRIDS-overview.org

File metadata and controls

4728 lines (4143 loc) · 185 KB

AWAP GRIDS


TODOlist

commit the test get awap data after liz

links to get pgutils

http://download.osgeo.org/postgis/windows/pg92/ postgis-pg92-binaries-2.0.2w64.zip 17-Jan-2013 14:25 21M http://download.osgeo.org/postgis/windows/pg92/postgis-pg92-binaries-2.0.2w64.zip http://www.enterprisedb.com/products-services-training/pgbindownload

check solar radiation solarave_2001050820010508.grid

http://www.bom.gov.au/web03/ncc/www/awap/solar/solarave/daily/grid/0.05/history/nat/1993080619930806.grid.Z’ ‘http://www.bom.gov.au/web03/ncc/www/awap/solar/solarave/daily/grid/0.05/history/nat/2001050820010508.grid.Z

main awap links must be obvious

Introduction

NCEPH holds Australian Bureau of Meteorology data for all stations from 1990 to 2010 \cite{NationalClimateCentreoftheBureauofMeteorology:2005}. The aim of this project is to download the Australian Water Availability Project (AWAP) gridded datasets \cite{Jones2009}. In particular we want the vapour pressure data from 2010 so that we don’t have to buy it again. We want to compare it with the station data to see if they are close.

The Codes

plan

init

project

create schema

config

main

main

deprecated main

scoping

deprecated scoping

load

load if day not available

load mirrored GRIDS

run multiple sessions

check database size

deprecated code

deprecated load

#p <- getPassword()

test upload to postgres

raster2pgsql

http://postgis.refractions.net/docs/using_raster.xml.html#RT_Raster_Loader ie raster2pgsql -s 4236 -I -C -M *.tif -F -t 100x100 public.demelevation > elev.sql psql -d gisdb -f elev.sql

SQL extraction

test geotiff

save storage space as geotiff

test readGDAL

test uncompress

unresponsive psql on some raster2pgsql.sql files

move from rawdata (or 5 year chunks) to one year Directories

clean

clean-check-against-stations

do

zones

Tests

DRO awap grids

Introduction and Methods

This is a work in progress. It is a stub of an article I want to put together which shows how to use several online data repositories together as a showcase of the [Scientific Workflow and Integration Software for Health (SWISH) Climate Impact Assessments](https://github.com/swish-climate-impact-assessment) project.

Authors

Background

  • Markus Nolf offers this use case of the [SWISH EWEDB](http://swish-climate-impact-assessment.github.io/)
  • Markus is pulling together his Daintree Rainforest Observatory (DRO) data into a manuscript for publication, and was looking for climate data from 2012 as well as long-term.
  • More specifically, the annual precipitation and mean annual temperature for both 2012 and the 30-year mean.
  • The Australian Bureau of Meteorology has a nice rainfall dataset available at http://www.bom.gov.au/climate/data/ (“Cape Trib Store” weather station), but it seems like the temperature records are patchy.
  • So it is advised to use the data the DRO collects its self
  • You need to apply through the [ASN SuperSite data portal](http://www.tern-supersites.net.au/knb/) for access to the daily data for the DRO.
  • Note the use of the DRO met data will need to be properly cited as it is harder to keep

an AWS station running in the tropics for years than it is to collect most other data. The citation information is provided when you make a request to access the data.

  • The long term mean used by most DRO researchers is from the BOM station as we only have a short record from the station itself. The offset is around 1000mm.
  • However what we want is mean annual temperatures but the BOM website seems to focus more on mean minimum and maximum temperatures.

Material and Methods

Baseline Climate Data 2012, Far North Queensland Rainforest Supersite, Cape Tribulation Node

Extract mean annual temperatures at the BOM website

  • SWISH uses BoM data a fair bit and aims to streamline access to BoM data for extreme weather event analysis (which require long term average climatology to provide the baseline that extremes are measured against).
  • WRT to temperature most daily averages from BoM are calculated as average of maximum_temperature_in_24_hours_after_9am_local_time_in_degrees and minimum_temperature_in_24_hours_before_9am_local_time_in_degree (only couple of hundred AWS provide hourly data to get the proper mean of 24 obs).
  • The Bureau of Meteorology has generated a range of gridded meteorological datasets for Australia as a contribution to the Australian Water Availability Project (AWAP). These include daily max and min temperature which you could use to generate daily averages, then calculate your long term averages from those?
  • http://www.bom.gov.au/jsp/awap/
  • Documentation is at http://www.bom.gov.au/amm/docs/2009/jones.pdf

A workflow to download and process the public BoM weather grids.

  • This workflow uses the open source R software with some of our custom written packages:

R-depends

R-code-for-extraction

Results

Results

  • Markus reports:
  • “The R-script worked great once i had set a working directory that did not include spaces. (It may have been a different problem that got solved by changing the wd, but the important thing is it’s running now.)”
  • Markus downloaded 70+ GB of gridded weather data from the BoM website to his local computer
  • Also note there is another set of gridded data available from the BOM, which contains pre-computed longterm mean temps, [ready to be extracted with the script](http://reg.bom.gov.au/jsp/ncc/climate_averages/temperature/index.jsp?maptype=6&period=#maps)
  • “Using this file, I only needed to get the 2012 temp grids for a comparison of 2012 vs. 30-year data. I’m going to run the extraction of 1961-1990 data, just to be sure.”
  • “When we finished analysis of the long-term temperature from daily means found:
  • While the official, pre-computed long-term mean (i.e. 30-year grid file, analysed with your script) was 22.29 °C for the DRO coordinates (145.4494, -16.1041), the new value from daily means (i.e. daily minave and maxave averaged) is 24.91 °C.
  • We’re not sure what causes this discrepancy, but thought we’d note that there is one.
  • For the manuscript, we ended up using the means obtained via BOM’s method* to compare 1961-1990 values to 2012, both computed with the above script.
  • (* average of daily min/max temperature for each year, then averaged across the entire 30 year period)

Dataset discrepancy

  • Following up on the interesting a difference between the two BoM datasets.
  • One thing that might cause this might be if you are calculating the average of the annual averages ie sum(annavs)/30 or the average of all the daily averages as sum(dailyavs)/(30 * 365 or 366)? the variance will differ by these methods.
  • looks like the 30 year dataset is the former:
  • “Average annual temperatures (maximum, minimum or mean) are calculated by adding daily temperature values each year, dividing by the number of days in that year to get an average for that particular year. The average values for each year in a specified period (1961 to 1990) are added together and the final value is calculated by dividing by the number of years in the period (30 years in this case).”

[metadata](http://reg.bom.gov.au/jsp/ncc/climate_averages/temperature/index.jsp?maptype=6&period=#maps)

  • Markus followed the BOM calculation method, and just compared it with two other approaches.
  • average of all 21914 values
  • average of yearly sum(min and max values per year)/(ndays*2)
  • average of yearly sum(daily average)/ndays)
  • where ndays = number of days per year.
  • Differences between these methods show only in the 6th decimal place, far from 2.62 degrees.

R-code-for-comparison

Discussion

  • Principal findings: Very convenient automated extraction of location-based time series data for the precise period that is requested.
  • Weaknesses (whole method, not your script): very long download time for daily grids (~11.000 grids = huge dataset, took several days in my case). Yearly grids would be beneficial (and I believe most others are also looking mainly for data on a yearly (or larger) scale).

Conclusion

  • Take home message: Seems like a perfect case of “double-check the data using one and the same method”.

daily

Seasonal Rainfall

2014-02-28-long-term-climatology-contextual-data-for-ecological-research

prod

monthly

Conclusions

versions