The TERN Australian SuperSite Network (SuperSites) is a national network of multidisciplinary ecosystem observatories that improve our understanding of ecosystem response to environmental change. TERN SuperSites strives to provide high quality ecosystem monitoring data to scientists, natural resource managers and the public. The level of Quality Assurance and Quality Control (QA/QC) measures implemented is reliant on available funding and resources.
TERN SuperSites are managed by different institutions and stakeholders who provide significant co-contributions and in-kind support. This arrangement creates challenges in ensuring consistency in monitoring, equipment maintenance and QA/QC measures due to varying approaches and resource availability.
While comprehensive network-wide QA/QC measures have not been fully funded to date, any QA/QC procedures that have been used in the collection of SuperSites field data is outlined in the associated metadata entry. To ensure transparency, SuperSites data users will have access to raw data and QA/QC procedures employed prior to the release of data as well as any protocols used.
TERN Australian SuperSite Network QA/QC measures include:
- Provision of detailed protocols.
- Where protocols vary due to SuperSite conditions (for example Leaf Area Index) these variations will be documented.
- Where possible, centralised training in monitoring procedures will be implemented, or specialised staff will travel between SuperSites to ensure consistency in methodology.
- Principal investigators (PIs) from each SuperSite will ensure data is checked for data entry errors. This will be acknowledged to the Data Officer and recorded prior to release (this will be supplanted with a new strategy that allows earlier data publication as described below).
As data accumulates over a number of years, SuperSite data will periodically be assessed to obtain estimates of bias and error and to inform improvement strategies if required.
Aspirational QA/QC procedures that will be implemented as funding allows include:
- Field based computers that constrain data entry and track records associated with specimens.
- Development of algorithms built into initial data processing to flag records outside of reasonable values due to data entry errors.
Quality Control of Data Input to the TERN SuperSites Database
Field data is submitted by SuperSite Principal Investigators or their representatives to the SuperSites Data Librarian (Data deposition through Morpho software has been disabled). Where possible, data is submitted in standardised spreadsheet formats. Standard data sheet formats are in development for recurrent monitoring.
The Data Librarian prepares the dataset and metadata for publication through the TERN SuperSite Database. This process may require the assistance of the SuperSite PI or representatives. The final metadata and data files are made available to the SuperSite PI for checking before publication. The SuperSite PI will acknowledge the correctness of the data and metadata entries by the prescribed mechanism. Acknowledgment of data ownership and final QA/QC checks are currently sent to the Data Librarian by email. An automated web based process is in development and will allow the SuperSite PI to access, check and give the final permission to publish via the click of a button.
To allow earlier public access to data, datasets will be published prior to final checking by the relevant PI. Datasets will display a "QA/QC" icon to indicate that the data has not been ratified by the PI and will be replaced with a "QA/QC" icon when completed.
QA/QC of Acoustic Data
The microphones used on the acoustic recorders are prone to degrade over time in harsh environments and require periodic replacement. To avoid collecting degraded data, the beginning and end of each recording will be assessed for quality by the PI or representative before transfer to the database.
QA/QC of Soil Chemistry
Soil chemistry analysis will be carried out in NATA accredited laboratories.
QA/QC of Eddy Covariance Data
Each SuperSite maintains its own flux tower and equipment using local technical expertise with coordination from OzFlux Central Node. Technical expertise is shared where possible to help ensure consistency across the network in setup, maintenance and calibration of towers. Data processing workshops have been held in conjunction with OzFlux conferences to ensure consistency across the network.
Flux tower data from SuperSites is delivered to the TERN OzFlux data portal. Data handling procedures are developed, tested and revised on a continuing basis. A new version of the OzFluxQC scripts were released in late 2013. The results of the OzFluxQC system will be compared with EddyPro at four OzFlux sites with software developed by Ray Leuning. These comparisons will be a final check on the integrity of the OzFluxQC system and will form a “golden” data set against which future releases of the OzFluxQC system can be checked.
Gap Filling of Eddy Covariance Data
There will be two approaches to gap filling and partitioning of OzFlux data available. Jason Beringer (Monash University) has developed a stand-alone system for ingesting data from a number of sources (Bureau AWS, AWAP and MODIS) to fill gaps in meteorological drivers coupled to a neural network for gap filling fluxes and partitioning NEE into GPP and ecosystem respiration. The OzFlux Central Node has integrated several flux gap filling and partitioning techniques, including a neural network, into the existing OzFlux QC system. A small working group has been established to explore several techniques for estimating ecosystem respiration for Australian ecosystems.
Collaborations between OzFlux and Neon Inc. (USA) will allow comparisons of data handling methods. OzFlux aims to contribute data to the FluxNet global data synthesis once data processing, access and licencing issues are met. Details of the OzFlux QC data processing system for eddy covariance flux tower data can be located on the OzFlux website.
Last updated: October 2016