CF checker

Revision as of 17:41, 11 November 2019 by P.petrelli (talk | contribs)

There are a few options when it comes to check if your files are CF compliant. You can use online checkers as these:

https://compliance.ioos.us/index.html http://cfconventions.org/compliance-checker.html

You upload yur netcdf file on the web and you get back a report.

This is fine if you have one small file but if you want to check a lot of files you have to use a checker on yuor computer. We installed the ioos checker in our conda environments. The ioos checker is python based, as well as checking for CF conventions it works also with the ACDD conventions. NCI uses this checker and in fact they require both conventions to be applied to the data they publish.

Using it is very simple, just load the module

       module use /g/data3/hh5/public/modules

       module load conda/analysis3

You can now call the checker on a netcdf file:

       

$ cchecker.py test_file.nc 
Running Compliance Checker on the datasets from: ['test_file.nc']
--------------------------------------------------------------------------------
                         IOOS Compliance Checker Report                         
                                    acdd:1.3                                    
http://wiki.esipfed.org/index.php?title=Category:Attribute_Conventions_Dataset_Discovery
--------------------------------------------------------------------------------
                               Corrective Actions                               
test_file.nc has 4 potential issues


                               Highly Recommended                               
--------------------------------------------------------------------------------
Global Attributes
* Conventions does not contain 'ACDD-1.3'
* summary not present
.........

As you can see from the example, without passing any option the tool checks for ACDD 1.3 (the latest version) compliance. The report is printed out to screen.

   cchecker --help

will list all the available options, the main ones are:

-t/--test  to choose the test;

     ie.e -t=cf  will test the file against the latest available verison of the CF conventions.

-c /-- criteria set the level to which run the test;

      possible options are < lenient, normal, strict >, default to <normal>

-f/--format  the output format;

      possible options are < text, html, json, json_new >

-o/--output optional file/s to redirect output to.

You can also run the checker on severla files at one time. For example :

    cchecker.py -t=cf -c strict -o cf_test.txt test_data/test*.nc 

In this case I tested all the files matching <test_data/test*.nc> against the CF standards, I applied the standard at a <strict> level and the report will is written to the file cf_test.txt.

When I test mutiple files the tools report for each files separately, if you have a lot of files you will end up with a long and repetitive report.

We created a simple python script you can run to summarise the report so you get each error or warning reported only once.

You can acces the script here:

 

This script create a summary of CF/ACDD tests run by the ioos checker on multiple files

First run the checker generating a json output, for example:

    cchecker.py -t=cf -c strict -f json_new -o cf_test.json test_data/test*.nc  

Then pass the json file as input to this script

   python parse_checker.py test.json


Results for cf checks
3 files were checked

High priority results

3 files failed:
�2.2 Data Types: The variable time failed because the datatype is int64

3 files failed:
�3.3 Standard Name: Attribute long_name or/and standard_name is highly recommended for variable time

...

If you want to save the summary in a file just redirect the output

   python parse_checker.py test.json > checks_summary.txt