Duplicate Data Filter

Objective

  • to identify official value for each time where duplicates exist
  • use one value for time and space interpolations
  • all algorithms complicated by duplicates
  • avoid more lengthy bug searches

Reasons for duplicate data from a station for a give paramid and time:

  • observations delivered with more that one typeid
  • multiple sensors
  • multiple levels (NB this is strictly not a duplicate, but is a case to handle)

Current work arounds

Qc2 algorithms select data on both paramid and typeid to avoid picking up duplicate data, i.e.:

result = dbGate.select(Qc2Data,kvQueries::selectMissingData(params.missing,pid,tid,ProcessTime));

… but this reduces the available data / statistics of the Qc2 algorithm.

A routine is required to parse all data and set the official value for a given duplicate.

To Do

  • specification of rules to identify official value
  • specification of required configuration information
  • define a universal flag to identify the official value in all circumstances
This website uses cookies. By using the website, you agree with storing cookies on your computer. Also you acknowledge that you have read and understand our Privacy Policy. If you do not agree leave the website.More information about cookies
  • kvalobs/kvoss/system/qc2/duplicatefilter.txt
  • Last modified: 2022-05-31 09:29:32
  • (external edit)