During a recent national Fisheries Statistics Working Group meeting, data managers from all Australian states highlighted and discussed the likely high prevalence of inaccurate or fraudulent data supplied by fishers and accrued through data-entry errors. Current data quality control measures in each jurisdiction are largely heterogeneous, undocumented and often rely on manual checks by clerks or analysts that are labour intensive and costly and not routinely executed. Because many of these checks occur during manual data entry of paper-based records, these are likely to become obsolete as reliance on electronic reporting increases, with data entered directly by fishers through online portals or mobile applications.
There is a need to develop automated data cleansing and diagnostic procedures that can be applied post-hoc or retrospectively to large fisheries databases to detect and flag errors and outliers and provide subsets of reliable catch and effort data for stock assessments and other analyses. This project will contribute towards addressing these issues, by developing automated processes to routinely assess newly entered fisheries catch and effort data for errors, retrospectively quantify error rates in existing data and assess their likely influence on the outputs of stock assessment analyses. The outcomes will help improve the quality and accuracy of catch and effort data used in routine stock assessments, and in turn lead to more sustainable management of wild capture fisheries resources.