You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the past few weeks as the Valorum team has had less "day-job" hours to dedicate to maintaining scrapers, we've seen a number of data integrity issues pop up (Thanks CAN team for opening the issues and helping us!)
Most of these issues seem like they could be caught automatically, instead of manually by inspecting or using the data.
I'd like to start a discussion for how we could automate a scheduled check/validation of the data
I'm opening this issue as a place to host that discussion.
My eventual plan is to turn the items identified in this issue into a DAG that is run on a schedule as part of our airflow setup.
What I'm asking for is ideas/support for different steps that can be part of that dag.
cc/ @ghop02 and @BrettBoval who are working on data integrity issues on our end.
Basically the "Check that data we had been collecting in the past is still being collected" would be super valuable to us, since we often don't notice these until the data gets 7 days old and our pipeline drops the data, and then it's a bit of a hassle to figure out where it was coming from and log the appropriate bugs with details, etc.
In the past few weeks as the Valorum team has had less "day-job" hours to dedicate to maintaining scrapers, we've seen a number of data integrity issues pop up (Thanks CAN team for opening the issues and helping us!)
Most of these issues seem like they could be caught automatically, instead of manually by inspecting or using the data.
I'd like to start a discussion for how we could automate a scheduled check/validation of the data
I'm opening this issue as a place to host that discussion.
My eventual plan is to turn the items identified in this issue into a DAG that is run on a schedule as part of our airflow setup.
What I'm asking for is ideas/support for different steps that can be part of that dag.
Some ideas are:
_total
columns don't have decreases in their value, as they are meant to be cumulative (ref Arkansas counties have 0 *_tests_total numbers since 2020-09-12 #125)cc @cc7768 @tlyon3 @mikelehen (please ping anyone else I missed!)
The text was updated successfully, but these errors were encountered: