Skip to content
Advertisement

Data Quality check with Python Dask

Currently trying to write code to check for data quality of a 7 gb data file. I tried googling exactly but to no avail. Initially, the purpose of the code is to check how many are nulls/NaNs and later on to join it with another datafile and compare the quality between each. We are expecting the second is the more reliable but I would like to later on automate the whole process. I was wondering if there is someone here willing to share their data quality python code using Dask. Thank you

Advertisement

Answer

I would suggest the following approach:

  • try to define how you would check quality on small dataset and implement it in Pandas
  • try to generalize the process in a way that if each “part of file” or partition is of good quality, than whole dataset can be considered of good quality.
  • use Dask’s map_partitions to parralelize this processing over your dataset’s partition.
User contributions licensed under: CC BY-SA
10 People found this is helpful
Advertisement