I was on LinkedIn recently, and looking at my profile, saw a post I had posted about six years ago around Data Cleansing. One thing that struck me was that the topics I brought up then are as relevant then as they are now, and with Big Data now mainstream many companies are wondering how to manage all this data in an ever-changing landscape. So I thought I would share it again.
The Business case
(A)Test Data meets Industry requirements.
In some industries, it is a legal requirement to have all your data displaying the correct format and description. Any pieces of information not included should be removed based on business rules. Today companies operate across multiple platforms electronic, print, video, etc., a process needs to be in place to make sure the data is in sync!
(B)Check for unwanted words appearing
Branding and reputation are critical, and businesses large and small need a mechanism to understand what information was written online in conjunction with any of their profile. Data Cleansing can be the first point of call to unwanted words that will damage the brand.
The Technical case
(A)Remove unwanted characters such as !”£$%^&*@’;:#~?>< etc.
When presented with a set of data from another source, they may be in a raw format, and if looking to grouping the words or numbers in your dataset, this can sometimes lead to wrong grouping.
Sometimes you will need to group specific names or numbers to see how often they appeared. Having this issue can become problematic if an initial review of the data was not started, as with point A above. So you want to check for Facebook in your dataset and if there are six occurrences of it 4* Facebook! and 2*Facebook, your grouping will be incorrect, giving you the wrong analysis.
The main reason for cleansing data is to have it ready to be processed. Often or not, the process is an initial step before further processing starts. The hard processing will have built-in controls to make sure the data is in the correct format if not, they will fail. This step would be crucial in an automated process.
(D)Check for Null Values
Sometimes a system can be set up to process data or receive in data from a third party vendor. It might be imperative that specific fields should not be empty or should be empty depending on the business need. The initial analysis to identify those values through the data cleansing process should help to mitigate any problems before the data gets loaded into systems that have strict controls on them.