Maintaining high quality data within a CRM database can be essential, especially for optimising marketing campaigns and complying with privacy policies. Our client, an importer of cars and car replacement parts, struggled with exact this problem and their CRM database notably suffered. Their sales representatives often had trouble finding particular clients in the CRM database and this led to numerous duplicates of the same client records, which only compounded as an issue over the years.
Improving CRM Data Quality
To solve this operational challenge, we developed a software robot based on Python.
The robot was built as an unattended robot managed in BRAINHINT’s Robot Farm, a platform that orchestrates robots.
Each time the robot is launched, it accesses a file in a specific network location using SFTP. Data is generated periodically by processing the database and is outputted in a CSV file.
When new data is available, the robot inspects it for integrity and proceeds with processing it into the CRM system used by the sales team. It logs in as a user, locates a file according to data provided in the CSV file and merges related information such as duplicate client records.
It does this with all records provided in the CSV file and its work is done when there is no data left.
This is a multistep process that would be burdensome to do manually but it is a process perfectly suited for our developed robot.
The robot is able to work 24/7 and processes up to 360 records a day, all while ensuring the highest quality CRM data at any given time.
high quality of data at any given time