6
krz1
4y

The company I worked for had to do deletion runs of customer data (files and database records) every year, mainly for legal reasons. Two months before the next run they found out that the next year would bring multiple times the amount of objects, because a decade ago they had introduced a new solution whose data would be eligible for deletion for the first time.

The existing process was not be able to cope with those amounts of objects and froze to death gobbling up every bit of ram on the testing system. So my task was to rewrite the exising code, optimize api calls and somehow I ended up in multithreading the whole process. It worked and is most probably still in production today. 💨

Comments
Add Comment