13

I dunno about coolest, but I did sort of cement my reputation as the "database guy" in my first job because of this.

My first job was with a group maintaining a series of websites. Because of the nature of the websites, every morning we had to pull the records from one database on one network, sneaker net the data to a database on another network, and import the data via custom data import function.

However, the live site would crash after 100 or so records were imported. The dba at the live site had to script out a custom data partitioning script to do his daily duties, but it definitely messed up his productivity.

Turns out, the custom mass import function had recycled the standard import function, which was only used to import 1 record at a time, and it never closed its database connections, because it never needed to. A one line fix to production code was delivered 6 months later (because that was our release cycle) and I came up with the temporary work around, which was basically removing the connection limit. It would still crash with the work around, but only with multiple days worth of data. So basically only on Monday. Also developed the test set for the import (15k+ records).

Comments
Add Comment