We’re in the process of re-organizing our office back-ups and archives.
For a while now, we’ve had a 1TB D-Link mini-server, dual parallel drives, serving as a backup device. It’s been getting full – so we decided to review and revamp. Turns out a big chunk of that drive (700 GB, at the moment) is devoted to one client, and an archive of site data and reports that go back nearly 15 years. In the early days the data sets were relatively small (by today’s standards) – 10 or 20 MB maximum. But today, we regularly see data sets that exceed 1GB.
So, new plan, we picked up a relatively low cost, 4TB backup drive (USB connection) and are moving all of the customer data over there. There’s no real requirement for this data to be backed up permanently, it’s more of a “nice to hang on to” archive. That way, we can free up the 1TB drive (still nicely serviceble and redundant as an automated backup device) for everything else.
What this requires, however, is patience. In the process of copying 700GB of data from the network drive to the USB drive is taking some time (days really); it’s slowing down my main workstation a bit but not enough for me to set up something else to handle the chore.
Once the data gets pushed to the new drive, I’ll clean up the old backup drive, and also clear out some space on my main workstation and spend some time defragging the disks.