We all have important files, whether it's just a folder full of junk or it is priceless family photos. Many people make the mistake of trusting and relying solely upon the storage in the drive from the device they use most. If that drive were to become inaccessible for any reason, your files are in essence trapped. For convenience sake, we are going to go with the trusty old Laptop/Desktop computer scenario in this post, but even if your file cabinet is in the clouds - it doesn't mean they're safe!
Creating a shared folder backup system that's fast and reliable is hard with Windows. Using the command "xcopy" copies and replaces all files on the target - which isn't necessary unless the files are new or have been modified since last backup, right? Using an external programs like Veritas can be a lengthy process as well. The comapay we initially created this system for was backing up nearly 1TB (1000GB) of data daily. We sought out a way to avoid this using a computer built from old spare parts, an external drive made from spare parts**, and a free operating system - Ubuntu (xfce release is the one we prefer, but you could do the same with any of them).
After we got Xubuntu installed on the tower, we installed samba file sharing as our root login (sudo apt-get install samba). By this time we added the computer name and a test-user to the Active Directory and allowed it Administrative privileges. After restarting the linux machine, log in as the network user by clicking "Other" and typing the full user name - for example: "NETWORKNAME\test-user" and enter that user's password that was applied in the Active Directory user set up. Now that a local user has been created on the computer, we log off and log back in as the root to allow our new user root privileges. Once that has been done we are ready to map the network drives.
We then created folders in the /media directory to map the network folders to. We added the network paths to the "fstab" file contents (sudo gedit /etc/fstab) of the folder we wanted to map and back up. An example line from that file is:
"//192.168.0.121/E$ /media/edrive cifs username=user,password=pass 0 0"
Once you get all of them into fstab, save and close that file then run the command "sudo mount -a" - if you receive any errors, you likely have input something wrong, if not, you should now be able to browse those folders where you mounted them.
Next, we write up a quick #!/bin/bash script to run the "rsync" command for those directories to sync with our external backup drive (for example: "rsync -aq /media/edrive /media/backup/edrive"). After, we used the "chmod" command to make our file useful without without having to use the sudo prefix or enter a password so it could easily be made into a scheduled task to run automatically. For this, we used "crontab" to set up the task execution frequency/schedule (crontab -e).
With that, we're done. In summary, what we've accomplished is making a faster more efficient backup system by only copying files that are new or have changed since the last time the script was run. Just keep the computer on and let it sit and do all the work for you quickly and efficiently while you enjoy your coffee and teach someone how to toggle the "O.N. / O.F.F. Modulator".
*Please note, the examples above will have to be altered slightly to accommodate your setup and also that the first time the script is run it will perform a "full backup" from the specified locations (unless the files already exist in the destination).
**The external drive was the only cost. We purchased a "shell enclosure" and used an extra sata drive that we had from another computer whose motherboard died. Please do make sure the one that you use has enough space to perform the backup before running it and it is in relatively new condition. You don't want your backup to die too, do you??