The loss of critical data can prove devastating. Still, millions of professionals ignore backing up their data. While individual reasons vary, one of the most common explanations is that performing routine backups can be a real chore. Because machines excel at mundane and repetitive tasks, the key to reducing the inherent drudgery and the natural human tendency for procrastination, is to automate the backup process.
Well for those who know how to write there own shell scripts and use cron ;-). Seriously though, any true computer user knows that one needs to back up data, either to some server, another hd on the same comptuer, or on a disk of some sort. Otherwise one will be liable for losing a term paper lets say.
I don’t know if it can work this way on Linux (I’m sure it can), but on Windows, I simply keep all my data on one single partition and just backup the entire partition (use TweakUI to put Desktop, My Documents, etc on this partition) and back that up routinely. I don’t really care if I lose my programs and settings, cuz those are pretty easy to restore. The only thing I can’t get back after a hard drive failure is the data.
Backing up to a HD on the same computer only protects you from drive failure. If you have something like a PS failure that fries both drives (yes, I’ve had it happen), you don’t have a backup.
I backup my important machines daily to a second drive in the machine, with a mirror of those backups on another machine.
I also dump to tape once a week, and store those in another location.
http://www.rsnapshot.org/
The article wasn’t very good …
I should add that I use an external USB drive, so only a catastrophic failure (fire, earthquake, lightening strike) could kill all my data.
I just have a PHP script run via cronjob which tars up the directories I specify, and copies them over to another computer via an NFS share. Except in the case where you only have one hard drive and aren’t networked to anything else, there’s not much reason not to take a half hour, crank out a good backup script, and set it up to run as a cronjob.
indeed it would work something like that….
best way though is to make a separate partition, either on the main hard disk or a secondary on, format it, and mount it as /home
then all the user files can be backed up from there.
doing this means that you can format/reinstall or change distros and all your user files will still be there, (unless you do something completely daft and format /home – like I did once)
#1 always make your /home directory in either a separate hard-drive or disk partition
#2 keep a Linux distro that runs off the CDrom <eg> Knoppix, Slackware live, etc…etc…
if the OS on the hard-drive fails and wont boot then boot with your favorite Live CD distro and see if you can repair your installed OS, and/or burn the files you want to keep to CDR/CDRW,
also like the article says, make backups often (automate if necessary), because if the hard-drive fails (not just the OS) then even a Live CD distro won’t have the ability to let you save your data…
I thought that the article was decent, and it went into some ssh details that are often left out of other backup articles which is cool because it assumes that you are working in a distributed environment. I don’t think it was intended to be the backup reference, but it does offer a few simple useful suggestions.
Anyway long story short, if you can find a better article then post a link to it! Lets make this thread useful.
You can indeed use a Net connection to backup your data remotely (yes, it should be encrypted if it’s at all sensitive, which is why this article goes into great detail about setting up ssh), but it’s not really sensible to skip having an “onsite” (local) copy somewhere too (either on another machine not in the same room or on removable media stored in another room). What if your remote Net connection stops working in the middle of a restore?
For large amounts of data that needs to be backed up on a daily basis, you need a proper client/server backup system (where the LAN is your “remote network”) and some sort of removable media/autochanger. I find that if you want to use a free solution, tar and homebrew scripts don’t cut it – go for Amanda at http://www.amanda.org/ which works quite nicely once you’ve finished tearing your hair out configuring it!
They get a lot easier if you start doing them. Half the trouble is getting people to start.
dvd-rw’s are cheap nowerdays.
If you only backup your important stuff. /home, /var/www possibly. you are pretty safe. Installing the system again doesn’t take ages. [well some distributions use to compile every package if you want. In those cases I’d advise to take a backup of your system – won’t exceede 4.7 GB in most installations – do you really use those applications or have them just installed to be cool?!.]
So having a shell script in cron wich tar’s and if space gets tight. possibly gzips. or bzip2s them. And than puts them on a DVD-RW. really helps.
DVD-RW burner cost around €80 and a set of 5( why five? well – they did not have single sets) costs you another €15.
Now you have a pretty safe backup solution for €95 or less.
( I used a cdrw previous though that one got small sometimes. But it was still managable ). Than I bought UT2k4 – to support a company that packs a native linux installer on their disk – but ohh crap I got the DVD version so i needed a dvd drive. the extra €20 for a one burner burns all formats device i had. So since then I’m using DVD-RW.
btw. if you feel like you need to backup your 10Gig Pr0n, MP3, Wallpaper, we collection. Than you should try to keep those somewhere out of you backup path.
kindest regards,
Moritz “neofeed” Angermann
I do regulkar routine backups to Tape, I like tape drives and I keep images of my developer machines around so if anything happens we are covered. While Linux and Windows Server 2003 are rock solid systems its always a neccesity not to procrastinate because while they may be safe, secure and reliable they sometimes do crash.
Oh I forgot to add. I use Cron and WinCron for automation, experimenting with MSH on the Windows side, and I use Tar for the backups. Tar may be an old method for doing it but the consistency of success is better than it is using a zip compression. IU have had many times where the zip file became corrupted.
[By Anonymous (IP: —.dyn.gotadsl.co.uk) – Posted on 2004-07-09 19:14:41]
You can indeed use a Net connection to backup your data remotely (yes, it should be encrypted if it’s at all sensitive, which is why this article goes into great detail about setting up ssh), but it’s not really sensible to skip having an “onsite” (local) copy somewhere too (either on another machine not in the same room or on removable media stored in another room). What if your remote Net connection stops working in the middle of a restore?
For large amounts of data that needs to be backed up on a daily basis, you need a proper client/server backup system (where the LAN is your “remote network”) and some sort of removable media/autochanger. I find that if you want to use a free solution, tar and homebrew scripts don’t cut it – go for Amanda at http://www.amanda.org/ which works quite nicely once you’ve finished tearing your hair out configuring it!
Probably one of the most best common sense remarks I have read in the recent *and* not so recent past!
I claim guilty, and I should know better. I have backup scripts and cd burning for all of my clients, but still seem to seldomn get around to backing up my systems more than once a month (or even 3 months).
Well, I guess I better check those scripts – thanks for the nudge.
As long as you’re talking about Linux backups, you should all (if you’re not already) be aware of mondo rescue http://www.mondorescue.com/ which is ideal for exactly what you talk about here — _disaster recovery_.
They have a public discussion forum on Gmane, as well as a support channel on irc.freenode.net.
For info on supported filesystems (LVM, RAID, ext2, ext3, JFS, XFS, ReiserFS, VFAT, for example… Additional support is possible, just post to the mailing list), and distros, check the website.
linux is the answer …
want to back up?
tar -Mcvf pr0n. now you just need a million floppies.
want to back up a drive to another?
dd if=/dev/hda of=/dev/hdb
want to encrypt? dd if=/dev/random of=/dev/hda
(don’t try this at home!)
feds knocking at the door in a hacking bust?
dd if=/dev/zero of=/dev/hda
want to back up porn?
mkisofs -o pr0n.iso /pr0n
cdrecord dev=0,0,0 speed=0 pr0n.iso
Do all this in the time it takes to open Roxio.
What kind of crazy advice was that?! Do you think writing zeros to the disk is a good idea when the feds are knocking at the door? LOL, then you know “zero” about how data is written to disks.
For backups, you can check out arkeia, xarkeia which is free (as is beer) for SOHO.
At least in FC2:
1. Place all your critical data in one spot;
2. Insert CD-R into drive;
3. Copy files to CD-R; and
4. Have enough discipline to do this regularly.
I had a fellow worker, a program manager, lose two years of critical job-related data on her “other platform” machine for being lazy.
This is not about /home/troy — this is about my 300 GB drive used in the office for storing all sorts of things… CD-R – haha.. 😉