The Quest for the Best Image Backup
At work, we use an imaging software named Altiris. It’s all fun and dandy but, dang, it’s complicated. It seems to work really well for our Windows guys, but it does not seem to handle my linux filesystems very well. It appears to be a file-based backup software and it doesn’t understand ext3. So, to fix that, I’m starting on the quest to find the best backup solution for the Linux Machines I’ll be managing.
- Altiris - Particularly Symantec Ghost. The regular ghost program does a file-file image of the disk, but I found that with the switch -ial you can do a sector-sector image of the disk. I’ll have to try this out.
- G4L - Their site says: “G4L is a hard disk and partition imaging and cloning tool. The created images are optionally compressed and transferred to an FTP server or cloned locally. CIFS(Windows), SSHFS and NFS support included, and udpcast and fsarchiver options.” This software runs off an ISO disk/usb. It looks like it might be useful. It should just make a sector-sector copy of the entire drive.
- Partimage - I’ve heard great things about this program. It can easily be found on the popular SystemRescueCd. I’m really interested in how this works, but it may not be the best solution as it seems to only do partitions, not full disks.
- DD - This is a simple command built into any and almost all Linux Operating systems. Super easy and simple and widely supported. It seems simple too, according to a post at ubuntufourms.org
- CloneZilla - This appears to be a robust program that almost combines all the others. It’s based off of dd, partimage, partclone, and some others I haven’t heard of before. , I think I’ll like this one the most.
I began to use Symantec Ghost’s sector-sector copy and it appeared to work great. After letting it run for an hour, I went to look at the destination drive and found it had only copied 1 GB. I looked at the Estimated time and it was in the range of 51 hours. That’s no good. I couldn’t even let this one finish before canceling it. Strike out.
After filling the empty disk space with zeros (which I did for the DD command) I’d like to try this image again.
Edit: After testing this install, it worked alright. It created the image in about 5 hours and it came out at almost 40GB. That’s pretty large but it’s using their system here, so there’s that.
I was unable to test this one yet.
This program worked great. It was considerably slower over the network, cifs, drive but it worked alright. I’d like to do some additional testing with this.
This program has worked really well so far. I followed the instructions to create the image following the guide is this post. I ran into an issue of it being a really huge copy (250GB for a 320GB drive) even though I was running it through gzip to compress it. After some research, I found that if I filled the empty space in the drive with zeros, it became a lot smaller of an image (1.8 GB for the same drive). Filling the drive with zeros allows gzip to compress the zeros a lot easier then the random data in the empty space. This seems to work really nicely.
CloneZilla has a really intuitive interface and seems to be built for almost any drives. When I ran this, it allowed me to mount a Samba share point to save or load my images from. Awesome. As it ran, it got upset about a ‘unreadable’ sectors and failed. I then ran it with the
-rescue flag and it skipped past those sectors. It created the image, but it failed halfway though because 98 GB was not enough space in the destination drive.
I like DD the best, but I found that dcfldd worked a little nicer. Quicker and more user friendly. DD is easy to customize it to exactly what I want because I’m the one running the commands. It’s a lot like CloneZilla but with more power. For an advanced user like me, DD is my first pick.
Support This Site
If my blog was helpful to you, then please consider donating to the Electronic Frontier Foundation as they do some really good stuff.