nV News Forums


nV News Forums (http://www.nvnews.net/vbulletin/index.php)
-   General Linux (http://www.nvnews.net/vbulletin/forumdisplay.php?f=27)
-   -   PING (http://www.nvnews.net/vbulletin/showthread.php?t=99840)

grey_1 10-06-07 07:42 PM


Has anyone used this utility? I'll hopefully be upgrading hdds in a month or so and really, really do not want to chance losing my current install/settings/data.

With this I can just make an iso to dvd to install to the new hdd.

TIA for any feedback or other suggested tools. :)

evilghost 10-06-07 08:16 PM

I just tar-gzip my home directory and or filesystem so I can pick and choose what I need. Never had a need to backup a partition since tar-gzip works fine. Had to restore from my backups multiple times on the server due to catastrophic drive failure. No issues with laptop use either or downgrading the wife's laptop from Feisty to Dapper.

grey_1 10-06-07 08:45 PM

Thanks eg, so I would do just a clean install, extract the tarball and replace files/directories as needed?

evilghost 10-06-07 10:29 PM

Pretty much, except I do a clean install, boot to a live CD, 'rm -rf' any of the non-system directories (aka, leave /proc and /dev but nuke /var, /bin, /usr, etc) then do the tarball extraction.

I keep about 7 days of rolling backups on an external USB drive and then about every one week or so I'll sync the backup drive to another USB drive that is stored offsite. This strategy works for me and it's been tried and tested and was 100% successful with no data loss.

I like the idea of the tarball because if I make changes to a website and the changes are horrific or undesirable, I can easily extract that file and/or directory from the backup tarball and undo my changes.

grey_1 10-07-07 03:21 AM

Sounds like plan. Thanks evilghost.

evilghost 10-07-07 07:46 AM

Here's the perl code I threw together to keep X days of backups, and here's the backup script I use on the server. I cron'd it in /etc/cron.daily



$popauthspool = "/backup/archives"; #Directory containing the files
$watcherlog = "/backup/backup_cleanup.log";

$date = localtime();

open(LOG,">>$watcherlog") || die("Can't open $watcherlog");
use File::stat;

@ips = `ls $popauthspool`;
foreach $ip (@ips) {
        $file = $popauthspool ."/". $ip;
        $s = stat($file) || die "Can't stat($file) $!\n";

        #1 day = 86400
        $modtime = $s->mtime + (86400 * 7);
        $now = time;
        if($modtime < $now){

foreach $expired (@expired){
  $date = localtime();
  print LOG "$date - Removing $expired\n";
  unlink($popauthspool ."/". $expired);

Backup Code


echo "`date` Starting Backup" > /backup/backup.log

nice --adjustment=20 tar zcvpf /backup/archives/fullbackup-`/bin/date +%m-%d-%Y`.tgz --exclude=/FileServer --exclude=/backup/archives --exclude=/media --exclude=/home --exclude=/www_chroot --exclude=/sys --exclude=/rsync /
echo "`date` Full System Done" >> /backup/backup.log

nice --adjustment=20 tar zcvpf /backup/archives/home-`/bin/date +%m-%d-%Y`.tgz /home
echo "`date` Home Done" >> /backup/backup.log

nice --adjustment=20 tar zcvpf /backup/archives/www-`/bin/date +%m-%d-%Y`.tgz /www_chroot
echo "`date` WWW Done" >> /backup/backup.log

nice --adjustment=20 tar zcvpf /backup/archives/luser-`/bin/date +%m-%d-%Y`.tgz /home/luser
echo "`date` luser Done" >> /backup/backup.log

nice --adjustment=20 tar zcvpf /backup/archives/FileServer-`/bin/date +%m-%d-%Y`.tgz --exclude=/FileServer/Music /FileServer
echo "`date` FileServer Done" >> /backup/backup.log

nice --adjustment=20 tar zcvpf /backup/archives/rsync-`/bin/date +%m-%d-%Y`.tgz /rsync
echo "`date` rsync Done" >> /backup/backup.log

ssh root@ -p 57600 -C 'dd if=/dev/mtdblock/1' > /backup/archives/openwrt_jffs2+squashfs-`/bin/date +%m-%d-%Y`.trx
ssh root@ -p 57600 -C 'dd if=/dev/mtdblock/3' > /backup/archives/openwrt_nvram-`/bin/date +%m-%d-%Y`.bin
/bin/gzip /backup/archives/*.bin
/bin/gzip /backup/archives/*.trx
echo "`date` OpenWrt Done" >> /backup/backup.log

echo "`date` Backup Complete" >> /backup/backup.log

#Email report
echo "Subject: Backup Report" > /backup/email.log
df -h >> /backup/email.log
cat /backup/backup.log >> /backup/email.log
ls -lah /backup/archives >> /backup/email.log
cat /backup/email.log|sendmail myemail@mydomain.net

grey_1 10-07-07 08:14 AM

Nice! Pun intended. :p

Wouldn't that tend to fill hdd space rather quickly if I'm changing files, dling etc? Or ...wait..I could cron.weekly or monthly, right? I've never used it before so I have much to catch up on.

Lol, having a good old time trying to make spamassassin do it's job in Slypheed for me, but that shouldn't take to long to figure out.

Absolutely loving this setup. :)

Edit: I don't leave my rig running 24/7....Do you think anacron would suit me better?

evilghost 10-07-07 08:17 AM

I backup to an external drive, 30GB, and can keep about 7 days worth. I only have about 5.6GB of used data. You can always --exclude directories you don't want to backup.

grey_1 10-07-07 08:38 AM

Thanks Bro.

All times are GMT -5. The time now is 06:00 PM.

Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.