Looking for some automated backup-ing

JPigford

I'm awesome...seriously..
My iTunes database has now gotten corrupted/damaged twice now. I'd like to setup some sort of cron jobs to automatically do a backup of my iTunes database file. I've got a bit of programming/UNIX knowledge but now really on applying it to my Mac. So any help with this would rock.
 
If you know how to write a cron script, you should be just fine, since there's very little deviation for Mac-specific implementations (as in, not until you get fairly complicated with it).

You know which file to back up, right? And what do you mean by "corrupted/damaged?"
 
Arden said:
You know which file to back up, right? And what do you mean by "corrupted/damaged?"
Yes and I wish I knew what "corrupted/damaged" meant to. Twice iTunes has loaded and then crashed saying the iTunes Playist Database was corrupted and/or damaged. Both times all my playlists were erased. So I just wanted to keep that from happening again.
 
Okay, then go ahead and write your script like you would do for any Unix box and it should work correctly, unless you have some strange, esoteric configuration.

You can also export your library, which might work... say, export it every time you make major changes to your lineup or playlists.
 
Just chiming in for the cron script, as I do the same thing to back up my remote MySQL databases and local files to my backup server. All very elegant and quite easy to script.

If your sh isn't very good you could write an AppleScript to do all your backups and compile it as an application, then just get cron to '/usr/bin/open backup_applescript'. This requires that you have a user logged in at the con though, I believe.
 
Code:
#!/usr/bin/perl

use strict;
use File::Find;
use File::Copy;
use Date::Format;
use MiscUtils; #take this out if you're just going to use `mkdir -p ` later on

my $backup_dir = '/backup/'.time2str('%Y/%m/%d', time());
my $logfile = '/backup/logfile';
my $inc = 1; # in days

my %dirs = (
            '/shared' => '',
            '/home' => [qw(/home/lackluster/Download /home/lackluster/Corrupt /home/lackluster/Incomplete)],
            '/etc' => '',
            '/var' => '',
            '/usr' => '',
            '/opt' => '',
            '/boot' => '');

open (LOG, ">>$logfile") or die("could not open $logfile\n$!\n");
local_log("\n", time2str('%d/%m/%Y', time()), "\n------------------------------\n");

foreach my $prefix (keys %dirs) {
    find(sub {
        my $file = $File::Find::name;
        return if (-d $file);

        return if ((-M $file) > $inc); # was changed more than $inc days ago? skip it.

        if (ref($dirs{$prefix}) eq 'ARRAY') {
            foreach (@{$dirs{$prefix}}) {
                return if ($file =~ m/^$_/);
            }
        }

        my $sep = ((($backup_dir =~ m|/$|) || ($file =~ m|^/|)) ? '' : '/');
        my $new = $backup_dir.$sep.$file;
        mkdirs ($new); #use `mkdir -p $new` instead
        copy ($file, $new) or local_log("$file could not be copied to $new because $!\n");
        `bzip2 $new`; # bzip each file to save room

    }, $prefix);
}

# backup mysql databases
`mysqldump --user=backup --all-databases >${backup_dir}/mysql.sql; cd $backup_dir; bzip2 mysql.sql 2>&1 >>${logfile}`;

sub local_log { print LOG @_; }

The script above is what I whipped up for my setup at home. It does incremental backups in the format /backup/YYYY/MM/DD/foo/bar/file.bz2. On my system, /backup is mounted as another ide device, so it works out okay (better if it were NFS or the like, but whatever). Requires a bit of tweaking if you want to use it (like taking out the mysql backups).

Good luck.
 
Back
Top