twyg
Back to Mac Baby!
Ok, here's the situation.
I have 10 macs in a LAN setting. 7 of those macs are running 10.1.
One machine acts as the "server" where AppleTalk is turned on, and everyone drops their files there. The current situation requires me to go once a week from mac to mac with a Firewire CD-R drive, and go through what the user wants backed up. Usually to expedite the process I just do a sherlock find file on files that have been modified in the past week through VNC. I also have remote login turned on for those machines running X. (For troubleshooting from my cubicle on stupid stuff like "I can't delete my trash because it says it's in use!?")
Ok, so here's what I want. I want on Friday night for my machine to "run" from computer to computer on the network, pull the data modified in the past week to the "server" excepting files larger than say 50MB, tar and gzip all newly modified files from all machines into their own respective folders, and take all the folders and burn them to a CD I've already prepped for the purpose. *whew*
Normally I would buy Timbuktu or a like program, however I work for a non-profit which means $1 or $2 a month spending allowance for the tech department.
I know that cron jobs have a lot to do with this type of task, and I certainly will be more than happy to get a number of answers, and string it together myself. i.e. you know how to do the remote login, and copying with cron, but not how to find a file modified in the past week that is under 50MB. No sweat, I just would like to get something in place to resolve this issue sooner rather than later.
Thanks, and whoever solves this one for me I owe a brewski. (or a soda should the circumstances require it)
(just realized this is in the wrong forum, sorry jdog!)
I have 10 macs in a LAN setting. 7 of those macs are running 10.1.
One machine acts as the "server" where AppleTalk is turned on, and everyone drops their files there. The current situation requires me to go once a week from mac to mac with a Firewire CD-R drive, and go through what the user wants backed up. Usually to expedite the process I just do a sherlock find file on files that have been modified in the past week through VNC. I also have remote login turned on for those machines running X. (For troubleshooting from my cubicle on stupid stuff like "I can't delete my trash because it says it's in use!?")
Ok, so here's what I want. I want on Friday night for my machine to "run" from computer to computer on the network, pull the data modified in the past week to the "server" excepting files larger than say 50MB, tar and gzip all newly modified files from all machines into their own respective folders, and take all the folders and burn them to a CD I've already prepped for the purpose. *whew*
Normally I would buy Timbuktu or a like program, however I work for a non-profit which means $1 or $2 a month spending allowance for the tech department.
I know that cron jobs have a lot to do with this type of task, and I certainly will be more than happy to get a number of answers, and string it together myself. i.e. you know how to do the remote login, and copying with cron, but not how to find a file modified in the past week that is under 50MB. No sweat, I just would like to get something in place to resolve this issue sooner rather than later.
Thanks, and whoever solves this one for me I owe a brewski. (or a soda should the circumstances require it)
(just realized this is in the wrong forum, sorry jdog!)