UNIX: Prove your skills?

didde

Registered
Ok, I have a thought here..

I constantly download large folders containing a lot of files from work. Usually it's roughly around 1 - 2Gb worth of files.

The problem is that this process usually takes about 3 - 4 hours and during that time it is quite a pain to have to use an Ftp-program like CaptainFTP which takes up a lot of unnecessary GUI resources..

Wouldn't it be possible to use the Terminal's FTP and download in the background? I am no expert on UNIX, so I don't dare to put together a .sh for it, but maybe some brave soul out there has the guts?

It would really be of great use for me and no doubt several people out there would love to have the opportunity to download "silently" in a Terminal session which you then can release through the "&".

Someone up to the challenge?


:p

//Dd.
 
No big deal, really. =)


You could explore using the background downloading capabilities of the 'ncftp' included in MacOSX. Check out the manual pages or online help.


dani++
 
didde - a few more details ...

curl is probably your best bet. In the terminal - change directories to where you want the file downloaded:

cd /location/of/download/dir

then simply issue the curl cmd with the url of your download - like such:

curl -O http://file/to/download

covers ftp, http, etc - not limited to http.

Other options you might investigate...

sftp - secure (encrytped) ftp
rsync - for syncing again your dir/files at work
 
one nice benefit of sftp over others is that if you have a slowish connection (like, not in the same building type slow), then you can use the flag -z to turn on compression of the connection. It won't do you much good if the files you're downloading are already compressed, of course, but if it's uncompressed data, it might speed things up a lot...
 
Back
Top