This may or may not be the same problem I was having. I am using ADSL, though it would be the same with any connection. This would be a problem if you have turned on your firewall, say using brickhouse.
The problem is that ftp opens a port for data transfer on the client machine, and the server machine connects to it. If your firewall is blocking that port, then of course it won't work. You have to use "passive mode" ftp, where the port is opened on the server, and the client connects to it.
You can do this with command-line ftp by issuing "passive" as the first command after connecting to a server.
If you use wget, you can invoke it as "wget --passive-ftp ftp://ser.ver/path/to/file"
Almost all ftp servers are capable of doing passive mode ftp.
No, this is not a passive/firewall issue. The transfer starts, but dies after up to 120k. So far as I can tell this effects all protocols/services. Uploading small amounts of data works, such as GET requests for web pages, but any large block (at a guess, 20k+) fails. I smell a rat in PPP Connect....