I am not sure at what you are getting at here, I think. Are you specifically trying to do a recursive download, or are you trying to do a list? Or maybe it is somewhere in between? While I haven't messed around too much with either recursive downloads or downloading a list of files, it seems to support these in a logical fashion (this is based on the help file in "file:///Library/Documentation/Commands/wget/". Since I don't quite understand the question, I hope this helps. It details the "-I" command, which might be what you are looking for:
`-I list'
`--include list'
`include_directories = list'
`-I' option accepts a comma-separated list of directories included in the retrieval. Any other directories will simply be ignored. The directories are absolute paths. So, if you wish to download from `
http://host/people/bozo/' following only links to bozo's colleagues in the `/people' directory and the bogus scripts in `/cgi-bin', you can specify:
wget -I /people,/cgi-bin
http://host/people/bozo/
Another source for help with wget is typing "wget --help"
Finally, you can get wget to download a file containing URLs with the command "wget -i file". This might be the best option, since the file need not be comma delimited.
-Bruce Adcock