Gwailo
B.A. Economics (Hon)
I have thousands (no joke) of text files I need concatenated into a single file. They are all separated into hundreds of little sub-folders, and some in 2-3 levels deeper. The folder names and file names are all alphabetically sorted.
Is there any analogy to rm -r {directory_name} for cat {directory_name} > output_file_name.
Alternatyively is there a PERL or shell script I can eaisly write (assuming I know zero perl BTW) that could recursively put all the folders and files into one folder?
Alternatively I was looking at either TAR or GUNZIP flags that will prevent the archive's folder structure from being preserved on decompression. Theoretically I could compress this all and then decompress it so the paths are not preserved so all the files spilled back into one folder? I can't find the flag thouugh, don't know if it exists.
Anyway, any ideas?
Is there any analogy to rm -r {directory_name} for cat {directory_name} > output_file_name.
Alternatyively is there a PERL or shell script I can eaisly write (assuming I know zero perl BTW) that could recursively put all the folders and files into one folder?
Alternatively I was looking at either TAR or GUNZIP flags that will prevent the archive's folder structure from being preserved on decompression. Theoretically I could compress this all and then decompress it so the paths are not preserved so all the files spilled back into one folder? I can't find the flag thouugh, don't know if it exists.
Anyway, any ideas?