# CAT files in subfolders to a parent recursively?



## Gwailo (Sep 20, 2002)

I have thousands (no joke) of text files I need concatenated into a single file. They are all separated into hundreds of little sub-folders, and some in 2-3 levels deeper. The folder names and file names are all alphabetically sorted.

Is there any analogy to _rm -r {directory_name}_ for _cat {directory_name} > output_file_name_.

Alternatyively is there a PERL or shell script I can eaisly write (assuming I know zero perl BTW) that could recursively put all the folders and files into one folder?

Alternatively I was looking at either TAR or GUNZIP flags that will prevent the archive's folder structure from being preserved on decompression. Theoretically I could compress this all and then decompress it so the paths are not preserved so all the files spilled back into one folder? I can't find the flag thouugh, don't know if it exists.

Anyway, any ideas?


----------



## scruffy (Sep 20, 2002)

One way of doing it would be (fingers crossed that bbscript doesn't eat this):

find . -type f -exec cat {} >> ../a_big_file ";"

This will find all regular files in the current directory and all subdirectories, and concatenate them into a file called a_big_file, which will be in the parent directory.

You can put your big file anywhere you want, but don't put it in or below the current directory - then "find" will find the file, and concatenate it onto itself again and again and again, which you likely don't want...


----------



## Gwailo (Sep 20, 2002)

That was perfect! I tried it and it seems to have worked perfectly (I'm not even in the same city as my computer today, so using only the terminal to view my MASSIVE text file) but it seems good!

Thanks scruffy!


----------

