I have a list of 20 file names, like ['file1.txt', 'file2.txt', ...]
. I want to write a Python script to concatenate these files into a new file. I could open each file by f = open(...)
, read line by line by calling f.readline()
, and write each line into that new file. It doesn’t seem very “elegant” to me, especially the part where I have to read/write line by line.
Is there a more “elegant” way to do this in Python?
Advertisement
Answer
This should do it
For large files:
filenames = ['file1.txt', 'file2.txt', ...] with open('path/to/output/file', 'w') as outfile: for fname in filenames: with open(fname) as infile: for line in infile: outfile.write(line)
For small files:
filenames = ['file1.txt', 'file2.txt', ...] with open('path/to/output/file', 'w') as outfile: for fname in filenames: with open(fname) as infile: outfile.write(infile.read())
… and another interesting one that I thought of:
filenames = ['file1.txt', 'file2.txt', ...] with open('path/to/output/file', 'w') as outfile: for line in itertools.chain.from_iterable(itertools.imap(open, filnames)): outfile.write(line)
Sadly, this last method leaves a few open file descriptors, which the GC should take care of anyway. I just thought it was interesting