Before I re-invent this particular wheel, has anybody got a nice routine for calculating the size of a directory using Python? It would be very nice if the routine would format the size nicely in Mb/Gb etc.
Advertisement
Answer
This walks all sub-directories; summing file sizes:
import os def get_size(start_path = '.'): total_size = 0 for dirpath, dirnames, filenames in os.walk(start_path): for f in filenames: fp = os.path.join(dirpath, f) # skip if it is symbolic link if not os.path.islink(fp): total_size += os.path.getsize(fp) return total_size print(get_size(), 'bytes')
And a oneliner for fun using os.listdir (Does not include sub-directories):
import os sum(os.path.getsize(f) for f in os.listdir('.') if os.path.isfile(f))
Reference:
- os.path.getsize – Gives the size in bytes
- os.walk
- os.path.islink
Updated To use os.path.getsize, this is clearer than using the os.stat().st_size method.
Thanks to ghostdog74 for pointing this out!
os.stat – st_size Gives the size in bytes. Can also be used to get file size and other file related information.
import os nbytes = sum(d.stat().st_size for d in os.scandir('.') if d.is_file())
Update 2018
If you use Python 3.4 or previous then you may consider using the more efficient walk
method provided by the third-party scandir
package. In Python 3.5 and later, this package has been incorporated into the standard library and os.walk
has received the corresponding increase in performance.
Update 2019
Recently I’ve been using pathlib
more and more, here’s a pathlib
solution:
from pathlib import Path root_directory = Path('.') sum(f.stat().st_size for f in root_directory.glob('**/*') if f.is_file())