Skip to content
Advertisement

Python iterate folder of csv and convert do json

I am amateur at python but I have a task of converting folder of csv to json files. I have this script working with specified CSV file but I have no idea how to make the script iterate thrue folder of csv and convert all of those csv to json. The original script:

import csv
import json
import pandas as pd

file = '/users/krzysztofpaszta/CSVtoGD/build-a-bridge.csv'
json_file = '/users/krzysztofpaszta/CSVtoGD/build-a-bridge.json'

#Odczyt pliku CSV
def read_CSV(file, json_file):
    csv_rows = []
    with open(file) as csvfile:
        reader = csv.DictReader(csvfile)
        field = reader.fieldnames
        for row in reader:
            csv_rows.extend([{field[i]:row[field[i]] for i in range(len(field))}])
        convert_write_json(csv_rows, json_file)

#Zamiana CSV na JSON
def convert_write_json(data, json_file):
    with open(json_file, "w") as f:
        f.write(json.dumps(data, sort_keys=False, indent=4, separators=(',', ': '))) 
        f.write(json.dumps(data))


read_CSV(file,json_file)

someone will give me a hint?

Advertisement

Answer

You can use os functions, particularly os.listdir() to iterate over files in the directory, and safely generate new names with os.path.splitext():

import os

DIRECTORY = "/path/to/dir"

for f in os.listdir(os.fsencode(DIRECTORY)):
     fn = os.fsdecode(f)
     pre, ext = os.path.splitext(fn)
     if ext == ".csv": 
         read_CSV(fn, pre + '.json')

The similar approach with pathlib would be:

from pathlib import Path

DIRECTORY = "/path/to/dir"

files = Path(DIRECTORY).glob('*.csv') # to process files only in this dir

files = Path(DIRECTORY).rglob('*.csv') # to process files in sub-directories recursively

for f in files:
    read_CSV(f, str(f.with_suffix('.json'))) # use .with_suffix() for safe name generation
User contributions licensed under: CC BY-SA
10 People found this is helpful
Advertisement