I am trying to set up logging where I can log in both stdout and on to a file. This i have accomplished using the following code:
logging.basicConfig( level=logging.DEBUG, format='%(asctime)-15s %(levelname)-8s %(message)s', datefmt='%a, %d %b %Y %H:%M:%S', handlers=[logging.FileHandler(path), logging.StreamHandler()])
The output of this something like this:
2018-05-02 18:43:33,295 DEBUG Starting new HTTPS connection (1): google.com 2018-05-02 18:43:33,385 DEBUG https://google.com:443 "GET / HTTP/1.1" 301 220 2018-05-02 18:43:33,389 DEBUG Starting new HTTPS connection (1): www.google.com 2018-05-02 18:43:33,490 DEBUG https://www.google.com:443 "GET / HTTP/1.1" 200 None
What I am trying to accomplish is logging this output to a file not as it is printing to stdout, but as a dictionary or JSON object similar to something like this (while keeping the stdout as it is at the moment):
[{'time': '2018-05-02 18:43:33,295', 'level': 'DEBUG', 'message': 'Starting new HTTPS connection (1): google.com'}, {...}, {...}]
Is this doable? I understand that I can post process this log file after my process is finished, but I am looking for a more elegant solution because certain things i am logging are quite big objects themselves.
Advertisement
Answer
So based on @abarnert, i found this Link which provided a good path to making this concept work for the most part. The code as it stands is:
logger=logging.getLogger() logger.setLevel(logging.DEBUG) file_handler=logging.FileHandler('foo.log') stream_handler=logging.StreamHandler() stream_formatter=logging.Formatter( '%(asctime)-15s %(levelname)-8s %(message)s') file_formatter=logging.Formatter( "{'time':'%(asctime)s', 'name': '%(name)s', 'level': '%(levelname)s', 'message': '%(message)s'}" ) file_handler.setFormatter(file_formatter) stream_handler.setFormatter(stream_formatter) logger.addHandler(file_handler) logger.addHandler(stream_handler)
Although it does not fully meet the requirement, it doesnt require any pre processing, and allows me to create two log handlers.
Afterwards, i can use something like:
with open('foo.log') as f: logs = f.read().splitlines() for l in logs: for key, value in eval(l): do something ...
to pull dict
objects instead of fighting with improperly formatted JSON to accomplish what i had set out to accomplish.
Still am hoping for a more elegant solution.