I am in a Ubuntu 22.04 Docker container with Python 3.10. I use these apt packages: I use the following Python packages: The sessionmaker parameter autoflush is set to True. I want to add a deletion and after that an insertion to a sqlalchemy.orm session so that I commit only when the two commands worked out well. The aim of
Tag: database-connection
How to use DatabaseHook objects with PythonOperator in Airflow without running out of connections?
I’m trying to store my database credentials using Airflow Connections and use them with PythonOperators. I noticed that if I pass the credentials to the PythonOperator then every variable gets logged, including the database password. So I moved to pass the connection object itself to the PythonOperator, per the example below. But the issue I have now is that airflow
Can I connect to mysql using Psycopg2 lib
I have a project in python that connects to a postgreSQL database. That project have a module that connects to other data base. The connection string is on a configuration file. I configure it to use a mysql database like this: One running the module I always get the following error: line 179, in connect connection_factory=connection_factory, async=async) psycopg2.OperationalError: FATAL: password
multiple execute on a single connection.cursor in django. Is it safe?
I am opening a cursor with connection.cursor executing a bunch of deletes then closing the cursor. It works, but I am not sure if it has any side effect. Would appreciate any feedback. Answer Since you are not executing these SQL statements in the transaction, you may encounter confusing states (for example, data was deleted from table_a, but not from