I have a postgres database running on a digital ocean server. The database is protected by a firewall and ssl root certificate, I add the Outbound addresses provided by the Azure Function App to the database firewall and I am passing the certificate through the connection string. But when I upload my function to the cloud the connection sends a
Tag: postgresql
Parse a CSV file, loop and insert rows into a PostGreSQL database
I use the Python psycopg2 module to copy the content of a csv file (a list of users) into a PostGreSQL database. So I begin to parse the CSV with the Python pandas module. Then with a for loop, I try to insert my data in my SQL queries. I have two problems : a) When I execute the role
Track to database, artifacts to specific destination
I am running mlflow ui and PostgreSQL db in docker compose. Mlflow UI container runs like this: mlflow ui –backend-store-uri “postgresql+psycopg2://postgres:passw0rd@database:5432/postgres” –host 0.0.0.0 Then I run my models locally from jupyter, e.g. Everything works fine – experiments get logged into PostgreSQL and mlflow UI can read it from PostgreSQL . One thing that bothers me is that artifacts are stored
django.db.utils.DataError: value too long for type character varying(30). I am getting this error while migrating on heroku postgresql
The errors I am getting while migrating on PostgreSQL Heroku. Note: It is working fine on the local server. there is no column with 30 lengths even tried migrating after deleting all data and still getting the same error. This is the model of the project models.py Answer I just deleted the junk file inside the migrations>pycache
Inserting data into psql database with high performance
Assume, I have a Python program, and I have an Offer object Offer(title=’title1′, category=’cat1′, regions=[‘reg1’]). I want to add this Offer into psql db, with minimal number of queries (performance). Inserts of new regions and categories are rare (number of regions and categories is limited (and are unique), but number of offers is unlimited). Basically Regions and Categories can be
How can I cast a DateField() + TimeField() to local time in a Django QuerySet?
My model as these fields: date = models.DateField() start_time = models.TimeField() end_time = models.TimeField() I would like to annotate the queryset with start_datetime and end_datetime, like so: However, the output field in the query results in a naive datetime: I would like the dates to be localized within the query. I tried wrapping the above expressions in django.db.models.functions.Cast(), with output_field=DateTimeField(),
How to pass the PostgreSQL query result into a variable in Airflow? (Postgres Operator or Postgres Hook)
I’m planning to use PostgreSQL as my task meta info provider, so I want to run a few queries and get some data and pass it like a filled variable to another task. The problem is when I use PostgresHook I get the data but its in a python method that I cant access, in fact I see bellow line
Calculated boolean column showing if column is newest
I have a table with a structure that more or less (I’ve simplified it for the question) looks like the following: id (P.K.) creation_ts some_field 1 2021-08-19 foo 2 2021-08-18 foo 3 2021-08-17 foo 4 NULL bar 5 2021-01-01 bar 6 2021-01-02 bar And I’m trying build a query to show a calculated column that per each row with the
How to Update in sqlalchemy using objects?
I have a student table in postgres with the following columns Studentid | Student_name | Student_division I am using sql alchemy and have modeled it as How to update? This doesnt work. Answer you need to commit the changes and remove the session.add(x) just keep session.commit()
inserting into Postgres table with a foreign key using python
I am trying to insert data from a json into my postgres table with python (psycopg2). This json is huge and now I am questioning what the proper way is to insert all of this data. What makes it difficult for me is that the table has a reference to another table. I want to refer to the id of