I’m trying to connect to a docker container’s locally hosted address. I’m using Django to serve a website within the container, and I want to connect to it on my local machine. How can I access this site from my local machine? I’ve tried to inspect the container and found that the local IP address is 172.28.0.4. Even after specifying
Tag: docker-compose
Python, Mqtt : can publish but can’t receive message on docker
I’m running two dockers : one for the mqtt server using mosquitto the other using flask_mqtt the flask container is receiving the CONNACK and is sending the subscribe to the broker but never get any SUBACK however it manages to publish hello word to /home/mytopic (mqtt-explorer proof) this is quite strange because it works without any problem outside of a
docker-compose, reload server each time I make changes to code
I am dockerizing my Django app and I have the following docker-compose.yml file: App logs: I want my container (or the app, idk) to reload whenever I let’s say make changes to my models or functions etc. For now, I add some functionality to my app, edit some html views, but no changes appears and the app logs doesn’t say
How to setup psycopg2 in a docker container running on a droplet?
I’m trying to wrap a scraping project in a Docker container to run it on a droplet. The spider scraps a website and then writes the data to a postgres database. The postgres database is already running and managed by Digitalocean. When I run the command locally to test, everything is fine: I can visualize the spider writing on the
Can’t connect to my Docker Postgres from python suddenly
I’ve been trying to configure my m1 to work with an older ruby on rails api and I think in the process I’ve broken my ability to connect any of my python apis to their database images in docker running locally. When I run: Instead of the lovely psql blinking cursor allowing me to run any sql statement I’d like
Python Mongo docker-compose Topology Error
I’m trying really simple example of mongo with python and got an error. Dockerfile: run_test.sh docker-compose.yaml: db_test.py: db.py: I’m doing docker-compose up and got this output: So it looks like it can’t connect to the database. I didn’t change ports or something, and another example works like a charm with those settings, so I don’t really know what I’m missing.
Docker-Compose Output File To Local Host
I have the below docker-compose.yaml file that sets up a database and runs a python script To run it I perform the following Here is the Dockerfile Now this works fine I can see the data has been properly populated under the generated in the msql database. The python file at the end of the script should dump a csv
Is there a good way to cache apt and pip packages on my home network without changes to my clients?
I am testing and building a lot of containers over a somewhat slow internet connection. Every time I test a rebuild it has to re-download all the apt and pip packages. I have seen a couple solutions to cache locally with docker, but I need the build process to work for other developers and also during deploy. I have also
docker compose .env variables not set
I have the following docker-compose.yml: In my .env file I have the following: This is to start up a flask api, but what I get when I run the container with: docker compose –env-file .env up –build or docker-compose –env-file .env up –build is this: The api insists on starting on 127.0.0.1:5000 suggesting that the environment variables are not set
Docker compose missing python package
To preface I’m fairly new to Docker, Airflow & Stackoverflow. I’ve got an instance of Airflow running in Docker on an Ubuntu (20.04.3) VM. I’m trying to get Openpyxl installed on build in order to use it as the engine for pd.read_excel. Here’s the Dockerfile with the install command: The requirements.txt file looks like this: And the docker-compose.yaml file looks