I’m trying to run a Docker container created from this Dockerfile Building this container is possible, but when I execute the image, I obtain the following message: And, therefore, I am not able to press enter and continue the execution, as the execution halts. Do you guys know how should I make my Dockerfile to directly execute the streamlit run
Tag: docker
Error in anyjson setup command: use_2to3 is invalid
This is a common error which the most common solution to is to downgrade setuptools to below version 58. This was not working for me. I tried installing python3-anyjson but this didn’t work either. I’m at a complete loss.. any advice or help is much appreciated. If it matters: this application is legacy spaghetti and I am trying to polish
Trying to supply PGPASS to Docker Image
New to Docker here. I’m trying to create a basic Dockerfile where I run a python script that runs some queries in postgres through psycopg2. I currently have a pgpass file setup in my environment variables so that I can run these tools without supplying a password in the code. I’m trying to replicate this in Docker. I have windows
How to test Docker container on another machine?
I wrote a simple script as follows: I dockerized it and put it in a container, and ran it from there. Everything is working fine. I have another machine with Linux OS. How can I test this container there? Do I need to simply copy paste it or what? Answer You need to install Docker on the other machine, and
Is there a good way to cache apt and pip packages on my home network without changes to my clients?
I am testing and building a lot of containers over a somewhat slow internet connection. Every time I test a rebuild it has to re-download all the apt and pip packages. I have seen a couple solutions to cache locally with docker, but I need the build process to work for other developers and also during deploy. I have also
Using bind mount in Docker (Ubuntu 20.04)
I am really new to Linux and Docker, and i want to not rebuild docker image after every change of my code. I read, that you can use bind mount, but i can’t understand it’s syntax and usage. My Python files with Dockerfile is located at /etc/python-docker Answer You can mount your python files inside the container with the -v
docker compose .env variables not set
I have the following docker-compose.yml: In my .env file I have the following: This is to start up a flask api, but what I get when I run the container with: docker compose –env-file .env up –build or docker-compose –env-file .env up –build is this: The api insists on starting on 127.0.0.1:5000 suggesting that the environment variables are not set
updating API Prometheus metrics on an Http server error
I work on a docker project with several containers and I wanna use python Prometheus library to monitor some metrics in the containers, expose each container’s metrics on a local port inside the docker-network and collect them in another container called Prometheus_exporter. For this purpose, I’ve defined several Prometheus metrics on my FastAPI and I wanna expose them on an
Cannot run docker container. Error response from daemon pull
To use TensorFlow serving, I had to use docker. I downloaded the TensorFlow image using After that, I had to start tf serving and map my directories. As a result I have an error :- Answer The volume path contained spaces, putting “” around the path could solve the error In this case, I changed the name of the directory.
Error using docker compose in AWS Code Pipeline
I’m deploying my dockerized Django app using AWS Code Pipeline but facing some errors of Docker. error: docker-compose-deploy.yml buildspec.yml Answer Docker Hub limits the number of Docker image downloads (“pulls”) based on the account type of the user pulling the image. Pull rates limits are based on individual IP address. For anonymous users, the rate limit is set to 100