Skip to content
Advertisement

Is there a good way to cache apt and pip packages on my home network without changes to my clients?

I am testing and building a lot of containers over a somewhat slow internet connection. Every time I test a rebuild it has to re-download all the apt and pip packages.

I have seen a couple solutions to cache locally with docker, but I need the build process to work for other developers and also during deploy.

I have also found a number of solutions to caching apt and pip with a local server, but all of them require changes to the clients which also won’t work for me.

Is there a way I can set up a router level DNS redirect to a local package caching server, or a similar solution?

Also, in the hopes to avoid the XY problem, is there a better solution I am missing?

Thanks!

Advertisement

Answer

I ended up adding

RUN --mount=type=cache,target=/var/cache/apt,sharing=locked --mount=type=cache,target=/var/lib/apt,sharing=locked 
apt update && apt install -y python3.8 ...

and

RUN --mount=type=cache,target=/root/.cache/pip pip3 install -r requirements.txt

to the Dockerfile and that made the process much less painful.

User contributions licensed under: CC BY-SA
6 People found this is helpful
Advertisement