I know that python mechanism doesn’t allow doing relative imports without a known parent package, but I want to know which the reason is.
If relative imports worked without known parent package it would make developers life much easier (I think, correct me if I am wrong)
Example:
From a python script a have to import a global definitions file. Right now I have to do it like this:
DEFS_PATH = '../../../' sys.path.append(DEFS_PATH) import DEFINITIONS as defs
If I could import this file just like this without having to specify the -m
flag when executing the script or creating a __init__.py
file that collects all packages. It would make everything much more easier.
from .... import DEFINITIONS as defs
Of course doing this raises the famous import error:
Obviously this is a toy example, but imagine having to repeat this in hundreds of python scripts…
Is there any workaround for importing relative packages without a known parent package that doesn’t involve tha hacky ugly way (sys.path.append(...)
or python -m myscript
)?
Advertisement
Answer
I solved this problem but in a different way. I have a folder where I have lots of global functions that I used in different packages, I could create a Python Package of this folder, however I would have to rebuild it each time I changed something.
The solution that fitted me was to add user-packages.pth
file in site-packages
directory of my current environment, but it could also be added to global site-packages
folder. Inside this user-packages.pth
I added the absolute path to my directory where all the global utils are. And now I just have to do from any python script
from utils import data_processing as dp from utils.database import database_connection as dc
Now I don’t need to add in each file sys.path.append(“path/to/myutils/”)
Note:
The .pth
file could have any file name (customName.pth
) and paths inside the file should be separated by carriage return (“n”). Also, paths should be absoulte.
For example:
C:pathtoutils1 C:pathtootherutils2