Skip to content
Advertisement

scrapy not running ModuleNotFoundError: No module named ‘scraper.settings’

I am getting below error while running my scrapy project. I tried everything suggested on stackoverflow but yet nothing has solved the problem.

Feel free to ask for more information. Looking forward for any help.

(venv) [kalpesh@localhost scraper]$ scrapy crawl mrdeepfakes -a output=db
Traceback (most recent call last):
  File "/home/kalpesh/venv/bin/scrapy", line 8, in <module>
    sys.exit(execute())
  File "/home/kalpesh/venv/lib/python3.6/site-packages/scrapy/cmdline.py", line 113, in execute
    settings = get_project_settings()
  File "/home/kalpesh/venv/lib/python3.6/site-packages/scrapy/utils/project.py", line 69, in get_project_settings
    settings.setmodule(settings_module_path, priority='project')
  File "/home/kalpesh/venv/lib/python3.6/site-packages/scrapy/settings/__init__.py", line 287, in setmodule
    module = import_module(module)
  File "/home/kalpesh/venv/lib64/python3.6/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 994, in _gcd_import
  File "<frozen importlib._bootstrap>", line 971, in _find_and_load
  File "<frozen importlib._bootstrap>", line 953, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'scraper.settings'

Advertisement

Answer

Make sure your scrapy.cfg file has the same default and project name as your spider crawler name inside the spiders folder. I tried changing my spider crawler name in the python file and subsequently modifying the scrapy.cfg file and it works.

Screenshot of scrapy.cfg

User contributions licensed under: CC BY-SA
3 People found this is helpful
Advertisement