7
votes

I have a project with a django and scrapy folder in the same workspace:

my_project/
    django_project/
        django_project/
            settings.py
        app1/
        app2/
        manage.py
        ...
    scrapy_project/
        scrapy_project/
            settings.py
        scrapy.cfg
        ...

I've already connected scrapy with my django app1 model so every time I run my spider, it stores the collected data in my postgresql db. This is how my scrapy project can access to the django model

#in my_project/scrapy_project/scrapy_project/settings.py
import sys
import os
import django

sys.path.append('/../../django_project')
os.environ['DJANGO_SETTINGS_MODULE'] = 'django_project.settings'
django.setup()

Everything works great when I call the spider from the command line, but when I wanted to call the spider as a script from a django view or a Celery task in django, for example:

from scrapy.crawler import CrawlerProcess
from scrapy.utils.project import get_project_settings
process = CrawlerProcess(get_project_settings())
process.crawl('spider_name')
process.start()

I get an Error:

KeyError: 'Spider not found: spider_name'

I think I'm suppose to tell Django where is Scrapy located (as I've done in scrapy settings), but I don't know how. To be honest, I'm not even sure that how I design my folder structure for this project is the correct choice.

1
I guess your scrapy_project is not in the PATH. - Andrey Shipilov
Is it enought that scrapy_project is on the PATH? I think that scrapy_project must be the current working directory (but I just briefly read the code). - aufziehvogel
I fixed the issue by putting a symlink to scrapy.cfg in django_project/ directory. - Andriy Tykhonov

1 Answers

0
votes

follow example from scrapy doc:

from my_projec.scrapy_project.spiders import MySpider
...
process.crawl(MySpider)