How to import django models in scrapy pipelines.py file

天涯浪子 提交于 2020-01-22 16:17:26

问题


I'm trying to import models of one django application in my pipelines.py to save data using django orm. I created a scrapy project scrapy_project in the first involved django application "app1" (is it a good choice by the way?). I added these lines to my scrapy settings file:

def setup_django_env(path):
  import imp, os
  from django.core.management import setup_environ

  f, filename, desc = imp.find_module('settings', [path])
  project = imp.load_module('settings', f, filename, desc)

  setup_environ(project)

current_dir = os.path.abspath(os.path.dirname(os.path.dirname(__file__)))
setup_django_env(os.path.join(current_dir, '../../d_project1'))

When I try to import models of my django application app1 I get this error message:

Traceback (most recent call last):
  File "/usr/local/bin/scrapy", line 4, in <module>
    execute()
  File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 122, in execute
    _run_print_help(parser, _run_command, cmd, args, opts)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 76, in     _run_print_help
    func(*a, **kw)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 129, in     _run_command
    cmd.run(args, opts)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/commands/crawl.py", line 43, in     run
    spider = self.crawler.spiders.create(spname, **opts.spargs)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/command.py", line 33, in crawler
    self._crawler.configure()
  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 41, in configure
    self.engine = ExecutionEngine(self, self._spider_closed)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/core/engine.py", line 63, in     __init__
    self.scraper = Scraper(crawler)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/core/scraper.py", line 66, in     __init__
    self.itemproc = itemproc_cls.from_crawler(crawler)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/middleware.py", line 50, in     from_crawler
    return cls.from_settings(crawler.settings, crawler)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/middleware.py", line 29, in     from_settings
    mwcls = load_object(clspath)
  File "/usr/local/lib/python2.7/dist-packages/scrapy/utils/misc.py", line 39, in     load_object
    raise ImportError, "Error loading object '%s': %s" % (path, e)
ImportError: Error loading object 'scrapy_project.pipelines.storage.storage': No module         named dydict.models

Why cannot scrapy access django application models (given that app1 in the installed_app ) ?


回答1:


Try:

from .. models import MyModel 

OR

from ... models import MyModel

Every dot represent the location




回答2:


In the pipelines you don't import django models, you use scrapy models bounded to a django model. You have to add Django Settings at scrapy settings, not after.

To use django models in scrapy project you have to use django_Item https://github.com/scrapy-plugins/scrapy-djangoitem (import to your pythonpath)

My recommended file structure is:

Projects
 |-DjangoScrapy
     |-DjangoProject
     |     |-Djangoproject
     |     |-DjangoAPP
     |-ScrapyProject
            |-ScrapyProject
                 |-Spiders

Then in your scrapy project you hace to add pythonpath ull path to the django project:

**# Setting up django's project full path.**
import sys
sys.path.insert(0, '/home/PycharmProject/scrap/DjangoProject')

# Setting up django's settings module name.
import os
os.environ['DJANGO_SETTINGS_MODULE'] = 'DjangoProject.settings'

Then in your items.py you cand bound your Django models to scrapy models:

from DjangoProject.models import Person, Job
from scrapy_djangoitem import DjangoItem

class Person(DjangoItem):
    django_model = Person
class Job(DjangoItem):
    django_model = Job

Then u can use the .save() method in pipelines after yeld of an object:

spider.py

from scrapy.spider import BaseSpider
from mybot.items import PersonItem

class ExampleSpider(BaseSpider):
    name = "example"
    allowed_domains = ["dmoz.org"]
    start_urls = ['http://www.dmoz.org/World/Espa%C3%B1ol/Artes/Artesan%C3%ADa/']

    def parse(self, response):
        # do stuff
        return PersonItem(name='zartch')

pipelines.py

from myapp.models import Person

class MybotPipeline(object):
    def process_item(self, item, spider):
        obj = Person.objects.get_or_create(name=item['name'])
        return obj

I have a repository with the minimal code working: (you just have to set the path of your django project in scrapy settings) https://github.com/Zartch/Scrapy-Django-Minimal

in: https://github.com/Zartch/Scrapy-Django-Minimal/blob/master/mybot/mybot/settings.py You have to change my Django Project path to your DjangoProject path:

sys.path.insert(0, '/home/zartch/PycharmProjects/Scrapy-Django-Minimal/myweb')


来源:https://stackoverflow.com/questions/15321584/how-to-import-django-models-in-scrapy-pipelines-py-file

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!