Custom signal not being handled by Scrapy internal API

ぐ巨炮叔叔 提交于 2019-12-11 09:44:41

问题


I am trying to handle a custom signal 'signalizers.item_extracted' in a Scrapy extension 'MyExtension' which is successfully enabled when scrapy starts. Here is my code:

signalizers.py

# custom signals
item_extracted = object()
item_transformed = object()


class MyExtension(object):

def __init__(self):
    pass

@classmethod
def from_crawler(cls, crawler):
    # first check if the extension should be enabled and raise
    # NotConfigured otherwise
    if not crawler.settings.getbool('MYEXTENSION_ENABLED'):
        raise NotConfigured

    # instantiate the extension object
    ext = cls()

    # connect the extension object to signals
    crawler.signals.connect(ext.item_extracted, signal=item_extracted)

    # return the extension object
    return ext

def item_extracted(self, item, spider):
    #Do some stuff

Then i try to send the 'signalizers.item_extracted' signal but i think is not handled, or at least i can not either see the actual output nor debug it :

In the spider:

SignalManager(dispatcher.Any).send_catch_log(
   signal=signalizers.item_extracted, 
   item=item, 
   spider=spider)

Am i doing something wrong here?


回答1:


After reading a few Scrapy's source code i figured out the problem was on creating a new Signal manager instance instead of using crawler's one:

spider.crawler.signals.send_catch_log(signal=signalizers.item_extracted, item=item, spider=spider)

Now It is properly handled by the extension



来源:https://stackoverflow.com/questions/30870872/custom-signal-not-being-handled-by-scrapy-internal-api

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!