鱼C论坛

 找回密码
 立即注册
查看: 2758|回复: 6

求助 scrapy not found tutorial

[复制链接]
发表于 2017-11-12 15:56:37 | 显示全部楼层 |阅读模式
5鱼币
网上找了很多方法都没有用.... 实在不行了

scrapy crawl books -o books.csv后出错

源代码
  1. #-*-coding:utf-8-*-


  2. import scrapy

  3. class BooksSpider(scrapy.Spider):
  4.     #every marks of spider
  5.     name = "books"

  6.     #define beginning point
  7.     start_urls = ['http://books.toscrape.com/']

  8.     def parse(self,response):
  9.         # get data
  10.         # details in <article class = "product pod">

  11.         for book in response.css('article.product_pod'):
  12.             name = book.xpath('./h3/a/@title').extract_first()


  13.             price = book.css('p.price_color::text').extract_first()
  14.             yield {
  15.                 'name':name,
  16.                 'price':price,
  17.             }



  18.             next_url = response.css('ul.pager li.next a::attr(href)').extract_first()
  19.             yield scrapy.Request(next_url,callback=self.parse)
复制代码



出错提示:

  1. 2017-11-12 13:58:04 [scrapy.utils.log] INFO: Scrapy 1.4.0 started (bot: tutorial)
  2. 2017-11-12 13:58:04 [scrapy.utils.log] INFO: Overridden settings: {'BOT_NAME': 'tutorial', 'NEWSPIDER_MODULE': 'tutorial.spiders', 'ROBOTSTXT_OBEY': True, 'SPIDER_MODULES': ['tutorial.spiders']}
  3. Traceback (most recent call last):
  4.   File "/Users/onec/anaconda3/lib/python3.6/site-packages/scrapy/spiderloader.py", line 69, in load
  5.     return self._spiders[spider_name]
  6. KeyError: 'tutorial'

  7. During handling of the above exception, another exception occurred:

  8. Traceback (most recent call last):
  9.   File "/Users/onec/anaconda3/bin/scrapy", line 11, in <module>
  10.     sys.exit(execute())
  11.   File "/Users/onec/anaconda3/lib/python3.6/site-packages/scrapy/cmdline.py", line 109, in execute
  12.     settings = get_project_settings()
  13.   File "/Users/onec/anaconda3/lib/python3.6/site-packages/scrapy/utils/project.py", line 68, in get_project_settings
  14.     settings.setmodule(settings_module_path, priority='project')
  15.   File "/Users/onec/anaconda3/lib/python3.6/site-packages/scrapy/settings/__init__.py", line 292, in setmodule
  16.     module = import_module(module)
  17.   File "/Users/onec/anaconda3/lib/python3.6/importlib/__init__.py", line 126, in import_module
  18.     return _bootstrap._gcd_import(name[level:], package, level)
  19.   File "<frozen importlib._bootstrap>", line 994, in _gcd_import
  20.   File "<frozen importlib._bootstrap>", line 971, in _find_and_load
  21.   File "<frozen importlib._bootstrap>", line 941, in _find_and_load_unlocked
  22.   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  23.   File "<frozen importlib._bootstrap>", line 994, in _gcd_import
  24.   File "<frozen importlib._bootstrap>", line 971, in _find_and_load
  25.   File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
  26.   File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
  27.   File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  28.   File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  29.   File "/Users/onec/tutorial/tutorial/__init__.py", line 2, in <module>
  30.     cmdline.execute("scrapy crawl tutorial".split())
  31.   File "/Users/onec/anaconda3/lib/python3.6/site-packages/scrapy/cmdline.py", line 149, in execute
  32.     _run_print_help(parser, _run_command, cmd, args, opts)
  33.   File "/Users/onec/anaconda3/lib/python3.6/site-packages/scrapy/cmdline.py", line 89, in _run_print_help
  34.     func(*a, **kw)
  35.   File "/Users/onec/anaconda3/lib/python3.6/site-packages/scrapy/cmdline.py", line 156, in _run_command
  36.     cmd.run(args, opts)
  37.   File "/Users/onec/anaconda3/lib/python3.6/site-packages/scrapy/commands/crawl.py", line 57, in run
  38.     self.crawler_process.crawl(spname, **opts.spargs)
  39.   File "/Users/onec/anaconda3/lib/python3.6/site-packages/scrapy/crawler.py", line 167, in crawl
  40.     crawler = self.create_crawler(crawler_or_spidercls)
  41.   File "/Users/onec/anaconda3/lib/python3.6/site-packages/scrapy/crawler.py", line 195, in create_crawler
  42.     return self._create_crawler(crawler_or_spidercls)
  43.   File "/Users/onec/anaconda3/lib/python3.6/site-packages/scrapy/crawler.py", line 199, in _create_crawler
  44.     spidercls = self.spider_loader.load(spidercls)
  45.   File "/Users/onec/anaconda3/lib/python3.6/site-packages/scrapy/spiderloader.py", line 71, in load
  46.     raise KeyError("Spider not found: {}".format(spider_name))
  47. KeyError: 'Spider not found: tutorial'
  48. one-2:tutorial onec$
复制代码

想知道小甲鱼最近在做啥?请访问 -> ilovefishc.com
回复

使用道具 举报

发表于 2017-11-12 16:28:25 | 显示全部楼层
应该要查看以下你项目的结构目录,可能从结构上就错了
想知道小甲鱼最近在做啥?请访问 -> ilovefishc.com
回复

使用道具 举报

 楼主| 发表于 2017-11-12 19:03:05 | 显示全部楼层
ooxx7788 发表于 2017-11-12 16:28
应该要查看以下你项目的结构目录,可能从结构上就错了

1.png
想知道小甲鱼最近在做啥?请访问 -> ilovefishc.com
回复

使用道具 举报

发表于 2017-11-12 20:02:01 | 显示全部楼层
你这错误信息好像和你的代码不匹配啊。
  1. 2017-11-12 13:58:04 [scrapy.utils.log] INFO: Scrapy 1.4.0 started (bot: tutorial)
复制代码

你这个里面的bot可是tutorial
想知道小甲鱼最近在做啥?请访问 -> ilovefishc.com
回复

使用道具 举报

 楼主| 发表于 2017-11-12 20:43:20 | 显示全部楼层
ooxx7788 发表于 2017-11-12 20:02
你这错误信息好像和你的代码不匹配啊。

你这个里面的bot可是tutorial

已经头晕中··· 我从装pip开始就一直折腾
想知道小甲鱼最近在做啥?请访问 -> ilovefishc.com
回复

使用道具 举报

发表于 2017-11-12 20:50:39 | 显示全部楼层
onedesigners 发表于 2017-11-12 20:43
已经头晕中··· 我从装pip开始就一直折腾

按照demo再来一次吧,scrapy没那么容易上手
想知道小甲鱼最近在做啥?请访问 -> ilovefishc.com
回复

使用道具 举报

 楼主| 发表于 2017-11-13 13:41:10 | 显示全部楼层
ooxx7788 发表于 2017-11-12 20:50
按照demo再来一次吧,scrapy没那么容易上手

好的,
想知道小甲鱼最近在做啥?请访问 -> ilovefishc.com
回复

使用道具 举报

您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

小黑屋|手机版|Archiver|鱼C工作室 ( 粤ICP备18085999号-1 | 粤公网安备 44051102000585号)

GMT+8, 2024-4-19 13:28

Powered by Discuz! X3.4

© 2001-2023 Discuz! Team.

快速回复 返回顶部 返回列表