我可以在Windows上用Python 3运行Scrapy吗?

mrfwxfqh  于 2022-11-09  发布在  Windows
关注(0)|答案(4)|浏览(197)

看起来Scrapy 1.1.0rc3无法在Windows上运行Python 3。
当我运行Scrapy tutorial上指定的scrapy crawl dmoz命令时,出现以下异常:

D:\Copy From 2\Python Project\ZhihuPlan\tutorial\tutorial>scrapy crawl dmoz
2016-04-26 14:40:36 [scrapy] INFO: Scrapy 1.1.0rc3 started (bot: tutorial)
2016-04-26 14:40:36 [scrapy] INFO: Overridden settings: {'ROBOTSTXT_OBEY': True, 'SPIDER_MODULES': ['tutorial.spiders'], 'BOT_NAME': 'tutorial', 'NEWSPIDER_MODULE': 'tutorial.spiders'}
2016-04-26 14:40:36 [scrapy] INFO: Enabled extensions:
['scrapy.extensions.corestats.CoreStats', 'scrapy.extensions.logstats.LogStats']
Unhandled error in Deferred:
2016-04-26 14:40:36 [twisted] CRITICAL: Unhandled error in Deferred:

Traceback (most recent call last):  
File "D:\Anaconda\Lib\site-packages\scrapy\commands\crawl.py", line 57, in run self.crawler_process.crawl(spname,**opts.spargs)  
File "D:\Anaconda\Lib\site-packages\scrapy\crawler.py", line 163, in crawl return self._crawl(crawler, *args,**kwargs)  
File "D:\Anaconda\Lib\site-packages\scrapy\crawler.py", line 167, in _crawl
    d = crawler.crawl(*args,**kwargs)  
File "D:\Anaconda\Lib\site-packages\twisted\internet\defer.py", line 1274, in unwindGenerator
    return _inlineCallbacks(None, gen, Deferred())
--- <exception caught here> ---  
  File "D:\Anaconda\Lib\site-packages\twisted\internet\defer.py", line 1128, in _inlineCallbacks
    result = g.send(result)  
  File "D:\Anaconda\Lib\site-packages\scrapy\crawler.py", line 72, in crawl
    self.engine = self._create_engine()  
  File "D:\Anaconda\Lib\site-packages\scrapy\crawler.py", line 97, in _create_engine  
    return ExecutionEngine(self, lambda _: self.stop())
  File "D:\Anaconda\Lib\site-packages\scrapy\core\engine.py", line 68, in __init__
    self.downloader = downloader_cls(crawler)  
  File "D:\Anaconda\Lib\site-packages\scrapy\core\downloader\__init__.py", line 88, in __init__
    self.middleware = DownloaderMiddlewareManager.from_crawler(crawler)  
  File "D:\Anaconda\Lib\site-packages\scrapy\middleware.py", line 58, in from_crawler
    return cls.from_settings(crawler.settings, crawler)  
  File "D:\Anaconda\Lib\site-packages\scrapy\middleware.py", line 34, in from_settings
    mwcls = load_object(clspath)  
  File "D:\Anaconda\Lib\site-packages\scrapy\utils\misc.py", line 44, in load_object
    mod = import_module(module)  
  File "d:\anaconda\lib\importlib\__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)  
  File "<frozen importlib._bootstrap>", line 986, in _gcd_import

 File "<frozen importlib._bootstrap>", line 969, in _find_and_load

  File "<frozen importlib._bootstrap>", line 958, in _find_and_load_unlocked

  File "<frozen importlib._bootstrap>", line 673, in _load_unlocked

  File "<frozen importlib._bootstrap_external>", line 662, in exec_module

  File "<frozen importlib._bootstrap>", line 222, in _call_with_frames_removed

  File "D:\Anaconda\Lib\site-packages\scrapy\downloadermiddlewares\retry.py", line 23, in <module>
    from scrapy.xlib.tx import ResponseFailed
  File "D:\Anaconda\Lib\site-packages\scrapy\xlib\tx\__init__.py", line 3, in <module>
    from twisted.web import client
  File "D:\Anaconda\Lib\site-packages\twisted\web\client.py", line 41, in <module>
    from twisted.internet.endpoints import TCP4ClientEndpoint, SSL4ClientEndpoint
  File "D:\Anaconda\Lib\site-packages\twisted\internet\endpoints.py", line 34, in <module>
    from twisted.internet.stdio import StandardIO, PipeAddress
  File "D:\Anaconda\Lib\site-packages\twisted\internet\stdio.py", line 30, in <module>
    from twisted.internet import _win32stdio
builtins.ImportError: cannot import name '_win32stdio'
2016-04-26 14:40:36 [twisted] CRITICAL:

有什么方法可以修复这个错误吗?我可以在Windows上用Python 3运行Scrapy吗?

fsi0uk1n

fsi0uk1n1#

遗憾的是,您无法在Windows上使用Python 3运行Scrapy。
Scrapy目前不支持Windows上的Python 3。请参阅此处的发行说明(滚动到限制部分)https://blog.scrapinghub.com/2016/02/04/python-3-support-with-scrapy-1-1rc1/我们正在修复此问题。
同时,您可以尝试使用不同的Python版本。

9wbgstp7

9wbgstp72#

Scrapy现在可以在带有Python 3的Windows 10上工作。我刚才通过conda提示符试过了。
首先,打开conda并激活添加了scrapy的环境。

conda activate <your environment>

那就跑去拼了

scrapy runspider <path.to.file.py>

......而史高丽应该逃跑。

blpfk2vs

blpfk2vs3#

您可以使用scrapy runspider file_name.py -O file_name.csv以.csv格式保存

h7wcgrx3

h7wcgrx34#

Scrapy在系统Windows上一起工作Python 3.
安装Anaconda或Miniconda后,请安装Scrapy:

conda install -c conda-forge scrapy

scrapy crawl spider_name

在此处查找安装过程:https://doc.scrapy.org/en/1.4/intro/install.html

相关问题