Error installing Scrapy package in Python

Asked

Viewed 1,090 times

3

I’m trying to install Scrapy through Pip, but I’ve been getting errors like:

running build_ext

buildig 'lxml.etree' Extension

error: Microsoft Visual C++ 10.0 is required (Unable to find vcvarsall.bat).

I already have VSC++ 10 installed. I am using Python version 3.4.4 and Windows 7. Can anyone help me?

3 answers

2

Many Python packages are boring to install with PIP on Windows as they need other tools to compile.

I would try to install already precompilated.

  1. Find the link to the package you want here http://www.lfd.uci.edu/~gohlke/pythonlibs/#scrapy . Normally you would need to choose the right version, Python 3 vs Python 2, 32-bit vs 64-bit. But in this case it seems there is only one version.
  2. Download the wheel (.whl) file from the link
  3. Install with PIP: pip install c:\caminho\para\Scrapy‑1.2.1‑py2.py3‑none‑any.whl

This is how I install Numpy and matplotlib among others on Windows.

1

0

See this link:

However, if you are Targeting Python 3.3 and 3.4 (and Did not have access to Visual Studio 2010), building is Slightly more Complicated. You will need to open a Visual Studio Command Prompt (Selecting the x64 version if using 64-bit Python) and run set DISTUTILS_USE_SDK=1 before Calling Pip install.

Looks like you need to open a shell in Visual Studio and call the Pip install from inside.

Browser other questions tagged

You are not signed in. Login or sign up in order to post.