One would think that after so many years, things would simply work on linux. Especially on mainstream platforms. Mint 17 is based on the last Ubuntu LTS version (Trustly Tahr).
For the uninitiated linux can seem difficult with all the terminal commands. But actually install guides make things seem a lot easier than they really are. Case in point, the scrapy site simply lists the command:
pip install scrapy
Very easy right? Well, assuming you already managed to install pip in the first place.
In any case, finding the exact error is not easy. One needs some intuition of where to look for it in the huge log files, and then Google the right error message so one can find the relevant information in the hive-mind (typically stackexchange).
libxml2 and libxslt
Buried in the 1000 line log file was an error concerning these:
ERROR: /bin/sh: 1: xslt-config: not found
** make sure the development packages of libxml2 and libxslt are installed **
Seems to be solvable by installing these.
sudo apt-get install libxml2-dev
sudo apt-get install libxslt1-dev
libffi
Package libffi was not found in the pkg-config search path.
Perhaps you should add the directory containing `libffi.pc’
to the PKG_CONFIG_PATH environment variable
No package ‘libffi’ found
This was repeated many times. The command to fix is: sudo apt-get install libffi-dev
/usr/bin/ld: cannot find -lz
sudo apt-get install -y zlib1g-dev
This error happened when trying to install lxml with pip.
OpenSSL error
One error was:
cryptography/hazmat/bindings/__pycache__/_Cryptography_cffi_36a40ff0x2bad1bae.c:194:25: fatal error: openssl/aes.h: No such file or directory
Which is discussed here. Solution: sudo apt-get install libssl-dev
And the rest
The above may not be sufficient. I ran lots of other commands which I have now forgotten. Perhaps some of them made a difference too. Perhaps not.