最近在自学爬虫框架scrapy,看了看官网文档,迫不及待动手安装。结果错误百出,惨不忍睹。网上搜了一篇文章,(原文链接 http://www.tuicool.com/articles/URNVV3E)。
首先,介绍下环境:
1、Centos6.5 x64 安装在VMware虚拟机中,因为后期需要移植,遂选择了兼容vm10.0。
# yum -y update
升级下系统
2、Python2.7.9。系统自带为2.6.6。
cd ~/Download
管网下载源码:
wget --no-check-certificate https://www.python.org/ftp/python/2.7.9/Python-2.7.9.tar.xz
tar xvf Python-2.7.9.tar.xz
cd Python-2.7.9
./configure --with-ensurepip=install #其余保持默认
make
sudo make install # 如果python2.7.9做副版本的话使用make altinstall
[Mikky@localhost Python-2.7.9]$ python
Python 2.7.9 (default, Feb 1 2016, 21:30:54)
[GCC 4.4.7 20120313 (Red Hat 4.4.7-16)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>>
升级了下libffi-devel因为踩它坑太多,所以直接又更新了下,此步选做:
sudo yum install update libffi-devel
至此环境准备妥当,下面就是踩坑环节o(�s□�t)o
安装失败方法:
sudo /usr/local/bin/pip install scrapy
错误很多,参考文首的链接依然无法解决。坑就不贴出来了,慢慢踩吧。
那么多坑的主要原因在于pip安装软件的依赖库问题,尤其是:cryptography。
pip默认安装最新的而不是最稳定的版本,安装的cryptography为最新版(1.2.2 我的安装时间为2016年2月初),因此:
sudo /usr/local/bin/pip install cryptography==0.9
........无视warn
Successfully installed cryptography idna pyasn1 six enum34 ipaddress cffi pycparser
Cleaning up...
到这大功告成一大半了,接下来请主角出场:
sudo /usr/local/bin/pip install scrapy
.........
Successfully installed scrapy pyOpenSSL queuelib service-identity lxml w3lib cssselect Twisted pyasn1-modules characteristic zope.interface
Cleaning up...
[Mikky@localhost ~]$ scrapy version
/usr/local/lib/python2.7/site-packages/cffi/model.py:526: UserWarning: 'point_conversion_form_t' has no values explicitly defined; next version will refuse to guess which integer type it is meant to be (unsigned/signed, int/long)
% self._get_c_name())
Scrapy 1.0.4
继续无视警告,已经被这些警告整疯了。
$ scrapy startproject tutorial
/usr/local/lib/python2.7/site-packages/cffi/model.py:526: UserWarning: 'point_conversion_form_t' has no values explicitly defined; next version will refuse to guess which integer type it is meant to be (unsigned/signed, int/long)
% self._get_c_name())
New Scrapy project 'tutorial' created in:
/home/Mikky/scrapy/tutorial
You can start your first spider with:
cd tutorial
scrapy genspider example example.com
[Mikky@localhost scrapy]$ ll
total 4
drwxrwxr-x. 3 Mikky Mikky 4096 Feb 3 22:27 tutorial
安装完成,我再找找这个报错原因。