数据库下载:http://zh.wikipedia.org/wiki/Wikipedia:%E6%95%B0%E6%8D%AE%E5%BA%93%E4%B8%8B%E8%BD%BD
MediaWiki数据导入方法
- 使用MediaWiki的特殊页面:你的网站域名/Special:Import。
- 使用MediaWiki自带的php命令:importDump.php。
- 使用SSH登录服务器。比如常用的SSH软件:PuTTY。
- 进入maintenance目录。
- 上传你的xml文件到maintenance目录中。
- 使用命令:php importDump.php 文件名.xml。
- 使用命令:php rebuildrecentchanges.php,刷新特殊页面 Special:RecentChanges,可以看到最新导入的文章情况。(更多php命令请参见:MediaWiki Maintenance)
- 使用MediaWiki自带的php命令:mwdumper。
Manual:Importing XML dumps
This page describes methods to import XML dumps.
The Special:Export page of any mediawiki site, including any Wikimedia site and wikipedia, creates an XML file (content dump). See meta:Data dumps and Manual:DumpBackup.php. XML files are explained more on meta:Help:Export.
There are several methods for importing these XML dumps:
How to import?[edit]
Using Special:Import[edit]
Special:Import can be used by wiki users with import permission (by default this is users in the sysop group) to import a small number of pages (about 100 should be safe). Trying to import large dumps this way may result in timeouts or connection failures. See meta:Help:Importfor a detailed description.
See Manual:XML Import file manipulation in CSharp for a C# code sample that manipulates an XML import file.
Changing permissions[edit]
To allow all registered editors to import (not recommended) the line added to localsettings.php would be:
- $wgGroupPermissions['user']['import'] = true;
- $wgGroupPermissions['user']['importupload'] = true;
Possible Problems[edit]
For using Transwiki-Import PHP safe_mode must be off and open_basedir must be empty. Otherwise the import fails.
Using importDump.php, if you have shell access[edit]
- Recommended method for general use, but slow for very big data sets. For very large amounts of data, such as a dump of a big Wikipedia, use mwdumper, and import the links tables as separate SQL dumps.
importDump.php
is a command line script located in the maintenance folder of your MediaWiki installation. If you have shell access, you can call importdump.php like this (add paths as necesary):
php importDump.php --conf LocalSettings.php dumpfile.xml.gz wikidb
or this:
php importDump.php < dumpfile.xml
where dumpfile.xml is the name of the XML dump file. If the file is compressed and that has a .gz or .bz2 file extension, it is decompressed automatically.
Afterwards use ImportImages.php to import the images:
php importImages.php ../path_to/images
Note: If you are using WAMP installation, you can have troubles with the importing, due to innoDB settings (by default is this engine disabled in my.ini, so if you don't need troubles, use MyIsam engine)
Note: For Mediawikis older than version 1.16, to run importDump.php (or any other tool from the maintenance directory), you need to set up your AdminSettings.php file.
Note: running importDump.php can take quite a long time. For a large Wikipedia dump with millions of pages, it may take days, even on a fast server. Also note that the information in meta:Help:Import about merging histories, etc. also applies.
After running importDump.php, you may want to run rebuildrecentchanges.php in order to update the content of your Special:Recentchanges page.