BlikBack
It is important to make regular backups of the data in your wiki. This page provides an overview of the backup process for a typical MediaWiki wiki; you will probably want to devise your own backup scripts or schedule to suit the size of your wiki and your individual needs.
Contents
Overview
MediaWiki stores important data in two places:
- Database
- Pages and their contents, users and their preferences, metadata, search index, etc.
- File system
- Software configuration files, custom skins, extensions, images (including deleted images), etc.
Consider making the wiki read-only before creating the backup - see <tvar|readOnly>Template:Ll</>. This makes sure all parts of your backup are consistent (some of your installed extensions may write data nonetheless).
File transfer
You will have to choose a method for transferring files from the server where they are:
- Non-private data you can simply [<tvar|url>https://github.com/WikiTeam/wikiteam/wiki/Tutorial#Publishing_the_dump</> publish on archive.org] and/or in a
dumps/
directory of your webserver. - SCP (or WinSCP), SFTP/FTP or any other transfer protocol you choose.
- The hosting company might provide a file manager interface via a web browser; check with your provider.
SQLite Database
Most of the critical data in the wiki is stored in the database.
If your wiki is currently offline, its database can be backed up by simply copying the database file. Otherwise, you should use a maintenance script:
php maintenance/sqlite.php --backup-to <backup file name>,
which will make sure that operation is atomic and there are no inconsistencies. If your database is not really huge and server is not under heavy load, users editing the wiki will notice nothing but a short lag. Users who are just reading will not notice anything in any case.
sqlite.php
sqlite.php file is a [[<tvar|maint-scripts>Special:MyLanguage/Manual:Maintenance scripts</>|maintenance script]] for tasks specific to [[<tvar|man-sqlite>Special:MyLanguage/Manual:SQLite</>|SQLite backend]].
Currently, these options are supported:
- --vacuum
- Executes <tvar|vacuum>VACUUM</> command that compacts the database and improves its performance.
Example:
$ php sqlite.php --vacuum VACUUM: Database size was 46995456 bytes, now 37796864 (19.6% reduction).
- --integrity
- Performs integrity check of the database. If no error is detected, a single "ok" will be displayed, otherwise the script will show up to 100 errors.
Example:
$ php sqlite.php --integrity Performing database integrity checks: ok
- --backup-to <file name>
- Backups the database to the given file.
- --check-syntax <one or more file names>
- Checks SQL files for compatibility with SQLite syntax. This option is intended for developer use.
All these options can be used at the same time.
File system
MediaWiki stores other components of the wiki in the file system where this is more appropriate than insertion into the database, for example, site configuration files (<tvar|1>Template:Ll
</>, <tvar|2>Template:Ll
</> (finally removed in 1.23)), image files (including deleted images, thumbnails and rendered math and SVG images, if applicable), skin customisations, extension files, etc.
The best method to back these up is to place them into an archive file, such as a .tar
file, which can then be compressed if desired. On Windows, applications such as WinZip or 7-zip can be used if preferred.
For Linux variants, assuming the wiki is stored in /srv/www/htdocs/wiki
tar zcvhf wikidata.tgz /srv/www/htdocs/wiki
It should be possible to backup the entire "wiki" folder in "htdocs" if using XAMPP.
Backup the content of the wiki (XML dump)
It is also a good idea to create an XML dump in addition to the database dump. XML dumps contain the content of the wiki (wiki pages with all their revisions), without the site-related data (they do not contain user accounts, image metadata, logs, etc).[1]
XML dumps are less likely to cause problems with character encoding, as a means of transferring large amounts of content quickly, and can easily be used by third party tools, which makes XML dumps a good fallback should your main database dump become unusable.
<translate> To create an XML dump, use the command-line tool <tvar|1>Template:Ll
</>, located in the <tvar|2>maintenance
</> directory of your MediaWiki installation.</translate>
<translate> See <tvar|1>Template:Ll</> for more details.</translate>
You can also create an XML dump for a specific set of pages online, using Special:Export, although attempting to dump large quantities of pages through this interface will usually time out.
<translate> To import an XML dump into a wiki, use the command-line tool <tvar|1>Template:Ll
</>.</translate>
<translate> For a small set of pages, you can also use the Special:Import page via your browser (by default, this is restricted to the sysop group).</translate>
<translate> As an alternative to <tvar|1>dumpBackup.php
</> and <tvar|2>importDump.php
</>, you can use MWDumper, which is faster, but requires a Java runtime environment. </translate>
See [[<tvar|import>Special:MyLanguage/Manual:Importing XML dumps</>|Manual:Importing XML dumps]] for more information.
Without shell access to the server
<translate> If you have no shell access, then use the <tvar|WikiTeam>WikiTeam</> Python script <tvar|dumpgenerator>dumpgenerator.py</> from a DOS, Unix or Linux command-line.</translate> <translate> Requires Python v2 (v3 doesn't yet work).</translate>
To get an XML, with edit histories, dump and a dump of all images plus their descriptions. Without extensions and LocalSettings.php configs.
python dumpgenerator.py --api=http://www.sdiy.info/w/api.php --xml --images
Full instructions are at the WikiTeam [<tvar|url>https://github.com/WikiTeam/wikiteam/wiki/Tutorial#I_have_no_shell_access_to_server</> tutorial].
See also Meta:Data dumps.
Scripts
- [[<tvar|script>Special:MyLanguage/Manual:Backing up a wiki/Duesentrieb's backup script</>|Unofficial backup script]] by User:Duesentrieb.</translate>
- Unofficial backup script by Flominator; creates a backup of all files and the database, with optional backup rotation.</translate>
- User:Darizotas/MediaWiki Backup Script for Windows - a script for backing up a Windows MediaWiki install. Note: Has no restore feature.</translate>
- Unofficial web-based backup script, mw_tools, by Wanglong (allwiki.com); you can use it to back up your database, or use the backup files to recover the database, the operation is very easy.</translate>
- [<tvar|wikiteam>https://github.com/WikiTeam/wikiteam</> WikiTeam tools] - if you do not have server access (e.g. your wiki is in a free wikifarm), you can generate an XML dump and an image dump using WikiTeam tools (see [<tvar|backups>https://github.com/WikiTeam/wikiteam/wiki/Available-Backups</> some saved wikis]).
- Template:Ll
- Another [<tvar|url>https://github.com/samwilson/MediaWiki_Backup</> backup script] that: dumps DB, files (just pictures by default, option to include all files in installation), and XML; puts the site into read-only mode; timestamps backups; and reads the charset from LocalSettings. Script does not need to be modified for each site to be backed up. Does not (yet) rotate old backups. Usage: <tvar|code>Template:Nowrap</>. Also provides a script to restore a backup <tvar|code>Template:Nowrap</>.</translate>
- Another unofficial {{<tvar|1>ll|Manual:Backing up a wiki/Lanthanis backup CMD</>|MediaWiki backup script for Windows}} by <tvar|2>Lanthanis</> that: exports the pages of specified namespaces as an XML file; dumps specified database tables; and adds further specified folders and files to a ZIP backup file.</translate> <translate> Can be used with Windows task scheduler.</translate>
- Script to make periodical backups [<tvar|url>https://github.com/nischayn22/mw_backup</> mw_backup]. This script will make daily, weekly and monthly backups of your database and images directory when run as a daily cron job.
See also
- Help:Export is a quick and easy way to save all pages on your wiki.
- Template:Ll
- Template:Ll
- Template:Ll
- Template:Ll - <translate> if you don't have a successful backup</translate>
- Template:Ll
References
- ↑ XML dumps are independent of the database structure, and can be imported into future (and even past) versions of MediaWiki.