mediawiki is a python wrapper and parser for the MediaWiki API. The goal is to allow users to quickly and efficiently pull data from the MediaWiki site of their choice instead of worrying about dealing directly with the API. As such, it does not force the use of a particular MediaWiki site. It defaults to Wikipedia but other MediaWiki sites can also be used.
MediaWiki wraps the MediaWiki API so you can focus on leveraging your favorite MediaWiki site's data, not getting it. Please check out the code on github!
Note: this library was designed for ease of use and simplicity. If you plan on doing serious scraping, automated requests, or editing, please look into Pywikipediabot which has a larger API, advanced rate limiting, and other features so we may be considerate of the MediaWiki infrastructure.
Pip Installation:
$ pip install pymediawiki
To install from source:
To install mediawiki, simply clone the repository on GitHub, then run from the folder:
$ python setup.py install
mediawiki supports python versions 2.7 and 3.4 - 3.7
Documentation of the latest release is hosted on readthedocs.io
To build the documentation yourself run:
$ pip install sphinx $ cd docs/ $ make html
To run automated tests, one must simply run the following command from the downloaded folder:
$ python setup.py test
Import mediawiki and run a standard search against Wikipedia:
>>> from mediawiki import MediaWiki
>>> wikipedia = MediaWiki()
>>> wikipedia.search('washington')
Run more advanced searches:
>>> wikipedia.opensearch('washington')
>>> wikipedia.allpages('a')
>>> wikipedia.geosearch(title='washington, d.c.')
>>> wikipedia.geosearch(latitude='0.0', longitude='0.0')
>>> wikipedia.prefixsearch('arm')
>>> wikipedia.random(pages=10)
Pull a MediaWiki page and some of the page properties:
>>> p = wikipedia.page('Chess')
>>> p.title
>>> p.summary
>>> p.categories
>>> p.images
>>> p.links
>>> p.langlinks
See the Documentation for more examples!
Please see the changelog for a list of all changes.
MIT licensed. See the LICENSE file for full details.