WIKI data [CNRS, U. Surrey] Link

catalogued by

Dump including daily snapshots of the Special:Statistics page for 11,500+ of the largest known MediaWiki-based wikis, collected between August 2007–April 2008 by polling a publicly-available Web service. The dump contains the following data for each entry: unique ID, URL, title, number of content pages, total number of pages, edits, number of admins, number of users, number of images, stub ratio, timestamp. A separate table (editable) contains for each entry identified by the ID in the main database the response for an anonymous edit trial (1: allowed; 0: forbidden).


Dataset Download

Related Publications