Difference between revisions of "EditThis"

From Archiveteam
Jump to navigation Jump to search
m (Reverted edits by Megalanya0 (talk) to last revision by Emijrp)
 
(3 intermediate revisions by 2 users not shown)
Line 6: Line 6:
| URL = http://editthis.info
| URL = http://editthis.info
| project_status = {{online}}
| project_status = {{online}}
| archiving_status = {{nosavedyet}}
| archiving_status = {{partiallysaved}} in 2014
| irc = wikiteam
| irc = wikiteam
}}
}}


'''EditThis''' is a [[wikifarm]]. According to our estimates, there are over [https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/editthis.info 1,300 wikis].
'''EditThis''' is a [[wikifarm]]. According to our estimates, there are over [https://github.com/WikiTeam/wikiteam/blob/master/listsofwikis/mediawiki/editthis.info 1,300 wikis]. There is a [https://archive.org/search.php?query=editthisinfo+subject%3Awikiteam 2014 backup] for most of them.


This farm is quite hard to archive, because of
This farm is quite hard to archive, because of

Latest revision as of 16:19, 16 January 2017

EditThis
EditThis logo
A screen shot of the EditThis.info home page taken on 27 May 2012
A screen shot of the EditThis.info home page taken on 27 May 2012
URL http://editthis.info
Status Online!
Archiving status Partially saved in 2014
Archiving type Unknown
IRC channel #wikiteam (on hackint)

EditThis is a wikifarm. According to our estimates, there are over 1,300 wikis. There is a 2014 backup for most of them.

This farm is quite hard to archive, because of

  • old software (MediaWiki 1.15) with several weirdnesses, both at application and webserver level (like directory structure, URL rewrites, l10n in MediaWiki namespace);
  • slow servers (even after they fixed their robots.txt);
  • very strict captcha and throttling (with unhelpful status codes);
  • number of wikis taken over by spam since 2012 or earlier.

The owner clearly has not had time to manage it properly for several years now.

Best results to complete a download with launcher.py and dumpgenerator.py have been reached with the following:

  • add --exnamespaces=8,9
  • use API and a --delay=60, wait 120 s between each wiki.

If you lower the delay too much, or forget to sleep between some kinds of requests, you can easily enter a loop of endless 503 errors and never get out of it (each request, failed or not, counts for the throttle).

See also

External links

v · t · e         Knowledge and Wikis
Software

DokuWiki · MediaWiki · MoinMoin · Oddmuse · PukiWiki · UseModWiki · YukiWiki

Wikifarms

atwiki · Battlestar Wiki · BluWiki · Communpedia · EditThis · elwiki.com · Fandom · Miraheze · Neoseeker.com · Orain · Referata · ScribbleWiki · Seesaa · ShoutWiki · SourceForge · TropicalWikis · Wik.is · Wiki.Wiki · Wiki-Site · Wikidot · WikiHub · Wikispaces · WikiForge · WikiTide · Wikkii · YourWiki.net

Wikimedia

Wikipedia · Wikimedia Commons · Wikibooks · Wikidata · Wikinews · Wikiquote · Wikisource · Wikispecies · Wiktionary · Wikiversity · Wikivoyage · Wikimedia Incubator · Meta-Wiki

Other

Anarchopedia · Citizendium · Conservapedia · Creation Wiki · EcuRed · Enciclopedia Libre Universal en Español · GNUPedia · Moegirlpedia · Nico Nico Pedia · Nupedia · OmegaWiki · OpenStreetMap · Pixiv Encyclopedia

Indexes and stats

WikiApiary · WikiIndex · Wikistats