Difference between revisions of "User:Emijrp"

From Archiveteam
Jump to navigation Jump to search
 
(25 intermediate revisions by the same user not shown)
Line 1: Line 1:
<span style="float: right;font-size: 2em;">·<u>L</u>·</span>
<center><div style="background-color: lightblue;width: 85%;">Around the year 2150, '''petabytes''' of currently generated content, if preserved, will enter in the public domain<br/>and will be freely used in projects like Wikipedia.</div></center>
I am the same guy of [[Wikipedia]] (http://en.wikipedia.org/wiki/User:Emijrp). You can contact me if you have questions related to [[wikis]].


* [[Template:Infobox project]]
[[File:Archive-all-the-things-thumb.jpg|right]]
{{TOCright}}
I am the same guy of [[Wikipedia]] ([https://en.wikipedia.org/wiki/User:Emijrp User:Emijrp]). You can contact me (emijrp{{@}}gmail.com) if you have questions related to [[wikis]] or libre content.
== [[Jamendo]] ==
 
* Downloading every single album. There are about 1.5 TB of music. Downloaded from ID 0 to 49306. <s>Also seeding in BitTorrent from 0 to 2999</s>. I am downloading in OGG format, it is free and better than MP3.
An English-centric archival effort is a biased one. Surf this: http://www.dmoz.org/World/
 
:'''''[https://archive.org/details/wiki-archiveteamorg Download a backup of Archive Team Wiki from here].'''''
 
__TOC__
== Projects ==
<!-- [[File:Oh shit archive team is here.png|right|]] -->
[[File:Fukkensaved.jpg|right|400px]]
* [[Jamendo]] holds about 2 TB of music. Jason Scott added [https://archive.org/details/jamendo-albums 59,000 albums] to Internet Archive using [https://github.com/emijrp/jamendo-downloader my download script]. That was in 2012, a new scan would be nice.
* [[LibreTeam]]
* [[ProHosting]]
* [[WikiTeam]], the Archive Team subcommittee for wikis
** [[Nupedia]] (save the ~20 published articles here)
** [[GNUPedia]] (Only 3 articles where sent to the mailing list. Archived links in the article.)
** [[Wikipedia]] (see [https://en.wikipedia.org/wiki/User:Emijrp/Wikipedia_Archive User:Emijrp/Wikipedia Archive]):
** [[Wikia]] wikis: http://wiki-stats.wikia.com/
** [[Enciclopedia Libre Universal en Español]] [http://enciclopedia.us.es]. Saved all images (2GB) 2010-08-14
** [[Wikanda]]: http://www.wikanda.es Done!
** [[Citizendium]] [http://en.citizendium.org/wiki/CZ:Downloads]
* [[OpenStreetMap]] [http://wiki.openstreetmap.org/wiki/Database_dump]
* [[OmegaWiki]] [http://www.omegawiki.org/Development]. Saved! 2010-10-31
* [[GeoNames]] [http://download.geonames.org/export/]
 
== How to upload stuff to Internet Archive ==
 
To upload stuff to [[Internet Archive]] you can use the [https://archive.org/create web interface], but in this tutorial we will learn to use the GNU/Linux console and the [https://pypi.python.org/pypi/internetarchive internetarchive] Python module. You need a Linux distro, Python, pip and an account in Internet Archive.
 
Configuration:
 
* sudo apt-get install python
* sudo easy_install pip
* sudo pip install internetarchive
* ia configure
 
Funny stuff. In this example we will backup [[GeoNames]] project:
 
* wget -r -np -l 1 -A zip,txt http://download.geonames.org/export/dump/
* ia upload GeoNames-20151022 download.geonames.org/export/dump/*.{zip,txt}
 
Result:
 
* uploading AD.zip: [################################] 1/1 - 00:00:00
* uploading AE.zip: [################################] 1/1 - 00:00:00
* uploading AF.zip: [################################] 5/5 - 00:00:00
* etc...


== [[Wikis]] (see ''[[Wikiteam]]'') ==
See [https://archive.org/details/GeoNames-20151022 GeoNames-20151022]. When the upload finishes, you can improve the metadata using the web interface. Or add the metadata in the upload process with --metadata="title:foo" --metadata="blah:arg"
I like free knowledge. I'm going to get a copy.
* [[Wikipedia]] predecessors
** [[Nupedia]]
*** Tried to save the ~20 published articles in en.wikisource.org. They said Nupedia license is not compatible with CC-BY-SA
*** Search a new house for them
** [[GNUPedia]]
*** Only 3 articles where sent to the mailing list
* Wiki[mp]edia (see http://en.wikipedia.org/wiki/User:Emijrp/Wikipedia_Archive):
** Only the 7z files pages-meta-history.xml.7z (the most important ones, they contain the text and the metadata for every revision): http://download.wikimedia.org/backup-index.html
*** Done! 2010-08 (the most recent English Wikipedia dump is from 2010-01-30)
** <s>Static HTML: http://static.wikipedia.org/</s> very out-of-date and all the relevant info is on the 7z files above
* [[Wikia]] wikis: http://wiki-stats.wikia.com/
** Downloading...
* More wikis:
** [[Enciclopedia Libre Universal en Español]]: http://enciclopedia.us.es
*** Saved all images (2GB) 2010-08-14
** [[Wikanda]]: http://www.wikanda.es
*** Huelvapedia images, done!
*** Cádizpedia images, done!
** Wikiextremadura: http://www.wikiextremadura.org offline : (
** LeonWiki: offline
* [[Citizendium]]: http://en.citizendium.org/wiki/CZ:Downloads
** Saved! It was easy, only 100 MB (they do not published the entire history, only the last revision)
** Downloaded images as of 2010-11-12
* [[OpenStreetMap]]: http://wiki.openstreetmap.org/wiki/Database_dump
* [[OmegaWiki]]: http://www.omegawiki.org/Development
** Saved! 2010-10-31
* [[Archive Team]] wiki (LOL): paste the content from [[/Archiveteam]] into [[Special:Export]]. It generates a XML file.
* [[GeoNames]] http://download.geonames.org/export/
** Done 2010-11-16


=== Domas visits logs ===
== Maintenance ==
By Lars:
* From December 2007 to September 2009: http://www.archive.org/details/wikipedia_visitor_stats_200712
* October 2009: http://www.archive.org/details/wikipedia_visitor_stats_200910
Uploading:
* November 2009


== Spam ==
Dealing with spam in this wiki:
* Dealing with spam in this wiki:
* [[MediaWiki:Spam-blacklist]] and [[MediaWiki:Spam-whitelist]]
** [[MediaWiki:Spam-blacklist]] and [[MediaWiki:Spam-whitelist]]
* [[MediaWiki:Titleblacklist]] and [[MediaWiki:Titlewhitelist]] (examples http://en.wikipedia.org/wiki/MediaWiki:Titleblacklist)
** [[MediaWiki:Titleblacklist]] and [[MediaWiki:Titlewhitelist]] (examples http://en.wikipedia.org/wiki/MediaWiki:Titleblacklist)


== Other tasks ==
Other:
* [[ProHosting]]
* [[Template:Infobox project]]
* [[Template:Navigation box]]
* [[Special:Newpages]]
* [[Special:Random/User]]


== Tip ==
[[File:Backupyourdata.gif|center]]
[[File:Backupyourdata.gif|center]]
== External links ==
* http://wikiindex.org site with a ton of wikis

Latest revision as of 14:37, 22 October 2015

Around the year 2150, petabytes of currently generated content, if preserved, will enter in the public domain
and will be freely used in projects like Wikipedia.
Archive-all-the-things-thumb.jpg

I am the same guy of Wikipedia (User:Emijrp). You can contact me (emijrpAt sign.pnggmail.com) if you have questions related to wikis or libre content.

An English-centric archival effort is a biased one. Surf this: http://www.dmoz.org/World/

Download a backup of Archive Team Wiki from here.

Projects

Fukkensaved.jpg

How to upload stuff to Internet Archive

To upload stuff to Internet Archive you can use the web interface, but in this tutorial we will learn to use the GNU/Linux console and the internetarchive Python module. You need a Linux distro, Python, pip and an account in Internet Archive.

Configuration:

  • sudo apt-get install python
  • sudo easy_install pip
  • sudo pip install internetarchive
  • ia configure

Funny stuff. In this example we will backup GeoNames project:

Result:

  • uploading AD.zip: [################################] 1/1 - 00:00:00
  • uploading AE.zip: [################################] 1/1 - 00:00:00
  • uploading AF.zip: [################################] 5/5 - 00:00:00
  • etc...

See GeoNames-20151022. When the upload finishes, you can improve the metadata using the web interface. Or add the metadata in the upload process with --metadata="title:foo" --metadata="blah:arg"

Maintenance

Dealing with spam in this wiki:

Other:

Backupyourdata.gif