Difference between revisions of "Ispygames"

From Archiveteam
Jump to: navigation, search
(Gamespy Domains)
(Gamespy Domains)
Line 352: Line 352:
 
* http://planetcrysis.gamespy.com
 
* http://planetcrysis.gamespy.com
 
* http://planetresidentevil.gamespy.com - grabbed, checking for completeness
 
* http://planetresidentevil.gamespy.com - grabbed, checking for completeness
* http://planetxmen.gamespy.com
+
* http://planetxmen.gamespy.com - grabbing
 
* http://bugsubmit.gamespy.com
 
* http://bugsubmit.gamespy.com
 
* http://planetquake.gamespy.com - grabbed, checking for completeness
 
* http://planetquake.gamespy.com - grabbed, checking for completeness

Revision as of 22:50, 23 March 2013

The News

IGN hit with layoffs, 1UP, UGO and GameSpy shutting down
1UP, UGO and GameSpy to be shut down

The Problems

  • Once you start digging around these sites you find it to be a mess of inconsistent url schemes and content everywhere.
  • Some files are being hosted on MediaFire.
  • Based on tests the larger and older a site is the more that is missed by a wget crawl due to the url scheme.

What we know

  • We already have a list of almost all the domains involved
  • A clean list with dups and bad domains is already being process and will be posted here when complete.
  • Most of the sites are not that big, but a few are huge.

The plan

  • Save the sites and related content
  • Backup the twitter feeds for any associated accounts. All my tweets just takes a username and returns the max tweets possible.


wget test command

USER_AGENT="Mozilla/5.0 (Windows; U; MSIE 9.0; Windows NT 9.0; en-US)"
SAVE_HOST="planetdoom.gamespy.com"

wget -e robots=off --mirror --page-requisites --waitretry 5 --timeout 60 \
--tries 10 --warc-header "operator: Archive Team" --warc-cdx \
"$SAVE_HOST" --warc-file="$SAVE_HOST" --wait 2 -U "$USER_AGENT" \
--span-hosts --domains=$SAVE_HOST,pcmedia.gamespy.com,pnmedia.gamespy.com,pspmedia.gamespy.com,oystatic.ignimgs.com

IGN domains


Gamespy Domains