Difference between revisions of "Photobucket"
Jump to navigation
Jump to search
Megalanya1 (talk | contribs) m (MOTHERFUCKER ! ! !) |
m (Reverted edits by Megalanya1 (talk) to last revision by Jscott) |
||
Line 10: | Line 10: | ||
'''Photobucket''' is an image hosting service. [http://www.washingtonpost.com/business/technology/twitter-partners-with-photobucket-on-photos-and-firefox-on-search/2011/06/01/AGVDSVGH_story.html They are also hosting Twitter images]. | '''Photobucket''' is an image hosting service. [http://www.washingtonpost.com/business/technology/twitter-partners-with-photobucket-on-photos-and-firefox-on-search/2011/06/01/AGVDSVGH_story.html They are also hosting Twitter images]. | ||
== | == Vital signs == | ||
Seems stable. | |||
== Archiving Photobucket == | == Archiving Photobucket == |
Revision as of 16:26, 17 January 2017
Photobucket | |
The Photobucket home page as seen on 2011-01-12 | |
URL | http://photobucket.com |
Status | Online! |
Archiving status | Not saved yet |
Archiving type | Unknown |
IRC channel | #archiveteam-bs (on hackint) |
Photobucket is an image hosting service. They are also hosting Twitter images.
Vital signs
Seems stable.
Archiving Photobucket
PB_Shovel can be used to download all images from an album and all subalbums (recursively).
This version of PB_Shovel has also been modified with a `--links-only` option to crawl a list of every single URLs from the album and subalbums instead of downloading. This makes it possible to use wget to grab all the files yourself, or grab-site to archive with WARC.
Usage: Obtain all the urls of a Photobucket Album, even subalbums (using the -r parameter), and put them in links-<datetime>.txt. This url file can be given to wget or grab-site to download.
python pb_shovel.py 'http://s160.photobucket.com/user/Spinningfox/library/Internet Fads' -r --links-only wget -i links-2016-03-20_02-03-01.txt grab-site -i links-2016-03-20_02-03-01.txt