Difference between revisions of "Photobucket"
m (link to howto) |
(I wouldn't say photobucket is stable, I'd say it's dying) |
||
Line 12: | Line 12: | ||
== Vital signs == | == Vital signs == | ||
Had a major outage though around year turnover from 2019 to 2020: https://thedeadpixelssociety.com/power-outage-impacts-photobucket-over-key-holiday-period/ | |||
== Archiving Photobucket == | == Archiving Photobucket == |
Revision as of 02:20, 3 May 2022
Photobucket | |
The Photobucket home page as seen on 2011-01-12 | |
URL | http://photobucket.com |
Status | Online! |
Archiving status | Not saved yet |
Archiving type | Unknown |
IRC channel | #archiveteam-bs (on hackint) |
Photobucket is an image hosting service. They are also hosting Twitter images.
Vital signs
Had a major outage though around year turnover from 2019 to 2020: https://thedeadpixelssociety.com/power-outage-impacts-photobucket-over-key-holiday-period/
Archiving Photobucket
PB_Shovel can be used to download all images from an album and all subalbums (recursively).
This version of PB_Shovel has also been modified with a `--links-only` option to crawl a list of every single URLs from the album and subalbums instead of downloading. This makes it possible to use wget to grab all the files yourself, or grab-site to archive with WARC.
Usage: Obtain all the urls of a Photobucket Album, even subalbums (using the -r parameter), and put them in links-<datetime>.txt. This url file can be given to wget or grab-site to download.
python pb_shovel.py 'http://s160.photobucket.com/user/Spinningfox/library/Internet Fads' -r --links-only wget -i links-2016-03-20_02-03-01.txt grab-site -i links-2016-03-20_02-03-01.txt
External links
- http://photobucket.com
- https://www.thephotoforum.com/threads/how-to-get-around-photobuckets-watermarks-on-your-images.438126/ - how to modify URLs in order to archive images without watermarks.