Flogão

From Archiveteam
Revision as of 14:54, 10 June 2019 by Igloo (talk | contribs) (Initial creation)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search
FloGao
[[File:
Flogao Screenshot 10-06-2019
|280px|O fotolog do seu jeito está chegando ao fim! Depois de 15 anos online, chegou a hora de dizer adeus... foi uma longa jornada. O Flogão sairá do ar oficialmente ...]]
O fotolog do seu jeito está chegando ao fim! Depois de 15 anos online, chegou a hora de dizer adeus... foi uma longa jornada. O Flogão sairá do ar oficialmente ...
URL https://www.flogao.com.br/
Status Endangered
Archiving status In progress... (Discovery)
Archiving type Unknown
IRC channel #archiveteam-bs (on hackint)

Flogão is an outdated Fotolog-esque social media website similiar to Instagram, it was created in 2004, has been dead at least since 2013, and will be shut down in June 24th 2019, the overwhelming majority of activity that I was personally able to find there dates from 2005 to 2008, and all of these nostalgic memories of thousands of users, historical material, and funny stuff will be permanently lost in June 24th if something is not done about it.

Archival

A recursive crawl is currently underway, Users appear in the format https://www.flogao.com.br/<<username>>/. Under here, there are several elements, favorites - Where other pictures are shown when a user has marked them "favorite", profile - The users profile, typically with some kind of outlink. blogs - Blog pages, Blog entries take the format <<username>>/blog/[0-9]+. Photos - photos uploaded to the site. These have a unique ID and appear in the format https://www.flogao.com.br/<<username>>/[0-9]+

So far a scrape which has been running for approximately 6 hours has found around 25,000 accounts with over 150,000 URI left to attempt to find more. It is not currently known how many accounts are on the site, But it is approximated at over 1million (1,000,000).

Once the scrape has been completed, individual user accounts can be added via a warrior item to be crawled recursively with wget. This should allow a complete capture.

Other archives or attempts


References


Links