00:14:34Island quits [Read error: Connection reset by peer]
00:29:41tapos quits [Client Quit]
00:53:45Island joins
01:05:37ScenarioPlanet quits [Remote host closed the connection]
01:05:37TheTechRobo quits [Remote host closed the connection]
01:05:37Pedrosso quits [Remote host closed the connection]
01:06:02Pedrosso joins
01:06:06ScenarioPlanet (ScenarioPlanet) joins
01:06:21TheTechRobo (TheTechRobo) joins
01:53:52tzt_ is now known as tzt
02:04:31Wohlstand quits [Client Quit]
02:22:18Wohlstand (Wohlstand) joins
02:45:32shgaqnyrjp quits [Remote host closed the connection]
02:46:34shgaqnyrjp (shgaqnyrjp) joins
02:46:43lennier2_ quits [Ping timeout: 255 seconds]
02:47:12lennier2_ joins
02:49:51tapos joins
03:04:37Wohlstand quits [Client Quit]
03:07:06systwi (systwi) joins
04:12:54JaffaCakes118 quits [Remote host closed the connection]
04:13:17JaffaCakes118 (JaffaCakes118) joins
04:16:25kiryu joins
04:16:25kiryu quits [Changing host]
04:16:25kiryu (kiryu) joins
04:51:51kiryu quits [Read error: Connection reset by peer]
04:52:54kiryu joins
04:52:54kiryu quits [Changing host]
04:52:54kiryu (kiryu) joins
05:06:13xarph quits [Ping timeout: 255 seconds]
05:27:59bilboed quits [Quit: The Lounge - https://thelounge.chat]
05:28:20bilboed joins
05:33:28xarph joins
05:45:01<c3manu>since i got no answer in #archivebot: question for someone better versed in all that funky crypto currency business: is it worth archiving pages for individual "blocks", like this for example? https://localmonero.co/blocks/id/block/3145788 17:05:04 <c3manu> this information should all be available on the blockchain anyways, no?
05:45:38<c3manu>localmonero.co job is fetching loads of urls like https://localmonero.co/blocks/tx/42b20c5496c188e0c03f8f8937fc6b9a8318b2fa576dc9ee19f5350d9f1a199a and i don't know whether it makes sense to grab them or not
05:46:52trenton13333 joins
06:10:50etnguyen03 quits [Remote host closed the connection]
06:17:16lumidify quits [Quit: leaving]
06:33:29sepro9 (sepro) joins
06:35:19sepro quits [Ping timeout: 255 seconds]
06:35:19sepro9 is now known as sepro
06:50:59Greedy joins
06:52:25sepro quits [Ping timeout: 255 seconds]
06:52:37Greedy quits [Client Quit]
06:59:37benjinsm quits [Ping timeout: 255 seconds]
06:59:55sepro (sepro) joins
07:01:56<pabs>this person died https://farley.io/ https://www.facebook.com/erin.a.farley/posts/pfbid02vjt7TCCmE5cSTrpiPVBqgHMokRCirSvaPZDEzUNrJd5sT2k6vd2gEkGY554DTxSTl
07:02:13<pabs>can someone AB / #gitgud / etc?
07:02:31<pabs>(they were from the mercurial/hg community)
07:04:55<pabs>thalia: AT can save things that are online right now, I think thats about it. archive.org does archiving of physical things such as printed catalogs.
07:05:01sepro quits [Ping timeout: 255 seconds]
07:06:11Unholy2361924645 (Unholy2361) joins
07:12:14sepro (sepro) joins
07:15:22pabs quits [Ping timeout: 255 seconds]
07:16:01pabs (pabs) joins
07:17:12<c3manu>pabs: what's their full name? with how shitty google has become i can't get anything better than "Tyler Polley's mercurial UConn career", but i don't thing that's him ^^"
07:17:30<pabs>Sean Farley
07:17:52<c3manu>so tyler was wrong, too. thanks :)
07:18:43<c3manu>this one? https://farley.io/
07:19:19<c3manu>ah, the last blog category he wrote in is Mercurial, so i guess that's him
07:20:10<pabs>yea, I linked that above :)
07:21:01<c3manu>ohh..i'm super blind then, nvm m)
07:32:21<pabs>wow, they have a lot of subdomains c3manu
07:32:58<c3manu>got to have infrastructure set up for yourself ;)
07:46:46benjinsm joins
08:45:59Island quits [Read error: Connection reset by peer]
08:57:19trenton13333 quits [Client Quit]
09:00:05Bleo18260072271 quits [Client Quit]
09:01:27Bleo18260072271 joins
09:01:49f_ (funderscore) joins
09:08:03<c3manu>any idea what might be good to grab regarding the georgian government? https://www.economist.com/europe/2024/05/09/protests-against-a-russian-style-law-threaten-georgias-government
09:15:22<c3manu>you know..in case it gets euromaidan’d or sth
09:57:27lumidify (lumidify) joins
10:23:28pedantic-darwin quits [Quit: The Lounge - https://thelounge.chat]
10:28:25pedantic-darwin joins
11:14:41etnguyen03 (etnguyen03) joins
12:53:04<AK>Well I can't even access https://parliament.ge/ from home or from the hetzner helsinki boxes, so dunno how we'd do that one
12:53:15Naruyoko quits [Remote host closed the connection]
12:53:36Naruyoko joins
12:58:28Notrealname1234 (Notrealname1234) joins
13:01:35<hexa->> Sorry, you have been blocked
13:01:46<hexa->from my home isp … sus
13:01:56<Notrealname1234>Block
13:02:10<Notrealname1234>You did sus things
13:05:19<@OrIdow6>Sounds like we need a Georgian (or maybe Russian?) IP
13:11:52Notrealname1234 quits [Client Quit]
13:12:43Notrealname1234 (Notrealname1234) joins
13:21:23<eightthree>re [IA•Wcite•.today•MemWeb] links in the wiki or elsewhere, should there be a indication when the link doesnt actually have the page?
13:22:15<eightthree>why are some sites in the wiki listed under more than one category, such as 4chan and mastodon being under lost sites and partially saved, simultaneously?
13:25:56Notrealname1234 quits [Client Quit]
13:27:21<thuban>eightthree: not really possible without manual intervention; mediawiki can generate links to those archives but not actually check what's on them (but altering the url template to better handle archive links has been previously discussed)
13:29:01<thuban>eightthree: usually because different subsets of content have different statuses (eg, one mastodon instance is lost but another is saved); all of the status links add their containing pages to the corresponding category
13:29:13<thuban>s/status links/status templates/
14:16:49Notrealname1234 (Notrealname1234) joins
14:18:28Notrealname1234 quits [Client Quit]
14:26:57Naruyoko5 joins
14:27:02Naruyoko5 quits [Read error: Connection reset by peer]
14:27:31Naruyoko5 joins
14:27:39Naruyoko5 quits [Read error: Connection reset by peer]
14:28:05Naruyoko5 joins
14:28:15Naruyoko5 quits [Read error: Connection reset by peer]
14:28:44Naruyoko5 joins
14:28:50Naruyoko5 quits [Read error: Connection reset by peer]
14:29:19Naruyoko5 joins
14:29:35Naruyoko5 quits [Read error: Connection reset by peer]
14:29:38lizardexile_ joins
14:30:04Naruyoko quits [Ping timeout: 255 seconds]
14:30:08Naruyoko5 joins
14:31:52lizardexile quits [Ping timeout: 255 seconds]
14:48:13<eightthree>"Google plans to close down more than 20 million Google Business Profile websites" I saw a mention of this in deathwatch, but shouldn't this have warranted a page on wiki, major saving effort and mention on the homepage etc?
14:51:45Naruyoko5 quits [Read error: Connection reset by peer]
15:05:07Notrealname1234 (Notrealname1234) joins
15:07:18<eightthree>+(edit: planned for around march)
15:14:50Notrealname1234 quits [Client Quit]
15:15:03Naruyoko joins
15:17:46nicolas17 quits [Ping timeout: 255 seconds]
15:18:55nicolas17 joins
15:30:42Naruyoko quits [Read error: Connection reset by peer]
15:31:07Naruyoko joins
15:47:10Naruyoko quits [Read error: Connection reset by peer]
15:47:40Naruyoko joins
15:54:45Naruyoko quits [Read error: Connection reset by peer]
15:55:11Naruyoko joins
16:00:51kiryu quits [Remote host closed the connection]
16:02:45Naruyoko quits [Read error: Connection reset by peer]
16:03:16Naruyoko joins
16:03:36Naruyoko quits [Read error: Connection reset by peer]
16:04:04Naruyoko joins
16:04:12Naruyoko quits [Read error: Connection reset by peer]
16:04:43Naruyoko joins
16:05:47<@OrIdow6>On that topic did anything ever happen with the idea to replace DW with a BTS?
16:17:43Naruyoko quits [Read error: Connection reset by peer]
16:18:15Naruyoko joins
16:18:23Naruyoko quits [Read error: Connection reset by peer]
16:18:43Naruyoko joins
16:21:15Naruyoko quits [Read error: Connection reset by peer]
16:21:37Naruyoko joins
16:36:45Naruyoko quits [Read error: Connection reset by peer]
16:37:15Naruyoko joins
16:45:15Naruyoko quits [Read error: Connection reset by peer]
16:45:37Naruyoko joins
16:55:45Naruyoko quits [Read error: Connection reset by peer]
16:56:11Naruyoko joins
17:04:15Naruyoko quits [Read error: Connection reset by peer]
17:04:19<hexa->OrIdow6: doesn't work from a russia either
17:05:02Naruyoko joins
17:05:10<hexa->confirmed by a friend from his internet connection at home in moscow
17:11:43linuxgemini quits [Quit: Ping timeout (120 seconds)]
17:11:54linuxgemini (linuxgemini) joins
17:19:32Naruyoko quits [Read error: Connection reset by peer]
17:22:29shgaqnyrjp quits [Remote host closed the connection]
17:22:36shgaqnyrjp_ (shgaqnyrjp) joins
18:02:02tapos quits [Client Quit]
18:09:56<c3manu>currently running through the gov.ge subdomains
18:17:06Notrealname1234 (Notrealname1234) joins
18:20:36Notrealname1234 quits [Client Quit]
18:31:09Notrealname1234 (Notrealname1234) joins
18:31:52lun4 (lun4) joins
18:32:04AlsoHP_Archivist quits [Client Quit]
18:32:24HP_Archivist (HP_Archivist) joins
18:33:42<joepie91|m>AK: can confirm that parliament.ge loads for someone I know in georgia, so it is indeed a geoblock
18:38:39<c3manu>good to know, thanks! :)
18:42:51Notrealname1234 quits [Client Quit]
19:16:25linuxgemini0 (linuxgemini) joins
19:17:26linuxgemini quits [Ping timeout: 265 seconds]
19:18:26linuxgemini0 quits [Client Quit]
19:20:51linuxgemini (linuxgemini) joins
19:21:41f_ quits [Client Quit]
20:12:38<tech234a>Broadcom's migration of VMWare content to a new platform seems to have broken a number of links (including downloads) on VMWare's website. Fortunately I was able to find the file I wanted from their software update server, but I'm not sure how strong the archive coverage is for that server. There seem to be a number of root directories on that server, could someone put them into AB? What I could find (probably incomplete) was:
20:12:39<tech234a>https://softwareupdate.vmware.com/cds https://softwareupdate.vmware.com/horizon-clients https://softwareupdate.vmware.com/viewcrt-windows https://softwareupdate.vmware.com/cds-ws-fus-2013-preview https://softwareupdate.vmware.com/cds-ws-fus-2012-preview https://softwareupdate.vmware.com/cds-ws-fus-2011-beta https://softwareupdate.vmware.com/cds-ws71-fus31-beta
20:13:37Island joins
20:21:56trenton13333 joins
20:37:53<nicolas17>Broadcom's migration of VMWare content to a new platform seems to have broken a lot of things
20:38:38<nicolas17>I heard of someone who had an active support ticket with VMware, and when it was migrated to Broadcom's platform it lost the last few days of replies
20:39:58<nicolas17>support articles were moved without keeping their "last modified" date so you have grossly outdated information that says it's from march 2024
20:40:05<nicolas17>because that's when it was moved
21:08:52<eightthree>thuban: Maybe a seperate category called "some hosts lost"? Is the wiki "move fast and break things", i.e. make changes first, or should I ask first here?
21:09:05<eightthree>* without asking,
21:16:48<fireonlive>interesting the <title>s for those are '/var/www/public/stage/session-103/cds' etc
21:16:56<fireonlive>last one is '/www/stage/session-97/cds-ws-fus-2012-preview'
21:17:49<fireonlive>eightthree: would partially saved not cover that?
21:18:03<fireonlive>it also implies partially lost
21:18:40<fireonlive>glass half full/glass half empty
21:23:29DogsRNice joins
21:23:55<eightthree>Just confirming before (or while) I nag my matrix bridge mod : adding #archiveteam and #archiveteam-ot should work right? I joined a bunch of AT rooms yesterday just fine, but these 2 today are failing to add.
21:29:48<thuban>eightthree: if by that you mean altering the status templates so as not to put a page in multiple status categories, then no, because mediawiki templates can't actually do that (they don't know what other templates are used on the containing page/what categories the containing page is in)
21:29:56<thuban>if you just mean creating a new category, calling it "some hosts lost", and adding all pages in multiple status categories to it, then still no, because use of multiple categories is often not about "hosts" (it may instead be about entirely different sites on the same topic, or different parts of the same site, or the same site at different times)
21:44:53<eightthree>fireonlive: I guess, so maybe just remove from lost category (and leave it in partial)?
21:49:01<@JAA>fireonlive: It's about pages like [[Mastodon]] where we list multiple projects and some were saved, some partially saved, some lost.
21:49:13<fireonlive>ahh
21:49:24<fireonlive>hm
21:49:25<@JAA>I don't think there's a better approach than what we do now. These pages act as containers for multiple different sites/parts/whatever.
21:49:40<fireonlive>yeah that would make sense
21:49:44<@JAA>We'd have to create individual pages for each to split it up, and that's not reasonable for such small things, I think.
21:54:05<eightthree>Just thinking out loud here: I'm noticing a lot of "manual" adding to AB, such as in YT, github room etc. Is there enough bandwidth to build a tool that pulls i.e. all bankruptcies in a jurisdiction as mentioned in official govt or mandatory news "ads", and thus programmatically find all-the-links (social accounts etc.) and add these to AB?
21:54:09<eightthree>I'm noticing a lot of AI projects called agents, such as BabyAGI or Autogpt and possibly better newer ones. I'll report back more if I play with tools that seem reliable enough.
21:54:11<eightthree>Yada yada, context: I was trying to searching for "closing down" "closing up" "shutting down" "filed for receivership" "until .. to backup/save" etc. with google search and google news but that quickly became too slow. The hardest might be filtering for large enough companies and/or those with "particularly added value" web-accessible data.
21:55:45<@JAA>Automating the searching has been discussed before, and we actually wanted to try that at one point but then it got stalled.
21:56:19<@JAA>The filtering would probably have to remain manual.
21:57:21<eightthree>https://wiki.archiveteam.org/index.php/Who_We_Are - I'm also surprised coder/scripters, automation or AI experts aren't mentioned as needed. Maybe AI is questionably reliable still, but the others shouldn't be, right?
21:58:06<eightthree>solved: I was able to join #AT and AT-ot
21:59:02<@JAA>That page has hardly been edited for over a decade.
22:02:56<joepie91|m>wanted: wiki maintainers
22:02:57<joepie91|m>:p
22:03:47<@JAA>:-)
22:06:49abirkill quits [Ping timeout: 255 seconds]
22:08:58<fireonlive>i need to edit it some more
22:11:17<h2ibot>FireonLive edited Current Projects (-68, move around current projects): https://wiki.archiveteam.org/?diff=52235&oldid=52006
22:13:17<h2ibot>FireonLive edited DeviantArt (-5, mark groups as saved): https://wiki.archiveteam.org/?diff=52236&oldid=51927
22:13:19<fireonlive>dunno if 'partially' is better there
22:13:36<fireonlive>it was declared very successful
22:14:01le0n_ quits [Ping timeout: 255 seconds]
22:26:37abirkill (abirkill) joins
22:28:58jacksonchen666 (jacksonchen666) joins
22:31:22<h2ibot>FireonLive edited Who We Are (-571, remove some outdated info; this page could use…): https://wiki.archiveteam.org/?diff=52237&oldid=45931
22:34:32Notrealname1234 (Notrealname1234) joins
22:38:17fangfufu quits [Quit: ZNC 1.8.2+deb3.1 - https://znc.in]
22:41:39fangfufu joins
22:43:01Notrealname1234 quits [Client Quit]
22:51:01Notrealname1234 (Notrealname1234) joins
22:53:15<Notrealname1234>What's the MEGA.nz API URL, so i can archive some stuff on WBM?
23:04:22Notrealname1234 quits [Client Quit]
23:09:18<nicolas17>the what
23:22:39BlueMaxima joins
23:41:40<@JAA>They think that the download from MEGA is a simple API request. It's far more involved IIRC.
23:45:38le0n (le0n) joins
23:47:36<nicolas17>yes you need multiple API requests to get the *encrypted* file, and then decrypt it yourself locally