00:04:00jacobk quits [Ping timeout: 265 seconds]
00:07:10jacobk joins
00:20:55jacobk quits [Ping timeout: 265 seconds]
00:42:41jacobk joins
01:02:38dm4v_ joins
01:03:56dm4v quits [Ping timeout: 265 seconds]
01:03:56dm4v_ is now known as dm4v
01:03:57dm4v quits [Changing host]
01:03:57dm4v (dm4v) joins
01:21:46Sluggs quits [Ping timeout: 240 seconds]
01:22:28Sluggs joins
01:39:27thetechrobo_ joins
01:43:05TheTechRobo quits [Ping timeout: 265 seconds]
01:47:26Suika_ joins
01:51:01Suika quits [Quit: Server is ded]
02:02:22BlueMaxima quits [Read error: Connection reset by peer]
03:17:51ell (ell) joins
03:18:54ell quits [Changing host]
03:18:54ell (ell) joins
03:41:41Stiletto quits [Remote host closed the connection]
03:48:40Stiletto joins
03:54:56Lord_Nightmare quits [Client Quit]
03:55:31qwertyasdfuiopghjkl quits [Ping timeout: 265 seconds]
03:59:25Lord_Nightmare (Lord_Nightmare) joins
04:02:34qwertyasdfuiopghjkl joins
04:03:36eroc1990 quits [Client Quit]
04:04:01eroc1990 (eroc1990) joins
04:05:01qwertyasdfuiopghjkl quits [Client Quit]
04:05:11qwertyasdfuiopghjkl joins
04:27:25eroc1990 quits [Ping timeout: 265 seconds]
04:39:05eroc1990 (eroc1990) joins
05:26:44eroc1990 quits [Ping timeout: 265 seconds]
05:49:54eroc1990 (eroc1990) joins
05:52:52qwertyasdfuiopghjkl quits [Remote host closed the connection]
05:56:13eroc1990 quits [Ping timeout: 265 seconds]
05:57:42qwertyasdfuiopghjkl joins
05:59:07pabs quits [Quit: Don't rest until all the world is paved in moss and greenery.]
06:00:59pabs (pabs) joins
06:02:08eroc1990 (eroc1990) joins
06:13:38sonick quits [Client Quit]
06:27:58rsn quits [Remote host closed the connection]
06:53:23eroc1990 quits [Ping timeout: 265 seconds]
06:55:27<@OrIdow6>Will work on Buzzly tomorrow
07:04:34michaelblob_ quits [Read error: Connection reset by peer]
07:04:57michaelblob_ (michaelblob) joins
07:06:10qwertyasdfuiopghjkl quits [Remote host closed the connection]
07:23:25qwertyasdfuiopghjkl joins
07:28:50<@rewby>Welp, that's a "good morning" indeed https://s3.services.ams.aperture-laboratories.science/rewby/public/710554ec-770c-406f-86a6-4360f73a1e9a/1648711633.8789976.png
07:30:51niku quits [Remote host closed the connection]
07:45:06<datechnoman>rewby please sir, can I have some more? Haha
07:45:18<datechnoman>Targets are on fire on all projects
07:45:21<@rewby>datechnoman: I'm working on it
07:45:27<@rewby>My automation can only go so fast
07:45:54<@rewby>datechnoman: Y'all eating as fast as I can add... https://s3.services.ams.aperture-laboratories.science/rewby/public/cc4661cd-1a5f-44ab-9bd5-b8f8914d2a87/1648712741.5976489.png
07:50:46<@rewby>Congrats, you're on 18gbps now
07:57:59<datechnoman>Haha sorry. Severs are hungry nom nom rewby
07:58:32<datechnoman>Thats across all projects?
08:00:39<@rewby>Yes
08:00:50<@rewby>But most of the server capacity is shared across multiple projects atm
08:11:36<datechnoman>Yeah thats fine. Was just curious. Will be much more quiet when coub finishes up
08:17:34<datechnoman>Servers are slowly emptying out now. Still got a bit of back pressure >:) heheh
08:19:04Megame (Megame) joins
08:32:20Doranwen quits [Ping timeout: 265 seconds]
08:33:36Doranwen (Doranwen) joins
08:48:00<@rewby>I feel sorry for the IA, they're currently eating 22gbps from me
09:00:28<@rewby>Currently bouncing between 20gbps in and 20gbps out
09:05:26eroc1990 (eroc1990) joins
09:11:58eroc1990 quits [Ping timeout: 265 seconds]
09:15:28<datechnoman>They will be right. They should be able to handle it for a few hours :P
09:19:13Church quits [Ping timeout: 265 seconds]
09:41:48sonick (sonick) joins
09:56:11<drexler>Alright I have a list of models and git repos and stuff I've made and can send to you.
09:56:16<drexler>How would you like me to do that?
09:57:29<drexler>It's just a .txt, have a favorite pastebin site?
10:30:53driib08 quits [Client Quit]
10:31:16driib08 (driib) joins
10:31:21driib08 quits [Client Quit]
10:31:47driib08 (driib) joins
10:32:53<thuban>drexler: transfer.archivete.am
10:34:14<drexler>https://transfer.archivete.am/%28/MHOqv/ai_art_model_archive_urls.txt%29.zip
10:35:10<drexler>If you glance through the list you will notice some things:
10:35:22<drexler>- Many *many* models are hosted on someone(s) personal Google Drive
10:35:32<drexler>- A lot of these URLs are weird/obscure
10:35:40<drexler>That is, the hosting status of a lot of these files is very precarious
10:36:41<drexler>I could go hunt down more, I might even, but as a first pass if you got everything in that list you would have already preserved a significant fraction of the important models people are going to want to be able to look at later.
10:37:52<thuban>thanks!
10:42:02<thuban>unfortunately, archivebot won't be able to handle this list by itself--in particular, google drive links are problematic.
10:46:35<thuban>what i would do is separate it into three lists: (1) binaries served directly over http (which are good to go through archivebot), (2) github repos (which can go to our github project, where they'll be archived completely--including github-specific data like issues), and (3) hosting services which have to be specially handled in some way (like google drive and iirc mega.nz)
10:48:19<thuban>if you want to do this, go ahead; if not, say so and i will
10:53:40<thuban>(i'm not sure whether we have a good way of handling google drive. :( the download process for large files includes both a nonce signature i suspect is nondeterministic and a post which definitely won't play back in the wbm--so even if we captured it in a warc it wouldn't really be discoverable from the original url.)
10:57:38driib083 (driib) joins
11:01:20driib08 quits [Ping timeout: 265 seconds]
11:01:20driib083 is now known as driib08
11:17:12eroc1990 (eroc1990) joins
11:23:26eroc1990 quits [Ping timeout: 265 seconds]
11:46:36eroc1990 (eroc1990) joins
11:53:53eroc1990 quits [Ping timeout: 265 seconds]
12:09:34<masterX244>I wonder if someone could set a helper page under archiveteam which spits out the linked url thwt thenpost redirects to so if fot example a gdrive link gets hit you can replace drive.google.com with gdrive.archiveteam.org in the address bar and get the right link (needs to be archived, too so it works inside wayback)
12:12:24eroc1990 (eroc1990) joins
12:14:58<thuban>slightly more discoverable than just uploading stuff as ia items, perhaps
12:18:08<masterX244>luckily archiveteam warcs are available. Got to peek into one to see if i can create a "phonebook" based out of them for the old gdrive-project
12:34:53duce1337 quits [Quit: Gone.]
12:36:25Zerote__ quits [Ping timeout: 265 seconds]
12:40:09Zerote joins
12:58:24Church (Church) joins
13:16:32eroc1990 quits [Ping timeout: 265 seconds]
13:21:29AK quits [Client Quit]
13:23:25AK (AK) joins
13:33:00eroc1990 (eroc1990) joins
13:36:21Arcorann quits [Ping timeout: 265 seconds]
13:38:25eroc1990 quits [Ping timeout: 265 seconds]
13:39:47eroc1990 (eroc1990) joins
13:45:03eroc1990 quits [Ping timeout: 265 seconds]
13:52:01eroc1990 (eroc1990) joins
13:54:19eroc1990 quits [Client Quit]
14:08:16eroc1990 (eroc1990) joins
14:23:45duce1337 joins
15:15:42Deewiant quits [Remote host closed the connection]
15:26:36<@JAA>drexler: Huh, curious how you uploaded that file to transfer to end up with that messy URL. Sounds like that might be a bug in the server.
15:29:07Deewiant (Deewiant) joins
15:31:00balrog quits [Quit: Bye]
15:39:42balrog (balrog) joins
15:50:06LeGoupil joins
16:15:56rn joins
16:28:18yo joins
16:28:31yo leaves
16:30:19spirit joins
17:01:37rn quits [Remote host closed the connection]
17:06:46rsn joins
17:19:47tzt quits [Ping timeout: 265 seconds]
17:23:39qwertyasdfuiopghjkl quits [Ping timeout: 265 seconds]
17:33:25Minkafighter quits [Client Quit]
17:40:10Minkafighter joins
17:44:02lunik1 quits [Quit: Ping timeout (120 seconds)]
17:44:07lunik1 joins
17:44:24<@OrIdow6>See the Google Drive wiki page
17:44:28<@OrIdow6>I have been working on a downloader
17:45:16<@OrIdow6>For playbak
18:05:57<IDK>is there a way to archive tiktok accounts, like tikup completly broke for me
18:07:42<@arkiver>is tikup that thing that creates an IA item for a tiktok video
18:12:01<IDK>arkiver: yes
18:12:06<IDK>however, the download wont work
18:12:14<IDK>no update since august
18:16:30<IDK>ERROR: Unable to download webpage: The read operation timed out (caused by timeout('The read operation timed out'))
18:40:14<thuban>IDK: https://github.com/yt-dlp/yt-dlp/issues/2396
18:40:43<thuban>if you're prepared to patch tikup, s/youtube_dl/yt_dlp/ will probably be drop-in
19:00:01seednode4942 (seednode) joins
19:00:40seednode494 quits [Ping timeout: 265 seconds]
19:00:40seednode4942 is now known as seednode494
19:53:02duce1337 quits [Changing host]
19:53:02duce1337 (duce1337) joins
19:53:04<drexler>JAA, I uploaded it with their web upload form
19:53:09<drexler>The file doesn't have a weird name
19:59:58spirit quits [Client Quit]
20:11:00qw3rty_ joins
20:11:26binzyboi quits [Remote host closed the connection]
20:14:37qw3rty quits [Ping timeout: 265 seconds]
20:16:04binzyboi joins
20:38:52tzt (tzt) joins
20:56:01lunik1 quits [Client Quit]
20:56:05lunik1 joins
20:56:17hackbug quits [Remote host closed the connection]
21:00:57<h2ibot>JAABot edited CurrentWarriorProject (+4): https://wiki.archiveteam.org/?diff=48447&oldid=48433
21:03:31hackbug (hackbug) joins
21:07:47jacobk quits [Ping timeout: 265 seconds]
21:17:22LeGoupil quits [Client Quit]
22:00:26jacobk joins
22:16:29<@JAA>drexler: Oh, I see what's happening. The file is actually https://transfer.archivete.am/MHOqv/ai_art_model_archive_urls.txt , there's just a link to a ZIP download which uses the format /(file).zip, so it becomes /(/MHOqv/ai_art_model_archive_urls.txt).zip plus some percent encoding.
22:16:45<drexler>Yeah
22:17:00<@JAA>I had no idea this was a thing. :-)
22:17:18<@JAA>Or rather, that it's done with such an ugly URL.
22:17:29<drexler>I didn't see the normal download option or I'd have sent that instead
22:17:45<drexler>Anyway if the google drive ones are going to be such a pain
22:18:08<drexler>I could just like, upload them as an IA archive item or something I guess, though a lot of them are unrelated things, so it should probably be multiple items
22:18:24<@JAA>It should definitely return a plain file link, but I think the web interface is a bit unintuitive. I do all my uploads through curl.
22:45:04lennier1 quits [Client Quit]
22:45:49lennier1 (lennier1) joins
22:53:32<@arkiver>JAA: want me to put that list of URLs in #// ?
22:55:24<@JAA>arkiver: Probably needs further attention, see earlier discussion. There's GDrive stuff in it etc.
22:56:59<@arkiver>alright
23:10:40Arcorann (Arcorann) joins
23:47:16Megame quits [Client Quit]
23:47:56Megame (Megame) joins
23:50:50@Fusl quits [Excess Flood]
23:51:11Fusl (Fusl) joins
23:51:11@ChanServ sets mode: +o Fusl
23:52:14hackbug quits [Client Quit]
23:57:54hackbug (hackbug) joins