00:03:22midou quits [Ping timeout: 255 seconds]
00:04:14katia is now known as catia
00:05:50midou joins
00:10:07midou quits [Ping timeout: 255 seconds]
00:45:52Wohlstand (Wohlstand) joins
00:46:50midou joins
01:28:09<mgrandi>@arkiver i have 1.8tb of portal 2 maps (934,000 files ) in #outofsteam , should i throw those on IA in 100,000 file chunks or something?
01:35:38<pokechu22>Pedrosso: ^
01:48:10Ruthalas59 quits [Client Quit]
02:00:18<mgrandi>yeah i already told him :)
02:00:21<mgrandi>them*
02:14:10<nicolas17>hot damn
02:14:14Ruthalas59 (Ruthalas) joins
02:27:26Wohlstand quits [Remote host closed the connection]
02:31:49mrbrown8 joins
02:32:32<thuban>JAA: those are broader than what we use in the ctas on the wiki (idk how much this matters)
02:32:57<@JAA>thuban: Yes. The bot sorts out the rest.
02:34:45<thuban>yes
02:34:56<thuban>but dammit, how "intentionally broad" should they be
02:35:06<@JAA>:-P
02:39:19<h2ibot>Switchnode edited Pastebin (+44, add filter regex): https://wiki.archiveteam.org/?diff=51996&oldid=51483
02:40:36<mgrandi>also the 1.8TB is compressed since half life 2 map files (.bsp) compress fairly well
02:42:31Dango360_ joins
02:46:33Dango360 quits [Ping timeout: 272 seconds]
02:54:27<nicolas17>is that compressing them "individually"?
03:15:58mrbrown8 quits [Client Quit]
03:33:47kiryu quits [Remote host closed the connection]
03:42:13xcx joins
03:42:30xcx quits [Client Quit]
04:16:47kiryu (kiryu) joins
04:54:43icedice quits [Client Quit]
05:29:48Island quits [Read error: Connection reset by peer]
05:48:47<mgrandi>Yeah, I downloaded them one at a time, got the preview images and ran 7z on the two files
05:49:26<mgrandi>I got better compression that way than using zstd even after training
05:56:19<fireonlive>i won't hear this anti-zstd propaganda
06:20:52<Vokun>7z++
06:20:53<eggdrop>[karma] '7z' now has 1 karma!
06:22:46<@JAA>Is there no commonality between the .bsp files? That's where the custom dict would shine.
06:26:00BlueMaxima quits [Read error: Connection reset by peer]
06:39:48<mgrandi>there probably is in the header and other custom stuff but i tried various tests in the past and i dont know if i'm training them wrong (was doing it on image files) but 7z at level 9 seemed to beat zstd every time
06:40:31<pokechu22>Are you doing a solid archive with 7z?
06:40:56<mgrandi>i just did "7z a -mx 9" as the commandline
06:41:03<pokechu22>if so that's basically equivalent to taring all of the files together and then compressing them
06:43:16<pokechu22>"-ms=on" enables solid archives, not sure if they're on by default with -mx 9. Might be worth trying both ways. If I recall correctly archive.org doesn't like listing the files in solid archives though you still are able to download individual files if you know the correct filename
06:45:22<mgrandi>yeah, right now its just individual files , i can recompress them to test, and will have to something similar to upload to IA
06:52:26<thuban>what are the present obstacles to getting #googlecrash running again?
06:54:48<thuban>we discussed it briefly when google announced its inactive account purges, but afaik didn't get anywhere, and it seems more important in light of the recent crackdown rumors
07:05:02Unholy23613166180851599738 quits [Remote host closed the connection]
07:06:10Unholy23613166180851599738 (Unholy2361) joins
07:08:07nulldata quits [Ping timeout: 272 seconds]
07:08:49nulldata (nulldata) joins
07:39:47<@arkiver>mgrandi: can you pack them into chunks of a few hundred GBs and upload those perhaps?
07:40:33<mgrandi>yeah, i was asking advice on how to best upload it , each file is like 5gb so i can chunk them appropriately
07:41:13<mgrandi>5 MiB -> 100MiB per map usually, sorry
07:55:11<pabs>https://www.davidrevoy.com/article1020/the-end-of-peppercarrot-and-my-next-project
08:03:50<@arkiver>mgrandi: if you can do a few GB per item, that would be good too
08:04:10<@arkiver>but yes, best to pack them up, IA does not deal well with tons of tiny files unfortunately
08:04:52<@arkiver>well - the files just sitting there is not really a problem, the problem is more about if anything needs to be done with them at which point it becomes slow
08:12:24<steering>mgrandi: wow, that's an impressive amount of maps. here i thought my 400 CSS maps was good :)
08:13:59<mgrandi>@steering: honestly yours are more valuable, as I got these from my script to automatically download stuff from steam workshop, but games that predate the workshop have their maps strewn all over
08:14:23<steering>this is true
08:14:58<steering>i wonder how well archive gamebanana is
08:19:03<mgrandi>Not very well, those are on my list of things to do eventually
08:19:34<mgrandi>So maybe throw your custom css maps on IA someday when you have a moment :)
08:20:35<steering>I'm thinking about ripping all of gamebanana's CSS maps, too
08:28:34<mgrandi>That's a tricky site but doing preliminary research would be a great start!
08:41:13<steering>the APIs look pretty straightforward, hopefully they don't ban me
08:41:20<steering>:P
08:50:42<fireonlive>that's what a VPN is for :3
08:50:46<fireonlive>(afterwards)
09:00:02Bleo182600 quits [Client Quit]
09:01:18Bleo182600 joins
09:29:22<steering>https://transfer.archivete.am/inline/LlqVV/yank-maps.sh there we are I think, other than the weird naming, their api is pretty helpful
09:29:27<steering>(and yes, bash was a terrible choice)
09:31:31<fireonlive>JAA-approved language
09:31:53<steering>503762 503762-$2011$_st.json 503762-1164709-2011__st.zip curl: (22) The requested URL returned error: 504
09:31:56<steering>MD5 mismatch 503762-1164709-2011__st.zip!
09:31:57<steering>and i already got to test it xD
09:32:19Guest88 quits [Ping timeout: 265 seconds]
09:32:53<fireonlive>:3
09:33:51istuk10 joins
09:33:55istuk10 quits [Client Quit]
09:36:04<fireonlive>steering++
09:36:04<eggdrop>[karma] 'steering' now has 1 karma!
09:40:12michaelblob quits [Read error: Connection reset by peer]
09:40:44steering warm fuzzies
09:43:13michaelblob (michaelblob) joins
09:47:57<fireonlive>:d
10:45:45<steering>https://transfer.archivete.am/JY5sq/yank-gamebanana-maps.sh here we go, seems good although it will require running multiple times to get everything (change $page_num to whatever page you've completed)
10:45:45<eggdrop>inline (for browser viewing): https://transfer.archivete.am/inline/JY5sq/yank-gamebanana-maps.sh
10:46:22pedantic-darwin quits [Client Quit]
10:46:37pedantic-darwin joins
11:42:46<that_lurker>Is there a channel for listing sites that are doing something for april fools that should be archived?
11:50:06qwertyasdfuiopghjkl quits [Client Quit]
11:56:48qwertyasdfuiopghjkl (qwertyasdfuiopghjkl) joins
12:24:20JaffaCakes118 quits [Remote host closed the connection]
12:28:27JaffaCakes118 (JaffaCakes118) joins
12:28:53<nstrom|m>https://aprilfoolsdayontheweb.com/ looks to have a list of sites doing things. dunno about a channel though
12:33:34jacksonchen666 quits [Remote host closed the connection]
12:41:37Arcorann quits [Ping timeout: 255 seconds]
12:48:24jacksonchen666 (jacksonchen666) joins
12:57:46jacksonchen666 quits [Client Quit]
13:27:31za4k joins
13:28:08za4k quits [Client Quit]
13:50:36Guest88 joins
14:30:56JaffaCakes118 quits [Remote host closed the connection]
14:41:06Unholy23613166180851599738 is now known as Unholy2361
14:41:19Unholy2361 quits [Client Quit]
14:41:35JaffaCakes118 (JaffaCakes118) joins
14:41:37Unholy2361 (Unholy2361) joins
15:09:18Dango360_ quits [Client Quit]
15:09:38Dango360 (Dango360) joins
15:22:11rohvani joins
15:47:31Wohlstand (Wohlstand) joins
15:50:37driib quits [Ping timeout: 255 seconds]
16:22:11icedice (icedice) joins
16:26:03Guest88 quits [Client Quit]
16:52:16wickerz quits [Ping timeout: 255 seconds]
16:53:36wickerz joins
17:14:11JaffaCakes118 quits [Remote host closed the connection]
17:14:31JaffaCakes118 (JaffaCakes118) joins
17:33:15wickedplayer494 quits [Remote host closed the connection]
17:40:19wickedplayer494 joins
17:54:02Island joins
17:56:13Island quits [Read error: Connection reset by peer]
17:59:28Island joins
18:05:32JaffaCakes118 quits [Client Quit]
18:05:47JaffaCakes118 (JaffaCakes118) joins
18:23:46Molotep joins
18:24:15Molotep quits [Client Quit]
18:50:51<icedice>How is it going with the scanlation group sites, by the way?
20:12:00JaffaCakes118 quits [Remote host closed the connection]
20:12:21JaffaCakes118 (JaffaCakes118) joins
20:14:05BornOn420 quits [Ping timeout: 272 seconds]
20:14:39BornOn420 (BornOn420) joins
20:17:28systwi quits [Ping timeout: 255 seconds]
20:30:56systwi (systwi) joins
20:39:49Dango360_ joins
20:43:51Dango360 quits [Ping timeout: 272 seconds]
20:47:22<@JAA>20:17:56 <+firebot> @textfiles: Archive Team, would you kindly archive the online presence of Ed Piskor. <https://twitter.com/textfiles/status/1774893623177306297>
20:49:55<Barto>I threw two websites related to ed piskor. twitter handle seems deleted
20:51:11<@JAA>Ah, perfect :-)
20:51:15<@JAA>Barto++
20:51:16<eggdrop>[karma] 'Barto' now has 3 karma!
20:59:06_Dango360 joins
21:02:55Dango360_ quits [Ping timeout: 255 seconds]
21:04:42<mgrandi>@steering: wow they have an API? Life of luxury
21:09:48BlueMaxima joins
21:39:24_Dango360 quits [Client Quit]
21:39:46Dango360 (Dango360) joins
22:07:16midou quits [Ping timeout: 255 seconds]
22:11:19nick73 joins
22:11:54nick73 quits [Client Quit]
22:15:50BlueMaxima_ joins
22:15:51mr_sarge quits [Read error: Connection reset by peer]
22:15:57benjins2_ quits [Read error: Connection reset by peer]
22:16:00benjinsm quits [Read error: Connection reset by peer]
22:16:02Naruyoko5 joins
22:16:32AlsoHP_Archivist joins
22:16:33mr_sarge (sarge) joins
22:16:41benjinsm joins
22:16:46wickerz quits [Client Quit]
22:16:53nic8 quits [Quit: Ping timeout (120 seconds)]
22:16:58wickerz joins
22:17:04Perk3 joins
22:17:12nic8 (nic) joins
22:17:17Ryz29 (Ryz) joins
22:17:18rohvani2 joins
22:17:20CraftByte0 (DragonSec|CraftByte) joins
22:17:28benjins2_ joins
22:17:33andrew2 (andrew) joins
22:17:34flashfire429 joins
22:17:41nulldata9 (nulldata) joins
22:17:44abirkill- (abirkill) joins
22:17:51G4te_Keep3r34924 quits [Quit: Ping timeout (120 seconds)]
22:17:55wyatt8740 quits [Remote host closed the connection]
22:17:56Justin[home] joins
22:18:01Frogging101 quits [Remote host closed the connection]
22:18:02steering7253 (steering) joins
22:18:06G4te_Keep3r34924 joins
22:18:12wyatt8740 joins
22:18:13Muad-Dib quits [Ping timeout: 272 seconds]
22:18:16andrew quits [Read error: Connection reset by peer]
22:18:16andrew2 is now known as andrew
22:18:18steering quits [Read error: Connection reset by peer]
22:18:20Ryz9 (Ryz) joins
22:18:21JohnnyJ2 joins
22:18:23Frogging101 joins
22:18:26steering7253 is now known as steering
22:18:30rohvani quits [Client Quit]
22:18:30rohvani2 is now known as rohvani
22:18:33xarph_ joins
22:18:51Ryz quits [Read error: Connection reset by peer]
22:18:51flashfire42 quits [Read error: Connection reset by peer]
22:18:51DopefishJustin quits [Ping timeout: 272 seconds]
22:18:51Ryz9 is now known as Ryz
22:18:51flashfire429 is now known as flashfire42
22:18:57abirkill quits [Read error: Connection reset by peer]
22:18:57Ryz2 quits [Read error: Connection reset by peer]
22:18:57abirkill- is now known as abirkill
22:18:57Ryz29 is now known as Ryz2
22:19:06Muad-Dib joins
22:19:29BlueMaxima quits [Ping timeout: 272 seconds]
22:19:29nulldata quits [Ping timeout: 272 seconds]
22:19:29Naruyoko quits [Ping timeout: 272 seconds]
22:19:29Perk quits [Ping timeout: 272 seconds]
22:19:29HP_Archivist quits [Ping timeout: 272 seconds]
22:19:29Perk3 is now known as Perk
22:19:30nulldata9 is now known as nulldata
22:19:46CraftByte quits [Read error: Connection reset by peer]
22:19:46CraftByte0 is now known as CraftByte
22:20:07JohnnyJ quits [Ping timeout: 272 seconds]
22:20:07xarph quits [Ping timeout: 272 seconds]
22:20:07JohnnyJ2 is now known as JohnnyJ
22:32:51lunik1 quits [Quit: :x]
22:34:54lunik1 joins
23:25:47<pabs>from #debian-gnome (OFTC): <jbicha> smcv: Debian doesn't have any objections to GNOME switching to gitlab tags for tarball releases and killing download.gnome.org, right?
23:25:53pabs asking for more details
23:26:29<nicolas17>wouldn't it cause problems if the tarball is non-reproducible?
23:27:07<nicolas17>although github already made a change to how they compress tarballs, and it broke SO much crap wrongly expecting the autogenerated .tar.gz to have a consistent SHA1, that they had to revert it
23:27:19<nicolas17>so I guess that's API now
23:27:36<fireonlive>really
23:27:39<fireonlive>o_o huh
23:28:50jr joins
23:29:34<nicolas17>yeah, think package managers or build systems assuming https://github.com/lzfse/lzfse/archive/refs/tags/lzfse-1.0.tar.gz SHA1 is 40f156053e34e8725f052d2d1590b6abd318f899
23:30:19jr quits [Client Quit]
23:30:37Kinille quits []
23:33:59Kinille (Kinille) joins
23:37:04<fireonlive>nicolas17: ah yeah, i have some stuff pinned like that
23:37:32<fireonlive>was it consistent after the change?
23:38:53<nicolas17>fireonlive: I think the change was reverted
23:39:41<fireonlive>ah
23:39:45<fireonlive>i was picturing like
23:40:04<fireonlive>hashes were different for each download after they made their change
23:40:13<fireonlive>but probably it just invalidated all existing ones
23:41:06<nicolas17>yeah probably they were generated in a consistent but different-than-before way
23:42:04<fireonlive>ah yee
23:52:21Kinille quits [Client Quit]
23:53:19Kinille (Kinille) joins