00:22:00BlueMaxima joins
00:23:07BlueMaxima quits [Read error: Connection reset by peer]
00:23:17BlueMaxima joins
00:26:38icedice quits [Client Quit]
00:41:49sonick (sonick) joins
00:45:55Mateon2 joins
00:47:35Mateon1 quits [Ping timeout: 252 seconds]
00:47:35Mateon2 is now known as Mateon1
00:55:54AmAnd0A quits [Read error: Connection reset by peer]
00:56:11AmAnd0A joins
01:07:54benjins quits [Remote host closed the connection]
01:08:05benjins joins
01:08:44fireonlive quits [Quit: Connection gently closed by peer]
01:09:16Naruyoko quits [Read error: Connection reset by peer]
01:09:50fireonlive (fireonlive) joins
01:22:51Naruyoko joins
01:36:06benjins2_ joins
01:38:13benjins2 quits [Ping timeout: 258 seconds]
01:51:15benjins quits [Remote host closed the connection]
01:51:30benjins joins
02:01:34<fireonlive>https://old.reddit.com/r/linux/comments/14g45gy/suse_has_uploaded_new_parody_songs
02:03:02justmolamola joins
02:05:49fishingforsoup quits [Ping timeout: 258 seconds]
02:05:59<nulldata>(https://teddit.net/r/linux/comments/14g45gy/suse_has_uploaded_new_parody_songs) FTFY ;)
02:12:27<imer>damn, all these are so good
02:12:41<nulldata>Can't wait for RedHat's "Where Open Source Goes (essentially) Closed"
02:48:16pabs is reminded of https://drewdevault.com/2021/01/20/FOSS-is-to-surrender-your-monopoly.html
02:57:58<fireonlive>nulldata: :P
03:17:21<Doranwen>Has anyone here ever tried setting up a symlink for flatpak so all the individual flatpaks go on a different partition? My / one doesn't have space for any of them, but I have spare space on my /home partition…
03:17:50<Doranwen>I saw several comments on this issue saying they'd done that, and I was trying to gauge exactly how to do the same thing, lol: https://github.com/flatpak/flatpak/issues/1224
03:18:03qwertyasdfuiopghjkl quits [Remote host closed the connection]
03:31:25qwertyasdfuiopghjkl (qwertyasdfuiopghjkl) joins
03:37:12Doranwen thinks she's set up the symlink fine and it seems to be going there, just hopes she hasn't missed data going anywhere else
03:40:58gfhh1 quits [Remote host closed the connection]
03:41:18gfhh1 joins
03:46:54Jake quits [Quit: Ping timeout (120 seconds)]
03:47:09Jake (Jake) joins
03:49:01Shjosan quits [Quit: Am sleepy (-, – )…zzzZZZ]
03:49:38Shjosan (Shjosan) joins
04:19:21BearFortress quits [Read error: Connection reset by peer]
04:19:32BearFortress joins
04:24:32HP_Archivist quits [Read error: Connection reset by peer]
04:24:59HP_Archivist (HP_Archivist) joins
04:37:02PredatorIWD joins
04:50:34Hans5958 quits [Quit: Reconnecting]
04:50:43Hans5958 (Hans5958) joins
04:55:02HP_Archivist quits [Read error: Connection reset by peer]
04:55:29HP_Archivist (HP_Archivist) joins
05:29:19sonick quits [Client Quit]
05:30:32HP_Archivist quits [Read error: Connection reset by peer]
05:30:59HP_Archivist (HP_Archivist) joins
05:38:18<PredatorIWD>Best way to compress a single 400MB .txt file that can then searched through without decompressing everything inside, using JavaScript or similar?
05:39:19<PredatorIWD>I've compiled a list of all the archived ZippyShare links and want to make it easily searchable but without anyone needing to download or uncompress a 400MB txt file first
05:52:19<datechnoman>When we upload compressed text files with .zst you can just drop the .zst off the file name end and view all the urls in the browser. Not to sure if that is a proper solution but it works
05:55:40justmolamola quits [Remote host closed the connection]
06:05:26BigBrain quits [Ping timeout: 245 seconds]
06:22:50BigBrain (bigbrain) joins
06:54:24BlueMaxima quits [Client Quit]
07:02:24Earendil7 quits [Quit: Leaving]
07:27:44Minkafighter quits [Quit: The Lounge - https://thelounge.chat]
07:28:53Minkafighter joins
07:53:17Earendil7 (Earendil7) joins
08:54:32yasomi quits [Ping timeout: 265 seconds]
08:55:06yasomi (yasomi) joins
09:00:12yasomi quits [Ping timeout: 258 seconds]
09:04:03yasomi (yasomi) joins
09:23:32Chris5010 quits [Ping timeout: 265 seconds]
09:28:48Minkafighter quits [Client Quit]
09:29:07Minkafighter joins
09:29:20Church quits [Ping timeout: 265 seconds]
09:29:50Church (Church) joins
09:48:49Craigle quits [Quit: The Lounge - https://thelounge.chat]
09:49:21Craigle (Craigle) joins
09:55:46AmAnd0A quits [Remote host closed the connection]
09:55:59AmAnd0A joins
10:10:37driib quits [Quit: The Lounge - https://thelounge.chat]
10:11:12driib (driib) joins
10:30:30pie_ quits []
10:30:38pie_ joins
10:34:07<thuban>PredatorIWD: if you're willing to make people download _something_ as long as it's not 400M, you can compress it and tell people to use one of the compressed-file-greppers (zgrep for gzip, zstdgrep or ripgrep for zstd)
10:37:37<masterX244>i wonder if splitted server-storage and compressing each chunk by itself and a index that tells which chunk starts where could work (logic inside javascript to load the right chunks and then the server can be a static site like gh-pages)
10:40:13<razul>Split the processing and the storage some more you mean?
10:44:28<PredatorIWD>I've basically wanted a simple static github page where data would be decompressed on the fly in the browser while being searched through without the entire 400MB being loaded but it seems like the easiest way would be to just compress the .txt file and upload it for people to easily download and uncompress themselves if needed.
10:45:37<razul>Stream decoding like LZ compression can do?
10:50:45<razul>Or as was said before, an index that can be searched and points to the right chunck seems like the most KISS solution
10:57:03<PredatorIWD>Eh, for now I think I'll just keep it as a simple file for people to download, maybe at some point I'll play around with chunk decoding and streaming, or if some website like github adds free limited dynamic page functionalities...
10:58:05<PredatorIWD>The file I am talking about btw are compiled archive team ZippyShare archived links, I've made an item at: https://archive.org/details/ZippyShare-Compiled-Archived-Links-from-ArchiveTeam.org
10:58:51<PredatorIWD>I've converted the .txt to a large PDF so archive.org can index it and people can search it through the "full text" search functionality, given it's size it's still deriving meaning I cant upload the compressed file of the .txt yet but the full file is already up on there.
10:59:01<masterX244>chunks meant splitting the 400MB file into for exmaple 4MB sections labeled 0 to 100 and then a helper file that tells at which part the chunk starts
10:59:41<masterX244>aka JS reads the helper first, compares after which line the searched one is and then grabs the file and decompresses
12:06:37Minkafighter quits [Client Quit]
12:08:18Minkafighter joins
12:13:55Iki quits [Read error: Connection reset by peer]
12:40:01that_lurker quits [Quit: Clowning around is not the same as fooling around...I am a clown, not a fool]
12:40:11that_lurker (that_lurker) joins
12:46:29AmAnd0A quits [Read error: Connection reset by peer]
12:47:07AmAnd0A joins
12:51:44AmAnd0A quits [Ping timeout: 258 seconds]
12:52:24AmAnd0A joins
13:02:29jasonswohl quits [Ping timeout: 265 seconds]
13:08:46jasonswohl joins
13:10:56emberquill quits [Quit: The Lounge - https://thelounge.chat]
13:11:16emberquill (emberquill) joins
13:39:11HackMii quits [Ping timeout: 245 seconds]
13:41:40HackMii (hacktheplanet) joins
13:54:02cdub joins
14:12:47Arcorann quits [Ping timeout: 252 seconds]
16:24:59SF quits [Remote host closed the connection]
16:26:47SF joins
16:42:01LeGoupil joins
17:03:50yasomi quits [Ping timeout: 252 seconds]
17:05:12<@JAA>thuban: zstdgrep for gzip*, it can handle it and is much faster than zgrep.
17:06:17<@JAA>(Unless you're running a weird self-compiled zstd without zlib support.)
17:07:34<nicolas17>I think zstd -b (benchmark) doesn't support zlib and other algorithms
17:07:40<nicolas17>that would be handy...
17:08:16<@JAA>It would indeed.
17:08:55<thuban>i stand corrected!
17:09:08<thuban>(i think ripgrep also supports a variety of algorithms.)
17:10:13<masterX244>ripgrep breaks once you need --large=31 on a zstd, no way to pass those flags
17:37:42yasomi (yasomi) joins
18:17:01cdub quits [Remote host closed the connection]
18:17:41andrew quits [Client Quit]
18:48:20LeGoupil quits [Ping timeout: 252 seconds]
18:50:36andrew (andrew) joins
19:04:25andrew quits [Client Quit]
19:17:24andrew (andrew) joins
19:21:30sarayalth quits [Client Quit]
19:30:59andrew quits [Client Quit]
19:31:59LeGoupil joins
19:39:17LeGoupil quits [Client Quit]
19:45:38andrew (andrew) joins
19:46:57AmAnd0A quits [Read error: Connection reset by peer]
19:47:18AmAnd0A joins
19:51:04andrew quits [Client Quit]
19:59:13andrew (andrew) joins
20:20:46<kiska>fireonlive So... yeah influxdb js client is a bit of a pain :D
20:22:37<albertlarsan68>The most JS I have ever done (IIRC) was scroll the page on space/enter.
20:23:19<fireonlive>xD sounds like it
20:23:42<fireonlive>it's been on my list but it's a long list :|
20:23:56<fireonlive>what kinda issues yuou seeing?
20:24:25<albertlarsan68>Anyway, I got to go to bed. See you :)
20:26:09<fireonlive>see ya :)
20:26:26<kiska>off to chatgpt :D
20:27:15<albertlarsan68>Ig The Lounge could implement that, it would be cool
20:27:18<albertlarsan68>If*
20:27:46<kiska>fireonlive Anyway I have this data https://server8.kiska.pw/uploads/55b3a59470feb929/image.png
20:28:45<kiska>Well more accurately I have: https://paste.kiska.pw/TailorPneumatophore as an example
20:43:34<masterX244>PredatorIWD: multi-day derives = when you manage to step onto some edgecases
20:52:22<fireonlive>ahh you're trying to do a fancy dashboard with uploader stats and such
20:52:32<fireonlive>wish i could help more :D
21:04:59<kiska>Yeah :(
21:09:21Matthww1 quits [Quit: The Lounge - https://thelounge.chat]
21:11:44<fireonlive>sadly it's not somehting i've touched much of in my travels yet :(
21:12:36<kiska>So I got the data into influx, make me a dashboard?
21:13:16<kiska>Here be dash https://grafana3.kiska.pw/d/c6f943a8-e03b-49b2-ac47-ff2e92e371b9/archiveteam-tracker-stats?orgId=1
21:16:21<kiska>I've added 3 visualisations
21:16:38<kiska>https://server8.kiska.pw/uploads/5f8022cdc7849be5/image.png Progress! :D
21:18:53<kiska>Everyone has editor, but you can't save it. If you make more panels, just grab the dashbaord json and upload to paste,kiska.pw where I can import it
21:19:05<kiska>Well interesting enough panels
21:21:11Matthww1 joins
21:22:02PredatorIWD quits [Client Quit]
21:22:17PredatorIWD joins
21:23:52<PredatorIWD>masterX244: I mean, I guess 160k pdf pages being derived is indeed an edge case lol
21:29:48<masterX244>yeah, unintentional benchmark... luckily you didnt manage to brick a item due to a bug. managed to lose one when --checksum failed repeatedly and a few tenthousand archive tasks were queued up
21:42:32savethestuffyo quits [Quit: WeeChat 3.8]
21:45:11icedice (icedice) joins
21:51:15<@JAA>lol, my Firefox got killed after it ran for quite a long time. After the restart minutes later: 'It looks like you haven't started Firefox in a while.'
21:56:26<imer>JAA: re: ia-upload skipping existing files, any idea if -c works with --keep-directories? since it doesnt seem to, uploaded a bunch of dupes from the looks of it :/
21:57:10<imer>(should I delete the upload and redo it, or does IA still keep that stored even if I do?)
21:57:40<imer>made sure there was no derive running as well
21:58:17<imer>also ia upload is too crashy, really should have a flag for "just retry please, don't crash"
22:01:06cdub joins
22:02:35<masterX244>imer: sometimes a CSV uplaod and then zapping the already done part is easier than fiddling out redoability stuff
22:03:33<imer>I was trying to be lazy and avoid any of that tbh (assuming things actually work this would be fine)
22:05:29<@JAA>imer: Looks like you get now why I mostly no longer use the CLI for uploading.
22:05:34<fireonlive>i think? deleting an item just makes it dark right?
22:05:49<imer>JAA: what's the alternative?
22:05:53<fireonlive>but deleting the files inside it actually delete the files... maybe lol
22:06:00<fireonlive>i was kinda scratching my head looking at the logs
22:06:06<@JAA>I wrote my own script.
22:06:08<imer>can i use other s3 clients?
22:06:25<fireonlive>ooh can it recurse into subdirectories too lol
22:06:25<imer>would such script be available online
22:06:30<@JAA>Maybe? No idea how well that would work.
22:06:40<@JAA>Yes, ia-upload-stream in my little-things repo.
22:06:48<imer>cheers, will check that
22:07:01<fireonlive>https://gitea.arpa.li/JustAnotherArchivist/little-things/src/branch/master/ia-upload-stream
22:07:04<fireonlive>cc imer
22:07:17<@JAA>The name comes from, well, streaming data directly to IA using multi-part uploads.
22:07:20<fireonlive>save you the typing :p
22:07:30Dallas (Dallas) joins
22:07:36<imer>thanks lol, was already on the page about to scroll down
22:07:44<imer>to look for the file*
22:07:52<fireonlive>:)
22:07:55<@JAA>Although it uses normal uploads these days when a file doesn't exceed the configured buffer size.
22:08:33<@JAA>All of this is too on-topic for this channel though. :-P
22:09:31<fireonlive>:3 true
22:14:51cdub quits [Ping timeout: 258 seconds]
22:58:02Hackerpcs quits [Quit: Hackerpcs]
23:00:04Hackerpcs (Hackerpcs) joins
23:14:56<nicolas17>fireonlive: apparently yesterday there were 30 fics about the oceangate submarine on ao3 already
23:21:17Doranwen facepalms at that
23:22:02<Doranwen>A bit in poor taste, imho, but people will write fic for *anything*.
23:22:06<fireonlive>lmfao wow
23:23:01<Doranwen>The 11 foot 8 bridge spawned some great fics one Yuletide - but afaik that one never killed anyone, just hit people in their wallets paying for all the repairs to stuff they'd damaged.
23:23:20<nicolas17>Doranwen: I have seen some *very* dark humor about this, which is probably in poorer taste than writing a fic :P
23:23:32<Doranwen>Yeah, I'll bet.
23:23:34<@JAA>I'm slightly surprised there doesn't appear to be anything on PornHub yet.
23:24:19<nicolas17>https://cdn.discordapp.com/attachments/228784648367505410/1121172859436413200/355431350_6227346300683322_9077213609648739438_n.png
23:25:22<@JAA>'One sinking sub is called The Titan, what do you call a fleet of sinking subs?' https://old.reddit.com/r/Jokes/comments/14fyxfi/one_sinking_sub_is_called_the_titan_what_do_you/
23:25:40<fireonlive>honestly reading how they died and how it happened like pretty quickly (5 days ago now?) i’d have loved to have been on that sub
23:26:07<fireonlive>implosion, basically instant death? sounds amazing
23:26:18<fireonlive>plus you got to like, see some underwater things i guess too
23:26:30<fireonlive>jaa: haha
23:27:13<@JAA>Yeah, definitely beats floating on the surface of the ocean, slowly suffocating and being unable to open the sub from the inside.
23:27:40<nicolas17>https://cdn.discordapp.com/attachments/991416672634286210/1121601410245607484/FB_IMG_1687480990966.jpg
23:27:40<fireonlive>for sure
23:27:59<fireonlive>turns out the oxygen countdown was unneeded but did provide some good meme material
23:28:15<@JAA>nicolas17: Hehe, yeah, there have been various comments along that line. 'Now the next company gets to offer a two-in-one tour!' etc.
23:28:21<myself>at those pressures you stop being biology and briefly become physics, all in all not a bad way to go
23:28:31<nicolas17>https://twitter.com/TheCumpound/status/1671878568773660672
23:29:32<fireonlive>there was that one guy who always took vacations to places after they had terrorist attacks
23:29:43<fireonlive>because everything was dirt cheap after they tried to recover tourism etc
23:31:47<nicolas17>JAA: https://twitter.com/Phineas/status/1566425438376828928 note the tweet date
23:32:35<fireonlive>https://www.bbc.com/news/business-65981876
23:32:53<fireonlive>Elon Musk and Mark Zuckerberg agree to hold cage fight
23:33:36<fireonlive>please let this happen
23:34:08<@JAA>nicolas17: Nice
23:42:51<fireonlive>https://i.imgur.com/q2Uyyib.jpg
23:43:58BlueMaxima joins