00:00:08missaustraliana joins
00:33:54missaustraliana quits [Client Quit]
00:40:53icedice quits [Client Quit]
00:45:47wickedplayer494 quits [Read error: Connection reset by peer]
00:49:13wickedplayer494 joins
01:16:15missaustraliana joins
01:18:52Mateon2 joins
01:19:20Mateon1 quits [Ping timeout: 240 seconds]
01:19:20Mateon2 is now known as Mateon1
01:27:20wickedplayer494 quits [Ping timeout: 240 seconds]
01:28:02wickedplayer494 joins
01:41:45Irenes quits [Ping timeout: 272 seconds]
01:47:50kitonthe2et quits [Ping timeout: 240 seconds]
01:55:38Irenes (ireneista) joins
02:08:10Earendil7_ (Earendil7) joins
02:08:21Earendil7 quits [Ping timeout: 272 seconds]
02:15:46kitonthe2et joins
02:20:23kitonthe2et quits [Ping timeout: 272 seconds]
02:29:13missaustraliana quits [Client Quit]
03:01:17kitonthenet joins
03:45:28missaustraliana joins
03:47:39BlueMaxima joins
03:59:56Shjosan quits [Quit: Am sleepy (-, – )…zzzZZZ]
04:00:33Shjosan (Shjosan) joins
04:03:20Dango360_ joins
04:04:20Dango360 quits [Ping timeout: 240 seconds]
04:30:51Ruthalas59 quits [Ping timeout: 272 seconds]
04:39:20kitonthenet quits [Ping timeout: 240 seconds]
04:42:08missaustraliana quits [Client Quit]
04:47:29Ruthalas59 (Ruthalas) joins
05:49:00DogsRNice quits [Read error: Connection reset by peer]
06:08:39missaustraliana joins
06:14:47Island quits [Read error: Connection reset by peer]
06:18:14kitonthenet joins
06:22:50kitonthenet quits [Ping timeout: 240 seconds]
06:23:20benjins2_ quits [Ping timeout: 240 seconds]
06:24:01benjinsm joins
06:24:13benjins quits [Ping timeout: 272 seconds]
06:24:20benjins2 joins
06:52:43ehmry quits [Quit: https://quassel-irc.org - Chat comfortably. Anywhere.]
06:52:57ehmry joins
07:04:22missaustraliana quits [Client Quit]
07:33:52benjins joins
07:34:19benjins2_ joins
07:35:09benjins2 quits [Ping timeout: 272 seconds]
07:35:09benjinsm quits [Ping timeout: 272 seconds]
07:55:47imer quits [Killed (NickServ (GHOST command used by imer4))]
07:55:54imer (imer) joins
08:03:39jacksonchen666 (jacksonchen666) joins
08:10:00jacksonchen666 quits [Client Quit]
08:11:44BlueMaxima quits [Read error: Connection reset by peer]
08:13:09le0n quits [Ping timeout: 272 seconds]
08:28:10missaustraliana joins
08:31:47le0n (le0n) joins
09:08:21missaustraliana quits [Client Quit]
09:14:33missaustraliana joins
09:19:44<missaustraliana>im chucking studio 10 back down the tube as there was a few issues. but this time im going to upload it as an item
09:21:23<missaustraliana>890 videos i have got locally. praying network doesnt die and or i run out of storage
09:21:59<flashfire42>missaustraliana usually I dont reccomend tubeup but that maybe be your best choice for uploading those items
09:22:17<missaustraliana>oh?
09:22:42<missaustraliana>im using that py script on the wiki
09:22:59<flashfire42>If you are gonna insist on uploading as items despite the fact they are going through down the tube you may as well use tubeup to make sure they go through as proper items.
09:23:49<missaustraliana>oh bahaha im not throwing it down the tube i was just saying. im using tubeup and the py script
09:23:58<missaustraliana>sorry for not clairfying
09:25:40<missaustraliana>flashfire42 i passed it the channel, will it make seperate items for each video or will it be one big item?
09:26:21<flashfire42>I thought you meant #down-the-tube rather than using the turn of phrase. #down-the-tube will mean it is accessible through wayback
09:26:58<missaustraliana>yeah nah i dont want that
09:27:16<missaustraliana>i want it chucked into my profile
09:28:55<missaustraliana>but will it? i want it to have a seperate item for each video
09:30:06missaustraliana quits [Client Quit]
09:30:23missaustraliana joins
09:43:20bogsen quits [Ping timeout: 240 seconds]
09:44:52missaustraliana quits [Client Quit]
10:00:05Bleo1826 quits [Client Quit]
10:01:20Bleo1826 joins
10:12:36bogsen (bogsen) joins
10:30:29missaustraliana joins
11:05:17kitonthe1et joins
11:15:33kitonthe1et quits [Ping timeout: 272 seconds]
11:29:05missaustraliana quits [Client Quit]
11:57:46bogsen quits [Client Quit]
12:00:48bogsen (bogsen) joins
12:55:00ScenarioPlanet (ScenarioPlanet) joins
12:55:50Arcorann quits [Ping timeout: 240 seconds]
12:58:20BearFortress quits [Ping timeout: 240 seconds]
13:26:47kitonthe1et joins
13:31:43kitonthe1et quits [Ping timeout: 272 seconds]
13:45:01Dango360_ quits [Ping timeout: 272 seconds]
13:46:30Dango360 (Dango360) joins
15:05:27bogsen quits [Ping timeout: 272 seconds]
15:37:30bogsen (bogsen) joins
15:38:19kitonthenet joins
15:43:27kitonthenet quits [Ping timeout: 272 seconds]
15:52:31rktk (rktk) joins
16:05:15ScenarioPlanet quits [Client Quit]
16:17:39bogsen quits [Ping timeout: 272 seconds]
16:29:08Island joins
16:52:59Dango360_ joins
16:53:07Dango360 quits [Ping timeout: 272 seconds]
17:11:46DogsRNice joins
17:26:22icedice (icedice) joins
17:27:19Gereon9 quits [Ping timeout: 272 seconds]
17:29:31bogsen (bogsen) joins
17:31:36icedice quits [Client Quit]
17:35:30LeGoupil joins
17:54:09kitonthe2et joins
17:56:30LeGoupil quits [Client Quit]
18:05:50Gereon9 (Gereon) joins
18:11:02<that_lurker>https://news.ycombinator.com/item?id=38568076
18:11:46<that_lurker>Arch linux completed the bug tracker migration to gitlab. The old one will be closed
18:29:24<@JAA>Looks like they already borked the bug list, so only the pages for individual bugs are remaining. Perhaps that static conversion already happened.
18:37:53lennier2 joins
18:40:20lennier2_ quits [Ping timeout: 240 seconds]
18:47:50aninternettroll quits [Ping timeout: 240 seconds]
18:52:49aninternettroll (aninternettroll) joins
19:12:01bjorn3 joins
19:15:14bjorn3 quits [Client Quit]
19:15:53bjorn3 joins
19:18:05<bjorn3>i've been looking for software to archive several sites for personal use. i found a bunch of options, but am not sure what would be the best.
19:18:56<@JAA>bjorn3: grab-site is what I'd recommend. If the sites are heavily relying on scripting, it gets much harder.
19:19:23<bjorn3>does it support resuming an archive operation after i restart my system?
19:19:47<@JAA>It does not.
19:20:25<@JAA>Long-standing issue that's not easy to fix: https://github.com/ArchiveTeam/grab-site/issues/58
19:21:20<bjorn3>i've only got a laptop which i shut down every day (and loses internet access whenever i travel to uni) and i don't know if i will be able to archive one of the sites i want in a single day.
19:23:36<bjorn3>i don't need js support at all by the way. they are old school web 1.0 sites that i want to archive. some barely use css even.
19:26:30mr_sarge quits [Read error: Connection reset by peer]
19:27:10<@JAA>Right. I can't recommend anything for that setup I'm afraid. It might be possible with grab-site if you can use sleep or hibernation rather than shutting down (by first setting the delay between requests to a large number and then putting it into sleep while it's waiting, ideally), but I'm not sure. All my archival stuff runs on hardware that's always on.
19:27:38<@JAA>Alternatively, we could archive them through ArchiveBot and you could download a copy of the WARCs from IA to your local storage later.
19:30:14<bjorn3>i guess i will give grab-site a go.
19:31:05<bjorn3>does grab-site actually need python 3.8 as the readme seems to indicate or does 3.11 also work?
19:31:19<bjorn3>debian bookworm ships with 3.11
19:39:16<bjorn3>i think it really does need 3.8. got an exception thrown.
19:40:17<@JAA>Yeah, it does require 3.7 or 3.8.
19:40:55<@JAA>I made a container image a while ago, if that's of any use to you. Though it comes without the web UI.
19:42:35<thuban>JAA: idk whether you saw this when i mentioned it earlier, but grab-site's pip install is currently broken due to fb-re2
19:42:51<bjorn3>i should be able to figure it out. just tried to avoid installing pyenv if possible.
19:43:05<@JAA>thuban: I saw, and my container build has further issues on top of that. :-|
19:43:12<thuban>nice
19:43:27<thuban>well, google-re2 should work as a drop-in
19:43:29<@JAA>bjorn3: pyenv is your friend. :-)
19:44:14<bjorn3>rustup is my friend already :)
19:45:00<bjorn3>(if i understand correctly what pyenv is, rustup is the same except for rust)
19:45:22@JAA is unfamiliar with the Rust ecosystem, so no idea.
19:45:28<@JAA>It's equivalent to rbenv in Ruby. :-P
19:46:06<@JAA>Yeah, rustup sounds very similar.
19:47:21<bjorn3>is pyenv compiling python from scratch?
19:47:34<bjorn3>rustup downloads precompiled versions.
19:48:41<@JAA>Yes, it compiles them.
19:49:48<bjorn3>that would be a nightmare for rustc. especially if you aren't using the precompiled LLVM builds.
19:50:26<bjorn3>didn't have any issues with fb-re2 when installing.
19:51:31<thuban>older version of abseil, i suppose?
19:52:14<bjorn3>don't have any debian package with abseil in the name installed.
19:53:07<thuban>might be named libabsl or sth
19:53:35<bjorn3>i have libabsl20220623 as package
19:54:52<thuban>yeah, older. fb-re2 is behind as of 20220706 https://github.com/abseil/abseil-cpp/commit/97ab3dcfd6490434202e4ab00b2eaba9449e42a1
19:56:09<bjorn3>got web-grab working
19:56:57Megame (Megame) joins
19:57:08icedice (icedice) joins
20:02:17<@JAA>bjorn3: If there's something that's not just of personal interest/importance, please feel free to let us know anyway for running through AB. That means it becomes available in the Wayback Machine and is thus useful to others, too.
20:33:56BlueMaxima joins
20:37:18<bjorn3>sure, will let you all know if i find anything that could be relevant for others in my bookmarks.
20:39:38VerifiedJ quits [Remote host closed the connection]
20:40:10VerifiedJ (VerifiedJ) joins
20:53:30<bjorn3>how do i add a regex to the set of ignored urls? --igsets only accepts one of the builtin ignore sets afaict from the readme.
20:53:46c3manu (c3manu) joins
20:54:39<pokechu22>For archivebot you can't add a single regex before the job starts; you have to add it afterwards. Not sure if grab-site is the same way or not
20:56:25<bjorn3>i see
21:00:59<thuban>bjorn3: to add ignores before starting the job, write them to a file and use --import-ignores
21:01:26<bjorn3>does that overwrite --igsets? or append to it?
21:01:40<thuban>append
21:01:41<@JAA>And after the job is started, you can edit the 'ignores' file with one pattern per line.
21:02:50<bjorn3>thanks!
21:20:23bjorn3 quits [Ping timeout: 272 seconds]
21:21:31bjorn3 (bjorn3) joins
21:38:19bjorn3 quits [Remote host closed the connection]
21:42:09T31M quits [Quit: ZNC - https://znc.in]
21:42:31T31M joins
21:50:45c3manu quits [Remote host closed the connection]
22:47:54dumbgoy joins
22:48:14dumbgoy quits [Client Quit]
22:58:51<fireonlive>https://twitter.com/w3cdevs/status/1732669269526151418 < oh hey they also moved but didn't lock their account lol
22:58:51<eggdrop>nitter: https://nitter.net/w3cdevs/status/1732669269526151418
23:29:31<pabs>Google announces April 2024 shutdown date for Google Podcasts: https://arstechnica.com/gadgets/2023/12/google-podcasts-dies-april-2024-youtube-music-migration-tool-goes-live/ https://news.ycombinator.com/item?id=38573378
23:41:07systwi_ quits [Quit: systwi_]
23:41:08nothere quits [Quit: Leaving]
23:56:19nothere joins