00:29:12GradientCat (GradientCat) joins
00:36:43IDK quits [Quit: Connection closed for inactivity]
00:42:45sec^nd quits [Remote host closed the connection]
00:43:10sec^nd (second) joins
00:44:32fangfufu quits [Client Quit]
00:44:44fangfufu joins
00:49:33etnguyen03 (etnguyen03) joins
01:01:49qwertyasdfuiopghjkl2 quits [Ping timeout: 260 seconds]
01:03:01fangfufu quits [Client Quit]
01:06:42fangfufu joins
01:14:22qwertyasdfuiopghjkl2 joins
01:14:49qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:15:18qwertyasdfuiopghjkl2 joins
01:15:43qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:16:42qwertyasdfuiopghjkl2 joins
01:17:07qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:17:46qwertyasdfuiopghjkl2 joins
01:18:12qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:18:56qwertyasdfuiopghjkl2 joins
01:19:21qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:19:53qwertyasdfuiopghjkl2 joins
01:20:19qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:20:58qwertyasdfuiopghjkl2 joins
01:21:23qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:21:46qwertyasdfuiopghjkl2 joins
01:22:12qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:23:07qwertyasdfuiopghjkl2 joins
01:23:32qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:24:05qwertyasdfuiopghjkl2 joins
01:24:30qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:25:35qwertyasdfuiopghjkl2 joins
01:26:01qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:27:01qwertyasdfuiopghjkl2 joins
01:27:26qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:27:44qwertyasdfuiopghjkl2 joins
01:28:10qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:29:13qwertyasdfuiopghjkl2 joins
01:29:38qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:30:07qwertyasdfuiopghjkl2 joins
01:30:33qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:31:07qwertyasdfuiopghjkl2 joins
01:31:33qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:32:26qwertyasdfuiopghjkl2 joins
01:32:51qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:33:25qwertyasdfuiopghjkl2 joins
01:33:51qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:34:26qwertyasdfuiopghjkl2 joins
01:34:52qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:35:35qwertyasdfuiopghjkl2 joins
01:36:01qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:36:45qwertyasdfuiopghjkl2 joins
01:44:56Wohlstand quits [Quit: Wohlstand]
01:51:06qwertyasdfuiopghjkl21 joins
01:51:31qwertyasdfuiopghjkl21 quits [Max SendQ exceeded]
01:51:59qwertyasdfuiopghjkl21 joins
01:52:25qwertyasdfuiopghjkl21 quits [Max SendQ exceeded]
01:53:09qwertyasdfuiopghjkl2 quits [Ping timeout: 260 seconds]
01:54:26qwertyasdfuiopghjkl2 joins
01:54:52qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:55:35qwertyasdfuiopghjkl2 joins
01:56:00qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:56:16qwertyasdfuiopghjkl2 joins
01:56:42qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:56:59qwertyasdfuiopghjkl2 joins
01:57:24qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:57:45qwertyasdfuiopghjkl2 joins
01:58:11qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:58:53qwertyasdfuiopghjkl2 joins
01:59:19qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
01:59:39qwertyasdfuiopghjkl2 joins
02:00:05qwertyasdfuiopghjkl2 quits [Max SendQ exceeded]
02:00:37qwertyasdfuiopghjkl2 joins
02:12:31etnguyen03 quits [Client Quit]
02:12:52etnguyen03 (etnguyen03) joins
02:13:02<h2ibot>PaulWise edited ArchiveBot/Ignore (-163, merge Flickr ignores into one /cc pokechu22): https://wiki.archiveteam.org/?diff=56643&oldid=56617
02:13:58<pokechu22>pabs: I generally do those ignores from memory, but if you're looking it up each time that makes sense
02:14:55<pabs>yeah I find copy-paste easier. a single line also means I can add a /flickr command to my IRC client :)
02:19:26dabs quits [Read error: Connection reset by peer]
02:27:32<@JAA>Re [[Patreon]] edit, I don't think we should have anything like a recommendation to use webrecorder tools for archival on the wiki either.
02:34:31<hexagonwin_>Is webrecorder bad? I find it pretty useful for saving some pages that need auth, for small personal archiving.
02:34:37etnguyen03 quits [Remote host closed the connection]
02:35:13<@JAA>None of their tooling seems to produce valid WARCs, and they have repeatedly shown not to care about complying with the standard.
02:35:29<hexagonwin_>I see. That's terrible :/
02:35:39<@JAA>https://wiki.archiveteam.org/index.php/The_WARC_Ecosystem has some links to specific problems.
02:37:15<@OrIdow6>JAA: I wouldn't call that edit an endorsement so much as someone noting that information they found out
02:37:22<hexagonwin_>Is there some alternative for doing what webrecorder does (saving current browser page)? That page seems to only recommend wget-at and grab-site which doesn't really work for that
02:37:39<@OrIdow6>We have tons of pages that link to some Python tool that doesn't produce WARCs at all but just dump to JSON or whatever
02:38:08<hexagonwin_>I was thinking webrecorder's still better than something like SingleFile that discards everything but the current DOM
02:38:08<@JAA>OrIdow6: Those tools don't pretend to produce 'high-fidelity web archives' though...
02:39:02<@JAA>hexagonwin_: Well, it's impossible with a browser extension. You need something with a MITM proxy like brozzler + warcprox.
02:39:04pabs tests to see if archive.today works
02:39:36<hexagonwin_>Also I've recently found Firefox's devtool has a "Save All As HAR" feature, maybe can that have enough details like a WARC?
02:39:44<@JAA>Nope
02:39:50<hexagonwin_>I see :/
02:40:00<@JAA>I'm not aware of any browser API that exposes the details needed for WARC.
02:40:50<pabs>oh, archive.today works with Patreon, nice
02:41:01<@JAA>Specifically, WARC needs the exact bytes sent by the server, not a parsed representation of the headers that discards whitespace or a decoded body.
02:41:12<@OrIdow6>JAA: How about, link WebRecorder in the prose, and have "WebRecorder" be a redirect to the column in the Warc Ecosystem table where we "don't recommend it"?
02:41:25<@OrIdow6>I think it' still useful info to have because it shows the scope of the Cloudflare block
02:41:52<@OrIdow6>*recommend" it
02:42:20<@OrIdow6>*row, I am tired
02:42:25<steering>if you just want to archive patreon content... just use gallery-dl, archiving the webpages is overkill... but if you want to archive patreon, the webpages... a proper WARC is more relevant
02:43:21<steering>"dump to JSON" is the former, not really the same as archiving the pages, IMHO anyway
02:44:05<pabs>gallery-dl probably can't save posts like https://www.patreon.com/posts/libreqos-v1-5-1-106946615 ?
02:44:05<h2ibot>PaulWise edited Patreon (+56, update wording, add mnbot and archive.is): https://wiki.archiveteam.org/?diff=56644&oldid=56642
02:44:27<pabs>JAA: ^ hopefully better
02:46:45hexagonwin_ is now known as hexagonwin
02:47:28<steering>pabs: mmh, seems not, although I'm not sure why, considering that it does save text of posts-with-images along with the rest of the metadata
02:48:04<steering>my point stands though, it's not about gallery-dl but rather about the difference between archiving webpage content and archiving webpages :P
02:48:06<@JAA>pabs: Yeah, that's better. :-)
02:48:28<pabs>ack
02:48:33<@JAA>pabs++
02:48:33<eggdrop>[karma] 'pabs' now has 110 karma!
02:48:58pabs tries to figure out browser-based mass archive.today saves...
02:49:12<hexagonwin>btw, just got curious. if WARC saves the exact bytes from the server, for HTTPS sites can someone validate the WARC hasn't been tampered with(MITMed or just edited later)?
02:49:48<@JAA>No
02:50:06<h2ibot>OrIdow6 edited CuriousCat (+67, /* 2024 grab */ Clarify slightly *why* you need…): https://wiki.archiveteam.org/?diff=56645&oldid=55915
02:50:22<steering>I don't think that would even be possible thanks to DH.
02:50:22<@JAA>For a start, it stores the HTTP bytes, not TLS. But even if it did, that wouldn't allow for any verification.
02:51:05<@OrIdow6>We need a FAQ on this
02:51:05<@JAA>TLS works by establishing a symmetric encryption key between client and server. There's no signature by the server or similar.
02:53:31<nicolas17>hexagonwin: TLS packets are authenticated, but unfortunately with the way TLS works, both client and server (and your MITM tool) have the key needed to forge the auth hash
02:53:57<@JAA>Yeah, this kind of thing simply wasn't a part of the design goals for TLS.
02:55:23<hexagonwin>I see.. I have almost zero knowledge on this subject lol. Then for example, there's no way to verify the WARCs from various archiveteam warriors weren't tampered by a bad actor?
02:55:25<@JAA>There was that proposal by Google (I think?) for storing website content elsewhere with a signature. Not sure anything ever came of that though.
02:56:07@JAA can't remember the name.
02:57:35<nicolas17>damn I should make a pcap-to-warc tool some day >.>
02:58:11<@JAA>Yeah, not the first time that idea has come up. Needs the TLS master keys, too, though.
02:59:15<@JAA>s/keys/secrets/ and possibly s/master/pre-&/
03:02:14ineffyble quits [Read error: Connection reset by peer]
03:02:31<steering>JAA: pretty sure that was AMP-related.
03:02:49<steering>https://web.dev/articles/signed-exchanges
03:03:09<nicolas17>yeah, they wanted to serve other people's webpages from their CDN such that the browser still associated it with the original domain
03:04:07<steering>supported by... chromium browsers only :P https://caniuse.com/?search=sxg
03:04:25<@JAA>Ah yeah, signed exchanges
03:04:42<hexagonwin>i guess it doesn't really mean much when no web server implement/use it
03:05:11<steering>IDK if it even needs web server support. I think it's "just a file".
03:05:30<nicolas17>yeah but needs websites to provide those files, does any?
03:05:34<steering>But yeah ~only Google has ever used it I think
03:05:37<hexagonwin>i mean like no website does it
03:05:53<@JAA>The website would need a key pair/certificate to do the signatures etc.
03:06:23<nicolas17>I once saw a document for how clients A and B can cooperate so client B makes a request on behalf of A such that A can be sure it really came from the server
03:06:28<steering>IDK if Google ever actually started using it, I assume they did, I don't remember what they ended up doing with AMP. That was a whole thing.
03:06:48<nicolas17>but it needed A and B to have real-time interaction during the process so it doesn't really help us for DPoS
03:06:49<@JAA>Looks like Buttflare has some support for it, too: https://developers.cloudflare.com/speed/optimization/other/signed-exchanges/
03:07:57<steering>I assume SXGs aren't allowed to be long-lived anyway so probably not of use for archiving
03:08:49<@JAA>> A SXG may be valid for at most 7 days.
03:09:02<steering>The web.dev link says Google caches them and serves them in search results, so presumably *some* people are generating them.
03:10:18<@JAA>But if the certificates get archived, you could still verify the signature after expiry. Assuming the website's private key for the certificate didn't get leaked, that should still serve as verification, I guess.
03:10:18<steering>> Note: Production use of SXGs requires a certificate that supports the CanSignHttpExchanges extension. Per spec, certificates with this extension must have a validity period no longer than 90 days and require that the requesting domain have a DNS CAA record configured.
03:10:25<steering>Thanks for making sure I'm safe, daddy Google. uwu.
03:10:53<@JAA>And those certs should be in CT, I suppose.
03:11:16<steering>Well, yes, you could do it manually certainly. Mostly I mean the browser would start rejecting them
03:11:33<@JAA>Oh, sure, this would be custom tooling anyway.
03:11:35<steering>you can't just "serve the SGX forever" or something which would be nice
03:12:02<steering>(but also problematic if f.e. XSS in the SGX or something? dunno)
03:12:29<@JAA>grep CanSignHttpExchanges crt.sh
03:12:30<@JAA>:-P
03:14:00<pabs>this script seems to work for doing archive.is via browser, no captchas yet https://transfer.archivete.am/kiU37/archive.today.sh
03:14:01<eggdrop>inline (for browser viewing): https://transfer.archivete.am/inline/kiU37/archive.today.sh
03:15:50<pabs>(saving all of the posts on https://www.patreon.com/dtaht since he died)
03:19:16<pabs>with background tab loading https://transfer.archivete.am/142CFt/archive.today.sh
03:19:17<eggdrop>inline (for browser viewing): https://transfer.archivete.am/inline/142CFt/archive.today.sh
03:25:11<h2ibot>PaulWise edited Archive.today (+205, add url list archiving script): https://wiki.archiveteam.org/?diff=56646&oldid=56014
03:26:11<h2ibot>PaulWise edited Archive.today (-2, s/browser/firefox): https://wiki.archiveteam.org/?diff=56647&oldid=56646
03:26:40<cruller><nicolas17> "damn I should make a pcap-to-..." <- If I save the appropriate pcap file now, will I be able to convert it to a warc file in the future? (I'm not familiar with networks, so I apologise if this is a bad question.)
03:27:52DogsRNice quits [Read error: Connection reset by peer]
03:28:03<nicolas17>cruller: you would need the TLS session keys too; Firefox can save them if you set the SSLKEYLOGFILE environment variable pointing to a filename, I don't know about Chrome
03:28:18<pabs>sounds like it was at some point possible to verify a TLS session https://tlsnotary.org/ https://news.ycombinator.com/item?id=29090604
03:28:43<nicolas17>pabs: I think that's the one I remembered
03:29:18<nicolas17>multiparty computation between the client and the verifier when generating the session keys
03:31:20<nicolas17>so client and verifier need to cooperate in real time during the request, in a DPoS scenario that would be impractical
03:37:23<cruller>nicolas17: Thanks for your answer. Chrome and curl also seem to support SSLKEYLOGFILE. I'm not sure about wget.
03:37:51<nicolas17>I think Chrome and Firefox have their own TLS stack
03:38:05<nicolas17>but anything using OpenSSL (like curl and wget) should just work
03:39:36<nicolas17>Wireshark supports reading the keylog file and decrypting TLS packets
03:42:17<nicolas17>note that as mentioned earlier, unfortunately it's not possible to verify if the data is pristine from the server or you tampered with it
03:42:29<steering>JAA: I dunno psql well enough to do that grep qq
03:42:51<steering>AFAIK `select id from certificate, lateral x509_extensions(certificate) as e(e) where '1.3.6.1.4.1.11129.2.1.22' = e limit 1;` but timeout
03:43:16<nicolas17>so getting it into wayback machine has the same trust issues as any other .warc
03:45:26cuphead2527480 quits [Quit: Connection closed for inactivity]
03:51:58nicolas17 quits [Quit: Konversation terminated!]
03:52:35nicolas17 joins
03:55:08<cruller>Regarding the integrity of WARC, is the only option at present to focus on "who created it and how"?
03:57:29<cruller>Also, is it possible to generate a warc file from an http(s) client that does certificate pinning? Sometimes I want to do that.
04:10:29GradientCat quits [Quit: Connection closed for inactivity]
04:12:11nicolas17 quits [Client Quit]
04:12:12Guest58 joins
04:12:27nicolas17 joins
04:15:27ThetaDev quits [Quit: https://quassel-irc.org - Chat comfortably. Anywhere.]
04:15:54ThetaDev joins
04:16:41<cruller>I know some tricks for Android APK. https://docs.caido.io/tutorials/modifying_apk https://github.com/ReVanced/revanced-patches/commit/94ed738515aa6e1a1d346b85b54805e68e36f94c
04:20:21nine quits [Quit: See ya!]
04:20:34nine joins
04:20:34nine quits [Changing host]
04:20:34nine (nine) joins
04:28:12LddPotato quits [Read error: Connection reset by peer]
04:28:41<steering>cruller: yes, and no.
04:28:54LddPotato (LddPotato) joins
04:39:14<nicolas17>!remindme 10h archive Safari 18.6
04:39:16<eggdrop>[remind] ok, i'll remind you at 2025-07-31T14:39:14Z
04:39:25<cruller>I thought so. Things are not going well...
04:47:34midou quits [Ping timeout: 260 seconds]
04:57:29midou joins
05:55:33<h2ibot>TriangleDemon edited Deathwatch (+64, /* 2025 */): https://wiki.archiveteam.org/?diff=56648&oldid=56636
05:58:34<h2ibot>TriangleDemon edited Flipnote Hatena (-58): https://wiki.archiveteam.org/?diff=56649&oldid=56551
06:10:35<h2ibot>TriangleDemon edited Hatena (+491, /* Former */): https://wiki.archiveteam.org/?diff=56650&oldid=56572
06:11:36<h2ibot>TriangleDemon edited Hatena (+2, /* Former */): https://wiki.archiveteam.org/?diff=56651&oldid=56650
06:23:49HP_Archivist quits [Ping timeout: 260 seconds]
06:25:40Island quits [Read error: Connection reset by peer]
06:36:24awauwa (awauwa) joins
06:54:26<hexagonwin>arkiver: i've finished running my first batch of random search keywords, found 488562 blogs in total https://transfer.archivete.am/inline/a3VjJ/tistory_blogs.txt (please ping when you're available)
07:22:15ducky (ducky) joins
07:25:02ducky quits [Remote host closed the connection]
07:43:18ducky (ducky) joins
07:44:43HP_Archivist (HP_Archivist) joins
07:45:19ducky quits [Remote host closed the connection]
07:53:42ducky (ducky) joins
08:02:17ducky quits [Remote host closed the connection]
08:04:07ducky (ducky) joins
08:12:53ducky quits [Remote host closed the connection]
08:14:45ducky (ducky) joins
08:24:51Webuser616768 joins
08:25:57<Webuser616768>Can I ask, is it possible to reverse-proxy warrior as a path rather than a subdomain? For example looking to do homelab.dev/warrior but it fails to load css/js or the info API call, etc. Tried many ways and cant get it to work. The closest I have is handle_path with caddy which loads everything but the info API call correctly
08:38:59Dada joins
08:39:44<Webuser616768>handle_path /warrior/* {
08:39:44<Webuser616768> reverse_proxy archiveteam-warrior:8001
08:39:44<Webuser616768> }
08:39:44<Webuser616768> @warrior header Referer https://homelab/warrior/
08:39:44<Webuser616768> handle @warrior {
08:39:44<Webuser616768> reverse_proxy archiveteam-warrior:8001
08:39:44<Webuser616768> }
08:39:50<Webuser616768>This worked :) -- bit large but it works
08:40:51Wohlstand (Wohlstand) joins
08:41:30<Webuser616768>hmm, but the xhr_streaming API call occassionally fails which freezes the web UI until a manual refresh
08:46:45Webuser616768 quits [Client Quit]
08:48:29HP_Archivist quits [Ping timeout: 260 seconds]
08:56:39anonymoususer852 quits [Ping timeout: 260 seconds]
08:56:58fangfufu quits [Quit: ZNC 1.8.2+deb3.1+deb12u1 - https://znc.in]
08:57:09fangfufu joins
08:58:29anonymoususer852 (anonymoususer852) joins
09:00:29ducky quits [Remote host closed the connection]
09:07:31Exorcism|irc quits [Quit: Ping timeout (120 seconds)]
09:07:40DigitalDragons quits [Quit: Ping timeout (120 seconds)]
09:08:20Exorcism|irc (exorcism) joins
09:08:21DigitalDragons (DigitalDragons) joins
09:10:42ducky (ducky) joins
09:13:23Exorcism|irc quits [Client Quit]
09:13:47DigitalDragons quits [Client Quit]
09:15:21<that_lurker>Would have been a lot easier to just make a subdomain for it. Though if they used the local address then they would need a dns server
09:15:59Exorcism|irc (exorcism) joins
09:16:26DigitalDragons (DigitalDragons) joins
09:18:03BornOn420 quits [Remote host closed the connection]
09:18:38BornOn420 (BornOn420) joins
09:18:46pseudorizer quits [Quit: ZNC 1.10.1 - https://znc.in]
09:19:33ducky quits [Remote host closed the connection]
09:20:22pseudorizer (pseudorizer) joins
09:21:14ducky (ducky) joins
09:23:01ducky quits [Remote host closed the connection]
09:32:08ducky (ducky) joins
09:45:04HP_Archivist (HP_Archivist) joins
09:54:11yano quits [Quit: WeeChat, https://weechat.org/]
09:54:24ducky quits [Remote host closed the connection]
09:54:33ducky (ducky) joins
09:55:54ducky quits [Remote host closed the connection]
09:56:50yano (yano) joins
09:58:56BornOn420 quits [Remote host closed the connection]
09:58:56sec^nd quits [Read error: Connection reset by peer]
09:59:18sec^nd (second) joins
09:59:53BornOn420 (BornOn420) joins
10:10:44tertu2 quits [Ping timeout: 260 seconds]
10:16:01ducky (ducky) joins
10:16:46tertu (tertu) joins
10:25:00monoxane (monoxane) joins
10:48:14HP_Archivist quits [Ping timeout: 240 seconds]
11:00:01Bleo182600722719623455222 quits [Quit: The Lounge - https://thelounge.chat]
11:02:44Bleo182600722719623455222 joins
11:39:17ducky quits [Remote host closed the connection]
11:39:22ducky (ducky) joins
11:44:54HP_Archivist (HP_Archivist) joins
11:49:19pseudorizer quits [Ping timeout: 260 seconds]
12:08:23beastbg8_ joins
12:11:29beastbg8 quits [Ping timeout: 260 seconds]
12:17:19benjins3 quits [Ping timeout: 260 seconds]
12:31:43kansei- (kansei) joins
12:32:14kansei quits [Ping timeout: 240 seconds]
12:45:13<BlankEclair>https://help.dropbox.com/installs/dropbox-passwords-discontinuation
12:45:28<BlankEclair>not quite sure how useful this is here considering the nature of the service, but i figure that i should drop it here anyway
12:48:34HP_Archivist quits [Ping timeout: 240 seconds]
13:05:32ducky quits [Remote host closed the connection]
13:05:41ducky (ducky) joins
13:05:49<justauser|m>What's the status of Wikibot jobs on Tuxfamily?
13:06:07<justauser|m>Pad says some were submitted; what exactly?
13:09:33<pabs>the ones with a job id next to them were submitted
13:09:50ducky quits [Remote host closed the connection]
13:10:00ducky (ducky) joins
13:13:49egallager quits [Quit: This computer has gone to sleep]
13:15:59<justauser|m>Not that I can see any...
13:16:23<justauser|m>OK, I can see one.
13:28:10ducky quits [Remote host closed the connection]
13:33:15pseudorizer (pseudorizer) joins
13:44:54HP_Archivist (HP_Archivist) joins
13:47:04ducky (ducky) joins
13:49:29T31M quits [Quit: ZNC - https://znc.in]
13:50:20T31M joins
14:08:47threedeeitguy69 quits [Quit: The Lounge - https://thelounge.chat]
14:09:18threedeeitguy69 (threedeeitguy) joins
14:39:16<eggdrop>[remind] nicolas17: archive Safari 18.6
14:48:34HP_Archivist quits [Ping timeout: 240 seconds]
15:41:22benjins3 joins
15:42:08Webuser146914 joins
15:44:54HP_Archivist (HP_Archivist) joins
15:46:13Webuser146914 quits [Client Quit]
15:57:43Guest58 quits [Quit: My Mac has gone to sleep. ZZZzzz…]
16:10:15egallager joins
16:36:54Wohlstand quits [Ping timeout: 260 seconds]
16:52:53egallager quits [Client Quit]
17:28:26dave quits [Quit: WeeChat 4.4.3]
17:33:02grill (grill) joins
17:36:37<h2ibot>OrIdow6 edited The WARC Ecosystem (+30, /* Tools */ Add anchor to ArchiveWeb.page): https://wiki.archiveteam.org/?diff=56652&oldid=55507
17:38:37<h2ibot>OrIdow6 created Webrecorder (+48, Redir to [[The WARC…): https://wiki.archiveteam.org/?title=Webrecorder
17:42:34egallager joins
17:53:39<h2ibot>Cooljeanius edited Deathwatch (+73, /* 2025 */ copyedit): https://wiki.archiveteam.org/?diff=56654&oldid=56648
17:54:40<h2ibot>Cooljeanius edited Flipnote Hatena (+16, copyedit): https://wiki.archiveteam.org/?diff=56655&oldid=56649
17:55:26grill quits [Read error: Connection reset by peer]
17:56:49grill (grill) joins
17:58:40<h2ibot>Cooljeanius edited Hatena (+64, /* Former */ copyedit): https://wiki.archiveteam.org/?diff=56656&oldid=56651
18:18:04awauwa quits [Quit: awauwa]
18:19:51Guest joins
18:49:01Wohlstand (Wohlstand) joins
18:51:14HP_Archivist quits [Ping timeout: 240 seconds]
18:52:54midou quits [Ping timeout: 240 seconds]
19:01:50<h2ibot>Manu edited Discourse (+55, /* Adding chrultrabook forum*/): https://wiki.archiveteam.org/?diff=56657&oldid=56219
19:02:25midou joins
19:08:28emily (pseudorizer) joins
19:08:34pseudorizer quits [Ping timeout: 260 seconds]
19:15:21<Guest>a game website (roblox) released a blog post in 2024 about removing their group walls feature (essentially a stream of messages where users can communicate).
19:15:21<Guest>https://devforum.roblox.com/t/rdc24-what-we-announced/3148833#p-11334181-groups-becoming-communities-21
19:15:21<Guest>i had originally made my own archiver in python to scrape this data but within the past few months it appears that roblox has rate limited my ip and tightened down security a lot on the api endpoints that are used to access the data. writing it was hell and i dont plan on doing it again. i also dont believe i have the expertise to do so based on
19:15:21<Guest>looking at the source code of some of archiveteam's past projects. i wanted to know if this is on the table to archive and if anyone is willing to do the work for it :) .
19:17:13<Yakov>yes please someone do something about this
19:17:25<Yakov>we should definitely be archiving group walls
19:20:37<@JAA>#robloxd I guess?
19:21:02<Yakov>> it appears that roblox has rate limited my ip and tightened down security a lot on the api endpoints that are used to access the data.
19:21:08<Yakov>is this fit for a warrior related project?
19:45:24<@JAA>Depends on the details, but in any case, #robloxd is our channel for all things Roblox.
20:01:34grill quits [Ping timeout: 240 seconds]
20:34:39Wake1 joins
20:34:54Yakov quits [Ping timeout: 240 seconds]
20:37:54Wake quits [Ping timeout: 240 seconds]
20:37:54Wake1 is now known as Wake
20:47:29Dada quits [Remote host closed the connection]
20:51:48Island joins
21:07:31dabs joins
21:13:49kansei- quits [Quit: ZNC 1.10.1 - https://znc.in]
21:15:54Sluggs quits [Ping timeout: 240 seconds]
21:19:43kansei (kansei) joins
21:19:49midou quits [Ping timeout: 260 seconds]
21:19:56midou joins
21:23:12<nstrom|m>just realized we're running projects twitch, glitch and itch
21:34:19egallager quits [Quit: This computer has gone to sleep]
21:56:02GradientCat (GradientCat) joins
21:59:07lennier2_ joins
22:02:24lennier2 quits [Ping timeout: 260 seconds]
22:06:56nicolas17 quits [Quit: Konversation terminated!]
22:08:02nicolas17 joins
22:08:25dabs quits [Read error: Connection reset by peer]
22:12:05<katia>\w*itch
22:17:00atphoenix__ (atphoenix) joins
22:19:19atphoenix_ quits [Ping timeout: 260 seconds]
22:22:20Exorcism|irc quits [Quit: Ping timeout (120 seconds)]
22:22:39DigitalDragons quits [Quit: Ping timeout (120 seconds)]
22:22:43Exorcism|irc (exorcism) joins
22:22:56DigitalDragons (DigitalDragons) joins
22:24:22SootBector quits [Ping timeout: 264 seconds]
22:25:46SootBector (SootBector) joins
22:51:54nicolas17_ joins
22:55:29nicolas17 quits [Ping timeout: 260 seconds]
23:03:27nicolas17_ is now known as nicolas17
23:16:08dabs joins
23:26:14APOLLO03 quits [Ping timeout: 240 seconds]
23:35:15Webuser065658 joins
23:35:29<Webuser065658>Hi
23:36:10Webuser065658 quits [Client Quit]
23:38:36<h2ibot>TriangleDemon edited YouTube (-2, Vvideos manually submitted from SPN has been on…): https://wiki.archiveteam.org/?diff=56658&oldid=55208