00:00:10<h2ibot>JAABot edited List of websites excluded from the Wayback Machine (+0): https://wiki.archiveteam.org/?diff=53721&oldid=53720
00:04:48<Ryz>JAA, I need help on tackling this - https://web.archive.org/cdx/search?url=people.umass.edu*&fl=original&collapse=urlkey - stripping the URLs to like https://people.umass.edu/~yehudith (though confusion on whether to keep the '~' or not since https://people.umass.edu/yehudith is valid)
00:05:05<Ryz>I don't wanna run all the userpages myself, holy shit
00:05:21<@JAA>Yes, we need to compile a list and probably feed it into queueh2ibot.
00:05:33<@JAA>But also, let's move this to #webroasting I think.
00:10:51sralracer quits [Client Quit]
00:13:41beastbg8_ joins
00:17:20beastbg8 quits [Ping timeout: 260 seconds]
01:19:00linuxgemini2 (linuxgemini) joins
01:19:38linuxgemini quits [Ping timeout: 240 seconds]
01:19:38linuxgemini2 is now known as linuxgemini
01:59:58Aoede_ (Aoede) joins
02:01:38<h2ibot>PaulWise edited Codearchiver (+80, link the Software Heritage page too.): https://wiki.archiveteam.org/?diff=53722&oldid=49595
02:01:58Aoede quits [Ping timeout: 240 seconds]
02:12:19ducky (ducky) joins
02:35:35Commander001 quits [Ping timeout: 260 seconds]
02:36:13Commander001 joins
02:38:07<thuban>arkiver, OrIdow6^2, JAA: fwiw, i've also wished we had some kind of issue tracker--partly to replace or supplement deathwatch (we've occasionally completely lost sites that were _on_ deathwatch, because nobody noticed and the unstructured data there makes automated tooling impractical), and partly because there are quite a few general infra issues (eg with the tracker) that
02:38:09<thuban>should be documented but don't belong on any single current repo
02:39:07<@arkiver>thuban: what is your opinion on a simple repo with issues?
02:45:41<thuban>i agree with JAA's skepticism of github at this point, but i think a self-hosted repo (or pure issue tracker) would be fine
02:49:32<thuban>(re semantic wiki, i don't think it would give us as much benefit in terms of domain-specific tooling for issue tracking, so that if we really want wiki integration we'd get similar roi on repo+wikibots. am willing to be enlightened on technical details, though)
02:56:12<monoxane>or I just engage some hyperfocus and make a shitty single purpose web app for it /s
02:56:26<@OrIdow6^2>thuban: I'm for a self-hosted repo, my main issue with that is just the fact that it means one of us needs to be willing to host it
02:56:57<@OrIdow6^2>Hopefully whatever it is we can do regular exports/backups etc
02:57:31<@JAA>Well, we already have that Gitea instance, and it's not going anywhere. It does need some love though.
02:58:50<@JAA>From an infrastructure point of view, that is the easier path than SMW since we need something for source code anyway (be it Gitea or Forgejo or something else entirely in the future).
02:58:51<@OrIdow6^2>What I like about the idea of Semantic Mediawiki (looking it up I see someone's also written a competitor, "Cargo") is that it's already in the wiki; the connection to another service requires someone to set it up (I have implicitly volunteered to do this fwiw) and requires people to wrap their heads around it to use it
02:59:02<@OrIdow6^2>And it not to break etc
02:59:56<@OrIdow6^2>I don't see much of an advantage to issue tracking over SMW besides their better handling of deadlines, but if we wanted to use that we'd have to have everyone make an account on GH and join the AT org/set up email on the issue tracker and have everyone sign up there
03:00:28<@OrIdow6^2>Deadlines, and assignees being a special thing rather than a string field
03:00:54<@JAA>Gitea has had deadline fields for a long time.
03:01:27<thuban>deadlines, assignees, dependency trees, tags, automation hooks...
03:01:34<@JAA>Or 'due date' as it calls them.
03:01:46<TheTechRobo>What about the AT gitea needs love?
03:06:16<@arkiver>JAA: with that support, i think we can just make a repo on gitea
03:06:45SootBector quits [Ping timeout: 240 seconds]
03:06:56<@arkiver>TheTechRobo: sometimes... we can't for love :(
03:08:05<@arkiver>but let's still document things on the wiki as well. i think the git repo will be less easy to discover for new users for example
03:08:13<@arkiver>it adds a little bit more complexity to the whole thing again
03:08:48<@arkiver>so it would be nice if we can still mirror the single lines to the Deathwatch page as well
03:08:58SootBector (SootBector) joins
03:11:51<nicolas17>IME, if syncing two lists is too complicated, finding discrepancies between the two lists and letting a human sync them is the second best thing
03:12:51<nicolas17>eg. "this is in Issues but isn't in the wiki" is easier than auto-editing the wiki, but still useful
03:13:42<@arkiver>maybe yeah
03:15:23<@OrIdow6^2>arkiver: Yes, there'd be a bot that'd keep DW, {{infobox project}}, and the issue tracker in sync
03:16:23<TheTechRobo>The problem I think is all the existing entries that are just whatever. Would be hard to parse.
03:16:39<TheTechRobo>(For things like sorting by date)
03:17:34<thuban>{{datetime}} is pretty useful for that, actually
03:17:36<@OrIdow6^2>TheTechRobo: Not sure what you mean
03:18:02<@OrIdow6^2>Scrolling thru Deathwatch there are fairly few that don't have a date, eithe {{datetime}} or a human-readable one
03:18:24<TheTechRobo>Right, {{datetime}} exists
03:18:29<thuban>(not used on the older entries, of course, but most of those are already dead and not as important)
03:18:54<@OrIdow6^2>I'm thinking we'd import the older entries anyway, to make it easier to sync
03:22:45<@arkiver>OrIdow6^2: i don't know. could just start listing things "from now on"
03:23:33<thuban>could also include the 'dying' section but not 'dead'
03:24:22<thuban>regardless, i volunteer to help with any manual data cleanup that may be necessary
03:24:28<@arkiver>that is nice
03:26:27<@arkiver>i'm contacting didthis.app
03:26:42<DigitalDragons>Personally, I would avoid mirroring the gitea tracker to the wiki because I feel like that's another thing to break
03:27:10<@arkiver>DigitalDragons: right. i have a strong preference for keeping it on the wiki as well
03:28:08<DigitalDragons>ah okay
03:28:54<DigitalDragons>My thinking was to split Dead as a Doornail off into its own page, migrate what's currently in Deathwatch, and then make the deathwatch page point people to the new tracker with maybe some notes about tricks for viewing/making issues
03:29:13<@arkiver>the wiki is really low level, interacting with the repo on a different site (gitea.arpa.li) requires a new level of "need to be aware of", and it raises the bar for some to participate
03:29:48<DigitalDragons>that is fair enough
03:31:38<thuban>i'm a little bit skeptical given that the wiki requires an account anyway (and the instructions for creating one maybe aren't even accurate at the moment? there's one for the general infra tracker...)
03:34:02<thuban>could ask people who want to report stuff via the wiki (but not issue tracker or irc) to use the talk page / other special section
03:34:59<@arkiver>contacted didthis.app
03:41:51qwertyasdfuiopghjkl quits [Quit: Client closed]
03:41:53Commander001 quits [Read error: Connection reset by peer]
03:41:54<@OrIdow6^2>Maybe it's too ambitious but I was thinking a bot could keep them synchronized both ways
03:42:05Commander001 joins
03:42:06<@OrIdow6^2>Edit DW on the wiki, it edits the gitea
03:42:13<@OrIdow6^2>And so forth
03:42:39<@arkiver>i think that is not impossible
03:43:01<@arkiver>we need to check some things... more on the repo soon
03:43:19<@arkiver>!remindme 3d deathwatch repo
03:43:20<eggdrop>[remind] ok, i'll remind you at 2024-11-10T03:43:19Z
03:43:27<@arkiver>!remind JAA 3d deathwatch repo
03:43:27<eggdrop>[remind] ok, i'll remind JAA at 2024-11-10T03:43:27Z
03:47:10<nicolas17>all my backlog of samsung-opensource uploaded to IA \o/
03:47:16<@arkiver>wooh :)
03:47:35<TheTechRobo>is uploading currently stable?
03:47:41<@arkiver>somewhat
03:47:42<nicolas17>no :P
03:47:51<nicolas17>speeds are good
03:47:57<@arkiver>go ahead TheTechRobo
03:48:13<@arkiver>TheTechRobo: what do you have waiting to go in? :)
03:48:33<nicolas17>across the ~100 files I uploaded, I got 4 or 5 "We encountered an internal error. Please try again" errors, so be ready to need retrying
03:48:41<TheTechRobo>arkiver: More worried about intermittent failures. `ia` retrying functionality isn't always great.
03:49:19<nicolas17>does "ia upload --retries=" have a default value?
03:49:21<TheTechRobo>Well I believe we have 40 todo in #burnthetwitch and I'd like to avoid letting it stew for too long. :-)
03:49:40<TheTechRobo>nicolas17: --retries only applies to 503 SlowDown errors AFAIK. Connection refusals, TLS errors, etc don't count.
03:49:47<nicolas17>mine did not retry but I'm not passing that option
03:49:51<nicolas17>ah yeah that explains it
03:51:47qwertyasdfuiopghjkl (qwertyasdfuiopghjkl) joins
03:52:22<@arkiver>TheTechRobo: definitely don't remove the data before it's actually uploaded
03:52:38<@arkiver>i would only the data with the flag from the ia tool
03:52:54<nicolas17>would only what? :D
03:53:02<@arkiver>delete*
03:53:04<@arkiver>:P
03:53:08<TheTechRobo>arkiver: It doesn't, but the item will fail and tons of items failing is a bit of a pain.
03:53:08nicolas17 accidentally the whole data
03:53:18<TheTechRobo>Although wait, is the data not successfully uploaded when ia exits?
03:53:30<@arkiver>nicolas17: in how many ways did you just violate the whole data?
03:53:54<@arkiver>TheTechRobo: i only rely on the delete option from the ia tool
03:54:12<@arkiver>i would not decide manually "alright it finished, must be successful, let's delete"
03:54:15<nicolas17>TheTechRobo: in theory the data is successfully uploaded when ia exists *with a zero exit code*
03:54:20<nicolas17>exits*
03:54:24TheTechRobo always removes data after ia returns with 0 exit code
03:54:24<nicolas17>but I wouldn't trust that 100% either
03:54:31<TheTechRobo>I use the -v flag if that counts for anything
03:55:20<@JAA>I don't trust either method. I upload, wait for task processing, verify hashes, then delete.
03:55:40<@arkiver>JAA has the most secure way
03:55:45<nicolas17>https://xkcd.com/378/
03:55:58<@arkiver>however, i never had problem with letting the data be deleted by the tool itself
03:56:22<@JAA>The only thing it's saved me from is errors originating from the space between my chair and my keyboard, but still. :-)
03:56:25<nicolas17>*real* archivists keep the file for a month then re-download it from both IA's data servers to ensure it's the same before deleting it
03:56:31<thuban>have uploads from archivebot and/or at projects to ia resumed?
03:56:32<nicolas17>/s
03:56:49<TheTechRobo>> wait for task processing
03:56:49<TheTechRobo>Uploading is currently serialized in the code because of limited storage on the server so that would slow things down substantially. :/
03:57:17<nicolas17>thuban: total_tasks_queued didn't skyrocket so I assume not yet lol
03:58:41<thuban>kthx (been holding off on some purely proactive ab jobs)
04:01:42etnguyen03 quits [Remote host closed the connection]
04:11:57hackbug quits [Remote host closed the connection]
04:16:39hackbug (hackbug) joins
04:42:22DogsRNice joins
04:44:36DogsRNice quits [Client Quit]
05:05:46Guest54 quits [Quit: My MacBook has gone to sleep. ZZZzzz…]
05:06:15Guest54 joins
05:06:46bladem quits [Quit: Leaving]
05:57:35SootBector quits [Remote host closed the connection]
05:57:54SootBector (SootBector) joins
06:03:44ducky quits [Read error: Connection reset by peer]
06:04:30<pokechu22>What's happening with data from archivebot? And is there a chance that it would also make sense to send data to someone in Europe given that it includes lots of US elections stuff that would be good to mirror in multiple places (e.g. https://www.bnf.fr/en/working-together-archive-web)?
06:05:06ducky (ducky) joins
06:14:43Island_ joins
06:16:18Island quits [Ping timeout: 240 seconds]
06:20:50<pabs>arkiver: re tracker of to-archive items, longer term I'd avoid generic issue/wiki things and go with something custom, perhaps integrated into AB itself, so it could accept submissions from randos with no gitea/wiki login or IRC knowledge, could do DPoS-based subdomain/URL enumeration, auto-detection of appropriate ignoresets etc.
06:20:50BlueMaxima quits [Read error: Connection reset by peer]
06:21:12<pabs>that requires work though :)
06:22:25<@arkiver>on the DPoS side i have some plans that i need to do
06:22:28<@arkiver>but the AB side, no idea
06:24:43<pabs>it could also auto-create to-archive items based on #// stuff like finding "foo died" in a news feed would create an item for "foo" and then grab wikidata info about domains/etc to archive
06:25:47<@arkiver>well that is far away in the future
06:25:50<pabs>like I see a lot of people/companies dying in #hackernews-firehose, on WikiNews etc, but don't have time to do all the manual stuff needed to save everything
06:25:53<pabs>yeah
06:26:59<@JAA>I tend to think that these things are better done modularly.
06:28:45<@JAA>A system for tracking stuff; a bot that scrapes HN/social media/search engines/whatever and creates items in that system; a bot that watches for items and sees if it can find something on Wikidata, etc.
06:29:54<@JAA>The first two of these are something I've thought about for a while and also discussed here (probably with OrIdow6) before.
06:30:06FartWithFury (FartWithFury) joins
06:30:29<pabs>sounds good
06:30:38<@JAA>The tracking system could well be a generic issue tracker like Gitea, possibly with heavy use of labels for triaging etc.
06:54:51FartWithFury quits [Read error: Connection reset by peer]
07:09:10Unholy2361924645377131 quits [Ping timeout: 260 seconds]
07:33:12<@OrIdow6^2>JAA: Yep
07:33:26<@OrIdow6^2>And yeah those would best be separate systems even if they're on the same wishlist
07:33:45<@OrIdow6^2>I do wish we'd done that for Reddit before it closed down, but alas
07:35:50<@OrIdow6^2>Do think Gitea might be overblown for something like that esp if you're doing keyword matching with a lot of false positives
07:47:52<@JAA>I mean, kind of, yeah. But running a separate thing specifically for that would be even more overkill. Or developing one from scratch.
08:01:01Wohlstand (Wohlstand) joins
08:04:16that_lurker looks at his todo list and find the item "Process hn firehose feed and detect words like death and send to different channel"
08:07:00<@JAA>:-D
08:13:29Snivy7 (Snivy) joins
08:14:58Snivy quits [Ping timeout: 240 seconds]
08:14:58Snivy7 is now known as Snivy
08:25:05<pabs>#styx was the proposed channel name when I was discussing that with fireonlive btw
08:25:11<pabs>fireonlive++
08:25:12<eggdrop>[karma] 'fireonlive' now has 721 karma!
08:40:41<@OrIdow6^2>I wonder if Bluesky could let us extract everything from there
08:41:18<@OrIdow6^2>DK much about the protocol/how much the distributed nature is actually realized
08:57:37hazem joins
08:57:56hazem quits [Client Quit]
09:01:29<katia>fireonlive++
09:01:29<eggdrop>[karma] 'fireonlive' now has 722 karma!
09:02:42<qwertyasdfuiopghjkl>fireonlive++
09:02:42<eggdrop>[karma] 'fireonlive' now has 723 karma!
09:11:58Wohlstand quits [Ping timeout: 240 seconds]
09:44:18benjins3_ quits [Read error: Connection reset by peer]
09:44:27atphoenix quits [Read error: Connection reset by peer]
09:44:35BearFortress quits [Read error: Connection reset by peer]
09:44:35benjins3_ joins
09:45:02atphoenix (atphoenix) joins
09:45:06BearFortress joins
09:53:36gragh (gragh) joins
09:55:49loug8318142 joins
10:09:35Wohlstand (Wohlstand) joins
10:53:38Wohlstand quits [Ping timeout: 240 seconds]
11:08:59ducky quits [Remote host closed the connection]
11:09:05ducky joins
11:26:52sralracer joins
11:33:58Sluggs quits [Ping timeout: 240 seconds]
11:41:50xkey quits [Quit: WeeChat 4.2.2]
11:42:39Sluggs joins
12:00:01Bleo182600722719623 quits [Quit: The Lounge - https://thelounge.chat]
12:02:41Bleo182600722719623 joins
12:03:41Sluggs quits [Excess Flood]
12:07:34Sluggs joins
12:12:52xkey (xkey) joins
12:28:09xDEADBEEF joins
12:28:58th3z0l4 quits [Ping timeout: 240 seconds]
12:39:39Hilst (gragh) joins
12:41:34Hilst quits [Client Quit]
12:43:25gragh quits [Ping timeout: 260 seconds]
12:47:11SkilledAlpaca4189 quits [Quit: SkilledAlpaca4189]
12:48:31gragh (gragh) joins
12:48:40midou quits [Ping timeout: 260 seconds]
12:49:22SkilledAlpaca4189 joins
12:56:48eightthree quits [Remote host closed the connection]
14:00:25Commander001 quits [Ping timeout: 260 seconds]
14:00:35Commander001 joins
14:04:58midou joins
14:17:12tek_dmn quits [Read error: Connection reset by peer]
14:19:22tek_dmn (tek_dmn) joins
14:28:21SootBector quits [Ping timeout: 240 seconds]
14:30:32SootBector (SootBector) joins
15:10:58s-crypt quits [Ping timeout: 240 seconds]
15:11:35Ryz2 quits [Ping timeout: 260 seconds]
15:11:35kiska quits [Ping timeout: 260 seconds]
15:11:35Flashfire42 quits [Ping timeout: 260 seconds]
15:13:01Mist8kenGAS_ joins
15:15:40Mist8kenGAS__ quits [Ping timeout: 260 seconds]
15:29:17Ryz2 (Ryz) joins
15:29:30Flashfire42 joins
15:30:25Commander001 quits [Read error: Connection reset by peer]
15:30:38Commander001 joins
15:33:46SootBector quits [Remote host closed the connection]
15:34:07SootBector (SootBector) joins
15:34:08s-crypt (s-crypt) joins
15:34:35kiska (kiska) joins
15:36:05gragh quits [Ping timeout: 260 seconds]
15:55:37Wohlstand (Wohlstand) joins
16:09:27fangfufu quits [Quit: ZNC 1.8.2+deb3.1+deb12u1 - https://znc.in]
16:14:28fangfufu joins
16:48:08MrMcNuggets (MrMcNuggets) joins
16:58:58Flashfire42 quits [Ping timeout: 240 seconds]
16:58:58Ryz2 quits [Ping timeout: 240 seconds]
16:59:30kiska quits [Ping timeout: 260 seconds]
16:59:30s-crypt quits [Ping timeout: 260 seconds]
17:20:33Mist8kenGAS__ joins
17:23:25Mist8kenGAS_ quits [Ping timeout: 260 seconds]
17:26:27Wohlstand quits [Client Quit]
17:32:50Ryz2 (Ryz) joins
17:33:22Flashfire42 joins
17:33:36s-crypt (s-crypt) joins
17:33:44kiska (kiska) joins
17:43:45MrMcNuggets quits [Client Quit]
18:02:28qwertyasdfuiopghjkl quits [Ping timeout: 255 seconds]
18:20:31nulldata quits [Quit: Ping timeout (120 seconds)]
18:21:26nulldata (nulldata) joins
18:51:51qwertyasdfuiopghjkl (qwertyasdfuiopghjkl) joins
19:04:49ducky_ joins
19:05:04ducky quits [Read error: Connection reset by peer]
19:05:10ducky_ is now known as ducky
19:13:41lennier2 joins
19:16:18lennier2_ quits [Ping timeout: 240 seconds]
19:29:07Craigle quits [Quit: The Lounge - https://thelounge.chat]
19:30:02Craigle (Craigle) joins
19:43:26lennier2_ joins
19:46:20lennier2 quits [Ping timeout: 260 seconds]
19:51:25lennier2 joins
19:52:38lennier2_ quits [Ping timeout: 240 seconds]
19:55:51magmaus3 quits [Read error: Connection reset by peer]
19:56:03magmaus3 (magmaus3) joins
20:01:02lennier2_ joins
20:02:05lennier2 quits [Ping timeout: 260 seconds]
20:08:15magmaus3 quits [Read error: Connection reset by peer]
20:08:30magmaus3 (magmaus3) joins
20:11:04gragh (gragh) joins
20:20:51Hilst (gragh) joins
20:22:30Helena (gragh) joins
20:24:15gragh quits [Ping timeout: 260 seconds]
20:26:35Hilst quits [Ping timeout: 260 seconds]
20:26:58Helena quits [Ping timeout: 240 seconds]
20:28:26magmaus3 quits [Read error: Connection reset by peer]
20:28:38magmaus3 (magmaus3) joins
20:37:24magmaus3 quits [Remote host closed the connection]
20:38:51AlsoHP_Archivist quits [Read error: Connection reset by peer]
20:38:52qwertyasdfuiopghjkl20 joins
20:39:04AlsoHP_Archivist joins
20:39:40magmaus3 (magmaus3) joins
20:42:13qwertyasdfuiopghjkl quits [Ping timeout: 255 seconds]
20:46:48magmaus3 quits [Remote host closed the connection]
20:49:07magmaus3 (magmaus3) joins
20:51:10magmaus3 quits [Remote host closed the connection]
20:59:42pixel leaves [Error from remote client]
21:06:51Island_ quits [Read error: Connection reset by peer]
21:09:03Island joins
21:14:39yijun69 joins
21:15:03yijun69 quits [Client Quit]
21:16:22useretail joins
21:24:28magmaus3 (magmaus3) joins
21:26:21gragh (gragh) joins
21:31:41qwertyasdfuiopghjkl28 joins
21:34:00magmaus3 quits [Remote host closed the connection]
21:35:17qwertyasdfuiopghjkl59 joins
21:35:19qwertyasdfuiopghjkl20 quits [Ping timeout: 255 seconds]
21:38:55qwertyasdfuiopghjkl28 quits [Ping timeout: 255 seconds]
21:55:14eightthree joins
21:56:07etnguyen03 (etnguyen03) joins
21:56:35eightthree quits [Remote host closed the connection]
21:58:15eightthree joins
22:01:37eightthree quits [Remote host closed the connection]
22:03:06eightthree joins
22:06:30eightthree quits [Remote host closed the connection]
22:08:01eightthree joins
22:12:10eightthree quits [Remote host closed the connection]
22:14:00eightthree joins
22:17:16eightthree quits [Remote host closed the connection]
22:19:09eightthree joins
22:22:09eightthree quits [Remote host closed the connection]
22:22:44magmaus3 (magmaus3) joins
22:23:34eightthree joins
22:27:07eightthree quits [Remote host closed the connection]
22:29:02eightthree joins
22:30:46tavantius joins
22:31:18eightthree quits [Remote host closed the connection]
22:32:01gragh quits [Remote host closed the connection]
22:32:18gragh (gragh) joins
22:32:49eightthree joins
22:34:08<tavantius>hi!
22:37:03eightthree quits [Remote host closed the connection]
22:38:35eightthree joins
22:41:01eightthree quits [Remote host closed the connection]
22:42:19tavantius quits [Client Quit]
22:42:34Snivy quits [Quit: Ping timeout (120 seconds)]
22:42:42eightthree joins
22:42:48Snivy (Snivy) joins
22:42:49Hilst (gragh) joins
22:45:25gragh quits [Ping timeout: 260 seconds]
22:46:18eightthree quits [Remote host closed the connection]
22:48:28eightthree joins
22:51:05eightthree quits [Remote host closed the connection]
22:53:01eightthree joins
22:55:43eightthree quits [Remote host closed the connection]
22:57:11eightthree joins
23:00:39eightthree quits [Remote host closed the connection]
23:02:12eightthree joins
23:06:08eightthree quits [Remote host closed the connection]
23:07:59eightthree joins
23:11:03eightthree quits [Remote host closed the connection]
23:12:37Hilst quits [Client Quit]
23:13:07eightthree joins
23:15:47eightthree quits [Remote host closed the connection]
23:17:26eightthree joins
23:21:06eightthree quits [Remote host closed the connection]
23:21:18Kinille quits []
23:22:36qwertyasdfuiopghjkl59 quits [Client Quit]
23:22:53qwertyasdfuiopghjkl (qwertyasdfuiopghjkl) joins
23:23:08eightthree joins
23:26:07eightthree quits [Remote host closed the connection]
23:27:42beastbg8_ quits [Quit: Leaving]
23:27:58beastbg8 (beastbg8) joins
23:28:02eightthree joins
23:29:21etnguyen03 quits [Client Quit]
23:30:46eightthree quits [Remote host closed the connection]
23:32:00eightthree joins
23:36:03eightthree quits [Remote host closed the connection]
23:36:38<nicolas17>!reminders
23:53:55Elizabeth quits [Ping timeout: 255 seconds]
23:56:11loug8318142 quits [Quit: The Lounge - https://thelounge.chat]