00:02:07AmAnd0A quits [Ping timeout: 258 seconds]
00:04:45AmAnd0A joins
00:09:58icedice (icedice) joins
00:16:30<pabs>https://hackaday.com/2023/06/09/they-used-to-be-a-big-shot-now-eagle-is-no-more/
00:16:53<pabs>^ Autodesk bought EAGLE
00:17:52<nicolas17>in 2017
00:18:05imer3 (imer) joins
00:21:56imer quits [Ping timeout: 252 seconds]
00:21:56imer3 is now known as imer
00:30:08<pabs>ah, ooops
00:37:42<h2ibot>Switchnode edited Deathwatch (+116, /* 2023 */ update harvard blogs): https://wiki.archiveteam.org/?diff=50102&oldid=50087
00:49:06Mateon1 quits [Quit: Mateon1]
00:50:12Mateon1 joins
01:00:23killsushi_ quits [Ping timeout: 258 seconds]
01:07:13@Sanqui quits [Remote host closed the connection]
01:12:57lunik173 quits [Quit: Ping timeout (120 seconds)]
01:14:50lunik173 joins
01:15:50Sanqui joins
01:15:52Sanqui quits [Changing host]
01:15:52Sanqui (Sanqui) joins
01:15:52@ChanServ sets mode: +o Sanqui
01:38:18FavoritoHJS joins
01:43:38<FavoritoHJS>update on twitter: musk has set a rate limit. all is lost.
01:44:01<fireonlive>yeeep
01:44:17<fireonlive>we're been kvetching about it all day :(
01:44:33<flashfire42>Is the CDN rate limited too?
01:44:42<flashfire42>We might be able to say fuck you and grab the images
01:44:45<FavoritoHJS>honest question, why did y'all not start yesterday?
01:44:47<FavoritoHJS>wait...
01:44:59<FavoritoHJS>the api, does it have rate limits?
01:45:00<@JAA>flashfire42: See 20:25
01:45:01<nicolas17>flashfire42: where do we get image links?
01:45:31<FavoritoHJS>yes, most users are locked out but if only one was unfortunate enough to get """"verified""""....}
01:45:46<flashfire42>Bing. reddit. google, baidu, yandex, URLteam dumps
01:46:06<nicolas17>FavoritoHJS: then that user could abuse their auth token and private API to get... 6000 tweets in a day?
01:46:10<flashfire42>But yeah brute forcing is infeasible
01:46:21<FavoritoHJS>oh, it has rate limiting...
01:46:28<@JAA>No, it's a whopping 10k now!
01:46:36<@JAA>(Out of ... 500 million tweets per day?)
01:46:45<FavoritoHJS>if even bots have read limits then all is indeed lost
01:46:53<kiska>Wasn't it like 3k per second of tweeting?
01:47:06<FavoritoHJS>...account spam?
01:47:11<nicolas17>kiska: when did you see that? everything changed in the last 24 hours
01:47:11<@JAA>500 million would be 5.8k/s.
01:47:20<nicolas17>oh you mean new tweets
01:47:24<@JAA>But who knows what it is now.
01:47:34<FavoritoHJS>600 per user, that would be a million users
01:47:35<FavoritoHJS>so no
01:47:43<fireonlive>archive all of twitter in one day?
01:47:44<nicolas17>FavoritoHJS: bots are asked to pay $40k/mo to do anything remotely heavy-duty
01:48:06<fireonlive>literally impossible haha
01:48:11<nicolas17>using the bot API has *worse* limits than abusing a logged in account
01:48:12<FavoritoHJS>and what are the limits on said "remotely heavy-duty"?
01:48:32<kiska>Not very high
01:48:56<FavoritoHJS>welp, better hope for a db leak as i fear we've lost our boat
01:49:47<fireonlive>that would be a massive-ass torrent
01:49:48<nicolas17>FavoritoHJS: the "basic" level costs $100/month and lets you read up to 10000 tweets per MONTH
01:49:48<fireonlive>xD
01:50:17<nicolas17>https://developer.twitter.com/en/portal/petition/essential/basic-info
01:50:27<fireonlive>'ok everyone please seed my 4933PB torrent'
01:51:14<@JAA>1. Introduce ridiculous API pricing. 2. Everyone starts scraping. 3. Introduce ridiculous view limits. 4. ??? 5. PROFIT, I guess?
01:51:26<FavoritoHJS>yea that would be a problem... maybe not that large once finished by removing the endless spam that twitter never removed, but actually getting to that point?
01:52:31<fireonlive>this leak better include all the hot gay twitter accounts tbh otherwise whats the point
01:52:35<fireonlive>:3
01:53:23<FavoritoHJS>(not that large = probably 4PB or 6 imgurs at least)
01:53:30<lennier1>If a bunch of AI companies were actually scraping Twitter, hopefully one of them publishes the data.
01:53:31<kiska>Hah!
01:53:46<kiska>Thinking that twitter is only 4PB big hahah
01:53:54<FavoritoHJS>text only, maybe?
01:54:01<FavoritoHJS>when compressed to hell and back
01:54:01<kiska>Google+ was some 1.5 PB and we didn't archive it fully
01:54:10<lennier1>I guess we could have a bot to let people submit image URLs from their private scrapes.
01:54:32<fireonlive>reddit is about 3PB atm
01:54:38<fireonlive>sorry 3PiB :D
01:59:14<FavoritoHJS>about that scraping the ai scrapers... could work but i'm fairly certain they didn't bother to preserve such minor details as poster or post time
01:59:56<FavoritoHJS>how much is already archived either as wayback machine pages or manual backups?
02:00:05<kiska>Tons
02:00:41<FavoritoHJS>is there a way to take out your data from twitter?
02:02:30<FavoritoHJS>also someone please move twitter to Alarm in `Alive... OR ARE THEY`
02:02:58<lennier1>Assuming it still works: https://help.twitter.com/en/managing-your-account/how-to-download-your-twitter-archive
02:03:33<FavoritoHJS>what does the "and more." at the end contain
02:04:08<FavoritoHJS>sure hope it would be pointers to replies since that could mean known spam accounts can be skipped over
02:04:11<@JAA>The text content of all tweets alone may well be a PB. And that's before all the metadata around it.
02:04:30<@JAA>Maybe not quite, but closer than you might think at first.
02:05:02<@JAA>500 million tweets a day for 17 years at 100 bytes per tweet would be 310 TB.
02:05:38<@JAA>Now add all the usernames, timestamps, links, and even before touching any images or videos, you're easily in the petabytes.
02:05:51<@JAA>With media, well...
02:07:08<kiska>100 bytes per tweet? I think you're being optimistic there
02:07:27<FavoritoHJS>i mean, for most of twitter's existence the limit was 140 characters
02:07:37<@JAA>No idea about the actual average, but yeah, that.
02:07:46<FavoritoHJS>in latin-based alphabets a character takes a single byte
02:07:59<@JAA>Fermi estimate, mkay? :-)
02:08:08<kiska>Ok :D
02:08:13<FavoritoHJS>so 100 bytes seems reasonable for lightly-compressed tweets
02:09:46<fireonlive>are we excluding UTF-8 here
02:10:22<@JAA>💩
02:11:26<FavoritoHJS>yes, but since ascii characters (which include english and are a significant component of latin-script-based languages) only take a single byte, that's probably fine?
02:12:05<FavoritoHJS>the big problem i think would be to reduce redundancy between tweets while keeping the archive browseable
02:12:49<FavoritoHJS>you could store all the info about each tweet, but that would require storing the username and timestamp and id of each tweet...
02:13:10<FavoritoHJS>but most users would probably tweet multiple times, and probably in bursts
02:15:22<fireonlive>the eggplants must live on!
02:30:27<nicolas17>/!\
02:30:58<nicolas17>I think something got deleted from the tb2b.eu FTP
02:31:22<nicolas17>wait no, I got a temporary connection error listing one particular subdirectory... retrying
02:38:54AmAnd0A quits [Ping timeout: 258 seconds]
02:39:53AmAnd0A joins
02:40:33yts98 leaves
02:40:36yts98 joins
02:44:33Pichu0102 joins
02:49:50Arcorann (Arcorann) joins
02:56:30FavoritoHJS quits [Remote host closed the connection]
03:06:49Hajdar (Hajdar) joins
03:07:35nicolas17 quits [Client Quit]
03:09:41tzt quits [Ping timeout: 252 seconds]
03:11:14tzt (tzt) joins
03:12:08killsushi_ joins
03:23:34AmAnd0A quits [Read error: Connection reset by peer]
03:23:51AmAnd0A joins
03:25:51superkuh joins
03:48:05qwertyasdfuiopghjkl quits [Remote host closed the connection]
03:54:10yts98 leaves
03:54:12yts98 joins
03:56:54lennier1 quits [Read error: Connection reset by peer]
03:57:12lennier1 (lennier1) joins
04:18:50qwertyasdfuiopghjkl (qwertyasdfuiopghjkl) joins
04:38:32Pichu0102 quits [Remote host closed the connection]
04:45:35Megame (Megame) joins
04:51:02dumbgoy quits [Ping timeout: 265 seconds]
04:53:32rohvani quits [Quit: The Lounge - https://thelounge.chat]
04:58:57justmolamola joins
05:05:17rohvani joins
05:25:08AmAnd0A quits [Read error: Connection reset by peer]
05:25:43AmAnd0A joins
05:26:08justmolamola quits [Remote host closed the connection]
05:28:35icedice2 (icedice) joins
05:32:08icedice quits [Ping timeout: 252 seconds]
05:37:38Megame quits [Ping timeout: 252 seconds]
05:42:54icedice2 quits [Client Quit]
05:43:12icedice (icedice) joins
05:55:13rohvani quits [Client Quit]
05:55:43hitgrr8 joins
05:55:47Arcorann quits [Ping timeout: 252 seconds]
05:57:34rohvani joins
06:01:33Naruyoko quits [Remote host closed the connection]
06:01:55Naruyoko joins
06:15:08le0n quits [Ping timeout: 265 seconds]
06:17:51le0n (le0n) joins
06:22:18icedice quits [Client Quit]
06:44:29cbrts joins
07:26:15AmAnd0A quits [Remote host closed the connection]
07:26:29AmAnd0A joins
07:28:12whatsinaname joins
07:28:40whatsinaname quits [Remote host closed the connection]
07:35:28Island_ quits [Read error: Connection reset by peer]
07:44:52<cbrts>Are there any things that I would probably be unhappy with if I used the `--convert-links` option with wget?
07:44:59<cbrts>Seems a little weird to rewrite the links in the saved HTML
08:05:59gfhh joins
08:09:03<Terbium>now i really regret not starting to archive Twitter years ago :/
08:31:11Arcorann (Arcorann) joins
08:40:40dunger quits [Quit: https://convos.chat]
08:41:32threedeeitguy quits [Client Quit]
08:44:16someone1 joins
08:45:18threedeeitguy (threedeeitguy) joins
08:49:48fluke joins
09:12:16<h2ibot>Bzc6p edited ЯRUS (+14, не все знают кириллицу): https://wiki.archiveteam.org/?diff=50103&oldid=50095
09:13:53asdfsdfsdf quits [Remote host closed the connection]
09:29:43inventatorul1 joins
09:36:00IDK (IDK) joins
09:40:58<ram|m>Gfycat's ending service September 1st
10:23:35Bearddough joins
10:33:31Bearddough quits [Remote host closed the connection]
10:46:05<razul>See #deadcat
10:59:29greenlight joins
11:04:51greenlight quits [Remote host closed the connection]
11:06:17W7RFa6AbNFz_ quits [Remote host closed the connection]
11:09:23W7RFa6AbNFz_ joins
11:10:29W7RFa6AbNFz_ quits [Remote host closed the connection]
11:11:39W7RFa6AbNFz_ joins
12:07:28<imer>how does twitter not do ipv6, smh these companies are all useless
12:10:47qwertyasdfuiopghjkl quits [Remote host closed the connection]
12:47:17AK quits [Quit: AK]
12:56:13AK (AK) joins
13:13:48Jonboy345 quits [Read error: Connection reset by peer]
13:26:14someone1 quits [Ping timeout: 252 seconds]
13:26:41someone1 joins
13:27:44GravelCZ joins
13:29:33<thuban>someone might want to add gfycat to the channel topic in #archiveteam
13:52:38Arcorann quits [Ping timeout: 252 seconds]
14:02:14nimaje quits [Quit: WeeChat 3.8]
14:04:12nimaje joins
14:12:10<h2ibot>Arkiver edited Deathwatch (+554, Add mudrunnermods.com): https://wiki.archiveteam.org/?diff=50104&oldid=50102
14:21:03Megame (Megame) joins
14:29:55<fuzzy8021>has anyone mounted a separate volume to store data on for the docker images of the project?
14:31:02<fuzzy8021>i am using "--mount type=bind,source=/data2/docker/project,target=/grab/data" and still seeing the main drive increase in size along with the secondary mounted drive
14:31:26<imer>there was some talk about using tmpfs for the grabbers: (quoting J_AA) "With the project images and Docker: `--mount type=tmpfs,tmpfs-size=2G,destination=/grab/data`
14:31:48<fuzzy8021>this is for youtube so dont have enough ram to hold the videos
14:31:55<imer>ah
14:32:20<fuzzy8021>my / has 100gb ssd and i have a second rust drive setup for /data2
14:32:52<fuzzy8021>when using that --mount, i am still seeing / go full even though i am seeing data written to /data2
14:34:51<imer>yeah, not sure. can only see usage in /grab/data here, although im not mounting that to anywhere
14:35:00<imer>are files appearing in the mounted dir?
14:35:14<fuzzy8021>yes
14:49:49Chris5010 quits [Quit: ]
14:59:33Chris5010 (Chris5010) joins
15:06:19<h2ibot>Yts98 edited Current Projects (+448, Propose Banciyuan, Skyblog, Xuite): https://wiki.archiveteam.org/?diff=50105&oldid=50072
15:12:57<@arkiver>yts98: we're on it btw with banciyuan
15:15:31<yts98>arkiver: do you mean we're working on banciyuan with stwp (that's what i know), or we're cooperating with banciyuan official?
15:16:20<@arkiver>"we" is archiveteam
15:16:42<@arkiver>now that Misty|m is finished completely with their part, we'll kick off the project at AT
15:18:27<yts98>sorry for bad grammar comprehension. my "propose" also includes upcoming projects.
15:19:36<@arkiver>no worries
15:29:05Megame quits [Client Quit]
15:37:09dumbgoy joins
15:39:40dumbgoy_ joins
15:43:20IDK quits [Client Quit]
15:44:01dumbgoy quits [Ping timeout: 265 seconds]
15:50:39<manu|m>hey, could someone please archive http://www.dlammiehanson.com/ for me?
15:50:39<manu|m>there’s a wix site behind it (http://www.dlammiehanson.wix.com/) but the links from the navigation (targeting the wix site) end up on the correct domain. not sure how this will mess with AB
15:54:10AmAnd0A quits [Ping timeout: 265 seconds]
15:54:14AmAnd0A joins
15:59:01Icyelut quits [Quit: bye]
16:00:10<Barto>!a http://www.dlammiehanson.com/ --igset blogs,badvideos -e 'for manu|m'
16:00:14<Barto>wrong place lol
16:00:48<manu|m>it got lost in #archivebot
16:02:34Icyelut (Icyelut) joins
16:25:22beario joins
16:51:41AmAnd0A quits [Read error: Connection reset by peer]
16:51:58AmAnd0A joins
17:09:46<h2ibot>Wickedplayer494 edited Current Projects (+154, Add Gfycat to upcoming): https://wiki.archiveteam.org/?diff=50106&oldid=50105
17:17:38dumbgoy_ quits [Read error: Connection reset by peer]
17:19:04dumbgoy joins
17:23:52<nyuuzyou>Hi, the Japanese video hosting for ASMR zowa.app will send all its content on September 29, 2023 to /dev/null/ (https://note.com/zowa/n/ndf5d4f158589), looking at the ID of the last video there are currently no more than 23589. New content uploads will stop on July 30, looks like a good candidate for archiving
17:26:02<pokechu22>Probably we want to wait until new content stops before archiving, but I'll add it to deathwatch
17:27:50<h2ibot>Pokechu22 edited Deathwatch (+163, /* 2023 */ ZOWA): https://wiki.archiveteam.org/?diff=50107&oldid=50104
17:29:50<manu|m>if the id's are consecutive numbers we could just go along as new content might be added still
17:31:51<@arkiver>thanks nyuuzyou :)
17:36:30<albertlarsan68>Just came across the fact that Speedtest.net results are public and sequential. It includes (for the recent ones at least) ping time, up and down speed, and time.
17:39:15<albertlarsan68>Maybe something worth archiving to get historical data
17:39:26<albertlarsan68>Goes at least back to 2010
17:41:56<albertlarsan68>I've been able to go down to 2007, https://www.speedtest.net/result/110000000
17:44:31<albertlarsan68>However, inexistant IDs seem to trigger 500
17:48:06Naruyoko quits [Client Quit]
17:54:13guest joins
18:00:11<guest>is this where twitter archiving is being discussed?
18:01:41<Barto>if you have findings about archiving twitter, talk in private to arkiver. Otherwise if it's public knowledge already, go on you're at the right place :)
18:07:50<guest>what srot of thing would need to be said in private?
18:08:29someone1 quits [Ping timeout: 258 seconds]
18:08:33qwertyasdfuiopghjkl (qwertyasdfuiopghjkl) joins
18:09:29<Barto>i'm quoting the man: @arkiver | Got interesting information on Twitter? PM me! Do not dump the information in a publicly logged channel.
18:11:54someone1 joins
18:14:00Island joins
18:15:19<fireonlive>guest: bypases, methods, etc
18:15:29<albertlarsan68>Would the team be OK to archive the dataset of the Mersenne primes (https://mersenne.org)?
18:15:29<guest>I was going to ask about ways to save tweets. I was also going post some tools that kinda work; im not the one who found it but i dont know how public knowledge it is already.
18:15:55<fireonlive>best to PM arkiver in these trying times
18:15:59<guest>im guessing it should be said in private to prevent people from overusing it?
18:16:16<albertlarsan68>guest: See the wiki for what is already public knowledge
18:16:42<albertlarsan68>guest: https://wiki.archiveteam.org/index.php/Twitter
18:17:17<guest>ok my stuff isnt mentioned there
18:18:57<fireonlive>ah ok, pls pm the arkiver in that case
18:19:07<fireonlive>it’s not a usual thing but twitter is very… unusual
18:21:00Ivan226 joins
18:21:04<Ivan226>so
18:21:05<Ivan226>uh
18:21:07<Ivan226>Twitter
18:21:32<Ivan226>is there a dedicated channel for it?
18:22:23<fireonlive>no, just anything about bypasses etc pm to arkiver
18:23:14<fireonlive>if elon makes a tweet saying you need to fart in a jar and mail it to him for 100 more tweet views then here is fine
18:25:18<Ivan226>noted
18:25:35<fireonlive>:)
18:26:25<Ivan226>good thing I saved the profiles of the people I follwed *before* musk entered and musk'ed around
18:27:20<guest>i really shouldve started saving stuff when musk bought twitter
18:27:35<guest>i knew stuff would get messed up but i didnt start till a day ago.
18:27:39<guest>right before the rate limit hit
18:40:33<Ivan226>https://www.youtube.com/watch?v=OuFv1A7fEOE
18:40:40<Ivan226>people already speedrunning ratelimit LMAO
18:48:00albertlarsan68 quits [Quit: The Lounge - https://thelounge.chat]
18:48:09albertlarsan68 (AlbertLarsan68) joins
18:49:18driib (driib) joins
18:52:18<fireonlive>lmao
18:52:32<Barto>guest: :-) We did start saving shit before shitstorm happened. You can't rely on a single entity. If you ask my personal opinion this is the prime example where twitter is digging their own grave
18:54:19<fireonlive>gotta love socialbot/Aramaki :)
18:54:26<Barto>this is a blvd for the fediverse
18:54:36Naruyoko joins
18:54:52TheTechRobo quits [Ping timeout: 258 seconds]
18:59:54TheTechRobo (TheTechRobo) joins
19:09:16<@JAA>fuzzy8021: Docker's log files would still end up in /var/lib/docker, i.e. on your /.
19:13:58<Barto>--log-opt max-size=10m ;)
19:20:20<fireonlive>i really really need to remember to do that
19:20:42<Barto>(took me longer than i admit to finally do it as well)
19:36:30AmAnd0A quits [Ping timeout: 265 seconds]
19:39:56AmAnd0A joins
19:40:06killsushi_ quits [Ping timeout: 258 seconds]
19:45:00atphoenix__ quits [Remote host closed the connection]
19:45:35AmAnd0A quits [Read error: Connection reset by peer]
19:45:43atphoenix__ (atphoenix) joins
19:45:59AmAnd0A joins
19:46:08albertlarsan68 quits [Changing host]
19:46:08albertlarsan68 (AlbertLarsan68) joins
19:47:41<lennier1>One thing that occurred to me is that there are some big text-only archives like https://archive.org/details/twitterstream so we could complete those by grabbing media links.
19:48:18<lennier1>And people with their own archives should probably be encouraged to upload them to IA.
19:52:21spirit quits [Quit: Leaving]
19:53:18<h2ibot>DigitalDragon edited Twitter (+249, add ratelimit information): https://wiki.archiveteam.org/?diff=50108&oldid=50101
19:53:19<h2ibot>AlbertLarsan68 edited Skyblog (-2, Switch Not Yet to Upcoming): https://wiki.archiveteam.org/?diff=50109&oldid=50003
19:58:23GravelCZ quits [Remote host closed the connection]
20:03:31<tzt>also this https://archive.org/details/archiveteam_twitter
20:07:38<lennier1>And anything run through socialbot typically skipped videos.
20:12:28<guest>There are some sites that specifically existed to archive tweets, but i dont remember what they are.
20:12:52<guest>and i imagine they might have trouble saving anything new right now.
20:18:53<fireonlive>archive.today is fucked for twitter atm
20:23:41pokechu22 quits [Ping timeout: 252 seconds]
20:25:47pokechu22 (pokechu22) joins
20:31:42greg joins
20:39:31tiger_millionaire quits [Ping timeout: 258 seconds]
20:42:26<fireonlive>https://pbs.twimg.com/media/F0CJ2SEWIAAAO-X?format=jpg&name=orig
20:42:29<fireonlive>nooooo way lol
20:45:30someone1 quits [Client Quit]
20:49:36<Terbium>i hope that's not real
20:50:07<@arkiver>who is gg551015
20:53:49<fireonlive>it's (supposedly) from blind (https://en.wikipedia.org/wiki/Blind_(app)) where you verify with your work email so the twitter logo there verifies they at least (once) worked there
20:54:08<fireonlive>supposedly there's reverifications but not sure how often
20:54:39<@arkiver>any confirmation this is true and not a fabricated image?
20:54:41that_lurker quits [Quit: Clowning around is not the same as fooling around...I am a clown, not a fool]
20:54:49that_lurker (that_lurker) joins
20:55:31<fireonlive>tried a bit to find it but no luck so far
20:55:56<fireonlive>(here's an example post where it shows who works where, I guess mobile is different? https://www.teamblind.com/post/Twitter-bug-causes-self-DDOS-tied-to-Elon-Musks-emergency-blocks-and-rate-limits-Its-amateur-hour”-csF0MkWZ )
20:56:48<fuzzy8021>seems like a lot for logs but i will try limiting them
21:01:00<fireonlive>you can choose whatever you wish i suppose fuzzy8021
21:01:12<fireonlive>i woke up to like a 20GB one one day lol
21:11:52nickmas joins
21:13:19Dango360 quits [Read error: Connection reset by peer]
21:16:23nickmas quits [Remote host closed the connection]
21:19:47Dango360 (Dango360) joins
21:19:57Dango360 quits [Remote host closed the connection]
21:28:13Dango360 (Dango360) joins
21:33:55nicolas17 joins
21:39:11<guest>I found out that adding a forward slash "/" at the end of a url technically counts as a different url on some archiving sites like archive.is (but apparently not with archive.org), so keep that in mind when searching for twitter archives or saving them.
21:39:44<razul>I think it depends on the configuration of the webserver, how it deals with trailing slashes.
21:48:47hitgrr8 quits [Client Quit]
21:54:30<Barto>yep, technically GET /something and GET /something/ is not the same HTTP request
22:00:04Hajdar quits [Remote host closed the connection]
22:00:19Hajdar (Hajdar) joins
22:47:25guest quits [Ping timeout: 265 seconds]
22:49:28AmAnd0A quits [Ping timeout: 258 seconds]
22:49:36AmAnd0A joins
22:51:05greg quits [Ping timeout: 252 seconds]
22:51:46qwertyasdfuiopghjkl quits [Ping timeout: 265 seconds]
22:51:52greg joins
22:56:00AmAnd0A quits [Read error: Connection reset by peer]
22:56:17AmAnd0A joins
22:59:26guest joins
23:04:20guest quits [Ping timeout: 265 seconds]
23:04:29qwertyasdfuiopghjkl (qwertyasdfuiopghjkl) joins
23:06:16greg quits [Ping timeout: 265 seconds]
23:12:51guest joins
23:20:49BlueMaxima joins
23:40:02AmAnd0A quits [Ping timeout: 252 seconds]
23:40:05AmAnd0A joins
23:51:16imer5 (imer) joins
23:54:53imer quits [Ping timeout: 252 seconds]
23:54:53imer5 is now known as imer