00:10:06krvme quits [Read error: Connection reset by peer]
00:11:32sidpatchy joins
00:14:32Island quits [Ping timeout: 252 seconds]
00:21:30<nicolas17>ok I whipped up a script to do that now
00:21:40<nicolas17>I'm definitely getting a speedup but not *that* good
00:22:05<nicolas17>went through 24GiB of the tar file in 6 minutes
00:22:38<nicolas17>which would be 71MiB/s if I was downloading the full thing, which seems impossible to get from archive.org :p
00:22:51<nicolas17>but... it could be better
00:35:17<nicolas17>there we go, I almost halved the number of requests needed, 100MiB/s equivalent now :D
00:43:31<nicolas17>"exception: connection aborted" nooo, time to add retries
00:47:32etnguyen03 quits [Ping timeout: 252 seconds]
00:49:25etnguyen03 (etnguyen03) joins
00:52:04Jake quits [Quit: Ping timeout (120 seconds)]
00:52:46BlueMaxima quits [Read error: Connection reset by peer]
00:53:21BlueMaxima joins
00:55:14AmAnd0A quits [Read error: Connection reset by peer]
00:55:20Jake (Jake) joins
00:57:09AmAnd0A joins
01:01:38Jake quits [Client Quit]
01:11:21Jake (Jake) joins
01:30:39<nicolas17>I should have done this script earlier lol
01:34:48apache2_ joins
01:34:55apache2 quits [Read error: Connection reset by peer]
02:03:03TheTechRobo quits [Excess Flood]
02:03:34TheTechRobo (TheTechRobo) joins
02:07:21<Rootliam>And now my program downloads 256kb at once and caches it, is this officially a race now or :P
02:08:04<nicolas17>I'm benchmarking :o
02:17:59<nicolas17>I tried readahead of 128KB, 256KB, and 512KB, the speed difference was completely lost in the noise
02:28:18<@JAA>> Cult of the Lamb dev says it will delete the game on January 1
02:28:28<@JAA>... due to the Unity changes
02:28:31<fireonlive>:\
02:28:33<@JAA>https://www.pcgamesn.com/cult-of-the-lamb/deleted
02:30:24<@JAA>I was going to suggest 'discord' if we wanted to create a channel, but...
02:31:10fireonlive blinks
02:31:57<fireonlive>hm they posted this after? https://twitter.com/cultofthelamb/status/1702091821273461176
02:31:58<eggdrop>nitter: https://nitter.net/cultofthelamb/status/1702091821273461176
02:33:09<fireonlive>the article's source was also two posts from their shitpost-social-media-account so ¯\_(ツ)_/¯
02:34:31<@JAA>Ah indeed :-)
02:41:07<nicolas17>IA download speeds are way too variable to test this properly
02:41:30<fireonlive>we need to get nicolas17 a 10Gig interconnect to IA
02:41:31<nicolas17>suddenly dropped to 800KiB/s *despite* skipping chunks
02:43:16<project10>I think optane9 has a better part of 10g to IA, it's on a network with peering to IA at the SFMIX. Maybe run it there? :D
02:56:01dumbgoy quits [Ping timeout: 265 seconds]
03:16:12DogsRNice quits [Read error: Connection reset by peer]
03:48:25<nicolas17>this is all over the place...
03:48:46<nicolas17>64KB: 32s 35s 43s
03:48:53<nicolas17>256KB: 11s 17s 18s
03:49:02<nicolas17>1024KB: 14s 25s 50s
03:59:02benjins quits [Read error: Connection reset by peer]
04:10:08superkuh joins
04:17:22<h2ibot>PaulWise created MoinMoin (+2087, create MoinMoin project page): https://wiki.archiveteam.org/?title=MoinMoin
04:31:59magmaus3 (magmaus3) joins
04:38:26<h2ibot>PaulWise edited MoinMoin (+5058, add more moinmoin wikis from google/bing): https://wiki.archiveteam.org/?diff=50769&oldid=50768
04:38:59etnguyen03 quits [Client Quit]
04:41:26<h2ibot>PaulWise edited MoinMoin (+145, another strategem): https://wiki.archiveteam.org/?diff=50770&oldid=50769
04:44:07benjins joins
04:51:28<h2ibot>PaulWise edited MoinMoin (+3477, more, sorted): https://wiki.archiveteam.org/?diff=50771&oldid=50770
04:56:41mtmustski quits [Ping timeout: 252 seconds]
05:11:08<nicolas17>43.1GiB tar file indexed in 3m22s :D
05:13:31<Rootliam>how the hell
05:14:06<nicolas17>I had others, especially those with few videos and mostly html pages, taking longer than just downloading the entire tar
05:14:26mtmustski joins
05:15:01<nicolas17>so it depends on the tar content *and* on the speed of the particular IA server I hit
05:15:13<nicolas17>especially latency more than throughput...
05:15:33<h2ibot>PaulWise edited MoinMoin (+5010, more, sorted): https://wiki.archiveteam.org/?diff=50772&oldid=50771
05:15:56<nicolas17>just finished a big one, 255GiB in 49m30s
05:16:19<nicolas17>I have another of a similar size with an ETA of 4 hours -.-
05:33:10Exorcism (exorcism) joins
05:34:03<pabs>anyone got any scripts/something to automate (browser-based?) searching using Bing?
05:37:56railen63 quits [Ping timeout: 265 seconds]
05:38:31<fireonlive>flashfire42?
05:38:38<fireonlive>or do you manually rawdog that
05:40:01<nicolas17>phrasing
05:40:55<fireonlive>:3
05:57:23<nicolas17>indexing 8 tar files at the same time, to do them at this speed while downloading the whole .tar I would need to download from IA at a total speed of 433 MB/s >:3
05:59:01TastyWiener95 quits [Client Quit]
06:08:02Arachnophine9 (Arachnophine) joins
06:09:50Jake quits [Ping timeout: 265 seconds]
06:10:15Arachnophine quits [Ping timeout: 258 seconds]
06:10:15Arachnophine9 is now known as Arachnophine
06:11:13nyany_ (nyany) joins
06:11:13HotSwap` joins
06:11:24nyany quits [Ping timeout: 258 seconds]
06:11:24betamax quits [Ping timeout: 258 seconds]
06:11:24HotSwap quits [Ping timeout: 258 seconds]
06:12:40Jake (Jake) joins
06:12:56betamax (betamax) joins
06:20:32<pabs>https://livingcomputers.org/Closure.aspx
06:22:01<nicolas17>pabs: it seems they closed in 2020, but it sucks that the announcement doesn't have a date
06:22:48<pabs>website got an AB in 2020, no subdomains though, inc the wiki
06:27:43nicolas17 quits [Ping timeout: 265 seconds]
06:38:32<pabs>started some jobs
06:43:45decky_e joins
06:48:16taaffeite joins
07:08:06<Rootliam>literally how
07:08:47<Rootliam>my program is taking 20 seconds just to get 5mb into the file with 256 or 512 kbps downloaded at a time and its written in c ._.
07:09:14Unholy236131661808515 quits [Ping timeout: 252 seconds]
07:09:33<Rootliam>and thats also with downloading html files turned off
07:12:12Arcorann (Arcorann) joins
07:13:33IDK (IDK) joins
07:33:16<Doomaholic>taaffeite: What kind of errors were you getting?
07:33:51<Doomaholic>Is it running at all or is it just a problem with the page you're trying to download
07:41:06<taaffeite>I'm receiving several warnings and errors: EBADENGINE is an unsupported engine, npm ERR! path /usr/local/lib/node_modules/mwoffliner/node_modules/sharp command failed, Installation error: Expected Node.js version >=14.15.0 but found 12.22.9.
07:41:34<taaffeite>So an outdated Node.js version?
07:43:35<Doomaholic>Yeah that's likely the issue
07:43:47<Doomaholic>How did you install it?
07:44:19<Doomaholic>Usually the repo in your distribution is outdated
07:46:19<taaffeite>I followed the instructions on the GitHub page. Using the latest version of Linux Mint. 'npm i -g mwoffliner'
07:46:29<Doomaholic>I mean how did you install Node?
07:50:10<taaffeite>Perhaps I didn't actually. I just installed the redis-server.
07:50:25<Doomaholic>I see
07:50:45<taaffeite>I downloaded the Node.js binary from their site, but couldn't install that.
07:51:47<Doomaholic>Well you should try to install Node then
07:51:49<Doomaholic>sudo apt install nodejs
07:52:05<Doomaholic>Then run nodejs -v and see what version it gave you
07:53:08<taaffeite>'nodejs is already the newest version (12.22.9~dfsg-1ubuntu3)'
07:53:43<taaffeite>The apt package is out of date?
07:53:59<Doomaholic>Oh okay, yes
07:54:49<Doomaholic>I haven't tried this myself but apparently there is a Node package that will update it for you
07:55:14<Doomaholic>npm install -g n
07:55:31<Doomaholic>Then run:
07:55:33<Doomaholic>n stable
07:55:37<Doomaholic>And it should update
07:56:26<Doomaholic>If that doesn't work you'll have to reinstall with a newer version
07:57:13<taaffeite>Okay that worked. It's now v18.17.1. I'll try running mwoffliner again.
07:57:54<Doomaholic>Nice
07:58:07<Doomaholic>Give it a try
08:01:22<taaffeite>A wall of deprecation and unmaintained warnings, then 'added 1191 packages in 2m', '117 packages are looking for funding'.
08:01:39<Doomaholic>That's normal for npm :P
08:02:20<Doomaholic>If you just see that when installing packages then it's likely fine
08:02:40Rootliam leaves
08:02:43<taaffeite>I'm getting a help page from mwoffliner so I think we're good.
08:02:57<Doomaholic>Sweet
08:03:45<taaffeite>I was told by someone working on the ZIM project that this script is undergoing maintenance and might not work until it's been repaired sometime in the next several months. But I'll give it a go. Thanks for the help.
08:04:41<Doomaholic>You're welcome :)
08:05:47Miori quits [Client Quit]
08:10:17nulldata quits [Ping timeout: 252 seconds]
08:13:30nulldata (nulldata) joins
08:16:26Miori joins
08:24:25Doomaholic quits [Client Quit]
08:27:25Doomaholic (Doomaholic) joins
08:36:46razul quits [Ping timeout: 265 seconds]
08:41:43BlueMaxima quits [Read error: Connection reset by peer]
09:04:01Doomaholic quits [Changing host]
09:04:01Doomaholic (Doomaholic) joins
09:04:44katocala quits [Ping timeout: 252 seconds]
09:05:04katocala joins
09:09:09<AK>Hmm anyone thought about archiving parts of the Unity forums? This is 128 pages of comments to the new pricing changes that should probable be saved (In case they delete it like they did their github) https://forum.unity.com/threads/unity-plan-pricing-and-packaging-updates.1482750/page-128
09:32:21ThreeHM_ quits [Ping timeout: 265 seconds]
09:39:54ThreeHM_ (ThreeHeadedMonkey) joins
09:54:14ThreeHM_ quits [Ping timeout: 252 seconds]
09:57:21icedice (icedice) joins
10:00:34ThreeHM_ (ThreeHeadedMonkey) joins
10:26:38<h2ibot>PaulWise edited Mailman2 (+1125, add new lists, move not done lists to the right…): https://wiki.archiveteam.org/?diff=50773&oldid=50767
10:48:38Peroniko joins
11:10:06benjinsm joins
11:12:20benjins quits [Ping timeout: 252 seconds]
11:13:20ThreeHM_ is now known as ThreeHM
11:45:40Peroniko quits [Client Quit]
12:13:00Peroniko (Peroniko) joins
12:22:11HP_Archivist quits [Ping timeout: 252 seconds]
12:34:02<pabs>https://investors.unity.com/news/news-details/2022/Unity-Announces-Merger-Agreement-with-ironSource/default.aspx
12:57:05etnguyen03 (etnguyen03) joins
13:11:23sec^nd quits [Remote host closed the connection]
13:11:46sec^nd (second) joins
13:26:51<@JAA>pabs: little-things/bing-scrape, though I haven't used it in some time.
13:29:54HP_Archivist (HP_Archivist) joins
13:39:19imer quits [Killed (NickServ (GHOST command used by imer8))]
13:39:26imer (imer) joins
13:52:52taaffeite quits [Ping timeout: 265 seconds]
13:58:21BigBrain_ quits [Ping timeout: 245 seconds]
14:00:31BigBrain_ (bigbrain) joins
14:32:32Peroniko quits [Read error: Connection reset by peer]
14:33:14Peroniko joins
14:35:51Terbium quits [Quit: http://quassel-irc.org - Chat comfortably. Anywhere.]
14:35:53monoxane quits [Read error: Connection reset by peer]
14:36:18monoxane (monoxane) joins
14:36:29Terbium joins
14:43:58Terbium quits [Client Quit]
14:46:53Terbium joins
14:50:11Island joins
14:56:40HP_Archivist quits [Ping timeout: 265 seconds]
14:57:09dan- quits [Read error: Connection reset by peer]
14:57:10cptcobalt quits [Read error: Connection reset by peer]
14:57:10qxtal quits [Read error: Connection reset by peer]
14:57:10seadog007 quits [Write error: Connection reset by peer]
14:57:11Ctrl-S quits [Read error: Connection reset by peer]
14:57:12monohedron quits [Read error: Connection reset by peer]
14:57:17JSharp quits [Read error: Connection reset by peer]
14:57:20@HCross quits [Read error: Connection reset by peer]
14:57:20ghuntley quits [Read error: Connection reset by peer]
14:57:20jonty quits [Read error: Connection reset by peer]
14:57:20ShadowJonathan quits [Read error: Connection reset by peer]
14:57:20justcool393 quits [Read error: Connection reset by peer]
14:57:20IDK quits [Read error: Connection reset by peer]
14:57:20tech234a quits [Read error: Connection reset by peer]
14:57:20mgrandi quits [Read error: Connection reset by peer]
14:57:20thejsa quits [Read error: Connection reset by peer]
14:57:20loopy quits [Read error: Connection reset by peer]
14:57:20todb quits [Read error: Connection reset by peer]
14:57:21VonGuard quits [Read error: Connection reset by peer]
14:57:21russss quits [Read error: Connection reset by peer]
14:57:21murmur quits [Read error: Connection reset by peer]
14:57:21@rewby|backup quits [Read error: Connection reset by peer]
14:57:23devsnek quits [Read error: Connection reset by peer]
14:57:30zifnab06 quits [Read error: Connection reset by peer]
14:57:34@hook54321 quits [Read error: Connection reset by peer]
14:58:36cptcobalt joins
14:58:37monohedron (monohedron) joins
14:58:39qxtal (qxtal) joins
14:58:42JSharp (JSharp) joins
14:58:43ShadowJonathan (ShadowJonathan) joins
14:58:43ghuntley (ghuntley) joins
14:58:44murmur joins
14:58:45seadog007 (seadog007) joins
14:58:47todb joins
14:58:48thejsa joins
14:58:48justcool393 (justcool393) joins
14:58:58zifnab06 joins
14:59:05VonGuard joins
14:59:06loopy joins
14:59:08russss (russss) joins
14:59:10jonty (jonty) joins
14:59:10Ctrl-S joins
14:59:12devsnek (devsnek) joins
14:59:15ghuntley quits [Max SendQ exceeded]
14:59:20ghuntley (ghuntley) joins
14:59:24HCross (HCross) joins
14:59:25@ChanServ sets mode: +o HCross
14:59:26mgrandi (mgrandi) joins
14:59:37dan- joins
14:59:51tech234a (tech234a) joins
14:59:51rewby|backup (rewby) joins
14:59:51@ChanServ sets mode: +o rewby|backup
14:59:52IDK (IDK) joins
14:59:52ghuntley quits [Max SendQ exceeded]
14:59:58ghuntley (ghuntley) joins
15:00:08hook54321 (hook54321) joins
15:00:08@ChanServ sets mode: +o hook54321
15:00:29ghuntley quits [Max SendQ exceeded]
15:00:34ghuntley (ghuntley) joins
15:01:06ghuntley quits [Max SendQ exceeded]
15:01:11ghuntley (ghuntley) joins
15:01:43ghuntley quits [Max SendQ exceeded]
15:01:48ghuntley (ghuntley) joins
15:02:20ghuntley quits [Max SendQ exceeded]
15:02:25ghuntley (ghuntley) joins
15:02:57ghuntley quits [Max SendQ exceeded]
15:03:02ghuntley (ghuntley) joins
15:03:34ghuntley quits [Max SendQ exceeded]
15:03:39ghuntley (ghuntley) joins
15:04:11ghuntley quits [Max SendQ exceeded]
15:04:16ghuntley (ghuntley) joins
15:04:48ghuntley quits [Max SendQ exceeded]
15:04:53ghuntley (ghuntley) joins
15:05:25ghuntley quits [Max SendQ exceeded]
15:05:30ghuntley (ghuntley) joins
15:06:02ghuntley quits [Max SendQ exceeded]
15:25:10Peroniko quits [Client Quit]
15:26:04Peroniko joins
15:29:32Arcorann quits [Ping timeout: 265 seconds]
15:29:44<Ryz>joepie91|m and others regarding interest on Unity, need to vocariously start finding Unity stuff :C
15:36:17HP_Archivist (HP_Archivist) joins
15:41:35Webuser244 joins
15:47:02Webuser244 leaves
15:48:26etnguyen03 quits [Ping timeout: 252 seconds]
15:55:57dumbgoy joins
17:03:19<icedice>You guys are going to start archiving Unity games?
17:03:53VoynichCR (VoynichCR) joins
17:05:15<@JAA>Exorcism: WordPress works well with simple recursive crawling like AB. Are you proposing a large-scale project?
17:07:30<Exorcism>JAA: let's say yzqzss launched its own project https://github.com/saveweb/wordpress-rss-archiver then I don't know if you really want to, that's why I'm asking 👀
17:09:14<@JAA>The README sounds like this is a 'run this to continuously submit new posts to SPN' thing...?
17:09:34<pokechu22>Isn't there also some kind of wordpress push notification system that's tied into IA?
17:10:05<pokechu22>related to https://developer.wordpress.com/docs/firehose/ (though I think it includes stuff not hosted by wordpress.com)?
17:10:25<@JAA>The README claims you have to pay for that.
17:10:58<pokechu22>Right, but I think IA does?
17:11:23<pokechu22>https://archive.org/details/NO404-WP?tab=about
17:12:03<pokechu22>(not to be confused with https://archive.org/details/NO404-WKP?tab=about)
17:13:14railen63 joins
17:15:06<@JAA>Ah nice, I've long wanted to look into leveraging Jetpack for that.
17:15:06<Exorcism><JAA> "The README sounds like this is a..." <- yep, that's it :p
17:17:37<icedice>You might want to add https://mangadex.com/ to the list if you're going to be archiving WordPress sites
17:18:16<icedice>It's a scanlation group site hosting service run by MangaDex and it uses WordPress
17:20:08<icedice>(mangadex.org is the domain used by MangaDex's manga reader)
17:26:48<Exorcism>👌🏻
17:28:16xarph joins
17:28:33<@JAA>A continuous thing for select blogs would fit into #//. Duplicating the IA's project, i.e. doing that for all blogs with Jetpack, probably makes little sense, assuming they're achieving decent coverage there.
17:29:14<@JAA>And as mentioned, one-off archival works very well with AB.
17:35:58<pokechu22>Oh, also, arkiver - what data would you need for a DPoS project for orange? I can build a list of pages that are known to exist (e.g. website front pages, possibly deeper ones too) based on the AB jobs, but I'm not sure what else is needed
17:40:36etnguyen03 (etnguyen03) joins
17:45:57railen63 quits [Remote host closed the connection]
17:46:25railen63 joins
18:06:49Rob joins
18:07:05Rob quits [Remote host closed the connection]
18:14:44etnguyen03 quits [Ping timeout: 252 seconds]
18:20:08Unholy236131661808515 (Unholy2361) joins
18:31:45treora quits [Ping timeout: 265 seconds]
18:31:49VoynichCR quits [Remote host closed the connection]
18:33:07PredatorIWD__ joins
18:33:35treora joins
18:36:11PredatorIWD_ quits [Ping timeout: 252 seconds]
18:38:23treora quits [Ping timeout: 252 seconds]
18:38:31treora joins
18:40:00DogsRNice joins
18:46:01PredatorIWD__ quits [Client Quit]
18:54:53Unholy236131661808515 quits [Ping timeout: 252 seconds]
18:55:45<@arkiver>pokechu22: we need all the links you know about
18:56:14<@arkiver>Exorcism: what is this about?
18:56:48<pokechu22>I've got 2GB of assorted links (some dead but existed in the past via CDX data, some alive, some already saved via AB); I can try to organize that into something actually usable
18:57:38<pokechu22>one other thing is that there are several kinds of links that will need to be remapped into other links because sites link to older domains that no longer work, but I imagine that's pretty easy to do with a script
18:58:18<@arkiver>pokechu22: can you gz or zst the list up and post it?
18:58:22<@arkiver>on transfer.archivete.am
18:58:31<@arkiver>let's do a channel for orange!
18:58:45<@arkiver>any ideas for an orange channel? :)
18:59:09<pokechu22>#webroasting already exists, not sure if we need a dedicated one
18:59:19<@arkiver>ah
18:59:22<@arkiver>alright we'll use that
19:00:07<@arkiver>Exorcism: for wordpress, could you just use #archivebot , and for regularly getting a set of wordpress RSS feeds we could (as JAA suggests) just use #// indeed
19:00:41<@arkiver>I'm not in favor of Archive Team using SPN on a large scale, SPN is not made for that
19:01:15<@arkiver>honestly behind the scenes, SPN is quite busy and regularly has too much to do, so queuing complete wordpress blogs through is maybe not the best way
19:01:27<@arkiver>(plus indeed IA already does something with wordpress)
19:04:09toss (toss) joins
19:08:06<fireonlive>archivebot best bot :)
19:08:38<@arkiver>:)
19:14:19<h2ibot>GroupNebula563 uploaded File:Outdated-warrior-error.png: https://wiki.archiveteam.org/?title=File%3AOutdated-warrior-error.png
19:15:19<h2ibot>Arkiver edited CNET Forums (-33, Reverted edits by…): https://wiki.archiveteam.org/?diff=50776&oldid=50445
19:15:30<Exorcism>👍🏻
19:16:25<@arkiver>i reverted a sneaky spammy edit ^
19:16:43<fireonlive>huh, odd
19:16:51<fireonlive>should block that user I suppose
19:17:06<@arkiver>yeah i marked them as spammer
19:17:14<fireonlive>ah :)
19:17:18<@arkiver>they tried to get in another edit (it was in the mod queue)
19:17:23<fireonlive>ahh
19:17:30<fireonlive>gotta love spammers...
19:17:38<@arkiver>it was in the CNET announcement, which went like "blabla... Thanks, CNET team"
19:17:51<@arkiver>and they added "BLABLA... Thanks [spam link], CNET team"
19:17:54<@arkiver>sneaky
19:17:55<@arkiver>:P
19:17:59<fireonlive>indeed :3
19:18:02<@arkiver>they were caught thouhg
19:19:15<@arkiver>Exorcism: or do you have different thoughts about that?
19:23:37<pokechu22>plcp - you should probably join #webroasting
19:26:01<pokechu22>(logs at https://irclogs.archivete.am/webroasting/2023-09-14 for recent stuff)
19:26:46nicolas17 joins
19:29:02<Exorcism><arkiver> "Exorcism: or do you have..." <- not really, I just prefer to use wordpress archiver, that's it haha
19:29:30taaffeite joins
19:30:41<@arkiver>right
19:43:30taaffeite quits [Client Quit]
19:57:05Unholy236131661808515 (Unholy2361) joins
20:23:29systwi__ is now known as systwi
20:49:59BearFortress quits [Ping timeout: 265 seconds]
21:09:19etnguyen03 (etnguyen03) joins
21:43:09etnguyen03 quits [Ping timeout: 265 seconds]
21:46:23toss quits [Client Quit]
21:54:11Unholy236131661808515 quits [Ping timeout: 252 seconds]
22:14:52Unholy236131661808515 (Unholy2361) joins
22:30:02<fireonlive>https://torrentfreak.com/ace-takes-aim-at-zoro-to-successor-aniwatch-to-230912/
22:30:33<fireonlive>"ACE Takes Aim at Zoro.to Successor Aniwatch.to" "Below is a list of all domains targeted by MPA/ACE in a recent DMCA subpoena wave"
23:14:36sec^nd quits [Ping timeout: 245 seconds]
23:15:35@Sanqui quits [Ping timeout: 252 seconds]
23:19:30<Peroniko>Moved from #archiveteam-ot: Any idea how feasible is to archive rateyourmusic.com considering that they seems to block Wayback machine IPs, probably because of the amount of traffic. They are great place for music discovery, and their forum is around 20 years old. Most of the pages are unarchived, and many of those that are just display block notice because of unusual activity (ex. https://web.archive.org/web/20230909224447/https://rateyourmusic.com
23:19:30<Peroniko>/~Fooftilly). Their image CDN isn't blocked though.
23:20:00sec^nd (second) joins
23:20:32nic (nic) joins
23:30:00Mateon1 quits [Quit: Mateon1]
23:30:29Mateon1 joins
23:31:04Mateon1 quits [Client Quit]
23:32:16Mateon1 joins
23:32:32<flashfire42>Well with 2 new trackers coming up I may switch to AT Choice when I head to work today
23:34:53Sanqui joins
23:34:55Sanqui quits [Changing host]
23:34:55Sanqui (Sanqui) joins
23:34:55@ChanServ sets mode: +o Sanqui
23:35:14<Peroniko>Which new ones are coming up?
23:37:28etnguyen03 (etnguyen03) joins
23:41:04<imer>Peroniko: #zowch and one under #webroasting for orange
23:50:30benjins joins
23:54:05benjinsm quits [Ping timeout: 252 seconds]