00:14:54<nicolas17>AB would split it into multiple warcs right?
00:17:01<pokechu22>Yeah
00:18:50<Flashfire42>Do all projects use a bloom filter or is it just #// ?
00:19:17<nicolas17>either way, this 72TB figure puts my 624GiB of Apple simulator runtimes into a good perspective :P
00:20:01<@JAA>Flashfire42: Virtually all projects do.
00:20:50<Flashfire42>I am just thinking about if queueing stuff to youtube or telegram is making it so that there is the potential that things are skipped by a bloom filter.
00:22:39<@JAA>Yes
00:23:30<Flashfire42>is there anyway to further minimise that?
00:24:30<@JAA>Not really unless you find a pile of money somewhere. :-)
00:25:49<nicolas17>I assume the bloom filter can be made larger to have less false positives, but you can never bring it to 0
00:25:50<pokechu22>It's only a risk of the URL you queue colliding with another URL in the same project, so I don't think that's much of a problem, I guess unless the URL itself is invalid for the project?
00:39:42<@JAA>nicolas17: Correct. A pile of money would allow for more memory = larger filter = lower false positive rate. A much bigger pile of money would be needed to avoid false positives entirely (by not using a bloom filter at all).
00:41:32<nicolas17>I assume inactive projects have their bloom filter offloaded from memory already?
00:42:43<Vokun>It's physically possible to dedup with no false positives on something that big?
00:43:00<Vokun>I guess somewhere someone's doing it, but it sounds impossible
00:43:25<@JAA>pokechu22: Every item being added has a tiny chance of colliding with some previously seen item in the same project. URL validity doesn't really play a role in this, and at least for things queued with qubert, there's a sniff test anyway.
00:43:49<Flashfire42>on something in the thousands or millions maybe but not likely billions
00:43:51<@JAA>nicolas17: Yes, and there are also other things to minimise memory usage.
00:44:10<nicolas17>Vokun: it's definitely possible, but it may not be fast enough
00:44:12<@JAA>Vokun: I'm sure AWS has a product to sell you for this exact thing. ;-)
00:44:39<@JAA>You can shard it efficiently via hashes.
00:44:40<Vokun>7 TB of ram
00:45:01<@JAA>Yeah, it's not going to be cheap.
00:45:09<nicolas17>like, you can have a SQL database with a unique index
00:45:50<nicolas17>but if you can't fit large parts of the database in memory, how long will it take to insert a million item IDs when there's already a billion?
00:46:46<@JAA>Yeah, performance is the other big knob.
00:46:48etnguyen03 (etnguyen03) joins
00:48:44<Vokun>I got a new ssd, and i'm thinking of just, deduping my url stash in not memory. I've deduped chunks of it, but never the chunks against each other, but I think my new ssd is probably fast enough to
00:50:04<Vokun>I think it'd take me longer to figure out how to do it the right way than to just do this
00:52:00<nicolas17>sort -u
01:00:54wickedplayer494 quits [Ping timeout: 260 seconds]
01:01:31wickedplayer494 joins
01:02:28lennier2 joins
01:05:34lennier2_ quits [Ping timeout: 260 seconds]
01:12:25PredatorIWD253 joins
01:14:54PredatorIWD25 quits [Ping timeout: 260 seconds]
01:14:54PredatorIWD253 is now known as PredatorIWD25
01:18:01<h2ibot>PaulWise edited Anubis (+49, LWN article): https://wiki.archiveteam.org/?diff=56412&oldid=56099
01:29:08<h2ibot>PaulWise edited Anubis/uncategorized (-69, use <code> to avoid having to add *): https://wiki.archiveteam.org/?diff=56413&oldid=56276
01:40:09<h2ibot>PaulWise edited Anubis/uncategorized (+329, more domains/urls): https://wiki.archiveteam.org/?diff=56414&oldid=56413
02:12:11dabs quits [Read error: Connection reset by peer]
02:16:15<h2ibot>PaulWise edited Anubis (+72, adjust for new metarefresh challenge): https://wiki.archiveteam.org/?diff=56415&oldid=56412
02:31:42etnguyen03 quits [Client Quit]
02:39:11etnguyen03 (etnguyen03) joins
02:51:57etnguyen03 quits [Remote host closed the connection]
02:57:37lennier2_ joins
03:00:47lennier2 quits [Ping timeout: 276 seconds]
03:31:59magmaus3 quits [Ping timeout: 276 seconds]
03:32:09magmaus3 (magmaus3) joins
03:36:29<Dango360>pabs: im guessing it was an intended that Anubis/uncategorized is now super condensed? https://transfer.archivete.am/13ORzn/Screenshot%202025-07-11%20at%2004.35.54.png
03:36:29<eggdrop>inline (for browser viewing): https://transfer.archivete.am/inline/13ORzn/Screenshot%202025-07-11%20at%2004.35.54.png
03:37:01<pabs>oh :/
03:37:59<pabs>hmm, it isn't condensed here
03:38:46<Dango360>i'm on chrome 138
03:41:21<pokechu22>pabs: did you want <pre> instead of <code> ?
03:43:07<pabs>kinda, but the links aren't HTMLified :/
03:43:30<h2ibot>PaulWise edited Anubis/uncategorized (-2, use <pre>): https://wiki.archiveteam.org/?diff=56416&oldid=56414
03:43:59<pabs>just wanted \n to be turned into <br/> automatically. apparently that requires the <poem> extension, which isn't enabled...
03:44:25<BlankEclair>enable scribunto, and i can make lua mess real quick :thumbs_up:
03:47:14Guest58 joins
03:47:17<BlankEclair>wait, y'all have string functions enabled in parserfunctions
03:47:45<BlankEclair>> Error: String exceeds 1,000 character limit.
03:47:47<BlankEclair>oh
03:52:58<BlankEclair>there we go
03:53:09<BlankEclair>now everyone can be unhappy
03:53:32<h2ibot>BlankEclair edited Anubis/uncategorized (-7, Make links work (Parsoid users hate this trick!)): https://wiki.archiveteam.org/?diff=56417&oldid=56416
03:55:38<BlankEclair>oh wait, this hack also works w/ parsoid: https://meta.miraheze.org/wiki/user:blankEclair/Quips?action=parsermigration-edit
03:55:43magmaus3 quits [Read error: Connection reset by peer]
03:55:52magmaus3 (magmaus3) joins
04:01:00Radzig quits [Quit: ZNC 1.10.1 - https://znc.in]
04:01:38Radzig joins
04:11:05Snivy quits [Quit: The Lounge - https://thelounge.chat]
04:15:53Snivy (Snivy) joins
04:21:35Guest58 quits [Client Quit]
04:22:09Guest58 joins
04:25:33ThetaDev quits [Quit: https://quassel-irc.org - Chat comfortably. Anywhere.]
04:26:02ThetaDev joins
04:26:49Guest58 quits [Ping timeout: 260 seconds]
04:36:38Shjosan quits [Quit: Am sleepy (-, – )…zzzZZZ]
04:37:04Point joins
04:38:39Shjosan (Shjosan) joins
04:42:33Guest58 joins
04:42:38<Point>wanting to run archive team via docker on my home server, reading the guide it said to ask here for the link to the docker image for a specific project. no real preference on project, ideally whatever is needed most/under represented
04:44:36<BlankEclair>Point: you can also run the warrior as a docker image
04:44:50benjins3_ quits [Read error: Connection reset by peer]
04:45:53<Point>yes, im wanting to run it as a docker image, though the guide on the wiki page i saw for that said to ask here for image link
04:48:43<Point>ah, i had found a list of the docker images, ill pick one from there, thank you
04:52:53<pabs>BlankEclair++
04:52:53<eggdrop>[karma] 'BlankEclair' now has 14 karma!
04:53:14skyrocket quits [Ping timeout: 276 seconds]
04:57:28skyrocket joins
04:59:39<Point>oh i misread what was said by BlankEclair, or more missed that Warrior decides the project it is working on automatically, thank you for the suggestion, ill go with that
04:59:50<BlankEclair>mhm ^_^
05:00:02<BlankEclair>it automatically handles doing the project-specific grab stuff
05:00:11<BlankEclair>and provides a web ui
05:00:32<BlankEclair>(you can also pick a specific project within the web ui, but obv you don't mind going with AT choice)
05:00:53Snivy quits [Client Quit]
05:02:20<Point>well than you so much, i have that deployed now <3
05:02:23<Point>BlankEclair++
05:02:23<eggdrop>[karma] 'BlankEclair' now has 15 karma!
05:02:28<BlankEclair>:3
05:03:50<steering>BlankEclair: LOL @ remove the closing tag
05:03:55<BlankEclair>IKR
05:04:01<BlankEclair>i don't know why it works
05:04:04<BlankEclair>but it does. flawlessly.
05:04:44TheEnbyperor quits [Ping timeout: 260 seconds]
05:06:10<steering>mediawiki--
05:06:10<eggdrop>[karma] 'mediawiki' now has -1 karma!
05:06:11<steering>BlankEclair++
05:06:13<eggdrop>[karma] 'BlankEclair' now has 16 karma!
05:06:14TheEnbyperor_ quits [Ping timeout: 276 seconds]
05:06:19<BlankEclair>y'know what
05:06:21<BlankEclair>CirrusSearch--
05:06:22<eggdrop>[karma] 'CirrusSearch' now has -1 karma!
05:09:19Snivy (Snivy) joins
05:10:35nicolas17 quits [Quit: Konversation terminated!]
05:20:21TheEnbyperor (TheEnbyperor) joins
05:20:27TheEnbyperor_ joins
05:24:09nicolas17 joins
05:28:04pabs quits [Ping timeout: 260 seconds]
05:30:28Guest58 quits [Client Quit]
05:41:26nicolas17_ joins
05:41:28nicolas17 quits [Read error: Connection reset by peer]
05:42:02DartRetaliator joins
05:47:41Guest58 joins
06:00:51pabs (pabs) joins
06:02:55nicolas17 joins
06:02:55nicolas17_ quits [Read error: Connection reset by peer]
06:11:38Point quits [Client Quit]
06:16:18nicolas17_ joins
06:17:04nicolas17 quits [Read error: Connection reset by peer]
06:30:15nicolas17 joins
06:34:34nicolas17_ quits [Ping timeout: 260 seconds]
06:36:19skyrocket quits [Ping timeout: 260 seconds]
06:47:19skyrocket joins
06:50:09yourfate1 quits [Quit: WeeChat 4.5.1]
06:50:22PredatorIWD25 quits [Read error: Connection reset by peer]
06:55:34PredatorIWD25 joins
06:58:17Guest58 quits [Client Quit]
08:44:37<@arkiver>nulldata: i'm definitely in for that!!
08:45:37<@arkiver>the 72 TB would be nice to have a separate project
08:47:18Dada joins
08:49:28Island quits [Read error: Connection reset by peer]
08:55:13lemuria_ is now known as lemuria
09:00:53agtsmith quits [Ping timeout: 276 seconds]
09:04:24simon816 quits [Quit: ZNC 1.9.1 - https://znc.in]
09:09:28nicolas17_ joins
09:10:51simon816 (simon816) joins
09:12:39nicolas17 quits [Ping timeout: 260 seconds]
09:29:54DartRetaliator_ joins
09:33:39DartRetaliator quits [Ping timeout: 260 seconds]
09:45:44BennyOtt_ joins
09:45:44BennyOtt quits [Ping timeout: 276 seconds]
09:46:50BennyOtt_ is now known as BennyOtt
09:46:57awauwa (awauwa) joins
09:51:10Guest58 joins
10:01:57Lunarian1 (LunarianBunny1147) joins
10:05:44LunarianBunny1147 quits [Ping timeout: 260 seconds]
10:11:34nicolas17_ quits [Ping timeout: 260 seconds]
10:11:44unlobito quits [Ping timeout: 276 seconds]
10:14:21nicolas17_ joins
10:19:31unlobito (unlobito) joins
10:31:06pixel (pixel) joins
10:33:38agtsmith joins
10:47:09<@arkiver>hexagonwin: yes, upload it to IA - others can download it then, but it's unlikely to go into the Wayback Machine
10:47:32<@arkiver>tzt: please add it to deathwatch! "ROCKET3 .NET"
10:49:30Wohlstand (Wohlstand) joins
10:54:52cuphead2527480 (Cuphead2527480) joins
11:00:45benjins3 joins
11:35:14<@arkiver>pabs: it may be nice to do a tripod project, yes
11:35:52<@arkiver>the wiki page at https://wiki.archiveteam.org/index.php/Tripod mentions 17500 sites that return a 200, do we have a list of those?
11:37:09<@arkiver>pabs: on tuxfamily, is there some page with more information on it? i see no mention of 'tux' on deathwatch for example
11:37:59<@arkiver>nicolas17: in my opinion, 624 GiB for the Apple Xcode data is totally worth it...
11:38:06<@arkiver>have we archived it already?
11:41:11<justauser|m>There was a job for encode.su in 2024 but it was aborted rather quickly. https://archive.fart.website/archivebot/viewer/job/202404221145045frly No mention of reason in #archiveteam-bs logs; can anybody with password to #archivebot check there? Unless it was something transient, restarting is probably a bad idea.
11:41:25<justauser|m>The main page about tuxfamily is https://pad.notkiska.pw/p/archivebot-tuxfamily
11:48:19PotatoProton01 joins
11:48:52<@arkiver>nulldata: do you have an idea how complete your list of drivers is? it's pretty nice stuff
11:49:20<@arkiver>you said you only looked for drivers - but what if you look for "everything", what would that give us?
12:04:47Onyx joins
12:09:31pixel leaves
12:20:28PotatoProton01 quits [Client Quit]
12:24:01pixel (pixel) joins
13:00:28unlobito quits [Remote host closed the connection]
13:01:24unlobito (unlobito) joins
13:03:00pixel leaves
13:03:05pixel (pixel) joins
13:04:35cuphead2527480 quits [Client Quit]
13:23:46sec^nd quits [Remote host closed the connection]
13:24:00sec^nd (second) joins
13:26:47egallager quits [Quit: This computer has gone to sleep]
13:31:04DartRetaliator_ quits [Ping timeout: 260 seconds]
13:33:48pixel leaves
13:33:53pixel (pixel) joins
13:45:39BennyOtt quits [Ping timeout: 260 seconds]
13:46:43BennyOtt (BennyOtt) joins
13:49:35grill (grill) joins
14:01:59grill quits [Ping timeout: 260 seconds]
14:03:39pixel leaves
14:03:43grill (grill) joins
14:09:17egallager joins
14:31:42PredatorIWD25 quits [Read error: Connection reset by peer]
14:34:39PredatorIWD25 joins
15:08:08BennyOtt quits [Ping timeout: 276 seconds]
15:08:11BennyOtt_ joins
15:09:10BennyOtt_ is now known as BennyOtt
15:12:41grill quits [Ping timeout: 276 seconds]
15:12:42BennyOtt quits [Client Quit]
15:13:43BennyOtt (BennyOtt) joins
15:29:47pixel (pixel) joins
15:33:44egallager quits [Read error: Connection reset by peer]
15:34:01egallager joins
15:37:02egallager quits [Read error: Connection reset by peer]
15:37:20egallager joins
15:53:03dabs joins
15:54:40nicolas17_ is now known as nicolas17
16:00:21BornOn420 quits [Remote host closed the connection]
16:00:54BornOn420 (BornOn420) joins
16:04:12cuphead2527480 (Cuphead2527480) joins
16:16:43<nulldata>I think it's all of them, but Windows Server Update Services (WSUS) is notoriously buggy and the database isn't super well documented.
16:16:43<nulldata>If I synced all products, in theory I'd have a database of pretty much every AV definition update, patches, service packs, and drivers for most Microsoft software products released since 2000 that you might use in a commercial setting. That's as long as WSUS doesn't croak and I don't run out of space to house the DB. Drivers for each version of
16:16:43<nulldata>Windows are in their own "products" as usually admins stay away from syncing drivers as it tends to make WSUS very unstable. Syncing the drivers took over a day and uses around 108GB for its MSSQL database.
16:17:43<nulldata>To note, I've never actually tried to sync really old products like say Office XP 2002. It's in the list, but I dunno if any data is still there for it or if the files are still live. I've got a snapshot before I synced drivers - I could roll back and give it a shot...
16:17:51<nulldata>arkiver^
16:17:58<nulldata>arkiver ^
16:18:32<@arkiver>nulldata: it would be pretty nice to have a complete list of all of their binaries, next to drivers
16:18:41<@arkiver>though i see that may require significant resources
16:18:48<@arkiver>nulldata: ^
16:18:54<@arkiver>i'll set up a project for the 72 TB
16:19:04Onyx quits [Remote host closed the connection]
16:19:15<@arkiver>will start i the weekend and we'll get them as fast as possible
16:19:38<@arkiver>in*
16:22:12<@arkiver>nulldata: am i correct in thinking that "the other stuff" next to the deriver might run into the many 100s of TBs? (or likely 1+ PB)
16:22:27<@arkiver>will name the current project "Windows Update Drivers"
16:22:37<@arkiver>a more general later could be "Windows Update"
16:22:43<@arkiver>if we decide to do that
16:22:58grill (grill) joins
16:24:27nicolas17_ joins
16:28:59nicolas17 quits [Ping timeout: 260 seconds]
16:29:26jinn6 quits [Quit: WeeChat 4.6.3]
16:31:20<@arkiver>nulldata: i see several lines like "2AD00540-6449-4DDF-A603-72935120AC6E,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL,NULL" is that correct?
16:31:27jinn6 joins
16:31:38<nicolas17_>arkiver: can I put the 624GiB URL list into archivebot or should I split it up?
16:31:46nicolas17_ is now known as nicolas17
16:32:35<@arkiver>nicolas17: where is your list?
16:32:46<@arkiver>i'm not an expert in ArchiveBot, so it's probably a question for JAA
16:33:41<nicolas17>nowhere but lemme upload :P
16:34:04<nicolas17>https://transfer.archivete.am/inline/x9qyW/updates.cdn-apple.com-xcode-simulators.txt
16:34:50<@arkiver>nulldata: distribution of years https://transfer.archivete.am/M7Wv1/WUDrivers_years.txt
16:34:51<eggdrop>inline (for browser viewing): https://transfer.archivete.am/inline/M7Wv1/WUDrivers_years.txt
16:35:02<nicolas17>I checked the cdx API, none of them are in WBM, there's only a few failed captures that returned an html error
16:35:08<@arkiver>looks like either the number of drivers really took off with Windows 10, or many earlier ones were already deleted
16:35:20<@arkiver>nicolas17: how large are the files?
16:35:49<@arkiver>pretty big i guess judging by number of URLs
16:35:52<nicolas17>10,737,418,240 bytes is the largest single file
16:36:05<@arkiver>JAA: pabs: can we use AB for this? ^
16:36:18<@arkiver>624 GB total, up to 10 GB each URL
16:36:33<nicolas17>I know AB works with 15GiB files, I just wasn't sure about the total size, though I guess it splits into multiple warcs
16:36:40<@arkiver>it does yes
16:36:58<@arkiver>i think we can put it in, it may just need a certain pipeline which i don't know much about
16:37:39<nulldata>Hmm there might be something wrong with my query - the files table only has 78373 rows - which is significantly less than what my query has. I'll look into it later
16:38:27<@arkiver>really good these are being collected. they're very important and often overlooked
16:38:33<@arkiver>i'll go get some sleep
16:38:55<nicolas17>your sleep schedule is more mysterious to me than JAA's
16:39:33<@arkiver>yeah it's not exactly stable
16:39:43<@arkiver>good day to you :P
16:39:52<nicolas17>gn :P
16:44:18<pokechu22>nicolas17: 624 GB total is probably fine
16:44:59grill quits [Ping timeout: 276 seconds]
16:45:04<pokechu22>the main danger is that archivebot pauses jobs on a pipeline when there's less than 5GB free space, but if files are larger than that then the pausing won't stop it from running out of disk space (it only prevents starting a new download)
16:46:15grill (grill) joins
16:48:47<pokechu22>based on http://archivebot.com/pipelines -p poke or -p dag is probably best
16:49:55<pokechu22>oh, with regards to total size, AB will start creating a new WARC when the existing one is over 5 GiB, though it does not split WARCs mid-file, so individual warcs may end up being bigger. But if the largest file is 10 GB, then that's 15GB max for a WARC (if it's at like 4.9 GB and then the 10GB file is downloaded) which should be fine
17:01:53Wohlstand quits [Quit: Wohlstand]
17:02:08Wohlstand (Wohlstand) joins
17:04:42<pokechu22>nulldata: does your list include 32-bit drivers (and I guess arm drivers and any other platforms like that)?
17:33:09grill quits [Ping timeout: 260 seconds]
17:34:56grill (grill) joins
17:52:35grill quits [Ping timeout: 276 seconds]
17:54:08grill (grill) joins
17:57:13<justauser|m>Quoting from "Valhalla"page:
17:57:15<justauser|m>> until the Internet Archive (or another entity) grows its coffers/storage enough that 80-100tb is "no big deal"
17:57:38<justauser|m>Apparently this project is now officially obsolete.
17:57:42<nicolas17>what is Valhalla?
17:57:59<justauser|m>https://wiki.archiveteam.org/index.php/Valhalla
18:01:41grill quits [Ping timeout: 276 seconds]
18:08:02<that_lurker>No I would say it's still a big deal, though not as big as it was before
18:10:06<h2ibot>HadeanEon edited Deaths in 2007 (+509, BOT - Updating page: {{saved}} (5),…): https://wiki.archiveteam.org/?diff=56418&oldid=55439
18:10:07<h2ibot>HadeanEon edited Deaths in 2007/list (+34, BOT - Updating list): https://wiki.archiveteam.org/?diff=56419&oldid=55330
18:40:50awauwa quits [Quit: awauwa]
19:11:15<h2ibot>HadeanEon edited Deaths in 2011 (+474, BOT - Updating page: {{saved}} (204),…): https://wiki.archiveteam.org/?diff=56420&oldid=55740
19:11:16<h2ibot>HadeanEon edited Deaths in 2011/list (+27, BOT - Updating list): https://wiki.archiveteam.org/?diff=56421&oldid=55741
19:14:07FiTheArchiver joins
19:14:23FiTheArchiver quits [Remote host closed the connection]
19:24:35cuphead2527480 quits [Quit: Connection closed for inactivity]
19:30:23<h2ibot>Pokechu22 created Sympa (+36, Redirected page to [[Mailing Lists#Software]]): https://wiki.archiveteam.org/?title=Sympa
19:30:24<h2ibot>Pokechu22 edited Mailing Lists (+32, /* Software */ https://lists.compasspoint.org/…): https://wiki.archiveteam.org/?diff=56423&oldid=56363
20:02:41Dada quits [Remote host closed the connection]
20:07:31Dada joins
20:18:14TastyWiener95 quits [Ping timeout: 260 seconds]
20:21:24TastyWiener95 (TastyWiener95) joins
20:50:35<h2ibot>HadeanEon edited Deaths in 2015 (+374, BOT - Updating page: {{saved}} (316),…): https://wiki.archiveteam.org/?diff=56424&oldid=56190
20:50:36<h2ibot>HadeanEon edited Deaths in 2015/list (+36, BOT - Updating list): https://wiki.archiveteam.org/?diff=56425&oldid=56191
21:18:05Teabag (Teabag) joins
21:22:27Webuser440238 joins
21:24:10<Webuser440238>hello, can someone please send site with WARCS that are about to be sent to webarchive? It was here but I lost it
21:24:41<pokechu22>I assume you're thinking of https://archive.fart.website/archivebot/viewer/ ?
21:25:14<pokechu22>note that WARCs are still listed there even after they end up on web.archive.org (it's more of an index of https://archive.org/details/archivebot)
21:29:51<Webuser440238>yep, thank you. And thanks for the information too
21:30:19Island joins
21:31:44Island quits [Client Quit]
21:32:54Webuser440238 quits [Client Quit]
21:37:54etnguyen03 (etnguyen03) joins
21:52:39<Teabag>Hi, does AT save marketing sites? A certain reward scheme is moving their offers from website to app only "soon". Would this be worth archiving?
21:55:45<h2ibot>HadeanEon edited Deaths in 2017 (+426, BOT - Updating page: {{saved}} (373),…): https://wiki.archiveteam.org/?diff=56426&oldid=56192
21:55:46<h2ibot>HadeanEon edited Deaths in 2017/list (+31, BOT - Updating list): https://wiki.archiveteam.org/?diff=56427&oldid=55960
21:57:41<pokechu22>Teabag: might be worth saving, yeah
22:02:25<Teabag>Cool, the site is https://priority.o2.co.uk/ (rather limited without JS and links to the main company site, hope those aren't problems)
22:02:39<Teabag>Thank you! :)
22:08:46<pokechu22>Hmm, archivebot mainly requires JS, but it looks like it should at least navigate the site, even if it doesn't discover images
22:08:54<pokechu22>err sorry
22:09:31<pokechu22>archivebot doesn't run JS, but it looks like that page uses <a href=...> for all of the links; it's just the images that are loaded via JS. So archivebot won't discover images but it will at least find info about all of the deals
22:09:46pixel leaves [Error from remote client]
22:25:15Dada quits [Remote host closed the connection]
22:26:27APOLLO03 quits [Quit: Leaving]
22:44:26nexusxe9 joins
22:44:47nexusxe2 joins
22:47:38nexusxe9 quits [Client Quit]
22:47:38nexusxe2 quits [Client Quit]
22:59:12Webuser021220 joins
22:59:17Webuser021220 quits [Client Quit]
23:19:02Wohlstand quits [Quit: Wohlstand]
23:35:39etnguyen03 quits [Client Quit]
23:48:44nexusxe quits [Quit: Leaving]