03:17:54pabs quits [Ping timeout: 260 seconds]
04:31:05davispuh quits [Ping timeout: 276 seconds]
05:41:05pabs (pabs) joins
05:51:54@hook54321 quits [Ping timeout: 615 seconds]
05:51:54mrfooooo quits [Read error: Connection reset by peer]
05:52:20mrfooooo joins
05:54:47hook54321 (hook54321) joins
05:54:47@ChanServ sets mode: +o hook54321
05:57:57qxtal quits [Read error: Connection reset by peer]
05:58:09qxtal (qxtal) joins
10:34:41MrMcNuggets (MrMcNuggets) joins
11:58:49tzt quits [Ping timeout: 260 seconds]
12:02:16<@arkiver>DigitalDragon: DigitalDragons: came up with a solution for preventing re-archiving of images on second rounds, we'll queue the image back with a hash, if that hash matches the next time we archive, we fail the item and try again next round
12:02:18<@arkiver>the web pages will simply have a date attached to them.
12:02:29<@arkiver>also do i use DigitalDragon or DigitalDragons ?
12:20:56tzt (tzt) joins
12:53:17<pabs>https://pzwiki.net/wiki/ gives 403s to wikibot
15:04:03MrMcNuggets quits [Quit: WeeChat 4.3.2]
15:08:28nulldata-alt quits [Quit: Ping timeout (120 seconds)]
15:09:49nulldata-alt (nulldata) joins
15:57:49davispuh joins
16:08:24davispuh quits [Client Quit]
16:08:40davispuh joins
16:15:35<DigitalDragons>arkiver: I get pings for both so either works
16:22:27<DigitalDragons>those solutions sounds good to me, though for the pages the revision ID might be easier to extract
20:50:13ThreeHM quits [Quit: WeeChat 4.6.3]
20:54:05ThreeHM (ThreeHeadedMonkey) joins
23:37:42nepeat (nepeat) joins
23:45:04Pedrosso quits [Quit: Leaving]
23:45:04ScenarioPlanet quits [Quit: meow meowy meow]
23:45:04TheTechRobo quits [Quit: Leave message goes here]
23:51:13Pedrosso joins
23:51:25ScenarioPlanet (ScenarioPlanet) joins
23:51:59nepeat quits [Ping timeout: 276 seconds]
23:52:19nepeat (nepeat) joins
23:53:50TheTechRobo (TheTechRobo) joins