00:09:38<pokechu22>hmm, ok, xmlrevisions is just too slow. It was able to process NS 4000 fine, but NS 1 (talk) failed - https://ru.openlist.wiki/api.php?continue=&format=json&list=allrevisions&arvprop=ids%7Ctimestamp%7Cuser%7Cuserid%7Csize%7Csha1%7Ccontentmodel%7Ccomment%7Ccontent&meta=userinfo&arvlimit=50&action=query&arvnamespace=1&uiprop=blockinfo%7Chasmsg times out
00:12:03<pokechu22>hmm, but changing arvnamespace to 0, which is where the ~3 million pages comes from, seems to load fairly quickly. 4000 (the first namespace) is slow but does load. That's weird.
00:16:05<pokechu22>oh, wait, the API gives the pages that were most recently changed, and I guess NS 1 is rarely used, but NS 6 (file) and NS 4000 (presumably Project?) are updated often enough (I see stuff in 2023) so they load quickly. That means that as I get further back into the history NS0 would probably slow down, and wikiteam tools have to restart the namespace from scratch if exporting
00:16:07<pokechu22>fails... yikes
00:18:26<pokechu22>which would presumably mean I'd need to use Special:Export instead, and thus perform 3 million export calls (rather than dividing that by 50 (assuming most pages are only edited once, which is probably the case))
00:22:50<pokechu22>hmm, no, making up an old revision using https://ru.openlist.wiki/index.php?title=Кляйн_Эрна_Филипповна_(1925)&action=history&limit=1 ->
00:22:52<pokechu22>https://ru.openlist.wiki/api.php?continue=&arvcontinue=20100627142516|2052656&format=json&list=allrevisions&arvprop=ids|timestamp|user|userid|size|sha1|contentmodel|comment|content&meta=userinfo&arvlimit=50&action=query&arvnamespace=0&uiprop=blockinfo|hasmsg still loads fast
01:50:42<pokechu22>OK, dumping everything but NS0 and NS6 without --xmlrevisions and then doing NS0 and NS6 with --xmlrevisions seems to be working fine
02:25:31systwi (systwi) joins
02:49:39tzt quits [Ping timeout: 265 seconds]
04:01:38tzt (tzt) joins
05:13:59Matthww1 quits [Quit: Ping timeout (120 seconds)]
05:14:56Matthww1 joins
06:16:39<pokechu22>I'm about 10% of the way through it at this point, and the XML file is already 7GB - at this rate it'll end up being 70GB, which could be a problem since I only have ~90GB free. I did enable NTFS compression for it (which brings it down to ~3.5GB instead) but still, it's going to be big... and that's not counting the ~34 GB of images the site has too
07:05:28hitgrr8 joins
08:10:55eroc1990 quits [Client Quit]
08:18:22eroc1990 (eroc1990) joins
08:18:38hitgrr8 quits [Client Quit]
09:24:57hitgrr8 joins
10:23:36hitgrr8 quits [Client Quit]
10:30:26hitgrr8 joins
10:37:22hitgrr8 quits [Client Quit]
12:23:08HackMii quits [Remote host closed the connection]
12:23:47HackMii (hacktheplanet) joins
12:35:47HackMii quits [Ping timeout: 276 seconds]
12:36:18HackMii (hacktheplanet) joins
13:12:47HackMii quits [Remote host closed the connection]
13:13:06HackMii (hacktheplanet) joins
19:12:57hitgrr8 joins
19:56:29HackMii quits [Ping timeout: 276 seconds]
19:57:16HackMii (hacktheplanet) joins
22:24:09hitgrr8 quits [Client Quit]