01:43:28Rno joins
01:50:53<Rno>For Fandom wikis that have recently closed, I read that articles will still be available to download for up to 60 days. Can I still use the dumpgenerator script and if so, how?
01:52:37<pokechu22>I suspect that means "for download if you're an admin on the wiki" - if it's no longer publicly accessible it probably can't be saved :/
01:57:00<Rno>Aw crap. There are 2 links to wikia_xml_dumps from AWS. I used the script beforehand to download every article revision but saw that my drive is too small for the images.
02:10:02<Rno>"The database dump also does not contain Discussions/message walls/comments" I guess the script grabbed those at least.
02:43:08<pokechu22>If it's still up, I can run it in #wikibot
02:49:09Rno quits [Client Quit]
02:51:07Rno joins
02:51:11<Rno>It closed a few hours afterwards :(
02:51:24<pokechu22>ah :/
03:04:22<Rno>if I were to recover the images by reconstructing the urls using file names in the database dump, would it be possible to merge them back into the .xml? https://community.fandom.com/f/p/4400000000003671509
03:08:44<pokechu22>The dumpgenerator doesn't put them in the XML but instead in a separate folder IIRC. The XML should still contain information about the file namespace, just not files themselves (and I'm not sure if the info in the file namespace is enough to reconstruct image URLs)
03:11:44<Rno>Weird. I grabbed a few hundred images before CTRL+C'ing. The metadata for them are listed in the .xml and titles.txt but I don't see any additional folder.
03:12:44<pokechu22>Hmm; it's been a while since I've done a dump directly (I mainly use #wikibot to do it); there, the images get uploaded as a separate compressed file
03:19:52<Rno>yeah just config.txt the history.xml and titles.txt :(
03:24:18Rno quits [Client Quit]
03:48:06<pabs>this dokuwiki blocks dokuWikiDumper: https://www.fsl56.org/debut
06:00:51magmaus3 quits [Remote host closed the connection]
06:02:22magmaus3 (magmaus3) joins
06:10:13ArchivalEfforts quits [Quit: https://quassel-irc.org - Chat comfortably. Anywhere.]
06:10:23ArchivalEfforts joins
06:20:28pabs quits [Ping timeout: 260 seconds]
06:22:13pabs (pabs) joins
09:15:29nulldata quits [Quit: So long and thanks for all the fish!]
09:16:43nulldata (nulldata) joins
09:55:05pabs wonders if any Wiki.js archiver exists...
13:14:55nulldata quits [Read error: Connection reset by peer]
13:17:49nulldata (nulldata) joins
21:49:10andrew quits [Quit: ]
21:53:38andrew (andrew) joins