00:06:12driib quits [Client Quit]
00:12:59<nicolas17>why so many tasks on a single item?
00:14:44<@JAA>Uploading many mostly small files does that.
00:15:06<nicolas17>oh
00:15:33<nicolas17>there's a flag to prevent derivation so you can trigger it on the last file only, but now that I remember, I think the CLI tool doesn't use it :/
00:16:23<@JAA>I am not using the ia CLI, and I am disabling derives.
00:16:26<@JAA>This is archive.php tasks.
00:16:58<nicolas17>rip
00:17:50<nicolas17>think how much worse it would be with derives then! :D
00:18:07<@JAA>Nah, only the last upload would normally queue a derive.
00:19:19<nicolas17>well I've seen derives start before the next file finishes uploading
00:20:06<@JAA>Can happen, and I've reported one or two bugs about that in the CLI before.
00:20:25PredatorIWD quits [Read error: Connection reset by peer]
00:22:41<@JAA>With a simple normal single upload of all files at once, it shouldn't happen.
00:23:27PredatorIWD joins
00:27:33driib (driib) joins
00:30:39<@JAA>Also, even with bugs in the CLI, it shouldn't happen frequently. The derive tasks aren't queued directly, but the archive.php task gets set a flag to queue a derive task on completion. As long as there are more archive.php tasks pending, that doesn't happen.
00:30:59<@JAA>So you'd still end up with at most one derive task.
00:37:50<nicolas17>okay! I think I'm done sorting these magazines
00:38:04<nicolas17>I need to upload a .zip with sortable image files, right?
00:38:17<nicolas17>would one zip per issue make sense?
00:41:23Arcorann (Arcorann) joins
00:42:38<pokechu22>https://help.archive.org/help/how-to-upload-scanned-images-to-make-a-book/ might help
00:45:35<nicolas17>"Name the .zip/.tar file correctly (e.g. identifier_images.zip)" is that the item identifier, or unrelated?
00:46:50<nicolas17>hm looks like last time I uploaded two zips to the same item, with filenames not matching the item name
00:47:04<nicolas17>but *also* not having the _images suffix, so I had to manually set the format to "Generic Raw Book Zip"
00:58:30<pokechu22>I'm pretty sure the name can be anything (and you can have multiple books in an item) but I haven't actually uploaded any books myself
01:35:49<nicolas17>oh this is cursed
01:36:15<nicolas17>I googled the ISBN
01:36:24<nicolas17>and I found there was an edition in Portuguese of this same thing
01:37:08<nicolas17>it has the same ISBN and same UPC barcode despite being in a different language ?!
02:25:55<nicolas17>fuuuuuck
02:26:02<nicolas17>I have 600dpi PNGs
02:26:21<nicolas17>I downscaled them by 50% and the resulting JPEGs *still say they are 600dpi* which is wrong :(
03:14:52<nicolas17>ok, now either I screwed up or the web UI screwed up and it's in the wrong collection D:
03:26:41<nicolas17>JAA arkiver: https://archive.org/details/fg-electronica-modular ended up in Community Data, I meant to use Community Texts (though if you find a more suitable collection feel free to use that instead)
03:26:56<nicolas17>also, is the wrong collection the reason why it's not showing the book viewer? or is there something wrong with my zip?
03:34:32@JAA can't move items.
04:16:58<nicolas17>ok seems the actual derive task appeared when I stopped looking
04:17:21<nicolas17>might get a flip book when the OCR etc finishes
04:25:42<nicolas17>nope, derive/OCR happened but still no flip book
04:26:26<fireonlive>looks like it's running book_op now
04:26:37<fireonlive>i guess that's chiefly the virus scan tho
04:27:34<nicolas17>it's running book_op again because my impatient ass uploaded zip #3 without noticing it was already deriving 1 and 2
04:28:18<@JAA>Derives get aborted when there are new archive.php tasks (i.e. uploads).
04:29:32<nicolas17>this one *seemed* to run to completion, maybe there's only specific points where it checks "should we abort"
04:30:06fireonlive wonders if it's php 8.3.6
04:30:39<nicolas17>wish there was documentation on what's required to trigger the book UI
04:34:38<@JAA>Yeah, pretty sure it only checks at certain times.
04:36:23<nicolas17>also, TIL the "notes" metadata field is shown nicely formatted in the details page
04:45:07<nicolas17>my bad, it did abort the previous derive, "TASK FINISHED WITH SUCCESS" mislead me but there was an earlier "Interrupting derive due to newly pending catalog task"
05:29:59<nicolas17>theory: the book reader appears when _scandata.xml is derived
05:46:57Arcorann quits [Read error: Connection reset by peer]
05:52:52Arcorann (Arcorann) joins
05:55:10Arcorann quits [Remote host closed the connection]
05:59:56<@JAA>`-analyzeduration 900000000000` seems like a perfectly reasonable value for an ffmpeg command. lol
06:01:20<fireonlive>me setting a systemd unit timeout
06:01:28Arcorann (Arcorann) joins
06:03:08Arcorann quits [Remote host closed the connection]
06:09:22Arcorann (Arcorann) joins
06:35:21nulldata quits [Client Quit]
06:36:24nulldata (nulldata) joins
06:55:46nulldata quits [Client Quit]
06:56:50nulldata (nulldata) joins
07:01:00datechnoman quits [Client Quit]
07:37:25datechnoman (datechnoman) joins
08:37:33datechnoman quits [Client Quit]
08:38:28datechnoman (datechnoman) joins
09:39:03driib quits [Client Quit]
09:46:41driib (driib) joins
09:48:11f_ (funderscore) joins
10:43:40BearFortress quits [Quit: https://quassel-irc.org - Chat comfortably. Anywhere.]
11:44:08BearFortress joins
13:13:15<nicolas17>"FATAL ERROR: No parseable reply from Library of Congress Z server to query on isbn 8489617880"
13:15:11<nicolas17>hmmm that might have been a temporary error, I'll rederive
13:17:01imer quits [Client Quit]
13:23:47imer (imer) joins
13:29:54imer quits [Client Quit]
13:48:27imer (imer) joins
13:50:28datechnoman quits [Client Quit]
13:51:44datechnoman (datechnoman) joins
13:53:12imer quits [Client Quit]
13:57:40datechnoman quits [Ping timeout: 255 seconds]
13:59:28Arcorann quits [Ping timeout: 255 seconds]
14:02:08datechnoman (datechnoman) joins
14:06:37imer (imer) joins
14:11:37imer quits [Ping timeout: 255 seconds]
14:28:10imer (imer) joins
14:32:20imer quits [Client Quit]
17:44:17imer (imer) joins
17:55:00f_ quits [Ping timeout: 255 seconds]
18:26:28<nicolas17>[NOTE: FILE IS VERY BIG -- 491.4KB. SHOWING FIRST AND LAST 500 LINES...]
18:26:56<nicolas17>and nowhere near done lol
18:27:07<nicolas17>it would be so much faster if it created a separate task for each zip file
19:44:10nicolas17 quits [Ping timeout: 255 seconds]
20:28:33qwertyasdfuiopghjkl quits [Client Quit]
20:47:48nicolas17 joins
22:00:09Vokun quits [*.net *.split]
22:00:10hlgs|m quits [*.net *.split]
22:00:10Nulo|m quits [*.net *.split]
22:00:10yzqzss quits [*.net *.split]
22:00:10s-crypt|m|m quits [*.net *.split]
22:00:10qyxojzh|m quits [*.net *.split]
22:00:10Thibaultmol quits [*.net *.split]
22:00:10schwarzkatz|m quits [*.net *.split]
22:00:10thermospheric quits [*.net *.split]
22:00:10audrooku|m quits [*.net *.split]
22:00:10theblazehen|m quits [*.net *.split]
22:00:10tomodachi94 quits [*.net *.split]
22:00:10x9fff00 quits [*.net *.split]
22:00:10igneousx quits [*.net *.split]
22:00:10mattwright324|m quits [*.net *.split]
22:00:10DigitalDragon quits [*.net *.split]
22:00:10britmob|m quits [*.net *.split]
22:00:10Sanqui|m quits [*.net *.split]
22:01:48mattwright324|m joins
22:02:21igneousx (igneousx) joins
22:02:21theblazehen|m joins
22:02:21tomodachi94 (tomodachi94) joins
22:02:21audrooku|m joins
22:02:22Sanqui|m (Sanqui) joins
22:02:22x9fff00 (x9fff00) joins
22:02:22DigitalDragon (DigitalDragon) joins
22:02:22Thibaultmol joins
22:02:22britmob|m joins
22:02:22schwarzkatz|m joins
22:02:22Vokun (Vokun) joins
22:02:26thermospheric (Thermospheric) joins
22:02:26hlgs|m joins
22:02:26qyxojzh|m joins
22:02:26s-crypt|m|m joins
22:02:26Nulo|m joins
22:02:26yzqzss (yzqzss) joins
23:22:26pabs quits [Remote host closed the connection]
23:24:44pabs (pabs) joins