00:16:39 | | driib quits [Quit: The Lounge - https://thelounge.chat] |
00:34:14 | | driib (driib) joins |
01:21:57 | <nulldata> | Law360 and LexisNexis ended their Pro Say podcast on December 21st of 2023. https://podcasts.apple.com/us/podcast/law360s-pro-say-news-analysis-on-law-and-the-legal-industry/id1240435608 https://twitter.com/AlexxLawson/status/1737989879051468954 |
01:21:58 | <eggdrop> | nitter: https://nitter.net/AlexxLawson/status/1737989879051468954 |
01:32:31 | <nulldata> | PonyChan supposedly dies today - might be a good time to grab anything posted after the last grab in December? |
01:33:19 | | qwertyasdfuiopghjkl quits [Remote host closed the connection] |
01:34:13 | <flashfire42> | https://youtu.be/gWL90wryyOw |
01:34:24 | <flashfire42> | I am joking btw but too good a chance not to post this in response |
01:41:56 | <nicolas17> | JAA: https://opensource.samsung.com/uploadSearch?searchValue=- is this archived or being worked on or in anyone's radar? |
01:43:50 | | nic9070 quits [Ping timeout: 240 seconds] |
01:44:31 | | nic9070 (nic) joins |
01:48:12 | <@JAA> | nicolas17: Looks like it's been mentioned a couple years ago, but that's about it. |
01:49:52 | <fireonlive> | looks like - doesn't return everything everything |
01:51:30 | <fireonlive> | using the release centre menu and the headings should do it though? |
01:52:11 | <fireonlive> | https://opensource.samsung.com/uploadList?menuItem=tv_n_video https://opensource.samsung.com/uploadList?menuItem=mobile https://opensource.samsung.com/uploadList?menuItem=home_appliances https://opensource.samsung.com/uploadList?menuItem=pc https://opensource.samsung.com/uploadList?menuItem=more |
01:52:12 | <fireonlive> | ..maybe |
01:52:38 | <fireonlive> | i love how every button is javascript |
01:52:55 | <fireonlive> | <a href="javascript:showSrcDownPop('3905');"> < just how god intended |
01:53:27 | <@JAA> | . might be a better search term. |
01:54:55 | <fireonlive> | seems to include more from different categories but missing more than the menuItem combination |
01:56:23 | <fireonlive> | 'more than the' 'stuff from the' |
01:57:28 | | fireonlive gives samsung the 'what the hell have you built' sticker |
01:59:59 | | qwertyasdfuiopghjkl (qwertyasdfuiopghjkl) joins |
02:00:02 | <nicolas17> | yes |
02:00:15 | <nicolas17> | I tried all IDs instead and found more than with any search term |
02:13:20 | | nic9070 quits [Ping timeout: 240 seconds] |
02:13:47 | | nic9070 (nic) joins |
02:17:45 | <pabs> | nulldata: I put https://www.invisionapp.com/ in AB the other day but didn't do the subdomains cus there are a ton of them, most with no content |
02:17:55 | <pabs> | didn't have time to sort thru them really |
02:30:19 | <nicolas17> | downloading everything from opensource.samsung.com doesn't seem difficuly |
02:30:31 | <nicolas17> | the problem is it's all POST so it won't be suitable for WBM |
02:33:20 | <fireonlive> | :| hwy is it posts |
02:33:40 | <nicolas17> | it's a POST with a one-time (or expiring) "token" |
02:34:07 | <fireonlive> | ahh :c |
02:50:31 | <Pedrosso> | What can be done about such requests then? |
02:51:16 | <TheTechRobo> | Pedrosso: They can be saved, but the Wayback Machine can't play them back |
02:51:40 | <TheTechRobo> | I dont there's much that can be done other than adding special handling to the WBM |
02:51:55 | | qw3rty quits [Ping timeout: 272 seconds] |
02:52:14 | <Pedrosso> | I suppose then what's important is that they can be saved |
02:56:32 | | qw3rty joins |
02:59:19 | | qw3rty_ joins |
03:01:25 | | qw3rty quits [Ping timeout: 272 seconds] |
03:19:36 | | Hackerpcs quits [Quit: Hackerpcs] |
03:21:32 | | Hackerpcs (Hackerpcs) joins |
03:35:33 | <nicolas17> | Pedrosso: I might download everything and upload it into a plain old item in archive.org |
03:37:24 | <Pedrosso> | not as easy to use as WBM but effective |
03:39:25 | | pabs quits [Ping timeout: 272 seconds] |
03:39:51 | <project10> | JAA: re zstdwarccat, am I grokking this correctly: it can be used with just a .warc.zst (single command line arg), and it will extract the dict from the zst, write it to a tempfile, and call zstd -D with that extracted dict? |
03:40:09 | | pabs (pabs) joins |
04:12:12 | | parfait quits [Client Quit] |
04:26:03 | <nicolas17> | JAA: https://cdn.discordapp.com/attachments/286612533757083648/1193771879652147231/image.png?ex=65adedb3&is=659b78b3&hm=f1b44aecc421370a1605e42bc7a45b12ea70f142e08c3df8d6dc9a8845aa6006& |
04:26:42 | <nicolas17> | they have different file ID, but same filename, same file size, and assuming I didn't accidentally start downloading the same one twice, same content in what I downloaded so far |
04:35:43 | <fireonlive> | samsung pls |
04:36:02 | <nicolas17> | and they download at 200KiB/s |
04:47:09 | <Ryz> | Whaaaa? What the hell? APKPure suddenly...shut...down...? https://www.claytoncountyregister.com/news2/what-happened-to-apkpure/ |
04:47:19 | <Ryz> | http://apkpure.com/ |
04:47:52 | <Ryz> | ...I use that and APKCombo for browse skimming through new mobile games S: |
04:48:10 | <Ryz> | Would like more archivey of .APK stuff :C |
04:48:21 | <nicolas17> | that article looks like pure speculation, information-free, and potentially written by an AI lol |
04:48:48 | <nicolas17> | it does not even remotely "shed light on the situation" |
04:49:06 | <Ryz> | Yeah, that unfortunately looks like an AI article, bleh |
04:49:25 | <Ryz> | ...But that clues me in that APKPure is down in the first place |
04:49:32 | <Ryz> | Again, use both of the websites |
04:49:53 | <Ryz> | ...Might need another website as a backup~ |
04:51:47 | <Ryz> | https://old.reddit.com/r/ApksApps/comments/18zoqrs/did_apkpure_just_get_taken_down/ and https://old.reddit.com/r/androidapps/comments/18z2l5o/apkpurecom_down_today/ |
04:52:30 | <Ryz> | They might have been acquired? Unsure if true: https://old.reddit.com/r/ApksApps/comments/18zoqrs/did_apkpure_just_get_taken_down/kglezti/ |
05:01:55 | | DJ joins |
05:14:02 | | DJ quits [Remote host closed the connection] |
05:14:02 | | qwertyasdfuiopghjkl quits [Remote host closed the connection] |
05:14:49 | <nulldata> | Ryz - looks like https://apkpure.net/ is up |
05:16:25 | <Ryz> | Huh, unsure if a mirror or not, since the APKPure one is the .com one - https://web.archive.org/web/20240000000000*/http://apkpure.com |
05:17:34 | | qwertyasdfuiopghjkl (qwertyasdfuiopghjkl) joins |
05:18:45 | <nulldata> | Not sure - they are both registered with Mark Monitor and behind Cloudflare. com has a private registration and net lists ELECYBER INTERNATIONAL PTE. LTD. |
05:20:29 | <nulldata> | Almost a year apart on the registration |
05:20:39 | <nicolas17> | lol it's going to take 2 hours to download the open source notices from samsung |
05:20:45 | <nicolas17> | I guess it will take days to download the actual zips then |
05:22:39 | <nulldata> | The YouTube channel that was linked on the .com one mentions .net in the bio links, but yet the YouTube links to a different Facebook account vs what was listed on .com |
05:23:36 | <nulldata> | https://lounge.nulldata.foo/uploads/139d181123007003/image.png |
05:23:41 | <nulldata> | https://www.facebook.com/APKPureOfficial/ |
05:24:05 | <nulldata> | https://www.youtube.com/@apkpure6707/videos |
05:26:31 | <nicolas17> | nulldata: is that screenshot from just now? |
05:26:40 | <nulldata> | Yeah |
05:26:56 | <nicolas17> | okay, so I know what "4h ago" is relative to :P |
05:37:11 | <Ryz> | Hmm, yeah, so it looks like there may be a .com and .net version that's legit, at least from checking https://t.me/s/apkpurechannel |
05:55:21 | <nicolas17> | trying to download from the samsung site to my VPS, and speed is *worse*, I wonder if this intentional malicious compliance |
05:55:58 | <nicolas17> | "we provide source code for the open source components we use, complying with the license, they are 800MB zip files downloadable at 100KB/s" |
06:05:43 | | nulldata quits [Ping timeout: 272 seconds] |
06:15:12 | | aninternettroll_ (aninternettroll) joins |
06:15:20 | | aninternettroll quits [Ping timeout: 240 seconds] |
06:15:20 | | aninternettroll_ is now known as aninternettroll |
06:18:45 | | nulldata (nulldata) joins |
06:23:40 | | Island quits [Read error: Connection reset by peer] |
06:55:13 | | DogsRNice quits [Read error: Connection reset by peer] |
07:06:34 | <nicolas17> | I assume uploading 5406 files adding up to 800GiB into a single IA item is not ideal |
07:11:11 | <nicolas17> | and I just found part of that data is 20GB of licenses and copyright notices which compress to <0.3% so I would feel really bad uploading it uncompressed >.> |
07:21:10 | <nicolas17> | so far I downloaded 15GB and found 537MB of duplicated files |
08:00:21 | | nfriedly quits [Remote host closed the connection] |
08:08:28 | | BlueMaxima quits [Read error: Connection reset by peer] |
08:14:23 | | lennier2 joins |
08:16:50 | | lennier2_ quits [Ping timeout: 240 seconds] |
08:54:27 | | ymgve_ joins |
08:57:20 | | ymgve quits [Ping timeout: 240 seconds] |
08:59:50 | | ymgve joins |
09:01:47 | | ymgve_ quits [Ping timeout: 272 seconds] |
09:03:20 | | monoxane quits [Ping timeout: 240 seconds] |
09:06:04 | | monoxane (monoxane) joins |
09:12:02 | | ymgve_ joins |
09:13:50 | | ymgve quits [Ping timeout: 240 seconds] |
09:34:06 | | D joins |
09:34:33 | <D> | hi |
09:34:44 | <D> | im trying to view images under a blur |
09:34:50 | <D> | to archive them |
09:34:56 | <D> | i was told to use inspect element |
09:35:01 | <D> | but not sure how |
09:38:31 | | DJ joins |
09:46:12 | <c3manu> | D: if you're not sure how because you've never done that before, the web should be full of step-by-step tutorials. usually you can right-click the blurred image and should see an "inspect element" menu entry. if you click that, the window will be split and a hierarchy of the html elements will be shown to you. you can hover over them in the text so they will be highlighted on the page. |
09:46:43 | <c3manu> | if you have an idea which element it could be, there should be some way to "disable" it in the development tools so it will not be rendered on the page |
09:46:45 | <D> | what element do i remove to remove the blur |
09:47:30 | <c3manu> | sometimes the name can give you hints, but it's mostly try and error. this is hard to explain via text to someone who has never done that before ^^ |
09:50:15 | <c3manu> | maybe you can find a youtube video that explains it? it should be easier to understand to see what someone is talking about while they speak |
09:51:29 | | DJ leaves |
10:00:03 | | Bleo18260 quits [Client Quit] |
10:01:30 | | Bleo18260 joins |
10:16:56 | | D quits [Remote host closed the connection] |
10:41:51 | | Chris5010 quits [Ping timeout: 272 seconds] |
10:59:09 | | MollyZen joins |
11:03:52 | | nulldata4 (nulldata) joins |
11:06:33 | | nulldata quits [Ping timeout: 272 seconds] |
11:06:33 | | nulldata4 is now known as nulldata |
11:11:05 | | ThreeHM_ is now known as ThreeHM |
11:52:35 | | nfriedly joins |
12:39:07 | | katia_ is now known as katia |
12:59:55 | | razul quits [Ping timeout: 272 seconds] |
13:07:23 | | razul joins |
13:41:58 | <@JAA> | project10: Correct |
13:42:56 | | jacksonchen666 quits [Quit: WeeChat 4.1.2] |
13:48:05 | <@JAA> | nicolas17: Why am I not surprised? |
14:11:04 | | MollyZen quits [Client Quit] |
14:11:04 | | qwertyasdfuiopghjkl quits [Client Quit] |
14:42:20 | | Ruthalas59 (Ruthalas) joins |
14:42:25 | | Ruthalas597 quits [Read error: Connection reset by peer] |
15:07:08 | <project10> | JAA: neat, I wasn't aware that the .warc.zst had the custom dictionary embedded, thought I needed to acquire the associated zstdict through other means |
15:12:11 | | CraftByte quits [Quit: Hasta la vista] |
15:12:39 | | CraftByte (DragonSec|CraftByte) joins |
15:12:44 | <@JAA> | project10: Yeah, it's in a skippable frame which isn't understood by regular tooling. |
15:15:09 | | CraftByte quits [Client Quit] |
15:15:27 | | CraftByte (DragonSec|CraftByte) joins |
15:16:02 | <@arkiver> | first implemented in Wget-AT, now somewhat officially proposed in the WARC standard as well :) |
15:16:06 | <@arkiver> | (well since many months) |
15:24:32 | | CraftByte quits [Client Quit] |
15:24:49 | | CraftByte (DragonSec|CraftByte) joins |
15:35:04 | <that_lurker> | Ubuntu Looking at Discontinuing Its Source ISOs: https://www.phoronix.com/news/Ubuntu-Discontinue-Source-ISOs https://news.ycombinator.com/item?id=38876167 |
15:35:51 | | D joins |
15:40:18 | <@arkiver> | oh? |
15:41:14 | | @arkiver never understood the push for snap |
15:41:51 | <@arkiver> | if there is an archive of src ubuntu ISOs, let's get them |
15:41:55 | <@arkiver> | perhaps in #archivebot-bs |
15:41:58 | <@arkiver> | perhaps in #archivebot * |
15:42:42 | <@JAA> | Not sure there's anything unique in those ISOs that isn't also in the package repos, but if it isn't huge, might not hurt either. |
15:43:37 | <@JAA> | It's about these, for example: https://cdimage.ubuntu.com/releases/23.10/release/source/ |
15:45:04 | <@JAA> | (I think, at least.) |
15:46:51 | <D> | how do i unblur images |
15:47:00 | <D> | on sites like patreon and substar |
15:49:04 | <c3manu> | D: oh, that's simple. i didn't know you were talking about those. you just pay the creators :) |
15:53:41 | <@arkiver> | yeah! :) |
15:54:54 | <@JAA> | Note that there's duplication in it. For example, https://cdimage.ubuntu.com/releases/mantic/release/source/ and https://cdimage.ubuntu.com/releases/23.10.1/release/source/ should have the same files. |
15:56:45 | <D> | what if the creators are dead/seem dead |
15:56:57 | <D> | and u want to archive their work |
16:02:33 | <c3manu> | they do not handle the booking themselves. if they still have creator accounts on there, you should still be able to subscribe and download their stuff |
16:03:34 | | D quits [Remote host closed the connection] |
16:03:38 | <c3manu> | i know gallery-dl supports at least patreon, so you can just pass your browser's cookies to it and it will download the creator's pictures (and maybe videos?) for you. |
16:03:44 | <c3manu> | oh well |
16:35:15 | | AlsoHP_Archivist quits [Ping timeout: 272 seconds] |
16:42:48 | | qwertyasdfuiopghjkl (qwertyasdfuiopghjkl) joins |
16:51:57 | | HP_Archivist (HP_Archivist) joins |
17:10:33 | | RealPerson leaves |
17:10:58 | | RealPerson joins |
17:29:14 | | qwertyasdfuiopghjkl quits [Remote host closed the connection] |
17:29:15 | | HP_Archivist quits [Read error: Connection reset by peer] |
17:29:31 | | HP_Archivist (HP_Archivist) joins |
17:35:49 | | qwertyasdfuiopghjkl (qwertyasdfuiopghjkl) joins |
17:37:12 | | qwertyasdfuiopghjkl quits [Excess Flood] |
17:39:06 | | Doranwen quits [Remote host closed the connection] |
17:39:52 | | Doranwen (Doranwen) joins |
17:47:20 | | qwertyasdfuiopghjkl (qwertyasdfuiopghjkl) joins |
18:01:26 | | nexusxe (nexusxe) joins |
18:04:20 | | Church (Church) joins |
18:11:31 | | nic9070 quits [Ping timeout: 272 seconds] |
18:11:35 | | nic90709 (nic) joins |
18:20:20 | | AlsoHP_Archivist joins |
18:21:39 | | HP_Archivist quits [Ping timeout: 272 seconds] |
18:54:52 | <nicolas17> | eh linux distro source isos are of questionable usefulness, who's gonna burn the source code into physical media? |
19:16:09 | <fireonlive> | i do wonder who did & why |
19:16:20 | <fireonlive> | i guess offline apt source comand? |
19:16:22 | <fireonlive> | commands |
19:16:38 | <that_lurker> | you can throw your enemies with it |
19:16:51 | <fireonlive> | ninja star! |
19:16:55 | <nicolas17> | JAA: what do you think about this samsung open source stuff? should I create an item with 800GB of zips? I'm not sure how to split it up :/ |
19:18:02 | <@JAA> | I guess the only potential usefulness would be 'collection of all source code behind a major Linux distribution at a specific point in time', but yeah, no disagreement. |
19:18:46 | <fireonlive> | i think some times $manufacturers have accidentally shipped something they weren't supposed to, discovered many years later |
19:18:49 | <fireonlive> | internal stuff or whatever |
19:19:29 | <@JAA> | nicolas17: 5.4k files in a single item might be a bit unwieldy, but if there's no reasonable way of grouping it, maybe it's the least bad approach, yeah. |
19:19:31 | <nicolas17> | I think the ubuntu source iso topic and the samsung open source topic are getting crossed |
19:19:36 | <@JAA> | Yeah |
19:20:03 | <nicolas17> | well, I could also do 2500 items, each with one .zip for one device model |
19:20:07 | <nicolas17> | but I don't know if there's a reasonable middle ground between "all in one item" and "2500 items", such as grouping by device type or something |
19:20:29 | <pokechu22> | What about grouping by year? |
19:20:31 | <@JAA> | Almost all files are for mobile devices, I think? |
19:20:56 | <@JAA> | What kind of file-level metadata is there? |
19:21:21 | <nicolas17> | I think there's more metadata in the search results page than in the files themselves, but let me check |
19:22:45 | <nicolas17> | each row in the search results is an "upload" |
19:24:16 | <nicolas17> | each "upload" has one or more "attachments" |
19:24:26 | <nicolas17> | ( https://opensource.samsung.com/downSrcMPop?uploadId=11931 example with multiple files) |
19:25:31 | <nicolas17> | they can be "source" or they can be "announcement" (usually massive html or text files with licenses and copyright notices) |
19:25:33 | <nicolas17> | like this https://opensource.samsung.com/downAnnMPop?uploadId=11931 |
19:25:45 | <nicolas17> | the vast majority have 1 source zip and 1 announcement html though |
19:26:43 | <nicolas17> | the model number and device type only show up in the search results, as far as I can tell, not in the zip contents nor in the DownSrcMPop page |
19:27:33 | <nicolas17> | for the "announcement" html files, there's so much redundancy that "tar | zstd -19" (in progress) gets a 0.8% compression ratio |
19:49:42 | | qwertyasdfuiopghjkl quits [Client Quit] |
20:21:34 | | ThreeHM_ (ThreeHeadedMonkey) joins |
20:23:45 | | Naruyoko quits [Read error: Connection reset by peer] |
20:24:31 | | ThreeHM quits [Ping timeout: 272 seconds] |
20:32:54 | | BlueMaxima joins |
20:43:52 | | DogsRNice joins |
21:16:19 | | Naruyoko joins |
21:18:50 | | BearFortress quits [Ping timeout: 240 seconds] |
21:30:04 | | ThreeHM_ is now known as ThreeHM |
21:33:28 | | JAA_ (JAA) joins |
21:33:28 | | @ChanServ sets mode: +o JAA_ |
21:33:47 | | @JAA quits [Remote host closed the connection] |
21:39:20 | <nicolas17> | JAA_ grew a tail |
21:46:36 | | hexa- quits [Quit: WeeChat 4.1.1] |
21:47:42 | | hexa- (hexa-) joins |
21:57:29 | | @JAA_ is now known as @JAA |
22:07:42 | <fireonlive> | owo |
22:10:50 | | nic90709 quits [Ping timeout: 240 seconds] |
22:11:27 | | nic9070 (nic) joins |
22:14:36 | | Island joins |
22:19:33 | | qwertyasdfuiopghjkl (qwertyasdfuiopghjkl) joins |
22:55:14 | | Naruyoko quits [Client Quit] |
23:11:41 | | lennier2_ joins |
23:14:20 | | lennier2 quits [Ping timeout: 240 seconds] |
23:27:51 | <project10> | JAA: re: transfer (from #frogger): curl --upload-file ./100MB.bin https://transfer.archivete.am 0.34s user 0.24s system 1% cpu 46.623 total |
23:30:32 | <project10> | I wonder if this is at least partly the cause of h2ibot multi-hour processing, as it seems to upload the "cleaned" list to transfer w/o compression (so a multi-GB upload at <20Mbps) |
23:31:42 | <fireonlive> | also of note, h2ibot's backend doesn't seem to zst compress the files first which might be helpful for some lists |
23:31:55 | <fireonlive> | by backend i mean whatever arkiver is running :3 |