00:01:26hexa_ (hexa-) joins
00:03:01magmaus3 quits [Ping timeout: 260 seconds]
00:09:49Doranwen (Doranwen) joins
00:50:44magmaus3 (magmaus3) joins
01:45:28<nulldata>https://beta.weather.gov/
01:46:15<cancername>ah damn.
02:16:06pixel leaves [Error from remote client]
02:41:41<steering>hmm, don't know if I had seen it
02:45:59<steering>https://www.weather.gov/images/ilx/Top_News/newfcst-sample.jpg (https://transfer.archivete.am/OAUBN/newfcst-sample.jpg) i guess
02:46:00<eggdrop>inline (for browser viewing): https://transfer.archivete.am/inline/OAUBN/newfcst-sample.jpg)
03:49:49BearFortress quits []
04:05:06DogsRNice quits [Read error: Connection reset by peer]
04:08:44BornOn420 quits [Remote host closed the connection]
04:17:33nicolas17 joins
04:26:10BornOn420 (BornOn420) joins
04:34:24BearFortress joins
04:57:38<nicolas17>Flashfire42: apparently I should be thankful I only got a 24-hour power outage
04:57:44<nicolas17>when other people got this https://resizer.glanacion.com/resizer/v2/acceso-a-zarate-ruta-6-cortada-por-2QYCOF4255EAJPSXR4XMAAFPTU.jpeg?auth=213bc33c8bdbb524ea8b27fa60d42e15457870db440dd38e19f263a3e1d2b88a&width=420&height=280&quality=70&smart=true
04:59:25<Flashfire42>Oh holy crap yeah you got lucky there
05:00:53<nicolas17>some places got 2x the amount of rain of the average month of May... in a single day
05:27:14<nicolas17>POWER WENT OUT AGAIN rip my youtube downloads as soon as my UPS dies
05:36:41nicolas17 quits [Ping timeout: 260 seconds]
05:44:29<nulldata>F
05:54:36HP_Archivist quits [Quit: Leaving]
06:06:46HP_Archivist (HP_Archivist) joins
06:35:46HackMii quits [Remote host closed the connection]
06:45:12HackMii (hacktheplanet) joins
07:22:15<BlankEclair>F
07:56:24pixel (pixel) joins
08:38:33Dada joins
09:26:31Riku_V quits [Ping timeout: 260 seconds]
09:30:32Riku_V (riku) joins
09:39:43f_|DSR quits [Remote host closed the connection]
09:40:20f_|DSR (funderscore) joins
11:00:03Bleo182600722719623455 quits [Quit: The Lounge - https://thelounge.chat]
11:02:49Bleo182600722719623455 joins
11:17:52sec^nd quits [Remote host closed the connection]
11:18:19sec^nd (second) joins
11:35:26balrog quits [Ping timeout: 260 seconds]
12:28:19vitzli (vitzli) joins
12:41:15fangfufu_ quits [Quit: ZNC 1.8.2+deb3.1+deb12u1 - https://znc.in]
12:51:32fangfufu joins
13:00:10vitzli quits [Client Quit]
14:43:53etnguyen03 (etnguyen03) joins
14:44:12ThreeHM quits [Quit: WeeChat 4.4.3]
14:48:22ThreeHM (ThreeHeadedMonkey) joins
14:48:22ThreeHM quits [K-Lined]
15:07:13etnguyen03 quits [Client Quit]
15:15:13NotGLaDOS quits [Quit: No Ping reply in 180 seconds.]
15:16:25NotGLaDOS joins
16:19:26grill (grill) joins
16:20:25balrog (balrog) joins
16:30:42sec^nd quits [Remote host closed the connection]
16:31:07sec^nd (second) joins
18:02:08etnguyen03 (etnguyen03) joins
18:11:53etnguyen03 quits [Client Quit]
18:47:02etnguyen03 (etnguyen03) joins
19:06:59etnguyen03 quits [Client Quit]
19:08:41grill quits [Ping timeout: 260 seconds]
19:24:04HackMii quits [Remote host closed the connection]
19:24:05BornOn420 quits [Remote host closed the connection]
19:24:25HackMii (hacktheplanet) joins
19:24:54BornOn420 (BornOn420) joins
19:33:24<steering>don't worry, climate change isn't real
19:33:45<steering>just look at how un-severe the weather is
19:41:34nine quits [Quit: See ya!]
19:41:46nine joins
19:41:46nine quits [Changing host]
19:41:46nine (nine) joins
19:53:09ichdasich quits [Quit: reboot]
19:54:57yano quits [Quit: WeeChat, https://weechat.org/]
19:56:52ichdasich joins
19:58:50yano (yano) joins
21:25:44nine quits [Client Quit]
21:25:57nine joins
21:25:57nine quits [Changing host]
21:25:57nine (nine) joins
21:42:47nine quits [Client Quit]
21:42:59nine joins
21:42:59nine quits [Changing host]
21:42:59nine (nine) joins
22:24:10<Doranwen>How do I remove all text in a URL up to the *final* / ? Like I have a list full of links like `https://www.fanfiction.net/s/10004023/1/Home` with varying numbers where the "10004023" is (and not all the same *length* of digits either), and I'd like to remove everything up to and including the final slash so the only thing left is the word or words *after* that slash, in this case "Home"
22:24:29<Doranwen>I found this which seems similar but I am not able to understand how to use any of it to solve my problem: https://unix.stackexchange.com/questions/423550/delete-everything-before-last-slash-in-bash-variable
22:25:08<@JAA>`sed 's,^.*/,,'`
22:25:33<@JAA>(Reads from stdin, writes to stdout)
22:25:57<@JAA>Doing it with Bash will be very slow.
22:50:36<Doranwen>Thank you! The time it takes is not that important to me, as long as I don't have to do it manually, lol.
22:52:24<Doranwen>But I like the speed of that - instantly for all 1800+ rows. XD
22:59:03Dada quits [Remote host closed the connection]
23:02:25etnguyen03 (etnguyen03) joins
23:02:39<@JAA>Oh, that few, yeah, then even Bash wouldn't be noticeable. But sed is still much faster.
23:14:26DogsRNice joins
23:18:24<steering>for x in $(cat urls); do echo ${x##*/}; done :P
23:19:43<@JAA>(INTERNAL SCREAMING INTENSIFIES)
23:20:00<steering>ok fine, `while read -r x; do echo "${x##*/}"; done <urls` if youre worried about special chars
23:20:04<steering>:P
23:20:41<@JAA>Better, but may still be broken depending on locale and IFS.
23:20:53<steering>hmm how so?
23:21:18<@JAA>There are funny bugs around reading incomplete UTF-8 sequences in a UTF-8 locale.
23:21:55<steering>hmm
23:22:02steering should put broken utf8 into some urls for funsies
23:22:45<steering>I can see that breaking a lot of stuff
23:22:53<@JAA>Try this: `read -r x < <(printf '\x01\xC2\x01'); declare -p x`
23:23:17<@JAA>I discovered this bug a couple weeks ago, still need to write Chet an email about it so it can get fixed before 5.3 comes out.
23:25:34<@JAA>And `read` exits non-zero when it fails to decode UTF-8 data, so this produces no output: `while IFS= read -d '' -r; do declare -p REPLY; done < <(printf '\xC2\0')`
23:25:51<@JAA>That's already fixed in devel.
23:26:14<steering>i assume there was meant to be an extra REPLY in that
23:26:23<steering>:P
23:26:35<@JAA>I mean, you can explicitly specify it if you want, sure.
23:26:43<steering>ah no thats default, TIL
23:27:20<steering>... but tbh its a pretty useless default, its easier to type x twice than to type REPLY once :P
23:27:56<@JAA>So the only portable safe way of `read`ing arbitrary bytes is to set `IFS= LC_ALL=C`, yay.
23:29:08<steering>I would just say that the list of URLs should already be encoded such that it won't have any characters that would even break `for in` :P
23:29:47<steering>(I mean, barring IFS=/ or something)
23:30:04<@JAA>* and ? are legal in URLs, so you could get globbing.
23:30:24<@JAA>And, with that first suggestion, even globbing twice. :-D
23:30:34<steering>yes, but the realistic odds of getting a match are quite low
23:31:03<@JAA>Sure. I like my code reasonably robust.
23:31:18<steering>I mean, yeah
23:32:09<steering>it's good to be able to reach for a range of options, depending on how much robustness you need, IMO
23:32:21<steering>if I really needed it robust I just wouldn't be writing it in bash :D
23:33:56<steering>I usually turn to perl though instead of sed just because sed regex is my (INTERNAL SCREAMING INTENSIFIES
23:34:37<@JAA>Ew, Perl
23:34:49<@JAA>sed is just BRE or ERE.
23:34:57<@JAA>But yeah, sometimes Perl regex is easier.
23:34:59<steering>yes, exactly my problem :P
23:35:11<@JAA>I mean, ERE is basically Perl regex without the fancy stuff.
23:35:18<steering>yeah ERE would probably be fine
23:35:37<@JAA>Fun fact: BRE has back references but ERE does not.
23:35:39<steering>BRE is a fun game of trying to remember which metacharacters get backslashes and which dont
23:36:23<steering>and then you end up with a big \(\|\)
23:36:54<steering>throw in a \+ or even \{1,2\}
23:36:54<@JAA>Yeah, I usually switch to -E as soon as I need any of (|{[.
23:37:48<@JAA>\+ \? are implementation-defined in BRE.
23:38:05<@JAA>\|, too
23:38:07ThreeHM (ThreeHeadedMonkey) joins
23:38:35<@JAA>They may either behave like the non-backslashed operators in ERE or match the literal character.
23:38:54<steering>even better lol
23:39:07<@JAA>ERE has much fewer of these pitfalls.
23:39:25<steering>of course perl adds more than just that
23:39:36<steering>ill use lookarounds or named captures in it occasionally
23:39:37<@JAA>But the lack of backrefs is ... interesting.
23:39:45<@JAA>Yeah, right.
23:40:21<steering>but i do mostly limit it to "one-liners", if i need to make a whole ass program i'll just use python or something usually
23:40:26<@JAA>Luckily, you can turn Perl into sed with `perl -pe`. :-)
23:40:30<steering>just a regex and maybe some ifs or variables
23:40:31<steering>yup
23:40:40<@JAA>Then you get the advantages of Perl without the atrocious syntax.
23:42:52<steering>perl -ne 'if (/^USE `a`;$/) { $db=1; } elsif (/^USE / && $db) { $db=0; } elsif (/^INSERT INTO `b`/ && $db) { print; }' 1.sql >2.sql
23:43:02<steering>could I have done tha in sed? probably. could I have done it in the like 30s it took me? definitely not :P
23:43:27<@JAA>That's already too much Perl for me.
23:43:33<@JAA>I'd probably use AWK for that.
23:43:34<steering>lol
23:43:45<@JAA>Which, bonus, always uses ERE. :-)
23:44:30<steering>sed -n '/^USE `a`;/,/^USE /p' | grep, or something, give or take an off-by-one there
23:44:35<steering>idk
23:44:47<steering>might be easier without -n and with d
23:44:48<@JAA>Yeah, I can't be bothered to remember how sed apart from s works.
23:45:19<steering>i've never used awk beyond "select a column" so *shrug*
23:46:01etnguyen03 quits [Client Quit]
23:46:03<@JAA>Looks very similar to your Perl code:
23:46:06<@JAA>`awk '/^USE `a`;$/ { db=1; } /^USE / && db { db=0; } /^INSERT INTO `b`/ && db { print; }'`
23:46:26<steering>mmh, no big surprise there
23:57:06HP_Archivist quits [Quit: Leaving]