Huh, I found it to be so much easier to set up than nginx that I wrote the devs a little thank you message
Huh, I found it to be so much easier to set up than nginx that I wrote the devs a little thank you message
Yeah, the Nazis weren’t really subtle. If you instead maintain a civil front inward for public support, you can wreak havoc more effectively.
That’s why fascism is a different kind of danger. It wouldn’t leech off of other places for centuries, it would explosively and directly attack internal and external enemies.
Neither of these things can be risked.
Yes, that’s why I wrote the part after “I see what you’re getting at”
Fascism is a pretty specific ideology. If you want to learn more, Umberto Eco made a list.
I get where you’re getting at: the role of past and ongoing colonialism is still being downplayed. But you’re wrong. There are very good reasons why we should fear fascism in particular.
So you’re agreeing. “one does not simply stop, because one needs to be really sure that they want to stop for some reason or another”. The desire to stop doesn’t come from nothing, yet it’s the vital ingredient for stopping successfully. Unless you have it, stopping is really hard.
The contents of your message aren’t a “no”, they’re a “yes, and”
I mean, hippos are one of the most dangerous big animals
That was my immediate reaction here: one of the reasons the xz backdoor was possible is that nobody is going to question the idea of shipping a tarball to spare users from having to touch Autotools.
Of course I wouldn’t think of manually hacking together Makefiles since I come from languages that have either the One True Build Tool or a standard for packaging and defining build backends.
I think the author’s aversion to build tools trying (and apparently failing) to make everyone’s life easier is more a statement about how much C/C++ have suffered from not having a standard for packages.
I don’t think those are better or worse. My point isn’t about some ancient far too limiting standard, but about how easy it is to wreck everything by not knowing some obscure syntactical rule. My issue is about implicit conversion between strings and arrays, about silently swallowing errors and so on. And the only shell languages that I know aren’t idiotic are nushell and Powershell.
That KDE theme that nuked some user’s home directory? Used a bash script. That time the bumblebee graphics card switching utility deleted /var? Bash script. Any time some build system broke because of a space in a path: bash/ZSH/… script.
Why would anyone make an init system based on shell scripts these days?
POSIX shells are horrible unmaintainable bug factories.
shellcheck is not enough to make them safe programming languages. They are acceptable only in an interactive context.
Having anything encourage people to write POSIXy shell scripts is a design flaw.
Services are bash scripts?
Oh no. That’s horrifying. I’ll never go back to the bad old days where my system constantly has dozens of untestable buggy bash scripts running.
I currently have zero bash scripts running on my system until I open steam, and there’s no world where I’d go back.
I have a pixel 6 and notice some lag in scrolling. Could it be that you don’t use srcsets but instead huge screenshots no matter the device screen?
Yeah, and when you read a paper that contains math, you won’t see a declaration about what country’s notation is used for things that aren’t defined. So it’s entirely possible that you don’t know how some piece of notation is supposed to be interpreted immediately.
Of course if there’s ambiguity like that, only one interpretation is correct and it should be easy to figure out which one, but that’s not guaranteed.
No, you can’t prove that some notation is correct and an alternative one isn’t. It’s all just convention.
Maths is pure logic. Notation is communication, which isn’t necessarily super logical. Don’t mix the two up.
Look, this is not the only case where semantics and syntax don’t always map, in the same way e.g.: https://math.stackexchange.com/a/586690
I’m sure it’s possible that all your textbooks agree, but if you e.g. read a paper written by someone who isn’t from North America (or wherever you’re from) it’s possible they use different semantics for a notation that for you seems to have clear meaning.
That’s not a controversial take. You need to accept that human communication isn’t as perfectly unambiguous as mathematics (writing math down using notation is a way of communicating)
Notation isn’t semantics. Mathematical proofs are working with the semantics. Nobody doubts that those are unambiguous. But notation can be ambiguous. In this case it is: weak juxtaposition vs strong juxtaposition. Read the damn article.
Just read the article. You can’t prove something with incomplete evidence. And the article has evidence that both conventions are in use.
Let’s do a little plausibility analysis, shall we? First, we have humans, you know, famously unable to agree on an universal standard for anything. Then we have me, who has written a PhD thesis for which he has read quite some papers about math and computational biology. Then we have an article that talks about the topic at hand, but that you for some unscientific and completely ridiculous reason refuse to read.
Let me just tell you one last time: you’re wrong, you should know that it’s possible that you’re wrong, and not reading a thing because it could convince you is peak ignorance.
I’m done here, have a good one, and try not to ruin your students too hard.
Mathematical notation however can be. Because it’s conventions as long as it’s not defined on the same page.
Ooo damn that sounds exactly what I’d like to try.
On the other hand I feel like I’m too old for this shit. My system works fine, I understand everything, and things rarely break and never in an unrecoverable way.