

Ehh. Old 8-bit machines had no trouble with the veritable Gordian knots written by kids in their bedrooms back in the day, so any chip’s gonna be fine.
That’s not to say this chip wouldn’t run it better…
Some middle-aged guy on the Internet. Seen a lot of it, occasionally regurgitating it, trying to be amusing and informative.
Lurked Digg until v4. Commented on Reddit (same username) until it went full Musk.
Was on kbin.social (dying/dead) and kbin.run (mysteriously vanished). Now here on fedia.io.
Really hoping he hasn’t brought the jinx with him.
Other Adjectives: Neurodivergent; Nerd; Broken; British; Ally; Leftish
Ehh. Old 8-bit machines had no trouble with the veritable Gordian knots written by kids in their bedrooms back in the day, so any chip’s gonna be fine.
That’s not to say this chip wouldn’t run it better…
Then you need concrete. And lots of it.
Paired with the recent change that Oscar award judges are no longer allowed to skip parts of the media they’re reviewing (because apparently that was a thing), the number of AI slop movies is going to be absolutely gruelling for them to wade through.
One possible outcome is that this means AI kills the Oscars… but it’s more likely to get that watch-all rule rolled back.
And either way, it would probably mean that we’ll never see another 2001: A Space Odyssey again because a bunch of that movie looks like AI slop.
… I just realised this means that AI-generated movies could well end up being trained - accidentally or on purpose - to determine what would generate the most Oscars by exploiting underlying psychology that exists only in the sort of people who are employed as Oscar judges, but which somehow manages to mostly exclude everyone else.
That said, many people disagree with the Oscar nominations and awards anyway, so whether that makes any real difference is probably moot.
Henrietta Lacks hasn’t managed it yet. Look her up. It’s at least as bad as this if not more so.
“Yet” being the operative word here. There’s a disease in dogs that started in some very similar circumstances (although happening in nature rather than from a science accident). One slip-up from an immunocompromised tech with just the right genetic make-up and it begins.
“So… what is your name?”
“Sigh. Richard. Richard Kiel.”
“Isn’t that the name of the actor…”
“Yes. Yes it is. My parents were hilarious.”
“…”
“…”
“Wait. Were hilarious?”
“I ate them.”
“Ah.”
There was a post on a similar subject not too long ago, and it seems the landscape shifted about 10 years ago. A whole load of alt-lifestyle folks got into Linux and sysadmin, and maybe a few previously straight-laced sysadmins came out as alt too.
Vive les différences
True. I think of it more as a semantic shift. In the old days, processes would actually quit and some other process would resurrect it as necessary, but then someone had the idea of having some processes catch the HUP and do all that itself without actually bothering any other processes.
And the implementation might actually involve an exec
of the process’ own executable, meaning that it actually does self-terminate, but it leaves a child in its place.
So I reread it and it says “P follows Q”, which I (mis)read/(mis?)interpreted as “P follows from Q”.
I don’t remember if “follows” was ever used for forward implication in this way when I actually did a logic course, but it was a few decades ago now. Maybe it was.
There’s also that the usual joke in this category is that in basic logic, false implies true, which seems to be the punchline of the joke in the comic, just with the arrow backwards.
In order of decreasing politeness: 1, 2, 15, 9 = HUP, INT, TERM, KILL = “Please stop”, “Quit it”, “I’m warning you” and “BANG”
Isn’t that implication arrow backwards?
“P follows from Q” is P ⇐ Q
Maybe that’s the joke, though.
EDIT: The text says “P follows Q”, which my brain apparently corrected to “P follows from Q”. These are not the same, and I’d argue that “P follows Q” is problematic as a phrase as a result. Grumble grumble.
If you approach people and tell them you’re normal, they won’t believe you. Something similar applies to saying you’re human on the Internet.
Hello, yes, I am a human.
Suspicious, isn’t it?
The cynic in me wants to know: Once purchased, will it, and any media it might contain at any time, be under the sole control of the purchaser?
If not, it’s definitely not worth buying.
To steal an idea I had on something similar once, perhaps on an entirely different site, there’s also where the wings come from in the first place, literally and figuratively.
The equivalent structure to a bird’s wing in humans is the arm and hand. Does this new wing take the place of the forelimb as it does in birds or does the wisher necessarily become a six-limbed creature?
From the naive perspective it’s looping infinitely and it ought to be infinitely old because there’s no “first loop”. Depending on the laws of physics, proton decay could make the pizza slice literally impossible.
Given that it clearly exists and has no rot let alone deep-time decay, I posit that it spontaneously appears/renews in panel three, away from the boundary break, as some kind of near-infinitely improbable entropy break.
He thinks he’s discovered panel time travel, but it’s far weirder than he thinks.
For a certain set of inputs, yes. Good luck guessing what that comprises that set even if 1) there’s documentation and 2) you read it.
Worse, for all we know, double
actually adds a thing to itself, which might accidentally or deliberately act on strings. Dividing by two has no such magic.
Perl was originally designed to carry on regardless, and that remains its blessing and curse, a bit like JavaScript which came later.
Unlike JavaScript, if you really want it to throw a warning or even bail out completely at compiling such constructs (at least some of the time, like this one) it’s pretty easy to turn that on rather than resort to an entirely different language.
use warnings;
at the top of a program and it will punt a warning to STDERR as it carries merrily along.
Make that use warnings FATAL => "syntax";
and things that are technically valid but semantically weird like this will throw the error early and also prevent the program from running in the first place.
Well, you see, Perl’s length
is only for strings and if you want the length of an array, you use @arrayname
itself in scalar context.
Now, length
happens to provide scalar context to its right hand side, so @arrayname
already returns the required length. Unfortunately, at that point it hasn’t been processed by length
yet, and length
requires a string. And so, the length of the array is coerced to be a string and then the length of that string is returned.
A case of “don’t order fries if your meal already comes with them or you’ll end up with too many fries”.
Reminds me of Tommy Cooper’s joke: “The Invisible Man’s at the door.” “Tell him I can’t see him.”
Pretty sure that became a doctor joke in many school yard repeats, if not also Christmas crackers and other places whimsical jokes tend to turn up.
As a Perl fossil I recognise this syntax as equivalent to if(not @myarray)
which does the same thing. And here I was thinking Guido had deliberately aimed to avoid Perlisms in Python.
That said, the Perlism in question is the right* way to do it in Perl. The length
operator does not do the expected thing on an array variable. (You get the length of the stringified length of the array. And a warning if those are enabled.)
* You can start a fight with modern Perl hackers with whether unless(@myarray)
is better or just plain wrong, even if it works and is equivalent.
In addition to corsicanguppy’s comment, some — often important — programs actually expect the system to be secured in a particular way and will refuse to function if things don’t look right.
Now, you’d be right to expect that closing down permissions too tightly could break a system, but people have actually broken their systems by setting permissions too openly on the wrong things as well.
That said, for general, everyday use, those commands don’t need to be used much, and there might even be a way to do what they do from your chosen GUI. Even so, it nice to know they’re there and what they do for those rare occasions when they might be needed.