

“There are some bad things on the internet”
“Just… Don’t use the internet?”
“There are some bad things on the internet”
“Just… Don’t use the internet?”
The ads for apps, Xbox games, trial versions of Office preinstalled, the minesweeper and solitaire collection that are preinstalled but actually ad supported or non-free, depending on the region spotify/TikTok/Facebook also come preinstalled, “Movies & TV”, Bing/MS News…
I think all of those count as bloat. I haven’t included Edge because I guess having a browser is a necessity, or copilot/cortana because you said “excluding AI features”.
I’ve only started using Storygraph recently (which I also like) but I’d consider a federated alternative. Does anybody know whether its possible to migrate the history from SG to Bookwyrm?
Excellent in which specific sense? Most competitors offer better everything (performance, range, build quality) for a given price point.
The fact that Tesla has managed to make EVs that consistently rank below most ICE brands in terms of reliability is mind blowing.
My mom (78) got a new kindle a couple years ago, after the previous one lasting over 10 years.
She’s not been using it now because “it’s not okay” anymore. After a lot of poking and prodding remotely (we live in different countries) to get to understand what the issue was for the kindle to “not be okay”, I managed to get her to tell me that “the screen is blank”. I said I’d check it soon after when I went to her place.
When I travelled there, not long after, I checked the kindle, turned on the screen, and it was blank. Because she’d finished a book and the last page was blank. All worked fine.
I have told her, but she refuses to use the kindle because “it’s not okay”.
In a separate conversation I offered to give my sister my really old kindle as hers is actually broken. My mom heard that and said she wanted it because hers is… Not okay.
The insistence and willful ignoring of what I said is the most infuriating part.
It’s the other way around, an Apple Silicon Mac would be able to run an intel binary through Rosetta (I think there’s almost no exceptions at this point). It’s intel macs that can’t run Arm specific binaries.
I thought a few days ago that my “new” laptop (M2 Pro MBP) is now almost 2 years old. The damn thing still feels new.
I really dislike Apple but the Apple Silicon processors are so worth it to me. The performance-battery life combination is ridiculously good.
Also because, as a person who has studied multiple languages, German is hard and English is Easy with capital E.
No genders for nouns (German has three), no declinations, no conjugations other than “add an s for third person singular”, somewhat permissive grammar…
It has its quirks, and pronunciation is the biggest one, but nowhere near German (or Russian!) declinations, Japanese kanjis, etc.
Out of the wannabe-esperanto languages, English is in my opinion the easiest one, so I’m thankful it’s become the technical Lingua Franca.
It’s UE in Spanish, from Unión Europea. (Non-doubled letters because it’s a single Union, there’s no plural like in “States”).
Sometimes people in Spain do use the English acronyms for both EU/USA, but I don’t think I’ve seen it often. Both UE and EEUU are more common from what I’ve seen, and also people rarely say these out loud, it’s exclusively a written language problem.
Language | Native Speakers | Total Speakers | Sources |
---|---|---|---|
English | ~380 million | ~1.5 billion | Wikipedia |
German | ~76–95 million | ~155–220 million | Wikipedia |
Mandarin | ~941 million–1.12 billion | ~1.1–1.3 billion | Wikipedia |
Well, it has 10x more speakers than German, but it still has fewer speakers than English and most of them are localised in a single country.
I’m talking about running them in GPU, which favours the GPU even when the comparison is between an AMD Epyc and a mediocre GPU.
If you want to run a large version of deepseek R1 locally, with many quantized models being over 50GB, I think the cheapest Nvidia GPU that fits the bill is an A100 which you might find used for 6K.
For well under that price you can get a whole Mac Studio with those 192 GB the first poster in this thread mentioned.
I’m not saying this is for everyone, it’s certainly not for me, but I don’t think we can dismiss that there is a real niche where Apple has a genuine value proposition.
My old flatmate has a PhD in NLP and used to work in research, and he’d have gotten soooo much use out of >100 GB of RAM accessible to the GPU.
If it’s for AI, loading huge models is something you can do with Macs but not easily in any other way.
I’m not saying many people have a use case at all for them, but if you have a use case where you want to run 60 GB models locally, a whole 192GB Mac Studio is cheaper than the GPU alone you need to run that if you were getting it from Nvidia.
So the lack of apple-branded AI Slop is slowing down the sales for iPhones but not for Macs?
Edit for clarity: I’m aware sequoia “has” apple intelligence but in a borderline featureless state, so it’s as good (or as bad) as not having anything.
Some of these are for insurance, government organisations… They are naturally dry but we can’t get away from them.
Some others that I described like internal changelogs, I agree won’t ever get read. Then if that’s the case I don’t care (much) about the quality - just about doing it as quickly as possible.
There are tons more applications in the workplace. For example, one of the people in my team is dyslexic and sometimes needs to write reports that are a few pages long. For him, having the super-autocorrect tidy up his grammar makes a big difference.
Sometimes I have a list of say 200 software changes that would be a pain to summarise, but where it’s intuitively easy for me to know if a summary is right. For something like a changelog I can roll the dice with the hallucination machine until I get a correct summary, then tidy it up. That takes less than a tenth of the time than writing it myself.
Sometimes writing is necessary and there’s no way to cut down the drivel unfortunately. Talking about professional settings of course - having the Large Autocorrect writing a blog post or a poem for you is a total misuse of the tool in my opinion.
To be fair, it’s such a load of nonsense that it isn’t really worth reading…
At least those are big, the ones that woman is wearing have both a thick and small frame which makes a truly horrible effect.
I always was on the “nerd” side at school but boy do those faces look punchable.
It’s okay. We can all play that game. I’ve replaced my use of Duolingo with AI.
Pro tip: have as your “system prompt” in your LLM of choice “at the end of every query, include me a short Swedish relates to my prompt”. No need for Duolingo.