

Oh wow, so we are in kinda similar places but from vastly different paths and capabilities. Back before I was disabled I was a rather extreme outlier of a car enthusiast, like I painted (owned) ported and machined professionally. I was really good with carburetors, but had a chance to get some specially made direct injection race heads with mechanical injector ports in the combustion chamber… I knew some of the Hilborn guys… real edgy race stuff. I was looking at building a supercharged motor with a mini blower and a very custom open source Megasquirt fuel injection setup using a bunch of hacked parts from some junkyard Mercedes direct injection Bosch diesel cars. I had no idea how complex computing and microcontrollers are, but I figured it couldn’t be much worse than how I had figured out all automotive systems and mechanics. After I was disabled 11 years ago riding a bicycle to work while the heads were off of my Camaro, I got into Arduino and just trying to figure out how to build sensors and gauges. I never fully recovered from the broken neck and back, but am still chipping away at compute. Naturally, I started with a mix of digital functionality and interfacing with analog.
From this perspective, I don’t really like API like interfaces. I often have trouble wrapping my head around them. I want to know what is actually happening under the hood. I have a ton of discrete logic for breadboards and have built stuff like Ben Eater’s breadboard computer. At one point I played with CPLDs in Quartus. I have an ICE40 around but have only barely gotten the open source toolchain running before losing interest and moving on to other stuff. I prefer something like Flash Forth or Micropython running on a microcontroller so that I am independent of some proprietary IDE nonsense. But I am primarily a Maker and prefer fabrication or CAD over programming. I struggle to manage complexity and the advanced algorithms I would know if I had a formal CS background.
So from that perspective, what I find baffling about RISC under CISC is specifically the timing involved. Your API mindset is likely handwaving this as black box, but I am in this box. Like, I understand how there should be a pipeline of steps involved for the complex instruction to happen. What I do not understand is the reason or mechanisms that separate CISC from RISC in this pipeline. If my goal is to do A…E, and A-B and C-D are RISC instructions, I have a ton of questions. Like why is there still any divide at all for x86 if direct emulation is a translation and subdivision of two instructions? Or how is the timing of this RISC compilation as efficient as if the logic is built as an integrated monolith? How could that ever be more efficient? Is this incompetent cost cutting, backwards compatibility constrained, or some fundamental issue with the topology like RLC issues with the required real estate on the die?
As far as the Chips and Cheese article, if I recall correctly, that was saved once upon a time in Infinity on my last phone, but Infinity got locked by the dev. The reddit post link would have been a month or two before June of 2023, but your search is as good as mine. I’m pretty good at reading and remembering the abstract bits of info I found useful, but I’m not great about saving citations, so take it as water cooler hearsay if you like. It was said in good faith with no attempt to intentionally mislead.
Awareness of the audience is a critical factor in multiple hits and successful communication