There hasn't been true CISC for decades. It's really just RISC under the hood.
Sure but the point is that the post may be right but it's a pedantic argument of semantics (pedantics of semantics :) It's like those people who bitch at you when you call a tomato as a vegetable;
REEEEEEEEEEEEEEE NO IT'S A FRUIT!!!
It's an irrelevant difference.
Wasn't every CISC "really" RISC on the inside? They all translated instructions into simpler operations over multiple cycles.
Of course professor designers like complexity so even with RISC they add pipelines, prediction, parallel execution, dependency tracking, multi-level caching, etc.
Not always, but the problem with CISC was that the complexity led to increased possibility of hardware bugs which were not possible to change as they were set in silicon. RISC was created to reduce the possibility of such bugs by making the instruction set as simple to implement in hardware as possible. That trade-off resulted in greater memory use (when memory was at a premium) along with being harder to program. Modern CPUs are hybrids, so that if there is a dodgy CISC instruction or a security flaw, it can be patched by microcode.
I thought that the point of RISC was to reduce the variable complexity of instructions, thus making them more amenable to pipelining, and keeping the common simple operations fast. Most RISC architectures have fixed-length instructions, simplifying the instruction fetching and decoding so it can run ahead and not have to interpret each instruction, for example.
(post is archived)