RCA wrote:So MoD you really think that more efficient CPUs (multi-core, HyperTransport ect) were a result of only software requirements and not necessity based off of the limits of silicone?
Yes. Absolutely. The reason I say this is that consumer tech trickles down from the professional market. Multi-core CPUs are an example of that. But they're also the logical progression of another bit of tech that never made it to the mainstream consumer market because of cost reasons: multi-CPU (not multi-core) setups. Servers with multiple Xeons were common long before multi-core CPUs entered the consumer market. They were designed for the server environment where clock speed was much less important than moving huge amounts of data quickly and efficiently. It was a software necessity, not a hardware workaround.
Fastfoward a few years, and the multi-CPU motherboard logically progresses to the multi-core CPU. With improved tech (like QPI and HyperTransport [I avoid using "HT" to abbreviate HyperTransport as it could be mistaken for hyperthreading as well]) a single die could be made to manage data as effectively as two. So servers began adopting dual core Xeons and Opterons. The goal was still the same: move lots of data quickly and efficiently. A software need.
Of course, it's always more cost-efficient to produce multiple products from a single design, so shortly after the professional market multi-core chips hit the market, AMD and Intel began offering consumer-market chips based on the same architecture. The Core/Core 2 and Athlon 64 X2 were here. But they were really descended from professional tech. They were multi-core because the Xenon and Opteron were multi-core. The consumer market at that time really couldn't make proper use of multi-core CPUs. Software wasn't ready for it (Windows certainly wasn't optimized for it). There were a lot of articles that talked about the negligible bonuses of multi-core CPUs over faster single-core CPUs, especially in the gaming world where games were not thread-optimized.
Fast forward again, about 4 years, and we've got today. Multi-core-shy Windows XP has been succeeded twice (once by another multi-core reject and then mercifully by Win7 and its superior process management). A large portion of the software available in the consumer marketplace is now written to take advantage of multiple threads or multiple cores. Many games have options to customize the software to utilize a specified number of threads. QPI and HyperTransport are critical as they allow the CPU to keep up with the data rates of 2, 3, 4, 6, 8, or 12 threads. The FSB was a weak point even in the days of single core CPUs (which was when AMD actually began using HyperTransport as a solution to that weak point).
If you compare a program like 7Zip or a well-threaded video encoding program, a slow but multi-threaded processor will easily outperform a fast, single-threaded processor. Moving more data more quickly. Processes might happen slower on an individual basis but more are happening at once.
It's worth noting that hyperthreading originated from server tech as well. The Xeon on which the P4 was based was the first hyperthreaded chip. Then you'll note that HT disappeared from consumer chips because it wasn't much help for the same reasons multiple cores weren't at first. Now that software is "ready" for it, ht is back to compliment multiple cores.
So on the professional end, multi-core was a solution to a software demand. But on the consumer end, we had to wait for software to catch up with the hardware advances. Now we're caught up, and threading is a very powerful tool.
One of the coolest benefits of multi-threading is that you can run multiple demanding programs at once. I can play Crysis and watch a movie, because Crysis doesn't need 8 threads, so the DVD player can do its thing without getting in the way (memory and 3D acceleration are a different issue, but as Crysis is GPU-limited by my poor little GTX260, this makes a good example of the CPU-side benefits).