Coding in the Fast Lane: The Multi-Threaded Multi-Core World of AMD64
I wrote five contributions for an ebook from AMD Developer Central — and forgot entirely about it! The book, called “Surviving and Thriving in a Multi-Core World: Taking Advantage of Threads and Cores on AMD64,” popped up in this morning’s Google Alerts report. I have no idea why!
Here are the pieces that I wrote for the book, published in 2006. Darn, they still read well! Other contributors include my friends Anderson Bailey, Alexa Weber Morales and Larry O’Brien.
- Driving in the Fast Lane: Multi-Core Computing for Programmers, Part 1 (page 5)
- Driving in the Fast Lane: Multi-Core Computing for Programmers, Part 2 (page 8)
- Coarse-Grained Vs. Fine-Grained Threading for Native Applications, Part 1 (p. 37)
- Coarse-Grained Vs. Fine-Grained Threading for Native Applications, Part 2 (p. 40)
- Device Driver & BIOS Development for AMD Systems (p. 87)
I am still obsessed with questionable automotive analogies. The first article begins with:
The main road near my house, called Skyline Drive, drives me nuts. For several miles, it’s a quasi-limited access highway. But for some inexplicable reason, it keeps alternating between one and two lanes in each direction. In the two-lane part, traffic moves along swiftly, even during rush hour. In the one-lane part, the traffic merges back together, and everything crawls to a standstill. When the next two-lane part appears, things speed up again.
Two lanes are better than one — and not just because they can accommodate twice as many cars. What makes the two-lane section better is that people can overtake. In the one-lane portion (which has a double-yellow line, so there’s no passing), traffic is limited to the slowest truck’s speed, or to little-old-man-peering-over-the-steering-wheel-of-his-Dodge-Dart speed. Wake me when we get there. But in the two-lane section, the traffic can sort itself out. Trucks move to the right, cars pass on the left. Police and other priority traffic weave in and out, using both lanes depending on which has more capacity at any particular moment. Delivery services with a convoy of trucks will exploit both lanes to improve throughput. The entire system becomes more efficient, and net flow of cars through those two-lane sections is considerably higher.
Okay, you’ve figured out that this is all about dual-core and multi-core computing, where cars are analogous to application threads, and the lanes are analogous to processor cores.
I’ll have to admit that my analogy is somewhat simplistic, and purists will say that it’s flawed, because an operating system has more flexibility to schedule tasks in a single-core environment under a preemptive multiprocessing environment. But that flexibility comes at a cost. Yes, if I were really modeling a microprocessor using Skyline Drive, cars would be able to pass each other in the single-lane section, but only if the car in front were to pull over and stop.
Okay, enough about cars. Let’s talk about dual-core and multi-core systems, why businesses are interested in buying them, and what implications all that should have for software developers like us.
Download and enjoy the book – it’s not gated and entirely free.