Intel's Sandy Bridge graphics tech: How good is it?
Sandy Bridge is the culmination of a major Intel design effort to achieve a respectable level of graphics performance and make it a standard feature in all Intel mainstream processors going forward. This week at the Intel Developer Forum in San Francisco, Intel engineers were fairly candid in explaining what Sandy Bridge can and can't do.
First, some background. A number of technical sessions at IDF were devoted to discussing Sandy Bridge's graphics technology and the design teams that came together to take this critical feature out of the chipset--a separate companion chip--and put it, for the first time, in the main processor, or CPU.
Intel integrated graphics silicon started appearing in many mainstream laptops about six years ago. And since then has shipped in the lion's share of PCs sold worldwide. While this has made Intel the leading graphics chip supplier, it has also made it the perennial target of criticism from gaming devotees, who claim--rightfully so in many cases--that Intel graphics fall woefully short in handling a number of mainstream games. In turn, this has led to Intel rebuttals and corresponding primers on Intel integrated graphics.
And Nvidia, a leading graphics chip supplier, has always offered its two cents on Intel's graphics technology. "Today's visual computing applications--like photo and video editing, playing games, and browsing the Web--use a GPU for the best experience," Nvidia said in a statement just prior to IDF. Standalone graphics processing units from Nvidia and Advanced Micro Devices almost invariably offer better performance, particularly on games, but can add cost and, in the case of laptops, can up power consumption requirements.
At IDF, Intel engineers described the markets they can, and cannot, address with Sandy Bridge's graphics. Sandy Bridge technology will be part of Intel Core i series mobile processors to be introduced into laptops early next year, with the first Sandy Bridge laptop announcements expected at the Consumer Electronics Show in January.
"We're not trying to target the most high-end discrete (standalone) card. We don't have the bandwidth, we don't have the power budget. We're trying to do the best experience for the mobile platform," said Opher Kahn, senior principal engineer on the Sandy Bridge design team.
But can Intel's Sandy Bridge graphics now handle games? "Historically, Intel has focused their integrated graphics on media capabilities and neglected gaming. With Sandy Bridge, Intel is finally making gaming performance a priority. While the number of shaders (processing units) in Sandy Bridge only increased by 50 percent, they are much more efficient and run at higher frequency so the performance gain is three times or more--putting popular games into reach for many more customers," said David Kanter, an editor and analyst at Real World Technologies, which covers chip technology in depth.
And Intel addressed this question directly at IDF. "For the games that are designed for the highest-end, extreme edition graphics, the answer is probably no. Unless you run it on a smaller (resolution) screen. For most of the mainstream games, I would expect that the answer is yes," said Thomas A. Piazza, Intel fellow and director for graphics architecture at the Intel Architecture Group, responding to a question at an IDF technical session.
I also had a chance to chat with Piazza briefly on Tuesday about Sandy Bridge.
Q: What are the main differences between the current Core i series of graphics and Sandy Bridge?
Piazza: On the media side, we moved a lot to fixed function to get a significant performance boost in the same thermal envelope. And on the 3D side, the same thing, the same theme. Actually, the best way to put it is: put a fixed function everywhere that there is no reason to have a soft function. And that's what gave us the throughput at the power budget. (Note: Fixed function processing is done on the graphics chip, not in "software" on the CPU, increasing performance.)
Q: What is your design constraint versus standalone graphics from Nvidia and AMD? Is it transistors?
Piazza: I don't think it's a transistor count thing at all. We're trying to hit mobile-socketed devices. We're not going to build a 300-watt graphics device on an integrated CPU. There is no cooling solution for that. We're talking about the 17-watt socket, the 35-watt socket, the 45-watt socket, and maybe the 55-watt socket. So the cutoff is in the power numbers, it's not a transistor thing. (Note: 17-watt and 35-watt socket chips typically go into ultrathin and mainstream laptops, respectively. Higher wattage sockets are for bigger system designs.)
Q: Nvidia and ATI boast dozens, if not hundreds of "processing cores." What is the core count on Sandy Bridge currently?
Piazza: "Twelve (what is also referred to as "shaders"). But I don't want to say that our twelve is equivalent to their (Nvidia, AMD) twelve. It's the way we happen to lump things together. If I go out a year, I may have twice as much throughput on one shader."
Reviews of commercial Sandy Bridge-based laptops will be the final arbiters of performance. And introduction of those systems should peak in the first quarter of 2011.