I suspected that the Acer S5 shown at CES was going to be an Ivy Bridge Ultrabook as it fits with the Q2 timescale, previous leak and hidden CPU information on the demo we saw at CES. it looks like another tech site saw the possibility that the S5 is based on Ivy Bridge too and took the chance to benchmark it. CPU figures are slightly better than on Sandy Bridge but the GPU figures show a marked improvement.
While Golem.de aren’t saying which device they tested, the CPU information and background colour in the images and video is a give-away – the lighting in the Acer S5 launch event was something we all remember from our S5 photo-attempts.
Ivy Bridge brings two important features to the Ultrabook platform. Firstly it’s based on a 22nm production process. That reduces power usage on the silicon (and heat) and allows a little bit more clock within the same TDP limit. Secondly, the graphics on Ivy Bridge are re-worked. The HD4000 graphics unit with DX11 support should provide significant improvements. Two Ivy Bridge part numbers are in the wild already.
Back to the Golem.de tests. They had a chance to run Cinebench which returns two performance figures. The first is a CPU-based result. Golem.de saw a result of 2.38 which is higher than our highest test result of 2.11
On a GPU test, the test returned 12.17 FPS Our highest test result so far is 8.36 FPS. Ivy Bridge clearly has some GPU improvements to look forward too.
The test results are obviously on non-optimised hardware and software so come the end of Q2, we should see even better results.
Note: Don’t expect significantly better battery life on Ivy Bridge Ultrabooks. It’s likely manufacturers will still have the same thermal design restrictions and some manufacturers may choose to reduce the size of expensive and heavy batteries.
You can see the 3D test being performed in the video below. The article (German) is here.
Why does Intel feel the need to up the GPU power? SandyBridge is already more than powerful enough to run HD video with ease. The only thing above that would be gaming but who really expects to play highend games on an Ultrabook?
Sure, there are a few pro-level editing applications that will take advantage of the extra GPU power. But that’s a pretty niche audience & those people generally use workstations.
It’s only a matter of time before we start getting dedicated gfx cards in Ultrabooks as well, further negating Intel’s efforts. I would much rather Intel be focused on lowering power consumption & let the experts handle GPU duties.
Well, I differ. I hope discrete GPUs die off, and open an era of powerful, efficient iGPUs.
Using an integrated GPU instead of a discrete GPU IS a way to decrease power. With these extremely thin systems, discrete GPUs make no sense at all.
Plus in the Ivy Bridge generation, it should perform like the low end modern card, rather than like with Sandy Bridge, how it performed like low end of a previous generation.
I’m with you. But then i have no need for high-end 3D graphics.
Actually, discrete GPU’s can be energy efficient. Nvidia Optimus for example can completely turn off the discrete GPU when not needed. They are just generally designed to be much more powerful than embedded GPU’s and thus use more power.
However, if you provided the same level of performance with a embedded GPU then it too would use more power.
Like one of the reasons why AMD Fusion chips still use more power than Intel ATOM’s is because of the more powerful embedded GPU!
The main advantage of using embedded GPU is thus mainly a reduction in manufacturing cost, with power savings coming at the cost of reduced performance.
While being thin & light doesn’t change that people will continue to want to do as much as they can with these systems and thus there is a demand for discrete graphics.
Actually, no. Discrete cards come on seperate PCB and need dedicated memory. That two factors allow it to consume more power. Even Optimus isn’t perfect.
Intel will continue to pursue higher end iGPUs to compete with AMD and satisfy the people that wants the extra. For those that don’t they will have cheaper variants, just like they do now.
Do you think embedded GPU’s don’t ever use dedicated memory? Because many do now and it’s just the issue of limited amount they can fit and limited performance of the type they can use.
And do you think using shared memory doesn’t have its own performance and efficiency down sides? Because they do!
Besides, discrete GPU’s can still be integrated into a system as they often are on most laptops. Don’t confuse what’s normal for desktops for what’s normal in laptops.
Unless the system is designed to be upgradeable, which many laptops are not, then often the discrete GPU becomes another integrated component of the motherboard and is soldered in just like the CPU, etc. Especially in these Ultra Thin & Light designs because connectors take up too much space and add to the cost of the system.
Optimus actually works pretty well, most of its issues were just driver related, and it’s just the graphic card itself still needs to be designed to work efficiently but the industry is still working on shifting focus from raw performance to efficiency.
However, improvements have been made and are continuing to be made. So don’t underestimate what is possible with discrete graphics.
While another reason not to discount discrete graphics is because they are one of the few things that we can actually choose in system configurations. Though laptops generally don’t provide much choice, people still prefer options and even limited choice is better than no choices at all.
IB GPU performance is going to be not much better than crappy SB GPU performance. Intel played a video of a game instead of actually gaming in their demo of IB graphics, I don’t think that bodes well.
If you really want to game on an ultrabook, wait until there are Thunderbolt based eGPU solutions or for the rumored 17 watt Trinity APU.
17W Trinity is said to be equal to 35W Llano solutions. The 35W Llano solutions are 1.5-2x faster than Intel’s graphics. Ivy Bridge should get the GPU pretty close to that.
And BTW, they invited reviewers to see that DX11 game runs, and not too shabby either: http://www.anandtech.com/show/5359/intel-confirms-working-dx11-on-ivy-bridge
+1