Massive apologies for the wall of text & mini-quotes, but DrG started it
Doctor Glyndwr wrote:
I disagree with many of Mr Dom's claims.
For one, Moore's Law states that the transistor count doubles, not the CPU grunt. But that's often misquoted so I'll let him off
It also states that the doubling is once every 18 months, so talking about "one Moore every year" makes my brain hurt.
More importantly, though: the evidence doesn't bear it out. Going from the iPad 1 to the 2, the CPU slightly more than doubled but the GPU got nine times faster (borne out by benchmarks, not just Apple PR). The same bump happened between the iPhone 4 and 4S. A similar increase was observed between the 3GS and 4 (not quite as much GPU as 9x, but still far more than double).
The basic Moore's Law states transistors-on-a-chip doubles every two years, this was further refined by Intel to be an 18 months doubling in performance, as the two are related. In fact most aspects of computing expand at the same rate, and I believe a large part of this is due to hardware devs using Moores law as a target in their planning (from talking to AMD & Intel chip dev relations guys a few years ago, and nothing I have seen since makes me think they have changed). Their planning is done 5 years or so in advance of chips hitting the streets, so using a baseline & multiplier is how they make the targets for future development. In effect, they make Moores law a self-fulfilling prophecy for most aspects of electronic kit.
Ok, it's not the strict Moore's law as stated by Moore, but it is very similar, and much more widely applicable in its current form. It is Moore's Law ++
Doctor Glyndwr wrote:
Mr Dom wrote:
They need a good doubling of battery performance across the board (smaller chip fab sizes, more efficient cores & OS) so that even after they double the CPU grunt, they have enough left over to push to 4x the graphics.
What? This would only be true if the CPU and GPU consumed 100% of the power budget. Clearly, in fact, the screen, wireless interfaces, RAM, etc etc draw proportionally far more juice. So even if Apple exactly doubles the power consumption of the chipset, it doesn't need to double battery capacity to compensate -- far from it.
I didn't say battery capacity - I said battery
performance. i.e. do the same for less watts. This applies to the things you mention, like wifi/3g/GPS. These need to draw less to leave more watts for the CPU & GPU to use. Batteries do not follow Moores Law++ sadly, and it gets increasingly tough to improve current lithium cells. they can squeeze a bit more juice into them, but they will not double the power available unless they make the battery much bigger. With shrinking electronics there is room for a small size increase, but after all this you are talking a 10-20% boost in capacity at best.
So with 120 'units' of power in the new beast, the 'rest' of the kit may drop from using say 50 units to 45 units. That leaves the CPU from having 50 units to now have 75 units - enough to double in processing power.
However - a new screen throws a massive spanner in the works. It will be drawing more power, and the real kicker is the bus speeds that you now have to support from the GPU to the screen controller and then to the screen.
To maintain the screen at that rez at 60Hz, you need 800Mb/s minimum just for the pixel data (up from ~200Mb for the iPad1 & 2) - Just one screen of pixel data would be 12MB - that is a huge amount.
Quadrupling your bus speed is a hefty increase in power consumption.
Doctor Glyndwr wrote:
Mr Dom wrote:
I know apple like their incremental money grab upgrade path, so they tend to go to a 1xMoore jump (2x performance if you are lucky). Compare this with consoles who look at a ~3.2xMoore jump (10x performance) between generations (roughly 6 years - go & count em if you don't believe me. It's like bloody clockwork)
A doubling in performance every year gives a 64x performance improvement over six years -- that's
twenty times more than the increase in a console's power, according to your figure. That's an "incremental money grab" option? I'd say it's an "astoundingly fast upgrade path" option myself.
Neither the original or modified Moores law say it doubles every year. the revised figure is around 18 months (1 1/2 years). In 6 years, that gives 8x, which with a bit of drift (some bits will do better, some worse) it is close enough to 10x, which is the goal for a 'generation' in consoles.
That's the non-incremental money grab option - wait for a seismic (order of magnitude) upgrade for the same money before you change. Apple go for tiddly upgrades between generations, which gives them more money as
hipsters with more money than sense users are encouraged to upgrade every 2-3 generations to stay 'with it'
Doctor Glyndwr wrote:
CraigGrannell wrote:
There's going to need to be some seriously amazing engineering to get that kind of display working while also retaining battery life
Why? Why does more pixels use up dramatically more battery life? Backlight intensity, wireless interfaces, etc are all unchanged. The GPU grunt might go up a bit, but it went up 9x between the iPad 1 and 2 without harming overall battery life. It might need an extra 64 MB or so for additional framebuffer space, but again, the iPad 2 went to 512 MB from the iPad 1's 256 MB and the world didn't end. I don't follow this logic.
9x? Yeah right...
Even Apple only claimed 7x in the end, and most benchmarks are strangely biased...
It went from a PVR-SGX535 (250 Mpx/s) to a PVR SGX543MP2 (500 Mpx/s) - double the fill rate at best.
The Gflop rate went from 1.2Gflops->3.6 Gflops which at best you can read as 3x the actual computational power.
The 'benchmark' proofs are mostly evidence that the first iPad was fairly unbalanced, and had serious issues with bus performance that crippled the graphics, and they ironed out some of the issues to make the iPad2 work more like how it should have done in the first place.
But, all the benchmarks I have seen concentrate on triangle throughput & vertex shader performance (GFlops), not fill rate which has always been PoorVRs problem due to the way the render stuff.
i.e. you can have more detailed opaque characters but all your particle effects stay exactly the same.
Fixing cock-ups gives you a Moore-beating headline, but only cos the kit was sub-standard to start with.
Also, they jumped from 1 core -> 2 cores. This does give a nice boost - nearly double. But, the jump from 2->4 cores does not give quite as much boost sadly - more chance of idling a core from thread starvation.
Doctor Glyndwr wrote:
Quote:
and enabling devs to shift graphics round at speed.
Only games devs. There's easily enough GPU grunt in an iPad 2 to push "Retina display" levels around in non-graphically-intensive apps; it's only the game devs that would suffer... if they supported retina graphics in their games at all, of course (it is optional after all). Now, Craig: do you really think Apple are going to not upgrade the display because it might inconvenience a subset of its developers? I contend the answer is a big fat "hell, no".
4x rez = 4x fill rate. All the triangle pushing in the world won't get you away from that. PoorVR chips don't do fill rate well. Always been their achilles heel.
4x rez = 4x the data through the bus.
4x rez = 4x memory & processing by the screen controller (big increase in power drain)
This isn't games we're talking about. That will use up the vertex processing, but tbh rendering text uses up a fair whack of that anyway. The polygons/sq-inch on small text is comparable to rendering a low poly model, and text has to be rendered as a transparency. A web page with a picture as the background and text on top counts as a fairly high poly scene with 2x overdraw.
Asking a PoorVR card to do this at 4x the resolution while using almost the same power as last years chip is asking for trouble.
Bottom line - I'm not saying the can't do it, but there are some very serious issues with power consumption & the sheer amount of data you need to chuck around that would penalise other aspects if they did it now. Due to their propensity in releasing every couple of years, you are more likely to get a 2x increase now. They may still claim it as a retina display tho, as it might technically count if you hold your arms stretched out or something.