On Thursday, the same day that AMD unveiled - early - its 4800 series of cards (because some etailers were already selling the card), NVIDIA decided it just had to unveil its own new card, as well as some other good news.NVIDIA's new card is the GeForce 9800 GTX+ GPU, an overclocked 9800 GTX, but really the 55nm refresh of the G92. NVIDIA increased default clock/shader speeds: from 675MHz to 738MHz and from 1688MHz to 1836MHz respectively. Memory speeds for this card will be 1100MHz, similar to the existing 9800 GTX. The MSRP will be $229, and it will hit store shelves in mid-July.
At the same time the 9800 GTX's price will be dropped to $199, matching AMD's 4850 MSRP.
CPU vs GPU PhysX In UT3
The really good news: PhysX for the GeForce 9800 GTX, 9800 GTX+, and GTX 260/280 cards will be enabled by the ForceWare 177.39 driver, timed to launch with the GTX+. And NVIDIA says it will gradually roll out PhysX support to the full GeForce 8/9 line. As the above graph shows, GPU accelerated PhysX could offer a significant performance increase, in games that support the technology. The UT3 PhysX map pack for example, which was used to gather the data above, shows a major boost with GPU accelerated PhysX enabled.
Good news us 88000 users will get PhysX, Yay!, but im prolly stepping up for a 280.
I'm just wondering whats going to be the difference in the phys processing in the different lines of cards. Obviously the G92's are going to be constrained but what impact is that going to have on framerates and will you be better off running in software? (that is if you can choose to run in software?) What about CPU load? I feel a need for a special G92 w/phys vs 260/280 w/phys review in the near future.
Well at least my 8800gt might show me a little more than what its been showing lately. After dropping my 8800 in I was fairly disappointed as that I figured I might have to upgrade to a quad core to get decent balance of speed and image quality in UT3. Thanks for that data.
"Hey everyone! Look! We here at (fill in company name here) made a graph of what our product(s) may be capable of in the future to make us look good against our competition!"
In fact, wasn't this the type of marketing Aegis did for this technology 2 years ago???
Ageia may have a third big problem if it turns out that this first-gen PhysX PPU is simply a dog of a performer. We don't have enough evidence draw that conclusion yet, but nothing I've seen so far convinces me this chip offers the sort of major leap in physics performance that Ageia claims for it.
Ageia's PhysX physics processing unit
Smithereens gone wild
by Scott Wasson — 12:00 AM on June 22, 2006
Well, I'll be damned. Looks like nVidia purchased their marketing team as well.
I tend to agree with you here for sure. We're just getting the word out on the news however, since it is obviously newsworthy that NVIDIA has begun to integrate their acquired technology into products. You're right though, they need to deliver on the claims next.
Again, it does seriously lack games, though. I'm hoping that nvidia's acquisition of Ageia will spurn game development, while not detracting from the GPU performance.
I would love to see Physx support for my 8800 gts, as was stated above there are not many games that require physx, but with physx enabled on the gpu it must bring some performance improvements.