News
  •  News
  • 50.25% (Neutral)
  • Member Topic Starter
2010-03-26T19:12:17Z

NVIDIA GeForce GTX 480: GF100 Has LandedFor better or worse, the launch of NVIDIA's next-generation GPU architecture codenamed Fermi, a.k.a. GF100, is one of the most highly anticipated in our industry, ever. Information about the GPU has been tricking out for many months now, some of it good and some bad. Regardless of what you have chosen to believe or ignore up to this point, one irrefutable fact remains. NVIDIA is extremely late to the DirectX-11 party. There are no ifs, ands, or buts about it. Rival AMD has used the last few months to release a myriad of DX11-class cards ranging in price from under $100 to almost $700, fleshing out a top-to-bottom line-up that caters to virtually every market segment. Today NVIDIA is announcing two high-end cards, neither of which will be available for a couple of more weeks. So while this announcement is an important move for the company, NVIDIA would have liked to have made it sooner. C'est la vie.

NVIDIA may be late with their DX11-class cards, but launching strong products that compete favorably at their respective price points may erase some lingering concerns about the company and restore faith in prospective consumers. To that end, we can finally show you what NVIDIA has in store for the hardcore gamers out there. Today, NVIDIA is officially unveiling the GeForce GTX 480 and GeForce GTX 470. We have two of the flagship GeForce GTX 480 cards in house, and have tested them alongside NVIDIA's previous-gen products and AMD's Radeon HD 5800 / 5900 series, both in single and dual-card configurations. There's a lot to cover, so grab a snack, hydrate, and strap yourself in while we take NVIDIA's latest flagship for a spin around the HotHardware lab...

NVIDIA GeForce GTX 480: GF100 Has Landed

acarzt
  •  acarzt
  • 100% (Exalted)
  • Advanced Member
2010-03-26T19:13:30Z

And here I was just about to complain that it's Firday and no news on the GTX480 :-P

Joel H
2010-03-26T19:23:15Z

Shouldn't "High Power Consumption" be labeled under "Hot"? ;)

 

*ducks*

AKwyn
  •  AKwyn
  • 76.5% (Friendly)
  • Advanced Member
2010-03-26T19:23:30Z

Meh, I can't believe that after all the hype. It's only marginally faster then an ATI card and it should of been $50 less for that kind of performance. I'm thinking this is part of a marketing scheme by NVIDIA for them to buy 2 cards and run them in SLI so they can get world class performance and more money, then again I might be crazy.

I'm also surprised by the fact that it makes more heat and consumes more power. I wonder what they thinking when they made the card, at least they could of done an Intel thing and made it more efficient so that it'd leapfrog over ATI just like the Core 2 Duo did with AMD's processors back in 2006.

And a card with 512 cores unlocked... I guess it's going to be 10-20 fps faster then the ATI cards released right now but damn are they going to need a cooler that can cool this card effectively.

I'm going to wait for the GeForce GTX 470 review until I make up my mind but I think NVIDIA has a GeForce FX on it's hands.

recoveringknowitall
2010-03-26T19:30:20Z

I'm pretty surprised with the results... IMHO GTX 480 is meh and I'm gonna buy Ati this time.

Inspector
2010-03-26T19:33:32Z

acarzt, its now time to complain why isn't this as super as the hype made it??? 🙂 lol. Its nice but we were expecting better of them 😞...

animatortom
2010-03-26T19:46:39Z

Yeah!........1600Core processors!!

Oh, Wait...What? Its Nvidia :(

Only 512 cores. Also a big card with a minuscule fan?

As you can tell This Fan is still one of ATI's :)

Like I have said before, it is going to take a while for the software to catch up to these cards to these multi-core GPU's. The big thing is the DX11 support. that is the main reason for upgrading a card!

acarzt
  •  acarzt
  • 100% (Exalted)
  • Advanced Member
2010-03-26T19:57:32Z

I think it is lacking in some areas due to drivers. It's a brand new peice of hardware.

Comparing it the the previous Gen of Nvidia cards, it's hugely impressive. A single chip GTX480 is faster than a Dual Chip GTX295. On top of that, in some bench marks it can nearly hang with the 5970. When paired in SLI it really shined.

While in Single card mode in the HAWX benchmark it was 20% faster than the 5870. Then in  Dual card mode it was 40-60% faster than a pair of 5870s. On top of that, a pair of 5970 is only 20% faster while having twice as many chips to work with.

More evidence of potential for driver optimization becomes apparant here as well since scaling from single card to dual picked up 90% at one resolution, but only 70% at the other. With some optimization Nvidia could s[censored]ze out a few more frames.

When a dual chip Nvidia card comes out.... it'll be a beast.

Also, pay close attention to the Unigine Heaven benches as they can be a bit deceiving. While it looks like the 480 is only slightly faster than the 5870s, it is in actuallity nearly twice as fast. Add to that, that it was faster than the 5970.

I think for this card DX11 is where it's at. This thing will probably, kindly, hand the ATI card it's a$$ in DX11 games w/ Tessellation. And as we all know DX11 is the future. So I don't mind if it's not the best at DX10 as long as it's great at playing the games of the future :-)

AKwyn
  •  AKwyn
  • 76.5% (Friendly)
  • Advanced Member
2010-03-26T19:58:03Z

animatortom wrote:

Yeah!........1600Core processors!!

Oh, Wait...What? Its Nvidia :(

Only 512 cores. Also a big card with a minuscule fan?

As you can tell This Fan is still one of ATI's :)

Like I have said before, it is going to take a while for the software to catch up to these cards to these multi-core GPU's. The big thing is the DX11 support. that is the main reason for upgrading a card!

It's 480 cores animatortom. The 512 cores aren't fully enabled yet. And the software is catching up at a faster pace then you predicted. The only problem is that it's only marginally faster then the ATI cards. The only time where it outperforms the ATI Radeon HD 5870 is in SLI mode, and the price does not justify the marginal performance increase.

animatortom
2010-03-26T20:10:41Z
Either one Nvidia or ATI, You can paint a Yugo Ferrari red. It still doesn't make it a Ferrari :)

Much like WS cards the ones that have the best OpenGL is the latest one needed for the current software, much like DX11. With each iteration, about the only difference is the doubling of the memory.

My theory is, when it comes to GPU's, Give it time and in another few months they will be rolling out something that is twice as powerful at the same price. Then the die hard fanboys on either side will try them out, then if you look at the complaints. You will get a better idea of the cards short comings :)
Marius Malek
2010-03-26T21:23:41Z

Quote:

To show off the capabilities of GF100, NVIDIA has created a number of interesting demos. As many of you know, properly rendering and animating realistic hair is a difficult task.

Tell me about it. My OCD has been cringing from this for years now. I see a character's head move, and it's like their hair is a lego piece that has been capped on top of their head. It doesn't move and it irritates the crap out of me. The pictures that nvidia released from the demo looks incredibly realistic, which of course is thanks to the proper lighting. 

Quote:

Another demo NVIDIA created to illustrate tessellation with the GF100 is aptly dubbed the Water Demo. As you can see in the screenshots above, the water demo takes a scene with relatively basic geometry, and through increased tessellation and displacement mapping the detail in the rocks and water is dramatically increased. 

Water has been the hardest object to master for so long now, at least for consumer priced products like video games. 

Quote:

The demo does not use realistic fluid dynamics, but the effect was nonetheless still very good.

That's a bummer. I can't tell you how many times games have made water look like soggy gelatin. Still impressive though. 

Quote:

In addition to offering much more compute performance and geometry processing than previous generations, the GF100 also features new anti-aliasing modes.

This is probably why the card spikes at 105C. I was looking at the Age of Conan references you attached to the article. It's quite a difference, even though that close up the image of grass is still pixelated, I'm sure that us gamers aren't going to critique the grass on a microscopic scale. 

I'm finally glad to see that Nvidia is pushing their "The Way It's Mean't to Be Played" slogan. I haven't experienced a renaissance in PhysX optimization in really any game that I've been able to play. But their incorporation of APEX was really impressive. I expect to see a lot more of capes flapping in the wind much like Arkham Asylum. I want real time hair movements too. I'm a picky gamer haha. Seriously though, the more realistic a game is for me, the more engrossed I get into it. 

Quote:

"Computational GPU" is short-hand for "a whole lot of number crunching".

I take it this means that the card will kick some ass. 

So I was looking at the charts. The GTX 480 barely passes the 5970 in Extreme Tessellation, even in both single and multi GPU set ups. However, it fails miserably at 3D MarkVantage. 

But, thinking more on the subject, we have to remember that the 5970 is made up of TWO 1 GB GPU's.

What's interesting though is the Quake Wars test. The fact that Crossfire 5970's only have about a 25% increase in performance than SLI GTX 480's is pretty substantial. That's 2 cards vs. 4 cards and the 4 cards have a slight head margin.

One thing I keep seeing though is the comparison between the GTX 480 and the GTX 280 and 295 models. The is an increase in performance, but nothing ground breaking in my opinion. It might have something to do with Crysis being optimized for Dx10.

However I noticed that the GTX 480 shinned brightly through the Far Cry 2 test. At Ultra Quality and 4x AA at 1920x1200 it only falls behind the 5970 by 10 FPS and only 6 in their multi GPU set up. 

Regarding L4D2, I just LOVE how every card just pushes through and dominates haha. 

Now, about that power consumption. HOLY CRAP. It's amazing how much juice this card needs in order to throw its weight around. 

Quote:

 At 438 watts under load, the GeForce GTX 480 consumed almost 40 more watts than the dual-GPU powered Radeon HD 5970, despite offering lower performance. With regard to power efficiency, it is obvious, the GF100 GPU is significantly less efficient than the Radeon HD 5870.

Yea seriously, Nvidia needs to rethink their strategy, people can afford their cards..but they will most likely foreclose on their homes because of the power bill. 

Quote:

Having spent some quality time with the GeForce GTX 480, we can't help but expect the card, as we have shown it to you here today, will not be NVIDIA flagship for an extended period of time. The true potential of the Fermi architecture hasn't been fully realized just yet.

You see this is what depresses me. People are going to go out and buy this card and then Nvidia will release something that's actually ground breaking. I think they did this, just so they could get a margin of their market share back. 

Right now the GTX 480 is Nvidia Flagship card, but like you said in your conclusion, it won't be for long. I don't think this is what Nvidia wanted either, and I hope they can roll out something eye boggling and mind numbing very soon. 

 

My conclusion? This card isn't worth the trouble despite the beautiful things that it can do. Save your money, wait one month, and I can promise you that ATI will drop the price tag on their cards by a few bucks. And you can end up getting more bang for your buck. That is at least until Nvidia finally get's their stuff together. But don't get me wrong, this card is still a monster, and it should be respected as so. 

 

bob_on_the_cob
2010-03-26T23:09:42Z

All I have to say is Meh. Pretty underwhelming.

rapid1
2010-03-26T23:29:36Z

I agree bob seems pretty underwhelming to me to. When I saw the email notification of the post I was excited to. Oh well it's pretty close to what I expected especially seeing as ATI will probably release another card in3-4 months. Then on top of that the pricing as I can almost guarantee you ATI is going to price drop a bit more, probably across the board. Give it a month if we see any major action market wise they will. I don't know though this may be hardcore fanboys only to, in which case there won't be much market movement.

I guess it's just wait and see, I will say one thing for sure though if Nvidia does not see some decent roll over on this GPU I think its going to hurt. They have been working on it for quite some time with ATI taking most of the pie on the GPU end money wise. So they have been putting money in R&D for what over a year now with no return, not to mention ATI has also been gaining market share which is more dangerous than straight money in many cases I think.

Der Meister
2010-03-26T23:42:29Z

its a nice looking card a performs well, though it was not what I was expecting. None the less I would still trade my 2 275's for one of them. 

Joel H
2010-03-27T01:56:30Z

Acarzt,

I'll hit the rest of this in the morning, but I'm going to respond to one thing you wrote now. It's physically impossible for NVIDIA to build a dual-GPU GTX 480 on a 40nm process. Board power for the single-GPU flavor of the card is 250W. The maximum amount of power you can feed a PCI-Express card is 300W. NVIDIA isn't going to break the PCI-Express specification (they'd get no support from motherboard vendors, all of whom would have to specifically vet their motherboards against a non-standard component).

It's possible that these first parts are terrible examples of 40nm, so yes, NVIDIA could re-spin the silicon. Statistically, that won't happen for at least six months; NVIDIA's already respun it (and TSMC's yields aren't exactly fabulous even now).  Farther out, Fermi probably will go through a die shrink at some point in 2011 once 28nm tech is ramping, but that's a year away.

Sure, NV could opt to build a svelte mid-range dual-GPU, but that wouldn't be a dual GTX 480. It's very likely that we'll see NVIDIA roll a full 512-core Fermi and I think it's a good bet that they'll build a Fermi 2.0 on an improved 40nm die, but you can only rearchitect so much. If we look at the GTX 285 / GTX 295 as indicative, NV would need to pull the power consumption of a GTX 480 down by 50-60W to have enough room in the PCI-E spec to build a dualie. The only time you see leaps that massive on the same process, maybe, is if you compare A0 hardware to the mature, high-yield silicon you're building 1-2 years later.