PDA

View Full Version : The INQ take on the GTX 280/260


Pages : [1] 2 3 4

heatlesssun
05-24-08, 02:25 PM
For the AMD fans out there, you might be encouraged by this:
http://www.theinquirer.net/gb/inquirer/news/2008/05/24/gtx260-280-revealed

Good old Charlie is what you might call a biased fellow when it comes to nVidia and Microsoft so I guess we'll see. Looks like he's saying that the GTX 280/260 is going to cost more and be slower than the RV700 parts.

I was planning on picking up two 280's but if this is true I may wait for the 4870X2.

So what do you think?

Maverickman
05-24-08, 02:36 PM
It's way too early to tell which cards are going to be faster. This guy sounds like an ATI fanboy. When we see the benchmarks this summer, then we can make a decision. One thing he said is very correct: it's going to be an interesting summer!

evox
05-24-08, 03:23 PM
Charlie Demerjian. Biggest douchebag in INQ staff, and that's saying something considering it's INQ we're talking about.

Die.

Runningman
05-24-08, 03:25 PM
I want to believe in ATI, but there track record isnt all that great.
I dont know how that INQ guy can say either way who is going to win this round. At this point no one knows the performance of any unreleased card yet...

Dazz
05-24-08, 03:28 PM
If you guys have kept up the RV770 is the single card solution while the R700 is a dual card in which case it's a 9800GX2 Vs a 9800GTX so yes you can expect the R700 to come out on top as it's two cards in one Vs a single card.

Lfctony
05-24-08, 04:00 PM
This guy is such an idiot... Anyway, we'll have the 4870 and the 4870 X2. The X2 (2 PCBs on one card) will be competing with the GTX 280 I suppose (2.5 9800GTXs vs 1.5 9800GX2***) , but I guess the GTX260 will be a lot faster than the 4870. Of course, the price of the 4870 could allow it to compete, since it's expected to beat the 9800GTX by 25%.


***
4870 = 1.25 9800GTX
4870 X2 = 2.50 9800GTX
9800 GX2 = 1.7 9800GTX
GTX280 = 1.5 9800 GX2 = 2.55 9800GTX

That's what I've been reading...

Ninja Prime
05-24-08, 05:54 PM
He does sound like a fanboy, but he seems to have solid info on the clock rates and such. The end result seems to be slightly lower clock rates with almost twice as many stream proccessors. I'd imagine it will end up somewhere around 75% faster than a 8800 ultra in terms of shaders.

However, the ROPs should make it steadily outpace an 8800 ultra more and more as you go up in resolutions and AA, I'd imagine in something like 2048x1536 with 4x AA it would be well over 2x as fast.

bacon12
05-24-08, 06:27 PM
Waiting for benchies from real game performance.

shabby
05-24-08, 06:55 PM
What a nut hugger, bashing the product before its even out. I wonder if he trashed the g80 too before it came out.

heatlesssun
05-24-08, 06:57 PM
He does sound like a fanboy, but he seems to have solid info on the clock rates and such. The end result seems to be slightly lower clock rates with almost twice as many stream proccessors. I'd imagine it will end up somewhere around 75% faster than a 8800 ultra in terms of shaders.

However, the ROPs should make it steadily outpace an 8800 ultra more and more as you go up in resolutions and AA, I'd imagine in something like 2048x1536 with 4x AA it would be well over 2x as fast.

Yeah, he might have the numbers but I don't know how he's translating it into performance. I do hope that we start seeing some leaked benches soon. What really doesn't make sense is how the GTX 280 is supposed to cost $600+ and be a lot slower than a cheaper 4870 X2, that simply won't work.

mtl
05-24-08, 07:38 PM
The Inq did say the G80 wouldn't be as fast as the 2900XT.
What idiots!

NoWayDude
05-24-08, 07:40 PM
This the same guy that said that G80 was a DX 9 card only, was not an unified architecture, and so on and so forth?

And that R600 was going to walk all over G80, etc,etc?

Charlie has a personal axe to grind against nvidia, so expect some more news blurbs like this
At this moment, he knows exactly the same we do. Jack ****

weeds
05-24-08, 07:44 PM
Charlie is such a tool.

BenchmarkReviews article on editors day has a few blurbs about GT2XX

http://benchmarkreviews.com/index.php?option=com_content&task=view&id=178&Itemid=1&limit=1&limitstart=1
Jason Paul, the GeForce Product Manager and NVIDIA veteran, came right out and dropped the new product bomb: the GeForce GTX 200 graphics platform. Perhaps it was the off-interest discussion of CUDA which lowered the attention span, but Jason's brief discussion exposed that the new GPU play Crysis "damn fast". There wasn't any time wasted, and Jason quickly introduced Tony Tamasi to introduce the GTX 200 compute architecture. Unfortunately, the non-disclosure agreement Benchmark Reviews honors with NVIDIA prevents me from disclosing the details for this new GPU.

So you might be wondering what Jason's holding in the image above, right? It's large, almost the size of an original Penium processor, except that this particular item has 240x the number of cores inside of it. I would love to tell you what it is, but suffice it to say there's a good reason why Mr. Paul has a smile on his face. It put one on my face, too. Benchmark Reviews will reveal more at 6AM PST on June 17th, 2008.

http://benchmarkreviews.com/index.php?option=com_content&task=view&id=178&Itemid=1&limit=1&limitstart=3
Just wait until June 17th when the GTX 200 series of GPU's launch, and you'll start asking yourself when you last witnessed such a dramatic technology improvement. If you thought the GeForce 8 series blew the 7-series out of the water, this is going to leave you in shock. That's not my own marketing spin... Benchmark Reviews is presently testing the new GeForce video card.

shabby
05-24-08, 08:05 PM
Just wait until June 17th when the GTX 200 series of GPU's launch, and you'll start asking yourself when you last witnessed such a dramatic technology improvement. If you thought the GeForce 8 series blew the 7-series out of the water, this is going to leave you in shock. That's not my own marketing spin... Benchmark Reviews is presently testing the new GeForce video card.

:drooling:

walterman
05-24-08, 08:14 PM
I think that if this new card can't run my fav game at 1920x1200 SSAA 2x at 60fps min frame rate, i'm going to be seriously disappointed & prolly skipping it.

Lfctony
05-25-08, 12:01 AM
:drooling:

Well, the last time was when the 6800/X800 cards launched. 100% + performance improvement over the previous 9800XT. Damn those were some fast cards. If we get 100% improvement over the 8800U, which is the fastest single card now, I'll be thrilled. :)

mojoman0
05-25-08, 12:05 AM
How could they possibly win when nVidias had a a year and a friggen half to come out with new tech!?

heatlesssun
05-25-08, 12:14 AM
I emailed Charlie at the INQ who wrote this:http://www.theinquirer.net/gb/inquirer/news/2008/05/24/gtx260-280-revealed

He responded back and seems to have seen or heard about numbers on both sides and to quote, "Two 770s on a card will kill the 280, and NV can't do 2 x 280s for power reasons, and would be hard pressed to do 2 x 260s for the same reason."

So at this point who's king is very much in question. But its starting to sound like both AMD and nVidia are going to have hardware that crushes any and everything. In real world performance terms it may not matter that much.

Oh how I am looking forward to mid June!:D

Ninja Prime
05-25-08, 12:27 AM
I emailed Charlie at the INQ who wrote this:http://www.theinquirer.net/gb/inquirer/news/2008/05/24/gtx260-280-revealed

He responded back and seems to have seen or heard about numbers on both sides and to quote, "Two 770s on a card will kill the 280, and NV can't do 2 x 280s for power reasons, and would be hard pressed to do 2 x 260s for the same reason."

So at this point who's king is very much in question. But its starting to sound like both AMD and nVidia are going to have hardware that crushes any and everything. In real world performance terms it may not matter that much.

Oh how I am looking forward to mid June!:D

That's exactly what I speculated. I think ATI's smaller, faster, more power efficent model might start to pay off this generation. Nvidia's gone too far on the 65nm process and are stretching its limits. ATI is on 55nm now and has better power management and better transistor efficency. Nvidia will have the more powerful single core, but price/performance looks like its going to ATI this round.

hirantha
05-25-08, 12:30 AM
I think people with Intel boards will benefit from the price and performance the new ATI cards have to offer. especially the ability to crossfire.

Runningman
05-25-08, 12:32 AM
That's exactly what I speculated. I think ATI's smaller, faster, more power efficent model might start to pay off this generation. Nvidia's gone too far on the 65nm process and are stretching its limits. ATI is on 55nm now and has better power management and better transistor efficency. Nvidia will have the more powerful single core, but price/performance looks like its going to ATI this round.
now that you mention it, why wouldnt NV jump to the 55nm node. I'm starting to wonder if this is all a ploy with regards to the leaked specs....

heatlesssun
05-25-08, 12:41 AM
I've got a Foxconn P35 CrossFire board and Q6600 I need to put back together. I may end up going CrossFire on that setup and SLI 280 with my sig rig just to keep things honest. If all this is true, I don't see the GTX 280 costing more than $500. $600 for a slower hotter part is going to be a problem for nVidia, assuming that the 4870X2 is $500 and all that.

It's been almost 4 years since ATI/AMD has had a clear led over nVidia. Maybe this is it, but something not adding up with all of this. I just don't see how nVidia can have the generational leap that the Benchmarkreviews guys are saying and then AMD one upping that with a CrossFire card unless AMD has down something dramatic with CrossFire.

Ninja Prime
05-25-08, 12:53 AM
now that you mention it, why wouldnt NV jump to the 55nm node. I'm starting to wonder if this is all a ploy with regards to the leaked specs....

Nah, boards are already manufactured and ready to go once they get enough made. They are 65nm. I'm sure this was done as a cost saving measure.

DSC
05-25-08, 12:55 AM
G80 was too big, too power hungry, many said that at launch. But look at the payoff for those that bought it on day 1, they got at least 1+ year of useful life out of it, UNHEARD off in this very competitive sector. Just ignore Charlie, he's the biggest ahole writing anti-Nvidia articles which he can't even get his facts or math right for GTX 2x0.

Both companies are launching within days of each other, review the facts and not pointless drivel from biased idiots like Charlie.

Ayepecks
05-25-08, 01:02 AM
No one answered this in the last thread, but these cards are the cards that used to be called the GT200 series, correct? At least, that'd make sense...