PDA

View Full Version : Nvidia may be in trouble according to this article


Pages : 1 [2] 3 4 5 6 7 8

shadow001
02-19-10, 02:46 PM
Here's some of the wierd pictures:


http://www.semiaccurate.com/static/uploads/2009/10_october/Fermi_end_plate_cropped.jpg


The wood screws and you see a lower ventilation opening that just doesn't make sense at all,and only a single DVI port,not the usual 2.

http://www.semiaccurate.com/static/uploads/2009/10_october/Fermi_back_cropped.jpg


Yet the PCB shown clearly has the solder points for 2 DVI ports,not just 1


http://www.semiaccurate.com/static/uploads/2009/10_october/Fermi_power_cropped.jpg


And the famous one in question...You see the 2 PCI-e power connectors,with one coming out from the top of the card and the other from the very end of it,and they don't even line up with the solder points used to make the electrical connection itself.

The 6 pin power solder points are also missing 4 solder points too,leading to believe that the PCB was basically cut,and even the certification stickers at the lower section are also cut in half.

XMAN52373
02-19-10, 02:58 PM
A prototype is still a working sample no matter what,even if it isn't operating at whatever the final specifications that the retail cards would have been working at....That card wasn't working in any shape or form and didn't even have a Fermi GPU in it to begin with,and Nvidia's CEO never said it was a mockup of what fermi will look like....He simply stated this was fermi right here,which is a big difference.

Actually, yes he did say it was a mockup the very next day when he was asked about it directly during day2 of the nvidia event. If I'm not Mistaken, it was either Theo of BSN or Rys of B3D that asked directly concerning the board he held.

shadow001
02-19-10, 07:33 PM
Actually, yes he did say it was a mockup the very next day when he was asked about it directly during day2 of the nvidia event. If I'm not Mistaken, it was either Theo of BSN or Rys of B3D that asked directly concerning the board he held.


Only after those pictures were taken and that the conclusion was that it could only be a mockup though....On the day that the card was presented,he didn't mention it was a mockup at all,but the actual fermi card in question,so it's his confirmation is leaning more towards damage control than anything else.

Johnny C
02-19-10, 08:38 PM
A prototype is still a working sample no matter what,even if it isn't operating at whatever the final specifications that the retail cards would have been working at....That card wasn't working in any shape or form and didn't even have a Fermi GPU in it to begin with,and Nvidia's CEO never said it was a mockup of what fermi will look like....He simply stated this was fermi right here,which is a big difference.

What would be the point of saying "Here's a mockup of what Fermi might look like....."

Might as well have said...."here's the pretty picture I drew of me holding a Fermi"

I think Hector Ruiz should go to jail.

But it doesn't change the fact that Jen....made himself look like a fool....and he's now perceived as less than honest.

Next year I'm sure someone else will do something stupid and Jen's silliness will be forgotten ....but any company that has the gall to publish cartoons about Intel should have a better code of ethics than to claim a poorly constructed mock up....is the real deal.

Pot and Kettle...

Stone and glass houses...

shadow001
02-19-10, 09:20 PM
What would be the point of saying "Here's a mockup of what Fermi might look like....."

Might as well have said...."here's the pretty picture I drew of me holding a Fermi"

I think Hector Ruiz should go to jail.

But it doesn't change the fact that Jen....made himself look like a fool....and he's now perceived as less than honest.

Next year I'm sure someone else will do something stupid and Jen's silliness will be forgotten ....but any company that has the gall to publish cartoons about Intel should have a better code of ethics than to claim a poorly constructed mock up....is the real deal.

Pot and Kettle...

Stone and glass houses...



He could have said it anyhow since implying it was a working sample only to delay it 6+ months causes even more problems in the longer run,and it's not the first time he's pulled stunts like these or even had comments on GPU's over the past several years that are less than accurate,or even grossly exagerated though,and it's not just my opinion either,but that of actual developers who've said the same,but on the other hand,it's his job to make his company look as best as possible all the time,regardless if what he says is bull****.


Here's another example,when Nvidia had the first DX10 GPU out well before ATI had theirs,and it was only the higher end versions too,and even before Vista was officially available or that there were any DX10 games out,he made a huge deal out of being first on the market with DX10 products regardless...Many times in fact.


Now that the shoe in on the other foot,and it's ATI that has the first DX11 GPU's out for the entire market from 60$ budget cards all the way up to 600$+,both windows Vista and Windows 7 have DX11 support and there actually are a few games that at least use some DX11 features in them,and ATI has already shipped well over 2 million GPU's to their customers,while Nvidia hasn't shipped a single one yet,he had the gall to say that ATI's lead is insignificant overall....it's pure damage control and very stupid to say that when you think about it.

john19055
02-19-10, 11:28 PM
Even if he is only half right ,then it does'nt look good for the GTX 480 Fermi.I hope to god he wrong and nvidia has everthing worked out ,it sounds like a killer card.But if the quanties are worst then ATI were when they first came out ,then it don't look good for nvidia.If they only have 10,000 cards at launch time,then it will be months and months before most people could get one ,if they can get one at all. We will find out more monday and I hope it not the doom and gloom I just read.I know they are haveing problems or the card would not be so late.It looks like the price is going to be around or over $800 for a GTX 480.If that is the case then I will just go with two HD 5850's .I hope he wrong because I like nvidia ,but I have nothing against ATI/AMD.I am building my mother a budget AMD system ,been a while sice I use a AMD chip.But the prices are just two good and I want to see what a cheap AMD system can do .I went with a AMD Athlon II X2 240 Regor 2.8GHz for $57 and a open box ASRock A790GMH/128 for $53.50 already have a great heatsink fan,Plus a open box SAPPHIRE HD 4850 DP 512MB GDDR3 256bit interface for $65 ,I already had 4 1gigs of Corsair Dominator 1066 DDR2,and a SATA Seagate 250gig harddrive and a samsung and sony DVD burners.I have a creative audigy ,but I am going to use the onboard sound,since she just has two cheap speakers.Iam currious to see what it will do,I will overclock the heck out of it.Most seem to overclock to 3.5gig with the stock cheap cooler at default voltage,I will be trying for 3.8 gigs.Not a bad system for under $200.
To get back on topic I sure hope what he is saying is wrong and they have a 512 shader Fermi working at 750/5000 and a Million ready to ship on lanch day at a $600 price tag.It sure would be nice,but either way I have waited long enough and if two HD 5850 is the better deal then I will go with ATI this time around and hopely next time in about a year nvidia has a worthy part out.

Muppet
02-20-10, 01:18 AM
Well it's only a couple of days away from the announcement. Not long to wait. :)

Rollo
02-20-10, 08:35 AM
I don't believe anyone reads Charlie Demerjian at all. Charlie isn't a smart person, he's not a journalist. He's a rumor monger who only posts rumors he think will damage NVIDIA.

He does this because NVIDIA removed his press privileges due to his misuse of his press privileges. This angered Charlie, so since then he's been acting like a teenager who was left home on prom night because their beloved found someone prettier/more handsome to escort.

Is Fermi "late"? I guess so, but I'd note that we're talking about inventions here, and you can't really schedule when they're done to coincidentally occur the same day your competitor launches a product. These things are in development for years.

Has being "late" damaged NVIDIA, like all the ATi fans said it would? Depends how you look at it. On one hand, NVIDIA actually gained desktop marketshare on ATi last quarter, and made more money than ATi the last two quarters. (so I guess the "uber triumph" of the 5XXX series didn't help them much) OTOH, it's pretty obvious NVIDIA would have been more successful if the Fermi based products were in the market those two quarters.

Should people wait another month to buy a Fermi? Depends- if you need a card right now because your old one died, probably not. If you just feel like upgrading, that depends if you want to be limited in running AA in UE3 games, not have PhysX effects, or not have a true 3d option. (not to mention lower DX11 performance)

An ATi 5XXX purchase will arguably be the "best you can buy" till next month, then they're going to to look pretty dated. This is the trade ATi made releasing an evolutionary GPU, they got half a year with little competition at the high end, now their products are going to look dated.

ShiningArcanine
02-20-10, 11:25 AM
Given that Charlie said several months ago that for the november date to be even possible at all,they'd have to get all the features,clocks and yeilds for Fermi working the way they wanted to from the very first revision of the Chip,which is something that pretty much never happends on the first revision of anything,never mind one with 3 billion transistors on a brand new 40nm fabrication process.


Then there was the little fact that when he made that statement in late september,as some of the GP-GPU related features of fermi were officially revealed one week after ATI unveiled the HD5870 cards,they didn't even have the first revision of Fermi back from TSMC yet,so how can one make availability predictions on something they don't even have working silicon of as of late september?.


Remember the hastily made card that Nvidia's CEO was proudly holding and showing to the crowd,stating that it was a fermi based,when it clearly wasn't by the sawed off PCB and wood screws holding the heatsink in place ....Nvidia's CEO lied about that plain and simple too....He's basically a top flight bull****ter of the worst degree to be a willing part of a stunt like that.

I refuse to believe that Cypress is a superior product to Fermi. Fermi is a true general purpose GPU with C++ support and a IEEE754 compliant floating point implementation, which Cypress is not.

I think this situation parallels Betamax versus VHS, where Fermi is Betamax and Cypress is VHS, which is sad. I really wanted to get my hands on a Fermi processor to program it. If this is true and I am lucky, Nvidia will have Fermi II out before I graduate next spring.

Xion X2
02-20-10, 12:42 PM
Should people wait another month to buy a Fermi? Depends- if you need a card right now because your old one died, probably not. If you just feel like upgrading, that depends if you want to be limited in running AA in UE3 games, not have PhysX effects, or not have a true 3d option. (not to mention lower DX11 performance)

An ATi 5XXX purchase will arguably be the "best you can buy" till next month, then they're going to to look pretty dated. This is the trade ATi made releasing an evolutionary GPU, they got half a year with little competition at the high end, now their products are going to look dated.

I think you exaggerate how "dated" they're going to look. Fermi will probably outpace 5870 in performance and likely not 5970. With ATI still having the fastest card on the market, this lessens how "dated" they'll look in the eyes of consumers.

And 3D is no longer just available with Nvidia. ATI has opened up the drivers to allow this with their hardware in the Catalyst 10.3. The difference is that it's open source and not propietary like Nvidia's solution.

3D Stereoscopic support

While ATI won't be offering their own 3D support like NVIDIA does with its 3D Vision, they will be updating their D3D driver to enable 3rd party middleware vendor support such as iZ3D. They'll be able to offer support for 120 Hz screens and output 60 Hz per eye.


http://www.tweaktown.com/articles/3140/future_ati_catalyst_drivers_why_you_should_be_exci ted/index4.html

There are advantages and disadvantages to this tech being propietary. One advantage would be that Nvidia's 3D is in-driver just as Eyefinity is in-driver for ATI and doesn't require 3rd party support (like Matrox-to-go does with Nvidia.) I'm kind of on the fence about it. I guess I'd prefer that ATI have some sort of in-driver 3D option, but at least it's possible to do it which undoubtedly mitigates the clear advantage that Nvidia once had in this area.

Johnny C
02-20-10, 01:05 PM
I think you exaggerate how "dated" they're going to look. Fermi will probably outpace 5870 in performance and likely not 5970. With ATI still having the fastest card on the market, this lessens how "dated" they'll look in the eyes of consumers.

And 3D is no longer just available with Nvidia. ATI has opened up the drivers to allow this with their hardware in the Catalyst 10.3. The difference is that it's open source and not propietary like Nvidia's solution.



http://www.tweaktown.com/articles/3140/future_ati_catalyst_drivers_why_you_should_be_exci ted/index4.html

There are advantages and disadvantages to this tech being propietary. One advantage would be that Nvidia's 3D is in-driver just as Eyefinity is in-driver for ATI and doesn't require 3rd party support (like Matrox-to-go does with Nvidia.) I'm kind of on the fence about it. I guess I'd prefer that ATI have some sort of in-driver 3D option, but at least it's possible to do it which undoubtedly mitigates the clear advantage that Nvidia once had in this area.

Wow,

Good news then....thanks for posting this up...

Rollo
02-20-10, 01:08 PM
I think you exaggerate how "dated" they're going to look. Fermi will probably outpace 5870 in performance and likely not 5970. With ATI still having the fastest card on the market, this lessens how "dated" they'll look in the eyes of consumers.

And 3D is no longer just available with Nvidia. ATI has opened up the drivers to allow this with their hardware in the Catalyst 10.3. The difference is that it's open source and not propietary like Nvidia's solution.

"Dated" is more than just speed. When the 5870 launched, the GTX295 was still the fastest single card on the market. I didn't notice you saying "NVIDIA is still lookin' good with the fastest card on the market!" Xion. (or did I miss that?) ;) "Dated" also has to do with being an outdated arch that will be far outpaced per GPU in DX11, GPGPU, physics, and mutli panel 3d support.

Can you link us to some reviews of ATi 3d on 120 Hz panels with shutter glasses? Or is it still just the same ol' same ol' profile driven IZ3D dual plane monitors? Now by "opening up" their drivers, are they saying you can use ANY shutter glasses?

Last, the "fastest card" is a temporary situation. Rumor has it dual Fermi launches in May. My money is on the rumor. ;)

Before summer begins, we'll be back where we left off last summer: NVIDIA leading single and multi GPU performance and features, ATi competing on price.



There are advantages and disadvantages to this tech being propietary. One advantage would be that Nvidia's 3D is in-driver just as Eyefinity is in-driver for ATI and doesn't require 3rd party support (like Matrox-to-go does with Nvidia.) I'm kind of on the fence about it. I guess I'd prefer that ATI have some sort of in-driver 3D option, but at least it's possible to do it which undoubtedly mitigates the clear advantage that Nvidia once had in this area.

3rd party support can be sketchy, and are there any shutter glass makers currently gearing up to use the "open" support? I'd think ATi would have told us if their were.

Personally I think this arrangement serves the market particularly well:

People who want the features, or "best of best" performance, will have NVIDIA cards to buy for more money.

People who want still very high performance coupled with a few less features can save with ATi cards.

Everybody wins if the market plays out as I expect.

Xion X2
02-20-10, 01:36 PM
"Dated" is more than just speed.

Your average consumer prioritizes a benchmark graph. Your average consumer looks to see which card is out in front. Nvidia and ATI both know this which is a key reason they keep releasing dual-GPU cards each generation.

The enthusiast crowd, such as you or I, knows there are underlying features that have an impact on how dated a technology is. That's not what I'm arguing. When you said that ATI's 5xxx line is "going to look dated," that's a generalized statement that applies to the masses. It doesn't. Average consumers don't care as much about how it's done as long as it's done. They're going to flip to Anandtech or FiringSquad, see the 5970 out in front, and assume that it's the best card on the market--features be damned.

When the 5870 launched, the GTX295 was still the fastest single card on the market. I didn't notice you saying "NVIDIA is still lookin' good with the fastest card on the market!" Xion. (or did I miss that?) ;)

I didn't say it, so that means that I had some sort of issue with the 295 or Nvidia? That's some twisted logic that you're using.

In fact, I've repeatedly said that I like SLI and that it's been mostly transparent for me whenever I've used it:

SLI is transparent for me.. works nearly as hassle-free as a single GPU and scales well. It's necessary, for me, with games like Crysis and Clear Sky that need more GPU power to run well.

http://www.nvnews.net/vbulletin/showpost.php?p=2045157&postcount=7

I wish that you would stop trying to turn every graphics discussion into an Nvidia vs. ATI thing like one of us has to play for a certain team. The only thing I am interested in are the facts. I could care less about supporting one company or the other as should be evident from my hardware purchases over the past few years.

"Dated" also has to do with being an outdated arch that will be far outpaced per GPU in DX11, GPGPU, physics, and mutli panel 3d support.

You're talking a year or two down the road before we start to see many DX11 games on the shelf. Average consumer doesn't care about that. Average consumer cares about what performance he sees at the present time.

Can you link us to some reviews of ATi 3d on 120 Hz panels with shutter glasses? Or is it still just the same ol' same ol' profile driven IZ3D dual plane monitors? Now by "opening up" their drivers, are they saying you can use ANY shutter glasses?

I'm not going to debate the specifics of 3D because, honestly, I don't know that much about it and don't currently use it. I'll let whoever here is interested do the research if they'd like to know what advantages/disadvantages there are in Nvidia's solution as compared to ATI's.

Last, the "fastest card" is a temporary situation. Rumor has it dual Fermi launches in May. My money is on the rumor. ;)

I'm not going to base my purchasing decisions on "rumors," and I'd advise others not to, either. A single Fermi chip is already pushing 280w, so given that PCI-e spec is 300w or under for a graphics card, it's going to be challenging for them to release a dual-GPU product that will outpace 5970 as they have a power ceiling to work with.

Will they? Possibly. Regardless if it does or not, you said it would be "next month" when the 5xxx series would look "dated," and if Nvidia doesn't release a dual-GPU Fermi until May then that statement becomes less and less viable.

Before summer begins, we'll be back where we left off last summer: NVIDIA leading single and multi GPU performance and features, ATi competing on price.

Possibly. Or possibly not. It's just speculation at this point to say that Nvidia will be leading in multi-GPU when there are no specs or even an official product announcement for a dual-GPU card yet. And given that it'd need to slide in at 300w or below, when their single GPU version is already in the vicinity of 250-275w, it makes it even more difficult to take back that performance crown.

3rd party support can be sketchy

Yep, it's just a good thing that Nvidia has that 3rd party support (Matrox-to-go) for multi-monitor support to match ATI's in-driver Eyefinity.

Rollo
02-20-10, 04:45 PM
I have both of the ONLY 3d solutions out for both platforms (unlike the couple people who said AMD has another way when they do not know shiat about what they are talking about). Both systems work WELL. They both totally different but both work.

Nvidia has 3 monitor support in their new drivers without 3d party hardware needed.

AMDs does NOT have open 3d like you think. Period. They might plan to but you cannot buy hardware to do it now unless you buy IZ3d.

You like the IZ3d DB? I'd read some less than flattering reviews of it, but also saw some people return it over at Rage3d, so to date I've been negative on it. If you say it works well, I'll have to reconsider- I hadn't seen a person as high on it as you apparently are.

Rollo
02-20-10, 05:04 PM
XionX2-
I agree we don't need to have an "ATi vs NVIDIA" battle. First and foremost, there's not a lot to discuss till everyone can buy a Fermi and the benches, final clocks, etc are known.

Second, I don't think we really have anything to argue about on the features front- NVIDIA just wins that hands down.

They have 3d Vision that can be used with a variety of LCD monitors, projectors and television.

They can do "Eye-finity", they can do it in 3d as well.

They can run C++ native and are on the forefront of GPGPU, with many handy apps already on the market.

They're the only game in town for hardware accelerated physics effects.

They're the only solution that offers total flexibility for multi GPU, with open profiles. Games launch with profiles rather than users waiting for a patch.

They have the only automatic switching graphics for lap tops.

UE3 engine games get AA.

Compared to that all ATi has:

They offer 6 panel as well as three panel Eyefinity.

They offer some home theatre capabilities, for those who have costly video cards but lack a blue ray player.

They have support for IZ3d monitors.

And that's it for the differences. I agree Joe Public may not use half the stuff, but wouldn't you want it available even if you didn't?

K007
02-20-10, 05:25 PM
lol

pkirby11
02-20-10, 05:40 PM
For some of you it doesn't matter but to me it comes down to one thing. Price/Performance. Which ever company has that, I'm going with ATI, gets my money. We are in a recession and times are tough for everyone, if you can get a card that performs with in the range of a card that cost $100+ more which one is your average user going to buy? Probably the cheaper card. I don't care about "features", physics has been being hyped for years and I've yet to see a game that's made me want to run out and buy it. Second all these proprietary "features" NVIDIA have actually make me want them less. I hate feeling like I have to buy something more expensive to get it, I will either a.) not buy the game if it's required or b.) buy the game but not worry if I don't have pretty floating papers when I beat up bad guys.

Personally at this point I don't care if Fermi blows me and cooks me dinner, if the cheapest solution I can get is $100+ more than ATI I won't be buying it. I don't play games like I once did and I also don't spend money like I once did. Because of this I ultimately want the best price/performance I can get. NVIDIA doesn't seem to get this, they think that because they have the numbers they can charge massive amounts for their cards. Maybe I'll be wrong but right now it doesn't matter, I'm waiting till the end of the year to buy anything. By then maybe NVIDIA will have a competing price/performance card on par with ATI's price/performance. But in the end when the day ends money is my biggest decision factor anymore. ATI's cards don't suck and they offer great price/performance and that's all that matters to me.

Good luck to both companies, competition is a great thing. But it doesn't matter how much you defend or trash either company NVIDIA will loose the minute they price their cards to high. So who cares whether the yields are low or it has C++ programing. I just play games with my graphics card.

Rollo
02-20-10, 05:47 PM
For some of you it doesn't matter but to me it comes down to one thing. Price/Performance. Which ever company has that, I'm going with ATI, gets my money. We are in a recession and times are tough for everyone, if you can get a card that performs with in the range of a card that cost $100+ more which one is your average user going to buy? Probably the cheaper card. I don't care about "features", physics has been being hyped for years and I've yet to see a game that's made me want to run out and buy it. Second all these proprietary "features" NVIDIA have actually make me want them less. I hate feeling like I have to buy something more expensive to get it, I will either a.) not buy the game if it's required or b.) buy the game but not worry if I don't have pretty floating papers when I beat up bad guys.

Personally at this point I don't care if Fermi blows me and cooks me dinner, if the cheapest solution I can get is $100+ more than ATI I won't be buying it. I don't play games like I once did and I also don't spend money like I once did. Because of this I ultimately want the best price/performance I can get. NVIDIA doesn't seem to get this, they think that because they have the numbers they can charge massive amounts for their cards. Maybe I'll be wrong but right now it doesn't matter, I'm waiting till the end of the year to buy anything. By then maybe NVIDIA will have a competing price/performance card on par with ATI's price/performance. But in the end when the day ends money is my biggest decision factor anymore. ATI's cards don't suck and they offer great price/performance and that's all that matters to me.

Good luck to both companies, competition is a great thing. But it doesn't matter how much you defend or trash either company NVIDIA will loose the minute they price their cards to high. So who cares whether the yields are low or it has C++ programing. I just play games with my graphics card.

Times aren't "tough for everyone". I only know one person who's been impacted significantly by the recession, and he chose to switch career paths during it.

Times are pretty much the same or better for everyone else I know.

shadow001
02-20-10, 07:05 PM
I don't believe anyone reads Charlie Demerjian at all. Charlie isn't a smart person, he's not a journalist. He's a rumor monger who only posts rumors he think will damage NVIDIA.

He does this because NVIDIA removed his press privileges due to his misuse of his press privileges. This angered Charlie, so since then he's been acting like a teenager who was left home on prom night because their beloved found someone prettier/more handsome to escort.

I don't doubt that he's got something against Nvidia,but on the other hand,the possible release schedule he's stated over the last 6+ months regarding fermi has been accurate so far,so whatever his sources are,they've been giving him the correct information,not the BS we've been hearing from NV directly,such as it might be out in late november 2009,stated by none other than Nvidia's CEO himself.



Is Fermi "late"? I guess so, but I'd note that we're talking about inventions here, and you can't really schedule when they're done to coincidentally occur the same day your competitor launches a product. These things are in development for years.

I'm not even asking the same day either,and things can and do screw up on occasion,but having said that,being 6+ months late is another matter altogether,and long enough that only the most hardcore Nvidia fan would wait for it,or tolerate that they'd still buy it even if it's not the fastest card on the market when it eventually does get released....I want the fastest card for my money thanks.



Has being "late" damaged NVIDIA, like all the ATi fans said it would? Depends how you look at it. On one hand, NVIDIA actually gained desktop marketshare on ATi last quarter, and made more money than ATi the last two quarters. (so I guess the "uber triumph" of the 5XXX series didn't help them much) OTOH, it's pretty obvious NVIDIA would have been more successful if the Fermi based products were in the market those two quarters.

Like you said it depends on how you look at it....What i do know is that whatever amount ATI has managed to sell for their HD5*** actually sells at a premium and has a nice profit margin for each one sold,unlike what Nvidia are likely being forced to do right now by seriously lowering the prices for all their current cards,therefore reducing their profit margins considerably,even if the end user can get great deals in the end.



Should people wait another month to buy a Fermi? Depends- if you need a card right now because your old one died, probably not. If you just feel like upgrading, that depends if you want to be limited in running AA in UE3 games, not have PhysX effects, or not have a true 3d option. (not to mention lower DX11 performance)


In case you haven't realised yet,people have been waiting for 5 months now,not just one like you state,and as for GPU based physX,which has only been used in a relatively small amount of games and the effects themselves aren't exactly mind blowing,and really require SLI setups to maintain good FPS performance(this is stated by users using Nvidia cards BTW),and it remains to be seen about it's DX11 performance across a large variety of games,not specific ones at cherry picked settings if you know what i mean..;)


An ATi 5XXX purchase will arguably be the "best you can buy" till next month, then they're going to to look pretty dated. This is the trade ATi made releasing an evolutionary GPU, they got half a year with little competition at the high end, now their products are going to look dated.


Strong statement indeed,given that they've gotten a six month lead on Nvidia,will likely have refresh parts out by the time Ferrmi is out in decent enough quantities to actually find one on store shelves,or ignoring that a brand new line of video cards(HD6*** series???),might actually be released before the end of this year,because ATI engineers haven't been sitting on their asses while Nvidia gets it's **** together with Fermi here....It would be very naive to think so.

shadow001
02-20-10, 07:09 PM
I refuse to believe that Cypress is a superior product to Fermi. Fermi is a true general purpose GPU with C++ support and a IEEE754 compliant floating point implementation, which Cypress is not.

I think this situation parallels Betamax versus VHS, where Fermi is Betamax and Cypress is VHS, which is sad. I really wanted to get my hands on a Fermi processor to program it. If this is true and I am lucky, Nvidia will have Fermi II out before I graduate next spring.


If you're a developer or somebody that needs cuda support to run their applications then you really have no other choice but to wait for Fermi,and i'll happily concede that the GP-GPU infrestructure is a lot better developed for Nvidia products than it is for ATI,no question about that.


But most are just regular users running more common applications,for which Cuda means absolutely squat for them in real terms,and there's no point in waiting for Fermi in those conditions.

shadow001
02-20-10, 07:15 PM
I have both of the ONLY 3d solutions out for both platforms (unlike the couple people who said AMD has another way when they do not know shiat about what they are talking about). Both systems work WELL. They both totally different but both work.

Nvidia has 3 monitor support in their new drivers without 3d party hardware needed.

AMDs does NOT have open 3d like you think. Period. They might plan to but you cannot buy hardware to do it now unless you buy IZ3d.


You forgot a little issue there i think,in that triple monitor can be used with a single card with ATI,regardless if it's 2D or 3D,while with nvidia,you need an SLI setup,as each card only has 2 outputs...;)


And users can choose which type of display size they want and what games they play more often and choose a model of card that's suited for it too,they don't have to automatically choose the fastest HD5*** card to have decent performance.

Muppet
02-20-10, 07:28 PM
anyone who would want to run three monitors in 3d with one gpu is delusional anyways. Heck I would want Crossfire X or SLI for more than two monitors even without 3d Surround. Keep in mind I am the guy who tried to get four 8800 GTXs running when I did the SKulltrail Waterbox project for Intel

And to be honest I really think that 3 screens would totally ruin the immersion of 3D with the 2 bezels showing. I would much rather have one large 40" - 50" screen.

shadow001
02-20-10, 07:40 PM
XionX2-
I agree we don't need to have an "ATi vs NVIDIA" battle. First and foremost, there's not a lot to discuss till everyone can buy a Fermi and the benches, final clocks, etc are known.


I can fully agree to that...It's about time Nvidia stopped talking the talk and started walking the walk if you know what i mean....Saying it'll be the best thing ever and the weeks and months just passes by with nothing to go on except slides and technical analysis articles only goes so far.


Second, I don't think we really have anything to argue about on the features front- NVIDIA just wins that hands down.

They have 3d Vision that can be used with a variety of LCD monitors, projectors and television.


I'll give that one for the time being.


They can do "Eye-finity", they can do it in 3d as well.

Like i mentioned in my previous post,it requires an SLI setup while ATI does it with one card.


They can run C++ native and are on the forefront of GPGPU, with many handy apps already on the market.

Good if you're a developer or run applications created in cuda,but that's not the situation with most users and you know it.


They're the only game in town for hardware accelerated physics effects.

Also requires SLI setups to pull off with good performance anyhow,single card,unless it's a GTX295,it gets a little choppy,especially if you like to play with high quality graphics settings,and we all do of course.


They're the only solution that offers total flexibility for multi GPU, with open profiles. Games launch with profiles rather than users waiting for a patch.


Actually,the latest Catalyst 10.2 driver set,released just yesterday btw,can now update crossfire profiles seperately from the driver package itself,so as soon as a new game hits store shelves,users will be able to download an updated driver profile package directly from ATI's driver support site,without having to wait a month for the entire driver package as it was the case up to now.


They have the only automatic switching graphics for lap tops.

That i got to admit is pretty cool overall



UE3 engine games get AA.

It's been quite a discussion on that one,since the latest game(batman darkham asylum),actually does a hardware detection to see if the system in question has an Nvidia card in order to allow it,but ATI GPU's are able to do it just fine,so it more a marketing thing to allow a feature to work with one GPU and not the other.


Users can force AA in UE3 games using the control panel with ATI hardware.


Compared to that all ATi has:

They offer 6 panel as well as three panel Eyefinity.

They offer some home theatre capabilities, for those who have costly video cards but lack a blue ray player.

They have support for IZ3d monitors.

And that's it for the differences. I agree Joe Public may not use half the stuff, but wouldn't you want it available even if you didn't?


If i don't use Cuda,why would i care if it does support it or not?....I mean like it or not,open standards is what makes the world go round,and the same goes for PhysX or Nvidia's 3D display....Proprietary standards never fly in the long run,since Nvidia isn't the only company out there and GP-GPU,physics and 3D technology will only fly when there's actually a unified standard that every GPU maker has to abide by,users know this and so do developers that create games,and that's the bottom line.


Given the above,and with ATI releasing the HD5*** line way earlier than Nvidia will with fermi,and they are outstanding cards for both performance and features,why would i wait for fermi that long is just mind boggling.


I'm the owner of a pair of HD 5970's in quad crossfire which have a simply umbelievable amount of grahics horsepower,and of course are DX11 certified and make every game out there it's personal bitch even at insane settings,and i think you mentioned that a Dual GPU fermi card than can actually beat these card may come in may of this year....That's 6+ months after i bought these,so they better be faster...;)

Muppet
02-20-10, 07:41 PM
Same here. Hope we'll see some nice 3D Vision compatible LCD TVs this year.

This is going to be one of the most expensives years to date for hardware. New Motherboard, CPU's, Fermi's large screen 3D TV's and SSD. :scarey:

shadow001
02-20-10, 07:43 PM
The GPU processing advantage has benefits for many more of us than you might expect. if you do video editing, 3d drawing and rendering, watch videos on You Tube . . .etc ;) Nvidia's goal is for you to be able to scrimp on the CPU part of your budget and spend those dollars on the GPU instead. AMD clearly has different ideas since they are in both markets ;)

Again, I personally hope AMD does well as I love competition. It benefits us all in the end (wow, nice buttsechs reference I made).


I doubt watching you tube videos is that stressing to be honest,but that's just me,and as far as scrimping on CPU's,we've already got more than enough CPU power even at the lowest end that only professional applications actually make use of 4 cores....Most games,even the latest ones,hardly use more than 2,and i'm talking about CPU's costing 200$ here,so not exactly high end stuff here.