PDA

View Full Version : NVIDIA GF100 Previews


Pages : 1 2 3 4 [5] 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52

Muppet
02-08-10, 11:32 PM
Well there's always that i guess,but i'm not in the market for a new system,and made my choice when the HD5970's were released and kickass of course,as well that the news was still bad with regards to fermi,so let's hope that this thing really gets released in march and that availability is actually decent,and not just slightly better than a paper launch where a lot of users won't be able to find one,even after waiting for them this long.

Your choice was a good one. It is an exceptional performer. Hell i'd have a couple myself, if AMD had a 3D solution. If Fermi performs in between the 5870 and 5970, i'll be happy :). Hopefully closer to the 5970 though.

Maverick123w
02-11-10, 03:48 PM
lets get back to tv vs monitor, well you can just put the monitor right in your face right? but let me understand something, if you look at a 1080 with over a 32inch, dont you see pixels?

I sit less than 3 feet from my 32" and don't see any pixels.

XMAN52373
02-11-10, 04:20 PM
Your choice was a good one. It is an exceptional performer. Hell i'd have a couple myself, if AMD had a 3D solution. If Fermi performs in between the 5870 and 5970, i'll be happy :). Hopefully closer to the 5970 though.

The 448SP part will be faster than the 5870 by about 10% on average in normal gaming situations, closer to 5970 in non. The 512SP part, which sadly will be mostly MIA til mid summer if things go the way Nvidia hopes, will be close to 5970 performance.

Ninja Prime
02-11-10, 05:00 PM
I'll believe it when I see some numbers in the hands of reviewers running real benchmarks.

Viral
02-11-10, 05:15 PM
It would still suck if the 448SP part is only 10% faster than a 5870, that would make it pretty easy for ATI to release a 5890 that takes back the single GPU performance crown.

Xion X2
02-11-10, 08:27 PM
The 448SP part will be faster than the 5870 by about 10% on average in normal gaming situations, closer to 5970 in non. The 512SP part, which sadly will be mostly MIA til mid summer if things go the way Nvidia hopes, will be close to 5970 performance.

That doesn't make much sense. A 5970 will be at least 50-60% faster than a 5870 most of the time. In cases where Crossfire scales optimally, around 75-80%. How does less than a 20% increase in shader capacity (448>512) make up that difference to where it would go from only beating 5870 by 10% to close with 5970?

JasonPC
02-11-10, 09:34 PM
If it really is only 10% faster than the 5870 then nvidia had better undercut ATI's prices. If not there will be little incentive for me to buy one. If that's the case I may just buy the ATI card to help support them. I mean come on. Massive delays, higher power usage, and so much of a bigger die size just to squeak out 10%... ridiculous.

gulizard
02-11-10, 10:19 PM
If it really is only 10% faster than the 5870 then nvidia had better undercut ATI's prices. If not there will be little incentive for me to buy one. If that's the case I may just buy the ATI card to help support them. I mean come on. Massive delays, higher power usage, and so much of a bigger die size just to squeak out 10%... ridiculous.

Hasn't been confirmed. In fact just rumors thus far. No one can say rather the final version will use more power, or even run hotter. Its all speculations.. and even if it does all the above right now in its current phase, it may not be with the final product that the public gets. Them holding back all the information is nerve racking. Its either one of two things. The card is really crappy, or its good enough its going to blow ATI out of the water... Either way I highly doubt I'll be purchasing a crappy ATI card. I dislike them, and their drivers.

Further more this is an nvidia fan site, and I've seen way to many fan boys outside of this thread that have just blown up with how nvidia sucks, with less then 100 post, some of them fresh registered users. Some of these guys that have been here for awhile that have ati cards that is fine, but it just shows that loyality to a company is lacking bad. Its whoever does better I suppose. Nvidia has yet to sell me a card that has not been able to play my games, and run my programs smoothly. I don't play musical cards.. Some people have the money to do that, I do, but I can think of better things to spend my money on then upgrading for a 10% increase, and honestly if it does 10% better at any game then nvidia has still won for now as far as performance.


I can remember someone taking identical shots of nvidia card, vs ati card. The nvidia card was producing a fog that wasn't showing up on ati cards, in fact a very popular game called AVP2 ati drivers would remove the fog in the predator vision, and from the maps. Making the entire game cheap because users with crappy ATI cards. I remember people show AA working ATI cards side by side with nvidia cards, and the nvidia card having a better quality smoother picture. I'd be willing to sacrifice a few FPS for a better looking game and picture quality for fermi. People who go on here and talk about "well I can run this game at 70 FPS, who cares? Anything past 70 FPS without vsync tearing is awful, and lets not to mention that 30 FPS is playable for many, and most of the current cards are running 45-50 FPS... Raw performance means little to me, and that is why nvidia is going to continue to have my performance. You guys who are running 100 FPS and want to do 200 FPS are silly. I just have to laugh... and then at these insane resolutions? Honestly I tried to run my monitor at those insane resolutions, and anything past 1600 by 1050 was very bland, and crappy looking.

As far as Eye Affinity that is over kill. Those lines in the middle of the screen would bug the hell outta of me. Considering all that crap is, is a bunch of monitors put together, why not just by a freaking 50 inch LCD HD TV and hook it, I guess its for that insane resolution, but still...

So at the risk of being flamed again. Thats all I got to say.

JasonPC
02-11-10, 10:33 PM
I've never had issues with ATI cards or drivers. Considering all the problems I've had with nvidia's drivers I'm willing to risk having issues with ATI drivers since I don't have a lot to lose. I know it's rumored but when XMAN52373 says 10% now I truly am concerned since he is typically very pro nvidia :D I'm going to wait and see who has the best price/performance and whoever comes on top will likely get my buy. I think ATI may be in the best position to deliver that. A 10% difference in performance is no biggie to me. Either card will improve my performance by quite a lot. But the price difference between the two is definitely going to be the deciding factor.

Wolken007
02-12-10, 08:27 AM
Hasn't been confirmed. In fact just rumors thus far. No one can say rather the final version will use more power, or even run hotter. Its all speculations.. and even if it does all the above right now in its current phase, it may not be with the final product that the public gets. Them holding back all the information is nerve racking. Its either one of two things. The card is really crappy, or its good enough its going to blow ATI out of the water... Either way I highly doubt I'll be purchasing a crappy ATI card. I dislike them, and their drivers.

Further more this is an nvidia fan site, and I've seen way to many fan boys outside of this thread that have just blown up with how nvidia sucks, with less then 100 post, some of them fresh registered users. Some of these guys that have been here for awhile that have ati cards that is fine, but it just shows that loyality to a company is lacking bad. Its whoever does better I suppose. Nvidia has yet to sell me a card that has not been able to play my games, and run my programs smoothly. I don't play musical cards.. Some people have the money to do that, I do, but I can think of better things to spend my money on then upgrading for a 10% increase, and honestly if it does 10% better at any game then nvidia has still won for now as far as performance....


So at the risk of being flamed again. Thats all I got to say.

gulizard, people respond negatively to your posts because you don't seem to have a middle-of-the-road approach. It is perfectly fine if you prefer nvidia products over ATI. However, you fail to recognize that both companies have had issues over the years with different products. Plenty of users have both positive and negative experiences with both. I know the name of the site is nvnews and has focused traditionally on nvidia products, but that doesn't mean that everyone here has to be a die-hard nvidia fan. I have used both products over the years and will continue to do so. I welcome the companies competing for the performance crown.

You make a lot of claims about ATI products based on a seemingly limited experience from quite some time ago. For you to make claims like "ATI has crappy drivers" and "ATI doesn't do AA as well as nvidia" and "Eye Affinity (which by the way, is actually Eyefinity) is overkill and worthless" without having used ATI's current generation of products is ridiculous. If you are loyal to nvidia and want to buy their product no matter what, that is fine -- just don't make claims about products that you have no experience with.

XMAN52373
02-12-10, 11:12 AM
That doesn't make much sense. A 5970 will be at least 50-60% faster than a 5870 most of the time. In cases where Crossfire scales optimally, around 75-80%. How does less than a 20% increase in shader capacity (448>512) make up that difference to where it would go from only beating 5870 by 10% to close with 5970?

The 448SP part and the 512SP are going to have different clocks. The 448SP should come in, unless they have changed things, 450-475 for the core and around 1250-1300 for the domiain while having just 40ROPs. The 512SP part is 48ROPs and should come in at 600-625 core and 1450-1500 for domain clocks.

gulizard
02-12-10, 02:24 PM
gulizard, people respond negatively to your posts because you don't seem to have a middle-of-the-road approach. It is perfectly fine if you prefer nvidia products over ATI. However, you fail to recognize that both companies have had issues over the years with different products. Plenty of users have both positive and negative experiences with both. I know the name of the site is nvnews and has focused traditionally on nvidia products, but that doesn't mean that everyone here has to be a die-hard nvidia fan. I have used both products over the years and will continue to do so. I welcome the companies competing for the performance crown.

You make a lot of claims about ATI products based on a seemingly limited experience from quite some time ago. For you to make claims like "ATI has crappy drivers" and "ATI doesn't do AA as well as nvidia" and "Eye Affinity (which by the way, is actually Eyefinity) is overkill and worthless" without having used ATI's current generation of products is ridiculous. If you are loyal to nvidia and want to buy their product no matter what, that is fine -- just don't make claims about products that you have no experience with.

I have a 4800x2 series card behind me in a box. I used it for 2 weeks, and went back to nvidia.

As for the way I approach the situation, and address these guys with 100 or less post that are here calling people names, and attempting to make it look as if nvidia has fallen those are the reasons I get angry. Makes me not even want to come to this site. I like technology to. I just never liked ATI, and I haven't registered over at Rage3d and started posting how crappy ATI has, but I have a half of mind to, considering that a lot of these new users have came here with no indications of ever owning an nvidia card. When a company falls behind, or doesn't keep up theres always one guy that had a bad experience with nvidia, maybe he got a DOA card, and blames nvidia, but hes the guy registering here and bashing nvidia, and trying to make his beloved ATI card look so good.

People are putting nvidia down an awful lot, considering they or no one for that matter has any proof of what nvidia is going to do. Its all speculation. The fact its taken this long, tells me personally they are probably finalizing the card. People are saying it will use more power then ATI? How can you prove this? Even if you found a site that shows power consumption that is not the final product. It isn't released. When its released even then you shouldn't be here with an ATI card telling people how horrible nvidia is.

Look up my post up until this mess started. Not once have I ever posted a single negative thing towards ATI, or anyone that owns an ATI card. Its because of these ass hats on here now, that I've gotten so hostile, because I come into this thread to be informed about nvidia graphics cards, and how it may compare, not to hear how the company is junk, and that people hate nvidia so much, and all that non sense. So when that stops maybe I will be a little nicer in my post.

Again no matter what a fan of ATI tells me I am loyal. I do not play musical chairs with anything. I don't swap between brands... I went to Intel after AMD bought ATI. People who are loyal and want to register here because they are a fan of nvidia, and technology, shouldn't be so darn hostile, and you know those same people here that have been here for awhile that are running nvidia through the mud, are the same ones that probably were doing it to ATI when nvidia was on top.

Nvidia has more games that it supports. Much larger gaming platform then ATI has right now. I am choosing to continue to buy nvidia products. I could really careless, as long as my card I got can run my game at the highest setting, and my desired resolution, at 45-60 FPS. If it can do 100 thats cool to, but I am not going to turn my back on a good quality made nvidia card that I have been buying for years over a few loss in FPS, if thats even the case.

Wolken007
02-12-10, 04:47 PM
Nvidia has more games that it supports. Much larger gaming platform then ATI has right now.

gulizard, I don't think anyone has any concerns with your loyalty towards nvidia -- it is statements like this, and the others I pointed out earlier, that are irritating. I don't mean to stir up anything and am not trying to be hostile in any way -- you just need to realize that these are the types of statements that are going to cause problems because they simply aren't true.

JasonPC
02-12-10, 06:29 PM
Seriously what games are out that aren't supported by ATI?

gulizard
02-12-10, 06:38 PM
gulizard, I don't think anyone has any concerns with your loyalty towards nvidia -- it is statements like this, and the others I pointed out earlier, that are irritating. I don't mean to stir up anything and am not trying to be hostile in any way -- you just need to realize that these are the types of statements that are going to cause problems because they simply aren't true.

Explain how they aren't true? Nvidia's name is on a lot more games out there. Some really great games have physics support which is nvidia only. That is what I meant by the statement. That is a good reason to stay nvidia as of right now. ATI hasn't been able to keep up long enough to really have its names on any real big titles that I know of. AVP is the only one that I believe ATI has had its name on as of recent.

JasonPC
02-12-10, 06:46 PM
There is a difference between marketing and actually supporting a game though.

Aphot
02-12-10, 08:34 PM
I'm selling my GTX 285 tomorrow and hopefully Nvidia will give me something to buy soon. If not i'm going to have to buy an ATI.

Viral
02-12-10, 09:27 PM
The 448SP part and the 512SP are going to have different clocks. The 448SP should come in, unless they have changed things, 450-475 for the core and around 1250-1300 for the domiain while having just 40ROPs. The 512SP part is 48ROPs and should come in at 600-625 core and 1450-1500 for domain clocks.

I can't see them clocking their high end part that low, even if they do plan to launch a higher end part later on.

XMAN52373
02-13-10, 12:30 AM
I can't see them clocking their high end part that low, even if they do plan to launch a higher end part later on.

The GTX470 isn't their highend, its their second tier highend. And why can't you believe it? If clocks are but 1 of the 2 reason the 512Sp part isn't available, why is it so hard to believe lower clocked secodn tier part? ATI is doing with the 5850 vs the 5870. The 5830 will have even lower clock than the 5850.

Xion X2
02-13-10, 12:53 AM
As far as Eye Affinity that is over kill. Those lines in the middle of the screen would bug the hell outta of me. Considering all that crap is, is a bunch of monitors put together, why not just by a freaking 50 inch LCD HD TV and hook it, I guess its for that insane resolution, but still...


Doesn't sound like you understand how Eyefinity works. Hooking up a 50" LCD would not give you the same experience.

What Eyefinity does is show you parts of the game that you don't get to see on a single display. So in your 50" LCD example, there are parts of the game to your left and your right field of view that you're not seeing on screen that you would with Eyefinity.

Eyefinity doesn't simply stretch the same experience across three monitors. It opens up more of the game for you to see at any point in time.

Also, Nvidia or ATI slapping their name on the back of a game box or game intro doesn't mean a goddam thing. There have been plenty of games that played perfectly on the competition's hardware, like Half Life 2, for example. ATI marketed it, but I never had problems playing it on an Nvidia card. Did you? What about Call of Juarez? That's another that was marketed by ATI yet played fine on Nvidia. That's nothing but a cop out to stick with a certain vendor. In reality, it doesn't mean ##&@.

gulizard
02-13-10, 12:55 AM
Doesn't sound like you understand how Eyefinity works. Hooking up a 50" LCD would not give you the same experience.

What Eyefinity does is show you parts of the game that you don't get to see on a single display. So in your 50" LCD example, there are parts of the game to your left and your right field of view that you're not seeing on screen that you would with Eyefinity.

Eyefinity doesn't simply stretch the same experience across three monitors. It opens up more of the game for you to see at any point in time.

Yes but it cannot do this as one monitor. That is why I wouldn't like it. I know its bigger resolution, a 50" cannot do more then 1280 by 1024 I don't believe. Still I don't see the point in crossing the monitors, nvidia has something similar I believe. Don't like it either. I just was pointing out that, that'd be a silly reason to go with ATI alone.

Xion X2
02-13-10, 01:08 AM
Yes but it cannot do this as one monitor. That is why I wouldn't like it. I know its bigger resolution, a 50" cannot do more then 1280 by 1024 I don't believe. Still I don't see the point in crossing the monitors, nvidia has something similar I believe. Don't like it either. I just was pointing out that, that'd be a silly reason to go with ATI alone.

Well, the point would be by increasing immersion. Don't you think that it's more immersive to be able to turn your head from side to side and see an entire world around you instead of just a flat screen in your straight-on view?

The concern about the monitor bezels is understandable but overstated. Most that try it out claim they can't even see them after a while. You're not playing a game staring at the sides of your monitor; you're playing it looking at the screen itself. When you turn side to side, you look at the screen, not the bezels.

gulizard
02-13-10, 08:37 AM
Well, the point would be by increasing immersion. Don't you think that it's more immersive to be able to turn your head from side to side and see an entire world around you instead of just a flat screen in your straight-on view?

The concern about the monitor bezels is understandable but overstated. Most that try it out claim they can't even see them after a while. You're not playing a game staring at the sides of your monitor; you're playing it looking at the screen itself. When you turn side to side, you look at the screen, not the bezels.

Couldn't this be a concern in multiplayer though? If someone lets say has this expensive option, and yes I understand its pretty darn expensive, can you see more of the screen in MP, giving you an advantage over most people running single monitors at a lower res? Never mind I guess people running at a higher res will always be able to see more though anyway.

I couldn't get use to those lines, even watching a youtube video bothered me quiet a bit. Its a good idea, but if I was going to bust out that kind of money I'd rather wait till they just made one big monitor that was able to handle that native resolution by default. Which I am sure we will see in a few years..

Xion X2
02-13-10, 02:19 PM
Couldn't this be a concern in multiplayer though? If someone lets say has this expensive option, and yes I understand its pretty darn expensive, can you see more of the screen in MP, giving you an advantage over most people running single monitors at a lower res? Never mind I guess people running at a higher res will always be able to see more though anyway.


Yeah, it'll give you an advantage in MP, because you'll have a wider viewing range than most other people.

And Eyefinity's not all that expensive. LCDs are a dime a dozen these days. I got all three of my 23" 1080p screens for less than 520$.

Snake101st
02-13-10, 02:52 PM
Cant wait for the Fermi to come out.. Im hopefully going to step up my 2 GTX 260s that Im getting via EVGA RMA since one of my 8800gtx's kicked the bucket (if theyll let me) to GTX480s. Should be sweeet.

Its highly doubtful that my PSU will power both though (PC Power Silencer 750). :/