Go Back   nV News Forums > Graphics Card Forums > NVIDIA GeForce 7, 8, And 9 Series

Newegg Daily Deals

Reply
 
Thread Tools
Old 07-24-03, 12:14 AM   #49
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default Re: Re: Re: Re: Re: Re: Re: Unreal 'optimization' a non-issue?

Quote:
Originally posted by Ruined
So I assume you'd agree that every review with the R9800PRO vs FX5900Ultra should include a disclaimer that the ATI's scores only are done at FP24, possibly inflating the score over the FX5900Ultra which can do FP32, which is more work, and therefore ATI may have an inflated score?
Let's not forget the spec that DX9 calls for is FP24 and that it NEVER calls for FP32 and that nVidia has no one but themselves to blame for that design failur..."feature".
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 07-24-03, 12:16 AM   #50
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default

Quote:
Originally posted by StealthHawk
As for you continued assinine comments, there have been screenshots posted that weren't zoomed in at showed the issue perfectly. It is exactly this point that begs asking the question WTF Brent was thinking? He didn't post any shots that showed the issue clearly. Why not?

He didn't compare the gfFX UT2003 filtering hack with ATI's full trilinear. Again, why not? Is he just incompetent? Or did he want to make the issue look as much as a non-issue as possible?
Bent Justice is fitting right in at [T]....

(Ok, ok...I'm done now. )
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 07-24-03, 12:25 AM   #51
Ruined
Registered User
 
Ruined's Avatar
 
Join Date: Jul 2003
Posts: 2,447
Default Re: Re: Re: Re: Re: Re: Re: Re: Unreal 'optimization' a non-issue?

Quote:
Originally posted by digitalwanderer
Let's not forget the spec that DX9 calls for is FP24 and that it NEVER calls for FP32 and that nVidia has no one but themselves to blame for that design failur..."feature".
aapo addressed that claim nicely here.
__________________
We're all in it together.

Intel Core 2 Quad Q6700 2.66GHz CPU | Intel G965WH mobo | 8GB (4x2GB) DDR2-667mhz CAS5 RAM (1066MHz FSB) | BFG GeForce 285 GTX OC 1GB | Dell E228WFP 22" DVI-HDCP LCD Monitor | 1TB Western Digital RE3 SATA2 Main Drive | 500GBx2 Western Digital RE3 SATA2 Scratch Drives in RAID0 | Western Digital RE3 1TB SATA2 Media Drive | External 2TB Western Digital MyBook Backup Drive | Adaptec eSATA 3.0gbps PCI-E interface | Sandisk External 12-in-1 Flash Card Reader | LG GGC-H20L HD DVD/BD reader, DVD writer | LG GGW-H20L HD DVD/BD reader, DVD/BD writer | Microsoft E4000 Ergonomic Keyboard | Logitech Trackman Wheel | Antec P182 ATX Case | Thermaltake ToughPower XT 850w modular PSU | KRK RP-8 Rokit Studio Monitors | Windows Vista Ultimate x64
Ruined is offline   Reply With Quote
Old 07-24-03, 02:13 AM   #52
extreme_dB
Registered User
 
Join Date: Jun 2003
Posts: 337
Default Re: Re: Re: Re: Re: Re: Re: Re: Re: Unreal 'optimization' a non-issue?

Quote:
Originally posted by Ruined
aapo addressed that claim nicely here.
How so? First, the DX9 minimum for calculating 32-bit data types is FP24. The NV30-NV34 calculate everything at FP16 or less, because FP32 hasn't even been allowed in the drivers up to now. Only the NV35 currently supports FP32, and it's still slow overall compared to the R350 because of the register usage problem. It remains to be seen how much Nvidia can optimize for HL2, but the developers themselves have said ATI's cards are much faster. It depends on the degree that HL2 uses pixel shading, and the precision required. The NV35 will take a huge performance hit much sooner if there are lots of full-precision shaders, so Nvidia will likely have to force FP16 for most or all of the shaders to try stay to under the penalty threshold.

If proper FSAA relies on the pixel shaders as well then the situation is even worse.

I think HL2 might be able to run as fast on NV35 as R350, but only after Nvidia rewrites shaders for lower precision, which may or may not result in a noticeable loss in IQ.
extreme_dB is offline   Reply With Quote
Old 07-24-03, 03:34 AM   #53
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Ruined
StealthHawk: No personal attacks please.
What personal attacks? That I called statements such as this
Quote:
IQ differences that can only be detected by zooming in on pictures with paint programs or by any other method than looking at the game with your naked eye in action are 100% useless in terms of comparing in-game IQ differences between the card.
assinine? That was not a personal attack, and I stand by my comment. There are screenshots that have been posted(not by [H] conveniently) that clearly show the mipmap transitions.

Quote:
Again, the guys at HardOCP pitted an FX5900 with 44.03 drivers against 9800PRO with latest drivers and could see zero difference in-game.
Keyword what he could see. He says he can see a difference, but that when you're "playing the game no one will see the differences." In other words, he writes off the changes and says that because he can't see them then nobody else can either. That's a dangerous presumption.

Quote:
Regardless of the filtering Nvidia is using with the FX5900 and 44.03 drivers, it looks no different than the filtering that is being used on the 9800PRO when you are actually playing the game according to a pro site that was specifically looking for such differences in gameplay.
And again, he was not comparing NVIDIA's filtering to ATI's full trilinear filtering. He proved nothing, except that he thinks that ATI's tri/bi AF looks better than NVIDIA's tri/bi AF

At the end of the day you just don't get it. IQ is changed, however so slightly. This affects direct benchmark comparisons in UT2003, and this is also deceit from NVIDIA. They don't perform full trilinear in UT2003 in the Quality mode, and in UT2003 only. Everything else is unaffected. They did not announce what they are doing, in fact, they claim performance increases in UT2003 with 44.03(hmm, where did they come from). Informed users have come to know that Quality mode = trilinear. Furthermore, NVIDIA's own documents seem to want reviewers to benchmark UT2003 on ATI cards while running full trilinear! They have quietly slipped in this fake trilinear in replacement for full trilinear without telling anymore, and without giving users the option of using real trilinear. That is the crux of the issue.
  Reply With Quote
Old 07-24-03, 06:40 AM   #54
Ruined
Registered User
 
Ruined's Avatar
 
Join Date: Jul 2003
Posts: 2,447
Default

Quote:
Originally posted by StealthHawk
What personal attacks? That I called statements such as this assinine? That was not a personal attack, and I stand by my comment. There are screenshots that have been posted(not by [H] conveniently) that clearly show the mipmap transitions.
I said:
IQ differences that can only be detected by zooming in on pictures with paint programs or by any other method than looking at the game with your naked eye in action are 100% useless in terms of comparing in-game IQ differences between the card.

Note I said "only" - read the statement again. I'm saying if you can't detect the differences in game with your naked eye, then any IQ differences that can be found with paint programs, etc, are useless, because in the end you still cannot detect them with your naked eye, and cannot see them. What is not valid about that?

Quote:

Keyword what he could see. He says he can see a difference, but that when you're "playing the game no one will see the differences." In other words, he writes off the changes and says that because he can't see them then nobody else can either. That's a dangerous presumption.
What he did say that Nvidia looked slightly superior when AF was off, ATI looked slightly superior when AF was on, but the differences were so minor that you wouldn't be able to see them when actually playing, presumably unless you stood there staring at the distance looking for whatever it was your were looking for. Not really a dangerous statement, just one that puts into perspective how minor the filtering differences are.


Quote:
And again, he was not comparing NVIDIA's filtering to ATI's full trilinear filtering. He proved nothing, except that he thinks that ATI's tri/bi AF looks better than NVIDIA's tri/bi AF

At the end of the day you just don't get it. IQ is changed, however so slightly. This affects direct benchmark comparisons in UT2003, and this is also deceit from NVIDIA. They don't perform full trilinear in UT2003 in the Quality mode, and in UT2003 only. Everything else is unaffected. They did not announce what they are doing, in fact, they claim performance increases in UT2003 with 44.03(hmm, where did they come from). Informed users have come to know that Quality mode = trilinear. Furthermore, NVIDIA's own documents seem to want reviewers to benchmark UT2003 on ATI cards while running full trilinear! They have quietly slipped in this fake trilinear in replacement for full trilinear without telling anymore, and without giving users the option of using real trilinear. That is the crux of the issue. [/b]
And again, my counterargument is that if this is in fact an optimization, and not a bug, I would prefer it because it offers faster speed with IQ differences that according to a major site is undetectable when actually playing. Maybe arguing for an option to revert to the standard method just for kicks would be in order, but if you owned an FX card would you honestly use it, if the reviewers themselves looking for the differences didn't notice any during play? I think the point they were making over at HOCP is that the IQ difference is being blown out of proportion and although it can be seen when studying screencaps, it's so minor that when actually playing it, it's not noticable. You may disagree, but that is his unbiased findings (again, HOCP has been very hard on NV cards). You may feel differently about comparisons if you have done a side by side comparison of both, but when one of the pickiest sites out there says its no big deal, you have to wonder if its worth worrying about.
__________________
We're all in it together.

Intel Core 2 Quad Q6700 2.66GHz CPU | Intel G965WH mobo | 8GB (4x2GB) DDR2-667mhz CAS5 RAM (1066MHz FSB) | BFG GeForce 285 GTX OC 1GB | Dell E228WFP 22" DVI-HDCP LCD Monitor | 1TB Western Digital RE3 SATA2 Main Drive | 500GBx2 Western Digital RE3 SATA2 Scratch Drives in RAID0 | Western Digital RE3 1TB SATA2 Media Drive | External 2TB Western Digital MyBook Backup Drive | Adaptec eSATA 3.0gbps PCI-E interface | Sandisk External 12-in-1 Flash Card Reader | LG GGC-H20L HD DVD/BD reader, DVD writer | LG GGW-H20L HD DVD/BD reader, DVD/BD writer | Microsoft E4000 Ergonomic Keyboard | Logitech Trackman Wheel | Antec P182 ATX Case | Thermaltake ToughPower XT 850w modular PSU | KRK RP-8 Rokit Studio Monitors | Windows Vista Ultimate x64
Ruined is offline   Reply With Quote
Old 07-24-03, 06:51 AM   #55
Hanners
Elite Bastard
 
Hanners's Avatar
 
Join Date: Jan 2003
Posts: 984
Default

Quote:
Originally posted by Ruined
I said:
IQ differences that can only be detected by zooming in on pictures with paint programs or by any other method than looking at the game with your naked eye in action are 100% useless in terms of comparing in-game IQ differences between the card.

Note I said "only" - read the statement again. I'm saying if you can't detect the differences in game with your naked eye, then any IQ differences that can be found with paint programs, etc, are useless, because in the end you still cannot detect them with your naked eye, and cannot see them. What is not valid about that?
The problem is, the different mip levels are obvious to the naked eye in some circumstances. I don't have any ATi screenshots for comparison, but take a look at the shots taken on a 5900 by Dave Baumann at Beyond3D - Bear in mind this is with the drivers set to use trilinear filtering:





__________________
Owner / Editor-in-Chief - Elite Bastards
Hanners is offline   Reply With Quote
Old 07-24-03, 07:17 AM   #56
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Ruined
I said:
IQ differences that can only be detected by zooming in on pictures with paint programs or by any other method than looking at the game with your naked eye in action are 100% useless in terms of comparing in-game IQ differences between the card.

Note I said "only" - read the statement again. I'm saying if you can't detect the differences in game with your naked eye, then any IQ differences that can be found with paint programs, etc, are useless, because in the end you still cannot detect them with your naked eye, and cannot see them. What is not valid about that?
I assumed that you said that because you thought it pertained to the UT2003 issue, am I mistaken about this? You have been asserting all along that the issue is NOT visible, right? I am saying that the UT2003 IS visible, without said measures listed above. If I misconstrued your intent then I apologize. Otherwise I stand by my assessment.

Quote:
What he did say that Nvidia looked slightly superior when AF was off, ATI looked slightly superior when AF was on, but the differences were so minor that you wouldn't be able to see them when actually playing, presumably unless you stood there staring at the distance looking for whatever it was your were looking for. Not really a dangerous statement, just one that puts into perspective how minor the filtering differences are.
No, it is dangerous. You're still not seeing the gravity of what he said. Paraphrased he succintly said this: "I cannot see the issue while playing. That means nobody else can either." Since when did Brent become the law? Since when should a journalist make blanket statements that just because they can't see something that nobody else can? I agree, it probably is not something that is always visible. But to say that it is never visible is ridiculous. Some people have made statements that FSAA and AF are not noticeable when actually playing a game, which I find ludicrous. Where does it end? There's nothing wrong with an individual saying that they cannot notice something while playing the game. An injustice is being committed when they try to say that their observations are true for everyone else.

Quote:
And again, my counterargument is that if this is in fact an optimization, and not a bug, I would prefer it because it offers faster speed with IQ differences that according to a major site is undetectable when actually playing. Maybe arguing for an option to revert to the standard method just for kicks would be in order, but if you owned an FX card would you honestly use it, if the reviewers themselves looking for the differences didn't notice any during play? I think the point they were making over at HOCP is that the IQ difference is being blown out of proportion and although it can be seen when studying screencaps, it's so minor that when actually playing it, it's not noticable. You may disagree, but that is his unbiased findings (again, HOCP has been very hard on NV cards). You may feel differently about comparisons if you have done a side by side comparison of both, but when one of the pickiest sites out there says its no big deal, you have to wonder if its worth worrying about.
I do not feel that [H] is unbiased. Their whole attitude towards NVIDIA's NV3x lineup has been disappointing. Think what you like, but Kyle has shown time and time again that he is in NVIDIA's pocket, and has never said anything about NVIDIA's methods, nor whether or not he thinks they are bad. I think that Brent and Pelly's past reviews have been fabulous. But I think he dropped the ball on this, especially since he did not compare NVIDIA's faux trilinear to ATI's real trilinear, so how are people supposed to gauge whether there is a difference or not, and how much of a difference there is?! Furthermore, the issue is clearly not a bug because the filtering changes only occur when "ut2003.exe" is detected. I agree that it is an optimization. However, because it only exists in UT2003, which happens to be a popular benchmark, it is clearly an "optimization."

Please agree or disagree with the following statements:

Giving the user an IQ/performance tradeoff is a good thing.

There is something deceitful about changing filtering in one game and not telling anyone about it, especially when they have been led to believe that the Quality setting in the drivers would perform full trilinear filtering.

It is not a good thing that users who want full trilinear filtering are unable to get it.

NVIDIA should provide an option in the drivers that does real trilinear filtering if the application asks for it.
  Reply With Quote

Old 07-24-03, 05:13 PM   #57
Ruined
Registered User
 
Ruined's Avatar
 
Join Date: Jul 2003
Posts: 2,447
Default

Quote:
Originally posted by StealthHawk
I assumed that you said that because you thought it pertained to the UT2003 issue, am I mistaken about this? You have been asserting all along that the issue is NOT visible, right? I am saying that the UT2003 IS visible, without said measures listed above. If I misconstrued your intent then I apologize. Otherwise I stand by my assessment.
I don't have an R9800PRO to side by side compare with an FX5900 on different monitors, so I can't personally prove that one card looks inferior to the other. I am going by HOCP's controlled A/B test, which clearly you disagree with, and was just making the point that if an optimization is minor enough to be only noticable when studying screenshots as opposed to playing, then I wouldn't consider it of significance. i.e. if you disagreed with the rationale behind the statement I posted, you'd be saying that IQ issues that you cannot see during gameplay are somehow harmful towards the game's image quality when you are playing.


Re: Brent saying he doesn't see a difference while playing. That statement is the same as a reviewer who says they think ATI's 4x AA looks better than Nvidia's 4x AA. In both cases, they are subjective, simply because all IQ evaluations of in-game action are subjective. I don't find it damaging, I find it the norm in IQ evaluations. You have every right to disagree with Brent, but that was his judgement when he had the opportunity to evaluate the cards side by side in a controlled testing environment. It doesn't mean he is automatically wrong, nor does it mean you are automatically wrong if you disagree. However, you probably should at least take his opinion into account - that even a pro reviewer looking for a difference could not detect it during actual gameplay. Brent never said 'HardOCP is right, every other site is wrong,' he just evaluated the situation at ATI's request, and posted his findings. It is your choice to agree or disagree with them.


Quote:

Please agree or disagree with the following statements:

Giving the user an IQ/performance tradeoff is a good thing.
Agree - I don't have the cash to buy a $400 video card every 6 months, so when games get too intense I'd like the option to tradeoff IQ for performance.

Quote:

There is something deceitful about changing filtering in one game and not telling anyone about it, especially when they have been led to believe that the Quality setting in the drivers would perform full trilinear filtering.

It is not a good thing that users who want full trilinear filtering are unable to get it.
I am going to lump these together. I can't make a judgement based on the information available to me. I'm not sure why Nvidia changed the filtering method of UT2k3 from the standard, but I really don't think its simply for benchmarking reasons - full trilinear shouldn't be such a major performance hit that it makes your card look horrible against the competition, unless there is a driver bug or incompatibility with that specific game.

ATI is basically taking the stance that the consumer should be able to customize everything. Nvidia is taking the stance that they will attempt to provide the consumer with the best IQ/performance for each game, which gives the consumer less options, but also gives the consumer less chance to have a decidedly worse IQ/performance. For instance, with ATI cards you can force AA on in Splinter Cell, but with Nvidia cards you cannot. ATI decided the consumer should be able to force the game into AA mode, while Nvidia used application detection to prevent AA from being enabled in that game (by request of Ubisoft). The difference? ATI users are the only ones who can use AA, but they are also the only ones who could be exposed to the massive IQ glitches seen in that game with AA enabled.

So in essence, ATI gives the user more control, but Nvidia gives the user what they believe is the best possible experience for that game. The former may lead to happier tweakers, the latter may lead to happier everyday consumers/casual gamers. Two very different approaches.

Quote:
NVIDIA should provide an option in the drivers that does real trilinear filtering if the application asks for it.
Depends on the situation.

If there is an underlying bug/issue with the drivers or hardware that is problematic with a particular application, which may be the case with UT and the 44.03 drivers, I don't think giving that option would be productive. In addition, by having Nvidia automatically control the IQ/advanced gfx settings with the drivers, there will likely be less glitches, performances issues, or need to switch around settings in between games in order to get them to run their best on the hardware. The Nvidia method allows you to install the drivers, set them to highest quality, and if they are the latest, play the game with the best settings that are not problematic/incompatible with the hardware/drivers - no need for switching settings around between games to get the best quality without IQ glitches/performance problems. Makes life easier on the gamer, though tweakers may be upset by the lack of control.
__________________
We're all in it together.

Intel Core 2 Quad Q6700 2.66GHz CPU | Intel G965WH mobo | 8GB (4x2GB) DDR2-667mhz CAS5 RAM (1066MHz FSB) | BFG GeForce 285 GTX OC 1GB | Dell E228WFP 22" DVI-HDCP LCD Monitor | 1TB Western Digital RE3 SATA2 Main Drive | 500GBx2 Western Digital RE3 SATA2 Scratch Drives in RAID0 | Western Digital RE3 1TB SATA2 Media Drive | External 2TB Western Digital MyBook Backup Drive | Adaptec eSATA 3.0gbps PCI-E interface | Sandisk External 12-in-1 Flash Card Reader | LG GGC-H20L HD DVD/BD reader, DVD writer | LG GGW-H20L HD DVD/BD reader, DVD/BD writer | Microsoft E4000 Ergonomic Keyboard | Logitech Trackman Wheel | Antec P182 ATX Case | Thermaltake ToughPower XT 850w modular PSU | KRK RP-8 Rokit Studio Monitors | Windows Vista Ultimate x64

Last edited by Ruined; 07-24-03 at 05:27 PM.
Ruined is offline   Reply With Quote
Old 07-24-03, 05:59 PM   #58
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Default Re: Re: Re: Re: Re: Re: Re: Re: Re: Unreal 'optimization' a non-issue?

Quote:
Originally posted by Ruined
aapo addressed that claim nicely here.
No, this issue is danced around and FUDed up by aapo there.

Why you defending the wrong side of the argument so hard?
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Old 07-24-03, 06:15 PM   #59
Ruined
Registered User
 
Ruined's Avatar
 
Join Date: Jul 2003
Posts: 2,447
Default Re: Re: Re: Re: Re: Re: Re: Re: Re: Re: Unreal 'optimization' a non-issue?

Quote:
Originally posted by digitalwanderer
Why you defending the wrong side of the argument so hard?
There is no 'wrong side' to this particular argument, just different rationales, neither of which is 'wrong.' Mine happens to differ from yours, but that does not make it invalid.
__________________
We're all in it together.

Intel Core 2 Quad Q6700 2.66GHz CPU | Intel G965WH mobo | 8GB (4x2GB) DDR2-667mhz CAS5 RAM (1066MHz FSB) | BFG GeForce 285 GTX OC 1GB | Dell E228WFP 22" DVI-HDCP LCD Monitor | 1TB Western Digital RE3 SATA2 Main Drive | 500GBx2 Western Digital RE3 SATA2 Scratch Drives in RAID0 | Western Digital RE3 1TB SATA2 Media Drive | External 2TB Western Digital MyBook Backup Drive | Adaptec eSATA 3.0gbps PCI-E interface | Sandisk External 12-in-1 Flash Card Reader | LG GGC-H20L HD DVD/BD reader, DVD writer | LG GGW-H20L HD DVD/BD reader, DVD/BD writer | Microsoft E4000 Ergonomic Keyboard | Logitech Trackman Wheel | Antec P182 ATX Case | Thermaltake ToughPower XT 850w modular PSU | KRK RP-8 Rokit Studio Monitors | Windows Vista Ultimate x64
Ruined is offline   Reply With Quote
Old 07-24-03, 06:18 PM   #60
digitalwanderer
 
digitalwanderer's Avatar
 
Join Date: Jul 2002
Location: Highland, IN USA
Posts: 4,944
Unhappy No, there can be only one.

Quote:
Originally posted by Ruined
There is no 'wrong side' to this particular argument, just different rationales, neither of which is 'wrong.' Mine happens to differ from yours, but that does not make it invalid.
Normally it wouldn't automatically make it invalid; but when you're talking wrong/right, black/white, truth/BS then one side is the correct one and the other side is in error.

You are in error, I don't get what you're trying to prove anymore other than "nVidia and [T] aren't as bad as all you guys say" when we all know damned well they are.
__________________
[SIZE=1][I]"It was very important to us that NVIDIA did not know exactly where to aim. As a result they seem to have over-engineered in some aspects creating a power-hungry monster which is going to be very expensive for them to manufacture. We have a beautifully balanced piece of hardware that beats them on pure performance, cost, scalability, future mobile relevance, etc. That's all because they didn't know what to aim at."
-R.Huddy[/I] [/SIZE]
digitalwanderer is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
NVIDIA Responds to Reports of Kepler V-Sync Stuttering Issue Rieper NVIDIA GeForce 600 Series 13 03-03-13 10:56 PM
Gorgeous Unreal Engine 4 brings direct programming, indirect lighting News Archived News Items 0 06-08-12 09:20 PM
Star Wars 1313 running on Unreal Engine 3 on PC at E3, will be linear and light on Je News Archived News Items 0 06-08-12 05:20 AM
Intel's Ivy Bridge Core i7 3770K Overheating Issue Detailed News Archived News Items 0 05-16-12 10:40 AM
Does anyone like the cool water reflection effect in unreal 2003? imtim83 Gaming Central 15 09-20-02 10:18 PM

All times are GMT -5. The time now is 10:39 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.