Go Back   nV News Forums > Graphics Card Forums > NVIDIA Legacy Graphics Cards

Newegg Daily Deals

Reply
 
Thread Tools
Old 06-25-03, 06:59 AM   #25
Nutty
Sittin in the Sun
 
Nutty's Avatar
 
Join Date: Jul 2002
Location: United Kingdom
Posts: 1,835
Send a message via MSN to Nutty
Default

I can do some GF4 shots when I get home.
Nutty is offline   Reply With Quote
Old 06-25-03, 09:13 AM   #26
jbirney
Registered User
 
jbirney's Avatar
 
Join Date: Jul 2002
Posts: 1,430
Default

Well Epic has their take on this:

http://udn.epicgames.com/pub/Content/TextureComparison/
jbirney is offline   Reply With Quote
Old 06-25-03, 09:30 AM   #27
gstanford
Registered User
 
gstanford's Avatar
 
Join Date: Oct 2002
Posts: 799
Default

I would not put too much stock into whatever Epic/Tim Sweeney has to say, frankly. The unreal engine may be visually impressive, but it is incredibly clunky and inefficient compared with just about any other engine out there at the moment and always has been. Tim is the guy who is claiming that unless your hardware fully accelerates FP32 his next title won't even run properly on your machine.

They haven't bothered to check DXT1 quality on anything but nVidia cards in the linked article. You can clearly see from nutty's screenshots that ATi hardware murders DXT1 quality just as badly.

In fact it isn't really a hardware issue - it's a limitation of the DXT1 format - it simply isn't suitable for high quality textures.
gstanford is offline   Reply With Quote
Old 06-25-03, 12:36 PM   #28
jbirney
Registered User
 
jbirney's Avatar
 
Join Date: Jul 2002
Posts: 1,430
Default

Quote:
Originally posted by gstanford
[b]I would not put too much stock into whatever Epic/Tim Sweeney has to say, frankly. The unreal engine may be visually impressive, but it is incredibly clunky and inefficient compared with just about any other engine out there at the moment and always has been. Tim is the guy who is claiming that unless your hardware fully accelerates FP32 his next title won't even run properly on your machine.
While I can not commit on what Tim says on the future I can state that his engines are very good in their own right. Sure the Unreal engine was a bit clunky in the past but thats due to it being a software render never really desgin for 3d cards. The first unreal engine was written almost 5 years ago. The engine that powers UT2k3 shows you just how good it is. I mean look at all of the AA titles planned for this engine.You even have gams like Tribes and MOH:AA that have swtiched to the Unreal egine. Plus the power of Uscript helps to make his engine much more usefull. Not trying to compare the two. But JC has pushed his engine one way had has done a great job at it. Tim has pushed his in another dir and has also done a good job.


Quote:
They haven't bothered to check DXT1 quality on anything but nVidia cards in the linked article. You can clearly see from nutty's screenshots that ATi hardware murders DXT1 quality just as badly.
Thats not what I see. Yes it may be bad in Nutty's article; but in my mod, based on the UT2k3 engine, it looks much better on an ATI card when I have had to use DX1 compression (I have both a GF4 and ATI so I cam compare). U might want to read the article.
jbirney is offline   Reply With Quote
Old 06-25-03, 12:57 PM   #29
Nutty
Sittin in the Sun
 
Nutty's Avatar
 
Join Date: Jul 2002
Location: United Kingdom
Posts: 1,835
Send a message via MSN to Nutty
Default

They look slightly different, but still are both nowhere near as good as no compression.

I think the whole issue with the Q3 S3TC fix, is that it just forces apps that ask for DXT1 to use DXT3 or 5. Like gstanford says, its not a bug at all. Not now anyway, there might've been one in the past, its just happens thats DXT1 looks crap when working correctly.

If ppl want, I can still do some GF4 shots too, but frankly from running the program here, they look identical the gf3 shots. I overlaid the program on top of the GF3 shots, and repeatedly ATL-TAB'd, and they look pixel-identical.

Thats on a GF4TI4600, with the new 44.65 drivers.
Nutty is offline   Reply With Quote
Old 06-25-03, 04:00 PM   #30
andypski
Registered User
 
Join Date: Jun 2003
Posts: 34
Angry DXT texture compression

Had to register to reply in this thread - I hate to see these popular misconceptions about DXT texture compression continue.

Since I have worked for many years with the man who invented S3TC/DXTC texture compression, and continue to work with him on texture compression issues on a regular basis I consider myself reasonably informed about this.

To cut a long story short -

For colour only images there should be no difference in quality between DXT1 and any of the other DXT formats. The specification for the colour compression is identical across all these compressed formats.

Nutty's screenshots appear to be using a texture map that has an alpha channel as well, and the final image is being presented as the result of additively blending that channel onto the image. This is why DXT1 looks worse in this case - it only has 1 bit to represent the alpha channel, so it is either 1 or 0 (hence the sharp boundary around the white blob). I can guarantee that on ATI hardware there is no difference in colour compression quailty between DXT1 and DXT2/3/4/5.

This idea that the other DXT formats should be in some way superior for colour images is a complete fallacy. There was never any intention in the specification that DXT1 should have a lower quality of colour representation than any of the other DXT formats.

- Andy.
andypski is offline   Reply With Quote
Old 06-25-03, 09:33 PM   #31
StealthHawk
Guest
 
Posts: n/a
Default

Ok, can someone make a test program that doesn't use an alpha channel then? *points at Nutty *
  Reply With Quote
Old 06-26-03, 03:30 AM   #32
Nutty
Sittin in the Sun
 
Nutty's Avatar
 
Join Date: Jul 2002
Location: United Kingdom
Posts: 1,835
Send a message via MSN to Nutty
Default

Sure, the alpha channel was deliberatly added to show what happens to alpha information.

If you press 'A' it toggles the alpha channel already.

Last edited by Nutty; 06-26-03 at 03:34 AM.
Nutty is offline   Reply With Quote

Old 06-26-03, 04:04 AM   #33
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Nutty
Sure, the alpha channel was deliberatly added to show what happens to alpha information.

If you press 'A' it toggles the alpha channel already.
In that case, I will go take screenshots with the alpha channel turned off
  Reply With Quote
Old 06-26-03, 04:27 AM   #34
Miksu
Registered User
 
Join Date: Apr 2003
Posts: 76
Default

Quote:
Originally posted by gstanford
Tim is the guy who is claiming that unless your hardware fully accelerates FP32 his next title won't even run properly on your machine.
Remember that you're talking about an engine which is prolly released in 2006-2007. You don't know what happens before that (except that Intel is leader market share/performance wise..)
Miksu is offline   Reply With Quote
Old 06-26-03, 04:35 AM   #35
gstanford
Registered User
 
gstanford's Avatar
 
Join Date: Oct 2002
Posts: 799
Default

Quote:
Originally posted by Miksu
Remember that you're talking about an engine which is prolly released in 2006-2007. You don't know what happens before that (except that Intel is leader market share/performance wise..)
Go to Beyond3D and actually read what Tim had to say. His timeframe was inside of 18 months, which puts us well before 2006 and within the lifespan of products such as R3xx and nV3x.

Just to clarify: when I say inside the lifespan of current products, I mean for the general consumer, not the hardcore gaming nut who changes top end video cards more often than his underwear.
gstanford is offline   Reply With Quote
Old 06-26-03, 06:03 AM   #36
Kruno
TypeDef's assistant
 
Join Date: Jul 2002
Location: Australia
Posts: 1,641
Send a message via ICQ to Kruno Send a message via AIM to Kruno
Default

Quote:
Originally posted by gstanford
Go to Beyond3D and actually read what Tim had to say. His timeframe was inside of 18 months, which puts us well before 2006 and within the lifespan of products such as R3xx and nV3x.

Just to clarify: when I say inside the lifespan of current products, I mean for the general consumer, not the hardcore gaming nut who changes top end video cards more often than his underwear.
Assuming you wear underwear. I know I don't.
__________________
"Never before has any voice dared to utter the words of that tongue in Imladris, Mr. Anderson" - Elrond LOTR
Kruno is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
Multiseat X with two cards: used to work, now segfaulting Artefact2 NVIDIA Linux 10 06-26-12 05:22 PM
basemosaic on 3 video cards shock32638 NVIDIA Linux 1 06-05-12 07:46 PM
2 Video Cards -- Disable Open GL on single? amites NVIDIA Linux 5 05-30-12 03:51 PM
Need advice on GeForce4 Ti 4200 cards marqmajere NVIDIA GeForce 7, 8, And 9 Series 40 09-26-02 01:01 AM

All times are GMT -5. The time now is 04:48 AM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.