Go Back   nV News Forums > Hardware Forums > Benchmarking And Overclocking

Newegg Daily Deals

Reply
 
Thread Tools
Old 06-03-03, 12:42 AM   #193
Nv40
Agent-Fx
 
Nv40's Avatar
 
Join Date: Aug 2002
Location: everywhere
Posts: 2,216
Default

many interesting views ,from diferent topics.. here is mines..

1)pc-games are not losing any ground ,but the opposite..
where did you think all the money of ATI and Nvidia comes?

the only thing that can hurt PC-games is warez
but actually CD keys are working really well.. heck.. i bought Ravenshield
and many other games ,when they kick ass they deserve my money
can wait for Doom3 and Hl2
but another thing that can hurt Pc-games is ->MS with the Xbox.
if they try to control more the pc-industry

2)standars doesnt benefit the industry when they dont benefit the
majority. but only a few .. its was MS bad move to force Nvidia
to use all their FP32 precision (never intended for today games but for the professional market) in games.. WTF what they were thinking?
they were responsible for all this mess about Fp24 vs Fp32 ,since they
raised the specs ,many months later Nvidia hardware was finished.
so as you see sometimes Standars do more damage to the industry
that benefits to it (part of 3dmark -Nvidia mess have something to do with the precision of MS).
if we take in to account that Microsoft is trying to monopolize the industry ,i will not like to see OpenGl to go any day ,is the only thing that
force MS to not gain full control of the Pc-games ,and force gamedevelopers to code only for their gaming console (cough) HALo.
standars are not always a good thing ,in the console market standars
doesnt exist , and notice the many high quality games and how succesfull is that industry. the positive side about Propietary ,is that there is so many innovations in the hardware. (PS2 vs Xbox vs gamecube vs dreamcast). in PC's all video cards amost do the same thing.
and something makes me believe that without standars ,
we will have many more innovation in video cards ,with new technolgy
the negative side is more work for developers. a good a aproach can be
something in the middle between standar and propietary. to have more
innovations without too much extra work for developers.

3)no matter if NVidia or ATI now support 3dmark , no benchmark
should ask hundreds of thousands of dollars to BIg companies ,
because nobody give so huge sum of money if they are not going to receive something back.
the way DELL defend 3dmark makes me believe even more about this..
i dont see Gateway or compaq in the betaprogram . something bigger is going on. at least with other syntetic benchamrk noone have to pay nothing ,and people observe the scores ,but never buy a card for that. but 3dmark is used by DELL and ATI and probably in the future by Nvidia to sell video cards. and im againts that ,because all gamers buy video cards to play games but 3dmarks may mislead them about the performance they will see in his favorites games. thats why you see many times in Forums.. "why my card is so slow in this game.?" unless 3dmarks use real game engines there is no way they can be taken as representative of games of the future.

in games its a diferent Story. generally speaking is more dificult there
to do the same thing as 3dmark ,because Game developers needs to sell
their games ,and they know that if ATI hardware doesnt run well in their game (they will lose money) ,they know that if Nvidia hardware doesnt works well in their game ,(they will lose money).in syntetic benchamrk they dont need to run well ,they only need to give you a score .

and ATI and Nvidia knows that to sell their video cards ,they need to run well in all games ,not only one. so is imposible to buy $$ all gamedevelopers and all his games ,to perform only well in their cards. its far more cheap to spend that money designing a powerfull hardware.

Last edited by Nv40; 06-03-03 at 01:00 AM.
Nv40 is offline   Reply With Quote
Old 06-03-03, 12:43 AM   #194
bkswaney
Mr. Extreme!
 
bkswaney's Avatar
 
Join Date: Aug 2002
Location: SC
Posts: 3,421
Send a message via Yahoo to bkswaney
Default

Quote:
Originally posted by Rowen
Let's not blame nVidia... after all, they're only trying to prove what they always said about 3DMark2003... it sucks and can be manipulated...

Now serious... as long as a driver show the performance gains in the real games without any kind of problem, why would I care about 3DMark??? It's useless... it's only about competition between people trying to get the ultimate score...

Yep... and we all know 3dmark can be beat.
There are ways to cheat it.
Without being nvidia.

As long as drivers are stable... keep making my games faster and look better I could care less really.

#1-stable-bugs
#2-IQ
#3-fast
bkswaney is offline   Reply With Quote
Old 06-03-03, 12:47 AM   #195
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by Zenikase
The reason why more developers are moving on to consoles is because there are they have rules set in stone as to what will work and what won't. It's a single set of hardware they have to program for, and there's no need to worry about compatibility because it will never change. This leads to much faster and cheaper development time, allowing them to focus on more important things, like the game itself.
I would say that more developers move to consoles because console games are more profitable as they sell more copies than pc games. Look at the top10 PC games sold in any given year, I don't think all games on the top10 even can sell a million copies(certainly not the top20). The top 20 console games each year easily sell 1 million copies each.
  Reply With Quote
Old 06-03-03, 12:48 AM   #196
StealthHawk
Guest
 
Posts: n/a
Default

Quote:
Originally posted by scott123
He, he: I hate to say I told ya so.

Kinda looks like Futuremark see's the reality, and realizes that they need Nvidia on board to survive.

As I said, it doesn't matter who's right and who's wrong. When the #1 graphics card maker says your product sucks, and your product just happends to be a benchmark program, you have a serious problem on your hands.

I think Futuremark see's the reality, and they are in survival mode.

Scott
Hmm, nvidia is the one that left Futuremark. Futuremark didn't kick them out.
  Reply With Quote
Old 06-03-03, 12:56 AM   #197
Zenikase
Registered User
 
Join Date: May 2003
Posts: 99
Default

Quote:
Originally posted by Nv40
many interesting views ,from diferent topics.. here is mines..

2)standars doesnt benefit the industry when they dont benefit the
majority. but only a few .. its was MS bad move to force Nvidia
to use all their FP32 precision (never intended for today games but for the professional market) in games.. WTF what they were thinking?
they were responsible for all this mess about Fp24 vs Fp32 ,since they
raised the specs ,many months later Nvidia hardware was finished.
so as you see sometimes Standars do more damage to the industry
that benefits to it (part of 3dmark -Nvidia mess have something to do with the precision of MS).
if we take in to account that Microsoft is trying to monopolize the industry ,i will not like to see OpenGl to go any day ,is the only thing that
force MS to not gain full control of the Pc-games ,and force gamedevelopers to code only for their gaming console (cough) HALo.
Microsoft can't force nVidia to do anything, all they do is lay down the guidelines for 3D accelerators to follow. There are only minimums as to how precise they can carry out fragment shader operations, and the manufacturers will have to live with their decisions, favorable or not.

Here's an excerpt from a NV30 vs. R300 comparison on B3D:

Quote:
According to DX9 Beta 2.1, the internal precision required by PS2.0 is here:
  • Implementations vary precision automatically based on precision of inputs to a given op for optimal performance.
  • The minimum level of internal precision for temporary registers is s10e5 (FP16).
  • The minimum internal precision level for constants is s10e5 (FP16).
  • The minimum internal precision level for input texture coordinates is s16e7 (FP24).
  • Diffuse and specular registers are only required to support [0-1] range, and high-precision is not required.

So, we can see R300 is a true DX9 card in spite of 24bit internal float precision in the pixel shader pipeline. Of course, NV30, which supports true IEEE-32(s23e8) FP precision, is also a true DX9 card.

Note that only parts of the R300 pipeline are at 24bit precision, with the chip being a mixture of both 32bit and 24bit floating point precision. The core pixel shader operations are carried out at FP24 precision however the texture address operations (and the entire Vertex Shader pipeline) are IEEE-32(s23e8) FP precision. The output of shaders can be converted to lower precision, such as 32bit or 64bit per pixel, or converted up to 128bit per pixel.

It was confirmed that R300 and NV30 support both FP16 per component textures and FP32 per component textures. Besides, R300 also supports 16bit fixed point textures, and to the contrary NV30 supports 12bit fixed point textures. NV30 and R300 also support 64bit and 128bit "float" frame buffer.

R300's pixel pipeline also supports 1d/2d/3d/cubemap floating point textures where NV30 is limited to texture_rectangle. Floating point textures in R300 are limited to nearest filtering. R300 also supports muliple 128bit and 64bit texture formats, including a c4_16 format where each component is a 16bit fixed point value and full filtering, 3D textures and projected textures are supported.

Last edited by Zenikase; 06-03-03 at 02:59 PM.
Zenikase is offline   Reply With Quote
Old 06-03-03, 12:58 AM   #198
Steppy
Radeon 10K Pro
 
Join Date: Aug 2002
Posts: 351
Default

Quote:
Originally posted by Nv40
many interesting views ,from diferent topics.. here is mines..

2)standars doesnt benefit the industry when they dont benefit the
majority. but only a few .. its was MS bad move to force Nvidia
to use all their FP32 precision (never intended for today games but for the professional market) in games.. WTF what they were thinking?
they were responsible for all this mess about Fp24 vs Fp32 ,since they
raised the specs ,many months later Nvidia hardware was finished.
so as you see sometimes Standars do more damage to the industry
that benefits to it (part of 3dmark -Nvidia mess have something to do with the precision of MS).
Umm, that was Nvidia's own pimping of their "32-bit" that led MS to decree ATI's "lower" 24-bit the minimum standard. It turned out that Nvidia's 32-bit is/was mostly unusable and needed 16-bit or less to perform well. Had Nvidia not been so concerned with one-upping ATI on the spec sheet and just said 16-bit was their preferred method, 16-bit most likely would have been the minimum for DX9.
__________________
Here's my clever comment
Steppy is offline   Reply With Quote
Old 06-03-03, 01:00 AM   #199
Quinn1981
Elite Bastard
 
Quinn1981's Avatar
 
Join Date: Jul 2002
Location: Lyons, GA, USA
Posts: 86
Default

Seems like nVidia knew from the get-go that their new FX stuff sucked at vertex and pixel shaders compared to the ATI stuff and couldn't admit it. Or at least that's what it looks like to me. I don't see why it matters when the next group of cards is what gamers are going to go crazay for and we might start to see some real use of DX9. They have made a big deal out of nothing really. They could be working on great drivers and making sure their next product is going to be it's best instead of this crap. Oh well. Guess there will be more ATI users as we go along. nVidia needs to shape up. It only took 3Dfx a couple of blunders to end up going away and nVidia has racked up a good few already with the FX stuff. I still think they 5200s are nice for slow systems or work stations though.
Quinn1981 is offline   Reply With Quote
Old 06-03-03, 01:00 AM   #200
Zenikase
Registered User
 
Join Date: May 2003
Posts: 99
Default

Quote:
Originally posted by StealthHawk
I would say that more developers move to consoles because console games are more profitable as they sell more copies than pc games. Look at the top10 PC games sold in any given year, I don't think all games on the top10 even can sell a million copies(certainly not the top20). The top 20 console games each year easily sell 1 million copies each.
You also have to consider the fact that there are more console gamers than PC gamers, since most people don't like going through the trouble of dealing with bugs/crashes, incompatibility, poor performance, etc. Others may not have a system powerful enough to enjoy the game (although this has become less and less common, with the trickling down of DX8.1-class video cards into the value market). With a console you just pop in the disc and you're set. This, plus the fact that console games are much harder to pirate, due to proprietary mediums and the complex hardware modding required to play a pirated game that not only could be risky, but voids your warranty.

Last edited by Zenikase; 06-03-03 at 01:10 AM.
Zenikase is offline   Reply With Quote

Old 06-03-03, 01:29 AM   #201
Nv40
Agent-Fx
 
Nv40's Avatar
 
Join Date: Aug 2002
Location: everywhere
Posts: 2,216
Default

Quote:
Originally posted by Zenikase

Microsoft can't force nVidia to do anything, all they do is lay down the guidelines for 3D accelerators to follow. There are only minimums as to how precise they can carry out fragment shader operations, and the manufacturers will have to live with their decisions, be they favorable or not.

oops

you dont know how much power have M$ ,INtel and AMD will never
say something or do something contrary to what MS says.
MS have to much power and control in the COmputer industry.
their OS are the more good looking ,the ones that sells more ,
but the ones with more securitie flaws. why ? youmay ask?
because their first concern is about business.
MS is more a Business company than a technology company.
(this is what the author of my Windows2k server book says )
Ask AOL how they won 700millions .. from M$.
what you call "lay down the guidelines" are RULES! nothing more or less. you dont agreed with their rules ,you dont get their aproval for directx9 certification.since there is no other choice or OS ,you cannot sell that hardware . this sounds like a good thing ,compatibility is what matters .
but it kills any posibility for innovations . if absolutely everything in the PC industry needs to be aproved and controlled By MS ,then wait for the times when they start selling computer .(xbox super stations) and you cannot get their OS unless you buy their hardware.

Last edited by Nv40; 06-03-03 at 01:44 AM.
Nv40 is offline   Reply With Quote
Old 06-03-03, 01:43 AM   #202
Zenikase
Registered User
 
Join Date: May 2003
Posts: 99
Default

So anything designed for Windows is automatically backed by Microsoft and thus inherently evil? Your logic seems a bit flawed, my friend.

Besides, the DirectX specification isn't decided upon solely by Microsoft itself. Major players from all parts of the field gather to put their ideas into the next DX standard. They're not going to go ahead and change the specification without the approval of, say, Carmack and Sweeney.

Does OpenGL offer raytracing and realtime radiosity? If so, I wasn't aware. Maybe we really should ditch DirectX. I mean, it's practically useless and obviously just an evil marketing tool from Microsoft.

The bottom line is that DX is a relevant API and will continue to be so for a long time.

EDIT: Also, DirectX is not a type of certification that video cards need to get. It is simply a standard, a set of rules or guidelines designed for compatibility across the entire board. Neither nVidia's nor ATi's top products follow the PS 2.0/3.0 specification word-for-word, but they are still considered DX9-level cards.

Last edited by Zenikase; 06-04-03 at 12:38 AM.
Zenikase is offline   Reply With Quote
Old 06-03-03, 01:49 AM   #203
SlyBoots
Registered User
 
SlyBoots's Avatar
 
Join Date: Jul 2002
Location: La Grande, OR
Posts: 339
Default

Quote:
Originally posted by Zenikase
Microsoft can't force nVidia to do anything, all they do is lay down the guidelines for 3D accelerators to follow. There are only minimums as to how precise they can carry out fragment shader operations, and the manufacturers will have to live with their decisions, be they favorable or not.

Here's an excerpt from a NV30 vs. R300 comparison on B3D:
and here's an excerpt from the updated MSDXDEL>

"- For ps_2_0 compliance, the minimum level of internal precision for
temporary registers (r#) is s16e7** (this was incorrectly s10e5 in spec)
- The minimum internal precision level for constants (c#) is s10e5.
- The minimum internal precision level for input texture coordinates
(t#) is s16e7.
- Diffuse and specular (v#) are only required to support [0-1] range,
and high-precision is not required"

that little typo makes quite a difference, well to Nvidia anyway
SlyBoots is offline   Reply With Quote
Old 06-03-03, 01:51 AM   #204
ntxawg
Registered User
 
Join Date: Mar 2003
Posts: 77
Default

didnt nvidia originally say it was a driver bug but now they say it was optimization.. sheesh looks like its time to demand a refund
ntxawg is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
User Response : PR Response to Linus Torvald's Inflammatory Comments Blackcrack NVIDIA Linux 16 06-29-12 04:57 AM
PR Response to Linus Torvald's Inflammatory Comments News Archived News Items 0 06-19-12 12:00 AM
PR Response to Linus Torvald's Inflammatory Comments MikeC NVIDIA Linux 0 06-18-12 10:14 PM
NV30 name poll sancheuz NVIDIA GeForce 7, 8, And 9 Series 72 10-19-05 01:23 AM
Any details on Nvidia's failed NV2 for SEGA? suburbanguy Rumor Mill 1 08-21-02 10:30 PM

All times are GMT -5. The time now is 07:39 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.