PDA

View Full Version : Broken VP in 6800GT/Ultra could be fake !


Salamandar
12-21-04, 06:30 PM
To be Honest with you, I don't think the VP in the 6800GT/Ultra is by any mean broken. IMHO I think Nvidia made all this for 2 reasons:

1- They want something that won't cost them much to hit the Graphics and Video market with.

2- Force some 6800GT/Ultra Users to pay a little more for their cards.

Call me a Conspiracy Theory freak, but the timing of these decoders/drivers thing and all the 8 months waiting period to come up with this....... I don't it just doe not feel right.

P.S. I'm not Trolling.

And a (xmassign2 for every one

rewt
12-21-04, 06:36 PM
I don't get it. Its been over 8 months and the VP still doesn't work like it was supposed to.

Salamandar
12-21-04, 06:41 PM
I don't get it. Its been over 8 months and the VP still doesn't work like it was supposed to.

That's exactly what I'm talking about, a very complicated chip gets broken in a part that is supposed to be an important technology, and I might believe a PS 3.0 not working but VP!
I can't ingest that sorry.

rewt
12-21-04, 06:46 PM
nVidia themselves have already stated that the 6800U/GT does not do WMV9 HD hardware accelerated decoding. That right there is enough to convince me.

http://www.nvidia.com/page/purevideo_support.html

rewt
12-21-04, 06:58 PM
However it is entirely possible that the VP wasn't actually broken, but that nVidia just mislead everyone about what it was capable of doing in order to sell more cards. It wouldn't suprise me one bit.

Similar to when nVidia said the nv35 could do "Up to 8 pixels per clock cycle", leading you to believe it was a 8x1 chip, when in fact it is actually slower 4x2 chip.

Salamandar
12-21-04, 07:43 PM
However it is entirely possible that the VP wasn't actually broken, but that nVidia just mislead everyone about what it was capable of doing in order to sell more cards. It wouldn't suprise me one bit.

Similar to when nVidia said the nv35 could do "Up to 8 pixels per clock cycle", leading you to believe it was a 8x1 chip, when in fact it is actually slower 4x2 chip.

YUP......... :screwy:

adpr 02
12-22-04, 02:12 AM
Do you realise how stupid that would be? Think about it. If a GEEKAZOID found a way to make it work, and released it to the public, then EVERYBODY would be PISSED off at NVIDIA, and it would loose TONS of money. (people wouldn't trust Nvidia anymore). They are to big, and rich to just make up stories that can make them loose everything.

Elderblaze
12-22-04, 03:27 AM
Mpeg 2 hardware decode is pretty useless anyways, I dont' see what the big deal is. It takes about 15-30% of my cpu to decode dvd's in full software mode. The quality is quite good, and I can get 5.1 Audio. With that stupid nvidia decoder im forced to use windows media player which means I get to enjoy downmixed audio on my Klipsch promedia 5.1's... which by the way are the only reason i watch movies on my pc, my pc's audio equipment is far better then my tv speakers.

Regards,
Mike

Subtestube
12-22-04, 04:59 AM
I hate to point out, but they've antagonised a key market segment by it not working as promised. No, if it worked in hardware, you can be damn sure it'd be working now.

ChrisRay
12-22-04, 05:29 AM
The bad PR wouldnt be worth Nvidia doing this.

MUYA
12-22-04, 05:31 AM
key market segment? hardly. People buy graphics cards like the Ultras, GTs NUs for gaming...video decoding was an extra feature that is nice to have if you use your PC for that. However, AFAIK only wmv decode is not working.

Anyways...talking about market segment? HTPC enthusiats with loud ultras/GTs in ther HTPC system, I don't think so. WMV hardware decode acceleration is the only function that is borked on NV40s and NV45s. All other nv4X have working WMV decode acceleration! A HTPC is hardly likely to have a Ultra etc. Now 6200, 6600 with quiet solutions. Maybe.


But NV40s, and nv45s do one thing great and that's game...now nv45s with SLI, they rawk!

ChrisRay
12-22-04, 05:36 AM
Maybe by key he meant specific? Though I do have a hard time believing true hardcore video enthusiasts would buy a product like the 6800 for Video decoding. There are much better solutions that such a market would want. However, the idea of only one card doing the work is understandable.

TBH I have found key video decoding acceleration features to be a hassle and alot of trouble. I am still toying with this software trying to make it work. ((But I havent given up yet))

MUYA
12-22-04, 05:38 AM
Big market that.

knee jerk reactions and rattle throwing

ChrisRay
12-22-04, 05:41 AM
Heh no doubt. It is a small market. I remember once upon a time I really cared about DVD/hardware acceleration. With paticularly noting DVD content. Half the time I just prefer to use software. Since it pretty much can do anything hardware decoders can these days.

and I'm still fighting with this DVD decoder software. Argh (xmastree)

Salamandar
12-22-04, 08:09 AM
Do you realise how stupid that would be? Think about it. If a GEEKAZOID found a way to make it work, and released it to the public, then EVERYBODY would be PISSED off at NVIDIA, and it would loose TONS of money. (people wouldn't trust Nvidia anymore). They are to big, and rich to just make up stories that can make them loose everything.

I doubt a Geek would find a solution, remember he doesn’t have blue prints of the GPU nor they do know how this VP inside the Nvidia chips exactly works.

zoomy942
12-22-04, 08:35 AM
when i got my 6800, the potential for onboard video decoding wasnt evena factor for me. i dont watch movies or anything on my pc. but for the peoplw that do, with the uber-powerful machines we all have anyway, i figure.. "oh well, no decoding" cause our machines can handle it anyway

Huigie
12-22-04, 10:57 AM
then EVERYBODY would be PISSED off at NVIDIA

Guess who's producing my next videocard...
(mikec)

Mojoe
12-22-04, 11:34 AM
Guess who's producing my next videocard...
(mikec)
Matrox ? (snow=D)

MustangSVT
12-22-04, 12:17 PM
I just have a quick question. How many of you here can truly say that they watch HDTV videos all day, that their CPU is too slow to render the 1080p videos without dropping frames and that they mainly bought their 6800 video card because it has "video acceleration" and not because it's better/they like nVIDIA more and that they were planning on watching 1080p movies while playing a game. How many of you can actually say that.

Zetto
12-22-04, 12:23 PM
Matrox ? (snow=D)

Paraphelia! (snowlol)

Anyways, PureVideo is a poor excuse for the promises nVidia made. I do have ultra and I intend to make it quiet by watercooling it. Others can go the NV5 way (as bad as the latter can be in terms of build quality, it offers a quiet solution). Besides, there are quite a few people out there using there computers for media encoding, and that was one of the things nVidia promised but quickly reneged on that after a short while. (snowman)

Anyways, (xmassign2 and (cheers) and let nVidia burn in hell where it belongs. Unfortunately, ATI is no better, no big company can be fair and square with its customers. That's capitalist corporate culture for ya ;)

Salamandar
12-22-04, 01:25 PM
For me VP was never a reason to chose a 3D card, Gaming performance and IQ are my number one reasons for a new card, but a the IQ when playing a DVD is also gets a higher priority after Gaming.
Once more, I'm only stating an Opinion that could be right or wrong and would be a center of some discussion with all of you guys.

And By the way, I switched to ATI when all the fuzz was there concerning Half life 2 and the GeforceFX and to be honest with all of you......it was a mistake as most of the games I play runs much better on Nvidia cards, and I have to live through all the ups and downs of ATI drivers especially in OGL.



(mikec)

particleman
12-22-04, 01:51 PM
They also said it would be able to encode video and it was programmable, I guess we can stop hoping for that if decoding doesn't even work.

I am one of the people that isn't too disappointed by this since I didn't buy my 6800 GT for video. But I can understand how some people would be if this was a critical factor in their buying decision.

Subtestube
12-22-04, 02:45 PM
When I said a "key market segment", yes, I essentially did mean the rattle throwing enthusiast market. I'm not talking about media guys, as as ChrisRay has correctly noted, they wouldn't be buying an NV40 specifically for that anyway. For a start, any feature that's unsupported at time of purchase isn't one you should ever bank on getting adequate support for.

nVIDIA know as well as anyone else that although the enthusiast market doesn't decide the success of a card, it certainly has a significant effect on it. The issues surrounding the OVP have annoyed a lot of people. Just browse tech message boards, and you'll see duplicates of the 40 page monster that was the whining thread here. I'm certain that nV are aware of this - pretty much without question. Ironically, I think their original focus for purevideo was actually more on the video processing than hardware WMV/MPEG-4 decoding. Again, I'm not suggesting that a lot of people actually (when they're really honest about it) bought the cards specifically for the WMV decoding. If people are really fair about this, they'll realise that as enthusiasts, by the time HDWMV becomes really mainstream, they will've bought new graphics cards (and probably everything else ;) ) anyway. All I'm saying is that the whole situation has made a significant (if not directly profitably significant) part of one of their market demographics very annoyed. This will, in all likelihood, work against them in the future. That's all I was trying to say.

MUYA
12-22-04, 07:00 PM
Hardly. When the next GPU launches..all will be forgiven if it roxors your boxors.

Let the non GPU wmv decoding acceleration cease. We have had enough of the ranting etc.