PDA

View Full Version : 965 (i dont want to use it but) FTW


XDanger
04-04-07, 07:47 PM
http://www.tomshardware.com/2007/04/04/does_chipset_to_gpu_matching_matter/

Toms can go suck my .... but anything that validates my hardware choice is always fine by me.
edit: by the way I did'nt go there on purpose

Xion X2
04-04-07, 08:53 PM
Tom's hardware is an absolute effin joke. It is the cess pool of the internet.

buffbiff21
04-04-07, 09:30 PM
What's wrong with their CPU and GPU charts??

LORD-eX-Bu
04-04-07, 09:51 PM
intel has the best chipsets FTW!

Blacklash
04-05-07, 01:23 AM
If you feel you must get something now, P965 is a very good choice. If you want to wait a while grab the upcoming Intel Bearlake X38.

http://www.nordichardware.com/news,5947.html

LORD-eX-Bu
04-05-07, 01:51 AM
so.. X38 should be like 975s replacement and G35 should be 965s replacement right?

Blacklash
04-05-07, 03:45 AM
so.. X38 should be like 975s replacement and G35 should be 965s replacement right?

I believe it's X38 replaces 975x and P35 replaces P965. Beerlake ftw! erm... um Bearlake.

Xion X2
04-05-07, 10:29 AM
What's wrong with their CPU and GPU charts??

If you spent 5 minutes on their forums, you would see what's wrong with them. They have immature kids writing their articles that wouldn't know a good CPU from their #*!

They also believe taking single FPS samples from games instead of an overall average is more effective in gauging performance. Before I left there the last time one of their writers specifically mentioned thinking this was a "good idea."

Just check out a few of these gems from a recent topic I participated in over there for a good laugh. This was all in regards to a debate that was going on where most of them refused to acknowledge you would be bottlenecking an 8800GTX with a weaker processor like an Athlon X2 3800+ @ 2.0gHz compared to a Core2Duo @ 3.0gHz:

average FPS mean very little in most cases and there meaning is non existant in this topic. a would bet that any difference in minimum FPS will be little or non existant.

average fps means nothing to how a game play.

If your point was that there would be a bottleneck, [b]but not a perceptible one, then why didn't you say that earlier?

This guy above came back at me with that after posting this benchmark representing a 26fps difference in performance on an 8800GTX between a 3800+ X2 and an E6700 at stock speed. This below, a 34% difference in performance, is what he calls "not perceptible":

http://img54.imageshack.us/img54/3728/14067ch7.png

Seems like if you're going to be playing at high resolutions with eye candy, the processor isn't that much of a bottleneck at all...

That guy Cleeve is a writer for Tom's and refused to acknowledge the 34% performance increase in going to an E6700 from an X2 3800+ on a single GTX in the benchmarks I provided him with at 1600 resolution. He also disagreed with the fact that filtering and AA do not make the G80s take much of a performance hit at 1600 resolution despite what you'll see from these graphs here:

http://www.beyond3d.com/images/reviews/g80-perf/ep1_00.png
http://www.beyond3d.com/images/reviews/g80-perf/ep1_416.png


Instead, he chose to believe that if you enabled those features on that benchmark that I provided showing an E6700 with a 34% performance lead over the X2 3800+, it would even them out to where there was a very meager difference, which is bologna.

As for the rest of them, they bow down to their cronie moderators/writers as if they were gods and knew everything about computers, when the reality is that they don't know #*!~. Case in point:

HooHoo Yeah Baby!
Wow, Cleeve, you melt me...
death has arrived for those who stood against us

http://forumz.tomshardware.com/8800-GTS-640mb-ftopict230573.html

The place is an effing joke. I still remember that one benchmark I read that had the 1900GT on equal footing with a 7950GX2. Classic.

Sorry to get things off-topic; I was responding to buff's question about Tom's hardware.

Blacklash
04-05-07, 11:37 PM
It would be nice if nVidia and Intel would work something out and I don't see it happening. I suppose an alternative would be to pick up the 8950 when it is released.

We know SLi is not working well under Vista. I can't remember if the GX2 style cards are completely broken in Vista. It may only operate as a single GPU. If it works as two then I'd be set.

Right now I don't feel the need for the push of two GPUs. If I do upgrade my monitor and change my mind I suppose I could go with the 8950. Time will tell.

Xion X2
04-06-07, 09:44 AM
Last I'd heard, the GX2s were functioning on a single GPU in Vista, but I *think* the last driver released fixed 7-series SLI in Vista, didn't it?

Don't know for certain because I have little reason to keep up w/ the 7-series anymore.

XDanger
04-07-07, 10:33 PM
the difference between 54 fps and 76 fps IS imperceptible if your monitor only does 60hz .:p
(which it seems it what most people who upload to steam run theirs at,theres some wierd numbers in there)

Xion X2
04-07-07, 11:20 PM
^ Only if you run vsync, which most people don't.

I do run vsync, however, and I can tell the difference when my frames are below 60. Games aren't as smooth.

Amuro
04-07-07, 11:48 PM
Will the new Intel FTW chipsets support SLI FTW?!!

Blacklash
04-08-07, 01:38 AM
If you want to SLi 7800s on an Intel 975 chipset the below driver will do it. Seems kinda pointless though with the current crop of cards available.:

http://www.techpowerup.com/downloads/304/NVIDIA_Forceware_81.85_Intel_SLI_Edition.html

Too bad no one has hacked a recent driver.

XDanger
04-16-07, 03:31 PM
^ Only if you run vsync, which most people don't.

I do run vsync, however, and I can tell the difference when my frames are below 60. Games aren't as smooth.

I may deserve a slap for this but, Do you set your prerender limit to 3 and enable triple buffering.

Xion X2
04-16-07, 04:10 PM
I think that triple-buffering in DirectX is enabled by default when you vsync on the 8800-series. I've never had to use a 3rd party utility to enable it. There's an option in the control-panel, but I'm pretty sure that it specifies OpenGL, not DirectX.

I never mess with the prerender limit. The slight lag that you get when vsync'ng doesn't really bother me. I'd much rather endure that than screen-tearing.

Dazz
04-16-07, 05:14 PM
60fps is more then enough, but Vsync is a must hate screen tearing does my bonce right in.