Originally Posted by Fathertime36
Is 3D06 bias towards Nvidia?, cause there are alot of 256mb gtx's posting same if not better scores then what you are getting for that xt1900?, my lowley gt isn't too far below either.....typical dated ati driver's maybe? I just ran my first bench using 3d06, seems like it is alos bias towards dual cores also.....time for a dual core opteron xxx.....I really think I have gotten my money's worth out of the 939/3000+ $135 single cpu....question directed towards you OWA, I notice that you have a 4000+ clocked @2.6ghz, that seems to be the highest norm that the single core non FX 939 are pushed by a majority of owners, how high have you been able to push it without having to worry about stability? like a 24/7 oc?, the only difference across the board is the fsb and some additional multiplyers.
Yeah, it seems like 3DMark06 runs better on Nvidia cards and it definitely takes advantage of dual-core CPUs. When I still had the X1800XT, I had it overclocked as far as I could go and had the 4000+ about as high as I could go and I couldn't break 4k (think 3979 was my highest). A lot of GTXs were breaking 4k easily. The XTX improved it about 1k (from 3600'ish to 4600'ish at stock settings).
Also, someone informed me that the drivers that came on the XTX CD are betas (modified 5.13s) and that reviewers had newer drivers so I'm hoping the XTX results will improve once the official drivers are released.
My 4000+ was actually at 2.58 and it's stable there. Doesn't seem like much difference but it actually takes me a while to get it stable when I'm at 2.6.