Go Back   nV News Forums > Graphics Card Forums > NVIDIA GeForce 200 Series

Newegg Daily Deals

Reply
 
Thread Tools
Old 07-11-08, 02:43 PM   #85
cvearl
Radeon 9700 Pro
 
Join Date: Nov 2002
Posts: 475
Default Re: The right price for the GTX200 family...

Quote:
Originally Posted by ChrisRay View Post
You can agree to disagree with me all you like. I'm disagreeing with you on grounds of testing and evidence.. A 768 meg card can complete a test. Despite it typically being slower while a 512 meg card cannot. Yet when you lower the resolution or AA to a managable framebuffer pool. The 512 card not only complete the tests. But performs competitively/better than the 768 meg card. Yes they are different pieces of hardware. But the problem is bleedingly obvious.

And...

If your buying 1 card and sticking with 4x MSAA @ 1680x1050 resolution You are unlikely to encounter many problems with most software. At least not within the given 1 year time frame. Lets assume 1 card purchase is a 1 year investment.... With regards to the 8800GTX. Yes its a good card. But the 9800GTX + is easily faster than it in settings that dont hamper its framebuffer. Once that framebuffer is exceeded. The 8800GTX pulls far ahead of the 9800GTX despite the 9800GTX's superior shading capabilities and bandwith saving color compression,
Chris,

I know you have forgotten more about video cards than I will ever know. I am asking the following with complete respect. Hope you can explain.

I totally get what you are saying. But in real world situations where 1GB is a good insurance policy for minimizing the chance of bad 512MB performance I have some questions.

In the following review you have a 768MB 8800GTX up against a 512MB 9800GTX. This is only one example of such reviews. But anyway as you look at all of the heaviest tests, the 512MB cards is doing better than the 768MB card. I know there are clock differences and other factors. But in the case of one situation where my brain THOUGHT would be a 768MB card winning was not.

COD4 2560x1600 4xAA Obviously maxed out. http://www.anandtech.com/video/showdoc.aspx?i=3340&p=3

The 512MB 9800GTX gets 40.7 fps
The 768MB 8800GTX gets 38.7 fps

Then I thought! Frig that! 768MB has GOTTA beat 512MB. I know Oblivion!

2560x1600 4xAA, 16xAF Maxed http://www.anandtech.com/video/showdoc.aspx?i=3340&p=5

512MB 9800GTX gets 24
768MB 8800GTX gets 20

/sigh

All other tests regardless of settings and res, the 9800GTX appears to match or beat it.

Can you help me understand why? And at what point will the extra 256MB change this result?

And I am just looking for an honest answer here. Will a person notice more microstutter in that COD4 example even though the framerate is about the same?

In Crysis. It wins here at 2048x1536 4xAA 16xAF http://www.techpowerup.com/reviews/L...800_GTX/6.html

8800GTX got 18.5
9800GTX got 13.3

Again in the rest of the test suite http://www.techpowerup.com/reviews/L...00_GTX/15.html

Both cards are almost exactly the same performance. I would have thought with the extra 256MB ram, wider memory bus and 8 additional ROP's that the 8800GTX 768MB would have owned the 512MB 9800GTX or 4850 across the board once you were at 1920x1200 res or higher with AA and AF applied.

Then not only that but the 512MB 4850 is winning across the board in all but Crysis versus the 768MB 8800GTX?

Can you explain why this is like this. Because on paper. The 768MB beast should be tearing the 9800GTX and 4850's a new one at the 1920x1200 4xAA and beyond. But it's not. WTF is the deal!

Thanks Chris.

PS - I know where 1GB WILL make a difference. As you already stated. In multi GPU situations where the GPU's are able to run unbound and free like the wind. Then they will need all the memory you can feed em. But only in extreme cases?

9800GTX SLI versus 8800GTX SLI would be a very interresting battle.
cvearl is offline   Reply With Quote
Old 07-11-08, 02:49 PM   #86
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default Re: The right price for the GTX200 family...

I apologize for the double post. But I wanted to show something. This thread made me put the 9800GTX + SLI cards back in. And I wanted to show this. Forgive the jpeg compression but didnt think it mattered much in this case


Unreal Tournament 3

16xAA/16xAF @ 1920x1080P ((4xAA memory footprint))



16xQAA/16xAF ((8xAA Memory footprint))





I know they are not 100% exact. But you try moving around 2 FPS. You want to hurt someone by the time you reach the map point. This is a perfect illustration of what happens in NWN 2, UT3, And FEAR at 16xQ/8xMS commonly on the 9800GTX, 9800GX2, 8800GT cards. I'll be back later to post a single GTX 280 or a single GTX 260 at the same settings.

Chris


Quote:
Chris,

I know you have forgotten more about video cards than I will ever know. I am asking the following with complete respect. Hope you can explain.

I totally get what you are saying. But in real world situations where 1GB is a good insurance policy for minimizing the chance of bad 512MB performance I have some questions.

In the following review you have a 768MB 8800GTX up against a 512MB 9800GTX. This is only one example of such reviews. But anyway as you look at all of the heaviest tests, the 512MB cards is doing better than the 768MB card. I know there are clock differences and other factors. But in the case of one situation where my brain THOUGHT would be a 768MB card winning was not.

COD4 2560x1600 4xAA Obviously maxed out. http://www.anandtech.com/video/showdoc.aspx?i=3340&p=3

The 512MB 9800GTX gets 40.7 fps
The 768MB 8800GTX gets 38.7 fps

Then I thought! Frig that! 768MB has GOTTA beat 512MB. I know Oblivion!

2560x1600 4xAA, 16xAF Maxed http://www.anandtech.com/video/showdoc.aspx?i=3340&p=5

512MB 9800GTX gets 24
768MB 8800GTX gets 20

/sigh

All other tests regardless of settings and res, the 9800GTX appears to match or beat it.

Can you help me understand why? And at what point will the extra 256MB change this result?

And I am just looking for an honest answer here. Will a person notice more microstutter in that COD4 example even though the framerate is about the same?

I am off in search of situations where the 8800GTX beats the 9800GTX...

Thanks Chris.
Cvearl. Just guy to guy. I do appreciate your respect. And I dont want you to think otherwise. But lets just be guys on the forum here and forget the other nonsense. There is one thing that should be noted and I'm fully aware of. I'm extremely biased to multi GPU. I admit it right off. So when I look at anything. I take a multi GPU perspective about it. It may not be immediately obvious to those here who dont know ne as well.

I am very glad that you asked that question. Oblivion results dont surprise me. Considering my experience with the title its not as memory bound as people think. However if you turned on 8x Multisampling things would look very different. However probably not so at 1080P where I can use 16xQ fine on the 9800GTX and GTX 260/280 cards. Or used the texture mod pack which uses 4096x4096 texture maps. I havent done much work with COD4. So I am a little uncertain. Look above at my UT3 comparison of the 9800GTX running 16xQ which has an 8x MS storage for color data. The 9800GTX should and will always be faster when it doesnt run out of memory.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 07-11-08, 03:46 PM   #87
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default Re: The right price for the GTX200 family...

Unreal Tournament 3 @ 1920x1080P

9800GTX SLI 16xAA/16xAF ((4xAA memory footprint))



9800GTX SLI16xQAA/16xAF ((8xAA Memory footprint))






GTX 280 Single

16xAA/16xAF ((4x memory footprint))





GTX 280 Single 16xQ/16xAF((8x Memory footprint))




Now. The GTX 260/280/8800GTX at this resolution. Wont offer identical performance obviously. But they will not show the kind of drop off you see from the 9800GTX. I have seen the 8800GTX pull about 18 FPS at 16xQ settings here. The GTX 260 pulls about 28 FPS. The 9800GX2, 9800GTX, 8800GT, All do exactly what the 9800GTX 16xQ screenshot shows. Once you enable SLI on the GTX 260/280/8800GTX configurations. These framerates raise dramatically. Why the 9800GTX, 9800GX2, and 8800GT cards suffer and are unable to expand with SLI. So yes my Bias in this case does come from an SLI perspective. When you cant see any gains with SLI on what would otherwise be playable SLI settings due to having inadequate memory. I think people should know about it.

Now if someone comes to me at slizone and asks me why this is happening on their 8800GTS SLI setup. Which screams at other settings. All I can tell them is. "Sorry mate. You ran out of memory. Tough Break". When shopping now there are simply better alternatives. Which is why I think so highly of the GTX 260. It will not have problems like this in its immediate future. And I expect to see it alot from the 512 meg DX10 SLI configurations alot.

The summary of my thoughts is. A 512 card Nvidia card can do in a pinch. But if you really dont want to compromise. Spend a little more on the 260. You wont regret it.

Chris
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 07-11-08, 04:13 PM   #88
Toss3
.<<o>>.
 
Toss3's Avatar
 
Join Date: Oct 2004
Location: Finland
Posts: 4,763
Default Re: The right price for the GTX200 family...

Now we just need someone to do some testing at 8xAA on their 4870 to end this discussion once and for all. The 4870x2 is still going to drive the GTX 280 into the ground.
__________________
: :Asus Rampage II Gene : : Core i7 920 4011Mhz : : 6Gb 1600Mhz A-Data DDR3 : : Club3D Theatron Agrippa : : Intel 80GB SSD : : 2xSamsung F1 750Gb : : Sapphire 5850 @ 850/1225Mhz : :
: :Benq FP241W : : Optoma HD80 Projector + 92" Screen : : Genelec 8020B speakers : : Sony MDR-XB700 Headphones : : Razer Lycosa : : Razer Lachesis : :
Toss3 is offline   Reply With Quote
Old 07-11-08, 06:04 PM   #89
Xion X2
Registered User
 
Xion X2's Avatar
 
Join Date: May 2006
Location: U.S.
Posts: 6,701
Default Re: The right price for the GTX200 family...

Edit: nm.
__________________

i7-2700k @ 5.0 GHz
Nvidia GeForce 570 2.5GB Tri-SLI
Asus P67 WS Revolution (Tri-SLI)
OCZ Vertex SSD x 4 (Raid 5)
G.Skill 8GB DDR3 @ 1600MHz
PC Power & Cooling 950W PSU
Xion X2 is offline   Reply With Quote
Old 07-11-08, 06:11 PM   #90
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default Re: The right price for the GTX200 family...

If your getting 115 FPS in a game. Wouldnt you like to turn on additional AA settings? 8x MS. MS + SS hybrids? When your well over the 100 FPS mark you start turning on additional features such as SLI/Crossfire AA modes. The question is. Do you have the memory to handle these modes? I' I'm sure "115%" scaling is more of an anamoly than anything else. m not reading too far into it. It occasionally happens with SLI. ((See my Crysis results with GTX 260 SL))

In my examples. The 9800GTX was pulling 70 FPS on AVG in that screenshot with SLI. But unfortunately when trying to increase the quality. You couldnt because it hit the memory barrier. So yes I do feel its silly. If I AVG over 100 FPS in any game. The first thing I do is turn on a fullscreen supersampling filter ((usually 1x2 SS)) for the -1.0 LoD adjustment it brings to the entire scene.

Like I said. The same thing happens with the 9800GX2 Quad SLI setup. It can run that scene at like 111 FPS. But when I turn on 16xQ. It drops to 3 FPS due to running out of memory. Very disappointing. With SLI and Multi GPU the point has never been to get over 100 FPS in a title. But too increase the quality as far as you can. IE bring unplayable settings to playable. On an LCD screen achieving 115 FPS is pointless. Yes its an interesting demonstration of speed. But I'm sure anyone who saw that kind of performance would likely want to attempt and turn up settings even further. Because they have nothing to lose. Its well over any LCD's refresh rate for visible FPS. With a 9800GX2 Quad SLI setup. Those 16xQ settings are impossible to attempt. Despite it having the raw fillrate to do it.


Chris
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 07-11-08, 06:15 PM   #91
Medion
 
Medion's Avatar
 
Join Date: Dec 2005
Location: Barksdale AFB, La
Posts: 1,238
Default Re: The right price for the GTX200 family...

Like I said Xion, Chris doesn't comprehend common sense. If a 9800GTX in SLI is slower than a GTX280 in SLI, it MUST be because of memory, and not other architectural limitations. I mean, ramping the AA doesn't tax fill-rate AT ALL I used to think that Chris was pretty smart, if not brilliant. But his defensive posturing, his ignoring of facts, and his stubbornness in this thread lead me to another conclusion.

Once the 1GB 4870s come out, someone with the funds, and half a clue, is going to bench the 4870 1GB vs. the 4870 512MB in single and CF configs, and investigate if the memory limit really affects us today. I eagerly await those previews.
Medion is offline   Reply With Quote
Old 07-11-08, 06:19 PM   #92
Xion X2
Registered User
 
Xion X2's Avatar
 
Join Date: May 2006
Location: U.S.
Posts: 6,701
Default Re: The right price for the GTX200 family...

Quote:
Originally Posted by ChrisRay View Post
If your getting 115 FPS in a game. Wouldnt you like to turn on additional AA settings? 8x MS. MS + SS hybrids?
Absolutely. But there are two factors to consider here. One - you're still gaining a lot by going from 55fps to up over 100fps. It means that your minimums are much higher. Anyone who's gone dual-GPU knows how smooth a game can look when framerates are consistently above 60.

Two - not all of us feel the need to 16xAA everything--especially at 2560x. I honestly think it's some sort of crack-like addiction for some just to say they can do it when the reality is that it's virtually impossible to tell the difference between 4--8--16x AA at that high a resolution. I know that when I had my G80 SLI setup that it was very difficult to tell going from 8x-16xAA, and that was at 1680x1050. I can't imagine that an image so sharp at 2560x would need that much AA.

Unfortunately, I don't have the ability to test that myself, but I will be the owner of a 1080p soon enough.
__________________

i7-2700k @ 5.0 GHz
Nvidia GeForce 570 2.5GB Tri-SLI
Asus P67 WS Revolution (Tri-SLI)
OCZ Vertex SSD x 4 (Raid 5)
G.Skill 8GB DDR3 @ 1600MHz
PC Power & Cooling 950W PSU
Xion X2 is offline   Reply With Quote

Old 07-11-08, 06:20 PM   #93
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default Re: The right price for the GTX200 family...

I was comparing 9800GTX + SLI to GTX 280 Single card. Use your head and actually read for once Medion.


Quote:
Quote:
Originally Posted by Xion X2 View Post
Absolutely. But there are two factors to consider here. One - you're still gaining a lot by going from 55fps to up over 100fps. It means that your minimums are much higher. Anyone who's gone dual-GPU knows how smooth a game can look when framerates are consistently above 60.

Two - not all of us feel the need to 16xAA everything--especially at 2560x. I honestly think it's some sort of crack-like addiction for some just to say they can do it when the reality is that it's virtually impossible to tell the difference between 4--8--16x AA at that high a resolution. I know that when I had my G80 SLI setup that it was very difficult to tell going from 8x-16xAA, and that was at 1680x1050. I can't imagine that an image so sharp at 2560x would need that much AA.

Unfortunately, I don't have the ability to test that myself, but I will be the owner of a 1080p soon enough.
That does depend entirely on where the min FPS bottlenecks are coming from. If your min FPS is 40 and is result of a CPU bottleneck. Its far less distracting to drop from say 60 FPS to 120 FPS. In these cases ideally we all want vsync enabled with triple buffering off for consistent inter-frame delay.

Inter-Frame Delay is extremely important in this case. Say your getting 100 FPS on a multi GPU setup your frames are toggled and adjusted with AFR and AFR will attempt to render those frames at maximum performance.

Frame 1: 15 MS
Frame 2: 30 MS
Frame 3: 45 MS
Frame 4: 60 MS.

Now assuming your running at 120 FPS. Multi GPU never distributes frame delay that evenly. Hence why 60 FPS on a single GPU is not = to 60 FPS on a multi GPU.

But in the example above. A frame spike would look something like this with Quad SLI

12-30-43-51-100-112-128-145-160.

For a smooth and consistent gaming experience. Your inter-frame delay must be as consistent as possible with minimums/highs. So therefore enabling Vsync or raising the quality is absolutely ideal. This is the common multi GPU stutter people sometimes describe. In ideal rendering. You run a consistent 60 or consistent 30 FPS all the time without the spikes.

Chris
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 07-11-08, 06:23 PM   #94
Medion
 
Medion's Avatar
 
Join Date: Dec 2005
Location: Barksdale AFB, La
Posts: 1,238
Default Re: The right price for the GTX200 family...

Quote:
Originally Posted by ChrisRay View Post
I was comparing 9800GTX + SLI to GTX 280 Single card. Use your head and actually read for once Medion.
Why would I extend that courtesy to a pompous ass such as yourself, when you can't lend me the same courtesy? Oh, and 9800GTX+ SLI to GTX280. Another of your brilliant apples to apples comparisons...

I have no doubt that memory is a limiting factor in ultra high res + aa combinations, on NV cards. We're just not seeing these same limits in ATI cards. Again, the only way to truly test memory limits is to remove the other variables. But then, I've explained that numerous times already, and you're just too dense to grasp such a simple concept.
Medion is offline   Reply With Quote
Old 07-11-08, 06:29 PM   #95
ChrisRay
Registered User
 
ChrisRay's Avatar
 
Join Date: Mar 2003
Location: Tulsa
Posts: 5,101
Default Re: The right price for the GTX200 family...

Quote:
Originally Posted by Medion View Post
Why would I extend that courtesy to a pompous ass such as yourself, when you can't lend me the same courtesy? Oh, and 9800GTX+ SLI to GTX280. Another of your brilliant apples to apples comparisons...

I have no doubt that memory is a limiting factor in ultra high res + aa combinations, on NV cards. We're just not seeing these same limits in ATI cards. Again, the only way to truly test memory limits is to remove the other variables. But then, I've explained that numerous times already, and you're just too dense to grasp such a simple concept.
And again. I was never talking about ATI cards in the first place. Thank you. Drive through. i have no reason to lend courtesy to you all when you've done nothing but attack me in this thread. Besides that I have read what you said. But it makes no sense. Alot of the things you have posted are pure nonsense. I did not say it was apples to apples. I said it was an absolute failure to deliver of a 512 meg part. Anyone gaming on a multi GPU setup and saw that would be utterly dissapointed. When you multi GPU. You strive for consistent, high quality gaming experience with less compromise on performance and quality. Being forced to run 16x CSAA without any hope of ever going higher due to a memory barrier is a compromise. One that sucks to see on a multi GPU setup. Which for all intents and purposes is to drive high quality rendering. You yourself admit you have no interest in high res, high AA gaming. Why are you even in this thread? Do you enjoy attacking me so much? That so far seems to be your only vested interest in this thread.

Performance comparisons are good for determining absolute performance of specific parts. But they say nothing of gameplay. With SLI. I expect the highest quality and most consistent experience possible. Anyone who pays that much money for something should demand it. As the 9800GX2 may compare to the GTX 280 in raw FPS in alot of cases. But once you start playing games and raising settings to improve the gameplay experience. The 9800GX2 falls to its knees why the GTX 280 is able to consistently produce the framerates as settings rise. In this case even a GTX 260 SLI setup provides a more consistent experience than a 9800GX2 Quad SLI. Due to the memory. Despite the 9800GX2 Quad SLI setup providing more raw FPS in its ideal settings modes ((4x AA, And lower resolution/smaller textures)) Hence which is why it all comes back down to my original where I think the GTX 260 is very good deal and brings alot to the table compared to most NVidia solutions from a price/performance perspective especially due to its memory.
__________________
|CPU: Intel I7 Lynnfield @ 3.0 Ghz|Mobo:Asus P7P55 WS Supercomputer |Memory:8 Gigs DDR3 1333|Video:Geforce GTX 295 Quad SLI|Monitor:Samsung Syncmaster 1680x1080 3D Vision\/Olevia 27 Inch Widescreen HDTV 1920x1080

|CPU: AMD Phenom 9600 Black Edition @ 2.5 Ghz|Mobo:Asus M3n HT Deluxe Nforce 780A|Memory: 4 gigs DDR2 800| Video: Geforce GTX 280x2 SLI

Nzone
SLI Forum Administrator

NVIDIA User Group Members receive free software and/or hardware from NVIDIA from time to time to facilitate the evaluation of NVIDIA products. However, the opinions expressed are solely those of the members
ChrisRay is offline   Reply With Quote
Old 07-11-08, 07:46 PM   #96
Medion
 
Medion's Avatar
 
Join Date: Dec 2005
Location: Barksdale AFB, La
Posts: 1,238
Default Re: The right price for the GTX200 family...

Quote:
i have no reason to lend courtesy to you all when you've done nothing but attack me in this thread.
Slow down there, sweetcheeks. I merely stated with my first post that a more apples to apples comparison (1GB vs. 512MB configs of the same card) would be a more accurate showing of the memory limitations. From post one you got defensive and attacked me. You also attacked Xion constantly, even though he hasn't said anything back to you. Are you making this up as you go? I think you've clearly got some antisocial personality problem going on there.

Quote:
Besides that I have read what you said. But it makes no sense. Alot of the things you have posted are pure nonsense.
Right. Eliminating variables and comparing the same architecture with different memory configs is pure nonsense, but you testing different architectures and claiming the gaming experience is different based solely on memory...isn't nonsense? Dude, you're off your rocker. You've lost all credibility with me.

Quote:
You yourself admit you have no interest in high res, high AA gaming. Why are you even in this thread? Do you enjoy attacking me so much? That so far seems to be your only vested interest in this thread.
Originally, I was stating my desire to see the same comparison you did, with only the same architecture. Since then, you've been attacking me, and I've merely responded. You pulled the same crap on others at Rage3D. You were right to take a vacation, because had you kept going, some people would have pushed for an outright ban. You may be tech savvy, but you're a rude, pompous ass, who considers yourself better than other gamers, for what reasons I cannot fathom.
Medion is offline   Reply With Quote
Reply


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump


Similar Threads
Thread Thread Starter Forum Replies Last Post
295.53: the kernel needs "acpi=off", thinkpad T420, nvs 4200M Imbrius NVIDIA Linux 1 05-27-12 06:18 PM
PC Games, CeleronII 566, CeleronA 300, BIOS Savior, Heatsinks, NES & Sega Items +pics TekViper For Sale/Trade 5 08-07-02 10:48 PM

All times are GMT -5. The time now is 02:19 PM.


Powered by vBulletin® Version 3.7.1
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Copyright 1998 - 2014, nV News.