AMD Radeon R7/R9 vs. NVIDIA GTX 7xx

  • Thread starter Luminis
  • 49 comments
  • 4,980 views
Yes, I have some of that bitter taste in my mouth as well....x2. Ouch. After my initial purchases, I try not to look at prices until I'm ready to upgrade again. It ALWAYS will leave a bitter taste in my mouth if I do. :)
 
even the 770 over here dropped by about £80, but i have to think i've had fun with mine since june and made good use of it. not sure which upgrade to go for in around a year's time, probably 1440p monitor and a second 770
 
even the 770 over here dropped by about £80, but i have to think i've had fun with mine since june and made good use of it. not sure which upgrade to go for in around a year's time, probably 1440p monitor and a second 770

I did opt for a 1440p monitor as an upgrade, the Dell u2711 actually. It has been a great monitor. What I have enjoyed better and might suggested is a 120hz/144hz monitor even at 1920x1080. I found gaming to be much smoother and pleasurable than even my 1440p monitor. With TXAA and other AA technologies, you can get a pretty smooth, jagged free image. Just a thought, and if I had to do it again, I would have probably skipped the 1440p and went straight to the higher refresh rate monitors. I do find an advantage of the extra resolution for other things though and do use my 1440p for a lot of non-gaming applications.
 
I did opt for a 1440p monitor as an upgrade, the Dell u2711 actually. It has been a great monitor. What I have enjoyed better and might suggested is a 120hz/144hz monitor even at 1920x1080. I found gaming to be much smoother and pleasurable than even my 1440p monitor. With TXAA and other AA technologies, you can get a pretty smooth, jagged free image. Just a thought, and if I had to do it again, I would have probably skipped the 1440p and went straight to the higher refresh rate monitors. I do find an advantage of the extra resolution for other things though and do use my 1440p for a lot of non-gaming applications.

It has crossed my mind, a few friends have 120hz monitors and say they're really good. of course with nvidia g-sync on the horizon, it's possible that a 60hz display could gain a lot of smoothness with it, that's if g-sync becomes available with 1440p. we'll just have to see what happens in the next year or so with monitor tech and whether any 120/144 hz 1440p monitors will happen. They're all ips at the moment, a bit laggy for gaming.
 
True. Not to digress too much from the original topic, but due to bandwidth limitations, you will not see 1440p 120/144mhz monitors. Like you said though, we'll have to see what happens with monitor technology but I have seen nothing consumer based that would hint at a compatible interface that can support such bandwidth loads. Could be that 4k at 120hz will lend itself to a natural progression towards this bandwidth requirement. We will see. :)
 
Seems as if the 780Ti might be the faster card overall, but probably the 290X will remain the better performance to cost ratio of the two. Hell, the ordinary 290 might gain that crown as that'll likely be cheaper than the ordinary 780's new price and probably be able to be overclocked to 290x speeds.

FYI I have a Dell U2713HM and love it.
 
Wow, the GTX 780Ti will have 2880 cuda cores. So the Titan Ultra pretty much is the GTX 780Ti...

Damn, AMD's party surely didn't last very long! :lol:
 
Well, after reading some reviews about the 290X, I am quite impressed. I didn't expect AMD to deliver that much bang for the buck. However, I'm even more excited to see what a third party 290X could do with a better cooling solution. It's got to be one hell of a card, I'd assume. Nvidia dropping their prices as a result is just as nice, though. Looking forward to the 290x and the GTX800-series duking it out 👍 Can't make paragraphs on my phone, somehow, sorry about that....
 
The 290X won't hold a candle against the 800 nVidia series, I can assure you of that.

Let's be honest here, AMD's top of the range card is barely beating a Titan. A card that is nearly a year old, and on top of that the 290X beats it by outputting a lot of heat and using a lot of power.

I'd say a 290X is the card to have at 4k resolutions, but for most 1080p and 1440p gamers a GTX 780 is still the way to go, especially with the price cuts. Hell, I found an MSI GTX780 for 450 euros (!!!) which is all the power you'd ever need for both 1080p and 1440p gaming. That's ridiculously cheap for such a powerful card.
 
The 290x might not be in the same league (obvious) as the soon to be released 800 series, but the "pirate island" cards will be 👍.

Speaking of pirate islands. Can't wait to see what future endeavors AMD has in store for their GCN architecture. Hawaii has shown what subtle tweaks can be done to the architecture for the greater good performance wise. Polish it some more, stick it on a 20nm node, we have a beast.
 
The 780Ti looks really promising. But the bang for the $ is yet to seen... I still think the 290x will hold that crown, following AMD's usual trait in the GPU sector in recent years.
 
So I've been making the rounds reading reviews of the cheaper R9 290. It looks like a monster value at $399… if you're deaf. I'm amazed that AMD thinks it's OK to ship a card that sounds like that. Correction: I'm amazed AMD thinks it's OK to ship such a lousy reference cooler. (Nvidia's at fault too; their pretty reference cooler is outperformed by their partners' coolers.) I know that open coolers vs blowers that exhaust outside the case have their pluses and minuses, but I think Tom's has some pretty compelling videos demonstrating the very audible differences:

http://www.tomshardware.com/reviews/radeon-r9-290-review-benchmark,3659-18.html
http://www.tomshardware.com/reviews/radeon-r9-290-review-benchmark,3659-19.html

In my mind I see no reason to buy an R9 290 or 290X with the reference design. I think third party cards will be THE way for these cards to truly shine. Luckily they're priced aggressively, so any markup from upgraded cooling should be insignificant.
 
While the complaints definitely have merit, I will never understand them. Until you've used an X1900 or even a 7800GTX you do not know what a loud reference cooler is. It's really just that simple. :lol:

Does that excuse things as they are now? Absolutely not, but as I've told someone once before: Having the "best" reference cooler is tantamount to being the coolest kid in a room permanently set to 110 degrees - it doesn't even matter.
 
While the complaints definitely have merit, I will never understand them. Until you've used an X1900 or even a 7800GTX you do not know what a loud reference cooler is. It's really just that simple. :lol:

I remember dark times with the X1900. *shudder*

Does that excuse things as they are now? Absolutely not, but as I've told someone once before: Having the "best" reference cooler is tantamount to being the coolest kid in a room permanently set to 110 degrees - it doesn't even matter.

The way I look at it, if you're going to sell products with a terrible reference cooler on it, then you're setting a trap for your customers to buy the "wrong" product.
 
I remember dark times with the X1900. *shudder*

That shrill whine caused by the plastic shroud... I still have one, it's still awful. :lol:

The way I look at it, if you're going to sell products with a terrible reference cooler on it, then you're setting a trap for your customers to buy the "wrong" product.

Perhaps. It's not difficult to pick up a third-party cooler though, or even a waterblock if your setup permits it.
 
That shrill whine caused by the plastic shroud... I still have one, it's still awful. :lol:



Perhaps. It's not difficult to pick up a third-party cooler though, or even a waterblock if your setup permits it.

As do I. And yes, dark times indeed. It was my experience with the x1900xt that moved me from ATI to nVidia. I haven't looked back. :D
 
I can't even fault you for it. I kind of want to run downstairs and ramp it up to 100% for old times sake.
 
Back