Canadian TV, Computing and Home Theatre Forums banner

1 - 11 of 11 Posts

·
Member #1
Joined
·
47,683 Posts
Discussion Starter #1
AMD today introduced its new flagship series of graphics processors, the Radeon HD 6900 series, which are designed for hard core PC gamers looking for top notch gaming performance.

The company says the new AMD Radeon HD 6970 and Radeon HD 6950 graphics are available now starting at $299 USD.

Features of the new DirectX 11 capable cards include a more efficient graphics processing design, improved architecture, and more memory which AMD says translates into better image quality and faster graphics cards. A dedicated video playback accelerator also helps users get the most out of HD video, online video and Blu-ray 3D.

AMD claims the new HD6970 graphic cards offer up to 2.5 times the gaming performance of similar priced high end graphics solutions - the Nvidia GTX 285 - from two years ago.

Other benefits of the new cards include AMD Eyefinity support and AMD's PowerTune technology which can automatically adjust the graphics card power use by dynamically controlling clock speed which gives users the ability to reduce when power consumption when the full power of the graphics card is not needed.
 

·
Member #1
Joined
·
47,683 Posts
Discussion Starter #2
Shopped around. In Canada about $300 for the 6950 and $400 for the 6970.
 

·
Registered
Joined
·
7,360 Posts
And how many DX11 games are around ? Last time I checked, DX10 was a flop (so much for my 8800GT, heh) with only about 50 titles vs over a thousand that were DX9 or lower. I realize DX11 is supposed to be more helpful to programmers. Im a big AMD and ATI fan, but Ise gotta get ma moneys worth :)
 

·
Registered
Joined
·
4,190 Posts
There are quite a few games that support DX11 now, and there are probably more to come. Now that ATI & nVidia both have cards that support it (and a lot of gamers are now running an OS that can take advantage of it), developers have a reason to code for it.

That said, it is still nowhere near as widespread as DX9. I can't remember how long it took after DX9 came out for it to become mainstream.
 

·
Registered
Joined
·
280 Posts
DX10 was a flop because it required Vista. The negative buzz about Vista ensured that most people stayed with XP. The adoption rate for Windows 7 is very encouraging for DX11.
 

·
Registered
Joined
·
7,131 Posts
It doesn't matter if the card is DX11 as long as it is backward compatible. If it runs DX10 and DX9 software, what's the problem? The advantage is that it will also run any new DX11 title that is available. Personally, I don't like paying over $150 and avoid power hungry cards (power savings mode noted) so I will be passing on these models.
 

·
Registered
Joined
·
2,818 Posts
The 6850 is the best for performance per watt, and will do many peoples' needs.
Though if it doesn't have enough oomph for you, your only alternative are the nvidia cards, which use far more power. Not much you can do there :p
 

·
Member #1
Joined
·
47,683 Posts
Discussion Starter #9
From what I understood, these cards are actually more efficient than predecessors.

These boards are hardly for everyday computing though.
 

·
Registered
Joined
·
1,111 Posts
i think the most interesting thing these cards do is monitor the TDP and adjust the core clock DOWN when TDP exceeds the spec.

This is a major game changer as it will change the speed of the card depending how how much its being stressed.

its very interesting direction but will in the end help them reject warranty claims lol
 

·
Registered
Joined
·
4,190 Posts
That particular feature isn't likely to do much in the real-world (i.e. outside of benchmarks). Sure, you could lower the max TDP down to 200w, but if you are doing that, why not save your money and grab a lower-end card that operates within that thermal envelope? Bear in mind that the cards still significantly lower their clock speed in 2D mode.

As it stands now, the main thing that would get the 6970 to exceed the default TDP (250w) is the Furmark "torture test." Sure, it might help those who overclock too much, but those who do so typically wouldn't want the card to throttle down.

If we start seeing games / GPU-accelerated apps pushing the cards to excessive power utilization, then I could see this having a benefit. For now though, I just don't see it.
 
1 - 11 of 11 Posts
Top