Breaking News

HD3870X2

Today AMD is announcing their latest video card in the HD3000 series: the HD3870X2. This board is special as it comes with two HD3870 GPU’s onboard.

INTRODUCTION

For a long time ATI and NVIDIA kept pretty even steps when it came to their cards. At any given time both had cards that competed at every possible level including the absolute high-end. Then something changed. Around the same time AMD bought ATI, NVIDIA gained the upper hand and ever since AMD has simply lacked a card that can compete with the best from NVIDIA.

Today AMD is announcing (and releasing) a card that for the first time in a couple of years has the chance to compete with the best NVIDIA has, at least until NVIDIA releases their new cards in a month or so.

The HD3870X2 is not your average video card. Instead of creating a single GPU with the power to compete with the 8800Ultra and beyond, AMD has chosen to go the multi-chip route instead. The HD3870X2, just as the name implies, simply is a card that has two HD3870 GPU’s. It is not the first video card in history with two GPU’s. The ATi Radeon Rage Fury MAX is one example and we all remember NVIDIA’s first attempt, the GeForce 7950 GX2. Recently AMD tested a similar design with the HD2600 Gemini cards which were a very rare sight in the wild. NVIDIA is also working on a multi-GPU card that should be out pretty soon.

THE FEATURES AND SPECIFICATIONS

The feature set of the HD3870X2 is the same as the other HD3xxx GPU’s.

  • Created with the 55 nm process making them need less power and run cooler
  • DirextX 10.1 SM4.0 support
  • UVD (Unified Video Decoder)
  • PCI Express 2.0
  • CrossfireX support
  • Native HDMI and DIsplayport support

There are no real surprises here. It’s cool that the board will support CrossfireX as it means that you can theoretical run with 4 HD3870 GPU’s in your system.

Dual-GPU

The main feature of this card is the dual R670 (HD3870) GPU’s placed on the card. These GPu’s are linked with a PCIe 1.1 high-speed interconnect bridge. The Bi-directional x16 lanes between the two GPU’s help giving the card Crossfire type performance. We will test that statement ourselves soon.
 

CLOSER LOOK

The HD3870X2 is a huge card. As expected it is a dual-slot card but it is also very long. It is longer than any other AMD/ATI card I’ve tested and as long as the GeForce 8800GTX. This is a card that needs a large case.

The card comes with a single fan that has to do the job of venting out the hot air outside the case. There are no vents on the fan house were warm air can escape inside the case. While running my benchmarks I had no problems touching the side of the fan house. Considering how warm the air is that comes out of the back it shows that AMD has done a good job with the cooling system. Just as important is that it’s actually not that loud, especially when compared with the HD2900XT. And all this on a card that has two GPU’s that need cooling. Impressive!

Just as with the HD2900XT this card needs power from both a regular 6-pin PCI-Express power connector and a 8-pin PCI-Express power connector. While most new PSU’s include these connectors there still are a lot who doesn’t so that is something you need to be aware of if you are interested in getting this card.
 

PERFORMANCE

You could see this card as Crossfire-on-a-card since essentially that is what it is. Unfortunately the fact is that the Crossfire performance, especially under Vista, has not always been that impressive. For this article I put in two HD2900XT cards in my rig and I was pretty disappointed with all the problems I had. Some benchmarks did indeed give a performance increase, although not a big one, and others, especially World in Conflict, completely choked. I must admit I was very curious about how AMD had solved these issues on the HD3870X2. In the press material they promise a lot: much better performance than a regular HD3870 and most importantly, much better performance than an 8800 Ultra.

All the cards tested were tested in the following rig:

  • Quad Core E6600 @2.4 GHz
  • 2 GB DDR3
  • Gigabyte P35T-DQ6 motherboard
  • 2×320 GB Seagate SATA drive
  • External USB Samsung CD-Rom
  • Windows Vista Home Premium with all the latest patches
  • The AMD cards were tested with Catalyst 8.1 except for the HD3870X2 which used a beta driver.
  • The NVIDIA cards used the Forceware 169.25 drivers.

3DMark05 and 3DMark06

Both benchmarks were run at their default setting.

There is not a huge amount of difference in the 3DMark05 score as it is an older benchmark. In 3dmark06 we see a bigger difference between the cards with the HD3870X2 almost beating the two HD2900XT in Crossfire.

The total score however is not as interesting as the individual scores.

While there is not any huge difference in performance for the first two game tests the third game test is different. This is how Futuremark describes it:

This test gives an example of a large scale outdoor scene. The scene is fairly complex with large areas of water reflecting the high canyon walls. The water actually is one of the key points of interest in this scene. The water not only does realistic looking reflections and refractions, it has a depth fog, making the sea monster swimming under the airship actually look deep down in the water. The air in this scene also uses a volumetric fog, making distant cliffs of the canyon really look far away.

The HD3870X2 is neck-to-neck with the two HD2900XT’s in Crossfire and easily beat the 8800GTS.

While all the newer cards perform pretty similar in the SM2.0 tests it is obvious that the HD3870X2 benefits from its second GPU in the HDR/SM3 test.

PERFORMANCE – CONT.

Unreal Tournament 3

Our first real game tested in this article is Unreal Tournament 3. To test the game we use the UT3 Benchmark tool that allows us to control the included benchmark without having to open up the game every time we want to change a setting. Each benchmark was run 3 times for 120 seconds and then the average framerate was calculated. The texture detail and world detail was set to 5 = Max setting.

First I selected a bot-match. While the 8800GTS dominated at the lower resolutions the HD3870X2 managed to sprint ahead at 1920×1200.

Next we look at a fly-by benchmark. This has no bots in it minimizing the impact of the CPU. While the GeForce 8800GTS still beat the HD3870X2 at 1280×1024 it has to capitulate at the higher resolutions. The extra GPU on the HD3870X2 does wonders for the higher resolutions.

World in Conflict

The next game we test is the brilliant RTS game World in Conflict. The game was set to the “Very High” setting (we’re testing enthusiast cards so they should handle this). This setting also is using DX10.

I should not be surprised. When I wrote an article about the performance of different video cards in this game (https://bjorn3d.com/read.php?cID=1162) I noticed that AMD had issues when the game was using DX10. I was hoping that the new patch + new drivers would improve matters but unfortunately there still are problems. While the HD2900XT actually now perform well the rest of the AMD cards fail miserably in this game. Adding a second HD2900XT in Crossfire creates a slideshow and while the HD3870X2 performs a lot better than the HD2900XT in Crossfire, it is still severely lacking.

Power consumption

The power consumption is becoming more important as a high power usage has all sorts of negative impact. Not only is it expensive, it also generates a lot more heat that needs to be removed. As the HD3870X2 is built with a 55nm process its power consumption should not be through the roof as it was with the HD2900XT.
I measured the power consumption of the whole system at the wall. First I measured it after the system had sat at the windows desktop for 30 minutes to get an “idle” consumption. Next I ran 3DMark06 at 1920×1200 with 8xAA and 16xAF to get the power consumption when the GPU(s) were stressed.

As I said above this is the total power consumption of the system. While the HD3870X2 comes out on top when running 3Dmark06 it still is not to bad considering that you have 2 GPU’s running. At idle it still has a lower consumption that the HD2900XT. I could not include the results from two HD2900XT in Crossfire since I had some issues measuring it with 3Dmark06 running but the idle power consumption for two HD2900XT in Crossfire was 230W so you can see that it would be much higher than the HD3870X2 when stressed.
 

CONCLUSION

The HD3870X2 is definitely an interesting product. You basically get all the benefits of Crossfire without having to buy two cards as well as getting a motherboard that supports it. The performance seems to be pretty good except for a few exceptions. It is obvious though that it is at the highest resolutions you will see the most benefit (1920×1200 and higher). This is also where AMD has tested the card when they compare it to the 8800Ultra.

Right now the card looks to be priced slightly higher than the GeForce 8800GTS 512 MB and lower than the GeForce 8800GTX. While I could not compare it to an 8800GTX as my Sparkle 8800GTX broke a few days ago we could see that the HD3870X2 matched and in some instances beat the 8800GTS 512 MB. This should mean it should easily match the 8800GTX also, especially at higher resolutions. Unfortunately I just had a limited time to test this card but as soon as I get an retail HD3870X2 we will run it through a broader set of benchmarks.

I would not say that this card will blow you away with its performance unless you are planning on gaming at very high resolutions. If you are looking for a card to complement your 24-30” monitor though the HD3870X2 looks like a good match. In addition to the performance you of course also get great video quality (AMD claims a HD-HQV score of 100 = perfect) as well as a card that will offload your CPU when watching those HD-DVD or Blueray movies. 

The HD3870X2 should be available any day now and all the major board vendors will have boards for sale.

Check Also

EZCast Pro HDMI Mirror2TV Stick – First look

Once upon a time I used to be very interested in finding the perfect media …

AMD presents 2012 A-Series APU (Trinity)

AMD today unveiled the next generation APUs for notebooks: Trinity. With more performance, Eyefinity-support and …

Leave a Reply

instagram default popup image round
Follow Me
502k 100k 3 month ago
Share