Breaking News

Albatron GeForce FX5900XTV

The recently-released GeForce FX 5900XT has become NVIDIA’s dominant player in the mainstream market. Today, we’ll take a look at Albatron’s excellent implementation of the 5900XT and find out if it really is the next GeForce 4 Ti4200.

Introduction


Two years ago, the NVIDIA GeForce 4 Ti4200 was the mainstream card of choice, and it was easy to see why. It was inexpensive compared to its big brother, the Ti4600, but it used the same NV25 chipset and memory bus. In fact, the only differences between the two were the clock speeds and the price. Since the arrival of chipsets that support DirectX 9, though, no mainstream card has achieved the success of the Ti4200. Mainstream cards have been coupled with smaller memory buses or fewer pipeliens than their enthusiast-level counterparts. However, in recent months, add-in board manufacturers have introduced a new line of cards, labeled the GeForce FX 5900XT, that has the same 256-bit memory bus and the same 4×2 pipeline configuration as the GeForce FX 5950 Ultra. Today, we’ll be examining Albatron’s 5900XT offering, the FX5900XTV, and we’ll see if it is the heir to the Ti4200’s mainstream throne.

Specifications, Card Layout, and Bundle


At its core, the Albatron FX5900XTV is powered by NVIDIA’s former flagship chipset, the NV35. Introduced in May 2003, the NV35 fixed some of the most glaring problems with the NV30 by adding a 256-bit memory bus, improving pixel shader performance, and eliminating the need for a drastic cooling system. The NV35 was replaced as NVIDIA’s high-end chipset by the NV38, introduced in October; however, the only difference between the NV38 and the NV35 are the chip’s clock speeds. This means that for all intents and purposes, the FX5900XTV is powered by a version of NVIDIA’s highest-end chipset with lower clock speeds. The FX5900XTV is clocked at 390Mhz for the core and 700Mhz for its128 MB of memory. Although these speeds seem low in comparison to NVIDIA’s 5700 Ultra, remember that the FX5900XTV also doubles both the pipelines and the memory bus, which more than compensates for the FX5900XTV’s lower clock speeds.

On the software side, the Albatron GeForce FX5900XTV includes the full versions of Duke Nukem: Manhattan Project, CyberLink PowerDVD XP 4.0, and CyberLink PowerDirector 2.5 Pro ME, as well as demo versions of Max Payne, Rally Trophy, BeamBreaker, Zax, and Age of Wonders 2. The card also includes an S-Video cable, a composite cable, a DVI to VGA adapter, and the VIVO unit, which contains S-Video and composite in and out. While the bundle is not the most exciting in the world, it is rather standard for a mainstream card.

If you look at the card, you’ll notice that the card forgoes the reference 5900 cooler in favor of a custom solution. The heatsink and fan on the cooler are very small and do not require the card to be mounted in two slots. However, like any modern video card with a heatsink and fan, the cooling system does prevent the use of an adjacent PCI slot. Still, the reduced size and weight of the cooler means that the 5900XT should have no problems with a SFF (small form factor) case. The cooler is excellent; during my time with the card, I encountered no heat-related stability problems and was unable to hear the fan on the card over the noise of the rest of my computer.

Benchmarking


I tested the Albatron FX5900XTV on the following computer:

  • AMD Athlon XP 2500+ at 2.2Ghz (11×200)
  • Corsair XMS PC3200LL at 2-3-2-6
  • Abit NF7-S 2.0
  • Antec TrueBlue 480W
  • 120GB Western Digital 7200RPM Special Edition (8MB Cache)
  • Windows XP SP1a, freshly formatted
  • NVIDIA Forceware 56.64, with all antialiasing and anisotropic filtering options set via the control panel

To test the Albatron FX5900XTV’s performance, I tested the following games:

Unreal Tournament 2003, version 2225, screenshots using 1280×960 with 2x antialiasing and 4x anisotropic filtering
Call of Duty, version 1.3, screenshots using 1280×1024 with 2x antialiasing and 8x anisotropic filtering
Savage, version 2.00c, screenshots using 1280×960 with 2x antialiasing and 8x anisotropic filtering
Halo, version 1.4, screenshots using 1280×960 with no antialiasing or anisotropic filtering
X2: The Threat, version 1.3, screenshots using 1280×960 with 2x antialiasing and 8x anisotropic filtering

Before I launch into the performance numbers, I’d like to comment on Forceware 56.64. 56.64 is the first public “Forceware 55” release from NVIDIA, and it adds a very significant feature: application profiles. Instead of the traditional global settings for antialiasing, anisotropic filtering, and related features, Forceware 55 allows you to configure settings for specific games that are automatically applied when the game is run. To be honest, this is a truly wonderful feature. It sounds minor to many people, I’m sure, but not having to turn off 8x antialiasing and 8x anisotropic filtering after you finished a session of the original Deus Ex is a godsend. Creating an application profile is relatively simple, too, although not well-documented in the least. It’s a three-step process.

First, from the “Performance and Quality Settings” of the Forceware control panel, select “Global Driver Settings.” Click “Add” to begin the process of creating a new profile.

Second, click “Browse” to add an application to the list of choices for profiles. Once it has been added, unselect “Global Driver Settings” and select the application name. Click “OK.”

Finally, select whatever options you want for that particular application, and click “OK.” That’s it. Am I silly for thinking that this is the best thing since antialiasing? Probably, considering it’s been around in third-party applications for years, but I’m overjoyed to see it finally included in an official driver release. Now, on with the benchmarking.

UT2003

At this point, UT2003 doesn’t even really need an introduction. It’s the spiritual successor to Quake 3 (which has just been replaced by Unreal Tournament 2004, but I digress). To test UT2003, I attempted to find the highest image quality settings where the minimum framerate did not drop below 30 FPS in a custom demo. After increasing the maximum recorded framerate to 300 FPS, I recorded a demo of a deathmatch on DM-Compressed with three bots. Compared to some maps in the recently-released UT2004, DM-Compressed is not extremely stressful, but it still provides a very useful metric of the FX5900XTV’s performance. I then replayed it without using the UT2003 benchmarking feature while FRAPS captured the framerate to a file every second. I repeated the playback until I found what, to me, seemed to be the highest image quality settings that still offered acceptable performance. After evaluating several combinations of resolutions, antialiasing, and anisotropic filtering, I determined 1280×960 with 2x antialiasing and 4x anisotropic filtering to be the most consistently playable on the FX5900XTV.

Anisotropic filtering is far more taxing in UT2003 than antialiasing, due to the game being fillrate-limited. I felt that 4x anisotropic filtering was an acceptable tradeoff between quality and speed, as the differences between 4x and 8x filtering are usually minor compared to the quality differences between, for example, 2x antialiasing and no antialiasing. Let’s see how the FX5900XTV fares in the benchmark.

As you can see, its average framerate is significantly higher than its minimum framerate of 35 FPS. However, in a competitive multiplayer game like UT2003, the minimum framerate is at least as important as average framerate, if not more so. In fact, during the five-and-a-half minute demo, the minimum framerate drops below 40 FPS only twice. These settings definitely seem to be the sweet spot for the FX5900XTV, as it could prevent losing in a multiplayer game from the huge drop in framerate that always seems to happen when you’re about to get that last kill to win the match.

Call of Duty

For many gamers, Call of Duty is currently the World War 2 first-person shooter of choice. Based on a heavily improved Quake 3 engine, I used the comma-separated values outputted during a timedemo to determine its optimal settings. Before I launch into the benchmarking, though, I did notice some oddities with the NVIDIA drivers in Call of Duty; specifically, 4x antialiasing seemed to cause ridiculous performance problems. A few times, I noticed that running a benchmark with 4x antialiasing would eventually cause performance to drop to levels better measured by frames per minute. It certainly seems to be a bug, but I haven’t been able to determine whether it is a problem with the game or the NVIDIA drivers (although I could replicate the problem with both 53.03 and 56.64). I could also replicate the problem using specific points in the single-player game–the POW Camp mission was the worst culprit.

With that said, I used the “timedemo1” demo, recorded by Call of Duty player Booda, to test performance because I am a terrible Call of Duty player (and Booda most certainly is not). I once again looked for the 30 FPS baseline, but I found the best settings to be very different from UT2003.

1280×1024 with 2x antialiasing and 8x anisotropic filtering was the best-looking combination of antialiasing and anisotropic filtering that I could find. The settings are significantly higher than those of UT2003, which strikes me as odd since UT2003 is an older game. However, Call of Duty seems to waver between memory bandwidth and fillrate bottlenecks, and 1280×1024 2xAA/8xAF seems like it is the setting that minimizes both. Let’s take a look at performance.

Like UT2003, the framerate drops into the mid-30s, but here, it only does during a single stretch of the demo. Since Call of Duty is not as fillrate-intensive as UT2003, a higher level of anisotropic filtering can be used while maintaining adequate performance. Judging from these two games, 1280×960 or 1280×1024 with 2x antialiasing and at least 4x anisotropic filtering is the ideal combination of settings for many current-generation games.

Savage

Released last year to very little fanfare, Savage is a hybrid first-person shooter and real-time strategy multiplayer game in the vein of the original Starsiege: Tribes. Savage is also of note because it is one of very few games that use OpenGL without incorporating an id engine. Despite its lack of popularity, Savage happens to be one of the best multiplayer games I’ve played in a long time. Savage makes heavy use of Direct3D 7-level features, and even though it doesn’t use pixel shaders or the latest graphics technology, it manages to render expansive outdoor environments extremely well.

Because all Savage demos are played back at 20 FPS , I could not do the same kind of analysis with Savage that I did with Call of Duty and UT2003. Instead, I was forced to rely upon Savage’s built-in benchmarking tools, which only report average framerate.

1024×768
No AA/No AF 37.0
2xAA/8xAF 28.9
4xAA/8xAF 27.3
1280×960
No AA/No AF 34.9
2xAA/8xAF 23.6
4xAA/8xAF 21.8
1600×1200
No AA/No AF 29.9
2xAA/8xAF 18.1
4xAA/8xAF 16.4

In Savage, antialiasing does incur a performance hit. However, that hit is nowhere near the penalty from anisotropic filtering. Savage is completely bottlenecked by fillrate, which is barely taxed by antialiasing but used heavily by anisotropic filtering. This performance penalty is due largley to the angle-independent nature of NVIDIA’s anisotropic filtering. Unlike ATI’s implementation, the NV35’s filtering will perform a selected anisotropic filtering level on all angles in a scene, where ATI’s will vary its level of anisotropic filtering depending on the angle of a surface. Is this a huge issue? No, but it does illustrate one of the many ways where ATI and NVIDIA cards differ. Something else to note regarding Savage, though, is that it reads from the card’s Z-buffer into system memory to render the sun. This incurs a major performance penalty (something in the range of 35%) and should be fixed in an upcoming patch. Keep that in mind if you’re an avid Savage player.

Halo

Gearbox’s port of Bungie’s sci-fi shooter for the Xbox has been widely criticized for its absurdly unpredictable performance, and these criticisms are not without merit. As a measure of predicting single-player performance, Halo’s built-in timedemo, which I will be using, has absolutely no basis in reality. Performance in the single-player game is always very erratic, with the same scene running three times at three entirely different framerates, but it can be summarized nicely by saying that it is terrible. So, you wonder, why am I including it? While the benchmark is unusable for predicting Halo performance, it is useful for comparing video cards in one of the first games that uses DirectX 9-level features.

No AA/No AF
1024×768 44.43
1280×960 33.82
1600×1200 23.12

Not much to see here. At 1024×768, my totally unscientific tests (playing the game and using FRAPS, since you can not record a demo in Halo) showed that framerates in the multiplayer game hovered around 35-40 FPS but dropped to closer to 20 FPS in the single-player game.

X2: The Threat

Released late last year, X2: The Threat is the sequel to a space sim that became something of a cult classic a few years ago. X2 makes heavy use of DirectX 8-level features, such as DOT3 bumpmapping, but it does not seem to use a considerable number of pixel shaders. Even without these advanced effects, though, the game is visually stunning .

Before I continue, take a look at the first X2 screenshot. You’ll notice that the ship’s engines are shining through the hull. This happens very often in X2, but I believe that it is a problem with the game. Anyway, to test X2’s performance, I used the included “rolling demo.” Basically, it is a collection of scenes that are representative of what you can expect to see in the game itself. You’ll notice that I ran no benchmarks without antialiasing. X2 actually includes basic antialiasing controls in the application, and given the number of edges visible on the screen, it’s as important a detail setting as bumpmapping or shadows. Let’s take a look at performance.

1024×768
2xAA/0xAF 58.2
2xAA/8xAF 53.8
4xAA/8xAF 45.2
1280×960
2xAA/0xAF 48.2
2xAA/8xAF 44.3
4xAA/8xAF 34.4
1600×1200
2xAA/0xAF 37.6
2xAA/8xAF 34.1
4xAA/8xAF 24.8

The X2 rolling demo has one major problem, though, and it is the same as the Halo benchmark; as a means of predicting X2’s in-game perforamnce, it has no meaning. However, unlike Halo, there is a good reason for this. X2, with its constant simulation of a universal economy, is totally limited by CPU performance and, on my processor, runs at around 30 FPS regardless of what settings I use (with the exception of 4x antialiasing and 8x anisotropic filtering at 1600×1200). But, with that said, using 2x antialiasing and 8x anisotropic filtering is a viable option, even at 1600×1200. For those of you with monitors that support a refresh rate above 60Hz at that resolution, it is definitely the way to play X2. I was honestly surprised that the performance didn’t completely tank when 1600×1200 was used, but this is probably the result of the 256-bit memory bus.


Overclocking


To overclock the FX5900XTV, I used the tried-and-true method of enabling Coolbits in the Forceware drivers through the registry. For those of you who aren’t familiar with Cooblits, it’s a registry key that adds a “Clock Frequencies” menu in the driver control panel. As of 56.64, there is an “Auto overclocking” option in addition to the “Manual overclocking” setting. At this point, I’m guessing that the auto overclocking option is essentially NVIDIA’s version of ATI’s Overdrive, but I can’t be certain without further testing. Regardless, I used the manual overclocking option to find this particular FX5900XTV’s sweet spot. Automatically detecting the maximum overclock yielded very disappointing results–399Mhz on the core and 709Mhz on the RAM, both 9Mhz over their stock equivalents. I knew (or at least hoped) that the card could do better than that, so I began to play with the values myself. When I was finished, the card reached 450Mhz on the core and 725Mhz on the RAM without artifacts or stability problems.

Overclocking Performance – Halo

1024×768
390/700 44.4
450/725 49.2
1280×960
390/700 33.8
450/725 38.1
1600×1200
390/700 23.1
450/725 26.2

While I was happy with the core’s overclocking ability, I was somewhat disappointed with the RAM. However, since this is a mainstream card and does not have the same memory chips as the 5900 or 5900 Ultra, I can’t say that I was really surprised by the results. Of course, these results are not necessarily indicative of how well all FX5900XTVs overclock, since I have only tried it with a single card; still, it proves that the custom cooler is at least very effective at cooling the core.

Conclusion


Before I finish, though, there are a few issues that I need to touch on relating to the FX5900XTV. To be honest, these aren’t issues with the Albatron card in particular but rather with the NV35 chipset. First of all, the NV35 has terrible antialiasing quality compared to ATI’s quality; the differences, especially with 4x antialiasing, are astounding. Comparing ATI and NVIDIA antialiasing quality is beyond the scope of this review, but 3DCenter has written two excellent articles on antialiasing and antialiasing quality and performance that demonstrate the quality differences and explain the reasons for the differences. This is definitely ATI’s greatest advantage over NVIDIA in the mainstream market; the 9600XT is capable of the same antialiasing quality as the higher-end R300 and R350 cards, while the NV35 has the same antialiasing found in the GeForce 4. I can’t really measure how much of an impact the poorer antialiasing quality would make on a person’s decision. For a flight simulator fan, antialiasing quality is paramount, while a shooter fanatic would probably care much more about raw speed than image quality. It’s personal preference.

Second, I have serious problems with NVIDIA’s pattern of sacrifcing image quality for performance in newer drivers. To most people, the 3DMark03 fiasco comes to mind, but I am referring primarily to the crippled “brilinear” filtering that has replaced traditional trilinear texture filtering in all applications, regardless of what the application or the user requests. Once again, the exact nature of brilinear filtering is beyond the scope of this review, but, once again, 3DCenter performed an investigation into texture filtering with the Detonator 50 drivers. Keep in mind, though, that as of the Forceware 55 series, brilinear filtering is also present in OpenGL applications. I’m not opposed to the idea of brilinear filtering; I think that, for many users, it would be an excellent choice, allowing them to sacrifice some texture quality for a significant speed increase. However, I am opposed on principle to forcing brilinear filtering and ignoring a user’s choices.

Finally, the NV35’s speed in applications that use PS2.0 shaders concerns me. While NVIDIA has been far behind of ATI with regards to PS2.0 performance in this generation, I’m not convinced that the NV35’s PS2.0 speed will be a problem compared to the 9600XT. Given the tiny number of games that make extensive use of PS2.0 shaders on the market right now, I’m not willing to call it a huge problem, but it is something to consider. It will certainly be more important in the next six months, but in those games, the FX5900XTV and 9600XT could perform exactly the same. If the game also stressed a card’s memory performance, the FX5900XTV would probably perform at least as well as the 9600XT but would still be bottlenecked by its PS2.0 performance. Of course, all of this is speculation, but it could prove to be a problem. Since it isn’t a problem yet and may never be a problem, I can’t really fault the FX5900XTV for its PS2.0 speed. Still, the NV35 chipset might be closer to a DX8.1 chipset than a DX9.0 chipset simply because its floating-point shader performance is so weak.

Pros

  • Very fast for a mainstream card
  • Excellent custom cooler
  • Inexpensive
  • VIVO support
  • Application profiles in the drivers

Cons

  • PS2.0 speed could be problematic
  • Forced “brilinear” filtering
  • Poor antialiasing quality
  • Bundle could be better

As for issues with Albatron’s implementation of the NV35, nothing jumps out at me as a problem. Sure, I’d love for them to bundle Far Cry or a more impressive game, but I can’t think of a single inexpensive card I’ve ever seen that’s really had an excellent bundle. If the card was half an inch shorter, that’d be great, but that’s an NVIDIA design problem. The cooler is unintrusive in every way, the card is fast, and it’s a great value. The drivers even have application profiles. Can you really ask for more? If you’re currently in the market for a mainstream card, you’d be silly not to consider the Albatron FX 5900XTV. Even with next-generation cards imminent, the trickle-down effect will still take weeks or months to bring the current high-end cards down to the FX5900XTV’s price point.

I’m awarding the Albatron FX 5900XTV a 9 out of 10 and the Bjorn3D Seal of Approval

Check Also

Fifine Ampligame A6T

Introduction Much like the webcam, the USB microphone has become a rather indispensable tool in …

Cooler Master Hyper 622 Halo

Introduction The liquid cooling is the go to cooler for the PC enthusiasts who want …

Leave a Reply

instagram default popup image round
Follow Me
502k 100k 3 month ago
Share