Saturday , 23 November 2024
Breaking News

The GeForce FX goes mobile

With the new mobile GPU’s NVIDIA has extended its new FX family to the mobile market as well as brought DX9 to the mobile market.

Introduction


A few years ago a new chip for laptops/notebooks would have been met with a huge Â’yaaaawnÂ’ from gamers. Things have changed though, and now both ATI and NVIDIA canÂ’t wait to release faster and more featured chips each year. Today it is time for NVIDIAÂ’s FX family to be extended to the mobile market.


The Chips


The two chips that NVIDIA is introducing today are the GeForce FX Go5600 and the GeForce FX Go5200. IÂ’m glad that they have moved to a more unified naming convention with the FX chips, since the confusion of the MX versus Ti did hurt NVIDIA in my opinion.

Just like their desktop counterparts, the Go5600 and Go5200 are DirectX 9 GPUs. This means NVIDIA has not only managed to bring DX9 parts to all segments of the desktop market but also now onto almost every segment of the mobile market.

Just as ATI did, NVIDIA has managed to keep pin compatability, which means that if an OEM wants to put an older GeForce4Go, an FX Go5200 or FX Go5600 in their notebooks, they only need one socket and one set of drivers.

LetÂ’s take a look at the specs and see what the mobile versions have in common with their desktop counterparts and what the differences are.



As you can see, the mobile users get the full FX treatment, including Vertex 2.0+ and Shader 2.0+.

LetÂ’s Watch a Movie


With more and more notebooks/laptops having both a DVD-drive and TV-out, using them for watching movies has become more important. The new FX Go5x00 chips do have VPE 2.0 (Video Process Engine – another ‘nice’ acronym):

  • New motion adaptive deinterlacing
    – Improves the playback on LCD screens
  • Enhanced filtering & Gamma-correction
    – Now you can have 1 gamma-setting for movies and 1 for regular windows
  • MPEG Video Deblocking
    – Sophisticated circuitry built into the GeForce FX Go mobile GPUs includes enhanced deblocking, scaling, filtering, and sharpening filters.
  • Integrated TV encoder
  • Full MPEG2 decode hardware implementation
    – VPE 2.0 offers an MPEG-2 decode algorithm that is more efficient than those found in previous-generation GPUs. The result is the industryÂ’s lowest CPU usage—less than 10 percent—for DVD decode.
  • MPEG2 encode hardware assist
  • Advanced Digital Vibrance Control
  • Enhanced nView support for LCD, TV, DVI, CRT

Here’s how the new FX mobile chips’ features compare to the old GeForce4 Go chips:

ItÂ’s All About the Power


It doesn’t really matter how great a chipset is if it draws so much power that it is useless for a notebook/laptop. NVIDIA of course knows this and have worked hard to make the new FX Go5x00 chips even more power-efficient than before. Enter NVIDIA’s PowerMizer 3.0 (where do they get all those names?).

So what has NVIDIA done to reduce the battery drain? The GeForce FX Go GPUs have a new transform and lighting CineFX engine. The engine offloads geometry and rendering calculations from the CPU to the GPU. Since the CPU draws a lot of power, this helps reduce the drain on the battery.

Next we have an enhanced MPEG-2 decode engine. We all know how horrible it is to sit and watch a movie while waiting for the airplane or on the airplane and have the batteries run empty before the end. To make sure you get the most out of your battery, NVIDIA has implemented a full MPEG-2 decode stack in VPE 2.0. By offloading the decoding from the CPU, the GPU allows the CPU to go into low-power idle state most of the time. The efficiency of the GPU lets it stay at a low frequency and voltage, resulting in very low overall system power consumption. Combined, it means longer battery time.

The next feature that has been enhanced in the GeForce FX Go GPUs is the Monitoring and Control Features. These on-chip performance monitors constantly oversee activity levels of the numerous hardware modules in the GPU. That means that automated gating and scaling back of units is performed only as needed. The level of granularity of this monitoring scheme is dramatically enhanced in the current GeForce FX Go GPUs. Finely tuned with this “monitoring” capability is a built-in “control” capability. Tightly integrated features allow dynamic control with varying levels of granularity for clock scaling, clock gating, and supply-voltage scaling.

So, full battery savings or full performance? How will the system know what the users want? NVIDIA of course has a user interface for that so the user can decide.

Basically the user can select between 3 levels: Maximum Power Savings, Balanced and Maximum Performance. Depending on what level you select, PowerMizer software reduces the maximum operating frequency of the GPU to reduce power consumed. Using a patent-pending technique, PowerMizer also reduces the workload on both the GPU and the CPU. Reducing the workload in the system allows maximum battery life.

Maximum Performance

When Maximum Performance is selected, no regard is paid to the level of CPU usage. Maximum frame rates are always achieved, even if they are too fast for the user to keep up with, or too fast for the screen to display.

NVIDIA did run a test scene with 1000 frames. During the short test execution, the CPU utilization was 100 percent, which dropped to almost zero for the remainder of the 1:40 (1 min. 40 sec.) profiled. In Maximum Performance mode, the test ran at 92 frames per second. While the test was running, the power (amps) consumed by the system was logged using a digital amp meter at 0.1 second intervals (see image below). The power was supplied from a 20 V source. Power is calculated by multiplying the 1.5 A average by 20 V, which equals 30 W.

Balanced

The balanced setting trades framerates for performance. It pays particular attention to the load balancing between the CPU and the GPU. This balancing ensures that the load is placed on the GPU, which lets the CPU idle in a lower power state. The same scene as mentioned above ran at 55 FPS. The CPU utilization decreased as did the power consumption. The average power drawn is less than 1.2 A. This translates to 24 W using a 20 V power supply. The PowerMizer Balanced mode offers a balance between high performance and significant power savings — 20 percent in this example

Maximum Power Savings

Setting PowerMizer at Maximum Power Savings provides the longest battery life. Clocks and voltage are held at their minimum settings to ensure the lowest power consumption. Don’t expect breakneck framerates, but for those who accept usable framerates and full power savings, this is the level to choose.

This time the scene ran at 26 FPS, but the average CPU utilization landed at around 10%. The average power at 1.1 A translates to 22 W, a 25 percent
savings in power. In this mode, the graphics subsystem offloads the CPU to achieve the power savings.

Earlier I talked about clock scaling, clock gating, and supply-voltage scaling. What is that? Well, it all is connected to the Power equation that explains the relationship between Power (P), capacitance (C), Voltage (V) and frequency (f).

Reduce any variable on the right side of the equation, and the power consumption decreases.

Clock Scaling

The GPU is designed to run at frequencies as low as 16 MHz during the Win-Idle state. As expected, this helps lower the power consumption. The frequency is raised when performance is needed, and then drops back down when not needed.

Clock Gating

Clock gating is equivalent to reducing the frequency to zero. GeForce FX Go mobile GPUs use clock gating extensively to ensure that the unnecessary portions of the GPU use zero power.

Voltage Scaling

As you can see from the equation, an increase in voltage results in an exponential increase in consumed power. The Go Mobile GPUs have been designed so they can run at lower nominal voltages, thus saving power. In the PR paper, this is attributed to “a power-saving 0.13 micron low-voltage process.” Since the FX Go5200 ‘only’ uses a 0.15 process, I’m not sure if it differs in this matter when compared to the Go5600, which is 0.13.

Conclusion


With the GeForce FX Go5200 and FX Go5600, NVIDIA has brought DX9 to the mobile market. Considering ATI had the lead on the desktop market, NVIDIA probably will do everything they can to get their mobile DX9 chips out first.

The new FX Go chips seem to have everything that a user interested in games and/or video would like in a mobile solution. Since we don’t have any notebooks/laptops with the chips to test, we can’t verify all NVIDIA’s claims (we’re working on borrowing a notebook though). But if they are true, mobile gamers have something to look forward to coming from NVIDIA in the near future.

Check Also

Geforce Now and the Apple Mac Book Pro (M1 Pro) – a match made in heaven?

Jump to section 1. Nvidia GeForce Now – the basics 1. Nvidia GeForce Now – …

NVIDIA introduces DLDSR – an intelligent downscaler

We have written a lot about Nvidias DLSS-technology that uses AI to upscale images to give a boost to the frame rate while trying to still offer great image quality. While I think it offers great image quality I know there are gamers who would would love to get a way to use AI to instead downscale a higher resolution image to give better quality at the lower resolution. Well, they are in luck as Nvidia now has presented Deep Learning Dynamic Super Resolution, DLDSR (phew, say that fast a few times).

Leave a Reply

instagram default popup image round
Follow Me
502k 100k 3 month ago
Share