YOUR CORRESPONDENT HAD
an exciting weekend, taking a plane from Zagreb to *somewhere*, enjoying some haute cuisine and picking up one interesting board, to be known to the market as Radeon HD 2600XT. This board is based on a 65nm RV630 chip and is accompanied with 256MB of ultra-fast GDDR4 memory from Samsung.
RV630 is really a small chip, almost 50% smaller than RV560 (X1650XT)
Since this is very first 65nm graphics chip in the world, we decided to show you the chip first. As far as we know, this chip packs serious number of trannies, little less than 400 million. All of this is packed in 13x11mm, or 143-4-5mm2... manufactured in TSMC's fab in Taiwan, of course.
...in contrast, even G84-400 is larger.
As we all know, Nvidia managed to pack only 32 scalar shaders in its own chip, which measures 169mm2. Thanks to 65nm process, AMD managed to get RV630 in smaller size than a G84 chip, but offers vastly higher number of scalar shaders - 120 compared to 32. Nothing to be sneezed at. Given the fact that OEMs are telling us that the whole board consumes around 40-45W, you can imagine that Graphzilla of Satan Clara fame is worried. Worried a lot...
This is not the final production chip, however. The reason Henri Richard was wrong when he was promised simultaneous and hard launch of the complete Radeon HD2000 series (can you say "10 products" again?) in front of whole analyst corps was the fact that both RV610 and RV630 need another respin. The boards will be ready just after Computex, with availability to follow at the very end of June, or should we say July?
The RV630XT Board - GDDR-4 monster
Board itself is fairly light, but it is surprisingly long
We saw pictures of empty PCBs quite some time ago, and in general, the board looks like X1800XL without 256-bit memory controller. Cooler is a long one, just like the PCB itself... but we are seeing absence of any additional power connectors like 4-pin molex or 6-pin PEG (PCI Express Graphics), which we grew accustomed to. After seeing X1650XT having a 6-pin PEG connector, this successor of the series does not use any, and even spots some room for overclocking.
Backside of the board shows that there is room for yet another 256MB of GDDR-4 memory
As we wrote above, the RV630XT eats up 40-45W, and this leaves around 30W to play with. We're certain that factory overclocked boards will feature this additional power connector, but then, with two-slot or even water-cooling block, it is not hard to imagine what would 100W headroom offer. Our sources are telling us that these chips are happy campers when it comes to overclocking, only real issue at hand is getting them to run stable. As it is right now, chips are either perfectly stable, or not stable at all. It was only the question of luck, and my luck was so good that I was quite happy not to be in Las Vegas...
Compared to GeForce 8600GTS, you can see "dwarf-effect" on the 8600GTS board. But the smaller one wants more juice to keep it running.
Coolers also show the difference between two cards. 8600GTS looks very inadequate when compared to the RV630XT.
Board we had was clocked to 799.84 MHz for the GPU and 1100 MHz DDR for the memory, e.g 2.2 GHz... since GPU communicates with the memory via 128-bit user interface you will end up with 35.2 GB/s of theoretically available bandwidth. Yes, we know that RV670 will bring 256-bit memory interface to mainstream segment, but this is a story about RV630.
The systems we planned to test on are trusty AMD and Intel based INQtest, consisted of following components:
AMD Athlon 64 X2 6000+
ASUS M2R32-MVP (ATi 580X)
2x1GB Corsair Dominator PC2-9136C5D
Seagate Barracuda ES 250GB
beQuiet! 850W PSU
Intel Core 2 Extreme QX6800
2x1GB GeIL PC2-8500/9600 MultiSpec Kit
Seagate Barracuda ES 250GB
Enermax Galaxy 850W PSU
Our standard setup from now on includes a dual-boot between 32-bit Windows XP and 64-bit Vista, when that is possible. Since we had several different versions of drivers, we installed 32-bit Vista as well. So, a triple-OS combo. Of course, each and every operating system was installed on a different partition, with complete set of drivers and patches. .NET Framework 2.0, DirectX April 2007 Update, Visual C++ Express and Microsoft Office 2007 were also installed.
Problems, what problems?
Board in action...BSODMark is really high, and this is the score that every hardware vendor should avoid
Sadly, it seems that software part did not get along well with the drivers in hand. We have tried on several different hardware configurations, both Windows XP and Vista - but to no avail. Drivers were not stable at all, and BSODs were often as rain in Blighty. Even though we had several different drivers, such as 8.31, 8.34, 8.37 and so on - we just could not get this thing to complete a single benchmark, especially favourites like synthetic 3DMark05 and 06. We did manage to get a score in Stalker, but it was so bad (8600GT level), that we decided to postpone the benchmark scores until we get a decent driver to test this card with...
This is where Graphzilla can start celebrating. While AMD has far better hardware on paper, and packs more power under the hood, the products are just not ready. High-end may be ready to fly of the shelves in a way that Nvidia could never deliver (we are getting reports of channel having more HD2900XT's than nV had in two of previous launches together), but mainstream and low-end are markets where the real money lies... and right now, nV is enjoying the ride of its life.
What we could tell from this prototype card is that it crashes in every test we tried, regardless of hardware combination. If it worked, it would be nice and not so loud - but it does heat up significantly, leading us to believe that GDDR-4 memory still has some problems. Joe Macri's GDDR-3 memory is constantly proving itself as the best memory standard in the history of 3D and further. DAAMIT is actually lucky to have this guy on their payroll. Meanwhile, GDDR-4 has higher latencies and heat dissipation, making it less-than-ideal choice for current batch of graphics cards.
AMD still has lot of challenges ahead. For a company that has an interesting slogan - Smarter Choice would probably be not to cut the pipe down on R&D all the time - and keep losing talents from Satan Clara and Markham offices. This only helps constant build-up of engineers over at Chipzilla and Graphzilla, who are happy to get their claws on any engineer available.
Meanwhile, the wait for AMD's own mainstream and low-end DX10 cards continues. But now at least we know what AMD will come up with. Only, make it stable - pretty please?