ATI R580+ supports GDDR 4

Tec

New member
The Inquirer Found out one key thing about the new R580+ chip. "We gave you most of the details (Click Here) already, but we can now confirm that ATI's new chip can and will support GDDR4.

All cards after the R520 could support GDDR 4, but obviously the next generation GDDR 4 memory will simply work better with the new chips. The new Radeon X1000 generation memory controller has the support but obviously R580+ will have even better and more polished support.”

Samsung and Hynix have been sampling GDDR 4 from late 2005, so I guess by the time the 580+ is ready in late summer, there are unlikely to be many problems with memory shortage as Samsung is already in mass production. GDDR 4 is based on a couple of new techniques, Samsung calls these new techniques Data Bus Inversion and Multi-Preamble.

Hynix had a sample and they claimed it was the world's first 512Mb GDDR 4 memory chip back in October 05.

Hynix's part runs at 1.45GHz, and can shuffle up to 11.6GB of data every second. GDDR 4 can process almost double the data than GDDR 3 can in the same amount of time. GDDR 4 is essentially a tweak of the previous, point-to-point specification rather than a radical revision. Plus it can work at higher frequencies.

Samsung part runs at 1.25GHz and had 256 Mb versions back in October 05 and claimed they can increase the speed to 1.4GHz by end of October 05.

The 256Mb parts can transfer data at up to 2.5Gbps, up from the 1.6Gbps achieved by graphics memory currently on the market, but Samsung said it was already gearing up to offer samples capable of a 2.8Gbps data throughput in late October 05.

We don’t know when Nvidia plans to embrace this memory architecture but it could be with its G80 technology.

As you can see there is a big difference between what each memory company claims regarding the performance of GDDR4 chips we will just have to wait and see how they perform.
 
Estimations put that this r580+ will cement ATI's lead by a good bit. Then r600 and G80 come around when Dx10 and Vista hit.

If anything nvidia's taking a shot to the balls.
 
I think nVidia is just waiting for the right moment to release something so viscious it's unfathomable :)
 
Back
Top