Another day, another rumour springs forth from NVIDIA’s mill. This week, the latest rumour involves an alleged GeForce RTX 4070 Ti, its supposed AD104 GPU, and how its performance may be on the same level as the RTX 3090 Ti.
The rumour comes, yet again, from the prominent but fairly reliable hardware leakster, Kopite7kimi (@kopite7kimi), who has also been responsible for all other leaks of NVIDIA’s RTX 40 series rumours over the last several months. In their latest tweet, the leakster appears to have been informed that the RTX 4070 Ti will be fitted with an AD104 GPU and more specifically, a PG141-SKU331 model.
As I have mentioned before, there is an AD104 SKU with a 400W limit.
PG141-SKU331
a full-fat AD104 with 7680FP32
21Gbps 12G GDDR6X
It can easily match RTX 3090 Ti.— kopite7kimi (@kopite7kimi) August 1, 2022
The model designation of the GPU is relatively important here, because it indicates that, technically, the RTX 4070 Ti using this specific SKU will be pulling as much as 400W off the wall, sans the rest of the power needed to power the other components. For context, that TDP is 100W higher than the current RTX 3070 and its 300W TDP requirement.
Other specifications listed for the alleged RTX 4070 Ti by Kopite include 12G GDDR6X with 21Gbps memory, and 7680 CUDA Cores. All of which, by the way, are channelled through a 192-bit memory bus. Technically speaking and on paper, that gives this GPU a 504GB/s memory bandwidth, which translates to around 40% more than its non-Ti counterpart.
Unfortunately, that’s all Kopite has to share about the RTX 4070 Ti, to say nothing of other GPUs under the Lovelace family. While the TDP consumption does seem pretty high for an alleged mid-range GPU, it’s still nowhere near earlier rumours of its more powerful brethren, one of which is expected to be fitted with an AD102 GPU that allegedly has a maximum TDP of 800W. That trade-off, by the way, could mean that that GPU could ship out with a maximum bandwidth of 1.1TB/s.
Again, all that is mentioned here are merely speculation and rumours. As such, we do encourage you to apply the usual amount of skepticism, at least until NVIDIA makes it official.
(Source: Videocardz)
Follow us on Instagram, Facebook, Twitter or Telegram for more updates and breaking news.