
There’s been a lot of buzz about the RTX 5090 and other new NVIDIA models from graphics card manufacturers in the last 24 hours. We’re talking both leaks and some official announcements.
The website VideoCardz spotted on Zotac’s support site a list of five next-gen NVIDIA video cards. The company quickly removed the info, but “screenshots don’t burn.” The site revealed the following models:
- GeForce RTX 5090
- GeForce RTX 5090D (China)
- GeForce RTX 5080
- GeForce RTX 5070 Ti
- GeForce RTX 5070
Screenshots from Zotac / VideoCardzBesides the list itself, new filters have popped up in the graphics card category — GDDR7 and 32 GB, which weren’t available during the RTX 40xx era. Obviously, the beefiest video card, the RTX 5090, will pack the most built-in memory. The list also indirectly denies the near-term availability of RTX 5060 and RTX 5080D video cards.
Acer confirmed GeForce RTX 5090 32 GB and RTX 5080 16 GB GDDR7 through its Predator Orion 7000 gaming PCs. Besides NVIDIA’s new releases, these computers will also feature Intel Core Ultra 200 processors. According to their specs, the RTX 5080 will only have 16 GB of VRAM — half as much as the flagship.

Press release from Inno3D, dedicated to CES 2025, mentions some new features that Blackwell video cards will get. These include Advanced DLSS Technology (possibly DLSS 4), Enhanced Ray Tracing, and the mysterious Neural Rendering Capabilities. The company also confirmed its video card series iChill Frostbite with liquid cooling, iChill X3, X3, Twin X2, and the new name White and Small Form Factor.

Latest rumors say the NVIDIA RTX 5080 will have a GB203-400 GPU with 10752 CUDA cores and 16 GB of GDDR7 memory over a 256-bit bus, and its power limit will be 400 W. RTX 5090 will feature a GB202-300 GPU with 21760 CUDA cores, 32 GB of GDDR7 memory, and a consumption under 600 W. Benchlife reports RTX 5080 might be the only model in the new lineup equipped with memory speed at 30 Gbps — the rest will have 28 Gbps.
Spelling error report
The following text will be sent to our editors: