Nvidia is no stranger to the field of AI. If anything, it may be more of a household name. The graphics company already showcased ACE For Games last year, integrating generative AI tech into video games. So it was probably only a matter of time before it announced its own AI chatbot. Which brings us to Chat with RTX, which is what Nvidia calls its chatbot.
One element that sets this AI chatbot apart from the rest is the fact that it runs locally on your machine, rather than through the cloud as most currently do. This means that you get responses to your queries faster. And if you’re worried about privacy risks with other cloud-based AI chatbots, this is less of an issue with the locally-run Nvidia Chat with RTX.
On the flip side, you’ll need a relatively beefy system to be able to run it. The Zip file that you can download from its web page is slightly over 35GB in size. And to run Chat with RTX, you’ll need a GeForce RTX 30- or 40-series card with at least 8GB of VRAM, as well as 16GB of RAM.
It’s probably worth noting at this point that the Nvidia Chat with RTX is labeled as a demo. Whatever that means for its future, right now it likely means that you should expect some jank when using it. If you’re still interested, you can download the demo chatbot by hitting the link below.
(Source: Nvidia)
Follow us on Instagram, Facebook, Twitter or Telegram for more updates and breaking news.