April 16, 2024

Nvidia Brings Generative AI Chatbot To Windows PC 

Nvidia, the company that’s best known for its chips, is now venturing into the world of generative AI with its new chatbot for Windows PC.

The announcement came on Tuesday, February 13, shortly before it released an early version of the chatbot called “Chat With RTX”.

As of now, this demo is free to use and supports .pdf, .txt, .xml, and .doc file formats.

The biggest benefit of this chatbot is that it can be used locally. While other AI chatbots like the ones provided by OpenAI are hosted on their platform. RTX will be hosted on your device, so you can use it to help your team be more productive without the need to rely on a third-party provider.

It’s also quite simple to use. You can create your own chatbot and train it the way you want using your own data (such as product manuals, company policies, and FAQs) as training material. In short, you get to customize how the bot behaves.

Another benefit of using local data for training purposes is that you don’t have to share those details with anyone. You won’t even need to connect to the internet to do that. Your files remain yours alone, secure in your computer.

Since Chat with RTX runs locally on Windows RTX PCs and workstations, the provided results are fast — and the user’s data stays on the device.Jesse Clayton, Nvidia product manager

Another major difference between other AI chatbots and the one by Nvidia is in the purpose— Chat with RTX is more for personal use. That’s because you are feeding it your personal data only.

Hence, it’ll double up as your virtual assistant. You can ask it “What was my total expense last month?” and if you have a spreadsheet in your device, it’ll pull out the numbers from there and let you know.

If you want to use it for research purposes, you can also include links from YouTube and third-party playlists. This way you can ask it to transcribe the video or solve particular queries regarding the topic.

Anyone with a Nvidia GeForce RTX 30 Series GPU or higher and at least 8 GB of video RAM can use this new chatbot. You’ll also need at least a Windows 10 or 11 PC running on the latest Nvidia drivers. If your system meets all these requirements, you can download the demo app from Nvidia’s official website.

If you are going to be dealing with a lot of long texts, we recommend Mistral.

There are two LLMs( language learning models to choose from)—Mistral or Llama 2.

Speaking of the technology, Chat With RTX runs on GEForce-powered Windows PCs using retrieval-augmented generation (RAG), Nvidia RTX acceleration, and NVIDIA TensorRT-LLM software.

RAG makes AI models more precise and accurate by obtaining information from external sources which in turn makes them more dependable.

Previously, Sam Altman was in the news for his multi-billion dollar chip project through which he plans to compete with Nvidia in the chip market. But it looks like Nvidia is upping the game by entering the AI industry where Altman’s OpenAI holds a good position.

free coins
free coinsfree coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins
free coins

Leave a Reply

Your email address will not be published. Required fields are marked *