While Intel suggests that application developers will soon infuse all software with AI, and PCs must be ready for them, the workload that currently matters most is inferencing to power large language models. Inferencing is the process that transforms a submitted prompt into its response. Doing so needs a machine that can grind through multiple gigabytes of data – and even more matrix multiplications.
While such a compute-intensive task sounds like it could well stretch the capacities of a desktop PC (ChatGPT and its ilk run in colossal datacenters), I've discovered that a lot of the hardware we already have can do an adequate job of inferencing. Anyone can run a 'good enough' chatbot on a PC – provided the PC has enough RAM and a mid-range GPU. That means a GPU with at least 8GB of VRAM (and not one from Intel – at least at the moment, for lack of driver support). That's not too pricey – which is good, because it's table stakes. For RAM, you'd be happier with 32GB than 16GB, while 8GB simply won't cut it. And that's about it. That's all you're going to need
Technology Technology Latest News, Technology Technology Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Source: NewsMedical - 🏆 19. / 71 Read more »
Source: TheRegister - 🏆 67. / 61 Read more »