Oh no, you're thinking, yet another cookie pop-up. Well, sorry, it's the law. We measure how many people read us, and ensure you see relevant ads, by storing cookies on your device. If you're cool with that, hit “Accept all Cookies”. For more info and to customize your settings, hit “Customize Settings”.Here's an overview of our use of cookies, similar technologies and how to manage them.
The high-end compute, a resource intended primarily for training models, is suggested as packing 40 exaflops of power drawn in part from 10,000 GPUs – plus 200PB storage. The vision document calls for the mid-range tier, or inferencing arm, to have 20.8 exaflops spread across four geographical centers in India, plus 400PB storage. Each of the four centers needs 1,000 GPUs for inferencing tasks and 750 for AI training.
India AI did note that China has the largest share of the world supercomputers– at 162 as of November 2022, comprising 32 percent of machines whose details are recorded on the Top 500 list. India, meanwhile, only counts three – but the nation's National Supercomputing Mission aims to increase this number to 24.
The datasets have potential for data-driven governance, as well as for start-ups and research, said India AI.