The military is increasingly utilizing virtual reality training systems and artificial intelligence in their development process.Popular artificial intelligence applications like ChatGPT or DALL-E are growing more popular with the masses, and the Department of Defense is taking note. To get ahead of potential uses and risks of such tools, on August 10, the DoD
While Pentagon research into AI certainly carries implications about what that will ultimately mean for weapons, the heart of the matter is really about using it to process, understand, and draw certain predictions from its collection of data. Sometimes this data is flashy, like video footage recorded by drones of suspected insurgent meetings, or of hostile troop movements.
The study of generative AI will take place by the newly organized Task Force Lima, which will be led by the Chief Digital and Artificial Intelligence Office. CDAO was itself created in, out of an amalgamation of several other Pentagon offices into one designed to help the military better use data and AI.
One such malicious possibility of generative AI is using it for misinformation. While some models of image generation leave somewhat obvious tells for modified photos, like people with an unusual number of extra fingers and teeth, many images are passable and even convincing at first glance. In March, an AI-generated image ofproved compelling to many people, even as its AI origin became known and reproducible.