Schumer Wants AI to Be Explainable. It’s Harder Than It Sounds

  • 📰 TIME
  • ⏱ Reading Time:
  • 40 sec. here
  • 2 min. at publisher
  • 📊 Quality Score:
  • News: 19%
  • Publisher: 53%

Technology Technology Headlines News

Technology Technology Latest News,Technology Technology Headlines

Chuck Schumer's proposal reveals how difficult it could be for policymakers to regulate a technology even experts struggle to fully understand.

their most powerful model, to label all of the neurons in a much smaller model, GPT2. The researchers were able to find multiple neurons that seemed to correspond to recognizable concepts, such as a neuron that seemed to activate for ‘doing things right’.

In practice, explainability would need to be implemented differently depending on which AI system is being used and what it’s being used for. It would be left to the AI developers to work out how to do this in each case. The difficulty arises from AI developers’ inability to provide any reliable explanation for decisions made by more complex systems, such as LLMs. If legislation was passed that required all AI systems to be explainable regardless of their architecture, it could spell disaster for companies developing these more complex systems.

In the second scenario, regulators, aware that explainability is not currently feasible for more complex AI systems, could create carve-outs for lower risk tasks or only partially enforce the legislation that Congress enacts.

 

Thank you for your comment. Your comment will be published after being reviewed.
Please try again later.
We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

 /  🏆 93. in TECHNOLOGY

Technology Technology Latest News, Technology Technology Headlines