DH Latest NewsDH NEWSLatest NewsNEWSTechnologyInternational

ChatGPT-maker to let thousands of hackers to test the limit of Artificial Intelligence

OpenAI, Google, Microsoft, and other chatbot makers may soon allow thousands of hackers to test the limits of artificial intelligence (AI), according to recent reports. These companies are reportedly coordinating with the Biden administration to hold a mass hacking event at this summer’s DEF CON hacker convention in Las Vegas. The goal of the event is to get people with different backgrounds and experiences to hack large language models used by these companies and find potential issues that need to be fixed.

Large language models such as ChatGPT, Bing chatbot, and Google’s Bard have been known to fabricate information and exhibit cultural biases based on their training data. The government officials took notice of this issue in March at the South by Southwest festival in Austin, Texas, where Sven Cattell, founder of DEF CON’s AI Village, and Austin Carson, president of the responsible AI nonprofit SeedAI, led a workshop inviting community college students to hack an AI model.

This year’s event will be much larger and will focus on large language models. Companies that have agreed to provide their models for testing include OpenAI, Google, Nvidia, Anthropic, Hugging Face, and Stability AI. However, some details of the event are still being negotiated. Scale CEO Alexandr Wang stressed the importance of ensuring the safety of foundation models as they become more widespread.

The mass hack is expected to draw several thousand people, and its coordinators hope it will be the start of a deeper commitment from AI developers to measure and evaluate the safety of the systems they are building. Jack Clark, co-founder of Anthropic, stated that AI systems need third-party assessments before and after deployment, and the mass hack is one way to achieve this. The AI industry needs practice to figure out how to ensure AI system safety, as it hasn’t been done before.

 

shortlink

Post Your Comments


Back to top button