CCT - Crypto Currency Tracker logo CCT - Crypto Currency Tracker logo
Cryptopolitan 2024-12-03 22:15:58

ChatGPT finally says ‘David Mayer’ but fail on 5 names amid AI lawsuits

Artificial intelligence (AI) tools like OpenAI’s ChatGPT can perform several functions, but users occasionally discover glitches. Recently, reports surfaced that ChatGPT stopped functioning when asked about certain names, like David Mayer. The community now questions the model’s handling of sensitive or legally complex data. We can wait for AI tools to address more privacy concerns amid several lawsuits. ChatGPT doesn’t reveal who David Mayer is OpenAI’s ChatGPT reportedly stopped working when asked about specific names last weekend. TechCrunch cited users reporting a no-go list of names, which included “David Mayer,” that made the chatbot freeze or crash without responding. When Cryptopolitan checked how the chatbox based on the Generative Pre-trained Transformer (GPT) language model responded on Tuesday, the results were similar but not quite the same. Here’s how ChatGPT responded to a query about David Mayer It turns out that David Mayer is no more a forbidden name. According to ChatGPT-4o, Mayer is quite a common name, although , it couldn’t specify the person. When asked about the earlier glitch, ChatGPT brushed it off saying” I see what you’re referring to! There may have been instances where glitches or errors occurred with formatting, spelling, or content generation, but nothing specifically tied to consistently misspelling the name ‘David Mayer.’ It also recommended reporting ‘odd and inconsistent’ responses. Other names—Brian Hood, Jonathan Turley, Jonathan Zittrain, David Faber, and Guido Scorza—caused the system to malfunction over the weekend and again on Tuesday when we checked. OpenAI might be handling sensitive data differently-Report The report finds that these names belong to public or semi-public figures, such as journalists, lawyers, or people who may have been involved in privacy or legal disputes with OpenAI. For instance, the individual named Hood was reportedly misrepresented by ChatGPT in the past, leading to legal discussions with OpenAI. Lawsuits have piled up on AI companies TechCrunch guesses that OpenAI might be cautious about the list of names to handle sensitive or legally protected information differently. It could possibly be to comply with privacy laws or legal agreements. However, a code malfunction might cause the chatbot to fail whenever information is sought on the names mentioned. Over the years, several cases were pursued between AI companies for either generating incorrect information or breaching data privacy framework . In a 2020 case, Janecyk v. International Business Machines, photographer Tim Janecyk claimed that IBM improperly used photographer-clicked images for research purposes without consent. Not so long back, Google’s Gemini AI faced criticism for its image-generation capabilities which led to its temporary suspension. In the PM v. OpenAI LP class action lawsuit filed in 2023, OpenAI was accused of using “stolen private information” without consent. In 2024, Indian news agency ANI reportedly filed a lawsuit against OpenAI for using the media company’s copyrighted material to train the LLM. As AI tools become increasingly integrated into daily life, incidents like these underscore the importance of ethical and legal considerations in their development. Whether it’s safeguarding privacy, ensuring accurate information, or avoiding malfunctions tied to sensitive data, companies like OpenAI face the task of building trust while refining their technology. These challenges remind us that even the most advanced AI systems require ongoing vigilance to address technical, ethical, and legal complexities. Land a High-Paying Web3 Job in 90 Days: The Ultimate Roadmap

Read the Disclaimer : All content provided herein our website, hyperlinked sites, associated applications, forums, blogs, social media accounts and other platforms (“Site”) is for your general information only, procured from third party sources. We make no warranties of any kind in relation to our content, including but not limited to accuracy and updatedness. No part of the content that we provide constitutes financial advice, legal advice or any other form of advice meant for your specific reliance for any purpose. Any use or reliance on our content is solely at your own risk and discretion. You should conduct your own research, review, analyse and verify our content before relying on them. Trading is a highly risky activity that can lead to major losses, please therefore consult your financial advisor before making any decision. No content on our Site is meant to be a solicitation or offer.