Companies
17/02/2023

OpenAI, Backed By Microsoft, Will Let Users Modify ChatGPT




The company behind ChatGPT, OpenAI, announced on Thursday that it is working to address concerns about bias in artificial intelligence by creating an updated version of its popular chatbot that users can customize.
 
The San Francisco-based startup said it has worked to mitigate political and other biases but also wanted to accommodate more diverse views. Microsoft Corp. funded the startup and is using it to power its most recent technology.
 
 “This will mean allowing system outputs that other people (ourselves included) may strongly disagree with,” it said in a blog post, offering customization as a way forward. Still, there will “always be some bounds on system behavior.”
 
The technology behind ChatGPT, known as generative AI, has attracted a lot of attention since it was released in November of last year. This technology is used to produce answers that are amazing imitations of human speech.
 
The startup's announcement comes the same week that some media outlets have noted that OpenAI-powered Microsoft's new Bing search engine's results could be harmful and that the technology may not be ready for widespread use.
 
Companies in the field of generative AI are still wrangling with how to set boundaries for this emerging technology, and this is one of their main areas of focus. Before a wider rollout, Microsoft said on Wednesday that user feedback was assisting it in improving Bing. For example, Microsoft learned that its AI chatbot can be "provoked" to respond in ways that are not intended.
 
In the blog post, OpenAI stated that ChatGPT's responses are first trained on sizable text datasets that are readily accessible online. Humans review a smaller dataset in a subsequent step and are given instructions on what to do in various circumstances.
 
For instance, the human reviewer should instruct ChatGPT to respond with something like "I can't answer that" if a user requests adult, violent, or hate speech-containing content.
 
In an excerpt from its reviewer guidelines for the software, the company advised that when asked about a contentious subject, reviewers should let ChatGPT respond and instead offer to describe different points of view held by individuals and movements.
 
(Source:www.cnbc.com) 

Christopher J. Mitchell
In the same section