Meta is to debut a gigantic language model for AI research, in the hope of fighting toxicity and bias in these systems.
The Open Pretrained Transformer (OPT-175B) has 175 billion parameters, on par with models like commercial models like GPT-3. In the past, developers have used these types of systems to build fun...
To continue reading this article...
Join Computing
- Unlimited access to real-time news, analysis and opinion from the technology industry
- Receive important and breaking news in our daily newsletter
- Be the first to hear about our events and awards programmes
- Join live member only interviews with IT leaders at the ‘IT Lounge’; your chance to ask your burning tech questions and have them answered
- Access to the Computing Delta hub providing market intelligence and research
- Receive our members-only newsletter with exclusive opinion pieces from senior IT Leaders