Databricks announces 'Dolly' - an open-source ChatGPT rival

It is aimed at democratising large language models

clock • 3 min read
Databricks announces 'Dolly' – an open-source ChatGPT rival
Image:

Databricks announces 'Dolly' – an open-source ChatGPT rival

Big-data analytics firm Databricks has an open-source language model called Dolly, which it claims can replicate ChatGPT's abilities without the expensive hardware and large datasets.

The firm, founded by the creators of Apache Spark, says the release of Dolly is aimed at democratising large language models (LLMs). It should help small firms to build their own generative AI mode...

To continue reading this article...

Join Computing

  • Unlimited access to real-time news, analysis and opinion from the technology industry
  • Receive important and breaking news in our daily newsletter
  • Be the first to hear about our events and awards programmes
  • Join live member only interviews with IT leaders at the ‘IT Lounge’; your chance to ask your burning tech questions and have them answered
  • Access to the Computing Delta hub providing market intelligence and research
  • Receive our members-only newsletter with exclusive opinion pieces from senior IT Leaders

Join now

 

Already a Computing member?

Login

You may also like
Facebook chatbot claims to have a child with 'unique needs and abilities'

Social Networking

Moving fast and breaking things again

clock 19 April 2024 • 3 min read
Stability AI cutting staff in the name of restructuring

Corporate

Following the departure of CEO Emad Mostaque, UK AI unicorn is shedding employees

clock 19 April 2024 • 1 min read
Meta unveils its latest large language model Llama 3

Artificial Intelligence

Comes in two variants - Llama 3 8B and Llama 3 70B

clock 19 April 2024 • 4 min read

Sign up to our newsletter

The best news, stories, features and photos from the day in one perfectly formed email.

More on Big Data and Analytics

Even CERN has to queue for GPUs. Here's how they optimise what they have

Even CERN has to queue for GPUs. Here's how they optimise what they have

'There's a tendency to say that all ML workloads need a GPU, but for inference you probably don't need them'

John Leonard
clock 17 April 2024 • 4 min read
Partner Content: Why good data is the foundation of AI success

Partner Content: Why good data is the foundation of AI success

Does your organisation have the right quantity and quality of data to make its AI ambitions a reality?

Arrow
clock 04 April 2024 • 2 min read
Partner Content: Human-in-the-loop - How AI can boost your organisational culture

Partner Content: Human-in-the-loop - How AI can boost your organisational culture

Why it’s vital to consider your organisation’s people when implementing AI

Arrow
clock 26 March 2024 • 2 min read