'Open source must be part of AI future', Lords committee

Open source is 'a key part of what UK businesses need to compete and thrive' says Baroness Stowell

Open source must be part of the AI future, says Baroness Stowell

Image:
Open source must be part of the AI future, says Baroness Stowell

The government must not take sides between open and closed source tech, says Baroness Stowell, chair of the Lords Communications and Digital Select Committee, which is responsible for scrutinising the government's legislation around technology and AI.

Last week, the Committee released a long-awaited report on AI and LLMs. Speaking at OpenUK's State of Open Conference on Wednesday, the Conservative peer emphasised the positive opportunities that the "epoch-defining" advent of generative AI can offer the world, and the UK more specifically.

"We the Committee are fundamentally optimistic about this new technology and large language models," she said. "We know it could bring huge economic rewards and drive groundbreaking scientific insights, but the benefits, I'm afraid, are often getting lost amidst all of the talk of doom and gloom."

Stowell insisted that the UK is well-placed to be a significant driver of responsible AI if it grasps the opportunities. "The UK can be a huge force for good and lead global conversations about responsible tech policy. But people will only listen to us if we have a thriving commercial sector that shows we can practice what we preach."

See also Government pledges more than £100m of AI funding

The Committee chair called for a focus on supporting commercial opportunities, academic research and spin-offs. Among other things, this means avoiding a preference for closed or open source (which the report refers to as "open access" owing to quibbles around definitions).

"Open source or open access must be part of the future, and we must have open markets," Stowell insisted. "Open source technology is a key part of what UK businesses need to compete and thrive in this fast-growing market."

She continued: "Helping the UK SMEs experiment with open source technology must be a part of this, whether that's through better access to world leading computing facilities, or scaling up accelerator programmes for startups, or providing guidance and opportunities to innovate with AI responsibly."

She added that regulation must be agile to avoid inadvertently burdening smaller businesses.

Avoiding regulatory capture

Another danger area for SMEs is regulatory capture by large companies. Stowell pointed to the quasi-monopolies of search engines and cloud platforms, and insisted that the government is cognisant of the risk of similar consolidations happening with AI.

"We have raised concerns about the real and growing risk of regulatory capture as well as the prospect of large tech firms moving on to entrench their existing advantages and stifling competitors from entering this enormous growing market," she said.

Safety second

The Committee report calls for a refocusing away from fears about existential risk and towards opportunity. Stowell said that "catastrophic risks do not appear likely" over the next three years.

But while boosting innovation is taking precedence over security for now, the report nevertheless seeks additional safeguards, including mandatory safety tests for some types of "highly capable systems". The government should also scale up existing security protections and develop formal accredited standards and auditing practices, she said.

"We don't need more mounds of red tape, but we do need clarity about what 'good' looks like to help ensure we steer this technology in the right direction from the outset. More support for regulators from the government is vital as well."

Protecting IP

With lawsuits mounting up against AI companies accused of feeding their models with copyrighted materials, Stowell said the government should stand firm on protecting the rights of creators.

"Plenty of tech firms have been playing fast and loose with copyright protection, and we're clear that that's not fair. Innovation doesn't have to come at the expense of things like copyright and paying creators for their works. Everyone else has had to play by the rules before now. We see no reason why that should suddenly change."

As a proactive approach, Stowell suggested the government should invest in "large, high-quality datasets for LLM training" to incentivise all players to use licenced content rather than scraping it from the web. But she hinted that copyright is one area where the government is currently falling short: "There's still work to be done," she said.

She concluded her address with an appeal for open source technologists to get involved in helping to steer policy around AI.

"We want this technology to be something that empowers people and allows them to be much more able to take control of themselves, and to play a part and have much more rewarding roles in their own workplaces or in their own communities. That is something that I would urge you as a community to promote as a very powerful positive message for the value of what it is that you are doing."