Several ISPs and mobile operators have signed up to a voluntary code of practice to provide comparable information about how they manage their web traffic.
The move is an attempt to fend off an enforced regulatory framework by communications watchdog Ofcom.
One of the ISPs to sign up to the agreement was Virgin Media, which said it supports the move and that it is a step in the right direction.
"We fully support, and have been actively engaged in, the development of the Broadband Stakeholder Group's Code of Practice and are pleased that other ISPs and operators will now provide consistent standards of openness on traffic management," said a Virgin Media spokesperson.
This is the first time information will be provided in a common format to explain what traffic management techniques are being used.
The move will be welcomed by the advocates of net neutrality.
Supporters of a free and fair internet, known as net neutrality, are most concerned that the current methods of traffic management will lead to a two-tiered structure, where ISPs strike deals with the likes of Google and the BBC to prioritise traffic and ensure a certain level of quality to their customers.
But is all this fuss surrounding net neutrality justified? Matthew Howett, analyst at Ovum, suggests that intervention from Ofcom is unnecessary and that industry players are well on the way to coming to an agreement among themselves.
"If we were to have prescribed rules from Ofcom regarding net neutrality and traffic management at this stage in the UK, it would be premature," said Howett.
"There is a good degree of competition among ISPs in this country and this should prevent the more serious forms of discrimination that have been hyped up, such as the outright blocking of web sites," he added.
"Let the industry work things through with this code, and if we start to see more serious forms of abuse, then Ofcom has the power to set out a regulatory framework.
"ISPs being transparent should be sufficient to stop the outright blocking of web sites."
Howett insists that the negative hype surrounding a two-tiered internet has spread from the US because of the lack of competition across the Atlantic. He suggests that lobbying has caused "unnecessary confusion and worry".
"We have not seen the abuse of our system yet, so why would you fix what is not broken?" he asked.
However, that is not the opinion of digital campaign body the Open Rights Group, which suggests that more needs to be done if the government wants to restrict anti-competitive behaviour.
"Transparency alone will not protect customers or innovators. We need to see a clear commitment to dealing with anti-competitive behaviour," said Jim Killock, executive director of the Open Rights Group.
"Transparency is a weak mechanism, and the UK economy will suffer," he added.
The main concern surrounding traffic management is that ISPs will strike lucrative deals with content providers.
However, Howett suggests that this may not be a bad thing and that if the government wants ISPs to roll out state-of-the-art broadband networks, it should support this as an innovative approach to gaining some extra funding.
"Let's see how these models play out," said Howett.
"If the government is trying to incentivise ISPs to roll out expensive networks, they need to get a return on investment, and playing around with charging models is a way of doing that," he added.
"It is foolish to suggest that the internet is still on an open basis or that it should stay that way."
Sometimes, the power of the mainframe is the most cost effective answer. Computing's Peter Gothard puts Computing's readers' questions on the future of the mainframe to IBM's Z13 expert Steven Dickens.
This Dummies white paper will help you better understand business process management (BPM)