Researchers should be forced to fully disclose data and evidence during the peer-review process for UK research, according to a report from The Science and Technology Committee.
This will better protect the integrity of UK research, said the report.
Peer review is the process by which experts review new research and establish whether or not it is suitable for release to a wider readership.
"Although it is not the role of peer review to police research and integrity and identity fraud or misconduct, it does, on occasion, identify suspicious cases," said Andrew Miller MP, chair of the Committee.
"While there is guidance in place for journal editors when ethical misconduct is suspected, we found the general oversight of research integrity in the UK to be unsatisfactory and complacent."
The investigation into the peer-review process by the Science and Technology Committee followed the alleged attempts made by scientists at the Climatic Research Unit (CRU) at the University of East Anglia to deliberately manipulate climate data so that it supported its global warming claims.
In addition, several media reports claimed that CRU attempted to abuse the process of peer review to prevent the publication of research papers with conflicting opinions about climate change.
Investigations cleared the scientists involved of any fraud or misconduct. However, the report suggests that data should be fully disclosed throughout the peer review process in a bid to support decisions to publish research.
The peer-review report states the following: "The best way to ensure that test results are verified would be for scientists to register their detailed experimental protocols before starting their research and disclose full results and data when the research is done.
"Currently, results are often selectively reported, emphasising the most exciting among them and outsiders frequently do not have access to the information they might need to replicate studies. Journals and funding agencies should strongly encourage full public availability of all data and analytical methods for each published paper."
However, despite this recommendation, the report also says that making data available and open isn't always economically feasible and could prove to be a challenge for research bodies.
Sir Mark Walport from the Wellcome Trust provided information for the report, but argued that there are "major costs" involved with making data available and added that the "costs of storing the data may in the future exceed the costs of generating it".
Dr Philip Campbell, from publishing group Nature, also contributed to the investigation and provided an example of the potential costs involved in making data, software and codes available:
"I was talking to a researcher the other day and he had been asked to make his code accessible. He had to go to the Department of Energy for a grant to make it so," he said.
"He was asking for $300,000, which was the cost of making that code completely accessible and usable by others. In that particular case the grant was not given. It is a big challenge in computer software and we need to do better than we are doing".
Dr Malcolm Read, executive security for digital technology promoters JISC, explained why research bodies may find it difficult to make code available.
"If you are talking about stuff running on so-called super-computers, you have to know quite a lot about the machine and the environment it is running on," said Read.
"It is very difficult to run some of those top-end computer applications, even if, of course, they are prepared to make their code available".
Sometimes, the power of the mainframe is the most cost effective answer. Computing's Peter Gothard puts Computing's readers' questions on the future of the mainframe to IBM's Z13 expert Steven Dickens.
This Dummies white paper will help you better understand business process management (BPM)