The value of data to the organisations that create it has come into renewed focus recently with the hype around big data. Companies are admonished to mine their information resources for new insights and business ideas, but are insights gleaned from a billion bad records actually valuable?
The value of these insights will clearly be dependent on the quality of the data they are based on. Yet in a recent survey of Computing's readers among large organisations, around one in four confesses to doing little or no active data quality control.
Poor quality data can have many repercussions, from an erosion of productivity and slow business decision-making through to more serious consequences, such as failed projects, reputational damage and failure to comply with legally enforced regulations.
But many organisations don't know where to start the process of improving data quality. With luck, this web seminar will give you a few pointers.
Sometimes, the power of the mainframe is the most cost effective answer. Computing's Peter Gothard puts Computing's readers' questions on the future of the mainframe to IBM's Z13 expert Steven Dickens.
This Dummies white paper will help you better understand business process management (BPM)