Data quality – Overview

MIT has a Total Data Quality Management program, led by Professor Richard Wang, which produces a large number of publications and hosts a significant international conference in this field (International Conference on Information Quality, ICIQ). This program grew out of the work done by Hansen on the \”Zero Defect Data\” framework (Hansen, 1991).

Marketing operations – Data

An entire category of software companies facilitate data quality management. Gartner lists DataFlux, IBM, Trillium Software, Informatica, and SAP Business Objects as leading data quality tool providers. Other related technology categories include ‘Data Integration Tools’ and ‘Customer Data Integration’. There is also a set of companies that sell data, or services to clean and enhance data. Some of these include: Dun & Bradstreet, Experian, Equifax, MarketWatch, and InfoUSA.

Integration Competency Center – Overview

* Promote Enterprise integration as a formal discipline. For example, data integration will include data warehousing, data migration, data quality management, data integration for service oriented architecture deployments, and data synchronization. Similar system integration will include common messaging services, business service virtualization etc.

Hyperion Solutions

* 2006 – Hyperion acquires UpStream (Financial Data Quality Management)

Hyperion Solutions

* Hyperion Financial Data Quality Management (Also referred to as FDM)

Hyperion Solutions – Timeline

* 2006 – Hyperion acquires UpStream (Financial Data Quality Management)

Hyperion Solutions – Products

* Hyperion Financial Data Quality Management (Also referred to as FDM)

Hyperion Solutions Corporation

* 2006 – Hyperion acquires UpStream (Financial Data Quality Management)

Hyperion Solutions Corporation

* Hyperion Financial Data Quality Management (Also referred to as FDM)

Stuart Madnick

His current research interests include connectivity among disparate distributed information systems, database technology, software project management, and the strategic use of information technology. He is presently co-Director of the PROductivity From Information Technology (PROFIT) Initiative and co-Heads the Total Data Quality Management (TDQM) research program.

Stuart Madnick – Career

His current research interests include connectivity among disparate distributed information systems, database technology, software project management, and the strategic use of information technology. He is presently co-Director of the PROductivity From Information Technology (PROFIT) Initiative and co-Heads the Total Data Quality Management (TDQM) research program.

Health information management – Methods to ensure Data Quality

The accuracy of data depends on the manual or computer information system design for collecting, recording, storing, processing, accessing and displaying data as well as the ability and follow- through of the people involved in each phase of these activities. Everyone involved with documenting or using health information is responsible for its quality. According to AHIMA’s Data Quality Management Model, there are four key processes for data:

Data quality assurance

MIT has a Total Data Quality Management program, led by Professor Richard Wang, which produces a large number of publications and hosts a significant international conference in this field (International Conference on Information Quality, ICIQ). This program grew out of the work done by Hansen on the Zero Defect Data framework (Hansen, 1991).

Data quality assurance – Overview

MIT has a Total Data Quality Management program, led by Professor Richard Wang, which produces a large number of publications and hosts a significant international conference in this field (International Conference on Information Quality, ICIQ). This program grew out of the work done by Hansen on the Zero Defect Data framework (Hansen, 1991).

Oracle Hyperion – Products

* Hyperion Financial Data Quality Management (Also referred to as FDM EE)

Oracle Hyperion – Timeline

* 2006 – Hyperion acquires UpStream (Financial Data Quality Management)

Data Management – Corporate Data Quality Management

Corporate Data Quality Management (CDQM) is, according to the EFQM|European Foundation for Quality Management and the Competence Center Corporate Data Quality (CC CDQ, University of St. Gallen), the whole set of activities intended to improve corporate data quality (both reactive and preventive). Main premise of CDQM is the business relevance of high-quality corporate data. CDQM comprises with following activity areas:.[https://benchmarking.iwi.unisg.ch/Framework_for_CDQM.pdf EFQM ; IWI-HSG: EFQM Framework for Corporate Data Quality Management. Brussels : EFQM Press, 2011]

Data Management – Corporate Data Quality Management

* ‘Applications for Corporate Data Quality’: Software applications support the activities of Corporate Data Quality Management. Their use must be planned, monitored, managed and continuously improved.

Data Management – Topics in Data Management

# Data Quality Management

Data quality – Overview

MIT has a Total Data Quality Management program, led by Professor Richard Wang, which produces a large number of publications and hosts a significant international conference in this field (International Conference on Information Quality, ICIQ). This program grew out of the work done by Hansen on the Zero Defect Data framework (Hansen, 1991).

Categories: Documents