time. The result is the need for de-duplication — knowing how to make a determination that eliminates duplicates. In fact, this category (matching and de-duplication) is closely related to identification and searching. In subsequent Giga research, reference is sometimes made to matching and searching. “Data profiling/metadata/analytics” is the description of the structure, data elements and content of data structures that regard content and the validity and semantics of the content in question. This includes information engineering and reengineering. The profiling of defined codes, statuses and permissible business values against the actual content of data stores is especially useful if the results are captured in a central metadata repository for leveraged reuse. Data quality is relative to allowable values, that is, to data standards. The idea of a data dictionary is not new. What is new is the possibility of capturing the data in an automated way to a local metadata repository as the data is mapped from the transactional system of record to the decision-support data store. “Standardization/scrubbing/correction/parsing” is modifying and enhancing the quality and content of data against a specified set of rules, canons or standards that indicates the proper structure and semantic content of the data. At least one data quality vendor, SSA, makes a strong case that standardization is unrelated to data quality. But even SSA acknowledges that standardization is necessary in order to qualify for discounts on direct mail through the national postal service. “Data augmentation” is the correlating of demographic information with basic customer or product data. The large credit reporting and data aggregators specialize in this area. These are not software products, and these infrastructures may also be involved in the delivery of the content. As discussed in previous research, they include Axciom, Equifax, Experian, CACI, Claritas, Harte-Hanks, Polk and TransUnion (see Planning Assumption, Market Overview: Data Quality, Lou Agosta). When data quality services, such as those specified in any of the above, are offered as part of a centralized service to a variety of clients from a single source, then a service bureau model is being invoked. An application service provider (ASP) is an example of a modern approach to a service bureau. Data quality vendors that are trying to generate revenues using the service bureau or ASP model include: Firstlogic (eDataQualityService ASP), Harte-Hanks (in a variety of service forms) and Group1 (dataQuality.net). For additional research regarding this topic see the following: •= •= •= Planning Assumption, Data Quality Methodologies and Technologies, Lou Agosta Planning Assumption, Vendor ScoreCard: Data Quality, Part 1, Lou Agosta Planning Assumption, Vendor ScoreCard: Data Quality, Part 2, Lou Agosta