Our DATA QUALITY ANALYSIS | ASSESSMENT | MONITORING solutions leverages: The Data Quality Assessment Manager (DQAM) | Clean Cloud | Data Quality Rules Manager - (DQRM) | DQAM Sensitive Data Discovery - (DQAM SDD) | Data Certification - (DC). Please take a moment to review the information on this page which relates to each of these platforms | software solutions:
THE DATA QUALITY ASSESSMENT MANAGER - (DQAM)
- The Data Quality Assessment Manager (DQAM) in the Cloud guides a user through a comprehensive value-level assessment of a given data asset, thus enabling an organization to scale its data quality activities appropriately. The DQAM facilitates the data quality analyst’s ability to competently employ advanced methods within their field.
- Moreover, by facilitating the collaborative nature of both analysis and sharing of findings both internally and externally, with the broader organizational data community, the DQAM also promotes a team dynamic among the organization’s data quality analysts.
- Through these two capabilities, the DQAM is capable of doing something that the largest and most expensive tools on the market have yet to accomplish —constitute a normalized and empirical method for identifying data quality anomalies, and making the process more visible and collaborative on a team and enterprise scale. -- (Contact Us for more information).
-
CLEAN CLOUD
- Clean Cloud is a revolutionary data quality analysis automation solution. The software solution is driven by a powerful data classification engine that utilizes machine learning algorithms to automatically identify and resolve data quality issues.
- This eliminates the tedious, error-prone manual analysis involved with traditional data quality products and increases the accuracy and consistency of the data quality results. This software supports the ability to identify and resolve all value-level data quality problems comprehensively with the push of a button. Clean Cloud also provides applications to analyze and clean entire databases or AWS S3 buckets with the push of a button. The software never changes the source data and instead allows the user to select a target to store the clean data. The target may be the same as the source or changed to one of the supported targets.
- For example, the source is a relational database and the target is an AWS S3 bucket. In this example, all the tables in the database would be analyzed and the cleansed data would be stored in the AWS S3 bucket as CSV files. The reverse is supported as well, CSV files in an AWS S3 bucket are analyzed and the cleansed data is stored in targeted database tables.
- This makes Clean Cloud ideal for integrating clean data to cloud assets or for cleaning data lakes and other data assets already in the cloud. Clean Cloud provides analytical tools to review and validate the results from the data quality applications. Rather than performing the data quality analysis manually, the analyst validates the results produced by the classification engine are correct. Erroneous results are fed to the machine learning algorithms to automatically adjust the data constraints for the data classes. This automation removes the tedious process of configuring or coding data rules manually. (Contact Us for more information).
-
DATA QUALITY RULES MANAGER - (DQRM)
- The Data Quality Rules Manager (DQRM) is intended for creating the remediation and transformation rules. The software is only viable with the rule specifications developed from the DQAM. The rules manager has the same look and feel as the DQAM. The intention of the rules manager is to allow the analyst to automatically generate the rules from specifications identified during the assessment process in the DQAM.
- The workflow necessary to create the rules is defined using chevrons very similar to the DQAM. Given the rule specifications have been created properly in the DQAM, the remediation and transformation rules will be automatically generated as SQL code. Following the workflow, the analyst is able to test and validate the SQL code using the interface to ensure the code works properly.
- Given successful testing, scripts are generated for use in ETL or given to a database administrator for implementation. The rules manager ensures that the data is remediated in the source system or transformed properly for use in target systems. (Contact Us for more information).
-
DQAM SENSITIVE DATA DISCOVERY - (DQAM SDD)
- The DQAM SDD is an advanced scanning technology that is built on the same platform as the DQAM product and leverages similar engines. The SDD allows the analyst to define business data elements with the following characteristics or metadata for an attribute: Names | Abbreviations | Words | Data types | Lengths | Patterns | Forms | Absolute patterns. In addition, the analyst is able to define what are called value patterns. The value patterns are unique to the DQAM technology and allow the analyst to create logic based on combinations of patterns and values.
- This allows the user to define the rules for things such as social security numbers, credit cards, phone numbers, or any other type of specific sensitive data. The analyst may also define values specific to the business data element. Once the business data elements are defined, the analyst then defines the data targets to be scanned and which business data elements to search for in the data. The analyst has full control over how much of the metadata to use during the scans.
- This software is ideal for GDPR, PII, PCI, PHI, but there could be many other use cases for the software as well. (Contact Us for more information).
-
DATA CERTIFICATION - (DC)
- The Data Certification software solution delivers certified data that is “fit for use | fit for purpose”. This is achieved by having the data community review and certify the resultants of the Clean Cloud software solution. The Data Certification software solution provides project management capabilities that allow data assets to be selected for certification. Specific attributes are then scheduled to be certified by the data community.
- The scheduling identifies when the certification work is to be started and finished by the data community. The scheduling supports assigning specific community members to certify attributes or by driving the certification by the roles assigned to the data community members. In either case, these capabilities provide complete transparency into the certification process and support managing the entire certification project in real time.
- The schedule guides the data community members through the certification process. The members are presented with the data scheduled for certification. The members review and certify the data with a push of the button. The data community has complete authority over the entire certification process.
-
Our Data Certification software solution delivers data that is certified as “fit for use | fit for purpose” by the data community and provide the documentation | collateral to prove it. (Contact Us for more information).