We are looking for a Data Quality Analyst! Reach out if you are interested and feel free to refer friends/colleagues!
Type of Employment: Contract
Title: Data Quality Analyst
Term: 12 months contract with extension
Location: GTA – Remote
Job ID number: C1256
Brief Description of Duties:
Enterprise Data Operations and Technology is building and supporting the use of enterprise data and analytics tools, process and quality data assets, a critical foundation for the client’s objectives of transforming into a Digital organization.
The Senior Data Quality Analyst service provider is a high performing role that supports the quality management of the client‘s critical data across the data lifecycle. This individual will work within a network of business and technical data stewards, product owners, data engineers, and other data management professionals to ensure the integrity of the data; it’s fit for use, and compliance within the Information Management policy.
- Provide data quality advisory, and design and deploy new data quality capabilities for digital transformation priority initiatives using Agile delivery methods.
- Profiling source data to identify any major Data quality anomalies.
- Develop Automated Data Quality processes to help identify Data anomalies using Analytics/ML and remediate DQ issues.
- Engage business and technical stewards and stakeholders to identify data quality requirements and rules, and design and implement data quality analysis and exception reporting from multiple source systems such as Data marts and legacy systems and within the Microsoft Azure enterprise data platform.
- Develop data quality dashboards to report on data quality metrics and trends that can be presented to a broad stakeholder audience for actioning.
- Using existing data assets and metadata, identify data requirements and sources, support and investigate root causes of data integrity issues, and recommend remediation approaches and controls to business and technical stewards.
- Support the implementation of a centralized metadata management solution for cataloging enterprise data assets and for discovering and harvesting business and technical metadata.
Top Skills Required:
- Expertise designing and implementing automated Data Quality solutions using Spark, python, Scala.
- Design and develop automated solution to detect Data Anomaly in a Databricks environment.
- Expertise with data and analytics tools such as data profiling and quality tools, data asset catalogs, ETL tools, Power BI, Denodo, Data Bricks or the Azure Data Platform.
- Strong understanding of data flow, lineage, data transformation and business rules as it relates to root cause analysis and remediation of data quality issues.
- Expertise writing SQL queries to support data extraction, analysis and exceptions reporting.
- Experience with database design principles, data lake or other cloud platforms, ETL, Data Warehouse, and Power BI.
Other Skills Required:
- Undergraduate Degree in Management Information Systems, Finance, Business Administration, Computer Science, Computer Engineering, or other relevant disciplines.
- Minimum 5 years of related experience in areas such as data quality management, data analysis, ETL procedures, data integration or data migration, data warehouse, metadata discovery and analysis, or data governance.
- Experience in analytics, software engineering, data architecture, or similar
- Experience with Databricks, Information Analyzer, Informatica, Collibra, or other similar tools
- Bilingual in both official languages (English & French)
- Agile Training
|Job Category||Data Quality Analyst|