We are looking for Cloud Analytic Engineer! Reach out if you are interested and feel free to refer friends/colleagues!
Type of Employment: Contract
Title: Cloud Analytic Engineer
Term: 6-month – 37.5 hours
Location: Mississauga – Hybrid, Wednesday and Thursday in the office
Job ID number: C1030
Brief description of duties:
As a member of the Enterprise Data Platform team, reporting to the EDP AI/ML Senior Manager, the Cloud Analytic Engineer and DevOps will play a leading role in the development of new products, capabilities, and standardized practices using Cloud and Data technologies. Working closely with our business partners, this person will be part of a team that advocates the use of advanced Data technologies to solve business problems, and be a thought-partner in Data space.
Data Analysis and visualization:
- Ability to own and lead your projects through all phases including identifying business requirements, technical design & implementation, and final delivery and refinement with business teams
- Elicit, analyze and interpret business and data requirements to develop complete analytic solutions, includes business process diagram, data mapping, data models (entity relationship diagrams, dimensional data models), ETL and business rules, data life cycle management, governance, lineage, reporting and dashboarding
- Facilitate data discovery workshops, downstream impact analyses and proactively manage stakeholder expectations
- Must have Analytical thought leadership
- Advanced expertise in SQL (Big Query, Trino, Impala or similar)
- Comfortable with data visualization, and data strategies. Experience with BI Analytic tools like Looker, MicroStrategy, Tableau, Kibana or similar.
- Communicate analysis results with effective storytelling.
- Comfortable working in complex and constantly changing environments, with multidisciplinary teams.
- Being able to effectively challenge the Status quo and set standard and direction for the solutions
- Well organized, able to multi-task and manage priorities.
- Proven experience building and deploying data analytic workflows in a big data environment (Hadoop, Google cloud platform or similar)
- Design and develop ETL workflows based on business requirements and using multiple sources of data in various formats within Hadoop platform or Google Cloud platform
- Clean, manipulate and analyze large, complex data sets, spanning a wide variety of sources.
- Develop scalable and robust analytic solutions that can support the growing volumes of our data environments.
- Build data models, implement business rules, and engineer responsive and scalable data analytics pipeline.
- Design and implement component execution orchestration in Cloud Composer/ Cloud Data Fusion / Oozie / Airflow
- Promote code to different environments using GitLab CICD
- Produce well documented quality codes
- Masters degree in Data Science and Analytics, Mathematics, Statistics, Computer Science or related field
- 5+ years of experience in a analytic engineering role, working in different data management disciplines including data integration, modelling, optimization and quality
- 5+ years of experience in coding with Python, SQL, Scala,
- 2+ years of CI/CD deployment, code management platform GitHub, building and coding applications using Hadoop components – HDFS, Hive, Impala, Sqoop, Kafka, HBase, etc.
- 2+ years experienced in working in Big Data Analytical environments/technologies (Hadoop, Hive, Spark), with a deep understanding of data mining and analytical techniques.
- 1+ years experience with Cloud platforms such as Google Platform, AWS, Azure, or Data bricks
- 1+ year Experience working with Google Cloud services, including Dataflow, Cloud Composer, Airflow, Cloud Run, Pub/Sub
- 5+ years experience building reports and visualizations in Tableau, MicroStrategy, Looker, Kibana or equivalent.
- 5+ years experience with traditional data warehousing and ETL tools
- Comfortable in version control tools such as Git.
Nice to have:
- Deep understanding of techniques used in creating and serving schemas at the time of data consumption
- Use AI, ML, and other big-data techniques to provide a competitive edge to the business.
- Advanced Cloud data technologies in ML space (Vertex AI, BQ-ML etc)
- Experience in AWS/Azure data platforms
- Past experience using Maven, Git, Jenkins, Se, Ansible or other CI tools is a plus
- Experience in Exadata and other RDBMS is a plus.
- Knowledge of predictive analytics techniques (e.g. predictive modeling, statistical programming, machine learning, data mining, data visualization).
- Familiarity with different development methodologies (e.g. waterfall, agile, XP, scrum)
- Strong inter-personal and communication skills including written, verbal, and technology illustrations.
- Experience working with multiple clients and projects at a time.
|Job Category||Cloud Engineer|
|Analytic engineering role, working in different data management||5+|
|Python, SQL, Scala||5+|
|CI/CD deployment, code management platform GitHub, building and coding applications using Hadoop components||2+|
|Google Cloud services, including Dataflow, Cloud Composer, Airflow, Cloud Run, Pub/Sub||1+|
|Reports and visualizations||5+|
|Data warehousing and ETL tools||5+|