We are looking for a Senior Data Engineer! Reach out if you are interested and feel free to refer friends/colleagues!
Type of Employment: Contact
Title: Senior Dara Engineer
Term: 12-months – 37.5 hours/week
Location: Hybrid – 1 to 2 times on site during the duration of the contract
Job ID number: C1130
Brief description of duties:
The Senior Data Engineer Service Provider will be responsible for leading the delivery of data on data and analytics platform. These services are at the forefront of building out data engineering practice cloud-native technologies. The Senior Data Engineer Service Provider must have experience in leading, designing, implementing, and collaborating with stakeholders to achieve the best results for our clients.
- Proven design, build and implementation of batch and real-time data pipelines. Driven by automated repeatable delivery of data that aligns to enterprise data governance standards.
- Experience in developing and proposing data models that conform to requirements. Responsible to ensure the proposed design, optimally addresses access and query patterns; data consumption and adheres to internal architecture standards.
- Experienced collaboration working with various stakeholders across the business, data scientists and IT. Working closely building relationships, refining data requirements to meet various data and analytics initiatives and data consumption requirements
Specific Project Requirements:
- Increase the overall speed in which data is onboarded to the Data and Analytics platform.
- Building robust data pipelines to enable larger data consumption on the Data and Analytics Platform\
- Increase the overall quality of data pipeline development through DevSecOps
- Programming experience in Spark using Scala, Python
- Experience working with modern data architectures like Azure Data Lake Storage, Azure, Azure Synapse (formerly SQL Data Warehouse) and Databricks, Delta Lake, Delta Sharing, Unity catalog, DLT pipelines
- Experience leading Data Engineering practice within an organization/team.
Nice to haves:
- Knowledge and expertise of database modeling techniques: Data Vaults, Star, Snowflake, 3NF, etc.
- Experience building Rest API integration using Redis Cache, Denodo
- Experience working with streaming data architecture and technologies for real-time: Spark Streaming, Event hub, Kafka, Flink, Storm
- Experience working with relational and non-relational database technologies: SQL Server, Oracle, Cassandra, MongoDB, CosmosDB, HBase
- Experience working with source code and configuration management environments such as Azure DevOps, Git, Maven, Nexus
- Experience within Azure environment
- Strong Scala, python and Spark experience
- Experience modernizing data platforms.
- 2 to 3 projects developing Data Vault
|Spark using Scala, Python
|Rest API integration using Redis Cache, Denodo
|Relational and non-relational database technologies