This recruiter is online.

This is your chance to shine!

Apply Now

Sr. Data Engineer - 54784

Toronto, ON
  • Number of positions available : 1

  • To be discussed
  • Contract job

  • Starting date : 1 position to fill as soon as possible

Business Unit: Protect, Fraud Product

Duration: Oct 2025

Extension possible: possible- cannot guarantee


Interview Process: 1 technical interview, in person (1 hour).

Work Location: 160 Front Street West

Hybrid- 2 days in office, Monday is the anchor day, 2nd is flex, usually Wed


CANDIDATE PROFILE DETAILS:

Degree/Certifications Required: no

Years of experience: 8 to 10 years

Reason for request/why opened: new project

% Interaction with Stakeholders: no

Project Scope: role a new technology within a TD - graph database (tigergraph).

Team Size: 8 to 10

Personality Style/Team Culture: friendly, collaborative, result oriented


Selling Points of Position: will be able to learn more about graph database , current technologies being used, important project to the Bank (high visibility).

Best Vs Average Candidate: Someone who has worked on a similar role. Data engineering w/ large volumes of data.


SUMMARY OF DAY TO DAY RESPONSIBILITIES:

Reporting to the Manager, the Data Engineer (DE) supports Fraud data migration and Business Intelligence projects by designing and building data management and analytics solutions, ensuring high quality and trusted data is available and accessible to Fraud Performance Management team to make informed business decisions on our registrants, including developing reporting and analytics systems and self-serve reports.

The DE is responsible for working with data and product owners and various stakeholders from the business, IT and external teams to ensure successful delivery of the Data Projects.

Scope and Complexity

The DE is responsible for developing and maintaining data pipelines in Azure data platform, designing, and managing data repositories, lakes and data warehouse. The DE supports the data governance and data quality strategies on the operational level maintaining the various artifacts on the data assets, including the data dictionary, lineage, ownership and business rules.

The DE works closely with the other data engineers and the data analysts (DA), business analysts (BA), product owners, IT and business SME’s. Specifically, for the Data migration project, the DE designs the source to target mapping, extracts the data from the current systems, transforms and loads the data into the new cloud-based solution. The DE also works closely with the Application and QA teams to validate the data ensuring quality and alignment with the data architecture.

Duties/Accountabilities

• Work together with other data engineers, data analysts, business analysts, business SME’s, records analyst, and privacy analyst to understand the needs for data and create effective, secure data workflows.

• Responsible for designing, building, and maintaining secure and compliant data processing pipelines using various Microsoft Azure data services and frameworks including but not limited to Azure Databricks, Data factory, ADLS Storage, PySpark.

• Build databases, data marts or data warehouse and perform data migration work.

• Build reporting and analytical tools to utilize the data pipeline, provide actionable insight into key business performance metrics.

• Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Microsoft Azure Cloud.

• Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and/or Azure Blob Storage.

• Utilize Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations.

• Implement data validation and cleansing procedures to ensure the quality, integrity, and dependability of the data.

• Improve the scalability, efficiency, and cost-effectiveness of data pipelines.

• Monitor and resolve data pipeline problems to ensure consistency and availability of the data.

• Identify, design, and implement internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes.

• Adapt and learn new technologies per business requirements.

• Ensure compliance with data governance, privacy and security policies.

• Fosters and maintains an organizational culture that promotes equity, diversity and inclusion, mutual respect, teamwork, and service excellence.


Must haves:

• architectures, and datasets using Microsoft Azure technologies including Spark.

• Experience with data migration projects within a Microsoft and Azure

• Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.

• Strong analytic skills related to working with unstructured datasets.

• A successful history of manipulating, processing, and extracting value from large, disconnected datasets.

• Ability to plan, prioritize and manage workload within a time sensitive environment.


Nice to have:

Experience w/ a similar project - graph database

Banking exp


Apply

Requirements

Level of education

undetermined

Work experience (years)

undetermined

Written languages

undetermined

Spoken languages

undetermined