Data Engineer

Data Engineer

Remote/Telecommute JobREMOTE / Toronto, Ontario, Canada  - Permanent
This job allows you to work remotely 


Job Description

We are looking for a Data Engineer to join the Data Warehouse team. If you passionate about solving data integration challenges, developing reliable and low maintenance data pipelines using the latest modern technologies? The following might be just for you! Our data team is responsible for maintaining and automating our data platform - Oracle and PostgreSQL, designing and validating data models for business service provisioning, implementing SQL queries, developing data delivery services, integration solutions, and reports with Python.

We are looking for a specialist who feels confident in both, data platform management and data services development areas. You will play an important role in keeping our database platform secured and stable while cooperating closely with other RSI teams. Your main focus will be the maintenance and configuration of our current database platform and helping to improve further. You also will be involved in developing reports, integration solutions, and data delivery services.

There's plenty of work and challenges as we keep growing and entering new markets.

Our current DW data stack: AWS, Snowflake, Airflow, DBT, Looker


What you’ll do:

Maintain and configure our on-premise and cloud databases
Own the Data Model and ensure clean data(base) design
Own the Data Layer; help the development teams to design and implement data(base) related solutions based on business requirements.
Cooperate closely with development teams and Operations.
Implement and maintain data security and auditing, backup, and recovery policies.
Perform data(base) tuning and performance monitoring.
Implementing queries that are used as a source for business analysis or publishing in Looker.
Develop integration services to import data from different sources or deliver data to other systems.
Automate reports generation and delivery.


Must Have Skills:

3+ years of experience as a data engineer using python.
3+ years of experience with Airflow.
Familiarity with distributed computing using a cloud-based tech stack on platforms such as AWS or GCP (Docker or Kubernetes)
Experience with big data technologies and/or in-memory/column-oriented databases (Hadoop/Hive/Spark, Vertica, Teradata, SAP HANA, Snowflake, Bigquery, Redshift etc)
Knows what it takes to produce high quality, performance, maintainable, and scalable data pipelines
Broad skills in ELT-tools and data integration related technologies (Airflow, DBT, python, S3, REST-API)
Computer Science or equivalent higher education.
Positive and solution-oriented mindset, willing to try and learn new technologies
Excellent knowledge of the English Language.


Details:

Starting: ASAP







Similar jobs in Toronto:

Similar jobs in other locations: