- A related university degree (e.g., Computer Science, Business), supplemented by at least four (4) years of related experience. Equivalencies may be considered.
- Relevant industry experience working with datatools and data platforms.
- Experience with building and optimizing data extraction, transformation, and loading (ETL) pipelines.
Assets:
- Familiarity ETL tools like SSIS, Azure Data Factory, Fivetran, DBT, etc.
- Familiarity with datatools and platforms, such as PostgreSQL, MySQL, Snowflake, Hadoop, Databricks, Azure Synapse, etc.
- Familiarity with DevOps practices and CI/CD pipelines
- Familiarity with programming languages such as Python, R
- Proficiency in SQL and experience in data modeling
- Knowledge and understanding of container technologies, like Docker and Kubernetes orchestration.
- Experience with version control systems like Git to manage codebase and collaborate with team members.
- Experience working with business intelligence tools such as Power BI, Tableau, SAS, etc.