- Degree in Computer Science or equivalent professional experience
- 10 years of experience in data modeling, database design, and data management
- 10 years of experience in data engineering, with a focus on data architecture, ETL processes, and big data technologies
- 10 years of hands on experience with designing and deploying enterprise data warehouse models
- 5+ years’ experience with proficiency in programming languages such as Python, Java, SQL
- Hands on experienced working with data warehousing technologies (BigQuery, Redshift, Snowflake)
- Strong understanding of database management systems (SQL and NoSQL).
- Expertise in big data technologies like Hadoop, Spark, DBT, Airflow, Apache Beam, Kafka, etc.
- Experience with cloud-based platforms and building data engineering solutions (AWS, Azure or GCP)
- Provide architectural guidance and big data engineering expertise for use cases that require capabilities in Federated Queries, Data Ingestion and Distributed Computing.
- Excellent problem-solving skills and attention to detail.
- Hands on experience with writing and optimizing SQL based code
- Experience with database performance optimization and tuning.
- Familiarity with Continuous Integration/Continuous Deployment (CI/CD) Pipelines (e.g., Jenkins, GitLab CI/CD, AWS CodePipeline, GCP Cloud Build)
- A strength in pragmatically designing, building and deploying scalable, highly-available systems
- An ability to think abstractly and are comfortable with ambiguous/undefined problems
- Excellent communication skills: you understand user needs and have the ability to translate them into actionable pieces of work
Base City:
Toronto - On-site
Salary:
No Salary therefore no Star!
Rating:
Self-taught:
Position Type:
Full-time
Position Keywords:
- Airflow
- AWS
- AWS Data Engineer certification
- Azure Data Engineer certification
- Bachelor - Computer Science
- Beam
- benevity
- Big Data
- Big Query
- Business Requirements
- CI/CD
- Code Pipeline
- Data Architecture
- Data Dictionary
- Data Governance
- Data Ingestion
- data integration
- Data Warehouse
- Database Optimization
- Database Optimization
- Databricks
- DBT
- Distributed Computing
- ETL
- Federated Queries
- GCP
- GCP Professional Data Engineering Certification
- Gitlab
- Hadoop
- Java
- Jenkins
- Kafka
- Logical Data Model
- Metadata
- Microsoft Azure
- NoSQL
- Physical Data Models
- Python
- RDBMS
- Redshift
- Snowflake
- Snowflake
- Spark
- SQL
- SQL
Required:
Bachelor - Computer Science
Experience:
10 Years Data Engineering
Other Experience:
About the Job:
The world’s coolest companies (and their employees) use Benevity’s technology to take social action on the issues they care about. Through giving, volunteering, grantmaking, employee resource groups and micro-actions, we help most of the Fortune 100 brands build better cultures and use their power for good. We’re also one of the first B Corporations in Canada, meaning we’re as committed to purpose as we are to profits. We have people working all over the world, including Canada, Spain, Switzerland, the United Kingdom, the United States and more!
Benevity’s software architecture has evolved to include a diverse technology stack. The front-end application, using mainly VueJS, is designed for both desktop and mobile web rendering. Our back-end systems (some Java SpringBoot, some PHP) manage data processing, interfaces with external providers, and ensures robust security. We run and operate our systems in the AWS cloud, leveraging where possible cloud-native technology. We emphasize clean, maintainable code and use GIT for version control and collaboration. Additionally, our platform integrates with various external services for functionalities like email communication, content storage, and server-to-server interactions.
Our culture is driven by our core value of “we-are-we” and as a Senior Data Modeler you will work in an outcome-driven environment where collaboration with your product, design and engineering counterparts is paramount.
If you’re eager to make a difference and thrive in a collaborative setting, we invite you to join our team!
What they want you to do:
The Senior Data Modeler will be responsible for the design, implementation, and maintenance of complex data models that support our business operations and analytics. This role requires a deep understanding of data modeling techniques, database design, and data management practices. The ideal candidate will be a strategic thinker with extensive experience in translating business requirements into robust data models.
- Design and develop logical and physical data models to meet the needs of various business applications
- Ensure data models are aligned with business requirements and best practices
- Create and maintain data dictionaries and metadata repositories
- Design and optimize database structures to support high-performance and scalability
- Collaborate with database administrators to ensure optimal performance of data models
- Implement indexing, partitioning, and other database optimization techniques
- Work closely with data architects, data engineers, and business analysts to integrate data from various sources
- Define data integration standards and practices. Ensure data consistency, quality, and integrity across different systems
- Collaborate with cross-functional teams to understand business requirements and translate them into effective data models
- Document data models, data flows, and business rules
- Ensure compliance with data governance policies and industry regulations
- Conduct regular audits and reviews of data models and databases to ensure compliance and optimal performance
- Identify opportunities for process improvements and implement solutions to enhance data modeling practices