Senior Developer (Level 3) – Data, Integrations & Data Platforms
Groupe NordikShare this job:
Groupe Nordik is a fast-growing company whose goal is to rise to the top of the wellness, health, and tourism industries. We have proven ourselves as an industry trailblazer through our commitment and mission. Our vision and our newly created corporate office to support its three spas in Quebec, Ontario, and Manitoba. This aims to continue our expansion and thus help us become an international leader in Nordic spas.
To fulfil our primary mission of transforming people’s lives, one visit at a time, Groupe Nordik is leading multiple projects. We gain momentum by identifying business opportunities, getting involved, and bringing promising, large-scale projects to life. We rely on our teams’ creativity and engagement to help us succeed and diversify our projects.
We are currently looking for a Senior Developer whose role is designed for a backend-focused expert who leads Nordik’s evolution toward a robust, scalable, cloud-based data ecosystem. The position elevates our data maturity through strong engineering, automation, and architectural leadership—aligned fully with the Level 3 senior expectations from the original Full Stack JD but tailored to your required specialization.
Join us!
You are passionate about data, integrations and development and want to contribute to exceptional projects?
Join the Groupe Nordik team!
Work with like minded professionals who use their creativity and talents for one of the most dynamic, fast-growing companies in the region.
What you will do
As a Senior Developer (Level III) specializing in data engineering, integrations, and cloud data platforms, you will architect, design, and deliver robust backend systems, data pipelines, and database-driven applications that power Groupe Nordik’s operational and analytical ecosystems.
Unlike a full-stack profile, this role is backend-centric, with deep expertise in SQL/NoSQL databases, ETL/ELT workflows, data modeling, data warehousing, and integration patterns. You will serve as a technical lead, mentor developers, and drive best practices for data quality, scalability, and reliability.
Your work will directly influence Nordik’s data strategy and ensure our systems, analytics, and integrations operate smoothly across multiple environments.
Key Accountabilities & Core Responsibilities
1. Backend Architecture & Data Platform Engineering
- Architect and implement high-performance backend systems focused on databases, ETL processes, and distributed data flows.
- Design and maintain data models, schemas, and storage strategies for PostgreSQL, MSSQL, Snowflake/Synapse, MongoDB, and DynamoDB.
- Lead efforts to improve database performance, reliability, security, and scalability.
- Build APIs and backend services that support data-heavy applications and integrations.
2. ETL/ELT Development & Data Workflow Automation
- Design and implement ETL/ELT pipelines using tools such as:
- dbt, SSIS, AWS Glue, Airflow (or similar orchestration tools).
- Automate ingestion, transformation, and storage of structured and unstructured data.
- Ensure pipelines are observable, fully logged, error-tolerant, and auditable.
- Optimize data workflows for efficiency and reliability.
3. Data Warehouse, BI & Reporting Enablement
- Design and maintain schemas and business layers in Snowflake or Azure Synapse.
- Work with analytics and BI teams to ensure data availability, quality, and performance.
- Build optimized SQL datasets and views for tools such as Power BI or SSRS or AWS QuickSight.
- Implement incremental loads, SCDs, partitions, clustering, and performance tuning.
4. Systems Integration & API/Data Exchange Engineering
- Architect and maintain integrations between internal systems, SaaS platforms, and cloud services.
- Implement secure, scalable patterns for data syncing, CDC, event-driven flows, and batch processing.
- Lead the design of APIs and microservices enabling data interoperability across the enterprise.
5. Leadership, Mentorship & Standards
- Act as technical lead on data-heavy initiatives and backend architecture decisions.
- Mentor intermediate and junior developers on backend design, SQL optimization, and integration design.
- Establish standards for SQL quality, data modeling, pipeline reliability, and documentation.
- Promote a culture of testing, code quality, and continuous improvement across the dev team.
6. Security, Performance & Reliability
- Implement secure coding and data handling practices (PII, access controls, encryption).
- Continuously evaluate and tune the performance of databases and pipelines.
- Identify, diagnose, and resolve complex backend data issues and bottlenecks.
- Ensure adherence to organizational performance, scalability, and reliability standards.
7. Cross-functional Collaboration
- Work closely with QA, DevOps, BI, Product, and business stakeholders to deliver solutions aligned with organizational needs.
- Provide expertise during planning and architectural sessions.
- Translate business data requirements into technical solutions that support analytics, operations, and reporting.
Scope of Influence
- Owns technical decisions for data platforms and integrations.
- Shapes database architecture and long-term data strategy.
- Resolves escalated or mission-critical backend issues.
- Represents backend/data engineering concerns in cross-functional planning.
Decision-Making & Autonomy
- Independently defines architecture for backend data systems.
- Leads resolution of complex data or integration issues.
- Anticipates technical risks across data pipelines and operational flows.
Typical Deliverables
- Data architecture diagrams, ERDs, integration specifications.
- Production-ready ETL/ELT pipelines and backend services.
- Performance-optimized SQL models, schemas, and stored procedures.
- Documentation, data dictionaries, pipeline monitoring dashboards.
- Mentorship artifacts (reviews, standards, guidelines).
Your profile stands out because of your
Core Backend & Data Engineering Expertise such as:
- Master-level SQL:
- PostgreSQL, MSSQL, Snowflake/Synapse
- Partitions, query tuning, execution plan analysis, indexing strategies
- Experience with NoSQL databases:
- MongoDB, AWS DynamoDB
- ETL/ELT development (SSIS, dbt, Glue, custom pipelines).
- Data warehousing concepts (star schemas, slowly changing dimensions, fact tables).
- CI/CD for data workflows (GitHub Actions, AWS Pipelines, etc.).
- API design and integration patterns (REST, events, queues).
- Cloud experience (AWS preferred: Lambda, RDS, DynamoDB, Glue, S3).
Key Competencies such as
- Strong analytical and troubleshooting skills.
- Excellent communication and documentation abilities.
- Ability to mentor and lead.
- Ownership mindset and autonomy.
- Proactive in identifying improvements and architectural refinements.
To be successful in this position you will need to have:
- Bachelor’s degree in Computer Science, Software Engineering, Information Technology, or similar.
- 5–10+ years hands-on backend development experience.
- Demonstrated expertise in:
- Database engineering (PostgreSQL, MSSQL).
- Data warehousing (Snowflake/Synapse)
- ETL/ELT pipelines
- NoSQL (MongoDB, DynamoDB)
- Data integrations & API engineering
- Experience acting as technical lead on data-centric or integration-heavy projects.
- Experience with cloud platforms (AWS strongly preferred).
Salary & Benefits
- Competitive salary
- Comprehensive benefits package (including telemedicine)
- Hybrid work model with flexible hours and strong focus on work-life balance
- Investment in your skills (company paid training, conferences, and more)
- Free access to the spa for you and a plus one
- Discounts on food, massage therapy, esthetic services, and more
- Free access to an exercise room at the workplace
Work schedule and status
Monday to Friday, 40 hours/week (Full-time).
To apply
Send us your resume and cover letter by clicking “Apply” below.
Share this job:
Type d'emploi
- Type d'emploi
- Temps plein
- Emplacement
- Gatineau, Quebec
Share this job: