Please login to view complete job details and apply.
Active

Sr/ Data Engineer (Snowflake + dbt)

Job ID: JB-398C2

Logic Pursuits

Hyderabad India Full-time Technology Mid Level 1,300,000 - 2,200,000 INR (yearly)
12 applications
4 positions available
Posted 3 weeks, 1 day ago
Deadline: Jan 01, 3000
Recruiter Commission: 0%
Skills
Snowflake DBT Python basic SAP DS Data Modelling SQL DBT Cloud Azure Data Lake RBAC Airflow Control-M Git CI/CD Azure AWS S3 GCS GDPR
Description & Requirements

Mandatory Skills

Snowflake
DBT
Python basic
SAP DS
Data Modelling
SQL
DBT Cloud
Azure Data Lake
RBAC
Additional Skills

Job Description

For Data Engineer Years of experience -3-5 years
Number of openings-2
For Sr. Data Engineer Years of experience- 6-10 years
Number of openings-2

About Us

Logic Pursuits provides companies with innovative technology solutions for everyday business problems. Our passion is to help clients become intelligent, information driven organizations, where fact-based decision-making is embedded into daily operations, which leads to better processes and outcomes. Our team combines strategic consulting services with growth-enabling technologies to evaluate risk, manage data, and leverage AI and automated processes more effectively. With deep, big four consulting experience in business transformation and efficient processes, Logic Pursuits is a game-changer in any operations strategy.

Job Description

We are looking for an experienced and results-driven Data Engineer to join our growing Data Engineering team. The ideal candidate will be proficient in building scalable, high-performance data transformation pipelines using Snowflake and dbt and be able to effectively work in a consulting setup. In this role, you will be instrumental in ingesting, transforming, and delivering high-quality data to enable data-driven decision-making across the client’s organization.

Key Responsibilities

Design and build robust ELT pipelines using dbt on Snowflake, including ingestion from relational databases, APIs, cloud storage, and flat files.
Reverse-engineer and optimize SAP Data Services (SAP DS) jobs to support scalable migration to cloud-based data platforms.
Implement layered data architectures (e.g., staging, intermediate, mart layers) to enable reliable and reusable data assets.
Enhance dbt/Snowflake workflows through performance optimization techniques such as clustering, partitioning, query profiling, and efficient SQL design.
Use orchestration tools like Airflow, dbt Cloud, and Control-M to schedule, monitor, and manage data workflows.
Apply modular SQL practices, testing, documentation, and Git-based CI/CD workflows for version-controlled, maintainable code.
Collaborate with data analysts, scientists, and architects to gather requirements, document solutions, and deliver validated datasets.
Contribute to internal knowledge sharing through reusable dbt components and participate in Agile ceremonies to support consulting delivery.

Required Qualifications Data Engineering Skills

3–5 years of experience in data engineering, with hands-on experience in Snowflake and basic to intermediate proficiency in dbt.
Capable of building and maintaining ELT pipelines using dbt and Snowflake with guidance on architecture and best practices.
Understanding of ELT principles and foundational knowledge of data modeling techniques (preferably Kimball/Dimensional).
Intermediate experience with SAP Data Services (SAP DS), including extracting, transforming, and integrating data from legacy systems.
Proficient in SQL for data transformation and basic performance tuning in Snowflake (e.g., clustering, partitioning, materializations).
Familiar with workflow orchestration tools like dbt Cloud, Airflow, or Control M.
Experience using Git for version control and exposure to CI/CD workflows in team environments.
Exposure to cloud storage solutions such as Azure Data Lake, AWS S3, or GCS for ingestion and external staging in Snowflake.
Working knowledge of Python for basic automation and data manipulation tasks.
Understanding of Snowflake's role-based access control (RBAC), data security features, and general data privacy practices like GDPR.

Data Quality & Documentation

Familiar with dbt testing and documentation practices (e.g., dbt tests, dbt docs).
Awareness of standard data validation and monitoring techniques for reliable pipeline development.

Soft Skills & Collaboration

Strong problem-solving skills and ability to debug SQL and transformation logic effectively.
Able to document work clearly and communicate technical solutions to a cross-functional team.
Experience working in Agile settings, participating in sprints, and handling shifting priorities.
Comfortable collaborating with analysts, data scientists, and architects across onshore/offshore teams.
High attention to detail, proactive attitude, and adaptability in dynamic project environments.

Nice to Have

Experience working in client-facing or consulting roles.
Exposure to AI/ML data pipelines or tools like feature stores and MLflow
Familiarity with enterprise-grade data quality tools

Education:

Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field.
Certifications such as Snowflake SnowPro, dbt Certified Developer Data Engineering are a plus
Additional Information

Why Join Us?

• Opportunity to work on diverse and challenging projects in a consulting environment.

• Collaborative work culture that values innovation and curiosity.

• Access to cutting-edge technologies and a focus on professional development.

• Competitive compensation and benefits package.

• Be part of a dynamic team delivering impactful data solutions

Required Qualification
Bachelor of Engineering - Bachelor of Technology (B.E./B.Tech.) ,

Job Insights: Important Tips to source better

Please look for early joiners. (Max. 15 Days) (Refrain sharing candidates with 60 days official notice period)
This is 5 days work from office role (No Hybrid/ Remote options available)

For Data Engineer (3-5 years)- Max budget is up to 15 LPA
For Sr. Data Engineer (6-10 Years)- Max budget is up to 22 LPA

Mandatory skills sets-

Snowflake – Intermediate Skillset
SAP DS – Intermediate Skillset
Dbt - basic Skillset
Questionnaire

Question1 : Current location? ?

Desired answer : Anywhere in India

Question2 : Experience in orchestration tools like Airflow, dbt Cloud, and Control-M ? ?

Desired answer : yes

Question3 : Basic Python programming? ?

Desired answer : yes

Question4 : Experience in RBAC? ?

Desired answer : yes

Question5 : Experience in Data Modelling with Kimbell or Dimensional? ?

Desired answer : yes

Question6 : Experience in SAP DS? (in years) ?

Desired answer : At least 2 years

Question7 : Experience in Snowflake and DBT? (in years) (in years) ?

Desired answer : Please mention separately