Skip to main content Skip to footer

Integration Engineer

Bengaluru Job No. atci-5508908-s2013823 Full-time

工作描述

Project Role : Integration Engineer
Project Role Description : Provide consultative Business and System Integration services to help clients implement effective solutions. Understand and translate customer needs into business and technology solutions. Drive discussions and consult on transformation, the customer journey, functional/application designs and ensure technology and business solutions represent business requirements.
Must have skills : Data Engineering
Good to have skills : NA
Minimum 5 year(s) of experience is required
Educational Qualification : 15 years full time education

Role Summary
As a Data Engineer, you will have deep technical knowledge across core components of a modern, modular, cloud-native data platform. You will take care of extraction data from source systems (SAP ERP / SAP IBP / Veeva / MES etc ) to S3 buckets (build the bronze data) and from S3 to Redshift to build our silver / golden data products ready to be consumed by business.
Key Responsibilities
Implement robust, scalable data services in AWS using Glue, Redshift, Iceberg, Lambda, EMR, Step Functions, Apache Airflow, etc.
Develop infrastructure-as-code modules and support continuous delivery pipelines.
Collaborate on architectural proposals and ensure alignment with broader platform strategy.
Work together with the Product Manager to gather data requirements, understand business priorities, and translate them into technical specifications.
Partner with data scientists and domain engineers to enable governed self-service Data Management Capabilities.
Perform code reviews and contribute to team knowledge-sharing and documentation.
Minimum Qualifications
5–8 years of experience in software or data engineering, preferably in cloud environments.
Proficiency in Python, SQL, and tools like dbt or Apache Spark.
Experience with AWS data stack (e.g., Glue Catalog, IAM, S3).
Solid understanding of CI/CD, DevOps, and IaC tools like Pulumi or Terraform.
Develop and manage ETL/ELT workflows to ingest data from multiple sources (structured, semi-structured, and unstructured).

Preferred Qualifications
Exposure to data lakehouse and data mesh architectures.
Familiarity with data privacy and access control implementations (OAuth, RBAC).
Experience with AI/ML integration or supporting MLOps workflows.
AWS certifications such as AWS Certified Data Analytics – Specialty or AWS Certified Solutions Architect.
Knowledge of Supply Chain, manufacturing and quality processes

职位要求

15 years full time education

更多了解埃森哲

我们的专长

我们秉承“科技融灵智,匠心承未来”的企业使命,致力于通过引领变革创造价值,为我们的客户、员工、股东、合作伙伴与整个社会创造美好未来。

认识我们的团队

从业务服务部门到各个行业领域, 从职场新人到卓越领袖,我们一直在运用科技创造非凡!

联系我们

加入我们的团队

搜索与你的技能和兴趣匹配的空缺职位。我们希望招聘充满激情、求知若渴、富有创意、专注于解决方案且喜欢团队合作的员工。

埃森哲职位博客

关注埃森哲职业博客,在职场中先人一步,从真正的业内人士处,获取职业建议、内部观点以及可以即学即用的行业真知。