Skip to main content Skip to footer

Cloud Platform Engineer

Pune Job No. atci-5272442-s2001061 Full-time

工作描述

Project Role : Cloud Platform Engineer
Project Role Description : Designs, builds, tests, and deploys cloud application solutions that integrate cloud and non-cloud infrastructure. Can deploy infrastructure and platform environments, creates a proof of architecture to test architecture viability, security and performance.
Must have skills : Microsoft Fabric
Good to have skills : NA
Minimum 3 year(s) of experience is required
Educational Qualification : 15 years full time education

We are seeking a skilled Microsoft Fabric Data Engineer to design, build, optimize, and maintain modern data solutions using Microsoft Fabric. The ideal candidate will have strong experience with data engineering, analytics workloads, cloud-based data platforms, and end-to-end data pipeline development.

Min 4yrs exp - Microsoft Fabric Data Engineer

Key Responsibilities
1. Data Architecture & Modeling
Design and implement scalable data architectures using Microsoft Fabric components such as Lakehouse, Data Warehouse, OneLake, and KQL Databases.
Create and optimize star schemas, data marts, semantic models, and medallion architectures.
Manage and enforce data governance, security, and access control within Fabric workspaces.
2. ETL/ELT Pipeline Development
Develop, orchestrate, and maintain data ingestion and transformation pipelines using Data Factory, Fabric Pipelines, and Dataflows Gen2.
Build automated workflows for batch, streaming, or event-driven ingestion.
Optimize pipeline performance and ensure reliability, scalability, and fault-tolerance.
3. Data Integration & Processing
Work with structured and unstructured data from various enterprise systems, APIs, and external sources.
Utilize Apache Spark within Fabric Notebooks for large-scale data processing.
Implement Delta Lake best practices (Z-ordering, OPTIMIZE, VACUUM, etc.).
4. Analytics & Reporting Enablement
Partner with BI analysts to create and optimize Power BI semantic models and direct lake mode datasets.
Publish high-quality, certified data assets for business consumption.
Ensure data quality, accuracy, and consistency across analytic layers.
5. Monitoring, Optimization & Operations
Monitor Fabric workloads, storage utilization, capacity models, and performance.
Implement logging, alerting, and automated testing for pipelines.
Perform cost optimization for compute workloads and OneLake storage.
6. Collaboration & Stakeholder Engagement
Work closely with data analysts, data scientists, and business stakeholders to understand data needs.
Translate business requirements into scalable data solutions.
Document workflows, architectures, and best practices.

Required Skills & Qualifications
Bachelor s degree in Computer Science, Information Systems, Engineering, or related field.
Hands-on experience with Microsoft Fabric (Lakehouse, Data Factory, Pipelines, OneLake, Notebooks, Power BI).
Strong proficiency with SQL, Python, Spark, and Delta Lake.
Experience with Azure services (Azure Data Lake, Azure Synapse, Azure Data Factory, AAD).
Solid understanding of ETL/ELT methodologies, data modeling, and data warehousing concepts.
Knowledge of version control (Git) and CI/CD workflows.
Excellent analytical, problem-solving, and communication skills.

Preferred Qualifications
Fabric Analyst or Fabric Engineer Certification.
Experience with MLOps or DataOps practices.
Familiarity with DevOps tools (Azure DevOps, GitHub Actions).
Experience with streaming technologies (Event Hubs, Kafka, Fabric Real-Time Analytics).

职位要求

15 years full time education

更多了解埃森哲

我们的专长

我们秉承“科技融灵智,匠心承未来”的企业使命,致力于通过引领变革创造价值,为我们的客户、员工、股东、合作伙伴与整个社会创造美好未来。

认识我们的团队

从业务服务部门到各个行业领域, 从职场新人到卓越领袖,我们一直在运用科技创造非凡!

联系我们

加入我们的团队

搜索与你的技能和兴趣匹配的空缺职位。我们希望招聘充满激情、求知若渴、富有创意、专注于解决方案且喜欢团队合作的员工。

埃森哲职位博客

关注埃森哲职业博客,在职场中先人一步,从真正的业内人士处,获取职业建议、内部观点以及可以即学即用的行业真知。