




**Requirements and responsibilities** **Duties:** * Experience with ETL/ELT processes using Microsoft Fabric Data Factory (or Azure Data Factory). * Strong knowledge of data ingestion pipelines (batch and real\-time) from diverse sources (databases, APIs, flat files, etc.). * Proficiency in SQL for querying, data manipulation, and performance tuning. * Experience with Python or Scala for data wrangling, transformations, and automation (Fabric Notebooks). * Familiarity with DAX and M language for advanced Power BI modeling and calculations. * Strong understanding of Data Lakes, Warehouses, and Lakehouse models * Power BI expertise — data modeling, relationships, measures, and report building. * Experience with Direct Lake Mode, Import Mode, and Composite Models in Power BI connected to Fabric. * Ability to create real\-time dashboards using Real\-Time Analytics features in Fabric. * Skills in query performance tuning (partitioning, indexing, data pruning strategies). **Technical Requirements:** * 7–10 years of experience in Data Engineering or a closely related role * Proven experience with Data Warehousing, Data Modeling, and ETL processes * Strong hands\-on expertise in Microsoft Power BI (including DAX) * Solid experience with Microsoft Fabric components (Lakehouse, Pipelines, etc.) * Knowledge of ETL frameworks **Non\-Technical Requirements:** * Strong Communication skills**:** ability to work closely with business users, analysts, and cross\-functional teams * Experience collaborating with stakeholders to understand business needs and translate them into data solutions * Problem\-solving mindset with the ability to simplify complex data challenges


