Lead Data Engineer- Azure

  • Hyderabad
  • Yash Technologies

We are looking forward to hire “ Lead Azure Data Engineers” who thrives on challenges and desire to make a real difference in the business world. With an environment of extraordinary innovation and unprecedented growth, this is an exciting opportunity for a self-starter who enjoys working in a fast-paced, quality-oriented, and team environment.

What you should have?

  • Azure Data Factory, Azure DevOps CI/CD, Azure Data Lake, Azure Synapse, Azure Databricks, PySpark, GitHub.
  • Need 6+ Years of experience
  • Good analytical skills with excellent knowledge of SQL.
  • Well versed with Azure Services
  • Must have experience and knowledge on ADF, ADLS, Blob storage
  • Must have experience in building data pipelines
  • Hand on development on PySpark, Databricks
  • Experience using software version control tools (Git) Work in Agile methodologies and might be required to perform QA for work done by other team members in the sprint
  • Work with team and assist the Product Owner and technology lead in identifying and estimating data platform engineering
  • Knowledge and ability to setup DevOps and Test frameworks
  • Familiarity with API integration processes
  • Exposure to Power BI , streaming data and other Azure services Career Growth Plan

What you will do?

  • Develop Data pipelines to load data using Azure Data Factory into Azure DataLake , SQL data warehouse
  • Perform Data Model design, ETL/ELT development optimized for efficient storage, access, and computation to serve various
  • Business Intelligence use cases
  • Contribute fully/partially to areas of API integration, End to end Devops automation, test automation, data visualisation (Power BI) and Business intelligence reporting solutions
  • Knowledge of programming languages such as spark or python
  • Create technical design documentation which includes current and future functionality, database objects affected, specifications, and flows/diagrams to detail the proposed database and/or Data Integration implementation.