Cloud Architect

  • Hyderabad
  • Tech Mahindra
Title: Reltio GCP Architect Location: Hyderabad/Bangalore / Pune Experience: 10+years NP: Immediate to 15days Company Profile: Tech Mahindra represents the connected world, offering innovative and customer-centric information technology experiences. We #Rise together to create sustainable businesses that can bring about lasting change in our communities – to create an equal world, to be future ready, and to create value. We are 152,000+ professionals across 90 countries, helping 1297 global customers including Fortune 500 companies Must have: Python, ML concepts and frameworks, Fast API, Graph QL, AWS, ML Flow, AirFlow, ML pipeline creation, drift monitoring JD: Responsibilities : • Translating business requirements into technical specifications, including data streams, integrations, transformations, databases, and data warehouses • Design and implement effective database solutions and models to store and retrieve enterprise data. • Examine and identify database structural necessities by evaluating client operations, applications, and programming. • Assess database /data flow implementation procedures to ensure they comply with internal and external regulations and security compliance. • Defining the data architecture framework, standards, and principles, including data modeling, metadata, security, reference data such as product codes and client categories, and master data such as clients, vendors, materials, and employees • Defining reference architecture, which is a pattern others can follow to create and improve data systems • Defining and designing data flows, i.e., which parts of the organization generate data, which require data to function, how data flows are managed, and how data changes in transition. • Collaborating and coordinating with multiple departments, stakeholders, partners, and external vendors Skills : • Minimum of 10+ years of designing, building and operationalizing large-scale enterprise data solutions and applications in On-prem/GCP platform • Experienced in designing data engineering solution in GCP using Cloud Data, Data Fusion, Dataflow, and Dataproc, Airflow, GCP bucket & Big Query. • Minimum of 10 year of experience in performing detail assessments of current state data platforms and creating an appropriate transition path to GCP cloud • Data migration experience from legacy systems including Hadoop, Exadata, Oracle, Teradata to Big Query • Experience with Data lake, data warehouse ETL build and design along with designing solutions using various Data Modelling concepts ( Dimensional modelling, Data Vault 2.0, Data Mesh) • Strong experience in writing stored procedures, UDF in Google Big Query, Oracle, SQL. • Minimum of 5 years of designing and building production data pipelines from data ingestion to consumption within a hybrid big data architecture, using Cloud Native GCP, Java, Python, Scala, SQL etc. • Minimum of 5 year of architecting and implementing next generation data and analytics platforms on GCP cloud • Experience in managing E2E data very responsibility with operational responsibilities like Alerts mechanism, monitoring, infrastructure upgrade and deployment utilities using CI/CD. • Active Google Cloud Data Engineer Certification or Active Google Professional Cloud Architect Certification.