Data Modeler [T500-**]

  • Hyderabad
  • Inspire
Position Overview: As a Data Modeler, you will create and maintain conceptual, logical and physical data models for efficient and high-performance storage and retrieval of data from Cloud based, Enterprise grade, Big Data environments which enable Data Science, Data Analytics and Reporting functions on multi sourced, multi branded data. You will work with both Business users and Engineering teams composed of Data Engineers, Data Scientists, Data Architects, Data Analysts, and information technologists to understand data requirements and model critical data that supports reporting and analytics. You will produce designs that follow Inspire data modeling conventions and adhere to naming standards as set forth by the Lead Data Modeler. Responsibilities: Collaborate with business stakeholders and IT Data Analysts to understand source data and determine the optimal design to meet the organization’s data management goals. Cooperate with the other data modeling team members to identify and resolve any potential model conflicts between business domains supported by other modelers. Create detailed logical models which identify all entities, attributes, and their relationships to each other. Convert logical models to physical models which follow Inspire data modeling guidelines and naming standards. Communicate design decisions to Business and IT stakeholders and ensure that designs meet requirements and goals. Generate table and base view DDL scripts from the Physical data model and deploy changes to Development and QA testing environments. Create presentation layer views for end users, which may involve manipulating data from multiple disparate data sources into a single homogenous output presentation. Prepare table and view DDL scripts for inclusion in Git repositories for User Acceptance Testing and Production environments and initiate their deployment via Azure DevOps pipelines. Monitor and manage table and view object versioning across each development Lifecycle environment ( Dev/QA/UAT/Production ) Work with the Production Support team to analyze and resolve errors and design defects. Education and Experience Qualifications: 4-year degree in Information Systems, Computer Science, or a related field. At least 8 years of total IT experience. 8+ years of writing high-performing complex SQL queries on large datasets. 5-6 years of Data Modeling experience using a major Data Modeling tool, preferably Erwin Data Modeler. 3-5 years of experience with Cloud-based Big Data Warehouses, preferably Snowflake and Azure Databricks. Experience in ETL development would be a plus. Required Knowledge Skills or Abilities: Solid experience with Agile development methodology and managing multiple design deliverables at different development stages ( Dev/QA/UAT/Production ) within a single Agile Sprint. Reasonable familiarity with Data Integration, Metadata Management and Data Warehousing concepts. Experience building normalized relational models and deformalized OLAP dimensional models with a data modeling tool, preferably Erwin Data Modeler. Experience with reverse engineering models from existing databases. Ability to write complex SQL queries on large databases comprised of billions of rows. Familiarity with Cloud data integration, data modeling, and data warehousing concepts. Effective communication and collaboration skills to work with cross-functional teams and convey technical concepts to non-technical stakeholders.