Data Engineer

  • Bengaluru
  • Cloud Raptor
At Cloud Raptor , we specialise in providing scalable and efficient solutions to businesses across many industries. Focused on cloud technologies and technical expertise, we help organisations navigate the ever-changing landscape of the digital era. With dedicated Centres of Excellence across the globe, we are enabled to deliver comprehensive, senior resourcing solutions of both onshore and offshore resources, allowing our clients to achieve maximum ROI while staying on budget. Our focus is on delivering amazing end-to-end customer experiences. From cost engineering budgets to meeting time-sensitive deadlines, we strive to exceed expectations and add excellence-grade value to our clients’ businesses. Job Description: ·5+ years working experience as a data or software engineer in a fast-paced growing company. ·Excellent SQL and Python knowledge strong hands-on data modeling and data warehousing skills and experience in transformations orchestrated through technologies such as dbt/cloud dataflow would be a plus. ·Strong experience applying software engineering best practices to analytics (e.g. version control, testing, and CI/CD) ·Power-user and expert in building scalable data warehouses and pipelines using some of Cloud tools such as Snowflake, AWS, Google Cloud, Cloud ETL tools such as Databricks (Spark/Azure) ·Solid experience with ETL/ELT processes, scheduling tools (e.g., Talend, Airflow) and API management tools ·Experienced with customer, marketing and/or web data (e.g., Salesforce, Google Analytics, AdWords, YouTube etc.) ·Good with data visualization tools and packages (e.g., Looker, Tableau, matplotlib) ·Strong attention to details to highlight and address data quality issues. ·Self- starter , motivated, responsible, innovative, and technology-driven individual who performs well both independently and as a team member. ·A proactive problem solver and have good communication as well as project management skills to relay your findings and solutions across technical and non-technical audiences. ·Bachelor’s or master’s degree in engineering, Math, Statistics, Finance, Economics, or a related field ·Passionate about gaming or gaming industry would be a great plus and conversation starter. Key Responsibilities: ·Work closely with data users to understand business requirements. ·You will be actively involved with efforts in designing high-performance, reusable, and scalable data models for our data warehouse to ensure our end-users get consistent and reliable answers when running their own analyses. ·Write complex yet optimized data transformations in SQL/Python using dbt or similar technology. ·Continuously discover, transform, test, deploy and document data sources ·Apply advanced aggregations and data wrangling techniques such as imputation for predictive analytics. ·Work with wide range of tech stack at various levels such as Snowflake, dbt, Airflow, Fivetran, AWS, Spark, GCP Env:BigQuery, Cloud Composer, Cloud Dataflow etc.