GCP + Bigdata

  • Bengaluru
  • Ltimindtree

Primary Skills: Hands on experience is mandatory for the GCP, Python/Java, Spark, SQL and Bigdata.

Expereince- 5 to 12 years

Location- Gurgaon & Bangalore


• Must Have Qualifications:

o Masters in computer applications or equivalent OR Bachelor degree in engineering or computer science or equivalent.

o Deep understanding of Hadoop and Spark Architecture and its working principle.

o Deep understanding of Data warehousing concepts.

o Ability to design and develop optimized Data pipelines for batch and real time data processing.

o 5+ years of software development experience.

o 5+ years experience on Python or Java Hands-on experience on writing and understanding complex SQL (Hive/PySpark-data frames), optimizing joins while processing huge amount of data.

o 3+ years of hands-on experience of working with Map-Reduce, Hive, Spark (core, SQL and PySpark).

o Hands on Experience on Google Cloud Platform (BigQuery, DataProc, Cloud Composer)

o 3+ years of experience in UNIX shell scripting

o Should have experience in analysis, design, development, testing, and implementation of system applications.

o Ability to effectively communicate with internal and external business partners.

• Additional Good to have requirements:

o Understanding of Distributed eco system.

o Experience in designing and building solutions using Kafka streams or queues.

o Experience with NoSQL i.e., HBase, Cassandra, Couchbase or MongoDB

o Experience with Data Visualization tools like Tableau, SiSense, Looker

o Ability to learn and apply new programming concepts.

o Knowledge of Financial reporting ecosystem will be a plus.

o Experience in leading teams of engineers and scrum teams.