Lead the development of module/defined scope of work from requirements to go-live
Well versed in Big Data technologies Hive, Pig, SQOOP, and Spark from development perspective
Experience in one programming language (Java, Scala, Python) with good experience in designing Big Data Platforms
Optional experience in Real time data analytics platform including technologies like Kafka, Spark Streaming/ Storm
Ability to create detailed level solution approach and design in conjunction with Architect
Conduct detail level component design sessions for team to help them understand and implement the components in line with the design.
Capable of providing an overall Code framework which can be industrialized by developers
Review the codebase from developers to ensure adherence of the defined best practices and patterns
Work with Business & Data Analysts during the requirements analysis and design phase.
Review the test plans created by developers for Unit Testing.
Review the outcomes of the Unit Testing
Coordinate between Offshore and Onsite teams.
Data Vault, Mainframe offloading, Teradata offloading experience is nice to have.
Experience working on cloudera.