Requirement:
- At least Bachelor's degree in Computer Science, Computer Engineering or Technology related field or equivalent work experience
- Experience in Data Warehouse related projects in product or service-based organization
- Expertise in Data Engineering and implementing multiple end-to-end DW projects in Big Data environment
- Experience in building and deploying production level data driven applications and data processing workflows/pipelines
- Experience with application development frameworks (Java/Scala, Spring)
- Experience with data processing and storage frameworks like Hadoop, Spark, Kafka
- Experience implementing REST services with support for JSON, XML and other formats
- Experience with performance Tuning of Database Schemas, Databases, SQL, ETL Jobs, and related scripts
- Experience of working in Agile teams
- Strong analytical skills required for writing and performance tuning complex SQL queries, debugging production issues, providing root cause, and implementing mitigation plan
- Strong communication skills - both verbal and written – and strong relationship, collaboration skills and organizational skills
- Ability to be high-energy, detail-oriented, proactive, and able to function under pressure in an independent environment along with a high degree of initiative and self-motivation to drive results
- Ability to quickly learn and implement new technologies, and perform POC to explore best solution for the problem statement
- Flexibility to work as a member of a matrix based diverse and geographically distributed project teams
- Experience in developing integrated cloud applications with services like Azure or GCP
Key Skills
- 6 scrum teams each of size ~8
- Expert in Angular and Core Java
- REST API & Spring Boot
- SQL & No-SQL DB
- Expect in - Coding, testing, design