Big Data Processing-Kafka @ Gurgaon

7 - 12 Years
Gurgaon

Job Description

We are pleased to invite you to apply for " Big Data Processing-Kafka"
Please find JD below:
Exp 7-16 years

Job Description:
Big Data Processing-Kafka

Apache Kafka or Confluent Kafka experience

*3+ years of experience with event driven architecture

*Should have complete knowledge of Kafka Architecture, like Properties of Kafka Record Use and benefits of partitions Use of Consumer Group What is ISR, what is the use of offset What is the role of Zookeeper *Should have knowledge of Fault-tolerant, Scalability etc...

*Should have the knowledge of Cluster configuration

*Should have involved in 2 Kafka project implementations

*Good to have knowledge of Kafka MirrorMaker and the scope of implementation

*Should have used AVRO schema for Kafka implementation

*Good to have the knowledge on AVRO schema registry

*Should have knowledge of programming languages/ framework to implement Kafka Solution

*Good to have knowledge of other messaging systems like RabbitMQ, ActiveMQ



NP-Immediate/45 Days only

Interested candidate please share your updated CV on satish@bharatjobs.com with below details:

Total exp:
Relevant exp :
Notice Period:
Payroll company
Current CTC:
Expected CTC:
Current location:

Thanks & Regards
Satish
Athena Consultancy Services (Bharatjobs), New Delhi
8744001282

Salary: Not Disclosed by Recruiter

Industry:IT-Software / Software Services

Functional Area:IT Software - Application Programming, Maintenance

Role Category:Programming & Design

Role:Software Developer

Keyskills

Desired Candidate Profile

Please refer to the Job description above

Company Profile

Athena Consultancy Services

ACS
View Contact Details+

Contact Company:Athena Consultancy Services