Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Oil & Gas Jobs |
Banking Jobs |
Construction Jobs |
Top Management Jobs |
IT - Software Jobs |
Medical Healthcare Jobs |
Purchase / Logistics Jobs |
Sales |
Ajax Jobs |
Designing Jobs |
ASP .NET Jobs |
Java Jobs |
MySQL Jobs |
Sap hr Jobs |
Software Testing Jobs |
Html Jobs |
Job Location | Hyderabad |
Education | Not Mentioned |
Salary | Not Disclosed |
Industry | IT - Software |
Functional Area | IT Operations / EDP / MIS,Web / Mobile Technologies |
EmploymentType | Full-time |
Senior Tech LeadLocation: Hyderabad, Andhra Pradesh, INJob Category: TechnologyJob Skills requirement: At least 3 years of Handson development experience and a deep understanding of the Kafka architecture and internals of how it works, along with the interplay of architectural components: brokers, Zookeeper, Producers/Consumers, Kafka Connect, Kafka Streams Experience with Kafka Streams / KSQL architecture and associated clustering model Experience with developing KSQL queries and best practices of using KSQL vs streams Strong knowledge of the Kafka Connect framework, with experience using several connector types: HTTP REST proxy, JMS, File, SFTP, JDBC, Splunk, Salesforce Handson experience as a developer who has used the Kafka API to build producer and consumer applications, along with expertise in implementing KStreams components. Have developed KStreams pipelines, as well as deployed KStreams clusters Strong understanding of relational and NoSQL databases (Mongo), SQL, and database/schema design Knowledge of connectors available from Confluent and the community Handson experience in designing, writing and operationalizing new Kafka Connectors using the framework The familiarity of the Schema Registry Best practices to optimize the Kafka ecosystem based on usecase and workload, e.g. how to effectively use topic, partitions, and consumer groups to provide optimal routing and support of UDF and UDAF Solid programming proficiency with Java/Scala/node.js/python and best practices in development Experience with monitoring Kafka infrastructure along with related components (Connectors, KStreams, and other producers/consumer apps) Familiarity with Confluent Control Center Preferred Skills: Strong fundamentals in Kafka administration, configuration, and troubleshooting Knowledge of Kafka clustering, and its faulttolerance model supporting HA and DR Practical experience with how to scale Kafka, KStreams, and Connector infrastructures, with the motivation to build efficient platforms,