Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Oil & Gas Jobs |
Banking Jobs |
Construction Jobs |
Top Management Jobs |
IT - Software Jobs |
Medical Healthcare Jobs |
Purchase / Logistics Jobs |
Sales |
Ajax Jobs |
Designing Jobs |
ASP .NET Jobs |
Java Jobs |
MySQL Jobs |
Sap hr Jobs |
Software Testing Jobs |
Html Jobs |
Job Location | Hyderabad |
Education | Not Mentioned |
Salary | Not Disclosed |
Industry | Education / Training |
Functional Area | General / Other Software |
EmploymentType | Full-time |
Job Duties: Experience in defining technology architecture to meet use case requirement (fill gap if any by looking what they already have) Development of standards, best practices and optimization of existing AWS cloud architecture. Drive the development of use case(s) once data architecture and technology architecture defined Deployment and implementation of the Kafka architecture in multiple environments & Ensure optimum performance, high availability and stability of solutions. Resolving any Infrastructure bottle necks related to data ingestion, transformation & processing layers. Execute scrum Skill Set:Must Have: Hands-on experience in Applications Development and Management of NoSQL Distributed Databases (Neo4J) Database Administration Minimum 5+ Years of experience in design and implementation of Bigdata SolutionSecondary skills: Experience in ETL (Extraction, Transformation, Loading) related work for Data conversions (ETL Processes in AWS Glue) Hands-on experience in Data Capture Platform (Attunity Replicate). Hands-on experience in standing up and administering Kafka platform. Experience in Kafka brokers, zookeepers, Kafka connect, schema registry, Monitoring etc AWS stack of technologies with focus on (Glue, s3, Lambda & Lex). Experience in building Real-time Data pipelines/Streaming. Desirable Skills: Hadoop/Cloud (AWS) Developer Certification Experience deploying applications in a cloud environment; ability to architect, design, deploy and manage cloud based Hadoop clusters Qualification: 4 years of college degree or US equivalent,
Keyskills :
databases sqlserver sql databaseadministration rman usecase dataarchitecture highavailability distributeddatabases technologyarchitecture optimizationstrategies etl aws cloud scrum nosql hadoop design ealtimedata gap