Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Oil & Gas Jobs |
Banking Jobs |
Construction Jobs |
Top Management Jobs |
IT - Software Jobs |
Medical Healthcare Jobs |
Purchase / Logistics Jobs |
Sales |
Ajax Jobs |
Designing Jobs |
ASP .NET Jobs |
Java Jobs |
MySQL Jobs |
Sap hr Jobs |
Software Testing Jobs |
Html Jobs |
Job Location | Noida |
Education | Not Mentioned |
Salary | Not Disclosed |
Industry | Management Consulting / Strategy |
Functional Area | General / Other Software |
EmploymentType | Full-time |
Role:The individual will be given the responsibility of leading the implementation of a real-time data pipeline which will pair Kafka events with databases and micro services. This position requires an effective communicator who can lead coordination across multiple channels of clients, project teams, and platforms. The individual s opportunity includes: Leading in the development of data pipelines using a combination of Java, Kafka, CDC, and other technologies. Assisting in the architecture and design of scalable and highly available data pipelines. Ensure extensibility, availability, and multi-tenancy in all aspects of the data pipeline solution. Drive continuous improvement to our software development process, products, and code.Technical Skills Good experience in Java Scripting experience with expertise in at least one reporting tool (Tableau/Spotfire) and good SQL knowledge. Some knowledge of Sisense will be preferred in a Data Pipeline, Ingestion and data processing (Batch/Streaming). Create and maintain optimal data pipeline architecture. Assemble large, complex data sets that meet functional / non-functional business requirements. You have advanced working experience with SQL(Relational)/NoSQL Databases knowledge and experience working with relational/ databases. Experience with integration of data from multiple data sources. Experience with various messaging systems, such as Kafka is must. Implementing ETL processes and constructing data warehouse at scale. Experience with NoSQL databases, such as Cassandra, MongoDB is highly desirable. Keep our data separated and secure across national boundaries through multiple data centers. Experience developing microservices, DevOps, test automation and CI/CD. Experience with building APIs on top of existing Data Models. Experience Designing Micro Services on AWS, Heroku, or other equivalent PaaS solutions. Experience with Agile Development and excellent time management skills.Personal Attributes Strong personal ownership and ability to set and meet high professional standards Flexible and responsive work style Extensive communication on a business and technical level will be required Strong Analytical and problem-solving techniques Team Management skills with ability to constructively engage different personality types. Must have managed a QA team of at least 3-5 members. Versatility, flexibility, and willingness to work within changing priorities,
Keyskills :
ealtime data data science project teams time management team management team management skills reporting tool data processing agile development