hireejobs
Hyderabad Jobs
Banglore Jobs
Chennai Jobs
Delhi Jobs
Ahmedabad Jobs
Mumbai Jobs
Pune Jobs
Vijayawada Jobs
Gurgaon Jobs
Noida Jobs
Oil & Gas Jobs
Banking Jobs
Construction Jobs
Top Management Jobs
IT - Software Jobs
Medical Healthcare Jobs
Purchase / Logistics Jobs
Sales
Ajax Jobs
Designing Jobs
ASP .NET Jobs
Java Jobs
MySQL Jobs
Sap hr Jobs
Software Testing Jobs
Html Jobs
IT Jobs
Logistics Jobs
Customer Service Jobs
Airport Jobs
Banking Jobs
Driver Jobs
Part Time Jobs
Civil Engineering Jobs
Accountant Jobs
Safety Officer Jobs
Nursing Jobs
Civil Engineering Jobs
Hospitality Jobs
Part Time Jobs
Security Jobs
Finance Jobs
Marketing Jobs
Shipping Jobs
Real Estate Jobs
Telecom Jobs

Data Engineer

1.00 to 4.00 Years   Delhi   12 Oct, 2021
Job LocationDelhi
EducationNot Mentioned
SalaryNot Disclosed
IndustryIT - Software
Functional AreaIT Operations / EDP / MIS
EmploymentTypeFull-time

Job Description

The Red Hat Marketing team is looking for a marketing operations focused Data Engineer to join us. In this role, you will work with your scrum team, across teams, and across departments to build, test, deploy, and maintain scalable container-based solutions. These solutions must be developed for high availability and to deliver accurate data for marketing campaign planning and delivery, lead generation, and segmentation. You will be expected to stay familiar with industry and technology trends, write detailed technical documentation for the team, write business solution guides, participate in scrum ceremonies, troubleshoot issues, and continuously enhance our marketing infrastructure. As a Data Engineer, you will be working in a fast-paced environment managing multiple projects at once, which involve stakeholders across a global company. Successful applicants must reside in a country where Red Hat is registered to do business. Primary job responsibilities

  • Identify and adopt best practices on data integrity, test design, analysis, validation, coding, and documentation
  • Develop kafka-streaming (confluent API) microservices based on Kubernetes or Openshift-containerization
  • Create extract, transform, load (ETL) processes that increase business value for marketing and sales teams
  • Use vendor APIs (Eloqua, Salesforce, Adobe Analytics)
  • Provide Tier 2 investigations by working with the data anomaly team and using critical thinking to analyze and provide answers to platform questions
  • Write documentation including playbooks, solution guides, and blog posts
  • Develop automated unit tests, end to end tests, and integration tests to assist in quality assurance procedures
  • Implement system health monitoring, reporting, and meaningful and actionable alerts
Required skills
  • Bachelor s degree in computer science, engineering, information systems, or related field or Bachelors degree with demonstrable coding experience
  • 3+ years of relevant work experience with software development
  • 1+ year(s) of Python programming experience
  • Record of using agile development processes (e.g., Scrum, Kanban) to develop software
  • Great problem solving skills, ability to learn on your own, and comfortable working with ambiguous or disparate pieces of information
  • Ability to manage multiple projects at the same time, in a fast-paced team environment, across time zones and with different cultures while maintaining the ability to collaborate in a team
  • Desire to collaborate with others even if outside of comfort zone
  • Ability to thrive in a fast paced environment with minimal direction, extreme pressure, and tight deadlines
  • Experience with back-end scripting languages (e.g., Python)
  • Experience writing ETL pipelines as code
  • Experience with with event-streaming platforms (e.g. Apache Kafka and Confluent)
  • Experience with version control tools (e.g., Git)
  • Experience with Platform-as-a-Service (PaaS) technology (e.g., AWS, Google Cloud Platform, Red Hat OpenShift, etc.)
  • Experience with SOA or Microservices architecture
The following are considered a plus:
  • Experience building and debugging container (e.g., Docker)
  • Experience deploying containers in a PaaS (e.g., Openshift, KubeNow)
  • Experience with Java
  • Experience writing complex SQL queries (e.g., MariaDB, Postgres, Redshift)
  • Experience with NoSQL databases (e.g., MongoDB, DynamoDB, Cassandra, Redis)
  • Experience with API development
,

Keyskills :
open sourcegoogle cloud platforminformaticadata warehousingversion control toolstest designpythonopen source softwarejavahybrid cloudtier 2sqlsql queriesapache kafka

© 2019 Hireejobs All Rights Reserved