Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Oil & Gas Jobs |
Banking Jobs |
Construction Jobs |
Top Management Jobs |
IT - Software Jobs |
Medical Healthcare Jobs |
Purchase / Logistics Jobs |
Sales |
Ajax Jobs |
Designing Jobs |
ASP .NET Jobs |
Java Jobs |
MySQL Jobs |
Sap hr Jobs |
Software Testing Jobs |
Html Jobs |
Job Location | Bangalore, Chennai, Noida, Hyderabad, Pune |
Education | Not Mentioned |
Salary | Not Disclosed |
Industry | IT - Software |
Functional Area | DBA / Datawarehousing |
EmploymentType | Full-time |
Data MonitoringJob Role: Full-TimeExperience: 7 to 10 yearsJob Location: Hyderabad, Pune, Noida, Chennai and Bengaluru Experience with integration of different data sources with Data Lake is requiredExperience in Perform Design, Hands on development & Deployment using Hadoop, Spark, Scala, Hive, Kafka, SQL, OozieExperience in optimal extraction, transformation, and loading of data from a wide variety of data sourcesExperience in working with Big Data eco-system including tools such as Hadoop, Spark, Scala, Hive, Kafka, SQL, OozieDevelop and maintain scalable data pipelines and build out new Data Source integrations to support continuing increases in data volume and complexity.Experience building and optimizing Big Data pipelines and data setsExtensive Experience around SQLExperience in solving Streaming use cases using Spark, Kafka Mandatory Skills: Mandatory Skills: Hadoop, Spark, Scala, Hive, Kafka, SQL, Oozie Nice to have skills: Python, Airflow
Keyskills :
transformationsparkkafkahivenicemonitoringlakehadoopintegrationscalabig datasqluse casespythonpipelinesdeploymentairflowooziedesigndata monitoring