Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Oil & Gas Jobs |
Banking Jobs |
Construction Jobs |
Top Management Jobs |
IT - Software Jobs |
Medical Healthcare Jobs |
Purchase / Logistics Jobs |
Sales |
Ajax Jobs |
Designing Jobs |
ASP .NET Jobs |
Java Jobs |
MySQL Jobs |
Sap hr Jobs |
Software Testing Jobs |
Html Jobs |
Job Location | Chennai |
Education | Not Mentioned |
Salary | Not Disclosed |
Industry | IT - Software |
Functional Area | General / Other SoftwareWeb / Mobile Technologies |
EmploymentType | Full-time |
Strong Python development experience with Data integration experience. Should have worked on handling huge data. Strong communication skills , experience in Agile methodologies , ETL / ELT skills , Data movement skills , Data processing skills. Role Description : Strong expertise to build Data Services in Google Cloud Platform Able to build end - applications using a correct choice of Big Data GCP Components like GCS , DataFlow , Dataproc , BigQuery , BigTable , Pub / Sub and Open Source components like MongoDB , Cassandra , Kafka , etc. Implement and support efficient reliable data pipelines to move data from a wide variety of data sources to data marts / data lake Working experience in one of RDBMS data stores like Oracle , MySQL , PostgreSQL etc. and one of NoSQL data stores like HBase , Mongo , Cassandra etc. Implement data aggregation , cleansing and transformation layers Ability to build Data Ingestion frameworks taking into account access patterns , scalability , response time and availability. Experience in Big data integration and stream processing technologies using Apache Kafka , Kafka Connect (Confluent) , Apache NiFi , Flume , Sqoop , Spark , Hive Experience working on writing Pub - Sub APIs , developing Kafka Streams , Kafka connect , KSQL Developing new processors within Apache NiFi and establishing new data flows / troubleshooting existing data flows to the various hardware instances associated with the different data platforms. Experience with serialization such as JSON and / or BSON.,
Keyskills :
sql java datawarehousing python strongcommunicationskills bigdata datamarts opensource apachekafka dataservices dataprocessing dataintegration dataaggregation nf matica googlecloudplatf streamp