Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Oil & Gas Jobs |
Banking Jobs |
Construction Jobs |
Top Management Jobs |
IT - Software Jobs |
Medical Healthcare Jobs |
Purchase / Logistics Jobs |
Sales |
Ajax Jobs |
Designing Jobs |
ASP .NET Jobs |
Java Jobs |
MySQL Jobs |
Sap hr Jobs |
Software Testing Jobs |
Html Jobs |
Job Location | Chennai |
Education | Not Mentioned |
Salary | Not Disclosed |
Industry | Recruitment Services |
Functional Area | General / Other Software |
EmploymentType | Full-time |
We are looking for a Senior NiFi Data Engineer(Early Joiners ONLY) who is able to design and build solutions for one of our Fortune 500 Client programs, which aims towards building a data standardization and curation needs on Hadoop cluster using Apache NiFi as primary tool for integration and engineering. This is a high visibility, fast-paced key initiative will integrate data across internal and external sources, provide analytical insights, and integrate with the customer s critical systems.Key Responsibilities Design, build and unit test applications on APACHE NIFI either on Hortonworks Data Flow (HDF) or on Cloudera Flow Management (CFM) platforms Build NiFi data pipelines for both batch and streaming data requirements, which will require in- depth knowledge on Hive, Kafka, HDFS, Sqoop and NoSQL databases as well Parse hierarchical XML and JSON documents using NiFi processors and flatten them as per data modelling requirements Build complex transformations and expressions on FlowFile attributes and content within NiFi processors Design & Build REST-ful service pipelines on NiFi with appropriate error handling Also build NiFi pipelines which can perform GET-POST-PUT-DELETE method calls on those service pipelines Build NiFi controller services to interact with other databases (RDMBS+NoSQL) and for credentials & kerberos tickets retrieval purposes Publish NiFi flows into NiFi Registry and promote to higher environments Build NiFi orchestration flows for coordination and interaction among various NiFi data flows Execute CLI commands for both NiFi and NiFi registry application to perform flow executions and promotions respectively Extensively use variables and context variables which hierarchically falls-through from parent to child NiFi flows Optimize performance of the built NiFi pipelines in Hadoop using configurations of different processors Optimize performance for data access requirements by choosing the appropriate native Hadoop file formats (Avro, Parquet, ORC etc) and compression codec, while writing into HDFS Analyzing and processing large amounts of structured and unstructured data, including integrating data from multiple heterogeneous sources Create and maintain integration and regression testing pipelines on Jenkins integrated with BitBucket and/or GIT repositories Participate in the agile development process, and document and communicate issues and bugs relative to data standards in scrum meetings Work collaboratively with onsite and offshore team model Develop & review technical documentation for artifacts delivered Ability to solve complex data-driven scenarios and triage towards defects and production issues Ability to learn-unlearn-relearn concepts with an open and analytical mindset Participate in code release and production deployment Challenge, inspire & mentor team members to achieve business results in a fast paced and quickly changing environmentRequirements Bachelor s degree with a minimum of 5 years of IT experience, OR in lieu of Degree, a High School Diploma/GED Minimum 3 years hands-on experience in NiFi, HDFS, Kafka, Hive, Sqoop, shell scripting Good knowledge of the data transformation, service handling, error handling and processors available in NiFi Able to transform heterogeneous data formats (Json, Xml, Avro, Parquet, Orc) Good knowledge on Batch and Streaming data processing through NiFi End to end thorough understanding of FDLC Implement best practices followed on NiFi Flow development Good knowledge on developing custom NiFi Processors Knowledge on improving Flow standards with exception and error handling with zero data loss Excellent communication skill Solution oriented and problem-solving skills Strong teamwork and interpersonal skillsExperience: 5+ YearsLocation: Pune, Hyderabad, Chennai, OR Remote (India),
Keyskills :
problem solvingjavadata flowdata standardsunstructured datafortune 500agile developmentpythondata modelingdata processingdata warehousinginformaticasqlregression testingdata transformation