Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Oil & Gas Jobs |
Banking Jobs |
Construction Jobs |
Top Management Jobs |
IT - Software Jobs |
Medical Healthcare Jobs |
Purchase / Logistics Jobs |
Sales |
Ajax Jobs |
Designing Jobs |
ASP .NET Jobs |
Java Jobs |
MySQL Jobs |
Sap hr Jobs |
Software Testing Jobs |
Html Jobs |
Job Location | Chennai, Pune |
Education | Not Mentioned |
Salary | Undefined |
Industry | IT, Computers - Software |
Functional Area | IT |
EmploymentType | Full-time |
Total of 8 191 12 Years of experience in BI DW with at least 4 191 6 years of experience in Big Data implementations191 Understand business requirements and convert them into solution designs191 Architecture Design and Development of Big Data data Lake Platform191 Understand the functional and non functional requirements in the solution and mentor the team with technological expertise and decisions191 Produce a detailed functional design document to match customer requirements191 Responsible for Preparation reviewing and owning Technical design documentation191 Code reviews and preparing documents for Big Data applications according to system standards191 Conducts peer reviews to ensure consistency completeness and accuracy of the delivery191 Detect analyse and remediate performance problems191 Evaluates and recommends software and hardware solutions to meet user needs191 Responsible for project support support mentoring and training for transition to the support team191 Share best practices and be consultative to clients throughout duration of the project191 Hands on experience in working with Hadoop Distribution platforms like HortonWorks Cloudera MapR and others191 Take end to end responsibility of the Hadoop Life Cycle in the organization191 Be the bridge between data scientists engineers and the organizational needs191 Do in depth requirement analysis and exclusively choose the work platform191 Full knowledge of Hadoop Architecture and HDFS is a must191 Working knowledge of MapReduce HBase Pig MongoDb Cassandra Impala Oozie Mahout Flume ZookeeperSqoop and Hive191 In addition to above technologies understanding of major programmingscripting languages like Java Linux PHP Ruby Phyton andor R191 He or she should have experience in designing solutions for multiple large data warehouses with a good understanding of cluster and parallel architecture as well as high scale or distributed RDBMS andor knowledge on NoSQL platforms191 Must have minimum 3 years hands on experience in one of the Big Data Technologies Ie Apache Hadoop HDP Cloudera MapRo MapReduce HDFS Hive Hbase Impala Pig Tez Oozie Scoop191 Hands on experience in designing and developing BI applications191 Excellent knowledge in Relational NoSQL Document Databases Data Lakes and cloud storage191 Expertise in various connectors and pipelines for batch and real time data collectiondelivery191 Experience in integrating with on premises publicprivate Cloud platform191 Good knowledge in handling and implementing secure data collectionprocessingdelivery191 Desirable knowledge with the Hadoop components like Kafka Spark Solr Atlas
Keyskills :
Hadoop
wwwpioneerfinancecoin