Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Oil & Gas Jobs |
Banking Jobs |
Construction Jobs |
Top Management Jobs |
IT - Software Jobs |
Medical Healthcare Jobs |
Purchase / Logistics Jobs |
Sales |
Ajax Jobs |
Designing Jobs |
ASP .NET Jobs |
Java Jobs |
MySQL Jobs |
Sap hr Jobs |
Software Testing Jobs |
Html Jobs |
Job Location | Bangalore, Chennai, Hyderabad, Mumbai City |
Education | Not Mentioned |
Salary | Rs 50,000 - 3.0 Lakh/Yr |
Industry | Banking / Financial Services |
Functional Area | General / Other Software |
EmploymentType | Full-time |
Should have Strong expertise of Extraction, Transformation and Loading (ETL) mechanism using Informatica Big Data Management 10.2.X and various Push down mode using Spark, Blaze and Hive execution engine. Should have Strong expertise of Dynamic mapping Use case, Development, Deployment mechanism using Informatica Big Data Management 10.2.X. Should have experience on transforming and loading various Complex data sources types such as Unstructured data sources ,No SQL Data Sources. Should have Strong expertise of Hive Database including Hive DDL, Partition and Hive Query Language. Should have Good Understanding of Hadoop Eco system (HDFS, Spark, Hive). Should have Strong expertise of SQL/PLSQL. Should have Good knowledge on working with Oracle/Sybase/SQL Databases. Should have Good knowledge of Data Lake and Dimensional data Modeling implementation. Should be able to understand the requirements and write Functional Specification Document, Design Document and Mapping Specifications. ,
Keyskills :
big datause casedata modelingdata managementunstructured datacommercial modelssqlecoddlhivelakesparkblazehadoopdesigndatabasemechanismpartitionmanagementdeployment