Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Oil & Gas Jobs |
Banking Jobs |
Construction Jobs |
Top Management Jobs |
IT - Software Jobs |
Medical Healthcare Jobs |
Purchase / Logistics Jobs |
Sales |
Ajax Jobs |
Designing Jobs |
ASP .NET Jobs |
Java Jobs |
MySQL Jobs |
Sap hr Jobs |
Software Testing Jobs |
Html Jobs |
Job Location | Kolkata |
Education | Not Mentioned |
Salary | Not Disclosed |
Industry | IT - Software |
Functional Area | General / Other Software |
EmploymentType | Full-time |
12+ years of experience, with hands-on experience on ETL tools with minimum of 6 years of experience in Ab Initio toolsets Deep understanding of concepts in Co-operating System, Using GDE (Graphical Development Environment) Able to design end to end solution for complex case studies in Ab Initio Enterprise Meta Environment (EME) version control, dependency analysis, mcommands, air commands, sandboxes, PDLs, Metaprogramming Lead & architect customize solution to reverse engineer Abinitio tool workflow to a data flow diagram Using all of the above components to create and maintain an enterprise data solution and rules execution. Must have hands-on experience with SQL, Shell Scripting, bash, Korn shell scripting, SQL performance tuning, relational model analysis and data migration. In-depth understanding on Ab Initio GDE, can troubleshoot any Ab Initio related problem/error, implemented real-time, batch based solutions with large volume of data processing. Should have working knowledge SQL like Teradata, Oracle and exposure on Big data tools like Hadoop, Hive, Spark SQL. Exposure to Data Quality especially data profiling, MDM especially matching algorithms Experience on executing Data Migration projects and knowledge on data reconciliation. Good experience in understanding business requirements and translating these to schema requirements, good understanding of data profiling, meta data, ETL and reporting Solve complex data problems in the areas of MDM, Data Quality, Integration Migration Prepare PoV on new tools or new version of existing tools and write whitepaper on upcoming trends. Analyze Enterprise ETL architecture in DW context, perform gap analysis and provide technical evaluation, architecture and design of scalable large-scale ETL solution in a multi-platform environment.Responsibilities: Guide and manage development teams on optimal design and solution architecture Analyze existing enterprise architecture and perform architecture design review Analyze the need for POC based on the strategic roadmap laid out by the enterprise architects. Review design artifacts and code as created by developers / senior developers to ensure it meets architectural requirements, quality and on time delivery. End to end responsibility of designing and architecting technology solutions Develop architecture and coding standards, review processes as well as project specific templates Participate in discussion for client proposals Handle offshore delivery ownership, project planning, Stakeholder management and status reporting to senior leadership teams Key Skills: 12+ years of experience, with hands-on experience on ETL tools with minimum of 6 years of experience in Ab Initio toolsets Deep understanding of concepts in Co-operating System, Using GDE (Graphical Development Environment) Able to design end to end solution for complex case studies in Ab Initio Enterprise Meta Environment (EME) version control, dependency analysis, mcommands, air commands, sandboxes, PDLs, Metaprogramming Lead & architect customize solution to reverse engineer Abinitio tool workflow to a data flow diagram Using all of the above components to create and maintain an enterprise data solution and rules execution. Must have hands-on experience with SQL, Shell Scripting, bash, Korn shell scripting, SQL performance tuning, relational model analysis and data migration. In-depth understanding on Ab Initio GDE, can troubleshoot any Ab Initio related problem/error, implemented real-time, batch based solutions with large volume of data processing. Should have working knowledge SQL like Teradata, Oracle and exposure on Big data tools like Hadoop, Hive, Spark SQL. Exposure to Data Quality especially data profiling, MDM especially matching algorithms Experience on executing Data Migration projects and knowledge on data reconciliation. Good experience in understanding business requirements and translating these to schema requirements, good understanding of data profiling, meta data, ETL and reporting Solve complex data problems in the areas of MDM, Data Quality, Integration Migration Prepare PoV on new tools or new version of existing tools and write whitepaper on upcoming trends. Analyze Enterprise ETL architecture in DW context, perform gap analysis and provide technical evaluation, architecture and design of scalable large-scale ETL solution in a multi-platform environment.Responsibilities: Guide and manage development teams on optimal design and solution architecture Analyze existing enterprise architecture and perform architecture design review Analyze the need for POC based on the strategic roadmap laid out by the enterprise architects. Review design artifacts and code as created by developers / senior developers to ensure it meets architectural requirements, quality and on time delivery. End to end responsibility of designing and architecting technology solutions Develop architecture and coding standards, review processes as well as project specific templates Participate in discussion for client proposals Handle offshore delivery ownership, project planning, Stakeholder management and status reporting to senior leadership teams,
Keyskills :
sitetenderplumbingdrawingcustomer relationssource system analysisbig dataab initiodata flowetl toolsdata qualitygap analysiscase studiesdata migration