hireejobs
Hyderabad Jobs
Banglore Jobs
Chennai Jobs
Delhi Jobs
Ahmedabad Jobs
Mumbai Jobs
Pune Jobs
Vijayawada Jobs
Gurgaon Jobs
Noida Jobs
Oil & Gas Jobs
Banking Jobs
Construction Jobs
Top Management Jobs
IT - Software Jobs
Medical Healthcare Jobs
Purchase / Logistics Jobs
Sales
Ajax Jobs
Designing Jobs
ASP .NET Jobs
Java Jobs
MySQL Jobs
Sap hr Jobs
Software Testing Jobs
Html Jobs
IT Jobs
Logistics Jobs
Customer Service Jobs
Airport Jobs
Banking Jobs
Driver Jobs
Part Time Jobs
Civil Engineering Jobs
Accountant Jobs
Safety Officer Jobs
Nursing Jobs
Civil Engineering Jobs
Hospitality Jobs
Part Time Jobs
Security Jobs
Finance Jobs
Marketing Jobs
Shipping Jobs
Real Estate Jobs
Telecom Jobs

Senior Hadoop Developer

4.00 to 7.00 Years   Bangalore   30 Oct, 2019
Job LocationBangalore
EducationNot Mentioned
SalaryNot Disclosed
IndustryBanking / Financial Services
Functional AreaGeneral / Other Software
EmploymentTypeFull-time

Job Description

About J.P. Morgan: J.P. Morgan is a leader in financial services, working in collaboration across the globe to deliver the best solutions and advice to meet our clients needs, anywhere in the world. We operate in 150 countries, and hold leadership positions across our businesses. We have an exceptional team of employees who work hard to do the right thing for our clients and the firm, every day. This is why we are one of the most respected financial institutions in the world and why we can offer you an outstanding career.Domain: JPMC has mobilized a global Liquidity Risk Infrastructure (LRI) program that is a firm-wide, mission critical effort to enhance our liquidity risk management monitoring and reporting capabilities, which will include a redesign of the related supporting technology and infrastructure. The LRI program will implement a world class liquidity risk reporting system complying with recently mandated new Federal and Basel regulations. The program will include strategic data sourcing, data enrichment, analytical, monitoring, and reporting capabilities.Liquidity Risk Infrastructure (LRI) program is a multi-year infrastructure initiative to support the Firms enhanced liquidity risk management processes addressing both the external and internal requirements. The program includes strategic data sourcing from transaction processing systems, data enrichment, analytical, monitoring, and reporting capabilities.The target platform must process 40-60 million transactions and positions daily, calculate risk presented by the current actual as well as model-based what-if state of the market, build multidimensional picture of corporate risk profile, and provide ability to analyze it in real time. The target LRI platform will utilize modern in-memory and non-relational data management principles, state-of-the-art methods of user interaction and must scale in horizontal and vertical directions Key Responsibilities include:

  • Component Software Design & Development.
  • Ensuring excellent practices are utilized in delivering Big Data Management and Integration Solutions.
  • Participating in agile development projects.
  • Acting as a role model for all best practices, ensuring consistency across entire team.
  • Developing solutions for the new Hadoop Big Data platform
  • Must be hands on developer passionate about building high quality applications
  • Display efficient work style with attention to detail, organization, and strong sense of urgency
  • Designing software and producing scalable and resilient technical designs
  • Creating Automated Unit Tests using Flexible/Open Source Frameworks using a Test Driven Development / Behavior Driven Development approach
  • Digesting and understanding Business Requirements and designing new modules/functionality which meet needs.
  • Utilize agile methodologies and adhere to coding standards, procedures and techniques while contributing to the technical code documentation
  • Drive and contribute to continuous improvement of the team.
  • Participate in design reviews and provide input to the design recommendations
  • Research and evaluate solutions and make recommendations to solve business and technology problems
Essentials:
  • Candidate should have overall IT industry experience with 14+ years. And should have minimum 4 years hands on experience in the Hadoop platform.
  • Good communication, influencing skill required for VP role.
  • Solid experience in Core Java/ Scala Programming with Good knowledge in design Patterns
  • Building projects using Maven,gradle/ sbt
  • Goods Hands on experience in Hadoop and at least 4 of the below technologies, with an enthusiasm to learn the others:
    • Spark programming(Core, SQ & Streaming)
    • Good understanding in Parquet other file formats
    • Impala & Hive
    • Kafka Real time messaging
    • Hbase modeling and development
    • Sqoop data ingestions
  • Exposure/competence with Agile Development approach
  • Solid experience utilizing Source code control software (e.g. GIT, Subversion)
  • Test Driven Development/Behavior Driven Development using appropriate frameworks.
    • Test-infected attitude (strong desire to perform thorough and exhaustive unit, integration and system testing).
    • Be a self-starter and be able to reach out to various groups to get the requirements to completion.
Desired Skills:
  • Experience with Cloud native application development, Cloud deployment, and Cloud application refactoring
  • Tableau and any reporting technologies.
  • Working experience with big data technologies such as MapReduce, etc.
  • Strong working knowledge of Oracle RDBMS.
  • Experience with Linux shell scripts is nice to have
  • Experience with Data Management is an added advantage
,

Keyskills :
sql oftwaredesign userinteraction dataenrichment corporaterisk riskmanagement datamanagement codingstandards testdrivendevelopment

Senior Hadoop Developer Related Jobs

© 2019 Hireejobs All Rights Reserved