hireejobs
Hyderabad Jobs
Banglore Jobs
Chennai Jobs
Delhi Jobs
Ahmedabad Jobs
Mumbai Jobs
Pune Jobs
Vijayawada Jobs
Gurgaon Jobs
Noida Jobs
Oil & Gas Jobs
Banking Jobs
Construction Jobs
Top Management Jobs
IT - Software Jobs
Medical Healthcare Jobs
Purchase / Logistics Jobs
Sales
Ajax Jobs
Designing Jobs
ASP .NET Jobs
Java Jobs
MySQL Jobs
Sap hr Jobs
Software Testing Jobs
Html Jobs
IT Jobs
Logistics Jobs
Customer Service Jobs
Airport Jobs
Banking Jobs
Driver Jobs
Part Time Jobs
Civil Engineering Jobs
Accountant Jobs
Safety Officer Jobs
Nursing Jobs
Civil Engineering Jobs
Hospitality Jobs
Part Time Jobs
Security Jobs
Finance Jobs
Marketing Jobs
Shipping Jobs
Real Estate Jobs
Telecom Jobs

Senior Analyst Programmer - Data Engineer

4.00 to 6.00 Years   Gurugram   03 Nov, 2021
Job LocationGurugram
EducationNot Mentioned
SalaryNot Disclosed
IndustryBanking / Financial Services
Functional AreaGeneral / Other Software,DBA / Datawarehousing
EmploymentTypeFull-time

Job Description

About the opportunityPurpose of the RoleAs a Cloud Data Engineer, you will play a key developer and/or tester role on a global programme working with senior business leaders, product owners and technology teams within Fidelity International to deliver a new technology platform to support Fidelity International s ISS strategy. Working as a practitioner/engineer in Data landscape, you will need to utilise your experience of working in Cloud application and/or infra engineering to assist with the engineering aspects, design, definition, exploration and delivery of end-to-end solution to service a scaling global Investment Management business. Your work outcomes should meet all business and operational readiness requirements with the highest standards of quality and responsiveness. All solution implementations should be highly compliant to leading edge DevOps practices in the industry.Key ResponsibilitiesDeveloper with hands-on experience to build end to end applications. This involves designing and developing data extraction/loading, processing, integration, quality layers and UI.

  • Write code for delivering functional stories, test cases, infra automation script, security scripts, monitoring tools and other related use cases.
  • Deep expertise in some of - Java/Python, React/Angular/Node JS, Oracle/SQL Server and strong analytic skills related to working with unstructured datasets.
  • Knowledge & practical experience building applications using Amazon Web Services (AWS) (or other public Cloud platforms like GCP /Azure)
  • Deep Knowledge of AWS and its various services, primarily EC2, VPC, IAM, Serverless offerings, RDS, R53, CloudFront
  • Deep Knowledge of UNIX system architecture
  • Strong hold on networking core concepts
  • Deep Knowledge and understanding of Serverless Architecture and AWS offering in the area
  • Comfortable and strong command on Terraform and/or CloudFormation Core concepts and hands-on writing
    • Hands-on experience with Unix Scripting, Python
    • Expertise with CI/CD pipelines and with few of the DevOps tools like Jenkins/Ansible etc. Understanding of containers - Docker/Kubernetes, Cloud build/deploy
    • Build & optimize data pipelines, architecture & data sets supporting data transformation, data structures, metadata, dependency and workload management
    • Working knowledge of APIs, caching and messaging
    • Experience in software delivery in agile methodologies. TDD & pair programming best practices to ensure quality certified deliverables
    • Experience in performing root cause analysis on internal and external data and processes to answer specific business questions and continuously identify opportunities for improvement.
Experience delivering on data related Non-Functional Requirements like-
  • Hands-on experience dealing with large volumes of historical data across markets/geographies.
  • Manipulating, processing and extracting value from large disconnected datasets.
  • Building water-tight data quality gates on investment management data
  • Generic handling of standard business scenarios in case of missing data, holidays, out of tolerance errors etc.
Good to have knowledge / past experience of
  • Message queuing, stream processing, and highly scalable data stores on Cloud
  • eXtreme prog, pairing, mobbing & other collaborative development practices
  • Experience with snowflake SQL Writing SQL queries against Snowflake Developing scripts Unix, Python etc to do Extract, Load and Transform, Snowpipe for bulk distribution
  • Big data stack - either on Cloud or on-prem. Data analytics & data science/machine learning / quantitative implementation
  • Functional understanding of Capital Markets & Investment data
Behavioural
  • Learnability- Ability to collectively push the boundary & pioneer the adoption and industrialisation of emerging data technologies in the organisation. Passion for growing your skills and, tackling challenging problems
  • Self-motivated to rapidly pick new skills, work directly with senior techies and fellow technology teams in a cordial environment
  • Fungibility- Ability to flex in different roles as per project demand & willingness to move to new roles
  • Ready to give and receive feedback in a candid way
ResponsibilitiesThis position requires a strong self-starter with solid technical engineering background and influencing skills, one who loves use of tech & engineering for solving business problems, can assist colleagues with design, best practices, troubleshooting and other technical challenges related to implementation of a critical business / customer facing proposition.
  • Create and maintain optimal data pipeline design & code. Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Working with product owners & business stakeholders - identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL & AWS technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Ensure delivery in a timely, efficient and cost-effective manner but without compromising quality.
  • Work with Support teams to assist with data-related technical issues and support their data infrastructure needs
Experience and Qualifications Required
  • 4 - 6 years
  • A graduate or Postgraduate in Computer Science, Mathematics, Statistics, Finance, Economics or any Science background from a reputed University.
About youAbout Fidelity InternationalFidelity International offers investment solutions and services and retirement expertise to more than 2.5 million customers globally. As a privately held, purpose-driven company with a 50-year heritage, we think generationally and invest for the long term. Operating in more than 25 countries and with $739.9 billion in total assets, our clients range from central banks, sovereign wealth funds, large corporates, financial institutions, insurers and wealth managers, to private individuals.Our Workplace & Personal Financial Health business provides individuals, advisers and employers with access to world-class investment choices, third-party solutions, administration services and pension guidance. Together with our Investment Solutions & Services business, we invest $567 billion on behalf of our clients. By combining our asset management expertise with our solutions for workplace and personal investing, we work together to build better financial futures.Our clients come from all walks of life and so do we. We are proud of our inclusive culture and encourage applications from the widest mix of talent, whatever your age, gender, ethnicity, sexual orientation, gender identity, social background and more.As a flexible employer, we trust our people to perform their role in the way that works best for them, our clients and our business. We are a disability-friendly company and would welcome a conversation with you if you feel you might benefit from any reasonable adjustments to perform to the best of your ability during the recruitment process and beyond. Data as at 31 March 2021. Read more at https://www.fidelityinternational.com/Applying to this Job Role: Please note you are only required to upload your CV/Resume to the application screen.,

Keyskills :
root causemvcmicrosoft officeunix scriptingamazon web servicesdata analyticspipeline designdata structuresroot cause analysisweb servicessql queriesdata qualitytest casesglobal investment management

Senior Analyst Programmer - Data Engineer Related Jobs

© 2019 Hireejobs All Rights Reserved