hireejobs
Hyderabad Jobs
Banglore Jobs
Chennai Jobs
Delhi Jobs
Ahmedabad Jobs
Mumbai Jobs
Pune Jobs
Vijayawada Jobs
Gurgaon Jobs
Noida Jobs
Oil & Gas Jobs
Banking Jobs
Construction Jobs
Top Management Jobs
IT - Software Jobs
Medical Healthcare Jobs
Purchase / Logistics Jobs
Sales
Ajax Jobs
Designing Jobs
ASP .NET Jobs
Java Jobs
MySQL Jobs
Sap hr Jobs
Software Testing Jobs
Html Jobs
IT Jobs
Logistics Jobs
Customer Service Jobs
Airport Jobs
Banking Jobs
Driver Jobs
Part Time Jobs
Civil Engineering Jobs
Accountant Jobs
Safety Officer Jobs
Nursing Jobs
Civil Engineering Jobs
Hospitality Jobs
Part Time Jobs
Security Jobs
Finance Jobs
Marketing Jobs
Shipping Jobs
Real Estate Jobs
Telecom Jobs

Senior Data Engineer

7.00 to 10.00 Years   Hyderabad   01 Apr, 2021
Job LocationHyderabad
EducationNot Mentioned
SalaryNot Disclosed
IndustryIT - Hardware / Networking
Functional AreaGeneral / Other Software
EmploymentTypeFull-time

Job Description

Please Note:1. If you are a first time user, please create your candidate login account before you apply for a job. (Click Sign In > Create Account)2. If you already have a Candidate Account, please Sign-In before you apply.As a SaaS Database Engineer, you will be responsible for making architectural decisions for databases, designing database solutions, developing robust database HA models, bringing in database expertise to solving complex database incidents, optimizing performance, maintaining defined SLAs, and proactively proposing strategic solutions to recurring problems.About the role

  • Perform core database administration functions, viz. provisioning, optimizing configuration settings, database security, data protection, data archival, networking, Architecting, Storage and Capacity Planning.
  • Design Database solutions and propose optimal database technology based on the application/business requirements and workload characteristics.
  • Design Database High Availability architecture - Active/Passive, Active/Active, Sharding, Distributed databases to maintain 99.99% availability SLA
  • Strong Experience in Database performance tuning, identifying bottlenecks, query tuning minimizing response time meeting performance SLAs as per business requirements. This requires close collaboration with Product Architects.
  • Troubleshoot & Diagnose complex database issues, perform root cause analysis, propose fixes, identify resolutions, determine failure trends and mitigation strategies.
  • Advanced scripting experience to automate processes and routine DB maintenance tasks using Ansible, Shell Scripts, Python, etc.
  • Hardening & Security of Databases.
  • Evaluate newer database technologies, upgrades, hot fixes, patches.
  • Experience in database capacity planning, data lifecycle management, growth estimation, scaling (horizontal/vertical) infrastructure to meet business demands.
  • Designing custom monitoring solutions Prometheus, Grafana, and other monitoring tools
  • Experience working in any or multiple cloud technologies including Google Cloud(preferred), Azure, AWS and managing databases within both private and public cloud environments
How You ll Stand Out:
  • Minimum 7-10 years of solid experience in various database technologies
  • Proficient in MongoDB and having wide range of knowledge including:
    • Deployment and Working expertise of Single and Multi-Datacenter HA solutions: MongoDB Replica Set and Sharding
    • Maintaining very large clusters with multiple shards (100 nodes); Knowledge of Shard keys, hashed and ranged sharding
    • Knowledge of MongoDB Query Language, Storage engines (esp. WiredTiger), CRUD operations, aggregations
    • Experience in diagnosing and resolving multiple issues, upgrading to later versions
    • Proposing best practices for schema design, backup & recovery strategies
    • Tuning MongoDB to achieve high performance, read consistency, and write durability
    • Experience in assessing performance of locking in transactions, tuning memory and cache, connection handling, wired tiger configuration and tuning
    • Experience in using database profilers, sampling slow operations
    • Knowledge of creating chunks in Sharded cluster (if required), splitting/merging chunks
    • Expertise in resolving replication gaps, fixing issues with replica sets, improving replication performance
    • MongoDB Enterprise tools: Ops Manager, etc.
  • Proficient in PostgreSQL and having deep knowledge and experience in:
    • Providing PostgreSQL database architecture solutions
    • Designing PostgreSQL HA models Streaming Replication, Logical Replication, Patroni, Slony, etcd, BDR, PG Logical experience (Experience with tools like Patroni is a huge plus)
    • Supporting PostgreSQL databases in a high-volume customer facing environment
    • Deep understanding of SQL query execution, execution plan analysis & optimization, and index tuning strategies
    • Advanced practical knowledge in database monitoring, diagnosing and tuning for high performance
    • Implementing standard methodologies for DR and Backups
    • Upgrading/Patching, Monitoring and troubleshooting
    • Measuring and optimizing system performance - Capacity planning and forecast management
    • Experience in Oracle to PostgreSQL migration is an added advantage
  • Experience with other database technologies will be a huge plus - MySQL, Cassandra, Elastic Search, Oracle
  • A passion for automation to reduce repeatable mundane day-to-day tasks and improve consistency and in results.
  • Expertise in Performance tuning, SQL Tuning
    • Performance Tuning and SQL Tuning for production MongoDB, PostgreSQL databases
    • PSR Performance and Stress Testing and Benchmarking using tools like swingbench, sysbench etc.
  • Data Lifecycle Management, Data Archival, Data Redaction
  • Experience in any/combination of Monitoring tools PMM (Percona), OEM, Atlas, MongoDB Cloud Manager, Prometheus, Grafana, Alert Manager
  • Experience with Migration of Database Platforms Oracle to PostgreSQL, MS-SQL to PostgreSQL, Aurora to MySQL etc. is preferred
  • Experience migrating large data sets, production workloads from AWS/Azure to Google Cloud and from Private Cloud to Public Cloud.
  • Perform zero downtime migrations from different database platforms (Oracle to PostgreSQL) and migrations from colo to cloud with zero to minimal downtime.
  • Experience with Kubernetes, OpenShift, Google Kubernetes Engine is an added advantage
  • Experience with running large databases on containers within Kubernetes/GKE cluster
  • DevOps Experience with technologies such as Docker, Containers, Jenkins, CI/CD pipeline, gitops
  • Participate in new technology/feature evaluation, design, and development of highly scalable distributed databases
  • Ability to do performance benchmarking between either two versions of same database technology or two different database technologies
  • Aptitude to independently learn new technologies
  • Experience with documentation of standard procedures, architecture, design and deployments
  • Ability to thrive in a fast-paced, tight deadline delivery environment
  • Strong communication skills and ability to work effectively across multiple business and technical teams
Broadcom is proud to be an equal opportunity employer. We will consider qualified applicants without regard to race, color, creed, religion, sex, sexual orientation, gender identity, national origin, citizenship, disability status, medical condition, pregnancy, protected veteran status or any other characteristic protected by federal, state, or local law. We will also consider qualified applicants with arrest and conviction records consistent with local law.If you are located outside USA, please be sure to fill out a home address as this will be used for future correspondence.,

Keyskills :
root cause analysisroot causeapachems sqlenvironmental impact assessmentsql tuningcontent managementhigh availability architectureprivate cloudsqlquery tuningemailanalytics

Senior Data Engineer Related Jobs

© 2019 Hireejobs All Rights Reserved