hireejobs
Hyderabad Jobs
Banglore Jobs
Chennai Jobs
Delhi Jobs
Ahmedabad Jobs
Mumbai Jobs
Pune Jobs
Vijayawada Jobs
Gurgaon Jobs
Noida Jobs
Oil & Gas Jobs
Banking Jobs
Construction Jobs
Top Management Jobs
IT - Software Jobs
Medical Healthcare Jobs
Purchase / Logistics Jobs
Sales
Ajax Jobs
Designing Jobs
ASP .NET Jobs
Java Jobs
MySQL Jobs
Sap hr Jobs
Software Testing Jobs
Html Jobs
IT Jobs
Logistics Jobs
Customer Service Jobs
Airport Jobs
Banking Jobs
Driver Jobs
Part Time Jobs
Civil Engineering Jobs
Accountant Jobs
Safety Officer Jobs
Nursing Jobs
Civil Engineering Jobs
Hospitality Jobs
Part Time Jobs
Security Jobs
Finance Jobs
Marketing Jobs
Shipping Jobs
Real Estate Jobs
Telecom Jobs

GCP Data Engineer

Fresher   Navi Mumbai, All India   03 Apr, 2026
Job LocationNavi Mumbai, All India
EducationNot Mentioned
SalaryNot Disclosed
IndustryIT Services & Consulting
Functional AreaNot Mentioned
EmploymentTypeFull-time

Job Description

    Role Overview:As a GCP Data Engineer at our company, you will be responsible for designing, building, and maintaining scalable data pipelines using Python, SQL, BigQuery, and other big data technologies. You will collaborate with data scientists and ML engineers to deploy services encapsulating DS/ML models at scale and implement data algorithm pipelines for efficient processing. Additionally, you will architect and manage robust data models and infrastructure on Google Cloud Platform (GCP), ensuring best practices for data quality, reliability, and performance for petabyte-scale datasets.Key Responsibilities:- Design, build, and maintain scalable, event-driven, and batch data pipelines using Python, SQL, BigQuery, and big data technologies (Spark, Dataflow, etc.).- Partner with data scientists and ML engineers to build, deploy, and evolve services encapsulating DS/ML models at scale.- Design and implement data algorithm pipelines to efficiently and rapidly process millions of products and requests.- Architect, build, and manage robust data models and infrastructure on Google Cloud Platform (GCP), with a focus on BigQuery, Dataflow, Pub/Sub, and GKE.- Build and manage the CI/CD, orchestration (Kubernetes, Airflow), and infrastructure for ML systems.- Work cross-functionally with product management, data scientists, and analysts to understand problems and design solutions.- Champion and implement best practices for data quality, reliability, and performance for petabyte-scale datasets.- Implement and advocate for engineering best practices, helping level up other engineers on the team.Qualifications Required:- 7 to 15 years of hands-on experience in data engineering or backend software engineering.- Expertise in Python for data processing (e.g., Pandas, Spark) and/or backend development.- Strong, advanced SQL skills and extensive experience with cloud data warehouses like BigQuery.- Proven, deep hands-on experience designing and operating scalable data solutions on Google Cloud Platform (GCP), with expertise in services like BigQuery, Dataflow, Pub/Sub, GCS, Dataproc, and Vertex AI.- Experience with big data technologies (Spark, Hadoop, MapReduce) and messaging systems (Kafka, Pub/Sub).- Production experience with containerized development (Docker) and orchestration (Kubernetes).- Experience building or maintaining the infrastructure for ML systems and integrating models into production services.- Ability to understand and make tradeoffs between different technologies and patterns.- Excellent communication skills and ability to work effectively with engineers, product managers, and business stakeholders.- Experience leveraging modern IDEs and AI-assisted development tools (e.g., Cursor, GitHub Copilot) to accelerate development cycles.(Note: The additional details of the company were not present in the provided job description.) Role Overview:As a GCP Data Engineer at our company, you will be responsible for designing, building, and maintaining scalable data pipelines using Python, SQL, BigQuery, and other big data technologies. You will collaborate with data scientists and ML engineers to deploy services encapsulating DS/ML models at scale and implement data algorithm pipelines for efficient processing. Additionally, you will architect and manage robust data models and infrastructure on Google Cloud Platform (GCP), ensuring best practices for data quality, reliability, and performance for petabyte-scale datasets.Key Responsibilities:- Design, build, and maintain scalable, event-driven, and batch data pipelines using Python, SQL, BigQuery, and big data technologies (Spark, Dataflow, etc.).- Partner with data scientists and ML engineers to build, deploy, and evolve services encapsulating DS/ML models at scale.- Design and implement data algorithm pipelines to efficiently and rapidly process millions of products and requests.- Architect, build, and manage robust data models and infrastructure on Google Cloud Platform (GCP), with a focus on BigQuery, Dataflow, Pub/Sub, and GKE.- Build and manage the CI/CD, orchestration (Kubernetes, Airflow), and infrastructure for ML systems.- Work cross-functionally with product management, data scientists, and analysts to understand problems and design solutions.- Champion and implement best practices for data quality, reliability, and performance for petabyte-scale datasets.- Implement and advocate for engineering best practices, helping level up other engineers on the team.Qualifications Required:- 7 to 15 years of hands-on experience in data engineering or backend software engineering.- Expertise in Python for data processing (e.g., Pandas, Spark) and/or backend development.- Strong, advanced SQL skills and extensive experience with cloud data warehouses like BigQuery.- Proven, deep hands-on experience designing and operating scalable data solutions on Google Cloud Platform (GCP), with expertise in services like BigQuery, Dataflow

Keyskills :
PythonSQLSparkKubernetesAirflowDockerHadoopMapReduceKafkaBigQueryComposerCloudRunDataflowPubSubGKE

GCP Data Engineer Related Jobs

© 2019 Hireejobs All Rights Reserved