hireejobs
Hyderabad Jobs
Banglore Jobs
Chennai Jobs
Delhi Jobs
Ahmedabad Jobs
Mumbai Jobs
Pune Jobs
Vijayawada Jobs
Gurgaon Jobs
Noida Jobs
Oil & Gas Jobs
Banking Jobs
Construction Jobs
Top Management Jobs
IT - Software Jobs
Medical Healthcare Jobs
Purchase / Logistics Jobs
Sales
Ajax Jobs
Designing Jobs
ASP .NET Jobs
Java Jobs
MySQL Jobs
Sap hr Jobs
Software Testing Jobs
Html Jobs
IT Jobs
Logistics Jobs
Customer Service Jobs
Airport Jobs
Banking Jobs
Driver Jobs
Part Time Jobs
Civil Engineering Jobs
Accountant Jobs
Safety Officer Jobs
Nursing Jobs
Civil Engineering Jobs
Hospitality Jobs
Part Time Jobs
Security Jobs
Finance Jobs
Marketing Jobs
Shipping Jobs
Real Estate Jobs
Telecom Jobs

Principal data Engineer

Fresher   Vijayawada, All India   07 Apr, 2026
Job LocationVijayawada, All India
EducationNot Mentioned
SalaryNot Disclosed
IndustryIT Services & Consulting
Functional AreaNot Mentioned
EmploymentTypeFull-time

Job Description

    You will be playing a critical role in designing and building core components of the data platform. This is a hands-on leadership role where you will drive architecture decisions, mentor engineers, and own complex data problems end-to-end. You will collaborate closely with product and platform teams to define scalable, reliable, and high-performance data systems.- Design and build scalable, reusable data engineering components and frameworks using Python, Airflow, and PySpark.- Lead the architecture of end-to-end data pipelines from ingestion and transformation to orchestration and monitoring.- Build and maintain FastAPI-based services to expose metadata, control plane, and developer APIs.- Drive best practices in software engineering and data architecture (code quality, testing, CI/CD, performance).- Mentor and guide a team of data and backend engineers.- Collaborate with product managers, designers, and other engineers to deliver features on time and at high quality.- Evaluate and introduce tools, frameworks, and technologies that improve the developer experience and platform scalability.**Qualifications Required:**- 10 years of experience in software or data engineering, including experience building large-scale data platforms or products.- Expert-level Python skills with a strong software engineering background.- Deep expertise in Apache Airflow (or similar orchestration tools) and PySpark (or distributed data processing frameworks).- Strong understanding of API design and development using FastAPI or similar frameworks.- Experience with data modeling, schema design, and working with large-scale data (TB/PB scale).- Hands-on experience with cloud-native data platforms (AWS, GCP, or Azure).- Familiarity with containerization (Docker), CI/CD pipelines, and infrastructure-as-code is a plus.You have the opportunity to lead and shape a cutting-edge data platform from the ground up in a collaborative and product-minded engineering culture. Competitive compensation, equity, and benefits are offered along with a flexible remote/onsite working model. You will be playing a critical role in designing and building core components of the data platform. This is a hands-on leadership role where you will drive architecture decisions, mentor engineers, and own complex data problems end-to-end. You will collaborate closely with product and platform teams to define scalable, reliable, and high-performance data systems.- Design and build scalable, reusable data engineering components and frameworks using Python, Airflow, and PySpark.- Lead the architecture of end-to-end data pipelines from ingestion and transformation to orchestration and monitoring.- Build and maintain FastAPI-based services to expose metadata, control plane, and developer APIs.- Drive best practices in software engineering and data architecture (code quality, testing, CI/CD, performance).- Mentor and guide a team of data and backend engineers.- Collaborate with product managers, designers, and other engineers to deliver features on time and at high quality.- Evaluate and introduce tools, frameworks, and technologies that improve the developer experience and platform scalability.**Qualifications Required:**- 10 years of experience in software or data engineering, including experience building large-scale data platforms or products.- Expert-level Python skills with a strong software engineering background.- Deep expertise in Apache Airflow (or similar orchestration tools) and PySpark (or distributed data processing frameworks).- Strong understanding of API design and development using FastAPI or similar frameworks.- Experience with data modeling, schema design, and working with large-scale data (TB/PB scale).- Hands-on experience with cloud-native data platforms (AWS, GCP, or Azure).- Familiarity with containerization (Docker), CI/CD pipelines, and infrastructure-as-code is a plus.You have the opportunity to lead and shape a cutting-edge data platform from the ground up in a collaborative and product-minded engineering culture. Competitive compensation, equity, and benefits are offered along with a flexible remote/onsite working model.

Keyskills :
PythonData modelingAWSGCPAzureDockerApache AirflowPySparkFastAPIAPI designSchema designCICD pipelines

Principal data Engineer Related Jobs

© 2019 Hireejobs All Rights Reserved