hireejobs
Hyderabad Jobs
Banglore Jobs
Chennai Jobs
Delhi Jobs
Ahmedabad Jobs
Mumbai Jobs
Pune Jobs
Vijayawada Jobs
Gurgaon Jobs
Noida Jobs
Oil & Gas Jobs
Banking Jobs
Construction Jobs
Top Management Jobs
IT - Software Jobs
Medical Healthcare Jobs
Purchase / Logistics Jobs
Sales
Ajax Jobs
Designing Jobs
ASP .NET Jobs
Java Jobs
MySQL Jobs
Sap hr Jobs
Software Testing Jobs
Html Jobs
IT Jobs
Logistics Jobs
Customer Service Jobs
Airport Jobs
Banking Jobs
Driver Jobs
Part Time Jobs
Civil Engineering Jobs
Accountant Jobs
Safety Officer Jobs
Nursing Jobs
Civil Engineering Jobs
Hospitality Jobs
Part Time Jobs
Security Jobs
Finance Jobs
Marketing Jobs
Shipping Jobs
Real Estate Jobs
Telecom Jobs

Pyspark Reltio

Fresher   Chennai, All India   19 Jan, 2026
Job LocationChennai, All India
EducationNot Mentioned
SalaryNot Disclosed
IndustryIT Services & Consulting
Functional AreaNot Mentioned
EmploymentTypeFull-time

Job Description

    You are a skilled PySpark Developer with hands-on experience in Reltio MDM, joining the data engineering team to design and implement scalable data processing solutions using PySpark and integrating with Reltiocloud native MDM platform.**Key Responsibilities:**- Develop and maintain data pipelines using PySpark in distributed computing environments e.g., AWS EMR Databricks.- Integrate and synchronize data between enterprise systems and the Reltio MDM platform.- Design and implement data transformation, cleansing, and enrichment processes.- Collaborate with data architects, business analysts, and Reltio solution architects to ensure high-quality data modeling.- Work on API-based integration between Reltio and upstream/downstream applications.- Optimize PySpark jobs for performance and cost efficiency.- Ensure data quality, integrity, and governance throughout the pipeline.- Troubleshoot and resolve data and performance issues in existing workflows.**Required Skills & Qualifications:**- 5 to 7 years of experience in PySpark development and distributed data processing.- Strong understanding of Apache Spark DataFrames and Spark SQL.- Experience with Reltio MDM, including entity modeling, survivorship rules, match merge configuration.- Proficiency in working with REST APIs and JSON data formats.- Experience with cloud platforms like AWS and data services e.g., S3, Lambda, step function.- Good knowledge of data warehousing concepts, ETL workflows, and data modeling.- Familiarity with CI/CD practices and version control tools like Git.- Strong problem-solving and communication skills.*Note: The additional company details provided in the job description have been omitted as they were not specifically related to the job role.* You are a skilled PySpark Developer with hands-on experience in Reltio MDM, joining the data engineering team to design and implement scalable data processing solutions using PySpark and integrating with Reltiocloud native MDM platform.**Key Responsibilities:**- Develop and maintain data pipelines using PySpark in distributed computing environments e.g., AWS EMR Databricks.- Integrate and synchronize data between enterprise systems and the Reltio MDM platform.- Design and implement data transformation, cleansing, and enrichment processes.- Collaborate with data architects, business analysts, and Reltio solution architects to ensure high-quality data modeling.- Work on API-based integration between Reltio and upstream/downstream applications.- Optimize PySpark jobs for performance and cost efficiency.- Ensure data quality, integrity, and governance throughout the pipeline.- Troubleshoot and resolve data and performance issues in existing workflows.**Required Skills & Qualifications:**- 5 to 7 years of experience in PySpark development and distributed data processing.- Strong understanding of Apache Spark DataFrames and Spark SQL.- Experience with Reltio MDM, including entity modeling, survivorship rules, match merge configuration.- Proficiency in working with REST APIs and JSON data formats.- Experience with cloud platforms like AWS and data services e.g., S3, Lambda, step function.- Good knowledge of data warehousing concepts, ETL workflows, and data modeling.- Familiarity with CI/CD practices and version control tools like Git.- Strong problem-solving and communication skills.*Note: The additional company details provided in the job description have been omitted as they were not specifically related to the job role.*

Keyskills :
AWSData modelingGitPySparkReltio MDMAWS EMRDatabricksApache Spark DataFramesSpark SQLREST APIsJSON data formatsLambdaETL workflows

Pyspark Reltio Related Jobs

© 2019 Hireejobs All Rights Reserved