hireejobs
Hyderabad Jobs
Banglore Jobs
Chennai Jobs
Delhi Jobs
Ahmedabad Jobs
Mumbai Jobs
Pune Jobs
Vijayawada Jobs
Gurgaon Jobs
Noida Jobs
Oil & Gas Jobs
Banking Jobs
Construction Jobs
Top Management Jobs
IT - Software Jobs
Medical Healthcare Jobs
Purchase / Logistics Jobs
Sales
Ajax Jobs
Designing Jobs
ASP .NET Jobs
Java Jobs
MySQL Jobs
Sap hr Jobs
Software Testing Jobs
Html Jobs
IT Jobs
Logistics Jobs
Customer Service Jobs
Airport Jobs
Banking Jobs
Driver Jobs
Part Time Jobs
Civil Engineering Jobs
Accountant Jobs
Safety Officer Jobs
Nursing Jobs
Civil Engineering Jobs
Hospitality Jobs
Part Time Jobs
Security Jobs
Finance Jobs
Marketing Jobs
Shipping Jobs
Real Estate Jobs
Telecom Jobs

DATA ENGINEER

2.00 to 6.00 Years   Pune   14 Mar, 2022
Job LocationPune
EducationNot Mentioned
SalaryNot Disclosed
IndustryManufacturing
Functional AreaIT Operations / EDP / MIS
EmploymentTypeFull-time

Job Description

    *Supports, develops and maintains a data and analytics platform. Effectively and efficiently process, store and make data available to analysts and other consumers. Works with the Business and IT teams to understand the requirements to best leverage the technologies to enable agile data delivery at scale.Implements and automates deployment of our distributed system for ingesting and transforming data from various types of sources (relational, event-based, unstructured).Implements methods to continuously monitor and troubleshoot data quality and data integrity issues.Implements data governance processes and methods for managing metadata, access, retention to data for internal and external users.Develops reliable, efficient, scalable and quality data pipelines with monitoring and alert mechanisms that combine a variety of sources using ETL/ELT tools or scripting languages.Develops physical data models and implements data storage architectures as per design guidelines.Analyzes complex data elements and systems, data flow, dependencies, and relationships in order to contribute to conceptual physical and logical data models.Participates in testing and troubleshooting of data pipelines.Develops and operates large scale data storage and processing solutions using different distributed and cloud based platforms for storing data (e.g. Data Lakes, Hadoop, Hbase, Cassandra, MongoDB, Accumulo, DynamoDB, others). Uses agile development technologies, such as DevOps, Scrum, Kanban and continuous improvement cycle, for data driven application., *SkillsData Extraction - Performs data extract-transform-load (ETL) activities from variety of sources and transforms them for consumption by various downstream applications and users using appropriate tools and technologies.Solution Documentation - Documents information and solution based on knowledge gained as part of product development activities; communicates to stakeholders with the goal of enabling improved productivity and effective knowledge transfer to others who were not originally part of the initial learning.Quality Assurance Metrics - Applies the science of measurement to assess whether a solution meets its intended outcomes using the IT Operating Model (ITOM), including the SDLC standards, tools, metrics and key performance indicators, to deliver a quality product.Solution Validation Testing - Validates a configuration item change or solution using the Functions defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements.System Requirements Engineering - Uses appropriate methods and tools to translate stakeholder needs into verifiable requirements to which designs are developed; establishes acceptance criteria for the system of interest through analysis, allocation and negotiation; tracks the status of requirements throughout the system lifecycle; assesses the impact of changes to system requirements on project scope, schedule, and resources; creates and maintains information linkages to related artifacts.Problem Solving - Solves problems using a systematic analysis process by leveraging industry standard methodologies to create problem traceability and protect the customer; determines the assignable cause; implements robust, data-based solutions; identifies the systemic root causes and recommended actions to prevent problem reoccurrence.Data Quality - Identifies, understands and corrects flaws in data that supports effective information governance across operational business processes and decision making.Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements.Customer focus - Building strong customer relationships and delivering customer-centric solutions.Decision quality - Making good and timely decisions that keep the organization moving forward.Collaborates - Building partnerships and working collaboratively with others to meet shared objectives.Communicates effectively - Developing and delivering multi-mode communications that convey a clear understanding of the unique needs of different audiences.Education, Licenses, CertificationsCollege, university or equivalent degree preferred or equivalent work experience in relevant technical discipline.This position may require licensing for compliance with export controls or sanctions regulations.ExperienceRelevant experience preferred such as working in a temporary student employment, intern, co-op, or other extracurricular team activities.Knowledge of the latest technologies in data engineering is highly preferred and includes: - Exposure to Big Data open source - SPARK, Scala/Java, Map-Reduce, Hive, Hbase, and Kafka or equivalent college coursework - SQL query language - Clustered compute cloud-based implementation experience - Familiarity developing applications requiring large file movement for a Cloud-based environment - Exposure to Agile software development - Exposure to building analytical solutions - Exposure to IoT technologyMust Have
    1. Create ETL pipelines using Azure Databricks, Spark, Scala and Python with very strong SQL and data modelling skills.
    2. Azure Databricks and Azure Datalake administration
    3. Knowledge of BI tools like Power BI, Tableau
    4. Experience in Data ingestion from multiple source and file formats - HBase, JSON, NOSQL DB
    5. Continuous improvement mindset in optimizing ETL solutions and reducing cost and time.
    Good to have
    1. Experience in Data Analytics and Data Science projects.
    2. Data ingestion using Attunity
    3. Knowledge of SSIS, SSAS, Informatica tools

Keyskills :
sqljavadata warehousinginformaticapythonkey performance indicatorsbig datapower bidata flowlife cycledata modelsdata sciencedata qualitytest scripts

DATA ENGINEER Related Jobs

© 2019 Hireejobs All Rights Reserved