Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Oil & Gas Jobs |
Banking Jobs |
Construction Jobs |
Top Management Jobs |
IT - Software Jobs |
Medical Healthcare Jobs |
Purchase / Logistics Jobs |
Sales |
Ajax Jobs |
Designing Jobs |
ASP .NET Jobs |
Java Jobs |
MySQL Jobs |
Sap hr Jobs |
Software Testing Jobs |
Html Jobs |
Job Location | Bangalore |
Education | Not Mentioned |
Salary | Not Disclosed |
Industry | IT - Software |
Functional Area | General / Other Software |
EmploymentType | Full-time |
The Data Engineer is responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams.Create and maintain optimal data pipeline architecture Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Google Cloud Platform big data technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operation Responsibilities Include The Data Engineer is responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross-functional teams.Create and maintain optimal data pipeline architecture Assemble large, complex data sets that meet functional / non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Google Cloud Platform big data technologies. Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics. Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Keep our data separated and secure across national boundaries through multiple data centers and cloud vendor regions. Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. Preferred Qualifications Knowledge of Meta and Master Data Management Familiar with Google Cloud Platform Service cloud services like: Cloud Sql, BigQuery, DataFlow, DataPrep, AppEngine Knowledge of stream-processing systems: i.e.: Storm, Kafka, etc. Education (include Minimum Education And Any Licensing/certifications) Required Qualifications (these are the minimum requirements to qualify):Bachelor s degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field Experience 5 to 9 years of experience in a Data Engineer role Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Proficiency building and optimizing big data data pipelines, architectures and data sets. Background performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Build processes supporting data transformation, data structures, metadata, dependency and workload management. A successful history of manipulating, processing and extracting value from large disconnected datasets. Working knowledge of message queuing, stream processing, and highly scalable big data data stores. Experience supporting and working with cross-functional teams in a dynamic environment. Knowledge and Skills Strong analytic skills related to working with structured and unstructured datasets. Project management and organizational skills. Experience with relational SQL and NoSQL databases Knowledge of data pipeline and workflow management tools: i.e.: Knime, DataFlow, DataPrep, Airflow, etc. Familiar with object-oriented/object function scripting languages: i.e.: Python, Java, C , Scala, etc. Familiar with big data tools. Examples include: Hadoop, BigQuery, Kafka, etc. *Bonus points if they have: ETL and Python experience. He is willing to train someone who is eager to learn these though.al efficiency and other key business performance metrics.Upload your CV/resume or any other relevant file. Max. file size: 8 MB.,
Keyskills :
sql java datawarehousing python rootcauseanalysis masterdatamanagement bigdata dataflow rootcause masterdata musicmaking datamanagement datastructures nf matica googlecloudplatf computerscienc