Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Oil & Gas Jobs |
Banking Jobs |
Construction Jobs |
Top Management Jobs |
IT - Software Jobs |
Medical Healthcare Jobs |
Purchase / Logistics Jobs |
Sales |
Ajax Jobs |
Designing Jobs |
ASP .NET Jobs |
Java Jobs |
MySQL Jobs |
Sap hr Jobs |
Software Testing Jobs |
Html Jobs |
Job Location | Noida |
Education | Not Mentioned |
Salary | Not Disclosed |
Industry | IT - Software |
Functional Area | DBA / Datawarehousing |
EmploymentType | Full-time |
Job DescriptionMust have experience as big data architect with hands-on experience in architecting and implementing petabyte scale Hadoop based enterprise data platforms. This job requires past experience of implementing efficient data management and data governance , Data integration , Processing batch and streaming data, working with multiple architects and multi-cultural teams , define long term strategic architectural vision and building efficient data organization .Roles and ResponsibilitiesPrimary responsibility for this position is to design and implement Big Data analytic solutions on Hadoop based platform , build end to end data pipelines, data exchange, analytics , advanced search , data management processes, standards and data life-cycle management.Person is expected to architect solutions in cloud/on premise deployments of big data platforms to integrate data from various sources and position them to perform analytics, reporting, visualizations and quick access for other applications/COTS.It is expected to provide technical leadership and mentoring, working with a team of Senior/Junior Big Data developers in design, development, deployment and systems integration activities.The Big Data Architect is expected to do hands on coding and develop/implement design patterns and Best Practices.Required experience of provisioning and setup/fine-tune configuration of Hadoop clusters Cloudera Hadoop distribution.Extensive experience with Big Data technologies to build highly reliable and scalable data solution using Hadoop ecosystem Nifi , Sqoop, Oozie , Spark , Hive , Pig , Impala , Kafka Streaming , Flink, Strom , Flume , Knox , Ranger , Ambari , Atlas , Zookeeper, Schema Registry , Cloudera Data Platform - Cloudera Data Hub, Cloudera Data Flow.Hands-on work experience with all facets of Enterprise Data Architecture: Data ingestion, transformation, wrangling, data quality, data lineage , data catalogue , schema registry, information security , classification, encryption , secure data exchange.Design the data model for different layers of enterprise data platform using well established modelling techniques and frameworks like DV2.Must have prior experience of building Hadoop data lake / hub , lake house from scratch. Minimum experience of 3 projects as data architect.Experience of performance tuning and resource management to ensure adherence to NFRs.
Keyskills :
data integrationbig datahiveproject administrationdata exchangesparkdata analyticscassandraetlimpalaperformance tuningstreamingresource managementsqoopooziekafkainformation securitydata qualityhadoophbasetechnical leadershipdesign pat