Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Oil & Gas Jobs |
Banking Jobs |
Construction Jobs |
Top Management Jobs |
IT - Software Jobs |
Medical Healthcare Jobs |
Purchase / Logistics Jobs |
Sales |
Ajax Jobs |
Designing Jobs |
ASP .NET Jobs |
Java Jobs |
MySQL Jobs |
Sap hr Jobs |
Software Testing Jobs |
Html Jobs |
Job Location | Bangalore |
Education | Not Mentioned |
Salary | Not Disclosed |
Industry | Recruitment Services |
Functional Area | DBA / Datawarehousing |
EmploymentType | Full-time |
Gladwin Analytics : Executive Search | Big Data | Advanced Analytics Our Industry Areas Big Data Architect 13 Years - 18 YearsJob Description - Design ETL Hubs, ETL Architecture for Data warehouse/ BI implementations. - Ensure systems meet business requirements and industry practices. - Build high- performance algorithms, prototypes, predictive models and proof of concepts. - Develop data set processes for data modeling, mining and production. - Collaborate with data architects, modelers and IT team members on project goals. - Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities. - Implementing ETL process using SQL programming, database design/ development and using ETL tools. - Monitoring performance and advising any necessary infrastructure changes. - Defining data retention policies. - Work with Hive, Sqoop, Impala and Kudu components of the Hadoop ecosystem. - Write complex scripts using Python/ Linux scripting/ Perl.Ideal Candidate & Qualifications - Ability to work with huge volumes of data so as to derive Business Intelligence. - Analyze data, uncover information, derive insights and propose data- driven strategies. - Database concepts, principles, structures and best practices. - Hands- on experience in working with Hadoop Distribution platforms like HortonWorks, Cloudera, MapR and others. - Full knowledge of Hadoop Architecture and HDFS is a must. - Good knowledge of Data warehousing concepts and Business Intelligence, Data management & Data Architecture. - Comprehensive understanding of Hadoop/ MapReduce ecosystem and architecture. - Experience with building stream- processing systems, using solutions such as Storm or Spark- Streaming. - Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala. - Experience with Spark, NoSQL databases, such as HBase, Cassandra and MongoDB. - Knowledge of various ETL techniques and frameworks, such as Flume. - Experience with various messaging systems, such as Kafka or RabbitMQ. - Experience with Big Data ML toolkits, such as Mahout, SparkML, or H2O. - Knowledge of Java & Web development. - An analytical bent of mind and ability to learn- unlearn- relearn.We Work With 12:30 PM01:00 AM07:00 AM03:00 PM06:00 PMFunctional Expertise 2018 Gladwin Analytics ALL Rights Reserved.,
Keyskills :
cassandra hadoop hbase atamodeling bigdata dataarchitects datawarehousing commercialmodels datamanagement advancedanalytics businessintelligence businessrequiremen dataretention executivesearch