Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Oil & Gas Jobs |
Banking Jobs |
Construction Jobs |
Top Management Jobs |
IT - Software Jobs |
Medical Healthcare Jobs |
Purchase / Logistics Jobs |
Sales |
Ajax Jobs |
Designing Jobs |
ASP .NET Jobs |
Java Jobs |
MySQL Jobs |
Sap hr Jobs |
Software Testing Jobs |
Html Jobs |
Job Location | Delhi |
Education | Not Mentioned |
Salary | Rs 5 - 12 Lakh/Yr |
Industry | IT - Hardware / Networking |
Functional Area | DBA / Datawarehousing |
EmploymentType | Full-time |
Dear Sir,We are lookingaData engineer expert having experience on Snowflake, Your profiles seems relevant to our requirement. If you are looking for job change, kindly share your updated resume.Please share below details with resume:-Current CTC :-Notice Period :-<<<NOTE:- NO CHARGES, 100% FREE JOB>>>Job Title -Snowflake Data engineerLocation -DelhiSalary -Upto 12 LPANo. of openings :1Experience :4+yrsSnowflake data engineers will be responsible for architecting and implementing very large scale data intelligence solutions around Snowflake Data Warehouse. A solid experience and understanding of architecting, designing and operation-ligation of large scale data and analytics solutions on Snowflake Cloud Data Warehouse is a must. Need to have professional knowledge of AWS Redshift. Developing ETL pipelines in and out of data warehouses using combination of Python and Snowflakes Snow SQL Writing SQL queries against Snowflake. Developing scripts Unix, Python etc. to do Extract, Load and Transform data Working knowledge of AWS S3 Provide production support for Data Warehouse issues such data load problems, transformation translation problems Translate requirements for BI and Reporting to Database design and reporting design Understanding data transformation and translation requirements and which tools to leverage to get the job done Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions.Basic QualificationsMinimum 1 year of designing and implementing a fully operational production grade large scale data solution on Snowflake Data Warehouse.3 years of hands on experience with building productionized data ingestion and processing pipelines using Java, Spark, Scala, Python2 years of hands on experience designing and implementing production grade data warehousing solutions on large scale data technologies such as Teradata, Oracle or DB2Expertise and excellent understanding of Snowflake Internals and integration of Snowflake with other data processing and reporting technologiesExcellent presentation and communication skills, both written and verbalAbility to problem solve and architect in an environment with unclear requirementsExperience in working with AWSBachelors degree in Computer Science, Engineering, Technical Science or 3 years of technical architecture and build experience with large scale solutionsMinimum 1 year of experience in architecting large-scale data solutions, performing architectural assessments, crafting architectural options and analysis, finalizing preferred solution alternative working with IT and Business stakeholdersExperience in building data ingestion pipeline using Talend, InformaticaPlease share your resumeContact no -7291910546Contact person - Mrs Komal vermaEmail id - hr01emp@gmail.com
Keyskills :
informaticamusic makingpythontechnical architecturesource system analysisproduction supportdata solutionsdatabase designdata warehousingsqldata intelligencesql queriesdata transformationcomputer sciencecommunication skillsetldata proces