Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Oil & Gas Jobs |
Banking Jobs |
Construction Jobs |
Top Management Jobs |
IT - Software Jobs |
Medical Healthcare Jobs |
Purchase / Logistics Jobs |
Sales |
Ajax Jobs |
Designing Jobs |
ASP .NET Jobs |
Java Jobs |
MySQL Jobs |
Sap hr Jobs |
Software Testing Jobs |
Html Jobs |
Job Location | Bangalore |
Education | Not Mentioned |
Salary | Not Disclosed |
Industry | Banking / Financial Services |
Functional Area | Application Programming / MaintenanceDBA / Datawarehousing |
EmploymentType | Full-time |
*Business BackgroundPosition is for a role in TEDRA department.TEDRA (Trade Enrichment Data Reporting & Allocations) is part of the Institutional Securities Technology (IST) Division. It is responsible for maintaining, distributing, and reporting on trading, revenue, risk, and reference data (client, product, and pricing). As the authoritative source of key data sets, we are at the forefront of database technology and are heavily involved in data engineering, data science, data visualization, and machine learning efforts across the Firm. Position IntroductionThis is a data engineer role in the team responsible for developing on the firms Trade Capture data stores that holds the transactional bigdata for real time and archive processing and getting it into the archives and data lake.The global team consists of highly technical team members who are adaptable to both hands on development and project management. We deliver multiple projects for multiple business areas in parallel. The business owners and subject matter experts will be globally distributed, making communication and pro-active to be important. You will be expected to work closely with our operations partners on requirements for projects.The development will be performed using an agile methodology which is based on scrum (time boxing, daily scrum meetings, retrospectives, etc) and XP (continuous integration, refactoring, unit testing, etc) best practices. Candidates must therefore be able to work collaboratively, demonstrate good ownership and be able to work well in teams.Primary responsibilities include:1. Translate business requirement into queries against a set of relational tables and produce reporting based on the requirements.2. Design and build reporting layer from different data sources and act as a SPOC for user queries3. Database and ETL development, including stored procedures, queries, performance tuning etc, using SQL and ETL tools such as Informatica.4. Efficient and clean automation script (using Python etc.) as part of the ETL process.The current global team members are all very skilled in domain modeling, database design, big data, Java and messaging so this is an excellent opportunity to play a key role in the growing Shanghai team., *Technical Skills Requirement* Strong relational database skills especially with DB2/Sybase or/and Greenplum. * Knowledge of Hadoop/Spark is desirable.* Experience in delivery of metrics / reporting in an enterprise environment (e.g. demonstrated experience in BI tools such as Business Objects, Tableau, report design & delivery)* Strong understanding of ETL processes and experience with tools such as Talend. Real time message processing experience is a big plus.* Create high quality and optimized stored procedures and queries* Experience with Power Designer or some similar modeling tool.* Strong with scripting language such as Python and Unix / K-Shell, Java is a big plus* Strong knowledge base of relational database performance and tuning such as: proper use of indices, database statistics/reorgs, de-normalization concepts.* Experience with data mining is a big plus* Familiar with lifecycle of a trade and flows of data in an investment banking operation.* Experienced in Agile development process
Keyskills :
subject matter expertsbig data etl toolsdata mining data scienceunit testing report designreference data