Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Hyderabad Jobs |
Banglore Jobs |
Chennai Jobs |
Delhi Jobs |
Ahmedabad Jobs |
Mumbai Jobs |
Pune Jobs |
Vijayawada Jobs |
Gurgaon Jobs |
Noida Jobs |
Oil & Gas Jobs |
Banking Jobs |
Construction Jobs |
Top Management Jobs |
IT - Software Jobs |
Medical Healthcare Jobs |
Purchase / Logistics Jobs |
Sales |
Ajax Jobs |
Designing Jobs |
ASP .NET Jobs |
Java Jobs |
MySQL Jobs |
Sap hr Jobs |
Software Testing Jobs |
Html Jobs |
Job Location | Hyderabad |
Education | Not Mentioned |
Salary | Not Disclosed |
Industry | Medical / Healthcare |
Functional Area | General / Other Software |
EmploymentType | Full-time |
Position Description Engagement & Project Overview -CDSM has several commitments to the UHC clinical business to identify gaps in care, Acquiring clinical data from providers and quality check of the data.CDSM Enterprise Integration Service Layer and Data Platform 2 Project is built on Optum BigData Platform and Java/J2EE to give clinical data processing agility to business. Data is acquired from external providers thru web services, FHIR, ECG and processed in BigData Platform in CDSM tenant. These people need to collect data from DXP tool and down streams of CDSM like SDR, SDM and do analysis of data using Spark and Hadoop code. These people should be experts in Java/ J2EE, Spark, unix scripting, Hive, Hbase. Technology currently we are using - Hadoop , Hive, Hbase, Java, J2EE, SQL, UnixOperates within established methodologies, procedures, and guidelines.Ability to work independently as well as guide team members in a fast paced, agile environment.Primary Responsibilities -Experience in designing and developing/coding software components in Bigdata, that includes various tools like Hive, Sqoop, HBase, Spark, Scala.Strong in communication and stack holder management.Expert in Java and RDBMS.Experience in writing solid unit tests and integration tests.Ability to work independently as well as guide team members in a fast paced, agile environment.Must Have Skills -Minimum of 5 years experience in designing and developing/coding software components in Bigdata, that includes various tools like Map Reduce, Hive, Sqoop, HBase, Spark, Scala, Kafka.Expert in Java and RDBMSExcellent Communication and stake holder management.Excellent knowledge of the Agile and Software Development Life Cycle (SDLC) processesNice To Have Skills -Experience in Oozie, Kubernetes big plus to have.Healthcare domain knowledge is good to haveShould be flexible with the working hours as he/she needs to closely work with the US counterparts,
Keyskills :
autocadproject management drawingcivil consultingsoftware development life cycle web servicesquality check