Job Archives
• Development of a big data platform in Hadoop using Spark, Oozie, and related;
• Implement and support big data tools, frameworks such as HDFS, HIVE and streaming technologies such as Kafka and Spark;
• Set standards for warehouse and schema design;
• Write HIVE Queries to process and manage data in external tables;
• Automating task of loading the data into HDFS by developing workflow jobs in Oozie;
• Develop/build Maven scripts to integrate them with Jenkins and automate the compilation, deployment and testing of web applications through XL Release; feed machine learning models in big data environment;
• Developing, optimising hadoop/Spark jobs and automation of hadoop/spark jobs using shell scrip, event engine tool;
• Developing scala, python and java based rest API for batch processing, Real time processing of streaming requests through API in big data platform;
• Responsible for unit testing, debugging, verifying splunk logs and deploying the production ready code and various micro services for US and International market projects.
• Migrating big-data projects to new architectural platform and configuration driven applications from old platform using technologies - Java, Hadoop Distribution, Map-Reduce, HIVE, Shell Scripting, SQL, REST Services;
• Developing and maintaining the big-data pipelines (batch and real-time) from end-to-end flow.
Required Masters in CS/In.Syst/Info Tech/related plus 6 months exp.
Job Features
Job Category | Full Time |
Posted On | 05/20/2022 |
Reposted On | 06/10/2022 |
Status | Open |
Spearhead enterprise integration of marketing platform using MuleSoft & Salesforce. Lead enterprise wide transformation of CRM platform to create the migration strategy.
• Research & provide best custom cloud solutions for the current infrastructure. Implement the cloud integration for ERP, CRM, e-Commerce, or mobile (Salesforce, NetSuite, Oracle, SAP, Workday, Ariba, etc.)
• Strategic planning and review existing architecture planning on enhancement and changes to improve the organization operational efficiency. Analyze, risk assess and document risk mitigations with the risk management team.
• Determine research objectives, work with the executive team to improve the customer acquisition/retention by applying CRM doctorate level research.
• Use predictive analytics, deep learning and machine learning techniques to identify and prevent cyber threats, improve product line sales and CRM customer marketing strategies.
• Orchestrate information technology strategies and implement technological strategic solutions from academic experience by ensuring best possible designs (performance, scalability) of MuleSoft applications/ solution compliance with MuleSoft best practices, coding standards in accordance to corporate standards.
• Research and improvise current SDLC lifecycle to reduce go-to-market time and deploy periodic release of quality products. Implement end-to-end continuous delivery and continuous integration, design architecture, development, test and deployment of CRM, ERP and HCM platforms.
• Track emerging technologies, evaluate and improvise architecture to meet business goals and operational.
• Oversee scrum ceremonies using agile manifesto for continuous delivery and provide suggestions to improvise safe scrum practices.
• Analyze web services interoperability, criticize and formulate solutions in multi-vendor and architecture committee meetings. Responsible for communication, collaboration with stakeholders, document and implement best practices.
• Identify project goals, research methods, choose right data collection techniques, identify target audience to increase ROI of product deliveries.
• Implement statistical and data mining techniques e.g hypothesis testing, machine learning and retrieval processes on a large amount of data to identify trends, patterns and other retrieval information.
• Perform data modelling using advanced statistical analysis, unstructured data processing and develop predictive models, support/mentor product team members.
Educational requirements : Masters' in IT and preferably undergoing PHD or Doctrate program
Job Features
Job Category | Full Time |
Posted On | 4/30/2020 |
Status | Open |
- Develop applications using Core Java, J2EE, JSP, Servlets, Struts, Hibernate, JDBC, Spring, Spring Boot, JDBC, IBMWSAD;
- Analyze the API requirements to map data elements to create payload for API;
- Develop REST APIs using Java, Nodejs and deploy on Apigee and AWS cloud;
- Design and Develop responsive web pages using HTML, CSS, JS, and framework like Bootstrap;
- Design and develop UI reusable components using a framework like Angular, and ReactJS;
- Implement Spring security to provide enterprise-level security to access the application data;
- Maintain source code, committing and update code changes in the version control system;
- Work on Development, Pre-production and Production environments on internal and AWS cloud to develop and deploy applications;
- Coordinate with DevOps team to achieve CI/CD pipeline and release perspective;
- Work on Agile environment applying latest Scrum concepts and associated processes.
REQUIRED BACHELORS IN CS/INFOTECH/RELATED PLUS 5 YRS EXP. WILL ALSO ACCEPT MASTERS DEGREE IN COMPUTER SCIENCE, INFORMATION TECHNOLOGY/ELECTRONICS ENGINEERING/ RELATED IN LIEU OF BACHELORS + 5 YRS EXPERIENCE.
Job Features
Job Category | Full Time |
Posted On | 02/18/2020 |
Reposted On | 03/19/2020 |
Status | Open |
Java SDK 1.8. Java Spring Framework. Spring Boot (Embedded Tomcat). REST & SOAP API. Experience working on Agile Methodology (Experience with JIRA). Experience with Git repository. Ability to write Unit Test Case and Integration Test Cases. Automated Integration/Regression Test Cases (Experience with Cucumber, Groovy or other related tools). Hands-on experience implementing Design Patterns (MVC, Factory, Abstract). Experience w/ Micro Services. Service-Oriented Architecture (SOA) architecture concepts. Knowledge of workflow orchestration tools. Some DevOps Experience/Exposure (Chef, Bamboo). Oracle Stored Procedures PL/SQL (relatively important skill)
Masters' degree in Computer Science, Information systems/Engineering/ related in lieu of Bachelors. May have to take up assignments anywhere within the US and candidates should have 7 years' experience.
Job Features
Posted On | 02/21/2020 |
Status | Open |
IT development and support activities for Big Data, Hadoop, DevOps applications; Oversee the overall planning, direction, coordination, execution, control and completion of assigned projects and reporting into senior management; Help define project’s scope, goals, deliverables and Integration, development, delivery with following skill sets includes but not limited to Big data (MapR, Spark, Scala) Rest API, HBase, event engine (scheduler). Develop detailed work plans, including work breakdown structures, project milestones, risk assessment / management plans, staffing needs and project timelines; Identify, schedule and assign project activities, tasks and milestones; Coordinate all related facility work with the Facilities department and ensure work is performed on schedule. Environment: MS Excel, Word, Visio, (mapr, spark, scala) Rest API ,Jenkins, Hive, Hbase, Python, event engine (job scheduler) Lifetime Invest.
Masters degree in Computer Science, Information systems/Engineering/ related in lieu of Bachelors . May have to take up assignments anywhere within US.
Job Features
Posted On | 02/20/2020 |
Status | Open |