Overview:Big Data Engineer
EXL (NASDAQ:EXLS) is a leading operations management and analytics company that helps businesses enhance growth and profitability in the face of relentless competition and continuous disruption. Using our proprietary, award-winning Business EXLerator Framework™, which integrates analytics, automation, benchmarking, BPO, consulting, industry best practices and technology platforms, we look deeper to help companies improve global operations, enhance data-driven insights, increase customer satisfaction, and manage risk and compliance. EXL serves the insurance, healthcare, banking and financial services, utilities, travel, transportation and logistics industries. Headquartered in New York, New York, EXL has more than 24,000 professionals in locations throughout the United States, Europe, Asia (primarily India and Philippines), Latin America, Australia and South Africa.
EXL Analytics provides data-driven, action-oriented solutions to business problems through statistical data mining, cutting edge analytics techniques and a consultative approach. Leveraging proprietary methodology and best-of-breed technology, EXL Analytics takes an industry-specific approach to transform our clients’ decision making and embed analytics more deeply into their business processes. Our global footprint of nearly 2,000 data scientists and analysts assist client organizations with complex risk minimization methods, advanced marketing, pricing and CRM strategies, internal cost analysis, and cost and resource optimization within the organization. EXL Analytics serves the insurance, healthcare, banking, capital markets, utilities, retail and e-commerce, travel, transportation and logistics industries.
Please visit www.exlservice.com for more information about EXL Analytics.Role Overview
Data Engineer will be part of core big data technology and design team. Person would be entrusted to developed solutions/design ideas, identify design ideas to enable the software to meet the acceptance and success criteria. Work with architects/BA to build data component on the Big data environment.Responsibilities:As a key member of the technical team alongside Engineers, Data Scientists and Data Users, you will be expected to define and contribute at a high-level to many aspects of our collaborative Agile development process:
- Software design, development, automated testing of new and existing components in an Agile, DevOps and dynamic environment
- Promoting development standards, code reviews, mentoring, knowledge sharing
- Product and feature design, scrum story writing
- Data Engineering and Management
- Product support & troubleshooting
- Implement the tools and processes, handling performance, scale, availability, accuracy and monitoring
- Liaison with BAs to ensure that requirements are correctly interpreted and implemented. Liaison with Testers to ensure that they understand how requirements have been implemented – so that they can be effectively tested.
- Participation in regular planning and status meetings. Input to the development process – through the involvement in Sprint reviews and retrospectives. Input into system architecture and design.
- Peer code reviews.
- 3rd line support.
What We Do
- Master/ BA/ BS degree in mathematics, engineering, computer science or related areas is preferre
- 8+ years professional software development experience and at least 4 years within Big data environments
- 4+ years of programming experience in Java, Scala, and Sparks
- Proficient in SQL and relational database design.
- Agile and DevOps experience – at least 2+ years
- Experienced in Java or Scala and/or Python, Unix/Linux environment on-premises and in the cloud
- Experienced in construction of robust batch and real-time data processing solutions on hadoop
- Java development and design using Java 1.7/1.8. Advanced understanding of core features of Java and when to use them
- Experience with most of the following technologies (Apache Hadoop, Scala, Apache Spark, Spark streaming, YARN, Kafka, Hive, HBase, Presto, Python, ETL frameworks, MapReduce, SQL, RESTful services).
- Sound knowledge on working Unix/Linux Platform
- Hands-on experience building data pipelines using Hadoop components Sqoop, Hive, Pig, Spark, Spark SQL.
- Must have experience with developing Hive QL, UDF’s for analysing semi structured/structured datasets
- Experience with time-series/analytics db's such as Elasticsearch or no SQL database.
- Experience with industry standard version control tools (Git, GitHub), automated deployment tools (Ansible & Jenkins) and requirement management in JIRA
- Exposure to Agile Project methodology but also with exposure to other methodologies (such as Kanban)
- Understanding of data modelling techniques using relational and non-relational techniques
- Coordination between global teams
- Experience on Debugging the Code issues and then publishing the highlighted differences to the development team/Architects
- Nice to have: ELK experience. Knowledge of cloud computing technology such as Google Cloud Platform(GCP)
- EXL Analytics offers an exciting, fast paced and innovative environment, which brings together a group of sharp and entrepreneurial professionals who are eager to influence business decisions. From your very first day, you get an opportunity to work closely with highly experienced, world class analytics consultants.You can expect to learn many aspects of businesses that our clients engage in.
- You will also learn effective teamwork and time-management skills - key aspects for personal and professional growth
- Analytics requires different skill sets at different levels within the organization. At EXL Analytics, we invest heavily in training you in all aspects of analytics as well as in leading analytical tools and techniques.
- We provide guidance/ coaching to every employee through our mentoring program wherein every junior level employee is assigned a senior level professional as advisors.
- Sky is the limit for our team members. The unique experiences gathered at EXL Analytics sets the stage for further growth and development in our company and beyond.