Position Summary
Business Unit Description
In today’s fast evolving technology world, one aspect remains common – reliance on data to drive the next wave of innovation. Strategic Analytics is Samsung’s Center-of-Excellence for driving the adoption of data driven decision making and product development across the company. The team’s core focus is developing best-in-class solutions that provide Samsung’s marketing and service organizations with a 360 degree view of the Samsung customers.
Strategic Analytics is powering a paradigm shift at Samsung and the global industry. We are looking for highly technical team members who are passionate about data, have the rigor needed to solve billion dollar problems, and possess an innate entrepreneurial spirit to explore the uncharted. Strategic Analytics combines the engineering backbone of a best-in-class Big Data Platform with the analytic expertise of advanced mining and predictive modeling.
If you want to work among the very best talent in the industry, working on the most innovative products in the world, Samsung is the place to be.
Role and Responsibilities
Position Summary
Big Data Engineers serve as the backbone of the Strategic Analytics organization, ensuring both the reliability and applicability of the team’s data products to the entire Samsung organization. They have extensive experience with ETL design, coding, and testing patterns as well as engineering software platforms and large-scale data infrastructures. Big Data Engineers have the capability to architect highly scalable end-to-end pipeline using different open source tools, including building and operationalizing high-performance algorithms.
Big Data Engineers understand how to apply technologies to solve big data problems with expert knowledge in programming languages like Java, Python, Linux, PHP, Hive, Impala, and Spark. Extensive experience working with both 1) big data platforms and 2) real-time / streaming deliver of data is essential.
Big data engineers implement complex big data projects with a focus on collecting, parsing, managing, analyzing, and visualizing large sets of data to turn information into actionable deliverables across customer-facing platforms. They have a strong aptitude to decide on the needed hardware and software design and can guide the development of such designs through both proof of concepts and complete implementations.
Additional qualifications should include:
- Tune Hadoop solutions to improve performance and end-user experience
- Proficient in designing efficient and robust data workflows
- Documenting requirements as well as resolve conflicts or ambiguities
- Experience working in teams and collaborate with others to clarify requirements
- Strong co-ordination and project management skills to handle complex projects
- Excellent oral and written communication skills.
Skills and Qualifications
Responsibilities include:
- Translate complex functional and technical requirements into detailed design.
- Design for now and future success
- Hadoop technical development and implementation.
- Loading from disparate data sets. by leveraging various big data technology e.g. Kafka
- Pre-processing using Hive, Impala, Spark, and Pig
- Design and implement data modeling
- Maintain security and data privacy in an environment secured using Kerberos and LDAP
- High-speed querying using in-memory technologies such as Spark.
- Following and contributing best engineering practice for source control, release management, deployment etc
- Production support, job scheduling/monitoring, ETL data quality, data freshness reporting
Skills Required:
- 5-8 years of experience writing complex SQL queries.
- 3+ years of Python development experience
- 3+ years of demonstrated technical proficiency with Hadoop and big data projects
- 3+ years of demonstrated experience and success in data modeling.
- Ability to automate and implement ETLs to reduce manual work.
- Fluent in writing shell scripts [bash, korn]
- Writing high-performance, reliable and maintainable code.
- Familiarity with and implementation knowledge of loading data using Sqoop.
- Knowledge and ability to implement workflow/schedulers within Oozie
- Experience working with AWS components [EC2, S3, SNS, SQS]
- Analytical and problem solving skills, applied to Big Data domain
- Proven understanding and hands on experience with Hadoop, Hive, Pig, Impala, and Spark
- Good aptitude in multi-threading and concurrency concepts.
- B.S. or M.S. in Computer Science or Engineering.
- Samsung Electronics America, Inc. and its subsidiaries are committed to employing a diverse workforce, and provide Equal Employment Opportunity for all individuals regardless of race, color, religion, gender, age, national origin, marital status, sexual orientation, gender identity, status as a protected veteran, genetic information, status as a qualified individual with a disability, or any other characteristic protected by law.
COVID-19 Vaccine Mandate
In order to comply with the federal vaccine mandate, Samsung Electronics America requires all employees to be fully vaccinated against COVID-19, unless a medical or religious exemption, or an exemption required under state/local law, is approved. Offers of employment are contingent upon proof that a candidate is fully vaccinated or qualifies for an exemption. More details on how to apply for an exemption are provided after the application process is complete.
Reasonable Accommodations for Qualified Individuals with Disabilities During the Application Process
Samsung Electronics America is committed to providing reasonable accommodations for qualified individuals with disabilities in our job application process. If you have a disability and require a reasonable accommodation in order to participate in the application process, please contact our Reasonable Accommodation Team (855-557-3247) or [email protected] for assistance. This number is for accommodation requests only and is not intended for general employment inquiries.