CURO Financial Technologies is one of the largest, fastest growing providers of short-term loans and financial services in the United States and Canada. Our licensed, direct lending products and heightened customer service focus are at the core of what we offer.
We have an upbeat, friendly and fast-paced environment. Our employees are excited to be a part of the CURO family, as evidenced by our low turnover rates and energized company culture. We've consistently grown well ahead of other short-term loan lenders and are primed for continued growth and enduring success.
Come and work for a Fintech company that has distinguished itself from competitors with quality product offerings, genuine customer service, robust operating systems, state-of-the-art call center, and a track record of new product innovation!
US Remote
We are looking for a Data-Ops Engineer who has passion for data and automation in data life cycle management to join our Data Engineering team. As a member of this team, you will play a key role in establishing Data-Ops methodology to our data engineering and analytics initiatives.
What you'll be working on:
You'll be working with internal stakeholders, data and analytics teams, external partners, and other technologists across the business. This position requires a passionate data technologist – if you're someone with expertise in cloud, dev/data ops, and operational processes and can maintain the fine balance of business acumen and deep technical knowledge, you've come to the right place. Here you can create the extraordinary!
What you will be doing:
- Help establish Data-Ops platforms, processes, and frameworks to improve data quality and cycle times
- Design, develop, deploy and support new and current ETL processes employing industry standards and best practices to enhance loading of data from and into different source and target systems.
- Design, build and maintain highly efficient and reliable data pipelines to move data across several platforms including applications, database, and data warehouse, and BI tools
- Implement continuous integration and delivery of data pipelines
- Design and implement automated test frameworks for testing data pipelines
- Design and implement data quality check frameworks for data quality assurance
- Design and implement alerting & monitoring systems for overall data application stack
- Provide operational support to satisfy internal reporting/data requests
- Contribute to all phases of the development life cycle; Collaborate with system architects on application infrastructure
- Develop reliable and scalable systems used for monitoring/alerting and access management of production systems
- Identifying and addressing design, development, and delivery performance bottlenecks in pre-production / development environments looking to continually improve applications
- Writes documentation for both internal and external consumers, covering design artifacts, code and fixes
- Administration of AWS services like IAM, Role, policies, and AD integration
- Safeguarding of company confidential data through appropriate IT security
What you should have:
- Experience working with cloud MPP analytics platforms (Snowflake, AWS Redshift, Azure Data Warehouse and similar).
- Proven track record of architecting large-scale data pipelines, CI/CD tools and have experience with technologies like Airflow, dbt, Docker, Terraform, and similar.
- 3+ years of experience in data ingestion automation at a large organization
- Professional working proficiency in SQL
- Experience leveraging Python for data pipeline orchestration, file parsing. JSON manipulation leveraging dictionaries and working with APIs are great to have
- Experience working in cloud environments – GCP, AWS, Azure
- Excellent understanding of data concepts, data architecture, data manipulation/engineering and data engineering design
- Git or Cloud Source Repository experience.
- Good communication skills: communicate ideas clearly and effectively to other members of the analytics team and to the client at multiple levels (both technical and business)
- Experience with implementation of Data Ops tools and software engineering best practices e.g. Gitflow,
- Experience in building data pipeline monitoring and alerting tools to provide real-time visibility about scalability and performance metrics.
- Hands-on Experience with building relational and/or dimensional conceptual/logical data models, transforming conceptual/logical data models into physical models and data artifacts, and in the development of data architectures using application/tools
CURO Financial Technologies Corp Supports Equal Employment Opportunity. CURO Financial Technologies Corp (dba Speedy Cash, Rapid Cash, Cash Money, LendDirect, Avío Credit, Opt+ and Revolve Finance) is committed to a policy of providing equal employment opportunity to all qualified employees and applicants. This commitment is reflected in all aspects of our daily operations. We do not discriminate on the basis of race, color, sex, religion, national origin, marital status, age, disability, veteran status, or genetic information in any personnel practice, including recruitment, hiring, training, compensation, promotion, and discipline. Additionally, we do not discriminate based on any other characteristic protected by applicable state or local law where a particular employee works. In addition, it is the policy of CURO Financial Technologies Corp to provide reasonable accommodation to qualified employees who have protected disabilities to the extent required by federal law and any state law where a particular employee works.
This employer participates in E-Verify.
NOTICE: Please upload your resume in .pdf or .doc format.