Senior Data Engineer

is a forward thinking, growth-oriented healthcare services and technology company that provides state of the art pharmacy solutions. Since 2015, we have helped millions of consumers save on their prescription drug costs, and we believe we have only scratched the surface.

We occupy a unique position in the market because we are vertically integrated. We have a PBM platform (RxAgile) that provides enterprise solutions to B2B players in the healthcare space, a direct-to-consumer product (SingleCare) with a mission to make prescription medication more affordable, and an analytics platform (RxIQ) which provides actionable insights in real-time.

Primary Duties and Responsibilities:

● Build and maintain one or more data lakes to support scalable ingesting, manipulation, and reporting of data
● Manipulate data to produce and maintain new data elements using repeatable, automated processes
● Demonstrates knowledge of industry trends, our infrastructure, technologies, tools, and systems
● Experience in measuring and communicating the value of data platforms and tools
● Display sense of ownership over assigned work, requiring minimal direction and driving to completion in a sometimes fuzzy and uncharted environment
● Build, operate and maintain highly scalable and reliable data pipelines
● Build Datawarehouse solutions that provide end-to-end management and traceability of patient data, enable and optimize internal processes and product features.
● Implement processes and systems to monitor data quality, ensuring production data is always accurate and available for key stakeholders and business processes that depend on it.
● Build and develop tools to support the use of ML and other analytical models to improve understanding of patient behavior, provider prescribing, the patient experience on treatment, treatment patterns and more.
● Collaborate with internal stakeholders to develop business domain concepts and data modeling approaches to problems faced by the organization in the analytics arena.
● Maintain and optimize existing data platform services and capabilities to identify potential enhancements, performance improvements, design improvements.
● Writes & maintain unit/integration tests, systems documentation.

Desired Skills and Experience:

● Masters or a bachelor’s degree in Information Systems, MIS, Statistics, related field or equivalent work experience required.
● 5+ years of overall experience in building and sustaining data engineering and big data solutions, preferably in the healthcare industry
● Extremely strong skills with at least 3+ years experience in at-least one programming and scripting language (Python, Go, Java)
● Has built and deployed into production large-scale batch and real-time data pipelines using Airflow
● Deep experience with AWS/Google Big Data platform and services. (SnowFlake, BigQuery, S3, Google Buckets, Parquet/Avro/ORC, ES)
● Current Stack includes Python, Snowflake, BigQuery, AWS S3, Google Bucket, Airflow, ES, Redis
● Overlap and experience with current stack is preferred and is a plus
● 3+ years experience in building enterprise data solutions using industry standard guiding principles and practices
● 2+ years of working knowledge with relational/non-relational databases
● 1+ years of experience in data engineering model that follows DevOps principles and standards for CI/CD processes

Organizational Skills Required

● Ability to multi-task, prioritize assignments and work well under deadlines in a changing environment with cross functional agile teams.
● Strong communications skills for working with stakeholders with various backgrounds

Location: Remote first job, but where the client mandates Work From Office, Candidate needs to relocate.

Kindly Submit Your Resume

    Name *

    Email*

    Phone Number*

    Resume*

    Only PDF / Docx / Text Files are acceptable