Senior ETL Pipeline Engineer Job at Caris Life Sciences, Phoenix, AZ

N2dHRlJ1emI2ejY1VTRuYTdpT09XWDdX
  • Caris Life Sciences
  • Phoenix, AZ

Job Description

**At Caris, we understand that cancer is an ugly word-a word no one wants to hear, but one that connects us all. That's why we're not just transforming cancer care-we're changing lives.** We introduced precision medicine to the world and built an industry around the idea that every patient deserves answers as unique as their DNA. Backed by cutting-edge molecular science and AI, we ask ourselves every day: _"What would I do if this patient were my mom?"_ That question drives everything we do. But our mission doesn't stop with cancer. We're pushing the frontiers of medicine and leading a revolution in healthcare-driven by innovation, compassion, and purpose. **Join us in our mission to improve the human condition across multiple diseases. If you're passionate about meaningful work and want to be part of something bigger than yourself, Caris is where your impact begins.** **Position Summary** Caris Life Sciences is seeking a **Senior ETL Pipeline Engineer** to design and maintain scalable, production-grade data pipelines that support our molecular data platforms. Reporting to the Senior Manager of Data Engineering with a strong background in hands-on data architecture and software engineering, this high-impact role requires deep expertise in object-oriented Python, pandas, modern data engineering practices, and strong attention to detail. You will collaborate with a focused, skilled team to enhance infrastructure, automate workflows, and ensure data reliability and reproducibility. The pipelines you build will directly enable scientific research and clinical insights, contributing to advancements in precision medicine. We value thoughtful engineering, clean design, and a proactive approach to continuous improvement. **Job Responsibilities** + Design, implement, maintain, and improve modular, production-grade ETL pipelines for molecular and clinical data. + Build and extend infrastructure to support new data types, scalable processing, and robust pipeline lifecycle management. + Develop and maintain data ingestion frameworks that ensure fault tolerance, reproducibility, and data integrity. + Improve automated testing environments, data validation, and monitoring for accuracy, consistency, and uptime. + Write clean, maintainable Python code using pandas and object-oriented principles within a shared modular codebase. + Operate and improve workflows in AWS, including integration with S3, Step Functions, Glue, and Athena. + Use SQL to interact with upstream relational data and optimize queries for performance. + Troubleshoot and optimize data pipeline logging and reliability. + Collaborate with data scientists, bioinformaticians, and software engineers to understand data requirements and deliver robust solutions. + Follow and advocate for best practices in code quality, version control, testing, and documentation. + Contribute to internal technical discussions and help drive continuous improvements across systems. + Mentor junior engineers and contribute to team knowledge sharing. **Required Qualifications** + Bachelor's degree in Computer Science, Engineering, or related field-or equivalent professional experience. + 6+ years in data engineering or backend software development, with end-to-end ownership of production ETL systems. + Advanced proficiency in Python for data processing, including object-oriented design and pandas. + Solid SQL skills and experience working with structured relational data and query optimization. + Experience with workflow orchestration tools such as Metaflow, Airflow, or Prefect. + Experience deploying and operating pipelines in cloud environments, especially AWS. + Strong grasp of software engineering fundamentals - testing, version control, code reviews, CI/CD, and documentation. + Familiarity with containerization and deployment tools (e.g., Docker, GitHub Actions, Jenkins). + Ability to work independently, prioritize effectively, and deliver high-quality results with minimal oversight. + Excellent problem-solving skills and attention to detail. **Preferred Qualifications** + Experience working with very large datasets (terabytes or more) and distributed processing. + Familiarity with healthcare, life sciences, or biomedical data, including regulatory and privacy considerations (e.g., HIPAA). + Exposure to tools such as AWS Glue, Athena, Redshift, or other cloud-native data services. + Understanding of data lineage, reproducibility, and scientific computing best practices. + Experience with infrastructure-as-code tools (e.g., Terraform, AWS CDK) and cloud resource provisioning. + Knowledge of data governance, cataloging, and metadata management tools (e.g., Amundsen, DataHub). + Experience with data versioning and reproducibility tools (e.g., DVC, LakeFS). + Familiarity with RESTful APIs and microservices architecture. + Exposure to machine learning pipelines or integration with ML platforms is a plus. **Physical Demands** + Will work at computer most of the time. **Training** + All job specific, safety, and compliance training are assigned based on the job functions associated with this employee. **Conditions of Employment:** Individual must successfully complete pre-employment process, which includes criminal background check, drug screening, credit check ( applicable for certain positions) and reference verification. This job description reflects management's assignment of essential functions. Nothing in this job description restricts management's right to assign or reassign duties and responsibilities to this job at any time. Caris Life Sciences is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, gender, gender identity, sexual orientation, age, status as a protected veteran, among other things, or status as a qualified individual with disability. Caris Life Sciences is a leading innovator in molecular science and artificial intelligence focused on fulfilling the promise of precision medicine through quality and innovation. Caris is committed to quality and excellence at our state-of-the-art laboratories. Learn more about our tissue lab and the advanced technologies that are helping improve the lives of cancer patients.

Job Tags

Similar Jobs

Dunkin' / Baskin Robbins

Cake Decorator Job at Dunkin' / Baskin Robbins

 ...opening the store on time every weekday. **YOU MUST TRAIN TO WORK THE COUNTER FIRST. YOU WILL BE RESPONSIBLE FOR SERVING CUSTOMERS ICE CREAM AND BEVERAGES IN ADDITION TO DECORATING CAKES AND MANAGING THE STORE.** Responsibilities: Assist with administrative... 

Nigel Frank

Senior Data Scientist - Healthcare Job at Nigel Frank

 ...Senior Data Scientist - Healthcare a0MaA000000hOjp.1_1750720054 We are partnered with an innovative HealthTech company in Dallas that is transforming healthcare through advanced data-driven solutions. They are now seeking a talented Data Scientist to join their growing... 

Sodexo

Cook Job at Sodexo

 ...GRAND CANYON UNIVERSITY - 52104070 Location ID: 52104070 Cook Location: GRAND CANYON UNIVERSITY - 52104070 Workdays/shifts...  ...students, should apply for open student worker positions at the school they attend. You can search student worker jobs here (... 

McLane Intelligent Solutions

IT Help Desk Technician Job at McLane Intelligent Solutions

 ...At McLane Intelligent Solutions, our IT Help Desk Technicians play a crucial role in ensuring our clients receive the best technical support...  ...systems and remote support tools is a plus. Ability to work well under pressure and manage multiple tasks simultaneously.... 

Ace Hardware

Inventory/Planogram Coordinator Job at Ace Hardware

 ...Urgent Hiring: Talented Inventory/Planogram Coordinator at Ace Hardware Do you thrive in a dynamic work environment where every day brings new challenges? Are you looking for a role that allows you to be part of a loving community-focused team? If so, Ace Hardware...