Data Engineer

August 23

🏡 Remote – New York

Apply Now
Logo of Interwell Health

Interwell Health

Reimagining kidney care to help patients live their best lives.

501 - 1000

💰 Venture Round on 2023-03

Description

• Interwell Health’s Data Engineer role is responsible for corporate claims data acquisition, file loading and structure development. • This position requires using current Data warehouse technologies, design, test, plan, implement, maintain EDW databases. • Designs and develops database systems, processes, and documentation, based on the requirements of the company. • Collaborates effectively with the development groups to deliver projects to the satisfaction of the client in a timely fashion. • Maintains current knowledge of the latest technologies in Data Architecture for utilization as appropriate within the department. • Identifies enhancement options and process improvements. • Assist with schema design, code review, SQL query tuning. • Write and deploy SQL code. • Guides and mentors junior teammates on development processes. • Becomes skilled with the architecture and technology supporting the Datawarehouse. • Ensures the distribution of knowledge for processes being designed. • Ensure the design of sufficient documentation for easy handover of projects. • Participates in the formulation and design of methodologies. • Trains in developing tools available with the Datawarehouse and Lakehouse infrastructure. • Monitors and troubleshoots data pipelines.

Requirements

• 3-5 years related experience in data engineering with bachelor’s degree; or a master’s degree with 3 years’ experience. • Ability to architect simple and complex solutions related to integration with other applications and design. • Excellent communication skills, both verbal and written. • Strong presentation skills. Ability to present formally to users and customers during gathering requirements, workshops, and feedback sessions. • Experience working under tight deadlines while maintaining high product relevance and quality. • Strong understanding of data lake and Lakehouse architecture principles, data modeling techniques, and data warehousing concepts. • 3-5 years’ experience using Azure Data Factory pipelines and Databricks. • 5 years’ experience working with Azure databases. • Extensive experience with MS SQL stack. (SSIS and ADF) • Extensive experience with Azure - Cloud migration. • Python experience.

Benefits

• We care deeply about the people we serve. • We are better when we work together. • Humility is a source of our strength. • We bring joy to our work. • We deliver on our promises.

Apply Now
Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@techjobsnewyorkcity.com