February 16
🏡 Remote – New York
• Create and maintain ELT/ETL processes for existing and new systems • Collaborate with development and business teams to understand requirements and define source system data flows • Develop and maintain ETL/ETL specifications for data integration development • Define and deliver consistent data modeling and data architecture standards, methodologies, guidelines and techniques • Document, implement and maintain the data pipeline architecture and related business processes • Serve as a source of knowledge of industry practices and processes • Participate in the development of enterprise standards and guidelines for data model quality and accuracy • Audit project level data model quality deliverables to ensure that practices and standards are met • Analyze information and data requirements and understand effects of data inconsistencies • Identify inefficiencies in current architecture and processes and communicate solutions in a manner that gets support from the teams involved • Perform cost and sizing estimates for projects • Collaborate with the project coordinator and the rest of the agile team to identify epics, stories and estimate effort • Create and maintain data dictionary documents, table and data lineage models and produce artifacts to support project development and communicate project information to customers
• Bachelors in Computer Science or other engineering degree equivalent • Master of SQL and Python Languages • Experience with using Airflow • Healthcare data experience • Knowledge of clinical systems (e.g. Cerner, Epic, Meditech, etc.) and standard Acute/Ambulatory workflows • Preferred AWS Cloud Practitioner or above certification
• We are committed to building a diverse team of Datavanters • Equal Employment Opportunity employer • Competitive and fair compensation philosophy
Apply Now