August 16
🏢 In-office - Manhattan
• Creating and maintain optimal data pipeline architecture • Designing and implementing data pipelines required in the data warehouse and data lake in batch or real-time using data transformation technologies. • Identifying, designing, and implementing internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability • Designing and deploying data models and views with large datasets that meet functional / non-functional business requirements • Delivering data integration solutions to downstream marketing and campaign software • Delivering quality production-ready code in an agile environment • Delivering test plans, monitoring, debugging and technical documents as a part of development cycle • Creating data tools for analytics and working with stakeholders across all departments to assist with data-related technical issues and supporting their data infrastructure needs
• Advanced experience writing Python scripts • Advanced working SQL knowledge and experience working with relational databases • Build processes supporting data transformation, data structures, metadata, dependency, and workload management • Show proficiency understanding complex ETL processes • Demonstrate the ability to optimize processes (ram vs io) • Knowledge of data integrity and relational rules • Understanding of AWS and Google Cloud • Ability to quickly learn new technologies is critical • Proficiency with agile or lean development practices
• An exciting and fun environment committed to driving real growth • Opportunities to build really cool products that fans love • Mentorship and professional development resources to help you refine your game • Be well, save well and live well - with FanDuel Total Rewards your benefits are one highlight reel after another
Apply Now