Senior Data Scientist - GCP Specialization

5 days ago

🏡 Remote – New York

Apply Now
Logo of Nerdery

Nerdery

Let's build powerful digital products together.

Digital Business Transformation • Strategy • Innovation • Experience Design • Software Engineering

201 - 500

Description

• Execute and lead data analysis and data modeling tasks, leveraging the Google Cloud Platform (GCP). • Work closely with stakeholders to understand data needs and develop advanced models. • Develop, test, validate, and deploy advanced machine learning models. • Lead cross-functional teams to integrate data-driven solutions into existing systems. • Conduct data exploration and preprocessing, ensuring data quality and integrity. • Implement and optimize data pipelines and ETL processes. • Continuously monitor model performance and make improvements as necessary. • Provide strategic insights and recommendations based on data analysis. • Document and communicate findings and best practices to stakeholders. • Stay updated with advancements in data science, machine learning, generative AI, and GCP technologies. • Interact with stakeholders including executives, product managers, data engineers, and designers.

Requirements

• 8+ years of experience in data science and machine learning, with a focus on GCP services such as BigQuery, Dataflow, Dataproc, and Cloud Storage. • Experience with AI and ML tools on GCP, such as AI Platform, Vertex AI, and BigQuery ML. • Experience working with generative AI models. • Proficiency in programming languages such as Python and experience with data science libraries and frameworks such as scikit-learn, Pytorch, Keras, Tensorflow, etc. • Strong understanding of statistical methods, machine learning algorithms, and data modeling techniques. • Experience with generative AI models, including model development, fine-tuning, and deployment. • Experience with Looker required and techniques to present insights effectively, ( experience with other data visualization tools like PowerBI, Tableau, etc. a plus. • Familiarity with data pipeline orchestration tools and practices, such as Pub/Sub and Cloud Functions. • Solid understanding of data warehousing principles and ETL processes. • Ability to work with stakeholders to define data requirements and translate them into actionable models. • Strong communication and teamwork abilities. • Experience with version control systems such as Git. • Proven ability to write functional and maintainable code based on technical requirements.

Apply Now
Built by Lior Neu-ner. I'd love to hear your feedback — Get in touch via DM or lior@techjobsnewyorkcity.com