Realize the full value of the cloud.
IT as a Service • Multi-Cloud • Managed Hosting • Managed AWS/Azure/Google Cloud Platform/OpenStack/Alibaba • Managed Private Cloud for VMware/Microsoft/OpenStack
August 9
🏡 Remote – New York
Realize the full value of the cloud.
IT as a Service • Multi-Cloud • Managed Hosting • Managed AWS/Azure/Google Cloud Platform/OpenStack/Alibaba • Managed Private Cloud for VMware/Microsoft/OpenStack
• Develop scalable and robust code for large scale batch processing systems using Hadoop, Oozie, Pig, Hive, Map Reduce, Spark (Java), Python, Hbase • Develop, manage, and maintain batch pipelines supporting Machine Learning workloads • Leverage GCP for scalable big data processing and storage solutions • Implementing automation/DevOps best practices for CI/CD, IaC, etc.
• Proficiency in in the Hadoop ecosystem with Map Reduce, Oozie, Hive, Pig, HBase, Storm • Strong programming skills with Java, Python, and Spark • Knowledge in public cloud services, particularly in GCP. • Experienced in Infrastructure and Applied DevOps principles in daily work. Utilize tools for continuous integration and continuous deployment (CI/CD), and Infrastructure as Code (IaC) like Terraform to automate and improve development and release processes. • Ability to tackle complex challenges and devise effective solutions. Use critical thinking to approach problems from various angles and propose innovative solutions. • Worked effectively in a remote setting, maintaining strong written and verbal communication skills. Collaborate with team members and stakeholders, ensuring clear understanding of technical requirements and project goals. • Proven experience in engineering batch processing systems at scale. • Hands-on experience in public cloud platforms, particularly GCP. Additional experience with other cloud technologies is advantageous.
Apply Now