Senior Software Engineer, Google Cloud Dataproc

Overview
Google Cloud's software engineers develop the next-generation technologies that change how billions of users connect, explore, and interact with information and one another. We're looking for engineers who bring fresh ideas from all areas, including information retrieval, distributed computing, large-scale system design, networking and data storage, security, artificial intelligence, natural language processing, UI design and mobile; the list goes on and is growing every day. As a software engineer, you will work on a specific project critical to Google Cloud's needs with opportunities to switch teams and projects as you and our fast-paced business grow and evolve. You will anticipate our customer needs and be empowered to act like an owner, take action and innovate. We need our engineers to be versatile, display leadership qualities and be enthusiastic to take on new problems across the full-stack as we continue to push technology forward.
Cloud Dataproc enables open source data analytics users (Apache Spark, Hadoop, Trino, Flink, etc.) to lift and modernize their workloads into the cloud. Dedicated to meeting customers where they are, Dataproc enables users to quickly provision and manage big data clusters and workloads.
Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems.
No. of Vacancies
Specific Skills
- 5 years of programming experience with Java.
- Experience developing with Spark, Hive, or similar analytical engines.
- 3 years of experience testing, maintaining, or launching software products, and 1 year of experience with software design and architecture.
-
Experience designing, analyzing and troubleshooting large-scale distributed systems.
Responsible For
- Write and test product or system development code.
- Participate in, or lead design reviews with peers and stakeholders to decide amongst available technologies.
- Review code developed by other developers and provide feedback to ensure best practices (e.g., style guidelines, checking code in, accuracy, testability, and efficiency).
- Enhance Apache Spark for performance, reliability, security, and monitoring, and simultaneously enhance Lake House technologies like Iceberg, Hudi, or Delta Lake for performance, security, and monitoring.
- Contribute to and adapt existing documentation or educational content based on product and program updates, as well as user feedback, while also extending open-source technologies like Apache Spark, Hive, and Trino to improve their debuggability, observability, and supportability.
Additional Requirements
- Master's degree or PhD in Computer Science or related technical field.
- 1 year of experience in a technical leadership role.
- Experience developing with Iceberg, Hudi or Delta.
- Experience with Database or Data Warehouse internals.
Job Nature
Educational Requirements
Experience Requirements
Job Location
Salary
Other Benefits
Job Level
Share On
How to Apply
Interested candidates can send their resumes to career@your-domain.com mentioning "Job Title" in the subject line.
Apply Online