We deliver open source to the world faster, more securely and more cost effectively than any other company. If you're interested in a career at Canonical, we are a remote-first company so please apply to any suitable role as skills are valued more than location, despite some having a preferred geographic preference.
Read more about this company
The data platform team is responsible for the automation of data platform operations, with the mission of managing and integrating Big Data platforms at scale. This includes ensuring fault-tolerant replication, TLS, installation, backups and much more; but also provides domain-specific expertise on the actual data system to other teams within Canonical. This role is focused on the creation and automation of infrastructure features of data platforms, not analysing and/or processing the data in them.
Collaborate proactively with a distributed team
Write high-quality, idiomatic Python code to create new features
Debug issues and interact with upstream communities publicly
Work with helpful and talented engineers including experts in many fields
Discuss ideas and collaborate on finding good solutions
Work from home with global travel for 2 to 4 weeks per year for internal and external events
What we are looking for in you
Proven hands-on experience in software development using Python
Proven hands-on experience in distributed systems, such as Kafka and Spark
Have a Bachelor's or equivalent in Computer Science, STEM, or a similar degree
Willingness to travel up to 4 times a year for internal events