Location: Work from home
Job Level: Level 4
Job Family: Technology
Closing Date: Tuesday 3 November 2020
We’re building a data analytics platform to help the NHS manage one of the richest healthcare datasets in the world. Currently housing nearly 50tb of data and able to query 1.5 billion rows of data every second, our platform is designed to unlock the potential of healthcare data to drive efficiency and deliver better care.
If you have a passion for doing good, we’d like you to join us to help us create new, innovative, accurate and secure solutions across this data to benefit UK healthcare.
Your new role
As our DevOps Engineer, you’ll focus on automating the rollout of applications, analytics and ML/AI workloads using CI/CD technologies like Github, Jenkins, AWS CodeBuild/CodePipeline on AWS. This is a cross-functional role where you will take ownership of existing parts of the stack or developing new projects into the stack. The technologies and solutions you own will change over time.
You’ll have responsibility for taking new applications and containerizing workloads (applications, analytics, data pipelines) using EKS/ECS (Docker) as well as defining Datacentric Solutions with CloudFormation/CDK/Terraform.
This role is vital to help us build and promote a DevOps culture within the team and wider organisation so you’ll need to be comfortable acting as an SME where required.
The team you’ll join
Our team is curious, driven, intelligent, pragmatic, collaborative, open-minded, but most importantly, we get things done. We’ve got developers, Solution Architects, Technical Architects and Data Scientists and we’re geographically spread across the UK and India.
The team mostly work from home (even in non-lockdown mode!) so this is an option which is open to you.
What you’ll bring
These are the skills and experience you’ll need to set yourself up for success in this role, however, we’d still love to talk to you even if you don’t tick all of them:
- You’ll have a good working knowledge Infrastructure and Automation on AWS and you’ll be able to understand and apply your experience to deliver modern data architectures.
- You’ll have experience with containerisation using Kubernetes, EKS/ECS (Fargate and EC2) to deploy and secure production workloads
- You’ll be able to use your expertise in defining CI/CD Pipelines to build, validate, verify, deploy and promote artefacts across environments
- Any experience with AWS, Docker, Spark or Big Data would be a real benefit
- You’ll be a team player, a collaborator and comfortable with remote working
What we’ll give you
Along with a competitive salary and benefits package which includes things like a pension, life assurance and gym discounts!
We also take our corporate social responsibility pretty seriously and we believe in giving back to our community, so you’ll get the chance to be involved in lots of fundraising events throughout the year. Our latest one was a challenge to complete up to a 100km walk over 24hrs across the Peak District!
So if this snapshot of the role looks interesting, click here to apply. If you don’t apply now, the job may vanish, but don’t panic; sign up for our job alerts and we’ll keep you updated.