How’d you fancy modernising the archaic process of buying a house...?
No more nipping to Post Office, sending off your printed bank statements – and no more chasing Solicitors every day, looking for an update!
Well, you’d be part of a team on the outskirts of Oxford, who are doing just that.
They’ve spent the last few years building their data infrastructure from scratch, leaning on a modern tech stack (no legacy to worry about) – with everything hosted on Azure.
But as they’re continuing to grow, they want to take it to the next level – in terms of reliability and stability – with a longer-term view of establishing a larger data mesh.
This is where you’d come in.
Your focus initially, will be data processing in PySpark (on Databricks), managing ingestion pipelines and DataOps/DevOps.
Their Data team is only small (currently one other Engineer) – so you’ll have a real chance to put your own stamp on things, bringing new ideas to the table and shape their Data roadmap moving forward.
So, what are they looking for?
- Well, you’ll need to show you’re a strong coder (Python/PySpark), who always has an eye on automating processes.
- Somebody who can showcase their experience with Kubernetes, IaaC and Terraform.
- It’d also be handy if you’re familiar with Databricks, and if you’ve come from an Azure background.
Salary wise, they’ll pay anywhere from £70,000-£82,000 DOE.
You’d get a pretty decent package too, covering a generous holiday allowance, private medical insurance, flexible hours to fit in with your work/life balance, volunteering days to spend with a charity/cause you’re passionate about, share incentive schemes and plenty of other perks/discounts.
You’ll get to work from home for the majority too, heading into Oxford probably once a month or so. It's likely less, but I'm telling you the worst case.
If the role looks of interest, get in touch with Jack Leeming @ Optima Dev for a chat.
You need to be UK-based. They can't offer sponsorship.