Senior QA Automation Engineer

Brady
Edinburgh
2 years ago
Applications closed

Related Jobs

View all jobs

Artificial Intelligence & Automation Data Engineer (12 Month Fixed-Term Contract)

Senior Machine Learning Engineer

Data Lead - Artificial Intelligence & Automation (12 Month Fixed-Term Contract)

Data Lead - Artificial Intelligence & Automation (12 Month Fixed-Term Contract)

Senior Data Science Engineer, Sports Modelling

Senior Data Science Engineer, Tennis

Senior QA Automation Engineer

Edinburgh (hybrid)

We have a truly exciting opportunity for a Senior QA Automation Engineer to be part of an innovative software engineering team developing Brady's cloud-native trading solution for the power and energy markets. With the energy revolution under way and decarbonisation driving reliance on intermittent renewable energy sources, companies must have the right tools to thrive in this green transition. Building upon Brady's unrivalled heritage in developing software for the European physical power trading markets, Brady is responding to this market shift by being first to market in creating a truly holistic short-term power trading solution called PowerDesk.

With some flagship customers already on board, we have exciting plans for PowerDesk including algorithmic trading capabilities with machine learning to be developed later this year. The Senior QA & Automation Engineer will be responsible for performing Quality Assurance and Test activities. This will involve developing and executing manual and automated tests across our product suite, which includes desktop and web applications. You will be an effective communicator who is comfortable discussing issues / ideas within the business and on site with clients. You are an enthusiastic tester whose drive is continuous improvement and a focus on helping the team deliver quality products.

The types of tech skills we're looking for:

Languages: JavaScript/TypeScript/C# Frameworks: Cypress, Playwright Testing tools: K6, Gatling Scripting: PowerShell, Bash, Python, Azure CLI Monitoring: Azure Monitor (App Insights) Reporting: PowerBI, JupterNotebooks, or similar Databases: NoSQL

Along with the technical skills, you'll likely have experience with the following:

Testing Event Driven architectures (Data Streaming) Data generation for Load and Performance testing Testing Azure cloud native systems that use Azure PaaS offerings (CosmosDB, ServiceBus, API Management, Azure Monitor, Storage Accounts) WebSockets and REST APIs Security Testing: Data segregation, roles and permissions Setting up a testing pipeline from scratch (CI/CD, regression, load, performance, security testing)

Some key responsibilities:

Liaise with internal teams (Product Management, Analysts etc) to understand requirements and develop testable Acceptance Criteria Liaise with clients, as required, to understand and develop testable Acceptance Criteria Provide Test estimates to support bid pricing, project costing and task planning Develop automation frameworks to deliver efficient and effective testing ensuring that solutions are practical, conform to good engineering practices e.g. SOLID and are readily adoptable, supportable and extendable by others in the team Design, develop and execute automated tests using approved tools and frameworks Derive and design test cases following approved development testing standards and guidelines Design, develop and execute functional and non-functional tests (automated and manual as required) Peer review QA and Test team work Prioritise workload to meet agreed commitments Review SDLC processes and recommend improvements Ensure approved development procedures are followed across the SDLC Capture, record and document bugs allowing Development teams to readily reproduce issues Provide timely feedback to Line Management as required Mentor less experienced staff in all aspects of testing (automated and manual) Ability to collaborate successfully across cross functional teams to improve processes and product quality Establishing baselines of tests required for regression testing Identification of test coverage across systems and remedial work as required to fix gapIdentification of areas of any system that would benefit from automated test investment

What Brady offers:

Great compensation + 8% pension + 5% bonus + private health insurance and more! 23 days' holiday + bank holiday, increasing by one day per year of service up to 28 days + bank holidays 1/2 day off Christmas Eve & New Year's Eve Pluralsight licenses for engineering team members Flexible working hours An opportunity to build a modern technology platform for the power and energy trading markets A positive, values-driven culture

Subscribe to Future Tech Insights for the latest jobs & insights, direct to your inbox.

By subscribing, you agree to our privacy policy and terms of service.

Industry Insights

Discover insightful articles, industry insights, expert tips, and curated resources.

How Many AI Tools Do You Need to Know to Get an AI Job?

If you are job hunting in AI right now it can feel like you are drowning in tools. Every week there is a new framework, a new “must-learn” platform or a new productivity app that everyone on LinkedIn seems to be using. The result is predictable: job seekers panic-learn a long list of tools without actually getting better at delivering outcomes. Here is the truth most hiring managers will quietly agree with. They do not hire you because you know 27 tools. They hire you because you can solve a problem, communicate trade-offs, ship something reliable and improve it with feedback. Tools matter, but only in service of outcomes. So how many AI tools do you actually need to know? For most AI job seekers: fewer than you think. You need a tight core toolkit plus a role-specific layer. Everything else is optional. This guide breaks it down clearly, gives you a simple framework to choose what to learn and shows you how to present your toolset on your CV, portfolio and interviews.

What Hiring Managers Look for First in AI Job Applications (UK Guide)

Hiring managers do not start by reading your CV line-by-line. They scan for signals. In AI roles especially, they are looking for proof that you can ship, learn fast, communicate clearly & work safely with data and systems. The best applications make those signals obvious in the first 10–20 seconds. This guide breaks down what hiring managers typically look for first in AI applications in the UK market, how to present it on your CV, LinkedIn & portfolio, and the most common reasons strong candidates get overlooked. Use it as a checklist to tighten your application before you click apply.

The Skills Gap in AI Jobs: What Universities Aren’t Teaching

Artificial intelligence is no longer a future concept. It is already reshaping how businesses operate, how decisions are made, and how entire industries compete. From finance and healthcare to retail, manufacturing, defence, and climate science, AI is embedded in critical systems across the UK economy. Yet despite unprecedented demand for AI talent, employers continue to report severe recruitment challenges. Vacancies remain open for months. Salaries rise year on year. Candidates with impressive academic credentials often fail technical interviews. At the heart of this disconnect lies a growing and uncomfortable truth: Universities are not fully preparing graduates for real-world AI jobs. This article explores the AI skills gap in depth—what is missing from many university programmes, why the gap persists, what employers actually want, and how jobseekers can bridge the divide to build a successful career in artificial intelligence.