Job Description
I’m supporting a Google Cloud premier partner consultancy in AI, that empowers world-class businesses with cutting-edge data solutions in the cloud. They have won GCP cloud partner of the year on several occasions. By blending their expertise in machine learning, data engineering, and analytics, they help clients push the boundaries of technology. Using GCP as the foundation, they deliver future-proofed solutions that enhance consumer insights, improve competitive advantage, and optimise operational efficiency.
The Role
As a Principal Cloud Architec, you be pivotal in shaping the architectural direction of key projects. You’ll collaborate with clients to provide pre- and post-sales architectural guidance and thought leadership for machine learning, analytics, and data migration projects. Additionally, you will help define the architectural framework for next- machine learning products.
This is a unique opportunity for an experienced cloud professional to work as part of a team of AI and data experts. You will act as an SME in cloud and solution architecture, working closely with both business and technical stakeholders. The role requires a strong background in Computer Science and experience designing and implementing complex cloud platforms, particularly on Google Cloud or AWS.
To excel in this role, you will need
- A BSc or MSc in Computer Science or a related field.
- Proven experience designing and implementing large-scale cloud architectures.
- Strong analytical and technical capabilities, with an innovative mindset.
- Excellent communication skills, with the ability to present concepts clearly to a variety of audiences.
- Experience in end-to-end production-grade cloud technologies, including areas such as data, security, and networking.
- Proficiency in programming such as Python, Java, and SQL, with the ability to build scalable and high-performing code.
- Experience in technical pre-sales, including crafting and presenting innovative proposals to clients.
Desirable Skills
- Google Cloud or AWS Solutions Architect certification.
- Familiarity with ETL tools, Hadoop-based technologies (e.g., Spark), and data pipelines (e.g., Beam, Flink).
- Experience designing data lake and data warehouse solutions (e.g., BigQuery, Azure Synapse, Redshift).
- Hands-on experience with visualisation tools (e.g., Looker, Tableau, PowerBI).
- Understanding of Agile methodologies such as Scrum.
- Knowledge of data science concepts like machine learning, data mining, and visualisation.
- Contributions to open-source projects.
#J-18808-Ljbffr