Amach is an industry-leading technology driven company with headquarters located in Dublin and remote teams in UK and Europe.Our blended teams of local and nearshore talent are optimised to deliver high quality and collaborative solutions.Established in 2013, we specialise in cloud migration and development, digital transformation including agile software development, DevOps, automation, data and machine learning…We are looking for a highly experienced Data Architect to design and implement cutting-edge cloud data solutions for our customer. The ideal candidate will have strong expertise in AWS Data Tools, SQLMesh, Terraform, Snowflake and Tableau, alongside a proven ability to design scalable data infrastructures that support advanced analytics and reporting. You will provide technical leadership, guide the engineering team and collaborate with stakeholders to ensure data strategies align with long-term business objectives. Strong skills in data security, Agile methodologies, and translating complex technical concepts into business language are essential for success in this role.Please note the successful candidate is expected to work from our customer's office in Warrington from time to time. Required skills: Experience in designing and implementing leading edge on premise and in Cloud data solutionsExperience working closely with the client and the delivery team to develop strategies and roadmaps that deliver client needs and requirementsDesigning architecture solutions that are in line with long-term business objectivesExperience around data security Excellent knowledge of AWS Data Tools, SQLMesh, Terraform, Snowflake and TableauDesigning a data infrastructure that supports complex data analytics, reporting and visualisation servicesProviding technical leadership and direction to the engineering teamsBuilding effective relationships with senior technical staff so that there is a common understanding of goals and challengesMeeting with clients or executive team members to engage in architectural and requirement analysis discussionsCreating documentation and diagrams that show key data entities and creating an inventory of the data needed to implement solutionsHelping to maintain the integrity and security of data assetsRelevant 3rd level qualification with a strong technical focusExcellent knowledge and proven experience of working with IT Software Development Lifecycle methodologies with particular focus on Agile as the de facto methodologyExperience working with Agile teamsExperience in working with third party suppliers in the delivery of business or IT change initiatives – including experience of working with remote and co-located teams and vendors Experience leading projects based on legacy technologies in an organisationA strong understanding of best practices, tools and techniques for delivery management with ability to continuously improve these processes in an agile delivery organisationAbility to translate technical to business speak and sometimes vice versaKey responsibilities & duties include: Assembling large, complex sets of data that meet non-functional and functional business requirements Identifying, designing and implementing internal process improvements including re-designing infrastructure for greater scalability, optimising data delivery and automating manual processes Translating business requirements into technical specifications, including data streams, integrations, transformations, databases and data warehouses Defining the data architecture framework, standards and principles, including modelling, metadata, security, reference data and master data Defining reference architecture, which is a pattern others can follow to create and improve data systems Defining data flows, i.E., which parts of the organisation generate data, which require data to function, how data flows are managed and how data changes in transition Collaborating and coordinating with multiple departments, stakeholders, partners and external vendors Desirable Skills:Experience in large enterprise data warehouse Ability to build and optimise data sets, ‘big data’ pipelines and architectures Ability to perform root cause analysis on external and internal processes and data to identify opportunities for improvement and answer questions Excellent analytic skills associated with working on unstructured datasets Ability to build processes that support data transformation, workload management, data structures, dependency and metadata Knowledge of ODBC and Java Experience with Data warehousing, Cubes and emerging EPP/MPP data designs Experience with Snowflake and AWS Data system preferable AWS Cloud Practitioner, Big Data Specialist, Tableau Professional or other similar certifications desired Act as an influencer to help the existing team grow into modern modelling and reporting methodologies Data Security