Full Time, London About us: BeZero Carbon is a carbonratings agency. We equip world-leading organisations with theknowledge, tools and confidence to make better climate decisions.Our aim is to scale investment in environmental markets thatdeliver a sustainable future. Our offices are in London, New Yorkand Singapore. With a 170+ strong team made up of climatescientists, geospatial experts, data scientists, financial analystsand policy specialists, and global partnerships with local expertsand world-leading research institutions, our ratings and risk toolscan help you make risk informed decisions on carbon projects of anytype, at any stage, anywhere in the world.www.bezerocarbon.comJobDescription: BeZero is looking for a mid-level geospatial dataengineer to join our existing geospatial engineering & machinelearning team. This team is part of our broader data organisationand is responsible for building geospatial data and machinelearning products for our client-facing platform and internalteams. The team works closely with colleagues in our ratings,research, product, and technology teams. You’ll be responsible forbuilding geospatial data products and algorithms that directlyaffect the way our ratings and research teams analyse carbon offsetprojects. You’ll therefore work closely with scientists in ourGeospatial and Earth Observation team and ratings scientists in ourRatings team. We process large-scale satellite imagery data sets(think about any of the public NASA and ESA missions) of differenttypes (optical imagery, SAR and LiDAR) for most of our use-cases,but also leverage raster and vector data from partnerships we havewith data vendors. Tech Stack We have a bias towards shippingproducts, staying close to our internal and external customers, andtake end-to-end ownership of our infrastructure and deployments.This is a team that follows software engineering best practicesclosely. Our data stack includes the following technologies: - AWSserves as our cloud infrastructure provider and Prefect as ourworkflow orchestration engine. - Snowflake acts as our central datawarehouse for tabular data. AWS S3 is used for any of our rasterdata, and we use PostGIS for storing and querying geospatial vectordata. - We use lots of technologies from the Python geospatial datastack: packages like gdal, rasterio, xarray, geopandas and toolslike STAC and zarr. - AWS Sagemaker acts as a platform for datascience and research teams to develop data pipelines and machinelearning models. We use Weights & Biases for modelexperimentation and versioning. - GitHub Actions for CI / CD. As ageospatial data engineer, you will be deeply embedded in ourproduct and GEO teams. You’ll be responsible for buildingproductionised data pipelines that handle (large) satellite-derivedand other geospatial data sets, define best practices with our GEOscientists for geospatial data analysis, and play a key role inbuilding the underlying infrastructure for our computer vision MLmodels. Responsibilities: You will be an individual contributor onour data team, focused on scoping and building geospatial dataproducts to be deployed on our carbon markets platform. You will becharged with finding ways to monitor and maintain data flows,enable self-service analysis to business users, and to create andimprove scalable data pipelines. You will build automated datapipelines that collect, and manipulate large data sets (such asoptical satellite imagery, SAR and LiDAR, climate data and others).You will be our ideal candidate if: - You care deeply about theclimate and carbon markets and are excited by solutions fordecarbonising our economy. - You have a strong level of expertisein building and shipping geospatial data pipelines and productsthat deliver value to users. - You have experience with handlingraster and vector data formats and geospatial SQL and Pythonpackages. - You have experience with using workflow orchestrationtools, cloud platforms, and the Python scientific computing stack.- You can write clean, maintainable, scalable, and robust code inPython and SQL, and are familiar with collaborative coding bestpractices. Bonus points (but we’d still like to hear from you ifyou don’t have experience in any of these) - You have practicalknowledge of software engineering concepts and best practices, inc.DevOps, DataOps, and MLOps. - You are interested in learning how tobuild and deploy deep learning models for computer vision. Pleaseknow that even if you don’t have experience in all the areas abovebut think you could do a great job and are excited about building agreat company culture, bringing transparency to the voluntarycarbon market, and being part of a fast-growing team, we would loveto hear from you! What we’ll offer: - Competitive salary and equityin a rapidly growing VC-backed start-up through share options. -Ability to learn and develop alongside a range of sectorspecialists from the worlds of science, economics, business,finance and more. - 25 days leave (with additional time off betweenChristmas and New Year, and for your birthday). - Benefits packagecovering private medical insurance, dental, critical illness cover,income protection, life assurance, medical cash plan and cycle towork scheme (or a comparable package if you’re based overseas). -Health and wellness cash allowance. - Enhanced parental leave. -Regular social events. - Hybrid with at least 1 day a week at ourEast London office space (Liverpool Street). - Nomad working overthe summer, allowing you to work from another country. Ourinterview process: - Introduction call with the Head of GeospatialEngineering or the VP of Data (30 mins). - 2x Technical interviewswith members from the data team (60-90 mins). - Reference checks +offer. We value diversity at BeZero Carbon. We need a team thatbrings different perspectives and backgrounds together to build thetools needed to make the voluntary carbon market transparent. Weare therefore committed to not discriminate based on race,religion, colour, national origin, sex, sexual orientation, genderidentity, marital status, veteran status, age, or disability.#J-18808-Ljbffr