OGNet: Towards a Global Oil and Gas Infrastructure Database using Deep Learning on Remotely Sensed Imagery (Papers Track) Spotlight

Hao Sheng (Stanford University); Jeremy A Irvin (Stanford); Sasankh Munukutla (Stanford University); Shawn Zhang (Stanford University); Christopher Cross (Stanford University); Zutao Yang (Stanford University); Kyle Story (Descartes Labs); Rose Rustowicz (Descartes Labs); Cooper Elsworth (Descartes Labs); Mark Omara (Environmental Defense Fund); Ritesh Gautam (Environmental Defense Fund); Rob Jackson (Stanford University); Andrew Ng (Stanford University)

Paper PDF Slides PDF Recorded Talk Cite
Power & Energy Computer Vision & Remote Sensing

Abstract

At least a quarter of the warming that the Earth is experiencing today is due to anthropogenic methane emissions. There are multiple satellites in orbit and planned for launch in the next few years which can detect and quantify these emissions; however, to attribute methane emissions to their sources on the ground, a comprehensive database of the locations and characteristics of emission sources worldwide is essential. In this work, we develop deep learning algorithms that leverage freely available high-resolution aerial imagery to automatically detect oil and gas infrastructure, one of the largest contributors to global methane emissions. We use the best algorithm, which we call OGNet, together with expert review to identify the locations of oil refineries and petroleum terminals in the U.S. We show that OGNet detects many facilities which are not present in four standard public datasets of oil and gas infrastructure. All detected facilities are associated with characteristics critical to quantifying and attributing methane emissions, including the types of infrastructure and number of storage tanks. The data curated and produced in this study is freely available at https://link/provided/in/camera/ready/version.

Recorded Talk (direct link)

Loading…