View a PDF of the paper titled MetaEarth: A Generative Foundation Model for Global-Scale Remote Sensing Image Generation, by Zhiping Yu and 4 other authors
Abstract:The recent advancement of generative foundational models has ushered in a new era of image generation in the realm of natural images, revolutionizing art design, entertainment, environment simulation, and beyond. Despite producing high-quality samples, existing methods are constrained to generating images of scenes at a limited scale. In this paper, we present MetaEarth, a generative foundation model that breaks the barrier by scaling image generation to a global level, exploring the creation of worldwide, multi-resolution, unbounded, and virtually limitless remote sensing images. In MetaEarth, we propose a resolution-guided self-cascading generative framework, which enables the generating of images at any region with a wide range of geographical resolutions. To achieve unbounded and arbitrary-sized image generation, we design a novel noise sampling strategy for denoising diffusion models by analyzing the generation conditions and initial noise. To train MetaEarth, we construct a large dataset comprising multi-resolution optical remote sensing images with geographical information. Experiments have demonstrated the powerful capabilities of our method in generating global-scale images. Additionally, the MetaEarth serves as a data engine that can provide high-quality and rich training data for downstream tasks. Our model opens up new possibilities for constructing generative world models by simulating Earth visuals from an innovative overhead perspective.
Submission history
From: Zhengxia Zou [view email]
[v1]
Wed, 22 May 2024 12:07:47 UTC (18,587 KB)
[v2]
Tue, 28 May 2024 15:27:23 UTC (18,409 KB)
[v3]
Tue, 15 Oct 2024 07:42:36 UTC (28,538 KB)
Source link
lol