Remote Sensing Visual Generative Models
Collection
diffusers implementation โข 24 items โข Updated
โข 1
we do not have a full checkpoint conversion validation, if you encounter pipeline loading failure and unsidered output, please contact me via bili_sakura@zju.edu.cn
ControlNet model conditioned on OpenStreetMaps (OSM) to generate the corresponding satellite images.
Trained on the region of the Central Belt.
This repo is self-contained and includes:
The dataset used for the training procedure is the WorldImagery Clarity dataset.
The code for the dataset construction can be accessed in https://github.com/miquel-espinosa/map-sat.
# From the repo root
python inference_demo.py
Or load programmatically:
from diffusers import StableDiffusionControlNetPipeline, ControlNetModel, UniPCMultistepScheduler
import torch
repo = "/path/to/controlearth" # or "." when run from repo root
controlnet = ControlNetModel.from_pretrained(f"{repo}/controlnet", torch_dtype=torch.float16)
pipe = StableDiffusionControlNetPipeline.from_pretrained(
repo, controlnet=controlnet, torch_dtype=torch.float16,
safety_checker=None, requires_safety_checker=False
)
pipe.scheduler = UniPCMultistepScheduler.from_config(pipe.scheduler.config)
pipe.enable_model_cpu_offload()
image = pipe("convert this openstreetmap into its satellite view", num_inference_steps=50, image=control_image).images[0]