
Pilar 4 — Contrastive Raster Embeddings as Active-Learning Covariates
Hugo Rodrigues
Source:vignettes/pilar4-simclr-embeddings.Rmd
pilar4-simclr-embeddings.RmdAbstract
Self-Supervised Learning (SSL) has emerged as the dominant
representation-learning paradigm in computer vision (Chen et al. 2020; He et al. 2020; Oord, Li, and Vinyals 2018) and is
increasingly applied to remote sensing (Jean et al. 2019; Reichstein et al. 2019). The
Pillar 4 of edaphos ships a minimal
SimCLR-style pipeline (Chen et al. 2020) that pre-trains
a small raster encoder on unlabelled covariate patches; the resulting
per-pixel embeddings are then used as additional covariates inside the
Pillar 5 Active Learning loop on br_cerrado. We report a
head-to-head comparison of the identical AL policy with and without the
learned features.
1. Contrastive representation learning
Given a mini-batch of raster patches, SimCLR draws two independent stochastic augmentations of each patch, encodes them with a shared CNN and a projection head , and minimises the normalised temperature-scaled cross-entropy (NT-Xent) loss (Chen et al. 2020): where , is cosine similarity and is the temperature. After pre-training, the projection head is discarded and the backbone feature vector is used as the reusable representation.
2. Data preparation
Each of the 2025 pixels of br_cerrado is made the centre
of a
patch with five normalised channels (elevation, slope, TWI, mean annual
precipitation, NDVI). Reflection padding preserves the original grid
size.
library(edaphos)
.torch_ok <- requireNamespace("torch", quietly = TRUE) &&
isTRUE(tryCatch(torch::torch_is_installed(),
error = function(e) FALSE))
if (!.torch_ok) {
knitr::knit_exit(
"torch runtime (libtorch) not available — skipping vignette."
)
}
data(br_cerrado, package = "edaphos")
covs_base <- c("elev", "slope", "twi", "map_mm", "ndvi")
H <- length(unique(br_cerrado$y))
W <- length(unique(br_cerrado$x))
stopifnot(H * W == nrow(br_cerrado))
c(H = H, W = W)
#> H W
#> 45 45torch runtime (libtorch) not available — skipping vignette.