
Load a pretrained encoder into an edaphos_foundation_moco wrapper
Source: R/foundation_weights.R
foundation_weights_load.RdRestores a pretrained MoCo v2 encoder from a .pt state-dict file
(or by name from the published registry) into an in-memory
edaphos_foundation_moco object that is shape-compatible with
the output of foundation_moco_pretrain_tiles(). The restored
object is ready to feed into foundation_moco_embed(),
foundation_fit_classifier() or foundation_fit_regressor()
without any additional plumbing.
Usage
foundation_weights_load(
source,
n_channels = NULL,
feature_dim = NULL,
proj_dim = NULL,
cache_dir = NULL,
overwrite = FALSE,
verbose = FALSE
)Arguments
- source
Either a character name registered in
foundation_weights_list()— in which case the encoder is downloaded (cached) viafoundation_weights_download()— or a path to a local.ptfile. For the local path case, pass the matching metadata asn_channels/feature_dim/ etc.- n_channels, feature_dim, proj_dim
Optional integers; when
sourceis a local path these must describe the shape the.ptfile was saved from. Whensourceis a registry name they are pulled from the registry.- cache_dir, overwrite, verbose
Forwarded to
foundation_weights_download()whensourceis a registry name.
Value
An edaphos_foundation_moco object with a populated
$encoder_q (the query encoder; the momentum encoder is not
restored because it is only needed during training).
Examples
if (FALSE) { # \dontrun{
moco <- foundation_weights_load("edaphos-cerrado-moco-v1")
emb <- foundation_moco_embed(moco, patches)
} # }