Destroy the data store.Source:
Destroy the data store written by the pipeline.
Character of length 1, what to destroy. Choices:
"all": destroy the entire data store (default:
_targets/) including cloud data.
"cloud": just try to delete cloud data, e.g. target data from targets with
tar_target(..., repository = "aws").
"local": all the local files in the data store but nothing on the cloud.
"meta": just delete the metadata file at
meta/metain the data store, which invalidates all the targets but keeps the data.
"process": just delete the progress data file at
meta/processin the data store, which resets the metadata of the main process.
"progress": just delete the progress data file at
meta/progressin the data store, which resets the progress tracking info.
"objects": delete all the target return values in
objects/in the data store but keep progress and metadata. Dynamic files are not deleted this way.
"scratch": temporary files saved during
tar_make()that should automatically get deleted except if R crashed.
"workspaces": compressed lightweight files in
workspaces/in the data store with the saved workspaces of targets. See
"user": custom user-supplied files in the
user/folder in the data store.
Logical of length 1, whether to pause with a menu prompt before deleting files. To disable this menu, set the
TAR_ASKenvironment variable to
usethis::edit_r_environ()can help set environment variables.
Character of length 1, path to the
targetsdata store. Defaults to
tar_config_get("store"), which in turn defaults to
_targets/. When you set this argument, the value of
tar_config_get("store")is temporarily changed for the current function call. See
tar_config_set()for details about how to set the data store path persistently for a project.
The data store is a folder created by
The details of the data store are explained at
The data store folder contains the output data
and metadata of the targets in the pipeline. Usually,
the data store is a folder called
tar_config_set() to customize), and it may
link to data on the cloud if you used AWS or GCP
buckets. By default,
tar_destroy() deletes the entire
_targets/ folder (or wherever the data store is located),
including custom user-supplied files in
as well as any cloud data that the pipeline uploaded.
destroy argument to customize this behavior
and only delete part of the data store, and see functions like
tar_prune() to remove
information pertaining to some but not all targets in the pipeline.
tar_destroy() with default arguments,
the entire data store is gone, which means all the output data from
previous runs of the pipeline is gone (except for
input/output files tracked with
tar_target(..., format = "file")).
The next run of the pipeline will start from scratch,
and it will not skip any targets.