Destroy all or part of the data store.Source:
Destroy all or part of the data store written
tar_make() and similar functions.
Character of length 1, what to destroy. Choices:
"all": destroy the entire data store (default:
_targets/) including cloud data.
"cloud": just try to delete cloud data, e.g. target data from targets with
tar_target(..., repository = "aws").
"local": all the local files in the data store but nothing on the cloud.
"meta": just delete the metadata file at
meta/metain the data store, which invalidates all the targets but keeps the data.
"process": just delete the progress data file at
meta/processin the data store, which resets the metadata of the main process.
"progress": just delete the progress data file at
meta/progressin the data store, which resets the progress tracking info.
"objects": delete all the target return values in
objects/in the data store but keep progress and metadata. Dynamic files are not deleted this way.
"scratch": temporary files saved during
tar_make()that should automatically get deleted except if R crashed.
"workspaces": compressed files in
workspaces/in the data store with the saved workspaces of targets. See
Logical of length 1, whether to pause with a menu prompt before deleting files. To disable this menu, set the
TAR_ASKenvironment variable to
usethis::edit_r_environ()can help set environment variables.
Character of length 1, path to the
targetsdata store. Defaults to
tar_config_get("store"), which in turn defaults to
_targets/. When you set this argument, the value of
tar_config_get("store")is temporarily changed for the current function call. See
tar_config_set()for details about how to set the data store path persistently for a project.
tar_destroy() is a hard reset. Use it if you
intend to start the pipeline from scratch without
any trace of a previous run in
Global objects and dynamic files outside the
data store are unaffected.