Destroy the data store written by the pipeline.
Usage
tar_destroy(
destroy = c("all", "cloud", "local", "meta", "process", "progress", "objects",
"scratch", "workspaces", "user"),
ask = NULL,
store = targets::tar_config_get("store")
)
Arguments
- destroy
Character of length 1, what to destroy. Choices:
"all"
: destroy the entire data store (default:_targets/
) including cloud data."cloud"
: just try to delete cloud data, e.g. target data from targets withtar_target(..., repository = "aws")
."local"
: all the local files in the data store but nothing on the cloud."meta"
: just delete the metadata file atmeta/meta
in the data store, which invalidates all the targets but keeps the data."process"
: just delete the progress data file atmeta/process
in the data store, which resets the metadata of the main process."progress"
: just delete the progress data file atmeta/progress
in the data store, which resets the progress tracking info."objects"
: delete all the target return values inobjects/
in the data store but keep progress and metadata. Dynamic files are not deleted this way."scratch"
: temporary files saved duringtar_make()
that should automatically get deleted except if R crashed."workspaces"
: compressed lightweight files inworkspaces/
in the data store with the saved workspaces of targets. Seetar_workspace()
for details."user"
: custom user-supplied files in theuser/
folder in the data store.
- ask
Logical of length 1, whether to pause with a menu prompt before deleting files. To disable this menu, set the
TAR_ASK
environment variable to"false"
.usethis::edit_r_environ()
can help set environment variables.- store
Character of length 1, path to the
targets
data store. Defaults totar_config_get("store")
, which in turn defaults to_targets/
. When you set this argument, the value oftar_config_get("store")
is temporarily changed for the current function call. Seetar_config_get()
andtar_config_set()
for details about how to set the data store path persistently for a project.
Details
The data store is a folder created by tar_make()
(or tar_make_future()
or tar_make_clustermq()
).
The details of the data store are explained at
https://books.ropensci.org/targets/data.html#local-data-store.
The data store folder contains the output data
and metadata of the targets in the pipeline. Usually,
the data store is a folder called _targets/
(see tar_config_set()
to customize), and it may
link to data on the cloud if you used AWS or GCP
buckets. By default, tar_destroy()
deletes the entire
_targets/
folder (or wherever the data store is located),
including custom user-supplied files in _targets/user/
,
as well as any cloud data that the pipeline uploaded.
See the destroy
argument to customize this behavior
and only delete part of the data store, and see functions like
tar_invalidate()
, tar_delete()
, and tar_prune()
to remove
information pertaining to some but not all targets in the pipeline.
After calling tar_destroy()
with default arguments,
the entire data store is gone, which means all the output data from
previous runs of the pipeline is gone (except for
input/output files tracked with tar_target(..., format = "file")
).
The next run of the pipeline will start from scratch,
and it will not skip any targets.
See also
Other clean:
tar_delete()
,
tar_invalidate()
,
tar_prune()
Examples
if (identical(Sys.getenv("TAR_EXAMPLES"), "true")) {
tar_dir({ # tar_dir() runs code from a temporary directory.
tar_script(list(tar_target(x, 1 + 1)), ask = FALSE)
tar_make() # Creates the _targets/ data store.
tar_destroy()
print(file.exists("_targets")) # Should be FALSE.
})
}