Destroy the data store written by the pipeline.
Usage
tar_destroy(
destroy = c("all", "cloud", "local", "meta", "process", "progress", "objects",
"scratch", "workspaces", "user"),
ask = NULL,
script = targets::tar_config_get("script"),
store = targets::tar_config_get("store")
)
Arguments
- destroy
Character of length 1, what to destroy. Choices:
"all"
: entire data store (default:_targets/
) including cloud data, as well as download/upload scratch files."cloud"
: cloud data, including metadata as well as target object data from targets withtar_target(..., repository = "aws")
. Also deletes temporary staging files infile.path(tempdir(), "targets")
that may have been accidentally left over from incomplete uploads or downloads."local"
: all the local files in the data store but nothing on the cloud."meta"
: metadata file atmeta/meta
in the data store, which invalidates all the targets but keeps the data."process"
: progress data file atmeta/process
in the data store, which resets the metadata of the main process."progress"
: progress data file atmeta/progress
in the data store, which resets the progress tracking info."objects"
: all the target return values inobjects/
in the data store but keep progress and metadata. Dynamic files are not deleted this way."scratch"
: temporary files in saved duringtar_make()
that should automatically get deleted except if R crashed."workspaces"
: compressed lightweight files inworkspaces/
in the data store with the saved workspaces of targets. Seetar_workspace()
for details."user"
: custom user-supplied files in theuser/
folder in the data store.
- ask
Logical of length 1, whether to pause with a menu prompt before deleting files. To disable this menu, set the
TAR_ASK
environment variable to"false"
.usethis::edit_r_environ()
can help set environment variables.- script
Character of length 1, path to the target script file. Defaults to
tar_config_get("script")
, which in turn defaults to_targets.R
. If the script does not exist, then cloud metadata will not be deleted.- store
Character of length 1, path to the
targets
data store. Defaults totar_config_get("store")
, which in turn defaults to_targets/
. When you set this argument, the value oftar_config_get("store")
is temporarily changed for the current function call. Seetar_config_get()
andtar_config_set()
for details about how to set the data store path persistently for a project.
Details
The data store is a folder created by tar_make()
(or tar_make_future()
or tar_make_clustermq()
).
The details of the data store are explained at
https://books.ropensci.org/targets/data.html#local-data-store.
The data store folder contains the output data
and metadata of the targets in the pipeline. Usually,
the data store is a folder called _targets/
(see tar_config_set()
to customize), and it may
link to data on the cloud if you used AWS or GCP
buckets. By default, tar_destroy()
deletes the entire
_targets/
folder (or wherever the data store is located),
including custom user-supplied files in _targets/user/
,
as well as any cloud data that the pipeline uploaded.
See the destroy
argument to customize this behavior
and only delete part of the data store, and see functions like
tar_invalidate()
, tar_delete()
, and tar_prune()
to remove
information pertaining to some but not all targets in the pipeline.
After calling tar_destroy()
with default arguments,
the entire data store is gone, which means all the output data from
previous runs of the pipeline is gone (except for
input/output files tracked with tar_target(..., format = "file")
).
The next run of the pipeline will start from scratch,
and it will not skip any targets.
Storage access
Several functions like tar_make()
, tar_read()
, tar_load()
,
tar_meta()
, and tar_progress()
read or modify
the local data store of the pipeline.
The local data store is in flux while a pipeline is running,
and depending on how distributed computing or cloud computing is set up,
not all targets can even reach it. So please do not call these
functions from inside a target as part of a running
pipeline. The only exception is literate programming
target factories in the tarchetypes
package such as tar_render()
and tar_quarto()
.
Several functions like tar_make()
, tar_read()
, tar_load()
,
tar_meta()
, and tar_progress()
read or modify
the local data store of the pipeline.
The local data store is in flux while a pipeline is running,
and depending on how distributed computing or cloud computing is set up,
not all targets can even reach it. So please do not call these
functions from inside a target as part of a running
pipeline. The only exception is literate programming
target factories in the tarchetypes
package such as tar_render()
and tar_quarto()
.
See also
Other clean:
tar_delete()
,
tar_invalidate()
,
tar_prune_list()
,
tar_prune()
Examples
if (identical(Sys.getenv("TAR_EXAMPLES"), "true")) { # for CRAN
tar_dir({ # tar_dir() runs code from a temp dir for CRAN.
tar_script(list(tar_target(x, 1 + 1)), ask = FALSE)
tar_make() # Creates the _targets/ data store.
tar_destroy()
print(file.exists("_targets")) # Should be FALSE.
})
}