Skip to contents

Set up targets for an existing project.

Usage

use_targets(
  script = targets::tar_config_get("script"),
  scheduler = targets::use_targets_scheduler(),
  open = interactive(),
  overwrite = FALSE
)

Arguments

script

Character of length 1, where to write the target script file. Defaults to tar_config_get("script"), which in turn defaults to _targets.R.

scheduler

Character of length 1, type of scheduler for parallel computing. See <books.ropensci.org/targets/hpc.html> for details. The default is automatically detected from your system (but PBS and Torque cannot be distinguished from SGE, and SGE is the default among the three). Possible values:

  • "multicore": local forked processes on Linux-like systems (but same as "multiprocess" for tar_make_future() options).

  • "multiprocess": local platform-independent and multi-process.

  • "slurm": SLURM clusters.

  • "sge": Sun Grid Engine clusters.

  • "lsf": LSF clusters.

  • "pbs": PBS clusters. (batchtools template file not available.)

  • "torque": Torque clusters.

open

Logical, whether to open the file for editing in the RStudio IDE.

overwrite

Logical of length 1, whether to overwrite the targets file and supporting files if they already exist.

Value

NULL (invisibly).

Details

To set up a project-oriented function-oriented workflow for targets, use_targets() writes:

  1. A target script _targets.R tailored to your system.

  2. Template files "clustermq.tmpl" and "future.tmpl" to configure tar_make_clustermq() and tar_make_future() to a resource manager if detected on your system.

  3. Script run.R to conveniently execute the pipeline. Call Rscript run.R or R CMD BATCH run.R to run the pipeline using run.R.

  4. Script run.sh to conveniently call run.R in a persistent background process. Enter ./run.sh in the shell to run it.

  5. If you have a high-performance computing scheduler like Sun Grid Engine (SGE) (or select one using the scheduler argument of use_targets()), then script job.sh is created. job.sh conveniently executes run.R as a job on a cluster. For example, to run the pipeline as a job on an SGE cluster, enter qsub job.sh in the terminal.

After you call use_targets(), there is still configuration left to do:

  1. Open _targets.R and edit by hand. Follow the comments to write any options, packages, and target definitions that your pipeline requires.

  2. Edit run.R and choose which pipeline function to execute (tar_make(), tar_make_clustermq(), or tar_make_future()).

  3. If applicable, edit clustermq.tmpl and/or future.tmpl to configure settings for your resource manager.

  4. If applicable, configure job.sh for your resource manager.

After you finished configuring your project, follow the steps at https://books.ropensci.org/targets/walkthrough.html#inspect-the-pipeline: # nolint

  1. Run tar_glimpse() and tar_manifest() to check that the targets in the pipeline are defined correctly.

  2. Run the pipeline. You may wish to call a tar_make*() function directly, or you may run run.R or run.sh.

  3. Inspect the target output using tar_read() and/or tar_load().

  4. Develop the pipeline as needed by manually editing _targets.R and the scripts in R/ and repeating steps (1) through (3).

See also

Examples

if (identical(Sys.getenv("TAR_INTERACTIVE_EXAMPLES"), "true")) {
tar_dir({ # tar_dir() runs code from a temporary directory.
use_targets(open = FALSE)
})
}