Logging in R
Tips and tricks using the {logger} package

Introduction
What is logging?
In computer science, logging is the practice of keeping track of some events occurring while the code runs1.
These outputs (called logs) are most often written to a text file (logfile), which has the .log extension by convention.
Why to log
Logging allows to keep track of events in a permanent way that’s not going to be erased (like console outputs) or forgotten (like your memory).
I find logging particularly useful to keep track of:
- Errors/warnings. Sometimes, code can run in an unexpected manner, and logs are useful to determine why and how this happened.
- Metadata about your environment/setup. For example, I use it to record how long a bit of code ran, or which arguments I used.
When to log
Logging is useful primarily for scripts that you don’t monitor when they run, like scripts that are long to run, or a series of short, but numerous numerous scripts.
Logging in base R
To write a logfile, the simplest option is just to use base R functions:
1# Create a log file and a file connection
2logfile <- tempfile(fileext = ".log")
3logfile_con <- file(logfile, open = "a")
4
5# Write things to the logger
6writeLines("Start analyses", con = logfile_con)
7res <- 40 + 2
8writeLines(paste("res is", res), con = logfile_con)
9writeLines("End analyses", con = logfile_con)
10close(logfile_con)
Let’s inspect the logfile:
1# Show contents of the logfile
2cat(readLines(logfile), sep = "\n")
Start analyses
res is 42
End analyses
This approach is sometimes enough, but there are packages out there that allow to make things more flexible and easy for us on the user end.
The {logger} package
One of them is {logger}. It is very complete and flexible, but at the same time I find it simple to use. This package allows you to:
- define a log level for your messages (e.g. INFO, WARN, DEBUG)
- prefix log messages with a predefined sequence (by default, log level and date)
- log to different streams (file(s), console, and many others discussed below)
- log errors, warnings and messages easily (see below)
Basic logger
The first step is to initialize the logger using the log_appender function. Here, we set the appender argument with an appender_file() function. The {logger} package also comes with many other appenders allowing to log to console, Slack channel, Telegram group chat…
We also define a logging level with log_threshold(DEBUG). Here, all messages above the DEBUG level will be logged (see the list and order of levels here).
1library(logger)
2
3# Create logger
4logfile <- tempfile(fileext = ".log")
5log_appender(appender = appender_file(file = logfile))
6# Set threshold level
7log_threshold(DEBUG)
Then, we can log messages with logging functions. Here, we use log_debug and log_info function to write to our logfile. Note that by default, {logger} uses the {glue} syntax to concatenate text and expressions (exemplified in "res is {res}", where {res} is replaced with the variable value in the log).
1# Write things to the logger
2log_debug("Start script")
3res <- 40 + 2
4log_info("res is {res}")
5log_debug("End analyses")
Here is our logfile:
DEBUG [2026-03-20 15:24:14] Start script
INFO [2026-03-20 15:24:14] res is 42
DEBUG [2026-03-20 15:24:14] End analyses
That’s it for a basic logger!
Setting logger level
Now imagine we want a detailed logger in the testing phase, but when launching our final script we want to print only important stuff. This can be achieved by changing the log level.
1# Create a log file and a file connection
2logfile <- tempfile(fileext = ".log")
3log_appender(appender = appender_file(file = logfile))
4# Change log level to INFO
5log_threshold(INFO)
6
7# Write things to the logger
8log_debug("Start script")
9res <- 40 + 2
10log_info("res is {res}")
11log_debug("End analyses")
In the code above with a log level set to INFO, all DEBUG messages are omitted.
INFO [2026-03-20 15:24:15] res is 42
Logging warnings, errors and messages
Now a useful thing to log are errors, warnings and messages occurring during computations. Consider the code below:
1# Create a log file and a file connection
2logfile <- tempfile(fileext = ".log")
3log_appender(appender = appender_file(file = logfile))
4log_threshold(DEBUG)
5
6# Write things to the logger
7log_debug("Start script")
8res <- "forty-two"
9log_info("res is {res}")
10res <- as.numeric(res) # This produces a warning
Warning: NAs introduced by coercion
1log_debug("End analyses")
Now, the certainly the logfile should show the warning?
DEBUG [2026-03-20 15:24:15] Start script
INFO [2026-03-20 15:24:15] res is forty-two
DEBUG [2026-03-20 15:24:15] End analyses
… except it doesn’t. The logger records only what we tell it to, so we need to explicitly ask to record warnings.
To record warnings, we need to use log_warnings()2:
1# Create a file connection (assuming "logfile" is a valid path)
2log_appender(appender = appender_file(file = logfile))
3log_threshold(DEBUG)
4
5# Record errors in logger
6log_warnings()
7
8# Write things to the logger
9log_debug("Start script")
10res <- "forty-two"
11log_info("res is {res}")
12res <- as.numeric(res)
13log_debug("End analyses")
And now, our logger records the warnings.
DEBUG [2026-03-20 15:09:53] Start script
INFO [2026-03-20 15:09:53] res is forty-two
WARN [2026-03-20 15:09:54] NAs introduced by coercion
DEBUG [2026-03-20 15:09:54] End analyses
The same is true for errors and messages, which can be recorded with log_messages() and log_errors().
Logging with parallel computing
Another thing I find interesting with {logger} is that it’s handy to keep track of what’s happening in different parallel processes (see this great blogpost for more explanations on parallel computing in R). Consider the parallel code below: starting from a list of species sightings, it computes the total number of sightings per species:
1# Parallel computing libraries
2library(parallel)
3library(foreach)
4library(doParallel)
5
6# Parallel computing setup
7n_cores <- 4
8cluster <- makeCluster(spec = n_cores)
9registerDoParallel(cluster)
10
11# Generate dummy dataset of species sightings
12species_counts <- lapply(1:4,
13 function(i) rbinom(10, size = 1, prob = 0.5))
14names(species_counts) <- c("Aeshna cyanea", "Anax imperator",
15 "Calopteryx virgo", "Crocothemis erythraea")
16
17# Parallel loop
18res <- foreach(i = 1:4) %dopar% {
19 # Get species name
20 sp <- names(species_counts)[i]
21 # Get total count
22 res <- sum(species_counts[[i]])
23 # Return values
24 return(paste(sp, res))
25}
26stopCluster(cluster)
27
28# print the total count per species
29res
[[1]]
[1] "Aeshna cyanea 8"
[[2]]
[1] "Anax imperator 5"
[[3]]
[1] "Calopteryx virgo 6"
[[4]]
[1] "Crocothemis erythraea 6"
We can log parallel events by defining a logger in each parallel process:
1# Parallel computing setup
2cluster <- makeCluster(spec = n_cores)
3clusterEvalQ(cl = cluster,
4 expr = {library(logger)})
5clusterExport(cl = cluster,
6 varlist = c("log_appender", "appender_file", "log_info"))
7registerDoParallel(cluster)
8
9# Get temporary directory for logfiles
10# This ensures all logs are written to the same temp folder
11tmpdir <- tempdir()
12
13# Parallel loop
14res <- foreach(i = 1:4) %dopar% {
15 # Get species name
16 sp <- names(species_counts)[i]
17
18 # Create species logger
19 logfile <- file.path(tmpdir, paste0(sp, ".log"))
20 log_appender(appender = appender_file(file = logfile))
21
22 # Get and log total count
23 log_info("Logger for species {sp}")
24 res <- sum(species_counts[[i]])
25 log_info("Species count: {res}")
26
27 # Return values
28 return(paste(sp, res))
29}
30stopCluster(cluster)
Let’s see what’s in the logfiles:
[1] "File Aeshna cyanea.log -----"
INFO [2026-03-20 15:24:16] Logger for species Aeshna cyanea
INFO [2026-03-20 15:24:16] Species count: 8
[1] "File Anax imperator.log -----"
INFO [2026-03-20 15:24:16] Logger for species Anax imperator
INFO [2026-03-20 15:24:16] Species count: 5
[1] "File Calopteryx virgo.log -----"
INFO [2026-03-20 15:24:16] Logger for species Calopteryx virgo
INFO [2026-03-20 15:24:16] Species count: 6
[1] "File Crocothemis erythraea.log -----"
INFO [2026-03-20 15:24:16] Logger for species Crocothemis erythraea
INFO [2026-03-20 15:24:16] Species count: 6
Amazing! Our outputs got copied to the files!
Conclusion
Logging is a great way to improve the reproducibility of analyses, and the {logger} packages can really make logging easy. This post showcased some applications of the package, including logging warnings and using it with parallel computing, but many other uses are possible: check out the resources below to learn more!
Resources
- {logger} package documentation
- {logger} presentation at RStudio::conf 2020