top of page
  • info500078

A simple R-Shiny dashboard for real time streaming data

You can construct data dashboards for realtime streaming data using R-Shiny that allow dynamic data visualization for a variety of data sources and signals. For example, there are many commodity sensors that can be run with micro-controllers like Arduino family processors, e.g. sensors for ECG, PPG, temperature, volatile and particulate pollution levels, infrared cameras, and many more. Also, many streaming data feeds are now available on the Internet.

One of the easiest ways to create a streaming API is simply to periodically write blocks of streaming data to a series of output files for processing by a reactive R-Shiny dashboard. Some of our projects involve sensors collecting data and the easiest way to deal with the incoming stream is to write the buffers to a file, either appending new lines as we go, or to write a series of time tagged files with a data block in each.

The reactiveFileReader() function in Shiny will check a given file as often as you tell it to (by increments of milliseconds) and update your application data whenever there's a change to it. I can see situations in which I'd want to poll the file more often than once a millisecond, but for the purposes of display in a Shiny application, I haven't found this a serious limitation—The frame rate of the monitor displaying the data dashboard limits the effective refresh rates anyway.

Note that this dashboard is intended to be run locally; it is not a web-facing app. It should work fine in RStudio or anything else that runs Shiny.


An example code with comments:

# Place everything from here to the break in a file, then run it from the command line using

#“Rscript #yourfilename.R” wherever you saved it.


# Making the raw dice rolls


makeRunifInts <- function(n,lowerBound,upperBound) {

randoms <- runif(n = n, min = 0,max = 1)

lowerBound <- as.integer(lowerBound)

upperBound <- as.integer(upperBound)

if (upperBound <= lowerBound) {

stop('upperBound must be greater than lowerBound')


rangeDiff <- upperBound - lowerBound

randoms <- randoms * (rangeDiff + 1)

randoms <- randoms + lowerBound

randoms <- floor(randoms)



# Rolling K Dice N times

kDiceNRolls <- function(k, n) {

rolls <- makeRunifInts(n = n * k,


upperBound = 6)

rollMat <- matrix(data = rolls,

nrow = n,

ncol = k)

diceSums <- sort(rowSums(rollMat))



updateDiceRolls <- function(mOutputFiles, nDice,kRolls) {

if (mOutputFiles <= 0) {

stop('invalid input: mOutputFiles <= 0')


while(TRUE) {

for (i in 1:mOutputFiles) {

diceRolls <- kDiceNRolls(k = kRolls,n = nDice)

write.table(x = diceRolls,file = paste("~/Desktop/testData",i,".csv",sep = ''),

row.names = FALSE,

col.names = FALSE)





updateDiceRolls(mOutputFiles = 1,nDice = 10,kRolls = 10)



# A reminder: Shiny apps have two parts--the UI, and the server. If you start with a UI and Server # template every time it's hard to go too wrong.

# Assigning ui as a fluid page, so Shiny has license to move the elements of the window around.

ui <- fluidPage(


# we're outputting a density calculation of the dice rolls to see what the distribution of the sums look # like




server <- function(input,


session) {

# we assign csvContents with reactiveFileReader. This updates once per 100ms

csvContents <- reactiveFileReader(intervalMillis = 100,

filePath = "~/Desktop/testData1.csv",

readFunc = read.table, # we pass arguments to read.csv after the first four parameters are # declared.

session = NULL,

header = FALSE) # this is an argument to read.csv

# This step assigns a rendered table object to the fileContents entry of the output list from # server. This # is what the ui function is outputting for the user

output$fileContents <- renderPlot({

ggplot(data = csvContents(),

aes(x = csvContents()$V1)) +




# Finally, we begin the Shiny Application

shinyApp(ui = ui, server = server)


The output will look something like this:

With each new set of coin tosses showing as a frame.

1,475 views0 comments

Recent Posts

See All

Commodity Biomedical Sensors - What's Practical Now?

Low cost pervasive sensors are now available and the data feeds can be analyzed with algorithms to create new physiological capabilities. The algorithms have to cope with lots of noise, however, in o

Filtering ECG's for Diagnostics and Monitoring

Abstract ECG signal processing techniques and standards differ depending on the purpose of the ECG. Diagnostic ECG’s require preservation of very low frequencies to preserve ST-Segment fidelity. Monit


bottom of page