0
votes

I have a bunch of sales opportunities in various excel files- broken down by region, type, etc.- that are one column each and simply list the dollar amounts of each opportunity. In R I have run a simulation to determine the likelihood of each opportunity closing with a sale or not, and repeated the simulation 100,000 times. I know that I can't pass the full results table back to Tableau because it has 100,000 rows- one total for each simulation- and the data I'm pulling into Tableau would just have the $ value of each opportunity so would only have a length of the number of opportunities of that type.

What I have in R is basically this first block of code; repeated a number of times with varying inputs and changing probabilities; then ultimately combine the totals vectors to get a quarter total vector.

APN<-ncol(APACPipelineNew)
APNSales<-matrix(rbinom(APN, 1, 0.033), 100000, APN)
APNSales<-sweep(APNSales,2,APACPipelineNew,'*')
APNTotals<-rowSums(APNSales)
...
Q1APACN<-APNTotals+ABNTotals+AFNTotals
...
Q1Total<-Q1APACT+Q1EMEAT+Q1NAMT

What I'd like to do is set this up as a dashboard in Tableau so that it can automatically update each week, but I'm not sure how to pass the simulation back into Tableau given the difference in length of the data.

1

1 Answers

0
votes

Some suggestions: For R you can use a windows scheduler to run a job at any given interval (or use the package taskscheduleR). After you save the R data you can manually update your dashboard if it is on a desktop version (I do not know if you can schedule an extract refresh with a desktop dashboard). However, if your dashboard lives on a tableau server you can schedule an extract refresh every week. Obviously, I would schedule the r update before the tableau extract refresh.

If you only wanted the data to update if there was a differing number of rows from the previous weekly run you can build that logic into R. Although saving the r data and refreshing the extract with the same data and number of rows should not cause any problems.