Restaurant Performance Sunk by Selfies

[This article was first published on The Pith of Performance, and kindly contributed to R-bloggers]. (You can report issue about the content on this page here)
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

An interesting story appeared over the weekend about a popular NYC restaurant realizing that, although the number of customers they served on a daily basis is about the same today as it was ten years ago, the overall service has significantly slowed. Naturally, this situation has led to poor online reviews so, the restaurant hired a firm to investigate the problem. The analysis of surveillance tapes led to a surprising conclusion. The unexpected culprit behind the slowdown was neither the kitchen staff nor the waiters, but customers taking photos and otherwise playing around with their smartphones.

Using the data supplied in the story, I wanted to see how the restaurant performance would look when expressed as a PDQ model. First, I created a summary data frame in R, based on the observed times:

> df
           obs.2004 obs.2014
wifi.data         0        5
menu.data         8        8
menu.pix          0       13
order.data        6        6
eat.mins         46       43
eat.pix           0       20
paymt.data        5        5
paymt.pix         0       15
total.mins       65      115

The 2004 situation can be represented schematically by the following queueing network

Referring to Figure 1:
  1. Customers get seated and browse over the menu items and make their selections. This is represented as a fixed (average) delay of 8 minutes in PDQ because it does not involve the waiter.
  2. Ordering, which does involve the waiter, is represented as a queueing facility where the waiter is the service facility.
  3. Eating is a fixed delay with an average time of about 45 minutes. The exact number was determined by calculating the difference between all the other observed times and the reported total time.
  4. Payment via the waiter so, this stage is also represented as a queue.

Contrast this with the slightly more complicated queueing schematic in Figure 2 which represents the 2014 situation

Referring to Figure 2:
  1. Customers engage the waiter to ask or complain about wireless (“wi-fi”) connectivity. This is a queueing stage.
  2. Customers then browse the menu but take longer due to using their smartphones.
  3. Ordering is a queueing stage.
  4. Eating is a fixed delay but takes longer than Figure 1 because of photos being taken.
  5. Payment is a queueing stage.
  6. Smartphone activity also occurs in the payment stage but, since it doesn’t involved the waiter, it’s a fixed delay.

In order to construct a PDQ model, we need the corresponding service times, but these values are not explicitly provided. Rather, I estimated them and then compared the resulting PDQ residence times for each queueing stage against the times reported in the above data frame.

Moreover, there is more than one waiter servicing all 45 customers in the restaurant, but the number of waiters is not mentioned. For the PDQ model, I assumed 6 waiters, on average, so that each waiter actually serves 7.5 customers.

All of these details are captured in the following PDQ-R code:

library(pdq)

ncust       <- 45
year        <- c(2004,2014)
waiters     <- c(6,6)     # number of waiters assumed

wifi.data   <- c(0,5)
wifi.serv   <- c(0,4)     # est. service time
menu.data   <- c(8,8) 
menu.pix    <- c(0,13)
order.serv  <- c(4,4)     # est. service time
order.data  <- c(6,6)
eat.pix     <- c(0,20)
paymt.serv  <- c(3.5,3.5) # est. service time
paymt.data  <- c(5,5)
paymt.pix   <- c(0,15)
total.mins  <- c(65,115)

sub.mins <- wifi.data + menu.data + menu.pix + order.data + eat.pix + paymt.data + paymt.pix
eat.mins <- total.mins - sub.mins
df <- data.frame(year,wifi.data,menu.data,menu.pix,order.data,eat.mins,eat.pix,paymt.data,paymt.pix,total.mins)

for(i in 1:length(year)) {
 model.name <- paste("Restaurant Model of",as.character(year[i]))
 Init(model.name)
 eat.delay <- menu.data[i] + eat.mins[i]
 pix.delay <- menu.pix[i] + eat.pix[i] + paymt.pix[i]
 CreateClosed("WaiterReq",TERM, ncust / waiters[i], eat.delay + pix.delay)

 CreateNode("WifiHelp", CEN, FCFS)
 CreateNode("Ordering", CEN, FCFS)
 CreateNode("Payment", CEN, FCFS)

 SetDemand("WifiHelp", "WaiterReq", wifi.serv[i])
 SetDemand("Ordering", "WaiterReq", order.serv[i])
 SetDemand("Payment", "WaiterReq", paymt.serv[i])

 SetTUnit("Mins")
 SetWUnit("Cust")

 Solve(EXACT)
 Report()
}

Appending the relevant PDQ outputs to the original data frame shows reasonably good calibration with the observed residence times for each queueing stage in Figures 1 and 2. The reason the 2014 queueing times are slightly lower is due to all the additional delays, which reduce the system throughput and therefore the number of waiting customers.

> df
           obs.2004 obs.2014 pdq.2004 pdq.2014
wifi.data         0        5   0.0000   5.2421
menu.data         8        8   8.0000   8.0000
menu.pix          0       13   0.0000  13.0000
order.data        6        6   6.6241   5.2421
eat.mins         46       43  46.0000  43.0000
eat.pix           0       20   0.0000  20.0000
paymt.data        5        5   5.3950   4.4226
paymt.pix         0       15   0.0000  15.0000
total.mins       65      115  66.0191 113.9069

One thing the PDQ model can do that is not easy to replicate in the actual restaurant is to produce the throughput and response time profiles for $N = 1, 2, 3, \ldots$ customers per waiter. See Figure 3. The projected throughput (a concave function) and the total occupancy time in the restaurant (a convex function), due to the combined queueing response time and delays, are calculated for both 2004 and 2014.

The x-axis is the number of customers per waiter assumed in the PDQ model, so that 45 total customers in the restaurant corresponds to 7.5 customers per waiter; indicated by the vertical dashed line. If the customer load were to increase, above its current level, the throughput will eventually saturate at 0.2500 customers per minute. Similarly, the occupancy times exhibit the classic "hockey stick" profile and will converge to linear growth (the hockey stick handle) because the fixed delays become swamped by the size of the queues.

The reason the 2014 throughput profile (red) is inferior to the 2004 (blue) curve is due to the longer total fixed delays in Figure 2. They have the effect of reducing the system arrival rate into the queueing nodes. Similarly, the corresponding 2014 occupancy time (red) is higher than the 2004 (blue) curve due to the higher fixed delay.

To leave a comment for the author, please follow the link and comment on their blog: The Pith of Performance.

R-bloggers.com offers daily e-mail updates about R news and tutorials about learning R and many other topics. Click here if you're looking to post or find an R/data-science job.
Want to share your content on R-bloggers? click here if you have a blog, or here if you don't.

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)