If you're trying to predict when an event will occur (for example, a consumer buying a product) or trying to infer why events occur (what were the factors that led to a component failing?), time-to-event models are a useful framework. These models are closely related to survival analysis in life sciences, except that the outcome of interest isn't "time to death" but time to some other event (e.g. in marketing, "time to purchase"). Also in today's applications the data sizes are much larger (often Hadoop scale) as all kinds of demographic, operational and sensor data are brought to bear to imrove the predictions.
In a webinar earlier this month, DataSong's John Wallace and Tess Nesbitt gave an overview of time-to-event models, with examples from marketing attribution and retail, and describing their on-demand implementation of these models using Revolution R Enterprise and Hadoop. You can watch the recorded webinar below:
You can also download the slides from the webinar at the link below.
Revolution Analytics Webinars: Using Time to Event Models for Prediction and Inference, presented by Revolution Analytics and DataSong