This is an example piece of code for the Overfitting competition at kaggle.com. This method has an AUC score of ~.91, which is currently good enough for about 38th place on the leaderboard. If you read the completion forums closely, you will find code that is good enough to tie for 25th place, as well as hints as to how to break into the top 10.
However, I like this script because it does 2 tricky things well, without over fitting:
1. It selects features, despite the curse of dimensionality (250 observations, 200 features)
2. It fits a linear model, using the elastic net.