Articles by Matt Bogard

Divide by 4 Rule for Marginal Effects

May 25, 2016 | Matt Bogard

Previously I wrote about the practical differences between marginal effects and odds ratios with regard to logistic regression. Recently, I ran across a tweet from Michael Grogan linking to one of his posts using logistic regression to model dividend probabilities. This really got me interested:"Moreover, to obtain a measure ... [Read more...]

A Toy Instrumental Variable Application

June 19, 2013 | Matt Bogard

I have previously discussed instrumental variables (hereand here)  from a somewhat technical standpoint, but now I’d like to present a very basic example with a toy data set that demonstrates how IV estimation works in practice. The data set below is fabricated for demonstration purposes. The idea is to ... [Read more...]

Data science = failure of imagination

January 8, 2013 | Matt Bogard

From: think I like this distinction between Bayesian and Frequentist statistics: "we are nearly always ultimately curious about the Bayesian probability of the hypothesis ... [Read more...]

How John Deere uses R

November 10, 2012 | Matt Bogard

HT: Revolution Analytics  Very good discussion about real applied econometrics and analytics including the use of ARIMA models, decision trees, and genetic algorithms. He also has a very smart approach in his attitude toward big data and data s... [Read more...]

Using SNA in Predictive Modeling

April 10, 2012 | Matt Bogard

In a previous post, I described the basics of social network analysis. I plan to extend that example here with an application in predictive analytics. Let's suppose we have the following network (visualized in R)Suppose we have used the igraph package ...
[Read more...]

Regression via Gradient Descent in R

November 27, 2011 | Matt Bogard

In a previous post I derived the least squares estimators using basic calculus, algebra, and arithmetic, and also showed how the same results can be achieved using the canned functions in SAS and R or via the matrix programming capabilities offered by ...
[Read more...]

Basic Econometrics in R and SAS

November 27, 2011 | Matt Bogard

Regression Basicsy= b0 + b1 *X  ‘regression line we want to fit’The method of least squares minimizes the squared distance between the line ‘y’ andindividual data observations yi. That is minimize: ∑ ei2 = ∑ (yi - b0 -  b1 Xi...
[Read more...]

Gradient Descent in R

November 27, 2011 | Matt Bogard

In a previous post I discussed the concept of gradient descent.  Given some recent work in the online machine learning course offered at Stanford,  I'm going to extend that discussion with an actual example using R-code  (the actual code...
[Read more...]

Elements of Bayesian Econometrics

September 16, 2011 | Matt Bogard

 posterior = (likelihood x prior) / integrated likelihoodThe combination of a prior distribution and a likelihood function is utilized to produce a posterior distribution.  Incorporating information from both the prior distribution and the likelihood function leads to a reduction in variance and an improved estimator. As n→ ∞ the likelihood centers over the ...
[Read more...]

QTL Analysis in R

August 13, 2011 | Matt Bogard

See also: Part 1: QTL Analysis and Quantitative Genetics  Part 2: Statistical Methods for QTL Analysis The 'qtl' package in R allows you to implement QTL analysis using the methods I've previously discussed. The code below is adapted from Broman... [Read more...]

R Program Documentation Template

August 13, 2011 | Matt Bogard

# ------------------------------------------------------------------ # |PROGRAM NAME: # |DATE: # |CREATED BY: MATT BOGARD # |PROJECT FILE: # |---------------------------------------------------------------- # | PURPOSE: ...
[Read more...]
1 2

Never miss an update!
Subscribe to R-bloggers to receive
e-mails with the latest R posts.
(You will not see this message again.)

Click here to close (This popup will not appear again)