# 1500 search results for "regression"

## Bayesian linear regression analysis without tears (R)

November 17, 2013
By
$Bayesian linear regression analysis without tears (R)$

Bayesian methods are sure to get some publicity after Vale Johnson’s PNAS paper regarding the use of Bayesian approaches to recalibrate p-value cutoffs from 0.05 to 0.005. Though the paper itself is bound to get some heat (see the discussion in Andrew Gelman’s blog and Matt Briggs’s fun-to-read deconstruction), the controversy might stimulate people to explore

## Linear Regression with R : step by step implementation part-2

November 16, 2013
By

Welcome to the second part! In previous part, we understood Linear regression, cost function and gradient descent. In this part we will implement whole process in R step by step using example data set. I will use the data set provided in the machine learning class assignment. We will implement linear regression with one variable The post Linear...

## Linear Regression with R : step by step implementation part-1

November 16, 2013
By

Welcome to the first part of my series blog post. In this post, I will discuss about how to implement linear regression step by step in R by understanding the concept of regression. I will try to explain the concept of linear regression in very short manner and try to convert mathematical formulas in to codes(hope you The post Linear...

## Nonlinear Gmm with R – Example with a logistic regression

November 7, 2013
By

In this post, I will explain how you can use the R gmm package to estimate a non-linear model, and more specifically a logit model. For my research, I have to estimate Euler equations using the Generalized Method of Moments. I contacted Pierre Chaussé, the creator of the gmm library for help, since I was having...

## Poisson regression fitted by glm(), maximum likelihood, and MCMC

October 29, 2013
By

The goal of this post is to demonstrate how a simple statistical model (Poisson log-linear regression) can be fitted using three different approaches. I want to demonstrate that both frequentists and Bayesians use the same models, and that it is the fitting procedure and the inference that differs. This is … Continue reading →

## Some heuristics about local regression and kernel smoothing

October 8, 2013
By
$\mathbb{E}(Y\vert X=x)=\beta_0+\beta_1 x$

In a standard linear model, we assume that . Alternatives can be considered, when the linear assumption is too strong. Polynomial regression A natural extension might be to assume some polynomial function, Again, in the standard linear model approach (with a conditional normal distribution using the GLM terminology), parameters can be obtained using least squares, where a regression of...

## Regression on variables, or on categories?

September 30, 2013
By

I admit it, the title sounds weird. The problem I want to address this evening is related to the use of the stepwise procedure on a regression model, and to discuss the use of categorical variables (and possible misinterpreations). Consider the following dataset > db = read.table("http://freakonometrics.free.fr/db2.txt",header=TRUE,sep=";") First, let us change the reference in our categorical variable  (just to...

## Logistic regression and categorical covariates

September 26, 2013
By
$A$

A short post to get back – for my nonlife insurance course – on the interpretation of the output of a regression when there is a categorical covariate. Consider the following dataset > db = read.table("http://freakonometrics.free.fr/db.txt",header=TRUE,sep=";") > tail(db) Y X1 X2 X3 995 1 4.801836 20.82947 A 996 1 9.867854 24.39920 C 997 1 5.390730 21.25119 D 998 1...

## Linear regression from a contingency table

September 7, 2013
By

This morning, Benoit sent me an email, about an exercise he found in an econometric textbook, about linear regression. Consider the following dataset, Here, variable X denotes the income, and Y the expenses. The goal was to fit a linear regression (actually, in the email, it was mentioned that we should try to fit an heteroscedastic model, but let...