1
votes

I want to do a poisson regression by hand and define a function that can be used for estimation of an arbitrary number of coefficients. I have 2 questions:

First: How can I get a matrix of betas and don't have to write every beta explicity. I want to write lambda in this way lambda = exp(t(x)%*%beta) . I thought I can do a for loop and create for every column in x a beta and sum them ab in a matrix but I dont know how to code it.

Second : As I don't know how to write for i betas I tried to write the function for estimating 6 betas. I get a result with the dataset warpbreaks but the coefficients are not the same like in glm, why? I also don't know which values i have to paste to par and also don't know why optim doesn't work if I don't paste x and y to the function.

Hope you can help!

    daten <- warpbreaks

LogLike <- function(y,x, par) {
  beta <- par
  # the deterministic part of the model:
  lambda <- exp(beta%*%t(x))
  # and here comes the negative log-likelihood of the whole dataset, given     the
  # model:
  LL <- -sum(dpois(y, lambda, log = TRUE))
  return(LL)
}


PoisMod<-function(formula, data){

  # #formula
  form <- formula(formula)
  # 
  # # dataFrame 
  model <- model.frame(formula, data = data)
  # 
  # # Designmatrix 
  x <- model.matrix(formula,data = data)
  # 
  # # Response Variable
  y <- model.response(model)

  par <- rep(0,ncol(x))

  call <- match.call()

  koef <- optim(par=par,fn=LogLike,x=x,y=y)$par

 estimation <- return(list("coefficients" = koef,"call"= call))

  class(result) <- "PoisMod"
}


print.PoisMod <- function(x, ...) {  

  # Call 
  cat("Call:", "\n")

  # 
  print(x$call)

  # 
  cat("\n")

  # Coefficients  
  cat("Coefficents:", "\n")

  # 
  Koef <- (t(x$coefficients))

  # 
  rownames(Koef) <- ""

  # 
  print(round(Koef, 3))
}
1
but it doenst work .. can you be more specific ? Do you get an error ? - MrSmithGoesToWashington
Sry my fault, i edit my question with better code and 2 explicit questions. - Dima Ku

1 Answers

2
votes

Here a working example, based on your code .. but without the square of the explanatory variables :

LogLike <- function(y,x, par) {
  beta0 <- par[1]
  beta1 <- par[2]
  beta2 <- par[3]
  beta3 <- par[4]
  # the deterministic part of the model:
  lambda <- exp(beta0*x[,1] + beta1 * x[,2] +beta2*x[,3]+beta3*x[,4])
  # and here comes the negative log-likelihood of the whole dataset, given     the
  # model:
  LL <- -sum(dpois(y, lambda, log = TRUE))
  return(LL)
}


PoisMod<-function(formula, data){

  # # definiere Regressionsformel
  form <- formula(formula)
  # 
  # # dataFrame wird erzeugt 
   model <- model.frame(formula, data = data)
  # 
  # # Designmatrix erzeugt
  x <- model.matrix(formula,data = data)
  # 
  # # Response Variable erzeugt
   y <- model.response(model)

  par <- c(0,0,0,0)
  erg <- list(optim(par=par,fn=LogLike,x=x,y=y)$par)
  return(erg)
}

PoisMod(breaks~wool+tension, as.data.frame(daten))

And you can compare with glm :

glm(breaks~wool+tension, family = "poisson", data = as.data.frame(daten))

Edit : for any number of explanatory variables

LogLike <- function(y,x, par) {
  beta <- par
  # the deterministic part of the model:
  lambda <- exp(beta%*%t(x))
  # and here comes the negative log-likelihood of the whole dataset, given     the
  # model:
  LL <- -sum(dpois(y, lambda, log = TRUE))
  return(LL)
}


PoisMod<-function(formula, data){

  # # definiere Regressionsformel
  form <- formula(formula)
  # 
  # # dataFrame wird erzeugt 
   model <- model.frame(formula, data = data)
  # 
  # # Designmatrix erzeugt
  x <- model.matrix(formula,data = data)
  # 
  # # Response Variable erzeugt
   y <- model.response(model)

  par <- rep(0,ncol(x))
  erg <- list(optim(par=par,fn=LogLike,x=x,y=y)$par)
  return(erg)
}

PoisMod(breaks~wool+tension, as.data.frame(daten))
glm(breaks~wool+tension, family = "poisson", data = as.data.frame(daten))