Error in r function Optim output

Asked

Viewed 255 times

3

I am simulating values of the Birnbaum-Saunders distribution with alpha and beta parameters and later try to estimate these values using the Optim function, although the simulation converges, I am getting the error below, how to proceed to correct this error:

n<-1000
z<-rnorm(n,0,1)
alpha=2
beta=2
t<-beta*((alpha*z/2)+sqrt((alpha*z/2)^2+1))^2
remove(par,alpha,beta)  

Loglik<-function(par){
  alpha=par[1]
  beta=par[2]
  ll<-sum(log(t+beta))-n*log(2*alpha)-(n/2)*(log(2*pi*beta))-(3/2)*sum(log(t))-((1/(2*alpha^2))*sum((t/beta)+(beta/t)-2))
  return(-ll)
}
alpha_0=1
beta_0=5
start=c(alpha_0,beta_0)
optim(start,fn=Loglik,method="BFGS",hessian=T)$par

Upshot:

[1] 2.018785 2.133996
There were 25 warnings (use warnings() to see them)

The mistakes are:

> warnings()
Mensagens de aviso:
1: In log(t + beta) : NaNs produced
2: In log(2 * pi * beta) : NaNs produced
3: In log(t + beta) : NaNs produced

I already checked my verisimilitude, which is correct, I can’t find the error in the code?

2 answers

5


These aren’t mistakes per se, they’re warnings (warnings) that something happened in a way that probably shouldn’t, but that the code was executed completely.

In your case, they are given because the function tried to take the logarithm of a negative number. Lucky for you, let’s say, the function optim can handle it well and ignores the problem, so much so that the obtained result seems correct.

Anyway, you can do some things to avoid the warnings:

  1. You can use supressWarnings() to hide the warnings. They will remain there, but hidden:

suppressWarnings(optim(start,fn=Loglik,method="BFGS",hessian=T)$par)
  1. You can prevent negative values from being used in the calculation of ll, adding the following line before the operation:

if (any(c(t, beta, alpha, pi) < 0)) return(NA)

Note that these "solutions" are just makeup to avoid warnings. A more elegant (and probably more correct) solution would involve analyzing why the algorithm is generating alpha and beta negative, and decide if this alone is a problem (it seems not), to make a more conscious choice, or else solve the problem at the source (in the optimization method or in the function being optimized).

  • @fsblajinha Good, the mathematical solution seems the best way out. You can put as answer, even though the question is yours!

2

I would like to highlight another way that I learned today that can also help other people that is an interesting tip.

Replace within the function log verisimilitude of the parameters alpha and beta for exp(lalpha) and exp(lbeta), and then make lalpha_0=log(alpha_0) and lbeta_0=log(beta_0). The result will be exp(par[1]) and exp(par[2]).

Browser other questions tagged

You are not signed in. Login or sign up in order to post.