R: Error in is.nloptr(ret) : objective in x0 returns NA

后端 未结 1 751
粉色の甜心
粉色の甜心 2021-01-22 17:28

I am trying to use the nloptr package to find the optimal x value that maximized the non-linear function F=b0+b1*x+b2*x^2+b3*x^3.

I am using the following code with appl

相关标签:
1条回答
  • 2021-01-22 18:02

    You should NOT be passing rows of "Regression" using apply if you are also intending to access items inside the function. There's also going to be a problem when apply coerces Regression to a single type. It will be character rather than numeric. Instead, it should be:

    library(nloptr)
    F <- function(x,b0,b1,b2,b3){return(b0+b1*x+b2*x^2+b3*x^3)}
    Optimal <- apply(Regression[-1],     #removes first column
                                     1, function(i){   # i-variable gets values
                      nloptr( x0 <- c(0)
                             ,eval_f <- F
                             ,eval_g_ineq = NULL
                             ,eval_g_eq = NULL
                             ,eval_grad_f = NULL
                             ,eval_jac_g_ineq = NULL
                             ,eval_jac_g_eq = NULL
                             ,lb <- c(-Inf)
                             ,ub <- c(Inf)
                             ,opts <- list( "algorithm" = "NLOPT_LD_AUGLAG",
                                            "xtol_rel" = 1.0e-7,
                                            "maxeval" = 1000)
                             ,b0=i[1]
                             ,b1=i[2]
                             ,b2=i[3]
                             ,b3=i[4])})
    

    Tested with your "Regression"-object. (I have concerns about whether there will be a minimum or a maximum when attempting to work with a cubic polynomial.) Unfortunately you have chosen parameters that are inconsistent:

    Error in is.nloptr(ret) : 
      A gradient for the objective function is needed by algorithm NLOPT_LD_AUGLAG 
    but was not supplied.
    

    It should be possible to calculate a gradient of a polynomial without too much difficulty, though.

    After constructing a gradient function I now get:

    grad_fun <- function(x,b0,b1,b2,b3) { b1 + x*b2/3 +x^2*b3/3 }
    > F <- function(x, b0,b1,b2,b3){return(b0+b1*x+b2*x^2+b3*x^3)}
    > Optimal <- apply(Regression[-1],     
    +                                  1, function(i){   
    +                   nloptr( x0 <- c(0)
    +                          ,eval_f <- F
    +                          ,eval_g_ineq = NULL
    +                          ,eval_g_eq = NULL
    +                          ,eval_grad_f = grad_fun
    +                          ,eval_jac_g_ineq = NULL
    +                          ,eval_jac_g_eq = NULL
    +                          ,lb <- c(-Inf)
    +                          ,ub <- c(Inf)
    +                          ,opts <- list( "algorithm" = "NLOPT_LD_AUGLAG",
    +                                         "xtol_rel" = 1.0e-7,
    +                                         "maxeval" = 1000)
    +                          ,b0=i[1]
    +                          ,b1=i[2]
    +                          ,b2=i[3]
    +                          ,b3=i[4])})
    Error in is.nloptr(ret) : 
      The algorithm NLOPT_LD_AUGLAG needs a local optimizer; specify an algorithm and termination condition in local_opts
    

    Seemed to me that I've gotten you past several hurdles, so this is not yet really an answer but it seems useful and was far too long for a comment.

    Edit; Further experiments with changing the algorithm to "algorithm" = "NLOPT_LD_LBFGS" gets the code to run without error but as far as I can see the 4 runs all returned list with $ message : chr "NLOPT_FAILURE: Generic failure code.". My guess is that optimizing cubic polynomials will generally fail without constraints and I see none in your problem specification.

    0 讨论(0)
提交回复
热议问题