non-linear-regression

Is deep learning bad at fitting simple non linear functions outside training scope (extrapolating)?

不羁岁月 提交于 2019-12-17 12:37:35
问题 I am trying to create a simple deep-learning based model to predict y=x**2 But looks like deep learning is not able to learn the general function outside the scope of its training set . Intuitively I can think that neural network might not be able to fit y=x**2 as there is no multiplication involved between the inputs. Please note I am not asking how to create a model to fit x**2 . I have already achieved that. I want to know the answers to following questions: Is my analysis correct? If the

Error in using optim to maximise the likelihood in r

。_饼干妹妹 提交于 2019-12-13 21:14:54
问题 So, I have these functions: funk1 <- function(a,x,l,r) { x^2*exp(-(l*(1-exp(-r*a))/r))} funk2 <- function(x,l,r) { sapply(x, function (s) { integrate(funk1, lower = 0, upper = s, x=s, l=l, r=r)$value })} which are used to explain the data y in, z <- data.frame(ts = 1:100, y = funk2(1:100, l = 1, r = 1) + rpois(100, 1:100)) I wish to use optim to maximise the likelihood, so I defined a likelihood function: LL_funk <- function(l,r) { n=nrow(z) R = sum((funk2(ts,l,r) - y)^2) logl = -((n/2)*log(R

Why does IPOPT evaluate objective function despite breaching constraints?

こ雲淡風輕ζ 提交于 2019-12-12 16:09:44
问题 I'm using IPOPT within Julia. My objective function will throw an error for certain parameter values (specifically, though I assume this doesn't matter, it involves a Cholesky decomposition of a covariance matrix and so requires that the covariance matrix be positive-definite). As such, I non-linearly constrain the parameters so that they cannot produce an error. Despite this constraint, IPOPT still insists on evaluating the objective function at paramaters which cause my objective function

Nonliner Regression Toolbox in Matlab (nlinfit) [duplicate]

…衆ロ難τιáo~ 提交于 2019-12-12 03:54:23
问题 This question already exists : Nonliner Regression Toolbox in Matlab (nlnfit) Closed 2 years ago . Do anyone know which algorithm and objective function for nonlinear regression MATLAB toolbox? I am looking at MATLAB website but it did not provide the information. 回答1: The Algorithms and References section at the end of the doc for nlinfit lists the algorithms and gives reference to them. The objective function is (obviously?) different depending on the problem being solved. 来源: https:/

NLS Regression in GGPlot2, Plotting y=Ax^b Trendline Error

冷暖自知 提交于 2019-12-11 16:18:47
问题 I'm attempting to fit a basic power trendline on a set of 3 data point, as you could do in Excel to mimic the y = Ax^b function. I have a very simple data set loaded into LCurve.data as follows: MDPT = {4, 10.9, 51.6} AUC = {287069.4, 272986.0, 172426.1} fm0 <- nls(log(LCurve.data$AUC) ~ log(a) + b * log(LCurve.data$MDPT), data = LCurve.data, start = list (a = 1, b =1)) ggplot(LCurve.data, aes(x=MDPT, y = AUC)) + geom_line() + geom_point() + stat_smooth(method = 'nls', formula = y ~ a * x ^ b

Negative Binomial Regression: coefficient interpretation

半城伤御伤魂 提交于 2019-12-11 15:52:23
问题 How should coefficients (intercept, categorical variable, continuous variable) in a negative binomial regression model be interpreted? What is the base formula behind the regression (such as for Poisson regression, it is $\ln(\mu)=\beta_0+\beta_1 x_1 + \dots$)? Below I have an example output from my specific model that I want to interpret, where seizure.rate is a count variable and treatment categorical (placebo vs. non-placebo). Call: glm.nb(formula = seizure.rate2 ~ treatment2, data =

NLS Function - Number of Iterations Exceeds max

浪子不回头ぞ 提交于 2019-12-11 07:57:49
问题 I have a dataset that looks like this: dput(testing1) structure(list(x = c(0, 426.263081392053, 852.526162784105, 1278.78924417616, 1705.05232556821, 2131.31540696026, 2557.57848835232, 2983.84156974437, 3410.10465113642, 3836.36773252847, 4262.63081392053, 4688.89389531258, 5115.15697670463, 5541.42005809668, 5967.68313948874, 6393.94622088079, 6820.20930227284, 7246.4723836649, 7672.73546505695, 8098.998546449, 8525.26162784105, 8951.52470923311, 9377.78779062516, 9804.05087201721, 10230

Tensorflow. Nonlinear regression

南楼画角 提交于 2019-12-10 13:45:56
问题 I have these feature and label, that are not linear enough to be satisfied with linear solution. I trained SVR(kernel='rbf') model from sklearn, but now its time to do it with tensorflow, and its hard to say what one should write to achieve same or better effect. Do you see that lazy orange line down there? It doesn't fill you with determination code itself: import pandas as pd import numpy as np import tensorflow as tf import tqdm import matplotlib.pyplot as plt from omnicomm_data.test_data

Non-linear regression in Seaborn Python

限于喜欢 提交于 2019-12-10 10:49:14
问题 I have the following dataframe that I wish to perform some regression on. I am using Seaborn but can't quite seem to find a non-linear function that fits. Below is my code and it's output, and below that is the dataframe I am using, df. Note I have truncated the axis in this plot. I would like to fit either a Poisson or Gaussian distribution style of function. import pandas import seaborn graph = seaborn.lmplot('$R$', 'Equilibrium Value', data = df, fit_reg=True, order=2, ci=None) graph.set

Failing to do fitting with non linear fitting methods (nlsLM, nlxb and wrapnls)

Deadly 提交于 2019-12-09 23:45:41
问题 I have a nls fitting task that I wanted to do with R. My first attempt to do this here and as @Roland pointed out "The point is that complex models are difficult to fit. The more so, the less the data supports the model until it become impossible. You might be able to fit this, if you had extremely good starting values." I can agree with @Roland but if excel can do this fitting why not R cannot do? Basically this fitting can be done with Excel's GRG Nonlinear solver but the process is very