regression

Tensorflow keras timeseries prediction with X and y having different shapes

半世苍凉 提交于 2021-02-10 15:43:05
问题 I am trying to do time series prediction with tensorflow and keras with X and y having different dimensions: X.shape = (5000, 12) y.shape = (5000, 3, 12) When I do the following n_input = 7 generator = TimeseriesGenerator(X, y, length=n_input, batch_size=1) for i in range(5): x_, y_ = generator[i] print(x_.shape) print(y_.shape) I get as desired the output (1, 7, 12) (1, 3, 12) (1, 7, 12) (1, 3, 12) ... This is because my data is meteorological, I have 5000 days, for training in the array X I

'PolynomialFeatures' object has no attribute 'predict'

跟風遠走 提交于 2021-02-10 11:55:51
问题 I want to apply k-fold cross validation on the following regression models: Linear Regression Polynomial Regression Support Vector Regression Decision Tree Regression Random Forest Regression I am able to apply k-fold cross validation on all except polynomial regression which gives me this error PolynomialFeatures' object has no attribute 'predict . How to work around this issue. Also am I doing the job correctly, actually my main motive is to see which model is performing better, so is there

'PolynomialFeatures' object has no attribute 'predict'

核能气质少年 提交于 2021-02-10 11:55:15
问题 I want to apply k-fold cross validation on the following regression models: Linear Regression Polynomial Regression Support Vector Regression Decision Tree Regression Random Forest Regression I am able to apply k-fold cross validation on all except polynomial regression which gives me this error PolynomialFeatures' object has no attribute 'predict . How to work around this issue. Also am I doing the job correctly, actually my main motive is to see which model is performing better, so is there

Manually bootstrapping linear regression in R

早过忘川 提交于 2021-02-10 11:53:05
问题 |Hi guys, I am asking you for help as I am stucked with bootstrapping... The task is: Use the nonparametric bootstrap to compute bootstrap standard error of CAPM beta estimate based on 1000 bootstrap replications and bootstrap sample size equal to the size of the original sample. If I understand it correctly, I am supposed to run my regression model 1000 times to estimate different estimates of the beta and its standard error. However, I am not able to put my thoughts into an actual R code.

Can't Calculate pixel-wise regression in R on raster stack with fun

三世轮回 提交于 2021-02-10 11:51:00
问题 I am working with rasters and I've a RasterStack with 7n layers. I would like to calculate pixel-wise regression, using formula beneath. I was trying to do it with raster::calc , but my function failed with message : 'Error in lm.fit(x, y, offset = offset, singular.ok = singular.ok, ...) : 0 (non-NA) cases.' But all rasters are OK, and contain numbers (not only NAs), I can plot it, and I can calculate general linear regression with formula cr.sig=lm (raster::as.array(MK_trend.EVI.sig_Only) ~

Manually bootstrapping linear regression in R

匆匆过客 提交于 2021-02-10 11:49:14
问题 |Hi guys, I am asking you for help as I am stucked with bootstrapping... The task is: Use the nonparametric bootstrap to compute bootstrap standard error of CAPM beta estimate based on 1000 bootstrap replications and bootstrap sample size equal to the size of the original sample. If I understand it correctly, I am supposed to run my regression model 1000 times to estimate different estimates of the beta and its standard error. However, I am not able to put my thoughts into an actual R code.

Manually bootstrapping linear regression in R

十年热恋 提交于 2021-02-10 11:49:11
问题 |Hi guys, I am asking you for help as I am stucked with bootstrapping... The task is: Use the nonparametric bootstrap to compute bootstrap standard error of CAPM beta estimate based on 1000 bootstrap replications and bootstrap sample size equal to the size of the original sample. If I understand it correctly, I am supposed to run my regression model 1000 times to estimate different estimates of the beta and its standard error. However, I am not able to put my thoughts into an actual R code.

Can't Calculate pixel-wise regression in R on raster stack with fun

天大地大妈咪最大 提交于 2021-02-10 11:48:12
问题 I am working with rasters and I've a RasterStack with 7n layers. I would like to calculate pixel-wise regression, using formula beneath. I was trying to do it with raster::calc , but my function failed with message : 'Error in lm.fit(x, y, offset = offset, singular.ok = singular.ok, ...) : 0 (non-NA) cases.' But all rasters are OK, and contain numbers (not only NAs), I can plot it, and I can calculate general linear regression with formula cr.sig=lm (raster::as.array(MK_trend.EVI.sig_Only) ~

Polynomial Regression values generated too far from the coordinates

主宰稳场 提交于 2021-02-10 06:38:29
问题 As per the the below code for Polynomial Regression coefficients value, when I calculate the regression value at any x point. Value obtained is way more away from the equivalent y coordinate (specially for the below coordinates). Can anyone explain why the difference is so high, can this be minimized or any flaw in understanding. The current requirement is not a difference of more 150 at every point. import numpy as np x=[0,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100] y=[0,885

Polynomial Regression values generated too far from the coordinates

安稳与你 提交于 2021-02-10 06:38:20
问题 As per the the below code for Polynomial Regression coefficients value, when I calculate the regression value at any x point. Value obtained is way more away from the equivalent y coordinate (specially for the below coordinates). Can anyone explain why the difference is so high, can this be minimized or any flaw in understanding. The current requirement is not a difference of more 150 at every point. import numpy as np x=[0,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100] y=[0,885