wls
Syntax
wls(Y, X, W, [intercept=true], [mode=0])
Arguments
Y is a dependent variable.
X is an independent variable.
Y is a vector. X can be a matrix, table or tuple. When X is a matrix, if the number of rows is equal to the length of Y, each column of X is a factor. If the number of rows is not equal to the length of Y, but the number of columns is equal to the length of Y, each row of X is a factor.
W is a vector indicating the weight in which each element is a non-negative.
intercept (optional) is a Boolean variable that indicates whether to include the intercept in regression. The default value is true. When it is true, the system automatically adds a column of "1" to X to generate the intercept.
mode (optional) is an integer that could be 0, 1, or 2.
- 
                    
0: a vector of the coefficient estimates
 - 
                    
1: a table with coefficient estimates, standard error, t-statistics, and p-value
 - 
                    
2: a dictionary with all statistics
 
ANOVA (one-way analysis of variance)
| Source of Variance | DF (degree of freedom) | SS (sum of square) | MS (mean of square) | F (F-score) | Significance | 
|---|---|---|---|---|---|
| Regression | p | sum of squares regression, SSR | regression mean square, MSR=SSR/R | MSR/MSE | p-value | 
| Residual | n-p-1 | sum of squares error, SSE | mean square error, MSE=MSE/E | ||
| Total | n-1 | sum of squares total, SST | 
RegressionStat (Regression statistics)
| Item | Description | 
|---|---|
| R2 | R-squared | 
| AdjustedR2 | The adjusted R-squared corrected based on the degrees of freedom by comparing the sample size to the number of terms in the regression model. | 
| StdError | The residual standard error/deviation corrected based on the degrees of freedom. | 
| Observations | The sample size. | 
Coefficient
| Item | Description | 
|---|---|
| factor | Independent variables | 
| beta | Estimated regression coefficients | 
| StdError | Standard error of the regression coefficients | 
| tstat | t statistic, indicating the significance of the regression coefficients | 
Residual: the difference between each predicted value and the actual value.
Details
Return the result of an weighted-least-squares regression of Y on X.
Examples
x1=1 3 5 7 11 16 23
x2=2 8 11 34 56 54 100
y=0.1 4.2 5.6 8.8 22.1 35.6 77.2;
w=rand(10,7)
wls(y, x1, w)
// output
[-17.6177  4.0016]
wls(y, (x1,x2), w);
// output
[-17.4168  3.0481 0.2214]
            wls(y, (x1,x2), w, 1, 1);
            | factor | beta | stdError | tstat | pvalue | 
|---|---|---|---|---|
| Intercept | -17.4168 | 4.8271 | -3.6081 | 0.0226 | 
| x1 | 3.0481 | 1.6232 | 1.8779 | 0.1336 | 
| x2 | 0.2214 | 0.3699 | 0.5986 | 0.5817 | 
wls(y, (x1,x2), w,1, 2);
// output
Coefficient->
factor    beta      stdError tstat     pvalue
--------- --------- -------- --------- --------
intercept -10.11392 4.866583 -2.078239 0.106234
x1        3.938138  2.061191 1.910613  0.128655
x2        -0.088542 0.446667 -0.198227 0.852534
Residual->[6.452866,3.207839,-3.002812,-5.642629,-6.147264,-12.515038,5.590914]
RegressionStat->
item         statistics
------------ ----------
R2           0.957998
AdjustedR2   0.936997
StdError     17.172833
Observations 7
ANOVA->
Breakdown  DF SS           MS           F         Significance
---------- -- ------------ ------------ --------- ------------
Regression 2  26905.306594 13452.653297 45.616718 0.001764
Residual   4  1179.624835  294.906209
Total      6  28084.931429
            x=matrix(1 4 8 2 3, 1 4 2 3 8, 1 5 1 1 5);
w=rand(8,5)
wls(1..5, x,w,0,1);
            | factor | beta | stdError | tstat | pvalue | 
|---|---|---|---|---|
| beta0 | 0.0026 | 1.4356 | 0.0018 | 0.9988 | 
| beta1 | -1 | 1.2105 | -0.8261 | 0.5605 | 
| beta2 | 0.4511 | 0.5949 | 0.7582 | 0.587 | 
| beta3 | 1.687 | 1.7389 | 0.9701 | 0.5097 | 
