Constraining Constants w Regression

G

Guest

It seems Excel presents two options for working with regression constants in
Excel:

-Excel sets the constant

-Constant = zero


Does anyone know of a way to constrain the constant and get Excel to
calculate a slope?

For example, what would the slope of this array be, given the constant = 1000?

----------------
This post is a suggestion for Microsoft, and Microsoft responds to the
suggestions with the most votes. To vote for this suggestion, click the "I
Agree" button in the message pane. If you do not see the button, follow this
link to open the suggestion in the Microsoft Web-based Newsreader and then
click "I Agree" in the message pane.

http://www.microsoft.com/office/com...5-039249e46054&dg=microsoft.public.excel.misc
 
G

Guest

Hi

Calculating the latter regression coefficients, is equivalent to calculating
them with the set constant term subtracted from both sides of the equation.
Hence, you force the intercept = 0.
 
G

Guest

Wigi,

Thanks for your help, although I'm not sure this is the correct approach.
After attempting to regress the same time series, this time with the constant
set to zero, and adding the constant I wanted afterward and keeping the
coefficients the same (if I understand you correctly), does not result in the
calculation of regression coefficients which fit the curve. Setting the
constant equal to zero makes the slope (coefficient) too big, and after
adding the intercept, my regression line lies almost entirely above my data
series, instead of fitting nicely on top of it.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top