Posted on Sunday, 25th March 2012

Please finish chapter 12 and post a comment.

Posted in Class | Comments (15)

  1. Eric VanEpps Says:

    When using forward selection to determine which variables to include in a multiple regression model, how should possible interaction terms be handled? The algorithm described seems to only deal with adding additional individual variables and examining the R^2 and t-statistics, but at what stage should a researcher also add interaction terms if they seem appropriate?

  2. Scott Kennedy Says:

    In linear regression, what do the p-values and R2 tell me about the data? More specifically, what does it mean to have a significant p-value for a variable but a fairly poor R2? Can I have a good R2 and a high p-value?

  3. Sharlene Flesher Says:

    Can you explain more about how the interaction between variables is used?

  4. Rich Truncellito Says:

    In model selection, when looking for the best predictor variables, how should one account for the possibly significant interaction terms? For example, in the forward selection process outlined, should each new regression include all possible and relevant interactions?

  5. Shubham Debnath Says:

    Is there other ways to quantify interaction effects other than defining a x1*x2 variable? Using ANOVA?

  6. Ben Dichter Says:

    When looking at interaction effects, you might end up with a very large number of variables. How do you know which combinations of variables might be important? How do you deal with over-fitting?

  7. Jay Scott Says:

    Regarding model selection, will ‘forward selection’ and ‘backward elimination’ converge on the same model?

  8. Matt Panico Says:

    Does the forward selection algorithm also apply to interaction effects? Should we consider interaction effects with every variable we add to the model, or only where we think it would make sense?

  9. David Zhou Says:

    Can you go over the source localization problem for MEG that you go into in Example 12.9? I’m not clear on your use of penalized least squares in minimum norm estimate. It sounds like an exciting statistical technique, but I’d love to hear more about it.

  10. Yijuan Du Says:

    When we chosee the explanatory variables, do we usually try to make them uncorrelated or it does not matter whether they are correlated? The correlation of them only influences interpretation?

  11. Rob Rasmussen Says:

    Is it possible to use a form of multiple linear regression for non-cosine directional tuning, or would it be limited to using kernel regression methods or some other technique?

  12. Rex Tien Says:

    It would be very helpful to have a picture of the example in 12.5.5.

    Is there any way of selectively shrinking certain coefficients selectively?

  13. Matt Bauman Says:

    I’m slightly confused by the remarks on shrinkage. Why does a high coefficient suggest an irrelevant variable? How does adding the penalty term fix problems with non-invertible matrices?

  14. Thomas Kraynak Says:

    If you want to simplify your model, what are some methods to determine whether to use forward selection, backward elimination, and/or stepwise regression?

  15. Kelly Says:

    I’m still unclear on how cross-validation works. Can you clarify?

Leave a Reply

You must be logged in to post a comment.