Posted on Sunday, 25th March 2012
Please finish chapter 12 and post a comment.
Posted in Class | Comments (15)
Leave a Reply
You must be logged in to post a comment.
Posted on Sunday, 25th March 2012
Please finish chapter 12 and post a comment.
Posted in Class | Comments (15)
You must be logged in to post a comment.
March 26th, 2012 at 8:20 am
When using forward selection to determine which variables to include in a multiple regression model, how should possible interaction terms be handled? The algorithm described seems to only deal with adding additional individual variables and examining the R^2 and t-statistics, but at what stage should a researcher also add interaction terms if they seem appropriate?
March 26th, 2012 at 8:42 pm
In linear regression, what do the p-values and R2 tell me about the data? More specifically, what does it mean to have a significant p-value for a variable but a fairly poor R2? Can I have a good R2 and a high p-value?
March 26th, 2012 at 9:53 pm
Can you explain more about how the interaction between variables is used?
March 26th, 2012 at 10:25 pm
In model selection, when looking for the best predictor variables, how should one account for the possibly significant interaction terms? For example, in the forward selection process outlined, should each new regression include all possible and relevant interactions?
March 26th, 2012 at 11:04 pm
Is there other ways to quantify interaction effects other than defining a x1*x2 variable? Using ANOVA?
March 26th, 2012 at 11:29 pm
When looking at interaction effects, you might end up with a very large number of variables. How do you know which combinations of variables might be important? How do you deal with over-fitting?
March 27th, 2012 at 12:35 am
Regarding model selection, will ‘forward selection’ and ‘backward elimination’ converge on the same model?
March 27th, 2012 at 1:30 am
Does the forward selection algorithm also apply to interaction effects? Should we consider interaction effects with every variable we add to the model, or only where we think it would make sense?
March 27th, 2012 at 2:40 am
Can you go over the source localization problem for MEG that you go into in Example 12.9? I’m not clear on your use of penalized least squares in minimum norm estimate. It sounds like an exciting statistical technique, but I’d love to hear more about it.
March 27th, 2012 at 7:33 am
When we chosee the explanatory variables, do we usually try to make them uncorrelated or it does not matter whether they are correlated? The correlation of them only influences interpretation?
March 27th, 2012 at 8:05 am
Is it possible to use a form of multiple linear regression for non-cosine directional tuning, or would it be limited to using kernel regression methods or some other technique?
March 27th, 2012 at 8:15 am
It would be very helpful to have a picture of the example in 12.5.5.
Is there any way of selectively shrinking certain coefficients selectively?
March 27th, 2012 at 8:27 am
I’m slightly confused by the remarks on shrinkage. Why does a high coefficient suggest an irrelevant variable? How does adding the penalty term fix problems with non-invertible matrices?
March 27th, 2012 at 8:30 am
If you want to simplify your model, what are some methods to determine whether to use forward selection, backward elimination, and/or stepwise regression?
March 27th, 2012 at 8:30 am
I’m still unclear on how cross-validation works. Can you clarify?