Posted on Friday, 17th February 2012

Please read Ch13a.pdf and Sections 12.1-12.4 of Chapter12-FEB17.pdf and post a comment.

Posted in Class | Comments (13)

  1. Scott Kennedy Says:

    As highlighted by the plots in figure 12.5, visual analysis of the data can be key in understanding regression fits. With so much recent neuro data having high dimensionality, what are ways to do a visual analysis for high dimensional (more than 3) data?

  2. David Zhou Says:

    In what types of experiments might it be important to formulate the additional null hypothesis in two way ANOVA, that all the increments added overall means are zero?

    I’m unclear about what residual analysis does. How does it represent unstructured noise? What are standardized residuals?

  3. Eric VanEpps Says:

    I don’t know if I’ve ever seen a confidence interval for a correlation coefficient before, and I must admit that I got lost in trying to understand the derivation of the confidence interval for rho in 12.4.3. Can you explain how this is derived, as well as when such a confidence interval would be important in practice?

  4. Shubham Debnath Says:

    Trying to use ANOVA now…this chapter is a good resource and will be useful.

  5. Rob Rasmussen Says:

    I am confused by the purpose of anova. When there are more than two groups, it seems to be throwing information away. Why not just use multiple different t-tests – it seems like it would be a rare case where there would be so many groups that doing t-tests would be too cumbersome.

  6. Rex Tien Says:

    The R^2 measure can tell us if two things are linearly related. However, a low R^2 measure could mean the points are totally scattered or that they follow a highly non-linear relationship. Is there a measure of how “continuous” or “tight” the points are? (i.e. something that would be high when the points are arranged in a sine wave, and low when the points are scattered) Or do we have to just do it by eye?

  7. Sharlene Flesher Says:

    Can you reiterate how the degrees of freedom for the error are determined?

  8. Rex Tien Says:

    I guess my question is, is there a measure that might tell us when to claim the two variables are unrelated, or when to seek out a highly non-linear relationship?

  9. Ben Dichter Says:

    When doing linear regression over many variables, isn’t there an increasing risk of overfitting? How do you know if you are overfitting and how do you choose parameters accordingly?

  10. mpanico Says:

    In the finger-tapping example used for the two-way ANOVA, would it have been reasonable to subtract out the placebo rate as a “baseline” from each individual and compare the on-drug changes to a no-change case?

  11. Yijuan Du Says:

    In 12.4 Correlation and Regression
    Is there a preference of using correlation or regression under some conditions?

  12. Thomas Kraynak Says:

    Can you go over why the one-way ANOVA splits up the variability into group and individual variability?

  13. Rich Truncellito Says:

    Does subtracting the mean of the x variable also work to reduce correlations of x with other transformations of x than x^2?

    Also, could you explain what AIC and BIC are?

Leave a Reply

You must be logged in to post a comment.