I could be wrong but let's assume that the reviewer is talking
about the standard error of a regression coefficient. The
mistake the reviewer makes is in implying that the standard
error of a coefficient directly affects the value of the squared
multiple correlation (i.e., R^2) for the regression equation.
Large standard errors for the coefficients are a problem
when they are caused by a high level of collinearity among
the predictors. Quouting from the Wikipeadia entry on
multicollinearity:
One of the features of multicollinearity is that the standard
errors of the affected coefficients tend to be large. In that
case, the test of the hypothesis that the coefficient is equal
to zero may lead to a failure to reject a false null hypothesis
of no effect of the explanator, a type II error.

A principal danger of such data redundancy is that of
overfitting in regression analysis models. The best regression
models are those in which the predictor variables each
correlate highly with the dependent (outcome) variable
but correlate at most only minimally with each other. Such
a model is often called "low noise" and will be statistically
robust (that is, it will predict reliably across numerous samples
of variable sets drawn from the same statistical population).

So long as the underlying specification is correct, multicollinearity
does not actually bias results; it just produces large standard
errors in the related independent variables. More importantly,
 the usual use of regression is to take coefficients from the model
 and then apply them to other data. If the pattern of multicollinearity
in the new data differs from that in the data that was fitted, such
extrapolation may introduce large errors in the predictions.[6]
http://en.wikipedia.org/wiki/MulticollinearityIn the general multiple regression case, the standard error of
a coefficient [SE(bi)] is given by the following equation:
SE(bi) = sqrt[MSresidual/{sum of squares for Xi * (1  Ri^2)}]
where Ri^2 is the multiple correlation of the predictor Xi with
the other Xs in the equation (page 126 in Edwards 1984,
Intro to Lin Reg & Corr)
As Ri^2 approaches 1.00, the denominator gets smaller
and the standard error gets larger. This can lead to odd
results, such as a significant R^2 for the regression but
none of the coefficients are significant. See the following
article by Cramer for related problems:
Cramer, E. M. (1972). Significance tests and tests of models
in multiple regression. The American Statistician, 26(4), 2630.
Maybe the reviewer was confused because large standards
errors are bad but not in terms of R^2. Reviewers as reviewers
sometime feel the need to say something even if they are
confused about what they say. ;)
Mike Palij
New York University
[hidden email]
 Original Message 
From: "Bruce Weaver" <
[hidden email]>
To: <
[hidden email]>
Sent: Thursday, March 12, 2015 2:05 PM
Subject: Re: Standard error of predictor and Rsquared
> It's not entirely clear to me which SE the reviewer is talking about.
> But we
> could just plug in the root mean square error (RMSE) to illustrate,
> because
> the SEs of the coefficients are related to it. Whether Rsq changes
> or not
> depends on what is driving the change in RMSE, redistribution of the
> total
> SS, change in the sample size, or some combination of the two. Here
> is a
> simple example.
>
> DATA LIST list / Example(F1) SSreg dfreg SSres dfres (4F5.0).
> BEGIN DATA
> 1 200 1 300 98
> 2 400 1 100 98
> 3 200 1 300 998
> END DATA.
>
> COMPUTE RMSE = SQRT(SSres/dfres).
> COMPUTE Rsq = SSreg / SUM(SSreg, SSres).
> FORMATS RMSE Rsq (F8.4).
> LIST.
>
> OUTPUT:
> Example SSreg dfreg SSres dfres RMSE Rsq
>
> 1 200 1 300 98 1.7496 .4000
> 2 400 1 100 98 1.0102 .8000
> 3 200 1 300 998 .5483 .4000
>
> In all 3 examples, SS_Total = 500. Examples 2 and 3 both have a lower
> RMSE
> than example 1. In example 2, RMSE is lower because SS_residual
> dropped
> from 300 to 100, and SS_regression increased from 200 to 400. Because
> Rsq =
> SS_reg / SS_res, Rsq increased.
>
> In example 3, on the other hand, all of the SS values remained the
> same, but
> N was increased from 100 to 1000. Therefore, RMSE error is a lot
> lower, but
> Rsq is unchanged (versus example 1).
>
> The reviewer's comment indicates that they are thinking of the example
> 1 vs
> example 2 situation as more 'typical'.
>
> HTH.
>
>
> Nina Lasek wrote
>> Dear all,
>>
>> while I fear that my question is not specifically related to SPSS, I
>> hope
>> you could still help me with the following problem:
>> In a comment on a recent analysis based on OLS regressoin, a
>> reviewer
>> mentionds that "a smaller standard error typically results in a
>> higher
>> amount of explained variance (R_squared). Is that correct? Isn't it
>> the
>> regression weight itself (the slope coefficient) which is used for
>> calculating R square rather than its standard error???
>>
>> Thanks for your comments!
>> Nina
> 
=====================
To manage your subscription to SPSSXL, send a message to
[hidden email] (not to SPSSXL), with no body text except the
command. To leave the list, send the command
SIGNOFF SPSSXL
For a list of commands to manage subscriptions, send the command
INFO REFCARD