In Chrome and Firefox,
typeof foo
evalulates to 'undefined'
.
But
typeof (function() { return foo; })()
throws an error:
ReferenceError: foo is not defined
This destroys the notions that I have of susbstitutability of expressions! Until now, I knew of no conditions for which foo
and (function() { return foo; })()
are not the same.
Is this standard behavior? If so, it would be helpful to quote the relevant part of the ECMAScript standard.
EDIT:
Another example:
typeof (foo)
typeof (foo + 0)
I would have expect (foo)
and (foo + 0)
to throw an error.
But the first one has no error; the second one does.
Basically, the typeof
operator checks whether a variable¹ is unresolvable and returns "undefined"
. That is, typeof
returns a defined value for undeclared variables¹ before reaching the GetValue
algorithm which throws for undeclared variables¹.
Quoting ECMAScript 5.1 § 11.4.3 The typeof Operator (emphasis added):
11.4.3 The typeof Operator
The production UnaryExpression :
typeof
UnaryExpression is evaluated as follows:
In the other hand, the return statement -- like most operators and statements which read the value from identifier(s) -- will always call GetValue
which throws on unresolvable identifiers (undeclared variables). Quoting ECMAScript 5.1 § 8.7.1 GetValue (V) (emphasis added):
8.7.1 GetValue (V)
- If Type(V) is not Reference, return V.
- Let base be the result of calling GetBase(V).
- If IsUnresolvableReference(V), throw a
ReferenceError
exception.
Now, analyzing the code:
typeof (function() { return foo; })()
This code will instantiate a function object, execute it and only then typeof
will operate on the function's return value (function call takes precedence over the typeof
operator).
Hence, the code throws while evaluating the IIFE's return
statement, before the typeof
operation can be evaluated.
A similar but simpler example:
typeof (foo+1)
The addition is evaluated before typeof
. This will throw an error when the Addition Operator calls GetValue
on foo
, before typeof
comes into play.
Now:
typeof (foo)
Does not throw an error as the grouping operator (parentheses) does not "evaluate" anything per se, it just forces precedence. More specifically, the grouping operator does not call GetValue
. In the example above it returns an (unresolvable) Reference.
The annotated ES5.1 spec even adds a note about this:
NOTE This algorithm does not apply
GetValue
to the result of evaluating Expression. The principal motivation for this is so that operators such asdelete
andtypeof
may be applied to parenthesised expressions.
N.B. I've wrote this answer with the focus on providing a simple and understandable explanation, keeping the technical jargon to a minimum while still being sufficiently clear and providing the requested ECMAScript standard references, which I hope to be a helpful resource to developers who struggle with understanding the typeof
operator.
¹ The term "variable" is used for ease of understanding. A more correct term would be identifier, which can be introduced into a Lexical Environment not only through variable declarations, but also function declarations, formal parameters, calling a function (arguments
), with
/catch
blocks, assigning a property to the global object, let
and const
statements (ES6), and possibly a few other ways.
Is this standard behavior?
Yes. typeof doesn't throw an error because it just returns a value as specified. However, as other answers have said, the code fails when evaluating the operand.
If so, it would be helpful to quote the relevant part of the ECMAScript standard.
When evaluating the function expression, an attempt to resolve the value of foo (so that it can be returned) will call the internal GetValue method with argument foo. However, since foo hasn't been declared or otherwise created, a reference error is thrown.
Edit
In the case of:
typeof (foo)
"(" and ")" are punctuators, denoting a grouping, such as a (possibly empty) parameter list when calling a function like foo(a, b)
, or an expression to be evaluated, e.g. if (x < 0)
and so on.
In the case of typeof (foo)
they simply denote evaluating foo before applying the typeof operator. So foo, being a valid identifier, is passed to typeof, per link above, which attempts to resolve it, can't, determines it's an unresolveable reference, and returns the string "undefined"
.
In the case of:
typeof (foo + 0)
the brackets cause the expression foo + 0
to be evaluated first. When getting the value of foo, a reference error is thrown so typeof doesn't get to operate. Note that without the brackets:
typeof foo + 0 // undefined0
because of operator precedence: typeof foo
returns the string "undefined"
, so +
becomes addition operator because one of the arguments is a string, it does concatenation (the string version of addition, not the mathematic version), so 0
is converted to the string "0"
and concatenated to "undefined"
, resutling in the string "undefined0"
.
So any time the evaluation of an expression with an unresolveable reference is attempted (e.g. an undeclared or initialised variable) a reference error will be thrown, e.g.
typeof !foo
throws a reference error too because in order to work out what to pass to typeof, the expression must be evaluated. To apply the !
operator, the value of foo must be obtained and in attempting that, a reference error is thrown.
The error "ReferenceError: foo is not defined" is not being thrown by typeof
, its being thrown by the function itself. If you had used:
typeof (function() { return 2; })()
it would have returned "number" as expected, but in this example JavaScript is not even getting to the point where typeof
is being run on anything. You are receiving the same error as if you had run:
function test () {
return foo;
}
test();
Digging through the spec, I think this all comes down to when the operator in question attempts to run GetValue()
on its operand.
typeof
attempts to determine its operand's Type
first. If that type is a Reference
and is IsUnresolvableReference()
, then it bails out and returns undefined
. In essence, it does not fully evaluate the operand; if it did, anything that was undefined
would throw an exception, so instead it short circuits and returns a nice, useful string.
In the examples, self-executing functions and the addition operator call GetValue
without first checking for IsUnresolvableReference()
like typeof
does: they call GetValue
and throw an exception if the reference is unresolved (foo
is undefined
in our case). (I think! This is my best guess from reading through the spec.)
This is standard behavior. The typeof
operator almost takes a reference of the next variable you pass to it.
So let's try typeof foo
.
The javascript interpreter looks at typeof and finds the type of foo
.
Now we try typeof (function() { return foo })()
The javascript interpreter looks at typeof. Since the expression afterwards isn't a variable, it evaluates the expression. (function() { return foo })()
throws a ReferenceError
because foo
is undefined. If it were possible to pass the reference of a varialbe i.e. something like (function() { return *foo })()
then this wouldn't happen.
Note: According to this, one may think that typeof (foo)
would throw an error, since (foo)
isn't a variable and must be evaluated, but that is incorrect; typeof (foo)
will also return "undefined" if foo isn't defined.
Essentially, the interpreter evaluates the next variable, but not expression, in a "safe" context so that typeof
doesn't throw an error.
It is a bit confusing.
来源:https://stackoverflow.com/questions/24150713/why-does-typeof-only-sometimes-throw-referenceerror