Why can't you access scoped variables using eval
under a with
statement?
For example:
(function (obj) {
with (obj) {
console.log(a); // prints out obj.a
eval("console.log(a)"); // ReferenceError: a is not defined
}
})({ a: "hello" })
EDIT: As the knowledgeable CMS pointed out, this appears to be a browser bug (browsers that use the WebKit console).
If anyone was wondering what abomination I was trying to come up with that would require both the "evil" eval
and with
-- I was trying to see if I could get a function (used as a callback) executed in another context rather than the one it was defined in. And no, I probably (cough) won't use this anywhere.. more curious than anything.
(function (context,fn) {
with (context)
eval("("+fn+")()");
})({ a: "hello there" }, function () { console.log(a); })
Answer
This is a bug reproducible only from the WebKit's Console, it has problems binding the caller context when eval
is invoked from a FunctionExpression
.
When a direct call of eval
is made, the evaluated code as you expect should share both the variable environment:
(function (arg) {
return eval('arg');
})('foo');
// should return 'foo', throws a ReferenceError from the WebKit console
And also the lexical environment:
(function () {
eval('var localVar = "test"');
})();
typeof localVar; // should be 'undefined', returns 'string' on the Console
In the above function localVar
should be declared on the lexical environment of the caller, not on the global context.
For FunctionDeclaration
s the behavior is completely normal, if we try:
function test1(arg) {
return eval('arg');
}
test1('foo'); // properly returns 'foo' on the WebKit console
And
function test2() {
eval('var localVarTest = "test"');
}
test2();
typeof localVarTest; // correctly returns 'undefined'
I have been able to reproduce the problem on the following browsers running on Windows Vista SP2:
- Chrome 5.0.375.125
- Chrome 6.0.472.25 dev
- Safari 5.0.1
- WebKit Nightly Build r64893
No comments:
Post a Comment