There is more than one stackoverflow question about how to find the min or max of an array of values in javascript. This is not that question.
I want to know why pas
This (so to speak) is a tricky part of JavaScript.
When you have a function, there's a magical variable called this
. Depending on how you call a function, its context is set differently.
function myFunction() {
return this;
}
var myObject = {
fn: function() {
return this;
}
};
myFunction(); // the global object. `window` in browsers, `global` in Node.js
myFunction.apply(Math); // Math
myFunction.apply({ hello: "world" }); // { hello: "world" }
myObject.fn(); // myObject
myObject.fn.apply(myObject); // myObject
myObject.fn.apply(Math); // Math
The reason that your examples above work is because min
doesn't use this
at all, so you can assign it to whatever you want.
When using .apply()
, the first argument controls what the this
pointer value will be set to when the function executes. In many cases, the this
value is critically important. In a few cases, the this
value is simply not used inside the function implementation (often referred to as a static function that does not operate on instance data). When this
is not used, it doesn't matter what this
is set to and therefore it doesn't matter what the first argument to .apply()
is set to.
So, in your specific case, Math.min()
and Math.max()
(and probably all the Math.xxx()
functions) do not use the this
pointer at all as they are all basically static functions that don't operate on instance data. So, it doesn't matter what it's set to and thus you can pass anything you want as the first argument to Math.min.apply()
and it won't change the result of the function call.
I would argue that one should still pass the correct value there which would be Math
since that's what this
will be when you do a normal:
Math.min(x, y);
So, to present the exact same situation to Math.min()
as the above code when using .apply()
, you would do it like this:
var arr = [x, y];
Math.min.apply(Math, arr);
IMO, this promotes proper habits and proper thinking about what that first argument is supposed to be because it will matter in other circumstances.
FYI, a similar issue comes up regularly with $.when.apply($, arr)
in jQuery which also doesn't use the this
argument in its implementation so one can call it as $.when.apply(null, arr)
or even $.when.apply("foo", arr)
. Personally, I prefer to pass the "technically correct" argument which is the first one.
What apply
does with its first argument, is simply setting the context (this
) for the function that it is called on.
So lets say, you have an object like this:
var obj = {
a: 1,
b: function () {
return this.a;
},
c: function () {
return 3;
}
}
If you call obj.b()
, you will get 1
as returned value, since this
in this scenario is defined by calling b
with obj
in front of it (this === obj
). To change that, you can use apply
:
obj.b.apply({a:2});
This will return 2, because you are now explicitly settings this
to {a:2}
.
BUT: if you have a function, that doesn't make use of this
internally, than it doesn't matter what value this
has. So you can call:
obj.c();
or one of these:
obj.apply.c(null);
obj.apply.c(undefined);
obj.apply.c("xyz");
But you will always get the same result 3
. It is like setting a variable that you never use. No matter what value it is, it doesn't affect the outcome.
So apparently Math.min
doesn't use this
and therefore doesn`t break when passing an "invalid" context.
TL;DR - Math.min()
and Math.max()
don't actually use the this
argument passed to them, it's just a hack to get them to accept an array of arguments, since the native API only specifies a list of variables. .apply()
can accept arrays of arguments, which is why it is used.
My guess is Math
is passed in as the this
argument simply for cleanliness' sake.
Thanks to @Jonathan Lonowski and @Niet the Dark Absol.