There is more than one stackoverflow question about how to find the min or max of an array of values in javascript. This is not that question.
I want to know why pas
What apply
does with its first argument, is simply setting the context (this
) for the function that it is called on.
So lets say, you have an object like this:
var obj = {
a: 1,
b: function () {
return this.a;
},
c: function () {
return 3;
}
}
If you call obj.b()
, you will get 1
as returned value, since this
in this scenario is defined by calling b
with obj
in front of it (this === obj
). To change that, you can use apply
:
obj.b.apply({a:2});
This will return 2, because you are now explicitly settings this
to {a:2}
.
BUT: if you have a function, that doesn't make use of this
internally, than it doesn't matter what value this
has. So you can call:
obj.c();
or one of these:
obj.apply.c(null);
obj.apply.c(undefined);
obj.apply.c("xyz");
But you will always get the same result 3
. It is like setting a variable that you never use. No matter what value it is, it doesn't affect the outcome.
So apparently Math.min
doesn't use this
and therefore doesn`t break when passing an "invalid" context.