Whenever a non-integer pixel value is used for the border of an element, the browser simply truncates the value to become an integer. Why is this the case?
I\'m aware th
The simple explanation is that the browser uses integers for border widths internally (or at least exposes them publicly as such).
An example of this is the source code of Chrome (Chromium) which in the file ComputedStyle.h defines all border-widths as integers (line 508):
There is little we can do with that and as to why: there is very little information about border widths in the W3C specification for CSS Backgrounds and Borders. It only states line-width
with no units, type or definition about how to treat this unit except it is absolute (non-negative):
Value:
[...]
Computed value: absolute length; ‘0’ if the border style is ‘none’ or ‘hidden’
And:
The lengths corresponding to ‘thin’, ‘medium’ and ‘thick’ are not specified, but the values are constant throughout a document and thin ≤ medium ≤ thick. A UA could, e.g., make the thickness depend on the ‘medium’ font size: one choice might be 1, 3 & 5px when the ‘medium’ font size is 17px or less. Negative values are not allowed.
The same information is found in the box model document with no new details.
As all values eventually end up as pixel values (as our screens are pixel-devices) the number coming through em, vw, % etc. seems to end up as an integer when it comes to border widths without considering sub-pixeling.
Not even transforms (scale) seem to affect this in the browsers which use integers for border widths.
In the end, it seems to be up to the browser vendor how to treat these values (it could simply be aesthetic reasons for doing so, performance, .. we can only guess..).