This is a short post that’s meant to clarify the HTML5 Number Input type restrictions. I recently got snagged on the “step” attribute.
HTML5 added some new input restrictions including “min,” “max”, “step” and more.
By default, step is set to 1 and only integers are accepted. So the following most basic usage will accept any integer positive or negative.
<input id="size" type="number"/>
To only accept numbers that are a multiple of five, set the step to five.
<input id="size" step="5" type="number"/>
Where I got hung up was on decimal places. I wanted to be able to go out two decimal places. I set my step to .1 thinking that that would get me into a decimal mode. However, that only gets you values with 1 decimal place. So I could enter 1.2 but not 1.25.
<input id="size" step=".1" type="number"/>
So to get two decimal places, my step needs to be .01. Numbers with only one decimal place or whole numbers are still valid, but more than two decimal places is invalid.
<input id="size" step=".01" type="number"/>
This made me wonder what happens if you need to be able to accept any number of decimal places. You can enter the value “any” in the step attribute. This allows any number of decimal places. The little increase and decrease arrows revert to operating on whole numbers. In Firefox and Chrome. Microsoft Edge appears to not bother with those arrows.
<input id="size" step="any" type="number"/>
Using a value with a step can get interesting results. If you set a step of .1, but set the value to be 1.01, values with two decimal places will be accepted if the final decimal is 1. With the code below, 1.21 is accepted but 1.22 is not. Normally with a step of .1, neither will be accepted.
<input id="size" step=".1" value="1.01" type="number"/>