Is there a way to step between 0 and 1 by 0.1?
I thought I could do it like the following, but it failed:
for i in range(0, 1, 0.1):
print i
The range() built-in function returns a sequence of integer values, I'm afraid, so you can't use it to do a decimal step.
I'd say just use a while loop:
i = 0.0
while i <= 1.0:
print i
i += 0.1
If you're curious, Python is converting your 0.1 to 0, which is why it's telling you the argument can't be zero.