Why is my code not working and what does this error mean?
import random initial_val = str(10) attr_c1_stre = (\"Character 1\'s Strength: \",str(random.randint(1,
initial_val is a string:
initial_val
initial_val = str(10)
You are trying to add it to a floating point value:
random.randint(1,12)/random.randint(1,4) + initial_val
initial_val should not be a string; leave it as an integer instead:
initial_val = 10