问题
I need to call one python script from another script,I'm trying to do it with the help of execfile function.I need to pass a dictionary as an argument to the calling function.Is there any possibility to do that?
import subprocess
from subprocess import Popen
-------To read the data from xls-----
ret_lst = T_read("LDW_App05")
for each in ret_lst:
lst.append(each.replace(' ','-'))
lst.append(' ')
result = Popen(['python','LDW_App05.py'] + lst ,stdin = subprocess.PIPE,stdout = subprocess.PIPE).communicate()
print result
Here,in the above code I'm reading the Input data from the Excel sheet in the form of list,I need to pass the list as an argument to LDW_App05.py file
回答1:
Instead of passing complex data as CL arguments, I propose piping your data via the STDIN/STDOUT - then you don't need to worry about escaping special, shell-significant chars and exceeding the maximum command line length.
Typically, as CL argument-based script you might have something like app.py
:
import sys
if __name__ == "__main__": # ensure the script is run directly
if len(sys.argv) > 1: # if at least one CL argument was provided
print("ARG_DATA: {}".format(sys.argv[1])) # print it out...
else:
print("usage: python {} ARG_DATA".format(__file__))
It clearly expects an argument to be passed and it will print it out if passed from another script, say caller.py
:
import subprocess
out = subprocess.check_output(["python", "app.py", "foo bar"]) # pass foo bar to the app
print(out.rstrip()) # print out the response
# ARG_DATA: foo bar
But what if you want to pass something more complex, let's say a dict
? Since a dict
is a hierarchical structure we'll need a way to present it in a single line. There are a lot of formats that would fit the bill, but let's stick to the basic JSON, so you might have your caller.py
set to something like this:
import json
import subprocess
data = { # our complex data
"user": {
"first_name": "foo",
"last_name": "bar",
}
}
serialized = json.dumps(data) # serialize it to JSON
out = subprocess.check_output(["python", "app.py", serialized]) # pass the serialized data
print(out.rstrip()) # print out the response
# ARG_DATA: {"user": {"first_name": "foo", "last_name": "bar"}}
Now if you modify your app.py
to recognize the fact that it's receiving JSON as an argument you can deserialize it back to Python dict
to access its structure:
import json
import sys
if __name__ == "__main__": # ensure the script is run directly
if len(sys.argv) > 1:
data = json.loads(sys.argv[1]) # parse the JSON from the first argument
print("First name: {}".format(data["user"]["first_name"]))
print("Last name: {}".format(data["user"]["last_name"]))
else:
print("usage: python {} JSON".format(__file__))
Then if you run your caller.py
again you'll get:
First name: foo Last name: bar
But this is very tedious and JSON is not very friendly to the CL (behind the scenes Python does a ton of escaping to make it work) not to mention there is a limit (OS and shell depending) on how big your JSON can be passed this way. It's much better to use STDIN/STDOUT buffer to pass your complex data between processes. To do so, you'll have to modify your app.py
to wait for input on its STDIN, and for caller.py
to send serialized data to it. So, app.py
can be as simple as:
import json
if __name__ == "__main__": # ensure the script is run directly
try:
arg = raw_input() # get input from STDIN (Python 2.x)
except NameError:
arg = input() # get input from STDIN (Python 3.x)
data = json.loads(arg) # parse the JSON from the first argument
print("First name: {}".format(data["user"]["first_name"])) # print to STDOUT
print("Last name: {}".format(data["user"]["last_name"])) # print to STDOUT
and caller.py
:
import json
import subprocess
data = { # our complex data
"user": {
"first_name": "foo",
"last_name": "bar",
}
}
# start the process and pipe its STDIN and STDOUT to this process handle:
proc = subprocess.Popen(["python", "app.py"], stdin=subprocess.PIPE, stdout=subprocess.PIPE)
serialized = json.dumps(data) # serialize data to JSON
out, err = proc.communicate(serialized) # send the serialized data to proc's STDIN
print(out.rstrip()) # print what was returned on STDOUT
and if you invoke caller.py
you again get:
First name: foo Last name: bar
But this time there is no limit to the data size you're passing over to your app.py
and you don't have to worry if a certain format would be messed up during shell escaping etc. You can also keep the 'channel' open and have both processes communicate with each other in a bi-directional fashion - check this answer for an example.
来源:https://stackoverflow.com/questions/45185425/passing-arguments-to-execfile-in-python-2-7