Passing data between separately running Python scripts

前端 未结 3 1946
眼角桃花
眼角桃花 2020-12-01 11:06

If I have a python script running (with full Tkinter GUI and everything) and I want to pass the live data it is gathering (stored internally in arrays and such) to another p

相关标签:
3条回答
  • 2020-12-01 11:26

    You could use the pickling module to pass data between two python programs.

    import pickle 
    
    def storeData(): 
        # initializing data to be stored in db 
        employee1 = {'key' : 'Engineer', 'name' : 'Harrison', 
        'age' : 21, 'pay' : 40000} 
        employee2 = {'key' : 'LeadDeveloper', 'name' : 'Jack', 
        'age' : 50, 'pay' : 50000} 
    
        # database 
        db = {} 
        db['employee1'] = employee1 
        db['employee2'] = employee2 
    
        # Its important to use binary mode 
        dbfile = open('examplePickle', 'ab') 
    
        # source, destination 
        pickle.dump(db, dbfile)                   
        dbfile.close() 
    
    def loadData(): 
        # for reading also binary mode is important 
        dbfile = open('examplePickle', 'rb')      
        db = pickle.load(dbfile) 
        for keys in db: 
            print(keys, '=>', db[keys]) 
        dbfile.close() 
    
    0 讨论(0)
  • 2020-12-01 11:47

    you can use multiprocessing module to implement a Pipe between the two modules. Then you can start one of the modules as a Process and use the Pipe to communicate with it. The best part about using pipes is you can also pass python objects like dict,list through it.

    Ex: mp2.py:

    from multiprocessing import Process,Queue,Pipe
    from mp1 import f
    
    if __name__ == '__main__':
        parent_conn,child_conn = Pipe()
        p = Process(target=f, args=(child_conn,))
        p.start()
        print(parent_conn.recv())   # prints "Hello"
    

    mp1.py:

    from multiprocessing import Process,Pipe
    
    def f(child_conn):
        msg = "Hello"
        child_conn.send(msg)
        child_conn.close()
    
    0 讨论(0)
  • 2020-12-01 11:47

    If you wanna read and modify shared data, between 2 scripts, which run separately, a good solution is, take advantage of the python multiprocessing module, and use a Pipe() or a Queue() (see differences here). This way, you get to sync scripts, and avoid problems regarding concurrency and global variables (like what happens if both scripts wanna modify a variable at the same time).

    As Akshay Apte said in his answer, the best part about using pipes/queues, is that you can pass python objects through them.

    Also, there are methods to avoid waiting for data, if there hasn't been any passed yet (queue.empty() and pipeConn.poll()).

    See an example using Queue() below:

        # main.py
        from multiprocessing import Process, Queue
        from stage1 import Stage1
        from stage2 import Stage2
    
    
        s1= Stage1()
        s2= Stage2()
    
        # S1 to S2 communication
        queueS1 = Queue()  # s1.stage1() writes to queueS1
    
        # S2 to S1 communication
        queueS2 = Queue()  # s2.stage2() writes to queueS2
    
        # start s2 as another process
        s2 = Process(target=s2.stage2, args=(queueS1, queueS2))
        s2.daemon = True
        s2.start()     # Launch the stage2 process
    
        s1.stage1(queueS1, queueS2) # start sending stuff from s1 to s2 
        s2.join() # wait till s2 daemon finishes
    
        # stage1.py
        import time
        import random
    
        class Stage1:
    
          def stage1(self, queueS1, queueS2):
            print("stage1")
            lala = []
            lis = [1, 2, 3, 4, 5]
            for i in range(len(lis)):
              # to avoid unnecessary waiting
              if not queueS2.empty():
                msg = queueS2.get()    # get msg from s2
                print("! ! ! stage1 RECEIVED from s2:", msg)
                lala = [6, 7, 8] # now that a msg was received, further msgs will be different
              time.sleep(1) # work
              random.shuffle(lis)
              queueS1.put(lis + lala)             
            queueS1.put('s1 is DONE')
    
        # stage2.py
        import time
    
        class Stage2:
    
          def stage2(self, queueS1, queueS2):
            print("stage2")
            while True:
                msg = queueS1.get()    # wait till there is a msg from s1
                print("- - - stage2 RECEIVED from s1:", msg)
                if msg == 's1 is DONE ':
                    break # ends loop
                time.sleep(1) # work
                queueS2.put("update lists")             
    

    EDIT: just found that you can use queue.get(False) to avoid blockage when receiving data. This way there's no need to check first if the queue is empty. This is no possible if you use pipes.

    0 讨论(0)
提交回复
热议问题