Exchanging data between python processes through a database

北城余情 提交于 2019-12-08 11:03:21

问题


I have a raspberry pi with a MCC118 daqhat. The goal is threefold:

  1. Acquire and record data to a database
  2. Stream data over a private LAN to a browser as close to real-time as possible (network GET/POST speed limited)
  3. Analyze this data and actuate devices (RPi.GPIO) based on the analysis

My program is both cpu and I/O bound in that the program must wait on data input from the daqhat (I/O bound), and, once data is read, it must be recorded and analyzed (cpu bound). For the cpu bound portion, I am utilizing multiprocessing to build a database. The browser display will then be updated (via flask) by querying the database. In an effort to do this efficiently, I developed the following example code as a foundation for the recording and displaying aspect of my project:

#!/usr/bin/env python
#  -*- coding: utf-8 -*-

import sys
import random
import psycopg2 as sql
from psycopg2.extensions import ISOLATION_LEVEL_AUTOCOMMIT
from multiprocessing import Process

def fill_table():
    conn_fill = sql.connect('dbname=mp_example user=pi')
    cf = conn_fill.cursor()
    while True:
        cf.execute('INSERT INTO ran_num (ran) VALUES (%s);', (random.random(),))
        conn_fill.commit()

def read_table():
    conn_read = sql.connect('dbname=mp_example user=pi')
    cr = conn_read.cursor()
    while True:
        cr.execute('SELECT * FROM ran_num WHERE id=(SELECT MAX(id) FROM ran_num);')
        val = cr.fetchone()
        if val:
            print('\r' + str(val[0]) + ' - ' + str(round(float(val[1]), 3)), end='')
        sys.stdout.flush()
    print('\n')

def main():
    conn_main = sql.connect('dbname=mp_example user=pi')
    cm = conn_main.cursor()
    cm.execute('DROP TABLE IF EXISTS ran_num;')
    cm.execute('CREATE TABLE ran_num (id SERIAL PRIMARY KEY, ran VARCHAR);')
    conn_main.commit()
    cm.close()
    conn_main.close()
    print('Starting program, press CTRL + C to quit... \n')

if __name__ == '__main__':
    main()
    try:
        p1 = Process(target=fill_table)
        p1.start()
        p2 = Process(target=read_table)
        p2.start()
    except KeyboardInterrupt:
        p1.join()
        p2.join()
        print('Processes have closed without zombies...\n')

I have just introduced myself to Postgres (I was using sqlite3) in an effort to get concurrency in writing and querying the data in what I think is a "data safe" way to do things. However, my questions are as follows:

  1. Is there a more memory efficient and/or data safe way to do this (e.g. pipes)?
  2. Is there a better way to exit when the KeyboardInterrupt exception is thrown? (My exception print statement never prints, indicating that the processes are leaving process zombies too.)

NOTE: I was initially just reading a single data value from the daqhat every time a GET request came in over the LAN, however, this meant that the sample rate was network speed dependent. For data integrity and consistency, I want to use multiprocessing to have the data acquisition do its thing at a constant sample rate, with the data being read from the database based on network requests.

来源:https://stackoverflow.com/questions/54836651/exchanging-data-between-python-processes-through-a-database

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!