问题
I'm running a flask server that connects to an iOS client with Flask-Socketio. The server has to process some complicated data, and since that takes a while to solve, I do it in a background job using Redis Queue.
Communication works fine normally, but I need to emit to the client, and write to database once the job finishes, and I am trying to do that from the job function (if there is a way to let the app know when the job is finished, the app could handle all communication in one place).
To do this, I start a new instance of Socketio in the job, and connect it to the redis queue, but I think I am doing it the wrong way.
It doesn't crash, but the client are not receiving anything.
Here is my code:
tasks.py
# This is the job
def engine(path, id):
result = process(path)
print(result)
socket = SocketIO(message_queue = os.environ.get('REDIS_URL'))
socket.emit('info', result)
events.py
def launch_task(name, description, *args, **kwargs):
rq_job = current_app.task_queue.enqueue('app.tasks.' + name,
*args, **kwargs)
return rq_job.get_id()
@socketio.on('File')
def got_file(file):
print("GOT FILE")
print(file[0])
name = file[0] + ".csv"
path = queue_dir + name
data = file[1]
csv = open(path, "w")
csv.write(data)
csv.close()
print(path)
launch_task("engine", "test", path, request.sid)
__init__.py
socketio = SocketIO()
def create_app(debug=False, config_class=Config):
app = Flask(__name__)
app.debug = debug
app.config.from_object(config_class)
app.redis = Redis.from_url(app.config['REDIS_URL'])
app.task_queue = rq.Queue('alg-tasks', connection=app.redis)
from .main import main as main_blueprint
app.register_blueprint(main_blueprint)
socketio.init_app(app)
return app
events.py handles all communication and launches the worker.
I think my arguments are wrong when instantiating Socketio, but I don't know... there are still a lot of things I don't understand about Socketio and the background jobs.
Thanks in advance!
回答1:
On the app, you have to initialize your SocketIO
object with app
and the message queue:
socketio.init_app(app, message_queue=os.environ.get('REDIS_URL'))
On your RQ worker you are doing it right, just the message queue is used:
socket = SocketIO(message_queue=os.environ.get('REDIS_URL'))
But creating a new SocketIO
instance each time you emit is a waste of resources, you should create a global instance that can be reused in multiple tasks handled by the worker.
来源:https://stackoverflow.com/questions/51635019/flask-socketio-not-emitting-from-external-rq-process