Answer the question
In order to leave comments, you need to log in
What is the best way to organize two threads of execution inside Flask?
Greetings.
Thinking about the structure of a simple monitoring with a graphical interface. It should take JSON from a specially trained URL, check some fields and, if a number of conditions match, show a Big Scary Red Rectangle so that the support gets scared and panicked.
GUI I want to make on Flask. But we still need to solve the problem of regularly twitching the mentioned tip of the API. For simplicity, expanding (and folding) I would not want to use cron, so I'm thinking how to implement this with just one script.
The first option that came to mind is to create another thread inside the Flask process, sleep it for 5-10 minutes, then update the data. It seems like a good idea, but I came across the fact that I don’t understand the structure of the framework so far and I can’t figure out where to put this stream on it.
The second option is to use SIGALARM. Plus in a simple implementation, but I need advice on how to do it civilly. I think to implement both behaviors (GUI and data acquisition) in one script, and make the choice of behavior through an argument. That is, the GUI will spawn a "worker".
Actually, the questions are whether there is a third [, fourth, fifth ...] option, how bad the proposed two are, and whether it is still possible to fashion candy out of them.
UPD. Yes, if someone can explain on the fingers how to implement the first option, I will be immensely grateful.
Answer the question
In order to leave comments, you need to log in
I think it is more correct and most useful to use an asynchronous job queue, such as Celery ( github ) or RQ ( github ).
First, you don't have to reinvent the wheel. Secondly: get acquainted with the work of advanced tools in this area.
Here is an article from a Flask guru on using Celery.
Those. you vlom to write a two-line script for cron and hang it with one line to be executed, but not to write and control the second thread in scrap?
If your JSON API is not very slow, you can get yourself some simple session data storage (memcached or in general FileSystemStorage, built right into werkzeug) and cache the API response there, and when the cache dies right during the next request, just pull the API and update the data in storage.
And it will be even easier to make a banal autorefresh of the page, and there will be no artificially created problems with deployment and threads. If no other functionality is expected except for displaying the BSCP, this is generally done on bare HTML + JS without any backends, threads, queue servers, asynchrony and other trendy features that are absolutely not needed in "simple" knee monitoring. And if you need normal monitoring - install Zabbix :)
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question