1 # Buildsheet autogenerated by ravenadm tool -- Do not edit.
7 SDESC[v11]= Library for procesing background jobs (3.11)
8 SDESC[v12]= Library for procesing background jobs (3.12)
9 HOMEPAGE= https://python-rq.org/
10 CONTACT= Python_Automaton[python@ironwolf.systems]
13 SITES[main]= PYPIWHL/84/62/9a6b14d4a724607f740cc4b3d132c52af24e753fe2a958d0b8c9d6d24e0a
14 DISTFILE[1]= rq-1.16.0-py3-none-any.whl:main
19 OPTIONS_AVAILABLE= PY311 PY312
20 OPTIONS_STANDARD= none
21 VOPTS[v11]= PY311=ON PY312=OFF
22 VOPTS[v12]= PY311=OFF PY312=ON
24 DISTNAME= rq-1.16.0.dist-info
28 [PY311].RUN_DEPENDS_ON= python-click:single:v11
29 python-redis:single:v11
30 [PY311].USES_ON= python:v11,wheel
32 [PY312].RUN_DEPENDS_ON= python-click:single:v12
33 python-redis:single:v12
34 [PY312].USES_ON= python:v12,wheel
36 [FILE:2206:descriptions/desc.single]
37 RQ (_Redis Queue_) is a simple Python library for queueing jobs and
39 them in the background with workers. It is backed by Redis and it is
41 to have a low barrier to entry. It should be integrated in your web stack
44 RQ requires Redis >= 3.0.0.
49 [![Code style: black]](https://github.com/psf/black)
51 Full documentation can be found [here][d].
55 If you find RQ useful, please consider supporting this project via
60 First, run a Redis server, of course:
66 To put jobs on queues, you don't have to do anything special, just define
67 your typically lengthy or blocking function:
72 def count_words_at_url(url):
73 """Just an example function that's called async."""
74 resp = requests.get(url)
75 return len(resp.text.split())
78 You do use the excellent [requests][r] package, don't you?
80 Then, create an RQ queue:
83 from redis import Redis
86 queue = Queue(connection=Redis())
89 And enqueue the function call:
92 from my_module import count_words_at_url
93 job = queue.enqueue(count_words_at_url, 'http://nvie.com')
96 Scheduling jobs are also similarly easy:
99 # Schedule job to run at 9:15, October 10th
100 job = queue.enqueue_at(datetime(2019, 10, 10, 9, 15), say_hello)
102 # Schedule job to run in 10 seconds
103 job = queue.enqueue_in(timedelta(seconds=10), say_hello)
106 Retrying failed jobs is also supported:
111 # Retry up to 3 times, failed job will be requeued immediately
112 queue.enqueue(say_hello, retry=Retry(max=3))
114 # Retry up to 3 times, with configurable intervals between retries
115 queue.enqueue(say_hello, retry=Retry(max=3, interval=[10, 30, 60]))
118 For a more complete example, refer to the [docs][d]. But this is the
123 To start executing enqueued function calls in the background, start a
125 from your project's directory:
128 $ rq worker --with-scheduler
129 *** Listening for work on default
130 Got count_words_at_url('http://nvie.com') from default
132 *** Listening for work on default
140 d71859da6f3af597bd79675b240e138587878ce9b693d1bcf681443e2a8193ce 89628 rq-1.16.0-py3-none-any.whl