M
M
Max Payne2019-05-11 14:19:08
Python
Max Payne, 2019-05-11 14:19:08

Why such a difference in asynchronous request processing?

There is an endpoint on aiohttp with uvloop, I'm testing siege, there are two test cases: in one it is used asyncio.create_task(func()), which creates an asynchronous task and processes it, but await this task is not used anywhere, since I don’t need its result and it just has to work out, and the other uses await func().
Test results:

Transactions:		        4549 hits
Availability:		      100.00 %
Elapsed time:		       29.72 secs
Data transferred:	        0.01 MB
Response time:		        1.02 secs
Transaction rate:	      153.06 trans/sec
Throughput:		        0.00 MB/sec
Concurrency:		      156.78
Successful transactions:        4549
Failed transactions:	           0
Longest transaction:	        3.23
Shortest transaction:	        0.11

Transactions:		        4638 hits
Availability:		       99.42 %
Elapsed time:		       29.34 secs
Data transferred:	        0.11 MB
Response time:		        0.81 secs
Transaction rate:	      158.08 trans/sec
Throughput:		        0.00 MB/sec
Concurrency:		      128.02
Successful transactions:        4638
Failed transactions:	          27
Longest transaction:	        7.42
Shortest transaction:	        0.14

As you can see, the time of the longest and slowest requests in the second case is longer (the longest request is twice), however, the competition and the average response time (20% - quite significantly) are less, but the number of processed transactions during testing is greater as well as the average number of processed transactions per second.
Why is it so? Should I use asyncio.create_task everywhere where I don't need the result of the task?

Answer the question

In order to leave comments, you need to log in

1 answer(s)
L
lega, 2019-05-11
@YardalGedal

1) The difference between these 2 results is "Data transferred:", the second option transferred 10 times more data, which means more work - less rps - i.e. tests are not equivalent
2) if you do not need to wait for the result, then do not use await, then you can complete the request faster
in general, create_task will not give you great performance, and an asynchronous call cannot take 3 or 7 seconds, rather your code is blocking or brake , and neither asyncio nor aiohttp have anything to do with it.
PS: asyncio + uvloop can output up to 100k rps on an average laptop in a single thread, so this is not the reason for your 150rps
--- asyncio + uvloop + httptools test on an average laptop
Running wrk:
Result:

Running 10s test @ http://localhost:8888/
  2 threads and 200 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency     1.59ms  404.50us   6.74ms   58.61%
    Req/Sec    62.18k     7.51k   74.02k    68.00%
  1237743 requests in 10.06s, 47.22MB read
Requests/sec: 123044.52
Transfer/sec:      4.69MB

import asyncio
import uvloop
from httptools import HttpRequestParser

class RpcProtocol(asyncio.Protocol):
    def connection_made(self, transport):
        self.transport = transport
        self.parser = HttpRequestParser(self)

    def connection_lost(self, exc):
        self.transport = None

    def data_received(self, data):
        self.parser.feed_data(data)

    def on_message_complete(self):
        #if self.url == b'/hello':
        response = b'HTTP/1.1 200 OK\nContent-Length: 5\n\nHello'
        self.transport.write(response)

        if not self.parser.should_keep_alive():
            self.transport.close()
            self.transport = None

if __name__ == '__main__':
    loop = uvloop.new_event_loop()
    asyncio.set_event_loop(loop)
    coro = loop.create_server(RpcProtocol, '127.0.0.1', 8888)
    loop.run_until_complete(coro)
    loop.run_forever()
    loop.close()

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question