I have a simple webserver written in Python using aiohttp. My goal is a server that can receive many file uploads simultaneously and stream them all to disk. This code works, but I'm not sure if it is efficiently handling simultaneous connections:
import pathlib
import click
from aiohttp import web
from src import settings
from src import logger
log = None
async def upload_handler(request):
"""
POST handler that accepts uploads to any path.
The files are accepted and saved under the root path
of the server.
"""
# You cannot rely on Content-Length if transfer is chunked.
size = 0
local_path = settings.WWW_ROOT + request.path
path_without_file = local_path.rsplit('/', 1)[0]
pathlib.Path(path_without_file).mkdir(parents=True, exist_ok=True)
with open(local_path, 'wb') as f:
while True:
chunk, is_end_of_http_chunk = await request.content.readchunk()
if not chunk:
break
size += len(chunk)
f.write(chunk)
return web.Response(text='%d bytes uploaded' % size, status=201)
@click.command()
@click.option('--port', type=int, default=8000)
@click.option('--log-level', type=click.Choice(['debug', 'info', 'warning', 'error', 'critical']), default='info')
def cli(port, log_level):
global log
log = logger.get_default_logger(name='', log_level=log_level)
app = web.Application(client_max_size=2e9, debug=settings.DEBUG)
app.router.add_post('/{tail:.*}', upload_handler)
web.run_app(app, port=port)
if __name__ == '__main__':
cli()
The aiohttp stuff feels a little magic to me. Is this running on parallel threads? I'm seeing some high CPU usage at times.
Are there performance issues here? I see performance issues on the server, but I'm not sure if they occur here or elsewhere in the stack.
aiohttpuses threads for DNS lookups (resolver.py), ifaiodnsisn't available. Try running withaiodnsinstalled. \$\endgroup\$