Skip to main content

All Questions

1 vote
1 answer
29 views

aiomultiprocess: Worker creating two processes instead of one

Starting with one of the two examples from the User Guide ( https://aiomultiprocess.omnilib.dev/en/latest/guide.html ) I started my testing with an own variation: import asyncio import psutil import ...
LudgerH's user avatar
  • 103
2 votes
2 answers
162 views

Efficient parsing and processing of millions of json objects in Python

I have some working code that I need to improve the run time on dramatically and I am pretty lost. Essentially, I will get zip folders containing tens of thousands of json files, each containing ...
user2731076's user avatar
2 votes
1 answer
918 views

Locking resource in FastAPI - using a multiprocessing Worker

I would like to make an FastAPI service with one /get endpoint which will return a ML-model inference result. It is pretty easy to implement that, but the catch is I periodically need to update the ...
mehekek's user avatar
  • 409
0 votes
1 answer
146 views

Why threading is much faster than asyncio when processing multiple files in my case

I'm experimenting with asyncio and threading to figure out which mechanism I should choose in processing a large number of files. The experiment is simple that I just want to read the files and add ...
Ames ISU's user avatar
  • 407
0 votes
0 answers
103 views

Python AsyncIO, multithreading and multiprocessing performance

I'm actually trying to speed-up a piece of code that is writing thousands of small files. My first idea was to use asyncIO, as writing file is a blocking I/O. I also thought it could be interesting to ...
julpw's user avatar
  • 11
0 votes
1 answer
2k views

speeding up query with sqlAlchemy and asyncio in Python

I'm new to python and i am trying to do a task that requires to query a mysql table of 50 millions records join with another table to match the data and finally group by to determine the count of ...
DucNguyen's user avatar
0 votes
0 answers
530 views

How to process queue in asyncio using ProcessPoolExecutor and run_in_executor?

I'm trying to implement a pattern where I'm: Using an async library to get data and add it to a queue Using ProcessPoolExecutor to process this data as the work is cpu bound Below is a minimal ...
Bruce Johnson's user avatar
1 vote
0 answers
243 views

Monitor file for read availability using asyncio on Windows

Using asyncio's add_reader() method on Unix systems, I can easily monitor file descriptors, such as pipes, for read availability and invoke a callback once the file is available for reading. For ...
estes's user avatar
  • 11
0 votes
0 answers
47 views

Coordinating Asynchronous Processes: Managing Data Flow from two functions to control another dependent function

would just like to ask for what approach I need to use for the completion of my project which is based on python. DESCRIPTION: So, I am developing a mobile robot and I am using rplidar and uwb for ...
user avatar
4 votes
0 answers
324 views

Python parallelism with concurrent.futures.ProcessPoolExecutor (or multiprocessing.Pool) and AsyncIO

Currently, I'm working on a project that listens to every API request, perform checks on it and sends it back. Sadly, API managed to overload my single threaded AsyncIO and multi-threading attempts, ...
Red 's user avatar
  • 68
1 vote
2 answers
200 views

How to fetch data asynchronously from a multiprocessing spawn process in Python?

I have a FastAPI app with a single endpoint /generate. This endpoint takes in requests and puts them into global input_queue. Now the background spawn process called worker gets the data from the ...
Uchiha Madara's user avatar
3 votes
2 answers
748 views

How to execute two different functions concurrently in Python?

I have two different functions (funcA and funcB) that I want to be executed concurrently to cut down on overall execution time. funcA is an API call that takes somewhere between 5 to 7 seconds. funcB ...
eNeM's user avatar
  • 630
0 votes
1 answer
144 views

Create and call a function that "asynchronously" updates a file in a loop until the second function that is started in parallel is done

I'm new to multiprocessing / threading and asyncio in Python and I'd like to parallelise two function calls so that the first function updates a healthcheck text file in an endless loop with 5 min. ...
MaxU - stand with Ukraine's user avatar
0 votes
1 answer
78 views

Multiprocessing backend blocking asyncio frontend

in a general GUI-based script (framework) I have a Tkinter-based GUI. This is run asynchronously (works). When I press the 'Start' button, a processing starts and as it's CPU-heavy, this is done using ...
Gyula Sámuel Karli's user avatar
3 votes
1 answer
1k views

How to make each process in multiprocessing.Pool handle work asynchronously

I have a function that takes a long time to run and has sizable IO-bound and CPU-bound components. I need to call this function many times in a for loop, more times than the number of cores on my ...
YudoSmootho's user avatar

15 30 50 per page
1
2 3 4 5
8