All Questions
Tagged with python-asyncio python-requests
148 questions
0
votes
2
answers
83
views
how to iterate api loop concurrently with asyncio python
My code requests two api gets and compares them. I'm iterating against a dictionary in a for loop to determine where to point the get request. Sending GET, waiting for reply, then performing math, ...
0
votes
2
answers
152
views
Timeout Issue with aiohttp but not with Requests
I'm trying to fetch a webpage in Python using two different methods: requests and aiohttp. The requests method works fine, but the aiohttp method results in a timeout. Here's the code:
import asyncio
...
4
votes
4
answers
599
views
How to perform single synchronous and multiple asynchronous requests in Python?
I'm working on a Python script where I need to make an initial request to obtain an ID. Once I have the ID, I need to make several additional requests to get data related to that ID. I understand that ...
0
votes
0
answers
105
views
Python request get Invalid param error when using data from websocket result
token_info_list = []
list_lock = threading.Lock()
async def subscribe():
global token_info_list
async with websockets.connect(pump_new_pair_url) as ws:
# Subscribing to token creation ...
0
votes
0
answers
37
views
Asyncio only process 15 requests at a time
I want to send 100 requests at the same time with the code below. The test endpoint also mine, simply sleep for 5 seconds and return 200
import asyncio
import aiohttp
import datetime
async def ...
1
vote
2
answers
737
views
Asyncio send multiple thousand requests
I am trying to create a script that can send a lot of requests to a website in the range of 100000 to 1.000.000. I am trying to use asyncio and aiohttp to do so but somehow I can't figure how to do it....
0
votes
1
answer
324
views
asyncio gives out an error while bulk-requesting urls in a loop
I have a script, taking x urls from a file (example: 0-5 file lines, then 50-100 file lines, etc), then it does bulk-requesting these x urls using asyncio. But after 1 iteration it gives out errors.
...
1
vote
1
answer
285
views
AWS Lambda: Async imports and data query to reduce latency
Let's say I have this Lambda function, which needs to do three things: 1/ import some "heavy modules" (ex Pandas, Numpy), 2/ request some nontrivial volume of data and 3/ perform some ...
0
votes
1
answer
242
views
Adding a timeout to an asyncio aiohttp
I am currently using the following code to query an api for data
import asyncio
async def get_data(session, ticker):
while True:
try:
async with session.get(url=f'api.exec.com'...
0
votes
0
answers
398
views
How to create a progress bar with aiometer's run_all method?
I've been creating a pipeline from an API using httpx, asyncio and aiometer to limit the number of coroutines running at once.
I know that it's possible to create a async progress bar for asyncio's ...
0
votes
0
answers
293
views
Python3 - asyncio versus requests - performance
I have recently started looking at asyncio in Python 3 and wanted to test it out. I am not able to understand why asyncio code is running slower than threading based code.
It is quite likely that I ...
0
votes
2
answers
196
views
Converting aiohttp script to asyncio + requests (aiohttp not working on ubuntu while asyncio + requests works)
I am using the following script to do queries on a website.
It works on macos, however it does not work with ubuntu. I have tried requests and it works, I also tried a simple asyncio + requests with ...
0
votes
2
answers
613
views
How to add a try until success loop inside aiohttp requests
I have the following code,
import aiohttp
import asyncio
async def get_data(session, x):
try:
async with session.get(url=f'https://api.abc.com/{x}') as response:
data = await ...
0
votes
1
answer
329
views
Unable to get multiple response requests on aiohttp
I am trying to pull multiple responses requests using aiohttp but I am getting an attribute error.
My code looks like this, I had to obscure it because I am using a private api.
import aiohttp
import ...
0
votes
0
answers
22
views
Fetching data using url which is of db and which had multiples folders and files in it and then saving it to zipped python file
Fetching huge data around 60 Gb data using requests module with url which is of db and which has multiples folders and files in it and then saving it to zipped python file.
Right now I am testing with ...