All Questions
15 questions
1
vote
0
answers
930
views
How to fix "ConnectionResetError" and "aiohttp.client_exceptions.ClientConnectorError" while web scraping using asyncio and aiohttp?
I am learning web scraping with asyncio and aiohttp. The scraper seems to work for few times and then shows some errors for multiple tries and then works again. The same URL opens normally in browser ...
0
votes
2
answers
5k
views
How to use asyncio, aiohttp web scraper with fastapi?
I am learning web scraping using asyncio and aiohttp with beautifulsoup. I want to create a RESTful API to get user input, scrape the data and then show the response in json format. This is how my ...
6
votes
0
answers
1k
views
How to use only one session for all the lifecycle of my application with Aiohttp?
I'm making a python module for interacting with an API. I'd like it to be fast, so I chose to use asyncio and Aiohttp. I'm quite new to async programming and I'm not quite sure how to reuse the same ...
1
vote
0
answers
162
views
Python Requests made in parallel but receives responses sequentially
this is the first time I am posting a question. Sorry if the way I present my question is not clear. Basically, I spent the whole day trying to figure out why I am not able to receive responses ...
1
vote
0
answers
2k
views
Run aiohttp requests in batches
So I've done some research on this and I see there is a fair amount of documentation, but I am quite new to asynchronous programming and I am not grasping exactly what needs to be done.
I need to ...
0
votes
0
answers
876
views
Cycling through IP addresses in Asynchronous Webscraping
I am using a relatively cookie cutter code to asynchronously request the HTMLs from a few hundred urls that I have scraped with another piece of code. The code works perfectly.
Unfortunately, this is ...
2
votes
1
answer
903
views
Can't use https proxies along with reusing the same session within a script built upon asyncio
I'm trying to use https proxy within async requests making use of asyncio library. When it comes to use http proxy, there is a clear instruction here but I get stuck in case of using https proxy. ...
2
votes
1
answer
909
views
How to use asyncio and aiohttp for looping instead of for looping?
My code is working in this way but it's speed is very slow because of for loops, can you help me, to make it work with aiohttp, asyncio?
def field_info(field_link):
response = requests.get(...
1
vote
1
answer
3k
views
My script encounters an error when it is supposed to run asynchronously
I've written a script in python using asyncio association with aiohttp library to parse the names out of pop up boxes initiated upon clicking on contact info buttons out of diffetent agency ...
1
vote
1
answer
354
views
Script performs very slowly even when it runs asynchronously
I've written a script in asyncio in association with aiohttp library to parse the content of a website asynchronously. I've tried to apply the logic within the following script the way it is usually ...
12
votes
1
answer
20k
views
Fetching multiple urls with aiohttp in python
In a previous question, a user suggested the following approach for fetching multiple urls (API calls) with aiohttp:
import asyncio
import aiohttp
url_list = ['https://api.pushshift.io/reddit/search/...
2
votes
0
answers
3k
views
Load zip file from url with asyncio and aiohttp
How can I load Zip file with GET request?
I use asyncio and aiohttp in my Python app.
That's my code:
async def fetch_page(session, url):
with aiohttp.Timeout(10):
async with session....
26
votes
2
answers
12k
views
asyncio web scraping 101: fetching multiple urls with aiohttp
In earlier question, one of authors of aiohttp kindly suggested way to fetch multiple urls with aiohttp using the new async with syntax from Python 3.5:
import aiohttp
import asyncio
async def fetch(...
23
votes
1
answer
17k
views
Fetching multiple urls with aiohttp in Python 3.5
Since Python 3.5 introduced async with the syntax recommended in the docs for aiohttp has changed. Now to get a single url they suggest:
import aiohttp
import asyncio
async def fetch(session, url):
...
5
votes
1
answer
2k
views
What is the equivalent method of request.iter_content() in aiohttp?
I am writing a small web scraper which gets a big number of images from a specific site. However, the IO speed was slow so I googled and found asyncio and aiohttp to deal with IO bound operation ...