0

I'm using a Bash script to retrieve the Spotify album thumbnail from whatever I'm listening at the moment to show it as an image in Hyprlock.

For this particular case, I'm using the command curl to retrieve the album cover image and store it in a separate directory.

...
if [ ! -f "stored_file.jpg" ]; then
     curl $url -so stored_file.jpg
     echo "stored_file.jpg"
...

The thing is, whenever this condition is met, curl downloads the image, but it causes a lag spike, affecting all of the other widgets I implemented, which is not ideal.

I wanted to know if there was a way to optimize curl, or use another similar command to download the image from the URL without having any performance issues. What I've already just managed to do is limit the use of curl as much as possible to not have lag constantly. But it doesn't help that it lags everything else so frequently.

3
  • IDK what can cause a lag when downloading a small image file but you can try limiting curl's bandwidth with --limit-rate
    – Fravadona
    Commented Dec 13, 2024 at 10:18
  • I assume it's not a bandwith problem, the script just executes curl and lags everything else for ~2 seconds, this is more problematic when the condition is met constantly, but I've managed to patch some things here and there to prevent this. Commented Dec 13, 2024 at 10:27
  • 1
    Save the list of cover image urls and don't download them until after the audio files are downloaded. Then they won't delay the audio files.
    – Sotto Voce
    Commented Dec 13, 2024 at 11:14

1 Answer 1

1

The solution here is probably not in speeding up curl; this would sound like something you might want to do in the background, not in a blocking manner.

Anyways, curl is not the speed-limiting factor here.

You call it for each image separately, which means it has to re-establish an HTTPS connection each time. And, the thing that probably actually limits the speed of that connection establishment and the download of the image is going to be the server, which limits things to stop scrapers.

So, not really much you can do on the speeding up side, if you stick with the idea of doing this in a shell script rather than having a daemon run in the background which keeps an HTTPS connection alive and handles a longer range of requests. Note that even that doesn't solve throttling by the server.

So, really, think about your system architecture. Do you need the image to have finished downloading to do the next steps? if not, you will want to do the download in the background and have some kind of signalling for when its done, so that you can do as many tasks as can be done in parallel in parallel and only when their result becomes necessary and they're not done yet wait. Or, in the case of album covers, you could just do everything. If the album cover comes in late, you can still update the display at the point that it's there.

However, shells are not capable of IPC, or condition variables, or semaphores, or thread-safe containers, or any other data structure that would enable this. My suspicion is that writing this in anything but bash will make your life a lot easier.

You must log in to answer this question.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.