Slow Browsing Speeds When Visiting the Official Website

Today at around 12:30 EDT, I wanted to look at some documentation on UPnP but was slightly frustrated to the slow loading speeds of the site. Any link I clicked on had a wait of around 10-15 seconds until it fully loaded with images. With the exception of the link to All Firmware Images for OpenWrt 24.10.0, all the pages were slow. As of the making of this post, I am not having these issues, but this hasn't been my first experience this year or the previous.

Does anyone else experience slow loading for the webpage of OpenWrt? Are their any reasons as to why these encounters occur?

'Welcome' to wannabe ai companies terrorizing the web to let their braindead LLMs 'learn' the whole internet and killing servers in the process.

Disclaimer: I don't have any deeper information, but exactly this is happening all the time. Bot(net) traffic hammering the servers and especially wiki, gitweb and forum tend to be affected most, dynamically created webpages, many recursive links, high server load. If you're running infrastructure on a budget, there is very little you can do - even aggressive blocking (if you happen to follow it in real time) only goes so far.

I saw an answer like that somewhere on this forum or Reddit. (Can't find the link)

What do you mean by this? I'm just a regular user who enjoys the customization of OpenWrt, not a sysadmin at all.

There are only two ways to defeat bots hammering your server:

  • aggressive blocking, as mentioned that only goes so far (and requires someone to be awake around the clock, to refine the blocks in realtime)
  • throwing additional hardware (servers) at it

The former is human resource limited, the later financially limited (server++ rent), neither of which are unlimited for opensource projects. You have to make do with what you have at your disposal.

1 Like

Thank You for the clarification.

I heard about Anubis the other day. It helps defend bot activities. Maybe it can be helpful: https://github.com/TecharoHQ/anubis

Crawl-delay: 1

Yes, we feel your pain. FWIW - my feelings on the subject.

I think anubis unnecessary at this time - Additional configuration profile to manage. We handle a lot of the crawlers, scrapers, bots, AI LLM indexers and a number of very directed DDoS attacks from thousands of IP addresses using the native capabilities of nginx currently. Additionally, anubis puts a large requirement on the client which is something we don't need to add to the list of things to respond to problems with.

Setting Crawl-delay: does no good as most offenders don't bother with robots.txt.

No matter what tools are deployed, it is still a game of "whack-a-mole".

1 Like

Are there patterns to the IPs? You could try to report bad bots to whoever owns the IPs. It is probably a lost cause but it might be worth a try. If nothing else you could setup banning and throttling for repeated offenders.

I would highly recommend against not using anubis. It is not terribly well design and even though it is a start there is a lot it could do better. Interestingly enough there are a few interesting ideas to solve this problem:

If you ever find a solution please post about it. Right now the solutions seem to be Fastly, Cloudflare and other proprietary solutions that aren't ideal.