Caching in LEDE

Is there a good and stable solution for caching frequently used web resources?

With the increasing proliferation of ssl/ https, caching has become rather ineffective, but there are multiple proxies packaged for OpenWrt.

It's even less effective because of increased part of dynamic content in the modern web.


In addition to the things said by the others (security, dynamic content), you probably won't gain much benefit with a cache unless you have a fairly large amount of RAM or other high speed storage in your router/cache server.


Thanks for the info guys, all that has been mentioned makes a lot of sense. My primary reason for a cache is my internet speed. On a good day, it's about 1.2Mbps down and about 580 Kb up. So you can see my reasoning for looking into a good cache. Again thanks for the input, I really approach it.

Ouch, that's a slow connection. But regardless, your web browsers will often cache what they can anyway, so a cache server on the router probably won't help much. A cache server can be really useful when you have a large number of independent devices trying to access the same general resources rapidly. I'm guessing this is not your situation -- just lean on the local cache for your browsers.


Well, kinda. It's an on and off thing. I do have a few things in my lab that need high speed internet. Just some windows servers that I don't want to worry too much about when it comes on to updates. A part from that though you are 100% correct.

You could setup an update server to ease the load of multiple downloads (or just do it once manually onto a thumb drive or other network accessible storage), but a router cache solution wouldn't make any sense unless you have many gigs of storage such as an x86 based OpenWrt installation.


That way a solution i was thinking of. Thanks a lot.

Best cache, also to be run on openwrt, is squid. I have it running on several installations, on standard MIPS. You might use SD-card for cache, although SSD usually is faster.
Your router should have 128MB RAM min., as squid (also) uses RAM for control info of cached objects.
Because of your slow web connection, you might find it useful to tune the squid-cache, i.e. by violating refresh times in case, some stale data is not too harmful. squid also caches dynamic content. So squid could have some performance advantages compared to browsers cache, depending upon usage case. However, these advanced configs of squid require quite some learning.

You can also run squid on a machine that is connected (wired) to your router and do the caching for your subnet on that, assuming you have such a machine and it's always on. I run a DNS server and a media server that way, and used to host squid that way too, but the caviats the other posters mentioned made squid almost useless in my case - many cache misses compared to hits.

Squid caching is mostly useful for software updates, where you use http and may need to download the same updated software packages for several machines on your LAN.

Otherwise, https means few things are cached. However I have found squid very useful for policy implementation and QoS control. You can do things like DSCP tag streaming media, but limit it to some fixed total bandwidth. I also use it to identify YouTube media headed to my kids devices and count that in a daily quota in the firewall. Similarly limit sites based on time of day or day of week.

This topic was automatically closed 10 days after the last reply. New replies are no longer allowed.