From this post:
So if that blog is right, a DTIM setting of 3 is better for battery life. But see this wherein the two concepts of DTIM interval and DTIM period are discussed. If I understand that correctly, the period is mathematically
dtim_period / beacon_int (in seconds)
dtim_period of 3 and a
beacon_int of 100 have a period of 3/0.1 or 30 times per second. That is 3x more than a
dtim_peroid of 1 and a
beacon_int of 100. Seems wasteful to transmit at that frequency if the entire point of raising the values is to increase the time between them. What am I missing?
Would that not be equal to one for all values of dtim_period (maybe excluding zero)?
Yea, sorry typo, corrected.
Further... consider equivalent fractions:
dtim_period = 3 and
beacon_int = 3000 vs
dtim_period = 1 and
beacon_int = 1000
3/3000 = 1
1/1000 = 1
Each equate to a beacon once/sec. What's the functional difference there?
I have noticed DTIM and beacon interval the first time a couple of days ago in another thread. Though using OpenWRT for over 4 years now, I never paid attention to these 2 values. I also cant remember seen this in threads before, but maybe did not pay attention.
Did some research. Your understanding of DTIM related battery saving on connected devices is also my understanding. Yet though I changed values recently due to other reasons.
I recently ran into an issue with an airplay2 receiver, where lowering the AP beacon interval from default 2 to 1 fixes the annoying problem, where the airplay2 receiver loses wifi connection every couple of hours, only fixable by restarting the access point.
Interesting though that the same airplay2 receiver and the same dir-1960 AP had worked flawlessly before with default intervall and DTIM. The problem started, when I had added additional interfaces/zones and wifis to the AP, but without at all touching the defsult Wifi where the airplay2 receiver connects. Also all other mobile devices did not show any issues.
Could be that some Wifi driver bug-related timing issues / race conditions interfere with other OS components somehow on the AP, and by coincidence lowering beacon interval came to the rescue. I currently have no clue, why that works or why that is somehow interconnected, I can only daily confirm that it still helps for me.
So theory is one thing, practical unexpected but to some degree helpful side effects another. Currently that fix is more helpful to me than battery savings.
As far as my understanding, it is different:
first one: beacon signal gets sent every 3000(milliseconds?) and on every third (aka every 9000ms) the DTIM is sent.
second one: beacon every 1000(milliseconds?) and on every one (aka every 1000ms) the DTIM is sent.
Yea, I think that quora link confused me since we define the period directly in
/etc/config/wireless whereas they were calculating it. When I read the wiki's definition of these:
dtim_period: there will be one DTIM per this many beacon frames.
beacon_int: this is the time interval between beacon frames, measured in units of 1.024 ms.
So for the hostapd defaults: 2 and 100:
- The beacon frame is 100 * 1.024 ms = 102.4 ms long
- Or 1/0.1024 = 9.77 beacons / second
- There will be one DTIM per two beacons so 9.77 / 2 = 4.88 DTIMs/sec
If you use a dtim_period of 3, the beacons / sec is constant at 9.77 but now it's just 9.77 / 3 = 3.25 DTIMs/sec
So they are less frequent. Makes sense. Do I have it right?
my reference might be unconfirmed/wrong:
If you have a beacon interval of 100 ms and a DTIM value of 1, the DTIM is transmitted every 100 ms.
If you have a beacon interval of 1000 ms and a DTIM value of 2, the DTIM is transmitted once every two second.