hi there for those who might be interested . you can download openwrt from archive.spacemit.com and have npu accelerated programs operating within openwrt.
immich AI acceleration works on it for sorting your media and catigorizing them , it also works on frigate for object recongnition. plus ollama large language models. but you are limited to <2b models.
the ollama docker-compose-yaml is found on archive.spacemit site. the frigate andothers docker-compose are found at bianbu.spacemit.com/en/ in the NAS section. you have your choice of nextcloud, kodbox,immich and frigate. they all work fine within openwrt. and all accelerated
if you are having trouble composing them in openwrt. run the biandu-desktop. and compose them in there. then just move the docker folder (var/lib/docker) to your open wrt docker folder and run them from there. for frigate and immich you have to also copy the timezone file from biandu to openwrt otherwise you get a start up error about localtime. to find the location of the time zone file on the biandu OS /etc/localtime look at properties it show what timezone file it is linked to just copy that file to openwrt and rename it to localtime in your /etc dir. when composing them try to editing the docker-compose.yaml volumes to match how you want it in open.wrt otherwise you have to edit the containers config.json afterwards point to where the volumes are located on openwrt.
docker containers
with it running idle operation is low. but when detecting people infrigate it will go up. immich after new media are uploaded it will run a bit higher until the media is AI sorted. ollama will use alot when in use, 4 cpus at 100% but drop to zero after it is done.
frigate object detection works fine as well
my openwrtAI project has just took a few more steps forward in very low power inhouse AI system to now having AI accelerated cloud computing and AI detection all on 5 watt device