Rust-lang (rustc/cargo) for OpenWrt - testing needed

I've been able to start back on rust-lang, and have been pushing updates.

I think I've finalized aarch64, arm, armv7, mips, mipsel, mips64, and x86_64. The arm/armv7 should now correctly determine if the target needs hard-float or not.

powerpc has been dropped from the package for now, as it seems to have LLVM issues. For anyone who wants to test rust-lang for powerpc, I've left provisions in the Makefile to pass the correct defines (--D__ppc__) to LLVM for powerpc targets if it's called directly (make package/feeds/packages/rust/host/{clean,compile}), but it isn't enabled in the rust-lang package DEPENDS

This is simply at the stage rust-lang toolchain compiles, and I've not tested that they successfully cross-compile, or that the cross-compiled package actually works correctly, for any ARCH other than mips64 and mipsel. I have no reason to suspect it won't work, but early on I ran into issues with mips64 where it compiled both the toolchain and suricata6 but would immediately SIGILL because of the static vs dynamic linking (musl issue). I had no indication of an issue until it just died on the device.

If anyone would like to test and see if rust-lang builds for your target, I'd appreciate the feedback. If anyone has a target ARCH not listed, let me know (I don't play outside of the very confined world of the Octeon MIPS64 branch often) as I'm not up on the other ARCHs.

  21M dl/rust-1.56.1-aarch64-unknown-linux-musl-install.tar.xz
  20M dl/rust-1.56.1-arm-unknown-linux-musleabihf-install.tar.xz
  20M dl/rust-1.56.1-armv7-unknown-linux-musleabihf-install.tar.xz
  22M dl/rust-1.56.1-mips64-unknown-linux-muslabi64-install.tar.xz
  20M dl/rust-1.56.1-mipsel-unknown-linux-musl-install.tar.xz
 364M dl/rust-1.56.1-x86_64-unknown-linux-gnu-install.tar.xz
  22M dl/rust-1.56.1-x86_64-unknown-linux-musl-install.tar.xz
 109M dl/rust-1.56.1.tar.xz

I'm also verifying arm-unknown-linux-musleabi and expect it should compile correctly.

Appreciate any help or feedback!

1 Like

Hi @Grommish thanks a ton for your work on this! I'm working on building Rust code for a couple of devices running OpenWRT, and what you've done here is hugely valuable.

That said, when running what I believe is your latest, I came across a small issue. I think I've maybe addressed it, but am still waiting for the build to complete, so figured I'd post here in the meantime

What I did was to apply your patch to the version of openwrt/packages (@ this fairly recent commit) that's used by the Teltonika SDK, which is relevant for the routers I'm building for. Although this is probably not 100% the envisaged approach, it feels like it ought to work. And it mostly does. I can request to build a sample Rust program, which triggers building of the toolchain, and that mostly builds, but then I get to an error such as

Building stage1 std artifacts (x86_64-unknown-linux-gnu -> mips-openwrt-linux-musl)
error: failed to run `rustc` to learn about target-specific information

Caused by:
  process didn't exit successfully: `/home/ubuntu/rutos-ath79-rut9-gpl/build_dir/hostpkg/rust-1.57.0/build/bootstrap/debug/rustc - --crate-name ___ --print=file-names -Zsymbol-mangling-version=legacy -Zmacro-backtrace '-Clink-args=-Wl,-rpath,$ORIGIN/../lib' -Ctarget-feature=-crt-static -Zsave-analysis -Cprefer-dynamic -L native=/home/ubuntu/rutos-ath79-rut9-gpl/staging_dir/toolchain-mips_24kc_gcc-8.4.0_musl/lib -Cembed-bitcode=yes '-Zcrate-attr=doc(html_root_url="")' --target mips-openwrt-linux-musl --crate-type bin --crate-type rlib --crate-type dylib --crate-type cdylib --crate-type staticlib --crate-type proc-macro --print=sysroot --print=cfg` (exit status: 1)
  --- stderr
  error: Error loading target specification: Could not find specification for target "mips-openwrt-linux-musl". Run `rustc --print target-list` for a list of built-in targets

Now, mips-openwrt-linux-musl is indeed not a valid target that's recognized by rustc. That should be mips-unknown-linux-musl, right? I've not really been able to track down where that discrepancy arises, but it feels to me like $RUSTC_TARGET_ARCH gets incorrectly set to the "raw" target triple, rather than the Rust-specific one. From what I can tell the solution is to switch the commenting towards the end of, so that

RUSTC_TARGET_ARCH:=$(ARCH)-unknown-linux-$(patsubst "%",%,$(RUST_TARGET_SUFFIX))

I'm now trying to build with that. Initially got a weird ld error about mismatch in "VFP register arguments" (potentially related to the hard vs soft float issue you mentioned), but have now cleaned the build tree and am building from scratch just to be sure it wasn't a conflict with a previous build.

Finally, do you still have a method for using rustup to install the Rust toolchain? I used that to build binaries for the architectures that I'm interested in (armv7 and mips) outside of the OpenWRT tree, and it worked fine - so I'm wondering if there's a way to integrate this into the OpenWRT build process (for supported architectures), to avoid building rustc from source.

1 Like


Couple of things:

  1. rustup won't work because rustup is statically linked, and OpenWrt MUSL requires dynamically linked libraries. I wish we could use rustup..

  2. OpenWrt uses xxxx-openwrt-linux-musl for the tuples. I'm working on the best way to integrate this into Openwrt and Rust-lang. I have access to a MIPS64 device, so that's what I'm currently testing. Previously I had MIPS/MIPS64/Aarch64/x86_64 working (ARM, ARMv5/v6/v7 is proving a pain).

I've gone from cross-linking tuples from xxxx-openwrt-linux-musl to the built-in rust-lang tuples (xxxx-unknown-linux-muslxxx) to just upstream'ing the tuples as Tier 3 targets (

I've been trying to figure out the best way to integrate long-term into the build system and potentially be able to introduce rustc and cargo natively on-device for the higher-end devices that might need it (although that will depend on LLVM adding support for the target Arch)

Compiling rust-lang takes about 4 hours on my device from a clean slate, so it just takes time to build, test, fail, repeat :smiley:

ARM Targets are going to be tricky because OpenWrt uses multiple ways to define ARM functionality (like VFP/NEON) across the trees, so that'll have to come later.

grommish@DESKTOP-AW:~/openwrt/staging_dir/host/bin$ file rustc
rustc: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, interpreter /lib64/, BuildID[sha1]=76008b88647eb342385535fc311bfaf0d769c225, for GNU/Linux 3.2.0, not stripped

This is my cross-compiled rustc for the build system (used to cross).

I'll probably push an update here soon, as I've been trying to organize the Makefiles to better be readable and organized.

I can push a MIPS enabled commit, if you'd like to test

I should add: I'm not a programmer and I don't know rust. So, if you actually know rust, it would be very helpful if you'd be willing to help test that what actually builds, builds correctly.

Thanks for the quick response! A few things to unpack here; let's maybe start with this:

To be clear, the second part of this isn't an objective of mine. I agree it would be neat to do that, but I'm currently simply trying to cross-compile Rust applications to run on constrained OpenWRT devices. And indeed to do that as part of the OpenWRT build and firmware packaging process. [The cross-compilation of individual binaries outside of the OpenWRT build process is quite doable, by e.g. following - but that's besides the point here.]

I wasn't familiar with this, but after a bit of reading (e.g. here), I think I'm kind of getting the picture. Is the actual issue that the rustc compiler that's been installed via rustup will always produce statically-linked binaries on musl? (I assume it's not about the rustup/rustc executables themselves being statically linked)

While I agree it's not ideal to have statically linked binaries, it does seem that such binaries would nonetheless run fine on the target (at least they do in my tests). In that regard I'm not sure what you mean when you say that OpenWRT "requires dynamically linked libraries". Can you elaborate, or would you be able to point me to any materials that I can use to educate myself on this? (am a bit of an OpenWRT novice).

The upstreaming seems like a worthwhile effort, but potentially also a lot of work to implement and maintain across all relevant tuples? I don't have a clear sense of what's best here. From a practical perspective, given that no openwrt tuples are currently recognized by rustc upstream, having some mechanism to match to a valid RUSTC_TARGET_ARCH seems like the only viable near-term option.

On that note, I've made some updates to your to correctly parse mips (not mips64) and at least some arm architectures. For example on one device I have


which the original did not recognize as having hard-float capabilities. I changed some instances of $(filter ... to $(findstring ... so that the neon and vfp keywords would be picked up even if not surrounded by whitespace. But yes, maybe trying to do this reliably for all tuples would end up being quite brittle - to your point, especially for all the ARM variants.

I think I've got it sorted (with the above), but may come back to you on that - thanks for the offer!

Yeah, similar here unfortunately - hence the allure of rustup :slight_smile: .

As well as being an OpenWRT noob, I'm also a bit of a Rust noob, so probably not the best person to give a definitive stamp of approval :). But in guess in both our cases we're running applications with defined functionality, so hopefully it should be fairly easy to tell if they do what's expected or not. (I would also be a bit surprised if everything compiles successfully, but then - silently - does something different to what it's supposed to during execution)

1 Like

I'm a hard-agree with this, BTW. The ability to create the packages is the goal, and one that works fine. My issue has been getting the cross-compiled package (in this case, suricata6) to work to validate rust-lang. It's the second part I've been having issues with.

Upstreaming the tuples isn't any harder than maintaining the local patches/ to add the OpenWrt specific tuples. The patch file I created for the rust PR could be dropped into feeds/packages/lang/rust/patches (and, in fact, was before I cloned rust-lang to submit the PR)

As you can see, at one point I worked to cross-map tuples between Openwrt's and the GNU standard tuple that rust-lang (and LLVM) use. Trying to sort out the ABI was reasonable on everything except ARMv7, and you saw the spaghetti waterfall of checks. Ideally, just using the tuple Openwrt wants to use internally keeps everything the same across the entire build environment. The only downside is the PR to rust-lang vs an update PR to openwrt-packages. But, it isn't like that would be an issue outside of rust-lang version updates, which would require an update anyway.

There were a few issues.. You can read them on the rustup PR (

My method has always been make it work and then figure out how to make it work while doing it right. I honestly couldn't tell you if a -mhard-float target would work with rustup because it was a non-starter once I learned it.

When I asked about the dynamic vs static linking, below is what I was told and have gone with.

I'm always open to suggestions and comments!

1 Like

Fair enough. I was thinking more about doing simple "matching" of tuples that are already supported by rust-lang, but I take your point that that can be a mess also. I guess it largely comes down to whether it's easier to get PRs merged into rust-lang or openwrt/packages :).

Thanks - this is really helpful context. To be honest I'm not sure just how different the rustup behavior is from the way that other languages are built. I do feel like there is often an element of binary downloads (e.g. of dependencies), though maybe it's the caching element that rustup doesn't do well. My gut feeling is that the ultimate solution would be to have the rustup behavior improved in a way that brings it in line with what's acceptable and desirable within the OpenWRT build process, but maybe that's a pipe dream.

Anyway, happy to stick with building from source for now. On that note, I've been playing around with trying to reduce the build time to something less ridiculous. E.g. by removing builds of unneeded tooling (rls, rustfmt, docs, linters, etc), to only leave rustc and cargo. This still doesn't help with (by far) the biggest resource-hog, which is building LLVM from source. In principle the best way to address this would be to set the config option to true or if-available, and just get a pre-built LLVM binary. This is available for all Tier 1 and Tier 2 target tuples (list), which is a nice start though obviously not exhaustive.

All that said - I cannot get the LLVM binary download to work currently, for what seems like a silly reason. Attempting to use that option triggers some git commands in (here) to obtain the currently checked out commit hash, which is used to figure out the correct LLVM build to download. But since in the OpenWRT build the source tree comes from an archive, and not a checked-out git repo, the git commands fail. Short of patching, I'm not sure how to address this.

Anyway, I do feel like I'm close to getting the compiler to build properly. Once that's done I'll look into the dynamic vs static linking a bit more. Among the many discussions on the topic, it's not always clear whether people are talking about (a) building a dynamically linked rustc compiler on a musl host, or (b) having rustc build dynamically linked binaries for a musl target. Obviously it's the latter that's most relevant here.

Hm I'm hitting an error that I'm not sure what to make of. This is during

Building stage1 compiler artifacts (x86_64-unknown-linux-gnu -> armv7-unknown-linux-musleabihf)

The error message itself is incredibly long, but boils down to

   Compiling rustc_driver v0.0.0 (/home/ubuntu/rutos-mdm9x07-trb1-gpl/build_dir/hostpkg/rust-1.58.0/compiler/rustc_driver)
error: linking with `arm-openwrt-linux-muslgnueabi-gcc` failed: exit status: 1
  = note: "arm-openwrt-linux-muslgnueabi-gcc" [...very long list of options, including:] "-Wl,-Bdynamic" "-lLLVM-13-rust-1.58.0-stable" "-Wl,-rpath,$ORIGIN/../lib"
  = note: /home/ubuntu/rutos-mdm9x07-trb1-gpl/staging_dir/toolchain-arm_cortex-a7+neon-vfpv4_gcc-8.4.0_musl_eabi/lib/gcc/arm-openwrt-linux-muslgnueabi/8.4.0/../../../../arm-openwrt-linux-muslgnueabi/bin/ld: cannot find -lLLVM-13-rust-1.58.0-stable
          collect2: error: ld returned 1 exit status

At this stage I have several relevant-looking LLVM libraries present at the following paths (under the OpenWRT root):


I can't quite tell if I just don't have the correct environment set (seems more likely), or there's a missing LLVM artifact. @Grommish any ideas/pointers? Not sure if you faced anything like this before.

There's some discussion here: which indicates that it's an environment variable issue. To be honest - and this is maybe the bigger issue - I have very little idea how to troubleshoot this, given how far-removed this step is from the original environment in which the build is run. I don't for example know what the value of $ORIGIN is - it doesn't seem to appear anywhere else in the build log. I also wouldn't even know where to start in terms of trying to set $LIBRARY_PATH (suggestion from the linked issue). Finally, it doesn't seem that I can just attempt to re-run the command in my shell (to try and narrow down the root cause), since it relies on a linker script in a temporary directory that gets cleared after the error.

Any pointers on how I might be able to isolate and troubleshoot this?

[Finally, @Grommish, just wanted to point out that I've slightly tweaked the original options to (a) build version 1.58.0 rather than 1.57.0 and (b) use from the stable rather than the nightly channel. This was actually in an attempt to get some more alignment between the versions, since I noticed that trying to build 1.57.0 with your original script was attempting to build against LLVM-13-rust-1.59.0-nightly - and crashing with the error above - and my initial assumption was that maybe the error was due to a version mismatch]

I've updated the PR. This better organizes and uses internal build functions that I was unaware of during the first (of many) iterations.

I attempted to pare back the rust-lang toolchain to a minimum, but I don't know enough about it to do it effectively. I can tell you that working in the Fakeroot adds.. issues.. for things like make install and how it installs itself to the fakeroot and then registers itself. To get around this, I build the dist and then extract them directly to $(STAGING_DIR_HOST). This makes them available to the build system.

I've also added a mips-openwrt-linux-musl patch to the PR so you can test build, although I've not tested it as of this writing because it takes so long.. Should be fine though. I'll edit this with an update once I run it.

Once you've got the toolchain built, anything you need to build with it will need the following:


CONFIG_HOST_SUFFIX:=$(shell cut -d"-" -f4 <<<"$(GNU_HOST_NAME)")

        --target=$(REAL_GNU_TARGET_NAME) \
        --host=$(REAL_GNU_TARGET_NAME) \      

The CONFIGURE_ARGS flags will change depending on what your package configure will handle. Although I need to verify this, I believe the build system defaults default to --target=$(REAL_GNU_TARGET_NAME)

I can say I'll need to do more cleanup on the whole thing since I'm no longer cross-directing tuples back and forth, but it works for now

Thanks for sharing @Grommish - I did finally work out the issue, and I can see it's also fixed in your latest version. Namely, in the previous version, in the build configuration both target and host were set to $RUSTC_TARGET_ARCH, which meant that Rust would build the LLVM library for ARM (or MIPS, etc), and would then try to link against that when building rustc (potentially also for that platform?...not sure), which would fail. Of course the host should be set to $RUSTC_HOST_ARCH. When I changed that it all worked out. FWIW the version I ended up with is here.

I think it's pretty close to your latest - though I'm still going with the OpenWRT<=>Rust/LLVM tuple matching in the makefiles, rather than patching the Rust codebase. I'm still personally averse to the idea of maintaining patches.

I was curious about one aspect of your latest PR: although you define $HOST_CONFIGURE_ARGS here, that variable is not used anywhere. In the previous version there was a ./configure <configure args> stage which would pick up the arguments and generate the config.toml file used by Is this no longer necessary?

Anyway, my current state is that rustc and cargo are installed and work more or less fine. I'm encountering an issue with one of the dependencies for the specific application I'd like to build, but that seems secondary.

HOST_CONFIGURE_ARGS and CONFIGURE_ARGS are automatically applied in Host/Configure when it runs ./configure. Check out TOPDIR/include/ and TOPDIR/include/ to see what is defined be default for those as well as the Prepare, Configure, Compile, and Install defaults for Build/ and Host/

In the latest PR, anything HOST_ or Host/ relates to the Host-Toolchain install. The CONFIGURE_ARGS are used for Build/ runs.


If you are building a Host Package, you would also add a


To set CARGO_HOME. My latest revision was still trying out different locations, so I defined it in the Makefile, but that'll just change.

The advantage to not cross-tying the tuples is that I don't have to import/copy the "sorting logic" to each project that uses rust-lang and remains LLVM compliant.

I haven't done testing on 1.58, but will once I finish testing 1.57.0. If I switch revisions, I have to start over :smiley:

You may have problems with stable versus nightly, as nightly is where the Experimental tuples are (like the ones we are building). Keep that in mind if you run into issues building or using the resulting toolchain. I know I set it nightly for a reason, but it was so early on I couldn't tell you exactly why.

If you extract and install the dist archive into STAGING_HOST_DIR, there isn't a need to try and set -lLIBPATH or -Iinclude_stuff

However, if you WANT to play with it, you can use the HOST_ and TARGET_ flags as below:


Which will pass on those flags to the build system.


TARGET_LDFLAGS += -latomic
1 Like

Thanks @Grommish for the advice here! I did get it working ultimately, though it was definitely not straightforward. In particular, for someone who's not very familiar with it, the amount of "magic" (through various inter-dependencies, semi-hidden steps, and environment-setting) that the OpenWrt build process does is pretty overwhelming. So I found the number of potential levers to play with when trying to troubleshoot anything (e.g. a linker error) pretty mind-boggling.

Anyway, it seems to work ok now. My version of the Rust host build is here: It's a slight tweak on yours, in particular to download the pre-built LLVM binary from CI, rather than spend ~3 hours building it locally (and yes, I know that's cheating by OpenWrt standards, but :man_shrugging: eh).

I also struggled making much sense of the Suricata Makefile since I really just need a simple cargo build, but ultimately managed to figure out what that needs to look like. For anyone interested, below is my minimal Makefile for a Rust Hello World. I also have some more complex sample code that works with an MQTT broker through the paho-mqtt library, and that's based on a nearly identical Makefile - example here.

Sample Makefile for Hello World:

include $(TOPDIR)/




include $(INCLUDE_DIR)/



define Build/Compile
        cd $(PKG_BUILD_DIR) && \
          $(CONFIGURE_VARS) cargo build --release --target=$(REAL_GNU_TARGET_NAME)

define Package/rust_helloworld
        TITLE:=Rust Hello World

define Package/rust_helloworld/description
  Hello World in Rust for OpenWrt

define Package/rust_helloworld/install
        $(INSTALL_DIR) $(1)/usr/bin
        $(INSTALL_BIN) $(PKG_BUILD_DIR)/target/$(REAL_GNU_TARGET_NAME)/release/helloworld $(1)/usr/bin

$(eval $(call BuildPackage,rust_helloworld))

That is pretty much the experience I had with dealing with the build system. I'd figure out how to do what I needed it to do thru brute force, then figure out where/how what I needed was already in the system, just not documented (or spread out).

If you don't mind, can I use your helloworld package for testing? I don't know rust, but if you do, and would, can the rust Hello World that also testing correct handling of Float (which seems to be the biggest issue I am facing with targets)? Having that be cross-compiled and run on a target device as a verification of the toolchain would be fantastic.

Yeah I certainly struggled with the documentation. E.g. it would be great to have a reference for the standard variables that are available for use in Makefiles, like $(TARGET_CC_NOCACHE) or $(TOOLCHAIN_DIR). Thankfully the community support is first-class :slight_smile: .

For sure! The repo it's based on is someone else's by the way - I just figured there was a git repo out there where someone has simply run cargo new helloworld and committed (and indeed there were plenty).

Do you know what a good test for this would be (conceptually)? As part of my sample MQTT pub/sub code, I convert a UTF-8 encoded string to a float (e.g. "1.1" -> 1.1), then multiply that float by 2.0 and convert it back to a string (here). So there's obviously a float operation in there, but I'm not sure if it really pushes the right buttons for testing. I'd certainly be happy to come up with a one-liner - e.g.

println!("Double a float = {:?}", 1.1 * 2.0);

if you have any idea as to the exact requirements. By the way I can confirm that this runs great on a MIPS unit (no hf), after cross-compilation via the toolchain here.

Finally, can also confirm that the resulting binaries are dynamically linked:

root@openwrt-host:~# ldd /usr/bin/helloworld
	/lib/ (0x77da8000) => /lib/ (0x77d0e000) => /lib/ (0x77da8000)

That said, the ones that I can build with the current version of the standard Rust toolchain (downloaded with rustup) are also dynamically linked, so this isn't necessarily a significant achievement.

This is why we can't cheat and install rustup on the build-host and by-pass the build system :laughing: You run into issues no one ever expected someone building to be in and assumptions by the dev-team on the far end get made (I ran into a similar Suricata5 issue with cross-compiling assumptions on auto-confs)

Also, it looks like the first *-openwrt-linux-musl* rust-lang toolchain was accepted as a Tier 3 target:

  Compiling helloworld v0.1.0 (/home/sda3/openwrt/build_dir/target-x86_64_musl/rust_helloworld-0.1.0)
error: linking with `x86_64-openwrt-linux-musl-gcc` failed: exit status: 1

Compile error with make file

Hi @qiuzi ..

I've not tested anything x86_64 as a target, and I'm not even sure if the Makefile will support that platform at the moment.

I am making changes to the entire package and how it's outlined. I'm working to validate a working toolchain and that it integrates properly before I start adding other arches.

That being said, I'm more than happy to enable x86_64 if it isn't already when I push the next update to the rust-lang PR.

Are you the one that emailed me? If so, I did receive it, but I've not had a chance to respond yet.

I'm doing some more testing before I send up the latest batch of changes, including developing a Makefile package template to use with Rust packages.

DEPENDS:=@BROKEN @(aarch64||arm||mips||mips64||mipsel||x86_64)

So I did have x86_64 included in the package, however, there is NO toolchain support for anything other than Mips64 and potentially mips at the moment. I will need to update the PR for a x86_64 target.

I have created a template ( that can be used for creating packages that include rust-lang.

I've added ripgrep package for those with rust-lang

Thanks to @neg2led, I have a test suite for testing rust-lang cross-compile output for specific outlier cases.

If you want to help test, pick your ARCH below, download and install the ipk file and let me know if it works. It'll show the output below, including the fractal pattern, or, it will SIGILL and exit.

root@OpenWrt:/tmp# /bin/float_test
running on linux mips64 with 0x0 terminal, aspect ratio 0.5

                                              .    @@
                                  .     :.  * #::~:@:@=* ..:  .
                                  .@@    .@@@@@@@@@@@@@@@@@@@@     .
                                  @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ @@@.
                             .+. ~@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@.
                .            @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
         .@@ .@@@@=@       *@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
          @@@@@@@@@@@@@    .@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
       .@@@@@@@@@@@@@@@@@  @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
       .@@@@@@@@@@@@@@@@@  @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
          @@@@@@@@@@@@@    .@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
         .@@ .@@@@=@       *@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
                .            @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@
                             .+. ~@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@.
                                  @@@@@@@@@@@@@@@@@@@@@@@@@@@@@@@ @@@.
                                  .@@    .@@@@@@@@@@@@@@@@@@@@     .
                                  .     :.  * #::~:@:@=* ..:  .
                                              .    @@


I will be adding arches as I generate and begin testing for the various other arches.

Anyone who can help test, it would be greatly appreciated. Once I can verify that the toolchain actually works, I can upstream the tuples. I don't have hardware to test on though, other than mips64.

Edit: A point was raised in the testing, if you test and report back, please let me know what branch you tested on.

mips_24kc on 19.07 was a successful test - Archer C7v5

ath79/generic master OK
ramips/mt7621 master OK

1 Like