Looking forward to 2023

(Note: This draft was being written when that Monday Night Football incident happened, so it was shelved for a bit.)

As this is my last day of holiday break, I thought I’d reflect a bit on what makes me the most excited for the coming year. Obviously, none of us know what the future holds, but these are some of my hopes for 2023:

Social stuff

It looks like Twitter might survive after all, but the fragmentation and millions of people going to the Fediverse intrigues me. I am very curious to see where the Fediverse goes now that it has so much more interest. I am hoping to see people like journalists and meteorologists start using it in earnest, which were some of my favourite follows on Twitter. It would be great to see the platform grow to new interests, since the majority of people there lean towards being in tech.

While their privacy policies and business practices still disturb me, this year will likely be the year I rekindle my Facebook account. There are still family members and friends of mine that use it, and some pretty nifty retrocomputing groups are on it as well. If you can’t beat ‘em, join ‘em, I guess. Any content that I post on Facebook would be mirrored to better platforms, so it wouldn’t be anything special for those of you who want to continue to stay away. I just don’t want to miss out on those connections that I could have just because of my aversion to late-stage advertisement capitalism.

Apple ecosystem

My iPad Pro is going to be seeing more usage this year as Stage Manager is finally available, bringing multiple app/window support. This is something that I’ve personally felt has kept the iPad from living up to its full potential, and something I remember seeing being done in the jailbreak scene for years, so I’m quite happy to see Apple finally putting it in official system software.

While I know rumours abound and there is no reason to think it would be released this year, I’m eternally looking forward to a wearable – like, say, an Apple Watch – that can also function as a glucometer. As someone with type 1 diabetes, it’d be a real boon to be able to have enough a rough estimate of what my blood glucose level is without having to wear a separate sensor.

It would be very cool, though unlikely, to see a MacBook Pro with a Dynamic Island like the iPhone 14 Pro.

Retrocomputing

I’ve received a lot of goodies, hardware and software, over autumn and winter. I can’t wait to put them to good use in the Retro Lab. I’m hoping to write a number of new articles in my Retro Lab series.

There are a number of software development projects I’d like to tinker with in the retrocomputng circle. I’m keeping details vague for now, as I don’t want to make any promises, but my focus as always will be on making classic Macs and Windows NT useful in the modern era.

Linux and libre software

I’ve been following the SPDX project’s continual drive to make automated tooling around discovering and managing licenses of software packages. It would be very cool to integrate some of these tools into package managers like APK.

The Qt project is still not in my good graces after their decision to make LTS releases commercial-only. This only became stronger when it was announced qmlsc, the QML compiler that would make QML apps into high-performant, non-interpreted C++ apps, is also only available for commercial customers of Qt. Maybe the KDE team will support a libre Qt 6 LTS branch in the same way they support 5.15?

Speaking of LTS branches of things with major versions of 6, the Linux kernel 2023 LTS edition should be pretty exciting. Linux 6.1 and 6.2 bring a lot more support of AArch64 boards, including the Apple M1 and Qualcomm 8cx Gen 3. When the Linux 6 LTS drops, it will be very exciting to dual-boot mainline Linux on my MacBook Pro M1.

I am personally hoping to have some time to devote to “traditionally opposite” endian projects. Specifically, I want to see if I can bootstrap an aarch64_be environment on my Pine A64, and similarly bootstrap a ppc64el environment. There are probably going to be a lot of false assumptions in code regarding aarch64_be.

Adélie continues to improve regularly, and hopefully this will finally be the year of the release of Adélie Linux 1.0. Yes, I am taking on a somewhat more active role again, and no, I do not want to comment 😉

Lastly, it will be exciting to see where the GCC Rust front end goes. Hopefully this will lead to significant improvements in Rust’s bootstrap story, which will help make it more useful and approachable by people who cannot use, or do not want to trust, the Mozilla-provided binaries.

Personal

I want to take photography seriously again. Photography can tell a story, document history, and transport others to a new perspective. I really enjoy taking these kinds of photos and hope to have some great snapshots to share throughout the year.

In addition to the retrocomputing projects, there are a few others non-retro-related software development and library improvement projects that I hope to spend some time on this year. Some of them are Wayland on Power, Zig on big-endian Power, and adding better compression support to APK Tools.

In conclusion

That is an overview of what I hope to devote my time to in 2023. What do you think? Are there cool developments that I should be looking at that I missed? Are you excited about some of these too? Feel free to discuss in the comments!

What’s the deal with Cisco devices in `file` output, anyway?

If you work on PowerPC systems of some kind – or maybe you work on car MCUs that use the NEC V800 CPU – you may have run across some strange output when you run the file command on any binary:

/usr/bin/file: ELF 32-bit MSB pie executable, PowerPC or cisco 4500, version 1 (SYSV), dynamically linked, interpreter /lib/ld-musl-powerpc.so.1, stripped

Of course it’s a PowerPC binary, but why the mention of “cisco 4500” (or Cisco 7500s for 64-bit PowerPC binaries, or Cisco 12000s for NEC V800s)? The reason behind this is a fascinating insight into the world of proprietary computing architectures and the somewhat inventive way Cisco tried to lock down some of their older systems.

A brief primer on ELF

ELF, which stands for Extensible Linking Format or Executable and Linkable Format and is not a Will Ferrell character, is a file format for executable files and shared libraries (among others).

In layman’s terms, ELF specifies things like what processor the executable runs on, the ABI that it uses, the endianness and word size (32-bit or 64-bit, for example) that it uses, and so on.

One of the fields in an ELF file is the e_machine field, which specifies the type of machine the file is designed to run on. 0x02 is SPARC, 0x03 is the Intel x86, 0x14 is 32-bit PowerPC, 0x15 is 64-bit PowerPC, and so on.

This is the identifier that allows your OS to tell you “Exec format error” (or similar) when you run an executable for a CPU other than the one you are currently using. As a side note, it is also this field that allows qemu-user binfmt to work, if you are curious.

Cisco’s use of e_machine

The boot loader for Cisco IOS machines, also known as ROMMON, will refuse to load firmware for a different router model than the system. For example, on a Cisco 2911, you may see:

loadprog: error - Invalid image for platform
e_machine = 30, cpu_type = 194

ROMMON uses e_machine as a sort of “model number”. The Cisco 4500 uses cpu_type 20 or 0x14, which happens to also be the ELF e_machine for PowerPC.

The “magic” library that the file command uses to determine the machine type of ELF binaries only knows a few models of Cisco. I haven’t been able to determine their criteria for inclusion, or why some are present and some aren’t.

References

The ROMMON error was gleaned from an OpenWrt forum post; I don’t have hardware to show this error myself.

More information about how older Cisco devices use ELF can be found on the LinuxMIPS wiki.

This question was originally asked by some curious people on the #talos-workstation IRC channel on Libera.Chat. I knew the basics of Cisco’s ELF-scapades, but they were the ones who inspired me to make this write-up and learn a bit more.

Compiling XIBs with CMake without Xcode

I’ve been enjoying using the JetBrains IDE CLion to do some refactoring and improvements to the Auctions code base. However, when I tried to build the Mac app bundle with it, the app failed to launch:

2022-07-30 19:54:15.117 Auctions[80371:16543044] Unable to load nib file: Auctions, exiting

The XIB files were definitely part of the CMake project. I later learned that CMake does not automatically add XIB compilation targets to a project. It relies on the Xcode generator to do that.

I found a long-archived documentation page from CMake on the Kitware GitLab that described a method to build NIB files from XIBs, and have modified it to make it simpler for Auctions.

You can see the change in the commit diff, but I’ll include the snippet here for posterity.

First, you define an array with the XIB file names with no suffix. For instance, I’ve done set(COCOA_UI_XIBS AXAccountsWindow AXSignInWindow Auctions) for the three XIB files presently in the codebase.

Then we have the loop to build them:

find_program(IBTOOL ibtool REQUIRED)
foreach(XIBFILE ${COCOA_UI_XIBS})
add_custom_command(TARGET Auctions POST_BUILD
COMMAND ${IBTOOL} --compile ${CMAKE_CURRENT_BINARY_DIR}/Auctions.app/Contents/Resources/${XIBFILE}.nib ${CMAKE_CURRENT_SOURCE_DIR}/${XIBFILE}.xib
COMMENT "Compiling NIB file ${XIBFILE}.nib")
endforeach()

Now it starts correctly and works properly when built from within CLion. This was surprisingly difficult to debug and fix, so I hope this post can help others avoid the hours of dead ends that I endured.

Until next time, Happy Hacking!

Expanding the Retro Lab, and Putting It to Work

Over the past month, I have been blessed with being in the right place at the right time to acquire a significant amount of really cool computers (and other technology) for the Retro Lab.

Between the collection I already had and these new “hauls”, I now have a lot of computers. I was, ahem, encouraged to stop using the closets in my flat to store them and finally obtained a storage locker for the computers I’m not using. It’s close to home, so I can swap between what I want to work on virtually at will.

Now I am thinking about ways to track all of the machines I have. One idea I’ve had is to use FileMaker Pro for the Power Macintosh to track the Macs, and FoxPro to track the PCs. One of my best friends, Horst, suggested I could even use ODBC to potentially connect the two.

This led me to all sorts of ideas regarding ways to safely and securely run some server services on older systems and software. One of my acquisitions was a Tyan 440LX-based server board with dual Pentium II processors. I’m thinking this would be a fun computer to use for NT. I have a legitimate boxed copy of BackOffice Server 2.5 that would be perfect for it, even!

Connecting this system to the Internet, though, would present a challenge if I want to have any modicum of security – so I’ve thought it out. And this is my plan for an eventual “Retro Cloud”.

Being a cybersecurity professional, my first thought was to completely isolate it on the network. I can set up a VLAN on my primary router, and connect that VLAN to a dedicated secondary router. That secondary router would have total isolation from my present network, so the “Retro Cloud” would have its own subnet and no way to touch any other system. This makes it safer to have an outbound connection. I’ll be able to explore Gopherspace, download updates via FTP, and all that good stuff.

Next, I’m thinking that it would make a lot of sense to have updated, secure software to proxy inbound connections. Apache and Postfix can hand sanitised requests to IIS and Exchange without exposing their old, potentially vulnerable protocol handlers directly to the Internet.

And finally, as long as everything on the NT system is public knowledge anyway – don’t (re)use any important passwords on it, don’t have private data stored on it – the risk is minimal even if an attacker were able to gain access despite these protections.

I’m still in the planning stages with this project, so I would love to hear further comments. Has anyone else set up a retro server build and had success securing it? Are there other cool projects that I may not have even thought of yet? Share your comments with me below!