Designing a Sheet in Interface Builder

Quick tip time! This is an anecdote from a libre project I’m working on, specifically Auctions.

One of the things I am doing right now is implementing the Cocoa/Mac UI. I’m writing a flow using sheets for signing in to accounts. I had a lot of issues making the sheet accept input; it just wouldn’t let any of its fields become the first responder.

Poking around DuckDuckGo, I found a Stack Overflow question that seemed pretty interesting, and the answer was to override NSPanel‘s canBecomeKeyWindow method to return YES. I did some searching around in Apple’s Developer Documentation to see how the system determines when a window can become a key window, and I found this nugget:

A window that uses NSWindowStyleMaskBorderless can’t become key or main, unless the value of canBecomeKeyWindow or canBecomeMainWindow is YES. Note that you can set a window’s or panel’s style mask to NSWindowStyleMaskBorderless in Interface Builder by deselecting Title Bar in the Appearance section of the Attributes inspector.

Apple Developer Documentation

I had turned off Title Bar in Interface Builder as I thought it should be disabled since the window would be shown as a sheet. I re-enabled Title Bar, and voila! The sheet worked perfectly, and did not have a title bar when displayed as a sheet.

Really leaving the Linux desktop behind

I’m excited to start a new chapter of my life tomorrow. I will be starting a new job working at an excellent company with excellent benefits and a comfortable wage.

It also has nothing to do with Linux distributions.

I have asked, and been granted, clearance to work on open source software during my off time. And I do plan on writing libre software. However, I really no longer believe in the dream of the Linux desktop that I set out to create in 2015. And I feel it might be beneficial for everyone if I describe why.

1. Stability.

My goal for the Linux desktop started with stability. Adélie is still dedicated to shipping only LTS releases, and I still feel that is useful. However, it has made more difficult because Qt has removed LTS from the open source community, plainly admitting they want us to be their beta testers and that paid commercial users are the only ones who deserve stability. This is obviously an antithesis to having a stable libre desktop environment.

Mozilla keeps pushing release cycles narrower together, in a desperate attempt to compete with evil G (more on this in the next section). This means that the yearly ESR releases, which Adélie depends on for some modicum of stability, are unfortunately being left behind by whiz bang web developers that don’t understand not everyone wants to run Fx Nightly.

I think that stability may be the point that is the easiest to argue it could still be fixed. You might be able to sway me on that. There are some upstreams finally dedicating themselves to better release engineering. And I’ve been happy to find that even most power users don’t care about running the bleeding edge as long as their computer works correctly.

My overall hope for the future: more libre devs understand the value of stable cycles and release engineering.

My fear for the future: everything is running off Git main forever.

2. Portability.

It’s been harder and harder for me to convince upstreams to support PowerPC, ARM, and other architectures. This even as Microsoft and Apple introduce flagship laptop models based on ARM, and Raptor continues to sell out of their Talos and Blackbird PPC systems.

A significant portion of issues with portability come from Google code. The Go runtime does not support many non-x86 architectures. And the ones it does, it does poorly. PPC support in Golang is 64-bit only and requires a Power8, which is equivalent to an x86 program requiring a Skylake or newer. You could probably get away with it for an end-user application, but no one would, or should, accept that in a systems programming language.

Additionally, the Chromium codebase is not amenable to porting to other architectures. Even when the Talos user community offered a PowerPC port, they rejected it outright. This is in addition to their close ties to glibc which means musl support requires thick patches with thousands and thousands of lines. They won’t accept patches for Skia or WebP for big endian support. They, in general, do not believe in the quality of portability as something desireable.

This would be fine and good since GCC Go works, and we do have Firefox, Otter (which can still use Qt WebKit), and Epiphany for browsers. However, increasingly, important software like KMail is depending on WebEngine, which is a Chromium embedded engine. This means KDE’s email client will not run on anything other than x86_64 and ARMv8, even though the mail client itself is portable.

This also has ramifications of user security and privacy. The Chromium engine regularly has large, high-risk security holes, which means even if you do have a downstream patch set to run on musl or PowerPC, you need to ensure you forward-port as they release. And their release models are insanely paced. They rewrite large portions of the engine with significant, distressing regularity. This makes it unsuitable for tracking in a desktop that requires stability and security, in addition to portability.

And with more and more Qt and KDE apps (IMO, mistakenly) depending on WebEngine, this means more and more other apps are unsuitable for tracking.

My overall hope for the future: more libre devs care about accepting patches for running on non-x86 architectures. The US breaks up Google and kills Chromium for violating antitrust and RICO laws.

My fear for the future: everything is Chrome in the future.

3. The graphics stack.

I’ve made no secret of the fact that my personal opinion is that it would still, even today, be easier to fix X11 than to make Wayland generally acceptable for widespread use. But, let’s put that aside for now. Let’s also put aside the fact that they don’t want to work on making it work on nvidia GPUs, which represent half of the GPU market.

At the behest of one of my friends, who shall remain nameless, I spent part of my December break trying to bring up Wayland on my PowerBook G4. This computer runs KDE Plasma 5.18 (the current LTS release) under X11 with no issues or frameskip. It has a Radeon 9600XT with hardware OpenGL 2.1 support.

It took days to bring up anything on it because wlroots was being excessively difficult with handling the r300 for some reason. Once that was solved, it turned out it was drawing colours wrong. Days of hacking at it revealed that there are likely some issues in Mesa causing this, and that this is likely why Qt Quick requires the Software backend on BE machines.

When I asked the Wayland community for a few pointers at what to look at, since Mesa is far outside of my typical purview of code (graphics code is still intimidating to me, even at 30), I was met with nothing but scorn and criticism.

In addition, I was still unable to find a Wayland compositor that supports framebuffers and/or software mode, which would have removed the need to fix Mesa yet. Framebuffer support would also allow it to run on computers that run LXQt fine, like my Pentium III and iBook G3, both of which having Rage 128 cards that don’t have hardware GL2. This was also met with scorn and criticism.

Why should I bother improving the Wayland ecosystem to support the hardware I care about if they actively work against me, then blame the fact that cards like the S3 Trio64 and Rage128 don’t have DRM2 drivers?

My overall hope for the future: either Wayland compositors supporting more varied kinds of hardware, or X11 being improved and obviating the need for Wayland.

My fear for the future: you need an RX 480 to use a GUI on Linux.

4. Usability.

This is more of an objective point than a subjective one, but the usability of desktop Linux seems to be eternally stuck just below that of other environments. ElementaryOS is closest to fixing this, but there is still much to be desired from my point of view before they’re ready for prime time.

In conclusion.

I still plan to run Linux – likely Adélie – on all servers I use. (My fallback would be Gentoo, even after all these years and disagreements, if you were wondering.)

However, I have been slowly migrating my daily personal life from my Adélie laptop to a Mac running Catalina. And, sad as it is to say, I’ve found myself happier and with more time to do what I want to do.

It is my genuine hope that maybe in a few years, if the Linux ecosystem seems to be learning any of these lessons, maybe I can come back to it and contribute in earnest once again. Until then, it’s system/kernel level work and hacking POSIX conformance in to musl for me. The Linux desktop has simply diverged too far from what I need.

The Retro Lab: Introduction

Welcome to my latest series of articles, The Retro Lab, where I will be detailing my excursions into the art and hobby of retrocomputing.

This article will serve as a general overview of what I hope to accomplish, and a bit about my background and why this will be fun for me 🙂 If you just want to see the meat of this, skip to the section “And now, in the present”.

My history with computers

When I was very young, my family had a 386 running DOS and Windows 3.1. The only thing I cared about were the games, of course. I don’t remember a lot from this era, because I was so young.

When we were with my grandpa, I loved to play on his XT clone. It was a Leading Edge Model D with 20MB disk running DOS. I inherited this computer when he passed and it is still in my closet. I treasure it. Some of the games it had were a text adventure game called “CIA” and Wheel of Fortune. The real magic for me, though, was in the BASIC interpreter. It was amazing to type in some commands and see this big huge loud complex machine do what I tell it! This is what hooked me on programming and is why I chose the field I did.

The next computer we had was a Canon StarWriter Pro 5000. I’m not sure what hardware it has – and it’s also in my closet, so a careful tear down may be a future article! – but I know it ran GEOS. I loved to write little comedy skits and song lyrics as a child, and it was cool to have all of them on a single 3.5″ floppy disk instead of taking all the paper in the house.

The real life changing moment, however, came on February 22, 1997. It was the day we brought home The Pentium. What a beast of a computer: 133 MHz, 24 MB RAM, and an 8 speed CD drive! It was a Compaq Presario 4712, and it came with Encarta 97, Compton’s Interactive Encyclopaedia, The Yukon Trail, Magic Carpet, PGA Tour ’96, but most of all: 15 free hours of America Online.

AOL was amazing to a second grader. They had Nickelodeon online! You could download little sound clips from Nick and Nick at Nite shows. They had games like Slingo (still one of my all-time favourite takes on slot machines, 20+ years later). And they had a graphical portal to Gopher/WAIS. The local school district had uploaded text files full of fun activities for us children on their Gopher server.

That computer was also where I first used Telnet to a computer running Solaris. A few friends and I used talk on it to have our own small chat rooms. My aunt ran an IRC channel and we talked on mIRC. We ended up with a webcam and talked with family using NetMeeting. Yahoo Messenger, ICQ, Infoseek… so many things.

Programming and desires

Enough with the ‘net reminiscing, at least for now 🙂

Something else important to mention is that my grandpa also was an Important Person at a facility, and one of the things he did was computer purchasing. He had catalogues from Compaq, IBM, and various other vendors in his house for that reason. I loved flipping through them and looking at all the cool stuff.

Something I always wanted back then was my own server. It seemed so cool. I was especially attracted to the ProLiants and AlphaServers in the Compaq catalogue. Windows NT and Tru64 looked so cool when I looked them up online.

The other thing that was very attractive back then were the Power Macintosh computers. My Mum was a digital artist back then. The Quadra was a nice system but didn’t compare to what I saw the Power Macs could do!

For my birthday in 1998, I received Visual Basic as a gift. It really cemented my desire to be a programmer. This was such an exciting time and part of the VB6 software was a one year subscription to the MSDN Library. From that library I learned about all the different servers one could run, all the different types of NT, the different programming languages of Visual Studio…

And now, in the present

In the past few years, as I am able and as opportunities arise, I have amassed quite a collection of hardware that I want to set up and enjoy:

A Compaq Armada e500 laptop. This is a Pentium III from the year 2000 and is likely the newest system I want to have in my Retro Lab. It runs Adélie right now; I’ll likely remove the hard drive and install another to run period-accurate software. I will likely run Windows NT 4 or 2000.

A beige Power Macintosh G3 with the Bordeaux personality card. I will be inserting a 10 GB disk and installing Mac OS 8.6 on it. This will run all the classic Mac software that I have collected over the years. It will be one of the main focuses of the Lab.

A few AlphaServers. Most are earmarked for Adélie so I can’t really use them in the Lab, but there is a single DS10L that was set aside for my personal use. I’ll likely install NT 4 on this one, but I need to investigate further on the hardware.

A Sun Ultra 60, Netra T1, and Ultra 10. These are all from ’98-’99 and will make great Solaris systems, to relive the glory days and experiment more with what was my first real Unix. I would like to run CDE again and possibly do some Java tinkering with these. It would be very fun to run a Retro Lab network off of the AlphaServer and Netra.

A Power Macintosh 7100/80AV. More fun Mac stuff awaits on this computer, though I’m not sure exactly what I will do with it yet.

A Compaq LTE 5150. This is actually my original laptop from high school, ca. 2004. I’d like restore it to its former glory and use it for Windows 3.1 and early 95 software. It can also run OS/2. The screen probably won’t do well for most games, but I do have the docking bay to connect it to an external monitor…

An SGI Indy. This will need an SCSI2SD adaptor to reach its full potential since the hard disk died many years ago. I would love to dual boot IRIX and a BSD.

A Dell System 316LT. There are a few older DOS games I have that would run much better under a CPU of this speed. It needs some love; I seem to recall it had an issue booting up the last time I had it out. I could also try and run GEOS.

A Compaq Presario 4850. This is, to my knowledge, the oldest original computer I have that still fully functions. We purchased it on my Mum’s birthday, 1998, for her graphic design software. This, along with the Beige G3, will likely be the centrepiece of my Lab. I plan on running either Windows 95 or 98 on it, and also various other OSes of the era: BeOS, OpenStep, maybe early Linux. I know that the Rage Pro functions in high res in Win3.1 and OS/2 from prior hackings. It’s also the first computer I used to tinkered with XFree86 modelines. It has a factory original Hitachi DVD drive.

Don’t forget the accessories!

Oh yes, I have some great period hardware for the tinkering as well:

HP ScanJet 5s SCSI scanner. Drivers for Windows, Macintosh, and IRIX, at least. I believe there is an attachment to scan photo negatives as well, but I can’t remember now.

Aiptek webcam. Yes, the original one from the NetMeetings of old that I talked about in my history section. Should be very easy to bring up under Windows. I am curious about Macintosh support.

HP CD Writer Plus 7200e. This is a parallel port, dual-speed CD writer and rewriter. One of the cool features I found on this back in the day is that if you send multimedia commands and have speakers connected to the external headphone jack, you can power off the computer and still listen to the CD until it finishes! I found this out one day when Win95 crashed while I was listening to Garbage’s Version 2.0.

My MSDN Universal archive. In 2002 I found an MSDN Universal subscription at a flea market for 15 USD. I activated it and have all the CDs, and also sent a special request for them to send me the Archive CDs which included BackOffice 4.5 and a few other goodies.

Unfortunately my back and neck are not up to carrying a CRT. I have a flat panel from 2006, a 17″ ViewSonic, that seems to be very close in specification to what we could have had in 1998 for way too much money 😉 Hey, with everything else being so accurate, a little cheating on the monitor isn’t so bad!

In conclusion

This was a lot longer than I had originally anticipated, but it also covers a lot of ground. Over the coming weeks, I hope to bring up a few of these computers and document the processes. Until then, happy hacking!

Reckless Software Development Must End

On the 6th of November, 2019, I made a comment on Twitter:

Okay, so today’s news isn’t as dramatic as Uber killing a homeless woman by not programming in the fact that pedestrians might not use crosswalks, but it is based in the same mode of thought.

Today’s news is that the US state of Iowa has had issues with their election processes (processes that are a bit too complex for me to provide you an overview in this blog). The problem boils down to reckless abandon of software engineering principles.

As reported in the New York Times and The Verge, in addition to many other outlets, there were a number of failings in the development and deployment of this software package that would have been trivial to prevent.

My personal belief is that the following issues significantly contributed to the failure we have seen.

No test plan

There was no well-defined plan of testing.

The test plan should have covered testing of the back-end (server) portion of the software, including synthetic load testing. My test plan would have included a swarm of all 1600+ precincts reporting all possible data at the same time, using a pool of a few inexpensive systems running multi-connection clients.

The test plan should have also included testing of the deployment of the front-end (user facing) portion of the software. They should have asked at least a few of the precinct staffers to attempt to complete installation of the software.

Ideally, a member of the development team would be present for this, to note where users encounter hesitation or issues. However, we are far from an ideal world. My test plan would have included a simple Skype or FaceTime session with the poll workers, if face-to-face communication would have been prohibitive.

These sessions with real-world users can be used to further refine the installation process, and can inform what should be written in documentation to simplify and streamline the experience for the general user population. Then, users should be allowed to input mock test data into the software. This will allow the development team to see any issues with the input routines, and function as an additional real-world test for the back-end portion.

By “installation”, I mean the set up required after the software is installed. For instance, logging in with the unique PIN that reportedly controlled authentication. I am not including the installation of the app software onto the device, which should not have been an issue at all — and which is covered in the following section.

Lack of release engineering

Software must be released to be used.

It appears that the developers of this software either did not have the software finished before the Iowa caucus began (requiring them to on-board every user as a beta tester), or they did not intend to have a proper ‘release’ of the software at any time (meaning every user was intended to be a beta tester). I could write a full article on the sad state of software release engineering, but I digress.

The software was distributed to users via a testing system, used for providing pre-release or “beta” versions to testers. This is an essential system to use when you have a test plan like what I described above. This is, however, a bad idea to use for releasing software for production.

On Apple’s platform, distributing final releases via TestFlight or TestFairy can result in your organisation being permanently banned from accessing any Apple developer material. Not counting the legal (contract law) issues surrounding such a release, on Android this requires your users to enable what is called “side-loading”, or installing software from untrusted third-party repositories.

All of the Iowa caucus precinct workers using the Android OS now have mobile devices configured in a severely vulnerable way, and they have had sideloading normalised as something that could be legitimate. The importance of this cannot be understated. This is a large security risk, and I am already wondering in the back of my mind how this will affect these same workers if they are involved with the general election in November. The company responsible for telling them to configure their mobile devices in this manner may, and in my opinion should, be liable for any data loss or exploitation that happens to these people.

My release plan document would have involved clearly defined milestones, with allowances for what features would be okay to postpone for later releases. This could include post-Iowa caucus releases, if necessary — the Nevada Democratic Party intended to use this software for their 22nd February caucus. Release planning should include both planned dates and required dates. For example:

  • Alpha release for internal testing. Plan: 6 December. Must: 13 December.
  • Beta release, sent for wider external testing. Plan: 3 January. Must: 10 January.
  • Final release, sent to Apple and Google app stores. Plan: 13 January. Must: 20 January.
  • Iowa Caucus: 3 February (hard).

Such a release plan would have given the respective app stores at least two weeks to approve the app for distribution.

Alternatively, if the goal was to avoid deployment to the general app stores of the mobile platforms, they could have used “business-internal” deployment solutions. Apple offers the Apple Business Manager; Google offers Managed Google Play. Both of these services are included with their respective developer subscriptions, so there is no additional cost for the development organisation.

Lack of security processes

Authentication control is important in all software, but especially so in election software. This team demonstrated to me a lack of understanding of proper security processes by providing the PIN on the same sheet of paper that would be used on the night of the election for vote tallying.

I would have had the PIN sent to the precinct workers via either email, or using a separate sheet which they could have in their wallet. Ideally, initial log in and authentication would have taken place on the device before the release, with the credentials stored in the secure portion of device storage (Secure Enclave on iPhone, TrustZone on Android). However, even if this is not possible, it was still possible to provide the PIN to users in a more secure manner.

Apparent lack of clearly defined specification

I have a sneaking suspicion that the combination of these failings mirror the many other development organisations who refuse to apply the discipline of engineering to their software projects. They are encouraged by bad stewards of engineering to “Move Fast and Break Things”. They are encouraged by snake-oil peddlers of “process improvement” that formal specification and testing are unnecessary burdens. And this must change.

I’m not alone in this call. Even the Venture Capitalist section of Harvard Business Review admits that this development culture is irresponsible and outdated. Software developers and project managers must be willing to #Disrupt the current industry norm and be willing to Move Moderately and Fix Things.