A few days ago, I was talking to an old friend of mine about that weird project I've been working on for the last few months.

Usually, people tend to just be bewildered by it.1 He wasn't. His response cut to something that I've been struggling with more or less since I got started, but which nobody else had actually raised (to me) so far:

I like the idea of it

Don't take this the wrong way though there's no fucking way I'm using a custom mobile device keyboard that someone recommends to me lol

There's so so many ways that could be abused haha

Like I trust you and I see it's actually by you but I'ma be real

If I wanted to get access to someone's everything this is exactly how I'd do it haha

If I had a non-critical device without my accounts on it, I'd probably use it

But like, even the thought of you potentially being able to see what I type is terrifying haha

I'm aware of those issues. Keyboards see a lot of sensitive shit! I'm just some weird person on the internet, why should anyone trust me?!

I try

I try to do what I can. I want to, at the very least, limit the damage that I can do:

  • All the source code is public for anyone to read (or modify).
  • It doesn't request any permissions. It literally has no internet access.
  • Even local tracking is optional and disabled by default.
  • All builds are reproduced by a third party, to ensure that they actually match the source code I publish.
  • It only uses a few hand-picked libraries from a few well-known sources, in order to limit the supply chain risk.2

It's not enough

And yet. None of that really means anything to someone who isn't already super invested in not just the platform, but how that platform works. And even then, well, are you going to read the source code for every app you use? For every new version of it?3

And while I'd like to say that permission restrictions should keep you safe... Android just doesn't consider network access sensitive anymore. It was specifically hidden from the permissions views about a decade ago. I could publish a new release tomorrow that requests the permission, and you would likely be none the wiser.

And even if you don't have the permission.. you can still sneak out information as long as another app, which does, lets you borrow their keys for a moment. Security by compartmentalization alone can't work on a platform where the default is that any app can collaborate with another app as long as they both agree.

At the end of the day, the rest of these are similarly circular. "You should trust my privacy policy because I pinky promise not to break it." "You should trust Izzy's rebuilds because I say you should trust them." And so on.

If you start from a position of "Who the fuck is she?" (as you should!) then everything else falls apart.

What about other vendors?

Subjectively, commercial vendors don't even tend to try to compete on the "trust axes" I'm talking about here.

Virtually universally you have no access to the source code. The internals are considered a trade secret, not something you owe to the people who rely on you.

They tend to include flashy features like GIF search that require internet access to function. They have features like auto-correct or predictive typing that are advertised as learning from your typing behaviour! Hell, they often include proprietary third-party libraries that even they don't quite know themselves.

In this world, any attempt at "technical mitigation" is completely meaningless. Any technical limitations we can think of end up having perfectly reasonable-sounding excuses for why they must be bypassed.

The attitude here is almost the opposite. "You should trust us because you trust us!", and any assurances beyond that end up becoming irrelevant. Any breaches of that trust end up irrelevant.

In theory, it kind of makes sense. Google and Microsoft have reputations to uphold, right? Surely, if anything bad was to come out, the backlashes would kill their companies! Right?

In practice... backlashes tend to be easy to ignore if the alternative is profitable enough. Well, to a point, anyway.

And yet

Mobile phones are also kind of unique in even trying to provide some sort of sandboxing regime. On a desktop computer, generally, any application you run, any game you download can just access (almost) whatever it wants to.4

And we don't really think about it. It's just the default that everything gets. I don't think it's bad that the access exists, but it probably shouldn't be the default that we give everything.

It's almost as if trying to address the problem triggers some sort of backlash. That acknowledging the problem draws attention to it as applying especially to you. Recursively.

I don't write this as an attempt at rules lawyering. That "Clearly, I tick the boxes and so that entitles me to that level of trust!"

I think we should all share the attitude my friend expressed. Towards all software. Or really, anywhere that involves a similar kind of power imbalance.

I just wish that it was practical.

I just wish that I knew a path towards making it practical.

1

"How do you even learn that?" "You're spending all this time just to save $12/year?"

2

Currently: Android itself and its build tools, their Jetpack libraries, the Kotlin standard library, and a custom fork of the Material Components for Android. We can quibble all day about how trustworthy Google themselves are or aren't, but at the end of the day they already own your root.

3

And even then, well, did you really understand it?

4

Some even turn it into a game mechanic. (Spoilers for Inscryption.)