The leaky abstraction of distrust
Aslan Askarov
2024-10-04
In August 2024, the Telegram app was in the news. Amongst all the discussions, one particular narrative stood unchallenged, which is worth revisiting now that the dust of the news cycles has settled. It went along the lines of
- I don’t trust Telegram for privacy, but I use it as a convenient news aggregator.
- I don’t trust Telegram for privacy, but I use it as a broadcasting platform to reach out to subscribers, and only disseminate opinions that I consider public.
What’s remarkable is that such statements originated not just from regular folk but from journalists and even dissidents, who while distrustful of TG, still normalized some usage of it because “they didn’t feed it anything of value”. I could not help but notice a serious misconception here.
Put simply, from the foundational security perspective, distrust needs to be enforced.
The basic fact is that in the context of modern app-ecosystems installing an app from a third party is akin to having a classical Trojan horse. Once installed, the app gains access to a plethora of metadata that is available on one’s phone. This metadata not only includes explicit information such as one’s contacts, but also real-time data, such as online-offline status, IP addresses (that depends on location), geo-coordinates, and the change in the device battery levels (that leaks the user’s proximity to a power socket). Even something as public as the current time (!) leaks information about the time-zone.
All of that information is vacuumed out of the devices as part of the app’s regular functionality. The importance of enforcing distrust is a well-recognized issue in computer security research. The state-of-the-art is full of nuances and technical qualifiers. Even coarse-grained system isolation using virtual machines may not be enough without careful consideration of all the data sources and sinks.
For a layperson, if you have serious doubts about the provenance of an app, do not have it anywhere near you – not on your phone, not on your other phone, not on your friends’ phones. The OPSEC of confining these apps requires meticulousness of a forensic security laboratory.
Once your data crosses the boundary of the device, the discussion of what happens to it also crosses the boundary of the disciplines: from science and engineering to legislation and regulation. If the receiving party has no intent of following a regulation, there is no way to undo the leak.
p.s. As a word of extra-caution, if you are considering evacuating your online presence someplace new, be mindful of where you go. App migration amongst the chaos of warnings (including this one!) may lead to poor choices.