
On public blockchains, privacy is not an intrinsic attribute but an unstable outcome shaped by incentives, usage patterns, and systemic observability.
Privacy, in the vocabulary of cryptocurrencies, has never been an absolute value. It has always been an emergent property: dependent on user behavior, protocol design choices, and — most importantly — on how transactions are observed. The idea that a public blockchain could guarantee anonymity was only plausible in an early phase, when data volumes were limited and analytical tools still primitive. That phase is now clearly over.
The research presented in the thesis by Haaroon M. Yousaf, defended at University College London, starts precisely from this point. It does not ask whether cryptocurrencies are private in theory, but to what extent they remain private in practice, once embedded in a real ecosystem of exchanges, incentives, and repetitive habits. The shift is methodological before being technical: from abstract guarantees to empirical measurement.
Bitcoin has never been anonymous. It is pseudonymous. Addresses do not carry names, but the links between addresses are often sufficient to reconstruct functional identities. Clustering heuristics — most notably the multi-input heuristic — have shown for more than a decade that ledger transparency is not an architectural footnote, but an attack surface. Privacy is not eroded despite the blockchain, but through it.
This is where the race toward so-called privacy coins begins. Zcash, Dash, and Monero aim to sever the link between inputs and outputs. Zcash, in particular, introduces a shielded pool based on zero-knowledge proofs, designed to hide sender, recipient, and amount. The cryptographic promise is strong. Actual usage, far less so.
Empirical analysis shows that anonymity sets are often too small. Repetitive patterns — deposits, withdrawals, timing, amounts — drastically reduce theoretical protection. Here privacy does not fail because of a cryptographic flaw, but because of a combination of economic incentives and predictable operational practices. This distinction matters: cryptography can work perfectly and still produce fragile privacy.
The problem becomes structural once transactions no longer remain confined to a single chain. Cross-chain exchange services such as ShapeShift introduce a new correlation surface. Even without KYC, moving from one asset to another generates temporal and quantitative signatures that allow activity to be linked across different ledgers. Privacy, in this sense, is not a local property. It is systemic.
A functional analogy helps clarify the point: improving privacy on a single blockchain is like installing reinforced doors in a building whose side entrances remain unguarded. The issue is not the strength of the door, but the overall access map. If the exit point is observable, internal opacity matters little.
This capacity for observation is not neutral. On one side, it enables legitimate investigations: tracking stolen funds, analyzing ransomware flows, reconstructing illicit activity. On the other, it challenges a widespread industry narrative according to which “decentralized” implies “untraceable.” The thesis shows that, in practice, the opposite is often true: the more open the system, the more measurable it becomes.
The case study on Forsage makes this tension explicit. Here privacy is not the central issue. Intentional opacity is. Smart contracts presented as inevitable and trustless are used to mask a classic pyramid structure. On-chain analysis reveals that the vast majority of participants lose money, while a small minority captures value. The blockchain, far from protecting users, becomes the public ledger of a regressive redistribution.
What connects these cases is not criminality, but informational asymmetry. Those who understand how analytical heuristics work, how accounting models behave, and how transaction patterns emerge hold a structural advantage. Those who rely on the abstract promise of the technology do not. In this sense, privacy in cryptocurrencies is not a feature to be switched on, but a competence to be maintained.
This leaves an open question. If even systems explicitly designed to hide information fail once used at scale, what space remains for sustainable privacy on public ledgers? Perhaps not in removing transparency, but in understanding its systemic effects. It is not a definitive answer. But it is the point from which the discussion must restart.

