Chinese authoritarianism and Western transparency

In a recent EconTalk, Russ Roberts and Amy Webb discussed how Artificial Intelligence (AI) may be the tool authoritarian regimes (and, particularly, China) always needed to fully implement the sort of widespread social control they always aimed for.

I found Webb’s arguments sometimes not so persuasive. For example, the fact that big companies in China operate on the government’s marching orders she considers an _advantage_, as this avoids duplications. Russ replies that, well, competition, with its trials and errors, hasn’t been that bad in fostering innovation in the past. Webb tells Russ she agrees, but it’s not quite clear to me she does. As with many futurists, she appears to be rather oblivious to the fact that what we need is not a system that provides us with the “right” vision of the future, but with proper incentives to correct our mistakes in due course.

That said, the sort of Orwellian world Webb sketches is certainly worrying. But if she blames this nightmare scenario partly on the West and the US, it is because Western countries and the US in particular do not “have norms and standards… we don’t have any kind of agreed-upon ideas for who and what to optimize for, under what circumstances. Or even what data sets to use… And here in the United States, a lot of these companies have obfuscated when and how they are using our data”.

Whatever the merits of this discussion, one important piece is missing: namely, how the regulations we have, as opposed to those possible regulations we don’t have, are helping this Orwellian nightmare come into place. This is a point that economist and banker Antonio Foglia addresses in his recent column on “Western Transparency Is Fueling Chinese Repression”.

Foglia shares Webb’s concern about contemporary developments in China. He sees the centralised database that the Chinese Communist Party is developing to “underpin a ‘social credit system’ that will monitor the behaviour of all Chinese” as no joke, because the government’s “use of artificial intelligence and social media gives it a level of control over its subjects that no dictatorship has ever achieved”. The most immediate target of such initiatives, however, Foglia maintains to be the “$1 trillion in undeclared assets abroad” owned by Chinese residents. These undeclared assets were amassed because “the Chinese know well that whatever they might gain through their industriousness is essentially only a temporary, revocable loan from the CPC”.

The theme of Foglia’s article is how rules providing for the automatic exchange of personal and corporate financial information are actually empowering whatever technology China is employing to have a witch hunt, aimed at the rich and their resources. These weapons were “armed by the Paris-based OECD” (but the US led the way with FACTA), who did not understand how “tax avoidance was rarely the main motivation for individuals hiding money abroad. Most wanted to diversify their risks by hiding some of their wealth in safer jurisdictions. Privacy is crucial for those who live under capricious regimes or are potentially exposed to kidnapping or extortion”.

I think Foglia’s message is of the utmost importance: before discussing rules we need to preserve privacy; we should understand how privacy is harmed by rules we already have. Of course this is a difficult debate: most of the people who speak about these subjects tend to be tremendously worried about people voluntarily sharing the pictures of their dogs on Facebook, or being monitored by Amazon so that they can order toilet paper one roll before they run out. But they seem to have little concern about governments knowing everything about our expenditures and bank accounts.

Source link