Archive for July 24th, 2018

China’s Social Credit System: Using Data Science to Rebuild Trust in Chinese Society by Jason Hunsberger

From 1966-1976 Mao Zedong and the Chinese Communist Party (CCP) waged an idological war on its own citizens. Seeking to purge the country of “bourgeois” and “insufficiently revolutionary” elements, Mao closed the schools and set an army of high school and university students on the populace. The ensuing cultural strife placed neighbor against neighbor, young against old, and destroyed families. Hundreds of thousands died.

Forty years later, China is still trying to recover from the social damage. Facing widespread issues of government and corporate corruption, and lack of respect for the rule of law, the national party is seeking to transform Chinese society to be more “sincere”, “moral”, and “trustworthy.” The means by which the CCP seeks to do this is by creating a nationwide social credit system. Formally launched in 2014 after two decades of research and development, this system’s stated goal is:

“allow the trustworthy to roam everywhere under heaven while making it hard for the discredited to take a single step.”
NOTE: Under heaven is a rough translation of one of the historical names of China.


Image 1: Goals of China’s social credit system as found in the Chinese news media (Source: MERICS)

By 2020, China hopes to have the system fully deployed across the entire country.

How will this nationwide social credit system help the nation rebuild trust in its public and private institutions and its people? By using data science to analyze all aspects of public life. That will be used to create a social credit score that represents a citizen’s or a company’s contribution to society. If your actions are a detriment to society, your social credit score will go down. If your actions are beneficial to society, your score will go up. Depending upon your credit score, you will either be restricted from aspects of society or be granted access to certain benefits.

The social credit system is deeply integrated into local government databases and large technology companies. It collects data from automatic facial recognition cameras deployed at city intersections, video game systems, search engines, financial systems, social media, political and religious groups you engage in, and more. With all this data, the social credit system creates a universal social credit score which can be integrated into all aspects of Chinese life.

Beyond a mere pipedream, there are currently 36 pilot programs deployed in some of China’s largest cities.


Image 2: A map of the social credit pilot cities in China (Source: MERICS)

Here are some examples of the social credit system at work, collected from these pilot programs:

  • you wake up in the morning to find your face, name, and home address being placed on a billboard in your local neighborhood saying that you have a negative impact on society.
  • the ringtone on your phone is automatically changed so that all callers know that you have not paid a bill.
  • your girlfriend/boyfriend dumps you because your social credit score is integrated into your dating app.


Image 3: A cartoon on Credit China’s website on the impact of the social credit system on one’s dating life (Source: MERICS)

  • your company is restricted from public procurement systems and access to government land
  • you fly to a destination for business, but are unable to take your return flight because your social credit score decreased
  • you work for a business that gets punished for unscrupulous behavior and you are unable to move to another company because you are held responsible for the company’s behavior.
  • your company is blocked from having access to various government subsidies
  • you are able to rent apartments with no deposit, or rent bikes for 1.5 hours for free, while others have to pay extra.
  • you try to get away for that long awaited vacation only to find that you cannot fly or take the high-speed train to any of your desired destinations.
  • your company may not be allowed to participate on social media platforms
  • your academic performance, including whether you cheated on an exam or plagarized a paper can affect your social credit score

If that is not enough, anyone who runs afoul of the credit system will have their personal information added to a blacklist which is published in a searchable public database and on major news websites. This public shaming is considered a feature rather than a flaw of the system.


Image 4: A public billboard in Rongcheng showing citizens being shamed as a result of their social credit scores (SOURCE: Foreign Policy)

Obviously, the entire social credit system raises many issues. China is making a big bet that the citizenry will view this data collection, analysis, and scoring positively. To help, the Chinese government and national media have been actively promoting “big data-driven technological monitoring as providing objective, irrefutable measures of reality.” This approach seems to ignore the many issues present in information systems regarding the bias of categories and of the algorithms used to analyze the data these systems contain. Additionally, it fails to address the problems with erroneous data falsely rendering damaging reputational judgements against the people.

But putting aside weather or not these systems can measure what they seek to reliably, does creating a vast technological data collection system that is deeply integrated into all aspects of people’s lives and is used to calculate a single ‘trustworthiness’ score that is to be displayed publicly if the score is too low, sound like actions that are intended to build trust amongst people? On its face, it does not. It sounds more like a system built to control people. And systems that are built to control people, at their heart, do not trust the people they are trying to control. For if the people could be trusted to make the decisions that were good for society, why would such a system be needed in the first place? So, fundamentally, can the CCP build trust with its citizens by taking an action that loudly tells them that they don’t trust them? Will a system built on distrust foster trust amongst the populace? Or will it signal to the entire populace that their fellow citizens, neighbors, friends, and family members might not be trustworthy? Instead of promoting trust within society, it is very possible that China’s social credit system will actually further erode trust in China’s society.

References

Ohlberg, Mareike, Shazeda Ahmed, Bertram Lang. “Central Planning, Local Experiments: The complex implementation of China’s Social Credit System.” Mercator Institute for China Studies, December 12, 2017: https://www.merics.org/sites/default/files/2017-12/171212_China_Monitor_43_Social_Credit_System_Implementation.pdf

Mistreanu, Simina. “Life Inside China’s Social Credit Laboratory.” Foreign Policy, April 3, 2018: https://foreignpolicy.com/2018/04/03/life-inside-chinas-social-credit-laboratory/

Greenfeld, Adam. “China’s Dystopian Tech Could be Contagious.” The Atlantic, February 14, 2018: https://www.theatlantic.com/technology/archive/2018/02/chinas-dangerous-dream-of-urban-control/553097/

Bad Blood: Trusting Numbers with a Grain of Salt by Amy Lai

Digital health may be well on its way toward becoming the next “it” trend in technology. Over the past few years, the presence of consumer health technology companies has boomed. In 2010, digital health companies received roughly $1 billion in total investment funding, a less than hefty amount compared to other sectors (1). However, fast-forward just 6 years later, and that investment has jumped by nearly 810% (1). That’s right. In 2016, digital health companies received nearly $8.1 billion in investment funding (1), with significant investments in wearable and biosensing technology (2)—a move that perhaps echoes the increasing promise of digital healthcare.


Health investment categories

Indeed, the time seems ripe for a long-overdue revolution of traditional healthcare. With an ever-growing pool of data about our lifestyles captured through our smartphones, social media accounts, and even online shopping preferences, coupled with rapid advances in computing power and recommendation systems, it seems like technology is at the cusp of transforming how we think, perceive, and quantify our health. And we’re just starting to see its effects…and consequences.

Fitness trackers such as Fitbit and health-tracking apps like Apple Health Kit quantify an impressive range of our physical health. From our weight, to the number of steps we take, flights of stairs we climb, calories we burn, and to the duration and quality of our sleep, it appears that there are increasingly more tools to track nearly every aspect of our lives (3). Anyone else also slept for 7 hours, 18 minutes last night? As we curiously go through the colorful line graphs and bar charts that show our activity levels, have you ever wondered whether we can fully trust these metrics? How accurate are the numbers?

If a fitness app recorded that you burned 100 calories when you actually burned 90, how upset would you be? Probably not too upset because mistakes happen. However, if you learned that a medical device determined that you had diabetes when you really didn’t, how distraught would you be now? Most likely more than a little distraught. Notice the difference? Depending on context, it appears that consumers have different expectations of health-related product efficacy and tend to place greater trust in certain types of products such as medical devices. Although somewhat anticlimactic, results from medical devices should warrant some skepticism as they can (and do) have measurement error that goes wrong…and in some cases, very wrong.

Founded in 2003, Theranos was touted as a revolutionary breakthrough in the blood-testing market. The company reinvented how blood-testing worked by introducing a “proprietary technology” that purportedly could detect numerous medical conditions from high cholesterol to cancer using a finger pinprick that only needed 1/100 to 1/1,000 of the amount of blood required by current standard blood-testing procedures (4). Theranos seemed unstoppable. Valued at $10 billion, the company raised more than $700 million in venture capital and partnered with national pharmacy chains including Walgreens and Safeway to open testing clinics for consumers and patients (4). However, the company quickly unraveled as its product turned out to be nothing more than a facade. After probing by the US Food and Drug Administration, Securities and Exchange Commission, and Centers for Medicare and Medicaid Services, the “proprietary technology” was found to be underdeveloped and inaccurate, reporting erroneous blood-test results with marked error (4). Consumers worried about supposed new tumors while others celebrated their allegedly improved cardiac health by stopping medications (5). Theranos fooled us and we (just might have) helped them do that.


Theranos

Theranos teaches us a subtle yet important lesson about privacy as contextual integrity. Because consumers don’t often seem to question the efficacy of health-related products, it behooves corporate executives to scientifically and ethically validate their products. It’s important that such integrity plays a key role in organizational culture, and is embedded at all management levels to keep business leaders in-check and minimize consumer harm. Doing so helps prevent violations of consumer expectations and gives them a reason for continuing to place their trust in products. However, health-related products are not perfect and infallible. Because products inevitably have some margin of error, it also behooves consumers to understand that product metrics may not represent the whole truth and nothing but the truth. Those numbers aren’t likely to be wholly correct. It’s essential that we adopt a more realistic set of expectations about health-related products, as well as a healthier level of skepticism the next time we’re told we only burned 10 calories or only a few droplets of blood is needed to detect cancer.

These shifts in the mindset and expectations of businesses and consumers may be needed to help keep both sides accountable to each other.

References:
1. https://www.forbes.com/sites/forbestechcouncil/2017/05/05/why-digital-health-startups-have-yet-to-reach-unicorn-status/#3b5f23188cdb
2. https://rockhealth.com/reports/q1-2017-business-as-usual-for-digital-health/
3. https://www.nytimes.com/2017/12/26/technology/big-tech-health-care.html
4. https://www.vanityfair.com/news/2016/09/elizabeth-holmes-theranos-exclusive
5. https://www.wsj.com/articles/the-patients-hurt-by-theranos-1476973026

Social Credit: a Chinese experiment by Yang Yang Qian

Imagine applying for a loan, but first the bank must check your Facebook profile for a credit report. As odd as it feels for consumers in the United States, for consumers in China, this is already part of an experiment with social credit.

The Chinese government has had plans to implement a Social Credit System by 2020: a big data approach to regulating the behavior of individuals, companies, and other institutions such as NGOs. Essentially, under the Social Credit System, a company or individual would be given a set of ratings that summarizes how good behaving they are in various categories, such as propensity for major credit offenses. The platform is intended to aggregate a huge amount of data about the companies and individuals. This includes general information, compliance records, and even real-time data where possible. Eventually, the system will span both government data sources and incorporate commercial sources. If the platform can be implemented successfully, it should strengthen the Chinese government’s ability to enforce regulations and policies. Now, the system is not yet in place. Instead, the government has licensed private companies and some municipal governments to build their own social credit systems as pilot programs. One of the higher profile projects is Alibaba’s Sesame Credit.

As individual consumers in the United States, many of us are used to having personal credit scores. With the Social Credit System, however, it looks to be much more comprehensive. One key difference is that the scope of the system intends to cover all “market participants”. Both individuals and companies are subject to it. For instance, some of more ambitious objectives aim to track polluting companies through their real-time emissions records. Moreover, the stated focus of the system is to promote best practices in the marketplace. Proponents argue that such a system will help China overcome a multitude of societal ills: food safety scandals, public corruption, and tax evasion.

But on the other side of the coin, there are fears that such a system can be used as a mass disciplinary machine targeted at the citizenry. A good rating might allow users to borrow favorably on credit or find a good deal through Alibaba’s hotel partners. A bad rating might bar them from traveling. For instance, nine million low-score users found were barred from buying domestic plane tickets. With these risks for material harm on the mind, some have voiced fears that certain activities might be promoted or punished, a sort of subtle social coercion. Part of the problem is Alibaba isn’t too clear about the specific actions will be punished. On the one hand, they’ve released some high-level descriptions of the categories they score: credit history, online behavior, ability to fulfill contracts, personal profile completeness, and interpersonal relationships. On the other hand, the BBC reported that Sesame Credit makes no secret they will punish specific online behaviors:

“Someone who plays video games for 10 hours a day, for example, would be considered an idle person, and someone who frequently buys diapers would be considered as probably a parent, who on balance is more likely to have a sense of responsibility,” Li Yingyun, Sesame’s technology director told Caixin, a Chinese magazine, in February.

Perhaps Sesame Credit just used this as an evocative example, or perhaps they meant it in all earnestness. In any case, the fact that a large private conglomerate, with encouragement from a government, is essentially piloting an opaque algorithm to enforce good behavior did not sit well with some human rights watch groups. And rather alarmingly, some of the current scoring systems supposedly also adjusts an individual’s scores based on the behaviors their social circle. This might encourage use of social pressure to turn discontents, into compliant citizens. Are we looking at the prototype for a future government social credit system that will leverage social pressure for mass citizen surveillance? Some sort of Scarlet Letter, meets Orwellian dystopia?

Wait. There is probably a too much alarmist speculation about the Social Credit System in Western media right now. As usual, there is a lot of nuance and context surrounding this experiment. After all, the large central system envisioned by Beijing is not yet implemented. The social credit platforms that do exist are either separate pilots run by local municipal governments, or by private companies like Alibaba or Tencent. We should also keep in mind that the current Sesame Credit system, along with its peculiarities, is designed to reward loyal Alipay users, instead of some abstract “citizen trustworthiness”. In Chinese media, citizens seem to be generally see the need for a social credit system. Additionally, there is an active media discussion within China about specific concerns, such as the risk for privacy invasions by the companies that host the data, or opinions on what kinds of data should be used to calculate the scores. It remains to be seen if the central government system will want to adopt any of the features of these pilot programs, and how much leeway it will allow for those companies to continuing this experiment.