A Privacy Policy To Which Nobody Agreed

A Privacy Policy To Which Nobody Agreed
By Andrew Morris | February 24, 2020

On Monday, February 10th, Attorney General Barr announced an indictment of four members of China’s Military for hacking into Equifax.

Equifax operates in a unique space: like Facebook, they have troves of data about a significant number of people. Specific data about financial balances, transactions, payment histories, and creditworthiness. The data may not be as socially personal as Facebook’s, but it is every bit as sensitive, if not more. However, unlike Facebook, nobody agreed to house their data there.

Equifax doesn’t have a privacy policy as much as a marketing page about privacy[1] and a “privacy statement”[2].

In this document, Equifax has taken the time to ensure that they are compliant with laws and best practices about data management and correction right up until the point where it starts to involve sensitive data.

It is worth noting that the California Consumer Privacy Act (CCPA) permits California residents to manage and delete their data. A dedicated page[3] details those rights. However, on my attempt to actually exercise these rights (2/21/2020 at 7:39pm PST), their dedicated site was unresponsive to requests.

Given the scope of recent breaches (147 million US residents), it might reason that regulators and government agencies would address consumer rights in the United States. The FTC made a statement about the Equifax data breach recently and accompanied it with some additional information.[4] On this page, there is a telling ‘question and answer’ that the FTC provides:

Q: I don’t want Equifax to have my data. What can I do?

A: “Equifax is one of three national credit bureaus. These companies collect information about your credit history, such as how many credit cards you have, how much money you owe, and how you pay your bills. Each company creates a credit report about you, and then sells this report to businesses who are deciding whether to give you credit. You cannot opt out of this data collection. However, you can review your credit report for free and freeze your credit.”

In other words, the financial credit system is so essential to commercial operations, the FTC has decided that this data collection is effectively mandatory for most of America.

This organizational system, where private organizations are responsible for infrastructure and data management for the financial system, is not unique to the United States. Some examples highlight the key differences:

  • Germany relies on a company called Schufa Holding AG. [5] However, they provide customers the right to erase, rectify and restrict processing of personal data under GDPR. [6]
  • Austria relies on another company called Kreditschutzverband von 1870 (KSV1870 for short – literally, Credit Protection Association from 1870), which operates as a blacklist-style credit list. This type of system would be un-ideal for granting opt-out rights, and yet they do allow the Austrian Data Protection Authority to intervene. [7]
  • The UK uses a variety of companies. One of them is TransUnion, who manages a specific page on the rights to delete data [8], and it requires some discussion and acknowledgment of the potential consequences, but there is a process to address it.

These exceptions seem to be limited to Europe. Anywhere where the General Data Protection Regulation (GDPR [9]) applies, the specific data subjects have rights. To summarize the legislation, these rights include simple terms and conditions explaining consent, timely notification of data breaches, the right to access your data, the right to be forgotten, data portability, and privacy by design. There is also a significant number of appropriate technical and organizational measures to ensure security levels and risk are commensurate.

In other words, many of the protections built into the GDPR would address both the rights of data subjects and potentially help some of the operational elements that permitted the Equifax data breach. Consumers and data subjects in the United States would benefit from either an expansion of the CCPA or GDPR to cover all residents.

Can You Trust Facebook with Your Love Life?

Can You Trust Facebook with Your Love Life?
By Ann Yoo Abbott | February 21, 2020

If you have ever heard about the Facebook data privacy scandal or emotion experiment, you probably don’t trust Facebook with your personal data anymore. Facebook had let companies such as Spotify and Netflix read users’ private messages, and Facebook was sued for letting the political consulting firm Cambridge Analytica access data from some 87 million users in 2018. It doesn’t end there. For one week in 2012, Facebook altered the algorithms it uses to determine which status updates appeared in the News Feed of 689,003 randomly selected users (about 1 of every 2,500 Facebook users). The results of this study were published in the Proceedings of the National Academy of Sciences (PNAS).

Recently, Facebook has launched its new dating service in the US. They had been advertising privacy is important when it comes to dating, so they consulted with experts in privacy and consumer protection and embedded privacy protections into the core of Facebook Dating. They say they are committed to protecting people’s privacy within Facebook Dating so that they can create a place where people feel comfortable looking for a date and starting meaningful relationships. Let’s see if this is really the case.

Facebook Dating Privacy & Data Use Policy

Facebook Dating’s privacy policy is an extension of the main Facebook Data Policy, and includes a warning that Facebook users may learn you’re using Facebook Dating via mutual friends:

Here are some of the highlights from Facebook’s Data Policy as far as what information is collected when you create a profile and use Facebook Dating:

Content: Facebook collects the “content, communications and other information you provide” while using it, which includes what you’re saying to your Facebook Dating matches. “Our systems automatically process content and communications you and others provide to analyze context and what’s in them.”

Connections: Facebook Dating will also analyze what Facebook groups you join, who you match and interact with, and how – and analyze how those people interact with you. “…such as when others share or comment on a photo of you, send a message to you, or upload, sync, or import your contact information.”

Your Phone: Facebook Dating collects a lot of information from your phone, including the OS, hardware & software versions, battery level, signal strength, location, and nearby Wi-Fi access points, app and file names and types, what plugins you’re using and data from the cookies stored on your device. “…to better personalize the content (including ads) or features you see when you use our Products on another device…”

Your Location: To get suggested matches on Facebook Dating, you need to allow Facebook to access your location. Facebook collects and analyzes all sorts of things about where you take your phone, including where you live, where you go frequently. Even Bluetooth signals, and information about nearby Wi-Fi access points, beacons, and cell towers are part of the information they collect. Facebook also analyzes what location info you choose to share with Facebook friends, and what they share with you.

This list does not include everything that Facebook takes from us. Even after all these informations, they still have more listed on their Data Policy. You’ll want to read it for everything Facebook discloses about what they collect and how it’s used. Do you think Facebook collecting so much of our information is justified? Don’t you think it’s too excessive?

Why the Biggest Technology Companies are Getting into Finance

Why the Biggest Technology Companies are Getting into Finance
By Shubham Gupta | February 16, 2020

With the rise in popularity of “fintech” companies such as Paypal, Robinhood, Square, Plaid, Affirm, and Stripe, the general public is becoming more and more used to conducting monetary transactions through their phones rather than with their wallets. And of course, the biggest tech companies are trying to claim a piece of the pie as well. Apple and Amazon both offer a credit card, Google is planning to offer checking accounts to users who use Google Pay, and Facebook is going as far as creating a new currency called Libra.

Of course, by expanding these new financial products, big tech companies are expanding their revenue streams and are aiming to immerse users more and more into their ecosystem. However, one of the biggest yet most hidden advantages of offering these financial products is that it opens up the doors for big tech companies to collect financial data. For example, processing a payment transaction through services such as Apple Pay or Google pay allow the company to keep a note of what the consumer bought, when they bought it, from where they bought it, and how much money they spent. This information could possibly be used to better understand consumer spending behavior and offer more targeted advertising to consumers.

Another danger of these services is security. Tech companies having access to your sensitive financial information could possibly lead to your information being exposed in the event of a data breach or hack. In the past, both Google and Facebook have suffered through numerous data breaches which exposed millions of consumer’s social media and email accounts. Additionally, because big tech companies are not held to the same level of regulations as banks and other financial institutions, they are more susceptible to operate in ways with your finances compared to banks. These worries are reflected in recent surveys of consumers asked if they trust big tech corporations with their finances. As seen in the chart below, consumer trust of tech companies handling their finances ranges around 50 – 60%. Facebook, probably due to its history of privacy violations, ranks the lowest with around a 35% approval rating.

That’s not to say that big tech’s contribution in the finance space has not come without its merits. Apple’s latest credit card, along with being a sleek piece of titanium, offers users helpful charts and visualizations to help them keep track of their spending and eliminates hidden fees. Facebook’s Libra, despite having a lot of partners back out of the deal, is envisioned to use block chain to process payments, making them cheaper, faster, and more secure. Amazon also offers a solid cash back on any purchase made on Amazon.com, Whole Foods, and other various restaurants, gas stations, and drug stores.

With a lot of these tech giants, having such a wide variety of products and services, finance is only the latest frontier for big tech companies. With the surge in popularity of these products especially in the younger generation, more and more companies are taking notice and trying to innovate in this space as well. However, as companies continue to innovate new financial products and solutions, government regulation is quick to follow.




Air Tags: Finding your Keys or your Sensitive Information?

Air Tags: Finding your Keys or your Sensitive Information?
By Chelsea Shu | February 7, 2020

With people becoming more reliant on features such as Apple’s Find My iPhone to help them find their phone or Mac laptop, people wish there was also a way to find other commonly misplaced items such as their wallet and keys. Apple may have a solution.

Apple is supposedly creating a new product called Air Tags that will track items through Bluetooth technology. The Air Tags will be small, white tags that attach to important items with an adhesive.

They have tracking chips in them that will connect them to an iPhone app called “Find My.” This will enable users to locate and track all of their lost items through an app on their iPhone. The Air Tags will also have a feature allowing a person to press a button to emit a sound from the Air Tag, allowing the user to locate their item easily.

Privacy Concerns

While this product may offer consumers an opportunity to seamlessly keep track of their items, it may be too good to be true. Apple already collects a multitude of personal data: what music we listen to, what news we read, and what apps we use most often. Introducing more items such as wallets and keys into Apple’s tracking system means Apple has increased surveillance on a person’s daily activities and their locations throughout the day. Increased surveillance means more access to a person’s personal life and possibly, sensitive information.

While Apple claims that it does not sell location information to advertisers, its privacy policy states that its Location Services function “allows third-party apps and websites to gather and use information based on current location.” This raises a concern because it enables third parties to collect this data for their own purposes and it is unclear what they will do with this data. Given this data, these third party companies can now track when Air Tag users arrive and leave places as well as monitor their tendencies.

Furthermore, there are large consequences if this data lands in the hands of a malicious person. Creating a product like Air Tags opens up the possibility for data to be shown or accessed by someone it was unintended for. This can lead to unwanted information being exposed or used against the person whose data is being tracked.

Apple also claims that it scrambles and encrypts the data it collects. But in reality, the anonymization of data is quite difficult. While Apple claims that the location data they collect “does not personally identify you,” combining the magnitude of data that Apple has on each person could make it possible to fit the puzzle pieces together and identify a person, violating that person’s privacy and enabling that persons sensitive information to be exposed.

In summary, the addition of Apple Air Tags might seem like a convenient, useful idea, but what people do not realize is that opting in to such a product opens up the doors for increased data surveillance.


Amazon Go Stores

Amazon Go Stores
By Rudy Venguswamy | February 8, 2020

The tracking of retail shopping has always been a huge part of American consumer culture. It almost seems a given that after Black Friday, news stations will be broadcasting numbers about how this Black Friday compared in terms of revenue to previous years. These numbers though, are only a peek mainstream consumers get into the obsessive relationship retailers have with tracking customers. The amount of data retailers are now pushing to have about consumers has grown exponentially thanks to smart technology such as computer vision, internet analytics and customized Machine Learning models built to sell more to consumers.

The holy grail for retailers, with this in mind, has always been not just tracking sales, but tracking the entire sales cycle. The increase in online sales has of course made part of this job easier, but quite interesting, the biggest juggernaut of online sales, Amazon, in the past two years, has opened up a physical store, one that in many ways harkens back to physical shopping, but in many ways, a is step closer to the coveted grail of retail, tracking everything about a consumer.

In its new Amazon Go stores, cameras decorate every square corner, and machine learning plays the field, tracking each participant’s ID (though Amazon insists without facial recognition), their movements in the store, and links this to their online presence, creating perhaps the greatest concentration of insights into a consumer walking in a store that the world has ever seen.

This newfound power, transcending the physical and online shopping experience is, with no doubt, a marvelous engineering feat, costing both hundreds of millions of dollars of R&D spent, and sophisticated matching algorithms that detect reluctance in consumers and encourage them with coupons offline and online.

This power, however, also will shift the paradigm for privacy in the real world. As consumers, most expect their activities online and their interactions in the real world to stay, for the most part, separated. This new shift in the way commerce can be done means that this physical- online wall is all but evaporated and abstracted away to ML models.

Under an ethical framework of subject interaction with experimentation via machine learning, I think Amazon Go stores are minefields for unethical manipulation of consumers. Though Amazon has made off-the-cuff promises about what technology AI “currently” is being allowed to operate in the store (such as no face detection), these assurances should not be reassuring as they, in truth, are subject to change contingent solely on Amazon’s bottom line and engineering prowess. Consumers by simply walking into the store are forced into the game played by algorithms whose purpose is to maximize sales. It’s already dubious when this happens online. It should be exponentially more concerning when this manipulation enters our physical world too.

In conclusion, these Amazon Go stores, which track nearly every aspect of the consumer degrade the inherent privacy wall of real life versus online interaction. This is problematic as the subjects of this incursion, the consumers, are unwitting. Customers don’t necessarily consent when they walk into a store. Placing limits on artificial intelligence and its manipulation of our physical interactions with stores is critical to the safety of consumers from an otherwise predatory retail practice.

Is privacy a luxury good?

Is privacy a luxury good?
By Keenan Szulik | February 7, 2020

As one of the largest companies in the world—now valued at over $1 trillion—Apple Inc. has masterfully crafted its products and its brand into an international powerhouse. In fact, in 2002, almost 20 years ago, Wired Magazine declared “Apple: It’s All About the Brand“, detailing how Apple’s brand and marketing had driven the continued rise of the company.

Industry experts used words like “humanistic”, “innovation”, and “imagination” to describe Apple’s brand in 2002. But now, in 2020, there’s a new word that comes to mind: luxury.

In many ways, Apple has been slowly moving itself into the luxury goods market for years: first with the Apple Watch in 2015, its first foray into wearable devices. Then, into jewelry, with the introduction of AirPods in 2016. More recently, with the iPhone X and its hefty price tag of $999. And to really cement itself as a luxury brand, Apple released its most expensive computer yet in late 2019. The price? Over $50,000.

Source: https://qz.com/1765449/the-apple-mac-pro-can-cost-over-50000/

Apple’s new Mac Pro, priced at over $50,000 when fully loaded with all possible customizations.

This (luxury) branding is important, especially as Apple continues its competitive war against Google’s Android. Android devices, unlike Apple’s iOS devices, are incredibly price accessible. They do not cost $999, like a new iPhone X.

Android’s price accessibility has enabled it to become the global smartphone market share leader, contrary to what some American readers may think. In fact, according to [estimates from IDC](https://www.idc.com/promo/smartphone-market-share/os), Android phones represent nearly 87% of smartphones globally, compared to Apple’s 13%.

Apple’s iOS dominates North America, but lags across the rest of the globe, per DeviceAtlas.

How has Apple responded in the face of competition and proliferation from Android? Privacy.

For over a decade, Google—the maker of Android smartphones—has been mired in privacy concerns. Most recently, a questionable handling of personally identifiable information in advertising and an Amnesty International report detailing “total contempt for Android users’ privacy” entrenched the stereotype that Google did not respect the privacy of its users.

Apple pounced on this opportunity, launching a series of ads titled “Privacy on iPhone” with the slogan “Privacy. That’s iPhone.” Two such ads, from 2019, now have over 25 million views each on YouTube.

This is where it gets interesting: by leveraging privacy as a competitive advantage, Apple associates privacy with its luxury brand. Apple customers deserve and receive privacy; the rest, not so much. This assertion is a subtle one, but it’s absolutely critical: Apple is effectively telling consumers that privacy is a luxury good.

There are two resultant questions from this:

1. Is privacy a luxury good, or a fundamental human right? (Let’s ignore, for the time being, defining “privacy”.)

2. Technically, how would Apple achieve this privacy in ways that its competitors would not?

The ethics of the human right to privacy is a fascinating debate for the 21st century. Smartphones are now nearly ubiquitous, and the meaning of privacy has changed dramatically over the last decade (as it almost certainly changed dramatically over every prior decade, thanks to technological innovation). But it’s worth noting: there are many technical tradeoffs when engineering with privacy as a goal.

Apple, for one, has taken many steps to engineer privacy into its products by creating “on-device intelligence” systems (which it has also effectively marketed). This means that rather than taking data and sending it back to Apple’s servers to be processed, the data can be processed on your phone, which you own and control. Google has also taken steps to achieve this on-device intelligence, but has communicated its benefits less effectively to consumers.

Building these on-device intelligence systems, however, is expensive. Privacy, in turn, is expensive. And Apple uses this, in part, to justify the high price tag on its iPhones (further asserting privacy as a luxury good).

All of this is to say that we’re in a trying time. As brands such as Apple and Google introduce privacy as a point of competition, we as consumers feel the impact of their choices. This could have a positive effect: Apple and Google could enter a privacy war, raising the privacy standards in a way that positively benefits all consumers. Or it could deepen divides, with privacy becoming a luxury good afforded to the rich and powerful, and revoked from those with less.