A New Generation of Phone Hackers: The Police

A New Generation of Phone Hackers: The Police
By Anonymous | June 18, 2021

Hackers. I challenge you to create an image of the “prototypical hacker.” What comes to mind? Is it a recluse with a noticeably worn hoodie, sitting alone in the dark hovering over a desktop?

The stated description may have been quite popular at one time, but the constant changes in technology are coupled with the evolution of those who constitute a “hacker.” One group in particular is becoming increasingly associated with this title and emerging into the spotlight: law enforcement.

A cartoon of a police officer chasing the image of someone created off of popular iPhone apps. Boris Séméniako

A report by Upturn has found that more than 2,000 agencies in all 50 states of the U.S. have purchased tools to get into locked, encrypted phones and extract their data. Reports by the researchers at Upturn suggest U.S. authorities have searched 100,000+ phones over the past 5 years. The Department of Justice argues that encrypted digital communications hinder investigations and for protections to exist, there must be a “back door” for law enforcement. Google and Apple have not complied to these requests, but agencies have found the tools needed to hack into suspects’ phones. The use of these tools is justified by its need to aid in investigating serious offenses such as: homicide, child exploitation, and sexual violence.

In July 2020, police in Europe made hundreds of arrests as the result of hacking into an encrypted app called EncroChat. EncroChat is a phone network that provides specially altered phones–no camera, microphone, or GPS– with the ability to immediately erase compromising messages. Authorities in Europe hacked into these devices to detect criminal activity. The New York Times reports the police in the Netherlands were able to seize 22,000 pounds of cocaine, 154 pounds of heroin and 3,300 pounds of crystal methamphetamine as a result of the intercepted messages and conversations.

However, these tools are also being used for offenses that have little to no relationship to a mobile device. There are many logged offenses in the United States which are not digital in nature such as public intoxication, marijuana possession, and graffiti. It is difficult to understand why hacking into a mobile device– an extremely invasive investigative technique– would be necessary for these types of alleged offenses. The findings from the Upturn report suggest many police departments can tap into highly personal and private data with little oversight or transparency. Only half of 110 surveyed large U.S. law enforcement agencies have policies on handling data extracted from smartphones and merely 9 of these contained policies with substantive restrictions.

A worker checking the innards of an iPhone at an electronics repair store in New York City last month. Eduardo Munoz/Reuters

An important question on this issue remains: what happens to the extracted data after its use in a forensic report? There are few policies clearly defining the limits on how long extracted data may be retained. The lack of clarity and regulation surrounding this “digital evidence” limits the protection of most Americans. More importantly, if the data is extracted from the cloud, there are further challenges. Since law enforcement has access to tools for siphoning and collecting data from cloud-based accounts there is an immensely continuous database they are able to view. Some suggest this continuous flow of data should be treated as a wiretap and require a wiretap order. However, research from Upturn has not been able to find a local agency policy that provides guidance or control over data extracted from the cloud.

Undoubtedly, the ability to hack into phones has given police the necessary leads to making many arrests. However, the lack of regulation and general oversight on these actions can also questionably impede the safety of American citizens. Public institutions have often been thought to be behind with the use of the latest technologies. There are those who argue that if criminals are utilizing digital tools to commit offenses, then law enforcement should now be one step ahead with these technologies. This begs the question: is it fair or just for law enforcement to have the ability to hack into their citizens’ phones?

References:
Benner, K., Markoff, J., Perlroth, N. (2016, March). Apple’s New Challenge: Learning How the U.S. Cracked Its
iPhone. Retrieved from New York Times: https://www.nytimes.com/2016/03/30/technology/apples-new-challenge-learning-how-the-us-cracked-its-iphone.html

Koepke, L., Weil, E., Urmila, J., Dada, T., & Yu, H. (2020, October). Mass Extraction: The Widespread Power of U.S.
Law Enforcement to Search Mobile Phones. Retrieved from Upturn: https://www.upturn.org/reports/2020/mass-extraction/

Nicas Jack (2020, October). The Police Can Probably Break Into Your Phone
Retrieved from New York Times

Nossiter Adam (2020, July). When Police are Hackers: Hundreds Charged as Encrypted Network is Broken
Retrieved from New York Times:

Tinder Announces Potential Background Check Feature: What Could Possibly Go Wrong?

Tinder Announces Potential Background Check Feature: What Could Possibly Go Wrong?
By Anonymous | June 18, 2021

In March 2021, Tinder, along with other Match Group entities, publicly announced the decision to allow for its users to run immediate background checks on potential dating partners. The dating service plans to partner with Garbo, a non-profit, female-founded organization that specializes in checks using just their name and phone number and aims to prevent gender-based violence in the midst of an unfair justice system. According to its website, Garbo’s mission is to provide “more accountability, transparency, and access to information to proactively prevent crimes . We provide access to the records and reports that matter so individuals can make more informed decisions about their safety.” The platform provides these records at a substantially lower cost than those provided by for-profit corporations.

Though well-intentioned with ensuring safety and well-being of its users, this partnership of the companies raises questions when it comes to user protection measures and the implications of digital punishment. For one, the access to public records at a user’s disposal might cause concern, especially to those who may have inaccurate records attached to their name, and according to a Slate article highlighting the nature of online criminal records, getting them removed is a taxing process and that virtual footprint could tarnish an individual’s name for life. Public record data is generally error prone, and there would need to be accountability and transparency in how often data is updated and how representative it is of a general population, since it is highly probable that the data collection could disproportionately affect marginalized communities. Additionally, there is a possibility of aliases or misspellings used by offenders to bypass the consequences of being identified in a background check.

Garbo’s technology is still relatively new, and much information isn’t out there regarding the tactics of collecting criminal cases nor maintaining quality control when accessing a criminal database. Could a full name and phone number alone assist in an effective identity matching process? What are the precautions taken to ensure accurate information? How high would error rates be in this situation?  If there are false positives and misidentification, will Garbo hold themselves accountable? Moreover, the non-profit’s current privacy policy does not explicitly disclose information regarding the data that will be provided to a user if they were to request access to criminal records.

The types of crimes that would have the most penalty in the dating sphere are still yet to be confirmed by the Garbo management. So far, Garbo decided to not include drug possession, primarily in order to raise awareness of the racial inequities when it comes to such charges.

In addition, despite claims that Garbo provides checks at low costs, the collaboration with Tinder would imply increases in costs as it is a for-profit entity, so that promise of accessibility may be a distant one.  This initiative is a step in the right direction, but until the public can get more information on how Garbo will maintain accountability and be transparent about the origins of the data, we can only hope of a perfectly safe, regulated, and fair experience for users on these apps.  

References:

Technology’s Hidden Biases

Technology’s Hidden Biases
By Shalini Kunapuli | June 18, 2021

As a daughter of science fiction film enthusiasts, I grew up watching many futuristic movies including 2001: A Space Odyssey, Back to the Future, Her, Ex Machina, The Minority Report, and more recently Black Mirror. It was always fascinating to imagine what technology could look like in the future, it seemed like magic that would work at my every command. I wondered when society would reach that point in the future, where technology would aid us and exist alongside us. However, over the past few years I’ve realized that we are already living in that futuristic world. While we may not exactly be commuting to work in flying cars just yet, technology is deeply embedded in every aspect of our lives, and there are subtle evils to the fantasies of our technological world.

Coded Bias exists
We are constantly surrounded by technology – ranging from home assistants to surveillance technologies to health trackers. What a lot of people do not realize however is that many of these systems that are meant to serve us all are actually quite biased. Let’s take the story of Joy Buolamwini, from the documentary Coded Bias. Buolamwini, a PhD at the MIT Media Lab, noticed a major flaw in facial recognition software. As a Black woman, the software could not recognize her, however once she put a white mask on the software detected her. In general, “highly melanated” women have the lowest accuracy for being recognized by the system. As part of her research, Buolamwini discovered the data sets that the facial recognition software was trained on consisted mostly of white males. The people building the models are a certain demographic, and as a result they compile a dataset that looks primarily like them. In ways like this, bias is coded in the algorithms and models that are used.

Ex Machi-NO, there are more examples
The implications go far beyond the research walls at MIT. In London, the police intimidated and searched a 14 year old Black boy after using surveillance technology, only to realize later that the software had misidentified the boy. In the US, an algorithm meant to guide decision making in the health sector was created to predict which patients would need additional medical help in order to provide more tailored care. Even though the algorithm excluded race as a factor, it still resulted in prioritizing assistance to White patients over Black patients, even though the Black patients in the data were actually in more need.

Other minority groups are also negatively impacted by different technologies. Most notably, women tend to get the short end of the stick and have their accomplishments and experiences continually erased due to human history and gender politics. A book I read recently called “Invisible Women” by Caroline Criado Perez details several examples of gender data bias, some of which are so integrated into our normal lives that we do not usually think about it.

For example, women are 47% more likely to be seriously injured in a car crash. Why? Women are on average shorter than men, and thus tend to pull their seats more forward to reach the pedal on account of their on average shorter legs. However, this is not the “standard” car seating position. Even the airbag locations are in places that are typical for the size of an average male body. Crash test dummies are usually tested with male sized bodies as well, leading to higher risk for females in a car crash since they haven’t tested on female sized bodies.

Additionally, women are more likely to be misdiagnosed with a heart attack, because women don’t have the “typical” symptoms of a heart attack. Female pianists are more likely to suffer hand injuries because the piano key size is based around male handspan and the female handspan is smaller on average. The length of an average smartphone is 5.5 inches and is uncomfortable for many women because of their smaller handspans. Google Home is 70% more likely to recognize male speech, because it is trained on a male-dominated corpora of voice recordings.

The list goes on and on. All of these examples are centered around the “standard” being males. The “true norm” or the “baseline” condition is centered around male experiences. Furthermore, there is a lack of gender diversity within the tech field so the teams developing a majority of these technologies and algorithms are primarily male as well. This itself leads to gender data bias in different systems, because the teams building technologies implicitly focus on their own needs first without considering the needs of groups they may not have knowledge about.

Wherever there is data, there is bound to be an element of bias in it, because data is based on society and society in and of itself is inherently biased. The consequences of using biased data can compound upon existing issues. This unconscious bias in algorithms and models further widens the gap between different groups, causing more inequalities.

Back to the Past: Rethinking Models
Data doesn’t just describe the world nowadays, it is often used to shape it. There is more power and reliance being put on data and technology. The first step is recognizing that these systems are flawed. It may be easy to rely on a model especially when everything is a blackbox, as you can just get a quick and supposedly straightforward result out. We need to take a step back however and take the time to ask ourselves if we trust the result completely. Ultimately, the models are not making decisions that are ethical, they are only making decisions that are mathematical. This means that it is up to us as data scientists, as engineers, as humans, to realize that these models are biased because of the data that we provide to them. As a result of her research, Buolamwini started the Algorithmic Justice League to create laws and protect people’s rights. We can take a page out of her book and start to actively think about how the models we build or how the code we write has an effect on society. We need to advocate for more inclusive practices across the board, whether it is in schools, workplaces, hiring practices, government, etc. It is up to us to come up with solutions so we can protect the rights of groups that may not be able to protect themselves. We need to be voices of reason in a world where people rely more and more on technology everyday. Instead of dreaming about futuristic technology through movies, let us work together and build systems now for a more inclusive future — after all, we are living in our own version of a science fiction movie.

References
* https://www.wbur.org/artery/2020/11/18/documentary-coded-bias-review
* https://www.nytimes.com/2020/11/11/movies/coded-bias-review.html
* https://www.washingtonpost.com/health/2019/10/24/racial-bias-medical-algorithm-favors-white-patients-over-sicker-black-patients/
* https://www.mirror.co.uk/news/uk-news/everyday-gender-bias-makes-women-14195924
* https://www.netflix.com/title/81328723
* https://www.amazon.com/Invisible-Women-Data-World-Designed/dp/1419729071

The Ethicality of Forensic Databases

The Ethicality of Forensic Databases
By Anonymous | June 18, 2021

Forensic data has been used in monumental cases to determine killers and criminals of many unsolved crimes. One such case was the murder of a 16 year old girl in Kollum, a small village in the Netherlands back in 1999. Iraqi and Afghan residents who were asylum seekers were predominantly blamed for the murder, which increased racial and ethic tensions in the area. Years later, forensic evidence from a semen sample determined that the murderer in the case was of European descent. This was a landmark for the case and diminished the racial tensions at the time. Such cases of wrongfully accused individuals having their cases overturned to serial killers being identified years later have shown a positive light on forensic based databases. The particular database that aided in the Kollum case was the Y-chromosome Haplotype Reference Database (YHRD). Now, YHRD is predominantly used to solve sex crimes and paternity cases.

The YHRD is a research database that is used in both academic and criminal laboratories. However, the ethicality of the data present in the YHRD has been of concern. Over a thousand of the male profiles in the YHRD do not seem to have had any consent given in having their DNA in the database. Some of these profiles include members of the Uyughur population, a predominantly muslim population in China. This comes with heightened concern as increasing reports indicate that the Uyughur population is being exploited, with even potential ethnic cleansing by the Chinese government. The DNA may be forcefully collected by the Chinese government, which could be used in a detrimental manner against the Uyughurs.

The ethicality of forensic databases is not well regulated and needs to be discussed more in the limelight. Forensic data, such as DNA and fingerprints, are sensitive data that can implicate not only an individual but their family and descendants as well. The YHRD opens up a discussion on other databases as well that do not properly regulate consent and the dissemination of purposeful data. DNA data collected by police forces are usually highly secured and are only used during preliminary investigation. Once a particular case is finished, the data is typically erased. With the predominance of large databases like YHRD increasing, new rules and regulations must be kept in place to ensure both privacy and ethicality. Forensic data can be very beneficial in solving crimes, insighting evidence, and even connecting loved ones. However, forensic data can also be used by the government for malpractice and can implicate people and their relatives. Due to this, we should take a deeper look into large scale forensic databases and their ethicality.

References
https://www.nature.com/articles/d41586-021-01584-w
https://yhrd.org/