The Pakistan experience says the need for personal data protection legislation is now. Our lives have become more digital. Since COVID-19, the physical has been replaced by the digital. Work-from-home, online schooling, e-Commerce, food and grocery delivery, etc., has left a bigger digital footprint in the form of our personal data. We only have some idea of how much data is collected but generally close to none about how it is managed or used. This is the catch-22 situation. We need these platforms because they are global brands and for the services they give us, but on the other hand, there is a cost to us in terms of our privacy and important data.
In May 2017, The Economist said that data is a more valuable resource than oil. The shift from price to data is accelerating. Data analytic techniques have become the oil extraction-and-refining plants and data companies have become the new oil giants. Data is the next “essential” facility and money maker, and this is prompting debate on how it should be regulated. Today, big data is a big driver of growth and for some companies, revenue. With a significant shift towards a more digital lifestyle, privacy implications have grown, along with concerns with the ethical discussion of how one’s personal data is used. With concerns that Google collects 7,000 points of information about people, the valid question today is “has Google’s data collection gone too far?”
Much of our everyday activity takes place in the online domain courtesy of the numerous apps running on smartphones accessing the Internet on 4G technology using platforms, such as Google, Facebook, and Amazon. There are also many regional and national platforms in play. Many people are shopping online because of the convenience it offers, even more, important as COVID-19 lockdowns were imposed. Online retailers tend to remember your purchase history and base their recommendations on your subsequent purchases on this history. As another, using services such a ride sharing or food delivery also provide convenient and affordable options to people.
All this is done through algorithms that analyse one’s digital footprint extensively – every click, every like, every second spent on a website, keywords in communication, demographic data, geographical location, age, gender, political orientation, etc., and mine a wealth of usable information from it. Facebook even considers their “algorithms can enhance our personal relationships.” How can this data be safeguarded? One key challenge is the fact that digital technology spans the globe and many data collectors are not located within national boundaries.
Our digital identities are accessible globally, often without our knowledge or permission.
Privacy from an Ethical Perspective
Data ethics must also realize that privacy is a fundamental human right that underpins freedom of association, thought and expression, and freedom from discrimination. It is not easy to define privacy. Different countries hold different views, as do individuals. Privacy varies from person to person and differs depending on the situation. Broadly speaking, privacy is the right to be let alone, or freedom from interference or intrusion.
It is important, thus, to enact effective legal instruments that define the individual’s right to privacy and the threats to it, in the digital age. Data protection helps us in safeguarding our fundamental right to privacy with appropriate legal frameworks that give individuals rights over their data and put in place accountability systems and define the obligations of those who control and process this data. But equally important is to imbue a sense of data ethics in organizations. As acquirers and custodians of data mostly without any deliberate effort, there is a reasonable expectation that organizations will protect our data and help keep us safe online. Without this expectation, online transactions and social media activity could suffer.
There is growing worry over social media companies’ non-social and non-economic influence over culture and information, and the implicit threat they pose to the jurisdictions of governments. For instance, Facebook and 270,000 users’ data – those who participated in a survey by Cambridge Analytica and had consented to having their data harvested – led to a breach of the personal information of 87 million users and could possibly have affected the 2016 U.S. elections. Similar concerns have been raised during the U.K.’s process of Brexit.
Tech firms have been voraciously collecting the data of their consumers, offering free services as an enticement. The major platforms – the “frightful five” or FAANGs – have played a key role in the process. Shoshana Zuboff, in her book, The Age of Surveillance Capitalism, details how Google and Facebook developed their business models to collect and monetize data – “A fundamentally illegitimate choice,” she says.
A 2015 survey by the Annenberg School of Communication, University of Pennsylvania, found that “a majority of Americans are resigned to giving up their data—and that is why many appear to be engaging in trade-offs.
Resignation occurs when a person believes an undesirable outcome is inevitable and feels powerless to stop it. Rather than feeling able to make choices, Americans believe it is futile to manage what companies can learn about them. The study reveals that more than half do not want to lose control over their information but also believe this loss of control has already happened.”
These findings were also reflected in a Pew Centre research in 2014 that found that “91% of Americans “agree” or “strongly agree” that people have lost control over how personal information is collected and used by all kinds of entities. 80% of social media users said they were concerned about advertisers and businesses accessing the data they share on social media platforms, and 64% said the government should do more to regulate advertisers.” Another Pew survey in 2017 found that “just 9% of social media users were “very confident” that social media companies would protect their data. About half of users were not at all or not too confident their data was in safe hands.”
Can we expect tech companies to protect our personal information? Not if their business model or a revenue stream depends on collecting and sharing it. Protecting privacy requires a legal framework but will also require a technology adaptation.
The Legal Framework for Privacy Protection
Privacy is a qualified, fundamental human right. The right to privacy is articulated in all major international and regional human rights instruments. Article 12 of the Universal Declaration of Human Rights proclaims that “No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence. Everyone has the right to the protection of the law against such interferences or attacks.”
Article 14 (1) of Pakistan’s Constitution gives individuals the right to privacy: “The dignity of man and, subject to law, the privacy of home, shall be inviolable.” With social media and growing digitalization, the concept of privacy today must also encompass online activities and our growing digital footprints. In 2013, the United Nations General Assembly affirmed that the rights of people offline must also be protected online, calling upon all states to respect and protect the right to privacy in digital communication.
Data protection is a trending topic and many governments have moved decisively to plug the gaps in their national laws. The British Government introduced a new draft data protection bill in 2017 to replace the 1998 law.
Key features include the “right to be forgotten” on the internet and “the right to innocence” whereby citizens can request social media sites to remove any content they posted before the age of 18. The bill proposes tougher penalties on companies for data breaches and a requirement by businesses to inform the U.K. Information Commisioner’s office about any breach within 72 hours.
The EU’s General Data Protection Regulation (GDPR) is the most important change in data privacy regulation in 20 years and considered the world’s most aggressive set of internet privacy rules. GDPR is a common set of rules and practices that apply across Europe, and it is hoped, the world. It grants regulators to fine any company in breach as much as four percent of its total worldwide sales. It promotes three legal and business principles for firms that want to gain or retain user trust: transparency (say what you do), user control (empower your customers), and accountability (do what you say).
GDPR states that first, companies need a person’s consent to collect their data, and second, a person should be required to share only data that is necessary to make their services work. More than 500 million people living in the European Union have two important rights, i.e., the right of erasure and the right of portability.
After GDPR, California passed a digital privacy law – the California Consumer Privacy Act (CCPA) – that gives consumers more control and insight into the spread of their personal information online.
This is one of the most significant regulations that governs the data-collection practices of technology companies in America. “The new law grants consumers the right to know what information companies are collecting about them, why they are collecting that data, and with whom they are sharing it. It gives consumers the right to tell companies to delete their information, as well as to not sell or share their data. Businesses must still give consumers who opt out the same quality of service.” This is quite a step as most of the existing laws do little to limit what companies can do with consumer information.
The Government of Pakistan has proposed various draft personal data protection legislation since 2018. In this regard, the preparation of a draft Data Protection Bill (PDB) by the government is commendable.
The need for personal data protection has become more urgent since then as e-Commerce, financial, and service activities have increased because of COVID-19 and the general trend of increasing digitalization everywhere.
There have been serious incidents of data breaches in local organizations in recent years.
In April 2018, Careem, a ride-sharing service in Pakistan (acquired by Uber in January 2020), admitted that “users’ personal data was compromised in a massive data breach.” Another article said that “The hack affected user data of over 14 million users and 558,880 Captains in the 13 countries and 90 cities that Careem operates in.”
In September 2020, the Karachi Electric Supply Company (K-Electric) was targeted by a ransomware attack. The ransomware operators demanded payment of US $3.85 million worth of Bitcoin, rising to US $7.7 million after a week. K-Electic did not pay and all user data was leaked to the dark web. The legal recourse for those affected: none.
In August, the Federal Board of Revenue (FBR), the custodian of all taxpayers’ information, was affected by a cybersecurity breach. The agency claims that all personal data was safe but the veracity of this remains doubtful. Ten days after the event, the recovery process was continuing.
A leading bank in Pakistan suspended international debit card transactions after discovering valuable customer data had been compromised. “Most of the debit cards running to MasterCard networks were subject to fraudulent transactions.”
It is common knowledge that data from government repositories has been accessed without permission and can be purchased economically. Telecom companies in Pakistan have been known to sell subscribers’ data to third parties, something that is even stated in the privacy policies of some companies. Hence, the talk of a national data regulator is timely.
Unfortunately, Pakistan’s Electronic Data Protection Act, 2005, and the Personal Data Protection Bill, 2018, remain draft pieces of legislation. Concerns about the latter legislation have been highlighted by Privacy International, i.e., exempting state-owned entities from its purview. Pakistan’s lack of data protection laws may make it difficult for international market platforms and other e-Commerce companies to operate locally and to protect its citizens from data breaches. This is an entry barrier as companies may hesitate to operate in a weak regulatory regime. And in general, the feeling of insecurity about one’s personal data can also stifle competition and innovation.
Pakistanis based in Europe will see their online transactions and activity protected under GDPR These include banking services, e-commerce transactions, and activity on social media. It cannot be one-sided protection and companies in Pakistan will need to adapt to service clients and customers in Europe, and the world. The EU plans to limit market access to the region if countries do not rise to meet Europe’s standards. Data protection laws are becoming part of trade deals. It is time that Pakistan moved decisively to promulgate data protection legislation not just for economic reasons but for personal security and privacy!
It will be a while before regulations in Pakistan for protecting data become effective, leading to the ethical question in the meantime: should data be protected or should less of it be collected? Both protection and collection have their attendant costs and risks. Even the American National Security Agency could not prevent an employee from walking off with a thumb drive full of information and releasing it to the world in 2013.
Can data ethical behavior be encouraged by having the necessary legal frameworks in place? Or can datadependent companies adopt principles and practices that demonstrate careful stewardship of the information they collect about people? Pakistan provides no legal recourse to anyone who suffers from a loss of personal data, at least, for the time being. It is time to change this by having a GDPRlike legal framework. But equally important is encouraging a sense of ethics. Without fear of legal sanctions, it is difficult to change behavior, and until the National Data Protection Authority is established, there is no other focal agency to advocate for data ethics and personal privacy. Given the global implications of GDPR, Pakistan’s draft data protection regulations should be carefully compared and harmonized with those in the GDPR and the CCPA. Data protection, privacy, and ethical guidelines require a global and unified response.