Chapter 2 Case study
Respecting consumers’ right of
privacy
Liam Pomfret
The University of Queensland
Source: Peterhowell/iStock/Getty Images
The modern digitally connected marketplace is all but defined by ubiquitous collection,
storage and use of consumer information by companies and organisations. These data are a
highly valuable source of marketing information for companies, and the use of so-called
second-party data (customer-level information acquired from another company) is now
commonplace.1 There are organisations whose entire business is collecting and selling
everything they can about anyone and everyone’s (personal) business. But while companies
may be reaping profits from this environment of data sharing and dissemination, it is also
creating significant systematic issues for consumers.
The unintended consequences of widespread access to personal information can be seen in
the seemingly never-ending stream of tales in the media about people who’ve suffered in
some way because of how their personal information has been handled. In some cases, these
impacts may be relatively minor. As irritating as it may be for consumers when a telemarketer
interrupts their dinner, a few occasional unwanted marketing communications do not
constitute the consumer suffering from any kind of significant or lasting objective harm. These
are by no means the only concerns, however, and nor are the harms of these privacy issues
limited to consumers.
For instance, data breaches resulting from companies placing insufficient protection on
consumers’ personal information have become commonplace over the past decade.2 The
2017 Equifax data breach alone is believed to have impacted on as many as 143 million US
citizens, or approximately 44% of the population.3 Other large organisations such as Dow
Jones, Time Warner Cable, Verizon and World Wrestling Entertainment all suffered breaches
in 2017 relating to data left completely unprotected on cloud web services.4 In April 2018 it
was revealed that more than 87 million Facebook users’ information was compromised after
it was improperly shared with the controversial data mining and analysis firm Cambridge
Analytica.5 Incidents like these enable and facilitate identity theft, causing long-term harm to
consumers, often through harm done to their credit score. Companies suffer harms that go
beyond the immediate damage to their reputation and market standing from these events,
with impacts frequently spilling over to companies completely unrelated to the initial breach,
such as through the fraudulent establishment and use of credit cards and loans using
information secured through a data breach. In extreme cases, impacts can even be felt on a
societal level, such as through the influence Cambridge Analytica is believed to have had on
the 2016 US Presidential Campaign and the 2016 Brexit referendum.
Wicked problems are complex social issues that are so complicated it is hard to uncover a
source or clear solution; and they can be highly resistant to resolution.6 Consumer privacy is
one such wicked problem, with broad societal implications that require participation from
multiple stakeholders to identify sustainable social change solutions. Yet despite the very real
consequences of this problem, and the ever-increasing levels of privacy concern reported by
consumers, privacy issues are often sidelined or ignored. Instead, discussions tend to focus
on the possible benefits consumers and society may derive from data capitalism and the use
of consumer data for the delivery of personalised product and service offerings, and
advertising.7 Even in the academic marketing literature, privacy concerns are typically
discussed in terms of the need for companies to allay concerns through reassurance and
transparency, rather than to actually address concerns meaningfully by limiting collecting and
use of information. This raises several pointed questions for marketing practice. Just how
much should organisations really be allowed to know about consumers? What justifications
are there for the collection and use of consumers’ information? And what responsibilities
should companies have for the protection of the consumer information they hold?
Why do you need to know that?
Discussions about consumer privacy ethics in the public sphere over the past decade have
focused on companies’ need for legal and regulatory compliance, and on the associated need
for companies to avoid legal liability. What typically is not asked is a much more fundamental
question: is it actually ethical for companies to be acquiring as much information about
consumers as possible in the first place?
In considering how information is to be used, organisations must take into consideration the
context in which personal information is disclosed, and the purpose or intentions the
consumer had in making that disclosure. Just because a consumer has shared a piece of
personal information with a company does not mean that they do not retain a desire to limit
broader access to that information. When a consumer does reveal information to a company,
such as when an individual shares their home address with an online retailer for the purpose
of completing a commercial transaction, there is an expectation of confidentiality—an
expectation that the information will not be on-sold to uninvolved third parties or used for
other unrelated purposes.
Privacy laws and regulations throughout the developed world rely on the notion of notice and
consent, giving legitimacy to practically any form of collection or use of an individual’s
personal data so long as the consumer has consented to it. This consent is framed in the
context of consumers opting in to services or an exchange relationship, agreeing to a contract
as recognised by consumer law. Consumers, however, suffer from a dilemma about their
ability, or lack thereof, to refuse the collection of their information. The idea of consent
doesn’t work well when there is no other viable option for the consumer. In the modern
digitally connected world, withholding information stymies an individual’s ability to function
in society and to conduct everyday activities both social and commercial. Don’t want to give
out your address? Hope you don’t mind not being able to shop online. Don’t want to share
your mobile phone number or your personal email address? You can probably say goodbye
to two-factor authentication (2FA) protections on your online accounts. Today, rather than
choosing to disclose or not, consumers are essentially limited to selecting who to disclose to,
a choice that holds little meaning to the consumer if that information is to then be
disseminated further beyond the consumer’s own control.
Much of what is collected about consumers today is arguably obtained without consumers’
active knowledge. Our society is fast becoming a sensor society,8 in which devices such as our
cars and mobile phones monitor and record our everyday actions, relaying this information
back to corporations for their own benefit. While consumers may have a base nebulous level
of awareness that an organisation is doing something with some information, and they may
have given their ‘consent’ to it in terms of agreeing to a set of terms and conditions they more
than likely never read, they rarely have a grasp on the full extent of exactly what and how
information is being collected, or exactly how it is used. This problem is further complicated
in that the terms and conditions may change over time, with organisations using clauses in
the conditions that allow them to unilaterally change the terms at their discretion. While
consumers are aware these clauses exist, and are often notified when the terms are about to
change, exactly what these changes may mean for them and their personal information are
rarely made transparent. This leaves little security for users that information will be used only
for purposes that they consented to at the time of disclosure and collection.
Organisations’ social responsibility for privacy
Rules and obligations, both formal and informal, govern how information may be collected,
disseminated and used by companies. Marketing scholars as far back as the early 1990s
framed privacy in terms of being a human right, urging decisive action on the issue for the
sake of human dignity.9,10 This decisive action was intended to be in the form of strong
industry self-regulation. This was presented as a win–win scenario for consumers and
business. On the one hand, industry held specialist technical knowledge in managing
consumer data, and was well placed to identify necessary protections for consumers. On the
other hand, self-regulation would allow companies to avoid direct regulatory response from
government, which it was feared might cause undesirable burdens on their business activities
should they lag behind the march of technology, lacking the flexibility companies needed as
they adapted to the rapidly changing digital marketplace. Industry self-regulation has thus far
proven to be a case of wolves guarding the hen house, with programs such as TRUSTe
certification and the Digital Advertising Alliance’s AdChoices having only limited success in
mitigating consumer harms. Organisations such as the Electronic Privacy Information Center
(EPIC) have stated that industry self-regulation has simply led to the creation of a more
permissive environment for the collection and use of consumer information, enabling a
steady erosion of consumer privacy.11
Moving forward, addressing consumers’ privacy concerns will become an increasingly
important part of how products, services and brands are marketed, and adopting privacy as
a matter of corporate social responsibility will be a necessity for companies to ensure their
own long-term profit and sustainability.12 Companies will be expected to demonstrate good
corporate citizenship, going beyond their legal responsibilities, factoring the broader ethical
responsibilities they have to society for consumer information into how they run their
company.
This may include, for example, the adoption of alternative business models that engage with
consumers to give consumers more control over their data, such as the use of consent
intermediaries.13 Companies might also look into alternative data collection and analysis
approaches that offer the individual consumer ‘privacy-by-design’. A number of large
organisations have already started on this path. In Australia, internet service provider iiNet
made a name for itself with its staunch defence of consumers’ personal information in legal
battles against Village Roadshow14 and Dallas Buyers Club LLC,15 as well as its stance against
the Australian Government’s mandatory data retention laws.16
Internationally, Apple has recently placed significant focus on how its products and services
can respect consumer privacy without compromising the company’s ability to act on
information. Senior vice-president of software engineering Craig Federighi boasted in 2016
about the company’s efforts in ‘differential privacy’, using algorithms designed to learn as
much as possible about a group in aggregate while learning as little as possible about any
individual in the group.17 Actions such as this, and Apple’s refusal to create backdoors to
bypass security mechanisms and encryption for the FBI,18 have endeared the company to an
audience of vocal tech enthusiasts, helping to restore the brand’s reputation following the
widely publicised breach of its iCloud service in 2014.19 As the company moves forward with
innovations such as Face ID in the iPhone X,20 Apple must continue to uphold the weight of
consumers’ expectations that the company will act as their privacy champion.
Questions
1. What moral obligations do you believe marketers have in regards to the collection and
use of consumers’ personal data? What do you believe are the ethical limits for the
use of consumer data? Do you believe the way companies collect consumer
information is exploitative? Why, or why not?
2. If you as a consumer are not happy with how an organisation has collected or used
your data, what recourse do you have? What actions, if any, could you take to reassert
control over your information?
3. What are the key steps you’ve taken to protect your privacy online? To what extent
are you able to proactively refuse to disclose or share your information?
4. Read through the privacy policy of your favourite social networking site. Do you
believe the policy provides you with sufficient information to understand how the
website uses your information?
5. The use of consumer information for micro-targeting of social media advertising was
a significant contributing factor in the election of President Trump in 2016. Does the
use of consumer information for political marketing create any additional moral or
ethical hazards?
Notes
1. MJ Schneider, S Jagpal, S Gupta, S Li & Y Yu, ‘Protecting customer privacy when
marketing with second-party data’, International Journal of Research in Marketing,
2017, 34(3): 593–603 <https://doi.org/10.1016/j.ijresmar.2017.02.003>.
2. <https://theconversation.com/equifax-breach-is-a-reminder-of-societys-largercybersecurity-problems-84034>.
3. <www.cnet.com/news/equifax-data-leak-hits-nearly-half-of-the-us-population/>.
4. <www.bbc.com/news/technology-41147513>.
5. <www.theguardian.com/technology/2018/apr/08/facebook-tocontact-the-87-
million-users-affected-by-data-breach>.
6. HWJ Rittel & MM Webber, ‘Dilemmas in a general theory of planning’, Policy Sciences,
1973, 4(2): 155–69 <https://doi.org/10.1007/BF01405730>.
7. SM West, ‘Data capitalism: redefining the logics of surveillance and privacy’, Business
& Society, 2017 <https://doi.org/10.1177/0007650317718185>.
8. <https://theconversation.com/detection-devices-how-a-sensor-society-quietlytakes-over-26089>.
9. C Goodwin, ‘Privacy: recognition of a consumer right’, Journal of Public Policy &
Marketing, 1991, 10(1): 149–66 <www.jstor.org/stable/10.2307/30000257>.
10. MG Jones, ‘Privacy: a significant marketing issue for the 1990s’, Journal of Public Policy
& Marketing, 1991, 10(1): 133–48 <www.jstor.org/stable/10.2307/30000256>.
11. CJ Hoofnagle, Privacy self-regulation: a decade of disappointment, 2005
<https://doi.org/10.2139/ssrn.650804>.
12. M Flyverbom, R Deibert & D Matten, ‘The governance of digital technology, big data,
and the internet: new roles and responsibilities for business’, Business & Society, 2017:
1–17 <https://doi.org/10.1177/0007650317727540>.
13. T Lehtiniemi & Y Kortesniemi, ‘Can the obstacles to privacy self-management be
overcome? Exploring the consent intermediary approach’, Big Data & Society, 2017,
4(2): 1–11 <https://doi.org/10.1177/2053951717721935>.
14. <www.iinet.net.au/about/mediacentre/copyright-case/>.
15. <www.iinet.net.au/about/mediacentre/copyright-case/>.
16. <http://blog.iinet.net.au/protecting-your-privacy/>.
17. <www.wired.com/2016/06/apples-differential-privacy-collecting-data/>.
18. <http://fortune.com/2016/02/17/apple-backdoor-order/>.
19. <https://en.wikipedia.org/wiki/ICloud_leaks_of_celebrity_photos>.
20. <https://theconversation.com/face-id-and-ios-11-a-few-lingering-securityquestions-about-the-new-iphone-x-83952>.