Governance and Big Tech: Setting the tone on data, privacy and (mis)information
Technology is arguably the most powerful driver of innovation and economic growth. Technology keeps us connected and has enabled us to work and study from home in times of a global pandemic. It connects us to our loved ones and makes virtual family gatherings possible when it is not safe to meet in person. It enables us to check in real time who rang the doorbell even when we are on the other side of the planet and operates devices in our home remotely for our convenience.
Technology has evolved exponentially in the past few years and brought huge value to our lives. It makes information, healthcare, education, banking and many other services much more accessible and customised. In healthcare, it can help detect preventable diseases in their early stages; in the banking sector, data helps recognise illegal activities such as money laundering; and in meteorology, it helps study global warming.
The economy is now often referred to as the digital, or transformative economy. This digitalisation impacts the way we work and socialise, and the competitiveness of our companies is now reliant on data. In 2017, there were 2.7 zettabytes(1) of data worldwide. Projections are that the world’s data holding will increase to 175 zettabytes by 2025.
Key statistics to amplify the significance of Big Tech:
• We generate 2.5 quintillion bytes of data daily.
• The big data analytics market is set to reach $103 billion by 2023.
• Poor data quality costs the US economy up to $3.1 trillion yearly.
• Internet users generate about 2.5 quintillion bytes of data each day and there is a
dramatic increase in the number of internet users.
• 95% of businesses cite the need to manage unstructured data as a problem for their business.
While the Information, Communication and Technology sector (ICT) improves our lives in many ways, the sector has also profited during the Covid-19 crisis and enhanced their influence during this otherwise very challenging economic environment. In particular, the so-called ‘Big Five’ (2) from the US West Coast have grown and diversified significantly. Together with Tesla they represent over $8 trillion in market capital—roughly 25% of the value of the S&P 500 (3). Their collective market capitalization grew by over 40% in 2020, a period in which many other companies and sectors suffered badly due to the pandemic. In certain ways this makes these companies more powerful than some countries and governments.
Against this background of dynamic growth and concentrated corporate power, the digital revolution is not without its challenges. With power comes responsibility—and the potential for abuse. In this regard, this Viewpoint highlights three fundamental areas of concern that are drawing the attention of governments, investors, and civil society:
• An economic problem relating to anti-competitive concerns and concentration of power – a 21st Century version of the early 20th Century ‘robber barons’ that portends a new era of regulation and possible trust busting. This is exacerbated by the industry’s intrinsic economies of scope that encourage business models to develop ranges of products and services which leverage off an unprecedented concentration of knowledge and customer data.
• A social problem relating to the impacts of big tech companies on customers and society more generally, particularly social media companies. This relates to the exploitation of user data, and the potential for both behavioral manipulation and disinformation leading to dystopic social outcomes. While technology can provide many solutions that can have positive impacts on issues like climate change and fossil fuel usage (for example, Zoom meetings can eliminate the need for some travel), there are also environmental externalities that result from the ICT sector, including toxic waste from electronic components and the vast energy demand of blockchain.
• A governance problem rooted in dual-class share-ownership structures prevalent in the ICT sector, particularly the Big 5+1. In prominent cases such structures compound the governance risks and the concentration of economic power by entrenching management and reducing accountability to shareholders. Independent boards must be prepared to protect minority shareholder interests while adapting effectively to dynamic economic and social externalities created by Big Tech companies.
The interplay and confluence of these problems creates the potential for them to build and reinforce one another. This ‘perfect storm’ creates systemic risks-- with potential consequences for the companies themselves, their investors, their employees, the customers and broader society.
In this Viewpoint, we explore these issues and identify ways in which investors can engage with technology companies to achieve higher levels of accountability and corporate responsibility. We look at the role of boards to guide the management of these companies in how to make their businesses more resilient or ‘futureproof’ over the long-term and to create sustainable value for all stakeholders.
2. What is the economic problem?
As the global economy increasingly shifts to a digital, data-driven environment, the internet is a vital lifeline—a MUST have. Effectively, it is a global utility lacking global regulation. Linked to this, the most fundamental economic problem of the big tech companies relates to their vast scale, power and omnipresence-- and how this impacts a competitive marketplace.
The big tech companies operate in definitionally competitive markets; however, the market positions of the tech giants in many ways buffer them significantly from many competitive pressures. For many big tech offerings a network effect kicks in, meaning that these services become more valuable and demanded by customers as the number of users grows.
This has the effect of reinforcing competitive positions and creating substantial barriers to entry to squash potential competitors—a vicious or virtual cycle, depending on one’s perspective. Or, for those competitors that do crop up, they remain targets for acquisition and further consolidation of power by the big tech companies—for example, Facebook’s acquisitions of Instagram and WhatsApp. This creates ‘opportunities of scope’ – extending customer/user data to a wide range of products and services.
The ethics and the social dangers relating to the abuse of customer data will be explored later in this Viewpoint, but an important dimension of the economic problem relates to tech company business models that mine and monetise customer data that they collect. For online services where the customer is not paying an access fee, the user effectively is the product, providing tech companies with valuable customer data, often without cost, to support their products and services. This, in turn, enhances their attractiveness for generating advertising revenues.
This business model challenges traditional ways of assessing the degree of competition, where price has traditionally been the focus of analysis. Where price is not the issue, as in a social media account, for example, the monopolistic influence has to be captured in different ways to assess or make tests for any market failures or externalities. However, the high profitability of the big tech companies does suggest that these firms are enjoying ‘economic rents’—profits that are well in excess of their costs.
Other economic issues relate to corporate taxation and the concern that big tech companies can use discretion in where revenues and costs are allocated geographically to minimise tax payments through tax arbitrage. In a broader context beyond the scope of this Viewpoint, some also view the tech sector as contributing to income inequality by shifting income and wealth from labour to capital (4).
3. What are the social issues?
When it comes to social impact, tech companies face a wide range of interrelated and challenging issues, many of which have human rights at their core. These include data governance, privacy protection, addictiveness, online bullying and exploitation, and the safety and impartiality of their Artificial Intelligence (AI) products and innovations.
The exploitation of user data extends ultimately to what Shoshana Zuboff, of Harvard Business School calls ‘surveillance capitalism’ (5). Everything that individuals do online is being watched, tracked, measured and recorded. Tech companies know exactly what images individuals look at and how long they look at them for. They know when people are lonely or depressed. They know what their personality type is like. This is not limited to big tech. There are also many smaller firms that buy, build and monetise data through data brokers and consumer surveillance. The use of secret surveillance scoring is employed by companies large and small. It invades personal privacy and leads to manipulative practices by companies that can affect individual behaviour(6).
Using data mining tools, tech companies collect vast amounts of individual data, and from this base of ‘big data’ they can apply algorithms to target existing or potential customers. Algorithms can be applied with a view not only towards attracting new business, but also with a view to manipulate behaviour and influence user attitudes and perspectives more generally – including political applications. In extremis, algorithms exploiting individual data have the potential to reinforce user biases, creating discrete ‘social bubbles’ that feed from disinformation and which distort reality. At the most fundamental level this amounts to a violation of human rights.
There is an environmental dimension too. The use of energy for mining cryptocurrencies is one example. Moreover, data centres, many of which are found in emerging markets where energy use is still primarily fossil fuel based. While many Tech companies have committed to net zero in their own operations, this begs broader questions about their supply chains and operating channels.
Despite the complexity of the companies, their product and service offerings, and the algorithms behind them, it comes down to this: users should feel safe and be respected. One pathway from here would be for the user to refuse to be the product, which can only be achieved under the current economic paradigm by starting to pay for all the digital services we use.
A less radical, and possibly more aspirational, approach is to motivate tech companies, their paying clients and shareholders at large to understand that it is in everyone's interest who uses online services, social media, the Internet of Things (IoT) or AI to set bar high enough for us all to be confident in the ethics and integrity of current product and service offerings as well as their safe and respectful future development.
Achieving this aspiration will not come without effort and resolve. It will require good corporate governance, both by company managers and the board. It is also likely to involve further regulation by governments to address market failures and externalities.
As an important subset of the tech sector, social media calls for particular attention in this debate. It has created meaningful systemic change in the way we socially interact. Much of this is positive, but we have been naive about the flip side of the coin. Do we fully understand the consequences of the systemic changes in society brought about by the result of social media?
Social media is addictive. We have a biological imperative to connect with other people. It directly affects the release of dopamine, linking both the brain and the body. We calibrate our lives around a perceived sense of perfection created by the tech companies. Social approval has always been part of our psyche but not at such a scale. The social platforms are simply a matchmaking service and have very few costs to absorb. Advertisers are the customers – the population is what is being sold.
Social media became available on mobiles from 2009. Since 2009, US hospital admissions for non-fatal self-harm is up 189% for girls between the ages of 10 and 14 since 2001. It is the same pattern with suicide - girls aged between 15 to 19 are up 70% compared to the previous decade, and for pre-teen girls this figure is 151%. Social media is also leading to behavioural change - the number of kids taking driving lessons has dropped and so has the number of teenagers dating. Health data is one of the most sensitive types of data that exists. Data protection is paramount. Tracking health data can be necessary for authorities to respond to a fast-moving outbreak. However, mismanagement of this data can lead to mistrust and the lower use of digital health tools. These social dysfunctionalities were explored in detail in the popular 2020 Netflix film, The Social Dilemma: https://www.imdb.com/title/tt11464826/
It is all about changing what people do and how they think. Jaron Lanier, inventor of virtual reality and author of ‘Ten arguments for deleting your social media accounts right now’ observes that “social media’s holy grail is the gradual, slight, imperceptible change in your own behaviour and perception.”
The political dimension adds to this. Cambridge Analytica’s applications of Facebook data were employed to influence, if not manipulate, the 2016 US Presidential election. Former President Donald Trump relied heavily on Twitter for his personal messaging – until his account was frozen following his instigation of the storming of the US Capitol on January 6, 2021. The ability of social media platforms to enable disinformation has distorted how individuals perceive reality – and has consequential real world impacts, including the spread of extremism. The decisions by social media platforms Twitter and Facebook to freeze Trump’s account following the raid on the Capitol have raised further questions in the US, particularly among the Republican party, regarding freedom of speech and the ability of these social media firms to control the ability of individuals to make use of these platforms.
In this context the campaign against social media has achieved a rare feat in the US; it has brought both Democrats and Republicans together in challenging them—albeit for different reasons.
4. Emergence of regulation around the world
The role of regulation in a sector is to address market failures and externalities that result in negative economic and social outcomes. In the case of the ICT sector, regulatory standards have not kept pace with the fast market dynamics, particularly in relation to anti-trust, data protection, market abuse and monopolistic behaviour. This is poised to change.
The prospect of new regulation may address many real or potential abuses that stem from the tech companies’ activities and business models. The European Union has already taken steps in this direction by introducing the Digital Services Act and the Digital Markets Act in late 2020. This regulatory initiative frames the big tech companies as “gatekeepers”. The Digital Markets Act in particular targets abusive market practices by these gatekeepers providing platforms for content such as hate speech, terrorist material or child pornography. It lays out ex ante competition rules targeted at gatekeepers and imposes sanctions for non-compliance, which could include fines of up to 10% of the gatekeeper's worldwide turnover. The Act states that for recurrent infringers, these sanctions may also involve the obligation to take structural measures, potentially extending to divestiture of certain businesses, where no other equally effective alternative measure is available to ensure compliance(7). These comprehensive remedies have the potential to shape market and company practices and could provide a regulatory model for the rest of the world. But a key challenge here is the effective implementation of these Acts into practice and their subsequent enforcement.
Across the Channel a post-Brexit UK is proceeding with its own approach to regulation of the tech sector. In November 2020, the government announced its plans to establish a Digital Market Unit (DMU) to introduce a new code to govern the behaviour of the dominant tech platforms (8). The DMU has been described as a “hybrid” version of the EU’s Digital Markets Act and Digital Services Act, and maximum fines for non-compliance could reach 10% of a company’s annual global revenue (9). Alongside that, the Online Harms Act, first introduced in 2018 will be reviewed for passage in 2021. Its intent is focused on targeting harmful internet content that can be used to spread terrorism and disinformation, undermine civil discourse, and abuse or bully other people(10).
While Europe may be leading on reforms, regulatory activity is building in other jurisdictions globally. For example, in the US the Justice Department filed a lawsuit in 2020 against Google, accusing the company of illegally protecting a monopoly. More recently the US Federal Trade Commission (FTC) has teamed up with state regulators in accusing Facebook of buying back rivals to squash competition(11). In October 2020, the Democratic-led US House Judiciary Committee’s Subcommittee on Antitrust published a report on competition in digital markets, taking into consideration reform proposals ranging from structural breakup to the development of a new and bespoke regulatory body to address the tech sector. With a Biden presidency and a Democrat-controlled Congress, this study lays out the foundation for a potential new policy agenda(12).
In Australia, the tension between big tech and national governments has been recently illustrated by Facebook's unilateral decision in February 2021 to block the sharing of news in Australia in the context of the government's consideration of a new law requiring it to negotiate and pay for news content. Facebook claims it was simply protecting itself from being penalised commercially. However Australian Prime Minister Scott Morrison has expressed concern about the behaviour of big tech companies "who think they are bigger than governments and that the rules should not apply to them". Facebook has since pledged to pay for news content in Australia, so that issue may be resolved for the time being. But this incident highlights the tensions between big tech companies and governments.
China is another important actor in this debate, and hosts several of the world’s large technology companies, including the giant retailer Alibaba. China is also taking a stand against potential abuses from its high-tech sector. Its State Administration for Market Regulation (SAMR) has proposed draft rules into anticompetitive behaviour and has begun a probe into Alibaba with regard to its market dominance. The share prices of Alibaba and other leading tech companies dropped significantly after the announcement of the SAMR draft rules(13).
The Chinese emphasis on anti-competitiveness is similar to concerns expressed in the US, Europe and Australia. However, in a geopolitical context China may be on a different trajectory, as Western observers also note China’s intent to compete with the West and assert global leadership and influence in the area of digital technology and innovation (14). This suggests that Chinese regulatory initiatives in the technology sphere may place lesser emphasis on the social and human rights concerns that are arising in Europe and the US.
5. Data protection and regulation: GDPR – a model for other markets?
While more fundamental regulatory initiatives await, the area of data protection has already featured in new regulation and is better defined. As noted previously, if properly analysed, big data can provide positive social benefits in a multitude of ways, including healthcare, education and banking. But the data revolution is not without its challenges. Regulatory standards have not kept pace with the dynamics of the market’s development. A weak legal and regulatory environment can affect everything -- from our global transition to a digital economy to the profitability and sustainability of individual businesses(15).
In the EU, the General Data Protection Regulation (GDPR) was established in 2016 to protect the personal data and privacy of EU citizens for transactions that occur within EU member states. GDPR takes a wide view of what constitutes personal identification information. It carries provisions that require businesses to protect the personal data and privacy of EU citizens for transactions that occur within EU member states and the exportation of personal data outside the EU. So GDPR has an extraterritorial dimension that broaden its influence beyond the EU itself.
For example, though while the UK was subject to GDPR until formally leaving the EU as of 1 January 2021, it passed its own Data Protection Act in 2018 containing many aspects of the EU’s GDPR. As the aspects of this Act are transposed into UK law, the regulation will be referred to as the UK GDPR. Similarly, recently proposed regulation in Canada, the Consumer Privacy Protection Act, intends to reshape Canada’s federal framework for privacy protection. With heightened standards that are consistent across provinces, and a co-regulatory approach, it appears to align with the EU’s GDPR regulatory model(16).
6. What are the corporate governance issues?
Regulation alone is insufficient as the sole discipline guiding company behaviour against potentials abuses. There is also the need for self-discipline and appropriate corporate governance. While there are many dimensions to consider in the governance of big tech companies we focus on two key issues:
• The first relates to the governance of sustainability and how big tech managers and boards should come to understand the company’s relevant sustainability issues and social impacts-- and to navigate the company through this challenging maze of risks and opportunities.
• The second key governance issue is a byproduct of the sector’s concentration of economic power and social influence over its stakeholders. It relates to ownership structure and the concentration of power held by those with strong economic and voting control. This concerns the ability of boards to ensure effective and independent oversight of powerful company managers-- and the ability of a company’s shareholders to engage with the company to ensure alignment of interests. For many tech companies dual class share structures compound this problem by cementing the control of the owner/founder in a way that can also entrench managers and dilute accountability to shareholders and other stakeholders.
The governance of sustainability in Big Tech
The governance of sustainability will feature prominently in the revisions of ICGN’s Global Governance Principles for 2021 and has particular relevance in the ICT sector. It is critical for the boards of tech companies to proactively build understanding and awareness of the sector’s, and the individual company’s economic and social challenges—and to relate them to how the company is both managed and governed. This reflects the need for executive management and boards to have governance models that “futureproof” companies to the greatest extent possible and preserve the company’s ability both to generate sustainable value and serve a legitimate social purpose.
Tech companies are increasingly subject to investor and public scrutiny on their human rights due diligence, data management and the accountability of the board for questions of ethics, culture and integrity. The famous Google mantra of “don’t be evil”, which was retired from the company’s code of conduct in 2018, suggests recognition of the need for tech companies to demonstrate responsible (or at least not irresponsible) behaviour. Ultimately, it is in shareholders’ interests to be able to trust, and to see that other stakeholders, including regulators, also build trust in tech companies to address their social challenges responsibly.
Social media companies aided by their lobbyists, often position themselves simply as neutral technology platforms for third party content, seeking to avoid responsibility for the content itself. They seek legal safe harbour through the argument of promoting freedom of speech—a key element of the US Constitution used by traditional media companies. US companies also benefit from Section 230 of the Communications Decency Act (1996) which provides immunity for website platforms from third-party content(17). Given increasing regulatory initiatives, social media platforms will not no longer be in a position to distort their stated corporate purpose, ignore social harms or dismiss externalities relating to disinformation.
Boards of directors play a pivotal role in safeguarding the protection of human rights in the current products and services tech companies offer. It is a difficult water to navigate, bearing in mind all the good things technology and social media have brought us while also creating negative externalities. How do boards draw the line between blocking the spread of harmful disinformation and being a platform for freedom of speech?
A good starting point for ICT boards is the articulation of a clear and legitimate sense of corporate purpose. This requires awareness of the salient sustainability factors that affect the individual company and its stakeholders, along with recognition of accountability for externalities and negative impacts. It can be an exercise in thinking about ‘double materiality’ – i.e., not just the impact of sustainability factors on the company, but also how the company impacts society.
Determining a digitalisation strategy will be complex amidst a continually changing business environment. A systematic approach is needed to integrate digitalisation into strategic plans, establish the right policies and design an appropriate information architecture across the organization. The first step should be to ensure that senior management and the board of directors alike are committed to the company’s digitalisation strategy and are sufficiently knowledgeable to oversee implementation.
Digital transformation requires fundamental changes in an organization’s culture, operations and processes. As with ESG or human resources issues, there needs to be a separate digitalisation strategy, but digitalisation should also be integrated into the company’s strategic plan. Boards need to demonstrate they understand why their companies will benefit from digitization and how they can support a digitization effort. Boards can play an important role in signaling the need for new data governance rules. The fundamental issues for boards are around data value (intellectual property and confidential information, as examples) and how to retain, access, control and protect individual data.
The effectiveness of ICT boards requires not only a moral compass but appropriate composition, skill sets and training. The boundaries of corporate governance continue to expand, and directors of ICT are expected to be aware of and oversee a growing number of complex economic and social issues, such as due diligence on human rights, customer data privacy, applications of artificial intelligence(18) that dominate smaller developing countries in the form of ‘digital colonialism’ or the psychological damage teens suffer from social media addiction. While putting the genie back in the bottle is not an easy task, it still needs to be addressed if the ICT sector is to thrive in the long term.
ICT boards must build an understanding on the tradeoffs between freedom of expression as a cornerstone of democracy and deepfakes(19) leading to an infocalypse, which are detrimental to democracy. While the board as a whole must understand and be held accountable for sustainability related risks in tech companies, some boards might seek particular sustainability expertise to build the board’s collective skill set. A board committee focusing on sustainability factors might also be an appropriate structural tactic to address these issues. Such a committee could oversee how human rights of stakeholders are monitors or ensure that meaningful human rights due diligence processes are part of all product innovation and developments.
The UN Guiding Principles on Business and Human Rights(20) provide a useful framework for boards to structure their approach on these issues. The correct and timely application of these principles and their implementation are work in progress - the pace of innovation means that ICT companies often find themselves in uncharted territory. Other resources exist to help tech company boards. For example:
• UN’s ‘B-Tech Project’ provides guidance and resources for implementing the UN’s Business and Human Rights Principles in the technology space(21).
• The Global Network Initiative, a membership organization that provides principles and guidance to ICT companies on human rights and content regulation(22).
• The World Benchmarking Alliance, whose mission is to facilitate benchmarking of best practice also provides guidance to tech companies to address human rights challenges and ensure open, inclusive and ethical innovation(23).
Challenges of dual class share structures
Ownership structure is a key governance factor for ICT companies, particularly for those companies with controlling owners. Many tech companies are relatively young, with founders and senior managers often maintaining significant ownership stakes along with management or board responsibilities.
Particularly for those companies that are publicly listed it is important to align interests of the controlling owners with minority shareholders. However, there can be a tension between these two shareholding groups. Many tech companies are concerned that their ability to be dynamic, inventive, and entrepreneurial in a fast-moving market space can be inhibited by institutional shareholders pressuring the company for short term results at the expense of long-term performance and growth. As a result, tech companies can be wary of the ‘animal spirits’ of institutional shareholders interfering with their businesses.
This is why dual class share structures have become a popular form of ownership structure with tech companies, particularly in the US Silicon Valley. By offering two (or more) share structures with differing voting rights, majority voting control can be maintained by the controlling shareholder even with a minority economic interest in the company. Tech giants Google, Facebook, Tesla, and Alibaba all have dual class share structures, and this sets an example for other tech companies that want to follow this model.
These perceived benefits of greater entrepreneurial freedom and buffering tech companies from short term market pressures are clearly attractive to companies. But for institutional investors who serve as a company’s minority shareholders there is a reciprocal problem that comes with differential ownership. The two main concerns are:
• Dual class shares can entrench management even if they are no longer effective.• Dual class shares marginalise the voice of minority investors by diluting their voting rights; this is anathema to ambitions of investor stewardship. To cite here just one of a number of research papers on dual class shares, Robert Jackson, former Commissioner at the US Securities and Exchange Commission and currently a law professor at New York University School of Law, also expresses reservations about dual class shares and suggests that if there is an advantage to dual class structures, such structures should not be permanent as they can lead to value deterioration over time(24). The following graph from his research clearly illustrates this point:
Source: Robert Jackson, US Securities and Exchange Commission, 2018
Ultimately this graphic reflects the private benefits of control enjoyed by the controlling shareholder, but not the minority shareholder. For these reasons ICGN has long opposed dual class shares(25) , and we have explained in detail the basis of our opposition in numerous comment letters and consultations in markets around the world(26).
While the risks of dual voting class structures can ultimately be priced into a company’s valuation to reflect this potential sub-performance, we believe the most sensible starting point is simply to avoid the introduction of dual class share regimes in the first place. Or, as a second-best alternative, having dual class shares with defined sunset clauses could serve as a mitigant of these downside risks. Otherwise, we fear there is a slippery slope to unintended consequences, even with the best of intentions.
So, in light of investor opposition to dual class shares and evidence suggesting how this might impact long-term performance, why do tech companies do this anyway? The short answer is because they can, whether minority shareholders like it or not.
For one thing, dual class issuance is permitted in several major markets, including both Nasdaq and the New York Stock Exchange (NYSE) in the US. The ability of these US exchanges to attract high profile listings, such as Google, Facebook, Tesla, and Alibaba through their historically more lenient standards on dual class shares has led to ‘FOMO’: fear of missing out on the next ‘unicorn’(27) . In turn, other leading exchanges, Singapore, and Hong Kong, for example, have dropped their previous restrictions on dual class shares to compete more evenly for company listings, particularly in the fast-growing ICT sector.
The UK, long a bastion of high regulatory standards, is now considering loosening these standards to allow for dual class issuance on the London Stock Exchange (LSE) for companies with Premium listing(28). The UK’s potential shift is particularly motivated on attracting more listings of technology companies and regards the UK and the LSE as having a competitive disadvantage given the current inability to offer ICT companies Premium Listings (and potential membership in UK stock indices). ICGN and others have labeled this form of competition as a ‘race to the bottom’, where the interests of exchanges dilute regulatory standards at the expense of investors(29).
These concerns about dual class shares are general and not limited to tech companies. What singles out the ICT sector in this debate is in part what Scott Galloway, a marketing professor at New York University, refers to an ‘idolatry of innovators’—giving particular license to innovative tech companies to resist influence from, and limit accountability to, minority shareholders. However, for the big tech sector in particular, which is already beset by regulatory concerns relating to monopolistic practices and excessive market dominance, the existence of dual class share structures only compounds this concentration of control.
Facebook is often a focus of this discussion. The technical savvy and market vision of Facebook’s founder Mark Zuckerberg created a social communications platform with a huge influence on its users and society more broadly – creating significant value for shareholders. Facebook’s dual class share structure provides Zuckerberg, and his designated heirs, indefinite, if not infinite, control of the company. So, on top of Facebook’s market dominance Zuckerberg is also effectively entrenched and shielded from accountability to his shareholders.
Is this a good thing for a public company? As Facebook evolved from a small tech start up to its current market capitalization of over $770 billion its management and governance requirements have clearly evolved as well. Many of the technical and entrepreneurial skills that Zuckerberg brought to Facebook are no doubt still relevant and add value. But this savvy does not automatically equip him with the wisdom and capabilities to most capably navigate the company as a giant monolith facing a very different set of challenges, including the prospect of regulations and reprisals, as well as social challenges relating ethics, human rights, and data privacy issues.
The concept of the life cycle of corporate governance suggests that as a company grows and evolves its governance and managerial needs will change(30). The problem with dual class share classes, at least those without sunset clauses, is that they can inhibit the company from making appropriate adjustments to its executive leadership and governance structures as the company’s needs evolve over time.
Investor engagement with Big Tech: priority questions
ICT is a complex, dynamic and hugely important sector. It has contributed many positive innovations for humanity and drives the 21st Century economy. Yet the challenges relating to market dominance and social problems relating to the impact of high tech companies, both individually and politically, pose huge questions for governance. For the Big Five companies in particular it a question of excessive market power, exacerbated in some important cases for companies whose dual class structures diminish accountability to shareholders.
For investors, boards of directors of ICT companies should serve as the first line of defense to ensure these companies address their social impacts responsibly— and to ensure the company’s ability for sustainable value creation(31). But investors also have a role to play in engaging with ICT boards and executive management. Achieving access to direct engagement with tech companies at a senior executive and board level can be a challenge, particularly for those companies with dual class share structures intended to buffer companies from minority shareholder input. Nevertheless, investors should seek opportunities for engagement, and can also make use of other tactics if a company is not responsive to outreach for direct engagement. To cite one example, an international coalition of the largest investors in Alphabet, the parent company of Google, co-filed a resolution at its 2020 annual general meeting, calling on the company to bolster board oversight of human rights risks(32).
Investor engagement in the ICT sector is not limited to company dialogue. As the regulatory landscape develops in Europe, the US and other jurisdictions investors should also be prepared to engage at a regulatory level—supporting when good but pushing back when not so good.
While each ICT company will have its own distinctive business and social dynamics, we offer some general engagement questions for investors to consider in their interaction with tech companies:
Ethics, Human Rights and board effectiveness
• What is the Board’s stance on ethical innovation? How is this position reflected in the operations and future outlook of the company?
• In terms of board composition and expertise, does the board have the right skillsets the company’s social impact and governance challenges? Board members should be able to understand the digital economy, and its opportunities and risks.
• Does the Board ensure the periodic independent assessment of salient human rights risks?
• What are the checks and balances in place ensuring the human rights commitments and policies are implemented?
• What grievance mechanisms are in place and how is timely remediation of human rights impacts ensured?
• Does the company have an active lobbying programme? If so, who is the company engaging and on what themes?
• Does the company make political donations related to its lobbying efforts? If so, what is the intent behind these donations?
• Can management explain where its data is coming from? Is this something discussed at board level? Data can be collected in existing datasets, generated through the deployment of new technologies or acquired from third parties. Management should be able to explain how they assess the risks associated with the use of data from third-parties.
• Where does data reside? There needs to be clarity regarding the use of cloud-based services to store datasets for secondary uses. For example, many public sector organizations in specific countries are required to ensure that cloud-based servers must be located within that country. Storing data in other jurisdictions may result in different privacy regimes applying to personal data.
• Is the company fully compliant with data privacy laws and regulations? Procedures to monitor and report on compliance to privacy laws and regulations should be in place.
• Do datasets and algorithms deployed in the organization meet acceptable ethical standards?
• Can the company plainly explain their processes, including re-use and the deployment of algorithms and machine learning tools, where data is coming from, how it is being used, and who is using it?
• Does the board understand what the company does with data? Management should identify ownership and control of data that will be subject to secondary use. Data sets should be easily retrieved, collated and labelled in order to facilitate secondary data use.
• Does management have an accountability structure for the development and ethical use of AI and machine learning? Board members should expect that a defined the chain of responsibility at each stage and level of data use is in place.
• Can the company plainly explain its processes, including re-use and the deployment of algorithms and machine learning tools, where data is coming from, how it is being used, and who is using it?
Dual Class Share Structures
• Where dual class structures are in place how does the board seek to provide independent oversight over the controlling owner to ensure alignment and accountability with the long-term interests of the company and minority shareholders?
• If a dual class structure is in place without a sunset provision does the board ever encourage the controlling owner to introduce such a provision to guard against the potential abuses of dual class shares?References
What is a zettabyte: A terabyte is equal to 1,024 gigabytes, A petabyte is equal to 1,024 terabytes, An exabyte is equal to 1,024 petabytes. A zettabyte is equal to 1,024 exabytes
2Amazon, Apple, Facebook, Microsoft and Google. Now that Tesla has entered the S&P 500, many now refer to the Big Six.
3 Paul R. La Monica, ‘Proof Big Tech is way too big: it’s a quarter of your portfolio’ CNN Business, 6 January 2021: https://edition.cnn.com/2021/01/06/investing/stocks-sp-500-tech/index.html
4 Martin Sandbu, How the internet giants damage the economy and society, Financial Times, 20 February 2018. This is one of an excellent set of articles by Sandbu featured in the FT’s collection on The Economics of Big Tech: https://www.ft.com/economics-of-big-tech
5 “It is capitalism profiting of the infinite tracking of where people go by large tech companies. Their business model is to make sure advertising campaigns are as successful as possible. This is a marketplace that never existed before. It trades exclusively in human futures.” Zuboff, Shoshana, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. New York: Public Affairs, 2019.
6 See the letter to the US Federal Trade Commission by the consumer advocacy body #REPRESENT, 29 April 2020: https://www.representconsumers.org/wp-content/uploads/2020/04/2020.04.29_REPRESENT-Letter-to-FTC-re-Scores.pdf
7 European Union, Europe fit for the Digital Age: Commission proposes new rules for digital platforms, 15 December 2020: https://ec.europa.eu/commission/presscorner/detail/en/ip_20_2347
8 UK Government Press Release, New competition regime for tech giants to give consumers more choice and control over their data, and ensure businesses are fairly treated, 27 November 2020: https://www.gov.uk/government/news/new-competition-regime-for-tech-giants-to-give-consumers-more-choice-and-control-over-their-data-and-ensure-businesses-are-fairly-treated
9Saqib Shah and Gaurang Dholakia, UK's regulatory regime for big tech to take shape in 2021, Standard & Poor’s Global, 5 January 2021: https://www.spglobal.com/marketintelligence/en/news-insights/latest-news-headlines/uk-s-regulatory-regime-for-big-tech-to-take-shape-in-2021-8211-experts-61831856
10 UK Government, Online Harm White Paper, 15 December 2020: https://www.gov.uk/government/consultations/online-harms-white-paper/online-harms-white-paper
11 Cecilia Kang and Mike Isaac, U.S. and States Say Facebook Illegally Crushed Competition, New York Times, 9 December 2020.: https://www.nytimes.com/2020/12/09/technology/facebook-antitrust-monopoly.html
12 Mark MacCarthy, The House Antitrust report is a major step toward reining in Big Tech, Brookings Institution, Center for Technology Innovation, 20 October 2020: https://www.brookings.edu/blog/techtank/2020/10/20/the-house-antitrust-report-is-a-major-step-toward-reining-in-big-tech/
13 Ryan McMorrow, Nian Liu and Mercedes Ruehl, China draws up first antitrust rules to curb power of tech companies, Financial Times, 10 November 2020: https://www.ft.coGym/content/1a4a5001-6411-45fa-967c-0fd71ba9300b
14 James L. Schoff, Competing With China on Technology and Innovation, Carnegie Endowment for International Peace, 10 October 2019:: https://carnegieendowment.org/2019/10/10/competing-with-china-on-technology-and-innovation-pub-80010
15 The Professional Accountant’s Role in Data: Draft Discussion Paper, Joint Publication, Chartered Professional Accountants of Canada and International Federation of Accountants, expected release date March 2021
16 Helen Cheng, The rise of co-regulation, from GDPR to Canada’s Bill C-11, 20 December 2020:
Michael Nadeau, General Data Protection Regulation (GDPR): What you need to know to stay compliant, CSO US, 12 June 2020 https://www.csoonline.com/article/3202771/general-data-protection-regulation-gdpr-requirements-deadlines-and-facts.html
17 See Cornell Law School, Legal Information Institute, 47 U.S. Code § 230 - Protection for private blocking and screening of offensive material: https://www.law.cornell.edu/uscode/text/47/230
18 See ICGN Viewpoint on Artificial Intelligence and Board Effectiveness, February 2020: https://www.icgn.org/artificial-intelligence-and-board-effectiveness
19 Deepfakes are synthetically generated pieces of 'information' that is complemented by algorithm to create the impression of credibility. This can lead to an unprecedented disruption through the uncontrolled proliferation of misinformation that impacts the judgement and behavior of large parts of the population.
20 United Nations Global Compact, Guiding Principles for Business and Human Rights: Implementing the United Nations “Protect, Respect and Remedy” Framework, 2011: https://www.unglobalcompact.org/library/2
21 United Nations Human Rights, Office of the High Commissioner: https://www.ohchr.org/EN/Issues/Business/Pages/B-TechProject.aspx
22 Global Network Initiative: https://globalnetworkinitiative.org/
23 World Benchmarking Alliance: https://www.worldbenchmarkingalliance.org
24 See: Jackson, Robert J. Jr. (2018) Perpetual Dual-Class Stock: The Case Against Corporate Royalty, U.S. Securities and Exchange Commission: https://www.sec.gov/news/speech/perpetual-dual-class-stock-case-against-corporate-royalty#_ftn19
25 ICGN Viewpoint on Differential Ownership Rights, April 2015: https://www.icgn.org/policy/viewpoints/differential-rights: https://www.icgn.org
26 For ICGN’s latest comment letter on dual class shares, please see our response to the UK Hill Review on UK listings, 18 December 2020: https://www.icgn.org/sites/default/files/26.%20ICGN%20Letter%20to%20UK%20Hill%20-%20Call%20for%20Evidence%20–%20UK%20Listings%20Review.pdf
27 A ‘unicorn’ is a private startup company valued at over $ 1 billion
28 Op. cit., ICGN comment letter to the UK Hill Review on UK listings, 18 December 2020.
29 See ICGN Viewpoint, Stock exchanges and shareholder rights: a race to the top, not the bottom, December 2018: https://www.icgn.org/stock-exchanges-and-shareholder-rights-race-top-not-bottom-0
30 Igor Filatochev and Mike Wright, The Life Cycle of Corporate Governance, Edward Elgar Publishing, 2005.
31 An excellent review of investor expectations relating to big tech and human rights was published in 2020 by the Swedish Pension Fund’s Council of Ethics and the Danish Institute for Human Rights. It details a range human rights risk, including privacy and data protection, freedom of expression, elections and public discourse and discrimination: https://etikradet.se/wp-content/uploads/2020/12/Tech-giants-and-human-rights-Investor-expectations-.pdf
32 The investor group included Aviva Investors, Axa Investment Managers, Boston Common Asset Management, the Church of England Pension Fund, Federated Hermes, NEI Investments and Robeco. See Siobhan Riding, Alphabet faces investor backlash over human rights policies, Financial Times, 24 February 2020: https://www.ft.com/content/02f38575-1050-4f18-a49d-52bd054a5d51Milly Sheehan
Senior Communications and Events Manager
+44 020 4541 7254