{"id":23754,"date":"2021-07-21T12:28:44","date_gmt":"2021-07-21T12:28:44","guid":{"rendered":"https:\/\/www.publicknowledge.org\/?p=23754"},"modified":"2021-07-21T12:28:44","modified_gmt":"2021-07-21T12:28:44","slug":"the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat","status":"publish","type":"post","link":"https:\/\/publicknowledge.org\/the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat\/","title":{"rendered":"The Privacy Debate Reveals How Big Tech\u2019s \u201cTransparency and User Control\u201d Arguments Fall Flat"},"content":{"rendered":"<p>If you\u2019ve been following the <a href=\"https:\/\/www.judiciary.senate.gov\/meetings\/algorithms-and-amplification-how-social-media-platforms-design-choices-shape-our-discourse-and-our-minds\" target=\"_blank\" rel=\"noopener\">Capitol Hill hearings about algorithms<\/a> and automated decision-making, then you\u2019ve probably heard technology companies talk about how they offer or <em>want <\/em>to offer \u201ctransparency and user control\u201d to consumers. Companies like Facebook and Twitter propose sharing information on how their algorithms work with the public, as well as enabling users to tweak these algorithms to change their experience on digital platforms. They argue that users are less susceptible to manipulation if they can control how a digital platform\u2019s algorithm delivers content to them. While this seems persuasive, this kind of regulatory regime poses significant dangers. It cannot address harms like <a href=\"https:\/\/www.vox.com\/recode\/2020\/2\/18\/21121286\/algorithms-bias-discrimination-facial-recognition-transparency\" target=\"_blank\" rel=\"noopener\">discrimination<\/a>, <a href=\"https:\/\/www.nbcnews.com\/pop-culture\/pop-culture-news\/months-after-tiktok-apologized-black-creators-many-say-little-has-n1256726\" target=\"_blank\" rel=\"noopener\">loss of economic opportunity for content creators<\/a>, or <a href=\"https:\/\/www.technologyreview.com\/2020\/01\/29\/276000\/a-study-of-youtube-comments-shows-how-its-turning-people-onto-the-alt-right\/\" target=\"_blank\" rel=\"noopener\">radicalization of users<\/a>. Luckily, we\u2019ve <a href=\"https:\/\/www.brookings.edu\/blog\/techtank\/2020\/01\/06\/hitting-refresh-on-privacy-policies-recommendations-for-notice-and-transparency\/\" target=\"_blank\" rel=\"noopener\">already had this conversation on privacy<\/a> &#8212; and can choose not to make the same mistakes twice.<\/p>\n<p>For much of the <a href=\"https:\/\/open.mitchellhamline.edu\/cgi\/viewcontent.cgi?article=1426&amp;context=wmlr\" target=\"_blank\" rel=\"noopener\">internet\u2019s history<\/a>, the U.S. has protected users\u2019 privacy online (to the extent you can say it has been protected) through a \u201cnotice and choice\u201d framework. Companies are expected to explain to users (usually through unreadable privacy policies) what data is collected and how it will be used. Sometimes, although not always, websites and applications will ask users to check boxes indicating that they have read and agreed to the privacy policy. Subsequent use of the website signals that users have been given <em>notice<\/em> about the sites\u2019 data practices and have <em>chosen to accept<\/em> those data uses. Notice a problem?<\/p>\n<p>This framework assumes a few things that we know are not true: 1) that users <a href=\"https:\/\/medium.com\/cgo-benchmark\/no-one-reads-online-privacy-policies-42b067701efc\" target=\"_blank\" rel=\"noopener\">read privacy policies<\/a>; 2) that they understand <a href=\"https:\/\/www.nytimes.com\/interactive\/2019\/06\/12\/opinion\/facebook-google-privacy-policies.html?mtrref=www.google.com&amp;gwh=3C67BA9B4F1ABDAB90DEBB1C7A303E98&amp;gwt=regi&amp;assetType=REGIWALL\" target=\"_blank\" rel=\"noopener\">what the policies say<\/a>; and 3) that they have a <a href=\"https:\/\/www.tandfonline.com\/doi\/full\/10.1080\/17441056.2020.1839228\" target=\"_blank\" rel=\"noopener\">practical choice<\/a> about whether or not to use a website or application under those conditions. These assumptions are wrong, and, therefore, the \u201cnotice and choice\u201d framework simply <a href=\"https:\/\/scholarship.kentlaw.iit.edu\/cgi\/viewcontent.cgi?article=1567&amp;context=fac_schol\" target=\"_blank\" rel=\"noopener\">can\u2019t protect internet users<\/a>. Instead, what happens is that users drown in information they can\u2019t be expected to read and understand, while companies are allowed to siphon data from those very same users that they then exploit. And it\u2019s not 1998 anymore &#8212; not using the internet\u2019s most dominant websites is <a href=\"https:\/\/www.nytimes.com\/2020\/07\/31\/technology\/blocking-the-tech-giants.html\" target=\"_blank\" rel=\"noopener\">not a viable choice<\/a>.<\/p>\n<p>The push for \u201ctransparency and user control\u201d with regards to algorithms is reminiscent of the \u201cnotice and choice\u201d framework. Both rely on users being able to understand the information they are given about a system, and then to make a choice about whether or how they will use it. If you thought making privacy policies readable was tough, try <a href=\"https:\/\/freedom-to-tinker.com\/2017\/05\/31\/what-does-it-mean-to-ask-for-an-explainable-algorithm\/\" target=\"_blank\" rel=\"noopener\">explaining an algorithm<\/a>. And since the company is the one doing the explaining, it also provides an avenue for those very same companies to use <a href=\"http:\/\/ceur-ws.org\/Vol-2327\/IUI19WS-ExSS2019-7.pdf\" target=\"_blank\" rel=\"noopener\">dark patterns<\/a> to circumvent user control. But the lack of true transparency isn\u2019t even the major stumbling block for this method of regulation; it\u2019s actually user choice.<\/p>\n<p>In order for a regulatory system targeting digital platforms to be effective, it must maximize user benefits while minimizing the possible harms. Having users choose their algorithm does not address most of the harms that come with automated decision-making processes. First, there are plenty of algorithms that consumer data is used to power but that the consumer does not have a say in <em>how<\/em> or <em>when<\/em> they are used. Think, for example, about how businesses use algorithms tuned to the consumer\u2019s information for <a href=\"https:\/\/hbr.org\/2019\/05\/all-the-ways-hiring-algorithms-can-introduce-bias\" target=\"_blank\" rel=\"noopener\">hiring algorithms<\/a>, <a href=\"https:\/\/www.technologyreview.com\/2020\/08\/07\/1006132\/software-algorithms-proctoring-online-tests-ai-ethics\/\" target=\"_blank\" rel=\"noopener\">remote proctoring software<\/a>, <a href=\"https:\/\/www.nbcnews.com\/tech\/tech-news\/tenant-screening-software-faces-national-reckoning-n1260975\" target=\"_blank\" rel=\"noopener\">tenant screening<\/a>, and validating <a href=\"https:\/\/www.iii.org\/insuranceindustryblog\/algorithms-a-i-and-insurance-promise-and-peril\/\" target=\"_blank\" rel=\"noopener\">insurance claims<\/a>, to name a few. (Granted, even if a person had some choice or control in how those types of algorithms were used, there would still be serious harms associated with them.) However, let\u2019s narrow our focus to the use cases that Congress has chosen to focus on this year &#8212; content recommendation and curation algorithms. Even if we limit the \u201ctransparency and user control\u201d proposal to content delivery platforms like Facebook, Twitter, or YouTube, it still won\u2019t protect users.<\/p>\n<p>The way people talk about recommendation engines or feeds would suggest that they are operated by one centralized algorithm, but that isn\u2019t what\u2019s happening. Generally, <a href=\"https:\/\/about.fb.com\/news\/2021\/04\/incorporating-more-feedback-into-news-feed-ranking\/\" target=\"_blank\" rel=\"noopener\">there is an algorithm<\/a> tracking your <em>thousands<\/em> of interactions with a particular platform that is attempting to guess the type of content you would like to view next, but there are also <em>other<\/em> algorithmic systems at work influencing what type of content you will see. These systems include content moderation algorithms that enforce community guidelines and standards; copyright identification algorithms that attempt to identify copyrighted material so that it may be taken down; and even advertising algorithms that determine what ads you see and where. For these algorithms to do their jobs, users can\u2019t have a choice in the matter. Copyrighted work must be taken down per the <a href=\"https:\/\/law.stanford.edu\/wp-content\/uploads\/2016\/10\/Accountability-in-Algorithmic-Copyright-Enforcement.pdf\" target=\"_blank\" rel=\"noopener\">Digital Millennium Copyright A<\/a><a href=\"https:\/\/law.stanford.edu\/wp-content\/uploads\/2016\/10\/Accountability-in-Algorithmic-Copyright-Enforcement.pdf\">ct<\/a>; community standards aren\u2019t really community standards if they aren\u2019t enforced; and platforms need to please their advertisers to continue making money. We are left with users being able to tinker at the <em>edges<\/em> of the algorithm, which may change their experience somewhat, but certainly won\u2019t address the harms.<\/p>\n<p>Algorithmic harms can be categorized into <a href=\"https:\/\/fpf.org\/blog\/unfairness-by-algorithm-distilling-the-harms-of-automated-decision-making\/\" target=\"_blank\" rel=\"noopener\">four major buckets<\/a>, specifically loss of opportunity; economic loss; social detriment; and loss of liberty. To make these buckets more salient, let\u2019s look at real-world examples. Facebook\u2019s advertising system has been accused of discrimination in the delivery of <a href=\"https:\/\/www.motherjones.com\/politics\/2021\/06\/facebook-discrimination-lawsuit-ads\/\" target=\"_blank\" rel=\"noopener\">insurance<\/a>, <a href=\"https:\/\/www.npr.org\/2019\/03\/28\/707614254\/hud-slaps-facebook-with-housing-discrimination-charge\" target=\"_blank\" rel=\"noopener\">housing<\/a>, and <a href=\"https:\/\/www.propublica.org\/article\/facebook-ads-can-still-discriminate-against-women-and-older-workers-despite-a-civil-rights-settlement\" target=\"_blank\" rel=\"noopener\">employment<\/a> ads. Maybe Facebook doesn\u2019t <em>actively choose<\/em> to discriminate, but instead allows advertisers to determine who they want to see their ads. Advertisers use these targeting tools and their own past (often discriminatory) data to very granularly control who sees their ads. Even if an advertiser wasn\u2019t intending to be discriminatory, the combination of granular targeting and biased data sets means people of color, women, and other marginalized groups often don\u2019t see ads that would benefit them. The result is a loss of opportunity to change jobs, switch to better insurance, or find new housing opportunities in an area.<\/p>\n<p>Furthermore, platforms aren\u2019t just vehicles for viewing content &#8212; they can also help users make money. YouTube allows content creators to become \u201cpartners\u201d with YouTube so that they can monetize their content. However, in recent years, these partner creators have accused YouTube of discrimination, saying that <a href=\"https:\/\/www.washingtonpost.com\/technology\/2020\/06\/18\/black-creators-sue-youtube-alleged-race-discrimination\/\" target=\"_blank\" rel=\"noopener\">Black creators<\/a> and <a href=\"https:\/\/www.washingtonpost.com\/technology\/2019\/08\/14\/youtube-discriminates-against-lgbt-content-by-unfairly-culling-it-suit-alleges\/\" target=\"_blank\" rel=\"noopener\">LGBT creators<\/a> have seen their content disproportionately de-monetized or even outright deleted with very little explanation. Those de-monetization and deletion decisions are generally not made by humans, but by the algorithm that is checking content to see if it complies with the site&#8217;s community standards. Ziggi, a black TikTok creator, \u00a0makes it difficult for marginalized creators to take full advantage of the economic opportunities presented by these kinds of platforms. Having users \u201cchoose\u201d their preferred algorithm wouldn\u2019t have stopped either of these harms from occurring.<\/p>\n<p>You may wonder if allowing users to choose their algorithm would stop some of the socially detrimental harms, like <a href=\"https:\/\/www.nytimes.com\/interactive\/2019\/06\/08\/technology\/youtube-radical.html?mtrref=www.google.com&amp;gwh=D1E12CBA710794A03CEEDE02372E7ED8&amp;gwt=regi&amp;assetType=REGIWALL\" target=\"_blank\" rel=\"noopener\">radicalization<\/a> or <a href=\"https:\/\/www.theverge.com\/2020\/5\/26\/21270659\/facebook-division-news-feed-algorithms\">polarization<\/a>. There is <a href=\"https:\/\/www.pnas.org\/content\/118\/9\/e2023301118#sec-8\" target=\"_blank\" rel=\"noopener\">little evidence<\/a> that users would actively choose to make their feeds more diverse and engage with a wider range of opinions and sources of information. And, in fact, Facebook itself <a href=\"https:\/\/www.facebook.com\/notes\/751449002072082\/\" target=\"_blank\" rel=\"noopener\">found<\/a> that users are more likely to engage with sensational or extreme content.In the consumer products context, flaws that created such harms would be called design flaws or defects and it would be the manufacturer\u2019s obligation to fix them. Lawmakers should treat algorithms the same way. These are not tools consumers should be obligated to fix, the platforms (manufacturer) should bear that obligation.<\/p>\n<p>Also, socially detrimental harms, like polarization and radicalization, can often lead to physical harm. While platforms themselves can\u2019t take away a person\u2019s life or liberty, that doesn\u2019t mean they can\u2019t be a contributing factor. Beyond the immense <a href=\"https:\/\/www.brennancenter.org\/issues\/protect-liberty-security\/social-media\/police-social-media-surveillance\" target=\"_blank\" rel=\"noopener\">surveillance capabilities<\/a> these platforms possess, they are also the breeding ground for violent political action like what was seen in <a href=\"https:\/\/www.nytimes.com\/2018\/10\/15\/technology\/myanmar-facebook-genocide.html\" target=\"_blank\" rel=\"noopener\">Myanmar<\/a> and our own <a href=\"https:\/\/www.businessinsider.com\/facebook-failed-stop-the-steal-us-capitol-riots-internal-report-2021-4\" target=\"_blank\" rel=\"noopener\">Capitol<\/a>. Those violent actions began on social media and were inflamed by how they curate content. Only Facebook, not users, can correct for these societal harms.<\/p>\n<p>Given the depth and breadth of harms that can arise from algorithmic decision-making, it would be incredibly unwise for Congress to limit themselves to something as ineffective as \u201ctransparency and user control\u201d for regulating it. I want to make clear that Public Knowledge isn\u2019t opposed to platforms giving their users more visibility into opaque systems and more options with how to engage. That does provide benefits to users. But that is not how we are going to address harms like discrimination, radicalization, and economic inequality. These problems will require something more prescriptive, and be a part of a constellation of new tech regulations like a comprehensive <a href=\"https:\/\/www.publicknowledge.org\/blog\/moving-beyond-consent-models-in-privacy-legislation-a-panel-recap\/\" target=\"_blank\" rel=\"noopener\">federal privacy law<\/a>, <a href=\"https:\/\/www.publicknowledge.org\/press-release\/house-antitrust-subcommittee-proposes-strong-bipartisan-legislation-to-rein-in-big-tech\/\" target=\"_blank\" rel=\"noopener\">new competition rules<\/a>, and even a <a href=\"https:\/\/www.publicknowledge.org\/assets\/uploads\/documents\/Case_for_the_Digital_Platform_Act_Harold_Feld_2019.pdf\" target=\"_blank\" rel=\"noopener\">digital regulator<\/a>. Regulating algorithms is the next frontier of tech policy and Public Knowledge will continue to explore and evaluate possible solutions for this emerging field. Fortunately, Congress has the opportunity to learn the lessons that have already been taught in privacy. Giving users the illusion of control just isn\u2019t a viable regulatory strategy.<\/p>\n<p><i><span style=\"font-weight: 400;\">Meme source: <\/span><\/i><a href=\"https:\/\/en.wikipedia.org\/wiki\/Steven_Crowder\"><i><span style=\"font-weight: 400;\">Steven Crowder&#8217;s<\/span><\/i><\/a> <a href=\"https:\/\/knowyourmeme.com\/memes\/steven-crowders-change-my-mind-campus-sign\"><i><span style=\"font-weight: 400;\">&#8220;Change My Mind&#8221; Campus Sign<\/span><\/i><\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>If you\u2019ve been following the Capitol Hill hearings about algorithms and automated decision-making, then you\u2019ve probably heard technology companies talk about how they offer or want to offer \u201ctransparency and user control\u201d to consumers. Companies like Facebook and Twitter propose sharing information on how their algorithms work with the public, as well as enabling users [&hellip;]<\/p>\n","protected":false},"author":195,"featured_media":0,"parent":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[5],"tags":[11,14],"class_list":["post-23754","post","type-post","status-publish","format-standard","hentry","category-insights","tag-content-moderation","tag-platform-regulation"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v26.5 (Yoast SEO v26.5) - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>The Privacy Debate Reveals How Big Tech\u2019s \u201cTransparency and User Control\u201d Arguments Fall Flat - Public Knowledge<\/title>\n<meta name=\"description\" content=\"Public Knowledge promotes freedom of expression, an open internet, and access to affordable communications tools and creative works. We work to shape policy.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/publicknowledge.org\/the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"The Privacy Debate Reveals How Big Tech\u2019s \u201cTransparency and User Control\u201d Arguments Fall Flat\" \/>\n<meta property=\"og:description\" content=\"Public Knowledge promotes freedom of expression, an open internet, and access to affordable communications tools and creative works. We work to shape policy.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/publicknowledge.org\/the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat\/\" \/>\n<meta property=\"og:site_name\" content=\"Public Knowledge\" \/>\n<meta property=\"article:published_time\" content=\"2021-07-21T12:28:44+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/publicknowledge.org\/wp-content\/uploads\/2021\/12\/pk_social_logo-2.png\" \/>\n\t<meta property=\"og:image:width\" content=\"400\" \/>\n\t<meta property=\"og:image:height\" content=\"200\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Sara Collins\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Sara Collins\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"8 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\/\/publicknowledge.org\/the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat\/#article\",\"isPartOf\":{\"@id\":\"https:\/\/publicknowledge.org\/the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat\/\"},\"author\":{\"name\":\"Sara Collins\",\"@id\":\"https:\/\/publicknowledge.org\/#\/schema\/person\/121f9a38df070fd2f7119f6cab35cbf5\"},\"headline\":\"The Privacy Debate Reveals How Big Tech\u2019s \u201cTransparency and User Control\u201d Arguments Fall Flat\",\"datePublished\":\"2021-07-21T12:28:44+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\/\/publicknowledge.org\/the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat\/\"},\"wordCount\":1534,\"publisher\":{\"@id\":\"https:\/\/publicknowledge.org\/#organization\"},\"keywords\":[\"Content Moderation\",\"Platform Regulation\"],\"articleSection\":[\"Insights\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\/\/publicknowledge.org\/the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat\/\",\"url\":\"https:\/\/publicknowledge.org\/the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat\/\",\"name\":\"The Privacy Debate Reveals How Big Tech\u2019s \u201cTransparency and User Control\u201d Arguments Fall Flat - Public Knowledge\",\"isPartOf\":{\"@id\":\"https:\/\/publicknowledge.org\/#website\"},\"datePublished\":\"2021-07-21T12:28:44+00:00\",\"description\":\"Public Knowledge promotes freedom of expression, an open internet, and access to affordable communications tools and creative works. We work to shape policy.\",\"breadcrumb\":{\"@id\":\"https:\/\/publicknowledge.org\/the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/publicknowledge.org\/the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/publicknowledge.org\/the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/publicknowledge.org\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"The Privacy Debate Reveals How Big Tech\u2019s \u201cTransparency and User Control\u201d Arguments Fall Flat\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/publicknowledge.org\/#website\",\"url\":\"https:\/\/publicknowledge.org\/\",\"name\":\"Public Knowledge\",\"description\":\"\",\"publisher\":{\"@id\":\"https:\/\/publicknowledge.org\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/publicknowledge.org\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/publicknowledge.org\/#organization\",\"name\":\"Public Knowledge\",\"url\":\"https:\/\/publicknowledge.org\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/publicknowledge.org\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/publicknowledge.org\/wp-content\/uploads\/2021\/12\/pk_social_logo-2.png\",\"contentUrl\":\"https:\/\/publicknowledge.org\/wp-content\/uploads\/2021\/12\/pk_social_logo-2.png\",\"width\":400,\"height\":200,\"caption\":\"Public Knowledge\"},\"image\":{\"@id\":\"https:\/\/publicknowledge.org\/#\/schema\/logo\/image\/\"}},{\"@type\":\"Person\",\"@id\":\"https:\/\/publicknowledge.org\/#\/schema\/person\/121f9a38df070fd2f7119f6cab35cbf5\",\"name\":\"Sara Collins\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/publicknowledge.org\/#\/schema\/person\/image\/\",\"url\":\"https:\/\/secure.gravatar.com\/avatar\/b7130369508b15f9f477deb88bfc18764d3e852f9b41852ff93e6c98cee8712a?s=96&d=mm&r=g\",\"contentUrl\":\"https:\/\/secure.gravatar.com\/avatar\/b7130369508b15f9f477deb88bfc18764d3e852f9b41852ff93e6c98cee8712a?s=96&d=mm&r=g\",\"caption\":\"Sara Collins\"},\"url\":\"https:\/\/publicknowledge.org\/author\/sara-collins\/\"}]}<\/script>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"The Privacy Debate Reveals How Big Tech\u2019s \u201cTransparency and User Control\u201d Arguments Fall Flat - Public Knowledge","description":"Public Knowledge promotes freedom of expression, an open internet, and access to affordable communications tools and creative works. We work to shape policy.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/publicknowledge.org\/the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat\/","og_locale":"en_US","og_type":"article","og_title":"The Privacy Debate Reveals How Big Tech\u2019s \u201cTransparency and User Control\u201d Arguments Fall Flat","og_description":"Public Knowledge promotes freedom of expression, an open internet, and access to affordable communications tools and creative works. We work to shape policy.","og_url":"https:\/\/publicknowledge.org\/the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat\/","og_site_name":"Public Knowledge","article_published_time":"2021-07-21T12:28:44+00:00","og_image":[{"width":400,"height":200,"url":"https:\/\/publicknowledge.org\/wp-content\/uploads\/2021\/12\/pk_social_logo-2.png","type":"image\/png"}],"author":"Sara Collins","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Sara Collins","Est. reading time":"8 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/publicknowledge.org\/the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat\/#article","isPartOf":{"@id":"https:\/\/publicknowledge.org\/the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat\/"},"author":{"name":"Sara Collins","@id":"https:\/\/publicknowledge.org\/#\/schema\/person\/121f9a38df070fd2f7119f6cab35cbf5"},"headline":"The Privacy Debate Reveals How Big Tech\u2019s \u201cTransparency and User Control\u201d Arguments Fall Flat","datePublished":"2021-07-21T12:28:44+00:00","mainEntityOfPage":{"@id":"https:\/\/publicknowledge.org\/the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat\/"},"wordCount":1534,"publisher":{"@id":"https:\/\/publicknowledge.org\/#organization"},"keywords":["Content Moderation","Platform Regulation"],"articleSection":["Insights"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/publicknowledge.org\/the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat\/","url":"https:\/\/publicknowledge.org\/the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat\/","name":"The Privacy Debate Reveals How Big Tech\u2019s \u201cTransparency and User Control\u201d Arguments Fall Flat - Public Knowledge","isPartOf":{"@id":"https:\/\/publicknowledge.org\/#website"},"datePublished":"2021-07-21T12:28:44+00:00","description":"Public Knowledge promotes freedom of expression, an open internet, and access to affordable communications tools and creative works. We work to shape policy.","breadcrumb":{"@id":"https:\/\/publicknowledge.org\/the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/publicknowledge.org\/the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/publicknowledge.org\/the-privacy-debate-reveals-how-big-techs-transparency-and-user-control-arguments-fall-flat\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/publicknowledge.org\/"},{"@type":"ListItem","position":2,"name":"The Privacy Debate Reveals How Big Tech\u2019s \u201cTransparency and User Control\u201d Arguments Fall Flat"}]},{"@type":"WebSite","@id":"https:\/\/publicknowledge.org\/#website","url":"https:\/\/publicknowledge.org\/","name":"Public Knowledge","description":"","publisher":{"@id":"https:\/\/publicknowledge.org\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/publicknowledge.org\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/publicknowledge.org\/#organization","name":"Public Knowledge","url":"https:\/\/publicknowledge.org\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/publicknowledge.org\/#\/schema\/logo\/image\/","url":"https:\/\/publicknowledge.org\/wp-content\/uploads\/2021\/12\/pk_social_logo-2.png","contentUrl":"https:\/\/publicknowledge.org\/wp-content\/uploads\/2021\/12\/pk_social_logo-2.png","width":400,"height":200,"caption":"Public Knowledge"},"image":{"@id":"https:\/\/publicknowledge.org\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/publicknowledge.org\/#\/schema\/person\/121f9a38df070fd2f7119f6cab35cbf5","name":"Sara Collins","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/publicknowledge.org\/#\/schema\/person\/image\/","url":"https:\/\/secure.gravatar.com\/avatar\/b7130369508b15f9f477deb88bfc18764d3e852f9b41852ff93e6c98cee8712a?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/b7130369508b15f9f477deb88bfc18764d3e852f9b41852ff93e6c98cee8712a?s=96&d=mm&r=g","caption":"Sara Collins"},"url":"https:\/\/publicknowledge.org\/author\/sara-collins\/"}]}},"_links":{"self":[{"href":"https:\/\/publicknowledge.org\/wp-json\/wp\/v2\/posts\/23754","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/publicknowledge.org\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/publicknowledge.org\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/publicknowledge.org\/wp-json\/wp\/v2\/users\/195"}],"replies":[{"embeddable":true,"href":"https:\/\/publicknowledge.org\/wp-json\/wp\/v2\/comments?post=23754"}],"version-history":[{"count":0,"href":"https:\/\/publicknowledge.org\/wp-json\/wp\/v2\/posts\/23754\/revisions"}],"wp:attachment":[{"href":"https:\/\/publicknowledge.org\/wp-json\/wp\/v2\/media?parent=23754"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/publicknowledge.org\/wp-json\/wp\/v2\/categories?post=23754"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/publicknowledge.org\/wp-json\/wp\/v2\/tags?post=23754"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}