Applewritten evidence (FEO0107)


House of Lords Communications and Digital Committee inquiry into Freedom of Expression Online



Apple is grateful for the chance to submit evidence to the Committee’s inquiry. The Committee asked four questions, which we have addressed in turn below. We hope these comments are helpful and wish the Committee well in its consideration of these important issues.


  1. Apple’s Human Rights policy states: “Where [international human rights standards and national law] are in conflict, we respect national law while seeking to respect the principles of internationally recognised human rights.” Please could you provide examples of when such conflicts have arisen and how Apple has dealt with them? Is “seeking to respect” human rights ultimately secondary to compliance with national law?


Apple is committed to respecting internationally recognised human rights in our business operations, as set out in the United Nations International Bill of Human Rights and the International Labour Organisation’s Declaration on Fundamental Principles and Rights at Work. Our approach is based on the United Nations Guiding Principles on Business and Human Rights.


The UN Guiding Principles on Business and Human Rights is a publication that was developed by the Special Representative of the Secretary-General on the issue of human rights and endorsed by the UN Human Rights Council. As the UN Guiding Principles state, they are grounded in recognition of “[t]he role of business enterprises as specialised organs of society performing specialised functions, required to comply with all applicable laws and to respect human rights”.


Recognising the role of business enterprises, the UN Guiding Principles state that business enterprises should “[c]omply with all applicable laws and respect internationally recognised human rights, wherever they operate,” “[s]eek ways to honour the principles of internationally recognised human rights when faced with conflicting requirements,” and “[t]reat the risk of causing or contributing to gross human rights abuses as a legal compliance issue wherever they operate.”


In line with the UN Guiding Principles, we say in Apple’s Human Rights policy, “[w]e believe that dialogue and engagement are the best ways to work toward building a better world. In keeping with the UN Guiding Principles, where national law and international human rights standards differ, we follow the higher standard. Where they are in conflict, we respect national law while seeking to respect the principles of internationally recognised human rights.


An example of where conflicts such as this may potentially arise in the course of our business is where we are subject to a request from governmental authorities to remove an app from the App Store based on alleged/suspected violations of national law. For example, law enforcement or regulatory agencies may suspect an app may be unlawful or relate to/contain unlawful content. In those instances, Apple carefully reviews all legal requests to ensure that there’s a valid legal basis for each request; and complies with legally valid requests. Where Apple determines that there is no valid legal basis or where a request is unclear, inappropriate or over-broad, Apple will object, challenge or reject the request.


Apple further seeks to respect the principles of internationally recognised human rights by among other things, publishing Apple’s Transparency Report, which provides comprehensive information regarding specific requests from governments and third parties for customer information, account removal or suspension and content removal, including those that were reasonably likely to limit free expression. Apple’s Transparency Report discloses, by country or region, (i) the request type (Legal Violation Request or Platform Violation Request) (ii) the number of requests received, (iii) the number of apps specified in the request, (iv) the number of requests objected to in part or rejected in full, (v) the number of requests that resulted in an app being removed, (vi) the number of apps removed, (vii) the number of appeals received, (viii) the number of appeals granted and (ix) the number of apps reinstated. In addition, under “Matters of Note,” Apple describes the nature of the content on the apps that governments in each country or region sought to remove.


In the most recent period for which we have reported (July- December 2019) we received 54 requests for takedowns, covering a total of 258 apps. Of these, we objected to 7 requests in part or in full, and removed 207 apps. None of these requests came from the United Kingdom.


  1. Has Apple ever infringed users’ right to freedom of expression in order to comply with national law?


Apple is committed to respecting the rights of freedom of expression in our business operations. Our products help our customers communicate, learn, express their creativity, and exercise their ingenuity. We believe in the critical importance of an open society in which information flows freely, and we’re convinced the best way we can continue to promote openness is to remain engaged, even where we may disagree with a country’s laws.


As discussed above, we acknowledge the responsibility of business enterprises to comply with all applicable laws and, in accordance with the UN Guiding Principles, we seek ways to honour the principles of internationally recognised human rights when faced with conflicting requirements.




  1. Does Apple remove apps which are legal but may be harmful? If so, how does Apple balance potential harms with consideration of freedom of expression? What process is there to ensure transparency? Is there an appeals process and, if so, how does it work?


For over a decade, the App Store has provided safe and trusted place to discover and download apps. But the App Store is more than just a storefront — it’s an innovative destination focused on bringing customers amazing experiences, a big part of which is ensuring that the apps we offer are held to the highest standards for privacy, security, and content. That is why all apps in the App Store must comply with our App Store Review Guidelines (the “Guidelines”),[1] which provide clear and transparent guidance to developers on building the best apps for our customers. The five pillars of the guidelines — Safety, Performance, Business, Design, and Legal — require that apps offered on the App Store are safe, provide a good user experience, adhere to our rules on user privacy, secure devices from malware and threats, and use approved business models.


Apple enforces the rules and policies set forth in the Guidelines through the App Review process, in which all apps are reviewed by specialists for compliance before developers can market them through the App Store. Apple uses a combination of automated systems and hundreds of human experts, representing 81 languages across three time zones. The App Review team reviews more than 100,000 submissions per week and rejects approximately 40,000 of those submissions due to various Guidelines compliance issues.


Apple strongly support all points of view being represented on the App Store, as long as the apps are respectful to users with differing opinions and the quality of the app experience is great. We will reject apps for any content or behaviour that we believe is over the line. To that end, our Guidelines specifically prohibit content that is offensive or upsetting, that may damage the user’s device, or that may cause physical harm. These prohibitions are set forth in Guideline 1, entitled “Safety.” Guideline 1 contains sections addressing objectionable content, user generated content, apps for our Kids category, physical harm, developer information, and data security. For example, Guideline 1.1.1 states: “Apps should not include content that is offensive, insensitive, upsetting, intended to disgust, in exceptionally poor taste, or just plain creepy. Examples of such content include: Defamatory, discriminatory, or mean-spirited content, including references or commentary about religion, race, sexual orientation, gender, national/ethnic origin, or other targeted groups, particularly if the app is likely to humiliate, intimidate, or harm a targeted individual or group.”


Apple may also remove apps from the App Store due to non-compliance with the Guidelines, or pursuant to a legal governmental or law enforcement request. Removing an app is usually an option of last resort, as Apple prefers to work with developers to keep their apps on the App Store. Apple will notify a developer when, where, and why an app is removed, with the exception of situations in which notification would be futile or ineffective, could cause potential danger of serious physical injury, could compromise Apple’s ability to detect developer violations, or in instances related to violations for spam, phishing, and child exploitation imagery. Whenever possible, apps that are removed from the App Store will only be removed in countries and territories specific to the issue, and will remain available in locations that are not impacted.


Apple provides developers with appeals processes for both app rejection and removal. When App Review rejects a developer’s submission to the App Store, the team sends a detailed message to the developer in Resolution Center, which is accessible in Apple’s App Store management software App Store Connect. The rejection message identifies the Guideline that the app violates; describes why the app violates the cited guideline; provides a description of next steps to resolve the rejection; and provides additional information on resources that may be accessed to help the developer resolve the issue. Each Resolution Center rejection message includes a "Reply" text submission field that allows developer to respond to the App Review specialist that reviewed the app with a request for additional information or to dispute the findings set forth in the message. The developer can include attachments in replies, such as screenshots and supporting documents. The App Review specialist will review the developer’s reply and respond accordingly.


Developers can also submit appeals for rejections and removals to the App Review Board (ARB). The ARB consists of senior App Review specialists who conduct a new review of each app submitted to them. Developers can submit additional details to the ARB to help them determine if the app should be reconsidered. Developers can start this process in Resolution Center by clicking the link titled "Submit an appeal to the App Review Board," which takes them to the "Contact the App Review Team" page on Developers can also navigate directly to this page on their web browser.


Developers can request a telephone call with a member of the Developer Services team via a call request form. The App Review team makes about 1,000 calls per week to developers to help them diagnose and resolve any issues that led to rejection — so they can get their app onto the App Store.


The Resolution Center, ARB, and telephone call request process are key elements in Apple’s App Review architecture and proof of how seriously Apple takes its responsibilities to create a marketplace which works for consumers and developers, and to maintain a review process that is thorough, transparent, consistent, and fair.


  1. How does Apple protect users’ privacy? What is Apple’s view of UK privacy law?


Apple believes that privacy is a fundamental human right. We are constantly working to find even stronger new ways to keep our customers’ personal data safe. Our products and features include innovative privacy technologies and techniques designed to minimise how much of your data we — or anyone else — can access.

We make public a detailed list of the ways in which we protect our customers’ privacy available at and we would encourage the committee to consult that for a comprehensive picture of our approach, but just to list some of the headline elements:


          Safari was the first browser as far back as 2003 to block third party cookies by default and has continued to innovate since then with state-of-the-art features such as Intelligent Tracking Prevention which protect users against cross-site tracking


          Apple Leads the industry in the use of on device intelligence and data minimisation techniques such that Apple does not collect data associated with user identity. Maps is a privacy by design service such that your searches and trips are not associated with an individual but rather random identifiers that we do not link to a user’s identity. So neither Apple nor anyone else can create a profile of your movements and searches.


          Siri Search also uses random identifiers such that the things users search for are not linked with their identity.


          Siri Assistant also associates the requests that users make with a random identifier that is not linked to a user’s Apple ID. Also a user’s utterances to Siri are only retained if a user consents to our doing so to maintain the service.


          Photos app. Searches for people, places and things within a user’s Photos app uses on-device machine learning algorithms so that the searches are done completely on a user’s device rather than in the cloud.


          Messages and FaceTime conversations are encrypted end-to-end, so they can’t be read while they’re sent between devices.


          Every app in the App Store is required to follow strict guidelines on protecting your privacy and to provide a self-reported Nutrition Label summary of how it uses data. And apps must ask for user permission before accessing things like your photos or location.


As the UK GDPR is derived from the GDPR, we believe that it sets a global standard for privacy. Our CEO, Tim Cook, speaking on International Data Protection Day at the CPDP Conference in Brussels stated “Proving cynics and doomsayers wrong, the GDPR has provided an important foundation for privacy rights around the world, and its implementation and enforcement must continue. But we can’t stop there. We must do more. And we’re already seeing hopeful steps forward worldwide, including a successful ballot initiative strengthening consumer protections right here in California.”



24 March 2021