View in English

  • メニューを開く メニューを閉じる
  • Apple Developer
検索
検索を終了
  • Apple Developer
  • ニュース
  • 見つける
  • デザイン
  • 開発
  • 配信
  • サポート
  • アカウント
次の内容に検索結果を絞り込む

クイックリンク

5 クイックリンク

ビデオ

メニューを開く メニューを閉じる
  • コレクション
  • トピック
  • すべてのビデオ
  • 利用方法

その他のビデオ

ストリーミングはほとんどのブラウザと
Developerアプリで視聴できます。

  • 概要
  • トランスクリプト
  • コード
  • 開発プロセスへのプライバシーの統合

    プランニング段階からデプロイまで、アプリにプライバシーを組み込む方法について説明します。データの最小化、オンデバイス処理、透明性、管理に焦点を当て、開発ライフサイクルの各段階でプライバシーを統合するための実用的な方法を紹介します。Appleのツールとフレームワークを活用して、ユーザーデータを保護し、プライバシーを尊重したアプリ体験を構築する方法を学べます。

    関連する章

    • 0:00 - Intro
    • 0:38 - How does Apple think of privacy?
    • 3:47 - Planning
    • 7:13 - Design
    • 9:27 - Development: User interface
    • 12:48 - Development: Client-server
    • 18:20 - Development: local resources
    • 20:46 - Testing
    • 22:28 - Deployment

    リソース

    • AdAttributionKit
    • Configuring app groups
    • Explore the Human Interface Guidelines for privacy
    • Privacy manifest files
    • Swift Homomorphic Encryption
      • HDビデオ
      • SDビデオ

    関連ビデオ

    WWDC25

    • AdAttributionKitの新機能

    WWDC24

    • プライバシーの新機能
    • 機械学習とAIモデルをAppleシリコンに統合
    • AdAttributionKitについて
    • Core MLを使用してデバイス上に機械学習モデルとAIモデルをデプロイ

    WWDC23

    • 写真ピッカーのアプリへの組み込み
    • Privacy manifestsの導入

    WWDC22

    • CAPTCHAのプライベートアクセストークンへの置き換え
    • Privacy Nutrition Labelの作成

    WWDC21

    • App AttestとDeviceCheckによる不正行為の抑制
    • CloudKitの新機能
    • Locationボタンについて
  • このビデオを検索

    Hello, my name is Joey Tyson and I’m a privacy engineer at Apple. When you build apps that rely on people’s data, the power of that data makes it essential to integrate privacy into your development process. At Apple, we believe that privacy is more than just a buzzword. Everyone deserves to have their data handled with care and respect. In this video, I’m excited to review key phases in building an app and share practical ways to infuse them with privacy at every step. First, though, I’ll review what I mean by “privacy”. At a conference in 2010, Steve Jobs described privacy as people knowing what they’re signing up for, in plain language, and repeatedly. In other words, people can understand and trust how you handle their data. There’s alignment between what they expect or intend and what you do in practice. Steve also expressed his optimism that people can make the right choices for themselves about their data if you build this understanding.

    He noted that some people want to share more data than other people do. Everyone brings their particular context to such choices, and it's not for anyone else to assume on their behalf. In fact, one of the best ways to stay consistent with what people intend is simply to ask them.

    No matter how you write a definition for privacy, three key concepts consistently appear. First, privacy is about people, how they interact with technology, and how it impacts them. More specifically, it involves the impact of processing data about people: what you do with what you know about the people using your app.

    Finally, in deciding what to do, you need to take into account the context of that information, especially the expectations and preferences of the people it describes.

    Privacy is about showing respect for that context when you’re entrusted with someone’s data. Knowing information about people can enable you to create exciting, personalized experiences for them. However, if that information is used in ways they don’t expect or shared in new contexts, it can have significant, tangible impacts on their lives. As people use your app to accomplish their goals, it should work in ways they expect or choose, including how it uses data to fulfill those goals. Tim Cook has shared that Apple’s approach to privacy involves collecting as little of your data as possible, then being thoughtful and respectful when it’s in our care.

    He highlighted that we ask ourselves not simply what can we do, but what should we do in the best interests of our users? This perspective informs Apple's privacy pillars of data minimization, on-device processing, transparency and control, and security protections. Now, I’m going to make this more practical by applying a privacy mindset in the context of app development.

    All of Apple’s platforms include many technologies to help make great privacy a reality in your app. These are just a few of the tool I will share with you in this video on integrating privacy into your development process.

    No matter what model you follow in your software development lifecycle, it will likely cover these five simplified phases.

    planning, design, development, testing, and deployment. I'll go through these stages and show you opportunities for building great privacy in each one. I will also share tools available on Apple platforms to support your privacy goals, as well as a few privacy requirements.

    First, I'll talk about planning. This is where you begin to write your app’s privacy story.

    To deliver the best privacy in your app, the strongest alignment with someone’s intent when handling their personal data, privacy should be part of development from the very beginning. As with security or localization, privacy is much more challenging to retrofit later on. At Apple, we begin our planning by defining privacy assurances, high-level statements of what we want people to expect from our products when it comes to privacy.

    These product descriptions not only inform the specific engineering requirements for implementing your privacy goals, they can help you communicate those goals to the people who use your app.

    At Apple, we use the privacy pillars I mentioned before to help guide the privacy assurances we craft. To illustrate this process, suppose I’m building an app called Pal About for finding nearby points of interest and sharing photos of them with friends.

    Data minimization means using only the data you need to maximize your alignment with people’s expectations. This builds efficiency and trust. In particular, when collecting data for analytics, think carefully about problems you're trying to solve or questions you're trying to answer and ensure your data collection is proportional to those use cases.

    For my app, one assurance I will define is that “We only retain aggregate usage data.” Data minimization also reflects an approach of strong defaults, building experiences that start with the least number of assumptions about people’s privacy choices. In my app, I may decide to assure users that “When searching for nearby places, your current location is not stored by default.” I mentioned earlier that privacy involves the context of someone’s information. On-device processing is a key way you can respect that context. Anytime data crosses a boundary from one context to another, such as from the device to a server or between two different apps, think carefully about whether that flow is necessary and how it affects your users’ privacy.

    Going back to my Pal About app, I could write: “Suggested meetup locations are only generated locally using on-device data.” Once you’ve planned for strong defaults and maintaining context, set expectations for how data is used and provide options for customizing the experience.

    When giving options to share data beyond the defaults, ensure that those choices provide clear value in return.

    In particular, when building features with AI, let people know when AI is being used and empower them to make decisions about relevant data flows, such as whether data is used for training models. For a privacy assurance in my app, I might decide that “Photos you upload are only used to train generative models if you opt in to improve intelligence features.” Finally, security protections can help maintain limits on your data flows to avoid harm. This means that whenever possible, privacy should be supported by technical controls. That is, systems which maintain privacy natively, even if other parts end up not working as expected.

    In my example app, I could say, “We cannot read messages to your friends in transit between devices,” and support that goal with an architecture using end-to-end encryption. Now, I'll focus on design, constructing how users will actually interact with your product.

    Great design lets you communicate with people using your app and educate them without it feeling like a chore. If people can trust how an app uses their data, they’re more likely to engage with it and share that data. I’ll now share a few general design principles for building that trust. First, set expectations proactively. People should never feel caught off guard by what data an app collects or how it uses that data. For example, in Apple apps, we show a brief summary of key privacy aspects on first launch, along with a link to more detailed information. You can also use onboarding flows in your app to help people understand how it uses data.

    In addition to educating people up front, make state changes clear. That is, ensure people can understand in the moment any changes to data flows that might affect their privacy or change their behavior.

    As an example, we show an animation on iOS to indicate whenever Siri is invoked, along with the system indicator at the top of the screen that appears whenever the microphone is enabled.

    Finally, choices about data should be meaningful and contextual. Each option should either make a tangible difference in app behavior or include a clear explanation of impact. Time these choices to appear when they are most relevant and study what level of granularity provides the right balance for your audience.

    For example, if Pal About on macOS included a feature for connecting with streaming media devices, I would not attempt any local network connections until someone decided to access that feature.

    This makes it clear why those connections need to happen and why the system shows a prompt to enable them.

    For more information on designing with privacy, check out the “Human Interface Guidelines” in the Apple Developer Documentation.

    After planning and designing for privacy, it’s time to write some code. You can use many tools and technologies to implement your privacy goals.

    In this section, I will highlight just a few examples of how you can use such tools when building your app.

    I’ll start with ways you can streamline privacy choices in your app's user interface. Accessing device resources, such as your calendar or current location, means sharing sensitive data across different contexts. So Apple’s platforms are designed to share this data only when it aligns with a user's intent.

    Broad access has the most impact on privacy, so requesting it usually triggers an explicit permission prompt, such as this photo library prompt.

    If someone only wants to share a few photos, allowing access to the entire library could reveal far more information than they intend. And that’s why “Limit Access” lets someone choose only certain photos instead.

    But you can create a smoother experience by using tools for requesting only the data you need. Since people choose what to share via these flows, their interactions signal intent implicitly, and they don’t get additional prompts. With my example app, I want to let people upload photos, but I probably don’t need access to their entire library. Instead, I can use a PhotosPicker to avoid the photo library prompt entirely. Since the system renders the picker out of process, the app only receives the photos the person actually selects, and no permission prompt is shown. You can embed the picker within the UI of your app and customize its appearance, and it even lets users select from their hidden photos. Versions of the picker are also available on other platforms, including macOS and visionOS. Watch the video, “Embed the Photos Picker in your app” from WWDC23, for more info.

    Apple offers other secure UI elements that streamline data access. Going back to my app, suppose I want to let people easily search for places nearby. Adding a Location Button lets them share their current location with just one tap. The system validates the button was triggered by a user interaction, helping to ensure yet again that the location sharing aligns with their intent.

    You can customize the look and feel of this button so it better matches your app's UI.

    The first time someone taps on a Location Button in your app, they will get a prompt confirming their intent and an explanation of what the button does. The location indicator will also be shown at the top of the screen as it would when location is shared through other APIs. After selecting “OK”, however, future taps will not trigger any prompt, and the current location will simply be shared right away.

    Watch the video “Meet the Location Button” from WWDC21 for more details.

    These are just two examples of tools for reducing the cognitive load of privacy UI. Others include: an out-of-process picker for contacts, the UIPasteControl button, and streamlined flows for setting up hardware accessories. Check for options to access only the device data you need before using a broader API.

    Of course, in some cases, permission prompts may still be necessary. To help make them meaningful and contextual, perform actions that trigger them at times when the user would expect such a question to appear. Avoid presenting such choices the first time they open your app, and write purpose strings that clearly explain why someone is encountering that prompt. Now, I’ll go over some of the privacy technologies that can work behind the scenes as part of client-server interactions. In particular, I'll share a few ways to use server-side resources without exposing sensitive data to the server, then how to reduce the need for identifiable data in addressing some common data collection scenarios.

    On Apple platforms, Advanced Data Protection is a feature that lets people enable end-to-end encryption for the vast majority of their data stored in iCloud. By adopting CloudKit, you can also enable end-to-end encryption of data stored by your app in CloudKit whenever someone enables Advanced Data Protection. This is possible without any changes to your infrastructure or additional management of encryption keys.

    First, make sure to use encrypted data types for all fields in your CloudKit schema.

    This includes CKAsset fields by default, and for most data types in CloudKit, there is an encrypted variant, such as EncryptedString.

    Then you can use the encryptedValues API to retrieve or store data on your CloudKit records.

    All encryption and decryption operations are abstracted away by this API.

    Note that CloudKit doesn’t support indexes on encrypted fields, so don’t include them in your predicate or sort descriptors when fetching records with CKQuery and CKQueryOperation.

    Watch the video “What’s new in CloudKit?” from WWDC21 for more information. Another encryption technique we use at Apple is called homomorphic encryption.

    Instead of computing over raw data, this lets you compute over encrypted payloads and get the same output when decrypting the result. At no point during computation is the data decrypted.

    By combining simple compute operations such as addition, homomorphic encryption can enable more advanced technologies, such as Private Information Retrieval. Using PIR, you can look up data from a server without the query or result ever being available on the server in a decrypted format. With PIR, the device sends an encrypted payload of the query to the server, then the server uses homomorphic encryption to compute an encrypted result. Remember, the computation itself is encrypted, so which result matches the query is not available to the server in cleartext at any point during the lifetime of the request.

    Finally, the server sends the encrypted result back to the device for it to decrypt and process locally.

    In my Pal About app, I could use PIR to get current information about meetup locations from a server without the server being able to track someone's search history. For some users, knowing what locations they explore in the app could affect their safety, so the engineering choice of whether I use PIR in my app becomes an important privacy decision. PIR is already used in features both on Apple platforms and in popular apps from developers just like you.

    For more details on implementing PIR, check out the Swift Homomorphic Encryption library and applications on GitHub.

    Two common scenarios that may at first seem to require identifiable data collection are reducing fraud and ad attribution. I’ll share some of the features Apple has built to help you accomplish your goals in these areas while reducing the need to collect data against a user ID or IP address.

    For anti-fraud, Private Access Tokens can help assure that requests come from legitimate devices while not identifying the people using them.

    Devices receive anonymous tokens after being validated, and you can verify these tokens without needing additional identity information.

    Watch the video “Replace CAPTCHAs with Private Access Tokens” from WWDC22 for more details.

    You may also need to prevent fraudulent actions specific to your app, such as trying to redeem a promotion multiple times. DeviceCheck is an API that lets you associate up to two bits of information with a given device without needing to track device identifiers. These states are maintained by Apple and persist across reinstall or device transfers. Watch the video “Mitigate fraud with App Attest and DeviceCheck” from WWDC21 for more details on how to set and retrieve them.

    For ad performance, AdAttributionKit is a powerful tool to measure the success of ad campaigns while maintaining privacy. It lets you attribute app installs and re-engagements to a particular campaign through a signed postback signal from the user’s device. AdAttributionKit can also further help in reducing prompts. Calling AdAttributionKit APIs does not require showing App Tracking Transparency. For publisher apps, the key implementation steps are to add identifiers to your Info.plist from registered ad networks, then display ads that those networks have signed. For more details on how AdAttributionKit and its postback signals work, watch the video “Meet AdAttributionKit” from WWDC24.

    For advertised apps, you'll need to register conversion events within your app, along with any relevant re-engagements. Plus, you can specify an endpoint for getting a copy of postback signals. All of the relevant APIs for these updates are in the AdAttributionKit section of the Apple Developer documentation, and the video “What’s new in AdAttributionKit” details new capabilities in iOS 26.

    Finally, I will give you some tips around maintaining privacy while managing local resources, such as on-device AI models, bundled frameworks, and app processes.

    First, I’m going to highlight the power of Core ML to run and even train or fine-tune machine learning models fully on-device.

    This works for a range of open-source models and models you custom-build with Create ML. With optimization and compression techniques, you can even run popular large-language models without ever calling a server.

    Watch “Deploy machine learning and AI models on-device with Core ML” or “Bring your machine learning and AI models to Apple silicon”, both from WWDC24, to get started with Core ML.

    App group containers not only let you share data between different apps you build, they also help protect that data. On macOS, other apps cannot access data in your app group container without user permission. To use these containers, start by registering an app group identifier in the Developer Portal under Certificates, Identifiers, and Profiles. You will then need to adopt the corresponding entitlement in your app and have it request the container directory anytime you need to store user data.

    For more information, watch the video “What’s new in privacy” from WWDC24, or refer to “Configuring app groups” in the Apple Developer documentation. Also on macOS, note that all app processes should be terminated when someone decides to quit the app.

    If you load processes using fork, exec, or POSIX spawn, clean up those processes on exit. macOS Tahoe will heuristically detect if such processes continue running, then show users a prompt to confirm if this aligns with their intent. Note that users will then have the option to terminate the processes themselves.

    Finally, remember that you are responsible for all code included in your apps, including third-party SDKs. Before you submit to the App Store, SDKs may require a privacy manifest file describing the data they collect and the required reasons APIs they use. Refer to “Privacy manifest files” in the Apple Developer documentation for more details on when this requirement applies. You can also learn more by watching the video “Get started with privacy manifests” from WWDC23. These technologies are just a few examples of how privacy is part of development. But now, I’ll shift to the next phase, and that’s testing. And yes, privacy is even part of this step.

    Just as you want to build tests for avoiding errors or detecting performance regressions, building tests to confirm you deliver on your app’s privacy assurances is an important best practice. At Apple, we think of tests in a pyramid model, where smaller tests form a foundation for broader ones that more closely reflect the practical goals of people using an app. Unit tests help verify single pieces of code, generally individual functions. For privacy, they can validate app logic supporting privacy controls.

    Integration tests target discrete subsystems or clusters of classes to make sure that different components behave correctly together. In a privacy context, these ensure data flows between systems work as expected. Finally, UI tests observe user-facing behaviors in your app, making sure it truly does what you expect. This includes common scenarios for onboarding or changing privacy settings. With my Pal About app, I had a privacy assurance of only training on photos if people opted in to improve intelligence features. So I would write a UI test to confirm that changing the opt-in setting always updates the relevant data flows. Apple offers many tools to enable more efficient testing of privacy features. For example, starting in iOS 15.2, you can enable the App Privacy Report in Settings to quickly review data access, sensor access, and network activity from your app and ensure they’re consistent with what people who use your app would expect to find.

    Finally, I'll cover deployment, the process of actually getting your app out into the world. At this point, you’ve done a lot of work to deliver great privacy in the features you’ve built, and it's time to share your privacy story. As you create content to promote your app, your privacy assurances can be a resource for building interest and setting your app apart.

    Deploying on the App Store also includes a few required steps that help people know what to expect in your app. In particular, privacy nutrition labels are a key way the App Store helps people understand the privacy practices of apps. The label provides concise information on what types of data your app transmits off device and how that data is used.

    Set up your nutrition label in the App Privacy section of App Store Connect. Be sure to review the “App privacy details on the App Store” section of the Apple Developer documentation for specific definitions of data types and terms such as “tracking.” When you archive a build in Xcode, select “Generate Privacy Report” from the archive’s context menu to get an aggregate report of data uses declared in privacy manifest files. Note that your nutrition label must document all potential uses of data, even if people choose whether to have that data collected as part of the app experience. Though check the Apple Developer documentation for certain exceptions, such as some feedback and reporting flows. You can update your nutrition label at any time, even without an app update.

    Watch the video “Create your Privacy Nutrition Label” from WWDC22 for more information.

    In addition to your nutrition label, be sure other privacy requirements for deployment have been met. Your app must have a clear privacy policy. Privacy manifests for your app and all third-party SDKs should be included in Xcode. And purpose strings are required for any permission prompts to access device resources.

    You can also include a Privacy Choices link in your App Store listing for people to learn more about managing data in your app, such as a portal to access, delete, or update such data.

    Planning, design, development, testing, and deployment. At every step of the process, integrating privacy can help deliver the best experience for people who use your app. Now, it’s time to apply these ideas to your process. Start by defining clear privacy assurances for your app.

    Then, use design principles to educate people on data uses and choices.

    Implement your privacy goals with strong technical foundations.

    Ensure you deliver on privacy assurances with testing.

    And finally, prepare relevant nutrition labels as you share your privacy story with the world.

    I mentioned data minimization several times in this video, but I will never minimize my thanks to you for investing in privacy. Thank you.

    • 10:29 - Create an inline Photos picker

      // Create an inline Photos picker
                      
      // Define the app's Photos picker
      PhotosPicker(
          selection: $viewModel.selection,
          matching: .images,
          preferredItemEncoding: .current,
          photoLibrary: .shared()
      ) {
          Text("Select Photos")
      }
                      
      // Configure a half-height Photos picker
      .photosPickerStyle(.inline)
      .ignoresSafeArea()
      .frame(height: 340)
    • 11:33 - Display the Location Button

      // Display the Location Button
      
      LocationButton(LocationButton.Title.currentLocation) {
          // Start updating location when user taps the button.
          // Location button doesn't require the additional
          // step of calling 'requestWhenInUseAuthorization()'.
          manager.startUpdatingLocation()
      }.foregroundColor(Color.white)
          .cornerRadius(27)
          .frame(width: 210, height: 54)
          .padding(.bottom, 30)
    • 13:48 - Encrypting data in CloudKit

      myRecord.encryptedValues["encryptedStringField"] = "Sensitive value"
      
      let decryptedString = myRecord.encryptedValues["encryptedStringField"] as? String

Developer Footer

  • ビデオ
  • WWDC25
  • 開発プロセスへのプライバシーの統合
  • メニューを開く メニューを閉じる
    • iOS
    • iPadOS
    • macOS
    • tvOS
    • visionOS
    • watchOS
    Open Menu Close Menu
    • Swift
    • SwiftUI
    • Swift Playground
    • TestFlight
    • Xcode
    • Xcode Cloud
    • SF Symbols
    メニューを開く メニューを閉じる
    • アクセシビリティ
    • アクセサリ
    • App Extension
    • App Store
    • オーディオとビデオ(英語)
    • 拡張現実
    • デザイン
    • 配信
    • 教育
    • フォント(英語)
    • ゲーム
    • ヘルスケアとフィットネス
    • アプリ内課金
    • ローカリゼーション
    • マップと位置情報
    • 機械学習
    • オープンソース(英語)
    • セキュリティ
    • SafariとWeb(英語)
    メニューを開く メニューを閉じる
    • 英語ドキュメント(完全版)
    • 日本語ドキュメント(一部トピック)
    • チュートリアル
    • ダウンロード(英語)
    • フォーラム(英語)
    • ビデオ
    Open Menu Close Menu
    • サポートドキュメント
    • お問い合わせ
    • バグ報告
    • システム状況(英語)
    メニューを開く メニューを閉じる
    • Apple Developer
    • App Store Connect
    • Certificates, IDs, & Profiles(英語)
    • フィードバックアシスタント
    メニューを開く メニューを閉じる
    • Apple Developer Program
    • Apple Developer Enterprise Program
    • App Store Small Business Program
    • MFi Program(英語)
    • News Partner Program(英語)
    • Video Partner Program(英語)
    • セキュリティ報奨金プログラム(英語)
    • Security Research Device Program(英語)
    Open Menu Close Menu
    • Appleに相談
    • Apple Developer Center
    • App Store Awards(英語)
    • Apple Design Awards
    • Apple Developer Academy(英語)
    • WWDC
    Apple Developerアプリを入手する
    Copyright © 2025 Apple Inc. All rights reserved.
    利用規約 プライバシーポリシー 契約とガイドライン