Fact-checked by Grok 2 weeks ago

UIKit

UIKit is an object-oriented framework developed by Apple Inc. for building graphical user interfaces in native applications targeting iOS, iPadOS, tvOS, macOS (via Mac Catalyst), and visionOS (with compatibility support) platforms. It supplies the core infrastructure, including objects for creating windows, views, controls, and handling user input events such as touches and remote control inputs, enabling developers to design responsive and interactive app experiences. Introduced in 2008 with the release of the first (SDK), originated as the primary toolkit for what was then called , evolving to support the broader of Apple's mobile and streaming devices. The framework adheres to the Model-View-Controller (MVC) design pattern, separating data models from views and mediating controllers to streamline architecture and maintenance. Central classes such as UIApplication manage the overall lifecycle, while UIView and its subclasses—like UIButton, UILabel, and UITableView—form the building blocks for on-screen elements, supporting features like animations, gestures, and accessibility. Originally implemented in , fully integrates with , allowing modern development practices while maintaining across versions. Although Apple introduced in 2019 as a declarative alternative for development, UIKit remains a robust and widely used option for complex, imperative-based interfaces, with ongoing updates to enhance performance and integration with newer APIs. Its extensive library of customizable components and mature ecosystem continues to power the majority of apps, emphasizing precision in layout, event handling, and cross-device adaptability.

Overview and History

Development and Release Timeline

UIKit originated as a core component of , Apple's object-oriented framework for developing applications, and was first released as part of the iPhone SDK 1.0 on March 6, 2008. This launch coincided with the public beta of the SDK and enabled developers to build native third-party apps for the and , marking the debut of the later that year with iPhone OS 2.0. Derived from the AppKit framework used in macOS applications, UIKit adapted familiar desktop UI patterns for touch-based mobile interfaces, providing foundational classes for views, controls, and event handling. Subsequent releases of iOS introduced significant enhancements to UIKit, aligning with evolving hardware capabilities and design paradigms. In (June 2009), Apple introduced MPMoviePlayerController in the MediaPlayer framework, allowing developers to embed and control video playback views directly within UIKit-based apps for the first time. (October 2011) brought Storyboards, a visual tool in for designing app flows and transitions without extensive code, streamlining interface prototyping. Auto Layout was previewed in this release's tools, though fully implemented in (September 2012) to enable constraint-based, device-adaptive layouts. The update (September 2013) represented a pivotal aesthetic shift, introducing principles, dynamic type scaling, and visual effects like and vibrancy, which required UIKit updates to support translucent navigation bars and motion-based animations. Further evolution emphasized modern programming languages and advanced features. iOS 8.0 (September 2014) expanded UIKit's compatibility with , Apple's new language announced at the same time, facilitating safer and more expressive iOS development while maintaining support. (September 2019) added system-wide dark mode, with UIKit providing trait collection APIs for automatic theme adaptation, alongside multi-window support for to handle concurrent app scenes. (September 2022) improved widget integration through WidgetKit extensions, allowing UIKit-based apps to render custom interactive s on home and lock screens. The iOS 18 release (September 2024) incorporated , enabling AI-driven UI adaptations such as generative content in text views and adaptive interface elements responsive to user context. iOS 19 (September 2025) brought additional modernizations to UIKit, including support for the new with fluid animations, enhanced menu bar APIs via UIMainMenuSystem for , automatic observation tracking for Observable objects, a new UI update mechanism for efficient animations, and better scene management for cross-platform flexibility on , , , and . Alongside these advancements, Apple has periodically deprecated outdated APIs to encourage modern practices. For instance, in (September 2023), the traitCollectionDidChange method in UIViewController was deprecated in favor of more granular trait observation APIs, improving performance by reducing unnecessary trait change notifications. Earlier deprecations include MPMoviePlayerController in (2015), replaced by AVPlayer for more flexible media handling. These changes reflect Apple's ongoing commitment to refining UIKit for efficiency and .

Relation to Cocoa Touch and Other Apple Frameworks

UIKit serves as the primary user interface framework within Cocoa Touch, Apple's application framework layer for iOS and iPadOS, enabling developers to construct responsive and interactive applications. It builds directly upon the Foundation framework, which supplies essential data structures such as strings, arrays, and dictionaries, as well as upon Core Graphics for low-level rendering of 2D content like paths, images, and text. This layered architecture allows UIKit to abstract complex graphics operations while leveraging Foundation's object-oriented utilities for app logic. UIKit depends on QuartzCore for layer-based compositing and efficient rendering of visual elements, and on Core Animation for smooth transitions and animations that enhance user interactions without taxing the CPU. In contrast, macOS employs AppKit as its equivalent UI framework, which shares conceptual similarities with UIKit—such as view hierarchies and event handling—but uses platform-specific implementations tailored for environments, including different management and systems. This distinction ensures UIKit's optimization for touch-based, mobile-first interactions on devices. Over time, UIKit has evolved to integrate with emerging Apple frameworks, extending its capabilities beyond traditional 2D interfaces. With the introduction of ARKit in 2017, UIKit supports experiences by overlaying virtual content onto real-world views captured via the device's camera. Similarly, RealityKit, launched in 2019, enables immersive UI elements within UIKit apps, particularly for entity-based AR scenes with physics and audio. Machine learning features via Core ML, also debuted in 2017, allow UIKit-based apps to incorporate on-device inference for tasks like image recognition directly into user interfaces. UIKit acts as a foundational bridge to other Apple platforms through shared subsets of its APIs, particularly for where it provides full support for building television interfaces with remote and focus-based navigation. For , select UIKit components—such as certain data types and utilities—are available to facilitate code reuse, though the primary UI layer relies on WatchKit extensions. However, UIKit does not encompass adaptations for macOS via , introduced after , which ports iOS apps to desktop while preserving UIKit's core but adding macOS-specific behaviors.

Core Architecture

Fundamental Components and Layers

UIKit's core architecture revolves around a layered system that separates visual content management from event processing and rendering. At its foundation, the class serves as the base class for all visual elements in an app, managing a rectangular area on the screen and handling responsibilities such as drawing content, layout, and event handling through its conformance to the UIResponder protocol. Every instance is backed by a CALayer object from Core Animation, which encapsulates the view's visual properties like position, size, and transformations, enabling efficient composition and animation without direct CPU involvement. Subclassing allows developers to customize drawing via methods like draw(_:), while standard subclasses such as UILabel or UIButton provide predefined behaviors built on this foundation. The UIWindow class acts as the root container in UIKit's hierarchy, serving as the top-level object that hosts the app's scenes and coordinates the display of content across screens. It does not render visible content itself but forwards events from the system to the root view controller and its associated view hierarchy, ensuring that user interactions reach the appropriate visual elements. In multi-window apps, multiple UIWindow instances can manage distinct scenes, such as those for external displays or multitasking on . Event processing in UIKit is orchestrated through the UIApplication singleton, which manages the main event loop to handle touches, gestures, and system events. Upon receiving raw input from the system, UIApplication packages it into UIEvent objects and dispatches them via the sendEvent(_:) method to the relevant UIWindow, which then routes them down the view hierarchy to responder objects like views or view controllers. This loop, initiated by UIApplicationMain during app launch, integrates with the underlying run loop to process events continuously, supporting through UIGestureRecognizer subclasses attached to views. For custom needs, developers can subclass UIApplication to override event dispatching, though this is rarely required as the default routing suffices for most interactions. Rendering in UIKit employs a backing store model powered by Core Animation's CALayer, where visual content is composited offscreen before being presented to the display. Each layer maintains its own content, such as images or drawn elements, and the framework automatically composites the layer tree into a final image, leveraging hardware acceleration to minimize CPU load and achieve smooth updates at the display's (typically 60Hz or 120Hz on supported devices). Since , this pipeline integrates with Metal as the underlying graphics API, enabling GPU-accelerated rendering for complex compositions and animations directly within standard UIView hierarchies via CAMetalLayer when needed. Custom rendering can use Core Graphics in the draw(_:) method, but all updates trigger layer-based redrawing for efficiency. UIKit enforces a strict threading model to ensure and responsiveness, requiring all updates—such as modifying views, layouts, or properties—to occur on the main . The main dispatch queue, accessible via DispatchQueue.main, serializes these operations to prevent race conditions, while background tasks use concurrent queues like DispatchQueue.global() to offload non- work, such as data loading, before dispatching results back to the main queue. Violations of this model can lead to or crashes, as UIKit classes are not thread-safe outside the main context. This design aligns with the app's , keeping the interface responsive by isolating heavy computations.

App Lifecycle and Delegates

The iOS app lifecycle in UIKit encompasses several distinct states that dictate how an application behaves from launch to termination: not running, in which the app is either unlaunched or has been terminated by the system; inactive, where the app runs in the foreground but does not receive events, often during state transitions; active, signifying the app is in the foreground and processing events; background, where the app executes code offscreen but remains responsive to system events; and suspended, in which the app resides in memory without executing code until potentially reactivated. These states facilitate efficient resource management, with transitions triggered by user actions or system notifications, such as applicationDidBecomeActive(_:) to resume foreground operations. The UIApplicationDelegate protocol serves as the primary interface for handling app-wide lifecycle events in UIKit. Key methods include application(_:didFinishLaunchingWithOptions:), introduced in (2009), which allows initialization of structures and scene configurations before the app fully launches. For handling in iOS 9.0 and later, methods like application(_:openURL:options:) enable the delegate to incoming resource requests. In and later, the delegate extends to managing scene sessions, coordinating multi-window behaviors through methods like application(_:configurationForConnecting:options:). As of iOS 19 (2025), several legacy UIApplicationDelegate lifecycle methods are deprecated, encouraging developers to adopt the UIScene-based architecture for new apps. Introduced in (2019) to support multi-window multitasking on , the UISceneDelegate protocol handles lifecycle events for individual instances, decoupling scene-specific management from the app delegate. Essential methods include scene(_:willConnectToSession:options:), which configures a new scene upon connection, and scene(_:updateUserActivity:), which synchronizes user activity states across scenes for features. This delegate responds to per-scene transitions, such as entering the foreground or background, enabling targeted updates like pausing media in inactive windows without affecting others. To perform finite-length tasks during transitions to the background, UIKit provides the beginBackgroundTask(withName:expirationHandler:) method on UIApplication, which extends execution time beyond the 's default suspension. Apps typically receive up to about 30 seconds of additional background , after which the may terminate the process if tasks remain incomplete; developers must pair this with endBackgroundTask(_:) upon completion and implement the expiration handler for cleanup. This mechanism is crucial for operations like , ensuring reliability without indefinite background execution.

User Interface Building Blocks

Views and View Hierarchy

In UIKit, the UIView class serves as the fundamental building block for user interfaces on and , representing a rectangular region on the screen that can display content, handle touch events, and manage a of subviews. Views form a tree-like structure where each view can contain multiple subviews, enabling developers to compose complex interfaces by nesting simpler components; this dictates rendering order, event propagation, and layout relationships within an app. The system renders the view efficiently using Core Animation layers, ensuring smooth updates and animations without requiring manual redrawing in most cases. Key properties of UIView define its position, size, and visual state relative to its parent. The property specifies the view's origin and dimensions in its superview's , typically expressed as a CGRect with x, y, width, and height values. In contrast, the bounds property describes the view's internal coordinate space, where the origin is always at (0, 0) and the size matches the view's dimensions, making it ideal for custom drawing operations that are independent of the view's position in the hierarchy. The transform property allows application of affine transformations, such as , , or , to alter the view's appearance and position without changing its or bounds; for example, a CGAffineTransform can rotate a view by 90 degrees around its center. For adapting to changes in the superview's size, UIView supports autoresizing masks through the autoresizingMask property, which uses bit flags like UIView.AutoresizingMask.flexibleWidth to automatically resize or reposition subviews during events such as device rotation or explicit frame adjustments. However, for more precise and constraint-based in complex interfaces, developers typically use Auto Layout constraints instead of or alongside autoresizing masks, defining relationships between views that the system resolves dynamically. The hierarchy is managed through methods that allow dynamic addition, removal, and ordering of subviews. The addSubview(:) method appends a new to the end of the subviews , placing it above existing siblings in the z-order for rendering. Developers can insert views at specific indices using insertSubview(:at:), or relative to others with insertSubview(:aboveSubview:) and insertSubview(:belowSubview:), enabling fine control over layering; for traversal, the superview provides access to the , while the subviews lists all direct children, facilitating recursive operations like finding or updating nested elements. To remove a , methods like removeFromSuperview() detach it from its , automatically updating the hierarchy and releasing associated resources. Custom views extend UIView to render unique content by overriding the draw(:) method, where developers use Core Graphics or UIKit drawing APIs within a provided CGRect to paint paths, images, or text; the system calls this method only when setNeedsDisplay() is invoked, optimizing performance by avoiding unnecessary redraws. All views are layer-backed by default, leveraging CALayer for hardware-accelerated rendering, compositing, and animations, which handles opacity, shadows, and borders without additional coding in the draw(:) override. Coordinate systems in UIView account for the , with each view maintaining its own space relative to its bounds. Methods like convert(:to:) and convert(:from:) transform points, rects, or sizes between a view's coordinate space and another view's or the , essential for hit-testing touches or aligning elements across subviews; for instance, a point tapped in a child view can be converted to the root view's coordinates for global event handling. affects the view hierarchy by resizing the window and triggering , which views handle through autoresizing masks or manual overrides of layoutSubviews() to reposition subviews accordingly.

Controls and User Input Elements

UIControl serves as the foundational class in UIKit for creating interactive user interface elements that respond to user input through a target-action mechanism. Developers associate actions with specific control events using the addTarget(_:action:for:) method, where the action is a selector on a target object triggered by events such as touch interactions. The sendActions(for:) method programmatically dispatches these actions for designated events, allowing controls to simulate user interactions. Controls maintain states like normal, highlighted, disabled, and selected, which influence their appearance and behavior; for instance, the disabled state prevents interaction while altering visual feedback. Key subclasses of UIControl provide specialized functionality for common input scenarios. UIButton, inheriting directly from UIControl, handles tap gestures primarily through the .touchUpInside event, enabling connections to action methods via or code for tasks like form submission. UISlider allows users to select a value within a continuous range, firing the .valueChanged event continuously during thumb movement unless configured otherwise with isContinuous. UITextField facilitates text entry and editing, relying on a delegate conforming to UITextFieldDelegate to manage events like return key presses and text validation. UISwitch offers a toggle interface, switching between on and off states and notifying via the .valueChanged event upon user interaction. Appearance customization for these controls enhances visual consistency and adaptability. The tintColor property, introduced in , propagates through the view hierarchy to tint interactive elements like buttons and sliders, defining a key color for interactivity. For right-to-left (RTL) language support, the semanticContentAttribute property determines content layout direction, automatically flipping views as needed for locales like . Accessibility features ensure controls are usable by assistive technologies. Setting isAccessibilityElement to true designates a control as an individual accessible item, while accessibilityLabel provides a concise, localized description read by . Integration with the UIAccessibility protocol further supports by exposing traits, hints, and values, allowing dynamic announcements of state changes like a switch toggle. These controls integrate into the view hierarchy as subviews to capture and respond to user input within the broader structure.

View Controllers and Navigation

View controllers in UIKit serve as the primary mechanism for managing a single view hierarchy, coordinating the presentation of user interfaces, and handling responses to system events such as orientation changes and visibility updates. The UIViewController class is the foundational component, responsible for loading, displaying, and unloading its associated view while integrating with the app's overall lifecycle. The lifecycle of a UIViewController instance follows a structured sequence of methods that allow developers to perform initialization, preparation, and cleanup tasks at appropriate times. After the view hierarchy is loaded into memory, viewDidLoad is invoked, providing an opportunity for one-time setup such as configuring subviews or data sources, regardless of whether the view was created programmatically or from a . As the view becomes visible, viewWillAppear is called to prepare the interface, such as updating content or starting animations, followed by the view appearing onscreen. Conversely, when the view is about to be hidden, viewDidDisappear enables tasks like saving state or stopping ongoing processes. Additionally, traitCollectionDidChange responds to updates in the view controller's trait collection, including size classes that indicate horizontal and vertical space availability, allowing adaptive layouts for different device orientations or form factors. Navigation between view controllers is commonly handled by UINavigationController, a specialized container that maintains a stack of view controllers to support hierarchical navigation patterns. Developers push new view controllers onto the stack using pushViewController(_:animated:), which embeds the new view in the navigation interface and animates its appearance if specified, while popping the top controller with popViewController(animated:) reveals the previous one and updates the display accordingly. The navigation bar, managed by the controller, features customizable elements such as the navigationItem property of each view controller, which sets the title displayed in the center and allows addition of bar button items on the left (e.g., back button) or right (e.g., action buttons) via leftBarButtonItem and rightBarButtonItem. For non-hierarchical transitions, UIKit supports modal presentation, where a view controller is displayed over the existing interface without altering the navigation stack. The present(_:animated:completion:) method overlays the new view controller, with animation and a completion handler optional, while dismiss(animated:completion:) removes it and returns focus to the underlying content. Presentation styles have evolved since , with the default shifting to adaptive card-based sheets like .pageSheet, which present as partial overlays with translucent backgrounds on larger screens, contrasting with the full-screen .fullScreen style that requires explicit configuration for complete coverage. Container view controllers extend navigation capabilities for specific paradigms, such as tab-based or split interfaces. UITabBarController organizes multiple child view controllers into a tabbed interface, displaying a tab bar at the bottom (on ) where selection via selectedIndex or selectedViewController swaps the active content area. It supports up to five tabs, with excess handled by a "More" section, and allows user customization of tab order. For iPad-optimized apps, UISplitViewController facilitates side-by-side layouts, managing primary and secondary (or supplementary) columns for master-detail patterns. Since , it includes compact mode support, collapsing to a single column or on smaller screens while expanding to double- or triple-column arrangements on wider displays, adapting via traits like size classes.

Auto Layout and Constraints

Auto Layout is a constraint-based layout system in UIKit that enables the creation of adaptive user interfaces by dynamically calculating the size and position of views based on a set of declarative constraints. Introduced as part of in , it replaced earlier frame-based and autoresizing mask approaches, allowing layouts to respond to changes in device orientation, screen size, and content. Constraints define relationships between views or their attributes, such as edges, centers, widths, and heights, ensuring consistent and flexible designs across different devices. The core class for defining these relationships is NSLayoutConstraint, which specifies how two layout attributes are related, using the formula firstItem.firstAttribute = multiplier × secondItem.secondAttribute + constant. Constraints can be initialized programmatically in two primary ways: directly using the NSLayoutConstraint initializer or via the Visual Format Language (VFL). The direct method involves calling init(item:attribute:relatedBy:toItem:attribute:multiplier:constant:), where parameters define the items (views), attributes (e.g., .leading, .width), relation (e.g., .equal), and scaling factors. For example, to pin a view's 20 points from its superview:
swift
let leadingConstraint = NSLayoutConstraint(
    item: myView,
    attribute: .leading,
    relatedBy: .equal,
    toItem: superview,
    attribute: .leading,
    multiplier: 1.0,
    constant: 20.0
)
VFL provides a more concise, string-based syntax for creating multiple constraints simultaneously, resembling diagrams of the layout. For instance, the string "H:|-20-[myView]-20-|" generates horizontal constraints pinning the view 20 points from the superview's margins. Constraints are created using constraints(withVisualFormat:options:metrics:views:), requiring a dictionary mapping view names to instances. This approach is limited to certain relations and cannot express multipliers like aspect ratios directly. Each constraint has a priority value ranging from 0 to , determining its importance in the layout engine; priorities of are required and must be satisfied, while lower values make them optional. By default, constraints are created with priority (required), but developers can set lower priorities to resolve conflicts by allowing the engine to break less critical ones. Content hugging and compression resistance priorities, which influence how views resize based on their intrinsic content, default to 250 (low) for hugging and 750 (high) for compression resistance in many standard UIKit views like labels and text fields. The Auto Layout engine solves the system of constraints to determine unambiguous frames for views, prioritizing higher-priority constraints and breaking ties by deactivating lower-priority ones in conflicts. arises when multiple valid solutions exist, such as insufficient constraints to fix a view's position; developers can detect this using the UIView method hasAmbiguousLayout, which returns true if the layout lacks uniqueness. To resolve , additional constraints must be added to specify exact relationships. Conflicts, where no solution satisfies all required constraints, are handled by the engine deactivating the lowest-priority conflicting constraint, though this can lead to warnings in the console. Self-sizing views leverage the intrinsicContentSize property of UIView, which returns the natural size based solely on the view's content, such as text length in a UILabel or image dimensions in a UIImageView. This property integrates with the engine to automatically compute sizes without explicit width or height constraints, reducing boilerplate while respecting content hugging and compression resistance priorities—for example, a label might hug at priority 251 to prevent unnecessary expansion. Custom views must override intrinsicContentSize to participate in self-sizing, ensuring the engine can derive appropriate dimensions. UIStackView, introduced in iOS 9 in 2015, simplifies linear layouts by automatically generating and managing Auto Layout constraints for a collection of arranged views. It arranges subviews along a specified axis—either horizontal (.horizontal) or vertical (.vertical)—and controls spacing with the spacing property. The distribution property defines how space is allocated along the axis, such as .fillEqually for uniform sizing or .fillProportionally based on intrinsic content sizes. Alignment perpendicular to the axis is set via the alignment property, like .center to center views or .fill to stretch them to match the stack's bounds. Stack views pin the first and last arranged views to their edges (or margins if isLayoutMarginsRelativeArrangement is true), deriving overall size from subviews' intrinsic content while allowing adaptive behavior. Adaptive traits in Auto Layout use size classes to tailor layouts for different devices and orientations, primarily distinguishing between compact and regular classes. Compact size class applies to constrained spaces, such as portrait (compact width, regular height) or Split View (compact width, regular height), while regular denotes expansive areas like full screen (regular width and height). These traits, accessed via UITraitCollection, enable conditional constraint installation or view hiding in or code—for instance, showing a detailed sidebar only in regular width on . There are nine total size class combinations (including "any" variants for broader applicability), ensuring layouts adapt seamlessly to and differences without duplication. Developers start with the base "any-any" class and override for specific combinations, verifying non-ambiguous results across all.

Interaction and Advanced Features

Gesture Recognition and Events

UIGestureRecognizer serves as the foundational class in UIKit for detecting and interpreting sequences of touches or other inputs as user gestures, decoupling the recognition logic from the actions taken upon detection. Developers attach instances of UIGestureRecognizer or its subclasses to any UIView to enable gesture handling without directly processing raw touch events. Upon recognition, the gesture recognizer sends action messages to designated targets, typically in response to state transitions. The class employs a state machine to track gesture progress, starting in the .possible state where it awaits input. For continuous gestures, it transitions to .began when the gesture initiates, .changed as the gesture evolves with ongoing input, and .ended upon completion; action methods are invoked at each of these transitions. Discrete gestures, by contrast, move directly from .possible to .ended or .failed without intermediate changes. The .cancelled state occurs if external factors interrupt the gesture, while .failed indicates non-matching input. To integrate a gesture recognizer, developers invoke the addGestureRecognizer(_:) method on a UIView, ensuring the view's isUserInteractionEnabled property is true. Multiple recognizers can attach to the same view, with UIKit coordinating their interactions via delegates conforming to UIGestureRecognizerDelegate for fine-tuned behavior. UIKit provides several concrete subclasses of UIGestureRecognizer for common interactions. UITapGestureRecognizer detects discrete taps, configurable via numberOfTapsRequired (default: 1) to specify sequential taps and numberOfTouchesRequired (default: 1) for finger count; it transitions to .ended once the taps complete. UIPanGestureRecognizer handles continuous panning, tracking finger drags; developers query translation(in:) for displacement relative to a view's coordinate system and velocity(in:) for movement speed in points per second. UIPinchGestureRecognizer recognizes continuous two-finger pinches for scaling, providing the scale property as the ratio of current to initial touch distance. UILongPressGestureRecognizer detects continuous presses, with minimumPressDuration (default: 0.5 seconds) setting the hold time threshold, numberOfTouchesRequired for fingers, and allowableMovement limiting drift before failure. For scenarios involving multiple gesture recognizers on the same , UIKit supports simultaneous recognition through delegate methods or direct dependencies. The require(toFail:) method establishes a failure requirement chain, delaying one recognizer's progression from .possible until another fails, such as requiring a single-tap to wait for a double-tap's . Delegates can further customize this via gestureRecognizer(:shouldRecognizeSimultaneouslyWith:) for concurrent gestures or gestureRecognizer(:shouldRequireFailureOf:) for ordered dependencies. Custom gestures beyond built-in subclasses require subclassing UIGestureRecognizer and overriding touch-handling methods like touchesBegan(:with:), touchesMoved(:with:), touchesEnded(:with:), and touchesCancelled(:with:). In these overrides, developers update the state property based on touch data—for instance, implementing a drag gesture that incorporates by analyzing touch deltas in touchesMoved(_:with:). This approach enables recognition of complex patterns, such as check marks or shapes, while integrating seamlessly with UIKit's .

Animations and Transitions

UIKit provides robust mechanisms for creating smooth and engaging animations through integration with , allowing developers to animate view properties such as position, scale, and opacity without manual frame-by-frame rendering. These animations enhance user interfaces by simulating natural motion, with UIKit handling the underlying rendering on the GPU for performance. layers backing each enable implicit animations for certain property changes, while explicit animations offer fine-grained control. Block-based animations form the foundation of UIView animations, introduced in to simplify declarative animation code. The UIView.animate(withDuration:delay:options:animations:completion:) method executes changes within the animations block over a specified duration in seconds, with an optional delay before starting. Animation curves, such as .curveEaseInOut in the options parameter, provide smooth acceleration and deceleration for more realistic motion. A completion handler executes after the animation finishes, receiving a indicating successful completion. For spring-like effects, UIView.animate(withDuration:delay:usingSpringWithDamping:initialSpringVelocity:options:animations:completion:) simulates physical springs by adjusting the dampingRatio (values near 1.0 for minimal ) and initialSpringVelocity (initial speed, e.g., 1.0 for full distance in one second). Keyframe animations extend block-based methods for complex, multi-stage sequences using UIView.animateKeyframes(withDuration:delay:options:animations:completion:). Within the animations block, developers call addKeyframe(withRelativeStartTime:relativeDuration:animations:) to define segments, where relativeStartTime (0.0 to 1.0) sets the fractional start offset and relativeDuration (0.0 to 1.0) allocates the segment's length relative to the total duration. This approach chains property changes, such as sequential transforms, without overlapping computations. The method supports the same options and completion as basic blocks for consistent behavior. Transitions in UIKit facilitate seamless state changes between views, often used for revealing or hiding content. The UIView.transition(with:duration:options:animations:completion:) class method applies effects like .transitionCrossDissolve from UIView.AnimationOptions to a container view, animating subview additions or removals in the animations block over the specified duration. For more advanced effects, Core Animation's CATransition adds transitions to a layer via layer.add(_:forKey:), with predefined types such as kCATransitionFade (default cross-fade) or kCATransitionPush. Custom subtypes like kCATransitionFromTop direct motion, and effects such as pageCurl simulate turning pages by setting the type to kCATransitionPageCurl. The transition duration defaults to 0.25 seconds unless overridden. Layer-level animations via Core Animation offer precise control over implicit properties not directly animatable by UIView methods. CABasicAnimation targets properties like opacity by specifying a keyPath (e.g., "opacity"), with fromValue and toValue defining start and end states for . For example, fading a layer from opaque to transparent sets fromValue to 1.0 and toValue to 0.0. Animations conform to CAAnimationDelegate for callbacks like animationDidStop(_:finished:), enabling completion logic such as state updates. These animations integrate seamlessly with UIKit views through their backing CALayer, running efficiently off the main thread where possible. Recent updates have enhanced UIKit's animation capabilities. In iOS 18, UIUpdateLink provides a new way to synchronize complex s with display updates, similar to CADisplayLink but optimized for UI elements. Additionally, UIKit now supports using SwiftUI types to animate UIView properties, improving interoperability, and introduces a reversible zoom transition for navigation and presentations. In iOS 19, spring are simplified with new duration and bounce parameters in UIView.animate methods, along with fluid as part of the Liquid Glass and expanded SF Symbols effects like bounce and pulse.

Data Presentation and Integration

Table and Collection Views

Table and collection views in UIKit provide efficient mechanisms for displaying large, scrollable datasets in applications, supporting both linear lists and grid-based layouts. These views handle the rendering of reusable content cells while delegating data management and user interactions to separate protocols, ensuring performance through recycling and lazy loading. specializes in single-column, vertically scrolling rows, ideal for lists like contacts or settings, whereas offers flexible, customizable arrangements for more complex presentations such as photo grids or dashboards. The UITableView class presents data in a single-column format, grouping rows into optional sections for hierarchical organization. It relies on a data source object conforming to the to supply content: the numberOfRowsInSection method returns the count of rows in a specified section, while cellForRowAt configures and provides a reusable cell for display at a given index path. A delegate object, implementing , manages interactions and appearance, such as heightForRowAt to dynamically set row heights and didSelectRowAt to respond to user taps on rows. Sections enhance navigation with headers, footers, and an index view for quick jumping, using index paths (row and section indices) to uniquely identify content. UICollectionView extends this capability to multidimensional layouts, managing an ordered set of items divided into sections and presented via a object. The UICollectionViewFlowLayout subclass, commonly used for grid or flow arrangements, defines properties like itemSize to specify cell dimensions and minimumLineSpacing to control vertical spacing between items in a row. Supplementary views, such as section headers, are provided separately from cells and positioned by the , allowing for enriched structures like titled galleries. Data sourcing mirrors UITableView, with UICollectionViewDataSource handling item counts and cell provision, while the delegate oversees selections and adjustments. Introduced in , diffable data sources simplify updates for both UITableView and UICollectionView by using identifiable items and sections, eliminating manual index path calculations. The NSDiffableDataSourceSnapshot struct captures the current data state, enabling developers to append sections and items, then the snapshot to the data source with animation via apply(snapshot, animatingDifferences: true). This computes efficient differences between snapshots for smooth transitions, such as insertions or deletions. For batch operations, UICollectionView supports performBatchUpdates to group multiple changes with animations, while UITableView uses beginUpdates and endUpdates for similar coordinated updates. In , table and collection view APIs were updated to facilitate easier cell updates, including the updateConfiguration() for cells, headers, and footers, as well as the contentHuggingElements property on UITableView and UICollectionLayoutListConfiguration for improved layout control. Custom cells enhance reusability and editing in these views, with dequeueReusableCell(withIdentifier:for:) efficiently recycling UITableViewCell or UICollectionViewCell instances to minimize memory overhead in scrolling scenarios. For editing, UITableView supports swipe-to-delete gestures: swiping a row reveals a Delete button, triggering the delegate's tableView:commitEditingStyle:forRowAt: to remove the item from the data source and animate the deletion via deleteRows(at:with:). This mode integrates with overall editing states set by setEditing(_:animated:), allowing bulk operations without displaying full reorder controls during swipes.

Integration with Modern Frameworks like SwiftUI

UIKit provides seamless interoperability with SwiftUI, Apple's declarative UI framework introduced in 2019, enabling developers to build hybrid applications that combine elements from both frameworks. This integration allows existing UIKit-based apps to incorporate SwiftUI views for modern declarative interfaces while retaining the imperative control of UIKit for legacy or complex components. By leveraging specific APIs, developers can embed SwiftUI hierarchies within UIKit view controllers or wrap UIKit views for use in SwiftUI scenes, facilitating gradual migration paths without full rewrites. A key mechanism for embedding SwiftUI views into UIKit is the UIHostingController class, introduced in iOS 13. This UIKit view controller manages a view hierarchy, allowing it to be presented modally, pushed onto a navigation stack, or added as a child view controller within an existing UIKit interface. Upon initialization, UIHostingController takes a root view, which it renders into its view hierarchy; the root view can be dynamically updated via the rootView property to reflect changing app state. For example, in a storyboard-based app, developers can instantiate UIHostingController with a content view and embed it into a container view, ensuring smooth integration without disrupting the overall UIKit architecture. This approach is particularly useful for adding -driven features, such as dynamic lists or animations, to apps with established UIKit navigation flows. Conversely, the UIViewRepresentable protocol enables the incorporation of UIKit views into SwiftUI by wrapping them as SwiftUI-compatible components, also available since iOS 13. To implement this protocol, a custom type must conform to UIViewRepresentable and provide two required methods: makeUIView(context:), which creates and configures the initial UIView instance, and updateUIView(_:context:), which applies updates to the view based on changing SwiftUI state. The context parameter in these methods provides access to environment values, transactions, and a coordinator for handling UIKit delegate or target-action patterns, ensuring bidirectional communication between the frameworks. For instance, a UITextView can be wrapped using UIViewRepresentable to support editable text fields in a SwiftUI layout, with updates propagating seamlessly as the underlying SwiftUI data model evolves. This protocol supports complex interactions, such as gesture handling or data binding, by delegating to a Coordinator class that bridges UIKit's imperative events to SwiftUI's reactive model. In iOS 18, the UIGestureRecognizerRepresentable protocol was added to simplify reusing UIKit gesture recognizers in SwiftUI, and new zoom transitions support reversible and interruptible navigation between frameworks. Additionally, UIKit can now incorporate SwiftUI animations more fluidly. In hybrid applications, coordinating navigation and state between UIKit view controllers and scenes requires careful to maintain a cohesive . Developers can achieve this by using UIHostingController to host views within a UINavigationController stack, where UIKit manages the overall navigation while handles subviews; state sharing occurs via bindings like @Binding in linked to UIKit properties or observable objects. For example, a UIPageViewController can host multiple UIHostingController instances, each presenting a view, with page transitions triggered by UIKit's data source methods that respond to state changes tracked via @State variables. This setup allows hybrid navigation flows, such as pushing a scene from a UIKit or dismissing a view back to a UIKit controller, while ensuring consistent propagation across framework boundaries. Such coordination is essential for apps transitioning incrementally, avoiding disruptions in navigation patterns like tab bars or split views. In 19, enhancements include automatic observation tracking and a new UI update method like updateProperties() for better in hybrid setups. Performance considerations in UIKit-SwiftUI integration emphasize leveraging each framework's strengths: SwiftUI excels in and adaptive layouts for simpler interfaces, while UIKit is preferred for performance-critical scenarios involving complex UIs or custom rendering. In hybrid apps, embedding SwiftUI via UIHostingController introduces minimal overhead for most use cases, but developers should for view hierarchy depth, as excessive nesting can impact rendering efficiency; Apple's guidance recommends retaining UIKit for intricate components like custom collection views where fine-grained control optimizes scroll performance and usage. For instance, complex UIs with heavy often remain in UIKit to avoid SwiftUI's declarative overhead in update cycles, ensuring smooth 60 FPS interactions on older devices. Overall, hybrid approaches balance SwiftUI's declarative simplicity with UIKit's mature optimization tools, guided by Instruments to identify bottlenecks at framework boundaries.

Extensions and Cross-Platform Use

Third-Party Ports and Adaptations

, a .NET-based cross-platform launched in , enables developers to build applications for Windows, , and other platforms using a single codebase that adapts to native controls where possible, including on for rendering views like buttons and elements. On non-iOS targets, it supports UIKit-like views through its native renderer backend, which maps XAML-defined components to platform-specific , ensuring consistent behavior across environments while leveraging .NET's for productivity. This adaptation allows .NET developers to target multiple platforms without rewriting logic, with iOS rendering specifically utilizing UIView subclasses from for pixel-perfect native performance. Note that while native rendering uses , the default Skia renderer provides a unified cross-platform drawing engine as of 2025. .NET Multi-platform App UI (.NET MAUI), Microsoft's cross-platform framework released in 2022, also leverages for native iOS apps. It maps XAML or C# UI definitions to UIKit components via handlers, enabling single-codebase development for , , Windows, and macOS with native performance on each platform. On , controls like buttons and lists are rendered using UIView and subclasses, integrating seamlessly with UIKit's layout and event systems. Flutter's Cupertino widget library provides a set of components designed to replicate the iOS design language, directly mimicking elements to create authentic iOS-style applications on cross-platform setups including and . Key examples include CupertinoApp, which serves as the root widget analogous to UIKit's UIApplication for managing app-wide themes and , and CupertinoNavigationBar, which emulates UINavigationBar for handling title displays, leading/trailing actions, and hierarchical . These widgets adhere to Apple's , incorporating iOS-specific interactions like edge swipes for back , and are built to run efficiently on Flutter's rendering engine while preserving UIKit's visual and behavioral fidelity. React Native facilitates integration with native iOS components through its bridge architecture, where JavaScript-defined views are mapped to UIKit elements via view managers. The RCTViewManager class acts as the core mechanism for this mapping, allowing custom or built-in React Native components to instantiate and manage corresponding UIView instances from , such as UILabel for text or UIButton for interactive elements. This bridge ensures seamless performance by delegating rendering and event handling to native code, enabling developers to extend React Native apps with platform-specific features without full rewrites. Prior to 2015, several early third-party libraries attempted to port UIKit-inspired UI patterns to , aiming to replicate aesthetics like navigation bars and tab structures amid the platform's pre- era, but most have been deprecated following the introduction of Google's guidelines in 2014, which standardized Android's . These historical efforts, often open-source projects shared on developer forums, highlighted challenges in cross-platform consistency but were supplanted by native tools and modern frameworks.

Accessibility and Theming Features

UIKit provides robust built-in support for accessibility, enabling developers to make apps usable by people with disabilities through adherence to standards like and Dynamic Type. Core to this is the UIAccessibility protocol, which allows views and controls to expose traits such as .button for interactive elements or .adjustable for components like sliders that can change values. These traits inform assistive technologies about the role and behavior of UI elements, ensuring accurately describes and interacts with them. Recent updates in 18 (2024) simplified maintenance of code with block-based setters for attributes like labels and hints, reducing boilerplate in UIKit apps. In 19 (2025), the introduced Accessibility Nutrition Labels to highlight features like support, aiding discoverability. To handle focus changes, developers can post notifications using UIAccessibility.post(notification: .screenChanged, argument: nil), which alerts to updates in the user interface, such as when a new view gains focus. Dynamic Type further enhances readability by scaling text based on the user's preferred content size category, accessible via UITraitCollection.preferredContentSizeCategory; apps adjust fonts using UIFontMetrics to respect these settings automatically. For users, UIKit supports custom rotor items through accessibilityCustomRotors, allowing quick navigation to related elements like headings or links via a gesture-based . Additionally, the accessibilityPerformMagicTap() method enables a double-tap gesture with two fingers to trigger salient actions, such as play/pause in apps. Integration with Guided Access ensures features remain functional in restricted modes, using UIGuidedAccessAccessibilityFeature to enable or disable options like touch exclusion during sessions. Theming in UIKit, introduced prominently with , centers on adaptive interfaces that respond to user preferences, particularly through the UITraitCollection class which encapsulates environmental traits like userInterfaceStyle for light or dark modes. This system has evolved, with iOS 19 (2025) introducing the Liquid Glass , which enhances UIKit components like tab views, split views, bars, and presentations for more dynamic and fluid adaptive theming. Developers override traitCollectionDidChange(_:) in views or view controllers to detect and react to style changes, updating colors, images, or layouts accordingly. Semantic color variants in UIColor, such as .label for primary text or .secondaryLabel for subdued elements, provide dynamic colors that automatically adapt between light and dark appearances without manual intervention. These features apply to standard views and controls, ensuring consistent theming across UIKit-based interfaces.

References

  1. [1]
    UIKit | Apple Developer Documentation
    UIKit provides a variety of features for building apps, including components you can use to construct the core infrastructure of your iOS, iPadOS, or tvOS apps.About App Development with... · UIKit Data Types · Accessibility for UIKit
  2. [2]
    UIView | Apple Developer Documentation
    To display labels, images, buttons, and other interface elements commonly found in apps, use the view subclasses that the UIKit framework provides rather than ...
  3. [3]
    What's new in UIKit - WWDC25 - Videos - Apple Developer
    Jun 9, 2025 · Modernize your app with the latest APIs in UIKit, including enhanced menu bar support, automatic observation tracking, a new UI update method, and improvements ...
  4. [4]
    UIKit and AppKit apps | Apple Developer Documentation
    UIKit and AppKit both provide a more traditional approach to creating apps that some people might prefer. Both frameworks provide a library of objects that ...
  5. [5]
    Cocoa (Touch) - Apple Developer
    Apr 6, 2018 · You use the AppKit and UIKit frameworks for developing an application's user interface. These two frameworks are equivalent in purpose but ...
  6. [6]
    MPMoviePlayerController | Apple Developer Documentation
    This class supports programmatic control of movie playback, and user-based control via buttons supplied by the movie player.Missing: UIKit | Show results with:UIKit
  7. [7]
    What's new in UIKit - WWDC22 - Videos - Apple Developer
    Jun 6, 2022 · Discover the latest updates and improvements to UIKit and learn how to build better iPadOS, iOS, and Mac Catalyst apps.
  8. [8]
    What Is Cocoa? - Apple Developer
    Sep 18, 2013 · The application-framework layer of iOS is called Cocoa Touch. ... AppKit and UIKit are Cocoa application frameworks that are designed for ...
  9. [9]
    Survey the Major Frameworks - Apple Developer
    Apr 23, 2013 · Your apps, as well as UIKit and other frameworks, are built on the Foundation framework infrastructure. The Foundation framework provides many ...
  10. [10]
    Framework - Apple Developer
    Apr 6, 2018 · For example, iPhone application projects link by default to the Foundation, UIKit, and Core Graphics frameworks. Your code accesses the ...
  11. [11]
    Drawing model - Apple Developer
    Apr 6, 2018 · The Quartz (Core Graphics), Core Animation, and application frameworks for each platform (UIKit and AppKit) provide native graphics support ...
  12. [12]
    Migrating from Cocoa Touch - Apple Developer
    Sep 16, 2015 · Note: Whereas UIKit uses Core Graphics data types for rectangles, points, and other geometric primitives, AppKit uses its own defined types for ...
  13. [13]
    Get Started - watchOS - Apple Developer
    Get started with this easy-to-navigate collection of videos, documentation, and tools to build great apps and games.
  14. [14]
    Windows and screens | Apple Developer Documentation
    ### Summary of UIWindow as Root Container Managing Scenes
  15. [15]
    UIApplication | Apple Developer Documentation
    Your app's application object handles the initial routing of incoming user events. It dispatches action messages forwarded to it by control objects (instances ...
  16. [16]
  17. [17]
    CALayer | Apple Developer Documentation
    An object that manages image-based content and allows you to perform animations on that content.Contents · Func render(in: CGContext) · Bounds · compositingFilter
  18. [18]
    Metal Overview - Apple Developer
    Metal powers hardware-accelerated graphics on Apple platforms by providing a low-overhead API, rich shading language, tight integration between graphics and ...Explore Metal documentation · Metal Performance Shaders · Metal Developer ToolsMissing: UIKit 8
  19. [19]
    DispatchQueue | Apple Developer Documentation
    An object that manages the execution of tasks serially or concurrently on your app's main thread or on a background thread.DispatchQueue.Attributes · Func sync(execute: () -> Void) · Global(qos:)Missing: UIKit | Show results with:UIKit
  20. [20]
    About App Development with UIKit | Apple Developer Documentation
    The UIKit framework provides the core objects that you need to build apps for iOS and tvOS. You use these objects to display your content onscreen.
  21. [21]
    Managing your app's life cycle | Apple Developer Documentation
    In iOS 12 and earlier, UIKit delivers all life-cycle events to the UIApplicationDelegate object. The app delegate manages all of your app's windows, including ...
  22. [22]
    UIApplicationDelegate | Apple Developer Documentation
    In iOS 12 and earlier, you use your app delegate to manage major life cycle events in your app. Specifically, you use methods of the app delegate to update the ...didFinishLaunchingWithOptions · applicationDidBecomeActive(_:) · Window
  23. [23]
    application(_:didFinishLaunchingWithOptions:) - Apple Developer
    Tells the delegate that the launch process is almost done and the app is almost ready to run.
  24. [24]
    application(_:open:options:) | Apple Developer Documentation
    Asks the delegate to open a resource specified by a URL, and provides a dictionary of launch options.
  25. [25]
    UISceneDelegate | Apple Developer Documentation
    Use your UISceneDelegate object to manage life-cycle events in one instance of your app's user interface. This interface defines methods for responding to state ...
  26. [26]
    Extending your app's background execution time - Apple Developer
    You extend your app's runtime by calling the beginBackgroundTask(withName:expirationHandler:) method. Calling this method gives you extra time to perform ...
  27. [27]
    View Programming Guide for iOS - Apple Developer
    Sep 17, 2014 · A view is an instance of the UIView class (or one of its subclasses) and manages a rectangular area in your application window. Views are ...
  28. [28]
    draw(_:) | Apple Developer Documentation
    Specifically, UIKit creates and configures a graphics context for drawing and adjusts the transform of that context so that its origin matches the origin of ...
  29. [29]
    UIControl | Apple Developer Documentation
    The base class for controls, which are visual elements that convey a specific action or intention in response to user interactions.
  30. [30]
  31. [31]
  32. [32]
    UIControl.State | Apple Developer Documentation
    Overview. A control can have more than one state at a time. Controls can have different configurations according to their state.
  33. [33]
    UIButton | Apple Developer Documentation
    ### Summary of UIButton Specifics
  34. [34]
    UISlider | Apple Developer Documentation
    ### Summary of UISlider: valueChanged Event and Functionality as a Control
  35. [35]
  36. [36]
  37. [37]
    tintColor | Apple Developer Documentation
    tintColor. The first nondefault tint color value in the view's hierarchy, ascending from and starting with the view itself. iOS 7.0 ...
  38. [38]
    semanticContentAttribute | Apple Developer Documentation
    A semantic description of the view's contents, used to determine whether the view should be flipped when switching between left-to-right and right-to-left ...
  39. [39]
    accessibilityLabel | Apple Developer Documentation
    Discussion. The label is a very short, localized string that identifies the accessibility element, but does not include the type of the control or view.
  40. [40]
    UIAccessibility | Apple Developer Documentation
    UIAccessibility. A set of methods that provides accessibility information about views and controls in an app's user interface.
  41. [41]
  42. [42]
  43. [43]
  44. [44]
    Apple Developer Documentation
    **Summary of Modal Presentation in UIKit (iOS 13 and Later):**
  45. [45]
    UITabBarController | Apple Developer Documentation
    A container view controller that manages a multiselection interface, where the selection determines which child view controller to display.UITabBarController.Mode · selectedViewController
  46. [46]
    UISplitViewController | Apple Developer Documentation
    In iOS 14 and later, UISplitViewController supports column-style layouts. A column-style split view controller lets you create an interface with two or three ...
  47. [47]
    Understanding Auto Layout - Apple Developer
    Mar 21, 2016 · Understanding Auto Layout. Auto Layout dynamically calculates the size and position of all the views in your view hierarchy, ...Working with Constraints in... · Stack Views · Programmatically Creating...
  48. [48]
    NSLayoutConstraint | Apple Developer Documentation
    After solving for the required constraints, Auto Layout tries to solve all the optional constraints in priority order from highest to lowest. If it cannot solve ...
  49. [49]
    Auto Layout Guide: Programmatically Creating Constraints
    Mar 21, 2016 · You have three choices when it comes to programmatically creating constraints: You can use layout anchors, you can use the NSLayoutConstraint class, or you can ...
  50. [50]
    Auto Layout Guide: Views with Intrinsic Content Size
    Mar 21, 2016 · The following recipes demonstrate working with views that have an intrinsic content size. In general, the intrinsic content size simplifies the layout.
  51. [51]
    Auto Layout Guide: Ambiguous Layouts - Apple Developer
    Mar 21, 2016 · Ambiguous layouts occur when the system of constraints has two or more valid solutions. There are two main causes.
  52. [52]
  53. [53]
    intrinsicContentSize | Apple Developer Documentation
    The natural size for the receiving view, considering only properties of the view itself.Missing: Layout | Show results with:Layout
  54. [54]
    UIStackView | Apple Developer Documentation
    The stack view ensures that its arrangedSubviews property is always a subset of its subviews property. Specifically, the stack view enforces the following rules ...
  55. [55]
    Auto Layout Guide: Size-Class-Specific Layout - Apple Developer
    Mar 21, 2016 · Interface Builder recognizes nine different size classes. Four of these are the Final size classes: Compact-Compact, Compact-Regular, Regular- ...
  56. [56]
    UIGestureRecognizer | Apple Developer Documentation
    The UIGestureRecognizer class defines a set of common behaviors that can be configured for all concrete gesture recognizers. It can also communicate with its ...
  57. [57]
    About the Gesture Recognizer State Machine - Apple Developer
    A gesture recognizer always starts in the UIGestureRecognizer.State.possible state, which indicates that it is ready to start processing events. From that state ...
  58. [58]
  59. [59]
    UIGestureRecognizerDelegate | Apple Developer Documentation
    A set of methods implemented by the delegate of a gesture recognizer to fine-tune an app's gesture-recognition behavior.
  60. [60]
    UITapGestureRecognizer | Apple Developer Documentation
    A discrete gesture recognizer that interprets single or multiple taps.
  61. [61]
    UIPanGestureRecognizer | Apple Developer Documentation
    UIPanGestureRecognizer is a concrete subclass of UIGestureRecognizer. Clients of this class can, in their action methods, query the UIPanGestureRecognizer ...
  62. [62]
    UIPinchGestureRecognizer | Apple Developer Documentation
    UIPinchGestureRecognizer is a concrete subclass of UIGestureRecognizer. The user must press two fingers on a view while pinching it.
  63. [63]
    UILongPressGestureRecognizer | Apple Developer Documentation
    UILongPressGestureRecognizer is a concrete subclass of UIGestureRecognizer. The user must press one or more fingers on a view and hold them there for a minimum ...
  64. [64]
    require(toFail:) | Apple Developer Documentation
    This method creates a relationship with another gesture recognizer that delays the current gesture recognizer's transition out of UIGestureRecognizer.State. ...<|control11|><|separator|>
  65. [65]
    Implementing a custom gesture recognizer - Apple Developer
    Discover when and how to build your own gesture recognizers.
  66. [66]
  67. [67]
    UIView.AnimationOptions | Apple Developer Documentation
    Option to remove the views from the hierarchy when animation is complete. struct KeyframeAnimationOptions. Options for configuring keyframe-based animations.
  68. [68]
  69. [69]
  70. [70]
  71. [71]
  72. [72]
    CATransition | Apple Developer Documentation
    You can transition between a layer's states by creating and adding a CATransition object to it. The default transition is a cross fade.
  73. [73]
    type | Apple Developer Documentation
    var subtype: CATransitionSubtype? Specifies an optional subtype that indicates the direction for the predefined motion-based transitions.
  74. [74]
    CABasicAnimation | Apple Developer Documentation
    Core Animation will interpolate between the fromValue color and the toValue color. The animation created in the following code fades a layer's background color ...Missing: UIKit | Show results with:UIKit
  75. [75]
  76. [76]
    UITableView | Apple Developer Documentation
    A view that presents data using rows in a single column.UITableViewCell · Protocol UITableViewDelegate · UITableView.Style · reloadData()
  77. [77]
    UICollectionView | Apple Developer Documentation
    An object that manages an ordered collection of data items and presents them using customizable layouts.UICollectionViewDataSource · UIScrollView · CALayerDelegate · UIAppearance
  78. [78]
  79. [79]
  80. [80]
    NSDiffableDataSourceSnapshot | Apple Developer Documentation
    You use a snapshot to set up the initial state of the data that a view displays, and you use snapshots to reflect changes to the data that the view displays.Nsdiffabledatasourcesnapshot · Overview · Topics
  81. [81]
    Updating collection views using diffable data sources
    A diffable data source stores a list of section and item identifiers, which represents the identity of each section and item contained in a collection view.Overview · Define The Diffable Data... · Load The Diffable Data...
  82. [82]
    Inserting and Deleting Rows and Sections - Apple Developer
    Sep 18, 2013 · This section gives a guided tour through the parts of a project that work together to set up a table view for editing mode and delete rows from ...<|control11|><|separator|>
  83. [83]
    UIKit integration | Apple Developer Documentation
    Overview. Integrate SwiftUI with your app's existing content using hosting controllers to add SwiftUI views into UIKit interfaces.
  84. [84]
    UIHostingController | Apple Developer Documentation
    Create a UIHostingController object when you want to integrate SwiftUI views into a UIKit view hierarchy. At creation time, specify the SwiftUI view you want to ...UIHostingControllersizingOptions
  85. [85]
    UIViewRepresentable | Apple Developer Documentation
    Use a UIViewRepresentable instance to create and manage a UIView object in your SwiftUI interface. Adopt this protocol in one of your app's custom instances.
  86. [86]
    Interfacing with UIKit | Apple Developer Documentation
    SwiftUI works seamlessly with the existing UI frameworks on all Apple platforms. For example, you can place UIKit views and view controllers inside SwiftUI ...
  87. [87]
    Understanding and improving SwiftUI performance - Apple Developer
    SwiftUI implements a declarative approach to constructing a user interface. You describe your app's UI and how it depends on the app's data and environment.
  88. [88]
    Uno Platform: Build Cross-Platform .NET Apps Faster
    Uno Platform is the most complete cross-platform .NET UI framework for Web, Mobile, Desktop and Embedded. Boost productivity with Visual Designer, C# and ...Uno Platform Documentation · Uno Platform Pricing · Uno Platform Plugin for Rider
  89. [89]
    The Native Renderer - Uno Platform
    The native renderer is our oldest backend, which uses the native components and APIs to render the UI. Each supported platform (iOS, Android, and WebAssembly)
  90. [90]
    Cupertino widgets - Flutter documentation
    A catalog of Flutter's Cupertino widgets that align with Apple's Human Interface Guidelines for iOS and macOS.Missing: mimic | Show results with:mimic
  91. [91]
    React Native
    **Summary:**
  92. [92]
    Accessibility for UIKit | Apple Developer Documentation
    **Summary of UIKit Accessibility Features:**
  93. [93]
  94. [94]
    preferredContentSizeCategory | Apple Developer Documentation
    With Dynamic Type, users can ask that apps display text using fonts that are larger or smaller than the normal font size defined by the system.
  95. [95]
    UIAccessibilityCustomRotor | Apple Developer Documentation
    A context-sensitive function that helps VoiceOver users find the next instance of a related element.
  96. [96]
    accessibilityPerformMagicTap() | Apple Developer Documentation
    The exact action performed by this method depends your app, typically toggling the most important state of the app.
  97. [97]
    UIGuidedAccessAccessibilityFea...
    Enables or disables the specified accessibility features while using Guided Access. enum Code. Error codes for Guided Access.
  98. [98]
    Supporting Dark Mode in your interface - Apple Developer
    Supporting Dark Mode in your interface. Update colors, images, and behaviors so that your app adapts automatically when Dark Mode is active.
  99. [99]
    traitCollectionDidChange(_:) | Apple Developer Documentation
    The system calls this method when the iOS interface environment changes. Implement this method in view controllers and views, according to your app's needs.Missing: theming | Show results with:theming
  100. [100]
    UI element colors | Apple Developer Documentation
    UIKit provides color objects for the foreground and ... For more information about supporting Dark Mode, see Supporting Dark Mode in your interface.