I use Apple’s official naming throughout this essay: visionOS. I also want to sharpen the thesis slightly. The most plausible near-term shift is not that AI suddenly replaces the kernel-level operating system. It is that AI increasingly takes over the primary interaction layer now spread across app navigation, system search, shortcuts, cross-app flows, and tool orchestration. That is the layer users actually feel every day. (Apple, OpenAI)
Spatial grammar is flowing back into mainstream operating systems
If 2013 was the year flat UI became the dominant grammar of mainstream mobile software, then 2025 looks more like the year spatial interface logic began flowing back into mainstream operating systems. Apple introduced Liquid Glass across iOS 26, iPadOS 26, macOS Tahoe 26, watchOS 26, and tvOS 26, and explicitly said the new design was inspired by the “depth and dimensionality” of visionOS. At the same time, Apple continues to define Vision Pro as a spatial computer, and visionOS as a spatial operating system built on the foundations of macOS, iOS, and iPadOS, complete with an infinite canvas, a fully three-dimensional UI, and input centered on eyes, hands, and voice. Taken together, those signals suggest that Liquid Glass is not just a visual refresh. It is Apple importing the grammar of spatial computing back into traditional 2D devices. (Apple, Apple)
The more accurate claim is not that flat UI is dead, but that flat UI is no longer sufficient as the sole organizing logic of software. Apple’s iOS 7 redesign already emphasized subtle motion, translucency, and functional layering instead of heavy skeuomorphic metaphor. If you go back further, Mac OS X Aqua also relied on transparency, fluidity, and optical richness. Apple’s design history was never a simple one-way migration from skeuomorphism to flatness. It has repeatedly rebalanced abstraction and materiality. Liquid Glass matters because it turns “material” from a look into an interaction system. (Apple, Apple Developer)
Apple’s own WWDC25 language makes that clear. Liquid Glass is presented not as literal glass, but as a digital meta-material that bends and shapes light in real time. Lensing, specular highlights, ambient spill, morphing surfaces, and real-time rendering are used to communicate focus, hierarchy, touch response, and the relationship between interface elements. Hierarchy is no longer established primarily through blocks, borders, and shadow tokens. It is increasingly established through optics, material behavior, and continuity of motion. This is not merely “more 3D-looking UI.” It is a shift from layout logic toward spatial logic. (Apple Developer, Apple Developer)
In Apple’s new design system, Liquid Glass is repeatedly described as a functional layer that floats above content while preserving clarity and not stealing focus. Action sheets are now anchored to the action that triggered them instead of merely rising from the bottom of the screen. Scroll edge effects are framed as a way to clarify the boundary between UI and content rather than as decorative blur. The interface is no longer a pile of flat screens. It is becoming a set of surfaces with source relationships, attachment points, and perceived distance. Apple even describes those relationships as “spatial yet grounded.” (Apple Developer, Apple Developer)
Apple’s own Human Interface Guidelines define materials as effects that create depth, layering, and hierarchy. That definition alone already signals a broader shift away from austere flatness toward interfaces that feel more embodied, more animated, and more layered. (Apple Developer)
Flat UI is not dead, but it is no longer enough
This is why I would avoid saying that Liquid Glass is a return to old skeuomorphism. The important shift is not nostalgia for realistic texture. It is the elevation of depth, attachment, optical response, and material continuity into reusable interaction primitives. Once material becomes functional rather than ornamental, the interface stops being only a layout problem. It becomes a question of how to stage focus, origin, affordance, and trust. That is a deeper change than a simple aesthetic refresh. (Apple Developer, Apple Developer)
The consequence is practical. In a flat-UI-dominant system, clarity is often built from contrast, spacing, alignment, and a limited set of elevation cues. In a spatially informed system, clarity can also come from where a surface appears to originate, how it reacts to motion, what it seems attached to, and how it separates itself from underlying content. This makes interface behavior more legible in some cases, but it also raises the bar for restraint. If the spatial cue does not improve comprehension, it quickly turns into noise.
Why visionOS is the decisive catalyst
visionOS matters because it does not merely introduce a new aesthetic. It formalizes space itself as an OS-level UI primitive. Apple describes Vision Pro in terms of windows, volumes, 3D objects, ornaments, and an infinite canvas. The spatial layout guidance for visionOS explicitly frames depth as a way to communicate hierarchy. Ornaments allow controls and information to attach to a window without crowding or obscuring the main content. Most revealing of all, Apple describes volumes as an intermediate step between a 2D windowed experience and a fully 3D immersive app. That is almost an official version of the transition many designers are now feeling: spatial UI does not arrive all at once, and it does not have to. It can arrive through intermediate layers. (Apple)
I would refine the thesis one step further. It is not that visionOS single-handedly caused UI to become more dimensional. It is that visionOS gave Apple a place to systematize depth, attachment, material, and spatial relationships into a coherent cross-platform interaction language. Liquid Glass also depends on hardware, silicon, real-time graphics, rounded display geometry, and a shared platform foundation that spans SwiftUI, Mac Catalyst, and updated windowing behavior. So visionOS is the clearest catalyst, but not the only cause. What is really happening is that the grammar of spatial computing is beginning to reshape the entire Apple ecosystem. (Apple, Apple)
Why AI interfaces are becoming workbenches
The next part of the argument is just as important. Interface structure is increasingly moving toward something IDE-like. Apple itself is already laying the groundwork. iPadOS 26 introduces a new windowing system with fluid resizing and precise placement. UIKit and UISplitViewController now emphasize inspector columns, dynamic column resizing, and richer menu bar behavior. Apple’s WWDC25 guidance treats split views, inspectors, sidebars, search, and toolbars as foundational structures of the new design system. Put together, those are no longer the ingredients of a single linear app flow. They are the ingredients of a workspace. (Apple, Apple Developer)
AI-native software makes the same pattern even more visible. OpenAI’s Canvas moves beyond pure chat into a dedicated collaborative workspace for writing and coding. Replit’s Workspace combines agent conversation, live preview, console output, and tools in one home base. GitHub Copilot agent mode exposes multi-step execution, terminal commands, approvals, undo, and tests inside the interface. These products do not resemble IDEs by accident. They resemble IDEs because IDEs are one of the most mature interface patterns for collaboration between human intent and machine execution. (OpenAI, Replit Docs, Visual Studio Code)
So the best version of the claim is not that every product will literally become VS Code. It is that any interface built for complex, iterative, delegated, inspectable work will tend toward a workbench structure. It will need a place for conversation, a place for content, a place for preview, a place for tools, a place for state, and a place for approval and reversal. Traditional app UI is usually a guided path. AI UI increasingly looks like a controllable, inspectable workspace.
AI may not replace the kernel, but it can redefine the interaction layer
This brings us to the operating system question. If by “replace the operating system” we mean the kernel, memory management, drivers, and low-level security infrastructure, then no, that is not the near-term story. But if we mean the layer users actually experience day to day, the layer of search, launching, cross-app coordination, shortcuts, tool orchestration, and guided action, then yes, that layer is already being redefined. OpenAI’s Computer-Using Agent shows the direction clearly: models are being trained not only to answer prompts, but to operate graphical interfaces, sequence actions, and complete tasks across existing software surfaces. That is not only response generation. It is orchestration. (OpenAI)
Apple’s trajectory points in the same direction from another angle. App Intents exists to describe app capabilities in a form the system can understand, invoke, and compose. Apple’s WWDC24 guidance says App Intents lets people take action outside the app. Then Apple makes the guidance more ambitious: rather than exposing only a few habitual tasks, the recommendation shifts toward making anything the app does available as an intent. Apple also positions App Intents as a core building block for Apple Intelligence experiences. Once app functionality is extracted from closed UI and exposed as system-callable capability, the user is no longer primarily operating an individual app. The user is operating a smart surface that can dynamically assemble app capabilities on demand. (Apple Developer)
This is exactly why UI and OS are becoming tightly coupled again. In the past, UI could be treated as a skin placed on top of the operating system. In the era of AI and spatial computing, that relationship flips. UI depends on the OS to expose semantic actions, permissions, windowing structures, and tool boundaries. The OS depends on the UI to make agent behavior legible: what is happening, why it is happening, what the agent is about to do next, and where the human can intervene. GitHub Copilot agent mode makes tool invocations visible in the UI and requires approval for terminal actions. In this world, good UI is not only visually refined. It is a trust surface. (Visual Studio Code, OpenAI)
Spatiality only matters if it improves understanding
There is also a real downside to this transition. Nielsen Norman Group argued in late 2025 that Liquid Glass can obscure content and replace familiar interaction conventions with newer but not necessarily better patterns. The broader lesson is clear: as UI moves from flatness toward spatiality, the real challenge is not making software feel more magical. It is preserving legibility, accessibility, predictability, and cognitive calm while depth and materiality increase. If spatiality does not improve understanding, it becomes noise. (Nielsen Norman Group, Apple)
Conclusion: the operating surface is being rewritten
If I had to compress the argument into one statement, it would be this: Liquid Glass is not simply Apple redesigning UI. It is the beginning of Apple moving the spatial grammar of visionOS, depth, material, attachment, and source-anchored interaction, back into iPhone, iPad, and Mac. The next phase is not that everything becomes glass, but that complex interfaces become more workbench-like and more IDE-like. The phase after that is not necessarily AI replacing the operating system itself, but AI replacing the operating system’s primary interaction layer. What is being rewritten is not only UI, and not only OS, but the operating surface between them. (Apple, OpenAI, Apple Developer)
References
- Apple: Introducing Apple Vision Pro
- Apple: Apple introduces a delightful and elegant new software design
- Apple: Apple Unveils iOS 7
- Apple Developer: Get to know the new design system
- Apple Developer: Design for clarity and delight
- Apple Developer: Human Interface Guidelines, Materials
- Apple: iPadOS 26 introduces powerful new features that push iPad even further
- OpenAI: Introducing canvas
- Replit Docs: Replit Workspace
- OpenAI: Introducing Computer-Using Agent
- Apple Developer: Explore app intents
- Visual Studio Code: Introducing Copilot agent mode
- Nielsen Norman Group: Liquid Glass: A New Look, Same Old Problems