Meta Display Glasses Now Open to Third-Party App Developers
Meta is expanding the reach of its Display AI glasses by opening the platform to third-party developers, marking a significant step toward building a broader app ecosystem for the wearable device.
Released in September, the Meta Display glasses represent the company’s most advanced AI-powered eyewear to date. The device features a heads-up display and a wrist control band, offering users a richer and more interactive experience than previous models. Now, developers can build mobile and web apps for the glasses through a developer preview program, using familiar iOS and Android tools.
A New Interaction Model Beyond Touchscreens
A standout feature driving this developer push is Meta’s Neural Band, which enables gesture-based controls without relying on touchscreens, voice commands, or capacitive touch. This opens up a genuinely novel interaction paradigm for app creators. Developers can design experiences that respond to subtle physical gestures, allowing for discreet, real-world control that feels natural and unobtrusive.
Meta CTO Andrew Bosworth highlighted one early example — an app called “Darkroom Buddy” that overlays darkroom photography processing guidance directly into the wearer’s field of vision, illustrating the practical, task-based potential of the platform.
Building Toward Full AR Glasses
Meta plans to gradually expand access to the Display development program over the coming weeks, encouraging more use cases to emerge organically from the developer community. The move is also widely seen as groundwork for Meta’s anticipated AR glasses launch next year, which promises even more immersive and interactive wearable experiences.
With a growing developer ecosystem now taking shape, Meta’s glasses ambitions are clearly accelerating.

