New York: Apple is quietly advancing its next generation of wearable technology—AI-powered smart glasses and camera-equipped AirPods—but these cutting-edge innovations may still be years from reaching consumers.
According to Bloomberg’s Mark Gurman, the tech giant is actively working on a secretive smart glasses project, codenamed N50, which Apple intends to develop as a flagship product under its “Apple Intelligence” umbrella. CEO Tim Cook is reportedly deeply invested in the initiative, with insiders claiming he is “focused on nothing else” when it comes to future hardware ambitions.
Unlike full-fledged AR headsets such as the Vision Pro, Apple’s upcoming smart glasses are expected to be far lighter and more user-friendly, blending real-world utility with intelligent onboard computing. Rather than delivering immersive AR overlays, these glasses are expected to feature built-in cameras and sensors that analyze the wearer’s environment and provide real-time, contextual feedback via integrated AI systems.
“The product will analyse the surrounding environment and feed information to the wearer, though it will stop well short of true augmented reality” ,Gurman wrote in his Power On newsletter.
However, this vision is not without significant hurdles. Technical and design limitations are reportedly delaying the project, as Apple strives to balance high-performance hardware, lightweight design, strong battery life, and premium optics. Gurman notes the product is “not close to being ready yet”, with earlier predictions suggesting it could be delayed by three to five years.
AirPods May Soon Get Smarter—With Cameras
Alongside smart glasses, Apple is also developing a more intelligent version of its AirPods. Gurman reveals that Apple is experimenting with infrared cameras embedded in future models. These are not traditional cameras, but rather sensors similar to those used in Face ID on the iPhone.
These new AirPods could provide a vastly improved spatial awareness experience by capturing environmental data and feeding it to Apple’s AI systems. This could translate into personalized spatial audio, enhanced user interaction, and potentially even gesture-based controls. With infrared sensors, users may one day be able to change songs, answer calls, or interact with AR interfaces simply by moving their hands—no physical touch required.
Respected analyst Ming-Chi Kuo has previously predicted that AirPods equipped with infrared cameras might enter mass production by 2026 or 2027, pointing to a longer development timeline than some may have hoped for.
Competition Heats Up: Meta Moves Faster
While Apple perfects its approach, competitors like Meta are already shipping smart glasses. Meta, in collaboration with Ray-Ban and EssilorLuxottica, launched its second-generation smart glasses in 2023. The Ray-Ban Meta Smart Glasses come with built-in cameras and speakers, allowing users to take photos, make calls, record videos, and even receive real-time translations with voice commands like “Hey Meta.”
Priced at $299 in the U.S., Meta’s glasses are expected to launch in India soon, underscoring the urgency for Apple to accelerate its own product development in the rapidly evolving wearable tech space.
While Apple is known for taking a cautious and calculated approach, its ambitions are clear: create wearable devices that are not just smart, but deeply integrated with Apple’s ecosystem and AI infrastructure. With its track record of redefining categories, the world will be watching closely—no matter how long it takes.