The premiere show on VR, XR and AR occurred from Jun 10-12, 2025, in Long Beach, CA. Over 5,000 people attended the 16th edition of this conference to get updates on what was new and hot in augmented reality.
One major shift for the show is how it has evolved its focus. In the last two years, the term “spatial” was highly promoted to define the show. Spatial was to embody the world of all things augmented. While the term spatial is still essential in the show promotions, the show this year seemed to embrace the term XR as the more accurate way to describe the fact that this show has fully embraced a VR, XR, (extended reality,) MR (mixed reality) and AR (augmented reality) world.
Qualcomm, Samsung, Sony, Lenovo, Google, Xreal, Snap and Snap Spectacles, Unity, Pico, and over 200 others representing new XR hardware, software, and dedicated XR service providers had booths at the show.
The show featured three days of conference sessions exploring every aspect of XR. More than 450 speakers took the stage, representing leading tech companies, Fortune 500 firms, and innovative startups. There were also over 250 sponsors and exhibitors.
Thousands of attendees—including creators, developers, industry executives, founders, entertainers, investors, top media, and more—participated both in person and virtually.
Topics ranged from enterprise case studies, the latest developer and creator tools, sales and marketing strategies, branded experiences, AR Cloud, WebXR, 5G, AI, Web3, haptics, privacy, and ethics to entertainment, education, and beyond.
One standout observation: the quality of speakers at this event was truly exceptional.
I attended one session on case studies for XR Training. The speakers were leaders of XR Training programs at Duke Energy and Volvo. Many sessions had speakers discussing the real-world usage of XR in their businesses today.
One significant takeaway from this year’s show was the role AI is playing in XR. The underlying theme of this year’s event focused on marrying AI+XR into all of the sessions and exhibitions.
There is a good reason for this, as AI has become a key factor in XR applications and services. In the past, most XR content was created by developers and special service providers using their code. But this year, these same people highlighted how AI is now empowering them to make more powerful XR solutions and speeding up the process of delivering creative programs and services for their customers.
Another surprising part of the show was that many sessions stressed the importance of XR’s human impact. Over the years, most of the talk at tech shows has focused on technology with little thought about how it affects people. However, Jason McGuigan of Lenovo said in his main stage presentation that we as an industry have to be more aware of how XR can and will augment the human experience.
He pointed out that today, people see, feel, and taste their world with their five senses. But XR will add dimensions to that experience by giving them new information and experiences that augment their current world. He stated that the highly negative concept of a cyborg is really defined as technology enhancing a person’s real world.
Another major takeaway from this year’s show was the strong focus on smart glasses. With the event’s pivot to XR, smart glasses have emerged as a central theme and will likely continue to shape the show in the near future. While the XR market was previously dominated by VR, there is now a clear surge in interest in smart glasses. This was evident at AWE, where at least 20 vendors showcased new smart glasses and dozens of sessions were dedicated to smart glass technology.
I was privileged to moderate a main stage panel on smart glasses and their future. I was joined by Ralph Jodice, GM of North America, Head of Partnerships & Publicity at Xreal, Kelly Ingham, VP AR Devices at Meta and Jason McGuigan, Head of Commercial VR, at Lenovo. Having these top executives on the panel allowed us to explore where smart glasses are today and where they will be in the next two years.
These folks are authorities on this subject and play significant roles in their company’s XR strategy and planning. All three agreed that in the next two years, we will see more exploration of new types of smart glasses with new styles and exciting features. These panelists explored the current types of smart glasses that have driven demand today. Meta has led the consumer smart glass revolution with its Ray-ban Meta Wayfarer glasses, which have sold over two million units. These represent AI smart glasses as Meta and others in this space have added AI audio feedback when using these glasses. The panel agreed that AI smart glasses will likely drive the strongest demand for these types of glasses in the next two years.
The second category of smart glasses that is developing is one like Xreal’s on the market. These use “birdbath lenses” and are optimized to deliver large-screen viewing experiences. Ralph Jodice explained that the newest Xreal Pro 2 now provides a 70-degree field of view experience and, when tethered to a device like a PC, smartphone, or mobile gaming device, allows you to view that content on what appears to be 100-200 inch screen through the glasses. They are optimized for watching movies, playing games, and using them in work environments. These glasses are shipping now.
Mr. Jodice also stated that Xreal will support Android XR and release a new version called Project Aura in 2026. Xreal is also adding AI feedback to its new smart glasses. I also got to see Viture’s new smart glasses, which are in this same category. Both companies are making great strides in developing even better versions of their products.
Another significant player in this space is Snap, which has Snap Spectacles. Although very different from what Xreal and Viture are doing, Snap has created great smart glasses that are powerful for gaming and have all types of applications for consumers and businesses.
The third type of glasses we discussed, which has a longer development cycle, are smart glasses with a video screen in the lenses so a person can get visual feedback when using them. There were many great sessions on optical lenses and the challenges of getting them to work well, and from the ones I attended, it is clear that breakthroughs in optical technology are needed to get this right.
The panel felt that, by late 2026-2028, we will see more smart glasses come to market in this category. If you are interested in smart glass optical challenges, I suggest you visit Karl Guttag’s KGOn Tech Blog for a deeper understanding of this subject. He had the best session on this topic at AWE, and his grasp of this issue is impressive.
However, the panel discussed a significant topic: the future of smart glasses and its OS war on the horizon.
Currently, Snap and Meta have two dedicated operating systems for their glasses. However, Google recently introduced Android XR, a new OS for smart glasses. With support from Samsung, Xreal, and others, this will become the third OS for smart glasses. We expect Apple to deliver its smart glasses and launch a fourth OS in the near future.
In April, I wrote a column on Face Computing that sets the tone for what I see as a coming OS battle. To date, we have two major OS computing platforms: operating systems for PCs and Macs and operating systems for smartphones with iOS and Android.
But I see the next big computing market will be around face computing, where our faces will be given the next significant way we deliver and work with information in the future. We are now laying the groundwork for the next personal computing battle, in which an OS and a software ecosystem will develop and drive the concept of wearable computing in the future.
If history is our guide, we should see a huge push to get software developers to support one or two of these face computing OS platforms and start to build a significant ecosystem of apps and services for this type of wearable computer.
If I am right about face computing being the next big thing in personal computing, AWE could evolve to become the main show for the industry within this category. As my panel of experts believes, we should see some remarkable new types of smart glasses come to market in the next two to three years.
Disclosure: Qualcomm, Samsung, Lenovo, Google, Meta and Apple subscribes to Creative Strategies research reports along with many other high tech companies around the world.







