At the Augmented World Expo (AWE) opening today in Silicon Valley, uSens, Inc. announced the winners of its U.S. regional uDev Challenge. A total of $200,000 in prizes were distributed amongst challenge entrants in the U.S. and China. These are the winners in the U.S.: Grand Prize: Developer Chris Wren won $50,000 for his application MonsterShop. The application delivers the framework for a VR retail experience by allowing players to customize the appearance of monster avatars via drag-and-drop hand gestures. For example, players can drag different colored eyeballs and facial hair onto the monsters. 2nd Prize: Developer Wallace Lages won $25,000 for Krinkle, a mobile action game that enables hand-activated spells. Players use motion gestures to shoot fire from their palms and form water balls. 3rd Prize: Developer Tom Leahy won $10,000 for Embodied Labs, an interactive medical training tool that lets users experience macular degeneration first-hand. In the Chinese regional competition, winning titles included fighting games GestureMagic and Shadow Play, the self-explanatory Throwing Bomb Game, and electronic circuit assembly experience VR Electronics Laboratory. “We were impressed by the quality and creativity of the projects submitted by our semifinalists,” said Dr. Yue Fei, uSens CTO and co-founder. “Our winners integrated Fingo modules in imaginative ways and demonstrated the ability to make engaging, exciting use of our tracking technology to benefit end-users. I’m very pleased about the promise of future applications of our tracking tech.” Dr. Fei is speaking at AWE at 4:15 pm on June 2 on “4 Keys to Augmented Reality’s Future.” Winning projects were selected from a pool of semifinalists, and determined by a panel of judges including analyst Jon Peddie, Samsung Research America director Chris Peri, Silicon Valley Virtual Reality Meetup founder Karl Krantz, and uSens executives. Project scores were based on several score categories, including use of the uSens Software Development Kit (SDK) for augmented and virtual reality applications. uSens’ SDK integrates the uSens Fingo module for 26DOF (degrees of freedom) hand tracking on mobile ARVR platforms as well as PC-based VR systems. All uDev Challenge submissions were made for Samsung Gear VR™, Google Cardboard™, HTC Vive™, or Oculus Rift™. Semifinalists developed their applications using the uSens SDK and an entry-level uSens Fingo module for adding 26DOF controller-free hand tracking. The uSens Fingo series of modules offers advanced 3D human-computer interaction capability for both mobile and tethered systems. By attaching a Fingo to the front of a head-mounted display (HMD) device, end users can enjoy their favorite apps without major sacrifices to system power consumption or performance – even on mobile.
By Dr. Yue Fei, CTO and Cofounder of uSens 2017 will be a pivotal year for virtual and augmented reality, given rather precarious positioning in the trough of disillusionment on the famed “hype cycle.” Industry predictions are starting to grow lofty again, with the most recent IDC study claiming that AR and VR headset shipments will approach 100 million in the next 5 years. VR/AR funding is at an all-time high. And the number of active users is forecast to reach 171 million by 2018. But does any of that mean the long dark night is over? Not just yet. We’re going to be disappointed if we expect the industry to continue growing unabated. The promise of AR/VR may be widely understood by the general public, but that doesn’t necessarily make it a mainstream technology. Right now, the industry is still plagued by user nausea, not to mention a lack of content and affordability issues. The knight in shining virtual armor, for the near term, will be smartphone-based VR platforms. With their greater ease of use, lower cost, and wider range of games and applications, Mobile VR is like the “gateway drug” that will give way to mass adoption. Eventually. There are a few big players in Mobile VR already: Google, LG, and Lenovo. But given that there are approximately 2 billion active mobile smartphones in the world capable of providing AR/VR content, we should expect more mobile makers to jump on the bandwagon and come out with their own devices and applications soon. Mobile VR presents a much lower cost-of-entry for consumers, and much less risk for manufacturers. Naysayers will point out that current smartphones just aren’t built to handle the intense computational load that VR requires, and they’re right. Most legacy mobile devices are still apt to kick into overdrive and overheat while running a sim, which puts a time limit on any VR fun. But chip makers and smartphone manufacturers are already stepping up their game: 2017 is the year we’ll see VR-friendly devices that don’t bake as easily and are designed to handle the additional processing. Even with state of the art mobile devices distributed by the millions across the world, if there’s no content to run on them, VR won’t latch on. Some developers have taken the plunge and created crowd favorites (à la Pokémon Go), but we still haven’t seen the “killer app.” At this point, I don’t know that there will be one. We might just see content growth via user engagement campaigns from healthcare, social, and enterprise applications. Consumers often wait until “the timing is finally right” to adopt a new technology. Such timing is rarely tied directly to one event, but rather a collection of events that eventually collect enough weight to tip the scale. VR content in myriad forms will be required to supply that substance. Meeting (and then exceeding) consumer expectation will depend heavily on the available ways users can immerse themselves into the content and interact with it. Which leads me to another hurdle: There are some basic human-computer interaction problems that the industry must address this year. The ability to employ familiar interfaces — seamlessly scroll, tap, or type to navigate your way through a virtual realm — would make a world of difference. Until recently, natural gesture and position tracking have been a challenge in AR/VR, but the industry is rising to the challenge: HTC has years of investment in external trackers and the Oculus Rift will eventually deploy hands-free control. There is nothing yet on the market that successfully combines mobile 3D hands-free tracking AND robust head position tracking in AR/VR, but we’re working on it. Inside the AR/VR echo chamber, it’s easy to think that the public is ready to fully embrace our technology. But it’s still early days. Virtual reality is not for everyone, and there are going to be many more unforeseen obstacles to overcome as the sector matures. For those of us founding this entirely new industry, a clear focus on known issues as we pass beyond hype and into certainty will sustain us through 2017 and beyond.
This is a guest blog post contributed by Rezin8 of San Diego, California. University donors, philanthropic leaders, and esteemed pillars of the UC San Diego community gathered to kick off the public phase of an ambitious $2 billion comprehensive fundraising campaign. This project will drive innovation that advances society, propels economic growth, and makes the world a better place. The public research university collaborated with Rezin8, a paradigm-shifting creative lab, to present attendees with a unique guided tour and presentation that will serve as a portal to the interactive reality. Driven by boundless curiosity and a foundational mandate to challenge traditional thinking, UCSD is pioneering breakthroughs in nanotechnology, climate science, machine learning, emerging arts, and much more. Champions of the university think it’s just the beginning. UCSD spent the evening taking a look back at the achievements their alumni were able to make in technology and medicine. More importantly, they took the time to ignite a future of innovation and advancement they want to bring into fruition. To bring UCSD’s enthusiastic vision to the reality, Rezin8 designed and developed a unique experience by leveraging interactive content coupled with augmented reality, virtual reality, futuristic presentations, performances, smart bracelets, and robotics — these were just a few of the innovative technologies deployed as part of what was a captivating and inspirational evening for all. The event transitioned from a tunnel to a cocktail area in the gymnasium. From the entrance of the tunnel, attendees had their own unique reality experience. Different settings will cater the visual displays to a more custom setting — attendees were welcomed with their names digitally displayed. To augment reality, tablets were distributed to each attendee in the cocktail area. This tablet provided a variety of special effects throughout the event. Human interactions were minimized for this special event. A robot, programmed and coded by Rezin8, traversed from guest to guest to take photos. As seen through the tablet, the augmented reality and digital content progressed with attendees as they moved through the tunnel. Special effects like rain and clouds were a part of the experience and changed depending on the attendee’s time and location within the experience. The event also included a futuristic live performance to the presentation. Nathan East, considered one of the most recorded bass players in music history, performs as a digital projection. Rezin8, UC San Diego’s creative partner for the event, is no stranger to high-tech digital performances. In 2012, the creative lab worked with Digital Domain to deliver one of the most legendary music performances in history – the Tupac hologram for Coachella. The event celebrates a halfway point to their fundraising goal, and serves as a critical moment to support the completion of their $2 billion fundraising campaign. The evening was a fully automated, tech-fueled experience, manifested in the living spirit of innovation – the very heart of UCSD. Saturday’s event inspires action from the ever-expanding community of innovators at UC San Diego and beyond — students, alumni, faculty, and philanthropists. Established in 1960, UC San Diego has become one of the world’s 15 largest research universities.
The first fleet of self-driving cars may be ready by 2020. Google has said that its car should be ready by 2020 and Tesla CEO, Elon Musk, says their Tesla’s will be fully autonomous by 2018. It appears that these driverless cars are still in the data gathering stage. Since increased safety is one of the top goals for these future cars, it is important for them to harvest great amounts of data in order for them to respond in the moment and react to potentially dangerous situations. One of the biggest worries of driverless car technology is that they may not be safe enough to utilize so a lot more data still needs to be gathered to help relieve this concern. Tesla’s Model S already features some self-driving capabilities known as “Autopilot” and it uses data and driving habits from many Tesla drivers in order to keep improving. Google’s self-driving cars have been driving around and collecting data since 2009 so its software already knows how to react to various real-life scenarios. As driverless cars are still gathering data, we may need another type of car technology to breakthrough before we go fully autonomous. In a sense, GPS technology could also be looked at as a tool that has been priming consumers for driverless cars for a long time. Cars that feature augmented reality may be able to further prime consumers into buying driverless cars. PSA Peugeot Citroen, a French multinational manufacture of automobiles, is working on one of the first cars that will finally implement built-in augmented reality in the cockpit. This technology aims to increase driver safety for a more convenient driving experience. If augmented reality cars become more popular, they have great potential to make the transition to driverless cars smoother. PSA Peugeot Citroen’s work-in-progress automobile features a built-in transparent display, projected onto the windscreen that allows the driver to view data and information without looking away from the road. The HUD is also able to adapt in real-time to objects on the road. The display will feature the usual driving information such as current speed, speed limits (implied from road signs) but it could also introduce new functions. Since the HUD is capable of covering the entire width of the windshield, it could, for instance, clearly give the driver directions or show dangers with a prominent graphic warning. The most important benefit for augmented reality driving is to increase safety. Since the useful driving information could be seen extremely quickly, drivers are sure to have a better awareness of the road and they are more likely to not be taken by surprise. Moreover, their response times may be much faster since they will be keeping their eyes on the road. For instance, they will be able to look up directions while focusing their eyes on the road instead of looking down at their phones or GPS. Since we may have to wait up to 5-10 years before we see autonomous care take over the roads, the process of driving automation will have to be gradual. PSA Peugeot Citroen will be an integral member of the car industry to make sure that the jump will not be so sudden since it will be retaining manual/human functionality. The driving information that will be displayed on the windscreen may help educate the consumer and give them a better understanding of how driving aids and autonomous cars will work. There is not any word yet on the release date of PSA Peugeot Citroen’s augmented reality vehicle but it is likely to release before 2020.
Over the last several weeks, Snap Inc. jumped head first into the fashion industry when they introduced the first pair of sunglasses to the public. Coming out of a vending machine near the Snap Inc. HQ in Venice Beach, CA, Spectacles drew customers from Southern California and beyond wanting to get fresh, new, smart pair of eyewear from the social media giant. After the initial release of the glasses in Venice Beach, a Spectacle pop-up vending machine appeared up the coast of California, in Loma Point, CA (near Big Sur), allowing interested Northern California customers to hop on the Spectacle-train. The “Snapbots” have now been showing up throughout the U.S. (including Tallahassee (FL), Catoosa (OK), Catalina Island (CA), Honolulu (HI), and several other cities as well). Snap Inc. are selling the sunglasses for $129.99 at resale but according to TechCrunch, “Snapchat staff on location are apparently telling people in line the vending machine in Big Sur won’t be restocked once it sells out. The long lines from Friday, along with high selling prices on eBay that are hitting 20x the original Spectacles selling price.” Twenty times the original selling price…the question is: are the new Snapchat sunglasses even worth the initial price as a high-end, fashionable eyewear accessory, are they just a toy or are they the future of augmented reality? As far as high-tech glasses go, immediate thoughts go to the Google Glass. While starting at a mere $1,500 for the public, Google Glass prided itself on being virtually a hands-free smartphone. Many critics dubbed Google’s foray into wearable tech a flop. The massive price-tag, the marketing strategies setting an unrealistic expectation, amongst other variables deemed the Google Glass not worth the time or the money to the public. Google Glass prided itself on features and the potential of endless possibilities and that ultimately proved to be a massive letdown. Snap Inc.’s Spectacles seems to pride itself on only have three primary uses: record 10-second videos, protection from the sun and fun. While it’s 2016 debut is made for entertainment, this could be Snap Inc.’s first step into the AR and VR realm. Snapchat recently purchased Israeli augmented reality startup Cimagine for an estimated $30 – 40 million. Cimagine has developed augmented reality technology that allows its users to seen on the screen of their mobile devices how appliances and furniture look in their respective homes. In an article spotlighting Snapchat and Spectacles, Anita Balakrishnan of CNBC wrote: Already ‘the social media platform of our time,’ Snapchat could now own the means of both producing and distributing its content, said Julia Sourikoff, who heads VR and 360 for Tool of North America, an award winning commercial production company that has a rapidly growing virtual reality division. For brands, that could mean a not-too-distant future where consumers could head out to stores to meet holograms of the trendy influencers who are already avid Snapchat users. Spectacles aren’t trying to completely revolutionize the high-tech, smart technology game right away. They are simply establishing that they are becoming a player in the wearable tech and the AR/VR space. Snapchat is clearly going to be the premise for which all of Snap Inc.’s future products will be built on.
uSens Inc. was featured in USA Today on Wednesday, after a successful showing at CES Unveiled in Las Vegas. In “CES 2017: The coolest tech you have to see”, USA Today wrote that: An attendee demonstrated Fingo, a device added to virtual reality goggles to incorporate hand gestures. Check out the article and visit uSens Inc. at CES 2017 this week!
After CES Unveiled in Las Vegas on Tuesday night Janko Roettgers, senior Silicon Valley correspondent for Variety, featured uSens Inc. Roettgers tweeted out that: The Fingo VR hand tracking module is one of the cooler things I got to see at CES Unveiled. Check out the article and visit uSens Inc. at CES 2017 this week!
Facebook F8 is an annual conference keynote that took place on April 12 and 13 at Fort Mason in San Francisco, CA. The event was directed at developers and entrepreneurs who build products and services for the social network. During the conference, Zuckerberg mentioned that virtual reality could be the best form of social networking because of its strong communal immersion. He illustrated this idea by showing off the Toybox Demo for Oculus Touch. Facebook also presented an open source camera rig that they are calling the “Surround 360 Camera”. The camera, made with 17 combined cameras, will be able to shoot 3D-360 8K video at a smooth 60 frames per second. According to Chris Cox, Facebook’s vice president of product, the company plans to share the hardware design and stitching algorithm on Github this summer. The Oculus team also demoed their VR selfie stick that aims to make VR social. With the virtual stick, users will be able to take photos of their avatars in front of famous landmarks. Once the photo is taken, the user can share it with their Facebook friends’ virtual mailbox so they could post it on the site. After talking about VR, Zuckerberg, shifted the topic to augmented reality. Surprisingly, it was the first time that Zuckerberg revealed Facebook’s interest in AR. The Facebook CEO said that we could use AR apps to replace physical objects such as an AR TV set. Facebook also showed that they wanted to combine both AR and VR into their devices, which is also what we are working on here at uSens Inc. Near the end of the conference, Zuckerberg presented a photo of a pair of black glasses. These glasses were only a glimpse into the future of Facebook tech with Zuckerberg saying that VR/AR headsets will look like a regular pair of eyeglasses in 10 years. VR and AR were also featured at the end of Facebook’s 10-year road map, along with artificial intelligence and connectivity. Now that Facebook is officially on the AR train with Google and Microsoft, it should be interesting to see how this competition will help bring AR to new heights. -Amanze Ugoh
The 2016 Vision Summit conference took place on February 10 and 11, 2016 in Los Angeles. It was hosted by the wildly popular middle-ware engine, Unity 3D. Above, John Riccitiello, Unity’s CEO, welcomes the elated standing room only crowd of global attendees to VS16. This special two-day VIP developer focused event may have ushered in the official sea change for 3D development moving towards 4D and beyond. This conference gathered together 1,400+ hardcore Unity 3D, digital, and interactive content and business developers focused on the business of AR and VR. The energy was unfiltered AR/VR passion for 48 hours straight. The keynote presentation consolidated impressive influencers in the contemporary AR and VR landscapes. Executives from Google, Sony, Oculus, Valve/Steam, NASA, Hollywood, as well as Unity’s CTO and VR Skunkworks team showed off their current AR and VR efforts. Attendees learned curious and interesting facts in the keynote from people like Alex McDowell explaining how the Minority Report movie storyline and world came from AR technology visionaries rather than traditional script writers. Director Jeff Norris, who leads much of NASA’s virtual outreach program from JPL in Pasadena, took the Vision Summit keynote crowd to a virtual Mars, where we viewed a glimpse of our real world with ever changing virtual content. Jeff’s team and all of NASAs virtual content developers are actively using VR and AR tools for almost every part of their outreach these days. And, of course Unity 3D is there in the hands of these developers, allowing for rapid designs and output of 3D interactive VR and AR content to take us all into space. More excellent keynote content was provided by Unity’s CTO, Joachim, and other lead engineers who demoed the new Unity 3D v5.4 capabilities that optimize and streamline rendering as well as enable in-engine VR editing, such as below. Additional keynote highlights came separately from Gabe Newell (Valve), followed by Palmer Lucky (Oculus), each with their own gifts for the crowd. Gabe had his self-anointed “Oprah” moment where he gifted every attendee a gratis commercial HTC Vive unit. Oculus followed at the end with a generous gift of months of free subscriptions for Unity Pro tools for all attendees. Needless to say, the Vive chatter never stopped until a few days ago when we all got our shipment notifications. The keynote set a high benchmark for the entire two days full of panels, presentations, networking, and parties. It was unarguably the epicenter for AR and VR development and education for those awesome days. The hosts and planning team ensured there were thoughtful and well planned tracks focusing on specific AR/VR production tools and work flow, business and investment advice from related domain experts, and most importantly was the ability to share with each other in a very well-orchestrated professional social environment. The Expo hall and demo showcase halls deserve special attention too. These areas included many of the major players, like Google with Tango, Epson with their AR glasses, to indies showing off beautiful VR worlds rendered directly from Houdini and other high-end art and audio tools. Also included were independent developers who came from Denmark, Israel, and further out to impress us with their content and technology demos. Below are some examples from the expo and demo areas. VR Unicorns have built a really fun and compelling immersive tennis experience for the HTC Vive system. Shape Space VR had a mesmerizing visual mobile VR experience. Altspace VR is now providing multiplayer social media VR experiences. Vizuality Studios combines physical peripherals with immersive VR experiences that feel “real”. The entire Unity team, the VS advisory board, all participating companies and all of the developers who attended should be commended for the incredible energy that flowed for two days non-stop. The location was convenient, there were tons of drink stations, great food, and a really respectful atmosphere to interact in. Can’t wait until next year! -Mark T. Morrison
We recently brought Impression Pi’s ‘Moon Walk’ demo to the Silicon Valley Technology Innovation & Entrepreneurship Forum (SVIEF) 2015. SVIEF 2015 had over 100 high profile speakers, including Ashton Kutcher. The virtual reality technology from Impression Pi transports the user to the surface of Earth’s moon. Impression Pi’s gesture recognition capabilities also allow the user to ‘touch’ each planet in our solar system and view its information. We were happy to show case our technology to the attendees of SVIEF 2015, but were ecstatic to be given the opportunity to have children interact with it as well. We had a great time at SVIEF 2015 and enjoyed having a diverse audience interact with our technology! For more about uSens, Inc. and Impression Pi, please visit our other web sites: Twitter: @usensinc Facebook: uSens, Inc. – Impression Pi Google+: uSens, Inc. – Impression Pi LinkedIn: uSens, Inc. Company Profile