Super Reality

Augmented Reality (AR) and Virtual Reality (VR) Industry News and Technology Updates

UC San Diego Hosts a Distinctly Unconventional Event to Ignite a Future of Innovation

This is a guest blog post contributed by Rezin8 of San Diego, California. University donors, philanthropic leaders, and esteemed pillars of the UC San Diego community gathered to kick off the public phase of an ambitious $2 billion comprehensive fundraising campaign. This project will drive innovation that advances society, propels economic growth, and makes the world a better place. The public research university collaborated with Rezin8, a paradigm-shifting creative lab, to present attendees with a unique guided tour and presentation that will serve as a portal to the interactive reality. Driven by boundless curiosity and a foundational mandate to challenge traditional thinking, UCSD is pioneering breakthroughs in nanotechnology, climate science, machine learning, emerging arts, and much more. Champions of the university think it’s just the beginning. UCSD spent the evening taking a look back at the achievements their alumni were able to make in technology and medicine. More importantly, they took the time to ignite a future of innovation and advancement they want to bring into fruition. To bring UCSD’s enthusiastic vision to the reality, Rezin8 designed and developed a unique experience by leveraging interactive content coupled with augmented reality, virtual reality, futuristic presentations, performances, smart bracelets, and robotics — these were just a few of the innovative technologies deployed as part of what was a captivating and inspirational evening for all. The event transitioned from a tunnel to a cocktail area in the gymnasium. From the entrance of the tunnel, attendees had their own unique reality experience. Different settings will cater the visual displays to a more custom setting — attendees were welcomed with their names digitally displayed. To augment reality, tablets were distributed to each attendee in the cocktail area. This tablet provided a variety of special effects throughout the event. Human interactions were minimized for this special event. A robot, programmed and coded by Rezin8, traversed from guest to guest to take photos. As seen through the tablet, the augmented reality and digital content progressed with attendees as they moved through the tunnel. Special effects like rain and clouds were a part of the experience and changed depending on the attendee’s time and location within the experience. The event also included a futuristic live performance to the presentation. Nathan East, considered one of the most recorded bass players in music history, performs as a digital projection. Rezin8, UC San Diego’s creative partner for the event, is no stranger to high-tech digital performances. In 2012, the creative lab worked with Digital Domain to deliver one of the most legendary music performances in history – the Tupac hologram for Coachella. The event celebrates a halfway point to their fundraising goal, and serves as a critical moment to support the completion of their $2 billion fundraising campaign. The evening was a fully automated, tech-fueled experience, manifested in the living spirit of innovation – the very heart of UCSD. Saturday’s event inspires action from the ever-expanding community of innovators at UC San Diego and beyond — students, alumni, faculty, and philanthropists. Established in 1960, UC San Diego has become one of the world’s 15 largest research universities.

USENS and #GDC17

As five days in sunny San Francisco came to a close, it was clear that virtual reality was all the buzz throughout the duration of Game Developers Conference, especially with the first two days of the conference being completely dedicated to VR. VRDC brought VR tutorials, VR boot camps and industry relevant speakers into an immersive AR/VR experience at the Moscone Center. This was VRDC’s second year and featured two different tracks for attendees to participate in: VR and AR for game development and VR and AR for other forms of entertainment–from CG movies to filmed experiences and beyond. It’s clear that VRDC’s presence for the second year at arguably the largest gaming conference in the world, means that virtual, augmented and mixed reality is the future and the most important steps in developing games for users and progressing the industry forward. The following three days opened the exhibition floor to companies and organizations to display their relevance to the gaming industry and how their technologies and products can progress technological advances to the next level. Oculus brought a full array of systems and demos to GDC sending an army of employees donning distinctive indigo shirts to recruit attendees to use their system with Oculus Touch controllers. PlayStation and its deep lineup of games allowed GDC-goers to sample everything from Grand Turismo Sport in a sit-in pod with a PS4 Pro to bringing back the infamous Crash Bandicoot of the early PlayStation days, and showcasing its wide variety of PSVR games as well. However, while controllers in VR and gaming in general still play a prevalent part in those fields, hand tracking sparked the interest of industry professionals and the thousands at GDC. USENS INC was a trending topic of conversation over the three days on the exhibition floor of the South Hall at the Moscone Center. We demonstrated our hand tracking capabilities with demos to show to the attendees of GDC. With several FINGO demos set-up, thousands of people in attendance flocked to the uSens booth (which happened to be close to the primary restrooms of the hall for maximum foot traffic – very strategic 😉). Throughout GDC, USENS interacted in-person and over all social media channels with tech, AR/VR and gaming influencers, journalists and industry professionals — all raving about USENS and FINGO. Here are just a couple examples: @DennisScimeca: Seriously: If you’ve never tried hand tracking in VR and have time, stop by the @usensinc booth. Tech works exactly as advertised. #gdc17. @Alexis_Macklin: Enjoyed testing out @usensinc at #gdc17. From verified Twitter users who promoted uSens to their thousands of followers to the individuals on social media who post with just a passion about their interests, USENS and our FINGO were trending topics coming out of GDC. USENS INC was featured in an article by Gao Yun of CGTN featuring FINGO and our tracking capabilities: San Jose, California-based uSens – founded by two Chinese developers – created a technology that utilizes a camera to recognize all the individual bones inside the hand, and then relays that information to the application. “Right now, people cannot interact directly in VR, but holding a controller is unnatural,” said Fei Yue, co-founder and CTO of uSens, adding that they are now letting people do whatever they want to do in real world.  As VR moves further into mainstream society, technologists agree that the experience needs to become more natural, and ironically, more like everything in the real world.  

The Future of Driverless Car and Augmented Reality

The first fleet of self-driving cars may be ready by 2020. Google has said that its car should be ready by 2020 and Tesla CEO, Elon Musk, says their Tesla’s will be fully autonomous by 2018. It appears that these driverless cars are still in the data gathering stage. Since increased safety is one of the top goals for these future cars, it is important for them to harvest great amounts of data in order for them to respond in the moment and react to potentially dangerous situations. One of the biggest worries of driverless car technology is that they may not be safe enough to utilize so a lot more data still needs to be gathered to help relieve this concern. Tesla’s Model S already features some self-driving capabilities known as “Autopilot” and it uses data and driving habits from many Tesla drivers in order to keep improving. Google’s self-driving cars have been driving around and collecting data since 2009 so its software already knows how to react to various real-life scenarios. As driverless cars are still gathering data, we may need another type of car technology to breakthrough before we go fully autonomous. In a sense, GPS technology could also be looked at as a tool that has been priming consumers for driverless cars for a long time. Cars that feature augmented reality may be able to further prime consumers into buying driverless cars. PSA Peugeot Citroen, a French multinational manufacture of automobiles, is working on one of the first cars that will finally implement built-in augmented reality in the cockpit. This technology aims to increase driver safety for a more convenient driving experience. If augmented reality cars become more popular, they have great potential to make the transition to driverless cars smoother. PSA Peugeot Citroen’s work-in-progress automobile features a built-in transparent display, projected onto the windscreen that allows the driver to view data and information without looking away from the road. The HUD is also able to adapt in real-time to objects on the road.   The display will feature the usual driving information such as current speed, speed limits (implied from road signs) but it could also introduce new functions. Since the HUD is capable of covering the entire width of the windshield, it could, for instance, clearly give the driver directions or show dangers with a prominent graphic warning. The most important benefit for augmented reality driving is to increase safety. Since the useful driving information could be seen extremely quickly, drivers are sure to have a better awareness of the road and they are more likely to not be taken by surprise. Moreover, their response times may be much faster since they will be keeping their eyes on the road. For instance, they will be able to look up directions while focusing their eyes on the road instead of looking down at their phones or GPS. Since we may have to wait up to 5-10 years before we see autonomous care take over the roads, the process of driving automation will have to be gradual. PSA Peugeot Citroen will be an integral member of the car industry to make sure that the jump will not be so sudden since it will be retaining manual/human functionality. The driving information that will be displayed on the windscreen may help educate the consumer and give them a better understanding of how driving aids and autonomous cars will work. There is not any word yet on the release date of PSA Peugeot Citroen’s augmented reality vehicle but it is likely to release before 2020.  

Snap Inc. Spectacles: A Spectacle to be Worn

Over the last several weeks, Snap Inc. jumped head first into the fashion industry when they introduced the first pair of sunglasses to the public. Coming out of a vending machine near the Snap Inc. HQ in Venice Beach, CA, Spectacles drew customers from Southern California and beyond wanting to get fresh, new, smart pair of eyewear from the social media giant. After the initial release of the glasses in Venice Beach, a Spectacle pop-up vending machine appeared up the coast of California, in Loma Point, CA (near Big Sur), allowing interested Northern California customers to hop on the Spectacle-train. The “Snapbots” have now been showing up throughout the U.S. (including Tallahassee (FL), Catoosa (OK), Catalina Island (CA), Honolulu (HI), and several other cities as well). Snap Inc. are selling the sunglasses for $129.99 at resale but according to TechCrunch, “Snapchat staff on location are apparently telling people in line the vending machine in Big Sur won’t be restocked once it sells out. The long lines from Friday, along with high selling prices on eBay that are hitting 20x the original Spectacles selling price.” Twenty times the original selling price…the question is: are the new Snapchat sunglasses even worth the initial price as a high-end, fashionable eyewear accessory, are they just a toy or are they the future of augmented reality? As far as high-tech glasses go, immediate thoughts go to the Google Glass. While starting at a mere $1,500 for the public, Google Glass prided itself on being virtually a hands-free smartphone. Many critics dubbed Google’s foray into wearable tech a flop. The massive price-tag, the marketing strategies setting an unrealistic expectation, amongst other variables deemed the Google Glass not worth the time or the money to the public. Google Glass prided itself on features and the potential of endless possibilities and that ultimately proved to be a massive letdown. Snap Inc.’s Spectacles seems to pride itself on only have three primary uses: record 10-second videos, protection from the sun and fun. While it’s 2016 debut is made for entertainment, this could be Snap Inc.’s first step into the AR and VR realm. Snapchat recently purchased Israeli augmented reality startup Cimagine for an estimated $30 – 40 million. Cimagine has developed augmented reality technology that allows its users to seen on the screen of their mobile devices how appliances and furniture look in their respective homes. In an article spotlighting Snapchat and Spectacles, Anita Balakrishnan of CNBC wrote: Already ‘the social media platform of our time,’ Snapchat could now own the means of both producing and distributing its content, said Julia Sourikoff, who heads VR and 360 for Tool of North America, an award winning commercial production company that has a rapidly growing virtual reality division. For brands, that could mean a not-too-distant future where consumers could head out to stores to meet holograms of the trendy influencers who are already avid Snapchat users. Spectacles aren’t trying to completely revolutionize the high-tech, smart technology game right away. They are simply establishing that they are becoming a player in the wearable tech and the AR/VR space. Snapchat is clearly going to be the premise for which all of Snap Inc.’s future products will be built on.

Top 5 Mobile VR Apps That Would Benefit from Hand-Tracking

If you are someone who has never tried virtual reality before, you are likely to be introduced to Google Cardboard or Samsung GearVR first. Almost every time I have witnessed someone experience VR for the first time in these devices, they tried to reach out to touch what they are seeing but they aren’t able to since these Cardboard & GearVR don’t come packed with hand-tracking capabilities out the box.  Because of this, what VR needs most is hand-tracking to help users retain their interest and immersion in the VR world. So I rounded up a list of the top 5 GearVR & Cardboard apps that would benefit greatly from hand-tracking technology. (A head strap mount would be needed for Google Cardboard) 1. Proton Pulse (Cardboard/GearVR) Proton Pulse is a fun game with incredibly simple head-tracking controls. It is basically a 3D version of the classic game, Breakout. The player controls the paddle simply by looking around. Hand-tracking could make this game even more impressive and natural though. With hand-tracking, the paddle can be controlled with one hand in front of the VR headset and there can be an applied force mechanic. Players would actually have full control of how fast or slow they want to hit the ball based on their strike. 2. Oculus Social Beta (GearVR) Oculus Social Beta is in desperate need of a feature like hand-tracking. As of now, its users’ avatars are nothing more than floating heads. This limits the overall experience of the app because all the player can do is talk to other players and watch videos. Since this app is online only, it would be ground-breaking to be able to move your hands around in the real world and have your friends see your avatar move accordingly in the virtual world. Hand-tracking could introduce a whole bunch of new activities in Oculus’ virtual world such as card games or playing catch. 3. Sisters VR (Cardboard/GearVR)  Sisters is a VR horror story. It could benefit from hand-tracking by making the story more interactive. The way it is now, the story always has the same ending but with hand-tracking there could be more possibilities to make the story less linear and more reliant on the users’ action. This will ultimately make the app even scarier than it already is because of the added immersion. Even if the story retains its linearity, hand-tracking could be used to interact with the characters and objects around the room to trigger more scares. 4. BAMF (Cardboard) BAMF is a platformer adventure game that only uses teleportation for movement. For a game about exploration, this game is sure to benefit from hand-tracking. There could be interactive puzzles that you could reach out and solve using your hands (i.e. activating switches and pushing buttons). 5. EndSpace VR (Cardboard/GearVR) EndSpace VR is a solid space shooter game with a vomit inducing “turn head to steer” control method that is in need of a more hands-on piloting experience. An “air-steering” control method could work out great for this game. where the user can see and interact with the yoke (steering) and other controls/buttons in the cockpit. Yes, this game can be played with a controller but this diminishes the accessibility. Even if these apps don’t contain hand-tracking (yet) I still suggest that everyone should try them out if they haven’t. All 5 experiences show great potential for the future of VR. Here at uSens, it is one of our goals to bring AR/VR hand-tracking into the rapidly growing mobile VR market. VR needs to be simple and intuitive for it to attain massive appeal in the mainstream audience. Samsung GearVR and Google Cardboard do a great job in making mobile VR simple but hand-tracking would be a great asset to make the experience more natural. -Amanze Ugoh Follow uSens on Twitter/Facebook/Instagram  

uSens Developer Challenge!

We’re super excited to announce the uSens Developer Challenge, an opportunity for AR/VR developers to use cutting-edge human-computer interaction tech to build amazing apps and compete for a total of $100,000 in cash prizes. Participants will have a chance to show off their submissions to industry leaders at conferences throughout the United States and China. Also, you could potentially be in the running for a $50,000 grand prize. Original apps made for Samsung GearVR, Google Cardboard, HTC Vive, or Oculus are eligible! The vertical does not matter;  games, educational, industrial, are all fine, the app just needs to be compatible with our SDK. (Please check out our SDK documentation here) Submissions will be evaluated based on: Application of the Fingo SDK (40%) Creativity (20%) User Experience (20%) Business Potential (10%) Polish (10%) Here’s the process in a nutshell: Submit your application on the contest page by October 15th We will review applications and announce the 10 semifinalists by November 1st Our panel of judges will review your submissions and select winners We will announce the winners at GDC 2017 during 02/27-03/03 Prizes Grand Prize: $50,000 Second Place: $25,000 Third Place: $10,000 We are thrilled to see what our applicants will come up with! More details and the application page can be found on the uSens contest page. Apply now! Follow us on Twitter for updates: https://twitter.com/usensinc

uSens Fingo SDK Release Event

Fingo is now in open beta! On August 24, 2016, we hosted a launch event in The Village Event Space in San Francisco where we announced the open beta of our software development kit (SDK) and pre-order availability for Fingo. We also launched our uDev developer network to help the developer community integrate our hand and head tracking technology with their AR/VR (augmented reality & virtual reality) projects.. The event featured over 200 attendees with the demographic ranging from VR enthusiasts to VR developers and designers. Fingo is a hand-tracking sensor module that supports mobile and tethered systems such as: Samsung Gear VR, HTC Vive, Oculus Rift, and Google Cardboard. The module can attach to the front of head mounted displays (HMDs) to provide expanded tracking capabilities while using less power consumption and processing on mobile devices. During the conference, we revealed that there will be 3 different versions of Fingo, all with different use cases. The entry-level Fingo allows for the standard hand-tracking and head position tracking to be implemented in mobile/tethered systems. The Color Fingo is very similar in design to the regular Fingo but it allows for inside-out position tracking. Being inside-out tracking capable means that Color Fingo is able to look out to the space around it and use the changing perspectives of the outside space to determine its own position in space. In addition to being inside-out, Color Fingo also has an augmented reality (AR) overlay and seamless transitions between AR and VR. The Power Fingo is the biggest and most powerful of the Fingo family, which includes the same capabilities as the Color Fingo but with its own battery and Qualcomm Snapdragon processor. With Power Fingo, AR and VR can easily be enjoyed on the broadest range of mobile devices including those found in emerging markets. uSens CTO Dr. Yue Fei and uSens engineers performed several live demos with Fingo and a HTC Vive HMD during the event. One particular demo showed Fei interacting with toys and instruments inside a virtual bedroom. Near the end of the event, we had our guest panel speak on the current and future status of AR/VR and human-computer interaction (HCI) followed by a Q/A session afterwards. The speakers included Jon Peddie, Samsung’s Christopher Perl, SVVR founder Karl Krantz, CTO of ODG John Haddock, Gestigon’s Stefan Bartschat, and uSens’ Mark Morrison. Their immense insight seemed to have a profound impact on the audience. After the panel, representatives from uSens’ backers, Fosun Capital, Fortune Capital, and IDG Ventures (China) also spoke about our partnership to the audience. With Fingo, we here at uSens Inc. aim to create a more visceral experience with AR and VR. It is apparent that AR/VR without hand-tracking is not nearly as immersive as it could be. Hand-tracking makes it possible for users to actually reach out and interact with the virtual objects that they are viewing, turning the user into an actual participant of the virtual world rather than a mere spectator. Stay tuned on our Twitter and at developers.usens.com for more Fingo related updates -Amanze Ugoh  

VR at E3 2016

This year’s Electronic Entertainment Expo (E3 2016) was virtual reality’s opportunity to make its big break within the mainstream audience. Since companies like Oculus Rift and HTC Vive have already made their mark within the VR industry, E3’s audience were mainly awaiting VR-related announcements from the 3 gaming company giants: Sony, Microsoft, and Nintendo. In Sony’s conference, they finally announced the official release date for Playstation VR. PSVR will be released on October 13, 2016 and will retail for $399 USD. Sony is also partnering with Best Buy to include PSVR demos in select locations. Alongside PSVR, Sony announced many software titles that will support it. The titles include: Resident Evil VII, Final Fantasy XV, and Farpoint, which utilizes a peripheral from Sony called the PSVR Aim Controller. The special controller aims to make VR first person shooter games become more immersive. Arguably the biggest announcement of E3 2016 was Microsoft announcing that their next VR/4k compatible Xbox, codenamed Project Scorpio, will launch next year. This was a bold move considering that Microsoft also announced a slimmer version of the Xbox One called the Xbox One S. Project Scorpio’s VR capabilities may be just as good as HTC Vive or Oculus Rift with a smooth frames per second (FPS) count of 90.   Surprisingly, Nintendo did not announce any upcoming VR technology. It’s surprising because Nintendo brought motion controls to the forefront and it seems that the next step would be to also become leaders in virtual reality gaming. Nintendo did already release a VR system in 1995 but it was deemed a failure quickly after release. Obviously, it was way too early to launch a VR capable system at that time. Nintendo now plans to wait and see the virtual reality network flourish before getting involved once again. Nintendo did announce that their highly anticipated mobile augmented reality game, Pokemon GO, will be launching sometime in July. It’s refreshing to see gaming begin to shift in a different direction with the implementation of VR and AR at this year’s E3. Next year’s E3 should be a lot more impressive considering that there will be further refinement and expertise in VR/AR content and technology. -Amanze Ugoh

Augmented World Expo (AWE 2016)

The Augmented World Expo (AWE 2016) is the 7th annual trade show event that took place in the Santa Clara Convention Center on June 1st and 2nd. The expo focused on augmented reality (AR), virtual reality (VR), and wearable technology. The event itself was very similar to the Silicon Valley Virtual Reality Expo (SVVR) but it was more focused on AR. The main purpose of AWE is to help developers, start-ups, and mobile/hardware companies cultivate new technology into a productive, sustainable, and entertaining medium. The expo was also very interactive, encouraging plenty of attendees to try out demos from exhibitors. This year’s AWE show was the biggest one yet, featuring over 4000 attendees from 47 countries, 200 speakers, and 200 exhibitors. We here at uSens Inc. were part of the exhibitor list along with a balanced mix of big, medium, and small sized companies and startups such as Leap Motion, DAQRI, Intel, and Epson. The audience was also comparable to SVVR 2016. It included developers, content creators, designers, entrepreneurs, and investors, with involvement in AR, VR, and wearable technology. Our booth was extremely popular with the audience and we enjoyed their consistent enthusiasm for our AR/VR hand-tracking technology.   On June 1st, our CTO, Dr. Yue Fei, was one of the speakers at AWE 2016 and he presented a case study on Inside-Out tracking and he demonstrated our Fingo technology, which tracks the hand skeleton using infrared stereo cameras on a mobile head mounted display. The next day our development team provided a developer tutorial and demonstration for ARVR Inside-out tracking. The session included recommended best hand tracking use cases and examples for uSens hardware and software development kit. Our developers also demonstrated how to integrate our hand tracking technology with Unity 3D, Java, or C++.  At the end of AWE, attendees of our developer conference were able to receive our Beta Dev Kits, which included hardware and software needed to develop with uSens hand tracking. AWE 2016 featured the Auggie Awards which showcases the best AR technology. The 11 categories ranged from: best app, best hardware, and best game. We were nominated for Best Tool but Vuforia by PTC ended up winning the award. One of my favorite new products that I demoed at AWE was the winner of “best headset or smart glasses”, which was the Epson Moverio BT-300. These smart glasses, reminiscent of Google Glass, features an OLED HD display, a 5-megapixel camera, and a clever design that can fit over regular glasses. The BT-300 is scheduled to ship in October this year. The complete list of the other winners at AWE 2016 can be found here. The excitement at AWE was amazing this year and it was also one of the busiest tech expos that we have been a part of. I did not expect this much popularity from an augmented reality focused event because of the massive trend of virtual reality right now. Judging by this event, AR has a great chance of hitting the mainstream faster than VR. Below is a montage video that I filmed at AWE 2016. Enjoy! -Amanze Ugoh

Facebook F8 / Facebook’s Fate

Facebook F8 is an annual conference keynote that took place on April 12 and 13 at Fort Mason in San Francisco, CA. The event was directed at developers and entrepreneurs who build products and services for the social network. During the conference, Zuckerberg mentioned that virtual reality could be the best form of social networking because of its strong communal immersion. He illustrated this idea by showing off the Toybox Demo for Oculus Touch. Facebook also presented an open source camera rig that they are calling the “Surround 360 Camera”. The camera, made with 17 combined cameras, will be able to shoot 3D-360 8K video at a smooth 60 frames per second. According to Chris Cox, Facebook’s vice president of product, the company plans to share the hardware design and stitching algorithm on Github this summer. The Oculus team also demoed their VR selfie stick that aims to make VR social. With the virtual stick, users will be able to take photos of their avatars in front of famous landmarks. Once the photo is taken, the user can share it with their Facebook friends’ virtual mailbox so they could post it on the site. After talking about VR, Zuckerberg, shifted the topic to augmented reality. Surprisingly, it was the first time that Zuckerberg revealed Facebook’s interest in AR. The Facebook CEO said that we could use AR apps to replace physical objects such as an AR TV set. Facebook also showed that they wanted to combine both AR and VR into their devices, which is also what we are working on here at uSens Inc. Near the end of the conference, Zuckerberg presented a photo of a pair of black glasses. These glasses were only a glimpse into the future of Facebook tech with Zuckerberg saying that VR/AR headsets will look like a regular pair of eyeglasses in 10 years. VR and AR were also featured at the end of Facebook’s 10-year road map, along with artificial intelligence and connectivity. Now that Facebook is officially on the AR train with Google and Microsoft, it should be interesting to see how this competition will help bring AR to new heights. -Amanze Ugoh