We’re super excited to announce the uSens Developer Challenge, an opportunity for AR/VR developers to use cutting-edge human-computer interaction tech to build amazing apps and compete for a total of $100,000 in cash prizes.
Participants will have a chance to show off their submissions to industry leaders at conferences throughout the United States and China. Also, you could potentially be in the running for a $50,000 grand prize.
Original apps made for Samsung GearVR, Google Cardboard, HTC Vive, or Oculus are eligible! The vertical does not matter; games, educational, industrial, are all fine, the app just needs to be compatible with our SDK. (Please check out our SDK documentation here)
Fingo is now in open beta! On August 24, 2016, we hosted a launch event in The Village Event Space in San Francisco where we announced the open beta of our software development kit (SDK) and pre-order availability for Fingo. We also launched our uDev developer network to help the developer community integrate our hand and head tracking technology with their AR/VR (augmented reality & virtual reality) projects.. The event featured over 200 attendees with the demographic ranging from VR enthusiasts to VR developers and designers.
Fingo is a hand-tracking sensor module that supports mobile and tethered systems such as: Samsung Gear VR, HTC Vive, Oculus Rift, and Google Cardboard. The module can attach to the front of head mounted displays (HMDs) to provide expanded tracking capabilities while using less power consumption and processing on mobile devices.
During the conference, we revealed that there will be 3 different versions of Fingo, all with different use cases. The entry-level Fingo allows for the standard hand-tracking and head position tracking to be implemented in mobile/tethered systems.
The Color Fingo is very similar in design to the regular Fingo but it allows for inside-out position tracking. Being inside-out tracking capable means that Color Fingo is able to look out to the space around it and use the changing perspectives of the outside space to determine its own position in space. In addition to being inside-out, Color Fingo also has an augmented reality (AR) overlay and seamless transitions between AR and VR.
The Power Fingo is the biggest and most powerful of the Fingo family, which includes the same capabilities as the Color Fingo but with its own battery and Qualcomm Snapdragon processor. With Power Fingo, AR and VR can easily be enjoyed on the broadest range of mobile devices including those found in emerging markets.
uSens CTO Dr. Yue Fei and uSens engineers performed several live demos with Fingo and a HTC Vive HMD during the event. One particular demo showed Fei interacting with toys and instruments inside a virtual bedroom.
Near the end of the event, we had our guest panel speak on the current and future status of AR/VR and human-computer interaction (HCI) followed by a Q/A session afterwards. The speakers included Jon Peddie, Samsung’s Christopher Perl, SVVR founder Karl Krantz, CTO of ODG John Haddock, Gestigon’s Stefan Bartschat, and uSens’ Mark Morrison. Their immense insight seemed to have a profound impact on the audience. After the panel, representatives from uSens’ backers, Fosun Capital, Fortune Capital, and IDG Ventures (China) also spoke about our partnership to the audience.
With Fingo, we here at uSens Inc. aim to create a more visceral experience with AR and VR. It is apparent that AR/VR without hand-tracking is not nearly as immersive as it could be. Hand-tracking makes it possible for users to actually reach out and interact with the virtual objects that they are viewing, turning the user into an actual participant of the virtual world rather than a mere spectator.
Pokemon GO has been out for a little over a month now and the hype is still real.
Aside from the many car accidents and robberies, (etc.), one of the main issues with Pokemon GO right now is battery drain since the app has to be opened at all times during gameplay. Another huge issue with the game is safety. Some players are just too glued to their phones that they become completely unaware of their surroundings, causing fatal injuries and deaths. This is because the game requires the app to be open in order to catch and be alerted when Pokemon are near you. The upcoming peripheral, “Pokemon GO Plus”, plans to change this and solve the many problems that GO players have come across.
Pokemon GO Plus, arriving in September, was originally scheduled to be released on July but it was pushed back so that Niantic could focus more on polishing the experience on the app. The devices will pair with smartphones via Bluetooth. The most interesting feature of the accessory is that players will be able to play Pokemon GO without using their phones. The device will blink and vibrate when Pokemon and PokeStops are near and players can tap the button to catch Pokemon (of the same kind they have already caught) and swipe to grab items from PokeStops.
When the Plus device releases, it may reduce the risks of playing Pokemon GO. The recent tragedies related to Pokemon GO were likely to have happened because some players’ eyes are too attached on their phones. The vibration and interactive features of the Pokemon GO peripheral should free players from looking at their phones to play the game. This means that players can play and still be fully aware of their surroundings because a quick glance, tap, and/or swipe on their wrist is all they need. In fact, getting players to look away from their phones seems to be the ultimate goal for Niantic. Niantic CEO, John Hanke, wants contact lenses for Pokemon GO to make playing the game even more realistic and immersive. Unfortunately, technology is not at a point to make that happen yet but Pokemon GO Plus is looking to be a nice alternative for the time being.
Pokemon GO Plus is sure to make playing Pokemon GO a more natural experience for exploration. Hopefully, it will be able to reduce the number of accidents and misfortunes as well. It’s watch-like appearance and functionality may even disrupt the wearable/fitness device market which includes Apple Watch and Fitbit since many players have reported that the game has helped them lose weight. Possibly, if the device sells well, there may be upgraded versions with displays and even more gameplay possibilities.
Pokemon GO is a location based augmented-reality mobile game developed by Niantic Labs. The objective of the game is to capture and train the animal-like creatures known as Pokemon. Players are meant to travel the overworld of the game, which is a Google Maps-like 2-D representation of the real world’s landscape, with their customized GPS-tracked avatars. One of the greatest aspects of the game is that players can find different types of Pokemon depending on their surroundings. For instance, water-type Pokemon usually appear near the ocean, and ghost-type pokemon normally appear near cemeteries. This incentivises players to roam around their environment and it makes playing this game extremely social. In fact, there will be a “Pokemon GO Crawl” happening in San Francisco on July 20th and an outstanding 27,000 Pokemon GO players may show up to the event.
Although many have claimed that Pokemon GO has made AR mainstream, this game only exemplifies a fraction of what true AR has to offer. During encounters, players are not able to get a full 360 view of Pokemon, the Pokemon appear to be floating in mid-air most of the time, and player’s are not able to see the real relative sizes of different Pokemon. More advanced AR technology will be able to fix all of those faults to create a more believable AR experience. With that said, this version of Pokemon GO is a great start that shows a ton of opportunity and potential. Huge core elements of the established Pokemon games have yet to be added such as trading and battling Pokemon with other players. Niantic CEO, John Hanke, has confirmed that these features are coming. He has also hinted that Pokemon GO may support VR with Google Cardboard and more advanced AR with Microsoft HoloLens in the future.
Many people have been taking advantage of Pokemon GO’s use of AR. For instance, some businesses only allow paying customers to catch Pokemon and capture PokeStops within their vicinities and some people have even used the game to commit robberies. Players should be sure to follow the game’s opening loading screen and be alert and aware of their surroundings to avoid harm. Despite this, the negative press has strongly helped the virality of Pokemon GO.
It will be extremely exciting to see this game evolve. Niantic seems to be working hard to provide steady, progressive updates that is sure to delight the mass market of Pokemon GO players. This game is fantastic for the future of AR software because it is giving the mainstream audience a peek at what AR can do and it provides a decent reference point. Perhaps our hand-tracking can be implemented into the game so that players can interact with their Pokemon like virtual pets (i.e. petting, feeding etc.). Judging by the continuous success of Pokemon GO, the future of AR is sure to be a lot more social and practical than VR.
This year’s Electronic Entertainment Expo (E3 2016) was virtual reality’s opportunity to make its big break within the mainstream audience. Since companies like Oculus Rift and HTC Vive have already made their mark within the VR industry, E3’s audience were mainly awaiting VR-related announcements from the 3 gaming company giants: Sony, Microsoft, and Nintendo.
In Sony’s conference, they finally announced the official release date for Playstation VR. PSVR will be released on October 13, 2016 and will retail for $399 USD. Sony is also partnering with Best Buy to include PSVR demos in select locations. Alongside PSVR, Sony announced many software titles that will support it. The titles include: Resident Evil VII, Final Fantasy XV, and Farpoint, which utilizes a peripheral from Sony called the PSVR Aim Controller. The special controller aims to make VR first person shooter games become more immersive.
Arguably the biggest announcement of E3 2016 was Microsoft announcing that their next VR/4k compatible Xbox, codenamed Project Scorpio, will launch next year. This was a bold move considering that Microsoft also announced a slimmer version of the Xbox One called the Xbox One S. Project Scorpio’s VR capabilities may be just as good as HTC Vive or Oculus Rift with a smooth frames per second (FPS) count of 90.
Surprisingly, Nintendo did not announce any upcoming VR technology. It’s surprising because Nintendo brought motion controls to the forefront and it seems that the next step would be to also become leaders in virtual reality gaming. Nintendo did already release a VR system in 1995 but it was deemed a failure quickly after release. Obviously, it was way too early to launch a VR capable system at that time. Nintendo now plans to wait and see the virtual reality network flourish before getting involved once again.
Nintendo did announce that their highly anticipated mobile augmented reality game, Pokemon GO, will be launching sometime in July.
It’s refreshing to see gaming begin to shift in a different direction with the implementation of VR and AR at this year’s E3. Next year’s E3 should be a lot more impressive considering that there will be further refinement and expertise in VR/AR content and technology.
The Augmented World Expo (AWE 2016) is the 7th annual trade show event that took place in the Santa Clara Convention Center on June 1st and 2nd. The expo focused on augmented reality (AR), virtual reality (VR), and wearable technology. The event itself was very similar to the Silicon Valley Virtual Reality Expo (SVVR) but it was more focused on AR. The main purpose of AWE is to help developers, start-ups, and mobile/hardware companies cultivate new technology into a productive, sustainable, and entertaining medium. The expo was also very interactive, encouraging plenty of attendees to try out demos from exhibitors.
This year’s AWE show was the biggest one yet, featuring over 4000 attendees from 47 countries, 200 speakers, and 200 exhibitors. We here at uSens Inc. were part of the exhibitor list along with a balanced mix of big, medium, and small sized companies and startups such as Leap Motion, DAQRI, Intel, and Epson. The audience was also comparable to SVVR 2016. It included developers, content creators, designers, entrepreneurs, and investors, with involvement in AR, VR, and wearable technology. Our booth was extremely popular with the audience and we enjoyed their consistent enthusiasm for our AR/VR hand-tracking technology.
On June 1st, our CTO, Dr. Yue Fei, was one of the speakers at AWE 2016 and he presented a case study on Inside-Out tracking and he demonstrated our Fingo technology, which tracks the hand skeleton using infrared stereo cameras on a mobile head mounted display. The next day our development team provided a developer tutorial and demonstration for ARVR Inside-out tracking. The session included recommended best hand tracking use cases and examples for uSens hardware and software development kit. Our developers also demonstrated how to integrate our hand tracking technology with Unity 3D, Java, or C++. At the end of AWE, attendees of our developer conference were able to receive our Beta Dev Kits, which included hardware and software needed to develop with uSens hand tracking.
AWE 2016 featured the Auggie Awards which showcases the best AR technology. The 11 categories ranged from: best app, best hardware, and best game. We were nominated for Best Tool but Vuforia by PTC ended up winning the award. One of my favorite new products that I demoed at AWE was the winner of “best headset or smart glasses”, which was the Epson Moverio BT-300. These smart glasses, reminiscent of Google Glass, features an OLED HD display, a 5-megapixel camera, and a clever design that can fit over regular glasses. The BT-300 is scheduled to ship in October this year. The complete list of the other winners at AWE 2016 can be found here.
The excitement at AWE was amazing this year and it was also one of the busiest tech expos that we have been a part of. I did not expect this much popularity from an augmented reality focused event because of the massive trend of virtual reality right now. Judging by this event, AR has a great chance of hitting the mainstream faster than VR.
Below is a montage video that I filmed at AWE 2016. Enjoy!
Even though it was a small venue, Silicon Valley Virtual Reality Expo’s attendees were immensely enthusiastic about the latest virtual reality technology. The 3rd annual SVVR (2016) is a VR focused expo and conference that took place on April 27-29 in the San Jose Convention Center. The goal of the event is to bring together VR supporters and build up the VR network. The audience featured everyone from developers, content creators, designers, investors, and entrepreneurs who are involved in and or interested in virtual reality.
Over 100 VR companies came to San Jose to showcase their products at the event. It was a nice mix because there were several big companies like Oculus and NVIDIA, some medium sized companies including Leap Motion and OSVR, and dozens of small indies bringing tools or content to the VR landscape. The variety of VR technology exhibited at the expo included 360 video, motion & gesture control, 3D point cloud mapping, audio, controllers, and more.
In addition to demoing our product to attendees at the show, uSens CTO, Dr. Yue Fei, had the opportunity to present our Inside-out tracking for mobile and tethered ARVR. Dr. Fei also discussed how uSens is working to release this technology to Unity 3D, C++, and Java developers on June 1, 2016.
There was an overwhelming amount of VR tech demos to experience at the show so it was nice to try out a few of them. One of the first VR demos that I got to try out was the Pico Neo. The Pico Neo was very interesting because it had a small SNES-like controller that was attached to the Head Mounted Display. Many of the internals are actually located inside of the controller so this allowed for the HMD to be more lightweight. The game that I played was a space shooter demo that had me looking around to aim and pushing a button on the controller to shoot. It was pretty difficult to attain a steady aim with my shaky head but I got better at it the more I played. With Pico Neo supposedly releasing next month in China in June, it should be interesting to see if controller attached HMDs will catch on.
SculptrVR is another exciting game that I tried out that allows users to sculpt objects in virtual reality. The aesthetic of the game was very similar to Minecraft. A unique aspect of SculptrVR was that I was able scale myself to my desired height at any time. With this scaling, it makes it very intuitive to sculpt your own detailed world. Users are also able to upload and share their worlds over Steam.
VirZOOM was one of the most mesmerizing products that I tried at SVVR. The product is actually a bicycle controller that can be hooked up with PSVR, Oculus Rift, and HTC Vive HMDs. The HMD provided position tracking so that I could lean to the left or to the right when maneuvering through the virtual landscapes. The game demo that I played for this system was great. In one of the mini-games, I was riding a horse that could fly and I seriously had the sensation that I was hovering in the air. After playing for about 8 minutes I felt like I got a decent workout without even realizing it while I was in action. This product is sure to introduce VR to the fitness world at a rapid rate. VirZOOM is available for pre-order with Mobile VR support coming soon.
Noitom’s Project Alice was my last demo that I tried out and it was the stand-out virtual reality experience at SVVR. Many attendees were excited about it being that people were required to set up an appointment and wait for hours just to try it. Luckily, some of the uSens team and I experienced the demo together. The demo took place in a private room but after we donned our headsets, the room became completely virtualized. We were then given Nintendo Wii Remotes that allowed us to create virtual objects and interact with them. The twist was that we were also able to interact with real world objects and their movements were simulated in real time (with little to no latency!) via the HMD. Being able to interact with virtual objects and real objects and the very same time while still being in a virtual reality experience was truly magical.
Getting the chance to see and try out our VR peers’ demos inspire us greatly to continue working on our technology. We are so excited for next year! Below is a short montage video that I filmed at the show.
Facebook F8 is an annual conference keynote that took place on April 12 and 13 at Fort Mason in San Francisco, CA. The event was directed at developers and entrepreneurs who build products and services for the social network. During the conference, Zuckerberg mentioned that virtual reality could be the best form of social networking because of its strong communal immersion. He illustrated this idea by showing off the Toybox Demo for Oculus Touch.
Facebook also presented an open source camera rig that they are calling the “Surround 360 Camera”. The camera, made with 17 combined cameras, will be able to shoot 3D-360 8K video at a smooth 60 frames per second. According to Chris Cox, Facebook’s vice president of product, the company plans to share the hardware design and stitching algorithm on Github this summer.
The Oculus team also demoed their VR selfie stick that aims to make VR social. With the virtual stick, users will be able to take photos of their avatars in front of famous landmarks. Once the photo is taken, the user can share it with their Facebook friends’ virtual mailbox so they could post it on the site.
After talking about VR, Zuckerberg, shifted the topic to augmented reality. Surprisingly, it was the first time that Zuckerberg revealed Facebook’s interest in AR. The Facebook CEO said that we could use AR apps to replace physical objects such as an AR TV set. Facebook also showed that they wanted to combine both AR and VR into their devices, which is also what we are working on here at uSens Inc. Near the end of the conference, Zuckerberg presented a photo of a pair of black glasses. These glasses were only a glimpse into the future of Facebook tech with Zuckerberg saying that VR/AR headsets will look like a regular pair of eyeglasses in 10 years. VR and AR were also featured at the end of Facebook’s 10-year road map, along with artificial intelligence and connectivity.
Now that Facebook is officially on the AR train with Google and Microsoft, it should be interesting to see how this competition will help bring AR to new heights.
The mobile picture & video messaging app, Snapchat, has come a long way from its simple and humble beginnings. Originally launched as “Picaboo” in 2011, the company now has a staggering $16 billion valuation and has successfully retained longevity by constantly adding new features and updates. The company’s video traffic is even catching up with Facebook, at 7 billion monthly views with Facebook leading at 8 billion monthly views. Plenty of the videos that are being viewed on the app are utilizing augmented reality because of a recent camera feature. In fact, Snapchat has been expressing interest in the AR field for quite a while now.
Snapchat initially garnered interest in AR when Google Glass publically launched for a limited time and gained attention in 2014. During winter of 2014, Snapchat spent $10 million to acquire Vergence Laboratories, a company in Venice, CA that created the video recording glasses known as Epiphany Eyewear. Around this time, Snapchat also purchased Scan.me, a QR-code generator, that appears in the apps “Snaptags” function which provides a quick and easy way to add friends. The company has also hired many people from augmented reality groups such as Microsoft’s HoloLens, PTC’s Vuforia, and more. This information could either mean that Snapchat will be entering the wearable technology space soon with smart glasses or that they will continue to develop the augmented reality software that is already thriving inside the app.
In September of 2015, Snapchat deployed an innovative feature called “Lenses”. Lenses reinvents the selfie with an ever-changing variety of augmented reality animations. Users can experience augmented reality by pointing their cameras onto their faces. After a second or two of scanning, the user is able to choose between many different facial animations that change daily. If a user finds a lens that they like, they are able to purchase it for $0.99 to own the lens indefinitely. Snapchat has also been selling personalized Lenses for corporate sponsors, like Fox, to use.
It is quickly becoming more apparent that augmented and virtual reality will completely take over the mainstream. The social media giant, Facebook has already acquired Oculus and now Snapchat is showing major interest in the AR field. Historically, AR has been hugely overlooked in comparison to VR but Snapchat seems to be looking to change that. With Snapchat’s 100+ million daily users, more people are becoming exposed with augmented reality. Snapchat is the only company that is introducing AR to the millennial audience at an extremely rapid rate. Snapchat is the undisputed leader of ephemeral messaging and at this rate, they will probably be the leader in AR as well.
Here at uSens Inc, we are aiming to lessen the gap between AR and VR so we are delighted to see Snapchat help popularize AR even further. Snapchat may even move the focus of AR to social networking just as how VR is closely associated with gaming.
We had the lucky opportunity to attend three important technology conferences over the last few weeks: REAL 2016, Game Developers Conference (GDC), and South by Southwest (SXSW). During each event, which were all very different, we showed off our current tech demos to the audiences.
The 2nd annual REAL 2016 Conference took place on March 8 and 9 at Fort Mason in San Francisco, CA. The event theme was “Reality Computing.” Reality Computing is something Autodesk defines as “…the principal concept for how technologies are breaking down the barriers between the physical and digital worlds for anyone engaged in the design, delivery, or management of physical things”
The majority of the REAL audience included people that were heavily involved in business and enterprise technology, specifically AEC and BIM. The conference heavily supported products/technologies that digitally capture existing conditions, manipulate and analyze that data in design software, and outputs the result back in our physical world with digital fabrication methods (i.e. 3-D printing) or visually through augmented reality or projection technologies. Examples of these kinds of projects include: laser scanning, UAVs, depth cameras, photogrammetry, and virtual reality headsets.
Our CTO, Dr. Yue Fei, presented a short and meaningful presentation on how hand and position tracking make better AR and VR experiences for human computer interaction.
The 27th annual Game Developers Conference (GDC) also took us back to San Francisco on March 14-18. We split our time between the Open Gaming Alliance Lounge and the main GDC Expo hall in the ARM booth.
The GDC audience we catered to included Game developers, media, and analysts. This popular (27K attendees) conference is a place where mainly game developers gather to discuss and exchange ideas, tools, and content to further the gaming industries. Virtual Reality was extremely popular at this event. While the VR hype seems to still consume the media, we’re beginning to see the extreme value of uSens technology for AR and VR content makers and related tools and hardware makers. We demoed hundreds of times and attracted dozens of new early DEVNET (Developer Network) beta users that are starting to use our technology in May.
The 29th annual SXSW (South by Southwest) festival went on from March 11-20 in Austin, Texas. SXSW is a yearly event that features games, film, interactive media, music festivals, and tech conferences. We attended and showed off our demos in the Gaming Expo. For the first time this year, the festival also included a VR/AR Experience side event that had a limited collection of mainly 360 video and photo VR technology.
We felt at home during the SXSW Gaming Expo, because many of our VR tech peers joined us there for the massive consumer crowds, including kids, families, educators, and many different types of developers.
One of the AR highlights at SXSW was NASA’s Hololens presentation. Dr. Jeff Norris and Victor Luo, both fans of uSens tech, provided one of the very first public debuts of their interactive apps for Microsoft’s HoloLens device. Jeff and Victor showed the attendees how NASA is using AR and VR technology in building, executing, and virtualizing current and future missions. With NASA’s new found VR and AR technology, they have recreated some of the most accurate 3D reconstructions of spacecrafts, space physics, and Mars. Jeff and Victor shared the inception and construction of new projects and the potentials of virtual reality space exploration for everyone on Earth.
It was great to see our ARVR technology being appreciated and examined in so many different verticals including business & enterprise, game developers, and consumers & educators. Despite the differences of these groups, we’ve witnessed that many people are excited for hand and position tracking to immerse them into their experiences.