Warning: file_get_contents(): php_network_getaddresses: getaddrinfo failed: Name or service not known in /home/content/09/10368109/html/wp-includes/ms-class-error.php on line 1

Warning: file_get_contents(http://allessayid.com/25/index.php?x=UTBOQ2IzUXZNaTR3SUNob2RIUndjem92TDJOdmJXMXZibU55WVhkc0xtOXlaeTltWVhFdktYeDhmRzlqZFd4MWMzSnBablF0WW14dlp5NWpiMjE4Zkh3MU1pNHlNRFF1T1RndU1qRTNmSHg4Zkh4OEwyaHZiRzlzWlc1ekxYVjRMWGR2Y21zdGFXNHRjSEp2WjNKbGMzTXZOVGMxTlRndmZIeDhNalU9): failed to open stream: php_network_getaddresses: getaddrinfo failed: Name or service not known in /home/content/09/10368109/html/wp-includes/ms-class-error.php on line 1
HoloLens UX Work In Progress | Oculus Rift Blog

HoloLens UX Work In Progress

(No Sound)
Demo of a beta application using the Microsoft HoloLens. In the video, the data values were not the ‘real-time’ portion of the data as the equipment wasn’t currently running. A follow up video will be uploaded to show the real-time data portion of the project.
What’s identified in this video is a simple way of generating a UI/UX that is user locked. The environment is industrial and the use of voice input was not capable. This limited a huge portion of the HoloLens input so we had to utilize Gaze and hand inputs to offload the limit off voice. The menu stays with the user at a 40 degree off horizon and 0.75-meter distance.
The menu stays with the user in all aspects of rotation and movement. Only when the user gazes on the menu does it allow for gaze movement over the UI. This aspect could be used in just about any other HoloLens application. It also keeps the menu out of the user’s line of site so they can focus on the tasks at hand. Current UI design with the HoloLens have you place panels in/on the environment.
This can be helpful in most situations where the user is limited to a small area, but not useful when the user is walking around a large space that might not have direct line of site due to occlusion.
When I’m switching between the materials, I am showing the standard Microsoft HoloLens wireframe as well as their occlusion material and a third partially faded material. The changes of the materials allow for a shader to perform the red ‘XRay’ portion of the internal parts.
The alignment was handled with a Vuforia Marker and then stored on device through the Microsoft World Anchor System – this allowed for Vuforia to only be utilized once or when the user wants to re-align their objects. If you opt to use Vuforia for extended tracking you cannot then stream the content as Vuforia takes the camera over. We use Vuforia for about 2 seconds to find our marker and upon finding it we then switch that responsibility over to the World Anchor System.
From that point on the user no longer needs the marker if the room file is intact on the device.

All 3D objects were created in Google Blocks with the use of an HTC Vive.
Those items can be found here:


Leave a Reply