After a long, exciting week at CES, we packed up and flew directly from Las Vegas to Hong Kong to oversee the pilot run at the factory. We lived on the factory floor for just over two weeks setting up production, assembly, and QA for the developer kits.
The result? We produced over 40 complete Oculus Rift “pilot run” developer kits and we’re on schedule to start shipping the full kits in March.
None of this would have been possible without your support. Thank you again for making this happen!
We want to take you behind the scenes of the pilot run process to see how it all works. But before that, we have good news that we’ve been keeping under wraps at Oculus VR HQ.
A handful of hardware detectives in the community have speculated on the unexplained mechanisms shown in our photos of the early SLA prototypes. What are those engravings on the side of the headset? Why the plastic around the lenses? What diabolical plans has Oculus cooked up?!
A Few New Features
When we extended the ship date to change the display, we wanted to take advantage of the expanded schedule. Because we couldn’t make any drastic design changes, we decided to pursue two small features we hoped would improve the kits for everyone.
The majority of the team at Oculus wear glasses or contacts. Our early prototypes required that you either:
- Wear contact lenses.
- Press your glasses up against the lenses, which can cause scratches (it’s also uncomfortable).
After prototyping a number of potential solutions, we picked what we thought was the best short-term bet and ran with it. The results were:
#1 Removable Eyecups for Vision Correction
The Oculus Rift developer kit now comes with three pairs of removable eyecups, instead of a single, permanent set. The different eyecups allow you to switch the focal distance of the developer kit between three predefined settings. This means that if you’re nearsighted and your vision isn’t too bad, you may be able to use the developer kit without glasses or contacts.
Pilot run unit with left eyecup removed. Different eyecups below.
Here’s how they work:
- If you have great vision (or you wear contacts), your vision inside the Rift will match your vision in real life. You’ll use eyecup set A.
- If you’re farsighted, you’ll have no vision problems in the Rift because the optics are focused at infinity (which makes your brain think it’s looking at something far away). You’ll also use set A.
- If you’re nearsighted, the additional eyecups, B and C, allow you to see inside the Rift as if you were wearing glasses. Again, this is because the lens cups change the focal distance. If you’re moderately nearsighted, you’ll use set B. If you’re very nearsighted, you’ll use set C.
This isn’t the perfect solution: the B and C cups won’t be ideal for everyone, but we’re hoping that they help some of the nearsighted developers. If you have other eye issues like astigmatism, the additional lens cups may not be sufficient. In short, your mileage may vary.
#2 Adjustable Assembly
The developer kit now has a geared mechanism that allows you to extend and retract the assembly that holds the screen and the eyecups to position it comfortably. This has a few advantages:
- You can extend the assembly to provide extra clearance for glasses or a larger brow.
- If you’re using either of the shorter eyecups, the lenses will be further away from your eyes. By retracting the assembly, you can bring the lenses closer to your eyes, significantly increasing your field of view.
The mechanism shown here is on both sides, allowing for assembly adjustment.
We’re designing better vision solutions for the consumer version of the Rift. These are just a few last-minute additions that we hope help developers in the meantime.
This pilot run is the final test for our calibration, assembly, and testware pipeline before we kick off mass production. The photos below provide a peek behind the scenes at the factory during pilot run week. Remember that everything you see in the photos below is still subject to change.
The very first Oculus Rift developer kit from our pilot run.
We covered tooling in a previous update (available here), but we wanted to provide a few shots of the actual tooling used for the pilot run:
Tooling for the developer kit’s headset assembly.
Tooling for the faceplate of the developer kit.
When you’re creating a hardware product, it’s crucial to test every function and feature. These tests need to be simple, robust, and streamlined so that they’re easily integrated into the manufacturing process. For the developer kits, we needed testware for everything, from the complex components (display and tracker), to the simplest buttons and switches on the control box.
Sensor programming and testing software.
We’ve touched on this in previous updates, but building the Oculus motion tracker from scratch was an exciting challenge. Perfect gyroscopes and accelerometers would report the exact angular velocities and acceleration in all axes. Unfortunately, due to the physical limitations of the manufacturing process, the tracker data may contain scale, offset, and cross-axis sensitivity errors that are unique to each unit. To make matters worse, minor misalignments in chip placement can introduce additional errors.
Pick and place machine for the Oculus tracker.
To counter this, every Oculus tracker is put through a thorough calibration process at our factory where it’s placed on each of its six orthogonal sides and rotated on a high-speed turntable to test each vector. The calibration produces error correction matrices unique to that particular sensor. These matrices are saved and later used to correct for errors when the kit is actually being used. The sensor can also recalibrate itself to adapt to changes in temperature that would otherwise affect the orientation data.
Sensor calibration system, inside the sensor calibration room.
That last bit is actually trickier than you might think. To dial in temperature correction, we built a mobile room at the factory that heats the sensors to operating temperature during calibration. To throw one last wrench in the system, the rigs need to stay perfectly level for accurate calibration. You’d be surprised just how hard it is to make a surface perfectly level to 0.1° and keep it that way.
Building a mobile sauna to calibrate our sensors is definitely a little crazy, but pretty neat.
Nirav peering into the sensor calibration room.
Nirav, Nate, and Jack looking over testware results.
Once we’ve tested and calibrated all the the individual components, it’s time to assemble the developer kit. The kit is produced on a standard assembly line by the factory’s team. Assembly is broken up into sub-tasks like attaching the plastic pieces together, sealing foam on the facemask, and connecting display and tracker components.
Assembly adjustment parts.
Display controllers for control box.
Plastic shells for the headset.
A sheet of Oculus trackers.
The assembly line’s QA process relies on custom testing software that walks the user through a series of display, tracker, and overall functionality tests. The last step in the QA software places the user in a virtual room with targets that they have to look at in a set order. As they look at each target, the target lights up green and computer plays back a sound, notifying the user of success.
QA application. The orientation target here has not been activated yet.
After QA, all that’s left is to print out the labels, wrap up the cables and eyecups, and pack it all into the box that ships to your doorstep.
Manufacturing is a tough business, whether it’s in China or the United States. You can’t whip something up in CAD, email the files over to a factory, and receive your perfect finished product in the mail. Unexpected issues always pop up, and concepts can be miscommunicated. In the end, nothing beats having boots on the ground where the product’s actually being built, so your team can be analyzing problems and creating solutions on the fly.
As an example, we noticed in our early SLA’s that the Oculus tracker was angled by about a degree and a half. While that number may seem insignificant, it’s a disaster if your virtual world is permanently stuck at an angle. We worked with the factory to create a simple solution: three tiny plastic strips beneath the tracker that keep it completely level and parallel to the screen. This resolved the issue, kept us within the factory’s manufacturing tolerances, and only cost us a few days of R&D and testing.
The plastic strips can just barely be seen at the bottom left of the tracker.
When you’re building hardware, every day’s a new challenge. Luckily, tackling issues like these to build a great product is a lot of fun.
If you found all of this fascinating, we recommend reading through the “On the Factory Floor” blog series by Bunnie, hardware guru, who details the finer points of manufacturing a product in China:
On the Factory Floor Part 1 – The Quotation (or, How to Make a BOM)
On the Factory Floor Part 2 – On Design for Manufacturing
On the Factory Floor Part 3 – Industrial Design for Startups
On the Factory Floor Part 4 – Picking (and Maintaining) a Partner
Back at Oculus VR HQ
As always, you can stay up to the date with the very latest Oculus news by following us on Twitter (@Oculus3D), liking us on Facebook (www.facebook.com/oculusvr), or checking out our website: www.oculusvr.com.
Late night Unreal Engine 3 deathmatch playtesting in Oculus VR
Special thanks to Joshua Topolsky and Jimmy Fallon who demoed the Rift last week on Late Night with Jimmy Fallon on NBC! If you missed the show, you can watch the clip here: The Verge – The Oculus Rift on Late Night with Jimmy Fallon
Thanks again for all the support! We will have more information on the Oculus SDK to share as we draw closer to the developer kit launch.
We’ll see you in the game!
— Palmer and the Oculus team