The project sought to explore the form factor, interaction, and create the product roadmap of the company's next-next generation MR product. As the prototyper, I led the design, fabrication, and experiment on the prototype.
* Project Aurora is the code name for Rokid Vision.
The Vision utilizes the computing power of pervasive personal devices. Instead of gathering all the function units around the users' head, it is a global "accessory" for various devices with computing power, such as mobile phone, tablets, and laptops.
To make the prototype fit, I developed a plugin called Uranus for Rhinoceros to generate the physical form based on the head data provided by our research team. With Uranus, the prototype provided vital parameters, such as eye relief, eye distance, tolerance, etc..
To read more about plugin Uranus, please check my talk at AWE USA 2019.
Due to the different demands for the prototype from different functional teams, the prototype adopted a modular and extendible design to create various derivates.
Integrated the similar sets of electronic parts as the final hardware selection, and later served as a fully functional platform for SDK, optics, and electronic engineers.
Adopted Vive tracker for stable 6-DOF tracking performance, allowing creators to design and develop content simultaneously during the early stage of the project.
Provided an instant platform for software engineers to evaluate/benchmark the overall performance of the phone-AR combination.
After three months of hackathon-style development, we presented Rokid Aurora at CES 2019 in Las Vegas. We completed its initial functionalities: 6DOF tracking, binocular stereo display, phone/laptop compatibility, and gesture control.
Project Aurora was formally named as Rokid Vision after CES 2019. It was then introduced at AWE USA 2019 in Santa Clara, with upgraded industrial design, optical system, VSLAM sensor, and SDKs.