Marsha, AI SPACEFACTORY’s Mars habitat design, a machines we depend on to keep us alive and well on Mars. As our design won the ultimate championship of the NASA Centennial Challenge, we presented “Moving to Mars'', a road show across the ocean. Our exhibition begins in Hongkong en-route to its final destination at the London Design Museum in 2019. We planned to use the AR technology with physical model for a better demonstration of the structure, inner space, and interior design of both MARSHA.
Why AR?
With the assistance of a team of architects, our initial idea was to effectively communicate an architectural language to a typical audience in a simplified manner. Having previously experimented with AR and 3D-printing technologies, I decided to combine them in our demonstration, which aims to enhance accessibility by providing immersive experiences and interactive user experiences.
Models and Animation
Digital Models
Autodesk Maya and Rhinoceros played a critical role in both modeling and animation period. To restore the realistic landscapes of Mars and Earth, FX and Xgens in Maya seemed like a better idea e.g. simulating the sand on Mars' surface, flowing water, huge amout of grass, and walking buffaloes on Earth.
Physical Models
We designed the physical model in Fusion 360 for the AR recognition and fabricated by a stereolithography (STL) printer. In two different physical models, one of them shows visitor the mars environment, another one demonstrate the section of the habitat to better illustrate the interior space.
Animation
Swipe Left: Go back to main menu
Swipe Right: Switch scenes between Mars and Earth
Swipe Up: Start animation
Swipe Down: Set the animation to initial state
Final Exhibition
The development of iOS touch phases is with much fun and easy to start. Basically Apple preset the definition of how the screen would be touched in three phases: touch began, touching and touch ended. A swipe left action can be understood as endTouchPosition.x > startTouchPosition.x.
Apple developers can download and create Mechine Learning models with Xcode. The AR recognition in this project was created in TensorFlow and converted to Core ML Model format.