Submit your email to see more.

Enjoy the Case Study!
Oops! Try again!

What’s a Rich Text element?

The rich text element allows you to create and format headings, paragraphs, blockquotes, images, and video all in one place instead of having to add and format them individually. Just double-click and easily create content.

Static and dynamic content editing

"sfdf"sfdsfsdfsdf This is whay I'm looking for

A rich text element can be used with static or dynamic content. For static content, just drop it into any page and begin editing. For dynamic content, add a rich text field to any collection and then connect a rich text element to that field in the settings panel. Voila!

How to customize formatting for each rich text

Headings, paragraphs, blockquotes, figures, images, and figure captions can all be styled after a class is added to the rich text element using the "When inside of" nested selector system.

SpaceCard® Mobile VR Platform

SpaceCard® Mobile VR Platform

"The Metaverse in a Postcard"

2018-2022

SpaceCard® is a Toronto-based startup and multi-year R&D project completed in collaboration with the Federal Government and University Research Labs. Milan has led SpaceCard® development and IP in machine vision, optics, industrial design and UX to help solve some of the biggest challenges facing VR today.

Founder & CEO

Inventor, Patent Holder

Head of Product

Head of Design

Technical Lead, SDK

The Challenge

Consumer Virtual Reality devices have emerged out of an abundance of repurposed high volume, low cost smartphone components. The case of consumer VR was made when Oculus first released their Rift headset and transformed the market with its acquisition from Facebook. The issue with dedicated VR systems was the considerable price point and friction-filled setup process. This results in low scalability, with limited use cases hence the challenges VR manufacturers experience today. On the other side of the market the issue with smartphone-based VR systems was that the experience itself is lackluster - from the lower fidelity experience, comfort of cheap enclosures, and most notably the lack of meaningful interaction. These two basic designs then can either provide high utility at the expense of scalability, or can provide scalability at the expense of utility.

Summary of Results

  • Founded, commercialized and patented innovative vision-based HCI and manufacturing technology for a high velocity VR system in an envelope at highest price/performance in the industry.
  • Led, coached and mentored multi-year research and development team of 14 members across design, engineering, computer vision and manufacturing disciplines.
  • Established and managed concurrent product roadmaps across software, hardware and SDK with iterative release cycles in partnership with Autodesk Technology Centers utilizing lean methodologies
  • Developed and led product design efforts across a holistic visual language including mobile UI design systems, VR UI design systems, industrial design and brand identity.
  • Shipped v1.0 of hardware and software platform for iOS and Android for pilot program for enterprise early adopters including Celcom Malaysia and Ericsson.
  • Galvanized and managed partnerships with federal government agencies, academia, technology partners, international wireless carriers and architectural design firms.
  • Raised over $600,000CAD non-dilutive capital via federal and provincial research and development grants, pitch competitions and subsidy programs
  • Invented and filed 8 provisional utility and design patents for smartphone-based VR systems for immersive environments
No items found.

Strategy
& Insights

Current Landscape of VR Devices

Consumer Virtual Reality devices have emerged out of an abundance of repurposed high volume, low cost smartphone components. The case of consumer VR was made when Oculus first released their Rift headset and transformed the market with its acquisition from Facebook. Since then many companies have taken the concept of repurposing smartphone components with various strategies and embodiments of the idea. Most notably was Google with its introduction of the Cardboard VR headset and SDK. Since the start of these two movements, many companies have jumped in to create variations of the 2 basic designs:

  1. Create a dedicated VR system with augment smartphone components
  2. Create a smartphone based VR ‘enclosure’ that would simply repurpose users existing devices

Each approach has its own advantages and disadvantages, however only the former emerged as a sustainable strategy for the long term.

Pros

The benefit of dedicated VR systems is often with the immersion and quality of the experience. Low latency processing and sensors provide more fluid and interactive experiences that present high utility and engagement. For smartphone VR, the main advantage is the cheap form factor of the smartphone enclosure that allows easy distribution and high scalability considering the abundance smartphone devices users already have.

Cons

The issue with dedicated VR systems was that they were yet another device that had to be purchased, often at a considerable price point and friction-filled onboarding and setup process. This results in low scalability, with limited use cases hence the challenges VR manufacturers experience today. Likewise, the issue with smartphone-based VR systems was that the experience itself is lackluster - from the lower fidelity experience, comfort of cheap enclosures, and most notably the lack of meaningful interaction. This results in typically short 30-second ‘gaze’ experiences that are a novelty but don’t provide meaningful utility. The two basic designs then can either provide high utility at the expense of scalability, or can provide scalability at the expense of utility.

Hard Trends and Soft Trends

The hard trend when it comes to personal computing is that it will become more engaging and immersive. As smartphone screen sizes and processing power increase, their form factor will also evolve. The direction of the evolution is purely based on the visual experience, hence why the form factor of glasses seems to be the logical conclusion of personal computing devices in the future. In the meantime, we can take advantage of soft trends that are particular to smartphones and their current form factor. So what are the truths here? Again every year smartphones have incremental performance improvements in both capabilities and variety of sensors. From larger more pixel dense displays, upgrades in radio technology like 5G, more cameras with higher capabilities, and additional inputs like proximity and microphone arrays.

Smartphone Component Utilization

If we are to solve for scalability and utility together, we need to 1st address the cost of the potential solution. If we are constrained to within the same cost range of existing smartphone based VR enclosures, how can we then address utility? What other avenues can we explore to deliver that value? Have we considered the full capabilities of smartphones? Have all of the capabilities been utilized to their maximum potential? That is the key question. Therefore to solve for utility, basing the solution on an existing form factor like the VR enclosure would require us to revisit how we are utilizing the smartphone to see if we can gain additional utility from the VR experience. Let’s look at what is needed for utility in a VR environment:

  1. Stereoscopic Vision  VR requires stereoscopic displays, or one display for each eye with accompanying orientation sensors to properly render a ‘viewport’ of a virtual environment. Google Cardboard proved this was possible by repurposing a smartphone display with low cost optics as well as MEMs sensors to facilitate the VR ‘viewport’. Therefore additional electronics outside a smartphone may not be required.
  2. Hand Interaction  VR also requires controllers, typically bluetooth controllers with buttons and orientation sensors to provide virtual hands or reticles in a VR environment. If we are revisiting the smartphone, we can see there is one sensor we may be able to take advantage of to recreate virtual hands. This also happens to be positioned appropriately to detect hand movement in real-time. This sensor is the rear facing camera and accompanying LED light. If we find a way to use these existing components, along with computer vision technology, we may be able to fulfill the requirement of virtual hands in VR. Therefore additional electronics outside a smartphone may not be required.
  3. Connectivity  VR environments also require low latency connections with multiple participants sharing an experience from remote locations. New smartphones are coming equipped with 5G radios that may be able to facilitate these low latency connections. Therefore additional electronics outside a smartphone may not be required.
  4. Audio  VR also requires audio input and output - again smartphones are coming equipped with ever louder and more capable speakers and microphones that can be repurposed. Therefore additional electronics outside a smartphone may not be required.

The Utility Threshold

In order to reach a minimum level of utility within a VR experience, I derived a progressive scale from the lowest to highest perceived level of VR experience. The following is a linear progression of utility based on technical capability of the VR device. The goal was to target the minimum viable utility while being realistic with technical capabilities and constraints of smartphone technology. Below is a graph on the left side indicating the lowest utility VR experience and on the right showing the highest level of utility. The goal was to get to 80% utility based on 20% of the technical capabilities of the available smartphone device. Again this is a graph of VR utility, as in what capabilities the user has within the VR environment, not VR experience quality (as this would be a different scale).

A New VR Category is Needed

Therefore, a new category of device is needed to solve the problem. According to the hypothesis above it may be possible to provide a high utility VR experience using existing smartphone technology repurposed to perform new functions. This is the basis of the product strategy and requirements.

The SpaceCard Platform Definition

SpaceCard would be an interactive VR environment built as a universal sandbox system with both mobile and VR user interfaces that allows users to upload 3D spaces and objects to the cloud. Users would be able to invite others to collaborate, chat and interact in VR environments real-time. These spaces would be accessible through an instagram-like experience that allowed users to jump into VR environments using nothing more than a mobile app and smartphone accessory.

Approach to Hardware & Software

The product design decisions for SpaceCard were vastly different than with most other VR platforms. Because of the cost-per-unit constraints (less than $10USD/unit) we could not implement any form of electronics into the physical device, but is instead powered using all the available technology and sensors in smartphones to drive interactive VR sessions. This of course meant that we had to effectively manage resources on the smartphone to not overload the electronics and also manage VR fidelity expectations with the user. As SpaceCard would essentially be a mobile app with accompanying accessory, we had to focus on developing key technologies that would enable new capabilities from the smartphone.

Positioning Opportunities

Relying on existing smartphone technology to fulfill the majority of VR functionality would enable SpaceCard a new level of cost efficiency. If this technical strategy were to be successful, we could significantly reduce or eliminate the need for electronics in the physical product allowing us to approach price per unit ranges that could allow SpaceCard to enter new categories Because of this, the target price for SpaceCard was less than $10USD/unit. If we were able to reach this price, our product positioning possibilities would greatly increase to fundamentally change how consumers thought of VR devices. We also considered the scalability opportunities this would create, and thought hard about other opportunities for cost cutting and efficiency improvements. At the sub $10USD/unit threshold, shipping costs become vastly more critical. This is why we considered formfactor of the physical device as a major consideration because it could greatly impact the total cost of the unit (Device + Shipping). The cheapest form of shipping is lettermail which would be a tremendously difficult form factor to fit a VR device into, but we felt this would be a worthwhile pursuit.

Much inspiration can be drawn from hardware & software startups like Square. This is a mobile payments platform with a freemium accessory. Their revenue model is versatile and relies on low cost hardware to enable recurring revenue models.

Therefore, according to the hypothesis above it may be possible to provide a high utility VR experience using existing smartphone technology repurposed to perform new functions. This is the basis of the product strategy and requirements. If we were able to achieve physical cost per unit $10USD with a lettermail form factor, we could open product positioning possibilities never before possible including:

  • VR as a mobile-first experience
  • Use Case-Driven VR Adoption
  • Convert VR devices into in-app purchases or impulse buys
  • Turn VR devices into an
  • Increase VR scalability on the level of direct mail
  • Compete with Electronic Promotional Products and Sales Catalogs or Brochures
  • Move up VR touchpoints and distribution towards the top of the Sales Funnel
  • Make high volume or wholesale B2B packages more palatable
  • Increase Product Bundling Opportunities
  • Eliminate a B2C brick and mortar retail
  • Subsidize B2B2C Costs
  • Subsidize Device Costs with SaaS Revenue Models (ie. free device with subscription)

Technology
Invention

Today’s mobile devices provide much more capability than just a couple years ago and with the right platform, can deliver higher quality and higher utility experiences. We wanted to see what was possible with interactive, free-to-roam spaces combined with the speed of 5G mobile devices, particularly for interactive spatial collaboration.

Smartphone Limitations

The smartphone was only able to provide a finite amount of computations and resources to deliver an interactive VR experience. We used electronics in the smartphone to provide the following functionality:

  1. Display panel as stereoscopic display
  2. Ambient Sensor override for display brightness consistency
  3. Battery Low Power override for performance consistency
  4. Audio Control override for sound consistency
  5. MEMs sensors, Accelerometer and Gyroscope for Head Tracking
  6. Rear Facing Camera Hand Tracking
  7. LED light for low light Hand Tracking
  8. CPU for Machine Vision Algorithm Calculations
  9. GPU for VR Graphics Rendering
  10. Front Facing Camera for state detection (inside or outside of the VR enclosure)
  11. 5G radio and Wifi for Peer-to-Peer collaboration
  12. Microphone for VR Conference Input
  13. Loudspeaker for interaction feedback sounds and VR Conference Output

The challenge with the above scenario was that all of these processes were carried out simultaneously in real-time on the smartphone, essentially creating a ‘stress test’ environment pushing thermal limits of the device. Our technical challenge was to manage these resources in an effective manner thus allowing us to ‘ride’ the thermal limitations of the device in a manner that was advantageous for the user experience. To do this, we had to address CPU and GPU computation loads where we could effectively run the VR session at near maximum load as to extend the most time the device can operate at maximum performance before hitting its thermal limit. This meant our two biggest computational processes including visual graphics and computer vision processing had to be exceptionally lightweight and efficient.

Computer Vision Controllers

The hands-on gesture control turns the user’s smartphone camera into a real-time gesture sensor. We developed a computer vision processing algorithms that utilizes the smartphone camera and LED light (torch) to process and track a left and right marker in real-time in all lighting conditions. The controllers are magnetically latched on the either side of the SpaceCard visor and can be held by the user’s hands. From tapping, to grabbing and swiping, the SpaceCard controllers provide full multitouch interaction similar to the experience of a large touchscreen. This technology converts the camera into a Virtual Reality sensor.

Gesture Recognition

Because our controllers had no bluetooth, buttons additional feedback mechanisms that would communicate back to the smartphone, we developed gesture recognition to allow for full controller functionality:

  • Tap (or click) intensity  measuring the positional delta in the Z axis enables the algorithm to detect a fast Z movement in realtime. Z movements that accelerate in the positive direction and suddenly slow down represent a click or ‘tap’. If the virtual hand reticle is hovering over a button or object during this state, the object is tapped or clicked. Depending on how fast the user taps, increases or decreases the Z delta in a given time frame thus changes the intensity of the tap. For example, when hitting a virtual button or drum, the user can change the intensity of their tap but hitting with higher acceleration thus increasing the intensity and the loudness of the drum tap.

  • Rotate/Orbit The drag gesture can be utilized to rotate an object in 3D space, such as a dial when performed on top of such virtual object. The drag gesture can also be utilized to orbit around a virtual orbit ‘anchors’ - similar to orbiting around an object in 3D modeling software with one hand.

  • Scale/Zoom  The drag gesture can be utilized with 2 hands simultaneous over an object. When the distance between the two ‘drag’ state hands changes over an object it can change the size or scale of that object in 3D space - similar to pinching and zooming on a multi-touch display.

Ergonomic Hand Height Problem

As the smartphone camera has a limited field of view, it creates restrictions around the boundaries of that space which we call the interaction area. Because a smartphone display and camera are used simultaneously by SpaceCard (one to render the virtual reality scene and the other to track the hands) it creates an uncomfortably high interaction area for the user causing strain on the arms and shoulders. As we cannot reliably increase the field of view of the rear-facing camera with mirrors, fisheye lens adapters or add additional cameras, our only option is to rotate the camera, and thus the entire smartphone assembly down at least 30 degrees to have a better view of the hands.  However this now creates a problem for the user’s ability to see the display as it too, rotates with the camera. Typical VR headsets have the facial plane and the display plane perfectly parallel to each other for the user’s eyes to properly focus on the image rendered by the display. Our solution, then, is to implement a unique optics system that allows the display plane of the smartphone to be offset from the user’s facial plane, while still keeping the image in focus for the user. There are three main components to this optics system:

  1. Facial Plane: this is the surface of the visor that rests on top of the user’s face and is in line with the position of the face. Consider this plane at a 0 degree angle in the pitch axis. In some versions of SpaceCard, this plane contains fresnel lenses (one for each eye) and in other versions of SpaceCard, it is simply a hollowed out opening for the eyes.
  2. Display Plane: this is the surface where the display of the phone rests and is typically at least 25 degrees offset downwards in the pitch axis. This angle allows the smartphone itself to be housed in a 25 to 35 degree downward angle that enables the camera to be pointed in a much more comfortable angle for a more ergonomic interaction area.
  3. Optical Instrument Plane: The lens assembly sits in between the facial and display plane and is typically parallel to the display plane. This plane contains a combination fresnel lens with a 43FL fresnel lens on one side and a fresnel prism with a 40 diopter that bends the light to focus at a particular angle towards the eyes for the image to stay in focus.

Patents Filed

3,058,447

"A SYSTEM FOR GENERATING AND DISPLAYING INTERACTIVE MIXED-REALITY VIDEO ON MOBILE DEVICES"

6,205,0432

"SYSTEM FOR INTERACTIVE ENVIRONMENTS"

No items found.

Design & Experience

Target Personas

SpaceCard would be positioned as the entry-level VR experience for mass consumption. We wanted to focus on a target persona that was most open to using VR outside of a gaming setting, and based on research we decided to focus the product to students from k12 to post secondary. This demographic is particularly open to new technologies and modalities of interaction with limited investment in existing systems. Use cases for SpaceCard would be focused around introducing younger students to the metaverse in fun and collaborative spaces, while also catering to students in post secondary education. Students in College and University with a need to visualize their designs in Retail, Interior or Architectural design courses would also benefit from this technology.

Industrial Design

Patent-pending optics and industrial design enable a full VR system to collapse into an envelope form-factor that ships like letter mail with a 3 step pop-up setup ready for use within seconds. These two technologies put SpaceCard® at the very top of VR ‘price performance’ and simplicity for an unmatched experience at 20X the scalability compared to the industry average. Zero-setup magnetic paddles give users VR hands for interacting with virtual spaces. Users can grab or stow the magnetic paddles from either side of the visor for use during VR sessions. These magnetic paddles give users VR hands in virtual spaces powered by machine-vision technology.

Collapsible Hardware

For v1.0 we manufactured the visor using techniques from the packaging industry to cut and score all major components out of a single sheet of polypropylene plastic.

The origami-inspired, collapsible VR system is made of a single sheet of polypropylene plastic manufactured with a ‘package dieline’ - inspired process. B2B customers can customize the look of SpaceCard® as part of a campaign to suit their brand with a wide variety of traditional and sustainable materials. SpaceCard®’s patent-pending optics and industrial design enable a full VR system to collapse into an envelope form-factor that ships like a postcard with a 3 step pop-up setup ready for use within seconds. The SpaceCard hardware system has two states:

  • Collapsed envelope state: a sub 10mm thick collapsed state that enables SpaceCard to be stored and shipped in a letter-sized envelope at less than 200grams making the hardware eligible for fast, low-cost shipping with ease of transport and delivery.
  • Expanded visor state: a one step assembly process and pop-up design allows SpaceCard to be easily expanded into a full VR headset/visor that houses the smartphone and is worn by the user.

Multidisciplinary User Experience Design

SpaceCard UX design was an exceptionally unique challenge because it required multi-level thinking to solve complex problems in many different areas. The user experience had to be solved in 6 user epics all interconnected with each other.

  1. Visor Popup UX  The task to complete for the user here is assembly of the VR visor from the collapsed to expanded state.
  2. Mobile Mode UX  The task to complete for the user is to select a virtual space to begin a VR session.
  3. Controller Onboarding UX  The task to complete here is for the user to place the smartphone in the visor, grab the controllers and ‘tap’ or click a virtual button.
  4. VR Sandbox Mode UX The key task here is for the user to place and move virtual objects in the sandbox environment.
  5. Mobile Summary UX  The task here is for the user to remove the visor and take the phone out of the enclosure.
  6. Collaboration UX  The task here is for the user to invite others to join them in the virtual environment.

Design Around Limitations

The industry average VR session duration at the time was between 3-5 minutes in VR for a first time user. Because of the thermal limitations, we knew there was a duration limitation to how long VR sessions could run on the smartphone. Because of the limitation, we created increased focus on creative solutions to continue the application experience beyond the VR session, and have it bleed back into the ‘mobile’ mode. We called this the 'Dual Mode Back and Forth Operation'.

Mobile VR has a unique advantage over any other VR platform is that it inherently enables one app to have two modes of operation based on device orientation.

  • Vertical The first is the 2D, or mobile mode with the traditional handheld operation where the device is in a vertical orientation.
  • Horizontal The second is the VR, or immersive mode where the device is ‘worn’ on the user's face in its horizontal orientation inside the VR enclosure.

Therefore we can use the VR session length limitation to our advantage by encouraging a back and forth experience between the vertical and horizontal orientations of the smartphone. This allows us to create a sense of focus around the VR session by explicitly creating a time limit in the session, while also providing a continuation of the experience in a vertical orientation as well. This strategy could work well with many different use cases where a task or selection is performed in mobile or vertical mode, the experience continues in VR or horizontal mode to perform immersive tasks, and the experience continues in mobile mode to provide supporting information to the immersive experience. Supporting information could be a session summary, questionnaire or other tasks that can augment the immersive experience or encourage a mode switch back to VR again. We enabled this functionality at any point in the experience by simply changing the orientation of the device. The switch back to vertical also gives the smartphone an opportunity to cool down and restore thermals to continue further VR sessions. To the user, this is presented as ‘taking a break’ with a 3-minute lockout from VR until the device is ready for the next VR session.

Visor Popup UX

This is the opening and unboxing experience of the product itself. The goal of this experience is to minimize required steps, friction and setup processes to start a VR session. Smartphone VR systems typically have an assembly phase where the user is required to set up and ‘build’ the enclosure in a DIY fashion. Our aspiration with the UX was to completely outdo any other VR system when it comes to time to VR. We measure the effectiveness of this experience by the amount of steps required for the user to have the visor on in the VR state starting from the collapsed envelope state. We devised a system to require only 3 steps to full assembly from when the product is in its collapse state to its expanded visor state. This helped set the expectation for the rest of the user experience.

Mobile Mode UX

Once the visor is set up, the user is required to download the mobile app which of course begins in the mobile or ‘vertical’ state. The mobile UX had within it user stories that started with device calibration and once completed, would begin the vertical version of the app designed around space selection. From a UI design perspective, we wanted to mimic an instagram-like experience where various spaces would be shown in an infinite scroll manner. Once selected, each space would be shown in a 3D preview that rendered the space in a dollhouse perspective allowing the user to ‘jump in’ with a yellow walking symbol. This would begin the transition into VR mode.

Controller Onboarding UX

Once the user has selected to begin a VR session, they need to be onboarded with SpaceCard’s controller or ‘paddle’ system. This is an onboarding process that gets the user comfortable and familiar with using controllers in VR. Because the controllers were magnetically latched onto the sides of the visor, they were more a part of the visor design as opposed to standalone controllers like in most VR systems. We decided to first familiarize the user with the location of the controllers by grabbing them and familiarizing themselves with their shape and function before they had the headset on. This experience would be guided by an animated tutorial with voice overs while the smartphone was still in vertical mode.

Once the user was comfortable with the location and ergonomics of the controllers, they would be asked to place them back in their nested location and then put on the visor. The transition into VR started with the user placing the smartphone into the visor, and putting the visor on where a VR version of the controller onboarding would commence. Because the user already had familiarity with the controller and its location on the visor, we were able to dramatically reduce friction and task failure.

VR Sandbox Mode UX

Once the user has performed the necessary onboarding, they are now ready to engage, interact and perform tasks within the VR environment. There were several key innovations in the VR experience that allowed the experience to flourish within the limitations we had.

User Interface Layout  There were three main components of the user interface split into 3 vertical zones. The top vertical zone, just above the user’s main line of sight was dedicated to status information and menu toggles. The second and central vertical zone was the interactive environment which took up the majority of central visual real estate. The third vertical zone was just below the user’s main line of sight and was dedicated to locomotion.

Top Vertical Zone Menu & Countdown The top vertical zone was used to engage the main menu and focus the user on time remaining in the sesion with a horizontal progress bar. These two components were always within sight, and were engaged with a gaze gesture to minimize the user from lifting their hand in high and uncomfortable positions. The menu was used to manage objects in the space, and change perspective of the experience. Once the 10 minute counter expired, the user would be requested to take off the visor and take a break.

Central Vertical Zone Object Mechanics  The central vertical zone was used to tap, drag and interact with the virtual environment. Gestures would be utilized to manipulate the environment in ways never before possible in mobile VR. Objects also had their own interactive menus that provided simple functionality like move, delete, rotation etc.

Bottom Vertical Zone Locomotion  Due to the lack of 6DoF we decided to use hand gestures to translate into user movement within the virtual environment. We created a singular button below the users line of sight in the bottom vertical zone. Once this button was engaged, the user would be moved forward based on their hand movement. This was an extremely difficult technical challenge as many users would be susceptible to motion sickness with z translation in VR. We solved this problem by using hand Z movement acceleration to drive the virtual camera movement in real time and simulate a ‘swimming’ experience. The gesture to move forward was similar to tapping a set of drums with both hands, but created a comfortable and physiologically acceptable method of moving through the space while ‘fooling’ the user into thinking they are moving with a button press. The inclusion of stepping sound effects further reinforced the sensation of walking.

Perspective & Space Toggle We felt it was critically important for the user to be able to easily manipulate the space they are in by offering various vantage and access points for the user to teleport to. This need for easy movement was also further enhanced by allowing the user to toggle between variations of the space, and different spaces altogether in a seamless manner.

Mobile Summary UX

Once the user has completed their VR session, the application offboards the user from VR mode to remove the visor and provide a summary of their VR experience. This was a useful segway into picking up the phone and continuing the experience from a handheld perspective. We believe this back and forth mechanism provides many useful opportunities in the future.

Collaboration UX

Multi-User collaboration was an important part of the SpaceCard experience and needed to take advantage of as many smartphone modalities as possible to reduce friction. Because SpaceCard is fundamentally a mobile app, it can take advantage of everything like notifications and pop ups. We wanted to make inviting users into shared VR experiences as easy as making a phone call. The user interface for inviting folks was split into two user stories: a) invite users via notifications b) share room codes and have users manually join. Every VR experience took place within a room session. Each session had its own unique room code that could be shared and accessed by other SpaceCard users.

Likewise, hosts had the ability to send invitations to other SpaceCard users via app notifications where invitees could either accept or decline the invite. During VR sessions, we took advantage of real-time headtracking coordinates, location and hand position in real-time to create abstract avatars of each user. Users could speak to one another in real-time using the smartphones onboard loudspeaker and microphone as well. We utilized as many Unity3D-based photon networking functionality to bring collaboration to life on SpaceCard. This is feature would result in the world’s first mobile VR metaverse.

No items found.

Launch & Showcase

Creative Brief

SpaceCard® is a mobile technology and popup VR system that delivers an ‘Instagram-like’ experience for exploring and jumping into VR spaces using your smartphone. Unlike other systems, SpaceCard® uses the smartphone camera and machine-vision technology to track the users’ hands. This allows users to interact in drag-n-drop environments, perform walkthroughs and configure shared spaces in real time. We needed to communicate that SpaceCard was a cutting edge technology available for everyone at a low cost. One of the biggest challenges with VR in general is the stigma that the experience would be cumbersome and complicated. We wanted to avoid that by leaning on smartphone modalities, simple interfaces, calls to action and using familiar terminology from the App Store world. At the end of the day, SpaceCard is just a mobile app.

Target Audience

We needed to distinguish between who our users are, and who our customers are. Targeting the education space meant that our users would be students from ages 12-25 (k12 to post secondary) who would use the technology. However our customers would be the procurement officers, teachers and professors in these professions. We needed to speak to the new generation of students to create a pull for the product, but also address implementation and cost concerns with the schools and universities. Therefore our marketing and launch strategy needed two distinct voices and address our users and customers.

Brand

SpaceCard was a name derived from the fundamental concept of the product - a postcard for interactive spaces. We used this name in combination with an aesthetic and visual style seen in NASA space programs with the use of black, yellow and silver to denote a futurist aspiration as if the product was out of space!

Web Experience

With such a complex product, with dozens of features and capabilities we wanted to incorporate as many visual elements of the user interface as possible. With a heavy emphasis on animated GIFs, the web experience had many requirements to fulfill in explaining the product in as simple a way as possible.

Launch Video

Much like our web experience, the launch video had to run through all 6 user epics within 60 seconds. This was no small feat but we were able to include all necessary content into the video with the help of environmental cues, real-time video captures and 3D animations. The idea was to imprint on the viewer the full capabilities of the product, while highlighting that it all came from a postcard and mobile app.

Unity3D SDK

The software development kit program was an offshoot of our developer evangelism efforts that resulted in us building a full repository of documentation, videos and supporting material for developers building custom Unity3D apps for our platform.

Presentation Decks

As part of our product validation, we targeted Architectural Design schools, programs and courses along with commercial design studios to pilot the product and provide initial feedback. To do this we designed variations of our standard presentation deck that targeted various stakeholders and decision makers in the space.

DealTap® Transaction Platform