AR Studio

I N T R A T E R R E S T R I A L by Maya Pruitt

INTRATERRESTRIAL is an AR narrative experience about extraterrestrial communication that interweaves interactive augmented reality with cinematic story telling.

How? This AR experience was designed using Vuforia and ARKit in Unity. Users become the main character of the story and are initially guided through video with subtitles to act as their inner thoughts. Once the supporting character of HQ is introduced, instructions to find the next AR components are provided through her voice and UI elements on screen of the user’s device. The mechanics work like a scavenger hunt, inviting the user to explore their realities to progress through the story.

Why? This project seeks to push the boundaries of Augmented Reality not only in content but also in its purpose. Each AR segment of the story is designed to create a more seamless augmentation. Rather than having objects randomly appear in the user’s physical space, Intraterestrial seeks to change it before the user’s eyes. In addition, the AR serve as crucial plot points to drive the over arching story. While it embodies characteristics of the more passive experiences of video games or film, its ultimate goal is to combine the digital and physical realms to encourage users to actively venture out into New York City. For there may be extraterrestrials among us!

Intraterrestrial incorporates technical applications of Vuforia, ARKit, UI, 3D modeling, animation, texture mapping, image tracking, and ground plane recognition in Unity.

DSC_0052.JPG

Read more about the development process here.

Final Project Progress: Final Video Draft by Maya Pruitt

First draft of the final video. This cut helped me see the main flow of the narrative. The AR components have not been added yet and are depicted with After Effect animations as placeholders. For the final version I am making it a self-requirement to showcase the AR components working with image tracking in real life.

UPDATE: After meeting with Mark he gave me some major notes, and I worked actively to adjust them for the final documentation:
- slowed down the map animation
- made the shatter inward to improve the illusion
- added a visual or audio cue when it switches to HQ on the sci-fi interface
- paned back to the map when HQ asks if you've seen anything strange lately
- got all the AR components working in real life!
- asked a friend and stranger to run away from the UFO to add some drama!

Final Project Progress: Details by Maya Pruitt

While developing the UFO, I remembered Mark saying that textures can be super effective. Rather than trying to do everything in 3D, I worked on creating a realistic texture to map onto the UFO ground model. The ground needs to resemble the final location of the scene, Union Square Park and make it appear as if the hexagon tiles have really been uprooted. The texture is created from my own photographs combined with researched imagery of sink holes.

Texture as flat UV map

Texture as flat UV map

Texture applied to UFO ground plane

Texture applied to UFO ground plane

I created the visual screen designs for the UI from scratch as well. In a future iteration, this would be fully interactive, which is why I created icons for a messages window and a map feature. The parallelograms on the right hand side would serve as a progress bar and fill one at a time as the user discovers strange occurrences in the story. 

screen1.png


Final Project Progress: Demo Video by Maya Pruitt

This demo video shows scene one of the final video and the setup for the narrative. The first AR component is introduced in the context of the story. I also wanted to begin playing with the subtitle mechanism. It is important to me that the main character is the user, so this feels like the best way to give them their own voice. The demo was shown to Mark, classmates, and guest critics for feedback.


Final Project Progress: Testing ARKit by Maya Pruitt

Took a deeper look into unity and ARKit this week. These are some tests for different functionality that may be incorporated into the final version.

Tracking multiple image targets:

Testing animated UFO:
So far it can be placed in the scene (with a tap) and the animation does run, but lighting is off and it seems to be spawning over and over. Goal: fly in?

UI test (how HQ voice and subtitles would be displayed):

UI_screenTest.png

Final Project Progress by Maya Pruitt

This week I did some visual research to get an idea of the aesthetics of my project. I want the interactions with the HQ voice over to feel very futuristic and spy-like.

Screen Shot 2019-10-25 at 1.26.20 PM.png

I started working with ARKit and boy was I having a LOT of trouble. My main accomplishment was getting a successful build to the phone and getting the camera to open. For serval tries the app would crash on launch.
Unfortunately I am having trouble with image tracking now. I tried a script from a tutorial I watched on youtube, but it produces no results. This video is just of getting the app to open, celebrate the small victories! But I'm sad it isn't responding to the image. Please if anyone has resources on how to get a cube to appear triggered from an image target in ARKit, let me know!

Subsequent builds I tried, kept changing my bundle identifier to "Unity-Target-New" when I didn't tell it to, and it says my iPhone architecture is unsupported. An hour before, this wasn't the case.

This is the error (still appearing after following its directions to change the architecture settings). Very confused what's going on.

Screen Shot 2019-10-25 at 1.30.23 PM.png

That's all for now! My goal is to get image targets working before getting into the detailed animations. I want to test the responsiveness of image targets too, do pngs work, can an image represent all versions of an object that would appear in real life?

Week 5: Final Project Proposal by Maya Pruitt

Working title: Among Us

An AR narrative about alien contact with earth. Using the themes of revealing the unseen, users would be prompted to find different locations in their realities and watch it be altered before their eyes. These scenes connect to reveal a final climax that extraterrestrials have already landed on earth.

PRESENTATION DECK

Week 4: What's in the Box? by Maya Pruitt

AIR MAIL

The USPS collection and storage boxes are a common sighting in NYC, so I began imaging what could possibly be inside. Not just mail of course!

This narrative piece was a play on the Harry Potter world. In the USA, eagles deliver young city witches and wizards their letters of acceptance to the school of IIvermorny. If you've revealed what inside the box, maybe you'll get yours too :) The eagle is also an homage to the USPS logo.

ASSIGNMENT: Create a reveal using augmented reality. How can you make the unseen, seen?

PROCESS:

First initial test - getting the box to open and playing with the eagle animation.

Basic 3D Letter model made in Maya:

Screen Shot 2019-10-04 at 2.30.54 PM.png
Screen Shot 2019-10-04 at 2.31.07 PM.png
Screen Shot 2019-10-04 at 2.31.13 PM.png