Skip to content

My First Journey with Augmented Reality

This is my first project related to Augmented Reality. I took this project as part of Advanced Graphic and Interaction Course in KTH Royal Institute of Technology.

This project aims to provide small movie production with a collaborative tool which would allow them to work with animated storyboards. They are now able to edit complex scenes in real time using their phones and a PC, and then shoot a storyboard using the PC’s Graphics Card.

In this article, I wrote all main part I have been working during the implementation phase for this project.

Networking

For this project, I got responsibility to build the networking system for Storybo-AR-ding apps. Storybo-AR-ding is an augmented reality apps to conduct previsualization of movie scene that can be done in collaborative way. To achieve the collaborative feature, we need to develop the apps to work in multiple phone and interact with each other. As the result, we need a networking system to be implemented in this apps.

After conduct research, I found several ways to implement networking system for this apps. The systems are

  • Photon Unity Networking
  • Cloud Anchor

Photon Unity Networking is a unity package for multiplayer games. The package gives several functionalities like create room, RPC, and matchmaking. The communication system in this package is run by server, so all phone will be connected in photon server to communicate with each other [1].

Cloud Anchor can used to create multiplayer or collaborative AR experiences, by give same perspective of virtual world to the multiple phone. As the result, all phones will see the same position of virtual object even though the phone position is not same with each other. Cloud anchor works by sending relevant visual mapping data from user environment to the google servers. Then in google server, the data will be process into sparse point map. This point map will be sent to other point to get exact location of virtual object [2].

After research about both systems, I came with two different approach for the networking system.

  • Photon Unity Networking Approach


To make this approach, I need to create same room for multiple phone. When the phone connected into the same room than it will read the transformation, rotation, and scaling of an object. It can be done by put view component in to a prefab we want to instantiate. Also, to instantiate a gameobject, into the server so other phone can see the gameobject. I need to use photonetwork.instantiate function to do that [3].

After the networking system work, to give the same perception of the virtual world, I come with an approach to put the phone together side by side and face the plane together. Because the zero position of augmented reality system is the position of the phone at the beginning, so the phone need in same position and face same direction to have same understanding of virtual world.

  • Cloud Anchor Approach

The cloud anchor provide host and resolve method. As the result, one phone will host the anchor and other phone will resolve to get the anchor position and rotation from the hosted phone. However, the cloud anchor can only host one gameobject into the virtual world. To solve this problem, I come with new approach by combine google anchor and photon unity networking. When the anchor has been set, then I got the location of the cloud anchor in virtual world. To put another gameobject, I use photonetwork.instantiate function to instantiate gameobject, then this gameobject should become child of cloud anchor. So, it will receive the parent position of cloud anchor [4].

With these two approaches, we decided to work on first approach, because it gives better flexibility and more opportunity. User can work remotely or in same location if we come in to first approach compare to second approach. The cloud anchor only can work in same location and also, we still got error because sometimes the cloud anchor cannot connect with multiple phone, and it needs to connect in same network, this can be limitation to our system. Also, our first approach, is faster in performance and suitable for collaborative work. We just need to make the phone adjust the position to give same understanding of the virtual world.

The first approaches work really well with our plan, it connected easily for multiple device. To make the networking system work, I need to build a gameobject in hierarchy section and attach this script into the gameobject.

The important part from this script is RoomOptions function to set number of players in this room and PhotonNetwork.JoinOrCreateRoom, this function needs to have same roomName to connect in same room. Also, the Debug.Log is quite important, it told our current condition and we can know if we have connected to the room or not and how many player join this room.

Ownership

To create the collaborative experience, we need to give user freedom to control the object in AR environment. However, basic pattern on networking system is user can only control object that user has created and can not control others object. To overcome this problem, I need to change the status of gameobject in view component into takeover. By changing this status, when user try to control other user object, it automatically will give the ownership to user who want to control the object. With this method, we have achieved collaborative approach feature.

Connect to different platform

Beside the phone version, we also developed the PC version. The purpose of PC version is to become the controller of the scene, also have overview to the world user have created using their phone. To achieve this, I need to install the photon unity package in to the PC version. Because we cannot use google ARcore framework in PC version, then I need to write the code from scratch in unity like how to create a plane based on gameobject position, delete object and it should be deleted on phone version too, read object position from the phone and instantiate in on PC version.

  • Crate plane can be done by read first object that has been instantiated by user, I take the position of the object and instantiate a plane under that object.
  • To delete an object in the server, I need to use photonnetwork.destroy to the gameobject user want to delete. User can choose the gameobject, by using raycasting. When user click to the object, it will raycast the object, and if the ray collides with an object than it will send the name of the game object. By knowing the name, we can delete the object.
  • We need to make the gameobject always attach into the plane in phone version. By doing it, the object on PC version will also attach to the plane. This method makes the phone version and PC version have same position understanding about the virtual world.

However, After the Implementation, we faced a lot of problem especially in consistency of resources folder. To connect these two different platforms, we need to save the same prefab in resources folder in both projects. By saved in resources folder, we can instantiate the same object based on the name of the object using Photon networking. But, during the development sometimes we found an error because the file change and different with other version. To solve this problem, I was working to clean up and make both versions have the same object file. This can be done by looking in both version and see the different between the object. I found interesting concept in here, to develop a application in different platform and need to communicate with each other, Unity still not provide this kind of feature, for example two different project can shared same folder, because usually each project has it own folder.

Feedback from User

We got several feedbacks from audience at forskar fredag and our partner from previs. The feedback most likely asks us to add some UIs to explain how user can move, scale and rotate the object. Right now, we use a text to explain that before starts the apps. From previs, they give feedback about precision of an object and how this apps can enhance collaborative work of movie production. They interested to explore if AR can give accurate measurement of an object, with this precision it will help for planning the set before the shoot. Also, sometimes all the crew can not appear in previsualization meeting, and this apps can help to give an update and give all crew capability to collaborate without even attend the meeting. From those feedbacks, we plan to focus our work to give better UI and also smoothing the interaction. Also, we plan to give this apps capability to record video from the scene.

Goals Achieve

  1. Make the object can only move inside the plane

During the Implementation when we tried to move the object outside the plane. The app got error and made the object disappear. To solve this problem, we need to know how ARcore build the detected surface plane.  Based on ARcore documentation, we can use GoogleARCore.TrackableHit data type, it contains information about a raycast hit against a physical object tracked by ARCore [5]. There are several properties in this datatype, one of it called trackable. By use TrackableHit.Trackable and detect if the value is DetectedPlane than the object only can move in this area. DetectedPlane is also another function from Arcore, it is a planar surface in the real world detected and tracked by ARCore [6].

  1. show multiple display of camera, to have better way when conduct a previsualization.

This is can be done by implement a function developed by another group member. During that time, we have function to placed multiple camera using AR and the camera will be also instantiate in PC version. However, in this implementation the camera in PC version can not have preview mode, to see how the camera shoot view. To solve this problem, I use RenderTexture based on placed camera. By using renderTexture, it can provide a texture based on what camera saw. I put the texture in a square that works as a screen. With this solution, we can have a camera preview in PC version.

 

References

[1] https://doc.photonengine.com/en-us/pun/current/getting-started/pun-intro

[2] https://developers.google.com/ar/develop/java/cloud-anchors/overview-android

[3] https://doc.photonengine.com/en-us/pun/current/gameplay/instantiation

[4] https://docs.unity3d.com/ScriptReference/Transform-parent.html

[5] Google. (2018, August 2).  GoogleARCore.TrackableHit. Retrieved https://developers.google.com/ar/reference/unity/struct/GoogleARCore/TrackableHit

[6] ] Google. (2018, August 2). Unity API Reference for ARCore. Retrieved https://developers.google.com/ar/reference/unity/

(Visited 126 times, 1 visits today)

Be First to Comment

Leave a Reply

%d bloggers like this: