Multi-User Classroom Environment in Virtual Reality - Creating and Implementing new Features

Page created by Dale Hayes
 
CONTINUE READING
Multi-User Classroom Environment in Virtual Reality - Creating and Implementing new Features
Multi-User Classroom Environment in
           Virtual Reality
       Creating and Implementing new Features

                          Filip Renman

       Computer Game Programming, bachelor's level
                         2020

                     Luleå University of Technology
      Department of Computer Science, Electrical and Space Engineering
Multi-User Classroom Environment in Virtual Reality - Creating and Implementing new Features
Multi-User Classroom
    Environment
          in
    Virtual Reality

Creating and Implementing new Features

               Filip Renman
               Bachelor Programme in Computer Engineering
               Department of Computer Science, Electrical and Space
               Engineering
               Supervisors: Teemu Laine & Henrique Souza Rossi
               Luleå University of Technology
               Skellefteå 2020
Multi-User Classroom Environment in Virtual Reality - Creating and Implementing new Features
Sammanfattning
En forskargrupp från Luleås tekniska universitet i Skellefteå arbetar med att skapa ett virtuellt
klassrum som ska användas för pedagogiskt syfte. I detta klassrum ska en lärare kunna hålla i en
lektion för sina studenter på ett intuitivt och smidigt sätt. I detta delprojekt har jag fokuserat på att
implementera olika funktionaliteter till klassrummet med hjälp av spelmotorn Unity. Resultatet är ett
program där en lärare kan nu skapa ett virtuellt klassrum som studenter koppla upp sig mot.
Användartester har gjorts för att testa programmets användarvänlighet vilket visade att klassrummet
var funktionellt och att det inte framkallade någon känsla av åksjuka, så kallad cybersickness.

Abstract
A research group from Luleå University of Technology in Skellefteå is working to create a virtual
reality classroom to be used for educational purposes in mining education. In this classroom, a
teacher should be able to hold a lesson for their students in an intuitive and pleasant way. In this part
of the project I have focused on implementing functionality to the classroom with the help of the
game engine Unity. The result is a program where a teacher now can create a virtual classroom that
students can connect to. User test have been performed to verify the user-friendliness of the
program, which showed, that the classroom was functionable and that it did not cause any feeling of
motion sickness, known as cybersickness.
Multi-User Classroom Environment in Virtual Reality - Creating and Implementing new Features
Abbreviations
API      Application Programming Interface
LTU      Luleå University of Technology
PTSD     Post traumatic stress disorder
SDK      Software Development Kit
UI       User Interface
VR       Virtual Reality
Multi-User Classroom Environment in Virtual Reality - Creating and Implementing new Features
Table of contents
1. Introduction                                                1
                1.1 Background                                 1
                1.2 User interface                             1
                1.3 Cybersickness                              2
2. Goals and purpose                                           2
3. Materials and methods                                       3
                3.1 User testing                               3
                3.2 Limitation                                 4
                3.3 Scrum                                      4
4. Results                                                     5
                4.1 Implemented features                       5
                                 4.1.1 Annotations             5
                                 4.1.2 Highlighting            6
                                 4.1.3 Animations              6
                                 4.1.4 Student status list     7
                                 4.1.5 Raise hand              8
                                 4.1.6 Teleportation           8
                                 4.1.7 Calling for attention   8
                                 4.1.8 Drawing                 9
                                 4.1.9 Permissions             9
                4.2 User tests results                         9
5. Discussion                                                  11
                5.1 Features                                   11
                5.2 User test                                  12
6. Conclusion                                                  13
7. Future perspectives                                         13
8. Social, ethical and environment considerations              13
9. Acknowledgements                                            14
10. References                                                 15
Appendix 1                                                     17
Appendix 2                                                     18
Multi-User Classroom Environment in Virtual Reality - Creating and Implementing new Features
1. Introduction
The concept of virtual reality (VR) has been around for a long time [1, 2], but have gained a lot more
popularity again during the last decade [3]. VR refers to the use of computer technology to create a
simulated environment which the user is immersed in. Instead of viewing this environment through a
computer screen like we have done traditionally, the user uses a VR headset [4] which is a head-
mounted display that shows two different images, one for each eye. This VR headset projects the
simulated world, so it feels like you are currently standing in it. Moving your head would in turn
move the camera in this environment. To further immerse the user, the use of motion tracked
controllers can be used to interact with objects and a pair of speakers or headphones can be used for
directional sound.

The possibility to immerse a user in a virtual environment opens a lot of interesting usage of the
technology. One of the more commonly known uses of VR is in entertainment especially in video
games [5]. The appeal of being immersed in a fantasy universe and being able to see your favorite
game characters standing in front of you and being able to fight and interact with them on the
battlefield is something no other media can achieve [6]. There are also other areas which uses VR.
The military have used VR to train trainees and walk them through different scenarios and to help
them familiarize themselves with different complex military vehicles in a safe and controlled
environment [7]. VR has also been used in the area of medicine to treat post-traumatic stress
disorder (PTSD) patients through exposure therapy which is the practice of recalling a traumatic
memory while talking to a therapist. In the case of war veterans, it might be hard to re-create specific
scenarios from their time in service but with VR you can recreate that moment and allow the patient
to relive the past in the safety of the therapist’s office [8, 9]. The ability to create different scenes
and scenarios could have great potential in education as it would allow students to travel to different
locations and practice in a safe and controlled environment [10].

1.1 Background

This project is part of an already ongoing EIT Raw Materials project MiReBooks to create a virtual
classroom for educational purposes [11]. A teacher will be able to take the class to different virtual
environments and hold a lecture there. The teacher should not feel restricted by not having the
functionality necessary to hold a proper lecture. The virtual classroom has been developed in Unity
[12] and the targeted virtual reality headset is the Oculus Quest [13]. A few features were already in
place such as basic teleporting movement controls, a server-client set-up and some networking to
sync the users’ position in the environment. There were also some graphical assets available such as
machine models that were used to decorate the current environment and some user interface (UI)
icons for a menu system.

1.2 User interface

To create the UI for a VR environment some adjustments from a normal desktop UI must be made.
When using a VR headset, the screen is right up in the users’ face making it impossible to interact
with any type of screen space interface since the user is placed in world space. This means that any
interaction between the user and the UI must be made in world space. This is easily done by placing
a canvas object in the scene and setting the render mode to world space which will place the canvas
in the world. The user can then interact with the UI with the help of casting a ray from one of the

                                                                                                         1
Multi-User Classroom Environment in Virtual Reality - Creating and Implementing new Features
controllers to see where it intersects on the canvas. Oculus already have this feature implemented in
one of their scrips that comes with the Oculus SDK for unity [14].

Placement of UI elements, while important on desktop, is even more important in VR since the user
must physically move their head and body to and look at it. Placing UI elements in a bad position will
make the user forced to have bad ergonomics to read or interact with the UI. For example, if the UI is
to far down it will force the user to bend their neck causing unnecessary pressure on their spine and
could potentially lead to injuries [15]. Placing UI elements at eye level and in a way that promotes
good ergonomics is the way to go.

Because of the display resolution on VR headsets, all crisp and clean looking UI elements that looks
great on a regular monitor will look pixelated once looked at through the VR headset. This means
that text can become hard to read if it is a big block of text and if the font size is too small. Having
highly detailed UI elements will also create high levels of aliasing which can be disturbing for the user
to look at. It is also important to think about how the UI elements are placed and designed to
counteract so-called cybersickness, which will be covered in the section below [16].

1.3 Cybersickness

Symptoms similar to motion sickness is common among inexperienced VR users. This so called
cybersickness has to be considered while developing the product. Cybersickness [17] is caused by a
disconnect between what the eyes sees and what the body perceives. When the user is walking
around in the VR environment, the eyes will recognize this as the user moving and the body will be
expecting to experience movement too. When the body does not, the user may experience symptom
similar to motion sickness [18]. To prevent users from experiencing cybersickness, the user should
only need to perform actions that will cause minimal disconnect between eyes and body [17].

Since cybersickness is a big problem for VR-developers, a lot of researching have been done which
have led to guidelines to aid developers in developing VR software [19]. Hardware manufactures
have acknowledged these studies and have on their web side created personal guidelines suitable for
their products [20].

2. Goals and purpose
The purpose of this project is that two students will together create an educational environment in
VR where a teacher can hold a lecture for the students. With the help of the VR environment, the
teacher doesn’t have to hold the lecture in a standard classroom but can instead take the class to, for
example, a mine where they can look at mining equipment and interact with mining vehicles. One of
the students will focus on the functionality and the other one on the networking. The goals of this
project are to create features that allows:

    •   The teacher to give out permissions to students to use certain features in the classroom. The
        teacher will also be able to revoke these permissions at any time.
    •   The teacher to highlight objects in the scene, play animations and bring up annotations for all
        students to see. Students may bring up the annotations and play animations locally.
    •   The teacher to see the status of all the connected students and to be able to see if someone
        disconnected from the classroom.
    •   The students to raise their hand to get attention from the teacher.

                                                                                                        2
Multi-User Classroom Environment in Virtual Reality - Creating and Implementing new Features
•   The teacher to teleport whole class to a specific point. The teacher can also toggle an
        automatic follow event which teleports students closer to the teacher if they are too far
        away.
    •   The teacher to call for attention which will display arrows that guides the students view to
        the teacher.
    •   The teacher to draw lines in the environment.

3. Materials and methods
This work will be a further development of an already ongoing project at Luleå University of
Technology (LTU) [11]. When this project started, a backlog that contained several features that had
to be implemented to the project was already created. These features were divided between two
students and therefore this report will only cover certain parts of the project, mainly the
functionality.

The virtual classroom was originally created in the game engine Unity [12] version 2019.2.2f1 and the
development will continue in the same engine and version. Some of the assets and functionality
were already implemented ahead of time. These functionalities include basic movement in the VR
environment, interaction with UI components in VR with the help of the Oculus SDK [14] for Unity
and some textures and models to place in the classroom. In addition to this, lobby creation and
transform syncing functionality was already in place by a handmade networking Application
Programming Interface (API). In this work it was replaced with the networking API Mirror [21].

The targeted platform for this project is the Oculus Quest [13], a wireless VR headset with a pair of
wireless controllers. This headset is great for this project since its wireless nature makes it more
suitable for a school environment. During the development, controls for mouse and keyboard were
introduced to help with testing of the program without the use of the Oculus Quest. This in turn
means that the VR environment can be used on desktop, but it is heavily unpolished, and some VR
specific features do not function properly.

3.1 User testing

Since a larger scale test with the targeted userbase for this project is not possible due to Covid-19,
the tests have been designed around collecting data about the general user experience. For example,
if cybersickness occur and how intuitive the functionality that is present in the virtual environment is.
The tests were performed on three family members to reduce the risk of the spread of Covid-19 due
to the government’s recommendations, one user at the time, using the Oculus Quest and they were
monitored by a developer in the VR environment by connecting to the session as a student via
desktop. All the participants also experience the system from the teacher’s perspective, which gave
them the opportunity to try its various functionality in the environment. During the test, the users
were asked to perform simple tasks in the environment such as highlighting an object or bringing up
an annotation. The participants perform the tasks two times, i.e. first in standing up position, and
then sitting in an office chair.

                                                                                                        3
Multi-User Classroom Environment in Virtual Reality - Creating and Implementing new Features
The task that the participants were asked to preform are as following:

      •   Start a new class session.
      •   Open an annotation and go to page 5.
      •   Highlight a truck in the scene.
      •   Start an animation of one of the trucks.
      •   Teleport the student to a specific position.
      •   Give and revoke permissions from the student.

When the participants are done with all the tasks, they will get to answer these four questions about
their thoughts on the environment (Appendix 1):

1a.              Did you ever feel motion sick during the test?
1b.              If yes, when do you feel the motion sickness started?
2a.              Was there something you felt were missing?
2b.              If yes, what do you feel was missing?
3a.              Was there a task that was hard to complete?
3b.              Explain what was hard.
4.               Was it better standing up or sitting down?

3.2 Limitation

Because of the ongoing pandemic of COVID-19, all group members were working from home. To
update all members of the project, a meeting was held each week to inform everyone on the
progress of the project and get the opportunity to discuss problems that had occurred during the
week.

User testing was supposed to take place during the development of the project in order to get
feedback from the targeted userbase. Due to the COVID-19 situation that occurred during the
development, testing the project to the extent that was planned was not possible. Luckily, testing
was still able to be performed but in a smaller scale. The tests new focus became about the user
experience and how intuitive the controls feel.

3.3 Scrum

During the project we have used the agile workflow scrum. Scrum is a framework that was initially
designed with software development in mind but have been adopted in other industries to help
teams to work together. When you work with the scrum workflow a backlog of tasks is made which is
used during a period called the “sprint planning” to decided which task which will be worked on
during the upcoming days or weeks. This upcoming period is called a “sprint” and each day during a
“sprint” a short meeting is held to inform all members on how progress is going. After a “sprint” a
“sprint review“ is held where the product is presented for the product owner. Afterwards a “sprint
retrospective” is held where the team documents and discuss what worked and what didn’t work
during the “sprint” [22].

                                                                                                     4
4. Results

4.1 Implemented Features

4.1.1   Annotations

Annotations are game objects that can be placed out in the scene and will bring up notes for
everyone to see. It can contain multiple pages with unique titles and notes. The teacher can bring up
an annotation that will be displayed to all students in the classroom that only the teacher can
interact with. The students can bring up a local version of the annotation that is only visible to them
if they have permission from the teacher to do so.

The annotation consists of:

    •   A model which will act as a button to toggle the annotation on and off.
    •   Two canvases with two text objects. One for the title and one for the notes.
    •   Two buttons on each canvas to go to the next page or the previous page.

One of those canvases is networked synced so when the button is pressed by the teacher, the canvas
will be visible for every student in the classroom. If the teacher decides to change the page, it will be
changed for everyone else too. In the case of a student joins the classroom late or reconnects, the
annotation will automatically be synced to the state of the teacher. If the teacher has given the
students a special permission, the students will be able to bring up a local version of the same
annotation. This annotation is only visible for the student and has the same functionality as the other
one (Figure 1).

Figure 1.      Example of an annotation. When a teacher presses the yellow exclamation object, a
               global annotation will be brought up. If a student does it instead, a local version will be
               brought up but only if the student has permission to do so.

                                                                                                        5
4.1.2   Highlighting

To easily call attention to an object in the scene during the lecture, the teacher can highlight an
object to make it gain a light blue outline. The highlighting is networked synced and can be toggled
on and off by the teacher.

To aid the teacher in giving out permissions to the students, a highlighting effect is also placed on the
students’ player model. The student gains the same light blue outline but also a solid gray texture
instead of the normal white-gray transparent texture. This highlight is only visible to the teacher and
only occurs when the teacher hovers over one of the students (Figure 2).

Figure 2.      Before and after using the highlighting functionality on an object and the hover
               highlighting on a student.

4.1.3   Animations

Certain game objects in the scene have animations attached to them. These animations can be
played by pressing the play button that is located above the game object. If a teacher starts the

Figure 3.      Example of a truck with an animation that can be played via the play button above the
               truck.

                                                                                                        6
animation, it will be played for all students at the same time. If the students have a certain
permission given by the teacher, they will be able to play the animation locally. If the teacher decides
to start the global animation while a local animation is running, the local animation will reset to be
synced to the global animation (Figure 3).

4.1.4   Student status list

To assist the teacher to keep track of the students and check if someone has disconnected from the
classroom, a display of all the connected students have been placed at the teachers’ feet. The display
starts off blank and starts to fill up once students connect to the classroom. Each student is
represented by an icon that contains:

    •    A text object that displays the device ID of the machine the student is currently using.
    •    A text object that displays the distance between the teacher and the student.
    •    An image that acts as a status icon that displays if the student is connected or have
         disconnected from the classroom.

When the student connects to classroom for the first time that session, they are immediately added
to the display. The device ID of the student’s machine is then stored as the key in a dictionary on the
server side and the value of the dictionary contains a struct with a reference to the student game
object and the icon for the display. If the student leaves the classroom the status icon will turn red
indicating that the student has lost connection to the classroom and the distance between the
student and the teacher will stop updating. If the student reconnects, the device ID will be checked
to see if it is already in the dictionary. If it is, then the stored struct is updated to reference the new
student game object and the status icon updates to become green again (Figure 4). A code snippet
can be viewed in Appendix 2.

Figure 4.      Example of the status list. The image on the left is when the student is online and the
               image on the right is when the student has disconnected from the classroom.

                                                                                                          7
4.1.5            Raise hand

                           Students in the virtual environment can raise their hand to attract the
                           attention of the teacher. This does not require the student to physically
                           raise their hand in the real world, but they can instead press a button
                           which will play a looping animation of a red hand waving above their
                           head. Once the student has the attention of the teacher, the student can
                           take his/her hand down or the teacher can do it for them (Figure 5).

                           4.1.6            Teleportation

                           To ease the work of keeping the class together in the virtual environment,
                           the teacher can teleport all students to a specific location by aiming their
                           controller pointer to a specific position and pressing a button. The
                           students’ screens will fade to black before teleporting and will fade away
                           once they have been teleported to the location. This is to prevent
                           cybersickness from a sudden teleport. Upon arriving in the selected
                           location, the students will always be faced towards the teacher.

                           The teacher can also prevent students from traveling to far away from the
                           teacher. This can be toggled on and off by the teacher at any time. If a
                           student were to travel too far away from the teacher, the students screen
                           will fade to black and be teleported closer to the teacher and the camera
                           will be faced towards the teacher. In case that the student would be
                           teleported inside of a game object, the student will instead be teleported
                           to the teachers’ position to prevent occasions where a student might get
                           stuck inside of an object.

Figure 5.     Example of the raise hand feature.

4.1.7   Calling for attention

The teacher can at any time call for attention. Enabling this option will cause arrows to appear on the
students’ screens that will guide their view to the teacher. When the teacher is in the view of a
student, the arrows will disappear until the teacher is no longer in viewport of the student and then
the arrows will appear again (Figure 6).

                                                                                   Figure 6.
                                                                                   Two examples of
                                                                                   the calling for
                                                                                   attention feature
                                                                                   from the students
                                                                                   point of view.

                                                                                                       8
4.1.8    Drawing

To further aid the teacher in bringing attention to certain objects in the environment, the teacher
can draw lines on objects that have a collider. The lines drawn by this tool will be displayed on the
surface of the collider. The lines of the drawing have parameters for how far away from the collider
the line should be drawn, material of the line and the width of the line drawn. The teacher can also
remove lines drawn at any time. The drawing is synced over the network and the students will be
able to see the lines drawn by the teacher (Figure 7).

Figure 7.      Example of the drawing feature.

4.1.9    Permissions

The students do not have many features that they can use when they first join the session. For the
students to be able to get access to these features, the teacher needs to give out special permissions
by aiming at the student with his/her right controller and pressing a button. The teacher can give out
permissions to open a local version of an annotation and play an animation locally. When the
student has been granted the permission, they will now be able to use these new features. If the
teacher wants to revoke a students’ permissions, they can do so by aiming at them again and
pressing the same button.

4.2     User tests results

Three people participated in the user test. The demographic data of the participants can be seen in
Table 1. After they had completed all the tasks and completed them while standing up and sitting in
an office chair, they got to answer the questionnaire. The results from the questionnaire are shown
in Table 2.

                                                                                                        9
Table 1.           Demographic data over the test persons.

                                                       Person 1                    Person 2                   Person 3
 Gender                                                female                      female                     male
 Age (years)                                           19                          50                         61
 Education* (years in school)                          11                          >11                        11
 Occupation                                            Student                     Worker (nurse)             Worker (banker)
 Computer experience (scale 1-10)**                    8                           2                          5
* Compulsory school + upper secondary school = 11 years; compulsory school + upper secondary school + university level > 11 years.
** Scale 1-10, were 1 stands for “none” and 10 for “a lot” of computer gaming experience.

Table 2.           Answers on the questionnaire – Person 1.

                                                                    Yes            No               Comments (n)
 1. Did you ever feel motion sick during the                                       x                -
 test?
 2. Was there something you felt were                                              x                -
 missing?
 3. Was there a task that was hard to                                              x                -
 complete?
                                                                    Standing Sitting                No differences
 4. Was it better standing up or sitting down?                                                      x

Table 2.           Answers on the questionnaire – Person 2.

                                                                    Yes            No               Comments (n)
 1. Did you ever feel motion sick during the                                       x                -
 test?
 2. Was there something you felt were                               x                               Instruction display (1)
 missing?
 3. Was there a task that was hard to                               x                               Teleportation (1)
 complete?
                                                                    Standing Sitting                No differences
 4. Was it better standing up or sitting down?                               x

Table 2.           Answers on the questionnaire – Person 3.

                                                                    Yes            No               Comments (n)
 1. Did you ever feel motion sick during the                                       x                -
 test?
 2. Was there something you felt were                                              x                -
 missing?
 3. Was there a task that was hard to                                              x                -
 complete?
                                                                    Standing Sitting                No differences
 4. Was it better standing up or sitting down?                      x

                                                                                                                                     10
5 Discussion
5.1 Features

Since this work is a further development of an already ongoing project, its leader had already created
a backlog over the features that had to be implemented when this part of the project started. The
project leader also already had a vision of how the features should work and look, which gives few
opportunities to create anything of my own. When this part of the project started, the workload was
divided between two students, me, and my student colleague. My task was to mainly focus on the
functionality and my intention when I created the features was that they should be intuitive and that
the experience should be pleasant in the virtual environment. My colleagues’ main task was to focus
on the networking part of this project. Since we got further into the development of this project it
became clear that it was hard to divide the tasks between functionality and networking since many
features also had to be networked, which lead both of us into each other working areas. Therefore,
some of the features in this report were made by my colleague. On the other hand, I have been
developing some networking for the features I have created such as getting the annotations to be
synced for all clients and the syncing of the highlighting. In addition to this, I have also been working
on the teleportation code, which was created by my colleague, and adding the fade in and out when
teleporting. The feature calling for attention and raise hand were also created by my colleague.

When creating the annotations, I wanted to make them feel like PowerPoint slides, since teachers
often already are accustomed to working with presentation in PowerPoint. With the intention to not
limit the text space, I gave the annotations multiple pages.

In previous testing of the project, highlighting objects was something that the project leader had felt
was missing. The test has shown that it was hard for the students to understand which item the
teacher was talking about, since the teacher was not able to point it out for the students. To fix this
problem, a feature that highlight objects were implemented to the program. To make it easier for the
teacher to see which student he or she is pointing at, a feature that also highlights students were
created and implemented. The highlighting on the students have been inspired by other VR
applications that I have been using before. When using VR, especially when the user is unfamiliar
with the program, it can be hard to understand where in the scene the pointer is located.
Highlighting when hovering over objects can help the user a lot when it gives them a visual indication
of where in the scene they are currently aiming. After I had consulted the project leader about my
idea, we decided to implement highlighting also on the students in this program. Hopefully, this will
aid the teacher when he or she is giving out permissions to the students.

To get the animations to play was just a matter of playing the animation when the button was
pressed. Besides that, there was also a check to see if a student has the permission to play the
animation locally when the student pressed the button.

The creation of the student status list was inspired by another project where the teacher, when
looking down at his or her feet, sees a display with information about all the devices that had
connected to the server. I created and implemented a similar status list that contain green icons for
each connected student. Since it can be hard for the teacher to see if any student is missing during
the lesson just by looking around in the environment, the icon on the status list will turn red if the
student disconnects from the program. The red colour was chosen since it stuck out against all the
green icons and alert the teacher that there is a student missing in the classroom.

                                                                                                      11
In the real life, a student can raise his or her hand to get the attention of the teacher. In this program
it is not possible to move the arm on the virtual student. Therefore, a raise hand function is added.
This hands colour is red to stand out against the dark atmosphere in the mine.

One of the advantages of having the classroom in a program is that you can manipulate the students’
position to make sure that they stick close to the teacher or bring them to a certain point without
hassle. The teleport feature was implemented to aid the teacher to gather the class to a specific
point or to regroup the class if anyone got lost in the classroom. Parts of this teleportation feature
were used in something we called the “tour mode” which is a mode that can be toggle on and off and
will teleport students closer to the teacher if they walk to far away. To prevent the user from feeling
cybersickness, the screen will always fade to black before teleporting [18].

Calling for attention is to help the teacher get the attention of the students if they notice that they
are looking away. This is a toggle option which will display arrows on the side of the screen to guide
the view of the student without obscuring the view too much.

The drawing functionality was developed by Samuel Lundsten and given to us by Denis Kalkofen who
worked on the paper “Tools for teaching mining students in virtual reality based on 360° video
experiences” [11]. My student colleague implemented the script into the project and made some
changes to make it work together with Mirror.

To summarize, all features listed in the backlog were completed well in time which opened up the
opportunity to fix bugs that occurred with the features we implemented. We also had time to
implement a quality of life feature by making the screen fade into black when teleporting.

During this project, we have been working with the scrum workflow [22]. Each week, we got assigned
different features to implement. Once the next weekly meeting rolled around, we showed the
progress we had made and got assigned new tasks and features to implement until the next week.
Since the COVID-19 situation got worse during the development of this project, each weekly meeting
has been held online via Zoom instead for physical meetings. It has gone well, even though it
sometimes has been poor data connection.

5.2 User test

Since the COVID-19 situation got worse during the development of this project, the large-scale user
test had to be cancelled and replaced with a much smaller scaled test. This also means that the test
was not performed on the targeted userbase. From the testing performed, no one of the participants
got symptoms of cybersickness. This is probably due to that the sensory conflict, explained in the
paper “A Discussion of Cybersickness in Virtual Environments”, is not as relevant since all the
movement is made through teleportation and the Oculus quests feature of six directions of freedom
elevates the sensory conflict [17].

From the observations made during the testing, the participant with the most computer knowledge
had the least amount of problems and quickly learned how to complete all the tasks. She also had
little to no trouble learning the button layout of the Oculus controllers and instinctively pressed the
right button to perform certain tasks without even knowing the controls.

The only person that struggled with completing the tasks was the test participant with the least
amount of computer knowledge. She had trouble moving around in the environment and locating
the buttons to press on the controllers. This did not get better when she did the tasks again for a

                                                                                                        12
second time. She did figure out which objects in the environment that was used to open up
annotations and play animations for the trucks rather quickly, which is a great indication that it is
clear what the icons are and what they do once they are pressed. She suggested to have some sort of
virtual instruction display or handbook that shows the controls for the various features in the
program. This seems like a good idea to have for both teachers and students alike.

6 Conclusion
The development of this project turned out well. In this project we have manged to create and
implement the features listed in the backlog well in time and without any major problems. According
to the user test we also manged to make a pleasant virtual environment with no occasions that
triggers cybersickness. The program is also intuitive, at least for them who have above average
experience of computer knowledge. They succeed well in the user test.

7 Future perspectives
Regarding the annotations feature, it may be desirable if the teacher also can include pictures or
videos in the slides. That will give them more options to customize the annotations to their own
liking.

To get a test results that is accurate for the target group, another test on the targeted user group
needs to be performed when the COVID-19 pandemic is over. If the program is going to be used on
people with poor computer knowledge, it would be wisely to add a feature for a user manual.

An option to toggle a colour-blind mode should be implemented later down the line so people with
colour-blindness also can be able to participate in the lecture.

8 Social, ethical and environment considerations
Since all meetings during the development of this project were held online, this project did not
generate any environmental contamination. This program will also when it will be introduced in
schools, save the environment by reduce the educational trips to the mines.

Due to the cost of VR hardware, this program might only be accessible to students that can afford to
invest in this type of equipment. Therefore, it will be desirable if the school provides the equipment
to avoid discrimination of social weak groups.

The program is usable for both women and men and the teacher and students’ avatars are a
genderless silhouette so everyone can feel comfortable using the program.

To also get people with colour-blindness opportunity to use this program, a suggestion of
implementing a colour-blind mode is given under the headline future perspectives.

                                                                                                       13
9 Acknowledgements
First of all, I want to thanks to Teemu H. Laine and Henrique Souza Rossi and the rest of the team for
letting me be a part of their project and for all the experience you have given me!

Another thanks to Andreas Lindblom for a wonderful cooperation on this project and Samuel
Lundsten for providing designing assets and the drawing script.

I also want to thank my supervisor Patrik Holmlund and my teachers Johannes Hirche and Fredrik
Lindahl for all support and all the valuable knowledge you have taught me!

Last, but not least, a thanks to all my classmates that have made these past three year a wonderful
part of my life!

                                                                                                      14
10. References
1. Brockwell, Holly. (April 03, 2016). Forgotten genius: the man who made a working VR machine in
1957. Tech Radar. Available https://www.techradar.com/news/wearables/forgotten-genius-the-
man-who-made-a-working-vr-machine-in-1957-1318253/2 (Retrieved April 17, 2020).

2. Ellis, S. R. What are Virtual Enviorments. IEEE Computer Graphics and Applications 1994; 14:17-22.

3. Montegriffo, Nicholas. (December 23, 2019). Hindsight is 2020: looking back on the decade of VR.
Pcinvasion. Available https://www.pcinvasion.com/hindsight-2020-looking-back-on-the-decade-of-
virtual-reality/ (Retrieved April 17, 2020).

4. Bardi, Joe. (March 26, 2019). What is VR? [Definition and Examples]. Marxent. Available
https://www.marxentlabs.com/what-is-virtual-reality/ (Retrieved April 17, 2020).

5. Garbade, Michael. (November 08, 2018). 10 amazing Uses of Virtual Reality. Available
https://readwrite.com/2018/11/08/10-amazing-uses-of-virtual-reality/ (Retrieved May 4, 2020).

6. Evans, L. (2019). Barriers to VR use in HE (Ed.), Proceedings of the Virtual and Augmented Reality to
Enhance Learning and Teaching in Higher Education Conference 2018, page 1-13. Swansea University.
Swansea University AR/VR conference.

7. Strickland, Jonathan (August 27, 2007). How VR Military Applications Work. HowStuffWorks.
Available https://science.howstuffworks.com/virtual-military.htm (Retrieved April 25, 2020).

8. U.S. Department of Veterans Affairs. (March 6, 2020). Life After Service: StrongMind technology to
treat Veterans with PTSD. Available https://www.blogs.va.gov/VAntage/72224/life-service-
strongmind-technology-treat-veterans-ptsd/ (Retrieved April 25, 2020).

9. Rothbaum, B. O., Hodges, L., Alarcon, R., Ready, D., Shahar, F., Graap, K,. Pair, P., Hebert, P., Gotz,
D., Wills, B., and Baltzell, D. Virtual Reality Exposure Therapy for PTSD Vietnam Veterans: A case
Study. Journal of traumatic stress 1999; 12:263-271.
10. Liou W-K. and Chang, C-Y. Virtual Reality Classroom Applied to Science Education. 2018 23rd
International Scientific-Professional Conference on Information Technology (IT), Zabljak,
Montenegro. Available https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=8350861
(Retrieved July 1, 2020).

11. Kalkofen, D., Mori, S., Ladinig, T., Daling, L. Abdelrazeq, A., Ebner, M., Orgega, M., Feiel, S., Gabl,
S., Shepel, T., Tibbett, J., Laine, T.H., Hitch, M., Drebenstedt, C. and Moser, P. (2020). Tools for
Teaching Mining Students in Virtual Reality based on 360° Video Experiences, 2020 IEEE Conference
on Virtual Reality and 3D User Interfaces, Atlanta (online).

12. Unity.com. Available https://unity.com/ (Retrieved April 17, 2020).

13. Oculus Quest. Available https://www.oculus.com/quest/ (Retrieved April 18, 2020).

14. Oculus SDK. Available https://developer.oculus.com/downloads/ (Retrieved May 4, 2020).

15. Purwar, Sourabh. Designing User Experience for Virtual Reality (VR) applications. (March 4, 2019).
Available https://uxplanet.org/designing-user-experience-for-virtual-reality-vr-applications-
fc8e4faadd96 (Retrieved May 4, 2020).

                                                                                                          15
16. Gera, Emily. (August 19, 2013). Oculus Rift is working to solve simulator sickness. Available
https://www.polygon.com/2013/8/19/4636508/oculus-rift-is-working-to-solve-simulator-sickness
(Retrieved July 1, 2020).

17. Kiryu, Tohry & So, Richard HY. Sensation of presence and cybersickness in applications of VR for
advanced rehabilitation. J Neuroeng Rehabil. 2007 Sep;25(4):34. Available
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2117018/ (Retrieved April 17, 2020).

18. LaViola, J. J. (January, 2000). A Discussion of Cybersickness in Virtual Environments. ACM SIGCHI
Bulletin. Available
https://dl.acm.org/doi/pdf/10.1145/333329.333344?casa_token=vkhvOa7eMlEAAAAA:GoeKjxifoG26
unBInZq4p5PiuZXeRVJJ1nm3NtP5HWFTQqDv37WViElsv2GUcG2AdvXyfcXHBfDKAw (Retrieved July 1,
2020).

19. Porcino, T., Clua, E., Vasconcelos, C. N., Trevisan, D. G. and Valente, L. (November 19, 2016).
Minimizing cyber sickness in head mounted display systems: design guidelines and applications.
Available
https://www.researchgate.net/publication/310610513_Minimizing_cyber_sickness_in_head_mount
ed_display_systems_design_guidelines_and_applications (Retrieved July 1, 2020).

20. Oculus Developers. Available https://developer.oculus.com/design/bp-locomotion/ (Retrieved
July 1, 2020).

21. Mirror-networking.com. Available https://mirror-networking.com/docs/General/index.html
(Retrieved April 17, 2020).

22. Scrum. Available https://www.atlassian.com/agile/scrum (Retrieved July 2, 2020).

                                                                                                       16
Questionnaire                                                                               Appendix 1

Demographic data of the test person:
Gender:                                     Male                                          Female

Age:                                        ______ years

Education:                                  Compulsory school              (9 years)

                                            Upper secondary school         (3 years)

                                            University level               (> 11 years)

Occupation:                                 Student

                                            Teacher

                                            Worker

Computer experience:                         1 _______________________________________________10

                                            none                                                   a lot

1a. Did you ever feel motion sick during the test?             Yes     go to 1b               No

1b. If yes, when do you feel the motion sickness started? ___________________________________

2a. Was there something you felt were missing?                 Yes     go to 2b               No

2b. If yes, what do you feel was missing?                      ___________________________________

3a. Was there a task that was hard to complete?                Yes     go to 3b               No

3b. Explain what was hard.                                     ___________________________________

4. Was it better standing up or sitting down?                              Standing

                                                                           Sitting

                                                                           No differences

                                                                                                     17
Appendix 2
    //Checks if the student that have connected to the classroom have previously been
in it.
    public void StudentConnected(MB_Student student)
    {
        //Check if the student have been connected to the classroom before. If they
have, reuse the same icon that they had previously on the canvas.
        if (dict.ContainsKey(student.deviceID))
        {
            //Set the student as online again and pair the icon with the new
gameobject.
            GameobjectPair gp;
            gp.icon = dict[student.deviceID].icon;
            gp.icon.GetComponentInChildren().color = new Color32(0,255,0, 100);
            gp.student = student;
            dict[student.deviceID] = gp;
        }

        //Otherwise add a new icon to the canvas and pair it together with the new
student gameobject.
        else
        {
             GameobjectPair gp;
             gp.student = student;
             gp.icon = Instantiate(statusIcon) as GameObject;
             gp.icon.transform.SetParent(canvas.transform);
             gp.icon.transform.localPosition = new Vector3(x_Start + x_Gap * x_Step,
y_Start + y_Gap * y_Step, 0);
             gp.icon.transform.localRotation = Quaternion.identity;
             gp.icon.transform.localScale = new Vector3(1.5f, 1.5f, 1.5f);
             gp.icon.transform.Find("DeviceIDText").GetComponent().text =
student.deviceID;
             dict.Add(gp.student.deviceID, gp);

            x_Step++;
            if (x_Step + 1 == x_Max)
            {
                x_Step = 0;
                y_Step++;
            }
        }
    }

    //Set the student as disconnected and set the students gameobject to null since
the original gameobject is destoryed upon leaving the classroom.
    public void StudentDisconnected(MB_Student student)
    {
        GameobjectPair gp;
        gp.icon = dict[student.deviceID].icon;
        gp.icon.GetComponentInChildren().color = new Color32(255, 0, 0, 100);
        gp.student = null;
        dict[student.deviceID] = gp;
    }

                                                                                                18
You can also read