An intelligent navigation experimental system based on multi-mode fusion

Page created by Clinton Lane
 
CONTINUE READING
An intelligent navigation experimental system based on multi-mode fusion
Virtual Reality & Intelligent Hardware       2020 Vol 2 Issue 4:345—353

·Article·

An intelligent navigation experimental system based on
multi-mode fusion
Rui HAN1,2, Zhiquan FENG1,2*, Jinglan TIAN1,2, Xue FAN1,2, Xiaohui YANG1,2,
Qingbei GUO1,2
1. School of Information Science and Engineering, University of Jinan, Jinan 250022, China
2. Shandong Provincial Key Laboratory of Network Based Intelligent Computing, Jinan 250022, China

* Corresponding author, ise_fengzq@ujn.edu.cn
Received: 7 May 2020       Accepted: 5 July 2020

Supported by the the National Key R&D Program of China (No. 2018YFB1004901); the Independent Innovation Team
Project of Jinan City (No. 2019GXRC013).

Citation: Rui HAN, Zhiquan FENG, Jinglan TIAN, Xue FAN, Xiaohui YANG, Qingbei GUO. An intelligent navigation
           experimental system based on multi-mode fusion. Virtual Reality & Intelligent Hardware, 2020, 2(4): 345—353
           DOI: 10.1016/j.vrih.2020.07.007

Abstract       At present, most experimental teaching systems lack guidance of an operator, and thus users
often do not know what to do during an experiment. The user load is therefore increased, and the learning
efficiency of the students is decreased. To solve the problem of insufficient system interactivity and
guidance, an experimental navigation system based on multi-mode fusion is proposed in this paper. The
system first obtains user information by sensing the hardware devices, intelligently perceives the user
intention and progress of the experiment according to the information acquired, and finally carries out a
multi-modal intelligent navigation process for users. As an innovative aspect of this study, an intelligent
multi-mode navigation system is used to guide users in conducting experiments, thereby reducing the user
load and enabling the users to effectively complete their experiments. The results prove that this system
can guide users in completing their experiments, and can effectively reduce the user load during the
interaction process and improve the efficiency.

Keywords         Navigation interaction; Chemical experiment system; Multi-mode fusion

1     Introduction
As an important and new technology, human-computer interactions and virtual reality are becoming widely
used in all fields of life. In the area of education, to better assist teachers and students, many companies
have applied human-computer interaction and virtual reality technology to products, and virtual
experiment schemes, such as NOBOOK's virtual experiment system and NetDragon's 101VR classroom,
have been proposed. However, most virtual experimental systems lack guidance for users during the
experimental process, preventing users from understanding how to conduct the experiment, thereby
increasing the user load and reducing the learning efficiency. Therefore, developing a way to intelligently
guide the user to conduct an experiment and remind the user about particular problems during an
experiment is significantly important.

2096-5796/©Copyright 2020 Beijing Zhongke Journal Publishing Co. Ltd., Publishing services by Elsevier B.V. on behalf of KeAi Commu‐
nication Co. Ltd. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by/4.0/).

                                                                                                                      www.vr-ih.com
An intelligent navigation experimental system based on multi-mode fusion
Virtual Reality & Intelligent Hardware   2020 Vol 2 Issue 4:345—353

    To solve the problem of insufficient guidance during an experimental system, in this paper, an intelligent
navigation system based on multi-mode fusion is proposed. The system first obtains user information by
sensing the hardware devices, intelligently perceives the user's intention and experimental progress according
to the acquired information, and finally carries out multi-modal intelligent navigation for users. The innovative
aspects of this paper are as follows: Intelligent multi-mode navigation is used to guide users in conducting
experiments, thereby reducing the user load and enabling users to better complete their experiments. It can
effectively reduce the user load during an interaction and improve the experimental efficiency.
    The rest of this paper is organized as follows: Section 2 introduces previous studies related to the
teaching system. Section 3 introduces the system design and implementation. Section 4 describes and
analyzes the experimental results. Section 5 introduces the user experience. Finally, Section 6 provides
some concluding remarks.

2     Related studies
In the field of education, many experts and scholars are committed to providing better learning and
teaching methods for students and teachers.
    In 2002, Lesta et al. proposed logic-ITA, an intelligent teaching assistant system for propositional logic
teaching[1]. In 2003, Zhao et al. proposed the structure of a computer-aided speech teaching system. The
system was designed to help native Chinese speakers improve their English pronunciation[2]. In 2011,
Huang et al. proposed a piano teaching system based on unlabeled augmented reality, which can naturally
track the real keyboard of a piano[3]. With the addition of virtual hands on a keyboard, beginning piano
students can practice playing the piano. In 2013, Özcan et al. proposed an innovative web-based adaptive
intelligent network learning system called UZWEBMAT[4]. In addition, Yan et al. proposed an ISIC-CDIO
teaching experiment system based on Internet of Things RFID technology[5]. GAO et al. designed a
polymer chemistry visualization teaching system[6]. Sun et al. proposed a computer-aided teaching system
to distinguish the self-teaching of Chinese tones[7]. In 2014, Xie proposed a maintenance teaching model of
faulty electronic equipment and applied the model to the design of a fault principle and maintenance
teaching system for such equipment[8]. In addition, Lin et al. proposed a new teaching system that enables
users to easily operate robot arms and complete various teaching tasks[9]. The teaching system consists of a
teaching pen, an optical mark on the pen, a motion capture system, and a pen tip estimation algorithm.
Using the motion-capture system to capture the mark position, a pen-point algorithm is used to accurately
calculate the posture of the teaching pen and control the robot frame. In 2015, Luan proposed a virtual
simulation teaching system for working on hydraulic transmissions with convenient operational and
interactive functions[10]. In 2016, Li et al. proposed an adaptive network teaching system based on a
learning analysis[11]. Moreover, Hsiao et al. proposed a system of adaptive remedial teaching materials for
TCSL learners[12]. The Chinese listening and speaking diagnostic and remedial teaching system integrates
computer diagnostic tests and instructional materials to diagnose errors during listening comprehension
and speaking tasks. Chen et al. proposed the use of a sports simulation in a virtual basketball-shooting
teaching system, providing scientific reference data for the training of basketball players[13]. In 2017, Wang
et al. proposed an online video teaching system for electronic majors based on micro-lessons[14]. In
addition, Lou proposed a virtual reality teaching system for graphic design courses[15]. In 2018, Deng
proposed a design method of a human-computer conversational English teaching system based on cloud
computing[16]. Feng et al. realized a measurement learning system using virtual reality (VR), in which
various measuring instruments used by students can be constructed through VR[17]. Liu et al. proposed a
346
An intelligent navigation experimental system based on multi-mode fusion
Rui HAN et al: An intelligent navigation experimental system based on multi-mode fusion

teaching system of four mathematics operations used in primary schools[18]. Li et al. proposed a teaching
method using a virtual simulation system for numerical control engineering training, which can help
students master their theoretical knowledge and practical and problem-solving skills of different numerical
control machine tools[19]. In 2020, Gao et al. used a virtual simulation technology to develop and design an
experimental teaching system for the maintenance of a high-speed train[20].
    At present, many educational systems lack guidance for users, which increases the user load. Therefore,
to reduce the user load, an intelligent navigation system based on multimodal fusion is proposed. This
system can realize multi-modal intelligent navigation for users through an intelligent perception of the
user's intention and experimental progress. This interactive mode, which navigates for the user during the
experiment, can guide the user to complete the experiment better, effectively reducing the user load and
improving the experimental efficiency.

3     System design and implementation

3.1    Hardware structure design of experimental equipment

Figure 1 shows a photograph of the hardware device. The sensors used by each device and their location are
marked on the diagram. The device captures the user actions through multiple sensors and transmits them to
the computer software application using the MQTT protocol. At the same time, a Kinect hardware device is
needed to capture and transmit the data on human hand displacement to the computer software application.

                                      Figure 1   Hardware design diagram.

    For the hardware equipment, a Kinect2.0, computer, and experimental simulation equipment for 3D
printing (including sensors) are used. For the software environment, Windows10, Unity 2018, Visual
Studio 2015, and Baidu voice were applied. Finally, for the programming language, C# is used.

3.2    Navigational interaction mode

Navigational interactions are designed based on intentional behavior nodes and virtual scene information.
An intentional behavior node is the perception description of the user's multimodal information of the
system and the basis of an interaction. The specific interaction design is as follows (Figure 2):
    (1) The nodes in the intentional-behavior node set NQ are filtered, and the node set WNQ to be executed
                                                                                                                           347
An intelligent navigation experimental system based on multi-mode fusion
Virtual Reality & Intelligent Hardware   2020 Vol 2 Issue 4:345—353

                                             Figure 2   Interaction design.

is obtained (see the details in 3.2.1; NQ is a set of intention behavior nodes, which are composed of
actions, objects, and attributes, and WNQ is the state before checking whether it can be directly executed).
  (2) The intentional behavior node in WNQ is applied. If it can be executed, it will be; otherwise, a voice
will be provided to guide the user operations (see 3.2.2 for details).
  (3) The progress of the experiment is monitored and users are guided to conduct the experiment through
the navigation system (see the details in 3.2.3)

3.2.1    Filtering of intent behavior nodes
Owing to the simultaneous operations required in a chemical experiment, the system supports the
perception of a dual-operation intention. Because one or two intent behavior nodes may be filtered,
filtering out the nodes that the user actually wants to execute is a prerequisite for an interaction. This is the
focus of the filter for two nodes that have the same active object but different intents because there may be
cases where they cannot be executed at the same time (the intentional behavior node contains the active
object, which is generally the experimental equipment that mainly sends the signals). The processing
method is as follows:
  (1) The number of elements in NQ is determined. If there is only one element, it is directly added to the
WNQ node set to be executed. If there are two elements, and if their active objects are the same but with
different intents, step 2 is executed; otherwise, NQ is used as the set of nodes, WNQ, to be executed.
  (2) According to the shortest intention-transformation path method SRP ( Node) (see 3.2.2 for details),
whether the active objects of the two elements can reach their new intentions from the current intention is
determined. If all of them can be reached directly, step 3 is executed; if not, NQ is used as the set of nodes,
WNQ, to be executed.
  (3) By asking the user to select an intention node to be executed, another node is set as an invalid node.

3.2.2    Execution of the intended behavior node
The system is instructed to interact according to the filtered nodes. During the process of node execution,
according to the shortest intention transformation path method SRP ( Node), the path planning of the
intention transformation of the nodes in WNQ (set of nodes to be executed) is carried out. If it can be
converted directly, the intent node is executed directly. If not, the user is prompted with the planned
shortest intention-transformation path.
  The shortest transformation path method SRP ( Node) is based on graph theory. For each object, an
intention-transformation graph is created to represent the transitions between different intents. The shortest
transformation path table SR intent, transformation requirement table TPR intent, and necessity intention table
TK intent (TK intent indicates the intention to execute first when the necessary conditions are not met) are saved
in the knowledge base.
  Figure 3 shows the SRP ( Node) method. First, according to the TPR intent table, this method determines
whether the node satisfies the intent conversion condition.
348
An intelligent navigation experimental system based on multi-mode fusion
Rui HAN et al: An intelligent navigation experimental system based on multi-mode fusion

                                           Figure 3    SRP ( Node ) method.

   If the conversion condition is satisfied, according to the SR intent table, the shortest path R node of the object's
current intention conversion into the new intention is obtained. If the path indicates that the new intent can
be reached directly without other operations, then the node is an executable Node'; If additional operations
are required before execution, then R node is the planned shortest intention-transformation path IR node.
   If the transformation condition is not met, the intent that must be executed first (the required intent) is
derived from the TK intent table. Then, according to the SR intent table, the shortest path from the current
intention to the necessary intention is spliced with the shortest path from the necessary intention to the new
intention to obtain the final planned transformation path, IR node.

3.2.3    System navigation design
The interactive navigation method is used to monitor the user's operation and experiment progress in real
time, including the voice navigation and visual navigation.
   (1) Voice navigation
   To reduce the load of the users, a method for navigation during an experiment is proposed herein. When
the user makes common sense mistakes (such as violating the operational method of the experimental
instrument), the user is prompted according to the planned path, which not only tells the user that the
operation is unreasonable, but also reduces the risk that the experiment will not be able to continue. For the
experimental key knowledge, the system supports exploratory experiment (that is, the experimental
phenomenon of wrong operation can be observed), so that students can have a deeper understanding of the
key chemistry knowledge. During this process, the system will give feedback and explain the experimental
phenomenon, and guide the user to apply the correct operation. In addition, the system will automatically
monitor the progress of the experiment and set up a voice navigation at the key nodes to guide the user
during an operation. Compared with a traditional method (in which a simple navigation is applied only at the
beginning of the experiment), this method of navigation during the experiment can better reduce the user load
and reduce the risk that the experiment will not be completed due to unknown system operation methods.
   (2) Visual navigation
   The system presented herein uses a virtual electronic screen to guide the users. The key steps in the
experiment are presented on the screen, allowing the user to follow the prompts on the screen during the
operation. Visual and voice navigation are used together. Visual navigation is focused more on providing
the experimental steps, whereas voice navigation is focused more on the navigation of dynamically
generated operations during the experimental process.

4     Experimental results and analysis
4.1     Experimental results

This system mainly uses Unity3D for the design and transmits multi-mode signals to Unity3D for fusion.
                                                                                                                              349
An intelligent navigation experimental system based on multi-mode fusion
Virtual Reality & Intelligent Hardware   2020 Vol 2 Issue 4:345—353

In this study, the effectiveness of the proposed system is verified through a dilution experiment conducted
on concentrated sulfuric acid and a carbonization experiment using sucrose.
    Figure 4(a) shows an operational check conducted during the intent transformation. In concentrated
sulfuric acid dilution experiments, reagent is required to be contained in the conical flask before the funnel
is installed on the conical flask. Therefore, when there is no reagent in the conical flask, the user is
required to first add reagent to the conical flask and then install the parting funnel on the conical flask. At
this point, the user is intelligently prompted based on the SRP ( Node) method. Figure 4(b) shows the
intelligent voice navigation system. In a sucrose carbonization experiment, a glass rod should be used to
stir the reagent to accelerate the reaction. Thus, the system prompts the user to stir using a voice navigation.

    Figure 4   Navigation experimental system based on multi-mode fusion. (a) Operation check; (b) Voice navigation.

4      User experience
To evaluate the navigation-based chemical experiment system based on multimodal fusion proposed
herein, 41 students from the affiliated primary school of Jinan university, Zhangqiu middle school,
Zhangqiu Wuzhong school, and Shenxian experimental high school were invited to participate. In addition,
12 teachers were invited to join, and a total of 53 people were tested (an experiment on diluting
concentrated sulfuric acid was selected).
    Figure 5 shows users using NOBOOK's virtual experiment system (referred to as the NOBOOK system)
and the navigation chemical experiment system based on multi-mode fusion proposed in this paper
(referred to as the proposed system). The NOBOOK system interacts based on a mouse or touch screen and
applies navigation only at the beginning of the experiment. The proposed system uses an interaction based
on the simulation equipment, voice, and vision, and conducts an intelligent navigation during the entire
experiment process. It allows the students to conduct an experiment, and the teachers to demonstrate the
experiment, in person. Through a questionnaire and description, users can evaluate the experience of

                         Figure 5   User experience. (a) NOBOOK system; (b) Proposed system.

350
Rui HAN et al: An intelligent navigation experimental system based on multi-mode fusion

different systems.
    Figure 6 shows a statistical graph of the user experience of both experimental systems. The data for each
indicator in the figure is the average after the statistics. Each item is scored from 0 to 5 points. Among
them, the lower the score is for the first four items (i. e., mental requirements, physical requirements,
degree of frustration, and difficulty of independently completing the experiment), the better the experience
of the system; the higher the score of the remaining items, the better the system experience.

                                    Figure 6   Comparison of user experience.

    Compared with the NOBOOK virtual experiment, the navigation system proposed in this paper
effectively reduces the users' mental requirements (users need to remember how to use the system),
physical demands, degree of frustration, and difficulty in completing the experiment independently. The
user's learning efficiency is effectively improved, and the user feels more relaxed.

4     Conclusion
To solve the problem of an insufficient interaction and guidance found in most experimental systems, an
intelligent navigation system based on multi-mode fusion is proposed. The system first obtains user
information by sensing hardware devices, intelligently perceives the user intention and experimental
progress according to the acquired information, and finally carries out multi-modal intelligent navigation
for users.
    As the innovative aspects of this paper, a multi-mode intelligent navigation is applied to guide users to
conduct experiments,to reduce the interactive load of the users and to enable them to effectively complete
the experiments.
    The experiment results prove that the navigation interactive system proposed herein can effectively
reduce the mental requirements, physical demands, degree of frustration, and difficulty of completing an
experiment independently. It effectively improves the user's learning efficiency, and makes the user feel
more relaxed. Therefore, the proposed system can effectively reduce the user load and improve the
learning efficiency.
                                                                                                                           351
Virtual Reality & Intelligent Hardware     2020 Vol 2 Issue 4:345—353

References

1     Lesta L, Yacef K. An intelligent teaching assistant system for logic. In: Intelligent Tutoring Systems. Berlin, Heidelberg,
      Springer, 2002, 421–431
      DOI:10.1007/3-540-47987-2_45
2     Zhao T L, Jia L, Lu Y F, Han S P, Li C L. An automatic pronunciation teaching system for Chinese to learn English. In:
      IEEE International Conference on Robotics, Intelligent Systems and Signal Processing. Changsha, Hunan, China, IEEE,
      2003, 1157–1161
      DOI:10.1109/rissp.2003.1285754
3     Huang F, Zhou Y, Yu Y, Wang Z Q, Du S D. Piano AR: a markerless augmented reality based piano teaching system. In:
      2011 Third International Conference on Intelligent Human-Machine Systems and Cybernetics. Zhejiang, China, IEEE,
      2011, 47–52
      DOI:10.1109/ihmsc.2011.82
4     Özyurt H, Baki A. Design and development of an innovative individualized adaptive and intelligent e-learning system
      for teaching-learning of probability unit: details of UZWEBMAT. Expert Systems with Applications, 2013, 40(8): 2914–
      2940
      DOI:10.1016/j.eswa.2012.12.008
5     Yan H, Hu H Y. Research and realization of ISIC-CDIO teaching experimental system based on RFID technology of
      web of things. Journal of Bionanoscience, 2013, 7(6): 696–702
      DOI:10.1166/jbns.2013.1172
6     GAO J, Zhang Z, Song Q, Ding Y, Li Q, Lin Y. Design and practice of visualized teaching system in polymer chemistry.
      Polymer Bulletin, 2013, 35(2): 94–98
7     Sun Q, Liu S, Sunaoka K, Hiki S. Visual displays of the voice pitch pattern for the CAI self-teaching system to
      discriminate Chinese tones. Journal of the Acoustical Society of America, 2012, 131(4): 060007
      DOI:10.1121/1.4887505
8     Xie J. Design of electronic fault principle and maintenance teaching system for missile equipment. In: China
      Conference on System Simulation Technology And its Application. 2014
9     Lin H I, Lin Y H. A novel teaching system for industrial robots. Sensors (Basel, Switzerland), 2014, 14(4): 6012–6031
      DOI:10.3390/s140406012
10 Luan F. Development of hydraulic transmission virtual simulation teaching system based on Unity3D. 2015
11 Li J, Su Z, Huang Y, Gou X. Adaptive network teaching system design based on learning analysis. Modern Education
      Technology, 2016, 26 (6): 113–118
12 Hsiao H S, Chang C S, Lin C Y, Chen B, Wu C H, Lin C Y. The development and evaluation of listening and speaking
      diagnosis and remedial teaching system. British Journal of Educational Technology, 2016, 47(2): 372–389
      DOI:10.1111/bjet.12237
13 Chen G, Chen N. Motion simulation in virtual basketball shooting teaching system. International Journal of Online
      Engineering (IJOE), 2016, 12(2): 55–57
      DOI:10.3991/ijoe.v12i02.5049
14 Wang B, Li Y, Yang L. Design of micro-course teaching system for electronic majors based on JSP. Henan Science And
      Technology, 2017 (5): 15–18
15 Lou M Y. A virtual reality teaching system for graphic design course. International Journal of Emerging Technologies in
      Learning (IJET), 2017, 12(9): 117
      DOI:10.3991/ijet.v12i09.7492
16 Deng L M. Design of English teaching system for human-computer dialogue based on cloud computing. In: 2018
      International Conference on Intelligent Transportation, Big Data & Smart City (ICITBS). Xiamen, China, IEEE, 2018,
      283–286
      DOI:10.1109/icitbs.2018.00079

352
Rui HAN et al: An intelligent navigation experimental system based on multi-mode fusion

17 Feng J, Zhang D, Li W, Dong L. Design of an auxiliary teaching system based on virtual reality. China Science And
    Technology Information, 2018 (1): 57–58
18 Liu T, Min P, Xiao H. Design and implementation of elementary school mathematics teaching system for arithmetic
    based on JAVA. Computer and Digital Engineering, 2018, 46(4): 655–658
19 Li Y X, Zhang D, Guo H X, Shen J Y. A novel virtual simulation teaching system for numerically controlled machining.
    The International Journal of Mechanical Engineering Education, 2018, 46(1): 64–82
    DOI:10.1177/0306419017715426
20 Gao B, Liu Z, Huo K, Jiao F. Development of experimental teaching system for maintenance technology of high-speed
    emu based on virtual simulation. Experimental Technology and Management, 2020, 37 (3): 139–142

                                                                                                                                353
You can also read