Tool Extension in Human-Computer Interaction - joanna bergstrom

Page created by Andre Roberts
 
CONTINUE READING
Tool Extension in Human-Computer Interaction - joanna bergstrom
Tool Extension in Human–Computer Interaction
                              Joanna Bergström, Aske Mottelson, Andreea Muresan, Kasper Hornbæk
                                                    University of Copenhagen

ABSTRACT                                                                          in understanding the influence of tools on our perception of
Tool use extends people’s representations of the immediately                      our bodies, called tool extension, may apply to HCI, too.
actionable space around them. Physical tools thereby become                          Tool extension was first empirically characterised by Iriki
integrated in people’s body schemas. We introduce a measure                       and colleagues [24]. They trained macaque monkeys to re-
for tool extension in HCI by using a visual-tactile interfer-                     trieve objects using a rake. When so trained, the neural areas
ence paradigm. In this paradigm, an index of tool extension                       associated with the hand also registered visual activity around
is given by response time differences between crossmodally                        the rake and the area reachable with the rake. Since Iriki’s
congruent and incongruent stimuli; tactile on the hand and                        study, these findings have been extended to humans, showing
visual on the tool. We use this measure to examine if and how                     a similar influence of manual tools on our perception and
findings on tool extension apply to interaction with computer-                    thinking. In particular, the use of a tool has been shown to
based tools. Our first experiment shows that touchpad and                         change our motor control [11] and perception around the
mouse both provide tool extension over a baseline condition                       tool [3]. Recent reviews provide further details on how tools
without a tool. A second experiment shows a higher degree                         act as extensions of our bodies, modify the representations
of tool extension for a realistic avatar hand compared to an                      of our peripersonal space, and incorporate into our body
abstract pointer for interaction in virtual reality. In sum, our                  schema [6, 20, 27].
measure can detect tool extension with computer-based tools                          The methodologies used to establish tool extension, and
and differentiate interfaces by their degree of extension.                        their resulting findings, have only rarely been applied to
                                                                                  computer-based tools. In one exception, Bassolino and col-
CCS CONCEPTS                                                                      leagues [3] used the audio-tactile integration paradigm to find
• Human-centered computing → HCI theory, concepts                                 that use of mouse extends peripersonal space representation.
and models; HCI design and evaluation methods; Labo-                              With the audio-tactile paradigm, extension shows as a lack
ratory experiments; User studies; Empirical studies in HCI ;                      of difference between response times to tactile stimuli during
                                                                                  concurrent sounds near or far to the hand holding the tool.
KEYWORDS                                                                          Therefore, it cannot be used to assess degrees of tool extension.
Tool extension, body schema, peripersonal space, input                            In another exception, Sengül and colleagues [35] employed
                                                                                  virtual tools, but merely replicate tool extension findings in-
ACM Reference Format:
                                                                                  stead of attempting to use tool extension to compare ways
Joanna Bergström, Aske Mottelson, Andreea Muresan, Kasper Horn-
bæk. 2019. Tool Extension in Human–Computer Interaction. In CHI
                                                                                  of interacting. Despite these exceptions, researchers in neu-
Conference on Human Factors in Computing Systems Proceedings (CHI                 ropsychology have lamented a lack of studies on more com-
2019), May 4–9, 2019, Glasgow, Scotland Uk. ACM, New York, NY,                    plex tools [16]. We are similarly unaware of previous studies
USA, 11 pages. https://doi.org/10.1145/3290605.3300798                            having applied the methodologies for studying tool extension
                                                                                  in HCI. The associated findings have not been used to evaluate
1 INTRODUCTION                                                                    specific interaction styles, although mentioned in discussions
Computer systems may be viewed as tools to achieve specific                       of embodied interaction [e.g., 25].
goals and human-computer interaction (HCI) thereby seen as                           We introduce a measure of tool extension to HCI, and val-
tool use [e.g., 23, 30]. Although computer-based tools are com-                   idate it in two studies. The first study demonstrates that the
plex, under this view findings about tool use may be applied                      visual-tactile interference paradigm shows tool extension
to HCI. We investigate if the advances during the past 20 years                   with two common input devices: a mouse and a touchpad. By
                                                                                  comparing the mouse and the touchpad to a condition where
Permission to make digital or hard copies of all or part of this work for         no input is given, we show that the paradigm applies be-
personal or classroom use is granted without fee provided that copies are not
made or distributed for profit or commercial advantage and that copies bear
                                                                                  yond manual tools to computer-based tools. The second study
this notice and the full citation on the first page. Copyrights for components    shows that minute details of the interaction styles may change
of this work owned by others than ACM must be honored. Abstracting with           the degree of tool extension. We find that the presence or ab-
credit is permitted. To copy otherwise, or republish, to post on servers or to    sence of an avatar’s arm provide different degrees of extension.
redistribute to lists, requires prior specific permission and/or a fee. Request   Thereby, we introduce a measure of tool extension to HCI that
permissions from permissions@acm.org.
                                                                                  can quantify claims about embodiment of computer-based
CHI 2019, May 4–9, 2019, Glasgow, Scotland Uk
© 2019 Association for Computing Machinery.
                                                                                  tools. We contribute a validation of this measure by showing
ACM ISBN 978-1-4503-5970-2/19/05. . . $15.00                                      that it can be used to characterize different interaction styles.
https://doi.org/10.1145/3290605.3300798
Tool Extension in Human-Computer Interaction - joanna bergstrom
CHI 2019, May 4–9, 2019, Glasgow, Scotland Uk                                                                        Bergström et al.

2 RELATED WORK                                                           These findings are bounded in important ways. Practice
We first review work on tool extension from neuroscience and          with the tool is critical. Several studies show that if you are
psychology, which have established the influence of using             highly practiced with a tool, merely holding it is enough to
tools on our perception of the space around us. Those studies         cause the tool extension to occur. In a study of tennis rackets,
have mainly been done using physical tools. Nevertheless, a           for instance, holding your own racket is enough for tool exten-
few studies have investigated computer-based tools; we re-            sion whereas generic rackets require use for tool extension [5].
view those as well. Finally, we discuss how the methodology           The tip of the tool also seems particularly relevant, whereas
used in these studies can be made usable in HCI.                      non-active parts of the tool do not seem to be incorporated
                                                                      into the body schema [21].
Tool Extension in Neuroscience and Psychology
Both neuroscience and psychology have been concerned with             Tool Extension from Computer-based Tools
understanding tool use in general and in particular how tool          In a review of tool extension, Holmes and Spence [22] dis-
use affects our perception of our body and the environment            tinguished physical tools (e.g., rakes, golf clubs; used in the
around us (for reviews, see [6, 20, 27]). Whereas the idea that       studies just cited) from pointing tools (e.g., laser pointers, the
our representation of our body (often called body schema)             mouse), and detached devices (e.g., telesurgical devices). The
is affected by tool use is more than a hundred years old [22],        latter two categories are of relevance to HCI because of their
the study by Iriki and colleagues was a key, modern introduc-         resemblance to input techniques in HCI; however, both in
tion of this idea [24]. They showed that the brains of mon-           the review as well as in recent work, these two categories are
keys, in the intraparietal cortex which integrates visual and         rarely explored.
somatosensory information, are affected by tool use in a sur-            In the second category, Bassolino and colleagues [3] pre-
prising way. Before using a rake (and when passively holding          sented a study of whether using a computer mouse provides
it), bimodal neurons in monkeys’ brains only responded to             tool extension. Bassolino used an audio-tactile interaction
visual stimuli near the hand. However, after using the rake,          paradigm in which sounds appeared near the hand and near
brain activity was also seen for stimuli along the tool. Thus,        the pointer controlled by the mouse (70 cm away from the
the tool seems to be incorporated into the body.                      hand). The assumption in this setup is that audio from far
    It was later shown that tools affect humans similarly, using      and from near might interfere with response times to a tac-
a behavioural task, rather than neural measures [28]. Maravita        tile stimulus on the hand in different ways. In particular, far
and colleagues used toy golf clubs with vibrotactile stimuli          sounds should increase response times only if the far space is
at their handles (in an upper and a lower location) and visual        considered separate from near space (thereby this paradigm
distractors at their tips (also in an upper and lower location).      is different from the crossmodal congruency task). The study
Participants had to judge the location of the vibrotactile stim-      results show that when not using a mouse, the response time
uli while ignoring the visual stimuli. This setup is based on         to tactile stimuli with sounds near the hand is lower compared
the visual-tactile interference paradigm, also known as the           to far sounds, whereas when either holding or actively using
crossmodal congruency task [37]. This task assumes that stim-         a mouse, there is no difference between those response times.
uli in different modalities (e.g., tactile and visual) interact; in   Bassolino and colleagues interpret this as showing extension
particular, stimuli that are incongruent, say visual in upper         of peripersonal space by the mouse. However, their study
location and tactile in lower, lead to higher response times          concerns the familiarity of the used tool (the mouse) and does
(or lower accuracy) because they require effort to ignore. In         not intertwine real-world human-computer interaction with
the study by Maravita, participants held two golf clubs and           a measure of tool extension.
indeed showed a crossmodal congruency effect. Crucially,                 In the third category, a few studies have recently experi-
this effect changed when participants cross the golf clubs so         mented with applying tool extension paradigms to other types
that it depended on the tip of the held club rather than on the       of computer-based tools [8, 35]. Sengül and colleagues [35]
tip closest to the hand. Despite the complexity of the setup,         transferred earlier findings on tool extension with golf clubs
it shows that tools appear to integrate with our hands.               (reported in [28]) to virtual reality (VR) in which participants
    This finding has been extended and validated in several           used a haptic device. However, their findings are mainly about
ways. Besides the crossmodal congruency task, the effect has          replication of the original study in VR and not about interac-
been shown for length estimations. For instance, tool use             tion styles therein.
makes people perceive their forearm as being longer than be-             Physical tools (Holmes and Spence’s first category) are
fore tool use [11], and merely observing tool use compresses          relevant in HCI, too, as they might be coupled with computer-
the perception of distance [7]. Targets are also perceived as         based tools. We are unaware of work on tangible or physical
nearer in space when holding a stick able to reach them, com-         interfaces using tool extension methodology. Yet, the study
pared to when participants do not hold a stick [39]. Tool             by Alzayat and colleagues [1] is relevant here. They studied
extension has also been shown for mechanical grabbers [11],           an object manipulation task using a physical and a virtual
rakes [17], canes [36], and wheelchairs [15].                         tool. While doing the manipulation task, participants had to
Tool Extension in Human-Computer Interaction - joanna bergstrom
Tool Extension in Human–Computer Interaction                                      CHI 2019, May 4–9, 2019, Glasgow, Scotland Uk

react to visual stimuli near the hand and near the tool (at the       Experiments on visual-tactile congruency are often done with
handle and the tip, respectively, of a wrench-like device). The       very particular hand orientations and modes of grip (e.g., hold-
underlying assumption is that for fluid interaction (e.g., when       ing a golf club with a thumb up). With physical tools, the ex-
tools are ready to hand, in the sense of Heidegger, to which          periments use congruency between the stimuli within a single
the paper refers) attention would shift to the tool, resulting in     plane and thus obtain a direct spatial mapping (e.g., thumb up
better performance in detecting visual stimuli there. Indeed,         for tactile stimulus and the same upper side of the grasped golf
this is the case in the empirical study. However, this study is       club for a visual stimulus). Computer-based tools, however,
about readiness-to-hand instead of tool extension, and intro-         pose a challenge to that mapping. For instance, movements
duces a new measure instead of using any of the established           of a mouse on a table plane map to cursor movements on a
methodologies discussed above.                                        display, which is often positioned on a plane orthogonal to the
                                                                      table. But how to map the tactile stimuli on the index finger
Issues in Transferring Tool Extension to HCI                          and thumb to the cursor on the display? Earlier work has not
                                                                      faced this problem. Therefore, it remains unclear if we can
Tool extension is often discussed as a concept that is important
                                                                      provide visual and tactile stimuli on those two distinct planes
to HCI [e.g., 18, 25]. However, as seen above, few studies use
                                                                      in a way that make tool extension methodology applicable to
this concept for computer-based tools; none within HCI. We
                                                                      computer-based tools and HCI.
believe this is due to three issues in adapting the methodology
                                                                         We find that these three issues are the reason why tool
to provide a useful measure for HCI.
                                                                      extension methodologies have not been used in HCI. Trans-
   One issue in transferring tool extension methodology is
                                                                      ferring methods from neuroscience and psychology often
that HCI studies require participants to actually use the tool
                                                                      requires adaptation work when applied in HCI. The sense of
or user interface, beyond merely completing the tasks nec-
                                                                      agency, for instance, plays a vital role in neuropsychology,
essary for measuring extension. Tool use rarely happens in
                                                                      and was initially transferred to HCI by Coyle and colleagues
existing work. For instance, in the paper by Bassolino and col-
                                                                      in 2012 [13]. After that, several papers have been required to
leagues [3], the description of the experiment leaves it unclear
                                                                      further test it [4] and extend it to work in actual task perfor-
what participants do with the mouse, in particular, if they do
                                                                      mance, for instance in VR [12]. In the experiments that follow,
any real-world interactions, such as aiming and selecting.
                                                                      we first show that the three issues discussed can be dealt
Similarly, neuropsychology has used tool extension to show
                                                                      with and that this methodology allows HCI to reason about
inclusion of artificial limbs within the body schema (e.g., in the
                                                                      computer-based tools in new ways. But first, we introduce
Rubber Hand Illusion [32]). The Rubber Hand Illusion, how-
                                                                      the basic idea of a tool extension measure in HCI.
ever, does not require action but merely that participants expe-
rience visuo-tactile synchronization to induce tool extension.
With physical tools in neuroscience, action has been incorpo-         3 A MEASURE OF TOOL EXTENSION FOR HCI
rated by small movements, such as crossing the tools, instead         We introduce a measure of tool extension to HCI by apply-
of using them for their real functions (e.g., clubs for golf [28]).   ing a visual-tactile interference paradigm. The purpose of
For HCI, however, we need to find a way to intertwine the task        our measure is to clear the stumbling blocks discussed in the
for measuring extension and the tool-specific interactions.           previous section.
Doing so is critical because action is widely believed to be             The visual-tactile interference paradigm is suitable for HCI,
essential for the representation of peripersonal space [26] and,      because most user interfaces rely on visual output and feed-
of course, a key in human-computer interaction.                       back. Therefore, this paradigm allows intertwining our mea-
   A second issue is to apply the findings from physical tools        sure with interaction, such as with controlling a cursor on a
to computer-based tools. In particular, tool extension has been       display. Figure 1 presents the procedure for the visual-tactile
studied mainly for physical tools that are grasped. The role          interference paradigm, using the interfaces and the setup of
of active and passive grasp is widely studied, as is the effect       our Experiment 1 as examples.
of previous experience in using a particular tool. From the              In the visual-tactile interference paradigm, participants
literature, however, it remains unclear if tools that are not         make speeded discriminations between two tactile stimuli
grasped, such as a computer touchpad, can induce tool exten-          (e.g., up/down) in the presence of a congruent (e.g., up/down)
sion. Further, the findings from psychology and neuroscience          or incongruent (e.g., down/up) visual distractor stimulus. The
on the differences between an empty hand and a tool are con-          visual distractors can be embedded on the visual feedback of
founded in that we found no examples where the empty hand             the user interface, such as above or below a cursor (Figure 1).
functions as a tool. For HCI, however, many interactions do           The actuators for tactile stimuli are placed in a way which can
not require grasping, and in many cases an empty hand works           be mapped to the visual stimuli. For example, in interaction
as the tool (as it would, for instance, in mid-air pointing or        with a mouse, forward movements of the hand are mapped
non-controller interaction in VR).                                    to upward movements of a cursor. As the posture of the hand
   A third issue is to transfer the stimuli for the crossmodal        controlling a mouse or a touchpad entails the index finger tip
congruency task from physical tools to computer-based tools.          being located more forward than the thumb, the actuators
CHI 2019, May 4–9, 2019, Glasgow, Scotland Uk                                                                                              Bergström et al.

    Move the cursor to select a               Discriminate beween the vibrotactile stimuli                               Move the cursor to the next
 target circle (or observe it move)              (in the presence of a visual distractor)                              target circle (or observe it move)

                                                                      Response time

Figure 1: The procedure in our measure using the visual-tactile interference paradigm. The procedure uses the setup of our
Experiment 1 to exemplify the role of input interfaces and the visual feedback. However, the procedure applies beyond this
particular setup with modifications according to the spatial configurations of the interface.

for tactile stimuli (such as vibration motors) can be placed on            With tool extension, the RTs in incongruent conditions are
those fingers to correspond to the locations of visual stimuli          longer than in congruent conditions. Therefore, tools that are
above and below the cursor.                                             easily integrated to the peripersonal space or body represen-
   Responses to tactile stimuli are given with a modality that          tation result in increased CCEs compared to tools that are not.
has minimal interference with conducting the experiment task            Thereby, our measure allows comparison of interaction styles
or interacting with the interface. For example, in our exper-           by the degree of tool extension they provide.
iments the participants gave input with a hand, and therefore
we used foot pedals to record responses (which are also com-            4 EXPERIMENT 1
mon in previous studies on visual-tactile interference [e.g.,           The purpose of the first experiment is to investigate if our
28, 35]). Foot pedals can be used, for instance, with one foot to       measure shows tool extension with computer-based tools. For
respond to a tactile stimulus on the index finger by lifting the        this purpose, we use two common input devices: a mouse and
toes and to a stimulus on the thumb by lifting the heel (Figure         a touchpad. We also include a passive condition as a baseline,
1). Another way to record responses to tactile stimuli is to use        where no input device is used.
key presses with the hand not stimulated, for instance by re-              A mouse has previously suggested to provide tool extension
sponding with left and right keys to tactile stimuli on the index       with an audio-tactile interaction paradigm. Finding extension
and middle finger [e.g., 40]. To tightly intertwine interaction         by using a mouse in this experiment can therefore validate
with the measure, a discrimination trial of tactile stimuli is          the suitability of our measure for tool extension.
performed after each instance of action (e.g., selecting a target
with the cursor, as Figure 1 and in our Experiment 1).
   Our measure for tool extension in HCI is crossmodal con-                                  Vibrotactile stimulus     Visual distractor    Correct responses
gruency effect (CCE). CCE is validated in studies ranging from
                                                                                                                     Congruent conditions
monkey’s visuo-somatosensory neural activity [24] to be-
havioural measures with brain-damaged patients [27]. It has                                  On the index finger       Above the cursor     Lift the toes
been shown to increase for objects that can be easily inter-
                                                                                             On the thumb              Below the cursor     Lift the heel
grated in body representation (e.g., in the Rubber Hand Illu-
sion [32, 40]), and it reflects changes in peripersonal space                                                      Incongruent conditions
representation after tool use [e.g., 20, 27, 28]. CCE is defined                             On the index finger       Below the cursor     Lift the toes
as the difference between mean response times (RTs) in in-
congruent conditions and congruent conditions within the                                     On the thumb              Above the cursor     Lift the heel

visual-tactile interference paradigm. Figure 2 presents the four          CCE = Mean RT in Incongruent Conditions - Mean RT in Congruent Conditions
stimuli conditions, and the correct responses to tactile stimuli,
again using the setup of our Experiment 1 as an example. CCE
                                                                        Figure 2: The stimuli and responses for incongruent and
is calculated from RTs measured in repetitions of each of the           congruent tactile and visual stimuli, and the calculation of
four congruence conditions as presented in Figure 2.                    the CCEs based on response times (RTs).
Tool Extension in Human–Computer Interaction                                     CHI 2019, May 4–9, 2019, Glasgow, Scotland Uk

   Tool extension has previously been shown mainly with
manual tools that are grasped. We use a touchpad to examine
whether findings on tool extension apply to tools that are not
grasped, as such input interfaces are common in HCI.
   In this experiment, we also intersperse interaction and our
measure for tool extension. For example, the participants
use the mouse and the touchpad to move and select with a
cursor in a setting wherein we do not control their grip and
hand posture with the input devices. We restrain from using
other means for high control seen in previous studies, such as
chin rests. By relaxing some of the control in the experiment,
we examine if our measure can show tool extension when
integrated with human–computer interaction.

Participants
Data were collected from 12 right-handed participants (7 iden-
tified themselves as females and 5 as males, with an mean age
of 25.92 ± SD of 3.85 years). Six of them used a mouse and ten
of them a touchpad daily.

Task
The experiment task was to discriminate between two loca-
tions for vibrotactile stimulus while ignoring visual distractor
stimuli. The tactile stimuli were presented either on the index
finger or the thumb. The response to a tactile stimulus was
                                                                     Figure 3: Study setup in Experiment 1. Participant perform-
given with a foot pedal by lifting the toes to indicate a stimulus   ing a trial on the left. Input condition on the right from the
on the index finger or lifting the heel to indicate a stimulus       top: a mouse, a touchpad, and the passive condition.
on the thumb.

                                                                     Procedure
Design
                                                                     The participants were instructed to fixate at the cursor at all
The experiment followed a two-by-three within-subjects de-
                                                                     times, and to respond to the tactile stimulus as quickly as
sign with the congruency between the tactile stimuli and the
                                                                     possible by lifting either of the foot pedals, while ignoring the
visual distractor as the first factor and input condition as the
                                                                     visual stimulus.
second.
                                                                        Before starting the experiment and before starting each
   In two active input conditions, the participants used a
                                                                     input condition, the participants were asked to lean back on a
mouse or a touchpad to control a cursor on a computer display
                                                                     chair, and to rest their foot on the pedals and lift the heel and
(Figure 3). In the passive condition, no input device was held
                                                                     toes a few times to ensure a firm enough press to not activate
and no input was given – the cursor was only observed to
                                                                     pedals accidentally, and a relaxed and comfortable posture for
move.
                                                                     responding to stimuli. The participants were then reminded
   The visual distractor stimuli were presented by highlight-
                                                                     of their task.
ing either of two dots above or below that cursor. The con-
                                                                        In the active conditions with input devices, the participants
gruent and incongruent conditions consisted of visual stimuli
                                                                     were also instructed to move a cursor around the window and
above or below the cursor with a tactile stimuli on the index
                                                                     to click with a mouse or tap with a touchpad to get used to
finger or thumb, as presented in Figure 2.
                                                                     their movement-display ratio and the feel of a selection. They
   The order of the input conditions was counterbalanced.
                                                                     were instructed to tap, not click, the touchpad. The procedure
Each of the four congruency conditions were presented 20
                                                                     is presented in Figure 1. In the active conditions, the partic-
times with each input condition in a randomised order. The
                                                                     ipants started the experiment by moving the crosshair cursor
experiment thus consisted of 360 trials in total (3 input devices
                                                                     to a target circle on the display. There were two target circles,
× 4 congruency conditions × 20 repetitions). The experiment
                                                                     on the left and on the right, and the previous target circle
lasted approximately 30 minutes. The participants received
                                                                     acted as the starting point for the next trial. In the passive
small gifts as a compensation for their time.
                                                                     condition, the participants did not control the cursor, but only
CHI 2019, May 4–9, 2019, Glasgow, Scotland Uk                                                                          Bergström et al.

observed it move to the targets. The movement speed of the
cursor was the average speed recorded in pilot studies.                  250
   After successfully selecting the target or observing the cur-
sor to reach the target (in the passive condition), the target           200
disappeared. Then, both a tactile stimulus and a visual distrac-
tor stimulus were presented. The visual distractor stimulus on           150
either of the dots above or below the cursor was presented 900

                                                                    ms
ms after the click on a target circle. With this delay, we aimed         100
to balance between having a short enough delay after moving
the cursor to maintain active tool use (in active conditions),
                                                                          50
but long enough to avoid possible effects of sensory attenu-
ation on perception of the tactile stimuli. Such attenuation
may occur when the body part receiving a stimulus is moved                 0
during or immediately before the stimulus. Similar delays                           Mouse            Touchpad             Passive
have been used in previous work [e.g., 40].
   The tactile stimulus was given 100 ms after the visual dis-     Figure 4: The average CCEs for the input conditions in Ex-
tractor stimulus, because previous work has shown that such        periment 1. The error bars denote .95 confidence intervals.
a delay maximises the CCE. The duration of both the visual
and tactile stimuli were 200 ms (therefore being simultane-
ously elevated for 100 ms). These durations, too, are similar      This posture was chosen to maximise the press on the pedals
to previous work, which vary the duration from 100 ms [e.g.,       with a relaxed foot, in order to avoid accidental lifts of pedals.
35] to 250 ms [e.g., 28, 40].
   The participants responded to the tactile stimulus by lifting   Apparatus
a foot pedal. The participants were instructed to not move the     For tactile stimuli, we used shaftless 3V, 60mA, 13000±3000
cursor (in the active conditions) before responding to stim-       rpm coin vibration motors with a diameter of 10 mm and a
ulus and pressing a foot pedal back down. After a foot pedal       thickness of 3.4 mm. The vibration motors were controlled
was pressed back down, the next target circle appeared on the      with an Arduino Uno. The vibration motors were placed on
screen. The participants were allowed to take a short break        the medial phalanx of the index finger and proximal phalanx
and rest their foot between the input conditions.                  of the thumb. The motors were attached firmly on the fingers
                                                                   with a slightly flexible medical self-adhesive bandage.
Setup                                                                 The experimental interface for visual stimuli, for control-
                                                                   ling the vibration motors, and for logging the foot pedal re-
The experiment setup is presented in Figure 3. The exper-
                                                                   sponses was developed with Processing. The interface was
iment interface was presented on a full screen window on
                                                                   presented on a Samsung SyncMaster 244T LCD display with
the display. We choose to use a crosshair as a cursor due to
                                                                   a 1920 × 1200 resolution. The experimental software ran on a
its neutral appearance, so as to avoid presenting unrelated
                                                                   MacBook Pro (2013). The input devices were an Apple Magic
directional cues that, for instance, an arrow cursor could have
                                                                   Mouse and an Apple Magic Trackpad 2. The tracking speed of
given. The dots for visual distractor stimuli were 100 pixels in
                                                                   both the mouse and the touchpad was set to the third fastest
diameter and 60 pixels above and below the cursor. The target
                                                                   level on the MacBook’s system preferences.
circles were 60 pixels in diameter at a vertical midpoint and
at a horizontal distance of a 1/6 of the width of the window       Results
from the left and right sides.
                                                                   Figure 4 and Table 1 show the results from Experiment 1. For
   The participants sat in front of a desk. The display was
                                                                   analysing the RTs, trials with an incorrect response to the
placed on the desk one meter away from the back of the
                                                                   tactile stimuli were discarded (the error rates are presented
chair participants sat on. The participants were asked to lean
                                                                   in Table 1). Trials with RTs smaller than 200 ms or larger than
back on the chair to keep the distance from their body to the
display constant. The input devices and the hand in the passive
condition were placed and used within a marked 30 cm wide          Table 1: Mean response times (RT) in milliseconds and
and 20 cm deep area. The nearest edge of that area was set to a    percentage of errors for Experiment 1.
25 cm distance from the display. The foot pedals were Andoer
Game Foot Switch USB connected pedals, attached to the                                 Mouse           Touchpad            Passive
floor underneath the participant’s heel and and toes. The foot                     RT       Errors   RT       Errors    RT       Errors
pedals were located on the floor in relation to the chair in a      Congruent      674.14   1.46     652.06   1.46      662.93   1.88
way that the participant’s right knee would form a maximum          Incongruent    829.15   7.08     824.71   8.33      760.69   3.96
of 90 degree angle when the right foot would rest on the pedals.    CCE            155.01   5.63     172.65   6.88      97.76    2.08
Tool Extension in Human–Computer Interaction                                    CHI 2019, May 4–9, 2019, Glasgow, Scotland Uk

1500 ms were also discarded (3.02% of trials with the mouse,
2.19% with the touchpad, and 1.88% in the passive condition)
as those represent either anticipations or omissions of stimuli
(these procedures in analysis follow previous studies, [e.g.,
20, 28, 35]).
   We used Bonferroni corrected paired sample t-test to anal-
yse differences between each input condition and the passive
condition by using the mean values per condition for each
participant. The t-tests show a significant difference between
the two input condition pairs: mouse and passive (t(11)=3.66,
p = .004), and touchpad and passive (t(11)=4.72, p < .001).
   Finding a higher CCE with a mouse compared to the passive
condition confirms that our measure can show tool extension.
Finding a higher CCE with a touchpad compared to the pas-
sive condition shows that also tools that are not grasped can
extend representation of peripersonal space. Together, these
findings also show that our measure for tool extension can
                                                                    Figure 5: Screenshots from VR of the appearances of the
be intertwined with human-computer interaction in user ex-
                                                                    conditions in Experiment 2. Top left: the realistic appear-
periments.                                                          ance for a female participant in the beginning of a trial. Top
                                                                    middle: the disconnected hand moving towards a spotlight.
5 EXPERIMENT 2                                                      Top right: the abstract appearance waiting for a stimuli.
The primary purpose of the second experiment is to examine          Bottom: the realistic appearance for a male participant and
                                                                    a visual distractor stimuli on the top sphere.
whether our measure can distinguish degrees of tool exten-
sion for different interaction styles. Tool extension measures
changes in the representation of the actionable space (i.e.,        with an arm was gender matched for participants, as gender
body schema), independently of whether the tool is physi-           has been shown to influence body ownership of avatars [34].
cal or a virtual projection of the body (as shown with neural       In contrast, the disconnected avatar hand had a static, neutral
measures with monkeys [27]). Therefore, another purpose is          appearance.
to examine whether our measure also shows tool extension               As a third appearance condition we used an abstract cur-
in free-hand interaction (not holding nor touching an input         sor. This appearance was included because previous work
device). Hands are important tools in HCI, as they are often        has found differences between natural appearances and ab-
used for input, for instance, with mid-air, gesture, augmented      stract appearances. For example, Pavani et al. [31] found a
reality, and VR interfaces. Here, too, we intertwine interaction    larger CCE with stimuli next to a natural shadow of a real
and our measure for tool extension; this time in a VR setting.      hand, compared to an abstract, polygonial shaped shadow. In
   We chose to vary the appearance of an avatar hand in three       another study, Pavani et al. [32] showed with rubber gloves
conditions presented in Figure 5: A realistic hand, a discon-       that having the gloves visible increased CCE compared to
nected hand, and an abstract appearance. These appearances          not having gloves at all (only distractor lights). We derive the
were chosen because previous findings on tool extension sug-        abstract appearance from these studies: using only spheres
gest that appearance of artificial bodies can influence the         at the locations of the fingertips.
magnitude of the CCE.
   First, we chose to use a realistic avatar hand with an arm       Participants
as one appearance condition. We hypothesize that a realis-          Data were collected from 18 right-handed participants (9 iden-
tic hand shows a highest degree of tool extension, because          tified themselves as females and 9 as males, with an mean age
previous work suggests increased CCE and inclusion of arti-         of 24.39 ± SD of 3.24 years). Twelve of them had tried a VR
ficial limbs (e.g., in the Rubber Hand Illusion [32, 40]) in body   headset before.
representation.
   As a second appearance condition we used a disconnected          Task
but realistic avatar hand. CCE has been presented as an ob-         The experiment task was the same as in Experiment 1.
jective predictor of ownership [40]. Perez-Marcos et al. [33]
showed that subjective ownership depends on connectivity            Design
of a virtual arm, but did not find such dependency on the           The experiment followed a two-by-three within-subjects de-
alignment of a virtual hand that was connected. Therefore,          sign with the congruency between the tactile stimuli and
a disconnected hand was hypothesized to show lower CCE              the visual distractor as the first factor and appearance con-
than a realistic hand connected to an arm. The realistic hand       dition as the second. The three appearances were a realistic
CHI 2019, May 4–9, 2019, Glasgow, Scotland Uk                                                                            Bergström et al.

                                                                         Before starting the experiment and before starting each
                                                                      appearance condition, participants were asked to rest their
                                                                      foot on the pedals and lift the heel and toes a few times to
                                                                      ensure a firm enough press to not activate pedal accidentally,
                                                                      but also a relaxed and comfortable posture for responding to
                                                                      stimuli. The participants were then reminded of their task.
                                                                         The participants rested their elbow on a table, kept their
                                                                      hand in the air and pointing forward, and their index finger
                                                                      and thumb opened apart so as to keep the spheres on top of
                                                                      each other (Figure 6). Other fingers were kept relaxed (slightly
                                                                      flexed).
                                                                         All appearance conditions consisted of the same interac-
                                                                      tion: Moving the hand between two spotlights on the right
                                                                      and on the left. After holding the spheres under a spotlight
                                                                      and not moving the fingers (centroid between them) more
                                                                      than 0.5 cm for 2 seconds, the visual distractor stimulus was
                                                                      presented, followed by a tactile stimulus a 100 ms later. The
                                                                      durations of both visual and tactile stimuli were 200 ms, the
                                                                      same as in Experiment 1.
                                                                         The participants responded to the tactile stimulus by lifting
                                                                      a foot pedal. The participants were instructed to not move
                                                                      their hand before responding to the stimulus and pressing a
Figure 6: Study setup in Experiment 2. Participant perform-           foot pedal back down. After a foot pedal was pressed back
ing a trial wearing the vibration motors on the index finger          down, the next spotlight appeared in their view. The partic-
and the thumb, the HTC VIVE headset, and the reflective               ipants were allowed to take a short break and rest their foot
markers that are used for tracking the hand motion with the           between conditions.
OptiTrack cameras surrounding the desk.
                                                                      Apparatus
                                                                      The tactile stimuli and foot pedal setup were identical to those
hand and arm, a realistic disconnected hand (without an arm),         used in Experiment 1. In addition, we used an HTC Vive with
and an abstract condition wherein only white spheres at the           Optitrack motion capture system for finger tracking and VR.
locations of fingertips were visible (Figure 5). The spheres fol-     The VR scene was created in Unity 2018 and ran on a laptop
lowed participants’ fingertip movements, thus inducing body           PC (2.8GHz Intel i7, 16GB RAM, NVIDIA GTX 1060), running
ownership by visuo-motor synchronisation, which has been              Windows 10 Pro. Finger tracking was done with eight Prime13
shown to correlate with tool extension [31]. All conditions in        cameras at a 120 Hz frequency, placed around a table where
this experiment were active.                                          the study was conducted. We tracked the thumb and index
   The visual distractor stimuli were presented by highlight-         finger using reflective markers attached to a 3D printed frame
ing either of white spheres. The congruent conditions con-            (Figure 6). We used FinalIK as the inverse kinematic model
sisted of visual stimulus on the white sphere at the fingertip        for more realistic hand and arm movements.
corresponding to the finger receiving the tactile stimulus. The
                                                                      Results
incongruent conditions consisted of the opposite.
   The order of the input conditions was counterbalanced.             Figure 7 and Table 2 show the results from Experiment 2. As in
Each of the four congruency conditions were presented 20              Experiment 1, trials with an incorrect response to the tactile
times with each input condition in a randomised order. The            stimuli were discarded from the analysis of the RTs (those er-
experiment thus consisted of 360 trials in total (3 input devices     ror rates are presented in Table 2). Trials with RTs smaller than
× 4 congruency conditions × 20 repetitions). The experiment
lasted approximately 30 minutes. The participants received            Table 2: Mean response times (RT) in milliseconds and
small gifts as a compensation for their time.                         percentage of errors for Experiment 2.

Procedure                                                                               Realistic      Disconnected          Abstract
The participants were instructed to look at the spheres at all                       RT       Errors   RT       Errors    RT       Errors
times, to not close their eyes, and to respond to the tactile stim-    Congruent     710.45   1.25     713.00   1.81      705.65   0.97
ulus while ignoring the visual stimulus as quickly as possible         Inongruent    891.74   5.42     872.75   5.56      858.03   4.44
by lifting the corresponding foot pedal.                               CCE           181.29   4.17     159.75   3.75      152.38   3.47
Tool Extension in Human–Computer Interaction                                      CHI 2019, May 4–9, 2019, Glasgow, Scotland Uk

                                                                     helps discriminate levels of extension, for instance between
      250                                                            mouse, touchpad, and a passive condition, as well as between
                                                                     a full representation of an avatar arm in VR and an abstract
      200                                                            pointer. Thereby, we have created a measure for use in HCI
                                                                     that quantify a central aspect of computer-based tools, namely
      150                                                            whether they incorporate into our body schema. Next, we
                                                                     discuss details of the studies and their implications.
 ms

      100
                                                                     Our Measure for Tool Extension
       50                                                            The key contribution of this paper is introducing and vali-
                                                                     dating a measure of tool extension for HCI. This measure is
        0                                                            objective but captures qualities of interaction that HCI has
                                                                     talked about in subjective terms for decades. Those terms
                Realistic      Disconnected        Abstract          include ‘acting through the interface’ [9], ‘ready-to-hand’ [1],
                                                                     ‘transparent’ [2], ‘embodied’ [25], and ‘natural’ [38]. Tool
Figure 7: The average CCEs for the three appearance condi-           extension does not help quantify the entirety of these qual-
tions in Experiment 2. The error bars denote .95 confidence          ities, but it does attempt to capture parts of those. As such,
intervals.                                                           it represents a milestone in working quantitatively with the
                                                                     experience of user interfaces.
200 ms (anticipations) or larger than 1500 ms (omissions) were           The adaptations in the present paper are substantial over
also discarded (5.21% of trials with the realistic arm, 4.79% with   methodology in psychology and neuroscience. First, our mea-
the disconnected hand, and 2.50% in the abstract condition).         sure does not require chin-rests, fixed arms, or unchanging
   We used Bonferroni corrected paired sample t-test to anal-        grips. Relaxing these constraints allows using the measure, for
yse differences between each appearance condition by using           instance, in VR and with devices that require a grip (mouse) or
the mean values per condition for each participant. The t-tests      not (touchpad). Second, users may do interactive tasks (rather
show a significant difference between one appearance condi-          than just the discrimination task as in most studies on tool
tion pair: a realistic arm and an abstract condition (t(17)=3.10,    extension [e.g., 28]). Thereby, the measure can be intertwined
p = .009). The error rates did not show significant differences.     with task performance.
   Finding a higher CCE with a realistic avatar hand com-                The validity of the methodology has been shown mainly as
pared to the abstract appearance confirms that our measure           discriminant validity. It was able to distinguish active (mouse,
can distinct degrees of tool extension. This finding together        touchpad) and passive conditions in Experiment 1 (as expected
with the average CCEs is in alignment with previous studies          from research on physical tools [5]). It was also able to distin-
on tool extension: characteristics of avatar hands similar to        guish degrees of tool extension with variants of VR avatars in
those applied to shadows [31] and rubber hands [32, 40] in-          Experiment 2 which subjective body ownership reports from
fluence tool extension in similar ways. For example, natural         earlier work have suggested dissimilar [19, 29]. The effect
appearance increases CCE over abstract appearance, and a             size there was medium (Cohen’s d=0.548) for the difference
disconnected hand decreases CCE compared to a hand con-              between realistic and abstract appearances.
nected to the body with an arm. Our findings also show that              We find that visual-tactile interference has some benefits
tool extension can be found in free-hand interaction with            for HCI over audio-tactile interaction paradigm, as used by
computer-based tools. Together, these findings imply that            Bassolino and colleagues [3]. First, the visual modality is dom-
our measure shows tool extension for avatars, and can be             inantly used for output and feedback in user interfaces. Most
intertwined with interaction in VR.                                  input is mapped to visual output, from keyboard events to text
                                                                     and mouse or hand movements to pointers. Therefore, visual
6 DISCUSSION                                                         distractor stimuli are easy to embed in real-world interactions.
Extending our action space by a tool is central to pointing and      Second, the audio-tactile paradigm requires physical distance
manipulating the world. Tool extension is an established con-        from the tool to the interface. Therefore, it may not apply to
cept in psychology and neuroscience about how a person’s             use with interfaces for VR. Third, the audio-tactile interaction
body schema may change as a result of tool use. Previous             paradigm shows extension of peripersonal space with a tool
work has claimed it relevant to human-computer interaction           when no difference is found between response times between
(HCI), but so far the empirical methods for establishing tool        the near and far sounds concurrent with the tactile stimuli.
extension has not been applicable to HCI. Our experiments            As the measure of extension is based on finding no difference,
show that tool extension can work in HCI: We have presented          it requires a passive condition as a baseline wherein a differ-
a measure of it that may be collected during interaction, with-      ence is found. Fourth, as the audio-tactile paradigm is indeed
out restrictions in grip or movement. Moreover, the measure          based on indifference in response times, it cannot be used
CHI 2019, May 4–9, 2019, Glasgow, Scotland Uk                                                                              Bergström et al.

for comparing interaction styles. In contrast, our measure is      from the hand (physically or in VR) to the target likely influ-
based on the magnitude of the CCE, and can thus be used for        ences extension of the body schema but we leave this question
comparing the degree of tool extension between interaction         for future work.
styles, as we did in Experiment 2.                                    The measure we have presented is easy to use but requires
                                                                   experimenters to get many details correct (as we experienced
                                                                   in numerous pilot studies). In particular, instructions to par-
                                                                   ticipants are crucial, both concerning which stimuli they need
Limitations and Open Questions
                                                                   to react to (the tactile ones, not the visual ones) and that they
Our study has several limitations. Most importantly we did         need to fixate at the targets for visual distractors (rather than
not quantify prior experience with the tools tested. Earlier       elsewhere or closing their eyes).
work has shown strong effects due to familiarity [e.g., 5] and
we suspect this could moderate our findings, too. Future work
                                                                   7 CONCLUSION
should attempt to relate such measures to CCE values.
   The availability of an objective way of quantifying tool        Tool extension suggests that using tools may extend our rep-
extension is in itself useful across a wide range of areas, but    resentation of action possibilities in our surroundings. It is
our study also leaves many research questions open. First,         widely studied in psychology and neuroscience but so far
we are curious about how the CCE relates to other measures         only discussed in human-computer interaction. We have in-
of experience, in particular, those on body ownership [e.g.,       troduced a measure of tool extension to HCI, based on the
14, 29] and subjective satisfaction [e.g., 10]. We are particu-    visual-tactile interference paradigm. In two experiments we
larly curious about the relation between the CCE and novelty       have shown how this measure can discriminate through in-
effects because this measure could be one way to evaluate          terfaces and produce an index of how much they are sensed
interfaces objectively where expectations and novelty play         as extensions of one’s body. This methodology helps to study
less of a role compared to subjective reports.                     embodiment and represents a significant advance in work-
   Second, because our method allows intertwining with in-         ing quantitatively with the incorporation of computer-based
teractive tasks, we may for the first time explore the relation    tools in our peripersonal space.
between performance and tool extension. For instance, using
a Fitts’s law task, it would be possible to vary task difficulty   8 ACKNOWLEDGEMENTS
and thereby performance and quantify its influence on tool         This project has received funding from the European Research
extension. Earlier work in psychology and neuroscience has         Council (ERC) under the European Union’s Horizon 2020 re-
not been able to do this.                                          search and innovation program (grant agreement 648785). We
   Third, investigating the methodology with other interfaces      thank Antti Oulasvirta and Adrian Alsmith for their helpful
seems interesting. We are interested in using it on tangible       comments.
user interfaces, possibly comparing it to the approach by Alza-
yat and colleagues [1]. The purpose for using VR in this work      REFERENCES
was to examine whether our measure of tool extension ap-
                                                                    [1] Ayman Alzayat, Mark Hancock, and Miguel A. Nacenta. 2017. Measur-
plies to free-hand interaction. Virtual tools were not used,            ing Readiness-to-Hand Through Differences in Attention to the Task
because those could bring other effects in play, such as a con-         vs. Attention to the Tool. In Proceedings of the 2017 ACM International
flict between a lack of haptic feedback from not grasping a             Conference on Interactive Surfaces and Spaces (ISS ’17). ACM, New York,
tool but seeing an avatar hand grasping a virtual one. Using            NY, USA, 42–51. https://doi.org/10.1145/3132272.3134114
the measure with VR controllers or other devices which give         [2] Jakob E Bardram and Olav W Bertelsen. 1995. Supporting the
                                                                        development of transparent interaction. In International Conference
tactile feedback also needs careful validation because of the           on Human-Computer Interaction. Springer, 79–90.
possible sensory attenuation or even interference with the          [3] Michela Bassolino, Andrea Serino, Silvia Ubaldi, and Elisabetta Làdavas.
tactile stimuli during the cross-modal congruency task.                 2010. Everyday use of the computer mouse extends peripersonal space
   Fourth, investigating the influence of distances on tool ex-         representation. Neuropsychologia 48, 3 (2010), 803–811.
tension provided by computer-based tools seem particularly          [4] Joanna Bergstrom-Lehtovirta, David Coyle, Jarrod Knibbe, and Kasper
                                                                        Hornbæk. 2018. I Really Did That: Sense of Agency with Touchpad, Key-
relevant for free-hand interaction. The distance in which tool          board, and On-skin Interaction. In Proceedings of the 2018 CHI Conference
extension applies has remained unclear as manual tools used             on Human Factors in Computing Systems (CHI ’18). ACM, New York, NY,
in neuropsychology pose physical limitations to their users,            USA, Article 378, 8 pages. https://doi.org/10.1145/3173574.3173952
such as a weight that increases with their length. Computer-        [5] Monica Biggio, Ambra Bisio, Laura Avanzino, Piero Ruggeri, and Marco
based tools do not pose such limitations. For example, Feucht-          Bove. 2017. This racket is not mine: The influence of the tool-use on
                                                                        peripersonal space. Neuropsychologia 103 (2017), 54–58.
ner and Müller [14] presented elongated and disconnected            [6] Olaf Blanke. 2012.      Multisensory brain mechanisms of bodily
augmented reality hands to act beyond user’s peripersonal               self-consciousness. Nature Reviews Neuroscience 13, 8 (2012), 556.
space, and their modulation on the experienced ownership.           [7] Emily K Bloesch, Christopher C Davoli, Noam Roth, James R Brockmole,
We kept distance to targets constant in Experiment 1; distance          and Richard A Abrams. 2012. Watch this! Observed tool use affects
                                                                        perceived distance. Psychonomic Bulletin & Review 19, 2 (2012), 177–183.
Tool Extension in Human–Computer Interaction                                                    CHI 2019, May 4–9, 2019, Glasgow, Scotland Uk

 [8] Daniel Blustein, Adam Wilson, and Jon Sensinger. 2018. Assessing            [28] Angelo Maravita, Charles Spence, Steffan Kennett, and Jon Driver. 2002.
     the quality of supplementary sensory feedback using the crossmodal               Tool-use changes multimodal spatial interactions between vision and
     congruency task. Scientific reports 8, 1 (2018), 6203.                           touch in normal humans. Cognition 83, 2 (2002), B25–B34.
 [9] Susanne Bødker. 1991. Through the interface-A human activity                [29] Antonella Maselli and Mel Slater. 2013. The building blocks of the full
     approach to user interface design. DAIMI Report Series 16, 224 (1991).           body ownership illusion. Frontiers in Human Neuroscience 7 (2013), 83.
[10] John Brooke et al. 1996. SUS-A quick and dirty usability scale. Usability        https://doi.org/10.3389/fnhum.2013.00083
     evaluation in industry 189, 194 (1996), 4–7.                                [30] Bonnie A Nardi. 1996. Context and consciousness: activity theory and
[11] Lucilla Cardinali, Francesca Frassinetti, Claudio Brozzoli, Christian            human-computer interaction. mit Press.
     Urquizar, Alice C Roy, and Alessandro Farnè. 2009. Tool-use induces         [31] Francesco Pavani and Umberto Castiello. 2004. Binding personal and
     morphological updating of the body schema. Current Biology 19, 12                extrapersonal space through body shadows. Nature neuroscience 7, 1
     (2009), R478–R479.                                                               (2004), 14.
[12] Patricia I. Cornelio Martinez, Emanuela Maggioni, Kasper Hornbæk,           [32] Francesco Pavani, Charles Spence, and Jon Driver. 2000. Visual
     Marianna Obrist, and Sriram Subramanian. 2018. Beyond the Libet                  capture of touch: Out-of-the-body experiences with rubber gloves.
     Clock: Modality Variants for Agency Measurements. In Proceed-                    Psychological science 11, 5 (2000), 353–359.
     ings of the 2018 CHI Conference on Human Factors in Computing               [33] Daniel Perez-Marcos, Maria V Sanchez-Vives, and Mel Slater. 2012. Is
     Systems (CHI ’18). ACM, New York, NY, USA, Article 541, 14 pages.                my hand connected to my body? The impact of body continuity and
     https://doi.org/10.1145/3173574.3174115                                          arm alignment on the virtual hand illusion. Cognitive neurodynamics
[13] David Coyle, James Moore, Per Ola Kristensson, Paul Fletcher, and                6, 4 (2012), 295–305.
     Alan Blackwell. 2012. I Did That! Measuring Users’ Experience of            [34] Valentin Schwind, Pascal Knierim, Cagri Tasci, Patrick Franczak, Nico
     Agency in Their Own Actions. In Proceedings of the SIGCHI Conference             Haas, and Niels Henze. 2017. These are not my hands!: Effect of Gender
     on Human Factors in Computing Systems (CHI ’12). ACM, New York,                  on the Perception of Avatar Hands in Virtual Reality. In Proceedings
     NY, USA, 2025–2034. https://doi.org/10.1145/2207676.2208350                      of the 2017 CHI Conference on Human Factors in Computing Systems.
[14] Tiare Feuchtner and Jörg Müeller. 2017. Extending the body for                   ACM, 1577–1582.
     interaction with reality. In Proceedings of the 2017 CHI Conference on      [35] Ali Sengül, Michiel Van Elk, Giulio Rognini, Jane Elizabeth Aspell,
     Human Factors in Computing Systems. ACM, 5145–5157.                              Hannes Bleuler, and Olaf Blanke. 2012. Extending the body to virtual
[15] Giulia Galli, Jean Paul Noel, Elisa Canzoneri, Olaf Blanke, and Andrea           tools using a robotic surgical interface: evidence from the crossmodal
     Serino. 2015. The wheelchair as a full-body tool extending the                   congruency task. PloS one 7, 12 (2012), e49473.
     peripersonal space. Frontiers in psychology 6 (2015), 639.                  [36] Andrea Serino, Michela Bassolino, Alessandro Farne, and Elisabetta
[16] Georg Goldenberg and Atsushi Iriki. 2007. From Sticks To Coffee-Maker:           Ladavas. 2007. Extended multisensory space in blind cane users.
     Mastery of Tools and Technology by Human and Non-Human Primates.                 Psychological science 18, 7 (2007), 642–648.
     Cortex 43, 3 (2007), 285–288.                                               [37] Charles Spence. 2011. Crossmodal correspondences: A tutorial review.
[17] Arvid Guterstam, Joanna Szczotka, Hugo Zeberg, and H Henrik Ehrsson.             Attention, Perception, & Psychophysics 73, 4 (01 May 2011), 971–995.
     2018. Tool use changes the spatial extension of the magnetic touch               https://doi.org/10.3758/s13414-010-0073-7
     illusion. Journal of Experimental Psychology: General 147, 2 (2018), 298.   [38] Daniel Wigdor and Dennis Wixon. 2011. Brave NUI world: designing
[18] Antal Haans and Wijnand A. IJsselsteijn. 2012.                    Embodi-        natural user interfaces for touch and gesture. Elsevier.
     ment and telepresence: Toward a comprehensive theoretical                   [39] Jessica K Witt, Dennis R Proffitt, and William Epstein. 2005. Tool use
     framework.       Interacting with Computers 24, 4 (2012), 211–218.               affects perceived distance, but only when you intend to use it. Journal
     https://doi.org/10.1016/j.intcom.2012.04.010                                     of experimental psychology: Human perception and performance 31, 5
[19] A. Haans, W. A. Ijsselsteijn, and Y. A. de Kort. 2008. The effect of             (2005), 880.
     similarities in skin texture and hand shape on perceived ownership          [40] Regine Zopf, Greg Savage, and Mark A Williams. 2010. Crossmodal
     of a fake limb. Body Image 5, 4 (Dec 2008), 389–394.                             congruency measures of lateral distance effects on the rubber hand
[20] Nicholas P Holmes. 2012. Does tool use extend peripersonal space? A re-          illusion. Neuropsychologia 48, 3 (2010), 713–725.
     view and re-analysis. Experimental brain research 218, 2 (2012), 273–282.
[21] Nicholas P Holmes, Gemma A Calvert, and Charles Spence. 2004.
     Extending or projecting peripersonal space with tools? Multisensory
     interactions highlight only the distal and proximal ends of tools.
     Neuroscience letters 372, 1-2 (2004), 62–67.
[22] Nicholas P Holmes and Charles Spence. 2006. Beyond the body schema:
     Visual, prosthetic, and technological contributions to bodily perception
     and awareness. Human body perception from the inside out (2006), 15–64.
[23] Kasper Hornbæk and Antti Oulasvirta. 2017. What Is Interaction?.
     In Proceedings of the 2017 CHI Conference on Human Factors in
     Computing Systems (CHI ’17). ACM, New York, NY, USA, 5040–5052.
     https://doi.org/10.1145/3025453.3025765
[24] Atsushi Iriki, Michio Tanaka, and Yoshiaki Iwamura. 1996. Coding
     of modified body schema during tool use by macaque postcentral
     neurones. Neuroreport 7, 14 (1996), 2325–2330.
[25] David Kirsh. 2013. Embodied Cognition and the Magical Future of
     Interaction Design. ACM Trans. Comput.-Hum. Interact. 20, 1, Article
     3 (April 2013), 30 pages. https://doi.org/10.1145/2442106.2442109
[26] Angelo Maravita. 2006. From" body in the brain" to" body in space". Sen-
     sory and intentional components of body representation. In G. Knoblich
     & al., Human body perception from the inside out (pp. 65-88) (2006).
[27] Angelo Maravita and Atsushi Iriki. 2004. Tools for the body (schema).
     Trends in cognitive sciences 8, 2 (2004), 79–86.
You can also read