Improving service use through prediction modelling: a case study of a mathematics support centre

Page created by Kent Morrison
 
CONTINUE READING
Improving service use through prediction modelling: a case study of a mathematics support centre
IMA Journal of Management Mathematics (2021) 00, 1–12
https://doi.org/10.1093/imaman/dpab035

                                                                                                                                                       Downloaded from https://academic.oup.com/imaman/advance-article/doi/10.1093/imaman/dpab035/6377514 by guest on 04 January 2022
          Improving service use through prediction modelling: a case study of a
                              mathematics support centre

                                                  Emma Howard†
                           School of Psychology, University College Dublin, Dublin, Ireland
                                † Corresponding author. Email: emma.howard@ucd.ie

                                                                      and
                                            Anthony Cronin
               School of Mathematics and Statistics, University College Dublin, Dublin, Ireland

                              [Received on 29 October 2020; accepted on 26 August 2021]

      In higher education, student learning support centres are examples of walk-in services with nonstationary
      demand. For many centres, the major expenditure is tutor wages; thus, optimizing tutor numbers and
      ensuring value for money in this area are key. In University College Dublin, the mathematics support
      centre (MSC) has developed a software system, which electronically records the time each student enters
      the queue, their start time with a tutor and time spent with a tutor. In this paper, we show how data
      analysis of 25,702 student visits and tutor timetable data, spanning 6 years, is used to identify busy and
      quiet periods. Prediction modelling is then used to estimate the waiting time for future MSC visitors.
      Subsequently, we discuss how this is used for staffing optimization, i.e. to ensure there is sufficient
      coverage for busy times and no resource wastage during quieter periods. The analysis described resulted
      in the MSC reducing the number of queue abandonments and releasing funds from overstaffed hours to
      increase opening hours. The methods used are easily adapted for any busy walk-in service, and the code
      and data referenced are freely available: https://github.com/ehoward1/Math-Support-Centre-.

      Keywords: staffing optimization; mathematics support; nonstationary demand; predictive modelling.

1. Background
A central part of management for any service is the effective staffing of trained personnel. In walk-
in services, where customer arrival is nonstationary, the issue of staffing can be complex. Defraeye
& Van Nieuwenhuyse (2016) provide a literature review of the research on staffing and rostering
for services with nonstationary demand. Komarudin et al. (2020) discuss the difference between
staffing and rostering optimization problems. They explain that staffing is ‘concerned with making
decisions regarding the quantity and characteristics of human resources in the organisation’ (p. 254),
whereas rostering involves assigning work shifts to the available personnel while taking into account
predetermined constraints. For example, Duenas et al. (2008) discuss the nurse roster problem whereby
nursing staff need to be scheduled for different work shifts (day, evening, night) in a hospital setting
according to specific constraints such as a nurse cannot work a night shift and then subsequently work
a day shift. Saccani (2012) focuses on the staffing optimization problem. He used an action research
approach to examine the problem in a call centre—forecasting call volumes using the time series
HoltWinters exponential smoothing method with additive trend and multiplicative seasonality. Overall,
he emphasizes the need for transparency in forecasting processes, evaluation of the implementation

                       © The Author(s) 2021. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.
                  0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited.
Improving service use through prediction modelling: a case study of a mathematics support centre
2                                        E. HOWARD AND A. CRONIN

and involvement of managers. In our paper, we examine the staffing problem for a busy academic
support centre with nonstationary demand, focusing on the quantity of staff needed to satisfy that
demand.

                                                                                                              Downloaded from https://academic.oup.com/imaman/advance-article/doi/10.1093/imaman/dpab035/6377514 by guest on 04 January 2022
     The mathematics support centre (MSC) has become an essential academic support component for
students of higher education in Ireland, the UK, Australia, Germany and elsewhere (Lawson et al.,
2020). While some MSCs offer bookable appointments, the most common form of student support
is the dedicated drop-in space where students of the university can access expert tutors for help with
mathematical queries from their programme. MSCs tend to use a first-come first-served or skill-based
routing queuing policy (Defraeye & Van Nieuwenhuyse, 2016). MSCs are complementary to regular
timetabled teaching activities such as formal lectures, tutorials and laboratories. However, unlike these
timetabled activities, students’ visits to an MSC do not occur at a regular rate, and this is one of the
most challenging problems MSC managers face. For example, in University College Dublin (UCD), the
number of students attending the MSC oscillates as each semester progresses (12 weeks of teaching
followed by one revision week and two examination weeks) with peaks in Weeks 7 and 10–13 of both
semesters, coinciding with the mid-term and end-of-semester examinations, respectively. However, the
service is well utilized from Week 1 and therefore meeting students’ needs requires sufficient tutors with
the relevant mathematical backgrounds to be on duty when students access the centre. Without sufficient
tutors, learners can experience lengthy waiting times and overcrowding of the centre. Similar to other
walk-in services subject to nonstationary demand, this in turn may lead to learners leaving without
receiving tuition (abandonment of the queue) and conflicts arising between disappointed students and
overloaded tutors. These factors can result in students developing negative perceptions of the service that
can affect word-of-mouth recommendations and service promotion. In a survey of mathematics support
practitioners in Ireland (Cronin et al., 2016), the top two suggestions for enhancing provision were
‘longer opening hours’ (39% of respondents) and ‘more tutors’ (26% of respondents). However, hiring
more staff and extending opening hours increases costs. Hence, an MSC service must look to maximize
its current offering in terms of minimizing waiting times for students and staffing appropriately qualified
tutors at relevant times. This requires the accurate identification of busy and quiet periods within the
MSC.
     There has been limited research investigating usage patterns of MSCs (Edwards & Carroll, 2018)
and subsequently staffing or rostering optimization. How students make use of the MSC space and
resources affects the length of time they spend there and their waiting times. For example, in addition
to regular attendees, the MSC can be suddenly ‘flooded’ by students prior to assessments (Wilson &
Gillard, 2008). As this ‘flooding’ of MSCs tends to be infrequent, Lawson et al. (2020) [p.13] note that
‘to double staff at all times simply for these peaks would be wasteful’. Using computational approaches
for the staffing of tutors in MSC settings is rare. An example, by Gillard et al. (2016), uses queueing
theory to reformulate the problem of rostering 8 tutors over 10 MSC opening hours as a finite-source
queueing model. Here the students represent machines breaking down where a breakdown corresponds
to a student requiring assistance from a tutor. Using their MSC service data, they calculate that students
on average sought support twice per hour and that tutors spent on average 10 min with a student. They
add further complexity by considering the skills of the tutors versus the queries asked by the students.
They treat the assignment of staff as ‘an optimization problem, the objective [being] to maximise a
linear cost function to ensure that as many fields of mathematics... are covered as possible within each
shift’ (p. 205). A limitation of this method is the scalability.
     We approach the staffing problem by examining historic MSC data of student wait times and by
using prediction modelling to estimate student wait times over the course of a semester; the chosen
IMPROVING SERVICE USE THROUGH PREDICTION MODELLING                                  3

prediction method, K-nearest neighbours, achieves a mean absolute error of 8.2 min using 10-fold cross-
validation. We show how the MSC manager uses predictions of student wait times for the next semester
to staff tutors for that semester. This has resulted in the MSC reducing the number of abdonments of

                                                                                                              Downloaded from https://academic.oup.com/imaman/advance-article/doi/10.1093/imaman/dpab035/6377514 by guest on 04 January 2022
the queue and releasing funds from overstaffed hours to extend opening hours. Additionally, we use
prediction modelling in two further ways. Firstly, when a student enters the MSC, they can see on the
projector screen their expected waiting time as well as their current queue position. This prediction can
be important in assisting a student to prepare for their visit appropriately and to inform them of how long
they can expect to wait before engaging a tutor. The second instance is through the implementation of
an RShiny app (Chang et al., 2019) hosted on the UCD MSC website, which allows students (external
to the centre) to look up the current waiting time and see predicted waiting times for the semester by
day and semester week. This allows students to plan their MSC visit for quieter periods.
    In order to analyse students’ use of an MSC, and subsequently use prediction modelling for staffing,
accurate historical data on when student visits occurred are needed. A limiting factor to the utilization
of data analytics in an MSC is the limited use of electronic systems to capture engagement data within
MSCs (Cronin et al., 2016). However, provided a robust record of client waiting times is captured,
the methods used in this paper can be utilized by any busy walk-in service to reduce waiting times
and improve client satisfaction, e.g. call centres, airports, motor tax office, academic writing centres,
vaccination centres, hospital A&E, etc.
    The structure of our paper is as follows: in Section 2, we describe the MSC data collected and
analysis conducted. In Section 3, we show how data analytics can be used to identify busy and quiet
periods in a MSC and discuss how this knowledge impacted staffing decisions. We also provide details
of an application developed for students’ use which incorporates prediction modelling. In Section 4,
we discuss the performance metrics used to evaluate the impact of the prediction modelling and the
limitations of the approach taken.

2. Method
2.1 Overview of the electronic system
The UCD MSC regular opening hours operate on a first-come, first-served drop-in basis (similar to
walk-in customer service centres). The MSC uses a bespoke session management system, developed
in-house, to electronically record the time each student uses the MSC via them logging into the queue
on a computer at the MSC entrance. We will refer to this as ‘the system’ from now on. Using the
system, each MSC tutor can initialize, pause or end a student–tutor session, and all these time points
are recorded on the system. In addition, the tutor can categorize the mathematics query, both at a high
level (e.g. Linear Algebra) and granular level (e.g. addition of matrices) and describe the help provided
through logging free-form feedback comments on the session (Cronin & Meehan, 2016, 2020). Through
the weekly automated sending of these comments to the relevant lecturer, the feedback loop is closed.
The software system and details on how a tutor operates a session with a student can be viewed at:
https://www.youtube.com/watch?v=pbswG50OuCs&feature=youtube.

2.2 Participants
In UCD, the MSC supports in excess of 5,500 student visits per year from over 250 distinct modules
across all 6 colleges of the university. The MSC has been using the system since Semester 2 of the
2014/15 academic year. When students log in, they are given the option to consent or not to their
4                                        E. HOWARD AND A. CRONIN

Table 1     Data extracted from the system

                                                                                                                Downloaded from https://academic.oup.com/imaman/advance-article/doi/10.1093/imaman/dpab035/6377514 by guest on 04 January 2022
Start                          Wait                           Tutor_time                     Wall_time
13 October                     3,660                          233                            833
2017 13:02
17/ October                    295                            3,374                          3,375
2017 16:39
17 October                     136                            2,130                          4,959
2017 16:41
17 October                     30                             1,096                          4,536
2017 15:45

visit data being used for research and evaluation purposes. To date, in excess of 99% of all visitors
have consented to this request. For this study, we are interested in students’ waiting time to see a
tutor. A number of visitor entries on the system were removed from analysis where their waiting time
was deemed not applicable. For example, the MSC runs a number of workshops called ‘Hot Topics’.
These are a proactive way of supporting a significant minority of students within a module who may be
lacking some prerequisite material. Hot topic groups range in size from 8 to 35 students, run more like
a traditional tutorial session and are booked in advance. Thus, while these students log in on the system
for the purposes of maintaining the total visits record, their waiting time is not related to the regular
drop-in MSC waiting time. In other cases, the waiting times were beyond normal expectations, defined
as anything in excess of 90 min, and likely to have been caused by tutor error. For example, tutors
may not have logged students into the system upon sitting with a student, thus extending their recorded
waiting time. Removing these instances gives 25,702 student visits as historical data for analysis from
the period of Semester 2 of 2014/2015 to Semester 1 2019/2020 inclusive.

2.3 The data
Table 1 gives an extract of the data retrieved from the system. The ‘start’ variable is the time when
a student logs into the MSC queue with their student number. The ‘wait’ variable denotes how many
seconds have elapsed until a tutor logs the student into their interface, i.e. when the tutor is available to
support the student. This signals the beginning of the period that the tutor spends with the student. The
‘Tutor_time’ variable is the amount of time in seconds that the tutor spends with a student providing
support. The ‘Wall_time’ variable is the amount of time the tutor spends with a student, in addition to
any paused time when a tutor leaves the student to work on their own independently or a paused period
of time when the student is swapped between attending tutors.
    For the purposes of conducting data analysis and prediction modelling, we wish to extract the
maximum amount of information from these four variables (see Table 2). In addition, the number of
tutors on duty in the MSC can be calculated for each time period from the MSC records and has been
included in Table 2. ‘New System’ refers to the time period from October 2015, when the MSC moved
from supporting all university students to supporting only those students from preparatory, first- or
second-year programmes. Hence, from October 2015, third- and fourth-year undergraduate students in
addition to postgraduate students were no longer assisted at the MSC unless they were registered to
earlier stage modules more typical of a first- or second-year student (e.g. an elective module). ‘Tutor
start time’ is considered to be a student’s start time with their wait time added on. For example, if a
student logs into the system at ‘13 November 2019 11.21’ and waits 480 s, or 8 min, the ‘Tutor Start
IMPROVING SERVICE USE THROUGH PREDICTION MODELLING                                    5

Time’ is ‘13 November 2019 11.29’. ‘Tutor End time’ is considered to be the ‘Tutor Start Time’ with the
time spent with a tutor added on. Continuing with the same example, if this tutor supports the student
for 720 s, or 12 min, then the ‘Tutor End Time’ is ‘13 November 2019 11.41’.

                                                                                                                Downloaded from https://academic.oup.com/imaman/advance-article/doi/10.1093/imaman/dpab035/6377514 by guest on 04 January 2022
2.4 Data analysis
To identify busy and quiet periods in the MSC, we are interested in the length of time students wait until
they start a session with a tutor, i.e. their waiting time. To investigate this, we initially ran descriptive
statistics, predominantly boxplots, of the waiting time controlling for specific variables from Table 2.
In a boxplot, the middle line of each box represents the median waiting time, and the upper and lower
lines of the boxes represent the upper and lower quartiles, respectively. The code for further descriptive
analysis and figures to those shown in Section 3 are available at https://github.com/ehoward1/Math-
Support-Centre-.
    To identify future waiting times, prediction modelling was employed using historical MSC data
from the 12-week teaching period of the semester, i.e. not including revision week (Week 13) and
examination weeks (Weeks 14 and 15). The revision and examination weeks are excluded as waiting
times have unusually high variance during this period and the tutor timetables also change significantly
over this period. The prediction methods compared were random forests (Breiman, 2001), principal
components regression (Ilin & Raiko, 2010), K-nearest neighbours (Hechenbichler & Schliep, 2004),
support vector machines (Karatzoglou et al., 2004) and splines (Friedman, 1991). The accuracy of these
methods was compared using 10-fold cross-validation and mean absolute error. K-nearest neighbours
achieve the lowest mean absolute error at 8.2 min, with 50% of errors between -1.7 min and +7.4 min.
At the time of implementation of the RShiny app (2017/18), random forests gave the best prediction
and was subsequently implemented for the prediction modelling. To predict future waiting times, a
dataset was created for every time point in the following semester. The prediction modelling analysis
was approved by the university ethics committee.
    As mentioned, for this dataset, the two best methods for prediction modelling have been K-nearest
neighbours and random forests. K-nearest neighbours calculate a predicted waiting time (response
variable) for a test case by identifying the ‘k’ training cases that are the most similar to the test case
and averaging the waiting time of the ‘k’ test cases. Here, k = 5 is used. In the case of a non-
numeric response variable, the prediction is made based on a voting approach. K-nearest neighbour is a
nonparametric method (Hechenbichler & Schliep, 2004). Random forests is an ensemble method, which
constructs multiple decision trees and averages the waiting time across the decision trees constructed.
To allow trees to be independent from each other, for each tree, a bootstrap sample of data is chosen and
a random subset of variables is considered at each split of the tree.

3. Results
3.1 Distribution of waiting times
Initially, we investigated the waiting times from 2014/2015 Semester 2 until 2019/2020 Semester 1
using a histogram (see Fig, 1). The average waiting time is 14.5 min with the median waiting time
being 7.6 min. As the waiting times are exponentially distributed, they were log-transformed for the
prediction analysis. This truncated exponential pattern is persistent for the waiting times for every year
and semester.
    Of the 25,702 student visits to the MSC, 62% occurred in Semester 1. Also, Semester 1 has on
average a longer waiting time of 14.9 min as compared to 13.7 min in Semester 2. Historically, 10%
6                                              E. HOWARD AND A. CRONIN

                                                                                                                        Downloaded from https://academic.oup.com/imaman/advance-article/doi/10.1093/imaman/dpab035/6377514 by guest on 04 January 2022
        Fig. 1. Distribution of waiting times in the MSC with a truncated exponential distribution fitted (λ = 0.07).

Table 2     Data variables extracted from the system data
                                            New variables
Year                                        Week in year                                            Tutor start time
Month                                       Semester week                                           Tutor end time
Day                                         Semester                                                Number in queue
Hour                                        New system                                              Number of tutors

of student visits have occurred in both Week 7 and Week 12 of the semester, and 44% of student visits
occurred in the first 3 h of opening for the MSC (10 am–1 pm). The average waiting time for students
is longest on Fridays (at 17.7 min) and shortest on Mondays (at 12.5 min). The longer waiting time
for Friday is likely a reflection of the shorter opening times on Friday coinciding with the morning
peak.
    Waiting times could also be examined based on any variable from Table 2. For example, Fig. 2 shows
the waiting times and the number of student visits based on the number of tutors working in the MSC
at any given time. The number of student visits is given above the upper left of each box. For example,
during periods when five tutors were on duty concurrently in the MSC, there have been 518 student
visits to the UCD MSC. Unsurprisingly, as the number of tutors increased, the waiting times decreased.
For example, the median waiting time when two tutors are on duty is 10 min, whereas when five tutors
are on duty, it decreases to approximately 3 min. The difference between the waiting times when four or
five tutors are on duty is marginal. Examining the waiting times based on different variables gives MSC
management greater insight into when long waiting times may be expected and how best resources can
then be managed.
    Examining waiting times for specific periods can also prove useful for management. For example,
Fig. 3 displays the waiting time for Weeks 1–12 of Semester 1 of 2018/2019 and 2019/2020 based
on the day of the week. For 2018/2019, Tuesdays had a longer waiting time in comparison with the
other days. Compared to Wednesdays, the MSC on Tuesdays had an additional 109 student visits.
IMPROVING SERVICE USE THROUGH PREDICTION MODELLING                                  7

                                                                                                            Downloaded from https://academic.oup.com/imaman/advance-article/doi/10.1093/imaman/dpab035/6377514 by guest on 04 January 2022
              Fig. 2. Boxplot of 2014/2015–2019/2020 waiting times based on number of tutors on duty.

However, on Mondays, the MSC had 37 more student visits than Tuesdays and the lowest average
waiting time. Since university timetables tend to stay consistent between academic years, this type of
information can be used by MSC management for staffing for the corresponding semester in the next
academic year.
    To reduce the longer waiting times on Tuesdays, more tutor hours were assigned to Tuesdays in
Semester 1 of 2019/2020 in the UCD MSC. The median waiting time for Tuesdays reduced substantially
from 16 min in 2018/2019 to 7.5 min in 2019/2020; on average a 8.5-min reduction in waiting times (See
Fig. 3). While the coordinator assigned an increased number of tutor hours to Tuesdays in 2019/2020,
the waiting time would presumably have also been impacted by the decrease in MSC visits (658 reduced
from 754). The reduced waiting time is likely a combination of both of these factors. Of note, for MSC
management, is the change in attendance for Mondays for the 2 years. It would be interesting to observe
whether this was owing to a change in module timetabling or a random effect. UCD MSC management
could investigate this on the system by examining the topics of the queries and the module type logged
by the tutors.

3.2 Incorporating prediction modelling into the MSC
Prediction modelling has been incorporated into the MSC in three ways. The first instance occurs in
the MSC itself. When a student enters the MSC, the student can see on a projector screen: the expected
8                                           E. HOWARD AND A. CRONIN

                                                                                                                 Downloaded from https://academic.oup.com/imaman/advance-article/doi/10.1093/imaman/dpab035/6377514 by guest on 04 January 2022
        Fig. 3. Waiting times for Weeks 1–12 of the 2018/2019 and 2019/2020 Semester 1 by the day of the week.

waiting time of the next person to see a tutor; how long the queue currently is (in terms of number of
students in the queue) and the mathematical strengths of the tutors on duty (see Fig. 4).
    Unlike the two other uses of prediction modelling discussed later, this prediction is built into the
system design. It is based on live MSC visitor and tutor information of: the number of students working
by themselves; the number of students working with tutors; the number of student groups working by
themselves; the number of student groups working with tutors; the number of students waiting for a
tutor (i.e. current queue size) and how long the student at the top of the queue has been waiting.
    Both tutors and students in the MSC benefit from these predictions. Tutors can take a glance at the
projector screen to see if and how the queue is building, and also how many other tutors are on duty with
them and their relative strengths should they need to transfer a student to one of them. In comparison,
students benefit by checking the screen to see if they have time to wait to see a tutor. If their time is
limited, the student can decide to come back later. If they join the MSC queue, they have an estimate of
how long they have to prepare at an MSC table before they start a session with a tutor.
    Prediction modelling has also been incorporated through a RShiny application on the UCD MSC
website. This was trialled in 2017/2018 and proved popular with students evidenced by the increase in
traffic to this part of the MSC website and via the embedding of this app into the university’s central
Student Desk suite of apps. Through accessing the application, students were informed of the current
estimated waiting time to see a tutor. Also, students could see a graph of the estimated waiting time
IMPROVING SERVICE USE THROUGH PREDICTION MODELLING                                   9

                                                                                                               Downloaded from https://academic.oup.com/imaman/advance-article/doi/10.1093/imaman/dpab035/6377514 by guest on 04 January 2022
                                  Fig. 4. Queue image from the MSC projector.

for that day. Students had the option of seeing the predicted waiting time for any day and week in
the semester (see Fig. 5). The aim of this application was to allow students, particularly students who
lived off campus and needed to make considerable journeys to campus, to plan their study day more
effectively.
    Finally, prediction modelling has been fitted over the waiting times for every time point of the
semester of interest. Thus, allowing the MSC manger to identify periods of longer/shorter waiting times
and to schedule tutors accordingly.

4. Discussion
We have described a number of ways that prediction modelling can be used to introduce both
cost and resource effective measures in a busy student learning support setting. Defraeye & Van
Nieuwenhuyse (2016) provide a list of performance metrics such as number in system/queue, waiting
time, abandonments/throughput, length of stay, and utilization. To evaluate the impact of this current
work, we consider the metrics of waiting times for visitors, length of time for tutor–student interactions
and change in the number of visitors left without being seen. Descriptive statistics of these metrics are
basic indicators, and subsequently, the next phase of research will involve more formal assessment of
how to evaluate the impact of using the prediction models.
    Firstly, using the historical data for the teaching semester, the median waiting time to see an MSC
tutor reduced from 7 min in 2015/2016 (n = 4, 112) to 5 min in 2016/2017 (n = 4, 681). In 2017/2018
and 2018/2019, the waiting time was 7 min (n = 4, 970) and 8 min (n = 4, 864), respectively. While
there was a marginal increase in waiting time over the past 2 years, there was also an increase in the
number of student visits to the MSC. Balancing the waiting time and time spent with tutors is a challenge
for a busy walk-in service. The median time spent with a tutor for the teaching semester has remained
steady over these years. In 2015/2016, this was 16.9 min; in 2016/2017, it raised to 17.5 min before
dipping to 15.6 min in 2017/2018 with a further increase to 17 min in 2018/2019. This again despite the
increase in annual student visits and no extra expenditure on tutors. It should be noted that this statistic
refers strictly to the time that a student spends with a tutor receiving support and does not include the
10                                       E. HOWARD AND A. CRONIN

                                                                                                            Downloaded from https://academic.oup.com/imaman/advance-article/doi/10.1093/imaman/dpab035/6377514 by guest on 04 January 2022
                                 Fig. 5. MSC app for estimating waiting times.

time durations when a student is paused to do work independently of the tutor or when their session is
paused while a tutor tends to another student/table.
    The implementation of the predictive models has accrued savings of over e12,000 as tutor
expenditure was utilized in a more efficient way, i.e. less tutors utilized in quieter periods. While the
number of annual tutor hours remained the same, the allocation of these hours changed. For example,
opening hours were extended to 7.30 pm Monday to Thursday in Semester 1 2017/2018 and to 8 pm in
2018/2019, and until 7.30 pm Monday to Thursday for Semester 2 of both 2017/2018 and 2018/2019.
Friday opening hours were also extended from 10 am–1 pm to 10 am–2 pm in both 2017/2018 and
2018/2019 using this model. Spreading the tutor allocation in a more efficient manner using the model
has also meant we have been able to hire more tutors (with a more diverse array of specialisms), from
21 tutors in 2017/2018 to 28 tutors in 2018/2019 albeit using the same number of annual tutor hours
(Cronin, 2020).
    Lastly, we analysed the number of visitors who left the MSC without being supported by a tutor. In
2015/2016, 2016/2017, 2017/2018, 2018/2019 and 2019/2020, the number of visitors who left without
IMPROVING SERVICE USE THROUGH PREDICTION MODELLING                                 11

being seen were 335, 280, 212, 306 and 193, respectively. While it is regrettable that any student leaves
a support centre without receiving assistance, the use of such models to optimize staffing has helped
minimize this event.

                                                                                                             Downloaded from https://academic.oup.com/imaman/advance-article/doi/10.1093/imaman/dpab035/6377514 by guest on 04 January 2022
    There are limitations in using prediction modelling for staffing, for example accurate and robust
visitor records are needed and this requires buy-in from all staff. This prediction modelling approach
is in some ways reliant on the static nature of module timetabling and on the knowledge that a large
number of MSC visits arrive prior to assessments (Edwards & Carroll, 2018). In UCD, modules tend
to be timetabled at the same time slots each year for convenience, and module assessment due dates
also tend to be consistent. If there was substantial change to the timetabling of modules and/or their
assessments, we would expect poorer predictions. Other academic changes such as new modules being
formed, changing of the module format, new continuous assessment regime and the offering of a
module being switched from Autumn to Spring would also impact the accuracy of the predictions.
The effect of academic changes to a module on MSC visits can potentially be reduced via maintaining
strong communication links with the heads of teaching and learning, module coordinators and the MSC
coordinator, see Cronin & Meehan (2020). The authors would be interested if any improvements can be
made to further utilize the model. For example, unlike our model, the approach taken by Gillard et al.
(2016) considered tutors’ mathematical strengths.
    The next phase of this research will also involve examining the impact that a student’s background
has on their time spent with an MSC tutor in terms of their programme, stage in their degree, the specific
module and subject with which they are seeking help, in addition to their gender, international status,
prior mathematical learning and other potentially determining demographics. Predicting student length
of stay in an MSC is vital because it may be considered a reliable and valid proxy of measuring the
consumption of resources.

5. Conclusions
While each customer/student walk-in service centre has its own requirements and demands, there is a
certain amount of commonality when it comes to meeting the needs of customers/students to ensure
satisfaction with the service. Obviously, long queues, excessive waiting times and clients leaving the
service without being supported by tutors will reflect negatively on a service’s reputation and success.
Also, the assignment of sufficient tutors with requisite skills at the required times avoids resource
wastage in quieter times and in our experience can prove useful when applying for sustained or increased
funding. Thus, we hope by sharing our experiences and the efforts made to minimize these issues in the
MSC context we can help other busy academic support centres and customer-facing services do the
same.
     As mentioned, we have provided the (anonymized) data and code in a GitHub repository, which
is flexible in that it can be easily adapted to suit the conditions (the number of staff, opening hours,
etc.) of other such services including academic support centres and customer call centres. In terms of
supporting decision-making for budget planning, staffing levels and other plans around expected peak
waiting times, we hope this paper and the complementary code can help ease the management of walk-in
support services in planning for and providing the best support experience for those who use the service.

Acknowledgements
We would like to thank Dr Raja Mukherji for all his support with designing, creating and maintaining
the UCD MSC session management system. We would also like to thank the editor and the anonymous
reviewers for their very helpful comments on the previous version of this paper.
12                                          E. HOWARD AND A. CRONIN

Disclosure statement
No potential conflict of interest was reported by the authors.

                                                                                                                      Downloaded from https://academic.oup.com/imaman/advance-article/doi/10.1093/imaman/dpab035/6377514 by guest on 04 January 2022
                                                  References
Breiman, L. (2001) Random forests. Machine Learning, 45, 5–32.
Chang, W., Cheng, J., Allaire, J., Xie, Y. & McPherson, J. (2019) Shiny: Web application framework for R.
     CRAN. Retrieved from https://CRAN.R-project.org/package=shiny.
Cronin, A., Cole, J., Clancy, M., Breen, C. & O’Se, D. (2016) An audit of Mathematics Learning Support
     provision on the island of Ireland in 2015. Dublin: National Forum for the Enhancement of Teaching and
     Learning in Higher Education.
Cronin, A. (2020) University College Dublin Maths Support Centre: Annual Report 2018/19. Dublin: UCD. Avail-
     able at https://www.ucd.ie/msc/t4media/MSC_Annual_Report_2018_19-3.pdf ( Accessed: 15 July 2020).
Cronin, A. & Meehan, M. (2016) The development and evolution of an advanced data management system
     in a mathematics support centre. Proceedings of the CETL-MSOR Conference 2015. London, UK: Higher
     Education Academy, pp. 21–27.
Cronin, A. & Meehan, M. (2020) Utility of mathematics support centre data to lecturers of large first year
     university mathematics courses. International Journal of Mathematical Education in Science and Technology
     Appeared online June 2nd 2020.
Defraeye, M. & Van Nieuwenhuyse, I. (2016) Staffing and scheduling under nonstationary demand for service:
     A literature review. Omega, 58, 4–25.
Duenas, A., Tütüncü, G. Y. & Chilcott, J. B. (2008) A genetic algorithm approach to the nurse scheduling
     problem with fuzzy preferences. IMA Journal of Management Mathematics, 20, 369–383.
Edwards, A. & Carroll, D. (2018) Predictors of demand for mathematics support. Proceedings of the. 42nd
     Conference of the International Group for the Psychology of Mathematics Education (E. Bergqvist, M.
     Österholm, C. Granberg & L. Sumpter eds), vol. 2. Umea, Sweden: PME, pp. 395–402.
Friedman, J. H. (1991) Multivariate adaptive regression splines. The Annals of Statistics, 19, 1–67.
Gillard, J., Knight, V., Vile, J. & Wilson, R. (2016) Rostering staff at a mathematics support service using a
     finite-source queueing model. IMA Journal of Management Mathematics, 27, 201–209.
Hechenbichler, K. & Schliep, K. P. (2004) Weighted k-Nearest-Neighbor techniques and ordinal classification.
     Discussion Paper 399, SFB, Ludwig-Maximilians University Munich, pp. 1–16.
Ilin, A. & Raiko, T. (2010) Practical approaches to principal component analysis in the presence of missing values.
     Journal of Machine Learning Research, 11, 1957–2000.
Karatzoglou, A., Smola, A., Hornik, K. & Zeileis, A. (2004) Kernlab – an S4 package for kernel methods in
     R. Journal of Statistical Software, 11, 1–20.
Komarudin, De Feyter, T., Guerry, M-A. & Berghe, G. V. (2020) The extended roster quality staffing problem:
     addressing roster quality variation within a staffing planning period. Journal of Scheduling, 23, 253–264.
Lawson, D., Grove, M. & Croft, T. (2020) The evolution of mathematics support: A literature review.
     International Journal of Mathematical Education in Science and Technology, 51, 1224–1254.
Saccani, N. (2012) Forecasting for capacity management in call centres: Combining methods, organization, people
     and technology. IMA Journal of Management Mathematics, 24, 189–207.
Wilson, R. & Gillard, J. W. (2008) Some problems associated with running a Maths Support Service. CETL-
     MSOR Conference 2008 Conference Proceedings (D. Green ed). Birmingham: The Maths, Stats & OR
     Network, pp. 94–99.
You can also read