EVISIONREVIEWPROJECT-ENGAGEMENT MONITORING - SIMON MCLEAN, HEAD OF WEB & IT SUPPORT INFORMATION & DATA SERVICES.

Page created by Morris Chandler
 
CONTINUE READING
EVISIONREVIEWPROJECT-ENGAGEMENT MONITORING - SIMON MCLEAN, HEAD OF WEB & IT SUPPORT INFORMATION & DATA SERVICES.
eVision Review Project - Engagement
Monitoring

Simon McLean, Head of Web & IT Support
Information & Data Services.
EVISIONREVIEWPROJECT-ENGAGEMENT MONITORING - SIMON MCLEAN, HEAD OF WEB & IT SUPPORT INFORMATION & DATA SERVICES.
What is Usability?

Why Bother?

Types of usability testing

Usability Testing in eVision

Report findings
EVISIONREVIEWPROJECT-ENGAGEMENT MONITORING - SIMON MCLEAN, HEAD OF WEB & IT SUPPORT INFORMATION & DATA SERVICES.
What is Usability?

Definition from Wikipedia:
• “Usability is the ease of use and learnability of a human-
  made object.…” (Physical or Virtual)
• “…Usability includes methods of measuring usability, such as
  needs analysis and the study of the principles behind an
  object's perceived efficiency or elegance….”
• “…usability studies the elegance and clarity with which the
  interaction with a computer program or a web site (web
  usability) is designed.”
EVISIONREVIEWPROJECT-ENGAGEMENT MONITORING - SIMON MCLEAN, HEAD OF WEB & IT SUPPORT INFORMATION & DATA SERVICES.
• In E-Commerce - it could save millions based on
  speed/transaction times alone.

• In Education – It could improve the ‘Student Experience’. We
  go beyond ‘it works functionally’ to perhaps ‘that helped me
  achieve my objective in the easiest possible way’.

• Staff – Efficiency/Speed of operation for routine administrative
  tasks + Positive User experience/Intuitive.

• Provide consistency.

• Highlight areas which need further work on that would
  otherwise not be picked up on by conventional testing.

• Pick up any real ‘klangers’ before a live launch.

Note the need to separately address Novices (public web access)
from Experts (Staff who use for administrative duties regularly)
EVISIONREVIEWPROJECT-ENGAGEMENT MONITORING - SIMON MCLEAN, HEAD OF WEB & IT SUPPORT INFORMATION & DATA SERVICES.
A selection of methods available:            Heuristics (Nielsen)
                                             -Visibility of system status
                                             -Match between system and
•   Cognitive Modelling (pre-design stage)   the real world
                                             -User control and freedom
•   Eye Tracking                             -Consistency and standards
                                             -Error prevention
•   Cognitive Walkthrough                    -Recognition rather than recall
                                             -Flexibility and efficiency of
•   Heuristics (single person can test)      use
                                             -Aesthetic and minimalist
•   Think Aloud                              design
                                             -Help users recognize,
•   Scenario Based tests                     diagnose, and recover from
                                             errors
•   Interviews                               -Help and documentation
•   Grounded Theory
•   Observation
EVISIONREVIEWPROJECT-ENGAGEMENT MONITORING - SIMON MCLEAN, HEAD OF WEB & IT SUPPORT INFORMATION & DATA SERVICES.
E.g. Nielsen ‘F-shaped pattern’ -
          http://www.useit.com/alertbox/reading_pattern.html

Users first read in a horizontal movement, usually across the upper part of the content area. This initial element forms the F's top
bar.
Next, users move down the page a bit and then read across in a second horizontal movement that typically covers a shorter area
than the previous movement. This additional element forms the F's lower bar.
Finally, users scan the content's left side in a vertical movement. Sometimes this is a fairly slow and systematic scan that appears
as a solid stripe on an eyetracking heatmap. Other times users move faster, creating a spottier heatmap. This last element forms
the F's stem. (Source: Website link above)

See also Wired Magazine for other uses of Eyetracking e.g.
http://www.wired.co.uk/magazine/archive/2011/06/start/this-page-has-been-eye-tracked
EVISIONREVIEWPROJECT-ENGAGEMENT MONITORING - SIMON MCLEAN, HEAD OF WEB & IT SUPPORT INFORMATION & DATA SERVICES.
Usability Testing on eVision
• Testing performed independently of the Portico Services group
  (objectivity)
• Scenario based testing plus ‘Think Aloud’ Method (Lewis & Rieman)
• Minimalist approach – resource was 1 laptop with Recording Mic,
  one meeting room. Environment is artificial – it is not the workplace.
  Camtasia software recorded screen & audio.
• “It takes only five users to uncover 80 percent of high-level
  usability problems.” (Jakob Nielsen). Range of administrators tested
  for Eng. Mon.
• Permission to record granted (permissions forms used)
• Interview questions combined with the testing process to make it
  more fluid.
• Analysis included audio transcription & axial coding (grounded
  theory)
EVISIONREVIEWPROJECT-ENGAGEMENT MONITORING - SIMON MCLEAN, HEAD OF WEB & IT SUPPORT INFORMATION & DATA SERVICES.
• A permissions form is signed to allow recording of data
• These normally follow a ‘script’ that is used during the test
  process. The script should follow a realistic scenario of usage.
• The tester should state what they are doing during the test
  (think aloud)
• For eVision related with student tests we moved away from
  prescribed scripts since the test setup was close enough to a
  real environment.
EVISIONREVIEWPROJECT-ENGAGEMENT MONITORING - SIMON MCLEAN, HEAD OF WEB & IT SUPPORT INFORMATION & DATA SERVICES.
• Camtasia Studio software used. Transcribe + Observe.
EVISIONREVIEWPROJECT-ENGAGEMENT MONITORING - SIMON MCLEAN, HEAD OF WEB & IT SUPPORT INFORMATION & DATA SERVICES.
Part of Interview Transcript
                                Audio Record -> Transcribe
• Grounded           What about those sites that is good?

  theory can tell    They are very logical and very user friendly. I don’t find myself getting frustrated with
  you a lot about    them.

  the field of       How does Evision compare?

  expertise,         It has a function and most of the time I can use it as I need to Sometime I get frustrated
  peoples            with it .

  emotional state,   It loggin you out for no good reason I find frustrating I spoke to someone in Portico serv
  process and        recently who said it was to do with the number of tabs. And it never occurred to me
                     before but yes sometimes it won’t acutally tell you why there is a problem.
  other criteria
                     If you’re trying to get a report and can’t get any data it doesn’t tell you – just says
  you wish to look   confined –
  for.
                     (bus objects issue)
• Key                I don’t see why there is a timeout.
Emotional
Environment
Productivity
• Full Report of Findings will be made available on PUG
  Meetings system
• Measurements: Average time taken to undertake the task

For Engagement Monitoring:
• Slow system responses
• Lack of system feedback
• Insufficient Help
• Wording/Labels unclear on some screens (see picture)
• Assumptions about the UI sometimes incorrect
• Increase font size
• Produce a standard method of use for buttons – size, location, and wording.
  Increasing the size of buttons makes it easier for the user to navigate/move the
  mouse to.
• Labels need to be more consistent and clear (standardise)
• Distinguish clearly hyperlinks from buttons in context
• Too much text content on screen – can this be minimised into a help link? Some
  graphical cues suggested (pictures/icons at key points)
• More feedback to the Administrator for them at key points of submitting form
  data. Provide status messages when core actions have been submitted.
• All Administrators indicated that it would be useful for a longer timeout, to be
  able to keep the system logged in for longer in the background for task switching
  to during the day.

• Other issues: Exams and Progression Calculation – although unrelated to
  Engagement Monitoring, seemed to be causing difficulties and had been raised
  with the Portico Services group. Another was that of reports on a particular
  degree course – in which there was insufficient applicant data on the system to
  utilise report functions.
Contact: s.mclean@ucl.ac.uk

                     Books on Usability
• Slides Beyond this point are informational/were not shown at
  the PUG meeting
Heuristics (Nielsen)
                                    Visibility of system status
                                    Match between system and the real world
                                    User control and freedom
                                    Consistency and standards
                                    Error prevention
                                    Recognition rather than recall
                                    Flexibility and efficiency of use
                                    Aesthetic and minimalist design
                                    Help users recognize, diagnose, and recover from
                                    errors
                                    Help and documentation

Nielsen:
• http://www.useit.com/papers/heuristic/heuristic_list.html

• http://designingwebinterfaces.com/6-tips-for-a-great-flex-ux-part-5
• Interviews provide an opportunity to gain advice from experts
  in the field. They can also be part of the process for scenario
  based testing e.g. Evision Administrators you might want to
  record their screen actions.
• Novices are often used for Scenario based testing to reveal
  new issues– they should be in the target web audience group
  e.g. students.
• You can combine Scenario based tests with interview
  techniques and record the results. Screen capture software
  (e.g. camtasia studio) can be used to track paths users take
  later as well as audio.
• Interviews generate a lot of data that you can examine later if
  recorded. Beware of response to recording/what people will
  reveal under interview conditions.
• Usability results from a range of methods cover different
  types of testing from different perspectives.
• Triangulation of results helps confirm findings.
• In practical terms the usability methods employed might be
  restricted by time and budgetary constraints.
• These are largely qualitative measurements. To measure the
  success of the changes to websites based on usability findings
  you might e.g. measure number of user accesses or perhaps a
  reduction in helpdesk calls or other ways to gain the
  information
• You need to be specific about what you want to address
  through the usability study.
• Cognitive Walkthrough
• Eye Tracking (e.g.Nielsen ‘F’)
• Cognitive Modelling

Q & A o/ break before continuing?
Walking through the tasks
• After the task analysis has been made the participants
  perform the walkthrough by asking themselves a set of
  questions for each subtask. Typically four questions are asked:
1. Will the user try to achieve the effect that the subtask has?
Does the user understand that this subtask is needed to reach the user's goal?
2. Will the user notice that the correct action is available? E.g. is
the button visible?
3. Will the user understand that the wanted subtask can be
achieved by the action? E.g. the right button is visible but the user does not
understand the text and will therefore not click on it.
4. Does the user get feedback? Will the user know that they have done
the right thing after performing the action?
By answering the questions for each subtask usability problems
will be noticed.
• Cognitive Modelling examines areas such as Task analysis – the hierarchical
  breakdown of subtasks to achieve goals. It allows us to predict some of the
  human processes which occur.

Measuring task completion times:
-Cogtool (Open Source) is a downloadable tool for working out times taken to
achieve tasks such as searching a website. It simulates the effect of an expert user
on your designs.
• http://cogtool.hcii.cs.cmu.edu/
Changing the Interface design in the prototype and re-modelling can be done to
reduce these times.
-Cognitive Modelling applied within Cogtool follows KLM-GOMS Task Analysis
model that predicts the times taken to move the mouse, type on the keyboard,
think and select items within a screen.
-Cogtool is in the design stage for prototyping

-It is also possible to predict or reduce Human Error through website re-design and
Error Theory – such as Just in Time indicators that point to the final step in a
sequence of web tasks that have been defined in the interface.
• Usability tests to perform should be
  based on Experts (Administrators) and
  Students/Other casual users (Novices).
• Suggest that budget and time constraints
  would mean a more qualitative approach
  taken.
• Define what we wish to improve. There
  are a lot of aspects to examine such as
  design related (use of colour, navigation,
  visibility of options), navigation, ‘user
  experience’….
• Suggest working out some scenario
  based tests for both administrators and
  end users. Use of Screen Recording
  software and audio record.
• Note/understand the constraints on how
  we can define websites and structure
  within the SITS interface.
• Heuristics are useful as an individual
  starting point as this can be done before
  the test participants are available.
Designing Web Usability – Nielsen
The Design of Everyday Things - Norman
Interaction Design – Beyond HCI – Helen Sharp
The Elements of the user experience – Jesse James Garrett
Don’t make me Think – Steve Krug
You can also read