In reason, they decided to do a research that

            In the research article “Evaluating
the Usability of ERP Systems: What Can Critical Incidents Tell Us?” authors
Mari-Klara and Wendy are writing about the research they conducted regarding
the usability of ERP (Enterprise recourse planning). To be more specific, what
can incidents, caused while using ERP, tell us about the usability of the same.

Mari-Klara and Wendy took an approach to, instead of looking at the scope and functionalities
of the ERP, to are searching for answers at what are the causalities of scope
and functionalities to the end user. They were conducting a study to determine how
difficult it is for an end user to find the information and perform needed
transactions. Authors stated that in spite other existing research on this
topic, they often failed to produce results that would provide valuable information
to practically improve usability of the ERP systems. For this reason, they
decided to do a research that would offer additional insights by examining
actual problems that users and experts encountered.

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!


order now

            This study was conducted on users of
the SAP ERP system. They self-reported critical incidents while using the
system, which involved “performing real tasks under normal working conditions”
(1). Since prior studies on empirical usability evaluations determined that
only three to five users are enough to understand 80% of the system’s usability
problems, Mari-Klara and Wendy decided to conduct this study on a total of
three users. Users were males, between 20 and 60 years of age, enrolled in a
business university in the north-east part of the United States. They were able
to record 53 incidents and categorize them into 10 usability problems. Tasks
were previously determined by the investigators, and users were reporting
incidents in two ways: by reporting them as they occurred, and after they finished
all tasks retrospectively with investigators while watching the recording of
their session. This session recordings were viewed again by the investigators
in order to determine any incidents that were failed to be reported.

            This study was done with each user,
of which two had previous experience using ERP, while one didn’t have any
previous ERP usage experience. Each user had 120 minutes recorded session to
perform pre-determined tasks and report usage incidents. Prior to this session
all users had 30 minutes training on SAP ERP, and were able to take notes.

Also, all users were able to receive about 10 minutes training on how to
identify critical incidents related to usability. “A usability issue was
defined as anything that is overly confusing, difficult to understand, requires
too much effort, or causes difficulties in task fulfillment” (1). Types of pre-determined
tasks were: Authorization task (create a role, create a user, assign role to
user, test role), Purchasing task (create a purchase order), and Sales
reporting task (run, sort, export a sales report). First task (Authorization)
was randomly assigned to two users, while the second and third tasks
(Purchasing and Sales Reporting) was done by the remaining third user.

            After all user sessions were
finished users were able to detect 33 incidents, while independent experts
found additional 21 after they reviewed recordings. Out of 54 critical
incidents 53 were used as valid, since one of the user recorded incidents was
found not to be valid for this study. As already mentioned these 53 critical
incidents were then ranked by their severity and categorized into 10 usability
problems. Ranking was also done by the pre-determined scale for Problem impact
and persistence (from 1 for minor annoyance to a 5 for a show stopper) and
Problem frequency (1- some, 2 – most, and 3 – everyone will encounter) and were
sorted later on a severity scale of 1-7. Top ranked Usability Problem (UP) was
the one where the user had difficulties finding the next step, and had the
severity of 6 and 7. The second UP was the one where the system didn’t provide
clear feedback, or provided unhelpful information, where the severity was 5-6.

Further went into problems like: Procedures of data entry were tedious, basic
data entry rules were not always obvious, the lack of user orientation within
the system and the lack of information on their current location, search was inconsistent
and unclear, visual design and placement of button are unclear to the users,
lack of settings customization for users, basic navigation and selection within
lists was not obvious to the users.

            Our team (group 6) decided to use
our DePaul’s Campus Connect re-design as a final project. We all get frustrated
all the time by the usability of the system. To be more specific, with the
Navigation, Presentation, and Learnability (2). We dissected Campus Connect using
all three categories and all agreed that it would be a perfect candidate for
our re-design. We were able to determine that Campus Connect is in fact ERP (Enterprise
Resource Planning) system, and hence doing a secondary research on usability of
ERP would be a great starting point for us. I found this study and decided to
use is since it provides us with valuable information about categorization of
usability problems, data about what methodology to use and what is the validity
of the same, but also with some results that might correspond with our findings
too.

            It’s interesting that in order to get
80% validity of the information one needs only 3-5 users. This information can
be of the essence if we undergo such study, since finding 3 Campus Connect
users sounds feasible enough. Also having the problems ranked by their severity
will help us tackle the main problem of the re-design, and that is, where to
focus. If we focus on the problems that frustrate the most Campus Connect users
and provide them with the better, more user-friendly solution, and improve
their experience, our goal will be achieved. This study can help us ask
questions like how easily can the information can be accessed? Is the specific
functionality easy to find and access? Is the visual layout well designed? What
is the quality of the information provided by the system? What is the layout of
the menus? How fast can the user learn to use the system? How well can the user
find specific function just by exploring the system, rather than receiving the training?

            If our team uses this study as a
starting point in our project, and build on it, it will present us with
valuable information. For an example, top severity usability issue in this
study was difficulty to find the next step. And, if we take this step and ask
the first ourselves this question, and then other users the same question we
might get the information that we didn’t know before. Our main concern about
Campus Connect was Navigation, but, by asking questions like, how fast can the
user find a function by just exploring might raise another issue we weren’t
aware of. This study gives a solid data and a proper methodology, and its only
up to us to use them in a proper way and make our project more successful.