hi everybody welcome to designing for
somebody thank you for getting up this
early to come to this Saturday morning
session and thank you to the ultimate
camp planning committee for putting on
this event my name is Kylie orc I was
your house key um and I'm not a typical
mid camp attendee let alone a typical
Smith camp presenter before I tell you
why let me remind you of our keynotes
first principle for inclusion yesterday
don't ever judge anybody speaking out of
your boo conference I have never worked
with Drupal before full disclosure and I
never really knew much about Drupal
before attending some sessions on it
yesterday so why the hell am I here um I
am a graduate student at the University
of Michigan School of Information
currently in an agile project management
agile project management methodologies
cars taught by Michael Hess who is on
the triple security team and towards the
beginning of the semester he encouraged
our class to submit proposals to mid
camp based on whatever sort of value we
thought we could provide to conference
attendees so I figured you know I'll try
and a few weeks maybe months later I'm
here and I'm here to talk to you guys
about something that's a little
different than I think is par for the
course for this conference but something
that I hope will nonetheless be valuable
to you and that is a high-level overview
of the processes of contextual design
and contextual inquiry so some
groundwork for the next however long I
talked for this section will cover why
you shouldn't
system for just anybody or for everybody
it will cover why you shouldn't design
develop or implement a system without
knowing the systems or your users and it
will cover how you can come to truly
know your users through contextual
design and inquiry I'll be drawing from
some of my own experiences in the
classroom and in my internships with
contextual inquiry just to provide some
sort of example that I can point to the
session won't cover anything really
about anything really technical and
anything explicitly related to Drupal um
so if that's not your speed I won't be
offended if you walk up now but just so
you know what you're getting into that's
what we'll be talking about today so
designing for somebody not just anybody
then not everybody what does that mean
so designs that claim to meet the needs
of anybody and designs that claim to
meet the needs of everybody are somewhat
similar a design for anybody would be a
design that any random person off this
tree were able to use this type of
design would necessarily be as
non-technical non-complicated
non-industry specific and not complex as
possible so as to guarantee that
literally anybody would find it
intuitive to use if such a thing exists
this design would necessarily be very
generic appealing to the lowest common
denominator of any potential users
background skills and experiences
similarly a design for everybody is a
design that every single person in this
room at this conference in Chicago
throughout the country would be able to
find a use for this type of design would
necessarily be very bloated with any and
all features that everybody could ever
want to use related to a particular
system or service um
accordingly neither of these user group
examples should ever be you
for a design development or
implementation process using these as
user groups for which to design for
developed for can only lead to like I
said very generic or very bloated
outcomes which no user will really find
and meets their specific needs
so if we're not for trying to not design
or develop or implement but just anybody
but for everybody
we want to develop for somebody we want
to create useful stuff obviously so we
want to develop our design or architect
or build a specific solution that meets
a specific need of a specific group of
users we want to design things for that
somebody and what that somebody does so
how do we go about doing that this is
where contextual design comes in
contextual design is one way to design
for somebody and it's the way that I'll
talk about this morning this is a user
centered design process developed by
Karin holes flat and Hugh buyer and it
incorporates ethnographic methods for
collecting data that's relevant to a
system or product or service being
developed or revised or improve
Martin and Hanson have defined
contextual design as a customer centered
design process in which every step is
anchored in customer data making it feel
less like design magic so this is a very
data-driven design methodology and so
these are the process of contextual
design has you know at least on paper a
lot of these steps
there's many methods of data collection
intermediary steps of data synthesis and
analysis and then some research
deliverables that hopefully result in a
usable useful design product this is
just a brief overview of these steps
laid out by Holtz flat and buyer but we
really will be only focusing on the
first one within the scope of this
session somewhat touching on the two
that follow it because I think these are
the most the most valuable at the
beginning stages of a designer and
fermentation process so as I had
mentioned contextual inquiry will be the
focus of this presentation as it
pertains to the scope of contextual
design so before we get into this first
step of contextual design which is
contextual inquiry I want everybody to
think about what they do every day when
you come into work just in your heads
you don't have to talk or tell me or
anything like that
so accordingly and with that one of
Holt's Bloods and virus principles
underlying the contextual design process
is that which states people are experts
at what they do but unable to articulate
their own work practices so essentially
what this boils down to is this we know
what we do but not exactly how we do it
when you thought about what you did
every day it was likely it was at a
fairly mid-level range you didn't
consider your individual keystrokes that
you do to perform your job nor is it
likely that you considered how your work
fits into the larger organizations
functional workflow that is our work
processes and how we perform them I
really test it especially when we're
asked to consider or recount them when
we're not in our typical work
environment or mindset just like I asked
you to do so this begs the question of
how users developers or researchers are
supposed to actually learn how people
how their users do what they do and
that's through contextual inquiry
according to Martin and hanison again
contextual inquiry is an immersive
method of observing and interviewing
that reveals underlying and somewhat
invisible work structures according to
me contextual inquiry is really
observing somebody doing their job in
their actual workspace and asking them
questions about what they're doing this
is fieldwork at its core for my field at
least if you're able to watch somebody
do their work you'll be able to
understand and observe even if you have
to ask follow-up questions the tacit
parts involved in their task performance
beyond what they be able to articulate
to you in a decontextualized setting as
in a sterile interview or something
similar so this this is something I've
done in a couple different instances in
my coursework at the University of
Michigan it's something I've done with a
local Public Library and consulting
method
Jacquetta class semester and it's
something I'm currently doing on a
larger scale to support a CRM
implementation project with U of M's
office of enrollment management in those
cases when an organization has
identified a problem with their workflow
or when an organization is introducing
or last introduce a new to our system
contextual inquiries can be helpful in
exploring how work is actually being
done in an organization and
recommendations and requirements can
file from the data resulting from the
exploration so these are the key parts
of a contextual inquiry process first we
have context obviously we have to
understand users and their needs in the
actual context in which they do work
this is done by observing them do actual
work in the National workspaces not in a
conference room not at a conference just
in where they work we want to understand
how they do it in that particular
context another key part is partnership
and we have to work with users in this
method as far as partners throughout the
inquiry process a good piece of advice
I've heard for my professors is that you
as a contextual inquiry should situate
yourself as an apprentice to the users
expert in this situation ready and
waiting for the user for the expert to
guide us through their actual work
activities another key part is
interpretation and you similarly related
to the partnership aspect of contextual
inquiry we have to work with users to
come to a mutual interpretation a common
understanding of the work we're doing
because our insights are only so
valuable in that they're accurately
assessing what the user is actually
doing and then the last key part is a
focus contextual inquiry is in most
cases should not require like a
predefined set of questions that you go
through and answer the user it shouldn't
be a standard interview you should come
to a contextual interest session with a
clear project focus that is a clear idea
of what you want to explore the work
processes you
get more involved with and maybe a
protocol that lists the points that you
want to be sure you cover in your
observation session but that's all you
shouldn't use a script but like a
protocol really to guide you through the
process and so this protocol is really
um I guess I'm map you could consider it
for a successful a successful contextual
inquiry um and you can write this
protocol out as a guide for the first
few times you conduct a contextual
inquiry but it really will become second
nature after a while um the first key
part of this protocol is really the
introduction and this is important for
setting the stage for what's to come in
your contextual inquiry because your
users aren't really your your experts in
this case I are really going to
understand what it is that you're doing
why you're observing them do their work
so setting the stage for exactly what
you're doing and the purpose is that
that serves is important in this and
similarly you should introduce yourself
in the scope of your work so that they
know you know what they should be
working on when you're with them you
should set expectations for the length
of the the interview / observation
session which is usually about an hour I
found after anything longer than that
people start to lose focus
and then you should be sure here to set
up the user as the expert in that case
that situating yourself is just there to
learn from them and then during the
observation / interview the contextual
inquiry you should as I've said observe
the user's task because Anax ask
questions according to your research
focus and you can be nosey you know you
can say hey what are you doing here what
are you doing here we're there to learn
accordingly you should take notes and
audio record the conversation with their
consent you should collect any artifacts
you know documents they reference things
like that
at a minimum you should set out to learn
the specific things they do to complete
specific work processes their pain
points of how things are now in their
goals for their own work processes and
here it's super important to ask the
users talk about their actual work and
provide specific examples of their
processes reconstructing situations when
possible if they begin to talk extract
lead or summarize that information is
not really useful in - - contextual
inquiry because it's decontextualize
it's based on their memory and the
accuracy is really reliant on their
memory which is as we discussed not
infallible so we don't want to rely on
what they think they do but we want to
rely on what we're actually seeing them
do and this is hard to do this is hard
to not ask them to predict what they
would do give an X what they would do
with better tools what they would do
with better systems and so this is one
of the hardest parts I found of
contextual inquiries making them making
it a concrete conversation about what
they're actually doing and not straying
into summary or abstraction or
predictions I guess on their end and
then after this main bulk of your
interview observation you should try to
start the wrap-up in which you with the
user creating shared larger
interpretation of what you've observed
in what you've learned so that you're on
the same page about what just happened
next up actor contextual inquiry ideally
you would complete a series of
contextual inquiries because nobody
works in isolation that you would
conduct through the set of observers and
observation interviews and observations
with people performing similar jobs or
people working on a team and after you
have a series of these done he would
conduct interpretation sessions after
these conduct show up and create with
your team to go over your notes and then
observations to essentially interpret
your data capturing any key findings
this interpretation session what you
could call it when you do this you
should start by creating a user profile
which includes like demographic
information about the user's job and the
roles they fill and then in this case
you should assign them a code that
you'll use to refer to them throughout
the inquiry process in this case then
the person who conducted the contextual
inquiry should then walk the team
through their interview without
summarizing just by going through their
data points and a designated a
note-taker should be taking notes on the
walkthrough with other team members
asking questions throughout the walk
through any key issue interpretation
characteristic breakdown or insightful
quote on the end of the user should be
recorded as a stand alone affinity no
attributed to the code of the respected
user profile and if a fitting note is
just that a notable a data point in a
contextual inquiry essentially in a
general rule of thumb that I found is
that you should have between probably 50
contextual inquiry conducted so with
your contextual inquiry is done in
interpreted then the next step usually
is affinity diagramming and this
affinity diagram will bring together
issues and insights across all customers
into like a wall-sized hierarchical
diagram to reveal the scope of the
problem or the scope of the user tax
that you are observing so to do this
after you like I said after you have all
your you know however many hundreds of
affinity notes after a handful of the
enquiries it suggested like I said to
write each on either like a small sticky
note or use an online tool like mural
comb and then with your massive
collection of affinity notes it's time
to create an affinity diagram and if I
can show you a guess what that looks
like real
so this is an example of an affinity
wall that I had created for a different
research process but essentially what I
had done was I conducted a series of
contextual inquiries with users and with
my team we recorded each individual data
point on the yellow sticky notes and
then we group them under blue sticky
notes which summed up any key themes or
interpretations that we found and then
we just kept going higher until the
green sticky notes where we found kind
of the key problems or findings of our
research process and that was just based
on you know shared interpretation of
what the data meant and what the data
highlighted
so that seems like I said to go back to
this recommended steps for the textual
design so we've conducted that we focus
mostly in the contextual inquiry where
we talk to users and we observe what
they do and we talk briefly about the
interpretation sessions I haven't
those are pretty open-ended in terms of
the structure of them but essentially
like I said it's just a walkthrough of
what your data found and then we talked
briefly about the affinity diagrams
where in you you bring your data points
together on this it's been large actual
physical diagram but I based on my
experiences I think it's probably going
to be more common for that to be more
digital using two online tools to do
that because it is a pain to get all
those sticky notes together and go
through that process but so that is the
process that I have mostly been
following in my my own work I haven't
really been in a position where I can do
the visioning storyboarding user
environment design and paper mock-ups
but depending on your role in an
organization and what you're really
looking to get out of your research
process those steps can follow from
there as well
so with your affinity ater diagram
what's what's revealed as I had said was
a critical set of findings related to
how your users currently perform their
day-to-day tasks their pain points with
the current processes and their goals
for their processes in some cases like I
mentioned this may be enough we're
giving you a role this may be enough but
you you could in other cases this may
not be sufficient so you could go ahead
to learn more information about your
users using that process laid out that I
had mentioned by halted Empire these two
readings are things that I would
definitely suggest in there nope 9
coincidentally my work cited for this
presentation as I had mentioned halts
flat and buyer kind of pioneered this
this method of contextual design with
all the steps underneath it and this
Martin and handing ting text is a good
good primer for brief introductions into
various methods of design research
okay little short but any questions
research choices made in wireframe or
design or particular features like how
does this
okay yeah so um to go to my my
internship right now so I'm supporting a
CRA above an implementation project with
the office of enrollment management at
University of machine so what I'm doing
is I'm sitting down with users in like
financial aid in admissions and I'm
looking I'm sitting with them and
watching them do their job and from what
I'm learning I'm making these affinity
notes and I'm looking at their discrete
work processes and from then I'm making
recommendations to the the technical
implementation team about what features
we need in this CRM tool so that we
don't disrupt the users workflows when
we switch to this new tool in a year or
so that makes sense so that's just one
specific example in another instance
where we had really in an in a class I
had taken a public library came to the
class that I was in and said you know
we're having problems collecting
feedback from patrons so we looked at
how they were currently collecting data
from their patrons and then from our
affinity diagram made recommendations
about how they can better do that work
process
so first things and this sounds like
it's focused on kind of like with the
websites that I work on is a feeling to
concentrators
have you seen like maybe is that
accurate or
early
[Music]
I guess
yeah I think I think it's some similar
similarly to like usability test
recruiting where you would want
end-users to look at where you'd want to
see how and users use a tool I think
that this process would be definitely
appropriate for looking um at how
end-users use a system and then making
record or making changes based on how
based on your observations probably use
it but it just might take a little more
recruiting similar to how you would
recruit for standard usability testing
if that makes sense
yeah yeah right right so um the is a
small Public Library in in Flint
Michigan actually and what we did was we
sat in on any like a family area like
kids events or adult events that they
conducted for the community and we took
notes we just observed what the
librarians or the program facilitators
did especially specifically paying
attention to how they distributed
feedback requests at the end of a
program and from then we found you know
patterns between how specific librarians
got feedback and what they did with that
feedback afterwards to share it with the
larger staff and from there we saw a lot
of work process breakdowns wearing some
some librarians were uploading this
information to a shared server space
others were just throwing the feedback
forms on their desks it's forgetting a
bottom for weeks on end you know so with
our observations of what we saw and our
kind
holistic interpretation of the data
points we were able to make
recommendations for like um like a
shared space where they have to upload
their their user feet or their
participant feedback after they host a
program so that they can use to further
drive data driven data driven decisions
about what programs to bring back what's
programs to focus on the things like
that
you know your current project you use
are you gonna use analytics to see how
the system's working now for the site I
compare that writing well it's in what's
interesting about that is that there
this this CRM that's being used in the
office of enrollment management they
don't have one right now so all data is
just thrown wherever in each of these
individual offices and the CRM is hoping
to bring all the data together into one
spot and so we want to just make sure
that we're not breaking anything when we
introduce this new system so I'm kind of
just looking at their current processes
now and how we can sort of support those
in the future so there's not we're just
looking at I guess the current state
rather than the current state of how
they work rather than how they work with
a specific tool or system now what's
like a profound observation you so far
that was well so full disclosure like I
have always been just like amazed by you
know the University of Michigan like
it's this huge school that's like really
prestigious you know beyond my wildest
dreams that I'd be studying there so I
figured their admissions process I
figured everything would be repeated
pretty clean on the backend you know not
really stuck together with duct tape or
anything like that
many of the admissions processes for
specific schools are just copying and
pasting application data into Excel
spreadsheets sharing and I'm like what
like that my my my the curtain has
dropped I guess on anything that I
thought so personally it's just been
weird to see that like how how piecemeal
this system has
but in terms of the how that relates to
the CRM essentially we're just it's just
more fodder for the rationale for
needing a system like that because
there's so much air that can be
introduced in such a manual process so
that's kind of what my data is is
suggesting now that whatever's going on
now is just not foolproof enough in the
case of the librarian I just do all the
reviews hundred I stuff over there she's
not supposed to and they're not supposed
to do that how do you get them to show
you that I mean I feel like if you would
interview them they like one and do it
correctly like how they're supposed to
write not really sure that they kind of
be lazy right yeah that's definitely an
issue people want to clean up their desk
before you come and they want to you
know a lot of honestly with the library
example a big push back we had was that
wanting to meet like in a conference
room where we could you know that'd be
crowded with all their books and all
their papers everywhere but it's really
important for this method to make sure
that it's in context and you know the
first few times I've done this I
probably wasn't successful in allowing
them to show me their actual work
process you know but I'm to be pretty
adamant that I want to see exactly what
they do you know and so with the library
example in particular like sometimes we
you know we would tell them we were
coming but we weren't telling them
exactly like oh we're going to be you
know just observing this session but we
also want to come see how you do this
data process so it was a thing where we
not that we didn't set expectations but
that we didn't tell them exactly what
our process what our what would be
observing I guess you
I'd say just so that we wanted to make
sure they are being as authentic as we
could observe in their process we find
it hard to
we need to find the audience for a
specific person having outside people
say we need to include these people in
these people and these people is part of
the process yeah and it's a just
particularly with the library example
the director of the library was our main
contact and she just wanted us to talk
to other you know executives not
executives but like managers at the
library you know it was really hard
actually to get into to talk to the
actual librarians because they were so
busy with everything else like and
what's a good part what's a good kind of
a benefit of this if you can spin it the
right way is that it shouldn't take up
much time of the participant because
they aren't still technically working
you might just have to take time out for
questions during your contextual inquiry
process but it is it is a bit
challenging to squeeze in on people's
schedules like so it's anytime you can
build into context compensate for that
is definitely a good thing in that case
did you do any reporting of your
findings on if the employee processes
and then the managers see that and say
well no that's not how things are done
here at all right that that definitely
is a big concern I think with this
process especially with the librarians
you know because they didn't want their
bosses to know that they weren't doing
things according to protocol which is
why those user profiles and user codes
are really important because we sort of
anatomize the data with that but there
has been you know there has been cases
where in my CRM project in particular
where two people who are supposed to be
like the admissions directors for a
particular school for example are kind
of in disagreement about what the actual
work process is but it's the goal of
this contextual inquiry session to come
to kind of a shared interpretation of
what is actually going on even if there
is some conflict that just highlights
the need that the CRM tool needs to
implement you know a more streamlined
board process for them to follow
yes he talks about a little bit about
evaluating and for this within the
context of a larger
but I was kind of curious if you like if
you this monster if you repeat this over
and iterate over it and like once you
get the results you kind of refine it
over time and see business change
facilities if you're thinking if some of
these things about doing this is their
recommendations but okay we can't do
this twice you're going to get like
integrated over multiple ones maybe
you're going to get them then success
but you know it's kind of like what is
the range of an applications or having
to be
let me think about that yeah I in terms
of my eye I can I can kind of guess at
an answer but in terms of my my
experience with it I've only been
included on the kind of in the front of
the process you know I haven't been
there for any any revision or any kind
of iteration and in the process but I
imagine that for example with the CRM
system um after it's implemented in a
year so I can definitely see like an
application where we'd kind of do this
process again to see if it's any better
but I would I would kind of venture to
guess that it would be more about
collecting user pain points to
specifically make fixes in the system
rather than kind of map out a
large-scale plan for a revision but more
so how the implementation team can meet
their specific needs in the in the near
future that makes sense
kind of a different I guess a different
purpose with it um after each iteration
this process kind of defined you're
building a new system to educate the
users then do you are you gonna I take
this award to like content people to
kind of after you have the system
defined right yeah so beyond those as it
stands beyond the scope of what I will
be involved with as an intern but from
what I'm understanding there's going to
be a lot of training sessions for users
and a lot of content like help come they
don't documentation developed for use
but in terms of what I will be involved
with I can't really explicitly speak to
anything because it's really useful it's
a develop that kind of chapter right
right yeah though that'll definitely be
on the works I'm just not really privy
to that the end of the conversation yep
in when you're constructing how do you
decide how many levels of hierarchy your
new group things under I think you have
three levels um honestly that was what
our professor told us to do in terms of
that one in particular but I think a
general rule of thumb was to have I want
to say between like three and seven this
great data points under each larger
heading before building on to the next
one just so that it's a manageable range
anything smaller might not can probably
be grouped into something else than
anything
can probably be broken up a little bit
so I would say that three to seven range
is probably the sweet spot for that but
I don't know what that's based on that's
kind of just what I've been taught
can you explain this infinity wall thing
a little bit more detail what what is
this yeah
so I'll show you a video of it actually
I think we have a video somewhere yeah
about five hour process this is just my
my work team from last semester so we
had all the sticky notes on the table
and here we're just kind of constructing
the crude weights at this point kind of
common themes we found so like I said
during our interpretation sessions we
any discrete data point any you know
particularly strong quote anything that
could stand alone as a data point I
guess any particular breakdown in the
work flow we recorded it as an affinity
note on a little sticky note and we had
probably close to 500 at that point this
is a pretty relatively I would say
large-scale this is pretty I realized
this is pretty a pretty big time
commitment on a typical development
process so this is maybe not realistic
in a lot of cases but we had those 500
sticky notes and we kind of just started
grouping and we started merging groups
taking groups apart until we come to me
my team and I came to a common
understanding of what these main
groupings should be and then we just
kept building upwards of the thick
groupings and then that's kind of it
I don't know that answer your question
for us
okay okay yeah it's possible to employ
when you're having these conversations
with your clients because if I find as
often
their clients will hire hire team to to
fix a problem but in order to fix the
problem got to talk to them about you
know pain points and weaknesses and this
could be difficult conversations so
I think one of the things that I have
found is crucial to that is situating
the user as the expert situating
yourself as there just to learn from
them by putting my you know by putting
your confidence in them and situating
them as the expert in the situation and
you're simply just there to learn from
them I think that really from me at
least it has helped and it breaks out
any barrier that might that might arise
but I think also you know it depends on
how you talk to them how you are you
come into the session if you come in
really straight-laced and like you know
and almost intimidating in a way like
like you're hoping to but like we want
to be sure that we're not like it
doesn't come across as like for auditing
them we want to avoid kind of the
evaluation tone of voice evaluation
questions the availa depressions that we
want to just learn in that case so I
think situating it as a learning session
an observation session will sort of work
to break down the barriers but then also
any you know standard good interviewing
techniques come into play with body
language what certain words you choose
and things like that so
so definitely interpersonal skills are
it's really important in this particular
research method
you're recording stuff oh yeah cuz here
baby
oh whoops thank you
that's what that means okay thank you
I'm going to say you know and add to
that because I know what some things
that I do that helps me out this right
so the more you kind of you know get
them up talking about sounds the
processes on video right cuz that's go
very simple but any question and then
they're just going on and on and on
right yeah
I'm just wondering
Senator John
they might be against that you know like
has been talking to
the folks that work and they don't they
don't want to take away for a time
reasons but I think it's more
right so so with my current scope I I'm
just like kind of along for the ride on
the implementation team in particular
and I haven't had to deal with that
necessarily but any I think I think
persuading the it really deals down to
making the use the cake that you know
the argument that this is useful for
your end product this is useful for
productivity this is useful for
streamline work processes things like
that and what I found is that being
really flexible on like availability is
has been important like with this
project I'm currently working on I
started scheduling interviews in January
and I'm scheduling out to April at this
point like I'm trying to be really
flexible with people with when people
are available and taking any time I can
get as I mentioned like a typical
standard session would feel like an hour
if I can get 15 minutes like I'll take
found be patient on my end but I think
there's a lot of negotiation buying has
to happen at a higher level beyond that
which I can really intelligently speak
to you right now
you think an hour is enough because I
cannot explain our working environment
right right so yeah it's I mean an hour
would probably be like I said I've done
like up to two hours and I found after
the first hour focus is lost people are
trying to multitask ideally it's good to
be able to observe like one discrete
task that would take an hour or so but
if it if it comes down to where you're
you need to observe this multi hour-long
process I think that's appropriate but
we just want to be sure that we're being
you know genuine to the project focus I
guess if that makes
[Music]
and the client was a law firm insulting
people's life the hand progress
the water
and you kind of mapped out as people hit
the show we have to show your income
good show you have any assets you've had
show tell the story of what happened
there so it's kind of set up to six
greens you know there's a discrete
screen for your income pre-treating for
your assets and it look like a really
great process so we all went out to see
one of these days we walk in and it's a
room of 75 people that are coming in to
talk about you know how they've learned
losing their house keys and we realized
that what would happen is people say
okay you have your ear a bunch of stuff
and they find water if we find it and
then you five streets later it's a all
the way I felt receipts and so you can't
really apparent that you were going to
be going back and forth between extremes
and it's good
and it was great aha moment but oh we
have to consolidate occipital couple
studies
so these people can jump around quickly
and that's kind of honest things we
wouldn't get unless you actually saw
this happen
right so it's like I said this might be
this the scale might be realistic
unrealistic excuse me for a lot of
organization because it takes so long if
you don't have like a specific research
team in place but even just observing
contextual use even just you know for
for a few minutes or a few hours is
super useful in terms of how usable you
can make the system later on
[Music]
do you find people and get off task as
they're showing you to say well I think
it should do this
sorry oh definitely definitely
um I try to I try to stay clear that as
much as I can but if somebody is off
talking about how a specific a specific
point they have I just let them go like
there's a point where I kind of like
things people think should yeah
definitely it's it's um I found what
specifically with this CRM
implementation process I'm working on
particularly one of the interviews I did
the woman I was speaking with just
before I even introduced myself or
anything she just went our process sucks
I hate doing all this blah blah blah
like I had to reorganize a little bit so
it you know it threw my my internal
protocol off for the system of it but it
was really useful information because I
could see kind of what she was doing and
what she was struggling with and I just
kind of took it I take you know any data
that you get is useful it might just
require some some finagling on your end
to to bring it back around to observing
actual work process
awesome okay thank you guys