>> IAN GEIMAN: Thanks, Kathryn, and thank you to Kathryn Thomas for doing the captions for this session. My name's Ian and I'm here with Amanda, Mariah, and Casey.
We're students at the University of Michigan studying research and design.
So I would like to welcome everyone to the Dark side. I would do my best Darth Vader impression, but it's frankly not a good time to sound sick. I hope everyone is staying healthy.
To those of you who are familiar with the field of usability, and user experience, welcome back. If you haven't heard of user experience, or UX, we'll have a primer for you before we get into the dark side of things.
So I do have an unfortunate announcement to make. We had originally planned this as a more interactive session but things will be different as we are virtually holding this.
So to match up for the lack of audience participation, we are using sli.do, a tool that allows you to ask questions of us and allows us to have interaction with you. It doesn't require an account, so enter the meeting code, #MidCamp.
I will give you a moment to get that up.
Or to stare at the delicious cookies. Before we start I will talking about user experience! Take a moment to think about something you enjoy using, whether it's a software experience or a physical product experience. Please let us know on sli.do what you think of. I'll give you a moment with some music.
[Music]
All right. Thanks, everyone, for your answers.
I'm seeing a lot of software here. I'm seeing ‑‑ I saw briefly the Delta website, which I hope it's working these days.
But ‑‑ so all of these are pretty good user experiences. Toaster. Good example. Really simple. Works like you expect it to.
So one of my favorite experiences is a feature that I wasn't expecting, and really impressed me the first time I used it.
It's Apple's WiFi password sharing, and it's such a small feature yet it saves time when people don't know their password. If you haven't used it, if you're trying to get onto a network and you're near a contact of yours already on the network, they can show the password to you so they don't have to find it and you don't have to find it. It's great for coffee shops or a home network environment.
These positive experiences can happen outside of software as well. Another favorite is Fuji film cameras. They have a familiar feel. Everything is exactly where I think it should be, with the controls and dials easily laid out so you can use them without looking at the camera. The experience of using the camera and all those you thought of were intentionally designed.
The reason if you have you like the experience, it is easy to use. It could be argued that the best user experience is one that you don't notice at all. Like an easy checkout process. Not that I need a new camera.
A good maps application.
But basically, anything that works like you expect it to.
This is all good UX. And whether you notice it or not, you UX is there to help get things done.
The Nielsen‑Norman group defines UX as encompassing all aspects of the end user's interaction with the company, its services, and its products. Anything we experience can be designed. As long as the person doing the designing thinks about how the end user thinks about they think. If our industry we say human centered design to indicate our goal in making experiences that revolve around how we as people think about doing things.
This can cover software, hardware, services, and even how people get information. In buildings and public spaces.
In it is the simplest form, UX means, does this thing work like I expect it to work?
So how does good user experience happen?
Ideally, when a product is being designed for the first time, that product is based on an actual user need.
A UX researcher or designer would be integrated into the design process to better understand the needs the user has and to make recommendations for project direction based on those needs.
All of that is to understand the mental model of the user so the product can work like the user intends. But as soon as it doesn't, the user gets frustrated and if a competitor is doing the UX better, the user may switch based on that experience.
But back to designing the product. Once the mental model is understood, the product can be made to match that model.
However, this can get complicated when you have many types of users.
So some systems get built on commonalities, between many types of stakeholders that have many differing needs. UX works to balance them and find the best way to handle them. These commonalities extend to how we use visual systems and interfaces, like a check mark to confirm something and X to cancel something, and how certain icons are commonly not the, such as these here. These are accepted in many cultures.
What is Dark UX? Hopefully you want to learn more about it. The dark side is everywhere and it's been around longer than you think.
Basically, Dark UX is the experience when using a product that tricks, coerces or confuses users into putting their information or money somewhere they wouldn't otherwise intend to.
These experiences are created with the intent to deceive or take advantage of users. They are also known as dark patterns.
Some dark UX is based on bad business practices. When we get into specific examples, you may be reminded of situations that you've experienced outside of software.
Some of you may be thinking, well, I've used some really annoying products before. How is dark UX different from bad UX?
Well, annoying doesn't mean the company is trying to intentionally profit off of you or steal your information.
It could just be as simple as bad design, without enough thought or research put into it.
So let's take a look at some examples of a product we likely all use: A car.
Now, anyone that has jumped into an unfamiliar vehicle and failed to find something in its complicated menu system, this is bad UX.
For example, this is Subaru's system.
Or a feature gets added that's different from how people want to do something. Which can have a negative effect on the experience.
This is the Cadillac CTS inferior, which at one point criticized for taking away knobs and adding touch controls only. So there are a lot of reasons automotive user experience has been so historically poor over the years.
Trying to satisfy many types of customers by building on those commonalities I was talking about before. Or just not having enough communication between the teams working on these complex projects.
So last example, this is the interior of a Mustang that had climate controls on top of other controls. None of these designs were malicious. They just negatively impacted the user experience. We're starting to see a positive change in the UX in vehicles being released today.
So more humorous example is this. Frustrating phone number inputs. A variety of reasons could have caused all of these frustration information entry experiences to happen in the real world.
Thankful,ly, with something as simple as a phone input, we worked through a solution that works a long time ago and we don't have to draw out our numbers, go through dropdowns or play number entry roulette.
Or inform my favorite intentional bad user experiences on the internet. The most frustrating volume sliders. They're all annoying ‑‑ [Laughs.] ‑‑ in their individual way. But bad UX happens all the time. It is mostly an accident with no malicious intent behind it but it happens by not understanding the user.
But dark UX occurs when the company understands and act on that for their own profit. The commonalities I mentioned before, between users and cultures and the interface we all used can be leveraged against us.
So finally, here are some of the types of dark UX that have found their way in the products we interact with. I will turn it over to Amanda.
>> AMANDA: Thank you for that wonderful explanation of a little bit more about what UX is, as well as an explanation of dark UX. I'm going to start getting into some of the specific types of patterns that we see on the internet every day.
So the first one that I have is called trick questions, which sounds self‑explanatory, but it's definitely one of the more common dark UX patterns that we see every day. This happens when you are filling in a form, and you respond to a question that seems to be asking you one thing on a first glance but then when you take a second look, it's actually asking you something else entirely.
A very common one is filling in different forms about subscriptions. The screenshot from the Royal Mail and post office appears to be when you click on the different boxes next to post, telephone, and email, you will receive notifications. Most people, when just going through a form like this would probably just scroll past this, saying, you know, I don't want to receive any more emails. I already get enough of those. But upon a second glance you see that there's actually opt‑out language buried in this long paragraph. Most people will probably miss it, and it probably won't be earth‑shattering. They might receive extra email or some mail they can recycle. But you can see how this can be more detrimental if for instance, there's a financial contribution involved.
Sneak into basket. You try to purchase something on the website but somewhere the site sneaks an extra item. Perhaps it's through the use of an opt out button or if the site makes it easy for the person to accidentally upgrade.
Booking tickets on Delta.com. When you purchase a basic economy seat you come to the page that you can see on the screen that says review your choice. It tell us you for an additional $50 and it has a huge red button with a bit of text that allows you to continue with your original choice. Perhaps a better way would be to swap those two things. Be the red button to help the customer go on with their original choice or for them to have two similarly sized buttons.
It puts autonomy back into the hands of the users and doesn't force them to buy something they weren't intending.
The next one is roach motel. You get into a situation easily, but then when you actually go to unsubscribe or get out of the situation, it's difficult for you. The New York Times is infamous for doing this. You can see in the screenshots on the right, they say that they offer several ways to cancel your subscription.
But you can see from the frustrated text messages on the right that the person trying to cancel their subscription is trying to get in touch with a customer care advocate, and it's taking them nearly eight hours to get someone to respond to them.
So this is putting a lot of undue burden on the user, making it hard for them to do something simple such as unsubscribe from a newsletter.
Perhaps the New York sometimes would have implemented a one‑click unsubscribe or cancel option, making the user's lives a lot easier.
The next one is called privacy Zuckering. This is one of the more dangerous patterns we see on the internet every day. For instance, Facebook is a really good example. This is when you're tricked into sharing more information about yourself then you really wanted to.
And I'm sure we have all heard the different things in the news about Facebook and how they're handling issues like data privacy and security. This dark pattern was named for the CEO, Mark Zuckerberg. Any way, this type of dark pattern can expand to other things such as transparency about use and inner privacy as well.
So another point of contention with privacy Zuckering is the use of data for targeted advertising.
Generally is collected about us to send out target the advertising through something called cookies, what your web browser uses to collect data. I have two different examples of websites using cookies in different ways. He have the Loft on the left. They tell you why they need the data and how they will use it. They will tell you more about how it's used, and go through a link that says "manage preferences" to opt out or at least learn more about how the data will be used and if it's going to be sold.
On the right‑hand side is the Essie website. They say cookies help us improve your website experience. If you want to use it, you have to consent to cookies. There's no way to opt out or learn how your data is being used.
So anyway, there is definitely a difference in the way that Loft and Essie are handling the cookie situation, and Loft is putting more autonomy in the hands of the users in regards to their data privacy.
The next one is called price comparison prevention, and yet another airline example.
The retailer makes it difficult for you to compare prices of items, [indistinct] ‑‑ your informed decision‑making ability. I have an example from British airways where there's an option for basic versus plus. In basic, you can choose your seat for a fee, or check a bag, for a fee. Or that doesn't tell you how much that fee will be.
Perhaps it would be easier for the user if British airways put all the information they needed for the informed decision in the side by side chart. Even though you are able to find out what the fees are later on, it would be easier for the user to make a decision and understand which cabin is the best choice for them.
We also see this in slightly more innocuous situations.
Take Sanisbury, and two types of apples, and one is priced in kilograms and the other is priced in units. Even though the user can figure out which one will be more expensive, based on the information that's given here, it would be a lot easier for them if they could go through their grocery shopping without needing to do mental math to figure out which one costs more. Just making it a little bit easier on the user, removing some of the burden and letting them not jump through hoops to do their grocery shopping.
The next one is called misdirection.
So this is one of the design that purposely focuses your attention on one thing to distract you from another thing. During the Skype setup, maybe you're rushing to a meeting and you're trying to set it up quickly and not paying attention. You might not notice that during the process, it makes Bing and MSN your defaults. Most users will not notice and it may not have been what they intended. By putting this in a more salient place, maybe when you're done rushing through setting it up, it makes it easier for the user and avoids them making decisions they wouldn't have otherwise.
The last one is friend spam. This is when a product asks for your I mean or social media permissions under the guise of ‑‑ [audio glitching] and this is LinkedIn. They were having a class action lawsuit and ended up paying out over $13 million a few years ago over the ‑‑ [indistinct] ‑‑ pattern.
So in this instance, LinkedIn has you enter your email address, and ‑‑ set up your profile, import your address books to help you find friends. It doesn't tell you that it will send out emails to perhaps all the other people you've interacted with in this email address. People are having issues with potentially hundreds of people that they have interact with over the years being sent emails leading to personal and professional embarrassment. LinkedIn doesn't do this anymore because of the lawsuit I mentioned earlier but it's still something we see on smaller websites, especially things like dating website.
I will turn it over to Mariah to talk about more dark patterns.
>> MARIAH JACOBS: Thank you, Amanda. Unfortunately but maybe fortunately for this presentation, there's still more patterns. I will pick up where Amanda left off.
Hidden costs is another UX dark pattern.
This one is pretty self‑explanatory, and it's probably something that we've all experienced.
You're about to check out, and finally in the last steps of the process you're hit with extra fines or fees which don't have any explanation. Here's one example. Back in December, when we actually paid money to be stuck inside a room, I was booking an escape room in Portland, Oregon, for a group of friends. To start, I selected the number of tickets that I wanted. As the top image on this slide shows, I have one ticket, and there's a description under the word ticket saying that it's $35 each.
When I went to finalize the booking, I was hit with this screen similar to the one on the bottom.
The second image here shows an itemized list of my purchase. The ticketed are listed for $35, and then there's taxes and fees for an additional $3.12 per ticket.
The tax appeared and it was added to the total. It's important to note in the situation that Oregon has no sales tax. There wasn't an explanation for this.
And that is an example of the hidden cost dark pattern.
The next dark pattern that we have is one that I named, though another name for it probably exist the out there somewhere ‑‑ and I call it the final countdown because that's what websites use to pressure you into thinking that time is running out.
This creates a sense of perceived scarcity among buyers and it's a common tactic, especially with travel sites, like Booking.com that we see here. In the top, there's a booking that no longer exists. It says you missed it.
Even though this hotel location ‑‑ that's what that is ‑‑ is no longer available, the website chose to display it anyway because they want you to know what you missed out on.
In the bottom, there's a warning in red text that lets you know the number of bookings is limited, so you better act fast.
Other online retailers outside the travel industry also use this.
Etsy also uses this strategy, which is not somewhere I necessarily expect to find that dark UX.
Bait and switch is a UX dark pattern that actually originated in marketing. When you set out to do one thing, but a different, undesirable thing happens instead.
An example of this is if you go to purchase something, but when you try, you're offered a different thing at a higher cost instead.
The example here is the screen from TurboTax, because my friend recently mentioned the experience she had when she was trying to file her taxes. As she was towards the end of the process, she was hit with a screen that said you're less than a minute away, but okay, let's get your free credit score and report. This won't hurt your score. And it asks for personal information such as email, or ‑‑ sorry ‑‑ street address, apartment, suite, and other things like that. This is an example of the privacy Zuckering and bait and switch, because it's asking for information that it doesn't necessarily need and is not related to the process of completing your taxes. My friend, when she was taking part in tax filing, just wanted to get the process done. She didn't want to have her credit score and report as part of this process.
So confirm‑shaming is our next example here.
And it's a fun example of how dark UX directly tries to emotionally manipulate you. Guilting users into opting into something. And they do this by including a decline option that's worded in such a way that you're likely to just agree and confirm to what they're asking, rather than to reject the offer.
One example is Wikipedia around the holiday times. They post large banners that ask you to donate.
In this top image, which asks the user for a donation, they don't include a decline button at all. The user is offered two payment options, credit card and PayPal, even though a donation isn't mandatory.
The second image is an example of one of those forced email subscriptions that tries to sign you up for a mailing list. In this case, the mailing list is the free beginners garden guide. There's a text box for an email address and a big bed button that says, get your free guide. At the bottom there's a small button that says, "No thanks, I know everything about gardening." It's worded flippantly, making you more likely to do the polite thing and subscribe.
If you've played a game or downloaded software, you've run into disguised ads. They're designed to look like something that would make sense on the page, like a download button. If you look closer, they're an advertisement. Clicked one will redirect you to who knows before. I was trying to install pgAdmin. On the left side, there's the correct download button for the application. On the right‑hand side, there's a big green "start now" button. If you hover over ‑‑ it redirects to an advertisement but the unsuspecting user could click on it. This is a disguised ad.
Another UX dark pattern that we probably all fall victim is forced continuity. Maybe you don't know the name, because I didn't know either before I came to the dark side.
This strategy is used by businesses who have lured you in with a free or discounted trial.
But with the catch that you'll pay full price for the service after a certain time period elapses.
This means the dreaded entering of your credit card number upon signing up. And of course, they are unlikely to remind you when your free or reduced trial period ends.
Audible, Amazon Prime, and Spotify are subscription services that are likely to catch you. I went through this battle with Spotify in December when they told me that my student account was expiring because my four years was up. So it hates grad students. Even if you do it under a new email account, they still only have it for four years. They love to trap you with forced continuity. Once you're on a trial period of a service, there's a good chance they can get you to pay the full price for the service after all.
So we have one final dark pattern here for you today before we transition to something a little bit more interactive.
This dark pattern is called irrelevant notification overload. And sites like LinkedIn and Facebook are notorious for this.
The product attempts to lure you back with the red notification icon by updating you about information that doesn't matter to you.
This is a snippet of my notification feed on LinkedIn. I receive notifications for things like Google being mentioned in the news or a topic that is trending now, titled "helping other job seekers can pay big dividends."
These notifications didn't have anything to do with me you'll at all, or my network ‑‑ at all or my network connections, but they know if they get you to respond to a notification, they're more likely to keep you engaged.
And so with that, we are now going to give you a chance to use your new knowledge. And so on sli.do, again, that's s‑l‑i.d‑o, if you enter the code #MidCamp, you should be shortly participate in an activity that we have for you here today.
We are calling this game an example of what type of dark UX is this.
So what we're going to do is for a few different examples of dark UX, provide you with a description, and an image, and then ask you to vote on what you think your answer is through sli.do.
In this case, this is a product that leads you through a lengthy process of inputting your desired parameters for a apartment and then it requires you to enter your email, name, phone number, and degree to be contacted by third parties before you can view matches.
So right about now, you should be able to see the poll voting on sli.do.
And go ahead and select what you think ‑‑ what type of dark UX you think this is.
[Music].
[Music ends].
All right. It looks like we got 32 responses, and the majority of people said privacy Zuckering. We think that one describes what situation this is, because you are basically being forced to hand out your information, and then third parties will receive it, which is not necessarily what you wanted to do what you were trying to look for an apartment, but suddenly your information is being given out to all of these other people.
So good job. Bait and switch also seems like it could be possibly an argument for that dark pattern here too.
All right. Now, moving on to the next one here.
This is a website with an unsubscribe feature that isn't the worst ever but it does hide the confirmation that you've opted out in a paragraph of text. So Red Bubble says, you have successfully opted out, we will no longer send you delightful, life‑affirming, money‑saving emails. To resubscribe ‑‑ users who are not reading carefully are likely to click this button. The poll should be enabled, and vote for what type of dark UX you think this is.
[Music].
Great. Okay.
Looks like we have 33 people respond, and the majority said misdirection.
I can see also why this would be potentially a confirm‑shaming, based on the way that they worded their, like, description.
But misdirection is what we were thinking here, primarily, because they sort of disguised the fact that you have already unsubscribed, and have that big opt back in button. If the user is going to unsubscribe, they think they will click something to unsubscribe rather than the fact that the page already unsubscribed them, and they may end up opting back in by accident.
Our third question for you here.
If you turn off your notifications on Instagram, the app bothers you to turn them on, back on again and again.
With a popup window like the one on the right, that says, turn on notifications. Know right away when people follow you or "like" your ‑‑ and comment on your photos. And there's an option to turn on, or not now.
Even if disabled notifications are what the user wants, they are constantly reminded of their choice.
So you should right now be able to see a poll, and submit your vote.
[Music].
All right. Let's see. It looks like we have confirm‑shame something our leading example here, and that is correct. The fact that the user has already said that they want to opt out or disable their notifications, but Instagram keeps bothering them over and over again, trying to get them to change their mind, is an example of what we consider confirm‑shaming.
So we have one last example for you today.
This is a website that is attempting to pressure its shoppers into making a purchase based on the perceived scarcity of an item. You can see on the image there's an item for sale, and below it, there is text that says only three available, and it's in more than 20 people's carts.
So you should just now be able to see a poll that allows you to vote on our fourth and final dark UX question.
[Music] all right it looks like 100% of people got the final countdown, and that's adrenaline correct. The website, this was Etsy.
It's actually trying to instill a sense of urgency and perceived scarcity by pressured them to think that a bunch of people have it. Whether they do or not.
And this might lead to more buying than if that text was not there.
So great job with reviewing those.
And now I'm going to hand it off to Casey, and she will talk about how to spread awareness of dark UX.
>> CASEY TIN: Thank you. So we've seen some examples of dark UX in practice and you might be wondering why it's still seen frequently. Oftentimes, dark UX practices are implemented as a result of shady business practices that try to manipulate users.
And in some cases, however, the intention behind the user experience is not meant to be harmful. Sometimes dark UX is implemented due to lack of attention or insufficient user testing that informs the designer. And this idea had been explained by Hanlon's razor.
Never attribute to malice that which is adequately explained by stupidity.
To rephrase, [indistinct] ‑‑ might not always be intended. Sometimes designs that follow dark UX patterns are due to bad design. We should remember that humans are prone to error and can't get everything right the first time around.
And we see this all the time in the media with fake news. For example, users were outraged when YouTube's policies, with views with LGBT content was restricted ‑‑ but there was an algorithm and not meant to be harmful.
What is being done?
The UX industry and research institutions were making efforts to explain. As well as bring awareness to dark UX patterns. We've included a few examples of what others are doing to mitigate the usage of dark UX.
As we can see from some of these sources, dark patterns have been around for a long time.
The Verge published an article in 2013 to bring awareness of the topic, yet it is still widespread across business process. In recent years, many efforts have been made to mill gait this. As. April 2019, Senators have passed bills. A draft of the bill notes that it's illegal for large public online services of more than 100 million monthly active users to design, modify, or manipulate a user interface of the purpose or substance affect of scaring, subverting or impairing users autonomy.
Soar it's more prevalent than you might think. What can we do on our part? The best way to avoid being trapped by dark UX patterns is to spread awareness about them.
Spreading awareness will help others identify patterns more easily, which can lead to actions taken against these practices.
First and foremost, your users.
These are the people that you should be most concerned about. As they are directly impacted by dark UX practices. It is important to assess their user experience.
And if they express something that doesn't feet right, it might be an indicator of dark UX patterns.
As people who aim to deliver good user experiences, we should be cognizant of how our design choices influence the spread of dark patterns. You might be asked to work on a project that includes this.
Ask why. Pointing to the research surrounding the argument will be help.
Dark patterns are not meant to be in favor of the user. We want to ensure our users are satisfied.
Users are long‑term users, and unhappy users are short‑term users. Happy users will resulted in long‑term user engagement.
And next, what you can do for yourself.
Apart from your users, being aware of how you as a user can be affected by dark patterns is second page. You should ‑‑ is essential. Be aware ‑‑ [indistinct] ‑‑ you can identify dark patterns we discussed earlier or doing research about what other patterns exist and examples of where they're being implemented.
And if you don't feel comfortable about a situation, try to get out of the situation.
As a user, be aware of how your information is handled. You should keep note where your data goes and what it is used for and how your money is being handled. For the sake of user privacy, it's important to voice these concerns and stay protected.
And lastly, staying informed and updated.
There are tons of resources on the internet that talk about dark patterns. Such as UX Twitter feeds, Reddit forums on UX design, UX design podcasts, and many more.
And finally, what you can do for friends and family.
For your friends and family, assuring them that dark UX is real and it might appear more than they might think, and it's typically a result of shady business strategies. Making sure they're aware of it can protect them from harmful consequences regarding their data.
One way is to talk about the dark patterns that you've learned about and come across.
They might be able to relate or realize they're in a situation with similar practices.
And lastly, sharing resources with them to help them stay updated.
For example, sharing this talk.
Or sharing some of the other resources that we've mentioned. Reddit forums, darkpatterns.org, Twitter hashtags, et cetera.
And thanks for listening to our presentation. We hope you've learned more about the dark side.
>> IAN GEIMAN: Now we're open to take questions. Post them in the chat that we're monitoring, as well as the sli.do.
We had one question for Mariah earlier. Did you ask them about the taxes and fee for the escape room? Or did you just walk away?
>> MARIAH JACOBS: That's a good question. So, yeah, when I was booking, and I was booking for six people, I think, so the taxes and fees was more significant than the $3.12.
Because the time frame that I was working with was limited, I did book it. I looked on the website to see if there was an explanation. I couldn't find one. I didn't want to track down information so I just booked it. If I had more time and maybe if I cared a lot more, then I could have potentially either just not ‑‑ not chosen to go through with the booking or tried to seek out that information.
>> IAN GEIMAN: I'm seeing a couple more in sli.do. First one, will you FedEx us cookies?
Unfortunately, no. [Laughs.]
But you're more than welcome to bring your own, as someone else suggested in the chat.
Second question, how do you balance the business needs for dark UX? Casey cut out for a second when she was talking about this, but there's a tradeoff between short‑term and long‑term users. If you implement a dark UX practice for the sake of making money, and users end up not liking it, you might lose that user for the duration of your ‑‑ your working on that product. If you might do another thing, you might not have immediate pay back from it, but if you might have created a life long user who respects what your organization is doing. More broadly, I talk about this a lot. UX does need to have an idea of the business need in mind when you are designing.
But that doesn't always mean that you have to go to doing something like a dark pattern to achieve the goal.
Someone asked, do you have any statistics on the percentage of users that actually care about this? Off the top of my head, no. This is a very product by product‑based thing.
I would say, you know, turn this back and ask, when have you been frustrated by something like this?
Is it a ‑‑ something you want to keep doing? Are you annoyed if you try to free trial something and it charges your credit card, and one or two months later you look and notice it?
So we know some of these things are frustrating, and some of the resources we linked may have broader studies but off the top of my head I don't have an answer for you.
I'm going to ask, do you have a job in the fall?
That's a separate question.
Will you bake cookies for next year's MidCamp?
Yes, if I present next year, I will definitely be bringing cookies with me to make up for this year's.
Someone mentioned they stopped using Facebook and their model is based on a lot of dark UX. I also stopped. It bothered me to the point where I didn't want to put one it anymore. But a lot of their practices are just not great across the board.
Another app that's really kind of bad about this is Uber. Both on the user side and the driver side. For some time they had negative business practices in place.
And it's of course they have a constant balance of trying to make money and trying to, you know, make sure people want to use their app.
But unfortunately, some of that has led to them, you know, mistreating their drivers for some time, and trying to find a way to constantly cheat them out of the amount of money they think they will be making.
So ...
All right.
I can keep answering the questions, unless someone else wants to jump in. [Chuckles].
>> You're doing a good job.
>> IAN GEIMAN: Will Facebook be illegal in Nebraska? I don't know. Can you write more about this?
There are any companies that you can avoid that do dark UX, companies that do not have to resort to tricks? That's a good question. Pay for service models, without dark UX.
In terms of big companies, that gets tough. You have smaller instances of companies that are definitely trying to do the right thing.
I mean, the problem is, like, a lot of companies that offer something like a free service are trying to find another way to make money. A lot of the apps you use and everything else that might be free, look free upfront, aren't inherently free by the end of the day. Because they're trying to get your information, you know, and learn from you and your trends, or you and your data, and do something from it. And they can profit off of that. But I have to think a little bit more about companies that are avoiding dark UX that are actually doing it well. Mostly because a lot of the time you see the companies that are actually engaging in these ‑‑ in these practices.
I think ‑‑ I don't know if this is necessarily dark UX, but I use Adobe products a lot.
Love them. For the most part. They have their quirks. They get the job done. I've been using Illustrator and Photoshop for years. When they switched to a subscription model, they forced an entire industry to start doing that. That's not necessarily an example of forced continuity. It's a dark business practice to say, "I'm going to go in here and make an entire worldwide designers to have to pay for this to keep using it and having to updates they're going to want to use."
Yeah, I would say it's kind of hard to find companies that are fully avoiding this.
What is your favorite dark pattern?
I will let someone else start this.
>> MARIAH JACOBS: That's hard, because, like, favorite in the sense, like, it catches me the most, and so I'm like, oh, yep, that one is here again to fool me. Or just favorite as far as, like, the name? I'm kind of biased towards final countdown for the name, because it's just something that I ‑‑ I know exists out there.
And so I try to think of, like, what would describe it as far as a good name. That was my name for this one. I have to say, probably that.
But otherwise, the money that probably, like, trips me up the most that I can appreciate more in doing this presentation, I think I would say forced continuity.
That's the one where they trick you into paying for, like, the full price of a service after you've signed up for a free or reduced trial period, because I've had to put in my calendars so many times. Okay, my Audible subscription is set to expire on this date, so I need to cancel it or I will pay more. That's probably my favorite/lease favorite pattern.
>> CASEY TIN: I will say forced continuity gets me the most and I have it in my calendar also to cancel a subscription. But I think the one that annoys me the most is probably sneak into basket. Like the example with Delta. That happens every single time I book a ticket, and it's so frustrating. And it's still there. And it's just something we have to deal with.
Being aware is helpful.
>> AMANDA: For me it's confirm‑shaming. So it's common. Every time I visit ‑‑ [indistinct] ‑‑ pop up getting you to subscribe. Okay, I don't want to receive life‑affirming emails.
Oh, maybe I should subscribe them. But definitely a dark pattern.
>> IAN GEIMAN: Privacy Zuckering. Name a side, which fantastic, it's an issue that's becoming more and more kind of present in the public conscious, I think. I think people are beginning to think a lot more about what privacy means only, and it's a fun one to talk about for that reason. There's a lot of just stuff happening in that space, and the internet has been a fantastic, innovative place for a lot of things over the years, but a lot of development happened that wasn't forward‑thinking about how the information will be used.
To go back to a previous question someone had about companies that avoid dark UX. I would say, like ‑‑ I was just, like, looking at my iPad for stuff. Apps that you have that are just single‑payment apps like Procreate, if you're an artist, is great. You pay for it once, they continue updating it. They don't guilt you into anything else. They don't have any, like, subscriptions to remove ads. It's just a one‑time payment and it's done.
So that's good in the fact it's lacking any of the other dark UX patterns.
All right.
I'm seeing some people talking about avoiding free trials. Something called privacy.com is a good solution when signing up for free subscriptions.
I know also, I don't know how this would interact with free subscriptions down the line but there were Cape Girardeau companies that were proposing using temporary card numbers for stuff. If there was a trial you didn't want to continue you could set up a temporary card number and it would just bounce. But that's a whole separate conversation.
I think we only have about a couple of minutes before we should get out and let the next ones prep. Thank you for listening, and we hope you enjoy the rest of the sessions here at MidCamp.
And feedback slide, and contribution day, Saturday, 10 to 4. Don't forget about that. Thanks, everyone.