>> DAN MORIARTY: All right! And I'm going to switch over to my slide presentation. Hopefully you all can hear me okay. Excellent. Thanks for joining me. I'm just navigating a couple of controls with zoom and my presentation on the screen. So this will take a second here. All right. Great let's get started. Here we go.
About me. My name is Dan Moriarty. I have been doing web design for 20 plus years, working in Drupal for 20 plus years. I'm on Twitter and Drupal as MinneapolisDan. I work for Electric Citizen out of Minneapolis, where hopefully we will be hosting you for Drupal Con at some point this year. Let's get right to what we're going to cover today in this talk. It's all about privacy. The state of privacy today, what is at stake, why I would call it a battle and see who is winning. The law in privacy, what things are changing, what you need to know to comply for your organization. What are best practices that you can start following know so that you will not only comply with the law, but just to be a good citizen in regards to privacy. And then lastly, technical considerations such as modules or things like cookies or third‑party scripts that affect privacy that we should take a look at.
I am going to touch on the law. I am not a lawyer, of course. You may not know that, but it's true. But we will just touch on some legal issues for your advice, and you can do further research on your own.
I thought it would be fun, and particularly in a room where we're all there to ask why are you here. I don't know. We could try this as a group here. Let's see how this might work. I'm going to skip that. I was going to ask why are you here? Are you here as a privacy advocate? Are you here because you're interested in keeping up with new laws and regulations? Do you just want to understand what the best practices are in regards to privacy or some other reason.
>> Dan, you could have everyone use the chat function.
>> DAN MORIARTY: Absolutely. What's funny is that I was supposed to close out of my presentation briefly. I can switch applications. Once I'm in my slide show, I can't just jump over to zoom, I have found. So we'll get to that later.
All right. I'm going to start off on a bit of a downer here, and that the privacy for privacy is not going so well. In fact, you can make the case that it's a losing battle. And some would say unwinnable. Hopefully not, but why is this? Let's take a look.
So let's start with data breaches, the kind that get the big headlines, cases where a mass of personal information gets stolen from a trusted source online. Companies are hording tons of personal information about you and online servers, websites, and there's determined hackers and poor security that lead to these things. There's not a lot that we have been able to do about it so far. Some of the big examples like Equifax, a lot of personal data was stolen, including myself. You can look up to see if you were affected online still, although apparently the end of last year was the last year to add your name to be considered for whatever settlement they're offering. So I missed that.
Yahoo! Is a famous one. 3 billion accounts, names, passwords, security question answers all hacked. One of the more infamous ones, Cambridge Analytica, where 87 million were affected and the information was used for political ads.
With all of these data breaches, what can we do? One point is that if websites weren't collecting so much data, we would be a bit less vulnerable. Of course, security could be better. Really, there's only a handful of solutions to these problems right now, including freezing your credit and paying for identity protection. You could try to opt to quit using some of these services, but when it comes to things like a credit verification service, that's an awfully hard thing to quit using. As I will point out later in this talk, a lot of these services that may seem optional really are such a part of our lives that it's really hard to leave them behind.
So one of the more scary versions of privacy and danger is facial recognition. A great example of that is what has been happening in China as as we watch ‑‑ if this works, this is just a brief video clip of what this might look like. All right. So facial recognition, a big threat to privacy. And it's not just repressive governments that are taking advantage of that. I don't know if any of you have heard of a company called Clearview AI. They are an American company. Essentially they started by a single individual taking advantage of technology that companies like Google and Facebook were well aware of years ago. And made the call to make that a product, because they feared the privacy implications. But because there's no law against it, someone did start a company called Clearview, and essentially what they do is they scrape the internet and social media, compiling this massive database of everyone in the country. And using that data to match instantly with facial recognition technologies to instantly map people to all of their personal information. This is something that they have been selling to law enforcement across the United States. There was a story about a "New York Times" reporter who uncovered this story who met with one of the law enforcement officers using the software and wanted a demonstration of what information they had on her. When he punched in her information, or I should say scanned her face, there was no record whatsoever of this reporter, which she thought was odd, because she had a fairly lengthy social profile. The next day the law enforcement officer got a call asking why he was talking to this reporter. And he was then banned from using the Clearview software. So sort of a really chilling story of what's out there, because there's very little regulation about this. And if something with facial recognition like this tool got into the wrong hands of a stalker, some kind of predator, it could be used to great harm.
Another example is NEC, a Japanese company who has been around for years and known for many other technologies has quietly become a leader in facial recognition technology as well. And they are currently setting up in 20 different states across the country, working with law enforcement agencies using this new technology.
All right. So geofencing, another privacy concern. I don't know if you're familiar with this technique that relies on maps to set up sort of invisible fences or boundaries around a certain area. As in my example on the screen, it is around a college. It could be used around a church or a political rally. And essentially what happens is that if you have location sharing on on your mobile device, then these trackers can know whether you have attended this certain event. It can be used to determine if you go to church, how often you attend, which church you attend, if you went to the black lives matter campaign. There are all sorts of implications of where this marketing could be used or abused.
It's also in your home with smart devices, which I'm sure many of you have. Smart speakers are a great place to start. They are designed to only work when they're triggered by their keyword. There have been many cases of accidental triggering. I don't know if you are aware that they are saving everything that you say to them. They are compiling massive databases in the name of improving their AI, they are saving your dialogue chunks, as they are called, when you speak certain phrases to an Alexa or Google Home. These databases are containing entire records of years of data on where you drove, what you buy, who you call, what you're listening to. And I think a fair question is, do these companies deserve to keep an ongoing record of your private life? And consider how this could be used against you. Apple, for their part, claims to privatize your recording and not making it available for any use or tied to personal records. In regards to Amazon's Alexa, you can view and download your entire record of what they have on you if you know where to look online. Google has finally switched to letting you opt out of tracking, but of course it's opt out, not opt in. So you have to know where to look online and take the time and trouble to have yourself removed from this tracking. All of this tends to follow the tracking.
I'm sure many of you have seen this of one of the most popular well recognized ads from the Super Bowl. About an elderly gentleman asking him to remember details about his late wife. And it was touching. A lot of people cried. It was all over Twitter. Let's play a few seconds of it. Not to rain on the parade in a touching video, but I think it's worth asking the question, when you see a video like that, is this ‑‑ remembering is also Google collecting all of this personal information on this individual which then would be fed back to them in the form of ads and targeted ads. Sometimes when people are at their most vulnerable, is that something that, you know, a private company needs to own?
Beyond smart speakers, it's a whole eco‑system of smart devices. Everything from its Nest, thermostats reporting back to Google when you're awake, when you're home, when you're moving in your house. Ring is reporting to authorities when people are moving through your neighborhood. It's Philip Hue reporting when you turn on your lights and your energy, and Sonos reporting what songs you're listening to. Roku, I love my Roku. It's, you know, it doesn't even consider itself a technology or entertainment company. They bill themselves as an advertising company. That's where their revenue comes from. Tracking all of your watching habits and using that for advertising. Car companies are taking advantage of the same things that Google puts in, tracking your data, where you drive, where you're going, all could be private data they collect. And I don't know if it's even toys. I don't know if you have heard about this famous example. Anyone out there? About the my friend Cayla? This doll could be easily harked and monitored by other people. It was recording and saving phrases that kids spoke to the doll, sending it back to the company. These phrases that you speak to smart speakers are called dialogue chunks. It was collecting and selling these, selling to nuanced communications, who in turn was selling them to people like the CIA who use dialogue chunks to expand their voice recognition software in things like profiling for terrorists. So even at a young age, we're having people getting used to being spied on.
The take away is if it's called a smart device, it's tracking you in some way. A smart fridge, smart TV, smart anything, smart car, that's the trade‑off. And what's the point of all of this tracking? It's building your public profile, knowing your school, your religion, your work, your politics, and really feeding the data brokers. These are well‑known, some well‑known companies. There is hundreds of unknown companies. They are doing things for private reporting, advertising, marketing, deciding whether you are approved for a loan, or getting a job. There's very little regulation, but billions of dollars at stake.
One of the more forceful speakers on this is Shoshana Zuboff. It's a dense book. I highly recommend listening to her on a podcast or reading some articles if not trying to read through her entire book, which she really is essentially laying out the point of surveillance capitalism is describing the new controlling class. She contrasts this with a previous age where we had what we called the robber barons who controlled our raw natural resources. And today we're seeing the same dynamic play out, but going to the new controlling class, which she calls surveillance capitalists. Right? So rather than controlling and exploiting our coal, lumber, and our land, the robber barons of the past, this time it's our personal experiences, our human behavior, our private thoughts, opinions, and experiences. And the key to all of that is not only are they doing this, but we're giving all of that material away for free.
I can't see you all, so hopefully you're all still there and following along. I made this ‑‑ jumping ahead here.
>> Quick note, there's nothing on your video. It really all started with Google. In the early years, Google established itself as a leader in search. And one of the ways it did that is it used people's search history to continually refine and improve its search engine until it became the best search engine available. There is still a lot of question about how it would be a profitable company.
But one of the side products of all of the data they were collecting to improve their search was they were collecting all of this personal information about people, what they searched for, what they were interested in, what they thought about. And they quickly realized how incredibly valuable this data was. Initially starting with Ad Words. They brought billions of dollars feeding this machine more products, better products and gigantic profits. Today it's an entire eco‑system of tracking or some say spying from your phone, your browser, your e‑mail, your documents, your photos, all that go into Google. Microsoft jumped on the ship soon afterwards. Amazon as well. And it's really expanded from there. Some of the more dangerous examples out there are companies like Verizon that could circumvent any browser or privacy choice you make by starting tracking at the baseline of your device and internet connection itself.
I made this nice little graphic to explain how this dynamic works. On the one hand, you have you and your family are friends. On the other hand on the right side of your screen, you have your Googles and Facebooks and companies like that. It starts with them offering free and low‑cost apps and services like search, documents, entertainment, chat, all things that we ‑‑ most of us use every day. The trade‑off is once we accept these things, they are starting to track your data. They are scraping your e‑mails. They are looking at your documents. They are using all of these things to create profiles. What they then start to do or are doing every day is extracting private data from you, sending it back to the same companies who then, in turn, pick your private data, which they got essentially for free, selling it to numerous companies like the data brokers. The middle man is making a fortune doing this. The dynamic is you have the large companies extracting your private data for free, making a ton of money. On the other side of us, we have companies that are targeting back to us our own private data and making a ton of money and really leaving us as the target, the raw material for company profit. So that's the dynamic of surveillance capitalism.
So the first thing is get off the grid, but it's not really realistic for any of us. And that's really one of the big catches with all of this. Online is such a central part of our lives that staying anonymous and being an engaged, productive member of society is impossible. When you think of the ways we live today from conducting our banking, a lot of it is done online. Finding a job, applying for jobs or searching for jobs mostly online. Dating is online. Filling out college applications. So much of our lives is centered around this dynamic that it's really hard to avoid. Of course there are privacy policies that are out there that are offered as an option. You can see what is being tracked and maybe find a way to opt out. But it's really a tough choice, this graphic you see here on the screen is indicating a number of privacy policies out there including highlighting some of the more well‑known companies in the graph is on a scale of how many minutes it might take to read their policy versus how easy or hard it is to read such a policy. A study well over ten years ago showed back then it would take an average person 244 hours per year to read all the privacy policies for sites they use. And that's about 40 minutes a day. And that was over ten years ago when people used the internet just a little over an hour per day, a number that's now grown to well over three hours, and I'm sure for many of you and for me, I'm on the internet far more than that. You have a company like Airbnb, not only does it take 40 minutes to read their privacy policy, but it's high on the scale of difficulty. You've got one at the middle school level, that's the BBC. And that's what all writing online should be targeted towards. But a vast majority of the privacy policies require education, reading levels at the college or professional, AKA, lawyer level to read. So it's really a false choice to be offering it. Option two is to just get over it. I attended a keynote last year at a conference where the keynote speaker, Erik Qualman made the statement that privacy is dead and really we need to lean into that and worry more about what he called your digital stamp, which is your running record of personal data and the legacy you're leaving behind. So your tweets, your Facebook posts, your blog posts, whatever public bio is out there and worry about that. That is what matters for getting hired or really being an active citizen if you do not participate, you do not exist. That's the dynamic that was put out there. You could do ‑‑ that's where a lot of us are today. Focus on our best social profile and we'll just trust that the markets will look out for our best interest. Or you could say I like the personalized ads. You could say I love that this company knows that I like this particular boot, so at least I'm seeing that versus a pair of shoes I don't like. Maybe that's great. Or the option three is to try and fight back against this big monster. The way to do that is to follow and support new privacy laws. Best practices and new tools and choose open source. I say responsible because open source in and of itself does not mean it's good for privacy. It has to be a choice. AKA, android operating system is open‑source, and it's a wonderful privacy tracking machine. All right.
So the rest of my talk is going to be about that. We will start with the law and the GDPR. It has been around for two years. I will not go into the details of the law, because that's a whole talk in and of itself, and others have done it quite well. To quickly summarize, GDPR was created to establish guaranteed rights in regards to your personal data. How it's collected and limiting what can be collected, empowering you to certain rights. And they include things like a right to be forgotten, which means you can ask companies to stop tracking me and remove all data that you are collecting on me. It's privacy by design, which means I am opting into tracking as opposed to the current paradigm, which means you have to find a way to opt out. And companies can respect and respond when a breach happens. The data breaches seem to be inevitable. So having policies and procedures in place to meet that.
Who does it affect? It's a very broad law. It affects both for profit and nonprofit, small company, large company. There's no size limit. It can be anyone who's a European Union citizen living in Europe or abroad. It can be you as an American citizen while you're living in a European Union state. It covers a broad audience. I wanted to give a shoutout to the EU privacy directive which came out years before the GDPR. It helps define the use of cookies. As many of you have seen, when GDPR became law, you really saw an uptick in cookie banners. And they seem to be almost one of the primary outcomes of the law. I'm sure you have seen them as you browse the web these days. Everywhere you go, you see a pop‑up saying hey, this site uses cookies. Usually there's a single button that says tell me more or I agree. Right? I want to highlight this is the wrong approach. You don't have to agree to be tracked. That shouldn't be the only option. You should be able to choose to be tracked or not be tracked. What has happened since the law took effect? There's been billions of dollars of fines. It's not something small organizations need to worry about at this point. They're going after the worst, most blatant offenders. It has led to new laws in the United States as well. We will touch on that next.
So privacy in the USA, the CCPA, that is the California consumer privacy act. It became state law this year on January 1, 2020. Anyone notice a flurry of e‑mails of updated privacy policies. Anybody bother to read those privacy policy updates? I'm imagining some of you raising your hands, but I can't see right now. So that's cool. What does it do? Again, I'm not going to get into the details. Just a quick overview. It's similar to the GDPR. At its essence, it's saying companies have to tell you what data they're collecting. You have to have easy ways to opt out or request ways for your data to be deleted. And you can't be discriminated against for exercising privacy rights. I find that part particularly interesting, going back to this cookie concept of tracking. You should be able to ‑‑ it guarantees a right to use a site, navigate a site whether or not you agree to tracking policies or not. There is nothing that can be disabled just because you don't agree. Who must comply? It's anyone doing business in California, which obviously doesn't include all Americans, but when it comes to the web, it's really hard to know where people are coming from when they're visiting your site. Unlike GDPR, this is currently only affecting for profits and only for larger companies with higher revenues. But it also could be anyone collecting personal info of 50,000 people or households or devices per year. That might sound like a lot. But if for example, you're running a blog that averages 10,000 visitors a month, and you're tracking analytics, you would qualify. So then you would need to make sure you're in compliance. It's really about visible opt out links. It's implementing security procedures and updating your privacy policy. What's happened since it's become law? Well, there's billions being spent on compliance. Companies are scrambling to make private data accessible to users. Being able to view and export your personal data is something most sites did not offer. So that's a steep requirement. One of the more notable one was the retailer hannah Anderson, which was fined about $7.5 million for violating this law, which for a relatively small retailer is a significant hit. New space being generated with companies offering to help with compliance. So there's billions of dollars being raised. And when it comes to exporting data, I thought it would be fun to explore a few options on my own. So what I did is I went into one of these data brokers called sift, and like many, they collect profiles on individuals, and you can ask for your data to be exported. So I went through the whole process of providing my ID, filling out applications. And after a few weeks, I got my data. It was shorter than I expected. It had some interesting information in there. You can see on the screen here, I didn't expect that it would be tracking every individual man ewe item I ordered from a particular take‑out service, but yet it does including what I tipped and what I paid. It's a little disconcerting to see the amount of personal data that might be out there.
In regards to Facebook, you know, you can go on right now to your Facebook account settings, go to your Facebook information, and you can see they have options including downloading a copy of all the information that they have on you. Or at least most of it. I thought it was a pretty well done operation, to be honest. I went ahead and did that. I was going to show you a live demo of that, but I'm not going to do that today. But just know that here on the right side of my screen is ‑‑ I was able to view things like advertisers who have uploaded a contact list of my information. Hundreds and hundreds of names. I'm able to clear Facebook activity where they're tracking you on third‑party sites. So Facebook is certainly guilty of a lot of privacy violations when it comes to this new privacy tools. I thought they were pretty well done.
All right. So it's not only California that you need to be concerned about. The biggest repercussions is other states joined and there are these four states I have listed here that have all been considering various privacy laws. And you can easily see how this could become a giant mess for anyone trying to comply for their website if they had to do this on a state by state basis. The web doesn't respect borders and people can access websites from anywhere. If all the laws are different, this is kind of a nightmare. It would seem to have to lead to a national law. And yet, in the current presidential election, no one is really talking about this. There was one candidate, Andrew Yang, who talked about establishing an office at the federal level in regards to privacy protection. He's no longer in the race. The current thinking is despite the potential for a lot of confusion, a federal law for privacy is still several years away.
Again, it's not just the United States or Europe, it's other countries. Canada has their own privacy laws, which if you're doing any business with any of these other nations, you would want to familiarize yourself with. Australia has their own privacy law. They recently, even though it's been several years now, they are looking at huge fines relating to the Cambridge Analytica scandal. Brazil has a new law going into effect this year. It's really becoming something that is hard to ignore. I also thought it would be fun to demonstrate this. This is a graph from last year sort of trying to put a score to each state in the country in regards to how well they are doing for privacy rights including things like whether states had to require a warrant to track someone's location or whether companies had to delete personal data on request. As you can see, California, especially with their new law is leading the country right now. You can see a lot of states are far behind. Illinois, home of Midcamp, has got a slightly better score because of some laws regulating the use of things like biometrics like facial recognition. But as a whole, we could all probably be doing a lot better. Regardless of law, there's a practice you can start doing today, and it's called privacy experience or PX. We're still adapting to accessibility laws. We're looking at ways to improve our dev ops, our UX. PX is another discipline and stage of projects that we should consider with every site and redesign. So adding that to your planning process. What do we mean by PX? What do we mean? We mean planning for a user's privacy and security before there's a crisis. So as you're starting out with a project, you're identifying your project goals. You're doing your technical considerations, technical architecture. Have a discussion about privacy. What data are you collecting? Are you limiting data collection to only the items you need? How will you use web forms and cookies and site analytics? What third party tools are you using? It can be short or longer discussions. Obviously it affects budgets. Just like accessibility and other concerns, it becomes more important. It's something we will have to consider.
I'm not going read this list to you, but you can look PII up online. It's obvious things like your name and date of birth, where you live, your face, your fingerprint. One of the more controversial but I think most worthwhile pointing out identifiers of PII is your IP address. A lot of people would consider that not personal identifying information. But it has been proven that companies can use that data to associate individuals with personal profiles. Particularly if ISP or cell phone providers continue to implement tracking, it's not too hard for them to use what is supposed to be an anonymous identifier and associate that with a particular individual.
All right. So what can you do? You can set clear policies. You can set expiration dates on data. Following in the spirit of GDPA and CCPA, you can make it clear to users to understand what data you're collecting. Let them view and download it. Let them ask to have it removed. And really set a plan for privacy. So many projects would wait until the end of the project to tack on a privacy policy, right? Just like the copyright at the end of a project that goes on the bottom of a site. You say oh, we need a privacy policy and throw something out there. A lot of times it's a cut and paste job from somewhere else and does not take into account what particular data you're collecting. I know that not everyone can budget for this. There are policy creators out there. I have seen for a small fee you can draft a custom private policy using the third‑party services. It's something that you can explore.
So I like to say accessibility isn't required everywhere yet either, but we do it because we know it's the right thing to do. We want to be good people. And judging by the way things are going, it's going to be a requirement soon anyway, so why not get on board? I think that privacy compliance is also something that we're going to be dealing with sooner than later. So, it's wise to start learning about what you need to do today.
All right. So with my last minute here, I'm going to touch on some technical issues as they relate to privacy. Starting with developers. Planning how you will manage the personal data and starting with the initial planning stages of the project and measuring what impact if you have any data intensive projects. I know in GDPR, they require something called a privacy impact assessment, which is a document where you discuss and audit inventory any privacy risks in your projects. So you may not have to go that far. But what you can do is practice some common sense things. So most developers, when they work on a site, they're working locally. That means they're copying all the data from a server. They are bringing it down to their local machine and they're working on their various tasks and pushing it back up to the server. So as they're doing so, you could ask yourself what personal data am I bringing down from the server on to my machine? Do I need to be exposed to that specific personal data? How am I protecting it? Is it possible I could purge some of that personal data and replace it with fake data? It certainly doesn't need to be a special person when you're testing a bio. It could be dummy data. Asking when you're putting data on servers, is it encrypted? Who has access? If you're on the content marketing side, ask yourself what data am I collecting? Do you need to collect? Can you limit what you're collecting? And how are you protecting users? You need to give an opt‑out for people. This includes things like first party cookies. Generally they are supposed to be things that are required for the site to function. If you're on a shopping e commerce site, you need to have the shopping cart functionality. That's fine. This first one we thought was nice. The EU cookie compliance module, which will help install that banner that tells users. There is third party integration services that do similar things. I thought that WordPress had a particularly nice GDPR cookie compliance module, which I would like to see more of in the Drupal world. It essentially looks like this where you're not only alerted to cookie settings, but when you go to more information, you can see what are the essential cookies. You can go to analytics or advertising cookies, it tells you what you are viewing and you can individually opt in or out of these choices. So I think a more robust experience is what the law is asking for. I think this does this very well. Site analytics. Again, one of the main issues there is it's supposed to be an anonymous experience. It is collecting IP addresses. Is there a way you can avoid doing so? And there are modules that can help with that. Analytics are important. I am not opposed to them. But can you make sure you're not overly tracking user data? Online forms. Of course we're talking a lot about web forms. When it comes to Drupal, you know, how long do you keep your data that people fill out on web forms? Is there a way that you can set up a Kron or some script to delete them? Can users see what data you have collected? I don't know, but these are the kind of questions you can be asking. I think third party scripts are a big one. Things like NPM and composer are widely used on Drupal sites. Installing dozens of hundreds of third‑party scripts and libraries, who is tracking the privacy policies of all of those? Is there an opportunity where those could be abused? There certainly seems to be. So that's something to be aware of. If you're doing things as simple as embedding a YouTube video on your page, that can be whether a user chooses to play it or not once it's loaded, it is tracking that user for the rest of their session. Same with these nice sharing widgets that you might put on a blog post, regardless of whether you use them or not, once they're in place, they are tracking. So they are things to consider. Really just being privacy smart. Relying on encryption of data. That's the number one tool.
Supporting advocacy organizations like the ones listed here. Considering ethical design, open source, making ethical choices when it comes to user privacy. And of course when I'm talking about ethics, what I'm really talking about are tethics. In the Gavin Belson institute of Tethics, which you should all read and sign if you're a fan of "Silicon Valley."
And Drupal‑specific, there are a number of modules that will do things like anonymize your IP address. It will prevent videos from loading until the user consents. You can use tools to sanitize personal data when you're working on sites. Really, the final take‑aways is I really just want to encourage everyone to avoid getting numb with the way things are today. Understand the new laws and know that all of us, especially those building the web have a role to play in making the web safe and protecting privacy for everyone. So that's my talk. Thank you so much. I'm going to exit my slide show now so I can go back to Zoom and see you all.
And thank you so much.
>> If you have questions, please use your raise your hand button. Other than that, you can go to the manage participants option in zoom. There should be an option by your name to raise your hand. Does anybody have a question? Can people unmute themselves?
>> Yes. Well, excellent talk. I just ‑‑ I was just thinking that perhaps you had yet to find sites such as takeout.google.com, which allows you to export all of the data that they have got on you including all of the data that smart devices may capture. There is another, myaccount.google.com that allows you to delete accounts.
>> All right. Great. That's good to know. I know I didn't mention it, but in regards to cookies, I know that Chrome is one of the browsers that does not block third‑party cookies from tracking. So if that's a concern of yours, you still, as of today, have to go to firefox or safari or lesser known browsers. There's a question here. Yep. A question in the chat. Are you aware of any tools to help automate privacy audits or scores. I'm thinking of how 1Password ‑‑
>> I use 1Password and I appreciate what they're doing. I am not aware of any tools for automating privacy scores or audits. Maybe the question is in regards to how your site is performing? Is that the gist of that question? I know there are tools for checking for accessibility. I know it's hard to do, because it's looking at how your site is constructed. It could see how you're loading a script or a cookie. There is an opportunity there. Yeah. If I find anything, I will certainly share it. That's a good question.
>> First of all, I loved the talk. I love this idea of ‑‑ like privacy experience management. That's a really interesting idea that I think we're going to see pop up a lot more. I'm curious if you have dealt with companies like ‑‑ like Aqueos is pushing their product, and this whole suite of products that's for, you know, content customization for customers. Have you seen anything from them about privacy concerns? Or did you know anything about how the companies like that, how this would fit into this whole privacy experience discussion?
>> Again, a good question. I apologize for not being up to speed on Aqueos things. I can tell you that one of their competitors or what they view as one of their competitors, Adobe, has been moving rapidly into the Google space of harvesting personal data and using it for profit. So I would be surprised if they are not feeling pressure to do so. But that is something for further research and something we will look into. Thanks.
>> Yeah, thanks.
>> All right. Any other questions out there? Well, if not, feel free to take yourself off of mute and give a round of applause to Dan. I can see you.
[ Laughter ]
>> I can see you applauding. Thank you so much for going through this with me. It was certainly a new experience for me to give a talk and not be able to see my audience. So thanks for bearing with me. And hope you have a great rest of your day.