I recently had the pleasure of chatting with the Bloor Group’s Eric Kavanaugh for an Inside Analysis Podcast on the ethics of data. Along with Collibra’s Stan Christiaen and data governance consultant John Ladley, we discussed the importance of an ethical approach to data and the ethics challenges companies need to consider when they launch their data initiatives.
The following is a condensed version of the more interesting exchanges we had during the conversation. Listen to the full Ethics of Data podcast.
ERIC KAVANAUGH: We’re going to talk today about the ethics of data management. Big regulations are changing things in the world. GDPR, of course, and we’ve already seen the ripple effects coming out across the U.S., like the California Consumer Privacy Act, the CCPA. The bottom line is that companies have to be more responsible about how they use your data, and you have to be alerted to the fact that companies are capturing your data. Companies like Facebook are poster children for bad ethics: Even though we give them permission in many cases, it’s still hard to manage. What does this mean for you, and what you can do to stay on top of your game? To start us off, Stan, tell us your thoughts about responsibility with data and data ethics.
STAN CHRISTIAEN: Cambridge Analytica brought data ethics to the headlines, especially in the context of social media platforms. Now the stories are everywhere, like about the big tech company who trained one of their chat bots on Internet conversations and it turned racist pretty quickly.
Ethics brings you to morals pretty quickly, and then it gets kind of difficult. Look at the classic MIT experiment where you have to choose between a train killing one person or three people. Research has shown that you actually make a different choice depending on your culture, so there are no right or wrong answers here. Given that it’s such a hot topic in this “algorithm economy,” I want to talk about five things I believe you should think about to avoid these nasty situations.
First, take a proactive approach to data ethics. Don’t wait until somebody steals your data or you build a racist chat bot and then think about it. You can make the right decisions when you start thinking about what problems you can solve next, rather than just reacting to situations that happen.
Second, understand the journey that data makes. Data moves around. It’s eaten by the AI bots. People use it in their data warehouses and the data analytics. You can put it on a USB stick. So you have to understand where it comes from and how it moves.
Third, data is not just leftovers from what you do as a business. It’s active on its own. And that’s why it’s important to explain the value of data to the business. How can they actually make money from it? You also have to think about the risks. If you explain to the business why data’s important, then also explain to them why ethics is important.
Fourth, involve a wide range of stakeholders. It’s not just about the Chief Data Officer or whatever CX role it is. It’s about all the stakeholders. The IT team, the analysts, even the CEO. If you’re out there bringing a message about ethics and privacy, you really have to make sure that your actions match your words. You also have to make sure there’s diversity among your data people. If you only have a certain population between 20 to 30 working on algorithms, and they only come from one part of the overall population that makes up your customer base, you’re going to have problems. They’re going to miss things.
Fifth, this is not a one-time thing. Data is being used all the time and that gives your business benefits, and business changes because of it. When you build an AI solution, you have to take into account that your customer base is changing. The demographics are changing. So you have to monitor data use on an ongoing basis to make sure that your bases are covered all the time from an ethics perspective.
EK: We have Heidi Maher from the Compliance, Governance and Oversight Council. Stan hit all the major points there, Heidi, but what are your thoughts about the ethics of data management in the modern world?
HEIDI MAHER: I agree with Stan that the first step is to be proactive about it. A lot of companies have put the cart before the horse because of the rate of information acceleration. We’re looking at 4.4 billion users in 2019, and they project that it’s going to increase year-after-year by 9%. So big data is a fact of life for a lot of companies, and now that they have it in-house, they want to monetize it. They want to get insights into customer behavior. They’re gung-ho about putting in analytics and data science tools and hiring data scientists and CDOs to get the meat out of this raw data.
Those are all great ideas, but data ethics needs to be something that’s brought in at the very beginning. You are dealing with good ideas out there and good ways of finding out things that you could never find out before, like, what is the common DNA that leads to a certain disease. If we could see that ahead of time, we can help people get the necessary care that could help slow its progression. Now there are analytics and data science around crime. When I was an ADA in Austin, Texas, it would have been great to have tools that could show us high crime areas, houses that are most likely to be burglarized or people who are high risk or low risk, to determine their bond or make a sentencing recommendation.
Those are all great ideas in theory, but when you are developing these programs, you should also look at them from a negative perspective: How can this data be misused? Are we going to start targeting people before they’ve even done anything? When it comes to healthcare, are we going to use that information to jack up their insurance rates? There are all kinds of ways that things can be misused. So when you’re creating your algorithm, you also have to ensure there are no internal biases baked into the algorithm. A lot of the this is in the GDPR, which is a wonderful starting point, but data ethics is deeper than just privacy concerns.
EK: What’s your advice for organizations to get it right with data ethics?
HM: As a lawyer, I think it’s better to be careful and maybe go overboard with compliance versus not doing it and losing your reputation over it, or perhaps receiving some fines. Some companies are putting ingenuity ahead of compliance, but the consequences of not having data ethics and just letting it be the Wild Wild West are terrible because we’re talking about people’s personal information. We’re talking about things that could affect them for their entire lives.
EK: Think about how much location data there is now because of your apps on your phone. If you aggregate enough of that, you can figure out where people are, at this store or at that store, at this restaurant chain or that restaurant chain. And there are people who have access to this… Just think of Google, for example, and Facebook, and some of the bigger ones. They have access to so much real-world data about just where people are during business hours.
John Ladley is a consultant on data governance issues. John, what do you think about this preponderance of real-world data at scale and how it’s going to change economies?
JOHN LADLEY: I’ve said this for years, and finally people are saying I’m not a crackpot. We’ve built society around land, labor and capital. Every economic textbook has land, labor and capital in it, and we have all this oversight in governance and behavior patterns and stuff for land, labor and capital. We’ve reached a point in human history where we need to add a fourth item to that list, and that’s data. We’ve created a new thing that defies any type of abstraction. We’ve invented something new that affects our lives, that affects our behaviors, but we have no rules around it, and it’s no different than the early days of the Industrial Revolution and the early financial models.
We’re going to have data bubbles like we had tulip bubbles. We’re going to have ethical challenges around the treatment of people. It used to be labor was considered dispensable in the early Industrial Revolution, and then to Heidi’s point being an attorney, we had to build protections around things. We had to manifest society’s desires in the code of law because we came up with principles. The basic issue with all this data and the Internet is no one has been held accountable. It’s been the Wild West. It’s easy to be a coward on the Internet and hide behind a fake ID and wild beliefs, and we’ve ended up thinking that that’s the rule and we’ve ended up inviting Big Brother in because there’s been no accountability. We’ve got to step back a bit and put some principles around this.
EK: I’ll just throw it out to Heidi for some closing comments here. We are in a transformative time, and it’s a big deal, so I think it’s good that we have folks such as yourself out there paying attention to these things and crafting policies. What would your advice be for someone who wants to do the right thing? How can they get started?
HM: You should look at it like dating. Have some minimum standards. Any company that doesn’t meet your minimum standards, don’t do business with them. If they’re not using the data according to the terms under which you provided it, then don’t ever use them again. Cancel your Facebook account. Cancel your Twitter account. Make a statement from the consumer point of view.
EK: Yeah, that’s a good idea.
Learn more about the CGOC community and become part of our global forum to get the insights and information you need to make better business decisions.
Not a member? Join the community
Already a member? Sign in
Become a CGOC Member and have access to resources, white papers, surveys, proceedings, and practice tools such as the Information Economic Process Assessment Kit. CGOC Members receive first priority to regional CGOC executive meetings around the world.
Asterisks (*) indicate fields required for registration