Back to Blog

Ethics in Tech: An Underexplored Topic in the ESG Space

Charles Radclyffe is an ESG specialist focusing on technology governance issues such as AI ethics. He is a co-founder and Managing Partner at EthicsGrade. Ethics Grade is an ESG ratings agency specialising in evaluating companies on the maturity of their digital strategy governance, such as AI or data ethics best-practices. EthicsGrade makes their company scorecards freely available on their website, and provides a detailed data-feed for investment professionals to help them align their own values on these questions with that of the companies in which they invest.

You have built three tech companies, along that journey were there any particular experiences or issues that got you thinking about ESG and tech combined?

My first company was an IT support business so I was living the circular economy problems, which are only now just reaching the headlines, 20 years ago. We received laptops in need of repair, but they’d been glued together in such a way that they’d break instantly if you tried to remove any components, so they were unserviceable. I’ve always had a frustration with the design of technology. I’ve seen a paradigm shift from where there once was an engineering focus (focus on producing a product that would last) to the current commercial focus. It seems that there is a greater focus on exploiting tech innovation nowadays, but I believe the old disciplines will make a return - they have to, if we’re going to save this planet for our grandchildren!

The early 2000s was when I started exploring how the web may have a dark side as well as being an incredible agent for positive change. I started to do more public speaking and writing surrounding the ethics of tech. Back then it was more of a hobby. In my role at the time there wasn't really an opportunity to push it any further, but then I joined Fidelity, where two things happened:

  1. Firstly, I was heading up AI for the firm, which meant I was responsible for figuring out where the biggest impact we could make from machine learning and autonomous technology might be. When looking at all the different business operations and getting to understand ESG, I realised that there was a massive opportunity to improve the quality of ESG data as well as tighten up the processes surrounding the analysis of the data in order to meet the ESG goals of the firm and our clients.
  2. Secondly, I realised that I was able to push this question of governance and responsibility of our technology and through the investments we make quite easily - and certainly much more easily than in previous roles on the sell-side

Tell us about EthicsGrade, and then your role in particular at EthicsGrade

We’re an ESG research company that provides ratings on companies (both tech and non-tech) concerning the level of maturity of their governance in digitalisation strategies. We then licence that data to people such as investors and proxy advisors.

 “We are hyper-focused on what we call “digital ethics” so when we look at ESG issues, we do so through the lens of digitalisation.”

However, our coverage is not exclusive to the tech industry, we cover many other industries including oil, gas, pharmaceuticals, FMCG, transport, etc. But what we look at remains the same, we look at how organisations are being responsible with technology.

My role is commercial and operational. I'm one of the co-founders and leading the business from a strategy perspective. We also have an analyst team doing research, and a tech team who are busy automating our research capability. We’ve got a very small commercial team that services clients and licences data. We have relationships with the companies that we research because we have to talk to them to find out what they're doing. Also, they themselves are interested in how they compare to peers. However, these companies are not our clients and we don’t provide them services; we only licence our data to investors.

EthicsGrade provides company scorecards, what do these scorecards show?

Currently they are the first iteration of what we want to be able to do in the future. Down the line, we want to be able to offer an automated and personalised assessment of any organisation in real-time. We can do this based on data that we can pick up in public sources and also the data that companies provide to us. We look at 4 risk drivers which pertain to where tech might go wrong, and analyse these across 6 different research dimensions such as technical risks, ethical risks, or environmental risks. We’re also offering maturity assessments; but this isn’t yet fully finished and we will be rolling them out in the next quarter. Digital risks and digital assets are much more relevant to a car company that makes self-driving cars than to an airline. And clearly these issues are much more relevant to online dating platforms than they are to companies that make paint. So, we need to start to look at the materiality of each factor for each industry, and that’s something that will soon be going onto our scorecards.

However, what most people take value from is not necessarily the score, it’s actually where an organisation sits against its peers. For me, it’s the most interesting thing. Which is the most ethical social media company? TikTok or Twitter? Who has the most robust self-driving car technology, Toyota or Tesla? It’s questions like these that we want to be the place for people to come and answer.

 “We hope that the scorecards will be useful to help inform interested parties of the risks associated with particular organisations and in particular industries. Of course, it’s always very interesting to compare things like Tiktok against Twitter, or Toyota against Tesla, but also in other industries, you can get a real sense of which organisations are starting to break-out in terms of their investment in data, AI and autonomous systems as good governance is a leading indicator for digital strategy maturity.”

What issues do you see as most salient within ESG and tech?

Starting with the environmental (E) sustainability of digital systems, one issue is the computational efficiency of technology; this relates to how an organisation is thinking about the environmental impact of its tech footprint. The environmental impact entails things such as policies on digital waste (disposing, repairing, upgrading) and location of its data centres. Over the last few years, we've had this trend of moving to the cloud, and now there's a trend towards ‘edge compute’. Edge is where you throw the computation down to the point where the data is collected; the problem is that you don't necessarily have as much control over two things: 1) the efficiency of the computation with when it happens at the edge. 2) the carbon footprint, particularly if the edge is no longer within your organisation. An example might be for Waze. For the sake of using them as an example, Waze might make the design choice that every time somebody is using the app on their phone, the user’s phone does all the computation of its routes rather than Waze doing it centrally. Doing so would be great for Waze because they can lower their energy costs and reduce their carbon footprint from not performing so much computation, but in overall impact that phone likely isn't anywhere near as efficient at computation as a data centre. These environmental factors are very relevant to organisations in the aggregate, and also betray the extent to which ESG strategy is holistically considered and implemented across the organisation.

On a social justice (S) front, when an organisation is using data, they must consider how they ensure the data isn’t having a disproportionate impact on certain groups. Bias in AI is an obvious area but an often-overlooked area is that of automation. When an organisation has a strong automation function, it's probably looking at which tasks it can automate with ease with acceptable risk. It’s looking primarily at a return on investment of automation. It will have a list of all of the different tasks and be prioritising which order they tackle them in. That’s certainly how we do it. But organisations are seldom looking at whether these roles are performed predominantly by men or women, or if they’re performed by people of colour, or people with disabilities. Therefore, they're not looking at the wider impact in terms of their demographic. And it’s very hard to see the impact in the short-term. My concern is that over a 5-10 to15-year time horizon, an organisation can have a devastatingly negative impact on its workforce diversity and never be aware of it, until it’s headline news.

From a corporate governance (G) perspective, the controls around the actual tech itself tend to be governed by engineering teams who’ll be looking at factors such as uptime and performance. They’re not necessarily looking at whether the board of an organisation has a policy in place on particular points and how the protocols are connected through corporate governance from the Board to those on the front-line building or marketing such digital systems. If a company has a data breach it’s likely a failure of corporate governance as much as a failure of its technology systems.

Energy efficiency has previously been referenced as a priority for tech companies. Can you explain how opting for renewable or sustainable energy suppliers might actually lead to digital waste issues?

This is a really big area that I've been talking about for at least 5/6 years now. In my previous experiences, running blockchain and crypto projects at Deutsche Bank, my number one criticism concerned the architecture of such systems burning a runaway level of electricity just for the sake of it. I then joined a startup that was all about computational efficiency and I saw this as the answer to a lot of the blockchain madness. I think that the problem is when people look purely in terms of what the source of energy creation is for a data centre, and if it’s renewable, then it’s OK, they're limiting the scope of their thinking.

 “An example might be where a company is claiming to be carbon neutral - which is great, but might not be the whole story. This company may be continuing to run things that aren’t necessary and are just a waste of resources. Let’s remember, it’s not just about CO2 and KwH it’s also about the Earth’s precious mineral resources. Sustainability requires minimising footprints, not excusing wasteful behaviour because it’s more ‘green’ than other methods”

Using the example of email, Wholegrain Digital did a study on the impact of not clearing out your inbox. The problem is, you're sending an email and you copy in ten other people - there’s now ten other copies of that email. If you then reply-to-all, you're adding even more copies of the same message. The original email is replicated potentially hundreds of times. Because of our love for instant access to data, the architecture that sits behind this means that the email has to be available a few microseconds after it’s been requested. The original email is sitting in multiple data centres because we need high availability as well as it also being backed up. As a result, we end up with unnecessary usage of the world's resources from something so simple as not being efficient with how we store data. Today we’re sending a much higher volume of communication. On the face of it, it looks greener than sending the same number of letters, but by the time we multiply, we can see that email chains come with their own set of issues.

To address these issues, we can turn off HTML and stick to plain text with email, we can archive emails. If you need to keep a backup for regulatory purposes, it doesn't need to always be on. Rather it could be backed up using tape storage as that’s something like 100 times more efficient from an energy perspective than having it in a data centre. But also, things like having policies for people working from home, such as encouraging people not to use Bluetooth headphones, but use wired headphones would be great. Because once a battery dies in many Bluetooth headphones, they become landfill waste. We could encourage people not to have virtual backgrounds as they have a 5% impact in terms of the energy cost. With a virtual background, the computer has to do a lot more processing simply figuring out which part of the image is you and which part is your messy home-office setup. 

“If you want to save the planet, perhaps the best place to start is cleaning up your home-office, and switching off the virtual background when you log-on tomorrow”

One criticism of this argument is that one fewer email or virtual background won’t make a shred of difference. The problem with this approach is that we have such a mindset in the tech industry that everything is abundant and we can just keep creating, consuming and storing data forever. As a result, we become very lazy with technology. Our answer is just to have more tech to overcome the limitations of our thinking. We’ve conditioned ourselves to believe that with every new feature that is built, regardless of its necessity, we are innovating and progressing. But a 108-megapixel camera on a mobile phone isn’t a necessity. However, examples such as this are presented in the media in a very positive light. A consequence is as companies digitalise, they tend to adopt this very mindset and understanding of what innovation is. Simple things such as reducing the resolution of images on your corporate website and getting into a ‘sustainability’ mindset really does add up and make a difference.

Does this mean that you either upgrade your equipment and cause environmental destruction or you don’t?

In my more extreme view, I’m running EthicsGrade deliberately on very old equipment. I just took great delight in buying a five-year-old MacBook recently for one of my team members. But I make the point in buying an old laptop that could have easily been destined for landfill and I could have easily (and more conveniently) bought a new device. However, all that my colleague needs is email and access to the web; it’s perfectly capable of that if you don't upgrade the software to the latest version. Yes, we've got a cybersecurity risk of running an old version of MacOS but that's offset by the fact that all of our data is held in the cloud in a data centre. Cybersecurity itself is an ESG risk, and there are great companies out there like BitSight who are doing a super job of evaluating the security of companies and therefore the risks that investors face in their positions in them. But cybersecurity should never be a trade-off against environmental factors, and very often - it’s the design of systems that mean that these trade-offs are being made; and it’s the environment that’s losing out.

I think organisations should essentially have this conversation with their stakeholders. If you're an organisation where having the latest technology is very important to your most important stakeholders (perhaps your employees), then fine. Do it. But I think we’re seeing a shift in attitudes, especially in other domains such as with upcycling and fixing old furniture or clothes. While this trend hasn’t caught onto tech yet, it is a trend we're heading towards. Who knows, we may all be bragging in the future not about how shiny and new our latest gadget is; but how we’ve resurrected some vintage tech from the grave and given it a new lease of life!

Can you highlight some of the social issues related to technology and how companies can address them?

One example is a bank I’ve recently talked to that has a workforce of around 100,000 people. About five people are dedicated entirely to looking at diversity and equality in the workforce; this is their role and they are specialists in that. They are supported by a wider team of about 35 people who don’t have D&I in their job title, but it’s part of what they do. So, in total, 40 people are spending time on these issues. Their aim is to help women and minority groups have more equitable futures in the workplace in the face of (largely) unintended barriers and discrimination. At the same time, they have a team of 200 people doing automation and nowhere in that automation team are they considering the KPI of how the automation and digitalisation of tasks might impact disproportionately these very same groups. This is the real shocker as essentially you have one team incentivised in such a way that misaligns with that organisation's very public commitment to diversity. The truth is that if you automate tasks within an organisation, such tasks are highly likely to be performed by underrepresented or minority groups. You have to be thinking very carefully about digital skills and the career path of these individuals. This is not something you see an impact on in a short-term time horizon. There are very few people talking about this so far, but it carries huge reputational risk for organisations and it's a very easy thing to mitigate.

 “Those running automation need to involve the diversity and inclusion teams in a conversation about digital skills.”

One thing that interested us was your term ‘Watermelon Organisation’, could you explain what this is?

I’m not sure when I first came up with these mean in relation to ESG, but my ex-boss used to talk about watermelons in the context of project management and how he “doesn’t want to have any watermelons'' in our programme office He was referring to projects that look green when they are reported to senior leadership but when you scratch the surface and look deeper, they're actually all red (or at high-risk of turning so at any point). For me, it’s a perfect analogy for ESG - because there are truly very few bad actors out there, but many (if not most) organisations like to paint a thin veneer of ‘green’ on the surface of some very questionable practices or absence of controls.

An example might be big tech organisations going out of their way to talk about their virtues. They may present a green view of themselves, not only from an environmental perspective but also from a tech ethics perspective. But once you look beyond the marketing, you can see systemic risks all over the shop. These are easy to spot retrospectively once the headlines have broken, but by then it’s a very bad day for the portfolio manager who had that stock in their fund. A recent example in the news is the controversy of social media companies. But how would investors know that these risks are there? Only by looking for the existence of best-practice governance around the design, implementation and use of these systems - and that’s something that specialists like us can really help unpack. Our view is that the likes of Facebook aren’t terrible organisations. They’re actually doing some really great work in pioneering quality engineering and management of issues that lead to all sorts of online harms. But the fact that they don’t score better on our system despite a $120m investment in governance really highlights that there is something significantly wrong with their approach. 

At the moment there is a singular view where we see companies only as their rating or score according to some external standard. At EthicsGrade we offer our clients the ability to see the world according to their view of ESG. We’ll be rolling out some new features early next year so that when you enter the website and see a company and their values, you’ll be able to compare these against your own. We believe that what is a watermelon to one person might well be a kiwi to another - or indeed a rotten tomato to someone else. We all have our different values and moreover different degrees to which we care about those values. We like to think about what we’re doing as offering a personalised watermelon detection service.

How can someone get in touch?

https://ethicsgrade.io/

If you're a journalist or an academic, you can get our data for free. We charge a licence fee to investment professionals. We don’t offer services to the companies we cover, but we’re always delighted to talk to corporates who see their company ratings on our website and would like to discuss the reasons why and what they need to do to improve.


Learn more about Nossa Data!

Beyond our educational content, see how we help companies globally, better collate their ESG data and improve internal processes

Request a Demo
Thanks for joining our newsletter.
Oops! Something went wrong.