Human at heart: how charities can make the most of AI

Artificial intelligence (AI) is revolutionising efficiency in the charity sector, but the role of people will be crucial to its success, writes Melissa Moody

In recent years, artificial intelligence (AI) has emerged as a transformative force across various industries, and the charitable sector is no exception. AI offers a range of capabilities that can revolutionise how charities operate, from data analysis and donor management to program optimisation and disaster response. However, like any technology, AI also comes with limitations and potential drawbacks. In this feature, we delve into the capabilities of AI in a charitable context, exploring its positive impacts, limitations, and potential negatives.

If you hadn’t realised already, that introduction was written by AI, Chat GPT to be specific. It’s not a bad introduction; I’m sure if you look back over any features in past issues of Charity Times there are very similar introductions.

So if AI can write features for writers like myself, what else can it do? “You can use AI to automate pretty much anything you want,” explains Tricia Blatherwick, chief commercial officer at Autogen AI. But the term ‘artificial intelligence’ is relatively unlikeable among those working in the industry. Instead, there’s a preference for ‘augmented intelligence’ as a term, “because we think human beings need to stay at the heart of AI,” Blatherwick adds.“‘AI has no judgement. Human beings do”.

AI, in some form or another, has been around for a while. Joe Reddington, founder of charity eQuality Time has been using it for near on 10 years and chatbots or similar have been a staple on a lot of websites for a number of years now.

It’s the generative AI and large language models that have stormed onto the scene in 2023. These are models that have essentially read a lot of words and are being trained to generate language that reads and sounds like a human being has written it. If you take it at face value, the applications could be endless.

“From a charity perspective, generative AI can be used to do any kind of writing that the charity would ever need to do. So it could be a charity chief executive writing a thought piece about what their charity does. It might be an email going out to all of your volunteers or supporters or contributors. It might be a grant application, or it could be a response to a government consultation.”

Currently, Autogen AI is working with a number of charities, where employees are using generative AI for grant/funding applications but where they are also using it for internal and external communications and responding to government consultations.

But if it sounds this good, why aren’t more charities using it? The sector has traditionally been both reluctant and slow to adopt new technologies. Even charities’ use of data still isn’t up to par with a lot of other industries. In 2021, the Law Family Commission on Civil Society found that there was a “serious data gap” within the sector. At the time, Sarah Vibert, CEO at NCVO said that “[charities] need to significantly improve data about, for, and from the social sector.”

Since then, charities have been improving their use of data, but with AI added in the mix, technology is moving faster than the sector can keep up.

Fact vs fiction

But with the emergence of a new technology is the hype that surrounds it (and you’d have to be living under a rock to be immune from the hype surrounding AI). “But because of all the hype there are lots of myths and misunderstandings,” Blatherwick says.

’She explains that large language engines have been trained on everything that has ever been written and digitised. A feat that would take a human being (with no sleep!) 23000 years. They use this training to generate language - complete sentences - that read as though a human has written them. But they are not a search engine. They make this language up.

Fabrication is one of the major limitations of large language models, which is also known as hallucinating. Those that have tried to use Chat GPT to write news or reports with references or quotes have found this out the hard way when someone points out that every source is manufactured by the AI.

If a charity is using a large language model, it should be tethered to factual data; this stops it hallucinating. In organisations like Autogen AI, they marry the models with data from the organisation, which means when someone asks the model for a grant application “you don’t get rubbish back. You get factual information, which is enhanced with your own case studies, references and example material.”

Another one of its limitations is adoption. “There’s a very real fear that AI is either going to take over the world or if it doesn’t, it will take your job,” Blatherwick states. But is it capable of either? Probably not.

For those with jobs that would be affected, it’ll likely make their lives easier, she believes. “The introduction of Excel spreadsheets did not eliminate finance managers. Therefore the introduction of a tool that can generate language is not going to eliminate writers. It means that writers’ skills become higher level; honing, editing, perfecting, enhancing with a very full and credible first draft [the AI] has created.”

Regulation is going to be needed, Blatherwick and Reddington agree, but it’s not going to be easy to do. The ethics and security around AI also need to be handled carefully.
“Governments are starting to grapple with that and starting to engage… and it’s not a brand new problem. Governments have had to grapple with emerging technology forever,” says Blatherwick.

Reddington however is more cynical on the subject. Some of it is profit motivated, he believes, particularly those organisations saying that the use of AI should be left to the ‘experts’. But “I am very cynical,” he admits. “I don’t know how many optimistic people work in the job, but I’m not expecting workable legislation, and if there was legislation I’m not expecting it to be fast enough to work.”

Add in some ethics

Even with the bells of caution ringing, it’s hard to not want to get involved. So what should a charity do if it wants to be a part of the hype? Well, first of all, don’t be taken in by everything, says Blatherwick. “It’s got the biggest hype curve on the internet at the moment and therefore there’s a lot of scare mongering.”

There are also a lot of businesses popping up and trying to solve problems with AI, and not all of these businesses will last. “Do your due diligence around the businesses you engage with and make absolute certain that what they’re developing and showing to you is secure at enterprise level and is not one man and his dog in a back room which then hooks to something like Chat GPT, which is not secure for business use.”

One of the issues with a staff member, for example, using Chat GPT and not a secure AI is that anything put into the model is used to train it, and anyone can access that information. If someone from a charity puts in GDPR-related information for example, then that could be considered a data breach and should be taken seriously.

Reddington believes that although it can be used for writing content, using it for more specific problem solving is probably a way off. “No charity, except a couple of the really big ones, will have the technical expertise to be able to run a piece of artificial intelligence that can make decisions for end users… the problem is that the training data can be misogynistic, it can be racist, homophobic.”

There have already been examples of this. In 2016, a Microsoft AI chatbot released via twitter was shut down only 16 hours after its launch because it began to post racist and sexually charged messages and facial recognition AI is notoriously bad at detecting non-white faces.

“I think [charities] should not use it unless they have in-house technical expertise and an ethical framework” Reddington adds. This is important, particularly for things like AI generated images or content where a charity is attempting to make more conscious decisions to reflect diversity. Ultimately, AI reflects the input it’s been given and the fact is that a lot of content on the internet is not representative of diverse backgrounds.

Looking positive

But when AI is done well, it can be one of the most useful tools a charity has. “The people who are using them are saying, never take this tool away from me because I will not write using a blank piece of paper ever,” Blatherwick jokes.

Reddington agrees. “I can say without exaggeration that it has doubled my output.” Although Reddington already works efficiently, and comes from a computer science background, he finds overcoming emotional barriers easier with AI. He can use the model to write a draft if he doesn’t have the mental energy. “It has genuinely doubled my output because I’m no longer stopped by these emotional roadblocks.”

Particularly for small charities and SMEs it can be an incredibly effective tool and help with everything from writing content to standardisation. As CEO of a small charity, Reddington uses it to switch between everything from writing social media posts to writing a letter of complaint.

Blatherwick agrees. “It’ll give you a beautifully written, consistent piece of prose even if the inputs have been from people who aren’t naturally good writers. This can include people where English isn’t their first language. It might include people with dyslexia, dyspraxia, people with hidden disabilities… those who struggle with grammar or have low literacy levels. This is an absolute goldmine for these people where everything is in their head. They can get it down on paper and the AI helps them structure, rephrase and correct any grammatical errors

For charities that work internationally, it can even translate work. Because AI has been trained on everything that’s ever been written on the internet, it’s like Google Translate on steroids.

Large language models instead do a semantic search. For example, if you’re translating to French it will find the word that usually follows the previous word and it ends up about 99% accurate Autogen AI has tested it on both European and non-European languages. As long as they’ve been digitised, it’s “astonishingly accurate.”

With all of the discussion surrounding the new technology and a lot of questions still yet to be answered, should it be something for people to be excited about? “Of course they should,” Blatherwick exclaims. “It’s like the invention of the internet, it’s like the invention of electricity, it’s like the invention of the wheel. It’s an absolutely pivotal moment in human history… we must embrace it. If you don’t embrace it, you’ll fall behind.”

Remember that introduction I asked Chat GPT to write? Well, the rest of the feature wasn’t much to write home about and instead I’ve spent hours putting this piece together. So although AI can be useful, I think my job, and yours, are safe.

Meanwhile, charities should experiment with AI, but with the right processes in place. For the technology to have a significant impact, humans are pivotal and people will be required to monitor due diligence and to work side-by-side with the technology (at a faster and more efficient pace than before), rather than seeing it as a threat. Whilst AI robots may not provide the best lunch companion or make the best cup of tea, there’s no doubt that their contribution to the workplace, when implemented carefully, will make working life significantly easier.

Share Story:

Recent Stories


Charity Times Awards 2023

How is the food and agricultural crisis affecting charity investment portfolios?
Charity Times editor, Lauren Weymouth, is joined by Jeneiv Shah, portfolio manager at Sarasin & Partners to discuss how the current pressures placed on agriculture and the wider food system is affecting charity investment portfolios.