AI not ‘sufficiently sophisticated’ to make ethical fundraising decisions

Artificial intelligence is not currently able to make ethical decisions in fundraising, a report is warning.

The analysis by think tank Rogare warns that artificial intelligence “currently doesn’t have access to sufficiently sophisticated thinking about ethics to be able to tackle ethical dilemmas in fundraising”.

As a result, AI’s use in fundraising needs to be closely monitored, which requires training in ethical and data literacy issues among fundraisers “to ensure the most rigorous human oversight”, says Rogare’s report.

It also warns charities that “generic concerns and guidance” about the use of AI “can’t simply be transferred to and overlain on to fundraising” due to ethical challenges involved in asking for and receiving donations.

As an example, researchers asked ChatGPT3 a range of questions around ethics in fundraising.

This includes asking what the technology knows of ethics in the profession, how fundraisers balance their duties to donors and beneficiaries, whether to accept a donation from a fossil fuel company, and donor consent around their data.

But in all its answers ChatGPT3 “never offered any information about donor power/privilege and donor dominance, unless prompted to do by asking a specific question”.

Rogare warns that “for ChatGPT to tell you about these things, you need to already know about them”.

“The answers provided by ChatGPT3 suggest that AI can give fundraisers a better understanding of what the ethical issues are,” states its report.

“But it seems unlikely AI will be able to use that information to resolve ethical dilemmas, because it does not know enough about fundraising ethics in sufficient depth and nuance – particularly about normative ethical theories/lenses – to be able to make such informed decisions.”

It adds: “At this point in the use of AI in fundraising, we would strongly caution fundraisers against relying on AI for decision making and instead use it as part of a larger process of assessment, such as a system to walk the fundraiser though a set of specific questions.”

This includes whether “to accept or refuse potentially problematic donations”.

Project lead Cherian Koshy said these concerns show the importance of effective training within charities to monitor AI’s use in fundraising.

“Not only does this oversight require a high degree of ethical literacy on the part of human fundraisers, it also requires a high degree of data literacy,” he said.

“However, it is questionable whether both the ethics and data skills, knowledge and competencies exist to the required degree across the entirety of the fundraising workforce that will be tasked with oversight of the use of AI in fundraising.

“As AI enters and becomes widespread in fundraising practice, we must upskill the human overseers with this knowledge and these competencies. Skilled and knowledgeable human oversight of AI in fundraising is absolutely essential.”

Last year research by the charity Money4You founds that AI funding searches are “littered with errors”.

Two leading AI tools, Google’s Bard and Bing’s chatbot, were asked a range of detailed questions around funding opportunities.

But when Bard was asked “where can I get funding for non-profit work on race equality in northeast England?”, out of six funding programmes recommended “only one actually exists”.



Share Story:

Recent Stories

BANNER

Charity Times Awards 2023

How is the food and agricultural crisis affecting charity investment portfolios?
Charity Times editor, Lauren Weymouth, is joined by Jeneiv Shah, portfolio manager at Sarasin & Partners to discuss how the current pressures placed on agriculture and the wider food system is affecting charity investment portfolios.