
The National Lottery Community Fund warns that AI-supported applications may not 'tell the unique story of your community'
Photo: Ron Lach/Pexels
National Lottery Community Fund and ACE ‘not opposed to AI-drafted funding applications’
Funders say artificial intelligence may be used to help draft grant applications but urge caution in its use.
The National Lottery Community Fund (NLCF), whose recipients include cultural projects, says it will not reject applications just because AI was used to write them.
But it has warned applicants to “use AI with caution”, saying its output is often “not as strong as it might appear”. Arts Council England has adopted a similar stance.
In a statement on its website, NLCF acknowledges that AI tools such as ChatGPT and Gemini are becoming “increasingly embedded in our lives”, helping save time and improve accessibility.
The NLCF says that applicants who do not speak English as their first language, or are new to the process, may be among the many who find AI helps them “write their applications faster and with less effort”.
“AI can provide a useful starting point but often what it produces for you is not as strong as it might appear,” says the statement.
“AI supported applications do not tell the unique story of your community and how you want to support them. Being too generic in content may disadvantage your application.”
Asked to comment on its position on grant applications written using AI, an ACE spokesperson said: “People from all sectors are increasingly turning to digital tools to support them in the delivery of their work, and the cultural sector is no exception.
“We recognise that individuals and organisations may employ generative AI to support the development of funding applications; it is important, however, to emphasise that applicants remain personally responsible for everything they submit to us.”
‘Free assistant’
The use of AI in funding applications has become an increasingly prominent issue in recent years as the technology develops.
The non-profit sector consultant Christina Poulton, for example, has offered training on “using AI for writing Arts Council Project Grants”.
In a LinkedIn post in 2023, Poulton said that a “lots of arts management work”, including writing funding bids, “takes time you don’t have, can be repetitive admin and isn’t the best use of your creative brain”.
“Think of ChatGPT as your new, free assistant”, said Poulton, adding that it can be “a pretty useful access tool if you are neurodivergent and find starting writing a challenge, or if you’d prefer to speak rather than write”.
But some have expressed reservations about the impact of AI on applications. The Paul Hamlyn Foundation (PHF) says that alongside “potential benefits”, there are also “a number of ethical concerns” around the technology – including its in-built biases and use of material without consent to train its models.
The funder says that using AI can lead to similarities in language and content across applications, which is likely to disadvantage bids by making them less distinctive.
It expects applicants to use generative AI tools responsibly and openly, making sure they convey what is “unique, valuable and distinctive” about their work and carefully assessing ethical concerns.
‘Bland points and flowery language’
In November, the philanthropy consultant Emma Beeston wrote in a blog post that AI-written grant applications can make “tough reading” for those assessing them, since the tools tend to “come up with fairly bland points written in flowery language”.
She added that there can be a “motivational” impact on those assessing such applications, saying: “What enters your head is ‘what is the point of my trying so hard to assess something that nobody wrote?’”
Beeston advised applicants: “By all means use AI for your first drafts, but please then translate it back into human and make sure it is a credible match to your work”.
Paul Hamlyn Foundation’s approach broadly aligns with a statement from a group of major research funders in 2023 on the use of AI in applications.
The Research Funders Policy Group, which includes the Wellcome Trust as well as UK Research and Innovation, says applicants must use generative AI responsibly and acknowledge any outputs.
Last year Stian Westlake, executive chair of the Economic and Social Research Council, said: “Generative AI has great potential for research and innovation, but must be used in appropriate ways that preserve the integrity and fairness of funding systems.
“We must uphold high standards and ensure public funds are used effectively, considering both opportunities and risks.”
Join the Discussion
You must be logged in to post a comment.