top of page

AI and Grants: Divine Tool or Slippery Slope?


When I was about ten years old (this would have been the late 70s), my aunt told me this joke:


One day, all the world’s top researchers met to design an experiment to test the existence of God. They built a HUGE computer and spent years entering EVERY BIT of human knowledge into it. 


Finally, the day of the experiment arrived. The scientists entered the question, “Is there such a thing as God?”


The computer buzzed for a few seconds and spat out the answer: 


“There is now.”


As a person of faith, I believe God is far greater than the sum of all human knowledge -- but I also understand the existential fear of technology’s achieving divine-level omniscience. 


As a grant professional, I have an enduring faith in the myriad of talents needed to craft comprehensive and convincing proposals -- talents that can’t be automated by artificial intelligence (AI) -- but I also understand the existential fear of our profession being devalued or compromised through technology.


Nonetheless, I am fascinated by ChatGPT and other emerging tools. I know they will alter the way we currently work, whether we want it or not.


I attended the recent GPA iLearnLive conference dedicated to AI and its impact on our profession. Sessions included both practical how-to demonstrations of emerging tools and thoughtful debate on the theoretical and ethical implications of the oncoming digital apocalypse…oops…sorry…I mean, the brave new world of AI and how it will make our lives better.


In this article, I will highlight three ways I believe AI will benefit grant-seeking and three concerns I have about its impact on our profession. Let’s start with the negatives.


1. A Robot Took My Job

At least once a week someone asks me if I’m worried about AI making my job obsolete. I always say no but some threats are concerning.


In the budget-strapped nonprofit world, staff are habitually expected to do the job of more than one person. According to Grant Station’s 2023 State of Grant Seeking report, 18% of respondents represented all-volunteer organizations or had only one employee doing the job of many. We all know (or are) program managers who pull double-duty as grant writers or grant professionals who juggle multiple responsibilities: maintaining the website, managing social media, coordinating board meetings, or other roles.


In the same report, grant seekers listed “lack of time” as the top challenge to grant seeking. Nonprofit leaders who are stretching each dollar as far as possible may give in to the temptation of “outsourcing” application preparation to AI in the name of good stewardship, rather than investing scarce resources in skilled professionals (ahem…GPA members) who can provide a solid return on investment and adherence to ethical best practices.


2. Inaccuracies and Plagiarism


I was recently playing with Canva’s AI image creator. I entered, “Man riding a horse” and it gave me dozens of banal images. To challenge the system, I entered “Horse riding a man.” Here are two of the nightmare-inducing results:



If you believe that AI platforms don’t create errors or compile hallucinations, please explain these photos to me. 


I want to believe that determining the difference between a reliable source and fake news is always as simple as it is in these photos. Sadly, this belief is regularly shattered, leaving me worried that grant seekers will assume AI content is always reliable and accurate. Frankly, I don’t believe this is any more dangerous than someone quoting the results of a sloppy Google search, but it is a concern, as AI may be more likely to “sound” legitimate.


While sloppy fact-checking may damage the grant seeker’s reputation and/or further a false narrative, the impact of plagiarism - even unintentional plagiarism - has the potential to be far more damaging. While the geniuses behind AI often claim it is NOT plagiarism, we also know their algorithms pull the information from existing content - it comes from somewhere. The danger that a creator’s original work will be unintentionally stolen may be one of the most dangerous pitfalls of relying on AI.


For example, in the horrific photos above, how do I know that the faces of the people shown aren’t actually living human beings who don’t know their likenesses are being used? I can only ensure I am following fair use standards and trust that Canva is being honest about how it creates images.


3. Increased Competition for Funding


At Friday’s iLearnLive debate about AI, Maura Harrington expressed a concern that increasing the number of excellent proposals submitted could overwhelm grantors. As a result, foundations may simply close their doors. 


This argument assumes the only reason nonprofits avoid grant applications is because they can’t come up with answers. When grantors require 2500 character answers to questions like, “What is your organization’s ‘secret sauce’?”, or “Explain how you will still fulfill the objectives of this project if you don’t get funding?” the argument is reasonable. 


When I stop thinking selfishly and look at the philanthropic landscape, the worst-case scenario isn’t really that bad. Perhaps nonprofit leaders can better navigate these questions with the help of AI tools. If funders are overwhelmed with too many well-written applications, maybe they will start making the application process simpler and more trust-based. 


Or, instead of placing the burden on the organization to make a case for funding, perhaps grantors will accept responsibility for discovering and vetting organizations they want to support. While this may not be in my best interest as a grant professional, it could benefit philanthropic missions. After all, making the world a better place is what we all want, right?


Speaking of making the world better, here are three areas in which AI can benefit our profession, the organizations we serve, and people in general. 


1. Saving Time

If you didn’t get to participate in the iLearnLive event, I highly recommend you enroll. Some of the tools explored left me in awe as to how they can save time.


Creating your own Chat GPT database: With a paid subscription to Chat GPT, you can create your own universe from which to search. Upload all your previous grant narratives, link to the organization’s website and, voila! You can start entering prompts and have ChatGPT find the answer! Patrick Kirby offered an excellent session demonstrating this tool.


Seeking funders: At the iLearnLive event, Margit Brazda Poirier demonstrated how she uses AI to find information about different funders. Her handouts offered various prompts such as “Is anyone from the ABC Foundation Board of Trustees affiliated with Anytown, USA?” Some of her searches didn’t yield much, but a few of them were gold mines! ChatGPT or Google Genesis could easily become another valuable tool for seeking alignment with funders.


Streamlining the writing process: A few months ago, I had a grant that required letters of support from five partners. I wrote one letter, then asked ChatGPT to rewrite it four times. The results needed tweaks, but I saved at least an hour not having to rewrite essentially the same letter four times. 


AI tools can also reduce a 2,000 character paragraph to 1,500 characters, unstick writers’ block, or offer creative suggestions for how to answer a daunting question (like the “secret sauce” question). 


2. Improving Quality 

As writers, we know it is usually easier to edit something that already exists than to create it from scratch. Grant professionals shouldn’t rely on AI to create their final proposal, but maybe AI can generate a decent first draft. The generated draft may not have the right tone or even include accurate data, but it can provide an excellent starting point. 


By using these tools, grant professionals can invest more time in organizational and program development, relationship building, and other soft skills that can still only be done by humans (for now). As a result, organizations and programs can respond to community needs more quickly, follow best practices and evidence-based principles, and have improved communication and partnerships.


3. Leveling the Playing Field


Some AI - such as the ubiquitous grammar and spell-check tools that have been around for years - have the potential to level the playing field for people whose first language may be different from their audience, who are overcoming challenges such as dyslexia, or who otherwise struggle to communicate their brilliant ideas. 


Tools like Otter AI (which provides a transcript and summary of a verbal meeting) can help people with auditory, or other communication limitations, interact with team members and stay abreast of developments. People who struggle with language or literacy issues can use AI tools to either read text aloud or translate it into an accessible language or reading level.


At the organizational level, AI can help small or new nonprofits that are doing life-saving work but cannot afford a grant professional to craft lovely language. Of course, I would prefer these organizations hire me to do this work for them - but sometimes they can’t. If I erect barriers to using AI simply to protect my profession, I can’t claim to be philanthropic or say I want to make the world a better place.


Making the world a better place is the purpose of any tool -- whether it is a hammer or artificial intelligence. Just as a hammer can be destructive in the hands of an unskilled or unscrupulous individual, the tools provided by AI can (and will) be abused by some. Regardless, it’s a tool that is here to stay. 


As a professional, I find peace in knowing the GPA and its members are committed to the ongoing examination of the practical benefits and ethical concerns related to technology and emerging tools. With new developments being released constantly, we are in the pioneering, Wild West days of AI, and no one can predict the outcome of society’s having these tools on hand. 


As committed grant professionals it is imperative that we continue to seek information, engage in discussions, and monitor the intended and unintended consequences of using AI tools. 


Based on what I learned at the iLearnLive event, here are a few emerging tools I am going to try:

  • Perplexity - Perfect for grant seekers. It will give you an answer BUT it will provide the sources of that information, so you can double-check.

  • Otter AI - Turn it on during meetings and it will transcribe AND summarize the meeting for you. In addition, it will give you a summary of how often you talked versus others. Prepare to feel judged.

  • Google Gemini - Until recently, this was called Bard. It is a Google-based AI that will be incorporated across the platform.

  • Microsoft CoPilot - Automatically included in your Office apps. This feature was just released last week.

  • Canva - Magic Studio will create slide decks for you, generate the images you need, and more (just be ready for weird horse photos…)

  • And, of course, ChatGPT - The OG of AI. New bells and whistles to ChatGPT include the ability to create your own private GPT database. 


The above article was NOT written or assisted by AI tools. But I did enter this text into ChatGPT and asked it to write a poem, based on my article. Here is just one of the dozens of stanzas it came up with, which I believe sums everything up quite nicely.


But beyond the jest, lies a deeper fear,

Of technology's rise, drawing ever near.

As a grant professional, I too must face,

The challenges of AI, in this evolving race.


31 views0 comments

Comments


bottom of page