Text analysis of thousands of grant abstracts shows that writing style matters
Connecting state and local government leaders
Research described with more words and more markers of verbal confidence received more award money from the National Science Foundation.
Is there a financial relationship to what or how people communicate?
Placing a value on words can feel crude or highfalutin -- unless you’re in academia, where words are often tied to money. More publications can lead to a promotion, and receiving grant aid can fund new research.
In a paper published on Jan. 30, I evaluated the financial value of words based on a sample of funded National Science Foundation grant abstracts. The data indicated that what researchers say and how we say it can foretell the amount of funding we are awarded. They also show that the writing funders idealize may not always match up with what they actually prefer.
The worth of words
Prior research shows a relationship between language patterns and the funding of personal online loans. Loan applications that had more complex writing -- such as those with more words in the description -- were more likely to receive full funding. Loan writers also received money if their text contained high levels of verbal confidence such as words that convey certainty (“definitely,” “always,” “clearly”).
To assess complexity and confidence indicators in the NSF sample, I ran over 7.4 million words through an automated text analysis program. The grants covered all NSF directorates, U.S. locations and nearly nine years of funding from 2010 to 2018.
Consistent with the online loans data, grant abstracts with more words and more markers of verbal confidence received more award money.
In fact, each additional word in the grant abstract is associated with a US$372 increase. The ideal word count across NSF directorates is 681 words. After this threshold, additional words associated with a decrease in award funding.
Two other results were telling about the NSF data. First, using fewer common words was associated with receiving more award funding, which is inconsistent with the NSF’s call and commitment to plain writing.
Second, the amount of award funding was related to the writing style of the grant. Prior evidence suggests that we can infer social and psychological traits about people, such as intelligence, from small “junk” words called function words. High rates of articles and prepositions, for example, indicate complex thinking, while high rates of storytelling words such as pronouns indicate simpler thinking.
NSF grant abstracts with a simpler style -- that is, grant abstracts that were written as a story with many pronouns -- tend to receive more money. A personal touch may simplify the science and can make it relatable.
Changing words to receive more change?
The data include only funded grants, and the relationships may not indicate a direct cause and effect. Therefore, such patterns are not a recipe for a marginal proposal to receive funding nor a “how-to” guide to outfund the competition.
Instead, the results demonstrate that real-world language data have rich psychological value. Just counting words can provide new insights into institutional processes such as grant funding allocation.
Most grant writers believe, and are even told by funders, that a competitive proposal starts with a great idea. This study suggests that another part of grantsmanship may be the proposal’s word patterns and writing style. Since most funded grants will contribute knowledge to science, one way to potentially enhance a funded proposal with more award money is to consider how the science is communicated in the writing phase.
Poet George Herbert suggested, “Good words are worth much, and cost little.” The NSF data offer a different perspective: More complex and confident stories tend to cost the NSF a lot. For researchers looking to support their work with more money, word patterns may be an inexpensive place to start.
This article was first posted on The Conversation.
NEXT STORY: SEC seeks blockchain analysis service