Powered by TypePad

July 2006

Sun Mon Tue Wed Thu Fri Sat
            1
2 3 4 5 6 7 8
9 10 11 12 13 14 15
16 17 18 19 20 21 22
23 24 25 26 27 28 29
30 31          

Links

Search


  • WWW
    Where Most Needed

« UK Charity Commission: File Earlier | Main | Tracking a Cohort of Recovering Symphonies »

Study Shows Overhead Ratios Have Minimal Influence on Donors

After years of media focus on "efficiency ratios" for charities, the donating public isn't buying it.

Bowman_ratios_tablesA paper by Woods Bowman of De Paul University (EIN 36-2167048 Form 990) investigates the question of whether donors are influenced by the operating ratios of charities.  The paper appears in the latest Nonprofit & Voluntary Sector Quarterly and unfortunately requires a subscription to go beyond the abstract.   Prof. Bowman analyzed giving to the Combined Federal Campaign in the Chicago area over several years to determine whether changes in so called "overhead" ratios had an impact on designations to charities.  The ratios are available in the CFC published materials, so donors have them readily available. 

I've just clipped a couple of tables from the article that show the key findings.  There was an impact, all right, showing that an increase in "overhead" resulted in a decrease in both the number of designations and the dollar amount.  But the key statistic is the R-squared, which shows how much of the variance in number and dollar amounts is explained by the change in "overhead."  It shows that only 3-5% of the variance in number of designations and 2-4% of the variance in dollar amounts is explained by the ratio changes. 

Prof. Bowman concludes that the focus on overhead ratios in public policy is "exaggerated." 

But what Prof. Bowman does not note is that this result comes after more than a decade of concerted publicity encouraging the public to use financial ratios as a measure of charity performance.  Mass circulation Forbes magazine runs an annual review of charities based on efficiency ratios.  Trent Stamp of Charity Navigator, the leading advocate of charity ratio analysis, is a regular talking head on news programs from Bill O'Reilly to the Daily Show.  That's just a small sample of the drum beat of editorials and articles impressing on the public the importance of ratio measures for rating charities. But the public just isn't buying what the self-appointed charity watchdogs are selling. 

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/t/trackback/4900695

Listed below are links to weblogs that reference Study Shows Overhead Ratios Have Minimal Influence on Donors:

Comments

Just one point of clarification: my paper argues that overhead ratios measured at one moment in time are not meaningful. Changes in overhead ratios are theoretically meaninful. I tested response to changes and these tests are the source of data in the tables.

Thank you for the excellent service overall to the overall good of non-profit organizations and to the public. I am especially pleased to see the article/reference today made to “Efficiency Ratios.” I’m just an old non-profit fund-raising practitioner, but with experience spanning thirty-four years, I have learned a thing or two regarding the dangers to non-profit organizations coming from the faulty financial manipulations made by self-styled “charity regulators.” My article, posted on my website, and reprinted elsewhere a number of times, was written in October of 2002, and I have not changed my opinion a bit about it since then. I would be interested in knowing what you think.

--- The Fallacy Of Financial Ratios:
Why Outcome Evaluation Is The Better Gauge Of Grant Worthiness
http://www.raise-funds.com/100402forum.html

Many of us in the non-profit world are not happy, content, or enamored, with what Charity Navigator is doing. CN “ranks” non-profits mainly from the numbers reported on IRS Form 990 and other scant, widely diverse, spreadsheet information to rank non-profits to a four-star system.

But, who “ranks” CharityNavigator? How many "stars" to they get---or give to themselves, as a non-profit organization taking it upon themselves to rate others with all of the much less-than-meaningful data they use?

What would CN’s donors think of the money they give to that non-profit organization which is wasted to a great degree, and which causes harm in the non-profit world? Talk about efficiency in the manner they judge other organizations! Imagine the CN staff kept busy, spending money and wasting time as they do for a poor quality result, causing harm to innocent non-profits---as they use direct contributions from their donors to carry out their own flawed and misguided mission.

When non-profits are victimized in the way CN does it to them, you can bet that some of those organizations’ donors are concerned about how their money is being spent. CN’s donors should demand the same accountability.

About two years ago, when I sent an email to CN protesting the damage they were doing to innocent and unwary non-profits with the use of very little relevant data, The CN Executive Director, Trent Stamp, replied personally to me with the following rather appalling statement.

“While our ratings are indeed only a piece of the puzzle, I think they’re a necessary piece that heretofore wasn’t presented in a meaningful fashion.”

So, here we have rankings of non-profits taken very seriously, for example, by donors and prospective donors, based on “only a piece of the puzzle.” To me, that means half-baked assessments are being passed off as Gospel.

In that email to me, Mr. Stamp further justified the use of only a partially developed formula by blaming the non-profits for, as he said to me, “resisting evaluation” and not “embracing it.”

If ever there was a distasteful force-feed, this is one of them, in my opinion. Non-profits resist what? Non-profits evaluate what?

It’s bad enough when an organization is so poorly evaluated as to receive, say, an undeserved one-star classification, but worse when it’s public, as CN floods the Country with their press releases which name non-profits and announce whether they are ranked with one, two, three, or four stars.

Charity Navigator’s flawed evaluation system some months ago seriously affected several of our Cleveland, Ohio non-profits which were awarded only two stars, while a number of those “anointed” by CN, garnered four stars. The repercussions were loud and many. And justifiably so. One of the publicly low ranked organizations has a history of being solvent, but was at the beginning of a very large capital campaign when CN got its hands on the organization’s financials. Thus, the organization’s capital startup costs for the major fund-raising effort were, as expected, to be significant, and out of balance with income, at least until the expected money would (was) be raised and received. The two star rating made on that basis, and which CN saw it to be public via their press release, did hurt the organization.

My article on the subject argues that no one is now in position to make such meaningful evaluations, be they grant makers, from academia---especially Charity Navigator.

With thanks, and regards,
Tony Poderis

Post a comment

If you have a TypeKey or TypePad account, please Sign In