Modules | Module A: overview | Module B: plan | Module C: build | Module D: evaluate | Module E: report |
Other Resources | Orientation | Logic Model | Cases | Glossary | Credits | Enhanced version |
[Start Screen A-1 of 16 /Module A > Introduction (1)]
This module provides an overview of the course and its goals.
[Graphic of street sign containing the word Introduction]
[End Screen A-1 of 16]
[Start Screen A-2 of 16 /Module A>What is OBPE? (2)]
OBPE is a systematic way to plan user-centered programs and to measure whether they have achieved their goals. OBPE goes beyond documenting what you did and measures what difference you made in the life of your audience—how has your audience changed?
[Box with bird image containing the words: Outcome - “not how many worms the bird feeds its young, but how well the fledging flies.” -- United Way of America, 2002]
[End Screen A-2 of 16]
[Start Screen A-3 of 16 /Module A>What is OBPE (3)]
First a few terms.
[Rectange containing the words: What is an outcome? Arrow pointing from the rectangle to the words: Desired change in an end user]
[Rectangle containing the words: What are some kinds of outcomes? Arrow pointing from rectangle to the words: Behavior, Attitude, Skills, Knowledge, Conditions, Status]
[End Screen A-3 of 16]
[Start Screen A-4 of 16 /Module A>What is OBPE ? (4)]
Using OBPE, you develop a logic model (a planning and evaluation tool) that helps
[Graphic of square containing the words needs, institutional misson. Arrow from square to circle containing the word "program". Arrow from circle to triangle containing the word "outcome" ]
[End Screen A-4 of 16]
[Start Screen A-5 of 16/Module A>Why OBE? (5)]
OBPE has benefits not only for the public but inside the institution as well.
What are professionals saying about OBPE?
[Moving the mouse over the image of a two people talking labeled "public programs director" presents the following text: “Believe it or not, OBPE has helped with some of the politics of program cuts. When ‘pet programs’ are challenged, we can look objectively at what is actually making a difference.”- Public programs director]
[Moving the mouse over the image of a two people talking labeled "librarian in charge of evaluation" presents the following text: “We’ve been collecting tons of data on our programs for years, but none of it really told us if we were making a difference in our patrons’ lives. Outcomes evaluation gives us a way to know that.”- Librarian in charge of evaluation]
[Moving the mouse over the image of a man in chair labeled "exhibit team member" presents the following text: “Even if we never implemented the plan, developing a logic model has helped our staff articulate the purpose of our programs more clearly- it has even generated healthy discussion of some key issues!”-Exhibit team member]
[Moving the mouse over the image of woman labeled "museum director " presents the following text: “Even if our funders didn’t want to know outcomes, our staff has come to think of this museum as audience-centered. OBPE shows them the real human impact of their work.”-Museum director]
[Moving the mouse over the image of man labeled "library director" presents the following text: “Funding streams are drying up and yet our community needs keep growing – we have to know what’s working and what isn’t so we can invest in success.”- Library director]
[Moving the mouse over the image of older man labeled "grant writer" presents the following text: “This way of documenting our impact in the community has been a powerful tool in writing grants- we don’t just say we’ re making a difference- we show them the data!”-Grant writer]
[Moving the mouse over the image of woman labeled "school services coordinator" presents the following text: “Sure, I was reluctant at first- more paperwork I don’t need- but it has been a useful planning tool and really pays off when it comes time to write reports.”- School services coordinator]
[Start DIG DEEPER text]
United Way of America has been promoting program outcome measurement for more than a decade. A survey of agency experiences with outcome measurement shows the following.
Agency executives agree or strongly agree – outcome measurement helps their
Source: Agency Experiences with Outcome Measurement: Survey Findings. United Way of America, 2000. Used by permission.
[End DIG DEEPER text]
[End Screen A-5 of 16]
[Start Screen A-6 of 16/Module A>A tale of two programs (6)]
Let’s see what a typical small program involves—one for a museum and one for a library — without worrying about outcome-based planning and evaluation. Then let’s consider what OBPE might add.
[End Screen A-6 of 16]
[Start Screen A- 6_M of 16/Module A> A tale of two programs>Museum example (6-M1)]
Let's explore a museum example. What happened?
[Graphic of young woman thinking with the text: You get an idea - It is the 100th anniversary of the birth of a local artist of some fame. Your county museum decides to mount an exhibition of her landscape paintings, a major part of your collection.]
[Graphic of desk containing paper and pencil with the text: You plan the program, budget resources and costs, and get funding - Program planners identify and research key works, develop the interpretive context, plan related public programming, and craft a budget (arguing successfully for museum support, funded in part by a corporate sponsor).]
[Graphic of computer with the text: You offer the services and monitor results - The exhibit opens to positive reviews, strong attendance, and your museum board is particularly pleased to hear that school group visits showed a marked increase.]
[Graphic of young woman smiling with the text: What difference did your program make? How do you know? -“We built it and they came! Attendance was strong, press was good, and the corporate sponsor was pleased that lots of people saw their logo.” ]
But what difference does the exhibit make in the lives of the visitors? How can you show your program contributes to the public good in support of the institution’s mission?
[End Screen A-6_M1 of 16 ]
[Start Screen A- 6_M2 of 16/Module A>A tale of two programs >Museum example (6-M2)]
Let's explore the same museum example again. This time we've added the OBPE. What outcome do you want? Your mission statement gives one of your institutional goals: “to inspire and educate the public about the historical, cultural, and artistic heritage of our county.”
What happened when OBPE was applied?
[Graphic of young woman thinking with the text: You get an idea -The 100th anniversary of a prominent local artist is an opportunity to increase knowledge of her work, connect students with the aesthetic and cultural heritage of their county, and promote an understanding of local landscape.]
[Graphic of desk containing paper and pencil with the text: You plan the program, budget resources and costs, and get funding -Your exhibit development team identifies a target audience for the exhibit, agrees on outcomes (changes in knowledge, skills, attitudes, behavior) and indicators of the change, and plans an exhibit and related programming to produce those desired outcomes. You argue successfully for museum support, funded in part by a corporate sponsor. With this support, you approach an arts foundation to fund outcomes that are central to your mission and theirs.]
[Graphic of computer with the text: You offer the services -The exhibit opens to positive reviews, strong attendance, and your museum board is pleased to hear that school group visits showed a marked increase. They and the arts foundation are especially pleased to see the results of the pre- and post-visit surveys that showed an increased level of knowledge about the artist and significantly higher interest in local landscape history and preservation. And the corporate sponsor was pleased that lots of people saw their logo.]
[End Screen A-6_M2 of 16]
[Start Screen A- 6_L1 of 16/Module A>A tale of two programs>Library example (6-L1)]
Let's explore a library example.
[Graphic of man thinking with the text: You get an idea -Your library owns a collection of documents concerning Native Americans. Local schools often have lessons on local tribes, and state educational standards now require use of primary materials.]
[Graphic of desk containing paper and pencil with the text: You plan the program, budget resources and costs, and get funding - You make documents available on the Web and get a modest allocation of funds to develop study modules for teachers.]
[Graphic of computer with the text: You offer the services and monitor results - You promote the availability of the items with local teachers and check to see how many web hits your site gets.]
[Graphic of man smiling with the text: What difference did your program make? How do you know? -“We got over 2,000 hits on our website in the three month period .”]
But this describes the effect on the library. What difference does the program make in the lives of the visitors? How can you show your program contributes to the public good in support of the institution’s mission?
[End Screen A-6_L1 of 16]
[Start Screen A-6_L2 of 16 /Module A>A tale of two programs>Library example (6-L2)]
Let's explore the library example again. This time we've added the OBPE. What outcome do you want? Your mission statement gives one of your institutional goals: “Serving as an educational and cultural resource to the people of the northeastern State region”
What happened when OBPE was applied?
[Graphic of man thinking with the text: You get an idea -Your library owns a collection of documents concerning Native Americans. State educational standards now require use of primary materials and teachers who do lessons on local tribes may want to include some of your documents.]
[Graphic of desk containing paper and pencil with the text: You plan the program, budget resources and costs, and get funding - You identify interesting documents, get local Native American groups to partner with you and school teachers to develop sample packs of materials, get modest funding from a grant (Library Services and Technology Act grant, administered by your S tate L ibrary) and mount materials on your website.]
[Graphic of computer with the text: You offer the services and monitor results - The Native American group schedules a pow-wow for the opening and tribal representatives speak when school groups visit. Your Library Board is pleased at increased attendance figures. Your website hits increase 25%. The local Board of Education and your Board are especially pleased to hear that children participating in the program have incorporated documents into their history fair projects. Results of the pre- and post-visit surveys show an increased level of knowledge about local tribes and an awareness of how Native Americans live today (= outcomes, changes in target audience).]
[End Screen A-6_L2 of 16]
[Start Screen A-7 of 16/Module A>OBPE benefits (7)]
When you start with the desired outcome, you plan the program to make the intended changes in your audience. The focus is on questions such as:
[Start Screen A-7 of 16]
[Start Screen A-8 of 16/Module A>OBPE benefits (8)]
The planning process doesn’t change from Bad to Good with OBPE. But because you have pictured what success would look like, you plan to make that happen. Here’s what happens to your planning process:
Planning Process - You get an idea for a program to further the mission of your institution.
Planning Process with OBPE - Identify specific individuals or groups (target audience) with a defined need central to your institution’s mission.
Planning Process - You plan the program, budgeting resources and costs, and argue successfully for funding.
Planning Process with OBPE - Establish clear program outcomes to meet that need. Develop ways to measure those program outcomes (indicators). Design program to reach that audience and produce the desired outcomes.
Planning Process - You offer the program and monitor the results.
Planning Process with OBPE - Offer the program having planned what to monitor to show changes in the target audience.
[Start COACH text]
You may be worrying, “What will happen if I don’t start by identifying an audience need but instead get a program idea from a funder’s Request for Proposals?”
Don’t worry. First, perhaps the funder has already identified a need. Second, the OBPE police don’t give tickets!
Our point is that it pays to think about outcomes early on because it’s so central in planning what you will actually carry out. And here’s a hint if you need help with terms like “outcome” and “indicator.” We’ll explain these carefully in the other modules, but meanwhile check the Glossary if a term seems puzzling or unfamiliar.
[End COACH text]
[End Screen A-8 of 16]
[Start Screen A-9 of 16/Module A >OBPE benefits (9)]
Benefits of OBPE for libraries and museums
Because you will plan for outcomes, OBPE will help you to:
[Start Dig Deeper]
The Colorado Digitization Project (CDP) illustrates the benefits of OBPE nicely . The CDP has grown from a consortium of Colorado libraries, museums and archives that was awarded a small grant in 1999 to the current Collaborative Digitization Project including four other Western states, with a total of over $2.6 million in grants . See their website at http://www.cdpheritage.org for a copy of grants, reports and evaluations of their projects. Or click on the Cases tab at the top of the screen and select Teaching Colorado’s Heritage with Digital Sources for an overview of their accomplishments and links on our server to selected documents they have posted.
[End of Dig Deeper]
[End Screen A-9 of 16]
[Start Screen A-10 of 16/Module A>Course overview (10)]
This first module introduces the course. Each module answers crucial questions. Click on the titles below to find out what you'll learn in each module.
[Clicking on “Overview” presents text: Why and how can OBPE help me?]
[Clicking on “Plan” presents text: How can I plan a program, understanding audience needs, working with stakeholders and partners, answering questions: What do we do? For whom? And for what benefitoutcomes?]
[Clicking on “Build” presents text: How do I carry out my plans for reaching desired outcomes: What activities need to be carried out within the organization? What services need to be delivered to participants? What input of resources is needed?]
[Clicking on “Evaluate” presents text: How can I measure the results of the program: What would success look like? What would indicate that success? Have I achieved the desired outcomes?]
[End Screen A-10 of 16]
[Start Screen A-11 of 16/Module A>Course overview (11)]
What will I be doing in this course?
Each module is a self-guided learning experience that you can take at your own pace.
At the end of each module is an opportunity to check and apply your understanding.
Participants will receive directions from their instructor about assignments, group activities, and learning assessment.
[Graphic of man smiling and reading documents.]
[End Screen A-11 of 16]
[Start Screen A-12 of 16/Module A>Course overview (12)]
There are several features in the course to help you.
In the modules:
Scenario-based examples help you connect the general principles to practice.
A “Check your understanding” screen lets you review before you move on to the assignments and the next module.
An “Apply your understanding” screen leads you to apply the concepts to your own program, applying the OBPE concepts. Check with your own instructor about what you need to do for each module.
“Coach” icons link you to tips for success. “Dig deeper” icons link to more in-depth information or examples.
[Graphic of books and papers with folded eye glasses sitting on top.]
[End Screen A-12 of 16]
[Start Screen A-13 of 16/Module A>Course overview (13)]
Across the top of each screen, you will find links to useful learning aids:
[End Screen A-13 of 16]
[Start Screen A-14 of 16/Module A>Check your understanding (14)]
OBPE provides you with a valuable tool for planning and evaluation. Think about the list of statements below. Is this something you can realistically expect OBPE to do for your museum or library?
[Clicking on the graphic “Statement 1: Prove your program benefited the community." presents text: No, while OBPE can demonstrate some results, its methodology (for example, sampling strategy, variable controls) can’t “prove” definitively that a program caused an impact.]
[Clicking on the graphic “Statement 2: Make writing reports easy.” presents text: Nothing makes writing easy. But the planning documents you prepare in OBPE will usually make writing reports- to funders, press releases, to your Board- a lot easier.]
[Clicking on the graphic “Statement 3: Help keep staff, partners and funders in agreement on goals by stressing outcomes (“so what?”) rather than the process.” presents text: Yes, by making end results the basis of planning and evaluation, everyone directs their efforts toward a common desired outcome that everyone has agreed is the human impact you want to have on your target audience.]
[Clicking on the graphic “Statement 4: Assist in fundraising and grant writing by providing data on outcomes.” presents text: Yes, OBPE helps you demonstrate the value of the organization to the community in concrete terms that connect what you are doing to what you’ve achieved.]
[Clicking on the graphic “Statement 5: Indicate needed improvements to programs and services.” presents text: No, OBPE can indicate what is working and what isn’t, but it can’t tell you what the changes need to be made to improve the outcome. That’s the job of the program planners.]
[End Screen A-14 of 16]
[Start Screen A-15 of 16/Module A>Apply your understanding (15)]
You have reached the end of the instruction for this module. Follow your instructor’s directions for any assignments:
If you are not familiar with OBPE, you may find it useful to look through examples of best practice in the Cases archive.
Think about talking to your governing board about the value of OBPE for your organization. After reading only the first module, what would you say were the three or four most valuable “promises” about the value of OBPE for your organization? Can you anticipate any objections?
Be sure to check your instructor’s postings for additional or alternate assignments.
[End Screen A-15 of 16]
[Start Screen A-16 of 16/Module A>Resources (16)]
Resources are available as you feel the need for them throughout the course.
Clegg & Associates, Inc. (2005). The logic model game. Retrieved August 8, 2005, from http://www.cleggassociates.com/Resources/LogicModel/index.asp
Clegg & Associates, Inc. incorporate outcomes-based evaluation in their strategic planning, evaluating, and facilitating for nonprofit and public sector organizations. This game is a tool to help build and design a logic model. It provides a concise overview of the logic model elements: resources, activities, outputs, outcomes and goals. Some of the vocabulary is a little different from the Shaping Outcomes OBPE course, but the principles are the same even though it uses health services and environmental examples.
Diamond, Judy. (1999). Practical Evaluation Guide: Tools for Museums and Other Information Educational Settings. Walnut Creek, CA: Alta Mira Press.
This handbook provides an introductory guide to tools and approaches for assessing how programs and exhibits communicate their intended messages to museum audiences. It includes samples of strategies for collecting information on museum learning and describes how to construct and use them.
Florida Department of State, Division of Library and Information Services. (2000). Workbook: outcome measurement of library programs. Tallahassee, Fl.: FDS.
Chapters include Introduction and Overview; Organizational Readiness Survey; Preparing for Outcomes; Logic Model; and Data Collection Plan.
Ondaatje, E.H., Zakaras, L., Brooks, A. & McCarthy, K.F. (2004). Gifts of the muse: reframing the debate about the benefits of the arts. Santa Monica, CA: RAND Research in Arts.
RAND Corporation is a nonprofit research organization providing analysis and solutions for public and private sector organizations globally. The book provides a comprehensive view of how the arts create public and private benefits, highlights how the arts are instrumentally and intrinsically beneficial. This publication has been peer reviewed for research quality and objectivity.
Rubin, Rhea Joyce, Demonstrating Results: Using Outcome Measurement in Your Library. Chicago: ALA Editions, 2006.
Written by a Library consultant, this book uses familiar task breakdowns along with key terms in a step-by-step, service-oriented format so that readers can master the outcome measurement process as they learn to enhance library programs using evaluation techniques, use and customize the 14 step-by-step workforms to address unique needs, gather and interpret statistically accurate data to demonstrate outcomes, as well as measure, evaluate, and present outcomes to attract funding. Examples and two running case studies demonstrate the application of the principles, and a toolkit provides tips on creating evaluations, coding data, and selecting a sample. Has excellent bibliographies of both evaluation in general and of evaluative tools.
United Way of America. (1999, April). Achieving and measuring community outcomes: challenges, issues, some approaches. Retrieved August 8, 2005, from http://national.unitedway.org/outcomes/files/cmtyout1.pdf
United Way is often regarded as a successful pioneer of outcomes-based evaluation in their programs. This 34-page document provides introductory information on developing a logic model, action plan, theories, strategies for successful program planning and evaluation the organization has used.
Weil, Stephen, “Beyond Big & Awesome: Outcome-Based Evaluation.” Museum News Nov-Dec. 2003.
Discussion of outcome-based evaluation -- what museums can learn from visitors' experiences and the need for museums to develop credible systems of feedback.
Weil, Stephen and Peggy D. (n.d.). Perspectives on outcome evaluation for libraries and museums, [webpage of Institute of Museum and Library Services]. Retrieved August 18, 2005, from http://www.imls.gov/pubs/pdf/pubobe.pdf
This Institute of Museum and Library Services-published booklet is recommended to Outcomes Based Evaluation workshop participants. It contains the articles “Transformed from a Cemetery of Bric-a-Brac” by Stephen Weil and “Documenting the Difference: Demonstrating the Value of Libraries through Outcome Measurement” by Peggy D. Rudd as well as a breadth of resources for further exploration. The booklet concisely articulates the value of OBPE for museum and library practitioners.
Zweizig, Douglas. "With our Eye on the User: Needed Research for Information and Referral in the Public Library." Drexel Library Quarterly 12(1976):48-58.
In his influential review of existing research on information seeking, Zweizig found that the patron's perspective was largely overshadowed by a library-centered focus.
Institute of Museum and Library Services (IMLS)
The Institute of Museum and Library Services is the primary source of federal support for the nation’s 122,000 libraries and 17,500 museums. Its mission is to grow and sustain a “Nation of Learners." The web site has lots of useful information about OBPE including an introduction to and resources for successful program evaluation and a project planning tutorial.
United Way of America: Outcome Measurement Resource Network
The United Way has championed the adoption of outcome measurement by health and human service programs. The Resource Network offers information, downloadable documents, and links to resources related to the identification and measurement of program- and community-level outcomes.
[End Screen A-16 of 16]
[End of Module A]
Modules | Module A: overview | Module B: plan | Module C: build | Module D: evaluate | Module E: report |
Other Resources | Orientation | Logic Model | Cases | Glossary | Credits | Enhanced version |