The Improve Group is always looking for creative ways to measure and evaluate program outcomes and their long-term impacts. Ripple effect mapping brings something new to the table by framing analysis around the initial program outcomes and how they connect to and interact with the larger service area, community, etc. It is a participatory strategy for measuring program outcomes, particularly those requiring collaboration among stakeholders or sectors.
Ripple effect mapping is usually utilized 12 months post-program completion and aims to capture socially complex interactions, social capital outcomes, and multi-causality. Steps in conducting ripple effect mapping include:
- Identification of program intervention
- Scheduling a group mapping event (~2 hrs.) and inviting participants (a mix of stakeholders)
- Participants must have a clear understanding of what the program is and why it exists
- Moderately sized group, usually 12-20 + 2 moderators: 1 facilitator and 1 mapper
- Utilize appreciative inquiry interviewing
- Holding the group mapping event
- Map live during the discussion
- Recommendation: probe using Community Capitals Framework (Cornelia and Jan Flora, 2008 http://www.soc.iastate.edu/staff/cflora/ncrcrd/capitals.html)
- Follow-up interviews
- Cleaning, coding, and analysis
- Can add to original map 1 year later or on an ongoing basis to continually capture impacts; a developmental evaluation can emerge from this
Before choosing this method for gathering information about program impacts, carefully consider the following benefits and challenges for implementing this strategy.
- The ripple effect mapping is participatory and engages a mix of stakeholders or sectors. The appreciative inquiry activity, in particular, motivates participants to think about successes of the intervention and continue to collaborate and build connections into the future.
- Including multiple stakeholders allows for cross-validation from members of the group. The live activity encourages people to comment as topics or outcomes arise in the discussion.
- The discussion that results from the mapping activity captures both intended and unintended impacts of an intervention. The results can help a client or an organization think about outcomes that they may not have identified when designing the intervention.
- Ripple effect mapping is a low cost option for collecting data. The group session is more cost-effective than conducting many separate interviews. Mapping software is available online for free! Some examples of free mind mapping software include XMind, Freemind, and MindMeister.
- The final map is a useful graphic to help clients and organizations understand and communicate program impacts to their stakeholders. In addition, the mapping results can be part of an ongoing evaluation process that can be used to track changes and new developments.
Limitations and Challenges:
- It is important to have a skilled facilitator to lead the ripple effect mapping activity. The facilitator should understand what information is most important to collect, and be clear about the types of “probes” or follow-up questions to ask in order to gather this information from participants. It is also ideal to have an external facilitator rather than program staff to ensure participants feel comfortable sharing both positive and negative impacts.
- There is potential for inconsistent implementation. The facilitator as well as others who assist in this process (moderators and mappers) should remain the same through all iterations of the mapping activity.
- There is a risk of bias as a result of participant selection. Not all participants may have information about all of the outcomes experienced by the group they represent. This can be avoided by carefully selecting participants and conducting supplementary interviews with additional stakeholders from that group.
The snapshot below is an example of a segment of a ripple effect map for a fictional park clean-up project created using XMind:
To sum up, ripple effect mapping is a unique data collection approach that has the potential to be a powerful evaluation tool. When implemented carefully, the results can greatly benefit a program, its surrounding community, and inform future decisions across stakeholder priorities or sectors.
Powerpoint and Sample Agenda: http://comm.eval.org/eval/resources/viewdocument?DocumentKey=a04a9a28-6c0b-4953-91f9-ed753f120f3f)
Posted: March 20th, 2013 | Author: igmain | Filed under: About evaluation, Improve Groove Newsletter, Knowledge exchange | Tags: AEA, American Evaluation Association, Cami Connell, Danielle Hegseth, evaluation, Freemind, Improve Groove, Improve Group, Leah Goldstein Moses, MindMeister, newsletter, Ripple effect mapping, Xmind | Comments Off
Evaluation for Improvement: A Seven-Step Empowerment Evaluation Approach provides an in-depth analysis and practical application of Empowerment Evaluation. Empowerment Evaluation (EE) is gaining a lot of attention in the evaluation community. The overarching goal of this approach is to empower a program’s stakeholders with the capacity to effectively plan, implement and evaluate their own programs. Underlying this is a focus on social justice, where stakeholders, including staff, participants, volunteers, and people indirectly affected by the program, have the power to decide what is evaluated and how. The approach involves extensive planning and effort, that when carried out effectively, allows organizations to transition to conducting program evaluations without the assistance of the original evaluator. In most cases, program participants are actively involved and are also empowered. The practicality of this skill is immeasurable, as the organization will establish a foundation for improving its strategies for a lifetime.
When giving or receiving Empowerment Evaluation, it is important to understand roles and when it is appropriate.
What are the roles of participants in an Empowerment Evaluation?
Empowerment Evaluation is a collaborative effort from both the evaluator and the stakeholders of the program that is being evaluated. The evaluator must carry out his or her evaluation but also engage the stakeholders, while the stakeholders role is to actively participate in the evaluation as they learn how to effectively conduct its methods in the future.
Although the evaluator determines the structure of each EE process, his or her role is usually to coach the organization’s staff to engage stakeholders, describe the strategy, choose an evaluation design, write reports describing conclusions, and work to ensure that evaluation results are used to improve organizational evaluation capacity (A Seven-Step Empowerment Evaluation Approach).
There are many things program stakeholders might experience during an empowerment evaluation. They may conduct interviews with program participants, develop surveys for the evaluation, analyze and interpret data, etc. By practicing these evaluation methods, stakeholders will gain skills to transfer to their own future evaluations.
When is Empowerment Evaluation appropriate or inappropriate?
Even when resources are scarce, organizations don’t lose the opportunities for reflection if program stakeholders have the capacity to evaluate.
Empowerment Evaluation is appropriate if the organization would like to assess its successes, impact on desired communities, and its role in equity on an on-going basis. If this is the case, then the organization should look EE to build its own evaluation capacities.
There are a few occasions when empowerment evaluation is not an appropriate approach to take. For example, in some instances an external evaluator can add credibility or overcome biases or assumptions. In other cases, the evaluation questions are only going to be answered once, and building for future capacity is unnecessary.
If your interest in this topic has been stimulated, I recommend accessing the above link for the Empowerment Evaluation Approach. It provides a more in-depth analysis of the principles of EE, and even provides a step-by-step approach that can be applied for those who would like to conduct or receive their own EE.
Had you heard of EE before reading this? Has this solved any questions you may have previously had regarding EE? Would you consider an EE approach in the future? Please feel free to respond, I would love to hear your feedback.
Posted: March 18th, 2013 | Author: igmain | Filed under: About evaluation, Knowledge exchange, Research Tidbits | Tags: Dan Goldstein, empowerment evaluation, evaluation, Improve Group | Comments Off
The following blog, Before Spending on a Client Tracking Database, Consider Your Goals, first appeared on MAP for Nonprofits MapTechWorks on March 15, 2013.
Post DateMarch 15, 2013; AuthorKaren Graham; CategoryNonprofit Software
What can data do for you? In this guest post Leah Goldstein Moses of The Improve Group shares one organization’s evolution toward more sophisticated data management and tools. I spoke with Leah while researching for my session theUltimate Guide to Client Tracking and Case Management Software. Join me on April 10th at the Minnesota Nonprofit Technology & Communication Conference for that session and learn about the powerful tools and techniques I’ve learned from nonprofits like yours.
- Karen Graham, director of innovation & technology, MAP for Nonprofits
What is the most important thing to understand when investing in new data management tools?
Software should work for you, not vice versa. Before you spend money on software, think about what you want it to do for you. When they work well, and you are fully using your systems, your data can answer a number of questions. Some possibilities:
- Keep track of your participants in real time
- Track results, such as specific outcomes your participants have achieved
- Document your activities
Start by considering how you will use the software, when, and for what purposes. Then, think about how you can make small, initial investments before committing to a costly, custom or off-the-shelf product.
When you don’t have good data, you run the risk of facing a difficult decision unprepared. Imagine a potential supporter approaching you and asking for outcomes, or a board member asking where resources should be placed. Bad or missing data leaves you unable to answer those questions. With great data, you can easily assess opportunities and risks and answer questions. While decisions may still be difficult, you can at least eliminate some of the uncertainty.
Several years ago, Nonprofits Assistance Fundbegan using a customized Microsoft Access database to track all of the data they collected as organizations sought financial support. As they began offering additional technical assistance, more fields were added, including an organizational assessment developed to capture critical issues their loan and technical assistance clients faced. They could use the data to help them serve individual clients and aggregate data to understand trends and outcomes.
While Access met their needs for a while, over time they needed a database that was more aligned with the loan fund, which has very specific data requirements. Their search for a new system led them to one that was built specifically for loan management but could be adapted for other needs. They’ve gradually been adding other components, such as assessment data and technical assistance tracking.
This is a great example of how major data needs can force compromises, or at least sequencing, of other tools. Other organizations may find themselves in a similar situation – after assessing what they hope software, data management, and analysis will do for them, they will likely find that one need dominates their initial plans.
Guest writer Leah Goldstein Moses founded The Improve Group in 2000 to help the public and non-profit sectors make better use of available information and find creative, data-supported ways to answer questions. Drawing on her decade of experience with dozens of organizations, Leah is recognized as an expert in evaluation, community-based research, planning programs and services, and engaging stakeholders.
Click here to learn more at MAPTechWorks.
Posted: March 18th, 2013 | Author: igmain | Filed under: About evaluation, About planning, Knowledge exchange | Tags: blog, data tracking, evaluation, evaluation software, Improve Group, Karen Graham, Leah Goldstein Moses, MAP for Nonprofits, MAP TechWorks, nonprofit software | Comments Off
Jon Pratt recently reported in Minnesota Council of Nonprofits’ Nonprofit News that MNCN has collaborated with Native Americans in Philanthropy to prepare The Native American Nonprofit Economy Report. The report focuses on the scope and impact of Native American-led organizations, including financial activity, sources of support, employment, major activities and other emerging issues.
This first ever, comprehensive assessment of Minnesota organizations led by and serving Native Americans provides insights about the sector’s strengths, challenges and opportunities.
The Need for Native American-led Nonprofits
“In 1952, the Urban Indian Relocation Program encouraged Native Americans living on reservations to move to urban areas, such as the Twin Cities” (Pratt, 2013). But as more and more arrived they found fewer culturally appropriate services and opportunities available than expected. This prompted development of organizations committed to social and cultural programming. Today, there are many of these organizations continuing to support Native American communities around Minnesota.
The Economy Report
“The research provides the results of 49 interviews with Native leaders and reports on 122 organizations with 1,335 employees and $5,646,671 annual expenditures across economic development, arts and culture, social services, health care and tribal sovereignty” (Pratt, 2013).
One finding of the study is that a vast majority of Native nonprofits exist in urban areas but are less common in more rural tribal areas, representing an opportunity for funding and growth in these sectors. One of the hopes of the study is that the findings will offer new insights to urban and rural Native-led nonprofit organizations as they come together to discover and better understand how they can more effectively serve their communities.
The Native American Nonprofit Economy Report will be fully presented on March 1st at All Nations Church, 1515 E 23rd St. in Minneapolis. It will provide insight for those who would like to learn more about these nonprofits and the communities that they serve. For more information on this report, please visit www.minnesotanonprofits.org or contact Jane Harstad at email@example.com.
Posted: February 19th, 2013 | Author: igmain | Filed under: Knowledge exchange, Research Tidbits | Tags: All Nations Church, Dan Goldstein, Improve Group, Improve Group blog, intern, Jon Pratt, Minnesota Council of Nonprofits, MNCN, Native American Economy Report, Native Americans in Philanthropy, Native noprofits, Urban Indian Relocation program | Comments Off
Everybody has to start somewhere on the road to their personal success. In some cases, that starting point begins with an internship. Internships are great opportunities to learn more about the field you are interested in while also developing significant, on-the-job experiences that future employers will value.
However, this knowledge and experience will not come to you unless you really apply yourself. As a current intern, I would like to share my advice for those looking to get the most out of their internship opportunity. The following are a few suggestions of what you can do outside of your assigned tasks that can lead to success during and after your internship.
Become acquainted with everyone
One of the most important things you can get out of an internship is a strong network of people. Developing professional relationships within the organization is important because it helps you stand out. You never know what a work connection may have to offer in the future, whether it’s an open position or knowledge of somebody with one.
Find a mentor by getting to know the employees in the organization. A mentor is important because they have already gone through the process of reaching a desired position and can teach from experience. Odds are good that a mentor will have many connections that he or she can point you towards for potential positions in the future.
Know the Organization and its competitors
The best way to demonstrate that you care about the organization is to know as much as possible about it. Having this knowledge can help avoid making mistakes or coming off as ignorant. Also, try to learn about your organization’s competitors so that you can discover what sets your workplace apart.
Be prepared to do grunt work
As an intern, you may be given small tasks like getting coffee, or filing paper work. These are not the most mentally stimulating jobs, but successful professionals had to do this work for someone else at some point. It is important to focus on the big picture. Look beyond the task and see how every contribution made establishes good-will among your co-workers and the organization.
Set goals and keep yourself busy
Come prepared with goals that you want to achieve through your internship. These goals should include what you want to accomplish and also what you want to learn. After setting these goals, map out what needs to be done in order to achieve them. Have a discussion with your supervisor on these goals and he or she will likely assign tasks that lead in those directions.
If you have some down time, ask your supervisors for more tasks. They will gladly assign something and you will demonstrate your strong work-ethic. If there is no available work at the time, ask if it is okay to read informational articles to learn more about exciting new trends within the industry. As long as you keep busy in a relevant manner you will stand out to your employers while also learning on the job.
It is important to remember that you are an intern, and not a manager therefore, you do not know everything there is about improving the organization, nor are you expected to. If you are confused about what you should be doing, do not be afraid to ask for assistance or explanations. An internship is a learning experience, so if you are asking questions then you are doing something right.
Be excited to be there!
Lastly, be enthusiastic about the great opportunity that you have. Whether or not you would like to be hired full-time within the company in the future, your fellow employees and supervisors will appreciate your passion and excitement. It will improve the atmosphere within the organization and they are likely to view you in a positive light in the future. These are just a few suggestions for interns looking to gain the most out of their position.
For those looking to learn even more, visit the Internship Success Guide provided by About.com and also check out the “suggested readings” at the bottom of that excerpt. Did this guide provide useful tips for you? Do you have any stories you would like to share pertaining to your success as an intern? Do you have any other advice that you would like to give? Please comment below!
Posted: February 14th, 2013 | Author: igmain | Filed under: Around the office, Knowledge exchange | Tags: Dan Goldstein, Improve Group, intern tips, interns, successful internships | Comments Off
The following blog on Small Business by Leah Goldstein Moses was first published on the American Evaluation Association AEA 365 blog this week:
IC Week: Leah Goldstein Moses on Small is Beautiful–Growing a Small Practice
Posted by jgothberg in Independent Consulting
Hello, I am Leah Goldstein Moses, founder and CEO of the Improve Group and 2012 President of the Minnesota Evaluation Association. When I founded the Improve Group in 2000, I was learning to be a consultant at the same time I was refining my evaluation skills. My practice has grown from myself and a loose network of other independent consultants to a consulting firm of 18 staff. Running a company is different than independent consulting. It took nearly 4 years before I had my first employee. From that first day, I had new responsibilities – making payroll, setting HR policies, and developing a new network of advisers and resources. I’ve learned a lot of lessons along the way on how to be successful as a ‘not so small’ independent evaluation firm.
- Take strategic risks – but prepare for the consequences. As our company grew, the risks also grew. I now take a strategic approach to big decisions, such as a new hire or pursuing a large project. I ask myself: what if this doesn’t work out? What are the potential risks? What choices will we need to make in reaction to those risks?
- Find a focus and identity. We love evaluation. We also do strategic planning and research, but always with an underlying evaluation focus. We are known as evaluators but we decided to work across sectors. So we do evaluation in arts, human services, formal education and informal learning, health, transportation, development, and corrections. We are experts in evaluation and have found ways to supplement our expertise in these sectors. You might find a different identity and focus – either in one sector, a set of methods, or something else. If you can describe who you are to your clients, collaborators, and community, you will be fine.
- Interested in learning more about growing and sustaining an evaluation business? Attend Improving Evaluation Practice Management During Chaotic Economic Times: Three CEOs Reflect on Strategic and Innovative Diversification, Budgeting, and Employee Support and Development at the AEA conference on Thursday, Oct 25, 4:30-6:00 PM, where I’ll be presenting with Gary Ciurczak, Richard Hezel and Samantha Hagel.
- I use the AEA365 blog, the mande listserv and the evaltalk listserv to stay fresh on current evaluation topics.
- I enjoyed the Momentum Effect by J.C. Larreche; a book that examines the factors that helped companies grow year after year.
Posted: October 8th, 2012 | Author: igmain | Filed under: About evaluation, Knowledge exchange | Tags: AEA, AEA Conference, American Evaluation Association, conference, evaluation, growing a business, Improve Group, J C Larreche, Leah Goldstein Moses, lessons on growing business, MN Evaluation Association, research, resources, Small Business, strategic planning, The Momentum Effect | 1 Comment »
Many organizations seek external evaluators –whether to gain objectivity, expertise, or capacity. They often find their external evaluator through a competition using a request for proposal (RFP). The RFP invites multiple evaluators to submit a proposal.
We see hundreds of RFPs each year at the Improve Group, and have found that, when done right, an RFP clearly articulates what the organization wants to gain from the evaluation in the short/long term, describes the information the organization needs in order to decide which evaluator to work with, and also leaves some flexibility for the evaluator to be creative when approaching your project.
If you are looking for advice on drafting an effective RFP, here are some resources that define what needs to be included along with some excellent samples of actual RFP’s:
- For writing an RFP specifically for evaluation, Janet Kerley has prepared a useful guide on “How to Prepare an Evaluation Scope of Work,” that outlines what needs to be included and gives a brief example of what your RFP could look like. About.com also has created an informational page for writing a more general RFP, not related to evaluation.
- A Spectrum Science blog posted by John Seng highlights the “Top 10 Tips for Writing a Great RFP.” It can be a great resource if you feel like your RFP might be missing something.
- The RFP Library at Techsoup.org also provides tips & techniques along with several samples of effective RFP’s. Though the samples are not requests for an evaluator, they can still give you ideas on how to format your RFP.
Not really sure what to look for in an evaluator? Here are a few resources that may help you identify which evaluator is the right fit for the services that you seek:
- The Substance Abuse and Mental Health Services Administration provides a great tutorial detailing the steps you need to take before “Hiring an Evaluator”. It offers information on reasons to involve an evaluator, locating one, deciding what your organizations wants and what it can afford, screening candidates and then preparing to interview those candidates.
- Childtrends.org details “Five Steps for Selecting an Evaluator.” With a little less evaluation background provided than the previous resource, this page is a bit more guided and also goes into more detail on how to choose the most qualified evaluator.
- Chapter 5 of the “W.K. Kellogg Foundation Evaluation Handbook” offers a more in depth process for planning an evaluation than the other two resources. The first section of this chapter covers four planning steps that will help you decide which direction you would like to go leading up to and then choosing the right evaluator. This chapter begins on page. 47.
By having a clearly developed RFP, and criteria in advance for choosing an evaluator, you will get higher-quality responses and will be more likely to find a good match. Once the process is complete you should feel that you have made the perfect decision!
Has this guide been insightful in your RFP process? Are there any other resources that you would like to provide for others? Please feel free to add on.
Posted: October 8th, 2012 | Author: igmain | Filed under: About evaluation, Knowledge exchange | Comments Off
Below is a list of our Improve Group Staff members and links to the sessions they will be presenting at the Evaluation in Complex Ecologies Relationships, Responsibilities, Relevance, 26th Annual Conference of the American Evaluation Association in Minneapolis next month.
Jill Lipski Cain – Ignite Your Data and Data Collection Methods: Ignite Presentations on Better Data and Best Practices
Danielle Dryke, Deborah Mattila & Stacy Johnson – Longitudinal Studies: Getting the Data You Need and Working With the Data You’ve Got
Elizabeth Radel Freeman – Moving Beyond the Standard Bar Graph: Data Visualization Basics for Evaluators
Stacy Johnson – Reception and Poster Exhibition
Leah Goldstein Moses & Samantha Hagel – Improving Evaluation Practice Management During Chaotic Economic Times: Three CEOs Reflect on Strategic and Innovative Diversification, Budgeting, and Employee Support and Development
Susan Murphy & Leah Goldstein Moses – To Bid or Not to Bid…that is just ONE of the Questions: Practical Tips and Tactics to a Successful Approach With Requests for Proposals
Elizabeth Radel Freeman & Stacy Johnson- Managing Evaluation Projects From a Distance: Strategies, Tips and Tools
Rebecca Stewart – Evaluation in Foundations: Why is it Sometimes Harder to Give than Receive?
Rebecca Stewart – Image Grouping: A New Pictorial, Participatory Method for Data Collection
Posted: September 26th, 2012 | Author: igmain | Filed under: About evaluation, Knowledge exchange, Staff Activities, Where are they now? | Tags: AEA conference 2012, conference presentations, Danielle Dryke, Deborah Mattila, Elizabeth Radel Freeman, Evaluation conference, IG staff presenting, Improve Group, Leah Goldstein Moses, Minnepapolis Convention Cnter, Rebecca Stewart, research, Samantha Hagel, sessions, Stacy Johnson, Susan Murphy | Comments Off
As a result of the Improve Group’s partnership with Cecelia Dodge & Associates, LLC, we are offering a series of articles this fall that highlight the ways data can be used to improve instruction and transform schools for the achievement of ALL learners. You can see our first blog, giving an overview of response to intervention, here.
A core assumption of responsive, tiered education systems is that high quality instruction is already in place. The idea of putting a system in place that intervenes when students need it relies on the notion that most of the students are already successful in the regular classroom. Since interventions are designed for individuals or small groups, they are much more labor intensive. The system is defeated when too many students need individualized interventions, because no school has the resources to provide that much individual attention. (You can hear a story about a similar challenge in the healthcare system here). Therefore, attention should first go to shoring up the core instructional program so that it supports most students. Schools should not move forward to implement a tiered system of intervention until this key foundation is in place.
Buffum, Mattos and Weber (2009) describe this core instructional program as “coherent and viable core curriculum that embeds ongoing monitoring for all students” (p.113), forming the foundation of Tier 1. Specifically, Tier 1 must include:
- Universal screening
- Standards-based, scientifically research-based curriculum
- Effective instruction that is engaging, rigorous and relevant
Once the core instructional program is shored up a school can move forward with implementation, including these stages, suggested by the National Association of State Directors of Special Education (NASDSE)’s Response to Intervention Blueprints
1. Consensus building – where RtI concepts are communicated broadly to implementers and the foundational “whys” are taught, discussed and embraced.
2. Infrastructure building – where districts and sites examine their implementations against the critical components of RtI, find aspects that are being implemented well and gaps that need to be addressed. Infrastructure building centers around closing these practice gaps.
3. Implementation – where the structures and supports are put in place to support, stabilize and institutionalize RtI practices into a new “business as usual.”
Buffum, A., Mattos, M., Weber, C. (2009). Pyramid Response to Intervention (p. 131). Bloomington, IN: Solution Tree.
National Association of State Directors of Special Education. (2008). Response to Intervention, Blueprints for Implementation. Accessed at http;//www.nasdse.org
Posted: September 25th, 2012 | Author: igmain | Filed under: About evaluation, Guest authors, Knowledge exchange | Tags: analysis, article 2, Article series, Buffum, Cecelia Dodge, data, education, Improve Group, Leah Goldstein Moses, Mattos, NASDSE, Response to Intervention, schools, transforming schools, Weber | 1 Comment »
As a result of the Improve Group’s partnership with Cecelia Dodge & Associates, LLC, we are offering a series of articles this fall that highlight the ways data can be used to improve instruction and transform schools for the achievement of ALL learners.
Key Components of Response to Intervention
Data review is one of the critical components of many efforts focusing on building tiered systems to help students with both academics and behaviors. Schools across the country are at various stages of implementing these systems, referred to as Response to Intervention (RtI), Positive Behavior Interventions and Supports (PBIS), Multi-tiered Systems of Support (MTSS), or other locally created names. When implemented correctly, leaders expect these systems to transform schools into places where achievement is accelerated for all students. They have 5 key components:
1. Be clear about what we want students to learn and do (objectives)
2. Recognize when objectives are not met (evaluation)
3. Have a system of additional supports ready to be activated when necessary (plan)
4. Use the supports to intervene at a level appropriate to the student’s needs (act)
5. Loop back to the original objectives when necessary (reflect).
These components are met through high quality instruction, real-time student data, and clear decision-making processes that are informed by data. In our next blog posted tomorrow, Cecelia will describe the characteristics of high quality instruction and why it is necessary for successfully implementing responsive, tiered education systems. Following that, we’ll discuss the role of data and how we’re seeing districts make data-driven decisions.
Posted: September 24th, 2012 | Author: igmain | Filed under: About evaluation, Guest authors, Knowledge exchange, Learning opportunities | Tags: 5 key components, Article one of series, Cecelia Dodge, data driven decisions, education, Improve Group, Leah Goldstein Moses, Multi-tiered Systems of Support, PBIS, Positive Behavior Interventions and Supports, research, Response to Intervention, school districts | 1 Comment »