So far in this series, we’ve addressed data security, consent, and bad survey questions. Now we turn to a problem that isn’t specific to evaluation: over promising, under delivering. However, two factors make evaluation vulnerable to this problem:
- We rely on others to share information and data with us
- The data we get may not be structured the way we expect, requiring greater manipulation or simply making us unable to answer questions
- Clearly articulating what our clients are expecting and when. After a couple of experiences in which we weren’t clear what our clients wanted, and ultimately went through more iterations and drafts than either of us cared for, we created a tool to help us talk through expectations for each deliverable. While very simple (including questions about audience, tone, style and length), it helps ensure that we are in agreement about the final products.
- Knowing as much as we can about available data before determining how it will be used. For example, we had a client with a wide variety of items, stored in different tables and documents, all of which informed each other. We summarized the elements that were used and described where the gaps were in the information we hoped for as we prepared a report outline for the client.