top of page

Was it worth it? Designing resources that people read.


I graduated from Weber State University in 2003. As an undergraduate health major, I consumed any information that related to health and well-being. I understood, even then, that textbook authors knew they were writing the book for students just like me. They had a certain audience and learning/communication objectives in mind. It was worth the investment in costly textbooks.


Fast forward to 2022 at AKA. We create reports, toolkits, resources, and guides.

We are evaluators, researchers, professors, and leaders. People often look to us for information, recommendations, evidence, and reliable information. But resources have little value if they are not read or understood. Because of this, we allocate about 20% of our overall budget towards creating print and electronic information that people will read, know, and understand. Our talented designers at AKA make ideas and information come alive through visual designs, colors, symbols, and images. Here are some examples of our reports:

We want to know if the cost of creating resources resulted in a benefit. We want to know who read the resource, what they thought of it, what they liked, what they hated, and how they will use the information. But, we have never systematically assessed the reach or impact of resources with our clients. One of the reasons we have not done this is because it is hard. It is unfunded. And we might find out that our efforts and resources are not being spent in the best way. Every journey begins with just one step, so we created a process to assess print and electronic reports and information developed by AKA.


#1 What is the process?


Here is a simple process that we created.

  1. Compile a list of all designed reports generated within a given time period. We used Excel but you can use anything that makes sense to you.

  2. Include the date, description, target audience, dissemination method, estimated reach, status, cost, and notes. Notes may include unstructured client feedback or information about how reports were used to create program changes, support grant applications, policy changes, and other distal outcomes.

  3. Develop a method for collecting feedback that documents the estimated reach and qualitative feedback from clients. Our clients are busy, they do not have time to complete client feedback forms, surveys, and interviews about the materials we create. So we are left with questions. Did they love it? Hate it? Somewhere in between? Did it reach the intended audience? Was the main message clear? Will information be used? How?

  4. Summarize the findings from the list based on the best available data.


#2 Was this a good use of resources?

We cannot assess every report or communication, so in this example, we will begin with just one client and one year. We seek objectivity in the process, even though this is difficult to achieve.

Here is an example of the first line of an Excel Spreadsheet we created to record all designed reports and resources.



While we are not including all of the information in the Excel sheet here, from December 2020 to December 2021 our team created 29 designed reports or resources. Examples of reports include bi-annual reports, literature reviews, clinical protocols, and graphical abstracts. No feedback was available for 38% (n = 11) of the resources, 41% (n = 12) received positive feedback, 21% (n = 6) were reviewed by funding agency staff and approved. The Program Director was the primary person that provided feedback on reports. Staff who are community members provided feedback on some of the resources. The annual design cost was $12,078.50. Printing and shipping costs were $5,600. AKA associate time to create content for various resources is estimated at $45,000. 100% of reports were completed on time and sent to the client in print and electronic formats. The average cost to create a resource was $1,968.22, this does not include printing or shipping costs. Resources varied in length and content (1 page to 56+ pages).


Our client feels that designed reports are more likely to be read than reports that are not designed (for example, in MS Word, without colors, symbols, imagery, and interactive content via hyperlinks). Overall, we feel that designing reports was worth the investment.


Future work to explore the question, “Was it worth it?” will include informal follow-up meetings with clients six months or a year after a resource has been disseminated. In these conversations, we will collect qualitative feedback about how the resource was used, what readers liked, and what could be improved.


Lessons Learned

  • You may never have complete data, use the best available data to assess the reach of resources

  • Start small, with one client or one project

  • Document costs, dissemination methods, completion status, and notes in a systematic way

  • Continue to create, connect, and promote health and wellbeing


20 views
bottom of page