What’s in a Name?

For more than a decade, colleges and universities have been working to develop survey tools to inventory civic engagement programs and initiatives on their campuses. Duke will soon issue our third attempt to capture an overview of civic engagement across the university. The first summary of volunteer, service learning and engagement programs was completed in the mid-1990s. In 2003-2004, John Burness, Duke’s former Senior Vice President for public affairs and government relations commissioned a second inventory of civic engagement. [1] Collecting data for these inventories is a behemoth task, especially given that there are currently no agreed-upon standards across institutions for collecting, reporting or interpreting data on civic engagement.[2]

Most college and university campuses do not have electronic systems in place to consistently and reliably pull relevant data. This problem is compounded by the fact that units, schools and constituencies on any given campus may not agree on what “civic engagement” means at their college or university; much less is there any consensus about its meaning across institutions.   At present, most campuses must invent their own definition of what civic engagement means on their particular campus and develop survey instruments to gather relevant data. Sometimes they rely on those developed by another institution. Some information can be reliably culled from surveys that colleges and universities routinely issue for other purposes. For example, to collect and compare data on civic engagement across state system campuses, the University of North Carolina’s Task Force on Community Engagement was able to pull and filter data collected for annual surveys issued by campus admissions’ and registrar’s offices. The UNC Task Force summarized this data in its recently published first annual summary report of civic engagement across UNC system campuses (see Engagement Report page 96).

Like Duke, several campuses in the UNC system developed ad hoc surveys over the years to gather information for their institution’s application to the national President’s Higher Education Community Service Honor Roll, first launched in 2006. Four years later, many campuses then attempted to collect comprehensive data on civic engagement using a different framework developed by the Carnegie Foundation for application to its Elective Community Engagement Classification which they issue every five years, beginning in 2010.

The Corporation for National and Community Service developed a survey framework different from that of the Carnegie Foundation for institutions to use as part of their application for recognition on their Honor Roll.

Even if we agree that civic engagement practiced on our campuses must recognize teaching and learning as important priorities, we still have the challenge of finding a way to account for nuances in the learning goals and outcomes of individual courses as Emily Janke, who served as chair of UNC’s community engagement Task Force, states in the recent UNC report.   Janke expresses a growing concern among many colleges and universities about the usefulness of survey tools designed to measure quantitative data to meaningfully assess the educational or community value of university engagement (Engagement Report, page 8). An appendix to the UNC report states:

“While there is great value, short- and long-term, to university and community collaborators for student involvement in and with the community, it is important to begin to understand the qualitative differences between these experiences as the learning goals and outcomes vary. For example, developing civic attitudes and habits is an expressed outcome of the majority of service-learning courses, but is not necessarily an expressed value of all internships. Internships frequently have a greater emphasis on and explicit goals for career clarification, readiness and post-graduation placement” (Engagement Report, page 96).

How do we effectively assess the meaningful nuances both in the educational goals and the community impacts of different university engagement programs and initiatives? What steps are necessary to develop a survey instrument to assess civic engagement across institutions in higher education?

As Janke and her Task Force point out, current measures also don’t take into account variation in the way that engaged learning is taught—even within a single institution. An ideal community-based learning course might be measured by the degree to which it demonstrates the following attributes: duration, reflection, integration with curricular requirements, and priorities and service to partner communities.   But currently there is no consensus on using these attributes for our assessment standards.  Happily, we can look forward to a soon-to-be released prototype for a new instrument The Community Engagement Collaboratory to track civic engagement activities designed by Janke and her colleagues (Medlin and Holland) and developed by TreeTop Commons.  This instrument will provide the basis for targeted and comprehensive measures and evaluations of community and university contributions and outcomes.

 

[1] This information was provided by Elaine Madison, Associate Director for Programs, DukeEngage and Director of Duke’s Community Service Center.

[2] At our request, Duke’s Office of Institutional Research sent an email request this past June 2014 to the thirty-five member institutions of the Consortium on Financing Higher Education. The request asked members to share civic engagement survey instruments that they had developed on their campuses. Several institutions responded that they had no such survey instrument. The surveys that we did receive varied greatly in how and what they chose to measure.