USC Community Based Learning Collaborative

Today I went to a meeting organized by USC’s Community Based Learning Collaborative (CBLC). There were people in the room from across the university who are involved in some kind of community partnership that links faculty and students to community based organizations. The JEP program folks, who administer the Service Learning program which allows USC undergrads to gain course credit for community based work, were there, as were people from the USC Volunteer Center, and the University wide office of City and Community relations.

JEP places about 2000 students each year for service learning, including courses as well as litearcy programs. There has been some JEP work on immigration (http://www.usc.edu/dept/LAS/jep/jep/immigrant.htm).

We were divided into tables by interest area, including computer science, arts, education, health, and so on. I was at the education table. There was a series of presentations by panelists discussing various approaches to assessing the impact of service learning, followed by round table discussions at each table. Unfortunately, the format didn’t provide much opportunity to meet or network with people who weren’t at your table, but it was still interesting.

The first presentation was by Adrianna Kezar, about assessment and evaluation of Service Learning. Here’s a summary:

Adrianna Kezar, Evaluation of Service Learning:

  • When you measure outcomes, dont just look at traditional measures. Also look at:
    -partnership
    -learning outcomes
    -student experience
    -impact
    -student and faculty perspectives
    -partner impact
  • In the vision for assessment, it’s important to determine the level you’re assessing. Is it the partnership level? School wide? Cross campus comparison? If its more local, action based and community based models might be better. If more global, standardized instruments might be better. Think about who the audience is for the evaluation.
  • Assessment areas for partnerships: common vision and goals; on going planning; policies; designated leads and roles; communication; decision making processes. Its best to track these things along the way, not just wait until the end. Its important to be flexible and be able to make changes along the way. Important to understand what went well, and what didn’t.
  • Often, these kinds of assessments don’t happen because of time and energy limitations. Often an office or institution at the university level can help, because it makes you sit down and do evaluation.
  • How to use research to inform assessment. What are we measuring? People who are proponents of service learning and community partnerships know that it works, but then they have to prove it to the provost in order to get money. So if you’ll have to do this, then one approach might be: pretest and post test with control and experimental group. But that may not be the best way to test what people actually learn.
  • Learning outcomes: knowledge, behaviors, attitudes, and beliefs. They looked at these things over 4 year college education. Understanding this research might help us when we think about how to assess outcomes. SL can have an impact that isn’t visible for years. Very hard to measure outcomes. Using portfolios and other qualitative forms of measuring outcomes. Having students choose what they think was the most important outcome. We have been able to see measurable impact on prosocial attitudes in students who have participated in SL.
  • Two goals: create good citizens, vs. real life systems as a place to best learn particular bodies of knowledge. These things often get fuzzy. Choose which one you are focusing on.

Eric Wat, Asian Pacific American Community Research Roundtable (APACRR)

Next presentation was by Eric Wat from APACRR, a project of the Asian Pacific Policy and Planning Council (http://www.a3pcon.org). He went over core values they use to develop partnerships as well as various kinds of collaborative projects.

  • Values: equality between campus and community; research as a tool for change in community; shared standards for community based research; community as producers of knowledge; nurturing next generation of API researchers.
  • Challenges and Strengths for CBOs, faculty, and students.
  • Potential roles in a partnership, drawn from focus groups and committee decisions:
    -facilitator
    -researcher
    -disseminator
    -technical assistance
  • Think about difference between short term and long term impact.
  • group project vs individual; school year vs. summer; division of labor between uni and CBO. Who manages the students? Who evaluates and assesses what?
  • Incentives for students: course credit; major requirements; $$$.
  • Regular service learning class is better than independent study. A requirement for a major is even better. But the best motivator is $. 3-4,000 can make a big difference in student motivation and might not be that big a dent in your budget.
  • This presentation discussed various elements to consider in setting up, designing, managing, and evaluating community university partnerships.

Following that was

Gary Kosman, America Learns (http://americalearns.net)

re: how to set up Service Learning projects with a deliverable.

Gary talked about the problem where a service learner is placed with a CBO, does good work, but what they do is not actually used by the CBO. So, showed us a draft ‘Application to Give a Service Learner an Incredible Learning Experience that Helps Our Organization in a Meaningful Way.’ An application for the CBO to fill out that indicates project clarity, who will be the project manager, the project plan, etc.

Barry Boehm, Computer Science Department

Finally Barry Boehm (http://en.wikipedia.org/wiki/Barry_Boehm) and another presenter talked about their experience developing a service learning course within the computer science department. Comp Sci 577 is a class where student programmers get credit for working with real-world community clients to develop software that can actually be used for community needs [I can’t resist: sounds way better to me than Dr. Boehm’s previous gig as director of Information Science and Technology at the DoD’s Defense Advanced Research Projects Agency (DARPA)]. Read more about the community software development class at http://viterbi.usc.edu/news/news/2007/viterbi-computer-science.htm. They operate with weekly effort, progress, and risk reports; grading criteria for artifacts; two live reviews per semester: client is there in the room. Together they present how it’s going, things done right, things to improve. Client evaluations: 20 questions, 1-5 ratings, comments, what did you learn about how to learn?

Overall: a lot of interesting people and ideas in the room, and definitely some good contacts if we want to figure out how to hook USC undergrads into our project.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s