Continuous Evaluation Report for 2021-22

Methodology, Takeaways, and Future Changes

Methodology
Participants from 16 long form workshops/institutes run by IWT CLASP between January 2021 and August 2022 were targeted for feedback. Participants were contacted by email in November and December 2022. Out of a total of 186 possible unique faculty respondents (lower than the number of unique participants due faculty contact info changing), there were 36 usable responses, for a response rate of about 19%. Respondents filled out an anonymous evaluation survey which included both quantitative and qualitative questions designed to gather information about workshop impact on faculty and their classrooms, as well as background information about respondents and their institutions. The evaluation has two goals:
  1. To provide CLASP with actionable feedback that can be used to direct improvements to workshop content and structure
  2. To measure CLASP impact on respondents’ teaching and their observations about CLASP impact on their classrooms with a high level of specificity
Results were cleaned and analyzed in order to create categories to summarize qualitative feedback for analysis. The summary and analysis of results below does not include all of the questions on the survey. Effort was made to split results based on respondent background and workshop participation, though this proved difficult to do (more on this below in Future Changes). Results are summarized and analyzed below.  Responses are represented with charts, graphs, and visualizations.  Tables with the data used to create the visual summaries are provided.  Notes are provided where there is a section of the evaluation that we hope to change in the future.
Takeaways
  • The evaluation provided valuable information about faculty and why they attend CLASP workshops.  Some of this data, like that about why respondents chose to attend, begins to paint a broader picture about faculty needs and interests across the network.
  • Respondents feel that CLASP workshops are well aligned with their needs and interests.  Enough workshop content is new without being overwhelming, and that content is useful.
  • CLASP workshops themes are designed to provide faculty with student-centered writing-based teaching practices that enable them to realize the goals of OSUN in their classrooms.  Respondents reported that workshops helped them significantly in making changes to their teaching practice around these themes.
  • Respondents were asked about the changes they observed after applying new methods, approaches or ideas learned in the workshop to their classrooms.  Respondents reported large, positive changes in student learning behavior across 9 target outcomes.
  • The evaluation provided detail about the differences in success for both workshop themes and classroom changes.  This detail enables CLASP to make targeted changes to workshops going forward in order to equalize success.
  • Respondents use CLASP workshops as a networking opportunity, and reach out to their fellow participants in order to collaborate outside of workshops.
  • Respondents share what they’ve learned in CLASP workshops widely at their institutions, and the reception is positive.  In both this and the above way, CLASP is a network builder for OSUN.
  • Respondents are overwhelmingly satisfied with CLASP workshops and want to attend more.  The evaluation points to areas of change and improvement in CLASP workshop content and structure, but these are changes designed to improve already very successful programming.
Future Changes
Our evaluation survey is designed to do the following 3 things: Measure that:
  1. Workshops have the results that we anticipate. For example: Do workshops help participants develop practices for applying activities that increase student engagement in the classroom?
  2. Those results have the classroom impacts we anticipate. For example: Does applying applying activities that increase student engagement in the classroom using practices learned in the workshop lead to faculty reporting an increase in student motivation to engage with course content and class activities?
And:
  1. If the evaluation shows that workshops aren’t having the results we anticipate or that the results aren’t having the classroom impacts we anticipate we need to be able to significantly differentiate and narrow down responses so that we can target groups of faculty to gather qualitative feedback about why things aren’t working as intended.
Two challenges to our current process have emerged as a result of the analysis:
  1. Respondents reported on #1 a while after they had taken the workshop, and on #2 potentially before they had really had time to apply the workshop.  We had to start at some point for our first survey, and so it was impossible to perfectly align distribution to faculty while also collecting enough responses.

    This makes it difficult to conclude things like “Faculty report that participating in workshops led to them learning X, trying to apply X in their classroom, and observing that applying X led to an increase in Y in their classroom”. We are limited to concluding something broader, like “Faculty who took a CLASP workshop observed an increase in Y in their classrooms at some point after taking a workshop where they recall learning about X”.

    Though the second conclusion above is still valuable (see the results and analysis below), it’s not as specific as we would like.  The goal going forward is to establish the link present in the first conclusion from specific aspects of workshop design, to specific aspects of workshop results, to specific aspects of classroom impacts.

  2. Not enough faculty filled out the evaluation for our evaluation process to really do #3, which is why it’s not included in the results and analysis below.

    We can’t adequately split up results to draw conclusions like “Faculty from X school feel that workshops are a poorer fit than faculty from Y school” or “Faculty who have been teaching for 5-19 years report a bigger change in confidence after our workshops than those who have been teaching for 0-4 or 20+”.

    This is a problem because it means that we can observe, for example, 19% of respondents feel that the new content in their workshop was “Somewhat” useful or less, but we can’t really use our data to figure out whether there is a pattern to those respondents.  This precludes us from targeting groups of faculty or contacts at institutions to gather more feedback about potential changes.
We will make two changes to the evaluation process going forward in order to address these challenges:  
  1. Have participants in long-form workshops like the ones listed in the results below to do two separate evaluation forms that are linked together:
    1. A “Results” evaluation immediately after the workshop, where they answer questions like How would you describe the fit between training and your professional needs and interests? or What aspects of the workshop had the most influence on changes in your knowledge, thinking or practice?
    2. A rolling “Impacts” evaluation a semester after the workshop, where they answer questions like To what extent have these changes helped increase student Engagement with class discussions and assignments? and To what extent have you shared your workshop experience with colleagues or leaders on your campus. This will require us to collect respondents names and emails so that we can link the “Results” and “Impacts” evaluations together.  Collection can be done in such a way that respondents names and emails are not visible when looking at their responses, preserving the anonymity of the survey.
  2. Require participants in long-form workshops to fill out both evaluation surveys.  Our hope is that participants will understand our earnest desire to gather feedback that we can use to improve our workshop structure and content.  We want participants to see the process of providing feedback on both their workshop and classroom experience as part of the reflective practices that are integral to CLASP and as an opportunity to help shape our offerings in the future.
Thank you for your interest in IWT CLASP’s work and in the results of our first Continuous Evaluation Report. If you have any questions about our evaluation process, the report itself, or CLASP’s programming, please email us at [email protected].

Respondents

Workshops Evaluated

Respondents were asked to choose a workshop from the list below to base their evaluation on.  The large date range, from February 2021 to July 2022, was necessary to ensure a high response rate in our first evaluation.

Workshops are divided into the following categories.

 

  • IWT Annual Workshops  are typically a day long, though the July Weeklong Workshops last from Sunday to Friday.  Most of the workshops below were held online, with the exception of the July Weeklong Workshops 2022.  These workshops are offered through IWT and are open to non-OSUN participants.  Workshops are experiential, and typically focus on applying specific IWT practices to participants’ contexts.
  • Intensive Institutes – The Experiential Learning Institute is a semester-long institute on the design of courses that incorporate experiential learning.  The institute culminates in the design of such a course.  Some respondents to the evaluation have gone on to teach all or part of the course that they designed in the institute.
  • CLASP Workshops are workshops slightly shorter in length than a day-long IWT workshops workshops, and exclusive to OSUN faculty.  The multi-hour workshops listed below and shorter 90-minute workshops have historically made up the bulk of IWT CLASP programming.

Faculty have to apply to participate in both the Experiential Learning Institute and July Week Long workshops.  All other workshops are open access.

The pattern of responses below highlights some shortcomings that we will address by conducting more frequent evaluations in the future:

 

  1. Though respondents may be applying workshop learning to their classroom, it is likely hard for some to recall what specific practices or topics were covered in the workshop they chose to evaluate.
  2. Though respondents chose one workshop to base their evaluation on, it is possible that they took more than that workshop during the date range.
  3. Some workshop series, like the IWT New Kinds of Attention Workshops, received no or very few responses.  New Kinds of Attention Workshops are particularly popular among OSUN faculty, so this is slightly surprising.  Faculty choosing another workshop to base their evaluations on, or the relatively small number of faculty evaluations are possible explanations.

Workshops Open for Evaluation

Internal CategoryWorkshop Title# Evals% Evals
IWT Annual Workshops2021-02-05 - New Kinds of Attention Feb 202100.00%
2021-03-05 - New Kinds of Attention Mar 202100.00%
2021-04-23 - April Conference 202100.00%
2021-07-12 - July Weeklong Workshops 202138.33%
2021-10-01 - Writer as Reader Oct 202125.56%
2021-11-05 - Writer as Reader Nov 202125.56%
2022-02-04 - New Kinds of Attention Feb 202200.00%
2022-03-04 - New Kinds of Attention Mar 202200.00%
2022-04-28 - April Conference 2022411.11%
2022-07-10 - July Weeklong Workshops 2022822.22%
Intensive Institutes2021-01-11 - 2021 Experiential Learning Institute719.44%
2022-02-24 - 2022 Experiential Learning Institute12.78%
CLASP Workshops2021-04-07 - Practical Introduction to Facilitating Classroom Debates25.56%
2021-07-19 - Debate in the Classroom: Writing, Speaking, Teaching12.78%
2022-02-03 - Teaching Critical Thinking through Writing - Section 138.33%
2022-03-03 - Teaching Critical Thinking through Writing - Section 238.33%
Total-36100.00%
Demographics

Respondents came from 12 separate OSUN institutions (combining BHSECs).  Distribution of respondents across OSUN institutions mirrors participant distribution in IWT CLASP workshops during the same period fairly closely.

Primary OSUN Institution# Evals% Evals
Al-Quds Bard College of Arts and Sciences411%
American University of Central Asia514%
Bard College411%
Bard College Berlin26%
Bard High School Early Colleges38%
BRAC University822.22%
Central European University38.33%
European Humanities University12.78%
Haitian Education and Leadership Program25.56%
Universidad de los Andes25.56%
University of the Witwatersrand12.78%
Total36100.00%

Years Teaching

Respondents were asked to provide an exact number, responses are grouped.

View as table
Years Teaching# Evals% Evals
0-4925.00%
5-9925.00%
10-19925.00%
20+925.00%
Grand Total36100.00%

Half of respondents have been teaching for less than 10 years. The target audience for longer term projects like the CLASP Fellows Program is primarily early to mid career faculty. We typically think of faculty who have been teaching for 5-19 years as falling into this category.

 

Years at Primary OSUN Institution

Respondents were asked to provide an exact number, responses are grouped.

View as table
Years at Institution# Evals% Evals
0-41850%
5-91131%
10-19411%
20+38%
Grand Total36100%

Most respondents have been at their institutions for a fair amount less time then they have been teaching overall.

Respondents were given a list of teaching fields and asked to select any that best described the primary fields in which they teach. Their responses were distilled into the broader groups below for the purpose of analysis.

The teaching field selection process is one that will be refined in future evaluations. The default set of fields in the survey platform includes a range of specificities (“Humanities” and “Theology” are both options) and some fields with a great deal of overlap (“English as a Second Language” and “Foreign Language”), as well as an “Other” field. In the future, we plan to ask respondents to select as many of the below broad categories as they feel are applicable, and then write in their specific fields.

Primary Teaching Fields

Respondents could select more than one option from a more specific list of fields.  Responses are grouped into 6 broad areas, plus “Other”.

View as table
Subject Area# Evals% Evals
Literature / Writing1644%
Foreign Languages1131%
Social Sciences719%
Other617%
History514%
STEM38%
Arts13%

Other Roles Beyond Teaching
Respondents answered in a text box, and not all respondents listed a field. Responses are grouped into 3 broad areas.

View as table
Other Roles# Evals% Evals
Oversight/Policy Making1028%
Professional Dev/Capacity Building26%
Ed Tech/Support26%
No Other Roles2261%

Most respondents don’t play any role at their institution beyond teaching. Of those that do play a different role, administration is common–nearly a third of respondents are involved in oversight/policy making at their institutions.

Workshop Interest

Respondents were asked three basic questions about why they attended the workshop under evaluation.

 

How did you learn about the workshop?
Respondents could select more than one option.

View as table
From# Evals% Evals
Chair/Institution leader1439%
Email from IWT/CLASP1439%
OSUN Newsletter/Email1131%
Peer411%

An “Other” option was provided for the above question, but no respondents selected it. It appears that the current mix of broader mailings and targeted communications to administrators is working well.

How many IWT CLASP activities did you attend prior to this one?

View as table
Prior Activites# Evals% Evals
This is my first1439%
2-31028%
4 or more719%
1514%
Total36100%

The majority of respondents (61%) had attended at least one workshop prior to the one they evaluated. It will be interesting to monitor this number over time.

 

Why did you choose to attend?
Respondents could select more than one option.

View as table
Why?# Evals% Evals
Improve current teaching skills2672%
Learn new teaching methods2672%
Broaden network2364%
Learn new teaching skills2261%
Gain confidence in student-centered teaching1953%
Learn new teaching theory1747.22%

The two most popular reasons for attending were to “Improve current teaching skills” and to “Learn new teaching methods”, while the least popular reason was to “Learn new teaching theory”. This aligns with workshop content—workshops are highly experiential, and though they have a basis in teaching theory, it is not emphasized.

 

Despite the fact that all workshops focus on a dimension of student-centered teaching, only half (53%) of respondents listed “Gain confidence in student-centered teaching” as a reason they attended. A possible explanation is that respondents are already fairly confident in student-centered teaching, but are looking to add new methods to their practice or to refine their existing methods. This explanation is reinforced by the two most popular reasons for attending.

Workshop Impact on Faculty

Workshop Fit

Fit between training and respondents’ professional needs and interests

View as table
Fit# Evals% Evals
Excellent1644%
Good1644%
Fair411%
Poor00%
Total36100%

The vast majority (88%) of respondents felt that the workshop evaluated had a “Good” or “Excellent” fit with their professional needs or interests. No faculty felt that the workshops were a “Poor” fit. Participants tended to feel that longer workshops were better fits than shorter ones, possibly because there is a greater chance of personally relevent content being covered the longer a workshop is.

 

The fit of workshop content to respondents’ needs and interests is not necessarily a measure of participant satisfaction.

Relationship to Content

How much of the content (readings, lectures, methods, skills, ideas, approaches) was new to you?

View as table
New Content# Evals% Evals
All/Almost all411%
Most1644%
About half1439%
A little13%
None at all13%
Grand Total36100%

Ideally most participants in a workshop will be in a position to both acquire new knowledge and expand on their current understanding.  It appears that most respondents are positioned in this way.  83% of respondents felt that “Most”  or “About half” of the content in their workshop was new, putting them in a good position to learn new content and draw connections to their existing knowledge without being either bored or overwhelmed.

 

Which of the following best describes the usefulness of the new content?

View as table
Usefulness# Evals% Evals
Greatly1028%
Very much1953%
Somewhat514%
A little13%
Not at all13%
Grand Total36100%

81% of respondents felt that the new content in their workshop was either “Greatly” or “Very” useful.

Reflection

To what extent has the workshop helped you to reflect on your teaching?

View as table
Helped Reflect# Evals% Evals
Greatly1233%
Very much1542%
Somewhat822%
A little13%
Not at all00%
Grand Total36100%

All respondents reported that the workshop helped them reflect on their teaching.  Despite the fact that 75% of respondents felt that the workshop “Very much” or “Greatly” helped them to reflect, this is still an area that could be improved.  The practices participants are intended to take from workshops to their classrooms are reflective.  IWT Annual Workshops always conclude with time to reflect on participants’ classrooms and actionable ways to implement the content covered in the workshop.  This structure is one that can be incorporated into other workshops.

Themes and Impacts

What aspects of the workshop had the most influence on changes in your knowledge, thinking or practice?

Respondents were instructed to select up to three options.

View as table
Workshop Content Category# Evals% Evals
Practical Exercises2569%
Peer Discussion or Presentations2467%
Facilitator Modeling1644%
Faculty Presentations1131%
Readings1131%

Our goal is that all workshops will lead to an increase in participant ability to incorporate the following into their classrooms: Writing in Curricula and Student-Centered Activities.  Respondents were asked to describe their confidence in incorporating these two universal themes before and after their participation.

 

 

Understanding the rationale, key terms, and methods for incorporating Writing in Curricula for my discipline.

Confidence Change# Evals% Evals
Before: A little; After: Moderate26%
Before: A little; After: Very13%
Before: Moderate; After: Moderate26%
Before: Moderate; After: Very1131%
Before: Moderate; After: Extremely Confident38%
Before: Very; After: Very719%
Before: Very; After: Extremely Confident513.89%
Before: Extremely Confident; After: Extremely Confident411.11%

Visualization of participants change in confidence in incorporating Writing in Curricula
Thickness of a bar indicates the number of participants who reported that range of change. For example, 30.56% of respondents reported a change in confidence from “Moderate” to “Very”, so that bar is the thickest. Arrows indicate directionality of change, if no change was reported then participants are represented with a box.

Understanding the rationale, key terms and methods for incorporating more student-centered activities in my discipline.

Confidence Change# Evals% Evals
Before: A little; After: Moderate13%
Before: A little; After: Very26%
Before: Moderate; After: Moderate26%
Before: Moderate; After: Very617%
Before: Moderate; After: Extremely Confident26%
Before: Very; After: Very822%
Before: Very; After: Extremely Confident616.67%
Before: Extremely Confident; After: Extremely Confident822.22%

Visualization of participants change in confidence in incorporating Student-Centered Activities

Thickness of a bar indicates the number of participants who reported that range of change. For example, 22.22% of respondents reported continuing to feel “Very confident” after the workshop, so that bar is the thickest. Arrows indicate directionality of change, if no change was reported then participants are represented with a box.

Most CLASP workshops incorporate more specific themes in addition to the universal themes of Writing in Curricula and Student-Centered Activities.  Respondents were asked to choose which of the following 6 themes were part of the workshop they participated in:

 

  1. New pedagogical methods for integrating writing into the classroom
  2. Skills necessary to apply one or more writing activities in the classroom
  3. Practices for preparing activities that bring students’ lived experiences into the class
  4. Practices for applying activities that increase student engagement in the classroom
  5. Practices for applying activities that give students more control and responsibility for their learning
  6. Practices to generate peer-to-peer exploration of diverse perspectives and learning from each other

Number of respondents that indicated each theme was part of their workshop

View as table
Theme# Selected% Selected
Applying activities that give students more control and responsibility2775%
Applying activities that increase student engagement2364%
Skills to apply one or more writing activities2364%
Preparing activities that bring students' lived experiences2158%
Generate peer-to-peer exploration of diverse perspectives2056%
New pedagogical methods for integrating writing2056%

If a respondent indicated that one of the above themes was part of their workshop they were asked to answer the following “Extent” questions about the efficacy of the work they did around that theme:

 

 

  1. To what extent has the workshop helped you understand new pedagogical methods for integrating writing into the classroom?
  2. To what extent has the workshop helped you develop skills necessary to apply one or more writing activities in the classroom?
  3. To what extent has the workshop helped you develop practices for preparing activities that bring students’ lived experiences into the class?
  4. To what extent has the workshop helped you develop practices for applying activities that increase student engagement in the classroom?
  5. To what extent has the workshop helped you develop practices for applying activities that give students more control and responsibility for their learning?
  6. To what extent has the workshop helped you develop practices to generate peer-to-peer exploration of diverse perspectives and learning from each other?

The split evaluation process we indent to implement going forward will enable us to draw a clear line between:

 

  • The extent to which the workshop was designed to help participants with a specific aspect of their classroom practice
  • The extent to which respondents felt a workshop helped them with that aspect of their classroom practice
  • The extent to which respondents observed a positive change in their classrooms as a result of implementing that practice

See Future Changes for more details.

Extent to which respondents felt the workshop helped them for each of the six additional themes.

Percentages are of the number of respondents who selected each theme.

View as table
ThemeGreatlyVery MuchSomewhatA littleNot at all
New pedagogical methods for integrating writing (N=20)945%840%315%00%00%
Skills to apply one or more writing activities (N=23)1043%835%313%29%00%
Preparing activities that bring students' lived experiences (N=21)1048%629%419%15%00%
Applying activities that increase student engagement (N=23)830%1141%830%00%00%
Applying activities that give students more control and responsibility (N=27)1252%417%626%14%00%
Generate peer-to-peer exploration of diverse perspectives (N=20)630%735%420%210%05%

The majority of respondents felt that the workshop helped them “Greatly” or “Very Much” for all 6 additional themes.  There is a fairly large spread between the most successful theme, “Understanding new pedagogical methods for integrating writing into the classroom” (85% “Very much” or higher) and the least successful theme “Generating peer-to-peer exploration of diverse perspectives and learning from each other” (65% “Very much” or higher).  This 20% spread is something that CLASP is working to address from multiple angles, particularly through that of the CLASP Fellows Program.  The first cohort of CLASP Fellows will graduate this year (2023) and will internationalize the pool of faculty available to lead CLASP workshops.

Applying Workshop Learning

To what extent have you tried to apply new methods, approaches or ideas learned in the workshop to your classroom?

View as table
Extent Applied# Evals% Evals
Greatly1028%
Very much1028%
Somewhat1131%
A little38%
Not at all26%

Note that two respondents said they did not try at all to apply new methods, approaches, or ideas learned in the workshop to their classrooms. As such, all further responses in this section will be out of a total of 34 respondents who did try to apply something. All respondents may not have had time to apply the workshop “Greatly” or “Very much” since the survey was not distributed to faculty at a fixed point in time after workshop participation.  As such, the extent to which respondents tried to apply the workshop in their classrooms is not necessarily representative of their motivation to apply content.

Workshop Outcomes

After applying the methods, approaches, and ideas learned in the workshop to their classrooms faculty should observe a positive change in students’ learning.  In order to measure this change respondents were asked to reflect on 9 specific learning outcomes which all CLASP workshops aim to affect.  The outcomes are as follows:

 

  1. Student interaction with each other
  2. Student engagement with class discussion and assignments
  3. Student motivation to engage with course content and class activities
  4. Student ability to be articulate orally and in written work
  5. Student confidence in voicing ideas and opinions
  6. Student self-reflection
  7. Student ability to listen well
  8. Student open mindedness to other perspectives and ideas
  9. Student empathy with classmates and their experiences

Respondents were asked to what extent they observed an increase in each of the 9 outcomes after applying the workshop to their classrooms.

 

Observed increase in all outcomes.

This chart does not take into account the extent to which faculty tried to apply the workshop to their classrooms.  A visualization which takes this into account follows below. 

View as table
OutcomeGreatlyGreatlyVery MuchVery MuchSomewhatSomewhatA littleA littleNot at allNot at all
Student self-reflection1235%1338%721%13%13%
Student confidence in voicing ideas and opinions926%1544%618%412%00%
Student engagement with class discussions and assignments926%1441%1029%13%00%
Student open mindedness to other perspectives and ideas926%1338%926%39%00%
Student motivation to engage with course content and class activities1029%1235%926%39%00%
Student ability to be articulate orally and in written work721%1235%1029%412%13%
Student empathy with classmates and their experiences926%926%1441%26%00%
Student ability to listen well515%1132%1544%39%00%
Student interaction with each other515%1029%1750%00%26%

Average observed increase in all outcomes.

This chart does not take into account the extent to which faculty tried to apply the workshop to their classrooms.  A visualization which takes this into account follows below. 

View as table
Outcome
(0 is "Not at all", 4 is "Greatly")
Average Increase
Student self-reflection3.00
Student engagement with class discussions and assignments2.91
Student confidence in voicing ideas and opinions2.85
Student motivation to engage with course content and class activities2.85
Student open mindedness to other perspectives and ideas2.82
Student empathy with classmates and their experiences2.74
Student ability to be articulate orally and in written work2.59
Student ability to listen well2.53
Student interaction with each other2.47

Participants who try to apply CLASP workshops to their classrooms observe a large positive increase in all 9 learning outcomes. The charts above are an average of all respondents. We expect the observed increase in learning outcomes to scale with the extent faculty try to apply the workshop to their classrooms. Responses show that this is the case.

Comparison of average observed increase in all outcomes for all respondents vs those who “Greatly” tried to apply the workshop to their classroom (n=10).

View as table
Outcome
(0 is "Not at all", 4 is "Greatly")
Average Increase
Average Increase
for respondents who applied "Greatly"
Student self-reflection3.003.60
Student engagement with class discussions and assignments2.913.40
Student confidence in voicing ideas and opinions2.853.50
Student motivation to engage with course content and class activities2.853.70
Student open mindedness to other perspectives and ideas2.823.40
Student empathy with classmates and their experiences2.743.40
Student ability to be articulate orally and in written work2.593.30
Student ability to listen well2.532.90
Student interaction with each other2.473.20

Faculty who “Greatly” tried to apply the workshop to their classrooms observed a greater increase in learning outcomes than the average.  Most of these differences are quite large, with all but “Student ability to listen well” at least half a step higher than the average.

Comparison of average observed increase by the extent respondents tried to apply the workshop to their classrooms.

The 4 learning outcomes with the overall highest and lowest average observed increases are shown.

For all 4 outcomes shown, respondents who tried to apply the workshop in their classrooms only “A little” report the same increase in outcomes of “Somewhat”. The trend in outcome increase vs extent applied begins to diverge as respondents apply the workshops more and more to their classrooms, which indicates that some outcomes are harder to achieve than others. This is valuable information as CLASP refines its workshops, since it suggests that workshops may need to focus more on outcomes that faculty have more difficulty achieving, or that the current techniques used in workshops are a tighter fit for some outcomes as they are for others.

Barriers to Application

Respondents who tried to apply the workshop to their classrooms (34/36) were asked to describe potential barriers to further applying workshop knowledge in their classrooms. These self-described barriers were condensed into three categories: “Time Constraints”, “Institutional Barriers”, and “Student Resistance”.

The evaluation will be refined in the future to also ask respondents who did not try to apply the workshop to their classrooms about potential barriers to application.

Barriers to Application.

17/34 or 50% of respondents listed a barrier.

View as table
Barrier# Evals% Evals
Institutional Barriers1235%
Time824%
Student Resistance412%
Listed 1+ Barriers1750%

Respondents who tried to apply the workshop to their classrooms (34/36) were asked to describe factors or supports that might facilitate their effort to further apply workshop knowledge in their classrooms. These self-described supports were condensed into three categories: “Changes to Teaching Practice”, “Cross-Campus Collaboration”, “Further CLASP Participation”, and “Institutional Changes”.

Likewise, the application will ask respondents who did not try to apply the workshop to their classrooms about potential supports.

Supports to Application.

17/34 or 50% of respondents listed a support.

View as table
Support# Evals% Evals
Institutional Changes721%
Futher CLASP Participation515%
Changes to Teaching Practice39%
Cross-Campus Collaboration39%
Listed Support1750%

Peer Exchange, Networking, and Collaboration

To what extent did your interaction with peers in the workshop inform your knowledge or perspectives?

View as table
Extent# Evals% Evals
Greatly1028%
Very much1336%
Somewhat1131%
A little26%
Not at all00%
Grand Total36100%

To what extent did learning with peers from diverse backgrounds and cultures improve your capacity to navigate diversity?

View as table
Extent# Evals% Evals
Greatly1233%
Very much1542%
Somewhat514%
A little26%
Not at all26%
Grand Total36100%

The majority of respondents feel that their peers “Greatly” or “Very much” informed their knowledge and perspectives and improved their capacity to navigate diversity.  Meaningful interaction with faculty from across OSUN in participatory workshops is one of the most valuable aspects of CLASP’s work.

How likely are you to reach out to other workshop participants to share experiences, seek advice or plan collaboration?

View as table
Likelihood# Evals% Evals
Extremely Likely822%
Very1233%
Moderately1028%
A little26%
Not at all Likely411%
Grand Total36100%

The majority of respondents (64%) are “Greatly” or “Very much” likely to reach out to other workshop participants to share experiences, seek advice, or plan collaboration.  This points to the fact that CLASP is a significant driver of collaboration across OSUN.

Knowledge of OSUN and Institutional Impact

To what extent have you shared your workshop experience with colleagues or leaders on your campus?
Respondents could select more than one option.

 

View as table
Party# Evals% Evals
Yes, with Colleagues3289%
Yes, with Dean1131%
Yes, with Department Chair1028%
Not at all411%
Other411%
Yes, with OSUN Chief Academic Officer26%

Nearly all (89%) of respondents shared their workshop experience with colleagues or leaders on their home campuses.  This points to excitement about workshop content on behalf of participants, but even more importantly to CLASP’s reach beyond the faculty who participate in a given workshop.  Particularly striking is that nearly a third of participants shared their workshop experience with those in leadership positions at their institutions. 

How interested were they in what you learned?

View as table
Response# Evals% Evals
Interested in broadly introducing on campus719%
Enthusiastic and supportive1747%
Mild interest822%
Not at all00%
Didn't Share411%
Grand Total36100%

Not only are respondents sharing their workshop experience, they are being met with a warm reception when they do.  66% of the people respondents shared their experiences with were “Enthusiastic and supportive” or “Interested in broadly introducing on campus”.

To what extent has your knowledge about OSUN improved as a result of your participation in this IWT CLASP workshop?

View as table
Improvement# Evals% Evals
Greatly1644%
Very much1028%
Somewhat617%
A little26%
Not at all26%
Grand Total36100%

CLASP promotes knowledge about OSUN.  72% of respondents felt that their knowledge about OSUN improved either “Greatly” or “Very much” as a result of the workshop.  It is likely that CLASP is an indirect promoter of knowledge about OSUN across institutions given that 89% of respondents share their workshop experience with colleagues or leaders.

How would you describe the reputation of IWT CLASP on your home campus?

View as table
Reputation# Evals% Evals
Positive2775%
Negative00%
Not well known925%
Grand Total36100%

If CLASP is well-known on a campus, respondents feel that CLASP has a positive campus reputation.  Campus knowledge of CLASP should improve naturally as OSUN continues to mature and participants continue to share their workshop experiences.  It will be particuarly interesting to monitor this number over time.

Satisfaction and Future Plans

How would you describe your overall satisfaction with your participation in this event?

View as table
Satisfaction# Evals% Evals
Very satisfied2261%
Satisfied1131%
Neither13%
Dissatisfied26%
Very Dissatisfied00%
Grand Total36100%

How likely would you be to participate in another IWT CLASP event?

View as table
Likelihood# Evals% Evals
Extremely Likely2158%
Very1028%
Moderately38%
A little13%
Not at all Likely13%
Grand Total36100%

92% of respondents were satisfied with the workshop they participated in. Nearly two-thirds of respondents (61%) were “Very satisfied”.

 

86% of respondents are “Extremely” or “Very” likely to participate in another IWT CLASP workshop.

 

Thank you for your interest in IWT CLASP’s work and in the results of our first Continuous Evaluation Report. If you have any questions about our evaluation process, the report itself, or CLASP’s programming, please email us at [email protected].