The Impact Report of the Quality Enhancement Plan

Motlow State Community College was approved for and implemented a QEP project entitled Internationalizing the Curriculum—Improving Student Learning through International Education: Preparing Students for Success in a Global Society. The intent of this project was to create international modules for placement into selected courses and then continue to add new courses each year; the outcome was to increase student awareness of diversity and connect students to the world beyond the classroom.  

Since the initial startup in 2008 through the final assessments in 2013, the QEP project allowed students to experience learning opportunities that were unique and diverse; furthermore, it allowed college personnel to unite around a common project and created an atmosphere that fostered growth in the areas of project implementation and assessment design.  Throughout this project, the College remained true to the established student learning outcomes but based on the data realized the implementation and assessment measures had to change from the original design.  This report summarizes the journey of growth and development that occurred at Motlow State Community College as a result of this project.

The original assessment design of the QEP set forth in 2008 was a combination of both internal and external assessments, but the design proved to be ineffective for a variety of reasons that will be discussed throughout the report.  In addition to assessment design problems, the College realized that the projected number of courses with an international module was insufficient to produce any type of measurable change.    These conclusions were based on instructor feedback and on data gathered by the College.  Key points in the evolution of the process are summarized below with all applicable data referenced. 

Initial Design and Implementation:  2008-2009 and 2009-2010

The QEP assessment had four components: (1) establish control and experimental classes for each course going through the module development phase; (2) employ a pre-/post-test module learning assessment; (3) grade a presentation on the module content as pass/fail; (4) use the Munroe Multicultural Attitude Scale Questionnaire (MASQUE) in a pre-/post-test format for all classes incorporating a module; (5) track progress on targeted questions on the Community College Survey of Student Engagement (CCSSE) that relate to cultural diversity; and (6) track progress on three additional questions added to the CCSSE instrument that directly measure QEP gains.

The use of the pre-/post-test assessment methodology seemed appropriate because the College has a history of using pre-/post-tests in several of its disciplines for measuring general education outcomes.  The control and experimental class designation was selected as a validation test for the module construction.  The control class did not receive the instruction in the international module, and the experimental class received the module. When the QEP was piloted in fall 2008 with a General Psychology I class, the base knowledge of the experimental class minimal showed positive growth based on the statistical analysis compared to the control class, and all the students received a passing grade on the presentation assignment.  The successful module was then ready to be incorporated into all sections of general psychology for the next semester, and this cleared the way to replicate this development and testing process in other disciplines.  For spring 2009 the next class to be piloted was American History II but the instructor incorrectly gathered the data; therefore, the first year ended with only the psychology module working.  Unfortunately, no data was captured for the psychology class in spring 2009.

The original plan called for the development of one to three new courses each year.  In an effort to rectify the errors in the history class in spring 2009, the creation and testing of the international module for that class was carried over to the fall 2009 semester, and the development process was expanded to include U.S. History I and Experiencing Literature.  The psychology class was to be fully implemented in all on-ground sections of the class.  Even with additional training, faculty had difficulty understanding the assessment design resulting in minimal participation in the pre-/post-test strategy.  The original research design called for T-test analysis, and that test works on a normal distribution of items in a population.  The data gathered did not meet this criteria.  Consequently, the data could not be properly analyzed and showed poor results. In an effort to keep working with the faculty and these classes, the same classes were carried over to 2010-2011 but with little improvement.

The use of the Munroe Multicultural Attitude Scale Questionnaire (MASQUE) also proved difficult, especially for students.  The MASQUE is a questionnaire that uses a six-point scale ranging from strongly agree to strongly disagree to capture cultural attitudes based on reactions to 28 statements.  During the initial pilot year in the psychology class, only two statements showed a substantial difference between the pre-/post-test results of the experiment and control groups.  In short, the MASQUE identified minimal cultural attitudinal changes in the students. In the second year the results were worse than the first year, and third year saw the lowest scores, and students wrote comments such as “the survey was repetitive, lengthy and invasive” documenting their displeasure with this instrument.

An initial consideration of the OEP topic came from the items that related to cultural awareness and diversity on the Community College Survey of Student Engagement (CCSSE).  These scores had been consistently below that of other small colleges.   The CCSSE is a proprietary and nationally normed instrument that gives institutions national benchmarks by which to compare assessment results and progress toward a goal.  The CCSSE provided an external data source to compare the College’s expected gains.  In addition to the standard CCSSE questions referenced above, the College added three additional questions to the CCSSE instrument that directly measure QEP gains.  While there are no small colleges’ means available for comparison, the College would track progress to demonstrate historical improvement, with 2008 forming the benchmark.  As with the other assessment measures designed for the QEP project, the results from the initial years of CCSSE assessment were not tracking toward success.  Upon reflection, the project was not touching enough students for an indirect measure like CCSSE to measure a change in the attitudes of the group as a whole.

The Transition Phase: 2010-2011

By the 2010-2011 year, the QEP project needed to be redefined and re-envisioned for the campus community as it was clear from the data that progress was not being made.  In spring 2011, the Provost initiated a series of meetings with an ad hoc committee consisting of the Coordinator of International Education, Director of Institutional Research Planning and Effectiveness, and the former Chair of the QEP Development Committee, to evaluate the QEP and its implementation timeline.   The following exigencies were identified as reasons necessitating changes to the QEP: 

  • The scope of the QEP has involved a limited number of academic disciplines. The only disciplines involved in the project to date are psychology, history, and English, which is eight percent of the academic disciplines.

  • The originally prescribed method (control groups, pre-/post-tests with t-test analysis) for assessing module creation and achievement of the student learning outcomes has proven to be cumbersome and has led to non-statistically quantifiable results. 

  • Of the courses that added, implemented, and assessed an international module since the QEP implementation in fall 2008, only two have shown a statistically significant difference in the knowledge and/or attitudinal changes of the experimental group over the control group.

  • Less than 200 students have been enrolled in a course that taught and assessed an international module.  This is small compared to the size of the student body which is approximately 4,500. Additionally, CCSSE reaches approximately one-quarter of the College’s population, so the impact of the project is too limited. 

  • The MASQUE instrument used to measure attitudinal changes regarding multicultural issues was poorly received by a number of students who found it to be repetitive, lengthy and invasive.

During the 2011-2012 year, a new pilot was held in a BIOL 2230—Microbiology.  The students in the class were divided into teams in order to research a worldwide infectious disease of their choosing.  The students created a poster presentation and were graded by a rubric.  Students in this pilot were engaged in the project, and the faculty member was able to document the learning that was achieved.   The pilot assessed 54 students, and 93 percent achieved mastery on student learning outcomes 1a and 2b; 88 percent achieved mastery on 3a.  The benchmark for success was 70 percent. 

The success of the microbiology pilot assisted the institution in reframing the following elements of the QEP:

  • The intended learning outcomes of the original QEP would remain unchanged.

  • The QEP would move away from both the pre-test/post-test assessment method and the experimental verses control classes. Instead, a streamlined methodology for assessing international modules was devised utilizing a common grading rubric aligned with QEP learning outcomes. This would greatly simplify the overall assessment process for the faculty as well as simplify the data collection/analysis processes. Because the MASQUE was proving more controversial than value added, it was discontinued as an external measure.  The CCSSE and the add-on CCSSE questions would remain as an external benchmark.

  • Faculty internationalizing courses would be provided with training in the development of grading rubrics.

  • The international modules themselves would not change, only the assessment methods.

  • An updated approval process for modules (see Academic Leadership Meeting Minutes), which provided peer and supervisory feedback, would be implemented. All collected data would be distributed to department chairs/directors and would be discussed with individual faculty and/or departments and at academic affairs leadership team. Department chairs/directors would submit weekly reports outlining data assessment and improvements to the modules to the Provost who would then forward the reports to the President. 

  • Department chairs and academic affairs leaders would develop an accelerated schedule of courses, which would greatly exceed the original QEP benchmark of one to three courses per year to include an international module. The new schedule would internationalize 25 percent of all Motlow courses by the 2012-2013 year and include internationalized courses in every academic department instead of only two departments.

  • Non-curricular international events already occurring such as international festivals and guest speakers would be assessed for gains/changes in student international learning and awareness.

  • Because Motlow does not control the sample of students selected to participate in the CCSSE and students in internationalized versus non-internationalized courses may be underrepresented, the three QEP specific add-on questions would also be put into a newly developed Student Perception Survey to be administered to all students in courses participating in QEP.

In order to familiarize the campus with the changes to the QEP, the President included comments about the QEP in her opening remarks  (see slide two) at Fall Assembly, and all faculty attended a session on the QEP during the afternoon of Fall Assembly.  The afternoon session was conducted by the microbiology instructor who taught the new pilot, the Director of Institutional Research, Planning, and Communication, and the Provost. This session was repeated three times so that faculty could attend in smaller groups in order to facilitate discussion/questions.

QEP Redux: 2011-2012 & 2012-2013                                                         

The 2011-2012 year opened with renewed vigor for the QEP project and through course modules and extra-curricular activities, all things international came to the forefront of the campus.  The 2012-2013 year has follow through with the successes achieved as a result of the changes.

  • Faculty assigned to develop the international modules and grading rubrics for fall 2011 or spring 2012 courses listed on the QEP Schedule of Courses attended training sessions on rubric development.

  • International modules were developed that included assignments such as essays, presentations, or poster/visual arts. Assignments could be group or individual. A common grading rubric incorporating at least three of the QEP international student learning outcomes was designed and used to grade the assignment.

  • Both the international modules and grading rubrics for all classes with an international component were reviewed and approved prior to implementation by department chairs, the International Education Committee, the Assistant Vice President for Academic Affairs, and the Provost.

  • All classes that were scheduled to receive a module did and the numbers of impacted students is vastly improved over the initial years (see Course Sections and Enrollment in QEP Courses).

  • Analysis was done on assessment data obtained through the following sources: rubric results to measure attainment of student learning outcomes; responses to cultural specific questions on the Student Perception Survey; responses to cultural specific questions on CCSSE survey; and responses to cultural and diversity issues on the add-on portion of CCSSE survey.

  • The use of results was documented in each department’s institutional effectiveness plans.

  • A number of multicultural experiences were offered including international festivals at all four campus locations and multicultural programs (see Summary of Multicultural Events AY 2011-12 and Summary of Multicultural Events AY 2012-13.)

  • The events and programs were promoted in local newspapers, with on-campus posters and closed circuit television, and on the college web sites and social media sites (see Press Releases for International Events in AY 2011-12 and 2012--2013).

  • Analysis was done on assessment data obtained from the responses to cultural specific questions on the Survey of Multicultural Events given to all multicultural event attendees.

Cumulative Assessment Results

The revised QEP project produced swift and successful results for students enrolled in an internationalized course and/or who attended an international event, which was what had been envisioned when the project started.  Grading rubrics and Student Perception Surveys served as direct assessment instruments in the classroom, and the Surveys of Multicultural Events provided direct assessments for the extracurricular multicultural activities. The CCSSE questions provided indirect measures for the project as a whole.

The effectiveness standard set for common grading rubrics required that 70 percent of all students completing an international assignment would score three or higher on each International Education Student Learning Outcome. The attainment of the learning outcomes as summarized in Table 1 show that in academic year 2011-2012 and fall 2012 more than 70 percent of students who completed the international education assignment scored a three or higher (see Grading Rubric Summary for All Participating Courses Fall 2011, Grading Rubric Summary for All Participating Courses Spring 2012, Grading Rubric Summary for All Participating Courses Fall 2012 and Grading Rubric Summary for All Participating Courses Spring 2013 for individual course rubric assessment results).  Issues in individual courses that did not meet the benchmark were addressed.

Table 1 – Summary of International Education Student Learning Outcomes
for AY 2011-12 & AY 2012-13


International Education
Student Learning Outcome

Exceeds
Standards

4

Meets
Standards
3

Not All
Standards Met
2

Attempt Made
1

Mastery
(3 or 4)

1a. Students will increase their knowledge of cultures in the world around them.

Fall 2011

55%

35%

7%

2%

90%

Spring 2012

56%

31%

10%

2%

87%

Fall 2012

54%

35%

8%

3%

88%

Spring 2013

57%

32%

7%

3%

89%

1b. Students will recognize the role that differing cultural perspectives play in shaping world events.

Fall 2011

54%

30%

12%

3%

84%

Spring 2012

50%

31%

13%

5%

81%

Fall 2012

47%

39%

10%

4%

86%

Spring 2013

46%

39%

11%

4%

85%

2a. Students will understand how international cultural diversity shapes the foundational elements, theory, research, and practice of various academic disciplines and related occupations/professions.

Fall 2011

44%

35%

13%

8%

79%

Spring 2012

46%

33%

15%

6%

80%

Fall 2012

46%

39%

14%

6%

80%

Spring 2013

47%

38%

11%

4%

85%

2b. Students will describe how the course-related international content impacts their own occupational/ professional development.

Fall 2011

46%

38%

7%

9%

84%

Spring 2012

49%

33%

12%

7%

82%

Fall 2012

48%

33%

12%

8%

81%

Spring 2013

47%

35%

11%

5%

82%

3a. Students will recognize how events in other nations affect the United States and how events in this country affect other nations.

Fall 2011

56%

34%

9%

1%

90%

Spring 2012

49%

33%

14%

3%

82%

Fall 2012

46%

38%

12%

4%

84%

Spring 2013

51%

39%

10%

3%

91%

3b. Students will articulate the perspectives of other cultures and nations when analyzing world events.

Fall 2011

44%

39%

13%

4%

83%

Spring 2012

46%

36%

12%

6%

82%

Fall 2012

48%

36%

11%

4%

85%

Spring 2013

51%

35%

10%

5%

86%

The CCSSE scores show that since the revisions to the project, the College was able to turn the College’s declining scores around.  The work toward beating the CCSSE mean is still on-going, with currently only one indicator surpassing the CCSSE mean.  As the College continues its international festivals and many courses retaining the international modules, work toward this goal will continue.

Table 2 – Means Comparison of CCSSE Cultural Awareness and Diversity Items


CCSSE Item

CCSSE Mean

MSCC Mean

2008

2009

2010

2011

2012

2008

2009

2010

2011

2012

Had serious conversations with students of a different race or ethnicity other than your own

2.29

2.29

2.28

2.30

2.42

2.21

2.22

2.43

2.25

2.30

Had serious conversations with students who differ from you in terms of their religious beliefs, political opinions, or personal values

2.30

2.31

2.30

2.31

2.35

2.25

2.29

2.40

2.29

2.31

Encouraging contact among students from different economic, social, and racial or ethnic backgrounds

2.45

2.46

2.47

2.50

2.54

2.26

2.37

2.51

2.39

2.56

Understanding people of other racial and ethnic backgrounds

2.34

2.34

2.35

2.38

2.42

2.23

2.23

2.44

2.38

2.41

(4) Very much (3) Quite a bit (2) Some (1) Very little

After three solid years of decline, the add-on questions that directly assess the impact of the QEP are showing steady improvement.   The final year of CCSSE add-on data will be included when it becomes available.

Table 3 – Means Comparison for CCSSE Add-on Items


CCSSE Add-on Items

MSCC Mean
2008

MSCC Mean
2009

MSCC Mean
2010

MSCC Mean
2011

MSCC Mean
2012

MSCC Mean
2013

How much has your experience at this college contributed to your knowledge of cultures other than your own?

2.72

2.62

2.30

2.37

2.58

2.59

How much has your experience at this college contributed to your comprehension of how international events and the peoples of other cultures impact your chosen major and future occupation?

2.90

2.85

2.11

2.15

2.59

2.55

How much has your experience at this college given you an awareness and understanding of the interdependency and consequences of international events and issues?

2.83

2.78

2.22

2.21

2.58

2.56

(4) Very much (3) Quite a bit (2) Some (1) Very little

With no control over the sample of students selected to participate in the CCSSE, the Student Perception Survey, which uses the CCSSE add-on questions, allowed for more immediate feedback and a solid instrument for gauging impact.  The survey is administered to all students in courses participating in the QEP to ensure that CCSSE add-on questions would be answered by students who had actively participated in the QEP. 

The results of the Student Perception Survey (Table 4) showed means that are comparable to the CCSSE Add-on Surveys (Table 3). The means also show improvement over the semesters in the courses where a module was taught.

Table 4 – Means for Student Perception Survey Items for AY 2011-12 & AY 2012-13


Questions

Fall
2011

Spring
2012

Fall
2012

Spring
2013

How much has your experience at this college contributed to your knowledge of cultures other than your own?

2.63

2.67

2.72

2.72

How much has your experience at this college contributed to your comprehension of how international events and the peoples of other cultures impact your chosen major and future occupation?

2.39

2.45

2.52

2.53

How much has your experience at this college given you an awareness and understanding of the interdependency and consequences of international events and issues?

2.56

2.60

2.67

2.68

(4) Very much (3) Quite a bit (2) Some (1) Very little

The various multicultural events held in 2011-2012 and in 2012-2013 were well attended with approximately 1,500 people attending the six events in AY 11-12 (see Summary of Multicultural Events AY 2011-12) and approximately 2,300 people attending eight events in AY 2012-2013 (see Summary of Multicultural Events AY 2012-13).  The Survey of Multicultural Events was administered to attendees at each of the events. When compared to the means of CCSSE Add-on Items in the Table 3, the means for the Multicultural Event Survey items were significantly higher for most events.  

The final two years of QEP implementation at Motlow College were exceptional years.  After setbacks and declining results, the overall trend showed improvement. Faculty members were more engaged in the module development because the focus of the assignments was seen as a more creative way to teach concepts that were already a part of the course.  The grading rubric gave faculty a tool that demonstrated achievement and offered solid feedback for the students as well. 

Reflections

Internationalizing the Curriculum—Improving Student Learning through International Education: Preparing Students for Success in a Global Society has left its mark on Motlow State Community College in some very meaningful ways and has produced some lasting lessons for the campus community. 

First, the College learned a lesson in comradery as demonstrated by the overwhelming participation in the campus international festivals. Students at all four Motlow locations saw staff, administrators, and faculty working together to organize these events. These festivals have become a part of the campus culture and will continue.

Second, the College learned to make data-driven changes to large-scale projects. While data-driven change is at work every day in the classroom and in the programmatic changes required to keep a curriculum healthy, change in a project of this size and magnitude pushed the institution to grow and embrace the process of change when a project does not proceed as originally planned.

Third, the use of a grading rubric gave all faculty—both full-time and part-time—a new tool for effective grading of student work. The QEP project moved the use of the rubric out of a few disciplines like education and nursing and introduced the concept to a broader audience. Additionally, the use of the rubrics gave the faculty measurable data to demonstrate that learning was taking place in the classroom with regard to the identified student learning outcomes.

Fourth, the College learned that a skill-based QEP topic is preferable to one that is designed to change feelings or perceptions. While changing minds and hearts is admirable and arguably one of the intangible goals of a college education, to accurately frame and measure a personally transformative project like this topic sought to do is difficult. As diligently as the campus community tried, the College’s CCSSE scores are still below the national norm, and work on this will continue. As a result of the rubric assessment of the student learning outcomes and the engagement of the students in the activities, several modules will continue as a permanent part of the curriculum. These modules are in education, music, composition, literature, mass communication, psychology, and nursing.

Fifth, the College learned that when approaching the next QEP topic that faculty training and a more aggressive pilot project schedule prior to the official start of the QEP makes more sense. By having a pilot done earlier than the initial implementation schedule, the College will adapt more quickly to indications that some aspect of the project is not living up to expectations.

Documentation

SACS Listing of Approved QEP Topics (including Motlow College)
Intended Objectives and Learning Outcomes for QEP
Pre and Post Test Results F08
Pre and Post Test Results F09
Pre and Post Test Results / MASQUE Results F10
MASQUE Questions
MASQUE Results AY 2008-09
MASQUE Results AY 2009-10
MASQUE Results AY 2010-11
CCSSE Scores 2005-08
CCSSE Add-on 2008
CCSSE Add-on AY 2009-10 and 2010-11
BIOL 2230 Rubric
Common Grading Rubric
QEP Rubric Training
Academic Leadership Meeting Minutes
QEP Weekly Reports
Documents Forwarded to the President
Accelerated Schedule of QEP Courses
Student Perception Survey
Fall Assembly Presentation
International Modules
Common Grading Rubric
Approvals
Course Sections and Enrollment in QEP Courses
Institutional Effectiveness Plans
Press Releases for International Events in AY 2011-12 & 2012-2013
Survey of Multicultural Events
International Education Student Learning Outcome
Grading Rubric Summary for All Participating Courses Fall 2011
Grading Rubric Summary for All Participating Courses Spring 2012
Grading Rubric Summary for All Participating Courses Fall 2012
Grading Rubric Summary for All Participating Courses Spring 2013
BTCH-04
Student Perception Survey
Summary of Multicultural Events in AY 2011-12
Summary of Multicultural Events in AY 2012-2013
Survey of Multicultural Events
CCSSE QEP, Student Perception and Multicultural Event Scores 2008-13