The institution engages in ongoing assessment of student learning as part of its commitment to the educational outcomes of its students.
The institution has effective processes for assessment of student learning and for achievement of learning goals in academic and cocurricular offerings.
Assessment efforts at IU Southeast embrace course-level, program-level, and institutional-level learning. Direct and indirect evidence of student learning is a part of annual review, promotion review, tenure review (24-0088, pp. 27-28, 35), consideration for campus teaching awards, such as the Distinguished Teaching Award (24-0434) and the Trustees Teaching Award (24-0435), and membership with the Faculty Academy on Excellence in Teaching (24-0500).
IU Southeast has undertaken systematic efforts to clearly articulate goals and learning outcomes for student learning at the course (24-0501), program (24-0436), and institutional levels (24-0437).
Course level: Faculty Senate approved a policy in October 2015 (24-0501) that all course syllabi were to include measurable student learning outcomes (24-0088, p. 52). These student learning outcomes (SLOs) are designed to assist in direct assessment of how well students are achieving learning goals. Appropriate general education (GE) SLOs were also to be included for all GE courses. In addition, multi-section courses, regardless of modality or location taught (online, hybrid, face-to-face, or a dual credit ACP class), should share the same SLOs. Finally, SLOs across courses should distinguish between 100/200 level, 300/400 level, and graduate courses. Deans of each school met and discussed these requirements with their faculty and campus-wide development examining SLOs and syllabi occurred. Deans established a policy or a process to ensure this was being completed in their areas (24-0211; 24-0212; 24-0213; 24-0214; 24-0215; 24-0216; 24-0217).
To make certain proper SLOs occurred in the ACP syllabi, Deans from our campus consulted with the coordinator who oversees all ACP classes across the regional campuses. This was completed to ensure that proper SLOs for each syllabus were represented for the ACP classes in our region. This entire SLO syllabus project highlights for all faculty the importance of consistency and clarity in SLOs on syllabi. To reinforce the importance of the syllabus SLOs, Academic Affairs created and instituted a Syllabus Check Policy (24-0502). At the beginning of each fall and spring semester, all syllabi are checked to ensure the syllabi criteria are present. All GE syllabi are examined to determine if appropriate SLOs are present. This feedback is then given to the Deans and relevant leaders of each syllabus area (school Deans, FYS director, the coordinator of ACP classes, and the General Education committee). The Associate Vice Chancellor of Academic Affairs also examines a sampling of syllabi to determine if multi-section courses share the same SLOs and whether, across modalities, the SLOs are the same. Additionally, an analysis using Bloom's taxonomy is conducted on the SLOs of a sampling of syllabi to determine if there is appropriate differentiation across levels of courses. The results of this review are given to the Deans of each School to share and discuss with their program coordinators and faculty (24-0218; 24-0219; 24-0220; 24-0221).
The Dean of the School of Business used this report as the basis of discussion in a faculty meeting, resolving any problems noted (24-0438). The Dean of Natural Sciences sent the report to her program coordinators and asked them to review their areas and institute procedures to ensure oversight continues (24-0215; 24-0216). The Dean of Social Sciences discussed the results of the syllabus check report with her faculty and put in place a policy (24-0214; 24-0440) to ensure syllabus review rotation (24-0439), and appropriate development of learning outcomes. The Dean of Arts and Letter instituted a policy on learning outcomes describing a process to ensure SLOs match across courses and are appropriate for course levels (24-0211). The Dean of Nursing and the Dean of Education also crafted policies relating to a syllabus check process (24-0212 and 24-0213, respectively).
Program level: Program assessment begins with identifying SLOs which are used to build a layered assessment program with many of the assessment products embedded into courses. These program-level SLOs form the foundation for assessment goals. At the program level, SLOs have been established by the faculty and staff members responsible for each program and are posted throughout the IU Southeast 2021-2023 Bulletin and on the Office of Institutional Effectiveness (OIE) website via Watermark (24-0436). The Academic Assessment Committee (AAC), a Faculty Senate committee, monitors assessment of academic achievement in undergraduate and graduate programs,provides oversight and guidance for academic units in the development of their assessment programs, and makes recommendations to support the accreditation process.
IU Southeast faculty work directly with OIE to develop, implement, operate, and maintain assessment programs (24-0441). OIE works with faculty throughout the assessment process to ensure assessment tools are aligned with outcomes and include both direct and indirect measures. Where possible, assessment tools are designed to automatically feed data to OIE. OIE then analyzes the information, providing feedback to the AAC and to individual programs. Program efforts are scored using a rubric (24-0442) and placed into one of three assessment cycles (annual, biennial, or triennial), incentivizing good assessment practice and requiring more frequent reporting for programs with underdeveloped assessment (24-0426). Achieving the highest assessment cycles requires programs to demonstrate the use of assessment data to improve their program (24-0672).
As mentioned above, OIE scores assessment reports using the Faculty Senate-approved Assessment Report Scoring Rubric (24-0442). The Assessment Specialist drafts detailed feedback for each criterion and returns the rubric and feedback to the program coordinator and dean within 90 days of the coordinator’s submission. The rubric scores range 0-4 for five of six criteria and 0-8 for ;Feedback Loop (or using the data to make improvements), totaling a possible 28 points. The Feedback Loop criterion was weighted heavier starting in Fall 2020 to emphasize the importance of using data to drive change. Reports scoring 22 points or greater (with no criterion scores of ‘0’) are placed on a triennial cycle. Triennial programs submit a full report every three years with update reports due in the off years. Reports earning 18-21 points (with no criterion scores of ‘0’) are placed on a biennial cycle and submit a full report every other year with an update report due in the off year. Reports scoring below 18 points are placed on an annual cycle and are expected to submit a full report every year. As of Fall 2022, 51.4% of programs earned a triennial cycle, 16.2% earned a biennial cycle, and 32.4% earned an annual cycle. Once the program coordinator receives feedback and the rubric score, there is an opportunity to revise and resubmit within 90 days. The coordinator is encouraged to use the feedback provided to improve program assessment practices. A report including updates and changes is once again submitted to OIE for review and additional feedback and the opportunity to improve cycles. To provide a glimpse into the assessment discussions, each area was asked to provide meeting minute notes where each academic and co-curricular areas have recently discussed assessment (24-0673).
Institutional level: Institutional level learning outcomes (ILOs) at the undergraduate and graduate levels have been developed. Undergraduate ILOs were developed (24-0437) and approved by Faculty Senate in November 2022 (24-0192). Graduate ILOs (24-0503) were developed and approved by Faculty Senate in April 2023 (24-0504). Two new committees (one undergraduate and one graduate) were formed in Spring 2022 and charged with creating ILOs representing the campus and its values. For the undergraduate ILOs, five categories were agreed upon and seven ILOs were created. For the graduate ILOs, five categories were agreed upon and eight ILOs were created. The ILOs were not solely targeted at curricular programs; co-curricular and student service unit assessment data will also align and contribute to the measure of ILOs. The ILOs were submitted to the Academic Policies Committee (APC) for review and approval. The campus gathered for a town hall to discuss the ILOs, explain the process for measuring each outcome in the future, and answer questions from both faculty and staff present. OIE will ask all academic programs and co-curricular units to map their specific SLOs to the new ILOs. This mapping will be entered in Watermark for tracking and reporting.
Evaluation of General Education also occurs at an institutional level. Effective Fall 2013, IU Southeast adopted the updated Statewide Transfer General Education Core, now called the Indiana College Core (ICC: 24-0443) for all incoming students. The ICC framework maximizes the transferability of a general education "package" from one campus or state institution to another while allowing each institution to determine and maintain its own distinctive general education curriculum.General Education at IU Southeast includes both campus-wide requirements that apply to all baccalaureate degrees as well as requirements that are specific to each degree. The Faculty Senate General Education Committee (GEC) requires programs to design assessment tools prior to submitting courses to the general education curriculum (24-0444; 24-0445). SLOs (between three and nine outcomes per requirement) have been established by the faculty based on curriculum and state-mandated standards. These goals (24-0222) and SLOs 24-0047) are posted on the IU Southeast website. The GEC oversees the assessment of general education at IU Southeast. The committee has established a cycle that ensures student learning in all major goals of general education is continually assessed while formally reporting in consistent time periods (24-0446) which began in the Fall of 2018 and is ongoing. In fall 2023, the GEC chair introduced flowcharts depicting the primary duties conducted by the GEC to enhance understanding of the committee’s role in the assessment process (24-0641). The flowchart tools indicate the steps to be taken to evaluate incoming courses, remove courses, and complete the review of assessment of each general education area. In addition, the work of current and past general education committees has been summarized using a General Education Review Tracker (24-0642).
The GEC receives information from the AVCAA confirming that syllabi of instructors teaching general education courses align with course learning outcomes and map to the corresponding general education outcomes. All general education syllabi are reviewed during the syllabus check process. In fall 2022, 91% of general education courses had the proper learning outcomes present across all areas (24-0505). In Fall 2023, 93% had the proper GE learning outcomes (24-0644).
All faculty teaching approved general education courses develop assessment measures in consultation with OIE. Typically, assessment data is collected each term for general education classes. OIE aids with collection of assessment data and summarizes it annually, which is then provided to the faculty coordinating the general education courses and to the GEC. The GEC subcommittees, comprised of one or two committee members, report to the GEC at the end of the evaluation year, evaluating progress and recommending improvements based on the data. The GEC reports to the Faculty Senate each year (e.g., 24-0643).
Assessing and reviewing general education courses is a three-part, three-year sequence for each category: in year one, the GEC reviews data, identifies gaps and weaknesses, addresses missing data, and creates an action plan for improving the data collection process; in year two, the programs review the reports from the GEC and analyze data from their own program/courses to identify gaps and strengths in student learning, and the faculty in the program create an action plan for improving students learning in their program; in year three, the GEC reviews outcomes, follows-up on action plans, and reviews the previous year’s reporting. This process is divided into nine categories, with three categories in each year’s step per year. The GEC reviews assessment reports from the categories. This process included contacting the coordinators from all courses and asking them to examine their assessment data and submit a report with an action plan by February 1 to the GEC. The reports are then divided among the members who review and evaluate the reports. To do so, the members checked the assessment reports with the syllabi gathered for all the courses by Academic Affairs. The GEC read the reports submitted by the course coordinators, write evaluations, and complete the rubric for each course to report back to program coordinators.
The AY 2022-2023 GEC offered a report (24-0643) to specifically address the concerns of HLC that:
The Focused Visit Report stated that the General Education Committee was to conduct a review in spring 2022 focusing on the alignment of General Education curriculum with its mission and values, student learning assessment, integration and enrichment, resources and support, and communication. (p.13).
HLC has requested we use current and available program and course data to show that our general education curriculum is aligned with our mission and values, student learning assessment, integration and enrichment, resources and support, and communication. The GEC concluded:
We believe that the General Education program at Indiana University Southeast is effective and robust, covering the nine competencies across a broad range of courses, designed to deliver optimal educational experiences for our students. Furthermore, we believe that our assessment of these courses, while still in a process of improvement ... the vast majority of our students over 70%--are meeting the competency at the average or mastery levels. Finally, when we evaluate our program across the offerings and across the range of competencies and courses of all the IU campuses, we see that our program is in line with the requirements of our peers. We have identified several areas in which we can make minor improvements, which we hope to accomplish over the next years.
Processes for assessment of student learning and achievement of learning goals in cocurricular offerings:
Cocurricular assessment of student learning and achievement of learning goals is more recent on the campus, systematically developed in collaboration between OIE and individual units. Each year, an assessment workshop, titled Co-Curricular & Administrative Assessment Day, specifically designed for these units, is provided through Academic Affairs and OIE. OIE sets the schedule (24-0448) and process for all campus units (24-0449), as well as a scoring rubric (24-0450).
All assessment programs begin with the development of outcomes. Cocurricular units directly engage with students and have developed both direct and indirect assessment of student learning and achievement of learning goals. One example is the Library, which has developed a robust assessment program with the assistance of OIE. Working as a team, Library faculty developed measurable learning outcomes, collected data, and met regularly to review results. These results led to Library faculty reworking some of their outcomes from operational outcomes to student learning outcomes, as well as identifying additional opportunities to improve services and resources, in addition to making improvement goals more specific, measurable, and meaningful. To track these changes over time, please see the following resources:
- 2020 – 2021 Assessment Plan: 24-0451
- OIE Feedback: 24-0452
- 2021 – 2022 Assessment Plan: 24-0453
- OIE Feedback: 24-0454
- 2022 – 2023 Assessment Plan: 24-0455
- OIE Feedback: 24-0456
- 2023-2024 Assessment Plan: 24-0457
A second co-curricular unit demonstrating continuous improvement through assessment is Residence Life & Housing (RLH). Through their assessment plan, RLH has shown through direct and indirect measures that students who live on campus demonstrate a higher commitment to their academic success and personal development than those who do not live on campus. For example, residential students from Fall 2021 to Fall 2022 were retained to the university at 72.5% compared to XX% of those who do not live on campus. In addition, RLH noticed a gap in their students' skills related to conflict management. RLH developed a direct measure for post-mediation assessment to ensure residential students are acquiring important conflict mediation skills (24-0645).
Academic programs (24-0506, pp. 4,5) and Co-curricular/Administrative units (24-0507, p. 5) have been asked by OIE to map student learning outcomes to the Institutional Learning Outcomes in their 2023 assessment reports.
The institution uses the information gained from assessment to improve student learning.
Academic Programs, General Education, and Co-Curricular units are expected to make use of the results of assessment processes to drive data-informed pedagogical change on behalf of student learning. Evidence of using assessment data to improve student learning is found in the annual Academic Program Assessment Reporting process. Program coordinators and unit heads are asked for changes to their assessment process through the Taskstream Accountability Management System by Watermark software. These are submitted to OIE, which then consults with coordinators and provides feedback for continuous improvement.
The faculty in programs are responsible for determining the most effective ways of using assessment data to enhance student learning. For example, the faculty in the Geosciences program have successfully implemented a plan that includes updated goals and outcomes, clearly defined success criteria, and an established working feedback loop (24-0458). Assessment data has enabled them to identify deficiencies in their assessment plan and program curriculum. One example of a change made to the Geoscience program degree requirements was the replacement of GEOG-G107 Physical Systems of the Environment with GEOG-G201 World Regional Geography as a Geoscience core requirement. They determined the program curriculum was not sufficiently preparing students to meet the success criteria of Leaning Goal 1: Geoscience majors will acquire substantive knowledge of global spatial patterns and concepts central to geosciences, so this change was implemented to help support the global spatial awareness component. Another advancement made to the program assessment was the addition of two measurement tools that directly measure Learning Outcome (3.A): Students demonstrate employability by using scientific methodology and communication skills. Before adding and implementing new measures, the only measurement tool for Learning Outcome 3.A was an indirect measure of student learning through a dialogue with graduates.
A second example is illustrated in the work of Modern Languages (24-0459), including both Spanish and German. Using their assessment data, Spanish faculty found upper-level students struggled with composing and supporting a thesis statement. To address this, the faculty increased the scaffolded assignments, added formative assessments, and additional resources to assist students in 300-level Spanish classes. In response to assessment data indicating lower-level students were struggling with reading comprehension (inference, analysis), the faculty added interpretive assignments / activities / formative assessments using authentic 'texts'. The German faculty developed new tools for assessing writing skills in upper-level courses and have realigned these courses in response to data on student performance.
In the School of Business, to ensure the learning of their students, faculty regularly reflect on curricula and assessment data during departmental meetings. Student learning goals, outcomes, and areas for improvement along with the overall quality of the programs are a recurring discussion (24-0646). As one example, consider the outcome: Students will demonstrate/apply knowledge of terminology, theories and principles of management. In the Industrial (Labor) Relations course using course-embedded exam questions 14.3% of students failed to achieve the outcome. As a result, the course modality was changed to online, and a learning facilitation project was modified covering multiple topics to allow students to expand the breadth of their learning. In Spring 2022, all students were at the mastery level for this learning outcome (100% Excellent). Additional examples from Business can be found in the Business Curriculum Revisions driven by Assessment document (24-0647).
A final example comes from the School of Education, the persistence rate once students enter the blocks (cohorts) is 90%. However, many students enter IU Southeast as pre-elementary education majors but leave before getting into the blocks. To address this gap a plan is being developed to begin a cohort of students starting at Freshman year who take the same sections of General Education courses so they can get to know each other (24-0648). The impact of this change will then be assessed in the future to see if the persistence of students is increased.
The institution’s processes and methodologies to assess student learning reflect good practice, including the substantial participation of faculty, instructional and other relevant staff members.
IU Southeast’s processes and methodologies to assess student learning reflect effective practices and are outlined in the seventh chapter of the Faculty Assessment Handbook (24-0460, pp. 18-21). Student learning is at the core of our mandate, and thus at the heart of annual faculty reviews, promotions, tenure, and teaching awards. Faculty create their own SLOs within their programs and work with OIE to assess them. The Faculty Senate monitors assessment of student learning by way of the Faculty Senate General Education committee. Faculty members participate in workshops offered by ILTE and Academic Affairs, including specific programs created for our adjunct faculty (e.g., Adjunct Faculty Scholars Program (24-0461) and Adjunct Faculty Scholars Conference (24-0255). Additional ILTE programs focus on inclusive teaching, online professional development modules, and a Faculty Learning Community focused on Teaching for Student Success. These programs focus on training in best practices for in-person, hybrid, and online teaching and attest to faculty intentionality when it comes to methodologies in teaching (24-0649). Additionally, the OIE rubric (24-0442) for annual academic program assessment encourages the substantial participation of all faculty and instructional staff in reviewing student data and making informed decisions based on those data.
The regular cycle of General Education and program review requires intentionality and faculty discussion and collaboration within and across departments and disciplines. Degree in Four, which contains mentoring and scholarship components, and the First Year Seminar (FYS) both involve the leadership of faculty from design through assessment. An example of this type of collaboration is the annual FYS full day retreat, which includes all FYS instructors and contains a component of intentional discussion each year across areas relating to FYS issues. During 2022, FYS faculty worked collaboratively to develop a shared learning outcome, embedding the assessment of this shared learning outcome into all sections of FYS (24-0650).
We launched a mandatory yearly training event called "Assessment Day" for academic programs that remained on an annual review level. The first installment, on October 2, 2020, was open to all members of the campus who wanted to advance their assessment acumen. The Vice Chancellors sent co-curricular and administrative representatives or encouraged their units to attend. As a result, the inaugural Assessment Day generated attendance and activity from co-curricular units (thirty-two attendees) and the individualized instruction offered by OIE throughout the year. Eighty-nine faculty attended. The campus hosted a distinct Co-Curricular Assessment Day in July 2021, given that many cocurricular programs had put forward goals and outcomes, and were ready to work on generating measures and data to support their plans.
A separate Academic Assessment Day and Co-Curricular/Administrative Assessment Day were held in July and August 2022 (for agenda items, please see 24-0462 and 24-0463, respectively). For the first time, both sessions were in-person. The Academic and Co-Curricular Assessment Committees planned the schedules for each with the assistance of OIE. The Academic Assessment Day took on a mini-conference style and offered a variety of special topic sessions for faculty to attend. Sessions included, "Qualitative Assessment", "Overcoming and Managing Resistance to Program Assessment", and "Leveling Up: How to advance your program from Annual to Biennial". Seventy faculty members attended in Fall 2023 (24-0651). Each year, an Assessment Ambassador Award is announced (see below) and a presentation is made detailing the assessment process to the group (e.g., 24-0464; 24-0652).
Prioritizing assessment of student learning means the campus must reward those who engage in best practices. The Faculty Senate’s Assessment Committee developed an Assessment Ambassador Award for curricular programs (after studying awards issued by several other universities). The award prioritizes best practices in assessment while advancing the value of academic programs teaching others how to create model assessment programs (24-0510). Assessment Ambassadors play a role alongside the Faculty Assessment Committee, General Education Committee, OIE, and ILTE in teaching academic programs how to engage in improvement-centered assessment. The academic program selected for the Assessment Ambassador Award receives $1500, which can be used to advance the program’s work in teaching, learning, and assessment. All degree-granting undergraduate and graduate programs, the Honors Program, and the Library can apply for the Academic Assessment Ambassador award. We also developed a new award and team to work on co-curricular and administrative unit assessment (24-0508). A new committee made up of staff was formed with our OIE director at its head to help develop programming and oversight, alongside the Faculty Senate committee. This team will score the applications using a rubric (24-0509) and the recipient receives $1,000. This award will be announced in late Spring 2023 by the Co-Curricular Assessment Committee.
Each year, several Trustees Teaching Awards are awarded to deserving faculty (24-0465). Each winner gets $2500 and approximately nine awards are given out each year. Faculty may self-nominate by submitting the teaching portion of their annual report for consideration. The Faculty Senate Improvement of Learning Committee reviews all candidates and evaluates each candidate using a rubric which highlights the defining characteristics of teaching generating effective learning (24-0435):
- Clear alignment between teaching goals, learning outcomes, and aligned assessments
- Methods which clearly articulate how teaching goals are achieved
- Use of multiple forms of evidence to demonstrate student learning including direct evidence, indirect evidence, and reflection
- Demonstration of professional development
- Use of assessment data to inform future changes to improve teaching and student learning
It is worth noting assessment and evidence of student learning are essential components of the rubric used to evaluate candidates. These awards are available for full-time and part-time candidates.
Distinguished Teaching Awards are also given to full-time and adjunct faculty. These awards focus on those who achieve a consistent pattern of excellence in teaching across multiple years. The rubric also emphasizes the impact on learning and the use of quality processes to ensure student learning (24-0434). Typically, one full-time faculty member and one part-time faculty member are awarded the DTA. A list of past winners is available on the website. Winners of the DTA get $1000 added to their base if full-time, and part-time winners receive $1000.