Assessment Structure at the University of Scranton

The Mission of the Office of Educational Assessment is to:

  • Develop and refine structures and processes for the collection, management, analysis, and application of educational assessment data (using a quality improvement framework – plan, implement, review, and improve).
  • Collaborate with faculty and directors of co-curricular programs to develop strategies for effective assessment at the program and institutional levels.
  • Consult with programs, including GE, to help implement, revise and/or update assessment plans.
  • Assist programs governed by discipline specific external accreditation to articulate evidence of the use of assessment data to improve teaching and learning.
  • Provide consultation for review and analysis of assessment data.
  • Communicate the most current information on best practices in assessment to the University community.
  • Communicate the most up-to-date assessment requirements (currently Standard 14) by the Middle States Commission on Higher Education.
  • Synthesize and demonstrate evidence of a sustained and organized approach to educational assessment at the University of Scranton to stakeholders and the Middle States Commission on Higher Education.
  • Plan and provide opportunities for continuing education about best practices in assessment.

The goal of assessment of student learning at the University of Scranton is the development and transformation of the individual student, who is a member of a community dedicated to excellence in the freedom of inquiry in the Catholic, Jesuit tradition. Educational assessment at the University of Scranton applies methods of inquiry to collect information on student learning experiences and uses this information for the continuous improvement of teaching and learning.

The Office of Educational Assessment (OEA) comprises a director, a member of the full-time faculty, and five faculty fellows representing the 3 colleges at the University of Scranton. The OEA reports to the Associate Provost for Institutional Effectiveness and Assessment.

vanessa-headshot
OEA Co-Director and PCPS Assessment Fellow
Vanessa Ann Jensen, Ed.D., BCBA-D
Dr. Jensen is an associate professor in the Counseling and Human Services Department at the University of Scranton. She teaches courses in Applied Behavior Analysis and also works as a consultant with adults with autism. She has been a faculty member since 2004 and a consultant since 1996. She received her bachelor’s degree from the University of Scranton 1996 in psychology, philosophy, theology and religious studies. In 1998, she received her master’s degree in elementary school counseling, also from the University of Scranton. She obtained her Ed.D. from Indiana University of Pennsylvania in Curriculum and Instruction, PK-12 in 2004. She completed her studies in Applied Behavior Analysis through Penn State University and obtained her Behavior Analyst Certification Board ®certification in 2004. She currently holds certifications in Elementary School Counseling, Grades K-6, Special Education, Grades N-12, English, grades 7-12 and 4-8, and Mathematics, grades 4-8. She has served as the chair of the Middle States Periodic Review Report Subcommittee on Institutional Assessment and Planning, Standard 7: Institutional Assessment. She also chaired the Assessment Committee in the Education Department from 2012 to 2017. She served as a member of the Special Education PK-8 and 7-12 Program Development for PDE and a member of the TEAC Inquiry Brief Proposal Committee as well as the CAEP EPP Annual Report Committee. She chaired the TEAC Internal Audit Committee for the Education Department in 2012 and 2013. She has also served as a member of the Curriculum Committee for PDE Program Revisions in 2008 and 2009. She currently serves as the Director of the Graduate Programs in Applied Behavior Analysis (ABA) and has developed the ABA Program Learning Outcomes and Assessment Plan. She has served as an Assessment Fellow in the Office of Educational Assessment since 2014.

 

satyajit-headshot

OEA Co-Director and KSOM Assessment Fellow
Satyajit Ghosh, Ph.D.
During his long teaching career Dr. Ghosh has always been interested in assessment.  He served on various assessment committees: Subcommittee on Assessment, Middle States Accreditation Committee (1997); Committee on Program Evaluation (1994-1996); and the GE Subcommittee of Middle States Accreditation Committee (2006). But it was 2007-2008 when he became very directly involved with educational assessment as he started to work closely with KSOM Assistant Dean Paul Perhach who was the main architect of Kania School’s assessment program. Dr. Ghosh worked as a faculty assessor on every KSOM assessment day that was held between 2008 and 2012. During this time with the help of his departmental colleagues Dr. Ghosh also developed the Student Learning goals for Economics and Finance majors. He continued his assessment related activities through his two recent committee appointments. In 2013 he served on the Institutional Learning Outcomes Working Group and helped to create a “draft” of the institutional learning outcomes—an integral part of University’s assessment plan. He served on the Middle States Monitoring Report Coordinating Committee in 2014. Over the past few years Dr. Ghosh has been developing various interactive teaching tools, based on “active learning models” to  “close the loop” in individual course assessment plan and help students attain their learning goals. He presented various research papers containing these alternative pedagogical tools and their assessment in several national and international conferences. Dr. Ghosh has served as a Faculty Assessment Fellow and as Director of General Education Assessment, and served on the 2022 Faculty Senate General Education Review Committee.
Email: satyajit.ghosh@scranton.edu

 

gerard-headshot
OEA CAS Assessment Fellow
Gerard Dumancas, Ph.D.
Gerard G. Dumancas holds a PhD in Analytical Chemistry from Oklahoma State University (OSU) and a BS in Chemistry from the University of the Philippines. Gerard was an Associate Professor of Chemistry and a Huie Dellmon Trust Endowed Professor of Science at Louisiana State University-Alexandria (LSUA). He also worked as a 2016 Visiting Scientist in GlaxoSmithKline, King of Prussia, Pennsylvania, USA as a Program to Empower Partnerships with Industry (PEPI) Fellow. He is presently a Visiting Professor and a Department of Science and Technology Balik Scientist Awardee of the University of the Philippines and the University of San Agustin-Philippines. In 2019, he was awarded tenure and promotion to Associate Professor of Chemistry and was named a Huie Dellmon Trust Endowed Professor of Science (2019-2021) at LSUA. He also served as the Coordinator of Chemistry Programs  (2019-2021) and as the Director of Honors Experience (2020-2021) at LSUA. In April 2021, he was awarded 2 major National Science Foundation (NSF) 5-year grants from the Robert Noyce Teacher Scholarship Program and the S-STEM Program amounting to ~$2 million as a Principal Investigator. As such, he has been directing/co-directing several NSF programs (NSF S-STEM, NSF Noyce, and NSF LAMP scholarships) at LSUA. Over the years, Gerard has generated more than $2 million in external research grants. Specifically, his core research interests involve the development of novel spectroscopic and computational tools with a wide array of applications in edible oils, food science, and biomedical research. Starting Spring 2022, he embarks on a career as an Associate Professor of Chemistry and the Director of NSF Noyce Program at the University of Scranton in Scranton, PA. At Scranton, he has also been mentoring MS in Chemistry students.

 

richard-headshot

Introduction and Background

Student learning assessment has long been part of the academic life of the University of Scranton. The University instituted a decentralized model for assessment of student learning in the late 1990s. Following an effort to centralize efforts under a Comprehensive Assessment Plan in 2004, the University returned to a decentralized model in the latter part of that decade, in which each administrative area with a role in student learning assumed responsibility for assessment: The College of Arts & Sciences (CAS), The Panuska College of Professional Studies (PCPS), Kania School of Management (KSOM), The Weinberg Memorial Library (WML), and Student Affairs (now the Division of Student Life).  Following its 2013 Periodic Review by the Middle States Commission on Higher Education (MSCHE), the University launched a significant overhaul of its student learning assessment structure and processes, returning to a more centralized model that retained key distribution of areas of responsibility. The development and launch of this new Comprehensive Plan for Assessment of Student Learning and its systematic approach to academic and co-curricular learning, guided by a new faculty-led Office of Educational Assessment (OEA), and collaborative Educational Assessment Advisory Committee (EAAC)[1] are at the heart of this approach.

Purpose

This Comprehensive Plan for Sustaining Assessment Practices to Enhance Student Learning at the University of Scranton outlines a comprehensive, systematic strategy for the University’s approach to student learning assessment. The plan describes processes and cycles for the development and assessment of learning outcomes, and associated reporting and application procedures. Grounded in learning outcomes at program and institutional levels, and for the general education program[2], improvements to student learning are thus part of a formal cycle of gathering, analyzing, disseminating, and acting upon evidence gathered. Key terms used throughout this document are defined in Appendix E.

 

In shaping our direction, the University community considers best practices in higher education, including those developed and endorsed by scholars and practitioners within the field of learning assessment. Our ethical commitment to reflective accountability, evaluating our programs and activities with honest candor in the spirit of better serving to our students though the best possible programming, is closely tied to our Catholic, Jesuit mission and ways of proceeding. Our practices are also designed to address external accountability obligations, including addressing Middle States’ Standards of Accreditation and those of other programmatic and disciplinary accreditation bodies.

 Mission Connections

The Plan pays particular attention to the importance of the University’s Catholic and Jesuit mission: namely, its dedication to freedom of inquiry and to the development of wisdom and integrity of all its members.  Drawing on underlying concepts from the Ignatian pedagogical paradigm,[3] the University’s student learning assessment plan ensures ongoing evaluation to build a sustained, evidence-based process for assessing student learning outcomes across programs and curricula.

 In 1599 the Jesuit Ratio Studiorum [Rule of Studies, which is largely understood to  outline the educational system of the Jesuits] articulated five key elements of Jesuit Education: (1) context, through which the material conditions of the student’s learning are considered, as well as the predispositions of the student; (2) experience, through which students move beyond rote learning to something more active and personal; (3) reflection, during which students apply the subject matter to their own lives and processes, and where meaning is said to be made in this paradigm; (4) action, which involves change in students’ attitudes and behaviors through the application of and reflection upon knowledge; and (5) evaluation, through which students’ mastery of subject matter is assessed with a view toward identifying gaps in students’ knowledge, the need for alternate methods of teaching, and individualized approaches to encouraging and advising students.[4] These same principles for evaluation of individual students can be applied to evaluation of groups of students who are enrolled in various programs and General Education. It is the last of these elements, Evaluation, with which Educational Effectiveness Assessment most closely aligns as a discipline.

 Goals & Guiding Principles: Assessment and Overall Institutional Effectiveness

The University of Scranton has developed a set of guiding principles that outline our commitment and approach to assessment at the institution. The goal of institutional effectiveness at The University is evaluating, documenting, and communicating what the University does well, identifying areas where we can improve, and applying assessment results to guide our application of resources and realize improvements. This discipline combines both institutional assessment (the practices used to assess achievement of mission and goals, and evaluation of non-academic areas) and educational and student learning assessment (the practices used to assess achievement of student learning outcomes, both directly within academic programs, and through co-curricular and related learning and formation activities).

At each level, goals and outcomes are monitored and measured through both formative and summative assessment strategies. Findings from these evaluations are used to inform decision making, planning and improvement, and resourcing of programs and services. 

  • Assessment is mission-driven, in the particular context of the Ignatian educational paradigm and our Catholic, Jesuit character.
  • Assessment is integrated within appropriate advisory and decision-making processes and structures.
  • Assessment is iterative, adapting to changing needs and new opportunities.
  • Assessment is collaborative and participatory, engaging all members of the University community in reflection.
  • Assessment is transparent, its processes and outcomes communicated clearly and frequently.
  • Assessment is evidence-based, supported by quality data and evidence that show how institutional and student learning goals are being met.
  • Assessment is useful, designed and pursued in ways that are practical and relevant to unit and program needs, and cycles for decision making and resource allocation.
  • Assessment results are used to “close the loop,” with results applied through planning, resourcing, and continuous improvement of programs and services.
  • Assessment is ongoing and cumulative, reflecting our performance over time.
  • Assessment is itself assessed, its processes and structures evaluated and refined through ongoing reflection and planned cycles of review.

Institutional Goals: Our Strategic Plan and Institutional Learning Outcomes

Supporting our mission, the University’s Strategic Plan outlines our institutional goals. In addition to guiding institutional growth and development, the strategic plan also address broad aims for student learning and formation, including educational emphases in the liberal arts and humanities; ethical, cultural, social and ecological justice, and global awareness; co-curricular learning through high impact practices. [5] A variety of institutional assessment processes are employed to monitor and reflect on the progress an impact of our mission and institutional goals.

 Student learning assessment efforts are connected to this framework via the close relationship of its goals to the University’s institutional learning outcomes (ILOs), including student learning in the humanities; student support and success efforts; and programming that engages students in high impact practices.

 The ILOs seek to ensure all students:

  1. Develop and use the intellectual and practical competencies that are the foundation of personal and professional development and lifelong learning including oral and written communication, scientific and quantitative reasoning, critical analysis and reasoning, and technological competency and information literacy. 
  1. Exhibit broad knowledge of the human condition, understanding the world in its physical and natural aspects, as well as the philosophical and theological basis for modern thought, faith and belief. 
  1. Demonstrate competence in their chosen field of study, using the knowledge and ability to address the most significant questions, and advancing towards positions of leadership. 
  1. Employ their knowledge and intellect to address situations in a way that demonstrates a devotion to the spiritual and corporal welfare of other human beings and by a special commitment to the pursuit of social justice and the common good of the entire human community. 

The Office of Educational Assessment (OEA)

Overall responsibility for educational assessment rests with the Provost and Senior Vice President for Academic Affairs, and, as assigned by the Provost, the Associate Provost for Academic Affairs. Working together with these and other academic leadership, and with the advice and guidance of the EAAC, the Office of Educational Assessment (OEA) is the central hub for student learning assessment activities at the University of Scranton.  As a faculty-led and driven office, it serves the institution as both a coordinating and a consultative body, developing faculty and staff expertise in methods of collection, analysis, and action so that program improvements, including curricular changes, are driven by constructive attention to evidence. The OEA oversees and documents assessment processes and cycles, developing a repository of teaching and learning information that is central to evidence-based decision-making.

The OEA is comprised of a Director/Co-Director and Faculty Fellows representing each of the three academic colleges. Through an application and selection process, the Associate Provost for Academic Affairs appoints the OEA Director and Fellows from among the full-time faculty.  The OEA is responsible to the Associate Provost and, in turn the Provost/Senior Vice President for Academic Affairs, for reporting on the state of learning assessment at the University.

The Educational Assessment Advisory Committee (EAAC), comprised of faculty, professional staff, and a student representative, advises the OEA. The Faculty Senate approves faculty appointments to the EAAC.  In addition to the EAAC, other advisory groups included in the learning assessment process include college curriculum and assessment committees (chaired by the Dean of the college), and assessment advisory committees within the Weinberg Memorial Library and Student Life.  The Assistant Provost for Planning & Institutional Effectiveness/MSCHE Accreditation Liaison Officer engages regularly with OEA leadership to provide support and build linkages between learning and institutional assessment.  The Assistant Provost for Institutional Reporting and Data Analytics plays an important role in support of the technical and data needs of the OEA.

The recently launched Institutional Assessment Committee (IAC), chaired by the Assistant Provost for Planning & Institutional Effectiveness, brings together the OEA director and others responsible for learning assessment together with personnel responsible for institutional (non-academic) assessment to identify and address shared opportunities for assessment activities, processes, and the application of results, including points of collection for assessment evidence, and the use of national surveys and other evaluations. 

Educational Assessment Procedures & Responsibilities[6]

1.       Academic Programs

This plan requires that all academic degree programs conduct assessment of all Program Learning Outcomes (PLOs) on a three-year cycle (Appendix A, Figure 1). This means that programs will: a) identify appropriate approaches and artifacts for direct assessment, which may be embedded in courses whose Student Learning Outcomes (SLOs) most closely map to PLOs for the given cycle and/or b) identify and assess indirect evidence of PLOs.[7]

Academic departments and programs shall:

  1. Develop learning outcomes appropriate to the program(s) of study. 
  1. Post and maintain up-to-date PLOs on the program and/or department web page and notify the OEA of any changes to PLOs via annual reports or other methods. 
  1. Ensure that SLOs for every course are communicated in course syllabi. 
  1. As a recommended best practice, ensure that, for every course, one or more SLOs aligns with a PLO. Departments should also demonstrate alignment of PLOs to relevant ILOs. 
  1. Develop and refine a plan to assess all PLOs on a three-year cycle. The plan should include both direct and indirect assessment evidence using the Program Assessment Report (PAR) template (see Appendix A, Figure 1), or other approved reporting method(s). 
  1. Gather and analyze evidence collected from key assessments according to the plan. 
  1. Describe how evidence is used to improve student learning and promote overall program improvement. Report and describe evidence of student achievement of PLOs, according to OEA’s established procedure for each college. This includes submission of annual assessment-related materials and documentation via the Annual Planning and Reporting process required of all academic departments[8]
  1. The OEA collates and maintains an active list of current PLOs, which are posted on the OEA web site. 

Timeline:

 Academic degree granting programs shall conduct assessment planning and reporting on the cycle developed and communicated by their Dean’s Office in consultation with the OEA. Assessment reports must be submitted as part of the Annual Planning & Reporting materials submitted by the department in which it resides each spring.  PLOs should be reviewed regularly as part of each assessment reporting cycle, and/or as part of program or accreditation reviews. In addition, interdisciplinary and other high impact programs may also conduct regular learning assessment activities in line with the expectations or requirements or their departments and/or college. Such assessment should also be reported to the OEA via the Annual Planning & Reporting process.

 If an Academic Program Review is being conducted for a program and/or department, these Program review documents may serve as a PAR in the spring of the academic year in which program review took place. Likewise, disciplinary accreditation reports that address learning assessment may serve as a PAR in the spring before the scheduled site visit.  Departments should consult with the OEA to discuss the status of these activities.

 

2.      Academic Deans

 The Dean of each college and the Weinberg Memorial Library, in keeping with their responsibility to oversee improvement of programs in their areas, will: 

  1. Review assessment plans and reports for each academic program in his or her college/division. 
  1. Ensure that academic program learning outcomes and assessment plans are reviewed and/or updated on a regular cycle, such as with Program Review (currently every 5 years) or in accordance with an accreditation cycle. 
  1. Document, disseminate, and communicate assessment results throughout their respective colleges/division and, where appropriate, with other campus groups and members of the faculty, staff and administration, through committee and other meetings, annual assessment days/retreats, and other mechanisms. 
  1. Lead discussions related to the use of assessment data for program improvement within their college with college-level Dean’s Conferences and Curriculum and Assessment committees, which will review assessment evidence reported by programs, and identify and recommend to their Dean opportunities for improvement based upon those data. 
  1. Provide a report to the OEA on college-wide assessment evidence, demonstrating the way in which attainment of PLOs in the college supports ILOs, and any programmatic changes or improvements made to address assessment results. 
  1. As part of their own Annual Report to the Provost, summarize assessment activities within the college/division, including successful outcomes, and the application of assessment results for improvement and/or resource allocation. 

 

3.         Weinberg Memorial Library 

The Weinberg Memorial Library faculty are integral to student learning, especially regarding Library initiatives in assessment of information literacy.[9] The Library’s Information Literacy Program reflects the framework and the standards for information literacy developed by the Association of College and Research Libraries (ACRL). Library faculty have representation on both the EAAC and IAC to assure their active engagement and input to our overall assessment strategies.   

Information literacy is a fundamental component of the general education curriculum and is included amongst our ILOs. To support information literacy assessment, 

  1. Library faculty will identify direct and indirect evidence that information literacy classes, research services interactions, and other activities assist students in the achievement of one or more outcomes. 
  1. Library faculty will articulate changes or improvements in the methods used in instruction based on assessment results. 

Timeline:

 The Library conducts Information Literacy Program assessment on an annual cycle. Faculty submit assessment reports to the Information Literacy Coordinator each Spring, which are then posted on the Library’s Information Literacy Curriculum and Assessment web pages. Through public posting to the Library's website, these documents are also made available to the Assistant Provost for Planning & IE and the OEA. 

 

4.      Student Life

 The Division of Student Life strives to foster extraordinary student formation of mind, body, and soul through a distinctly Jesuit educational experience that prepares reflective, compassionate, courageous, and capable graduates who thrive in justice, spirit and truth. To support this goal, Student Life departments engage in a variety of learning assessment activities[1]. These departments include but are not limited to the Roche Family Career Development, Office of Student Conduct, Office of Residence Life, Cultural Centers, Center for Student Engagement, Center for Health Education & Wellness, University Police, Counseling Center, and Student Health Services. The general assessment approach utilized by the Division of Student Life is as follows: 

  1. Departments identify relevant learning outcomes appropriate to their type of programming and intended student learning & formation.
  2. Departments gather direct and indirect evidence that programs and services are assisting students in the achievement of one or more SLOs, which often map to one or more ILOs. 
  3. Departments articulate changes or improvements in programs or services based on assessment results. This application of assessment findings is described in annual assessment reports prepared by each department. 

Departments submit assessment reports and plans to the Vice President for Student Life & Dean of Students and Director of Student Conduct and Conflict Resolution each June. Based upon reporting lines, the Vice President for Student Life & Dean of Students or Assistant Vice President review and provide feedback to departments, including that related to assessment activity, following receipt of these reports. This information is also made available to the Assistant Provost for Planning & IE and OEA via the Annual Planning & Reporting system. 

 

5.      Institutional Planning, Effectiveness, and Reporting 

The Offices of Planning & Institutional Effectiveness (OPIE) and Institutional Reporting and Data Analytics (OIRDA) regularly work with administrative departments to provide data and information for planning, and other improvement and decision-making needs. 

These departments assist in learning assessment in the following ways: 

  1. Develop and maintain a calendar of institutional assessments and other surveys. Administer key surveys (e.g. NSSE, Noel Levitz), conduct analysis of survey data, and assist others in exploring the use of these surveys for their own assessment needs. 
  1. Prepare reports of institutional assessment and other data. Share and guide review of assessment results with members of the campus community, including the EAAC and the IAC. 
  1. Archive information on surveys and other evaluative tools currently in use across the University that capture indirect evidence of academic and co-curricular student learning. 
  1. Consult with the OEA, Academic Programs, Colleges, and Student Life for the purpose of identifying and supplying evidence for indirect assessment, including data relative to the University’s ILOs. 
  2. Coordinate the Annual Planning and Reporting infrastructure and cycle.

 

6.      The Office of Educational Assessment

The OEA will guide the development of effective assessment processes; gather and collate assessment documentation; and review evidence of educational effectiveness assessment in academic and co-curricular programs, including General Education.  The OEA shall:

  1. Ensure that PLOs are in place for each academic program and are made available to students via program web pages and other communication vehicles. Monitor changes to PLOs, and prepare an annual, comprehensive listing of all PLOs on its own web site. 
  1. Monitor PLO connections to the ILOs, reviewing evidence of ILO assessment provided by Deans reports and other sources. Prepare broad reporting on the state of ILO assessment, and ILO outcomes. 
  1. Develop and oversee templates, reporting tools, and data management platforms for collecting, analyzing and reporting evidence of student learning. 
  1. Review assessment evidence submitted through the Annual Planning & Reporting System. 
  1. Consult with colleges, departments, and individual faculty and others on best practices in assessment. Review assessment plans and other core assessment documents prepared by departments. Make recommendations for improvements to program assessment processes. 
  1. Identify areas for faculty and staff development with regard to assessment of student learning; plan, implement, and evaluate resources and programs for faculty and staff development. Work with other University departments, such as the Center for Teaching Excellence, to provide assessment-related skills development. 
  1. Host gatherings and events to facilitate broad discussion of the use of assessment results to monitor and improve academic and co-curricular programs. 
  1. Prepare and submit an annual report on the plans, goals, and activities of the OEA to the Associate Provost for Academic Affairs via the Annual Planning & Reporting process. 
  1. Report evidence of student achievement of PLOs and ILOs, as well as the use of evidence for academic programs and co-curricular offerings to the following entities:
  • EAAC, IAC, and Faculty Senate
  • The Office of Planning & Institutional Effectiveness for communication to the Board of Trustees, MSCHE, and other internal and external stakeholders
  • The Provost and Associate Provost, to apprise academic leadership of assessment activities, their application, and
  • Students, faculty, staff and others by way of internal communications and the OEA website 
  1. Develop and communicate resources and programming to support best practices in student learning assessment, conveying these to the University community via means that include the OEA website, scranton.edu/assessment. 
  1. Coordinate specific duties related to the assessment of the General Education Program, as described below. 

General Education Assessment Coordinator

Under the leadership of the Co-Coordinators of General Education Assessment, the OEA will oversee a regular GE assessment cycle. The GE assessment coordinators will: 

  1. Maintain an internally available dashboard of assessment results and how evidence is used for program improvement and decision-making.
  2. Promote best practices in GE assessment through information sharing, the annual Intersession Institute, and summer workshops.
  3. Routinely communicate and collaborate with the GE program coordinator and the Faculty Senate Executive Committee. 

 

7.      Educational Assessment Advisory Committee

In support of these efforts, the Educational Assessment Advisory Committee (EAAC) will: 

  1. Advise the Director/s of the OEA on the impact and effectiveness of OEA processes. Regularly review the Comprehensive Plan and other core assessment materials to ensure they are current, understood, and appropriately promulgated. 
  1. Serve as a liaison between the OEA and Faculty Senate and Student Government. 
  1. Consider and monitor the state of assessment at the University, including reflection on the sufficiency of evidence of student learning assessment to ensure attainment of our broader assessment goals, and MSCHE standards. 
  1. Review assessment results, including institutional assessment materials such as student survey data, to identify, consider, and make recommendations related to indirect measures of student learning.

 

 Appendix-A


 

Appendix B: Assessment Brief: A Guide for Using Results for Program Improvement

Student learning assessment is all about determining essential student learning outcomes – what we want students to know or be able to do as a result of their learning – and how well they are meeting those goals. To help illustrate this process, visuals such as the one below is commonly used:

 appendix-b

 

The fourth phase of assessment planning – using results - is often referred to as “closing the loop.” Taking the time to review, discuss, and reflect on assessment results is an important part of supporting continuous improvement in our programs.  To facilitate this process, it is essential to share assessment findings amongst faculty, as well as others involved in academic leadership – department chairs, college curriculum and assessment committees, deans, and governance groups.  Sample questions to guide the review of assessment results:

  • Do the results suggest the need to pay more particular attention to the predisposition and life experiences of the learner? What changes might be made? How and when will they be made? How and when will the effects of these changes be assessed?
  • What did the assessment results indicate about the level of achievement of the student learning outcomes?
  • Do the results suggest areas where improvements or changes should be made within the program, its curriculum, or its courses? Or, are there outcomes that we can celebrate – which describe success?
  • Do results describe or connect to broader learning goals, such as those at the program level and/or institutional learning outcomes or other goals?

Appendix C: The Evolution of the Comprehensive Plan

Following the University’s Periodic Review, in November 2013, the Middle States Commission on Higher Education (MSCHE) issued a warning expressing concerns with the University’s compliance with its Standard 14: Assessment of Student Learning (now Standard V: Educational Effectiveness Assessment). The University responded by creating a more visible and coherent infrastructure, namely the faculty-led Office of Educational Assessment (OEA). The Office operates under the supervision of the Associate Provost for Academic Affairs and is closely aligned with the offices of Planning & Institutional Effectiveness and Institutional Reporting and Data Analytics. Presently it is staffed by two faculty Co-Directors and Faculty Fellows. The Educational Assessment Advisory Committee (EAAC) counsels the OEA.[1] 

In considering a more effective approach to learning assessment processes, The OEA undertook an analysis of structures and processes already in place in AY 2014-15. Data for the analysis included college and University documents, as well as formal and informal conversations with those involved in assessment at all levels. From this analysis, the OEA concluded the following: 

  • Existing assessment structures and processes operated in silos.
  • Evidence of student learning was inconsistently reported and communicated.
  • Program improvement was infrequently driven by evidence.
  • Program assessment was more limited and inconsistent across programs that do not have external, professional accreditation requirements. 

This analysis, coupled with guidance and input from Academic Affairs leadership, the Faculty Senate, and other stakeholders, guided the Office of Educational Assessment in preparing new Comprehensive Plan for Student Learning to outline a new, systematic approach to student learning assessment for the University. The Comprehensive Plan was approved by the Assessment Advisory Committee and Faculty Senate in early 2016, and has guided our efforts since that time.  

Following the University’s MSCHE Self-Study in 2019, the OEA began discussions with the AAC (now, EACC) and other stakeholders to identify opportunities to further improve student learning assessment processes, including opportunities for the refinement and renewal of the Comprehensive Plan. Adding to this reflection were findings from a 2022 audit of the OEA, and review of updated MSCHE accreditation standards and evidence expectations. An updated plan document was presented to the EAAC in November 2023. 

[1] First named the Assessment Advisory Committee (AAC), renamed in 2023 to more clearly communicate its role in educational assessment activities.

[2] As of October 2023, the University is in the midst of a multi-phase review of the general education curriculum. The general education assessment framework remains in place, though may adjust based upon the outcomes of that process.

[3]  A framework describing core tenets and approaches to education and learning in the Jesuit tradition. See: Duminuco, V. J. (Ed.) (2000). The Jesuit Ratio Studiorum: 400th anniversary perspectives (1st ed.). Fordham University Press: NY, NY.

[4] Witek, D. and Grettano, T. (2016). Revising for metaliteracy: Flexible course design to support social media pedagogy. In T. E. Jacobson and T. P. Mackey (Eds.), Metaliteracy in practice (pp. 1-22). Chicago, IL: Neal-Schuman. (Citation is on page 5)

[5] “Our Core, Our Community, Our Commitments.” 2020. See: www.scranton.edu/strategicplan. As described in our planning model, each college, division, and department (including academic departments) outline goals in support of the strategic plan. Assessment results, including program learning assessment, may be used to inform goal setting and help monitor progression and outcomes.

[6] See “Assessment Roles and Responsibilities at The University of Scranton” document for additional details.

[7] These activities, however, do not preclude programs or individual faculty members from conducting assessment at the course level for their own interest, curriculum development, or to align with expectations of disciplinary (specialty) accreditation. Dean’s Offices consult with the OEA on determining any scheduling/cycle changes appropriate for departments within their respective college.

[9] For the Library’s long-standing assessment of information literacy, including their present plan and results, see: http://www.scranton.edu/academics/wml/infolit/assessment.shtml

[1] Non-learning focused Student Life departments also engage in non-educational evaluation as part of departmental and divisional planning and institutional effectiveness processes.

Provost and Senior Vice President for Academic Affairs:  Provides administrative oversight and support for a campus-wide program of educational assessment. Works closely with the Associate Provost for Academic Affairs, academic deans, and the Director(s) of the Office of Educational Assessment, to make available to faculty a variety of opportunities to develop, implement, and evaluate student learning. Collaborates with these leaders to monitor assessment activities across academic units.

Associate Provost for Academic Affairs: Reporting to the Provost and Senior Vice President for Academic Affairs, provides managerial oversight for student learning assessment and institutional assessment activities. Supervises the operations of the Office of Educational Assessment, Planning & Institutional Effectiveness, and Institutional Reporting & Data Analytics.

Assistant Provost for Planning & Institutional Effectiveness: Reporting to the Associate Provost, leads institutional planning, institutional assessment and effectiveness activities, and a variety of related compliance activities. Serves as the University’s Middle States Accreditation Liaison Officer, and, in this capacity, as a guide and resource to internal groups in understanding and conforming to Middle States expectations and requirements for institutional assessment and the assessment of student learning. Responsible for the coordination and official submission of all Middle States reporting and documentation.

Office of Educational Assessment: The Office of Educational Assessment provides oversight and coordination of the development of a campus-wide culture of assessment at The University of Scranton.  It provides support for developing faculty and staff expertise in the measurement and analysis of student learning outcomes, program learning outcomes, and institutional learning outcomes. In consultation with faculty, departments, school/colleges, the office provides direction and consultation for analyzing and reporting assessment results. The office provides resources to faculty and staff for the effective assessment of the student learning experience and the promotion of best practices in assessment. The OEA is comprised of a Director/Co-Directors, a General Education Assessment coordinator, and faculty assessment fellows representing the different colleges.

Director/Co-Directors, Office of Educational Assessment: Reporting to the Associate Provost for Academic Affairs, the Director of the Office of Educational Assessment leads the OEA and is responsible for the implementation of learning assessment activities at the University, including those outlined in the Comprehensive Plan for Student Learning Assessment. The Director/Co-Director provides leadership in support of the needs of school/colleges for evaluation activities in each school/college to assess evidence of student learning, program quality and effectiveness, and the student learning experience. The Director/Co-Director promotes best practices in assessment and supports other departments within the University in developing and implementing assessment strategies and promulgating the results and implications of various student learning assessment projects. Consults with faculty and staff on developing and implementing assessment plans and reports, developing appropriate student learning outcomes, and choosing effective approaches and measurement instruments. Reports to the Office of the Provost on the plans and activities of the OEA, the overall state of educational assessment, assessment reporting and analysis, and related initiatives.

 

Coordinator, General Education Assessment: Reporting to the OEA director/co-directors, the GEA Coordinator reports provides leadership for the assessment, coordination, and improvement of the General Education curriculum to support the needs of colleges and programs conducting assessment activities for student learning, program quality and effectiveness, and the student learning experience. The GEA Coordinator takes responsibility for informing other departments within the University of results and implications of various student learning assessment projects related to the General Education Program. In addition, the coordinator assists faculty and staff in planning, designing, implementing, analyzing, and reporting University-wide general education assessment efforts. Responsibilities include planning, locating, or developing tools to assess student learning outcomes and GE Program evaluation needs; providing opportunities for faculty and staff to learn and share methodologies and results; assisting University offices and its leadership in ensuring that results of student learning outcomes assessment are used effectively for program improvement; and producing periodic reports on the results of GE assessment. In addition, the GEA Coordinator reports findings to members of the University community to improve the assessment of student learning across the GE Program.

 

Faculty Assessment Fellows: Facilitate assessment activity between and amongst faculty.  In specific connection to their home college, Fellows support student learning assessment and collaborate with deans, department chairs and program directors to assure development of learning outcomes and plans, and timely reporting and end of semester/academic year program reports. Introduce and encourage student learning assessment activities, as well as:

  • Support faculty development, in collaboration with CTE and Academic Departments, to promote faculty participation in Assessment.
  • Advise faculty, departments, and colleges on assessment procedures and methods.
  • Assist departments and programs to complete the assessment process (close the loop) to maintain and improve the student learning experience.
  • Review of institutional and program level learning assessment reports.
  • In addition to above OEA responsibilities, Fellows may also serve as members of their home college’s curricular/assessment committee.

 

Educational Assessment Advisory Committee:[1] Chaired by the OEA director/co-directors, the Educational Assessment Advisory Committee serves to develop and enrich effective assessment of the student learning experience at The University of Scranton. Members include: Faculty Fellows, assessment staff from CAS, KSOM, PCPS and Library, including Associate Deans, Assistant Deans, and representatives from the CTE, Institutional Reporting, Student Affairs, and faculty representatives with experience in assessment from CAS, KSOM, and PCPS. The committee collaborates with and makes recommendations to the OEA to support ongoing systems of assessment, including development, implementation, and maintenance of the plan for assessment of the student learning experience.

Dean’s Group: Chaired by the Provost and comprised of the Provost, Associate Provost, and Deans. In addition to other academic responsibilities, provides a forum for discussion of assessment activities and results amongst the colleges.  

 

Dean CAS:  Responsible for the leadership of assessment activities within the CAS, including the development, maintenance, and implementation of the CAS assessment plan and related activities. Sets expectations for annual cycles of assessment reporting at the course, department/program, and college levels, and communicates those expectations to individual faculty, department chairs, and others.  Reports to the OEA and Office of the Provost on college-level learning assessment activities and their application for improvement. Oversees the Academic Program Review for all of our programs, which includes a review of the development and implementation of program assessment plans.

Associate Dean CAS:  The Associate Dean in the College of Arts and Sciences. The function of this role is to assure a close relationship between assessment at the program and College levels, Academic Program Review, and program development. The Associate Dean serves on the Assessment Advisory Committee and reports directly to the Dean.


CAS Dean’s Conference:
To assure a broad-based review of programs and a fuller understanding of assessment practices, assessment in The College of Arts and Sciences includes consultation and input from the College's two governance bodies: the Dean's Conference and the Curriculum and Assessment Committee. The Dean's Conference includes all department chairs in the College, as well as the Dean, the Associate Dean, and the Assistant Deans. The Dean's Conference meets regularly throughout the academic year—ordinarily once each month in addition to occasional meetings of smaller disciplinary groups—to advise the Dean on a wide range of issues affecting the College. In addition, at its final meeting of the Fall semester (ordinarily in December), the Dean's Conference reviews the results of assessment at the College-level, according to the sequence and structure outlined in the "College of Arts and Sciences Student Learning Outcomes and Assessment Structure," and provides feedback to the departments through the minutes of these meetings. Department Chairs also consult with their Department faculty, as needed, on issues raised by College-level assessment.

CAS Curriculum and Assessment Committee: The College of Arts and Sciences Curriculum and Assessment Committee facilitates curriculum and program development in the College. This Committee will attend to the curriculum as a matter of ongoing deliberation, focusing not only on specific proposals but also on the continuous development of the curriculum as a whole, including the development of new programs. While department and program faculty have primary responsibility and authority for the assessment of student learning at the program level, the Curriculum and Assessment Committee also reviews and provides feedback on this assessment and Academic Program Review. Each fall and spring semester, the Curriculum and Assessment Committee reviews the results of College-level assessment, according to the sequence and structure outlined in the “College of Arts and Sciences Student Learning Outcomes and Assessment Structure” and provides feedback to the departments. The members of the Curriculum and Assessment Committee also consult with the College faculty, as needed, on issues raised by College-level assessment.

Dean PCPS:  Responsible for the leadership of assessment activities within the PCPS, including the development, maintenance, and implementation of the PCPS assessment plan and related activities. Sets expectations for annual cycles of assessment reporting at the course, department/program, and college levels, and communicates those expectations to individual faculty, department chairs, and others.  Reports to the OEA and Office of the Provost on college-level learning assessment activities and their application for improvement. With other college deans, collaborates with Provost to monitor assessment activities across academic units.

Assistant Dean of Assessment, Finance and Communications, PCPS: Aids departments in gathering data and developing alumni surveys electronically on employment.   Publishes assessment data to the web and directs licensure results to the University’s consumer index page.

PCPS Dean’s Conference: The monthly PCPS dean’s conference includes department chairs and program directors where assessment information is shared.  The PCPS dean’s conference reviews the results of the assessment information from the PCPS Board of Visitors. 

PCPS Curriculum and Assessment Committee: Reviews PCPS program curricular maps, which include specific direct and indirect strategies for assessing mastery of learning competencies, in order to ensure high quality, academically rigorous learning experiences for all PCPS students. Verifies that programmatic student learning outcome assessment activities are taking place and consult with the program about how its plan (rationale and method) may be strengthened and improved. Provides a structured forum for faculty from all PCPS programs to discuss best practices, what works and what doesn’t work, and for showcasing examples of closing the loop in actually using assessment data to improve student learning.

Dean KSOM: Responsible for the leadership of assessment activities within the KSOM, including the development, maintenance, and implementation of the KSOM assessment plan and related activities. Sets expectations for annual cycles of assessment reporting at the course, department/program, and college levels, and communicates those expectations to individual faculty, department chairs, and others.  Reports to the OEA and Office of the Provost on college-level learning assessment activities and their application for improvement. With other college deans, collaborates with Provost to monitor assessment activities across academic units.

KSOM Assessment Committee: chaired by a member of the KSOM faculty and comprised of faculty and the Dean and Associate Dean of the college, the committee coordinates the college-wide assessment process and collection, documentation, review, and communication of assessment data and information to faculty, staff, and students.

Dean Library and Library Faculty: The Weinberg Memorial Library, under the administrative leadership of the Dean of the Library in collaboration with the Library Faculty, supports the development of information literacy in University of Scranton students at all levels of study. This is accomplished through collaboration with course faculty to integrate information literacy student learning outcomes into their courses. The Library contributes to students' formation and development toward University institutional learning outcomes through an outcomes-based approach to assessing information literacy instruction sessions and through Library programming and activities in all functional areas of the Library. In addition, the Association of College and Research Libraries (ACRL) Framework for Information Literacy for Higher Education is integrated into the Library’s Information Literacy Program.

 

Faculty Senate Curriculum Committee (FSCC): A standing committee of the Faculty Senate, responsible for the review of all proposed changes to curriculum, including new courses and programs.

 

Conference Committee on Curriculum (CCC):  A standing committee of the Faculty Senate Curriculum Committee, primarily responsible for the implementation and oversight of the general education curriculum, and the assignment of general education attributes.

 

Academic Department Chairs and Program Directors: Provide leadership within the department and/or program for the development of student learning outcomes for their programs (some of which are in support of general education program learning outcomes, and/or institutional learning goals) and a program assessment plan. Oversee the implementation of the program assessment plan, and department/program reporting of assessment activities as guided by their department/program, college, or other assessment plans. 

 

Individual Faculty: Responsible for the development of learning outcomes for courses, some of which are in support of program learning outcomes, general education outcomes, and or institutional learning outcomes, and the inclusion of those outcomes on all course syllabi, as required by the Faculty Handbook.  Administer course embedded assessment of learning outcomes, and reporting of assessment activities as guided by their department/program, college, or other assessment plans.

 

Office of Institutional Reporting and Data Analytics: Provides data and technical support for institutional studies and surveys, including those that provide indirect evidence of student learning of institutional learning outcomes, and the operational needs of the OEA. Provides support to the Assistant Provost for Planning & Institutional Effectiveness for annual reporting of planning and institutional effectiveness measures, including those within the University’s strategic plan.

   

Student Life: The Director of Student Conduct and Conflict Resolution serves as a support for assessment efforts within the Division of Student Life. Each department within the Division is responsible for its own assessment, but the Director helps to support these efforts while focusing on increasing communication, clarity, and collaboration.

Institutional Assessment Committee (IAC): Advisory to the Assistant Provost for Planning & Institutional Effectiveness, provides support and builds connections between institutional assessment activities across the University, especially those related to the University’s mission and goals. Leads analysis of data collected via the University’s institutional survey cycle. A peer group to the EACC, with intentional cross-membership to facilitate mutual awareness and collaboration on assessment activities that overlap between institutional and learning assessment.

 

[1]Renamed from the Assessment Advisory Committee (AAC) in 2023 to better reflect the educational assessment focus of the group and distinguish from the peer Institutional Assessment Committee.

Past Director of Educational Assessment


Mary Jane DiMattio, RN, Ph.D.
Dr. Mary Jane DiMattio Associate Professor of Nursing, will serve as the OEA Director. Dr. DiMattio is an alumna of The University of Scranton, has a master’s degree in nursing education, and research-focused doctorate in Nursing from The University of Pennsylvania. She has served in a variety of capacities within and outside of the University in roles related to research, assessment, and continuous quality improvement. Most notably, Dr. DiMattio chairs the Nursing Department’s Evaluation Committee, charged with collecting, monitoring, and reporting outcome data for programmatic improvement and Commission on Collegiate Nursing Education accreditation. In addition, she chaired the Quality Committee of the Board at Mercy Hospital, continues to serve on the Board of Trustees at Regional Hospital of Scranton, and participates on a Nursing Research Council at Geisinger Health System. She also participated on the University committee that developed the initial draft ILOs in 2013. Having been a member of the Monitoring Report Coordinating Committee, Dr. DiMattio’s service in this new role will help to maintain important continuity and institutional knowledge in the development of our assessment processes through and beyond the context of the Monitoring Report.

Email: maryjane.dimattio@scranton.edu

 

Past Faculty Assessment Fellows


Tara Fay, M.S.
Prof. Fay currently serves as the Laboratory Supervisor for Human Anatomy and Physiology and teaches Human Anatomy and Physiology lecture and laboratory, General Physiology laboratory, and a travel course called Extreme Physiology.  She became interested in using assessment as a tool to help her continually improve her courses and has attended numerous teaching workshops and conferences related to pedagogy and assessment. 

Linda Ledford-Miller, Ph.D.
Dr. Ledford-Miller first became involved with Assessment under the then-CAS Dean Dreisbach. She attended several workshops by external assessment consultants sponsored by the Dean, and subsequently led the Department of World Languages and Cultures in the creation of its Program Assessment Plan for modern language majors, completed in 2008. The Department revised its Assessment Plan in fall 2013. In spring 2014 she collaborated with Dr. Joseph Wilson to begin an Assessment Plan for Classics. Dr. Ledford-Miller was an inaugural Assessment Fellow from November 2013 until Fall of 16, and was an active member of the Monitoring Report Coordinating Committee. Together with colleagues from the MRCC,  she attended a special workshop in Philadelphia on "Creating and Selecting Assessment Tools." She served as co-author of the 2014 Middel States Monitoring Report.

John Deak, Ph.D.
Dr. Deak learned about assessment practices and expectations when co-chairing the committee responsible for drafting the Middle States periodic review report in 2012-2013. He has attended Middle States workshops regarding assessment of student learning, participated in the Intersession 2014 CAS Assessment Pilot, and participated in several assessment related workshops and activities on campus. During his tenure as an assessment champion and fellow, Dr. Deak has assisted faculty and departments in refining their assessment practices and in conveying faculty concerns regarding assessment.

Harry Dammer, Ph.D.
Dr. Dammer has been an advocate for assessment since he developed an assessment plan for Sociology and Criminal Justice Department in 2007. Since that time he has worked on assessment planning and implementation in his Department and also has become active in assessment in the Academy of Criminal Justice Sciences (ACJS).  His recent activity with ACJS includes reviewing the assessment procedures of criminal justice programs across the country to determine if they meet accreditation standards.  Dr. Dammer feels strongly that assessment need not be onerous, has improved his own teaching, and has helped his Department focus on improving student learning for all who take Sociology and Criminal Justice classes.

Cyrus P. Olson III., D.Phil. (Oxon.)
Dr. Olsen is Associate Professor of Theology/Religious Studies at The University of Scranton, where he has been employed since 2006. A graduate of the Comparative History of Ideas Program at The University of Washington, and Systematic Theology from The University of Oxford, his research and teaching focus on religion and society from 1750-Present.  In addition to his primary academic appointment, Dr. Olsen is also Research Fellow and Network Co-Lead for AI&Faith, and an Affiliate Research Fellow through a Templeton World Charity Foundation Grant at the Human Network Initiative, a subsidiary of the Dhand Lab specializing in Neurology at Brigham and Women's Hospital, Harvard Medical School, located in Cambridge, MA.

Nicholas P. Truncale, M.S.
Prof. Truncale participated in the pilot Intersession Assessment Group in 2014, assessing a GE Natural Science course “It’s Only Rocket Science” he developed. Since then he created a new course, Foundations of Physics and Engineering, which serves as his department’s first year Eloquentia Perfecta (EP) Level I oral communication and digital technology requirement. He assessed all of the EP student learning outcomes earning the course permanent EP status and also presented these assessment results at a physics education conference. He co-authored a physics program review, participated in a departmental student retention study, and “closed the loop” by helping to enact changes based upon the results of the retention study. A Faculty Assessment Fellow since 2014, he currently directs the Provost Assessment Scholars Program.

Adam Pratt, Ph.D.
Dr. Pratt joined the OEA in September 2017. As a member of the History Department, he teaches courses on the Age of Jackson, the American Civil War, and Native American History, as well as introductory courses to U.S. history. In addition, he teaches courses on historical research methods for undergraduates. In addition to the OEA, he sits on the General Education Assessment committee, and is a member of the CAS Curriculum and Assessment Committee and the Faculty Senate Curriculum Committee.

Scroll to Top