For a better experience, click the Compatibility Mode icon above to turn off Compatibility Mode, which is only for viewing older websites.

2016 Conference Schedule

To find a PDF copy of any presentation, please find the presentation below and click on the PDF button to the right of the title.  If there is no PDF button, then the a copy of the presentation was not provided to the Conference Committee.

Program Materials

Date/Time Session

Sep 7, 2016
2:15 PM - 3:15 PM

Concurrent Session 1

PDF

DIY General Education Assessment: A Campus-Wide Assessment Program Overnight with What you Already Have Jason Adsit and Gina Camodeca, D'Youville College PISB 104

So you've gotten a MSCHE warning about your assessment limitations (or you fear you're about to), and you don't know what to do. Your faculty means well, but they don't understand general education assessment. Your administration means well, but they don't understand general education assessment either. Maybe you have bought some tools you thought might help from a snazzy vender, but nobody knew what to do with the tools really, so that just drained resources and didn't help. You're in a learned helplessness assessment situation and can’t see the way out. This session is for you. We will describe the context of learned helplessness around assessment that is very common but not insurmountable. We will explain how we organized and implemented a straight-forward human-resource-driven campus-wide course-embedded assessment plan. In total, this general education assessment model is one that your administration won't find onerous in terms of investment and your faculty won't find to be a bummer. The session will include a 30-40 minute power point presentation and 20-30 minute audience participation, during which we will use one of our own embedded rubrics to quickly render outcomes data of the session itself for discussion.

Learning Outcomes:

1. Participants will gain basic working knowledge of what embedded assessment is and how it works
2. Participants will be able to develop teams, how to develop assessment rubrics, and how to render assessment data quickly and meaningfully for general education

Audience: Beginner

PDF

Accreditation Drove Quality Assessment: Making Lemon Sorbet from A Case of Lemons Dale Trusheim, Phyllis Blumberg and John E. Connors - University of the Sciences PISB 106

MSCHE told us that we needed to produce, within three months, an extra monitoring report assessing student learning outcomes (SLO) for every academic program. They wanted to see all outcomes, how it was assessed, and the latest results. They also wanted us to perform a gap analysis of the achievement of the outcomes. As a result, we created a Google sheet with column headings for all of the required information and the rows specifying every educational program. This sheet was shared with all department chairs and program directors. We learned several generalizable lessons from this spreadsheet exercise: 1. These data collection meetings served as a teachable moment for faculty and administrators. 2. Having the entire spreadsheet available for all chairs and directors to inspect also helped people to learn what others were doing and could lead to some improvements. 3. It was easy to monitor progress toward completion. 4. Even most the reluctant assessment laggards cooperated because there was a perception that our accreditation status might be change. 5. We plan to use a similar Google sheet for future annual assessment reports. A gap analysis determined the present state of the university's assessment processes, and assisted in determining the steps necessary to move assessment of student learning outcomes from its current state to more transformative goals and larger impacts.

Learning Outcomes:

1. Participants will be able to apply lessons learned while complying with an accreditation request to their own accreditation process
2. Participants will be able to do a gap analysis of their achievement of their own student learning outcomes and how to strive for more transformational goals

Audience: Intermediate

PDF

Degree Qualifications Profile (DQP) = Institutional Learning Outcomes (ILOs): What's Next Christina Dryden - American Public University System PISB 108

In this presentation we'd like to discuss our journey with adopting the DQP as our university institutional learning outcomes (ILOs). The Lumina Foundation's Degree Qualifications Profile (DQP) framework, along with digital literacy, was adopted by APUS as our institutional learning outcomes (ILO) in 2013.  Phase I and II of this endeavor occurred over 2014-2015 and included mapping of program objectives (PO) and identification of signature assignments.  In order to provide some feedback about the alignment of the POs and signature assignments, a quality assurance review was completed. During the process of ILO mapping and identifying signature assignments a need was realized.  The need was the creation of rubrics which aligned with the newly adopted ILO areas and could be used university-wide, regardless of the program or signature assignment.    Participants will be polled to evaluate their use of the rubric.  Discussion will be invited to see how other campuses pursue university wide rubrics.  We will also have the audience look at the alignment of program objectives and the institutional learning outcomes as part of a quality assurance process.  Here we will entreat the audience to discuss areas where a quality assurance process could be applied on their campus.</p>

Learning Outcomes:

1. Participants will be able demonstrate proficiency using our ILO (DQP) rubrics
2. Participants will be able to describe areas across their programs where our quality assurance process might be used.

Audience: Intermediate

PDF

To Flip or Not to Flip Steven Billis and Nada Anid - New York Institute of Technology Pearlstein 101

The flipped classroom is gaining popularity as a teaching strategy that allows instructors to create an active learning environment. It focuses the responsibility of learning on the students and changes their role from listeners to learners. In the flipped classroom, instructors typically assign online recorded lectures as homework, and use face-to-face instruction for active learning exercises and direct engagement with students in the classroom. In a study conducted by a group of faculty with the Electrical and Computer Engineering Department of the University of Florida, Gainesville, Florida and presented in the IEEE Transactions on Education, they reported significant gains in both student's performance and retention rates after flipping a Circuits I class. This presentation will present an example of a flipped-classroom approach to a one-semester Fundamentals of Digital Design required course for Electrical and Computer Engineering majors in order to lower failure rate and to further motivate students so as to improve overall attrition. As a result of my own formative and assessment activities and despite the many ways to implement this model, this presentation will describe the characteristics and challenges that the most successful flipped classrooms typically share and may serve as resource for instructors who are deciding whether to flip or not to flip.

Learning Outcomes:

1. Participants will learn through my own formative and summative assessment activities of the potential challenges they will face when flipping a class for the first time.
2. Participants will receive data for those that are considering to flip or not to flip so that they may make an informed decision.

Audience: Intermediate

Honoring Faculty Well Being to Build a Culture of Assessment Carolyn Haynes, Renee Baernstein and Rose Marie Ward - Miami University (Ohio) Pearlstein 102

Six years ago, the assessment landscape at Miami University was almost invisible.  Rather than offer a story of success, the facilitators of this session will expose the participants to a short summary of our journey to build a culture of assessment which is very much still in process and characterized by some modest successes along with several colossal mistakes.  We believe that our journey would have been made more productive and easier if we had better leveraged Charles Walker's five conditions for faculty well-being (2002, 2003): (1) honoring faculty expertise; (2) enabling control of one's work; (3) providing reliable sources of support; (4) offering feedback on the quality of one's work; and (5) setting challenging and meaningful goals. After hearing a brief summary of the five -year journey of building a culture of assessment at Miami University (Ohio), participants will work in small groups to analyze the challenges and benefits of the approaches Miami used and then to generate 3-4 guidelines or pieces of advice institutions can use to cultivate a culture of assessment that relies upon (rather than works against) faculty well-being.

Learning Outcomes:

1. Participants will be able to apply Charles Walker's five conditions for faculty well-being to the process of fostering assessment of student learning across a campus
2. Participants will be able to generate tips and guidelines for building a faculty-oriented culture of assessment

Audience: Intermediate

PDF

Our QuEST for Healthier Outcomes: Evaluating Revisions to the General Education Wellness Requirement Susan Donat and Mindy Smith - Messiah College Gerri C Lebow Hall 109

Messiah College reviewed and revised the Wellness component of the General Education curriculum, known as QuEST (Qualities Essential for Student Transformation). The new curriculum integrates regular participation in student-selected physical activity with wellness research, discussions, goal-setting and seminars. Our revised general education wellness model seeks to extend this application of wellness behaviors that can be practiced during college and then maintained beyond. The College offers approximately 15 different wellness courses each semester. Embedded into the different courses are common core components including wellness seminars, current research articles, and goal-setting programs encourage a common conversation throughout campus, yet allow for different physical activities and pedagogies.  Our revised curricular wellness program provides students with an opportunity to develop a pattern of regular physical activity throughout the semester. Simultaneously, students are now also engaged in conversation regarding the integrated connection of physical, relational, emotional, and spiritual wellness.  By addressing potential barriers and courses of action for pursuing holistic wellness, students work towards goals through problem solving, realistic evaluation of their actions, and adaptation to changing situations.  Our assessment data indicates that the revised program is significantly more successful in meeting student learning objectives, in student satisfaction and in self-report of activity level.

Learning Outcomes:

1. Attendees will be able to identify the process of reviewing and revising a General Education Wellness component and discuss potential pitfalls within the process.
2. Attendees will review the assessment findings of the Wellness Program revisions, discussing potential future refinements.

Audience: Intermediate

An Integrative Approach to Managing Curriculum and Assessment Processes: a Discussion of Leadership and Technology Jacob Amidon, Debera Ortloff and Gigi Devanney - Finger Lakes Community College Gerri C Lebow Hall 209

As educational leaders engaged in curriculum and assessment, regardless of our college affiliation, whether 2 year or 4 year, small or large, all of us are in some way, shape and form in trying to create a robust culture of assessment -- one which really supports continuous improvement and integrates the whole campus from classroom to co-curricular to service area.  There are fairly well-documented resources on the importance of creating such a culture as well as best practices for working toward it.  Yet, buy-in will only be continued if the bureaucracy supporting the process is efficient, well-managed and reflective of the campus culture. We argue in this presentation, that in truth considering the management of curriculum and assessment needs an integrative approach of educational and technological leadership.  Based on the development of our creation of a fully online, customized curriculum and assessment management process, using an integrative software solution including Chalk and Wire and SurveyGizmo, we will detail the series of lessons we learned in ultimately finding the solution for efficiently managing the bureaucratic processes, like recording curricular change that need to accompany a robust culture of assessment.

Learning Outcomes:

1. Participants will learn strategies for avoiding pitfalls in connecting assessment reform with technology
2. Participants will learn about thinking through and creating efficient systems for curriculum and assessment management

Audience: Intermediate

Sep 7, 2016
3:30 PM - 4:30 PM

Concurrent Session 2

PDF

Bridging General Education and the Major: Critical Thinking, the Mid-Curriculum, and Learning Gains Assessment Jane Detweiler and Russell Stone - Unversity of Nevada, Reno PISB 104

Now more than ever, regional accreditors require institutions to report on university-wide learning outcomes and learning gains across curricula.  Critical thinking is vital to the integration of a general education program within undergraduate majors.  With a brief overview of critical thinking and integrative learning, teaching and assessment strategies, and a UNR degree program as a test case, we will guide a thirty-minute discussion (designed for an intermediate-level audience) on the role of critical thinking at the beginning, middle, and end of a curriculum.  Participants will cite their own general education plans and degree programs to answer three questions for the next thirty minutes: can we identify assessable critical thinking at the introductory and senior levels? Can we trace how student’s progress in critical thinking from the one to the other? And how can we lead faculty in curriculum mapping that articulates effectively their expectations as teachers?   Working in small-group discussions, participants will draft critical thinking-related SLOs and identify appropriate assignments and assessment strategies. Participants will also develop a plan to align critical thinking across the disciplines with integrative learning in the majors.

Learning Outcomes:
1. Participants will be able to apply concepts of critical thinking that bridge general education and degree program curricula
2. Participants will be able to develop an assessment model that aligns general education and degree program outcomes

Audience: Intermediate

Snapshot Sessions (5 minute Mini-sessions) PISB 106

Formative Assessment in the Online Classroom
Krys Adkins , Drexel University Calibrating Teaching Assistant

Calibrating Teaching Assistant Scoring in Large Lecture Sections; Identifying Standards and a Strategy for Intervention
Dylan Audette, University of Delaware

Assessment Quality: The Test Blueprint for Validity
Diane Depew, Drexel University

A Collaborative Approach to Creating a Graduate Student Survey
MacKenzie Lovell and Amanda Albu, Temple University

"No Stakes" Direct Assessment, With "Carrot and Stick" to Inculcate Professional Development in Student Pharmacists
Diane Morel, Nicole Salamantin and Lisa Charneski,  Philadelphia College of Pharmacy, University of the Sciences

How Well Do You Know Your Off-campus Clinical Sites?
J
onette Owen - Salus University

Pieces of the Program Assessment Puzzle
Bernice Purcell - Holy Family University

Stop Doing Assessment by Hand.  Using Assessment Software for Your Small School
Ruth Sandberg and Rosalie Guzofsky - Gratz College

Best Practices in the Assessment of Adult Learning - New Contexts and Paradigms
Adrian Zappala - Peirce College

PDF

Categories of Student Learning: A Concept Model for Aligning MSCHE Standards for Educational Effectiveness Assessment Krishna Dunston, Amy Birge and Elisa Seeherman - Community College of Philadelphia PISB 108

The new MSCHE Standards allow institutions to think critically about how they are designing, delivering, supporting and assessing student learning; not as disparate silos, but as a cohesive institutional whole. It demands that we think about learning as the student experiences our institution. As institutions seek to improve assessment at every level and across the institution, there is a risk of developing specific, targeted evaluations which do not provide a view of the whole.  I have found it useful to develop, categories of student learning, as an organizing principle. The purpose of categories is to create an umbrella under which an institution can define program objectives, student learning outcomes, student support services goals and co-curricular experiences. It also serves as a framework for introducing new initiatives and integrating them into diverse program structures.  In this presentation I will be joined by my former colleague at the University of the Arts, Elisa Seeherman, Director of Career Services, to present our collaboration in defining a cross-institutional assessment of professional preparedness as a case study of the categories model; and my current colleague at the Community College of Philadelphia, Dr. Amy Birge, Coordinator of Curricular Development to present our synthesis of faculty resources for writing and improving student learning outcomes, framed by newly developed categories.

Learning Outcomes:

1. Participants will be able to investigate a case study in which categories were utilized to define a cooperative institutional assessment between career services and multiple program areas
2. Participants will be able to discuss the use of categories to design curricular development resources for faculty

Audience: Advanced

PDF

Holding the T: Making Rubrics Work for You Belinda Bleivns-Knabe, Joanne Liebman Matson and Brian Ray - University of Arkansas at Little Rock Pearlstein 101

The question, "are our students learning what we want them to learn?" is a driving force behind assessment of student learning.  This question unifies the interests of accrediting bodies, institutions, and faculty. Our objective is to identify meta-principles that can guide assessment in any context and to demonstrate through an example how they can be adapted to fit a particular context.  In our session, we will discuss and facilitate a short workshop on a rubric that is both common and contextual. Rubrics are used across the nation for direct assessment of student learning outcomes.  The introduction of AAC&U's VALUE rubrics in the LEAP project helped faculty understand the possibilities for using common rubrics for multiple assignments and even multiple disciplines. We will describe how we, as faculty, have used rubrics as a direct measure of student learning in writing, moving toward contextually-based common rubrics that share central dimensions but are adaptable to different contexts.  Our objective is to present a case for the value and flexibility of rubrics.   We will share sample writing rubrics and lead the group through the process of adapting the rubrics to assess student writing at their institutions.

Learning Outcomes:

1. Participants will learn about national movement toward "common rubric"
2. Participants will be able to adapt a broadly used rubric to their own local university and disciplinary contexts

Audience: Intermediate

Creating Climate Change: Increasing Faculty Engagement to Generate Results Susan Deane, Cheryle Levvitt and Jennifer Lusins - SUNY, Dehli Pearlstein 102

This session will focus on strategies for standardizing assessment initiatives and increasing faculty engagement among three different programs in a school of nursing.  When we began the standardization of program assessment, there was a wide disparity of faculty interest, understanding, commitment, and willingness to engage in the assessment process. It was clear that a variety of creative strategies were needed to promote assessment in a meaningful and manageable way. Over the course of an academic year a number of initiatives were launched to promote and standardize assessment.  The program assessment directors will present how the following initiatives served to increase faculty engagement in program assessment: revisions of the Compliance Assist template, faculty training videos, individual tutorials, establishing a set schedule for data entry, and bi-monthly reporting of assessment data at faculty meetings.  Attendees will learn about the successes and challenges in establishing a climate that is supportive to ongoing and productive program assessment. The audience will participate in a Think-Pair-Share exercise, applying these initiatives to their own assessment processes, identifying potential resources and challenges in implementation.

Learning Outcomes:

1. Participants will be provided with strategies to promote increased faculty engagement in program assessment.
2. Participants will be able to identify resources and potential challenges in the implementation of these assessment strategies in their own academic programs.

Audience: Intermediate

PDF

Ethics Assessment in Marketing Courses Using a Business Ethics Simulation Game Lawrence Duke - Drexel University Gerri C Lebow Hall 109

The purpose of this session is to propose a new approach to ethics education and assessment in marketing courses. Most US business schools embrace an institutional mission that includes moral development as a desired student outcome. While the Association to Advance Collegiate Schools of Business Schools (the premier business school accreditation agency) requires business schools to meet ethics education expectations, it does not specify any courses or program template for delivering ethics education to business students (AACSB International, 2015). This allows for ample flexibility among business schools on how this policy should be implemented. Given the above, my primary research question is will an ethics education intervention based on a business ethics simulation game significantly increase marketing student's moral reasoning skills as assessed by a "gold standard" instrument?

Learning Outcomes:

1. Participants will be able to recognize the potential benefits of moral judgment assessment in promoting more effective ethics education approaches
2. Participants will be able to identify effective teaching opportunities through the use of experiential learning approaches, such as a business ethics simulation game

Audience: Intermediate

PDF

Course Level Assessment: No, it is Not Punitive and Yes, it Can Be Fun! Karen Bull - Syracuse University Gerri C Lebow Hall 209

Faculty are brilliant in their content areas, but many of them have not been taught how to design, deliver and assess their courses. In most instances, when discussing assessment, faculty balk and immediately think of negative ramifications for themselves.  The presenter will discuss what one institution did in order to shed a positive and impactful light on assessment. This instructional designer leverages the required course syllabus template as a roadmap to assist faculty in aligning their student learning outcomes and their course assessments. Further, the opportunity is seized in order to discuss the importance of objective grading and encourages faculty to incorporate at least one rubric into the course design for a single assessment. Additionally, all primary documents (course syllabi, assessments, etc.) are gathered in a single database which can then be used to drive curriculum program reviews, performance improvement plans and ultimately inform the completion of the period review report for accreditation.   This intermediate session is designed for those administrators who help faculty design and develop courses and are involved in assessment and accreditation activities at their institutions.

Learning Outcomes:


1. Participants will be able to articulate a process for aiding faculty in successfully aligning their course level outcomes with course level assessments and then to the program level objectives and institutional goals.
2. Participants will be able to compare their own processes for the alignment of course level assessments to course level outcomes to programs and their institutions.

Audience: Intermediate

Sep 8, 2016
7:30 AM - 8:30 AM

Continental Breakfast 1

PISB Atrium

Sep 8, 2016
10:00 AM - 11:00 AM

Concurrent Session 3

PDF

Let's Give Them Something to Talk About: How About Gen Ed Outcomes? Jeff Bonfield, Roberta Harvey and Bharathwaj Vijayakumar, Rowan University PISB 104

Unlike many assessments, which are designed to answer how well students are learning and developing, Rowan University engaged in an assessment that was designed to explore faculty assumptions and expectations about what students are or should be learning.  Over the last six years, the University has undertaken a significant reform of its general education program, centered on the adoption of six core literacies (Artistic, Communicative, Global, Humanistic, Quantitative, and Scientific) and associated learning outcomes.  The new outcomes will shape the institution's general education program, called the Rowan Core.  The Fall 2017 class will be the first students for whom the Rowan Core requirements apply.  Attendees at this presentation will be able to design an intervention that directly addresses the challenge of integrating general education outcomes into student's major requirements and indirectly promotes student's understanding of those outcomes.  Attendees will be able to describe a novel data visualization tool (using Tableau) that could be replicated on their campus.  The intent of the discussion will be to generate ways to modify Rowan's technique to address the unique challenges of each institution.

Learning Outcomes:

1. Participants will be able to design an intervention that directly addresses the challenge of integrating general education outcomes into student's major requirements and indirectly promotes student's understanding of those outcomes.
2. Participants will be able to describe a novel data visualization tool (using Tableau) that could be replicated on their campus

Audience: Intermediate

PDF

Learning from the CAEP Assessment Process Within HBCU Environments: Examining our Strengths and Challenges in Classroom and Program Review Pamela Felder, Michael Reed, Kimberly Poole-Sykes and Nomsa Geleta, University of Maryland Eastern Shore PISB 106

The purpose of this 60-minute panel presentation session is to discuss the strengths and challenges associated with facilitating Council for the Accreditation of Teacher Preparation assessment (CAEP) within an HBCU environment. Panelists will discuss their experiences interpreting CAEP assessment guidelines and how their participation in classroom and program review served to inform the process of accreditation. Classroom and program assessment are at the nexus of educational and curricular developments. In the meeting the needs of national, statewide, local, and institutional assessment criteria, peer review and interaction are essential to understanding the effectiveness of classroom and program strategies. Understanding the impact of classroom and program effectiveness relative to review standards is the basis for facilitating CAEP assessment. In particular, peer review and interaction can serve to illuminate what strategies support and/or hinder teaching and learning goal in an effort to address CAEP standards and expectations. Overall, the goal of this session is to provide the audience with information about the CAEP assessment process that is institutionally driven relative to classroom and program guidelines that serve to support specific student populations.

Learning Outcomes:

1. Participants will be able to discuss strengths and challenges associated with classroom and program review during CAEP Assessment
2. Participants will be able to understand strengths and challenges associated with classroom and program review during CAEP Assessment within HBCU environments

Audience: Intermediate

After the Review Team Leaves: Planning for Improvement Post-Periodic Program Review Gina Calzaferri, Temple University PISB 108

Periodic program review is a valuable process for encouraging the continuous improvement of programs and departments, with the goals of: assessing what programs do; clarifying expectations for teaching, research and service; reviewing indicators of quality and student outcomes; and establishing plans for improvement, among others.  Sustaining a high quality program review process and experience requires thoughtful planning, resources, institutional support and coordination among various offices across campus.  Yet, the most critical aspects of program review occur after the review team has left campus and the program considers recommendations from their own self-study and visiting team report and decides how (and if) to use this information for program improvement.  This session demonstrates how one large research institution has implemented a “Plan-for-Improvement” procedure to ensure that program review remains a central activity for evaluating program effectiveness, and informing planning and the allocation of resources. Participants will be briefly introduced to the university's model for Periodic Program Review and will learn in more detail about the post-review process including the follow-up Plan-for-Improvement Survey.

Learning Outcomes:

1. Participants will gain an understanding of the Temple's Periodic Program Review model
2. Participants will learn about the post-review process and receive a sample of the follow-up Plan for Improvement Survey

Audience: Beginner

PDF

Teaching Quality Should Drive Assessment Phyllis Blumberg - University of the Sciences Pearlstein 101

The purpose of education is to help students learn, and succeed after graduation. Helping students learn and succeed is an essential aspect of quality teaching. Suskie(1) summarized the massive literature on helping students learn and succeed by consistently employing two themes: (1) student engagement in the learning process, and (2) faculty and students sharing responsibility for learning.  Assessing teaching should become an integral aspect of the teaching process.  Three principles of good assessment define how to assess teaching: 1. Use explicit, objective and uniform criteria. 2. Triangulate data from a variety of different sources of information. 3. Tie into evidence-based literature and data. Faculty performance evaluations should be linked to desired student learning outcomes. Therefore, assessments should include measurements of student learning and teaching strategies to foster learning. If quality teaching is evidence-based, then assessment of teaching should also be evidence-based. To be acceptable, faculty need to show evidence critically reflecting on the information given in student evaluations, student learning outcomes, and course artifacts to determine how well they are teaching. At the middle level, faculty use evidence-based literature to support their teaching. At the highest level, they engage in scholarship of teaching and learning

Learning Outcomes:

1. Participants will be able to describe implementation examples of evidence-based, best practices of quality teaching
2. Participants will be able to discuss why best practices should drive assessment of teaching and identify objective ways to assess teaching using these best practices

Audience: Beginner

PDF

Popping the Question--Time to Get Engaged! Salvatore D'Amato - D'Youvillie College Pearlstein 102

How often have you heard members of the faculty express concern about their student's lack of participation and apparent lack of interest?  How often have professors said that they feel they must rely on a handful of students to keep the conversation alive and to ask and answer questions?  This interactive workshop offers faculty a number of easily implementable strategies that encourage all students to pay attention and participate in class discussions.  After we explain our rationale and some cueing and pacing strategies, participants will assume the role of students in a few exercises.  Snowballing encourages students to share experiences and thoughts about sensitive or controversial topics.  Dialectic Journaling helps pairs of students share their understanding of concepts and issues as they construct meaning together and consider each other's perspectives.  Tabletop Round Robin adapts dialectic journaling for groups of students.  We will also present a list of questions that may discourage learners from participating.  These activities is that each provides professors with observational and tangible evidence of student learning, so that they can make informed decisions about future instruction and provide more purposeful feedback to students.  Handouts include guidelines for questioning and directions for activities.

Learning Outcomes:

1. Faculty will be able to engage all students in deeper conversations by adopting and adapting questioning strategies and activities.
2. Faculty will be able to use observational and tangible records from questioning as evidence of student learning and to inform future instruction.

Audience: Beginner

 

PDF

Ethical Reasoning: Defining, Teaching, Assessing Keston Fulcher - James Madison University Gerri C Lebow Hall 109

This presentation focuses on a particular type of learning, ethical reasoning, which we consider essential to student success. Regardless of how proficient students are academically; this learning is for naught if these skills are applied unethically. Despite widespread concerns about definition, assessment and teaching strategies, JMU forged ahead with ethical reasoningr because of what is at stake: student's ability to navigate complicated ethical situations.  After several years of planning, JMU created a learning system to integrate the learning, teaching, and assessment of ethical reasoning. Experts in ethical reasoning, teaching, and assessment worked collaboratively to create the Eight Key Question (8KQ) process.  Students deliberate through the following considerations before making a decision: fairness, outcomes, rights, character, liberty, empathy, authority, responsibilities. Pedagogy, curriculum, and assessment were explicitly integrated around this framework. Student's ethical reasoning skills, at the university level, have increased substantially as a result. Indeed we can show that participation in these high impact practices truly translates into demonstrable impact. The purpose of this presentation is NOT to mind-numbingly tell the details of JMU's project. Rather, it is to actively involve attendees in a thought process about how to integrate learning, teaching, and assessment to move the needle on an important student learning outcome.

Learning Outcomes:

1. Participants will be able to cite a definition of critical thinking (the Eight Key Question Framework)
2. Participants will be able to explain why teaching, learning, and assessment should be integrated

Audience: Intermediate

PDF

Drexel Outcomes Transcript: Building Academic Innovation and Renewal Using an Effective Assessment Process Mustafa Sualp and Caitlin Meehan - AEFIS; Stephen DiPietro and Donald McEachron - Drexel University Gerri C Lebow Hall 209

In higher education, courses and instructors are often functionally siloed and students fail to see the connections between curricular elements. Outcomes-based design and assessment should address this problem but often does not due a significant disconnect between what students and faculty understand about the significance of student learning outcomes. In an effort to address these issues, a complete assessment management solution approach and software are being designed and implemented to create ‘learning outcomes transcripts’ which transcend individual courses and educational experiences. By providing developmentally relevant feedback to students in real-time, these transcripts may promote significant student ownership of learning outcomes, creating a stronger sense of purpose and curricular continuity. That, in turn, should promote more effective student learning and academic performance.

Learning Outcomes:

Audience: Advanced

Getting them in the Game: A Participatory Approach to the Evaluation of Assessment Infrastructure Sade Walker and Zornitsa Georgieva - Prince George's Community College PISB 120

By now most institutions have developed approaches to assessing student learning in individual courses, programs, and the institution as a whole with the focus on ensuring that students acquire necessary skills and abilities. The development and improvement of learning outcomes, assessment tools, data collection methods, and the use of assessment results require the presence of an assessment infrastructure. We define infrastructure as the policies and procedures that guide the day-to-day assessment processes in addition to an organizational support structure.  An infrastructure is necessary for a culture of assessment to be established and to flourish, allowing for time and space for assessment to develop organically rather than as an add-on.  With this in mind, we turned the assessment lens on ourselves to study how our current assessment infrastructure supports the assessment of student learning outcomes (SLO) at our institution. In using this participatory approach we were able to move from, "I am going to hear you" to "I am going to involve you."  By the end of the investigation, participants had taken ownership in the decision making process and crafted the improvements to the assessment infrastructure. We will share lessons learned, including the benefits of actively involving stakeholders.

Learning Outcomes:

1. Participants will be able to describe a stakeholder driven approach to evaluating your current SLO assessment infrastructure.
2. Participants will be able to facilitate active participation of stakeholders in the decision making and vetting process of assessment infrastructure improvements

Audience: Intermediate

Sep 8, 2016
11:15 AM - 12:15 PM

Concurrent Session 4

PDF

One Size Fits All: Using AAC&U Rubrics to Facilitate Interdisciplinary Assessment of General Education Carolyn LaMacchia, Michael McFarland, Mindi Miller and Tom Kresch - Bloomsburg University PISB 104

General Education (GE) involves core courses and experiences to promote better communication and problem-solving abilities of higher education graduates. The aim of General Education is to facilitate an awareness and skill-set in students beyond the focus of a declared major. Measuring GE outcomes and comparing results across disciplines is not easy, but faculty and accrediting bodies recognize the important of GE assessment.  The Valid Assessment of Learning in Undergraduate Education (VALUE) rubrics of the Association of American College and Universities (AAC&U) provides a method for assessing GE outcomes and comparing aggregate data.  Bloomsburg University (BU) based their revised GE program on VALUE rubrics in order to have department and division flexibility for selecting elements from the rubrics that could be aligned with the course objectives and expected student learning outcomes (SLOs). Academic faculty members and co-curricular staff members are using VALUE rubrics for assessing benchmark to capstone SLOs for the goals within their GE-approved courses.  Results each semester are organized via TracDat and SharePoint software with the assistance of the Office of Planning and Assessment.  Specialty groups organized by the General Education Council are following a yearly plan to address each goal.

Learning Outcomes:

1. Participants will be able to discuss how General Education can extend beyond a single discipline to include campus-wide GE  assessment
2. Participants will be able to critique the BU model to address challenges and potential solutions to intra-department and inter-disciplinary GE assessment

Audience: Advanced

What'd You Say?: How to Communicate During the Self-Study Process Gail Fernandez, Shyamal (Sony) Tiwari and Larry Hlavenka Jr. - Bergen Community College PISB 106

Organizing and navigating an institutional self-study requires consideration of how information will be communicated to those involved in the process, the college community and the public at-large.  It is therefore essential that early in the process, a clearly-articulated and actionable communication plan is developed to ensure transparency and provide clear benchmarks to evaluate progress.  In this session, we will share Bergen Community College's communication plan, including how we (1) created the plan; (2) crafted the themes; and (3) delivered a consistent and meaningful message.  Several foci guided the team's work: (a) educating the college community about the significance and relevance of the self-study; (b) combating misinformation and faulty perceptions about the scope and mechanics of the reaccreditation process; (c) earning support from key constituencies by publicizing the process and sharing information at key intervals; and (d) recognizing the efforts of working groups.  A number of factors contributed to the team's success, above all being the need to maintain credibility and trust among those involved.  This session will benefit institutions beginning their self-study or looking to improve coordination in other college-wide initiatives.  The presenters will share their communication plan, sample correspondence and other artifacts from the process.

Learning Outcomes:

1. Participants will learn about the key components of an actionable self-study communication plan
2. Participants will learn how a robust and transparent communication plan maintains credibility and builds trust

Audience: Intermediate

PDF

Using Simulation, 360-degree Feedback, and AARs to Assess Individual/Team Performance in Different Delivery Formats Jim Caruso, Drexel University PISB 108

This session will describe the creative and innovative integration of various assessment tools along with business acumen and leadership content in a graduate management course using an engaging total enterprise team-based simulation. Examples will be provided describing the simulation pedagogy and design options of implementing this course in three different delivery formats.  This course develops skills for two of the university's  student learning priorities (information literacy and leadership) by integrating finance, accounting, strategic planning and implementation, marketing segmentation, and operations with teamwork, managing conflict, feedback, influence, alignment, and communication and presentation skills.    The TeamMATE evaluation tool is built into the simulation and helps evaluate individual team behavior and team performance and students diagnose their own behaviors and overall team functionality in real time.  Students receive 360-degree feedback on their individual performance, reflect on their blind-spots and practice giving and receiving feedback. Finally, student teams conduct a three-part after action review (AAR) after each round where they assess and reflect on what they intended to do, what actually happened, and what they need to do to improve performance the next round. Participants will be able to apply best practices on how to integrate simulation, multi-discipline content, instructional design resources, and various assessment tools in your setting.

Learning Outcomes:

1. Participants will be able to use a 360-degree feedback tool (Capsima's TeamMATE) to assess individual and team performance in your courses
2. Participants will be able to use After Action Reviews (AARs) in your courses to enable individuals and teams to analyze, reflect on, and improve their performance

Audience: Intermediate

PDF

Using Data Analytics to Drive Continuous Improvement for Academic Quality Su Dong and Rollinda Thomas - Fayetteville University Pearlstein 101

This session will illustrate how institutions can effectively use data analytics to improve academic quality and other measures related to major components of the institution's mission. This session also highlights the challenges and myths of using data analytics in higher education.  Institutions of higher education are facing various challenges such as increasing competition, declining government funding, and growing demands for accountability. (Daniel 2015) These challenges require institutional leaders to make informed and timely decisions on a regular basis with recourse to vast data sources. (Daniel 2015). To unlock the value of data analytics, institutions need to implement processes for data collection, data analysis, and data visualization.  This session highlights one successful framework, the Continuous Improvement Report (CIR), adopted by Fayetteville State University.  The CIR is an innovative tool for rewarding academic departments’ performance on ten metrics related to major components of the institution's mission. The effectiveness of the CIR derives from two essential features: 1) its emphasis on departmental specific data that indicates the extent to which each department is contributing to institutional progress on key metrics and 2) its provision of budgetary rewards for high performance and improvement.

Learning Outcomes:

1. Attendees will be able to sensitize the need for using data analytics to assess and improve academic quality
2. Attendees will be able to identify ways of using an innovative data tool (Continuous Improvement Report) to drive continuous improvement for academic quality

Audience: Intermediate

 

Faculty Assessment Liaisons and the Consultation Model: From Astrophysics to Theology Seth Matthew Fishman and Valentina DeNardis - Villanova University Pearlstein 102

Faculty buy-in related to assessment is difficult, yet the research literature overwhelming supports the notion that faculty-owned assessment is the most successful and sustainable approach (e.g. Bresciani et al, 2009; Nilson, 2010; Palomba & Banta, 2015) to student learning outcomes assessment. This session will review the consultative assessment approach I have been utilizing for over three years at Villanova University. The outcomes assessment liaison model is the highlight of this approach. We now have over 45 faculty assessment liaisons, representing undergraduate and graduate programs in our College of Liberal Arts & Sciences. We will be adding several more in the interdisciplinary programs and micro-majors for Fall 2016. Using an active presentation approach throughout the presentation and Q&A.  I will candidly discuss our philosophy, structure, successes and challenges.

Learning Outcomes:

1. Participants will be able to articulate at least one strategy to gain faculty involvement in assessment
2. Participants will be able to identify challenges faced when utilizing a consultation model

Audience: Intermediate

Creating Academic Quality through Planning and Technology Mark Green, Maryann Godshall and Mary Yost and Danielle Devine - Drexel University Gerri C Lebow Hall 109

This presentation is a review of the process of a work group, consisting of faculty and staff, utilized to gather data to evaluate student and program outcomes using multiple choice examinations.  The presentation will demonstrate how this group chose and mapped key concepts from course level outcomes to program level outcomes and applied test items that are written at a designated Blooms taxonomy levels. Attendees will then see example reports generated from this data that are used to help inform faculty practices and curriculum decisions.

Learning Outcomes:

1. Participants will be able to create a curriculum map
2. Participants will explore a process to trend student learning and performance outcomes

Audience: Intermediate

PDF

Make the Best of Multiple Choice Tests: Improving Question Writing Skills Kirsten Grant - Hunter College Gerri C Lebow Hall 209

The presentation will provide the audience with a hands-on opportunity to utilize a Test Item Checklist or a set of criteria used to create or modify multiple choice questions by focusing on the learning objective being assessed.  For large lecture sessions, multiple choice exams are mandatory.  Therefore, the need for quality assessments utilizing multiple choice questions is evident.  The quality of each question rests in its ability to test the student's mastery of one learning objective at a time.  The complexity of each question is limited by the method itself assessment of learning above Bloom's synthesis level is difficult. Consequently, during this presentation, emphasis will be placed on applying the criteria to each question while addressing one learning outcome at a time using language relevant to the course level.  The presentation session is designed to provide the audience with a tool to improve their exam questions to better assess student learning for individual learning outcomes.  The audience will practice using the tool on old exam questions to gain a better understanding of each criteria being addressed.  The intention is that audience members will be able to immediately use the skills obtained to improve their multiple choice questions, for their very next assessment.

Learning Outcomes:

1. Attendees will be able to identify test questions that do not meet the Test Item Checklist criteria
2. Attendees will be able to analyze, modify and create multiple choice questions for specificity, clarity, and relevance to individual learning outcomes

Audience: Beginner

Sep 8, 2016
12:30 PM - 1:45 PM

Luncheon & Plenary

PDF

“Developing a Culture of Assessment, Learning, Inquiry, Innovation…What culture am I developing now?” Jane Marie Souza - University of Rochester Behrakis Grand Hall

Jane Marie Souza is an assistant provost for academic administration at the University of Rochester. In her role, Souza serves as the University’s chief assessment officer and manages academic policies in areas that require coordination among schools.
 
In addition, she serves as a liaison with the New York State Education Department and the Middle States Commission on Higher Education. Souza has served on accreditation teams for the Middle States Commission on Higher Education, the New England Association of Schools and Colleges, the Council on Podiatric Medicine, and the American Association of Colleges of Pharmacy, and has offered assessment workshops for higher education institutions and presented at national and international assessment conferences. She has also written for the National Institute for Learning Outcomes Assessment publications and for the journal Assessment Update.
Souza believes that her primary objective is to translate and document assessment and accreditation information for multiple constituencies, including state boards, accrediting agencies, alumni, and most importantly Rochester’s campus community
 
Prior to her role at Rochester, Souza was assistant dean of assessment and chair of the assessment leadership team at St. John Fisher College. She has also served as chief academic officer for the New England Institute of Art and executive director of CONNECT, a six-university consortium in southeastern Massachusetts.
 
Souza received her PhD from the University of Nebraska–Lincoln and a master’s in education from Curry College in Milton, Massachusetts. Her undergraduate degree is from the University of Massachusetts–Boston.
Sep 8, 2016
2:00 PM - 3:00 PM

Concurrent Session 5

Creating and Adopting Institutional Learning Outcomes: 4 Case Studies from the Trenches Kristel Kemmerer - Dutchess Community College, Victoria Ferrara - Mercy College, Heather Malonado - Buffalo State University PISB 104

Using institutional learning outcomes (ILOs) is well established in the assessment of student learning (Bers, 2008).  The National Institute of Learning Outcomes Assessment (NILOA) views ILOs as a critical marker in assessment process maturity, using the adoption of ILOs as a variable in their ongoing study of assessment (Kuh et.al., 2014).   Likewise, the use of Institutional Learning Outcomes is an important means of connecting student affairs assessment academic assessment (Yousey-Elsener et.al, 2015), which in turn helps institutions meet the best practices laid out in most national accreditation processes.  Yet, despite the strong argument for using ILOs, very little attention has been paid to the process through which ILOs can be most effectively adopted and used.  In this panel presentation we will present 4 case studies of ILO adoption, purposely selected because they differ from each other. Finger Lakes Community College will highlight an approach to values-driven ILOs; Duchess Community College will highlight its approach to develop ILOs in response to Middle States, Mercy College will discuss using General Education outcomes as IL0s and Buffalo State will share the resistance to establishing ILOs at its institution.

Learning Outcomes:

1. Participants will be able to articulate the importance of ILOs in a robust assessment process
2. Participants will learn about multiple approaches to ILO development across different college contexts

Audience: Intermediate

Snapshot Sessions (5 minute Mini-sessions) PISB 106

Retrofitting Outcomes Assessment to the General Education Curriculum: Lessons Learned at Hofstra University
Bret Benington, S Stavros Valenti and Terri Shapiro,  Hofstra University

Inviting Students to Lead the Conversation: Student-Driven Assessment Efforts on Campus
Will Miller, Flagler College

What Can't a Sticky Note Do?! #Curricularmapping
Laura Farrell, Longwood University

Mapping an Entire University's Curriculum to New General Education Goals
Kevin Guidry and Kathleen Langan Pusecker, University of Delaware

Building a Culture of Assessment:  A Nuts and Bolts Approach
Debbie Kell, Deborah E. H. Kell, LLC

A Sustainable Method for Outcomes Assessment Applied To Information Literacy, Quantitative Reasoning, and Oral Communication.
S. Stavros Valenti, J Bret Benington and Terri Shapiro, Hofstra University

Creating a Scholarship of Teaching and Learning Group to Frame an Assessment Culture
Antonis Varelas, Alisa Roost, Jacqueline DiSanto and Nelson Nunez Rodriguez , Hostos College, CUNY

Innovations in Internationalizing Curricula
Adam Zahn, Ahaji Schreffler and Harriet Millan, Drexel University

Translating Data into Action: Helping Faculty Use Assessment Data to Make Qualitative Change Anthony Fulton and Margaret Jenkins - Prince George's Community College PISB 108

Closing the assessment loop means using assessment data to make qualitative changes to courses and/or programs. Unfortunately, faculty members and academic leaders are not always sure of how to interpret assessment data in a way that makes qualitative change possible. That is where assessment professionals come in; our job is to help academics, who often have little background in data analysis, make sense of assessment data so that they can make informed decisions about the best ways to revise their courses. This session will walk participants through a process we have used at Prince George's Community College to train faculty members in the effective use of assessment data.  We will divide the participants into groups of three to four and give them an assessment scenario through which they will be asked to negotiate. The scenario will provide the groups with the pertinent details about how the data for a course-level assessment was acquired, along with the data itself. Participants will then be asked to recommend a course of action for revising the inputs that went into generating the assessment data. The goal will be to help them craft specific recommendations that lead to more valid and meaningful assessment data in the future.

Learning Outcomes:

1. Participants will understand the connection between assessment inputs and assessment data to faculty members who are novices in the culture of assessment
2. Attendees will see a model of an effective faculty development workshop for training faculty in the effective use of assessment data

Audience: Intermediate

The Highs and Lows of Writing Assessment: Connecting Outcomes, Rubrics, and Data (Student Work) in Meaningful Ways. William FitzGerald and Brynn Kairis, Rutgers University, Camden Pearlstein 101

When the local assessment czar turns to directing the (First Year) Writing Program, there is a burden to do assessment right.  This session addresses efforts to institute best practices in a foundational General Education requirement with a particular responsibility to put into place modes of assessment that meet, even exceed, expectations for responsive, data-driven assessment. In this session, we present our efforts to bring a writing program, foundational to Gen Ed, into an assessment culture mindful of the visibility of those efforts. The “highs and lows” of our title refer not to good or bad practices or results but to the high-level formulation of program learning goals, goals that are themselves responsive to hierarchical, or top-down, standards and to what they look like at lower levels, on the ground where instruction meets actual students’ performance. We will discuss the insights raised from the process of revising existing learning goals, converting those goals into measurable outcomes along a developmental spectrum through the use of a program specific rubric. In the second half of the presentation, we generalize these insights beyond our local FYW program into practical advice for assessing writing across the curriculum.

Learning Outcomes:

1. Attendees will gain a greater appreciation of best practices in assessing writing
2. Attendees will gain an increased awareness of challenges in articulating and measuring learning goals

Audience: Intermediate

Trickle Up Assessment: Using Charrettes to Build an Outcomes-based Assessment Plan Molly Kerby, Stacy S. Wilson and Wren Mills - Western Kentucky University Pearlstein 102

This presentation outlines the implementation of a Quality Enhancement Plan (QEP) at a south central Kentucky public university aimed at teaching students the skills of evidence-gathering, sense-making, and argumentation, or Evidence&Argument. The QEP leadership team established an Evidence&Argument (E&A) Fellows Program.  This faculty development initiative was designed, in collaboration with the Center for Faculty Development (CFD), to provide curriculum expansion opportunities using workshops and charrettes.  The term charrette refers to a collaborative session in which participants design solutions to problems.  Initially, a group of 11 faculty (E&A Fellows) were selected through an application process that included development of an outcomes-based plan to revise or enhance the curriculum in an area of identified need and to develop a shared understanding and vocabulary in argumentation pedagogy in order to align the curriculum for maximum impact in addressing QEP student learning outcomes. These groups of E&A fellows will each work interactively over a two-year period to integrate projects into the curriculum and to assess student learning using AAC&U LEAP rubrics.  The session will focus particularly on the process of using charrettes to “tune” class assignments and/or curriculum to organically (from the bottom up) build a tiered assessment plan. Those attending the session will be given the opportunity to actively participate in a mock charrette.

Learning Outcomes:

1. Participants will be able to develop an interdisciplinary professional learning community focused on the implementation and assessment of new academic initiatives using assignment charrettes
2. Participants will be able to create a multipronged, "trickle up" approach to meeting accreditation assessment standards

Audience: Intermediate

Moving from Compliance to Improving Student Learning: Reframing Academic Quality Natasha Jankowski and David Marshall - National Institute for Learning Outcomes Assessment (NILOA) Gerri C Lebow Hall 109

​What is the ideal relationship between accreditation, assessment, and academic quality? Various surveys have indicated that accreditation is a driver of assessment practices at institutions, leading in part to a compliance driven mentality that disconnects assessment processes from teaching and learning. Yet, the relationship between the three is not clear, and on most campuses disconnected, leading to faculty and staff development of a healthy skepticism of assessment efforts and little in the way of use of assessment results to improve teaching and learning. This session will examine assessment, accreditation, and academic quality by reframing the relationships among the three. The session will draw upon work of the National Institute for Learning Outcomes Assessment (NILOA) including the NILOA policy statement, Higher education quality: Why documenting learning matters, and the NILOA authored book, Using Evidence of Student Learning to Improve Higher Education.  The first part of the session the presenters will explore with audience participation, the current relationship among accreditation, assessment, and academic quality. We will then present alternative framings by exploring what "academic quality" means in relation to learning outcomes, and how accreditation reinforces certain notions of quality assessment as played out in peer review feedback. Having redefined quality, the session will continue with a discussion of assessment principles that encourage reflective efforts across campuses to focus on student learning.  The second part of the session will include examples of how the reframing has played out in practice at various institution types.

Learning Outcomes:

1. Participants will be able to present alternative principles of quality in student learning to campus practices in assessment and accreditation
2. Participants will apply principles to their local contexts, leaving with action plans including next steps

Audience: Intermediate

The Wizards of Assessment: Peel Back the Curtain and Experience the Art and Science of the Assessor Ray Lum and Mark Green - Drexel University Gerri C Lebow Hall 209

During this hands-on session, conference attendees will be invited to gather in a lighted hearted, but rigorous process of creating assessment tools.  Whether the subject is complex or simple, evidence-based assessment techniques will be the foundation of the process. Be surprised as we demystify assessment by using the most unassuming subjects.  Participants will work collaboratively with one another to develop their assessment tools which are reliable and measurable.  A panel of distinguish evaluators will determine the efficacy and validity of the tools.  Alternatively, conference attendees may observe the quick-witted panel as participants gain insightful feedback and quips regarding their assessment tools. They will witness an array of techniques used.  In addition, attendees will identify themes of best practice and tips for improvement.  While networking during the session is prize enough for some, top assessment tools presented will receive additional recognition and of course  . . . bragging rights for the year.

Learning Outcomes:

1. Participants will gain feedback on their ability to develop assessment tools and apply them in grading situations
2. Participants will be able to network and engage in meaningful dialogue with other conference attendees

Audience: Beginner

Sep 8, 2016
3:15 PM - 4:15 PM

Concurrent Session 6

PDF

How to Design and Implement a Comprehensive Assessment Plan Under Pressure Satyajit Ghosh, Richard Walsh and Nicholas Truncale - University of Scranton PISB 104

In Fall 2014 we developed a comprehensive assessment plan guided by the following principles.  1. Program level assessment should be the primary focus.  2. The plan must reflect assessment as a continuous process open to improvement.  3. Although necessarily well-structured, continuous assessment should not impose undue burdens on programs but rather augment existing initiatives to facilitate program development. 4. Consistency with current assessment practices, especially for programs accountable to external accrediting bodies, such as AACSB's assessment requirement of the business programs.  The three-year cycle renders “assessment” a continual process rather than a one-time effort to address the issues raised by an accrediting body. The plan focuses on assessment of program learning outcomes (PLO). But the PLOs need to be consistent with the Institutional Learning Outcomes (ILO).   We also created our own electronic annual reporting system; we now utilize an annual Assessment Program Plan (APP) and Assessment Program Report (ARP) to ensure consistency.  We will provide an example concerning mathematical preparedness among our undergraduate population and how deficiencies contributed to lower performance in Physics/EE.  We shall also focus upon the institutional challenges faced by Universities with respect to assessment platforms and electronic tools for information management.

Learning Outcomes:
1. Participants will be able to evaluate a comprehensive assessment plan, designed around a three-year cycle, along with a specific example of evidence-based program improvement.
2. Participants will encounter specific methods of evidence collection from within existing institutional structures as well as the benefits and challenges of creating electronic tools.

Audience: Beginner

Aligning Program Review: Academic Quality and the New Middle-States Standards Robert Wilson - Cedar Crest College, LaMont Rouse - The College of New Jersey PISB 106

The comprehensive self-study is the most significant regional accreditation event that any institution will undergo, and the results of the self-study will have a significant impact on the institution in the short- and long-term. Ensuring that your institution is in compliance with MSCHE's Standards for Accreditation and Requirements for Affiliation in this contentious higher education and accreditation environment is an essential interest of the institution. The challenge for most institutions is to create a comprehensive and meaningful internal program review process that also meets the external requirements of MSCHE. This presentation focuses on the essential elements of program review, and a learning outcomes framework on what academic quality looks like at a mid-sized regional college (The College of New Jersey) and a private liberal-arts women's college (Cedar Crest College). The presentation offers a proactive framework for building a multipurpose assessment system that ensures academic quality aligned with the Standards of Accreditation.

Learning Outcomes:

1. Participants will be able to identify various program review models within the context of institutional mission, including key indicators of academic quality and student learning
2. Participants will be able to integrate the new MSCHE standards of affiliation into their institutional program-review process

Audience: Intermediate

PDF

What A Difference Assessment Can Make! Rebecca Haggerty and Daniel Haggerty - The University of Scranton PISB 108

The presenters will share their experience implementing assessment for a unique, specialized honors program. Presenters will identify the challenges of assessing such a program, as well as the surprising benefits realized in initiating various assessment measures, such as greater faculty engagement and meaningful collaboration with academic support and non-academic offices. This presentation will provide suggestions and concrete examples for integrating assessment into other like programs or programs identified as High Impact Practices (HIP). This topic may be of interest to audiences charged with assessing interdisciplinary or co-curricular programs for improving academic quality.  Supporting recent research in higher education that emphasizes the value (and necessity) of providing students access to support services on campus, such as tutoring and career services, the assessment data has led to broad and meaningful collaborations with other departments within the university.  These have become opportunities for additional improvements to academic quality and student learning.  Higher education is complex and through assessment, the presenters will share how they have been able to create meaningful partnerships for improved student experiences and success.

Learning Outcomes:

1. Attendees will be able to apply assessment practices of a specialized honors program into a high impact practice through the use of informal interviews, personal reflections, field based assignments and quantitative measures
2. Attendees will discuss examples of how assessment helped create a model of collaboration between university departments and specialized programs for improvement of student educational experiences

Audience: Intermediate

PDF

Stubborn Numbers: Driving Writing Assessment with Targeted Professional Development Moe Folk, Amy Lynch-Biniek and Doug Scott - Kutztown University Pearlstein 101

How can institutions improve the results of writing assessments that have stagnated? To that end, the participants will 1) discuss why improving professional development is key to improving writing assessment results; 2) explain how to identify specific writing outcomes that lend themselves to professional development remediation; 3) describe how to build sustainable and worthwhile professional development efforts between administration and faculty; and 4) help audience members develop targeted professional development plans for their own stubborn writing outcomes.  We will offer strategies that are rooted in our university's assessment of first-year composition over the last seven years but are also applicable to a broader range of general education courses where writing across the curriculum is practiced.  We will also demonstrate the targeted asynchronous professional development designed by a faculty-administration collaborative team and explain how the content was created and what effects it is having on instruction and assessment.

Learning Outcomes:

1. Participants will be able to identify stubborn learning objectives that can benefit from targeted professional development
2. Participates will be able to develop targeted professional development strategies for their own assessment efforts

Audience: Intermediate

PDF

Integrating Assessment & Faculty Development to Improve Course-Learning Outcomes Achievement Using the Critical Thinking Assessment Test (CAT) Elizabeth Lisic, Tennessee Tech University and Kim Gagne, Keene State College Pearlstein 102

The Critical-thinking Assessment Test (CAT) was developed by faculty from a wide variety of institutions and disciplines, with guidance from their colleagues in the cognitive/learning sciences and assessment with support from the National Science Foundation (NSF).  The instrument engages faculty in the scoring of student short-answer essay responses to increase awareness of student weaknesses and stimulate discussion of methods to improve learning. The engagement of faculty in the scoring process is an essential feature of the CAT, strengthening the link between assessment and the improvement of learning. As such, the CAT contributes not only to the assessment of learning but to the development of faculty teaching strategies.  This session will focus on research surrounding a new framework designed to assist faculty in the development of discipline specific assessments (CAT Apps) that target similar critical-thinking skills as those measured by the CAT test. Our hope is that these activities will support faculty in their desire to develop targeted skills and merge the assessment of both discipline-specific content and critical-thinking.  Attendees will learn the skills associated with the framework, how to identify content related to these skills, and how to develop a rubric to evaluate potential student responses.

Learning Outcomes:


1. Participants will be able to understand the role of assessment in student learning and the importance of experience based training to drive change in course teaching practices
2. Participants will learn about a new framework and associated skill sets that can be utilized to develop content-based assessments that integrate critical thinking skills through engaging in interdisciplinary activities

Audience: Intermediate

PDF

Systematic Curriculum Review: Establishing a Process That's Worth the Time Jennifer Kirwin and Margarita DiVall - Northeastern University Gerri C Lebow Hall 109

We will describe the steps used by our school to create and implement a process to systematically review courses in the Doctor of Pharmacy curriculum as required by accreditation standards for schools and colleges of pharmacy. We identified several guiding needs that shaped the design of our program including a need for a process that would help the school meet ongoing programmatic assessment needs, facilitate preparation of materials needed for accreditation and allow for digital archiving of materials. We will describe the environment and goals that influenced creation of the initial process and the process policies and procedures. Participants will have the opportunity to engage in a discussion about guiding needs that should be considered when designing a Systematic Curriculum Review (SCR) process for their own institution. The SCR process evolved over time. Rationale for changes will be explored and results of a 2015 comprehensive evaluation will be described. The session will conclude with a discussion about barriers to implementation and alternatives or recommendations that might be considered and participants will have the opportunity to discuss topics of interest during a question and answer period at the end of the presentation.

Learning Outcomes:

1. Participants will Identify important elements of a systematic process for curricular review designed to aid in routine programmatic assessment efforts
2. Participants will be able to discuss the logistics of implementation and ongoing evaluation of a systematic curricular review process

Audience: Intermediate

Training for Success with Automated Assessment; A model for Training Taculty in Academia. Kenneth McCurdy, Gannon University Gerri C Lebow Hall 209

Automated assessment of student learning is a challenging initiative that provides colleges and universities opportunities to:  automate assessment processes; collect evidence and centralize data storage; report results in a formalized and consistent manner; communicate results to others; and foster continuous improvement and support institutional effectiveness. It is imperative that faculty and administrators receive effective training and support to successfully transition to automated assessment. Keeping in line with the Kotter model of change leadership, and the Adlerian constructs of the Crucial Cs, attendees will be presented with a proven training curriculum to foster a climate of change and success implementing automated assessment of student learning. Reference will be made to using the Blackboard Outcomes Assessment Module. During this interactive workshop, attendees will learn about taking programs and departments from evaluation of assessment readiness through a structured training curriculum that will lead them to a fully implement automated student learning assessment process.  Attendees will explore the three stage training model, identify how the model can be adapted to their institution, and leave with a draft template for implementation.

Learning Outcomes:

1. Attendees will be able to take programs and departments from evaluation of assessment readiness through a structured training curriculum that will lead them to a fully implement automated student learning assessment process.
2. Attendees will be able to explore the three stage training model, identify how the model can be adapted to their institutions, and leave with a draft template for implementation.

Audience: Beginner

Sep 8, 2016
5:00 PM - 7:00 PM

Reception

The Franklin Institute

Sep 9, 2016
7:30 AM - 8:30 AM

Continental Breakfast 2

PISB Atrium

Sep 9, 2016
8:45 AM - 9:45 PM

Concurrent Session 7

PDF

Triple A Aligning, Accelerating, Achieving! Using Strengths to Drive Your Strategic & Assessment Plan Forward Carol Thurman, Juana Cunningham and Liz Kazungu - Georgia Institute of Technology PISB 104

On most university campuses, strategic planning and assessment are viewed as mutually exclusive activities. Very rarely do those who are charged with developing strategic and assessment plans come together to produce plans that align with each other, regardless of level within the university. And yet, it is critical that institutions of higher learning reaffirm their unique mission and seek out creative processes that will attain institutional goals as effectively as possible (Aloi, 2005).  Georgia Tech's Office of Undergraduate Education collaborated with the Office of Strategic Consulting to create a unified strategic and assessment plan that aligned program assessment outcomes and metrics with the vision, mission, goals and objectives for the division. Additionally, the division used the principles of the Clifton SrengthsFinder to leverage individual strengths to accelerate the implementation of strategic plan goals and objectives. Since research shows that anywhere from 60-90% of strategic plans fail due to the organization's inability to execute its strategies (Kaplan & Norton, 2005, 2008),this collaborative work served the purpose of ensuring that the divisions planning work would not be in vain.

Learning Outcomes:

1. Participants will be able to explain the benefits of aligning strategic planning and assessment processes, and leave with tools that can be used to align these often distinct processes in their own universities
2. Participants will be able to articulate how faculty and staff can use the Clifton SrengthsFinder to support implementation of an organization's strategic and assessment plans

Audience: Intermediate

PDF

At the Mercy of Many Masters: Assessment Planning in a College of Health Professions Jody Bortone, Robin Danzak and Beverly D. Fein - Sacred Heart University PISB 106

This presentation describes our created model for the development of a college-wide assessment plan for a College of Health Professions with "many masters", including accredited and non-accredited, undergraduate and graduate, pre-professional, and professional programs. We offer an analysis of the process and suggestions for teams developing assessment plans in similar contexts.  We outline the assessment plan's trajectory: the development of college-wide goals, alignment of goals with diverse program curricula, the development of rubrics to assess each goal, and the identification of appropriate course artifacts to which the rubrics will be applied. Each component of the assessment plan was designed to meet the expectations of a range of internal demands and external accreditors. We also discuss the strengths and challenges of our approach, and offer suggestions for facilitating a collaborative, interprofessional approach to developing an assessment process while avoiding pitfalls.  Session participants will have the opportunity to work collaboratively to create rubrics that can be applied to their own institution's goals.

Learning Outcomes:

1. Attendees will be able to describe the process of developing an assessment plan that meets the demands of many masters
2. Attendees will be able create rubrics for assessment of goals specific to your institution.

Audience: Beginner

PDF

Innovations in Conceptualizing and Assessing Civic Competency and Engagement in Higher Education Javarro Russell - Educational Testing Services (ETS) PISB 108

Most educators agree that one of the goals of higher education is to develop those skills of civic competency and engagement that will allow students to participate effectively in democracy.  In order to determine whether students are developing these skills at higher education institutions, it is first critical to develop a clear definition about what these competencies and skills are. A recently published research report at the Educational Testing Service (ETS) has done just that. Entitled, Assessing Civic Competency and Engagement in Higher Education: Research Background, Frameworks, and Directions for Next-Generation Assessment.  This paper: 1. Presents a comprehensive review of existing frameworks, definitions, and assessments of civic-related constructs; 2. Includes a discussion of challenges related to assessment design and implementation; 3. Synthesizes existing information and proposes an assessment framework to guide the development of a next-generation assessment of civic competency and engagement; and 4. Discusses assessment considerations such as item formats, task types, and accessibility.  The framework paper is comprehensive and research driven, providing useful information about assessing civic competency and engagement in higher education.  Using this assessment framework, ETS has been developing a new assessment tool called HEIghten Civic Competency and Engagement. HEIghten (www.ets.org/heighten) is a suite of six computer-based assessments measuring different student learning outcomes that can be used by institutions in conjunction with internal assessments for accreditation and curriculum improvement.

Learning Outcomes:

1. Participants will be able to identify several ways to assess civic competency and engagement
2. Participants will be able to identify components of a new framework to measure civic competency and engagement

Audience: Intermediate

PDF

Get the Assessment Train Moving: Assessment Readiness Strategies to Support Program and/or Institutional Assessment Catherine Datte and Ruth Newberry - Gannon University Pearlstein 101

Assessing an institution's readiness to move forward with program or institutional assessment will provide valuable information to support success in any assessment project. Readiness involves acceptance and understanding of the process, acquisition of needed resources, and implementation of strategies. It also involves a thoughtful realistic project plan driven by a coalition and supported by a “volunteer army” that can serve as a spokes-person, role model, and leader moving the effort forward. During this interactive workshop attendees will review best practices, complete their strengths, weaknesses, opportunities and challenges (SWOCh) analysis, and identify gaps that will prevent creation of a common vision, sense of urgency, and implementation plan. This organized approach will enable attendees to identify and prioritize critical actions associated with best practices in program or institution assessment along with documenting practical, individualized action steps to get the assessment train moving.

Learning Outcomes:

1. Attendees will be able to complete a preliminary readiness assessment to launch program or institutional assessment
2. Attendees will be able to identify strategies and action steps to launch program or university assessment

Audience: Beginner

PDF

Methodologically Rigorous Assessment: Engaging Faculty in Data Collection for Assessment and Publication Laura Maki - St. Olaf College Pearlstein 102

Direct, embedded assessment of student learning outcomes reflects best practices, reduces the burden on students for producing evidence of learning, but also relies heavily on faculty investment and involvement.  Embedded assessment can speak to issues that stakeholders care deeply about, reflecting the American Association for Higher Education's principles of good practice (Banta, Lund, Black, & Oblander, 1996). Moreover, a well-designed and methodologically rigorous embedded assessment plan can also provide a foundation for faculty publications in discipline-specific fields as well as in the scholarship of teaching and learning. Minimizing the burden of assessment while recognizing and rewarding faculty for their investment helps promote a culture of assessment (Suskie, 2004). This session will focus on designing and implementing methodologically rigorous student learning outcomes assessments that meet the standards for educational research. Specifically, the presentation will include examples of research questions, research design, sampling methods, and data collection procedures that meet criteria for empirical research and that could be included in a manuscript for presentation or publication. This session will also include dialogue with audience members around discipline-specific needs in assessment and research, perceived barriers to transforming assessment into research, and ideas for collaborations across campus to increase faculty involvement in assessment.

Learning Outcomes:

1. Participants will advance their knowledge of assessment and educational research
2. Participants will generate ideas for collaborative assessment research and actions to begin the research process

Audience: Advanced

PDF

Assessing Student Engagement to Improve Academic Quality: Applying Findings from NSSE Jillian Kinzie - Center for Postsecondary Research, National Survey for Student Engagement (NSSE) Gerri C Lebow Hall 109

The National Survey of Student Engagement (NSSE) collects information annually at hundreds of colleges and universities about student engagement in educational experiences and activities that foster learning and personal development.  Institution results provide participating colleges and universities specific evidence of educational effectiveness, while aggregate project findings point to aspects of the undergraduate experience of broad concern to higher education. NSSE research findings highlight strengths and shortcomings in undergraduate education and focus attention on areas for improvement.  Several recent topical findings related to the extent to which students are challenged to do their best work, students experiences with effective teaching practice and academic advising, and engagement in high-impact practices (HIPs), including undergraduate research, service-learning and internships, have garnered significant interest among educators interested in improving educational quality.  In particular, findings about these experiences have worked their way into a variety of quality initiatives including accreditation self-studies, quality improvement projects and faculty development.  For example, results pointing to low levels of challenge in some majors have prompted departments and faculty to evaluate the rigor of assignments in gateway to the major courses, while findings about inequities in participation in HIPs have advanced efforts to expand access for under-represented students. 

Learning Outcomes:


1. Attendees will gain familiarity with several current findings about student engagement
2. Attendees will apply student engagement findings to the improvement of academic quality

Audience: Intermediate

PDF

Me, Myself, & I: Self-assessment as a Means to Enhancing Academic Quality Janet McNellis and Lisa D. Belfield - Holy Family University Gerri C Lebow Hall 209

​Many faculty believe that the academic quality of a program refers to more than just book learning.  These faculty posit that students who go through their academic programs should, in addition to mastering academic content, achieve growth in areas such as critical thinking ability, drive for life-long learning, collaboration ability, and other non-academic domains. However, Assessment Offices often discourage faculty from setting these types of student abilities as part of their formal program outcome objectives because growth in these areas is difficult to measure.  In this presentation we will discuss one solution to this issue: the use of student self-assessments to measure non-academic growth.  Throughout the presentation we will provide real-life examples from two academic programs: one undergraduate and one graduate, in a School of Education.  Our presentation will focus on the following content: How student self-assessments may yield higher outcomes in academic domains. Guidelines that faculty should follow to achieve the most benefits from utilizing self-assessments; How to quantitative and qualitative analysis of self-assessments results.  Participants will be actively engaged at the beginning of the presentation by completing a short self-assessment on presentation-related areas.

Learning Outcomes:

1. Participants will gain an understanding of the academic and non-academic benefits of student self-assessment.
2. Participants will be able to create a useful self-assessment instrument for students.

Audience: Beginner

Implementing ExamSoft: Using Technology to Improve Quality in Assessment Caitlyn Goldschmidt - Drexel University PISB 120

​At our college, faculty requested tools to measure and improve the quality of student learning. The Assessment Department of Nursing Operations, along with faculty, vetted assessment software and decided to implement ExamSoft for its efficacy in providing insight into the reliability of exams and feedback for improvement. The faculty can use the feedback from the software to improve the quality of exam questions and assessments. Learn how The College of Nursing and Health professions is using the numerous features of this software including a collaborative test bank to share questions and exams, exam management between faculty and administration, data that is tracked and formed into customizable reports, and computer-based-testing. We'll look at how our college implemented the software with our Physician's Assistant department and how we continue to use it in our Nursing Program. Learn about some of the challenges we've come across and how we've dealt with them. Questions will be answered about the various ways this this software has benefited our college, the challenges we've faced, and how it can be implemented into any program of study to improve the measurement of student learning.

Learning Outcomes:

1. Participants will learn what ExamSoft is, ideas for implementation, and how they can use specific features of the software to improve the quality of their exams.
2. Participants will be able to gage student learning through direct assessment with ExamSoft.

Audience: Beginner

Sep 9, 2016
10:00 AM - 11:00 PM

Concurrent Session 8

PDF

Faculty at the Wheel: Assessment Education and the Map toward Data-driven Decisions Emily Zank, Jim Eck and Brittany Hunt - Louisburg College PISB 104

Louisburg College recently completed its reaffirmation process with no recommendations due in large part to its strategic planning process, Department of Education Title III grant, and faculty professional development.  Student success cannot be improved significantly on a course-, program-, or even degree-level if faculty do not embrace assessment.  However, when faculty are faced with students who are increasingly underprepared for college-level work, the burden to collect and assess data  may become a low priority, especially when faculty work on a small campus and are very much focused on needs of individual students.  Our presentation will share assessment processes, resources, and tools to increase faculty buy-in, empower them to collect and analyze data for decision-making, and educate them on demonstrating use of results for continuous improvement. As a result, faculty will be more equipped to improve the rate at which students meet learning outcomes without compromising the ever-important faculty and student interactions that are the foundation of a small college. To frame our presentation, we will briefly explain our strategic planning process and subsequent U.S. Department of Education Title III Strengthening Institutions grant, resulting in the College's ability to better demonstrate capacity for sustained improvement.

Learning Outcomes:


1. Participants will be able to formulate steps and professional development opportunities that will foster a faculty culture of data-driven decisions across their campuses
2. Participants will be able to select the appropriate mixture of locally developed and national measures in order to develop an assessment calendar to meet their campuses' needs and budgets

Audience: Beginner

PDF

Strategies and Tools for Engaging in a Middle States Self-Study Using the Revised Standards Karen Rose - Widener University PISB 106

Implementing a self-study design for re-accreditation using the revised Middle States standards necessitates critical reflection, extensive collaboration, and transparency in communication. Learn about strategies and tools developed and used by one workgroup at a Collaborative Implementation Pilot (CIP) institution to systematically analyze evidence and report findings for the educational effectiveness standard (Standard 5). Participate in pair-share and large group conversation about these resources and future self-study preparations.

Learning Outcomes:


1. Participants will be able to describe one approach to review, analyze and reflect on accreditation self-study evidence, especially with respect to expectations for educational effectiveness
2. Participants will be able to develop practical tools that may be adapted to your institution's self-study to guide the process for analyzing and summarizing findings based on the evidence

Audience: Intermediate

PDF

Quantitative Assessment for Qualitative Practices: Creating Effective Rubrics and Assessment Practices for Studio Based Courses Dana Scott - Philadelphia University PISB 108

All assessment begins with outcomes. The difficulty is in effectively measuring these outcomes for creative, studio based courses. The audience will examine a performance task assessment rubric for aesthetic and creative practices, that is both norm-referenced and criterion-referenced. The rubric was designed to apply to objectives and competencies, and not specific aspects of an assignment, and has been successfully used across an array of studio projects and disciplines. It was designed to promote balance between critical thinking and problem solving, creativity, and craftsmanship, encouraging students to take risks and push boundaries. This has allowed students to, not only get a clear idea of their competency in completing a project, but also how that competency relates to their grade. The rubric was also used for self and peer evaluation by the students, enabling comparisons using the same descriptive criteria. Consistent use of a uniform language for the competencies promoted a greater understanding of the grading system and a better self-awareness of growth as a student. This research produced a series of examples that give direction and insight to those wishing to use competency-based rubrics for creative practices. This information was presented in a poster session at the 14th Annual Faculty Conference on Teaching Excellence at Temple University.

Learning Outcomes:

1. Attendees will recognize a process for creating quantitative performance task assessment for aesthetic and creative practices
2. Attendees will be able to align course objectives with rubric criteria

Audience: Beginner

PDF

From Visual Literacy to Literary Proficiency: An Instructional and Assessment Model Using Graphic Novels Lynn Kutch and Julia Ludewig - Kutztown University Pearlstein 101

This session, which will be part theory and part hands-on, will demonstrate how college language instructors can effectively implement visual and multi-modal methods often used in beginner and intermediate courses as effective building blocks to develop skills of literary analysis. The presentation introduces aspects of a mini-curriculum based on Kafka's Die Verwandlung (The Metamorphosis), the graphic novel; and it also outlines a detailed assessment model that aligns with the AAC&U's Reading Value Rubric. Corresponding to the format of a graphic novel that combines pictorial and visual information, items in the featured curriculum and assessment consistently incorporate methods to build visual literacy to move students toward verbal literacy. The presentation contributes to the scholarly fields of instruction and assessment in higher learning language pedagogy.  Field-tested examples demonstrate ways students learn to use illustrations and literary products to support higher-level analysis. The presentation will show how carefully crafted questions can emphasize the visual exploration that has typically gone into reading graphic texts, but has less frequently been associated with building skills of literary exploration.  I recently published a description of this curriculum and assessment in Die Unterrichtspraxis/Learning German in Spring 2014.

Learning Outcomes:

1. Attendees will be able to implement graphic texts as tools for building literary proficiency.
2. Attendees will be able to apply concepts from AAC&U's Reading Value Rubric to assessing reading and literary proficiency with graphic texts.

Audience: Intermediate

Faculty Assessment Fellows: A Model for Building Capacity, Advancing Goals and Sustaining Success Beth Roth, Scott Davidson and Kathy McCord - Alvernia University Pearlstein 102

For three summers, Alvernia University's faculty Assessment Fellows gathered to collaboratively learn about assessment, run calibration exercises, score artifacts, add to a centralized database, analyze results, refine rubrics, produce written reports and present findings to all faculty at an August workshop. This model has proven overwhelmingly successful, both in the quality of the work and the positive experience communicated. The shared accomplishment emanating from this endeavor has enriched faculty and administration in ways that go well beyond service. Faculty have leveraged this project to enhance their teaching and develop scholarship that expands knowledge of assessment and institutional research. The Assessment Fellows serves as a sustainable model for any campus striving to build capacity with results that achieve short-term and long-term gains.  The three presenters will provide an overview and unique perspectives on the Assessment Fellows model. First, we will describe the environment that led to the germination and implementation of the Assessment Fellows. In particular, We anticipate there will be hypotheses proven as well as surprises revealed during the facilitated discussion.

Learning Outcomes:

1. Participants will learn about a faculty Assessment Fellows model that can be adopted at any educational institution to build capacity and sustain momentum for assessment.
2. Participants will receive a resource packet of materials with information about compensation, job description, timeline, readings, activities and reports for implementation of the Assessment Fellows at any educational institution.

Audience: Intermediate

Promoting Academic Quality through Development of Meaningful Rubrics for First-Year Courses Elizabeth Jones and Dianna Sand - Holy Family University Gerri C Lebow Hall 109

​In this session, presenters will discuss (1) the processes used to develop several common rubrics for multiple sections of a first-year college-success course; (2) the piloting of several rubrics and how this information was used to inform the development of the final rubrics applied to required student assignments; (3) the processes used to obtain faculty buy-in.  Presenters will share sample rubrics and sample assignments that all students across different sections of a first-year college success course were required to complete.  The rubrics include assessment of essential skills including critical thinking and communication that are transferable across different courses.  Participants will critique these rubrics to determine how they might apply to their own courses.  Discussions will be approximately 30 minutes of the presentation. Approximately 10 minutes is anticipated for questions and answers.  Participants will take part in round table discussions and interactive learning for approximately 20 minutes of the presentation.

Learning Outcomes:

1. Participants will be able to apply a common rubric to different assignments.
2. Participants will develop a plan for using a rubric in their own teaching and will explore how they might collaborate with their colleagues to use a common rubric.

Audience: Intermediate

PDF

Critical Thinking? It's not what you Think! Janet Thiel - Georgian Court University ​Gerri C Lebow Hall 209

This session will examine the academic quality of various intellectual skills currently classified as critical thinking. Participants will consider the various nuances of critical thinking and its assessment. The definition of critical thinking will be teased out as problem-solving, reflective, self-aware, metacognitive, creative, and critique thinking. Appropriate teaching methods and ways to assess the above intellectual skills will be presented. Participants will consider how critical thinking is defined and assessed on their own campus and within its various programs, both with learning inside and outside the classroom.

Learning Outcomes:

1. Participants will be able to analyze conceptions of critical thinking beyond the testing parameters of inferential reading ability.
2. Participants will be able to review appropriate assessment of various intellectual skills classified as critical thinking.

Audience: Intermediate

Sep 9, 2016
11:15 AM - 12:00 PM

Closing Remarks

Closing Remarks & Raffle PISB 120

Come join us as we close the conference and hold a raffle drawing for free registrations for the 2017 conference
Sep 12, 2018
9:00 AM - 12:00 PM

Pre-Conference Workshops

Implementing Curriculum Review: From Designing the Process to Using the Findings Jane Marie Souza, University of Rochester Pearlstein 302

Periodic curriculum review is essential to maintaining a quality educational program.  While faculty and administrators may clearly agree with that statement, implementation of the review process may be much less evident. Questions abound: How do we schedule the review?  How long should it take? How are duties assigned? How do we manage the process? What should we look at when reviewing individual courses? What evidence do we use to support our conclusions?  And perhaps most importantly: How do we plan for use of our findings? The answers to these and other common questions will be explored in this pre-conference workshop. This workshop will present a strategy for establishing a Curriculum Review timeline and distributing the workload. Then a review process will be outlined employing a series of questions that can be researched through an established evidence bank. It will be demonstrated how questions posed for the review process can be aligned with targeted goals and specific sources of evidence. Finally, a plan will be suggested for the important step of following through on resulting recommendations. Participants in the workshop will be provided handouts including a set of possible research questions, a sample evidence bank, and tools to align course-level assessments. They will then be tasked with using the tools to outline a process to fit their unique educational settings.

Outcomes:

At the conclusion of this workshop participants will be able to:

  • Outline a plan and timeline for a curriculum review process.
  • Draft research questions to guide an effective curriculum review.
  • Identify appropriate sources of evidence to address research questions.
  • Outline a process to follow-through on review findings.

Developing Direct and Indirect Measures of Student Learnin Jodi Levine-Laufgraben, Temple Unversity Pearlstein 303

This pre-conference workshop will focus on strategies for selecting the right assessment approach with which to measure student learning outcomes.  We will discuss how to design and implement direct measures of student learning, and how to best use indirect measures of student learning to compliment your direct assessment efforts.

Outcomes

At the conclusion of this workshop participants will be able to: Identify direct and indirect measures of student learning outcomes Select assessment strategies that best align with their learning outcomes Design a direct measure of student learning

Winning Arts and Minds: Assessing the Creative Disciplines Krishna Dunston, Community College of Philadelphia Pearlstein 307

Assessment advocates and leaders in the creative disciplines often find themselves squeezed between the right and left brains of the college campus. We comprehend the urgent need to demonstrate student competency; but find that what fits most easily into a spreadsheet has little or nothing to do with creative success. It is easy to get stuck collecting meaningless data which does not improve student outcomes or allow for informed program improvement. This can prove frustrating to institutions who need arts programs to, “close the loop.” A well-constructed plan can lift the blinders from both sides and reveal the way arts pedagogies provide some of the most important skills of a 21st Century education: collaboration, self-assessment, innovation, discipline and adaptability.

This workshop is for those looking to revitalize their own course or program assessment plan; preparing to build new creative programs; seeking inspiration as an assessment facilitator; or are wanting to learn more about authentic assessment. Participants will experiment with a variety of tools and discuss their use in an assessment structure which balances the evaluation of artistic product with an examination of creative process. The presenter will share how unconventional assessment metaphors: the elementary school science fair, NASA vs. Google, Venn diagrams, and even reality coking shows have proved useful models for opening dialogues: breaking the cycle of useless reporting and encouraging the creation of meaningful assessment processes.

Outcomes

At the conclusion of this workshop participants will be able to:

  • Identify and build from existing pedagogies;
  • Emphasize process in authentic assessments;
  • Discuss the balance of process, product and reflection; and
  • Investigate new models and metaphors for program mapping.

Assessment Toolbox: Supercharge the Direct Assessment of Student Services Michael Sachs, East Stroudsburg University Pearlstein 101

The Middle States Commission on Higher Education’s publication Student Learning Assessment: Options and Resources, Second Edition states “the characteristics of good evidence of student learning include considerations of direct and indirect methods for gathering evidence of student learning.” Creating direct student learning assessment tools within student support services can be challenging for student service professionals.  Often many student service programs rely only on indirect assessment techniques such as focus groups, evaluations, satisfaction surveys, NSSE results, etc.

This workshop will explore the countless direct student learning assessment tools available to Offices of Student Affairs and other services offices on campus. These techniques and tools are both qualitative and quantitative in intention and design.  This workshop will also enable participants to develop program goals, rubrics, and direct student learning outcomes for their student service areas – linked, of course, to their college’s mission and/or strategic plan.  Participants should bring copies of their institutional strategic goals and mission.

Outcomes

At the conclusion of this workshop participants will be able to:

  • Explain the importance of direct assessment for planning, resource allocation and student learning.
  • Recognize and understand the differences between direct and indirect assessment in student services.
  • Create and use rubrics for student learning outcomes.
  • Create direct assessment of Student Learning Outcomes for their individual areas / programs that can be incorporated into assessment plans.

How I Learned to Stop Worrying and Love Accreditation: Working with the New MSCHE Standards Sean McKitrick, PhD, Vice President, Middle States Commission on Higher Education Pearlstein 308

In accordance with CFR 34 602.21 Review of Standards, the Commission conducts a regular review of its accreditation standards. During spring 2013 the Commission began its latest comprehensive review of the standards. These efforts were led by a Steering Committee representing MSCHE member institutions, the MSCHE staff, and the general public. The Steering Committee followed a set of Guiding Principles. These four Guiding Principles were developed by the Commission to reflect the areas that were identified as the most important to the membership of the Commission: Mission-Centric Quality Assurance, the Student Learning Experience, Continuous Improvement, and Supporting Innovation.

The Commission approved a plan to implement the revised standards through a unique Collaborative Implementation Project. The project involves a cohort of 15 institutions that are scheduled to submit their self-studies and host evaluation teams during the 2016-2017 academic year. Throughout the next two years these 15 institutions will undergo a “high touch” experience in which they will speak frequently with members of the Commission staff and with each other, as they engage in self-study. They will also play an active role in preparing other institutions to use the revised standards. All institutions hosting an evaluation team visit in the 2017-2018 academic year and beyond will engage in self-studies guided by the revised standards.

Outcomes

At the conclusion of this workshop participants will be able to:

  • Discuss and explain the new MSCHE standards
  • Demonstrate how the new standards focus on the student learning experience

From A - Z: An Assessment Toolkit Sean Joanna Campbell, Professor, Bergen Community College Maureen Ellis-Davis, Associate Professor, Bergen Community College Gail Fernandez, Associate Professor, Lead Assessment Fellow, Center for Institutional Effectiveness, Bergen Community College Dr. Pearlstein 102

​You are charged with developing a robust and formal assessment program at your institution.

How do you get started? What are the necessary components of a successful program Who is in charge of the process? Who are the stakeholders? What value does the institution place on assessment as demonstrated by the institutional resources and commitment?

In this workshop, the assessment team at Bergen Community College will (1) help you identify key components that lay the foundation for an effective assessment program, and (2) share  “add-ons” that may help sustain and nurture the assessment program at your institution.

Session attendees will have an opportunity to begin building their assessment platforms and will receive an assessment toolkit to bring back to their institutions.

Outcomes

  • ​Participants will be able to describe the components of an effective and sustainable assessment program.
  • Participants will identify the assessment tools to use at their institutions.
  • Participants will begin the process of building an assessment platform that meets the needs of their institution.
  • Participants will be able to explain how institutional commitment translates into necessary resources.​
Sep 12, 2018
1:00 PM - 2:00 PM

Welcome and Opening Plenary

Welcome Brian Blake, Provost, Drexel University Mandell 424

The expectations placed on higher education to foster and document students’ active and deep learning have never been higher. We live in a time of economic uncertainty, global interdependence, and urgent challenges. If our students are to be equipped with the skills to succeed in such a future, we must reject any claims of quality learning that do not include as their focus students’ active learning and understanding and our ability to assess such claims.
At Drexel, our assessment activities are based on institutional values that aim to produce relevant and functional data for aligning curricular design, course content, and pedagogical approaches with Drexel’s mission and values. In all assessment activities, the faculty and staff endeavor to take full consideration of the different educational and cultural backgrounds of our increasingly diverse student population. The primary objective of our assessment program is to establish a practice of action research that informs planning and results in tangible improvements for our students.
In attending the Annual Conference on Teaching & Learning Assessment, you will enjoy three days of thought-provoking speakers, workshops, and invaluable networking on Drexel's beautiful campus, just minutes from the heart of historic Philadelphia and the birthplace of our nation. Come join us as we work together to ensure that all students have continuous opportunities to apply their learning to the significant, real-world challenges which, no doubt, lie ahead for them.
PDF

Opening Plenary: Relying on Assessment to Chip Away at Academic Injustices in the Classroom Todd Zakrajsek, University of North Carolina Mandell 424

Todd Zakrajsek, an Associate Research Professor and Associate Director of Fellowship Programs in the Department of Family Medicine at the University of North Carolina, will open the conference with a spirited insight as to how research has allowed us to change our teaching to help students to be successful, how teaching myths can be particularly detrimental to some students, and how things like growth mindset can change things around for students who have had less opportunities while growing up.  This topic is a natural extension of his current body of academic work and publications which focus on faculty development, effective instructional strategies, and student learning. In addition to his work at UNC, Todd serves on several boards, among them: the Journal of Excellence in College Teaching; International Journal for the Scholarship of Teaching and Learning; Higher Education Teaching Learning Portal; Technology Enriched Instruction (Microsoft); and Communicating Science in K-12 (Harvard).  Todd is also currently serving terms as an elected steering committee member for the both the Professional Organizational Developers Network and the National Academies Collaborative.
Sep 12, 2018
2:00 PM - 2:15 PM

Break 1

Sep 12, 2018
3:15 PM - 3:30 PM

Break 2

Sep 12, 2018
4:45 PM - 5:30 PM

Ice Cream Social

Networking Event PISB Atrium

Come join your colleagues for ice cream and conversation during Network 101 Hour in the Papadakis [PISB] Atrium sponsored by AEFIS. “Ice Cream is constant proof that others want us to be loved and be happy”, Benjamin Frnaklin
Sep 13, 2018
8:45 AM - 9:45 AM

Morning Plenary

“A Revolution in Higher Education: Tales from Unlikely Allies” Richard De Millo, Georgia Tech University Mandell 424

Richard DeMillo is the Charlotte B. and Roger C. Warren Chair of Computer Science and Professor of Management at Georgia Tech. He founded and directs the Center for 21st Century Universities, a unique institution. The Center is Georgia Tech’s living laboratory for fundamental change in higher education. He is responsible for educational innovation at Georgia Tech and is a national leader and spokesman in the online revolution in higher education. Under his leadership, Georgia Tech has developed a pipeline of 50 Massive Open Online Courses (MOOCs) that together enroll a million learners. Georgia Tech’s innovation projects include new research in blended learning and a groundbreaking MOOC-based Master’s degree in computer science that offers a Georgia Tech degree for under $7,000. He was named Lumina Foundation Fellow in recognition of his work in higher education.
 
He was previously the John P. Imlay Dean of Computing at Georgia Tech where he led the design and implementation of the Threads program, which has helped transform undergraduate engineering education in the US and around the world. His influential 2011 book “Abelard to Apple: The Fate of American Colleges and Universities,” which helped spark the national discussion of the future of higher education, was inspired by this experience.
 
He was Hewlett-Packard’s first Chief Technology Officer, where he had worldwide responsibility for technology. He led HP through technology revolutions in super computing, printing, open source software, information security, and nanotechnology. Prior to joining HP, he was in charge of Research at Bellcore, where he oversaw the development of many Internet and web-based innovations. He has also directed the Computer and Computation Research Division of the National Science Foundation. During his twenty-year academic career, he has held academic positions at Purdue University, The University of Wisconsin and the University of Padua (Italy).
 
The author of over 100 articles, books, and patents, Rich’s research has spanned computer science and includes fundamental innovation in computer security, software engineering and mathematics. He is a Fellow of both the Association for the Advancement of Science and the Association for Computing Machinery. His book, “Abelard to Apple: The Fate of American Colleges and Universities,” was published by MIT Press in 2011. A sequel entitled “Revolution in Higher Education: How a Small Band of Innovators will make College Accessible and Affordable” was published by MIT Press in 2015.
Sep 13, 2018
11:00 AM - 11:15 AM

Break 3

Sep 13, 2018
3:00 PM - 3:15 PM

Break 4

Sep 14, 2018
9:45 AM - 10:00 AM

Break 5