For a better experience, click the Compatibility Mode icon above to turn off Compatibility Mode, which is only for viewing older websites.

Date/Time Session
Sep 12, 2018
9:00 AM - 12:00 PM

Pre-Conference Workshops

Visualizing Assessment Data through Interactive Dashboards Frederick Burrack: Director of Assessment, Kansas State University Chris Urban: Assistant Director of Assessment, Kansas State University Pearlstein 101

This pre-conference workshop will walk users through transforming assessment scores into interactive dashboards that enable faculty, staff, and administrators to more deeply engage with direct assessment results. Data preparation, model creation, and report design will be shown using provided sample data files. To fully engage with the session, participants will need a windows laptop with the free Power BI desktop client installed. Additionally, the workshop provides a framework to automate and deepen data analysis, which reduces the burden placed on faculty, program coordinators, and assessment staff. The workshop will also utilize a tool that allows easy report customization at the program level, allowing assessment staff to automate reports while still tailoring analyses to suit the needs of individual programs. Finally, the workshop will demonstrate how to link indirect and direct assessment data to get a richer view student learning.

At the end of this workshop, participants will be able to…

  • Create interactive reports and dashboards from direct and indirect assessment data.
  • Learn several techniques for preparing assessment-related data for visualization.
  • Understand how data tools can be used to automate data extraction, transformation, analysis and reporting.
  • Accreditation and Accountability for Education Programs: Selecting and Using the Most Compelling Evidence Brigitte Valesey, Ph.D.: Director of Assessment and Accreditation-School of Education, Drexel University Sarah Ulrich, Ed.D.: Associate Dean for Teacher Education and Undergraduate Affairs, and Clinical Professor of Teacher Education, Drexel University Pearlstein 102

    Education programs preparing graduates for PK-12 schools face rigorous expectations for external accountability and compliance, especially at the federal and state levels. Likewise, professional accreditation for EPPs has changed dramatically, in the evolution of education accrediting organizations from the former NCATE and TEAC to the Council for Accreditation of Education Providers (CAEP). This pre-conference workshop is intended for educator preparation leaders, the recruitment and preparation of high quality PK-12 teachers, specialists, and administrators is an imperative. While each educator preparation program provider (EPP) prepares PK-12 educators according to unique missions and visions, these providers share a commitment to rigorous and impactful educational experiences that produce ground-ready PK-12 teachers and leaders for the diverse communities in our region. This session seeks to bring participants together with education program providers, state department education leaders and members of the CAEP national accreditation organization to discuss and strategize evidence-centric practices and design approaches related to candidate success and continuous program improvement.

    At the conclusion of this workshop, participants will be able to…

  • Apply strategies to identify strong evidence for continuous improvement in education candidate success and program effectiveness.
  • Explore ways to leverage learning evidence and program review findings for considering and pursuing accreditation self-study.
  • Explore ways to collaborate with state and local LEAs to identify evidence-based strategies to enhance candidate/completer success and program effectiveness
  • Overcoming Curricular Fragmentation: An Experiential Learning Approach to Curriculum Mapping Jennifer Harrison: Associate Director of Assessment, University of Maryland Baltimore County Vickie Williams: Director of Student Services, University of Maryland Baltimore County Pearlstein 302

    This pre-conference workshop will focus on curriculum mapping that helps build collaboration, continuity, and connection across students’ learning opportunities, fostering higher-level integrative learning, and a more cohesive learning experience. However, programs at many institutions struggle to form shared visions of how their curricula work. How can we collaborate to build programs that scaffold deeper student learning, measure and improve that learning, and align learning opportunities? Collaborative curriculum mapping is one solution. In this gamified workshop, we will gain experiential learning in curricular and co-curricular mapping, aligning vertically and horizontally to build common ground, optimizing direct measures, triangulating direct and indirect evidence, closing the loop, using double-loop analysis, and creating continuity mechanisms. The presenters have created a gamified approach to curriculum mapping, so participants will experience a collaborative process that can help them internalize the key benefits and challenges involved in mapping a curriculum. Our learning activity is scaffolded by mini-presentations and interactive clickerlike quizzes that empower participants to apply the threshold concepts to an authentic program example. By participating in a kinesthetic, hands-on mapping challenge, participants will expand their cognitive, affective, and psychomotor competencies related to curriculum design. In particular, they will identify at least one closing-the-loop intervention to implement.

    At the conclusion of this workshop, participants will be able to…

  • Analyze curriculum mapping and vertical and horizontal outcome alignment.
  • Design a curriculum map for a sample program.
  • Apply double-loop analysis to propose curricular improvements
  • Learn and Practice Dynamic Individual Mindfulness Techniques to Increase Awareness of Faculty Engagement Ken Mawritz Ph.D., Professor, Drexel University Pearlstein 303

    Workshop participants engage in individual mindfulness and in practices that increase awareness of the “social body.” By heightening their sense of inter-personal connection, participants are better able to access creativity, facilitate groups, and provide effective and compassionate leadership. The use of non-verbal techniques to inquire into “stuck” situations in one’s professional and organizational life lead to surprising insights and possibilities. Social Presencing Theater (SPT) heightens sensitivity to current experience, brings attention to shifts in social justice or a contextual field, and is a method for prototyping seeds of the future. Arawana Hyashi the designer of SPT leads the creation of SPT for the Presencing Institute (Scharmer, 2016). Leading by Convening (Cashman, et. all, 2014) is a process where educational leaders look to have authentic engagement. This authentic engagement has three prongs of (a) ensuring relevant participation from all stakeholder groups, (b) coalescing around issues, and (c) collaborating to solve problems (Cashman, et. all, 2014). The workshop’s teaching implications include the development of a person’s knowledge base regarding facilitation techniques, an “outright” application of lessons learned, and modeling effective teaching and learning interaction.

    At the conclusion of this workshop, participants will be able to…

  • Interact and collaborate within a team while utilizing empathic listening
  • Understand several key leadership principles
  • Utilize a Social Presencing Theater technique(s)
  • Powerpoint Presentation

    Nuts and Bolts of Leading an Accreditation Self-Study Janet Thiel: Director of Assessment, Georgian Court University Pearlstein 307

    While MSCHE offers support for the accreditation process, they concentrate on understanding the Requirements of Affiliation and Standards for Accreditation, as well as the approved process for completing a self-study. What is not covered is the internal processes a university needs to consider as it begins and executes its self-study. These processes include how to engage senior leadership in supportive roles and decisions, how to manage meetings and data, best fit for steering committees and subcommittees, and how to engage the university community. This workshop will allow the participants anticipating a self-study to think through the necessary internal processes as applicable for their institution.

    At the conclusion of this workshop, participants will be able to…

  • Identify internal processes already in place that can facilitate a self-study and ensure its relevance “after the visit”.
  • Develop a network of peers for professional support during a self-study process.
  • Analyze the internal assets (personnel, technology, leadership) and capacity useful for a successful self-study at the institution.
  • Moving to a Learning Systems Paradigm Natasha Jankowski Director, National Institute for Learning Outcomes Assessment (NILOA) and Research Assistant Professor, University of Illinois Pearlstein 308

    Participants in this workshop will explore as a collaborative the Learning Systems paradigm, derived from research with over 800 institutions working on large-scale institution-wide change initiatives. Those that moved forward successfully with sustainable change efforts went through a paradigm shift, more extensive than Barr and Tagg's shift from an instructional paradigm to a learning paradigm. It involved developing a collective understanding and paradigm shift with constituents throughout an institution. The four elements of the paradigm include: consensus-based, aligned, learner-centered, and communicated. We will discuss the paradigm elements, including examples of them in action and means to begin the conversation through curriculum mapping approaches, assignment design efforts, and spaces for collaborative dialogue. The impacts from the learning system paradigm will be presented and participants will learn about various resources available to assist in the effort. Finally, participants will apply their learning to develop plans to implement steps to move towards a learning systems approach within their institution or program.

    At the conclusion of this workshop, participants will be able to…

  • Participants will be made aware of and learn how to access various resources that support the development of a learning system.
  • Participants will explore the four elements of the learning system paradigm and unpack how it differs from existing approaches to teaching and learning.
  • Participants will develop a plan to implement a learning systems approach within their institution or program.
  • Sep 12, 2018
    9:00 AM - 4:45 PM

    AALHE All Day Assessement Institute

    AALHE Assessment Institute (Full Day) Jane Marie Souza PhD: Assistant Provost for Academic Administration, University of Rochester Catherine M. Wehlburn: Associate Provost for Institutional Effectiveness, Texas Christian University Skyview Room

    This full-day learning opportunity will be offered to an inaugural cohort of 40 participants maximum. Dr. Jane Marie Souza and Dr. Catherine Wehlburg, both members of the Board of AALHE, will lead this workshop-style institute. These facilitators will bring a mix of theory and practice along with an engaging and participatory mix of information, practice, feedback, and skill-building. Participants will leave this institute with a solid foundation in the assessment of student learning, multiple resources, and a network of colleagues from across the country. Using their experiences at the course, program, institution, and national levels, the facilitators will foster lively conversations about what has worked, what hasn’t worked, and how higher education can best focus on improving and enhancing the quality of student learning at our institutions. An important aspect of this Institute is the cohort-based approach. Participants will spend the day learning together, lunching together, and attending a plenary session together. By creating a network, participants will have access to each other, the facilitators, and many other resources long after the end of the program This Institute will provide a framework for ways to better understand how to use information and data to inform decision making. The facilitators will work to use examples from many different types of institutions and will encourage dialogue among all participants in order to model good practices for determining how, when, and why to use assessment.

    Sep 12, 2018
    1:00 PM - 2:00 PM

    Welcome and Opening Plenary

    Welcome Remarks M. Brian Blake: Provost, Drexel University Mandell Theater

    Opening Plenary: Why Assessment Coordinators Look Grumpy so Often: My Contributions as A Faculty Member Todd Zakrajsek: University of North Carolina Mandell Theater

    There are just a few areas in which we see a good deal of consensus in higher education. One area with fairly high consensus is assessment, and it isn't in a positive direction. As a campus-wide assessment coordinator, I had more than one person duck into a hallway to avoid me, particularly whenever accreditation reports were being drafted. Actually, it is not just faculty members and it is not only higher education. Workers dislike performance reviews, students hold their breath a bit when receiving feedback on papers and grades on midterms. Even formative assessment, which is specifically designed to assist the learner, is often accepted with hesitancy. In this session, we will look at a few of the factors that make assessment, evaluation, and feedback difficult for assessment coordinators, and a few suggestions on how to make the process more positive for everyone involved.

    Biography:

    Dr. Todd Zakrajsek is an Associate Professor and Associate Director of the Faculty Development Fellowship in the UNC School of Medicine. Dr. Zakrajsek has founded or reconfigured, and then directed, centers for teaching and learning at UNC-Chapel Hill, Central Michigan University, and Southern Oregon University (where he also taught as a tenured Associate Professor of Psychology). Dr. Zakrajsek has served on many educational boards, including The Journal of Excellence in College Teaching; International Journal for the Scholarship of Teaching and Learning; Technology Enriched Instruction (Microsoft); Communicating Science in K-12 (Harvard); and Education Research Initiative (Lenovo). Currently, he directs 5 national Lilly Conferences on Evidence-Based Teaching and Learning.

    Todd’s recently co-authored books include Dynamic Lecturing: Research-based Strategies to Enhance Lecture Effectiveness (2017); Teaching for Learning: 101 Intentionally Designed Education Activities to Put Students on the Path to Success (2015); and The New Science of Learning: How to Learn in Harmony with Your Brain (2013). He has been a visiting professor and delivered keynote addresses at approximately 300 campuses and teaching conferences in 46 states, 11 countries, and 4 continents

    Sep 12, 2018
    2:00 PM - 2:15 PM

    Break 1

    Sep 12, 2018
    2:15 PM - 3:15 PM

    Concurrent Session 1

    Powerpoint Presentation (PDF)

    Identification of Bottlenecks: Useful Assessment Tool for the Combined Purposes of Accountability and Improvement. Phyllis Blumberg: University of the Sciences PISB 104

    Two main purposes of assessment are for accountability (demonstrating what is happening) and for improvement (progression toward the ideal). While accreditation accountability often takes priority over improvement, both purposes are important. When assessments, such as identification of bottlenecks, are able to combine both purposes, they become more valued and useful. When assessment is conducted for accreditation purposes, faculty and administrators hesitate to reveal weaknesses. We will explore how bottleneck obstacles can deter student success and persistence within a discipline or program. Specifically, we will practice (1) identifying bottlenecks as an assessment tool for student learning and programmatic assessments and (2) developing measurement techniques that gauge the effectiveness of interventions intended to reduce a bottleneck. Without knowing these weaknesses, identifying opportunities for improvement is much harder. Faculty and administrators can use bottleneck identification to assess learning and program because it provides data that enhances decision-making ability, closing the assessment loop.

    Learning Outcomes:

    1. Participants will be able to explain how they can conduct assessments that serve the duo roles of assessment: accountability and improvement by borrowing a technique used in course improvement.
    2. Participants will learn how to identify bottlenecks to student learning and efficient program operation and identify appropriate assessment measures.

    Audience: Beginner

    Powerpoint Presentation (PDF)

    A Blueprint on How to Maximize Assessment Information and Minimize Cheating in Online Multiple-choice Tests Pia Di Girolamo & Alan Hecht: Drexel University PISB 106

    A major challenge in online exams is the implementation of a setup that tests students’ performance for different skill levels within Bloom taxonomy, while also ensuring fairness and integrity. Unlike the traditional randomization, the proposed setting ensures that all students have different exams while targeting their performance by assessment goals. Participants will learn how to map a multiple-choice exam using pools of questions by skill levels and difficulty. They will also learn about the use of lock-down browsers and webcams to ensure integrity, the pros and cons, possible issues that may arise and how the presenters have been resolving them. Attendees will identify how to map and set-up a pooled online test targeted to optimize assessment diagnostics. Attendees will also see how a web camera and a lock-down browser can be embedded in online tests, helping to ensure integrity while allowing to preserve the pooled banks for future testing.

    Learning Outcomes:

    1. Attendees will be able to to perform online testing with a setting that targets assessment of competence by Bloom’s taxonomy and level of difficulty.
    2. Attendees will be able to use web-monitoring instruments in online testing to preserve integrity and maximize assessment outcome.

    Audience: Beginner

    Powerpoint presentation

    Peer to Peer: Designing Collaborative Campus Assessment Round Tables Seth Matthew Fishman & Gabriele Bauer: Villanova University PISB 108

    Peer-to-peer sharing is an effective way to engage faculty. We often hear that faculty to do not like attending assessment "lectures", but seem to more engaged through round tables. We will discuss how we initiated a campus-wide assessment forum focused on faculty-led round tables on specialized topics. We will then guide participants into designing a draft of their own round table event. Part of this session focuses on basic planning and implementation issues, as well as facilitating this half-day activity, with lessons learned as entering our fourth year of the faculty forum. It involves and engages faculty in a half-day assessment activity. We've had 35-50 faculty attend each year and the evaluation feedback has suggested they are implementing ideas gleaned from the forum and promoting it among their colleagues. We will facilitate conference participants to design their own forum during our session.

    Learning Outcomes:

    1. Participants will be able to create a draft for a half-day assessment forum
    2. Participants will be able to convey benefits and challenges of implementing a half-day campus-wide assessment event

    Audience: Beginner

    Powerpoint Presentation (PDF)

    Using the Student’s Journey to Transform Institutional Assessment at BCC Jesse Jacondin, Tonia McKoy & Monica Rodriguex: Bergen Community College Pearlstein 101

    Institutional Research (IR) supports assessment practices at Bergen Community College. This presentation is about how IR developed an applied approach to data analysis and dissemination. This approach, which is based on Joseph Campbell’s The Hero’s Journey, is a holistic framework that bridges the gap between stakeholders and meaningful institutional outcomes. This approach offers an alternative view of institutional assessment which focuses on the student’s experience. The student’s journey through higher education is contextualized through the use of data; thus illuminating the impact of the institution and its departments on the lives of students. The session will show how institutions can break down silos by focusing on how each department fits into the larger framework of the student’s journey. Within this context, they can assess their effectiveness in relation to the mission of the institution.

    Learning Outcomes:

    1. Participants in this session will leave with a basic understanding of Joseph Campbell’s storytelling device The Hero’s Journey and how it can be related to The Student’s Journey and institutional assessment.
    2. Participants will be able to connect institutional outcomes with decision-makers and stake holders.

    Audience: Intermediate

    Powerpoint Presentation (PDF)

    Tracking Experiential Learning Outcomes across University Campuses Alyssa Martin, N. John Dinardo & Adam Fontecchio: Drexel University Pearlstein 102

    Experiential learning opportunities are viewed to be important for the undergraduate experience but vary in terms of breadth, intensity, and integration across institutions. It is therefore difficult to track outcomes from these experiences systematically, especially since no validated assessment tool exists. Drexel’s Center for the Advancement of STEM Teaching and Learning Excellence (CASTLE) has partnered with the College of Charleston and the University of North Carolina at Wilmington to develop a framework for long-term tracking of experiential learning outcomes across campuses. This session will explore challenges and opportunities in developing a validated tool to assess experiential learning across differing experiences and describe the collaborative efforts across the three institutions for long-term tracking of experiential learning outcomes. This work will identify validated core characteristics derived from experiential learning that promote the student experience and increase understanding whether specific student populations gain more from particular types of experiential learning.

    Learning Outcomes:

    1. Participants will be able to define learning outcomes for experiential learning activities and create assessment queries that can be broadly applied.
    2. Participants will be introduced to challenges and opportunities in the development of a standardized, validated assessment tool.

    Audience: Intermediate

    Powerpoint Presentation

    Using Focus Groups for Holistic Assessment on Campus Will Miller: Campus Labs Gerrie C LeBow Hall 108

    Ultimately, data today helps campuses explain what is happening. But, it does not always help us understand why. When we take the time to sit down and talk with students about their experiences, we are able to discover insights and information not available from a cursory look. Complementing survey research--and other assessment methodologies--with more in-depth focus groups can help to triangulate student-based data on campus. In this session, we discuss how focus groups can be used as a key piece of the larger assessment pie. The overarching goal of this workshop session is to send attendees back to campus armed with a new way to gather insights from their students. They will leave armed with actionable templates and design ideas related to maximizing student information on campus. Included is an overview of design and analysis strategies.

    Learning Outcomes:

    1. Participants will be able toexplain the value of focus groups for holistic assessment on campus.
    2. Participants will be able to describe the necessary steps for successfully utilizing focus groups as part of a holistic assessment culture.

    Audience: Intermediate

    Avoiding the Dreaded "Common Assignment" - an Assessment Model for Large Course Sections Antonis Varelas, Stacey Cooper, Eugena Griffin, Karen Steinmayer & Kate Wolfe: Hostos Community College, CUNY Gerrie C LeBow Hall 109

    Developing a course-level assessment program can be an arduous process that is often made more difficult by attempting to develop a “common assignment” that typically needs to be agreed upon and used by all faculty teaching a particular course. We believe the course-level assessment model we will present takes advantage of and respects faculty’s unique approach to pedagogy while maintaining the integrity and goals of the assessment program. In this session the presenters will outline a model of assessment, appropriate for courses with multiple sections and faculty, that avoids the laborious creation of a “common assignment.” Instead, this model leverages what faculty are already doing inside the classroom and avoids the imposition of a common exercise or exam. We will present a less burdensome and time-consuming approach. We believe it is a more efficient approach than the “common assignment,” and will afford you greater buy-in from faculty.

    Learning Outcomes:

    1. Participants will be able to develop an assessment program without the need of a "common assignment"
    2. Participants will be able to leverage faculty's unique approach to teaching in the assessment process

    Audience: Intermediate

    Sep 12, 2018
    3:15 PM - 3:30 PM

    Break 2

    Sep 12, 2018
    3:30 PM - 4:30 PM

    Concurrent Session 2

    Powerpoint Presentation (PDF)

    Radical Collaboration to Define and Design the Student Experience Jacqueline Snyder: SUNY Fulton-Montgomery Community College Mary Ann Carroll: Herkimer County Community College SUNY PISB 104

    Most institutions possess a solid grasp of their student demographics, however this data only provides a hazy outline and it does not identify attitudes, needs, or how the students experience their education. To truly revolutionize assessment processes and planning, two SUNY community colleges led initiatives to identify and collect student psychographic data as a way to address institutional planning for student success. Through collaboration, student psychographic data is being used to lead an institutional shift from "who are our students" to "why our students are our students". The national focus on student retention and success has become a priority for institutions, particularly with declining enrollments. Understanding what student value, what are their priorities, and what are their expectations from an institution will assist in developing plans at all institutional levels that address the educational experience through the lens of the student. Atttendees will learn how to design student experience profiles and plans that will result in student centered planning at all levels of the institution.

    Learning Outcomes:

    1. Participants will be able to recognize elements and strategies needed to lead student psychographic data collection.
    2. Participants will be able to develop student experience plans for student sucess that integrate common institutional plans (strategic, master, operational,...).

    Audience: Advanced

    Snapshot Sessions (A Collection of Mini Sessions) Various Speakers PISB 108

    SS1: Connecting the Dots: Assessing Learning Across Sequential Research Courses
    Suzan Harkness: Notre Dame of Maryland University

    SS2: Institutional Transformation? What’s Really Different?: Maximizing the use of Evidence to Assess Dramatic Institutional Change.
    Virginia Bender: Saint Peter's University

    SS3: Lighten the Load through Peer Review
    Shantelle Jenkins: Cabrini University

    Doug Trimble: Eastern University

    SS4: How To Assess Class Participation and Get Away With It
    Greg Jewell: Drexel University

    SS5: Assessing Experiences Abroad for Continuous Improvement: The Siena Livable Cities Program
    Thomas Paradis: Butler University

    SS6: A Sustainable Method for Institutional Outcomes Assessment Applied to Quantitative Reasoning
    Stavros Valenti, Breat Bennington, Frank Gaughan & Terri Shapiro: Hofstra University

    SS7: Consumers of Assessment Data - Who are Our Clients? How Can we Engage Them and Meet their Needs?
    Adrian Zappala: Peirce College

    SS8: Fearing Faculty: How to get Faculty to Embrace the Online World of Assessment, Curriculum, and Program Review
    Danielle Zimecki-Fennimore: Rowan College at Gloucester County

    Powerpoint presentation

    Taming the General Education Hydra Susan Donat & Robin Lauermann: Messiah College PISB 108

    What objectives? Why do I have to take this gen ed course? What data? What right do you have to demand data? For every assessment question we field, two (or more!) grow! Sometimes the questions take on a life of their own and derail the larger conversation. Clarifying and effectively communicating outcomes shapes meaningful conversations regarding the nature and purpose of the General Education curriculum. Potentially adversarial conversations defuse from territorial disputes into engaged conversations between faculty where the focus is on how the general education curriculum contributes to student development. Participants develop strategies to improve and reinforce the value of General Education curriculum at their campus. The result is easier data collection and engaging conversations about improving student learning.

    Learning Outcomes:

    1. Participants will be able to articulate strategies to clarify General Education outcomes
    2. Participants will be able to identify approaches to build institutional consensus regarding the nature, purpose and value of the General Education curriculum

    Audience: Intermediate

    Powerpoint Presentation (PDF)

    Providing the Full Cycle of Assessment: Guides and Samples from the (Fictional) Bicycle Riding Program Krishna Dunston: Delaware County Community College Pearlstein 101

    A community college is home to a variety of disciplines, pedagogies and assessment methods. One fictional program creates consistency in resource materials. Resources are key: new faculty are always joining the institution, and program leadership may ebb and flow. An assessment facilitator needs to be able to offer resource material which is usable, approachable and clear. An assessment facilitator needs to be able to produce clear, usable materials for faculty. This session will provide an overview of the guides, examples and resources produced as references for faculty at Delaware County Community College utilizing a fictional Bicycle Riding Program. The Bicycle Riding Program makes assessment as easy as riding a bike. Examples remain consistent, and allow faculty to better visualize how assessment language and documentation applies to their own discipline.

    Learning Outcomes:

    1. Participants will be able to discuss the advantages in creating a consistent example which can clarify your institution's full CYCLE of assessment.
    2. Participants will be able to examine the many artifacts which can be created by a consistent example.

    Audience: Advanced

    Asking questions of work integrated learning (WIL): Using Qualitative data to assess cooperative education and internship experiences. Karen Nulton & Joanne Ott: Drexel University Pearlstein 102

    This session highlights the limits of quantitative data in telling the story of experiential learning experiences and explores the value of qualitative methods. The session will guide participants through qualititive coding and analysis and discuss how qualitative coding at Drexel showed small but important differences based on gender and international status of student, highlighting how diversity and inclusion should be integral elements of every assessment plan. This session helps participants to see that only evaluating "easy" data (for example, likert scale questions of employers and students) can lead to missing vital information. While we focus on work-integraed learned, the tools and skills are applicable beyond analyzing experiential learning. We offer very accessible, pragmatic, hands-on exploration of how to gather, code, and assess qualitative data related to work-integrated learning Participants will learn 1) how to create learning goals for WIL experiences; 2) how to gather and query data a out these goals, 3) how to perform basic coding of qualitative questions and 4) how to use this data to inform curricular change. The session should make participants feel capable of creating and analyzing qualitative data at their home institutions.

    Learning Outcomes:

    1. Participants will be able to perform basic qualitative coding linked to internships/co-ops
    2. Participants will be able to use data from qualitative coding to inform curricular change

    Audience: Beginner

    Performance and Artistry: How to Best Quantify Aspects of Art Through Subjective and Objective Assessment Valerie Joyce: Villanova University Gerrie C LeBow Hall 108

    Passion for the Fine Arts engenders environments where students are encouraged to explore personal terrain and expand their imagination. This safe environment often collapses when facing the cold hard reality of grades. Grading performance, whether in dance, theatre, and studio art, or in public speaking or group project presentations, often leads to reliance on emotional responses rather than objective assessment. Teachers can improve the detail of their feedback and instruction by crafting rubrics that focus on the specific skills of performance. This session examines subjectivity in art appreciation and objectivity in learning assessment in Fine Arts classes. The subjective nature of grading performance leads to the most grading complaints and challenges. Developing core values and basic structures around performance assessment will create a stronger base for evaluation and help in communicating the purpose and meaning of the grades clearly.

    Learning Outcomes:

    1. Participants will be able to develop approaches to and a process for the evaluation and assessment of student learning in Arts courses
    2. Participants will be able to identify the execution of the elements of the craft and to define a student's abilities as separate from a student's identity in performance

    Audience: Beginner

    Powerpoint Presentation (PDF)

    Promoting and Assessing Professional Identity Formation Michelle Schmude & Matt Marriggi: Geisinger Commonwealth School of Medicine Gerrie C LeBow Hall 109

    Few studies have examined the benefits of ePortfolio use on enhancing student’s knowledge and skills regarding professional identity formation and none have commented on the ability of required ePortfolio use to change student behaviors and attitudes regarding reflection. (adapted from APHC 2018 session proposal) Understanding how reflection is utilized to identify one’s strengths and growth areas and the ability to develop a professional development plan for future growth is vital to professional identity formation. ePortfolio use in education has been evolving over the past decade. This session introduces how an ePortfolio can promote and assess students’ evolving professional identity formation. By embedding ePortfolio activities and reflections within a graduate course, we developed a structure to assess this competency. Finally, the Groningen Reflective Ability Scale was administered to determine if a change in reflection occurred. This session can assist attendees in the understanding of this critical competency for students’ lifelong learning. Attendees can also utilize the information to improve their own lives.

    Learning Outcomes:

    1. Participants will be able to use an ePortfolio to promote and assess professional identity formation.
    2. Participants will be able to utilize the Groningen Reflective Ability Scale (GRAS) as pre and post assessment of student learning regarding professional identity formation in a course.

    Audience: Intermediate

    Empowering Students and Faculty with Evidence of Learning using Effective Assessment - Drexel Outcomes Transcript and Competency Portfolio Mustafa Sualp - AEFIS: Stephen DiPietro & Donald McEachron - Drexel University Gerri C. LeBow Hall 121

    In higher education, courses and instructors are often functionally siloed and students fail to see the connections between curricular elements. Outcomes-based design and assessment should address this problem but often does not due a significant disconnect between what students and faculty understand about the significance of student learning outcomes. In an effort to address these issues, a complete assessment management solution approach and software are being designed and implemented to create ‘learning outcomes transcripts’ which transcend individual courses and educational experiences. By providing developmentally relevant feedback to students in real-time, these transcripts may promote significant student ownership of learning outcomes, creating a stronger sense of purpose and curricular continuity. That, in turn, should promote more effective student learning and academic performance.

    Learning Outcomes:

    1. Participants will be able to explain the workflow involved in an extended or value-added transcript from direct assessment planning to data entry to the resulting transcript
    2. Participants will be able to discuss the steps involved in the development of such a project from conception to fruition.

    Audience: Intermediate

    Sep 12, 2018
    4:45 PM - 5:30 PM

    Ice Cream Social

    Sep 12, 2018
    5:30 PM - 7:30 PM

    CASTLE Pedagogical Happy Hour

    Hosting Happy Hours as a Strategy to Improve Teaching and Learning Jennifer Stanford, Eric Brewe, Adam Fontecchio & Jason Silverman: Drexel University CASTLE Idea Lab, 3401 Market St, Suite 110

    Many faculty are unfamiliar with effective pedagogical approaches shown to enhance student learning and retention. Events such as the CASTLE Pedagogical Happy Hour build faculty learning communities across disciplines and serve as professional development for faculty interested in developing as evidence-based educators. Participants will learn how to use Pedagogical Happy Hours as a strategy to improve faculty understanding of evidence-based teaching and learning and to build faculty networks at their own institution. Attendees will also see how the strategy works in practice by participating in a mini Happy Hour event. Attendees will learn information on Hosting Pedagogical Happy Hours that they can bring back to their own institutions. Implementation of these strategies can improve the working environment by promoting learning communities of diverse faculty members. Student learning will also be enhanced by implementation of effective pedagogical techniques.

    Learning Outcomes:

    1. Participants will learn how they can build a faculty/staff learning community event at their home campuses.
    2. Participants will experience what a successful pedagogical happy hour looks like in practice.

    Sep 13, 2018
    7:30 AM - 8:30 AM

    Continental Breakfast 1

    Sep 13, 2018
    8:45 AM - 9:45 AM

    Morning Plenary

    Thursday Morning Plenary: Assessments Final Frontier: Making Good on the Promise to Improve Student Learning and Success Jillian Kinzie, National Survey of Student Engagement (NSSE) Mandell Theater

    Increased attention to assuring quality student learning and equity has helped usher in noteworthy shifts in assessment practice. Looking across the current landscape of higher education assessment processes and practices, the emerging trend is an authentic form of assessment that values evidence produced in the context of teaching and learning, represents students’ work, supports faculty use of evidence, and is connected to a range of institutional learning initiatives. Despite occasional grips about assessment being a waste of time and important prods for evidence of real improvements to student learning, there is much to celebrate about the maturation of assessment practice. This session will highlight implications of the shifts in institutional assessment practice, propose some ideas about where the field is headed, and encourage reflection on models for making good on assessment to improve student learning and assure success for all students.

    Biography:

    Jillian Kinzie is a Senior Scholar and Associate Director of Indiana University's Center for Postsecondary Research & NSSE Institute, is responsible for organizing the day-to-day activities related to the case study portion of this project. She played a similar role in managing the field research reported in Student Success in College, for which she was a senior co-author. In her role as associate director of the NSSE Institute for Effective Education Practice, she has worked with dozens of campuses to develop assessment strategies and advise on using the data to improve student learning. She was the co-principal investigator on the Teagle-funded project to evaluate the efficacy of its investment in supporting the development and use of assessment instruments in liberal arts college settings.

    Sep 13, 2018
    10:00 AM - 11:00 AM

    Concurrent Session 3

    Powerpoint Presentation (PDF)

    Investigating Cognitive & Non-Cognitive Factors: A Schoolwide Approach to Increasing Early Student Success and Completion Kristen Betts, Toni Sondergeld & Penny Hammrich: Drexel University PISB 104

    There are now “fewer students studying on campus than at any point since 2012” (Seaman, 2018). Although universities provide flexible course formats including onsite, blended, and online, undergraduate graduation rates still hover around 60%. Concurrently, graduate student completion rates are challenged by the illusive balance of academics, work, and family. This session focuses on institutional approaches to support student success. Drexel University’s School of Education launched a longitudinal study in fall 2017-18 to investigate cognitive and non-cognitive factors associated with early student success, retention, and completion. Data on will be shared on GRIT, Mindset, Imposter Phenomenon, and Creativity and relationships between key demographic factors and student anxiety. The session will also discuss the importance of evaluating of non-cognitive factors for incoming students to decrease melt and attrition. School-wide programming and demonstrations will be shared. Attendees will also share their best practices.

    Learning Outcomes:

    1. Attendees will be able to articulate the importance of investigating the role of non-cognitive factors associated with retention and attrition
    2. Attendees will leave with assessments and proactive strategies that can be implemented within their institutions

    Audience: Intermediate

    Don’t Buy Wholesale: A Better VALUE in Learning Outcomes Assessment Jeff Bonfield & Vincent Lacovara: Rowan University PISB 106

    Technology has made assessment work easier, but for the most part it has been built to execute existing processes rather than to enable better ones. The presenters will share a methodology that makes novel use of existing technology to improve the scope and validity of learning outcomes assessment while providing data that departments can use to improve curricula, pedagogy, and classroom assessment. Typical general education and university outcomes assessment models involve collecting samples of student work that faculty evaluate using a common rubric, often AAC&U’s VALUE Rubrics. Doing so can provide reliable evidence of student learning, but it is time-consuming and often does little to directly inform pedagogy or classroom assessment. The assessment strategy detailed in this presentation has the advantage of measuring student attainment of learning outcomes across departments and throughout students’ undergraduate education, from a baseline in general education courses through upper-level courses. Employing this strategy can benefit institutions equally, regardless of whether they utilize the VALUE Rubrics.

    Learning Outcomes:

    1. Participants will be able to identify flaws in typical outcomes assessment methodologies.
    2.Participants will be able to implement an assessment methodology that has advantages over AAC&U’s suggested application of the VALUE Rubrics, both in the reliability of the data and the pedagogical and curricular relevance to faculty.

    Audience: Advanced

    Powerpoint Presentation (PDF)

    We’re in this Together: How Facilitated Peer Feedback Leads to Continuous Improvement Katherine Cermak & Lon Olson: US Naval Academy PISB 108

    While many institutions have one or two assessment experts, cultivating assessment expertise and a culture of assessment continues to be challenging. This session will describe ways that assessment committees can actively model the behavior we would like to see documented in reports and expand the number of experts on campus. It will feature the description of an annual assessment report review process featuring self-assessment and formative feedback. The presenters will describe a process that uses best practices (rubric creation and norming) to increase committee member confidence and knowledge ultimately resulting in faculty-led discussions in which formative feedback is used to improve assessment and learning. Attendees will leave this session with ideas regarding setting expectations for assessment reports, using a rubric to educate and empower their faculty, and creating a collegial environment that empowers faculty to see assessment as both meaningful and manageable.

    Learning Outcomes:

    1. Participants will consider the process and rubric in comparison to existing structures and tools at their home institutions and identify ways that it can be used or adapted to fit their needs.
    2. Participants will be able to structure peer review to improve faculty confidence and understanding of good assessment practices while creating a collegial environment where formative assessment feedback is more easily received and used to improve assessment.

    Audience: Intermediate

    Powerpoint Presentation (PDF)

    “NOT Using ‘Rubrics’ to Assess Institutional Learning Outcomes” Robert McLoughlin & Gerald Kobyski: United States Military Academy Pearlstein 101

    Assessing institution level (or general education) outcomes that have performance, moral, civic, leadership and social components can be challenging because of the unavailability of the right data, or too much data. We will share a different approach for assessing these outcomes that does not rely on rubrics. Data overload on student learning, either data that is on hand or has the potential to be collected, can be overwhelming for an institutional assessment committee. The presented approach is different way of deriving meaningful conclusions and concurrently integrates numerous campus stakeholders resulting in an evidence-based conclusions and recommendations forward. Drawing meaningful and useful conclusions on student outcome achievement becomes more complex with rubrics when voluminous data is available. The presented research based method facilitates this integration would without the use of rubrics.

    Learning Outcomes:

    1. Participants will be able to apply a simple methodology for implementing an effective process for assessing institutional learning (or general education) outcomes, an approach different from applying rubrics to evidence.
    2. Participants will practice the above methodology on their own outcomes and data, or examples on which to work will be provided.

    Audience: Beginner

    Surveys and Standards and Strategies, Oh My!: How to Strategically Align Surveys with Accreditation Standards Jessica Sears, Amanda Dudley and John Lyons: New York University Pearlstein 102

    The number of specialized accreditations is growing as accreditation becomes no longer a “voluntary” process, and they are becoming more assessment-focused. As assessment of student learning grows, more programs are being asked to collect and analyze data, in ways they may not understand. This is a starter for how to collect clinical data that can be used for program improvement.We will take participants through the life cycle of collecting data from clinical/internship/field experience evaluations and using it for program improvement. We will discuss creating clinical surveys that show student progress and developing a matrix of standards mapping, connecting evaluations and program components such as coursework to standards. The session will also cover best practices for survey development and distribution, including preparing data for analysis. Systems demonstrated include Tableau, Qualtrics, and the Google suite. This session provides processes for accredited programs to use, including handouts.

    Learning Outcomes:

    1. Attendees will be able to lead the creation of an accreditation’s standards mapping to their clinical evaluations
    2. Attendees will be able to use best practices in online survey distribution, to have data ready for reporting

    Audience: Beginner

    Data-Informed Curriculum Mapping in 3-Easy Steps Jane Marie Souza and Andrew Wolf: University of Rochester Gerrie C LeBow Hall 108

    In higher education, peer review is an established method of ensuring quality of publications. Furthermore, anything that is truly important to us in our professional lives, we tend to peer review. Sadly, however, somethings of utmost important to our students – test/exam items – are often not provided the benefit of peer feedback. Poor quality test items may unfairly impact student grades and have the potential to undermine motivation, engagement and learning. This session addresses this issue by demonstrating how to evaluate test items for content accuracy, applicability to audience, clarity of writing, and alignment to student learning outcomes as well as teaching methods. Using a test item review guide provided, participants will actively engage first in critiquing sample test questions, then in drafting their own questions using the strategies learned. Participants will discuss how the peer review process can be used to improve quality, enhance awareness, and expand capacity for, assessment at their institutions.

    Learning Outcomes:

    1. Participants will be able to choose the appropriate type of test item for the learning outcomes to be assessed
    2. Participants will be able to generate quality test items

    Audience: Intermediate

    Powerpoint Presentation (PDF)

    Continuous Improvement- One Course at a Time Janet Thiel: Georgian Court University Gerrie C LeBow Hall 109

    Efficiency is critical for assessment compliance in academics. If one course and its learning artifacts can be used for course-level through institutional-level assessment, all involved in the assessment of learning benefit. The goal of assessment practices is continuous improvement. A process that emphasizes quality of reflection rather than quantity of results enriches the profession and its practitioners. This session examines a process for course reflection based on Brookfield’s reflective praxis. Participants will review a survey that leads faculty into an in-depth reflection on one course. The participants will also consider how data collected across the university from these reflections can inform faculty development and assess academic programs. The heart of an institution’s learning mission is the classroom. The immediacy of actions taken to improve classroom learning should not be impeded by remote assessment practices. Practitioner reflection on course learning puts assessment actions in the hands of the classroom instructor.

    Learning Outcomes:

    1. Participants will be able to apply Brookfield's Reflective Praxis on course assessment.
    2. Participants will be able to examine how individual reflection on course learning can be utilized to determing university-wide faculty development and assessment of student learning.

    Audience: Beginner

    Turning Technologies (Vendor Session) Turning Technologies PISB 105

    Sep 13, 2018
    11:00 AM - 11:15 AM

    Break 3

    Sep 13, 2018
    11:15 AM - 12:15 PM

    Concurrent Session 4

    Powerpoint Presentation (PDF)

    Don’t Assess in a Vacuum: Team up to Clean up Assessment Ellen Derwin: Brandman University PISB 104

    Many institutions are challenged to support and encourage faculty to meaningfully engage in assessment practices. Faculty can feel isolated as they strive to create effective assessment tools, such as signature assignments and rubrics. To respond, Brandman University implemented peer review and mentoring strategies to encourage faculty interest and collaboration. Signature assignments and rubrics are key components of assessment, and they are the primary tools used to measure mastery of learning outcomes. Assuring that signature assignments and rubrics are well-aligned, well written, and academically rigorous is relevant to assurance of higher education learning today and in the future. Participants will take away strategies to support faculty collaboration in assessment, and they may find these strategies apply to additional aspects of their work. Additionally, they will gain access to checklists and protocols that can be used and adapted for assessment approaches and projects at their own institutions.

    Learning Outcomes:

    1. Participants will be able to apply supportive peer review and mentorship strategies.
    2. Participants will be able to evaluate well-written assignments and rubrics that align to learning outcomes.

    Audience: Intermediate

    Comparing Apples and Oranges: Effective Data Analysis for Program Assessment, Accreditation, and Comparision Raymond Francis & Mark Deschaine: Central Michigan University PISB 106

    One of the largest challenges in educator preparation in higher education is the effective use of data for actionable and meaningful decision making about programs. Within programs there are many types of students. However, in all instances the program (ie college, university, school, etc.) is charged with using data in a meaningful manner to guide program decisions, make revisions to curriculum and clinical experiences, and determine the overall quality of the program. Visual Scaling is an effective, research-based practice that allows for the comparison of data within, and between programs. Examples and online resources will be provided. This session addresses the need for effective and flexible data analysis and communication strategies by demonstrating and exploring scaling strategies that provide 1) an effective strategy for using data in an actionable manner, and 2) establish a process for comparison of individual and group data within and between peer groups. The presenters will utilize 3 specific sets of predetermined data to engage participants in the Visual Scaling process. Participants will explore patterns within data, explore themes and patterns across program data, and explore effective uses of scaled data in the process of data-based decision making.

    Learning Outcomes:

    1. Participants will become familiar with the general aspects of Visual Scaling processes, and identify key elements in the scaling process that are directly applicable to their professional practice.
    2.Participants will be able to use the scaling processes in the assessment, evaluation, and accreditation processes related to their professional practice.

    Audience: Beginner

    Powerpoint Presentation (PDF)

    Seeing Assessment: Track and Communicate Students’ Progress with Innovative “Visual Assessment Maps Dawn Hayward & Jing Gao: Gwynedd Mercy University PISB 108

    Because assessment processes occur through time and across multiple dimensions of learning, tracking and communicating the plan and results can overwhelm faculty and staff. A coherent, visual method for mapping the assessment helps everyone better understand the big picture. In this hands-on workshop, participants will explore an innovative approach for “mapping” program-level assessment plans. Invented in preparation for a national accreditation review with the Council for the Accreditation of Educator Preparation (CAEP), the creative “Visual Assessment Diagrams” have proven a successful tool, for tracking and communicating students’ progress across multiple key assessments and complex arrays of key data indicators This visual mapping tool helps simplify the complex task of tracking and demonstrating students' progress and improvement through time, across “implicit” and “explicit” curricula, through multiple types of assessment (e.g. skills and knowledge, dispositions, performance in the field),and by key data indicators (e.g. retention/persistence, completion/graduation rates, employment rates, licensure pass rates).

    Learning Outcomes:

    1. Participants will explore innovative methods for creating visual assessment plans
    2. Participants will discover how mapping can support complex requirements of specialized and professional accreditors

    Audience: Beginner

    Powerpoint Presentation (PDF)

    Utilizing and Prioritizing Assessment Results with Finite Resources Kathryn Strang: Rowan College at Burlington County Pearlstein 101

    As college's face the dilemma of lower enrollments combined with less funding making sound budget decision based upon outcomes is critical to the success of the college. The presentation will share Rowan College at Burlington County’s methods, tools and strategies used to establish a financially responsible decision making process. RCBC has developed two mechanisms to ascertain the college’s functional areas through operational unit reviews and institutional effectiveness plans. Using the outcomes to demonstrate continuous quality improvement, a method of prioritizing all strategic budget requests was developed. This method is utilized with all outcomes/recommendations associated with budget implications to determine the value and most effectively allocate the College’s finite resources. . Participants will see how to implement contingency tables, matrices and steps to closing the loop. It will be lead through a PowerPoint presentation followed by a learning activity and a Q&A session designed for professionals with experience in assessment and knowledge in institutional effectiveness plans and budget planning.

    Learning Outcomes:

    1. Participants will be able to construct a responsible decision making process for allocating finite resources.
    2. Participants will be able to acquire methods, tools and strategies used in the decision making process.

    Audience: Advanced

    Using Leadership Assessment to Develop Outcomes Assessment Leaders in Higher Education Terri Shapiro & Comila Shahani-Denning: Hofstra University Pearlstein 102

    In higher education, faculty/junior administrators often move into leadership roles with minimal, and typically informal, training. We assume they will naturally develop leadership skills as they take on new accreditation/assessment responsibilities. That is an unrealistic expectation, given the particular challenge of leading accreditation/assessment efforts at the institutional and program level. We will present a pilot study in a private northeastern university in which we employed a validated leadership assessment tool (the Hogan Leader Focus report) based on personality and values, as the basis for a leadership development program aimed at faculty and administrators taking on new roles, including assessment and accreditation. Assessment leaders must lead faculty, a group often thought of as “independent professionals” toward shared, sustainable, data-based, assessment strategies. Higher education institutions need to provide these leaders with the tools and skills to help their institutions to meet the increasingly difficult demands of accreditation and ongoing, meaningful, assessment effectively.

    Learning Outcomes:

    1. Participants will understand that using a leadership assessment tool is a worthwhile basis for engaging faculty and administrators in leadership development discussions.
    2. Participants will understand that successful leadership development is not a one-shot deal; programs aimed at helping both faculty and administrators become better leaders should be conducted over time.

    Audience: Intermediate

    Powerpoint Presentation (PDF)

    Assessing College Athlete Engagement with High Impact Practices (HIPs) Ellen Staurowsky: Drexel University James DeVita & Anthony Weaver: University of North Carolina at Wilmington & Elon University Gerrie C LeBow Hall 108

    Although high impact practices have been shown to be beneficial to student experience, their definition and assessment continues to be a challenge in higher education. This session presents preliminary findings from a study focusing on identifying how and in what ways NCAA Division I college athletes have access to HIPs. College athletes constitute a student cohort who often experience high time demands and resource constraints, which limit their opportunities to participate in HIPs. Assessing the access college athletes have to participate in HIPs provides an opportunity to better understand their educational experience. The session will share results about potential challenges when assessing the engagement of athletes in HIPs. Conversation around the ability to effectively measure HIPs for athletes is timely because of public concerns regarding the integrity of the educational experiences offered to that segment of the student population.

    Learning Outcomes:

    1. Participants will be able to identify challenges with defining and assessment of HIPs at multiple institutions
    2. Participants will be able to explain the inequities associated with the college athlete experience and HIP participation

    Audience: Intermediate

    Drexel Faculty Assesment Fellows Awards Panel Drexel Faculty Assessment Fellows: Drexel University Gerrie C LeBow Hall 121

    Each year, Drexel University awards an individual, group and/or program with the Drexel University Assessment and Pedagogy Award. The award recognizes individuals and teams that have utilized assessment to improve a teaching and learning initiatives and, as a result, have significantly impacted curriculum design and the overall quality of teaching and learning at Drexel. This year’s awardee, along with first runners up, will participate in a panel discussion describing their efforts, successes, problems and constraints when implementing assessment approaches at Drexel University. The panel will show how various strategies can be implemented at the same institution and that assessment is not a ‘one size fits all’ proposition.

    AEFIS (Vendor Session) AEFIS PISB 105

    Sep 13, 2018
    12:30 PM - 1:45 PM

    Luncheon & Plenary

    A Panel Discussion with College/University Presidents Moderator: John Fry – President, Drexel University Behrakis Grand Hall

    Panel:

    Richard M. Englert: President, Temple University
    Donald Generals: President, Community College of Philadelphia
    Colleen M. Hancyz: President, La Salle University
    Mark C. Reed: President, Saint Joseph's University

    As part of our 5th year anniversary of Drexel University’s Annual Assessment Conference, we are sponsoring a President’s Panel to be held during our Thursday Luncheon in Behrakis Grand Hall in the Creese Student center. President John Fry of Drexel University will moderate the panel, which will include the Presidents of the Community College of Philadelphia, La Salle University, St. Joseph’s University and Temple University. The panel will provide an opportunity for conference attendees to solicit the Presidents’ perspectives on the future of higher education in light of advances in instructional delivery modalities, heightened federal regulatory compliance, enrollment troughs, budget compaction, etc. We are so pleased that the Presidents have agreed to add their insights and thoughts to this critical discussion. The Presidents will comment on the above topics in their opening statements and then respond to your questions from the floor. Your questions may touch on any area within the higher education environment.
    Sep 13, 2018
    2:00 PM - 3:00 PM

    Concurrent Session 5

    Powerpoint Presentation (PDF)

    Feeding the Pig: An Integrated System for Program Learning Improvement Kristen Smith: University of North Carolina, Greensboro Diane Lending: James Madison University PISB 104

    A pig never fattened up because it was weighed.” Similarly, students do not learn more simply because they are assessed. We will present a model for student learning improvement and an empirical example of how one program evidenced learning improvement at the program level. In this session, we will describe the process of assessing a learning objective, designing and implementing an intervention that crosses the curriculum, and re-assessing student learning. We will present an example of a successful learning improvement initiative from JMU. We will explain the step-by-step procedures used at JMU, share assessment results, and implications. Lastly we will sharetechniques used to obtain and maintain faculty involvement which had impacts on faculty and student learning. The objective chosen is the soft-skill of conducting an interview and we will show videos from before and after the intervention

    Learning Outcomes:

    1. Participants will be able to identify the steps of a model for evidencing program learning improvement (weigh pig, feed pig, weigh pig)
    2. Participants will practice integrating assessment practices with teaching and pedagogy

    Audience: Advanced

    Snapshot Sessions (A Collection of Mini Sessions) Various Speakers PISB 106

    SS1: Reflecting on Self-Reflective Assessments: Influencing the Use of Metacognition in Inquiry-Based STEM Classes
    Thomas Heverin: Drexel University

    SS2: Seeing the VALUE in Numbers: A Quantitative Reasoning Assessment Study in a Chemistry Course
    Punita Bhansali: Queensborough Community College

    Nelson Nunez Rodriguez & Travis Bernardo: Hostos Community College

    SS3: Everything You Wanted to Know about "An Inclusive and Collaborative Assessment Process" but Were Afraid to Ask
    Steven Billis: New York Institute of Technology

    SS4: Development and Validation of Mathematics Computer-Based Assessment Instrument for Senior Secondary School Students
    Collin Okorie Ifere: Ebonyi State College of Education, Ikwo. Ebonyi State Nigeria

    Boniface G. Nworgu: University of Nigeria, Nsukka, Enugu State; Nigeria.

    SS5: Using Telepresence Robots in Formative and Summative Assessment
    Dana Kemery: Drexel Univerisity

    SS6: What makes that true? Building Academic Communities Where Assessment Lives/leads/rules.
    April Massey: University of the District of Columbia

    SS7: Enhancing Academic Program Review to Include Program Viability Review
    Clevette Ridguard: Montgomery College

    SS8: Beyond Data: the Need for Holistic Assessment of Academic Library Services
    Susan Van Alstyne: Berkeley College

    Can Our Students Write?: A Program Evaluation of Writing and Implementation of a Writing Rubric Amy Bowser & Briane Greene: University of Pittsburgh PISB 108

    The University of Pittsburgh, School of Nursing completed a writing evaluation across all programs, which resulted in the implementation of a school-wide writing rubric, new writing assignments, and faculty development for creating assignments and peer review lessons. We will present this assessment process and brainstorm modifications for other university programs. Assessing learning outcomes for accreditation at the program level is prevalent among universities. When benchmarks are not met, we must implement a quality improvement process. We will present an example of how faculty across programs can develop methods to assess writing and improve both writing instruction and student writing. Through presenting the specific case of addressing and improving student writing outcomes, the session will (1) help generate ideas on how to evaluate school-wide student learning outcomes in a systematic way and (2) provide an example of implementing an intervention for improving student learning outcomes at the course level.

    Learning Outcomes:

    1. Participants will be able to plan an evaluation of student writing across all programs.
    2. Participants will be able to develop action plans for faculty to use to improve student writing.

    Audience: Intermediate

    Powerpoint Presentation (PDF)

    Surveying Student Success: Engaging Everyone in the Conversation Amanda Mae Dudley, John Lyons and Jessica Sears: New York University Pearlstein 101

    Traditionally, retention and graduation metrics have been at the center of student success conversations. Administrators add student voices to the conversation by surveying withdrawn students. Withdrawn student surveys, however, suffer from low response rates and leave out an important population from the student success conversation: the students who stay. Understanding this limitation in available information around student success, NYU Steinhardt’s Office of Institutional Research, in collaboration with NYU Steinhardt Student Affairs, piloted a student success survey. The goal of the survey was to collect student perceptions of their own success and stress and to connect them to university resources where appropriate. In this presentation, we will walk participants through crafting targeted survey items, and creating understandable and insightful dashboards to inform the student success conversation. Defining student success is an important conversation for all university administrators. The tools demonstrated in this presentation ensure that student voices are not left out.

    Learning Outcomes:

    1. Participants will be able to identify and engage appropriate stakeholders in the student success conversation conversation.
    2. Participants will be able to develop targeted survey items to support conversations around student success.

    Audience: Beginner

    Powerpoint Presentation (PDF)

    Developing Faculty Investment and Expertise in General Education Assessment Laura Edelman & Kathleen E. Harring: Muhlenberg College Pearlstein 102

    The structure and process for General Education assessment is a critical first step to get faculty involvement. This session will outline Muhlenberg’s General Education Assessment Plan and share the process of faculty coordination and oversight for assessment of each curricular element. Details will be provided about the assessment of the Diversity and Reasoning requirements. The presenters will share strategies for addressing challenges and for disseminating results to faculty stakeholders to revise curriculum and improve student learning. Additionally it is important for attendees to see that different curricular requirements need to use different assessment techniques. This session will provide attendees with an example of a successful model of how to structure General Education assessment and how to involve faculty in the assessment plan and process. Attendees will also learn about two different assessment projects that they can adapt to their home institutions.

    Learning Outcomes:

    1. Participants will learn about a model for general education assessment that they can adapt to their campus climate and needs.
    2. Participants will gain an understanding of general education assessment methods that can be adapted to their curricular structure.

    Audience: Beginner

    Making Your Data Count: A Taxonomy, Process, and Rubric to Achieve Broader institutional Impact Jennifer Harrison & Sherri N. Braxton: University of Maryland, Baltimore County Gerrie C LeBow Hall 108

    Our session explores technologies that enable institutions to systematize outcomes data, so direct learning evidence can add depth and nuance to learning and predictive analytics and deepen our understanding of student learning. Assessment technologies can help contextualize learning analytics with student learning outcome evidence, but how can institutions integrate these data? Institutions need tools that integrate multiple measures of student success—especially direct evidence—to deepen insights about student learning. To bridge student success and outcomes data, we need software that enables institutions to aggregate outcomes data by rolling up direct evidence to the institutional level. Our goal is to help faculty, staff, and other campus leaders create a culture of data-informed decision making by interacting with three tools we created to help institutional leaders begin to systematize learning assessment data: a taxonomy, a process, and a rubric.

    Learning Outcomes:

    1. Participants will be able to classify technology tools and their assessment uses
    2. Participants will be able to customize a planning process to their institutional culture and develop criteria to evaluate technologies for specific uses

    Audience: Intermediate

    Civic Learning and Intercultural Competency: Key Tools and Strategies for Assessment Javarro Russell: Educational Testing Service (ETS) Gerrie C LeBow Hall 109

    Many courses, programs, and institutions are acknowledging the importance of constructs such as civic learning and intercultural competence. However, traditional assessment tactics, such as learning outcomes, alignment, and data collection, can face certain challenges when assessing these constructs. This session will explore innovative assessment practices and tools. In these areas, traditional assessment approaches can encounter challenges. Data can come from a variety of sources.. This session will review innovative strategies and tools to address the assessment of civic learning and intercultural skills. These include frameworks of knowledge, skills, and dispositions that articulate complex constructs, frameworks for aligning various institutional efforts, and the integration of multiple sources of data.

    Learning Outcomes:

    1. Participants will identify the knowledge, skill, and attitudinal components of civic learning and intercultural competence.
    2. Participants will compare and contrast multiple data sources in the assessment of civic learning and intercultural competence.

    Audience: Intermediate

    Locks and Keys of GPS: Guided Pathways to Success - Standardized Program Maps with Prerequisites M. Christopher Brown & Beverly Schneller: Kentucky State University Gerri C. LeBow Hall 121

    This session will present standardized program map templates to increase efficiency and consistency in advising, retention, and program completion. The lock and key program map allows students and advisors to actively visualize courses that serve as or require prerequisites, thereby encouraging students to follow a prescribed course of study leading to program completion. The session will provide attendees with a template to standardize their institution's program maps and provide a visual to help students identify courses that serve as and/or require prerequisites. This will decrease the likelihood that students will drop classes that are necessary for their academic progression.

    Learning Outcomes:

    1. Participants will learn about the importance of creating standardized, easy-to-follow program maps using locks and keys to show courses that serve as or require prerequisites.
    2. Participants will use the tools obtained through the session to design program maps for their curriculum.

    Audience: Intermediate

    Sep 13, 2018
    3:00 PM - 3:15 PM

    Break 4

    Sep 13, 2018
    3:15 PM - 4:45 PM

    Concurrent Session 6

    Powerpoint Presentation (PDF)

    Two 45 Minute Sessions PISB 104

    3:15-4:00pm

    Where is the Sweet Spot?
    Fiona Chrystall: Asheville-Buncombe Technical Community College

    What is the difference between establishing an assessment system and sustaining one? Establishment requires structure, and adherence to it long enough to identify the pros and cons of the system, rather than human responses to it. Sustaining the system requires focusing on the human response to the work of assessment. Without this shift in focus, an assessment system can become an empty vessel. Customizing assessment to the varying needs of programs aids relevance and meaningfulness, but how much customization is enough without losing integrity? Balancing the needs of internal and external stakeholders in the assessment process is tricky. Participants will have the opportunity to evaluate the current state of their institution’s assessment system and determine future potential directions. Through discussion, use of an evaluation exercise, and reflection, they will generate ideas for how to maximize the effectiveness and efficiency of assessment while maintaining a system with integrity.

    Learning Outcomes:

    1. Participants will determine the “health” of their assessment system by working through a set of structured questions during the session.
    2. Participants will generate ideas for how to maximize the effectiveness and efficiency of their assessment system in the long term, potentially increasing the perceived value of assessment while maintaining the integrity of the system.

    Audience: Intermediate

    4:00-4:45pm

    Assessment as Leadership for Change
    FKathleen Wise: Center of Inquiry and the Higher Education Data Sharing Consortium/p> Catherine Andersen: University of Baltimore

    Cynthia Crimmins: York College of Pennsylvania/p>

    Good assessment requires more than technical skills. It requires leaders who can work with colleagues to find ways to respond formatively to evidence. This includes setting goals, planning, and implementing projects across different and disconnected parts of the institution. Our session will highlight the skills necessary for successful assessment leadership. Many people see the final product of assessment as a report for administrators and/or accreditors, but its most important goal is to improve student learning. Evidence alone won’t do this. Evidence must be imbedded in our unique institutional environments and mechanisms, and this requires a distinct kind of leadership. Participants will learn leadership skills necessary for helping their colleagues use assessment to improve student learning. These include: (1) setting ambitious but reasonable goals, (2) planning for the entire assessment cycle, not just collecting evidence and writing reports, and (3) implementing assessment plans in the unpredictable environments of our institutions.

    Learning Outcomes:

    1. Participants will be able to establish ambitious but realistic and sustainable assessment plans that include commonly overlooked components
    2. Participants will be able to structure assessment plans so that colleagues focus on work that impacts student learning, not just minimally complying with assessment requirements.

    Audience: Intermediate

    Powerpoint Presentation (PDF)

    Two 45 Minute Sessions PISB 106

    3:15-4:00pm

    Cooperative Education and an Increasingly Entrepreneurial Workforce: In Step?
    Kristen Gallo-Zdunowski & Liza Herzog: Drexel University

    Self-sufficiency, resilience and opportunity recognition are emerging competencies central to today/s hiring practices. The Steinbright Career Development Center and the Close School of Entrepreneurship at Drexel University will present early findings from a collaborative study to better understand between Drexel's cooperative education program and those competencies essential to entrepreneurial education. Today's fast-moving information platforms, together with finite resources, mean that employers need to work to continuously adapt their ways of doing business. In order to best position students for the future of work, it's critical to understand shifting employer needs and priorities, and align academic and professional programs accordingly. Session participants will come away with actionable insights into their own academic and cooperative education programming and assessment practices, and with tools to help explore the connection between student and employer interests and needs.

    Learning Outcomes:

    1. Participants will explore the ways in which employer needs are evolving, as evidenced by job descriptions and job titles within a cooperative education program.
    2. Participants will be able to identify competencies critical for student candidates, and determine whether these competencies are being developed within their own academic units.

    Audience: Intermediate

    4:00-4:45pm

    Building an Effective Assessment Process: How to Know That Your Students Are Learning
    Sonya Ricks: Bennett College

    This session focus will be to provide a model for building an effective process for assessment planning to determine what students are learning. Teacher Education programs will see assessment as a process, not a thing.The foundational concepts of validity, reliability, and fairness are at the core of the inferences faculty can make about data related to students, but building and implementing effective processes that help programs capitalize on good measures of student learning. This session will help educators turn data into information and information into action by focusing on an effective process of assessment including critical features, common pitfalls, and keys to an effective implementation in practical settings. In this current era of high stakes accountability in higher education the ability to effectively engage in an assessment process that can accurately tell the story of what is being accomplish with students in our programs is essential. An effective assessment process can turn data into information and information into productive action that can bring about student success.

    Learning Outcomes:

    1. Participants will be able to identify the key components of a quality student learning outcomes assessment process which will include an assessment activity packet that will provide a step by step guide to how to identify student learning outcome and how to evaluate student learning outcomes.
    2. Participants will be able to describe the relationship among assessment process components that will include an interactive strategic engagement activity for each process component.

    Audience: Beginner

    The Last Assessor: Feel the Force of Assessment that Awakens Revolutionary Change in Ancillary Support Services Mark Green & Ray Lum: Drexel University PISB 108

    Join your colleagues ‘a long time ago, in a far away university’ while you wrestle with assessment in ancillary support services. Do battle in a light hearted, but rigorous hands-on session that develops assessment techniques for ancillary support services. Be surprised as we awaken the Force within you by using unassuming topics. Participants will work collectively with one another to develop their assessments that are reliable and measurable. A panel of Jedi Assessors will determine the efficacy and validity of the developed evaluations. Conference attendees may observe the masterfulness of quick-witted Jedi Assessors as participants gain sage advice considering the meaningfulness and strategic nature for closing the loop. In addition, the attendees will differentiate themes and best practices of ancillary support services from curriculum. While camaraderie during the session is prize for some, top assessment tools presented will receive galactic recognition and of course . . . bragging rights being the Jedi Assessor of the year. This presentation will focus on producing assessment relatable to all audience members. The presentation will give an illustration on how to engage in a robust discussion about assessment that is fun at the same time.

    Learning Outcomes:

    1. Participants will have the opportunity to network and engage in meaningful dialogue with other conference attendees.
    2. Participants will be able to differentiate assessment tools and apply them accordingly.

    Audience: Beginner

    A Collaborative Approach to Faculty Development; Using Human Centered Design Tools to Help Advance Assessment Dana Scott: Thomas Jefferson University Pearlstein 101

    This hands-on workshop will present how a community based model of assessment can be effective on multiple levels of faculty development. Participants will learn how to use a variety of strategic design tools, and work collaboratively to identify key issues and insights to advance their assessment practices. The session will introduce a series of tools that use both convergent and divergent strategies to evaluate and improve assessment practices. It is designed to allow participants to collaboratively learn from each other, making it accessible to individuals with introductory to advanced levels of knowledge. Participants will be provided with strategies to cover faculty development on assessment from large scale, university-wide presentations, to smaller group workshops, down to one-on one sessions with assessment specialists. The information presented will allow participants to continue to generate ideas for impactful faculty development.

    Learning Outcomes:

    1. Participants will be able to recognize a process for creating a holistic system of faculty development.
    2. Participants will be able to examine key issues and insights in their assessment procedures in order to gain a “big picture” overview of their assessment .

    Audience: Intermediate

    Hands-on with Digital Communication Poppy Slocum, Jaime Riccio & Patricia Sokolski: LaGuardia Community College Pearlstein 102

    This workshop will explore why our college chose to integrate digital communication in student learning outcomes, what digital communication means, and how we assess students’ digital work. Participants will get hands on experience creating and evaluating assignments that highlight a student’s ability to communicate digitally. While digital communication is an ever-growing part of our students’ lives, creating assignments and assessing digital communication continues to be challenging for faculty. Much of the current research on digital assessment in college highlights the lack of academic scholarship in this area or the gap between faculty knowledge and practice. Participants will evaluate wiki-based assignments with our assessment rubric before creating a wiki assignment for use in their own classes. As an accessible and free platform for faculty and students, wikis are easy to learn, easy to teach, easy to assess, and can be used in any course or discipline.

    Learning Outcomes:

    1. Participants will use a rubric to assess digital communication ability
    2. Participants will create a wiki-based assignment for one of their courses

    Audience: Beginner

    Assessment Building and Time Saving Grading with Google Forms and Flubaroo - Come Effectively Form! Susan Marie Terra & Hayet Bensetti-Benbader: New Jersey City University Gerrie C LeBow Hall 108

    Google Forms and Flubaroo makes building and evaluating assessments easy. A majority of high school students today are using Google Form in conjunction with Google Classroom so why not create assessments for your students using a platform they are already familiar with.. These assessments maintain student engagement based on how they are developed using new literacies. For instance, teachers use Google forms to create assessments utilizing watching videos or viewing photos and require immediate student feedback directly in the form (Dutton, 2015). This session will provide self-made classroom ready assessments to serve as models for future assessments. Attendees will learn how to use Google Forms to create classroom (brick and mortar or virtual) ready assessments. Google Forms will save attendees time upon return to their classrooms because assessments can be sent to their students electronically and graded within minutes once the assessment is complete saving time and paper.

    Learning Outcomes:

    1. Participants will be able to attendees will be able to create a classroom ready assessment and grading rubric to use immediately with their students.
    2. Participants will be able to attendees will learn how to convert their already made assignments in to a Google Form.

    Audience: Beginner

    Engaged Conversations: Assessing Internship and Co-op Experiences as Part of the Educational Curriculum Donald McEachron, Karen Nulton and Joanne Ott: Drexel University. Nancy Johnston: Simon Fraser University (British Columbia, Canada) Gerrie C LeBow Hall 109

    Traditional classroom and experiential learning environments differ in significant ways related to the value of information, the delivery of information, control of the curriculum, learner and teacher roles, etc. In this session, participants will explore how differences in experiential and classroom formats might impact program design, delivery and assessment. Valid and useful assessments allow for the effective design of work-integrated educational experiences which can be integrated with classroom learning. Such designs are essential if work experiences are to be transformative in the development of professional skills, now required by both regional and professional accreditors as well as by industry. Participants will explore the benefits of qualitative data using samples gathered from Drexel's extensive co-op data set and examine what work-integrated learning goals can be best assessed from qualitative and quantitative instruments along with statistical approaches. This will allow participants to implement more effective, learner-centered work-integrated experiences into a curriculum.

    Learning Outcomes:

    1. Participants will be able to develop learning outcomes appropriate to work-integrated learning environments
    2. Participants will be able to apply analytic techniques to qualitative data obtained from work-integrated learning experiences

    Audience: Intermediate

    QSR International (Vendor Session) QSR International PISB 105

    Sep 13, 2018
    3:15 PM - 4:45 PM

    AEFIS User's Meeting

    AEFIS Users Meeting AEFIS Gerri C. LeBow Hall 121

    The AEFIS Team is excited to host the inaugural AEFIS Users Meeting as part of the 5th Annual Drexel Assessment Conference. The meeting will offer the opportunity for users to share their strategies for using AEFIS to support their assessment, continuous improvement and accreditation efforts. Come share and learn while you network with colleagues from across the country and the AEFIS Team.

    This event is by-invitation only.

    Sep 13, 2018
    4:45 PM - 5:15 PM

    Transportation

    Sep 13, 2018
    5:30 PM - 7:30 PM

    Reception

    Evening Reception at the Betsy Ross House The Betsy Ross House: 239 Arch St. Philadelphia

    This year, the Thursday evening reception will be held at the famous Betsy Ross House courtyard and home located in the historic district of Philadelphia. There you will be treated to food, beverages, and a tour of the house if you so choose or enjoy the company of newfound friends and colleagues on one of the historical properties in the nation. There is even a rumor that George Washington and Betsy herself will be among the guests! Shuttle transportation will be provided to and from the event.
    Sep 14, 2018
    7:30 AM - 8:30 AM

    Continental Breakfast 2

    Sep 14, 2018
    8:45 AM - 9:45 PM

    Concurrent Session 7

    Powerpoint Presentation (PDF)

    What if There was no Failure, Only Feedback? How Visible Learning Provides a Platform for Lifelong Learning Suzanne Carbonaro & Carolyn Giordano: University of the Sciences Caitlin Meehan: AEFIS PISB 104

    Receiving real-time feedback of learning within a modular course structure is essential for students, faculty and administrators as they strive for success and meet their goals. Using a rich technology base, our assessment system is couched in the research of educator John Hattie (2012) and his work on factors that affect student learning. This session reveals an innovative approach to curricular, co-curricular and experiential education assessment design integrated within a modular program. This transparent, cloud-based system is the foundation of an assessment infrastructure, which provides visible, real-time, meaningful feedback to all stakeholders to inform student learning and instructional effectiveness. Participants will be able to infuse the assessment model in their own teaching and learning practices with their students and university stakeholders. Rubrics, curricular mapping, effective instructional practices and use of data to drive improvement can be adapted for individual program and institutional needs.

    Learning Outcomes:

    1. Participants will learn through backward design and curricular mapping, how to design a robust student dashboard which reveals progress toward meeting competencies for graduation.
    2. Participants will learn the components of visible learning and discuss how to apply them in their assessment processes.

    Audience: Intermediate

    Solving the Rubrics Puzzle: Exploring the Use and Perceptions of Rubrics by Faculty Diane DePew & Mark Green: Drexel University PISB 106

    Rubrics provide a means for faculty to express assignment criteria, guide students in assignment expectations, and facilitate feedback from the teacher to learner. We undertook this survey to understand where the faculty was with the use of rubrics. We expect that a discussion about this data will provide some clarity on the use of rubrics and a determine a direction to the future needs of faculty in the development and use of rubrics in their practice. This session will provide an exploratory analysis of the results of a University-wide survey on the development and use of rubrics by faculty. The presenters will highlight key-findings and trends while trying to answer the question -- what do they mean and where do we go from here? This session addresses the subtheme, “Learning and contemporary higher education assessment.”

    Learning Outcomes:

    1. Participants will discuss current state of faculty development and use of rubrics.
    2. Participants will be able to identify future needs in the development and use of rubrics in faculty practice

    Audience: Intermediate

    Powerpoint Presentation (PDF)

    It Took a Village: How One Department Created a Viable Assessment Model through Shared Ownership Jacqueline M. DiSanto, Sarah Brennan & Catherine Lewis: Hostos Community College, CUNY PISB 108

    Using common assignments and the corresponding rubrics allows for the monitoring of the development of these skills through the use of criteria relevant to the academic program. Faculty can use these tools in planning instruction. Faculty in the Hostos Community College Education Department joined a campus-wide initiative to use rubrics to assess the development of critical skills from first course through capstone course. A one-day retreat was held where faculty across four distinct units successfully collaborated to pair key general-education competencies with each program-learning outcome. The results of this joint effort include highly usable assessment tools to measure growth in both career and general-education skills; shared ownership by administration and faculty of the assessment process; and the means for students to self-assess and revise their work. This process can be easily replicated. Degree programs must provide instruction that supports the development of key skills.

    Learning Outcomes:

    1. Participants will be able to create a rubric for a common assignment that pairs general-education competencies with program-learning outcomes.
    2. Participants will be able to replicate the step-by-step plan, including the retreat, at their home campuses.

    Audience: Intermediate

    Prior Learning Assessment (PLA): Opportunities, Challenges, and Institutional Commitment Victoria Ferrara: Mercy College Pearlstein 101

    Mercy College is committed to providing Prior Learning Assessment (PLA) offerings to assist adult learners with degree completion. This session will engage participants in a discussion about opportunities, challenges, and the need for institutional commitment to PLA. More adults are returning to college to finish their undergraduate degrees. Prior Learning Assessment (PLA) is a viable option for adults to earn credit for learning acquired outside of the traditional classroom setting. PLA is one way to attract, retain, and graduate adults who have significant life and work experience. This session will engage attendees in meaningful conversation about the critical role of PLA to institutional success. Time will be spent discussing potential opportunities for PLA, challenges institutions may face when implementing PLA programs, and the critical nature of institutional commitment to the PLA process.

    Learning Outcomes:

    1. Participants will be able to explain how PLA can be used to attract, retain, and graduate adult learners
    2. Participants will be able to identify opportunities for and challenges relating to PLA on their campuses

    Audience: Intermediate

    Powerpoint Presentation (PDF)

    Change is the Only Constant: Using Assessment Best Practices to Guide Strategic Planning Kate Oswald Wilkins & Susan Donat: Messiah College Pearlstein 102

    A strategic plan can serve as a springboard to forward improvement efforts through institutional politics and layers of governance while documenting improvements over time. Using a synthesis of assessment best practices, participants reflect on the status of their institution’s culture and assessment efforts to create an assessment strategic plan. Developing a strategic plan involves prioritizing long-term needs and goals, identifying existing and needed resources, planning concrete tasks aligned with those goals, and setting realistic timelines. Strategic plans drive annual planning and the ongoing prioritization of assessment tasks while communicating big-picture goals to high-level leadership (president, provosts and deans). Reflecting on the status of assessment work and prioritizing long-term goals helps assessment professionals be strategic and purposeful in leveraging limited resources and accomplishing daunting tasks (e.g. developing a culture of continuous improvement and learning). A well-crafted strategic plan helps make overwhelming long-term goals manageable and achievable.

    Learning Outcomes:

    1. Participants will be able to evaluate current assessment practices and culture at their home institution.
    2. Participants will be able to develop a strategic plan to move towards assessment best practices on your campus.

    Audience: Intermediate

    Powerpoint Presentation

    Creating a Culture of Assessment - Lynn University's Evolution from Panic to Plan Michael Petroski & Katrina Carter-Tellison: Lynn University Gerrie C LeBow Hall 108

    Most faculty dread anything to do with assessment, and want it to be done by someone else. Accreditation notwithstanding, involving all faculty in the process helps them understand the benefits of assessment and leads to better academic decision making. Assessment closes the loop on continuous improvement. This session will outline how a small private university went from viewing assessment as an accreditation-time exercise (panic) to a fully integrated and continuous part of our regular operations (plan). Our culture of assessment is allowing better data-driven academic decisions, not just in accreditation years, but every year. This session will share some best practices and lessons learned about faculty participation in the assessment process. It will provide a relatively easy to improve student learning, and can help make the much-feared accreditation site visit less stressful.

    Learning Outcomes:

    1. Participants will be able to explain the benefits of a continuous faculty-driven assessment process.
    2. Participants will be able to create a comprehensive assessment plan based on current courses, assignments, and student learning outcomes.

    Audience: Beginner

    Powerpoint Presentation (PDF)

    “I can do that!” - Helping Faculty and Staff Understand Assessment and Its Purpose in a Fun and Simple Way Nicole Tomasello-Rodriguez: Trocaire College Gerrie C LeBow Hall 109

    We have all seen it - that one faculty or staff member who looks like a deer in headlights when we mention the word, “ASSESSMENT.” They become obviously anxious, twiddling their thumbs, shaking their foot and explaining how they have too many other tasks to complete than to worry about “completing all of these forms.” I once had a staff member tell me that she “hates” assessment and was “scared to death” to present at our Institutional Effectiveness Committee Meeting. Ouch! That meant I needed to make some changes in how assessment was being portrayed to the college community. Once I was able to implement some positive, simple and fun initiatives, the staff were able to understand why assessment is important, how it can help them in their careers and how they can work with others to accomplish assessment goals. When faculty and staff see assessment as important to their own specific initiatives, they are able to use data to implement changes that will improve their offices and ultimately the college as a whole. This presentation will provide examples of specific activities that allowed faculty and staff to embrace assessment at the program and college level.

    Learning Outcomes:

    1. Participants will be able to teach faculty and staff about the importance of assessment in a fun and simple way
    2. Participants will be able to implement a specific learning activity with faculty, staff or students about the cycle of assessment

    Audience: Beginner

    Sep 14, 2018
    9:45 AM - 10:00 AM

    Break 5

    Sep 14, 2018
    10:00 AM - 11:00 PM

    Concurrent Session 8

    Powerpoint Presentation (PDF)

    Connecting Teaching & Learning: Measuring the Effects of Pedagogy on Course Evaluations Ethan Ake-Little & Dana Dawson: Temple University PISB 104

    In this session, we will explore how two types of pedagogical approaches (skills based and experience based) mediate the relationship between student demographic and academic profiles and three measures of course evaluation (student interest and preparation, instructor quality, and the overall quality of teaching and learning in the course). This session discuss our forthcoming study that examines how 10+ pedagogical techniques self-reported by instructors in over 8,000 General Education sections influences student instructor and course perceptions. Few studies explore this issue and our ultimate goal is to encourage both administrators and instructors to incorporate findings into policy and practice. Our study aims to provide a comprehensive quantitative model designed to help educators understand the relationships between instructors, students and the dynamic nature of teaching and learning. The study is part of a larger initiative to not only conduct rigorous assessment, but also “close the loop” between assessment and practice.

    Learning Outcomes:

    1. Participants will understand how skills vs. experience based pedagogy influences the relationship between a class's demographic profile and its perception of the course.
    2. Participants will understand how skills vs. experience based pedagogy influences the relationship between a class's academic profile and its perception of the course.

    Audience: Intermediate

    Powerpoint Presentation (PDF)

    Days and Nights at the Round Table: Collaborative Program Review Tracy Kaiser-Goebel, Stefanie Crouse & Gaetan Giannini: Montgomery County Community College PISB 106

    We will explain MC3's revised approach to the Annual Program Review. Historically, the responsibility of each program coordinator, we now bring together a team of the academic program stewards, for a collaborative meeting, where we identify common barriers across programs at the institution, and create meaningful, long-term action plans. Historically, program information from Student Affairs was not included in the APR process. The team structure allows us to target areas of concern, and perform real-time analysis of possible solutions, in partnership, across the institution. We discuss equity, course sequencing, student success and retention, from a variety of perspectives The session will provide examples of how the process created a renewed sense of shared ownership and achievable action steps. The audience will learn of a collaborative, data-driven process improvement, which they can apply to refining institutional effectiveness, and assessment of student learning, in a variety of settings and scales.

    Learning Outcomes:

    1. Participants will be able to assemble a team of stakeholders for institution-wide review of academic programs.
    2. Participants will be able to apply this model to reform of other processes, where a more inclusive approach can bring similar gains.

    Audience: Intermediate

    Expanding Mission-Based Student Learning Assessment from the Classroom to Across the Institution Diane Pevar & Marc Minnick: Manor College PISB 108

    The assessment of student learning outcomes begins in the classroom and that data is used to drive program assessment and planning within the silo of the specific program. However, evaluation of institutional effectiveness can be markedly enhanced through an interdisciplinary examination of how well the institution’s mission and goals are linked to student learning. This presentation will examine ways of crossing program and discipline lines to measure institution-wide student learning outcomes and trends and how academic assessment outcomes can be scaffolded into institutional assessment, planning, and effectiveness, particularly in the area of mission. Participants will examine a hypothetical university and use its mission and institutional goals to create an inter-disciplinary rubric that can be used across the curricula to track the linkages between student learning and advancement of the institution’s mission and goals. This session will provide both beginners to assessment and those already performing course and program-level assessment additional tools to extend examination of mission reach. This session will provide attendees with a rubric tool that can be used and adapted to evaluate student learning across an institution.

    Learning Outcomes:

    1. Participants will explore and understand how academic assessment can be used as a measure of how effectively the institution’s mission is being communicated.
    2. Participants will collaborate in the development of a rubric that can be used by an institution’s diverse academic learning pathways to assess whether institutional goals for student success are being achieved.

    Audience: Beginner

    Entrustable Professional Activities for Competency-Based Education and Assessment Laura Vearrier: Drexel University Pearlstein 101

    This session provides an introduction to the concept of entrustable professional activities (EPAs) and how EPAs can be used to assess competency-based learning. EPAs translate milestones and competencies into measurable units of professional abilities to determine a learners’ readiness to perform tasks independently. EPAs assess a learners’ ability to perform professional activities, which goes beyond knowledge retention and individual skills. EPAs require a synthesis of knowledge and skill sets in order to perform complex professional duties required for real world performance. Attendees will develop tools to think about the ultimate end-goal of professional assessment. A purpose of assessment is to determine if an individual can competently and independently perform duties. Thinking about how achievements translate into real-world success will facilitate the process of assessment.

    Learning Outcomes:

    1. Participants will be able to describe the concept of entrusbable professional activities (EPAs).
    2. Participants will be able to understand how to map milestones and competencies to EPAs.

    Audience: Advanced

    Powerpoint Presentation (PDF)

    Celebrating your Students' Research Efforts Through Mid-point and Senior-level Performance Benchmarks and Processes Penny Weidner & Albert Sarvis: Harrisburg University of Science and Technology Pearlstein 102

    Yes, celebrating is the perfect way to describe this session. This presentation outlines a comprehensive assessment process and set of tools grounded in Harrisburg University’s student-research requirement for each academic program. This process allows students, faculty, and staff to collaborate across programs and operational units in support of student-research projects. The model's strength features flexibility and innovative properties at the academic-program level while maximizing self-directed student research. After 12 months of planning and several considerations, HU faculty and staff have embraced a standardized process that streamlined their work. And yes, the University community is now celebrating the students’ research accomplishments. Students generated 110 research projects known as Projects I and II during the 17-18 AY. Projects occurred at the 200- and 400-course levels and embedded two objectives. One, the content mapped to three program goals. Two, the deliverables were measurable and demonstrated performance associated with the University’s eight competencies.

    Learning Outcomes:

    1. Participants will learn strategies for integrating student-based research across programs and course levels that can be adapted to their institutional capacity.
    2. Participants will understand the benefits associated with implementing this concept including its impact on capturing performance measurements, planning, student advising, and celebrating the student’s accomplishments.

    Audience: Intermediate

    Powerpoint Presentation (PDF)

    Leveraging “Bad News” into Transformative Change – Faculty Lead the Way Kalina White & Caroline Evans: Community College of Allegheny County Gerrie C LeBow Hall 108

    Failure to meet accreditation guidelines is a clarion call for change. Academic standards are strengthened and maintained when faculty take ownership of the problem solving process. In this session, CCAC’s experiences using its core academic values to empower local control to improve the student experience will be discussed. Increasing faculty leadership and engagement in institutional accreditation processes can be difficult. Shifting from a top-down mandate to faculty-led control can lead to rapid change in how assessment and accreditation are viewed. When faculty internalize accreditation guidelines via direct involvement, a collaborative network of peer mentorship can be developed. Participants will learn how a problem (in this case, a threat to accreditation) can be transformed into a chance to build engagement, and drive innovation, cutting through years of inertia, while still remaining true to fundamental institutional values.

    Learning Outcomes:

    1. Participants will understand the strategy used at CCAC to subvert federal mandates to spark local change will be shared with participants for potential use at their home institutions.
    2. Participants will gain insight and tips on how to build a robust faculty-led network for change regarding future accreditation requirements.

    Audience: Intermediate

    Powerpoint Presentation (PDF)

    The American Idol of Assessment - External Program Reviewer Anne Willkomm: Drexel University Bobbijo Grillo Pinnelli: Immaculata University Gerrie C LeBow Hall 109

    Since the external review is an essential aspect of program review/assessment, the act of reviewing is a helpful service to external colleagues and helps the reviewer see their program(s) more objectively. This session will provide participants the dos/don'ts of being an external reviewer, which will further broaden their assessment expertise. External reviewers often learn by doing because there is no training. In this presentation, we will discuss our experience and provide the dos and don’ts to faculty who would like to take on the valuable assessment role of external reviewer, specifically detailing the expectations for preparation, site visit, and reporting. The external reviewer brings their knowledge, skills, and experience to evaluate program(s) and offer constructive feedback. Not only will they be able to take on the role of an external reviewer, but through that role, they will bring their experience to their own review practices, further enhancing their assessment practices.

    Learning Outcomes:

    1.Participants will be able to describe the role and expectations of an external reviewer.
    2. Participants will be able to identify ow skills, knowledge, and expertise can be valuable as an external reviewer.

    Audience: Intermediate

    Sep 14, 2018
    11:15 AM - 12:00 PM

    Closing Remarks

    Closing Remarks with the Confrerence Committee Stephen DiPietro: Drexel University PISB 120