Left Menu Right Menu

An 'un-conference' is a participant-driven meeting.

All who attend should plan to actively participate.

LUAU will be held:

The AGENDA .

April 4, 2014 from 9am to 4pm.
White Stag/UO PortlandBuilding
70 NW Couch St
Portland, OR

 
LUAU planning team has been selected the the ACRL -Oregon Award for Excellence 2014 !

Way to go Assessment Team
Chair Rick Stoddart OSU
Council Liaison Donna Reed PCC
Chair Elect Steve Hiller UW
Members Diane Prorak UI
Frank Haulgren WWU
Kate Rubick L&C
Laura Zeigen OHSU
Lori Ricigliano UPS
Meredith Farkas PSU
Steve Borrelli WSU
 Alliance Staff Anya Arnold  


LUAU Roundtable talks:

1. Judy Solberg–Seattle University

Writing rubrics to assess user services. Description: How do we measure the success of our various user services, including research/reference service, research consultations, information services, circulation services, etc.

Notes:
  • Before you can measure you must define expectations – performance
  • Use RUSA behavioral standards or something similar
  • READ scale
  • What evidence will you use to judge your standards
  • Benchmark changes standards is needed
  • Rubric may vary in disciplines
2. Karen Munro and Bronwyn Dorhofer -University of Oregon

Reaching out to users who "don't need" the library How the UO Portland Library & Learning Commons has reached out to professional graduate programs with low library awareness and usage.

Notes:
  • Email students/ Faculty to set up interview meetings
  • Questions
    • Demographic
    • Cultural
    • Library services
      • If you were to use the library what would you use
    • Review syllabus where would libr services fit in
  • Faculty buy in is critical and required
  • Outreach strategies
    • Workshops
    • Extended hours
    • E-newletters
    • Visibility in dept
  • Don”t need
    • Lit reviews
    • Print newsletters

3. Michelle DeSilva-Central Oregon Community College

Meaningful use stats: how could we better use or creatively supplement COUNTER-type use stats to get more meaningful information about how users actually use our resources (and possibly how that affects their work) - beyond knowing how many full-text journal articles are downloaded, for example?

Notes:

  • Some other sources of data
  • Custom reports from vendors
  • EZ proxy/ CAS authentication data to ID users by school dept.
  • Qualitative methods
    • Bibliographic analysis
    • Interviewing users
    • Unobtrusive observations
    • Self-reporting use
    • Subject specialists relearning
    • Librarians faculty input
  • Can use rubrics to apply user data
  • Can use data visualization to analyze communicate use stats
  • Cost per use

4. Peggy Burge–University of Puget Sound

Longitudinal Assessment of Students' Information Literacy Competencies. Research shows that the acquisition of information literacy competencies is an iterative, lengthy process. How can we use long-range assessment both to measure true learning and to inform our teaching?

Notes:

  • Standardized test for first years and seniors
  • Partners with other campus units like writing center cohort evaluations
    • Librarians participate in evaluation of student papers
  • Quick way to see results
    • Data helps guides re design of programs and courses
  • Tracking individual students helps identify students by major/ background--- where are strengths and weaknesses

5. Laura Zeigen- Oregon Health and Science University --? 6. Barbara Valentine and Brian McQuirk -Linfield College and Portland Community College :

Primo in the Alliance How shall we collectively assess our users' experiences with the new discovery layer? We can learn from each other about what works well (or not), but what kind of system, if any, will we collectively set up to track on the experiences of users in the new interfacePrimo: Usability and User Experience

Notes:

  • Prepare for pain in a productive way
  • Set the bar low
  • Silent usability
  • See something says something

7. Diane Prorak-University of Idaho

Campus collaboration for library instruction assessment

Notes:

  • Program faulty
    • Saves time
    • They got ‘free assessment’
  • Student affairs
  • Tutoring
  • Women”s center
  • VA
    • Share data
    • Makes more meaningful
  • Writing center
    • Skills
    • Common assessment
  • Career services
    • Life learning
    • PIL
  • Center for teaching and learning
  • Institutional Research
  • Academic Affairs
  • Students
  • General Ed
  • FYE
  • Alumi
  • Elevator speeches

8. Sara Seeley- Portland Community College

Threshold concepts in learning and assessment

Notes:

  • Metacognitions and self-reflection are vital to assessing threshold concepts
  • T.C. are in their infancy they make sense to us but it isn”t clear to assess them
  • T. C. bring up discomfort and lack of certainty
  • Disc facility have to be included because we can”t just undermine certainty and then split how to help students find a path in a nuanced world

9. Michael Paulus -Seattle Pacific University

Documenting and demonstrating library contributions to student success

Notes:

  • Students success is wide ranging extending beyond graduation
  • Partner w other on campus
  • How to measure engagement outside lib
  • ACRL Assessment in action
  • Different libraries make different cases
  • Difficult to tie to data such as GPAs many external factors
  • Delouse- software to analysis qualitative data inexpensive
  • U of Minnesota report on 13 things to tie to student success
10. Lisa Molinelli–Portland State University

Silencing the Bells and Whistles: what are the ways we can carry out assessment activities that are easy, low-tech, and inexpensive, but also effective and useful?

Notes:

  • High cost = not necessarily money for tech or supplies but more often time and people
  • Low cost can be solved with tech solutions
  • Get students to contribute tech such as 90 min videos on what they are interested in
  • Time is worth something

11. Melody Steiner -Seattle University

Patron Satisfaction Assessment How libraries can respectfully and positively assess patron satisfaction--and how we can make use of this information.

Notes:

  • Are they finding what they want
    • Is what they want what they need
    • Are they getting it when they need it
  • Long term satisfaction versus short-term satisfaction
    • How to address both using assessment
  • Exploring creative ways to communicate changes t service or policies based feedback
  • Micro – Surveys – focus groups

12. Kristen Shyler -Seattle University

Assessing library spaces: unobtrusive observation and data collection by student workers. Plus: can we brainstorm ways to assess campus satisfaction with library spaces?

Notes:

  • Sitting in student focus groups to hear their desires for space
  • Working with students in relevant disciplines to collect and analyze data oon spaces
  • Whiteboard questions – asking for feedback
  • In person survey during events when you know they are coming in to linger
  • Questions:
    • How to gather meaningful ongoing metrics in a sustainable way
    • How do we use these metrics to communicate our space”s importance to others on campus
    • How do we better re-design our spaces based on what we learned
    • How do we show that it was a success
    • How do we show the impact

13. Lynn Deeken – Seattle University

Assessment as Conversation: How do we assess library effectiveness (beyond our instruction program) in ways that are more interesting and meaningful than endless surveys? How can we create assessment opportunities that make our users feel like they are in conversation with us?

Notes:

  • How do you ask? Other than surveys
    • Ask student workers
    • Suggestion box
    • At service desk
    • Transfer students
  • Has anyone done online focus groups
    • Synchronous
    • Google hangouts
    • Virtual formats
  • How to make surveys more fun and engaging
    • Paper
    • Dancing robots
    • Noise
    • Clickers

14. Barbara Oldham and Lori Hilterbrand- Wenatchee Valley College and Oregon State University

Focusing on non-users: Assessing and engaging non-library users in stakeholder discussions.

Notes:

  • How do you identify non-users
    • EZ proxy to determine who has logged in
    • Mapping by program affiliation and circulation stats and library instruction
  • What are different avenues for connecting with sutdents who are not users
    • CMS
    • Friends
  • Do libraries have the capacity to serve everyone? Should they?

15. Rick Stoddart – Oregon State University

Let's figure out an Alliance-wide assessment project for student learning! A facilitated discussion on prioritizing an assessment tool or resource to measure student learning across the Alliance. This might be an information literacy rubric, a critical-thinking instruction assignment, or coordinated sample of student work that can be used for accreditation, benchmarking, or campus reporting. Let's discuss the potential in leveraging the Alliance for student learning assessment.

Notes:

  • Develop a rubric for threshold concepts
  • Establishing guidelines for working with other technologists on campus/ digital library
  • Grading bibliographies
  • Is there a way to track how different disciplines or user groups are using different library resources?
  • How do we broaden the definition of info literacy
  • Standards for evaluating library instruction as related to ACRL
  • Buy consortial access to the NSSE infor Literacy module
  • For the Alliance to officially adopt the AACU learning outcomes



16. Eli Gandour-Rood-University of Puget Sound

Assessing reference models: how are we delivering our reference services, and how are our users benefiting (or not!) from our service model? What are the outcomes that we want our users to have, and how can we measure them?

Notes:

  • Reference Models
    • What is success
      • Did they get what they want
      • Did that help
      • Next time come to me first
      • How many repeated appointments
      • Length of appointments
      • Does the user Leave Happy
  • Concerns
    • Staff time
    • Efficiencies
    • Number of bodies
      • Can we rely on students
      • Multiple modes are we point of need
      • Timing
        • Gaps in coverage
    • Types of Models
      • Multiple desks
      • Traditional desk
      • By appointment
      • Triage desk
      • On call
      • Trained students
        • Better pay
        • Grad student
        • Peer model
  • Assessment methods
    • Ask for feedback
    • Faculty feedback
    • Read scale
    • Student enter questions for review
  • Google doc
  • Gimlet
  • Wufoo
    • What even is reference
      • Ready reference or known item
      • Research consultations
      • Chat reference
      • Who can do it
  • Librarians
  • Para Prof staff
  • Students
  • Special students

17. Tina Hovekamp – Central Oregon Community College

Methods of assessment as they relate to an institution's (and libraries) strategic planning.

Notes:

  • Accreditations reasons
  • Budget accountability- budgeting is a foundations and informs the strategic plan- assessment feeds budget proposals / planning in the process
  • How to tie board themes to measurable outcomes
  • Outputs versus outcome measure
  • What makes a strategic plan assessable time and prioritizations of activities times need to be clarified
  • Needs to be phased in terms of outcomes and not just activities
  • Take list of activities and wishes and priorities them so that your tie most important activities to measurable outcomes in the strategic plan

18. Candise Branum– Oregon College of Oriental Medicine

Assessing and modifying classroom instruction on-the-fly

Notes :

  • Pre-class survey/ assignment/ tutorial
  • Picklist/ choices of topics and ideas – menu of options
  • Insta –polling lib-guides or otherwise
  • How to implement and assess?
  • What is challenging about research
  • Problem bases interventions
  • Let them flounder then laughter the life boat
  • Last minute check ins: What was helpful , what do you still need to know
  • Different levels or times for implications first year vs thesis
  • Instructor involvement is key
  • Contextual or assignment based
  • Flipping the classroom

ยท

19. Dawn Lowe-Wincensten – Oregon Institute of Technology

A Question of Syntax: Asking the right questions in the right way can make all the difference in your assessment. We will look at how different question syntax will net different results, and how to ask effective questions.

Notes:

  • What do you what to know?
  • How do your users think about what you want to know – what is their terminology?
  • What can you do ( able to do) with the data collected
    • Never ask anything If you are not going to use the information
  • Make questions meaningful and actionable

20. Allie Flanary – Portland Community College

UX for Libraries: Beyond the Screen In the whirl of web scale software migrations, it can be easy to lose sight of User Experience needs that happen beyond the screen. In this session, participants will share and discuss best practices (and opportunities for improvement) pertaining to the user experience involving names, services, and physical spaces within the library.

Notes:

  • Empathy
    • Patron involvement communication
      • UR designer
      • Embrace beginners mind
      • Empathize
      • Show don’t tell
      • Get unstuck – be uncomfortable
      • Collaborate
      • Problems are opportunities

21. Evviva Weinraub Lajoie - Oregon State University

Assessing Impact of Library Technology: Folks who work in library technology know that for anyone who uses the library, whether it's as a researcher, a community member, a staff member, or a student, they depend on the services provided by their geeks. In Assessing Impact of Library Technology, we will focus on ways to gauge the impact of library technology beyond web stats and usability testing. We'll discuss how the importance of Library Technology can be articulated to our administrators, how can we show impact on student learning, and brainstorm ways to elucidate the fundamental importance of library technology.

Notes:

  • How to assess how to show impact of library technology?
    • Edge workbook
      • Setting benchmarks
      • Strategic plan
    • Checking out equipment- surveys
    • Website usability testing
    • Google analytics
    • Re-design periodically
    • Take small steps
      • Online survey – needs to ask the right questions get data that is actionable
      • Digital collections- identify actual people – tell a story- usage experience
      • Build relationships
      • How to be heard on larger campus levels

22. Nancy Slight-Gibney –University of Oregon

Consortial purchase of Counting Opinions. I would be interested in knowing if other places are using or are considering using this product.

Notes:

  • Counting Opinions
  • ACRL metrics
    • Consortial subscription
      • Data Portal
      • Set of Key performances
      • Other possibilities
    • User assessment data gathering

23. Sara Q. Thompson -Oregon State University

Online vs. In-Person: how could we use assessment to figure out which aspects of library instruction work best as online chunks vs. face-to-face delivery?

Notes:

  • In-Person
    • One shot
    • Class visit
    • Tours
    • Service desk
    • Workshops
    • Consultations
  • Online
    • Tutorials
    • Prep for class or f2f
    • Chat
      • Ref
      • Instruction
    • Guides
    • DIY
    • Videos
    • Embedded course
  • Cross over
    • Brief class visits combined with online tutorials
    • Library DIY
    • Articulate a story line
    • Guide on the side
    • Qualtrics
    • Managing expectations
    • Audience and Purpose
    • Workflow and purpose

24. Torie Scott—Portland College

Walking the razor’s edge between assessment of student learning and evaluation of teaching

Notes:

  • You have to have goals before assessing anything, but which goals
    • Teaching goals
    • Institutional goals
      • Tension between authentic meaningful feedback and instruction all need for data to report out
      • At institutional level, it doesn’t matter where / how students achieve outcomes.
      • This doesn’t inform or relate to our teaching take away / teach less but engage students deepley and ask them to reflect on the process.

Lighting Talks

1. Lori Ricigliano –University Puget Sound

Assessing Library Spaces Using the seat sweep method and a photo survey, library staff conducted a study to observe the ways in which students are using spaces. Activity, floor use, and furniture preference were tracked.

2. Joyce Wong –Langara College Library (Vancouver, B.C.)

Using feedback from students and partners to bridge the physical learning commons with a virtual learning commons.

3. Joe Marquez–Reed College

Service Design: Assessing the Library's Physical Services and Touchpoints Usability is not just a tool for the electronic library. Service design looks at the physical elements in the library in order to refine the service touchpoints from a user perspective.

4. Benjamin Moll and Garrett Trott–Corban University

Assessing information literacy as a university-wide endeavor. At Corban University, we assess information literacy using a variety of means. We will share how we assess it and how we use the data to track progress on a university-wide scale

5. Sally Mielke–Eastern Oregon University

Identifying information literacy instruction and assessment already occurring in academic programs / collaboration with faculty.

6. Candise Branum –Oregon College of Oriental Medicine

Assessing and modifying classroom instruction on-the-fly.

7. Tony O’Kelley – British Columbia Institute of Technology

Improving Library Interaction for Patrons through User Experience (UX) At BCIT Library we did a walk-through of the library with students examining the various service points as well as examining the library home web page; analyzing usability in terms of how easily, efficiently and satisfactorily some of our services are used by students to achieve their research goals.

8. Ben DeCrease– Wenatchee Valley College

What are the necessary elements to a library home page and what should they be called for maximal patron comprehension.

9. Dawn Lowe-Wincensten – Oregon Institute of Technology

A Question of Syntax: Asking the right questions in the right way can make all the different in your assessment. We will look at how different question syntax will net different results, and how to ask effective questions.

10. Bahram Refaei – Linfield College

Alma Analytics: how to collect the data you need Create analytical reports to gather fulfillment and collection management stats.

11. Jennifer Ward – University of Washington

Update/info sharing from the Primo Usability Working Group

12. Laura ZeigenOregon Health and Science University

Primo in the Alliance How shall we collectively assess our users' experiences with the new discovery layer? We can learn from each other about what works well (or not), but what kind of system, if any, will we collectively set up to track on the experiences of users in the new interface?

13. Kate Rubick and Meredith Farkas - Lewis and Clark and Portland State University
Assessing and Teaching at the Same Time: Using survey software to design pre-assessment tools that also instruct.

14. Laura ZeigenOregon Health and Science University

What Can Library Assessment Learn from Doctor Who? No, seriously! Without knowledge of how different alien cultures operated and the specific needs of each, how could the Doctor have survived, much less saved the universe time and again? Doctor Who can prove instructive for anyone in libraries interested in assessment