The Measuring Non-cognitive Competence Report Task Force was assembled to inform the ICE community on the topic of noncognitive assessment and its relevance to professional credentialing. To this end, this questions and answers document is provided to serve as an initial introduction and to provide references for those interested in a deeper dive into the underlying literature.
This on-demand webinar provides a focused introduction to computerized adaptive testing (CAT), allowing credentialing professionals to better understand the benefits of the approach and evaluate its applicability to their organization.
This chapter discusses job analysis by presenting the rationale for using a job analysis as the basis for validating content specifications for a certification assessment, explaining the process of a job analysis, conducting a job analysis meeting, validating the job analysis, and developing the test specifications. The last section of the chapter discusses trends in job analysis at the time of publication.
This chapter covers examination development, identifying the professional standards related to examination development and describing examination development as an onging endeavor that requires process management and control tools. The chapter is organized around two stages of examination development: design and construction.
This chaptercovers performance testing methodologies, recommending practices in order for performance tests to meet NCCA Standards along with associated samples. This chapter addresses performance test design, types, and construction and is designed to present performance test design considerations including pschomotor, affective, and cognitive testing; context-based testing; criterion referenced testing; and process results testing.
This chapter discusses how the paper-and-pencil and computer adminstration of achievement tests used for credentialing purposes can be designed to achieve the greatest consistency and accuracy possible.
This chapter describes "best practice" guidelines available for setting performance standards, to review the numerous methodologies that are available to certification bodies to set performance standards for their certification programs, to compare the available methodologies to methods being used in practice by NCCA accredited programs, and to provide additional guidance and suggestions about standard setting.
This chapter provides an overview of how technology has been used by certifying agencies and suggests criteria for making better technology choices. Unlike other chapters in the book, technology is not a specific subject of the NCCA Standards.
This publication summarizes current principles related to performance examinations and accepted methods for their development, validation, and use. It is an update to Guidelines for Non-Written Examinations, published by the National Commission for Certifying Agencies (NCCA) in January 1991.
In 2010 ICE conducted a large scale job analysis, the first in the field, to identify the responsibilities and duties of the credentialing professional. The study consists of the following major phases, which provide the organization of this report: 1) Initial Development and Validation 2) Pilot Study 3) Validation Study, and, 4) Development of Specifications.
The Psychometric Concepts and Analyses module from the Fundamental Topics in Certification online course explains basic psychometric principles and examines the type of analyses used to evaluate the quality of a certification candidate.
The Assessment Development and Delivery module from the Fundamental Topics in Certification online course reviews the major steps in developing fair and accurate assessments and how to best administer these assessments based on the nature of the program.
Designed for new credentialing staff, new volunteer board members, or staff who are developing a new certification program, set up your staff and volunteers for success by providing them basic knowledge and good practices for administering a certification program.
his 2011 white paper, authored by Timothy Muckle, PhD, will explore a number of facets of the foray into the development of innovative item formats by one credentialing organization (National Board of Certification and Recertification for Nurse Anesthetists, or NBCRNA).
In addition to summarizing the implementation of alternative item types, the paper will discuss overall benefits of the project, with a particular focus on value added to the credential, as well as lessons learned and recommendations for decision-making for other organizations.
Recruiting and managing quality and dependable SMEs is a major challenge for organizations. This session will discuss real-life efforts to recruit SMEs, train them on sound item development techniques and manage their work efforts.
In this webcast, NAMSS staff and volunteers will share their resources and best practices for developing and maintaining a strong exam development process, covering:
How to identify and recruit top volunteers to participate in exam development activities How to run effective exam development meetings How strong processes help achieve NCCA accreditation and re-accreditation Tips and tricks for staying on track Lessons learned from past experiences
This package offers you recordings of 12 concurrent sessions (1 hour each) from the 2013 ICE Excahgne that focus on testing concepts and issues for certification programs. Most sessions are suitable for all levels of credentialing professionals in both the public and private sectors.
This webcast will discuss ways to identify discussions or disclosure of exam content on the internet. Heather Mullen & Aimée Hobby Rhodes will review an approach to positively identify the individuals who have posted the exam content and how to determine the manner in which the information was gathered.
Arm yourself with a fundamental working knowledge of the varied aspects of job analysis methodology so that you can better navigate the considerations that are necessary when conducting a job analysis. Increase your awareness of its many functions as well as observe its key contributions to program development. Join Deborah Ford, PhD and Robert Ployhart, PhD as they discuss the nature of work, steps in a job analysis, methodological considerations, data analytic considerations, and applied usage of job analysis.
ICE's Research & Development (R&D) committee noted the limited guidance and resources available on how to set eligibility requirements. To meet this need, the committee conducted interviews with existing certification programs, completed a thorough review of the topic, and produced the Eligibility Requirements Report. The report provides a glimpse of industry perspectives and best practices, alongside a review of the policies in place for determining credential eligibility and the methods for setting and changing these requirements.
The "Credentialing Specialist" Assessment-based Certificate Program is designed for certification staff who want to gain insights into the eco-system of certification from a program's conception, through its development and delivery, to maintenance and improvement.
Use Advanced Topics in Certification online modules to guide your staff's conversations and decisions on expanding, elevating, or differently marketing your established program. This purchase is for a single user of the Legal Issues module.
Lynn Webb will have you speaking like a psychometric nerd in just one hour’s time. Join her as she divulges secret phrases and meanings. Dr. Webb’s favorite parts of the test development cycle will be highlighted in the web meeting.
The session will commence with a description of the essential elements of a successful score report for passing candidates and separately for failing candidates. Following that, the various standards that pertain to score reporting for certification examinations will be provided along with some guidance and commentary about their relevancy and meaning. The third part will focus on best practices for score reporting. Topics that will be covered include score metrics (e.g., raw, percent, or scaled scores), the reporting of subscores, error metrics, other potentially useful metrics (such as number of items), the use of graphics, and when/how to provide score reports.
In this webcast, Donald Truxillo identifies the information you could collect in your job analysis to help set your cutoff scores, particularly for content valid tests. Topics will include guidelines for setting cutoff scores, as well as methods for setting cutoff scores for different assessment types.
During this webcast, consider the pros and cons of each setting, and identify programmatic considerations (goals, budget, resources, etc.) that should factor in to the decision to develop test items in-person or in the virtual environment.
This overview includes a walk-through of the steps used in a Job Task Analysis, which is a common method used to gather current information to stay abreast of new developments in the field, and discussion on how the results inform the development of a test blueprint.
A case study is provided that documents the Chartered Financial Analyst (CFA) exam’s transition to a three-option item format. The CFA program is the gold standard credential among investment professionals globally, with 124,000 CFA charterholders residing in 145 countries. This case study is intended to be of value to other testing programs considering a similar change with respect to understanding the motivation for change, the challenges faced with stakeholders and constituents, and the psychometric impacts of change that align with literature-based expectations.
The 2015 ICE Exchange concurrent session, A Game Plan for Managing Test Security Incidents, provides a case study approach to introduce a security incident response framework and a flow chart to help navigate game-changing test security incidents.
The 2015 ICE Exchange concurrent session, Considering Online Proctoring? Tips, Guidelines, & Lessons Learned, shares insights on the psychometric, policy and standards, product feature, privacy, and security aspects of the decision to use online (remote) proctoring.
Learn how to take a proactive approach to including security in communications before, during, and after the test experience; analyze current communications (e.g., handbooks, candidate agreements) for security language; and present examples. The presentation will take an in-depth look at the steps to achieve these objectives by exam vendors and sponsors.
This session introduces test users to the topic of examination performance and the exam features that are measured to indicate that performance. The presenters discuss which features of an exam program should be monitored, how they can be evaluated, what impact they have on the candidate experience, and what an exam program can do about the results.
This webcast will outline the key considerations for programs that are thinking about implementing remote proctoring. By the end of the session, attendees should be armed with sufficient insight into the benefits, drawbacks, and challenges of remote proctoring to determine if it is worth further consideration for their testing program and, if so, how to move forward.
A digital copy of the ICE Handbook, a step-by-step guide to designing and implementing effective professional certification examinations was written by recognized experts with applied knowledge of practical experience.
This webcast covers the basics of how to build a strong security plan from the ground-up, provide recommendations for prioritizing limited budget resources, and offer advice for maintenance of your security plan as your program grows.
This webinar will provide a focused introduction to computerized adaptive testing (CAT), allowing credentialing professionals to better understand the benefits of the approach and evaluate its applicability to their organization.
The concurrent session, Lost In Translation: Facilitating The Conversation Between Psychometricians and Non-Psychometricians, provides a framework for understanding and resolving conflicting program priorities, and illustrates a way to create a common understanding so all stakeholders can work together for the betterment of the program.
This concurrent session, Establishing and Maintaining Firewalls Between Certification and Education, presents the five most frequent problem areas with applications for initial and continuing accreditation under the NCCA Standards for the Accreditation of Certification, and is based on the ICE white paper with the same title.
The Online Proctoring: Dipping Your Toes or Cannon Balling - It's Time to Jump In concurrent session shares a framework for navigating security/privacy risks; an understanding psychometric considerations, candidate reactions, and NCCA’s perspective on this emerging delivery format; and some key decision points before “jumping” into online proctoring.
This Institute for Credentialing Excellence (ICE) white paper provides insight into how one testing organization—the Association of Social Work Boards (ASWB)—created a strong SME culture with low turnover and high engagement at all levels. This paper presents a case study, but is also intended to provide a set of conceptual tools for organizations interested in creating not only a sense of investment among their SMEs but also a sense of group cohesion and overall program continuity.
Explore the cognitive level schemes and how the various levels are applicable to item writing and discuss ways to guide item writers in developing higher order questions. By the end of the webcast, know two strategies to improve the quality of your items.