Follow ACMA
Twitter Facebook Linkedin Instagram
v
GO
ACM Exam Development

ACM Exam Development

The content of the ACM Accredited Case Manager Examination is defined through a national job analysis study. The study involves surveying practitioners in the field to identify routine tasks considered important to competent practice. Practitioners are selected from a wide variety of work environments, settings and geographical areas. The examination is developed and maintained through a combined effort of qualified subject-matter experts and testing professionals who construct the examination in accordance with the ACM Accredited Case Manager Examination content outline.

This practice analysis study was conducted to identify critical tasks performed by case managers. Study results were used to make recommendations to the advisory committee (AC) that will influence examination assembly for the ongoing Accredited Case Manager (ACM) credentialing program of the American Case Management Association (ACMA). The practice analysis study was conducted in 2015.

Members of the practice analysis committee (the Committee) supervised the study and made the decisions affecting data gathering and results evaluation. These included selection of tasks included in the detailed content outline and determination of test specifications. Committee members represented different settings across Canada and the United States.

This survey-based study was conducted in phases including survey development, distribution, and response analysis. The Committee developed task statements and items to collect background information about respondents. After survey response analyses were completed by AMP, the Committee created exclusion rules by which tasks were rated as performed or not performed and significant or not significant. The Committee also specified item distributions by content domain and cognitive level for a test specifications table. Study results come to the Certification Committee as a set of recommendations when assembling forms of the examination starting in 2016.

An invitation asking recipients to participate in the online survey was distributed by email to 19,963 potential respondents. A volunteer sample of 2,520 chose to provide usable responses in time for the analysis. The approximate response rate among potential respondents was 13.7%.

After respondents responded to the survey, at least 98.8% found that the list of tasks had adequately covered the scope of their job activities. The lowest intraclass correlation value among the domains under which tasks were organized was 0.991. Therefore, the same ratings were highly probable among other potential samples from the population. The lowest coefficient alpha value among the content areas was 0.95, indicating tasks within each content domain had received ratings that were consistent.

The Committee assessed the degree to which the study sample had represented known subgroups (e.g., by region, by years of experience) within the population of case managers. Committee members detected no disproportionate representation. Still, the Committee decided to use a task exclusion method that would give sample subgroups opportunities to exclude tasks in case representation bias was present, but undetected by the Committee.

After examining task-rating results, the Committee established exclusion rules designed to narrow the full list of 91 tasks to a subset of those tasks that were critical to practice. These rules were designed to first identify the tasks that were extensively performed. The average significance of surviving tasks was assessed next with the intent to only retain significant tasks. Applying 5 decision rules excluded 7 tasks and retained 84 tasks across 4 content domains. Committee members assigned cognitive complexity designations by consensus to each critical task according to their perceptions of the mental process by which practitioners behaved competently. The Committee was confident that candidates’ scores should reflect critical job content associated with the demands of the job when an examination comprised of multiple-choice items is developed to the new specifications.

ACM Test Validation

Validation of the ACM core exam is determined using a modified Angoff Method, which is applied during the performance of a Passing Point Study by a panel of experts in the field. These universally accepted psychometric procedure rely on content experts to estimate the passing probability of each item on the examination. The experts evaluate each question to determine the number of correct answers necessary to demonstrate the knowledge and skills required to pass. A candidate’s ability to pass the examination depends on the knowledge and skill displayed, not on the performance of other candidates. Passing scores may vary slightly for each version of the examination. To ensure fairness to all candidates, a process of statistical equating is used. This involves selecting an appropriate mix of individual questions for each version of the examination that meet the content distribution requirements of the examination content outline. Because each question has been pre-tested, a difficulty level can be assigned. The process then considers the difficulty level of each question selected for each version of the examination - attempting to match the difficulty level of each version as closely as possible. To assure fairness, slight variations in difficulty level are addressed by adjusting the passing score, depending on the overall difficulty level statistics for the group of scored questions that appear on a particular version of the examination.

Validation of the Specialty Simulation Examination is set by an examination committee using a criterion-referenced method similar to a modified Angoff Method. The exact passing point may vary from one form of the examination to another, depending on the scored problems included on the examination form attempted. The examination committee follows strict guidelines in selecting the problems for each examination form to ensure the versions of the examination are parallel in difficulty.

Announcements

SAVE THE DATE | ACMA 2025 National Conference

Join us for ACMA 2025 National Conference, April 3-6, 2025 in Denver, CO. Don't miss out on the premier case management and transitions of care conference of the year!
Experience ACMA at National!

Save The Date | 2024 Leadership and Physician Advisor Conference

Our 2024 Leadership and Physician Advisor Conference is November 18-20 in Huntington Beach, CA! Health plans and providers, this conference has everything you need.
Find out more here!

Share Your Research | Collaborative Case Management

Do you have a project or measurable initiative you've instituted at your organization? Have you conducted research on a current issue in the field? Share your experiences and results with your professional community! Email your proposal or questions: vmatthews@acmaweb.org.
Learn More!

Get ACMA News in the Palm of Your Hand

Join the conversation with ACMA! Text the keyword ACMA to 844-554-2497 to stay up to date on all the latest news and announcements, delivered straight to your phone!

Now Accepting Presentations for Chapter Conferences

We are now accepting presentations for upcoming ACMA chapter conferences. If you have a unique solution, intervention or strategy to improve case management, this is a great opportunity to share your knowledge and be a part of ACMA's national-caliber education at the local level. There is no deadline to submit; presentations will be accepted throughout the year so you can prepare a submission as your schedule allows.
Submit a presentation!

SAVE THESE DATES | National Case Management Week

2024 - October 13-19
2025 - October 12-18
2026 - October 11-17
2027 - October 10-16
2028 - October 8-14

American Case Management Association
17200 Chenal Parkway Ste 300 #345
Little Rock, AR 72223
Phone: 501-907-ACMA (2262)
Fax: 501-227-4247