Webinar Recordings

60 minutes

Globalization and innovative web technologies have removed geographic boundaries and opened new opportunities for awarding bodies, technology certification programs and credentialing organizations. However, with these opportunities come new challenges: How do you localize items and assessments so they will reliably measure knowledge, skills and abilities across multiple languages and cultures?

Presented by Steve Dept, CEO of cApStAn, leading provider of linguistic quality assurance services, and John Kleeman, Executive Director and Founder of Questionmark, this webinar will help you understand good practice so that your tests measure the same thing in other languages and cultures. We will explore both strategic and procedural issues that must be considered in undertaking an assessment localization project and offer some tips and strategies for anticipating and overcoming common pitfalls. Finally, the session will provide a brief overview of some of the technologies and services available to manage the translation and test items and assessment.

   Download slides

Go to video page  »

James R. Parry, Test Development Manager, US Coast Guard Training Center Yorktown, VA, explains how to use the Angoff method for setting cut scores, plus a simple spreadsheet and metatags, to ensure all participants receive an assessment of equal difficulty.

View webinar slides as PDF

Go to video page  »

Video of a webinar presented by a Questionmark customer describing how to determine a cut score using the Angoff method.

Go to video page  »


3 minutes

In this video we’ll show you just how easy it is to access Questionmark OnDemand from SuccessFactors with SAML-based single-sign-on (SSO) enabled.

Go to video page  »
9 minutes

This video explains how to integrate Questionmark and Moodle.

Go to video page  »
8 minutes

This video explains how to integrate Questionmark and Cornerstone OnDemand

Go to video page  »

This video explains how to integrate Questionmark and SuccessFactors Learning.

Learn more about Publish to LMS and how it enables you to launch-and-track Questionmark assessments from within a learning management system. 

Go to video page  »

This video explains how to integrate Questionmark and Canvas.

Go to video page  »

Results API


Getting Started with Results API - Creating dashboards views with Microsoft Power BI

Go to video page  »


This video provides a high-level explanation of the OData protocol and shows how Questionmark's Results API for Analytics (which use OData) can be used to easily access assessment results data. It also explains how common business intelligence tools such as Excel, Tableau, and SAP BusinessObjects can use the Results API as a data source for creating custom reports and dashboards. Learn more about analysis and reporting of assessment results in Questionmark.

Go to video page  »


How to use SAP Business Objects to access your organization's Questionmark assessment results using odata feeds via Questionmark's Results API.

Go to video page  »


In this video, you'll learn how to use Questionmark's Results API for Analytics as an OData data source for the SAP BusinessObjects Business Intelligence Suite. Once BusinessObjects is connected to your assessment results data through the Results API, you can easily create custom reports and dashboards.

Go to video page  »

Users Conference Webcast Recordings

90 minutes

Questionmark CEO Eric Shepherd will moderate this session and call on members of our product management teams to present and demonstrate the latest releases and new features in assessment authoring, delivery, and reporting. 

Go to video page  »

68 minutes

You got your test scores, but what do they mean? Assessments include an evaluative component for interpreting scores in relation to the domain or measured construct, and test users require documentation on how to make those interpretations. Defining performance level descriptors and setting meaningful cut scores help meet these requirements. In this session, we will discuss interpretation differences between norm-referenced and criterion-referenced assessments. We will review the role of performance levels in score interpretation, and we will introduce several common methods for setting performance standards that are used to link scores to interpretations. We will illustrate concepts with examples and activities. 

Presenters: Austin Fossey, Reporting & Analytics Manager, Questionmark

Go to video page  »


We have all been exposed to some sort of assessment in our lives. Think back to your school days – you would sit in class and then get a test on what you were supposed to know. Most times the test was just a regurgitation of rote memory. If you paid attention and read the assignments you probably would pass. How long did you remember the material after the assessment? You probably forgot most of it by the end of the day so you could cram more in your head for the next test. Did your score really demonstrate what you could actually do? All it did was test your knowledge and how well you could memorize.

Writing test items at higher cognitive levels is a challenge that most test writers are apprehensive about tackling. This session will explain how to write meaningful tests items at all levels of Bloom’s Taxonomy – Knowledge, Comprehension, Application, Analysis, Synthesis and Evaluation. Additionally, tips on writing items at various levels of complexity will be discussed.


  • James R. Parry, Test Development Manager, U.S. Coast Guard Training Center, Yorktown, VA, 
  • LT Carlos E. Schwarzbauer, IT Lead, US Coast Guard Force Readiness Command (FC-Tadl)

Go to video page  »