Video of a webinar presented by a Questionmark customer describing how to determine a cut score using the Angoff method.
In this video we’ll show you just how easy it is to access Questionmark OnDemand from SuccessFactors with SAML-based single-sign-on (SSO) enabled.
This video explains how to integrate Questionmark and Moodle.
This video explains how to integrate Questionmark and Cornerstone OnDemand
This video explains how to integrate Questionmark and SuccessFactors Learning.
Learn more about Publish to LMS and how it enables you to launch-and-track Questionmark assessments from within a learning management system.
This video explains how to integrate Questionmark and Canvas.
Getting Started with Results API - Creating dashboards views with Microsoft Power BI
This video provides a high-level explanation of the OData protocol and shows how Questionmark's Results API for Analytics (which use OData) can be used to easily access assessment results data. It also explains how common business intelligence tools such as Excel, Tableau, and SAP BusinessObjects can use the Results API as a data source for creating custom reports and dashboards. Learn more about analysis and reporting of assessment results in Questionmark.
How to use SAP Business Objects to access your organization's Questionmark assessment results using odata feeds via Questionmark's Results API.
In this video, you'll learn how to use Questionmark's Results API for Analytics as an OData data source for the SAP BusinessObjects Business Intelligence Suite. Once BusinessObjects is connected to your assessment results data through the Results API, you can easily create custom reports and dashboards.
Users Conference Webcast Recordings
Questionmark CEO Eric Shepherd will moderate this session and call on members of our product management teams to present and demonstrate the latest releases and new features in assessment authoring, delivery, and reporting.
You got your test scores, but what do they mean? Assessments include an evaluative component for interpreting scores in relation to the domain or measured construct, and test users require documentation on how to make those interpretations. Defining performance level descriptors and setting meaningful cut scores help meet these requirements. In this session, we will discuss interpretation differences between norm-referenced and criterion-referenced assessments. We will review the role of performance levels in score interpretation, and we will introduce several common methods for setting performance standards that are used to link scores to interpretations. We will illustrate concepts with examples and activities.
Presenters: Austin Fossey, Reporting & Analytics Manager, Questionmark
We have all been exposed to some sort of assessment in our lives. Think back to your school days – you would sit in class and then get a test on what you were supposed to know. Most times the test was just a regurgitation of rote memory. If you paid attention and read the assignments you probably would pass. How long did you remember the material after the assessment? You probably forgot most of it by the end of the day so you could cram more in your head for the next test. Did your score really demonstrate what you could actually do? All it did was test your knowledge and how well you could memorize.
Writing test items at higher cognitive levels is a challenge that most test writers are apprehensive about tackling. This session will explain how to write meaningful tests items at all levels of Bloom’s Taxonomy – Knowledge, Comprehension, Application, Analysis, Synthesis and Evaluation. Additionally, tips on writing items at various levels of complexity will be discussed.
- James R. Parry, Test Development Manager, U.S. Coast Guard Training Center, Yorktown, VA,
- LT Carlos E. Schwarzbauer, IT Lead, US Coast Guard Force Readiness Command (FC-Tadl)