Videos | Page 3 | Questionmark

Videos

Integration

08:09

This video explains how to integrate Questionmark and SuccessFactors Learning.

Learn more about Publish to LMS and how it enables you to launch-and-track Questionmark assessments from within a learning management system. 

Go to video page  »
11:39

This video explains how to integrate Questionmark and Canvas.

Go to video page  »

Results API

09:40

Getting Started with Results API - Creating dashboards views with Microsoft Power BI

Go to video page  »

06:49

This video provides a high-level explanation of the OData protocol and shows how Questionmark's Results API for Analytics (which use OData) can be used to easily access assessment results data. It also explains how common business intelligence tools such as Excel, Tableau, and SAP BusinessObjects can use the Results API as a data source for creating custom reports and dashboards. Learn more about analysis and reporting of assessment results in Questionmark.

Go to video page  »

7:22

How to use SAP Business Objects to access your organization's Questionmark assessment results using odata feeds via Questionmark's Results API.

Go to video page  »

7:22

In this video, you'll learn how to use Questionmark's Results API for Analytics as an OData data source for the SAP BusinessObjects Business Intelligence Suite. Once BusinessObjects is connected to your assessment results data through the Results API, you can easily create custom reports and dashboards.

Go to video page  »

Users Conference Webcast Recordings

90 minutes

Questionmark CEO Eric Shepherd will moderate this session and call on members of our product management teams to present and demonstrate the latest releases and new features in assessment authoring, delivery, and reporting. 

Go to video page  »

68 minutes

You got your test scores, but what do they mean? Assessments include an evaluative component for interpreting scores in relation to the domain or measured construct, and test users require documentation on how to make those interpretations. Defining performance level descriptors and setting meaningful cut scores help meet these requirements. In this session, we will discuss interpretation differences between norm-referenced and criterion-referenced assessments. We will review the role of performance levels in score interpretation, and we will introduce several common methods for setting performance standards that are used to link scores to interpretations. We will illustrate concepts with examples and activities. 

Presenters: Austin Fossey, Reporting & Analytics Manager, Questionmark

Go to video page  »

45:00

We have all been exposed to some sort of assessment in our lives. Think back to your school days – you would sit in class and then get a test on what you were supposed to know. Most times the test was just a regurgitation of rote memory. If you paid attention and read the assignments you probably would pass. How long did you remember the material after the assessment? You probably forgot most of it by the end of the day so you could cram more in your head for the next test. Did your score really demonstrate what you could actually do? All it did was test your knowledge and how well you could memorize.

Writing test items at higher cognitive levels is a challenge that most test writers are apprehensive about tackling. This session will explain how to write meaningful tests items at all levels of Bloom’s Taxonomy – Knowledge, Comprehension, Application, Analysis, Synthesis and Evaluation. Additionally, tips on writing items at various levels of complexity will be discussed.

Presenters:

  • James R. Parry, Test Development Manager, U.S. Coast Guard Training Center, Yorktown, VA, 
  • LT Carlos E. Schwarzbauer, IT Lead, US Coast Guard Force Readiness Command (FC-Tadl)

Go to video page  »