More Control

For most stimulus event categories, Psykinematix provides more control at their onset, offset, and rendering stages as well as when they either emit or wait for trigger signals to or from external devices for synchronization purposes:

Onset
Offset
Rendering
Trigger

Such additional controls are available when a small switch appears in the upper right-hand corner of the properties panel as depicted below:

Clicking on the switch opens a properties palette that specifies which sets of controls are being used for the current event.

Onset

The "Onset" control specifies which action to take at the onset of the event.

It is possible to clear the background with a specified color and transparency value (alpha channel). It is also possible to keep the visual stimulus as a fixation during inter-trial intervals (particularly useful in dichoptic presentation). Note that both options are mutually exclusive. One can also specify whether the standard fixation mark should be displayed for the stimulus.

Offset

The "Offset" control indicates which action triggers the offset of the event.

  

The offset can be contingent upon its duration (default behavior), and/or its inputs (e.g., subject response) or both. When the offset occurs with an input, the device (e.g., "Mouse" or "Keyboard") and its event (click of a "Button" or "Key" stroke) must be specified. It can also be stipulated that the offset only be triggered when the device event has been released. Here is a list of supported devices and input events (if supported by the device):

For more details about specifying the inputs, see the 'inputs' section of the Procedure chapter.

Note: You may refer to the tutorial Creating Retinotopic Mapping Stimuli as it uses the Onset options to trigger the trial sequence.

Rendering

The "Render" control specifies which action to take during the presentation of the event, in particular the OpenGL functions to be applied to the stimuli (see the official OpenGL documentation to learn more about them) in terms of:

Note that the Alpha (that defines opacity), Scaling Factor and Orientation parameters can be specified using time-varying expressions to generate hardware-accelerated animations. For example a temporal contrast modulation can be created on-the-fly by applying alpha blending (see below) with a time-varying Alpha, hence saving a lot of video memory that would be required if the effect was pre-computed.

For texture-based stimuli (Grating, Checkerboard and Custom kinds), a Texture Mode specifies how the texture is applied on the destination surface or current background. Four texture functions are available (see 'Texture Mapping' chapter in OpenGL documentation):

For all kinds of stimuli, a Blending Mode is available to indicate how the stimulus is blended with the background by specifying the source and destination factors (see 'Blending Mapping' chapter in OpenGL documentation). Several pre-defined blending modes are available to perform common operations on visual stimuli:

A custom mode is also available to specify any other blending function supported by OpenGL; however this should be only used by OpenGL experts, or those who have thoroughly read the OpenGL documentation.

Note: You may refer to the tutorial Visual Acuity: Lesson 1 as it uses the Rendering options to properly display the text optotypes, and the tutorial Orientation Discrimination: Lesson 4 as it uses the Rendering options to properly overlap center and surround stimuli using the transparency blending mode.

Trigger

The "Trigger" control specifies whether the event emits an output trigger at its onset. The external device is selected using the pop-up menu attached to the "Trigger" checkbox. The selectable devices are only those that have been enabled in the I/O Devices preferences.

Supported external devices:

Events that send information to an external device (such as NetStation) by specifying variables with values, can use the [SELECTION] system-defined variable to indicate the value of the currently selected stimulus when it is chosen from a list, as it is the case for "Multimedia" stimuli chosen through the selection mode. Note that the value is converted to the indicated type (e.g., a stimulus name should be a 'TEXT' or 'string' type).

Note that for Dynamic Composing (except for the fused mode), a trigger may be emitted either at the onset of the dynamic event or at the onset of the composing events if these are set to do so, allowing for example a trigger to be sent at every cycle in the 'Temporal Frequency' mode, or for each polarity change in the 'Contrast Reversal' mode.

© 2006-2024 KyberVision Japan LLC. All rights reserved.

OpenGL is a registered trademark of Silicon Graphics, Inc. NetStation and the names of EGI products referenced herein are either trademarks and/or service marks or registered trademarks and/or service marks of Electrical Geodesics, Inc. (EGI).

Note: You may refer to the tutorial Creating Retinotopic Mapping Stimuli as it uses the Trigger options to trigger out some information to some external device.