A key part of application development is configuring the app to respond to touch, gestures, and input from the end user.
Touchscreen input is a very platform specific consideration. Storyboard works with a number of standard input devices and abstracts the implementation specific behaviours for press
, release
, motion
and multi-touch
system events into standard Storyboard events. These input events are described in the Storyboard standard event definitions section of this document.
The configuration details for setting up and troubleshooting two popular Linux input systems are described in this document. For other system or touchscreen specific configurations consult the operating system or touchscreen vendors documentation.
On some systems, Storyboard Engine runs in a window. When running in a window there is specific behaviour for mouse or touch input leaving the screen. When leaving while pressed, Storyboard Engine will generate a gre.release event. When entering the window while pressed, Storyboard Engine will generate a gre.press event. Entering/leaving a window will not generate events if there are no mouse buttons pressed down.
Storyboard gesture support is provided by the libgre-plugin-gesture
plugin. The gesture plugin options are described in detail in the Storyboard plugin option appendix of this document.
The gesture plugin interprets the inbounds press
, release
and motion
events and based on those observed events will generate custom gesture events. Gestures are only emitted once a release
occurs and a pattern has been matched.
Gestures are made up of a series of numbers. The numbers represent the direction that the cursor was traveling as a grid arranged from one (1) to eight (8) ordered clockwise:
-
Up
-
Up and to the right
-
Right
-
Down and to the right
-
Down
-
Down and to the left
-
Left
-
Up and to the left
By default the gesture plugin registers some default gestures
|
1 |
|
5 |
|
3 |
|
7 |
Other gestures can be created by registering them in a custom gesture definition file that is loaded by the gesture plugin.
The gesture definition file is a comma separated value text file that contains a field for the name of the event followed by the numeric gesture sequence string that needs to be matched to generate the event. For example, to define a Z
gesture, you could put the following in the a gesture-definition.txt
file:
gre.gesture.zee,363
This definition indicates when the gesture plugin detects a right, down and left, left motion sequence that it should generate a gre.gesture.zee
event.
You can point the gesture plugin at the custom gesture definition file by running Storyboard Engine with the option -ogesture,file=filename
, where filename
is the name of the project relative file, for example gesture-definition.txt
.
Gesture sequences are currently limited to 30 movements after which a warning will be generated and the gesture entry will be ignored.
Unlike the single-touch gestures, which state the gesture you have just entered, the multi-touch gestures are events that fire whenever you have more than one finger on the touchscreen. The plugin tracks up to five contact points: if 6 or more are present, they are ignored.
The events the plugin listens to are gre.press, gre.release, and gre.motion to track the touchscreen info while only one finger is present and gre.mtpress, gre.mtrelease, and gre.mtmotion, to track the touchscreen info while multiple touches are present.
Note
When using a multi-touch enabled device, the press, release, and motion events will be sent only while there is one touch point present. As soon as there are multiple touch points present, all events will be mt events.
After listening to the events, if more than one touch point is present and one or more touch points move, the plugin will do an update where it compares the old touch locations to the updated touch locations and generate the related multi-touch gesture events. To determine how many fingers are currently being used to generate these events, there is an npoints field in the event data.
This event uses x_move and y_move to communicate the difference in x and y of the midpoint of all present touch touches between the current and last event sent from the touchscreen.
Data
uint32_t button uint32_t timestamp int16_t subtype int16_t x int16_t y int16_t z int16_t id int16_t spare float value float x_move float y_move int16_t npoints
Where:
- x_move
-
The x difference between this event and last event
- y_move
-
The x difference between this event and last event
- npoints
-
The number of touch points used to generate this event
This event uses the value data field, which will be the scale factor of the average spacing from all current touch points compared to the spacing of all the old touch points. The scale factor is calculated by newspacing/oldspacing, so a value of 1.1 indicates a growth of 10% and a value of 0.9 indicates a shrink of 10%
Data
uint32_t button uint32_t timestamp int16_t subtype int16_t x int16_t y int16_t z int16_t id int16_t spare float value float x_move float y_move int16_t npoints
Where:
- value
-
The scale factor event
- npoints
-
The number of touch points used to generate this event
This event uses the value data field, which will be the difference in rotation between the average angle of all current touch points compared to the average angle of all the previous touch points. The value will be in degrees.
Data
uint32_t button uint32_t timestamp int16_t subtype int16_t x int16_t y int16_t z int16_t id int16_t spare float value float x_move float y_move int16_t npoints
Where:
- value
-
The rotation difference event
- npoints
-
The number of touch points used to generate this event
By default the gesture events are treated as custom events and are not included in the available events list for actions to bind with. You will need to add them manually the same way that custom user events are added as described in the chapter Connecting Events to Actions.
Right-click the control you want to add the action to, and select Add button to the right of the Event Filter text box.
> . Then click theYou will see the gre.gesture.up
, gre.gesture.down
, gre.gesture.left
, and gre.gesture.right
Now that you have added the gesture events to the application you will be able to select them in the events list pf the
dialog to trigger an action.By default, sbengine will search your application for use of any gesture events and gestures will be enbled as required.
The gesture plugin can be disabled or forced to single or multi-touch mode by passing one of the folowing options to Storyboard Engine. See the section on Plugin Options for details about command line arguments.
When running from Storyboard Designer, the gesture plugin can be configured from the Simulation Configuration Dialog.
- auto
-
This is the default setting, Storyboard Engine will decide disabled, single or multi.
- single
-
Only single-touch gestures will be generated.
- multi
-
Single and Multi-touch gestures will be generated.
- disabled
-
No gesture events will be generated.
These gesture plugin options are discussed in more detail in the gesture plugin options section of this document.
Once you have configured the gesture plugin options in the Simulation Configuration Dialog, click Apply and Run to see your changes applied to the Storyboard Engine commandline arguments.