GSTreamer Backend

Playing media with gstreamer is possible in Storyboard through the use of the gstreamer-backend application. The gstreamer-backend application is available on most of the Linux targets that Storyboard supports. In order to use the gstreamer-backend application, the gstreamer base plugins library needs to be installed on the system. 

The media that will be able to be played by the gstreamer-backend will depend on the gstreamer plugins that are installed. To figure out what is installed on the system, use the gst-inspect application. 

Once you have had a chance to look through the plugins you have available to you, you can test playing the media. To do this, you can use the gst-launch application. On systems that have both the 0.10 version and 1.0 version of gstreamer, this application will be called gst-launch-0.10 and gst-launch-1.0. A good place to start is something like the following:

gst-launch -v playbin uri=file:///path/to/somemedia.file

The arguments after the –v option are called a pipeline. The above is a very simple pipeline that uses the playbin element to automatically detect how to play the media file that is located at the location specified by the URI. This is the quickest and simplest way to play media with gstreamer. For gstreamer-0.10 it requires that the playbin2 element is installed and for gstreamer-1.0 it requires the playbin element. To check to make sure these elements are installed you can use the gst-insect application.

Playing Audio Files with Gstreamer Backend

Playing audio is a simpler process than playing video due to the fact that audio does not need to be displayed on a graphical display. For this reason, playing audio just requires emitting the new audio action with the media name that you wish to play. Therefore if you can play the audio file using gst-launch on the target, you will be able to play it with gstreamer-backend.

A note about the channel argument in the media actions

The channel that is a part of the media actions is not a greio channel. Think of it more as a T.V. channel or a radio station. If you wanted to show 4 different video feeds at the same time, then you would create four different channels, for example, called video1, video2, video3, and video4. This makes it possible to control each video stream independently of one another by specifying the correct channel to subsequent media events.

Playing Video Files with Gstreamer Backend

Playing video with gstreamer and having it display in a storyboard application is a little more complicated. There are two different approaches. One is to use the storyboard external render library and have gstreamer place video frames in shared memory where the Storyboard engine can grab and display them on the screen. The other is to have gstreamer render the video frames to the screen directly and either have an area in the Storyboard application that has nothing drawn to it, or if available, make use of hardware layers to composite the Storyboard application and the video playback together. To control the video playback properly a basic pipeline setup should be:

filesrc name=media-src ! decodebin ! ffmpegcolorspace ! appsink name=video-sink

The above pipeline is one that would be used to play video through the external render extension. The name=media-src is important because this will allow the gstreamer-backend application to find the filesrc element and set the location of the media element to play. The name=video-sink is important only if you are using the external render extension configuration for showing video. This allows the gstreamer-backend to find the appsink entry and register to be notified of when new buffers are available.

Using the External Render extension

If you would like to use the external render extension then the video needs to be rendered to an appsink element and that element must be named video-sink. The Storyboard engine external render extension can only handle buffers that are in an RGB format, with a depth of 16 or 32 bits per pixel. Most of the video codecs output video frames in the YUV format. For this reason, a color converter will need to be used. The ffmpegcolorspace converter is a software converter that is typically available in gstreamer 0.10. For gstreamer 1.0, there is typically a video-converter element. On some platforms, the software color space conversion will impact performance. For this reason, the gst-inspect application should be used to see if there is a hardware color space converter available. These typically have the term ipu in the element name. On the note of performance, scaling of the video will also impact the performance of video playback if it needs to be done in software. Also, please note on ARM boards the size and depth of the external buffer will have an impact on the size of the shared memory region. If alignment needs to be taken into consideration then the external render extension will need to be sized accordingly.

Rendering frames to the screen directly

If you would like to have gstreamer to render the frames of the video to the screen directly then you don’t need to provide an appsink to the pipeline. A video sink that renders to a display can be used and it does not need to be named because Storyboard engine will not be attempting to grab frames from it in this case. If the system has hardware layers, and you wish to have video controls on top of the video during playback, then play the video on the back most layer and place the Storyboard application on the front-most layer.

If video layers are not available, then you can tell the video sink to render the video at a specific location and size. Then place controls around the video. In the Storyboard application, do not render anything in the area where the video will be playing.

Was this article helpful?
0 out of 0 found this helpful
Have more questions? Submit a request

Comments

0 comments

Please sign in to leave a comment.