This document serves as a quick reference which aims to answer questions regarding the various ways in which gstreamer can be set-up in your Storyboard application, and explain the building blocks along the way.
Gstreamer is an open-source library that supports audio and video playback/streaming/mixing. It is a tool that Storyboard utilizes, but it is not a Crank Software product. Gstreamer-backend is a media service which uses the gstreamer framework to control audio and video. While media can be played using just the base gstreamer framework via `gst-launch`, it’s usually better to use the gstreamer-backend service because it allows Media Control features like Pause, Resume, and Seek.
Before delving into any gstreamer work in Storyboard, if you have never worked with gstreamer before, please take some time to read their documentation and tutorials. Specifically, I would ensure to read this page on pipelines:
https://gstreamer.freedesktop.org/documentation/tutorials/basic/gstreamer-tools.html?gi-language=c
Having good foundational knowledge will only make the next steps easier.
Okay, so now that we’ve decided we want to use gstreamer-backend, how do we actually implement it?
Note: it’s generally a good idea to get your media successfully playing using basic gstreamer via ‘gst-launch-1.0 [pipeline]’ before attempting to integrate into Storyboard. This will help you identify potential issues with your video, platform, pipeline, etc. before introducing it into the gstreamer-backend/Storyboard environment.
If your Storyboard Engine’s plugin directory contains the ffmpeg plugin (libgre-plugin-ffmpeg.so), this will either need to be disabled or removed. This is because the ffmpeg plugin is loaded automatically when it is present. If it is not disabled/removed, it will compete with gstreamer-backend to service media requests.
Now all media requests will be serviced by gstreamer-backend. However, gstreamer-backend needs to be launched. Gstreamer-backend should be launched before your application is.
Here is an example of what a launch script might look like at this point:
killall sbengine gstreamer-backend export ENGINE=/usr/crank/runtimes/linux-imx6yocto-armle-opengles_2.0-obj export SB_PLUGINS=$ENGINE/plugins export LD_LIBRARY_PATH=$ENGINE/lib # Launch gstreamer-backend $ENGINE/bin/gstreamer-backend -v & # Launch your application $ENGINE/bin/sbengine -v -ogreio,channel=MediaPlayer /path/MediaPlayer.gapp &
Now that gstreamer-backend has been successfully set-up, let’s look at two ways we can render media playback in our application: using an external render extension, and the dual framebuffer approach. In a nutshell, if you wish to play low-fidelity videos, it’s quicker and more convenient to use external render extensions. If your video is medium to high fidelity or your main concern is performance, use the dual framebuffer approach. The following sections discuss these two approaches in detail.
Please ensure to read both, as they both have some crossover information.
The external render extension creates a buffer for other system applications or tasks to render into, things such as video players and web browsers. The MediaPlayer sample that comes with Storyboard uses an external render extension, so let’s use that sample as our reference. To import a sample, go to File > Import > Storyboard Development > Storyboard Sample > select MediaPlayer > Finish
Once the MediaPlayer application is open, select the external_buf control.
As you can see in the Properties tab on the right-hand side, this control contains an external render extension. In Designer, external render extensions will appear as a magenta square. Since external render extensions are a built in part of your application, they are easier to work with. You can overlay controls simply via re-arranging layer/control order in the application model, which adds a level of convenience when using external render extensions.
These don’t have any strict naming convention, but you do need to keep what you’ve set these to in mind as they will come into play momentarily.
Buffer Name: this is simply an identifier for your buffer.
Object Path: this is the path to a shared memory object.
Again, these can be named whatever you wish, as long as whatever platform you are running on does not currently have a conflicting shared memory object at that path, etc.
Let’s take a look at how to trigger video playback. Video playback is triggered via the gra.media.new.video action, so let’s find where that’s being used in this sample.In this case, the gra.media.new.video action can be found on the application level being triggered by an event called MediaPlay.
The Properties of this event are where the majority of the work for successful playback is done.
For greater detail on what each individual option here controls, please refer to our Plugin Action Definitions page, under gra.media.new.video:
(https://support.cranksoftware.com/hc/en-us/articles/360040001012-Plugin-Action-Definitions)
For now, we’ll focus on the most important ones.
Right away, one of the first things you will want to do is populate the External Buffer Name and Object Name. You must make sure these fields contain the same information as the Buffer Name and Object Path within the external render extension properties.
Channel Name: The channel name the new video is to be played on
Media Name: The name of the media to play, full path to a video file
Extra Data: Any extra data that should be passed to the backend (very important)
The Extra Data field is where you will specify the gstreamer pipeline to be used.The pipeline specified here can be similar to the pipeline used with gst-launch with only some minor changes.
Most notably, a pipeline that might’ve looked like:
gst-launch-1.0 [pipeline]
Translated for use by gstreamer-backend would look like:
pipeline: [pipeline]
There are a couple of extra flags that we will have to add/modify and pass into the pipeline for Storyboard usage though:
-
use_external: This is only to be used if you are writing your pipeline within Storyboard, i.e. the Extra Data field. Otherwise use “-e”. If you are using an external render extension like we are in this example, you need to add the “use_external” flag at the end of your pipeline, separated by a semi-colon.
“pipeline: [pipeline];use_external;”
-
-e: This has the same functionality as the use_external flag, the difference in when to use which one simply comes down to where you are specifying the pipeline. Use this one if you are writing your pipeline from the command line or launch script.
$ENGINE/bin/gstreamer-backend -e -p [pipeline]
-
-p: use the following defined pipeline to play the gstreamer content.
-
When it comes to specifying which video you wish to play, there are a couple ways to go about it.
You could hard-code the path to the video you wish to play:
“pipeline:filesrc location=/path/video.mov”
Or you could tell gstreamer-backend to look for whatever you’ve specified in the Media Name field of the gra.media.new.video properties:
“pipeline:filesrc name=media-src”
-
appsink name=video-sink: In order to use gstreamer-backend with an external render extension, appsink needs to be used as opposed to specifying any other kind of video sink. The external render extension options we’ve specified (use_external/-e) work by looking for an ‘appsink’ element named ‘video-sink’ in which to render the Storyboard application to.
appsink does not inherently know the format you wish to display your content with. So you will have to specify this. If I wanted my content to be 800x480 using BGRA color space I would write:
video/x-raw,height=480,width=800,format=BGRA
You will also need to add videoconvert and videoscale into your pipeline before adjusting the dimensions and color format, as these arguments are what allows the scaling, positioning, etc.
All together this would look like:
videoconvert ! videoscale ! video/x-raw,height=480,width=800,format=BGRA
Here’s an example pipeline with these modifications:
Pipeline:filesrc name=media-src ! videoconvert ! videoscale ! video/x-raw,height=480,width=800,format=BGRA ! appsink name=video-sink; use_external;
At this point these are the only modifications to your pipeline that are required in order to get your media to play on an external render extension. However, it is not uncommon to experience performance issues depending on your video or platform. Using an external render extension is slower. There are a few reasons for this. For every frame of a video, a memcopy is performed in order to copy the buffer data into the external render extension. Additionally, because we are using appsink and have to scale the video, both of these operations can be an intensive process for certain platforms.
If your platform has hardware acceleration capabilities, then you can add these elements into your pipeline to boost performance. If you are unsure what this pipeline element is called or what other elements you might have access to, perform `gst-inspect-1.0` and it will spew out a list of elements that your platform has.
If after all of your tweaking for optimization, you still find performance lacking then you may have to switch implementation to take the dual framebuffer approach.
The dual frame-buffer approach is simple in principle. In general, this is the ideal approach when it comes to performance. Since we won’t be using an external render extension, what we do instead is render the Storyboard application to one framebuffer and gstreamer-backend to a separate framebuffer.
You are going to want to render gstreamer-backend to the framebuffer below your application’s framebuffer.
What this usually means is that your Storyboard application will need to have a “hole” punched out of it to allow the video to peak through.
Luckily the MediaSample already does this, if you simply remove the external_buf control you will see that the background has a transparent section.
An example of “punching a hole” in your application.
For gstreamer, It’s actually quite simple to specify the framebuffer to be used. Video sink elements should have a parameter to specify the framebuffer. I’ll take imxipuvideosink for example:
“Imxipuvideosink window-width=880 window-height=520 window-x-coord=10 window-y-coord=75 framebuffer=/dev/fb0”
In this case, all you need to do is add the “framebuffer” element and provide the path to the desired frame buffer. Note that I am also specifying the window’s dimensions and coordinates that the video is to be displayed at.
It is also simple to specify the framebuffer that Storyboard is to use. The render manager has options to specify the framebuffer, but they will vary depending on your platform. Take a look at our plugin option documentation, under “render_mgr” to determine which one is right for you:
https://support.cranksoftware.com/hc/en-us/articles/360040000752-Storyboard-Engine-Plugin-Options
You will either need to use ‘display=[x]’
or ‘fb=[x]’
. Again these will vary from platform to platform.One of the key differences between these two options is that ‘display’ takes an index while ‘fb’ takes a path.
This just means that when you launch Storyboard, simply include something like: -orender_mgr,display=1
As part of your launch command line.
You may find that you will have to set up your secondary framebuffer via fbset before you are able to see content that you are rendering to that specific framebuffer. For example, that may look something like:
fbset -fb /dev/fb1 -g 1024 600 1024 600 32
Simply execute this command before trying to render to that buffer.
If at this point you are encountering some visual artifacts like a buffer having half-transparency, then you will have to modify the way alpha is handled in that buffer. Luckily, we do have a utility called fbalpha that can handle this for you.
The fbalpha utility provides added control over rendering content on the secondary framebuffer.
Installing fbalpha
When installing fbalpha, ensure to install it in your runtime’s ‘bin’ directory. Add a call to start up the fbalpha utility and direct it to the top-most framebuffer.
$ENGINE/bin/fbalpha -l1 -f /dev/fb1
At this point your launch script may look like:
$ENGINE/bin/fbalpha -l1 -f /dev/fb1
$ENGINE/bin/gstreamer-backend -v ...etc.
$ENGINE/bin/sbengine -v ...etc.
The next time you launch your application, the alpha issues should be resolved. The download provided also includes the source code. So if you wish to adjust the utility to suit your needs, you can.
This is purely gstreamer pipeline work. If your stream uses rtsp then instead of filesrc you will use rtspsource, which ends up looking like:
"rtspsrc location=[address]”
There are many pipeline parameters unique to rtsp streaming which you can familiarize yourself with here: (https://gstreamer.freedesktop.org/documentation/rtsp/rtspsrc.html?gi-language=c)