Getting started with GStreamer with Python

You know, there are tonnes of undocumented things out there. Really, really cool technologies that should be getting used more are not getting used as much because there lacks decent docs. And, to make matters worse, the developers naturally just want to get on and write the software. So, I would like to urge everyone who reads this (and I am thinking you ‘orrible lot on Planet GNOME in particular) should write an article about something that you have discovered that isn’t particularly well documented. This could be a technique, a technology, skill or something. Lets get some Google juice pumping and get some extra docs to help people get started. :)

So, with this in mind, I am going to write a simple first guide to getting started with GStreamer using the excellent Python bindings. This tutorial should be of particular interest if you want to hack on Jokosher, Pitivi or Elisa as they, like many others, are written in Python and use GStreamer.

Ready? Right, lets get started with the pre-requisites. You will need the following:

  • GStreamer 0.10
  • Python
  • PyGTK (often packaged as python-gtk2)

You will also need a text editor. Now, some of you will want to have a big ‘ole argument about which one that is. Come back in four hours and we can continue. :P

An overview

So, what is GStreamer and how do they help you make multimedia applications? Well, GStreamer is a multimedia framework that allows you to easily create, edit and play multimedia by creating special pipelines with special multimedia elements.

GStreamer has a devilishly simple way of working. With GStreamer you create a pipeline, and it contains a bunch of elements that make that multimedia shizzle happen. This is very, very similar to pipelines on the Linux/BSD/UNIX command line. As an example, on the normal command line you may enter this command:

foo@bar:~$ ps ax | grep "apache" | wc -l

This command first grabs a process listing, then returns all the processes called “apache” and then feeds this list into the wc command which counts the number of lines with the -l switch. The result is a number that tells you how many instances of “apache” are running.

From this we can see that each command is linked with the | symbol and the output of the command on the left of the | is fed into the input on the command on the right of the |. This eerily similar to how GStreamer works.

With GStreamer you string together elements, and each element does something in particular. To demonstrate this, find an Ogg file (such as my latest tune :P ), save it to a directory, cd to that directory in a terminal and run the following command:

foo@bar:~$ gst-launch-0.10 filesrc location=jonobacon-beatingheart.ogg ! decodebin ! audioconvert ! alsasink

(you can press Ctrl-C to stop it)

When you run this, you should hear the track play. Lets look at what happened.

The gst-launch-0.10 command can be used to run GStreamer pipelines. You just pass the command the elements you want to play one by one, and each element is linked with the ! symbol. You can think of the ! as the | in a normal command-line list of commands. The above pipeline contains a bunch of elements, so lets explain what they do:

  • filesrc – this element loads a file from your disk. Next to the element you set its location property to point to the file you want to load. More on properties later.
  • decodebin – you need something to decode the file from the filesrc, so you use this element. This element is a clever little dude, and it detects the type of file and automatically constructs some GStreamer elements in the background to decode it. So, for an Ogg Vorbis audio file, it actually uses the oggdemux and vorbisdec elements. Just mentally replace the decodebin part of the pipeline for oggdemux ! vorbisdec and you get an idea of what is going on.
  • audioconvert – the kind of information in a sound file and the kind of information that needs to come out of your speakers are different, so we use this element to convert between them.
  • alsasink – this element spits audio to your sound card using ALSA.

So, as you can see, the pipeline works the same as the command-line pipeline we discussed earlier – each element feeds into the next element to do something interesting.

At this point you can start fiddling with pipelines and experimenting. To do this, you need to figure out which elements are available. You can do this by running the following command:

foo@bar:~$ gst-inspect-0.10

This lists all available elements, and you can use the command to find out details about a specific element, such as the filesrc element:

foo@bar:~$ gst-inspect-0.10 filesrc

More about GStreamer

OK, lets get down and dirty about some of the GStreamer terminology. Some people get quite confused by some of the terms such as pads and caps, not to mention bins and ghost pads. It is all rather simple to understand when you get your head around it, so lets have a quick run around the houses and get to grips with it.

We have already discussed what a pipeline is, and that elements live on the pipeline. Each element has a number of properties. These are settings for that particular element (like knobs on a guitar amp). As an example, the volume element (which sets the volume of a pipeline) has properties such as volume which sets the volume and mute which can be used to mute the element. When you create your own pipelines, you will set properties on a lot of elements.

Each element has virtual plugs in which data can flow in and out called pads. If you think of an element as a black box that does something to the information that is fed into it, on the left and right side of the box would be sockets in which you can plug in a cable to feed that information into the box. This is what pads do. Most elements have an input pad (called a sink and an output pad called a src). Using my l33t ASCII art mad skillz, this is how our pipeline above looks in terms of the pads:

[src]  !  [sink  src]  !  [sink  src]  !  [sink]

The element on the far left only has a src pad as it only provides information (such as the filesrc). The next few elements take information and do something to it, so they have sink and src pads (such as the decodebin and audioconvert elements), and the final element only receives information (such as the alsasink). When you use the gst-inspect-0.10 command to look at an element’s details, it will tell you which pads the element has.

So, we know we have pads, and data flows through them from the first element on the pipeline to the last element, and now we need to talk about caps. Each element has particular caps and this says what kind of information the element takes (such as whether it takes audio or video). You can think of caps as the equivalent rules on a power socket that says that it takes electricity of a particular voltage.

Lets now talk about bins. A lot of people get confused about bins, and they are pretty simple. A bin is just a convenient way of collecting elements together into a container. As an example, you may have a bunch of elements that decode a video and apply some effects to it. To make this easier to handle, you could put these elements into a bin (which is like a container) and then you can just refer to that bin to in turn refer to those elements. As such, the bin becomes an element. As as an example, if your pipeline was a ! b ! c ! d, you could put them all into mybin and when you refer to mybin, you are actually using a ! b ! c ! d. Cool, huh?

Finally, this brings us onto ghost pads. When you create a bin and shove a bunch of elements in there, the bin then becomes your own custom element which in turn uses those elements in the bin. To do this, your bin naturally needs its own pads that hook up to the elements inside the bin. This is exactly what ghost pads are. When you create a bin, you create the ghost pads and tell them which elements inside the bin they hook up to. Simple. :)

Writing some code

To make this GStreamer goodness happen in a Python script, you only need to know a few core skills to get started. These are:

  • Create a pipeline
  • Create elements
  • Add elements to the pipeline
  • Link elements together
  • Set it off playing

So, lets get started, we are going to create a program that does the equivalent of this:

foo@bar:~$ gst-launch-0.10 audiotestsrc ! alsasink

Here we use the audiotestsrc element which just outputs an audible tone, and then feed that into an alsasink so we can hear it via the sound card. Create a file called gstreeamertutorial-1.py and add the following code:

#!/usr/bin/python

import pygst
pygst.require("0.10")
import gst
import pygtk
import gtk

class Main:
    def __init__(self):
        self.pipeline = gst.Pipeline("mypipeline")

        self.audiotestsrc = gst.element_factory_make("audiotestsrc", "audio")
        self.pipeline.add(self.audiotestsrc)

        self.sink = gst.element_factory_make("alsasink", "sink")
        self.pipeline.add(self.sink)

        self.audiotestsrc.link(self.sink)

        self.pipeline.set_state(gst.STATE_PLAYING)

start=Main()
gtk.main()

Download the code for this script here.

So, lets explain how this works. First we import some important Python modules:

import pygst
pygst.require("0.10")
import gst
import pygtk
import gtk

Here the GStreamer modules (pygst and gst) are imported and we also use the gtk modules. We use the GTK modules so we can use the GTK mainloop. A mainloop is a process that executes the code, and we need some kind of mainloop to do this, so we are using the GTK one.

Now lets create a Python class and its constructor:

class Main:
    def __init__(self):

Now, to the meat. First create a pipeline:

self.pipeline = gst.Pipeline("mypipeline")

Here you create a pipeline that you can reference in your Python script as self.pipeline. The mypipeline bit in the brackets is a name for that particular instance of a pipeline. This is used in error messages and the debug log (more on the debug log later).

Now lets create an element:

self.audiotestsrc = gst.element_factory_make("audiotestsrc", "audio")

Here you create the audiotestsrc element by using the element_factory_make() method. This method takes two arguments – the name of the element you want to create and again, a name for that instance of the element. Now lets add it to the pipeline:

self.pipeline.add(self.audiotestsrc)

Here we use the add() method that is part of the pipeline to add our new element.

Lets do the same for the alsasink element:

self.sink = gst.element_factory_make("alsasink", "sink")
self.pipeline.add(self.sink)

With our two elements added to the pipeline, lets now link them:

self.audiotestsrc.link(self.sink)

Here you take the first element (self.audiotestsrc) and use the link() method to link it to the other element (self.sink).

Finally, lets set the pipeline to play:

self.pipeline.set_state(gst.STATE_PLAYING)

Here we use the set_state() method from the pipeline to set the pipeline to a particular state. There are a bunch of different states, but here we set it to PLAYING which makes the pipeline run. Other pipeline states include NULL, READY and PAUSED.

Finally, here is the code that create the Main instance and runs it:

start=Main()
gtk.main()

To run this script, set it to be executable and run it:

foo@bar:~$ chmod a+x gstreamertutorial-1.py
foo@bar:~$ ./gstreamertutorial-1.py

You should hear the audible tone through your speakers. Press Ctrl-C to cancel it.

Setting properties

Right, lets now add a line of code to set a property for an element. Underneath the self.audiotestsrc = gst.element_factory_make("audiotestsrc", "audio") line add the following line:

self.audiotestsrc.set_property("freq", 200)

This line uses the set_property() method as part of the element to set a particular property. Here we are setting the freq property and giving it the value of 200. This property specifies what frequency the tone should play at. Add the line of code above (or download an updated file here) and run it. You can then change the value from 200 to 400 and hear the difference in tone. Again, use gst-inspect-0.10 to see which properties are available for that particular element.

You can change properties while the pipeline is playing, which is incredibly useful. As an example, you could have a volume slider that sets the volume property in the volume element to adjust the volume while the audio is being played back. This makes your pipelines really interactive when hooked up to a GUI. :)

Hooking everything up to a GUI

Right, so how do we get this lot working inside a GUI? Well, again, its fairly simple. This section will make the assumption that you know how to get a Glade GUI working inside your Python program (see this excellent tutorial if you have not done this before).

Now, go and download this glade file and this Python script. The Python script has the following code in it:

#!/usr/bin/python
import pygst
pygst.require("0.10")
import gst
import pygtk
import gtk
import gtk.glade

class Main:
    def __init__(self):

        # Create gui bits and bobs

        self.wTree = gtk.glade.XML("gui.glade", "mainwindow")

        signals = {
            "on_play_clicked" : self.OnPlay,
            "on_stop_clicked" : self.OnStop,
            "on_quit_clicked" : self.OnQuit,
        }

        self.wTree.signal_autoconnect(signals)

        # Create GStreamer bits and bobs

        self.pipeline = gst.Pipeline("mypipeline")

        self.audiotestsrc = gst.element_factory_make("audiotestsrc", "audio")
        self.audiotestsrc.set_property("freq", 200)
        self.pipeline.add(self.audiotestsrc)

        self.sink = gst.element_factory_make("alsasink", "sink")
        self.pipeline.add(self.sink)

        self.audiotestsrc.link(self.sink)

        self.window = self.wTree.get_widget("mainwindow")
        self.window.show_all()

    def OnPlay(self, widget):
        print "play"
        self.pipeline.set_state(gst.STATE_PLAYING)

    def OnStop(self, widget):
        print "stop"
        self.pipeline.set_state(gst.STATE_READY)

    def OnQuit(self, widget):
        gtk.main_quit()

start=Main()
gtk.main()

In this script you basically create your pipeline in the constructor (as well as the code to present the GUI). We then have a few different class methods for when the user clicks on the different buttons. The Play and Stop buttons in turn execute the class methods which in turn just set the state of the pipeline to either PLAYING (Play button) or READY (Stop button).

Debugging

Debugging when things go wrong is always important. There are two useful techniques that you can use to peek inside what is going on in your pipelines within your GStreamer programs. You should first know how to generate a debug log file from your program. You do so by setting some environmental variables before you run your program. As an example, to run the previous program and generate a debug log called log, run the following command:

foo@bar:~$ GST_DEBUG=3,python:5,gnl*:5 ./gstreamertutorial.py > log 2>&1

This will generate a file called log that you can have a look into. Included in the file are ANSI codes to colour the log lines to make it easier to find errors, warnings and other information. You can use less to view the file, complete with the colours:

foo@bar:$ less -R log

It will mention it is a binary file and ask if you want to view it. Press y and you can see the debug log. Inside the log it will tell you which elements are created and how they link together.

Onwards and upwards

So there we have it, a quick introduction to GStreamer with Python. There is of course much more to learn, but this tutorial should get you up and running. Do feel free to use the comments on this blog post to discuss the tutorial, add additional comments and ask questions. I will answer as many questions as I get time for, and other users may answer other questions. Good luck!

…oh and I haven’t forgotten. I want to see everyone writing at least one tutorial like I said at the beginning of this article. :)

  • James

    You should submit this to Gnome Journal.

  • http://www.cin.ufpe.br/~cinlug Setanta

    We have a little article (in portuguese) obscenelly illustrated (not illustrated with obscenity) here (http://www.cin.ufpe.br/~cinlug/drupal/?q=node/59), and we’re planning for another one with code in python and ruby, so your article is of great help. Thanks.

  • http://macslow.thepimp.net MacSlow

    You’re so right, Jono! We’ll do something about this, I’ve no doubt about that :)

    Best regards…

    MacSlow

  • http://yonkeltron.com Yonkeltron

    Slick work man. Well done. Beard.

  • http://diggit eelco
  • http://SSPAETH.DE spaetz

    Nice tutorial. Contrary to earlier believes, there doesn’t even seem to be any witchcraft involved in getting gstreamer to do anything. I feel inspired to try myself at a python powered transcription application that is a little better suited than amarok at playying interview data with a foot switch. Thanks Jono

  • jb

    Thanks Jono for making this clearer :-) If readers need not be scared of Gsteamer, do you not fear listeners might be scared by this “Beating Heart” release ?

    jk, the tune is impressive.

  • jono

    Setanta – is there any chance you could translate that awesome looking tutorial into English? It would be awesome if you could. :)

    spaetz – awesome – if this tutorial helps you get started with the incredible GStreamer framework, then I am happy. :)

    Oh, and /me thwacks jb. :P

  • gm

    Great work Jono, I’ve been looking for something like this for a long time, it’s helped me understand how gstreamer works a lot better than previously.

    It would also be great if you’d follow this up with a tutorial dealing with more advanced stuff like metadata handling etc. (wink wink, nudge nudge :wink:)

    The complete lack of documentation for gst-python is incredibly frustrating, and I’m sure it’s absence has caused a lot of potential media-related app developers to shy away from using the brilliant framework.

  • flesse_bleu
  • minkwe

    Great stuff! I was able to code a TV player (no tuning yet) using “v4lsrc” and “autovideosink” in python. Now I would like to draw some lines and text on the life video feed.

    I suppose there is an “element” or “bin” that permits one to do this? Any ideas will be appreciated.

  • http://www.cin.ufpe.br/~cinlug Setanta

    Setanta – is there any chance you could translate that awesome looking >>tutorial into English? It would be awesome if you could. :) lol Thanks. So, I will try my english writing skills next weekend. And after that, suppose you allow me to translate your article to portuguese. :smile:

  • Aneglus

    actually translating the c code of the official gst documentation into python without knowing c nor glib (nor python) works pretty well too! thx

  • jono

    minkwe – that is great news! I am really pleased! :)

    Setanta – feel free to go ahead and translate the article. :)

  • flesse_bleu

    could you give examples with decodebin please :mrgreen: Thanks

  • flesse_bleu

    oops :oops: correction !!! could you give examples with bins and ghost pads please Thanks :smile:

  • jono

    flesse_bleu – when I get time I will see if I can do something. I still plan on making anothet tutorial about gnonlin.

    To sum up, bins though, you first create one:

    self.bin = gst.element_factory_make(“bin”, “mybin)

    Then create elements like normal and add them to the bin with self.bin.add() and link them like normal.

    To create the Ghost Pads, you need to do three things:

    (1) Grab the pads from elements inside the bin that you want to map the ghost pads to. So if your elements in the bin are: foo ! bar ! boogle – you would need to grab the sink pad from foo and the src pad from boogle. You can do this with get_pad() for each of those elements: (2) You then need to create the Ghost Pads. This is done with the following command:

    sinkpad = gst.GhostPad(“sink”, foopad) srcpad = gst.GhostPad(“sink”, booglepad)

    In the above lines, foopad and booglepad are the pads you grabbed from the elements.

    (3) Finally, add the Ghost Pads to the bin:

    self.bin.add_pad(sinkpad) self.bin.add_pad(srcpad)

    With that, you bin is complete. You can now add it where you need in a pipeline. :)

    Hope this helps. :)

  • flesse_bleu

    yes that helps me much thank you again the goal of my small project is to compile audio files various (mp3, flac, mpc….) in only one file ogg/mp3 but with a crossfade of 15 second between each. I don’t know which module of gstreamer i must use , so looking for a solution .

  • jono

    flesse_bleu – to do this I recommend you use gnonlin. Gnonlin will look after decoding the source files, and gnonlin includes a special GnlOperation that can be used to process bits of audio – such as a adding a fade to a particular portion. You would actually put a volume element inside a GnlOperation (a GnlOperation is a bin) and then you can use a GstControl to set the start and end point of the fade.

    I plan on writing up a quick Gnonlin tutorial over the next few weeks so stay tuned. :)

  • jono

    s/Gstcontrol/GstController :)

  • flesse_bleu

    :smile: yes yes yes :smile: I don’t move :lol:

  • Aneglus

    So it was about Gnonlin-love… :roll:

  • jono

    Aneglus – ?

  • flesse_bleu

    Hi Jono

    I have to look in dapper. I do not find GnlOperations

    gst-inspect-0.10 | grep gnl* gnonlin: gnlfilesource: GNonLin File Source gnonlin: gnlcomposition: GNonLin Composition gnonlin: gnlsource: GNonLin Source

  • jono

    flesse – yeah GnlOperations are a feature in the current developer version of Gnonlin and will be in the next official release. So, you need to build Gnonlin yourself from CVS to try them out.

  • BrentC

    I think maybe it’s time for jono to write “The Definitive Guide to Gstreamer” published by O’Reilly Corp. :lol:

    No, really…I’m serious

  • http://www.madinatek.com Khaled

    Hey Jono,

    nice work!!! I would like to add the article to mono-project.com with examples ported to C#. Is that ok with you?

    Thanks

  • jono

    Khaled – sure! Just add attribution. :)

  • http://www.openroadtrip.net Scott

    minkwe: textoverlay will overlay text and timeoverlay will overlay the time.

  • http://www.david-web.co.uk/ DJ

    Good tutorial, there’s certainly a serious lack of documentation related to using GStreamer in Python.

    Now what I really need is the same tutorial but dealing with video, specifically how to get it to be displayed within a GTK window/object. I’ve been trying to figure this out for months…

    And yes, Jono writing an O’Reilly guide to GStreamer is a very, very good idea!

  • http://www.david-web.co.uk/blog/?p=166 David-Web :: Blog » Video goodness with Python and GStreamer

    [...] I’ve been working on a project for some time which involves using Python and GStreamer to display some video. The lack of documentation has been a PITA and I’ve been trying for some time to figure out how you get GStreamer to display its video output within a GTK widget. Well today, I figured it out. Thanks to this blog post by Jono Bacon I learned a lot more about GStreamer which motivated me to start looking at the problem again. His post urges everyone who can to document what they’re doing in Python and GStreamer, so here’s some example code to display some video within a GTK widget. The code is based on Jono’s example for an audio player. #!/usr/bin/python import pygst pygst.require(”0.10″) import gst import pygtk import gtk [...]

  • LC

    Thank you, thank you, thank you… I had been looking for info o python-gstreamer for quite a while and this was the first thing that I actually found helpful.

    Now, I saw this the day you posted it (or maybe the next one), so if I were only writing to thank you, I’d have done so quite earlier. (Not that I didn’t want to do so, I’m just lazy, and believed you could live without having a total stranger thank you :P)

    So, what I wanted to ask to you (or to anyone else that reads this and knows the answer, of course ;) is:

    Is there a better/more-recommended way of seeing if a pad has certain caps than this:

    def on_oggdemux_new_pad(demuxer, pad):
        caps_str = str(pad.get_caps())
        if (caps_str.count("audio/x-vorbis")):
            demuxer.link(self.vorbisdec)
    
    self.oggdemux.connect("pad-added", on_oggdemux_new_pad)
    

    ?… I bet there is ;)

  • http://therning.org/magnus/archives/200 therning.org/ magnus » Interesting stuff

    [...] Thinking of writing a media app in Python? This seems like a good place to start. funny stuff, microsoft, stuff worth reading, trusted computing, vista [...]

  • http://www.john-hunt.com John

    Cheers Jono, absolutely fantastic intro to gstreamer python stuff. So easy I infact managed to make a cool siren sound with no python knowledge!! wooo woooo wooo woooo wooo etc. Brings back memories of BEEP on the zx speccy!

  • http://www.jonobacon.org/?p=810 jonobacon@home » GStreamer Dynamic Pads, Explained

    [...] (for an intro to GStreamer, see my previous article) [...]

  • http://www.john-hunt.com John Hunt

    Fantastic guide, even if you’re not looking to do anything musical, it’s a great intro to some basic ‘real-world’ python stuff. I like it a lot.

    Infact, it’s inspired me to do a lot of python. Python is blatantly cool.

  • http://e-ploko.livejournal.com/ E-ploko

    Wow. This is an amazing tutorial. I’d been sure before that the GStreamer Framework is a kewl thing, and your tutorial made me sure even more. %) I guess now I can finally try to get rid of the mecoder-sox-blah-blah-blah pipeline in my app, as that’s really hard to manage from the app…

  • jono

    Woo! :)

    I have my gnonlin one cued up too. :)

  • http://www.sheepeatingtaz.co.uk/blog/2006/11/28/we-demand-a-shubbery/ sheepeatingtaz » We Demand A Shubbery

    [...] … I have even tried little things in the past, like trying to learn through existing applications (for example: mrben’s Lugradio Mirroring Script) and had a look through the Jokosher code, and Jono Bacon’s Python/GStreamer tutorial. These however (especially Jokosher!) seemed like blindfolded extreme diving headfirst into python! If anyone can recommend a really good book, or online tutorial, preferably that they can recommend from experience, please leave a comment, as this is something I would really like to move forward on Participate! Leave your comment. [...]

  • John

    Now this is how a good startup tutorial should be written! Thanks! :grin:

  • http://www.jonobacon.org/?p=851 jonobacon@home » Using Gnonlin with GStreamer and Python

    [...] Right, lets dig in and write some code. Once again, I am going to make the assumption that you are familiar with using a Glade GUI (if you are not see this fantastic tutorial. You should also be familiar with the basics of GStreamer and Python – see my own tutorial for a primer on this. [...]

  • http://www.jejik.com/articles/2007/01/streaming_audio_over_tcp_with_python-gstreamer Lone Wolves – Web, game, and open source development

    [...] Anoher thing I wanted an excuse for was learning gstreamer, so naturally I picked python-gstreamer. Sadly there’s a huge lack of documentation for it. There are only a few good tutorials around, first and foremost Jona Bacon’s excellent tutorials [1][2][3]. I wanted my hack to work over TCP as well, because I have multiple music libraries. One on my home server, one on my desktop, a bit more on my laptop, etcetera. I want to be able to stream from other machines to my server which I’ll hook up to my stereo. I could not find any tutorials on using tcpserversource and tcpclientsink elements, so honoring Jona’s request that everyone “should write an article about something that you have discovered that isn’t particularly well documented”, here’s mine about tcpserversource and tcpclientsink. [...]

  • kthakore

    great tut, but I was wondering how I can set a property of a plugin in python. for example I am thinking this

    fileout = gst.element_factory_make(“filesink”, “sink”) fileout.location = “/home/Desktop/test.ogg”

    but it dosn’t seem to work

  • http://www.jonobacon.org/?p=989 jonobacon@home » Debugging Jokosher Guide

    [...] Getting started with GStreamer with Python [...]

  • http://ubuntu-linux.withishow.com/2007/06/11/jono-bacon-debugging-jokosher-guide/ Ubuntu | Jono Bacon: Debugging Jokosher Guide

    [...] Getting started with GStreamer with Python [...]

  • http://azulebanana.com/bluey/2007/06/12/mixing-no-linux-com-python-gtk2-e-gstreamer/ Mixing no Linux com Python, GTK2 e GStreamer at Liberdade na era tecnológica?

    [...] Graças ao esforço do Jono em incentivar quem pegue nesta matéria, estive a brincar com os exemplos dele, no Eclipse+PyDev e no Glade. [...]

  • http://dlai.jafu.dk dlai

    Hey Jono,

    I was planning on doing a small rhythmbox plugin… I’m trying to dump the audio stream of rhythmbox to a file… It seams to be relatively simple, using filesink.

    just like kthakore did it above

    fileout = gst.element_factory_make(”filesink”, “sink”) fileout.location = “/home/Desktop/test.ogg”

    but for some reason the ogg never gets made??

    And how do I grab the audio stream of rhythmbox?

  • http://www.xdiscover.com/ Freeware and Shareware Software Downloads

    Freeware and Shareware Software Downloads…

    I couldn’t understand some parts of this article, but it sounds interesting…

  • http://pythonnoob.wordpress.com/2008/01/21/some-gstreamer-tips/ Some GStreamer Tips « The Python Noob

    [...] find a tutorial, a reference manual and some example programs. There is also a blog post called Getting started with GStreamer with Python, haven’t checked it out yet but it probably contains some useful [...]

  • http://www.tuxjournal.net/?p=2237 TuxJournal.net » Un’introduzione all’audio su Linux – PyAudio e PyGst con Gstreamer (5/6)

    [...] di interfaccia grafica. Per chi non conoscesse già Gstreamer una buona introduzione può essere l’articolo di Jono Bacon che descrive i lavori di Gstreamer e anche la libreria che si appoggia su di esso PyGst. Gstreamer [...]

  • vlkodlak

    Thanks for the tutorial. Exactly what I was looking for.

  • Neurostu

    Amazing, tutorial. I can’t tell you how long I spent on google trying to find some documentation about gstreamer in python… Luckily I found this tutorial. I’m amazed that you got this far without any documentation.

  • Keith Wright

    Good introduction! It would be nice to see the code to make the sound file play from the GUI, or even in the CLI in python scripts…:cool:

  • http://www.ettklickforskogen.se skorpan

    Thanks a bunch! Seriously, this is an awesome tutorial!

  • kab

    I have run some testing python scripts now and I love gstreamer. The only thing I have not found out, is how I can give gstreamer multi channel audio inputs. I have 6 audio files encoded with FLAC, on for every speaker.

    I like to play with the files from xiph.org, wich are very high quality: http://media.xiph.org/BBB/

    Regards

  • Ajay Gautam

    Can anybody tell that what is difference between Pipeline and Bin?

  • http://www.earobinson.org Edward A Robinson

    Download link for the full script is broken :(

  • http://codebad.com/ Donny Viszneki

    less’s -R option doesn’t always work very well because it doesn’t know how to account for the spacing of many control characters properly

  • http://www.chaz6.com Chaz6

    The file “gui.glade” needed for one of the examples seems to have gone awol. Any chance of restoring it? Thanks!

  • http://linux.leunen.com Michel Leunen

    Hi Jono, Thanks for this great startup tutorial. Just a little problem I want you to know. The links you provide to download the samples code just lead to nowhere (404 – not found).

  • suyash gogte

    hi i tried to run gstreamertutorial1.py which is frst tutorial or code , but after runnning it I couldn’t hear nay sound,I checked log file it shows evrything fine..can u plz tell me what can be d problem

  • http://sheepeatingtaz.co.uk/blog/2006/11/28/we-demand-a-shubbery/ sheepeatingtaz : We Demand A Shubbery

    [...] Lugradio Mirroring Script) and had a look through the Jokosher code, and Jono Bacon’s Python/GStreamer tutorial. These however (especially Jokosher!) seemed like blindfolded extreme diving headfirst into python! [...]

  • varun aka samurai

    After running the code, u couldnt hear any sound ! ok… let me think… hmmmmmmm…….. yeah ! got it ! See, i m sure u havnt turn on ur speakers! Plz, put it on ! Then try ! If problem still exists, then call me, mala baryaach prakarche sound tondane kadhhta yetat ! Wish u all the best ! [:)]

  • http://ipass.wordpress.com/2009/09/11/%e0%b8%aa%e0%b8%99%e0%b8%b8%e0%b8%81%e0%b8%81%e0%b8%b1%e0%b8%9a-gstreamer/ สนุกกับ Gstreamer « iPAS’s Technical Note
  • http://www.hindsightlabs.com/blog/2010/02/09/pyclutter-video-tutorial/ PyClutter video tutorial « Hindsight Labs

    [...] libraries (Cairo, Box2D, GTK), it does so by thinly covering their APIs. Read this great article: Getting started with GStreamer and Python. I’m as ADD as a pug at Thanksgiving dinner, so it took me about an hour and a half to get [...]

  • http://ratiosoftware.com Clayton Gulick

    Dude, seriously, thanks. If you’re ever near Dallas, beers are on me.

  • http://www.cynapsys.de Hamadi MESSOUD

    hello anyone knows the method to create dynamic pads? I use mpegtsdemux plugin to demultiplexer audio and video,so I need to create a dynamic pad for link mpegtsdemux pad and queue ( queuevideo, queueaudio) here is my pipline I must coded in python or C : gst-launch filesrc location=/home/hamadi/Bureau/test.ts ! mpegtsdemux name=demux program-number=12041 ! queue ! mpeg2dec ! ffmpegcolorspace ! xvimagesink demux. ! queue ! mad ! audioconvert ! audioresample ! alsasink thank you

  • Ravindran

    (gst-launch-0.10:6312): GLib-WARNING **: g_set_prgname() called multiple times Setting pipeline to PAUSED … Pipeline is PREROLLING … ERROR: from element /GstPipeline:pipeline0/GstDecodeBin:decodebin0: A text/html decoder plugin is required to play this stream, but not installed. Additional debug info: gstdecodebin.c(986): close_pad_link (): /GstPipeline:pipeline0/GstDecodeBin:decodebin0: No decoder to handle media type ‘text/html’ ERROR: pipeline doesn’t want to preroll. Setting pipeline to NULL … Freeing pipeline …

    Returns an Error Like this. Can you help ?

  • http://codingweasel.blogspot.com/2006/12/talk-3-gstreamer-and-python.html The Coding Weasel: talk #3 – gstreamer and python

    [...] the talk a couple of posts by Jono Bacon that provide a good introduction – they can be found here and here.Labels: gstreamer, python, [...]

  • http://blog.kunalk.co.cc Kunal

    Probably you don’t have the ogg decoder. Try it with a mp3 file.

  • Sam

    Hello .. i am doing a project in building a media player using gstreamer.. I find this very very useful to start with .. do you have any tutorial or example for building a pipeline ..

    many thanx

  • http://squaregalaxy.com Jacob

    What if I wanted to display video rather than play audio?

  • ajaxhe

    ??? I want recording from microphone and use “tcpclientsink” to send the voice to remote computer. The problem is, how can I get the data from microphone? thanks!

  • ajaxhe

    fixed! using command line like: gst-launch-0.10 alsasrc ! audiorate ! wavenc ! filesink location=testsound.wav

  • Glebzykov

    Properties are set with set_property() method. Try fileout.set_property(“file,”/home/desktop/test.ogg”)

  • Udayasri

    Thanks Jono ..It is awesome..Really helpful for me tounderstand GStreamer :)

  • http://www.seorus.com.au/ SEO Melbourne

    Great post .

  • Tb. Ifan Firmansyah

    Thank you for sharing and post this article man, good job and keep going ….

  • Miguisg

    Wonderfull guide! Congratulations! You explain easily what others can not!

  • Moeko Ramone

    I really enjoyed the tutorial and I’ll try to post one like this one in the near future.

  • hualet

    Thank you for always doing that for our rookies….

  • George

    This debugging stuff is magical. Set the envars and boom straight to stdout. Love it.

  • Mo

    Thank You, Extremely helpful. :)