元心科技分享:Gstreamer 初探 (五) (GTK+)_gstreamer画界面必须在主线程吗-程序员宅基地

Gstreamer 初探 (五)

(GTK+)

本次教程我们介绍将Gstreamer集成到GTK+中的操作详解,基本上Gstreamer负责媒体流的处理,用户交互的部分留给GTK+来处理,最有趣的部分是Gstreamer如何将视频输出到Gtk+的窗口,而Gtk+将用户操作转发到Gstreamer进行控制

通过这次课程,请留意以下内容:

  1. 如何将Gstreamer的视频输出到指定的窗口(而不是让其创建自己的窗口)
  2. 如何将Gstreamer的信息在界面上进行刷新
  3. 如何从Gstreamer的多个线程中更新界面
  4. 如何只获取需要的消息,而不是获取所有的消息
介绍

我们将使用Gtk+和Gstreamer构建一个媒体播放器,对Gtk+有一定了解将有助于理解本教程

主要的一点是让Gstreamer的视频流输出到我们使用Gtk+创建的窗口上,当然,这其中具体的机制是依赖于操作系统的(更确切的说依赖于窗口系统),但GStreamer为平台独立性提供了一个抽象层,通过GstVideoOverlay接口实现,这允许应用程序告诉sink应该接收渲染的窗口处理程序

GObject interfaces

Gstreamer所使用的GObject 接口是通过element所包含的一组函数,如果有这样的函数,就称作它支持这个特定的接口,例如,sinks通常创建自己的窗口来呈现视频,但如果视频流可以被呈现到外部的窗口,sinks就可以实现GstVideoOverlay这个接口提供函数来提供指定一个外部窗口的方法,从开发者的角度讲,如果element支持某个接口,我们不需要考虑具体是哪个element实现的,而可以直接使用这个接口来完成功能,此外,playbin会自动支持实现一些接口函数以供我们使用

我们需要注意GStreamer的多线程问题,GUI工具包通常只允许通过主(或应用程序)线程操作图形“小部件”,而GStreamer通常生成多个线程来处理不同的任务。从回调内部调用GTK+函数通常会失败,因为回调是在调用线程中执行的,而调用线程不需要是主线程。这个问题可以通过在回调中的GStreamer总线上发布消息来解决:主线程接收消息,主线程将相应地做出反应。

在上一次的教程中,我们通过注册handle_message函数,每当总线上出现一条消息时,就会调用这个函数,这使得我们解析每条消息。在本教程中,使用不同的方法为每种类型的消息注册回调,因此解析更少,代码也更少。

basic-tutorial-5.c

#include <string.h>

#include <gtk/gtk.h>
#include <gst/gst.h>
#include <gst/video/videooverlay.h>

#include <gdk/gdk.h>
#if defined (GDK_WINDOWING_X11)
#include <gdk/gdkx.h>
#elif defined (GDK_WINDOWING_WIN32)
#include <gdk/gdkwin32.h>
#elif defined (GDK_WINDOWING_QUARTZ)
#include <gdk/gdkquartz.h>
#endif

/* Structure to contain all our information, so we can pass it around */
typedef struct _CustomData {
    
  GstElement *playbin;           /* Our one and only pipeline */

  GtkWidget *slider;              /* Slider widget to keep track of current position */
  GtkWidget *streams_list;        /* Text widget to display info about the streams */
  gulong slider_update_signal_id; /* Signal ID for the slider update signal */

  GstState state;                 /* Current state of the pipeline */
  gint64 duration;                /* Duration of the clip, in nanoseconds */
} CustomData;

/* This function is called when the GUI toolkit creates the physical window that will hold the video.
 * At this point we can retrieve its handler (which has a different meaning depending on the windowing system)
 * and pass it to GStreamer through the VideoOverlay interface. */
static void realize_cb (GtkWidget *widget, CustomData *data) {
    
  GdkWindow *window = gtk_widget_get_window (widget);
  guintptr window_handle;

  if (!gdk_window_ensure_native (window))
    g_error ("Couldn't create native window needed for GstVideoOverlay!");

  /* Retrieve window handler from GDK */
#if defined (GDK_WINDOWING_WIN32)
  window_handle = (guintptr)GDK_WINDOW_HWND (window);
#elif defined (GDK_WINDOWING_QUARTZ)
  window_handle = gdk_quartz_window_get_nsview (window);
#elif defined (GDK_WINDOWING_X11)
  window_handle = GDK_WINDOW_XID (window);
#endif
  /* Pass it to playbin, which implements VideoOverlay and will forward it to the video sink */
  gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (data->playbin), window_handle);
}

/* This function is called when the PLAY button is clicked */
static void play_cb (GtkButton *button, CustomData *data) {
    
  gst_element_set_state (data->playbin, GST_STATE_PLAYING);
}

/* This function is called when the PAUSE button is clicked */
static void pause_cb (GtkButton *button, CustomData *data) {
    
  gst_element_set_state (data->playbin, GST_STATE_PAUSED);
}

/* This function is called when the STOP button is clicked */
static void stop_cb (GtkButton *button, CustomData *data) {
    
  gst_element_set_state (data->playbin, GST_STATE_READY);
}

/* This function is called when the main window is closed */
static void delete_event_cb (GtkWidget *widget, GdkEvent *event, CustomData *data) {
    
  stop_cb (NULL, data);
  gtk_main_quit ();
}

/* This function is called everytime the video window needs to be redrawn (due to damage/exposure,
 * rescaling, etc). GStreamer takes care of this in the PAUSED and PLAYING states, otherwise,
 * we simply draw a black rectangle to avoid garbage showing up. */
static gboolean draw_cb (GtkWidget *widget, cairo_t *cr, CustomData *data) {
    
  if (data->state < GST_STATE_PAUSED) {
    
    GtkAllocation allocation;

    /* Cairo is a 2D graphics library which we use here to clean the video window.
     * It is used by GStreamer for other reasons, so it will always be available to us. */
    gtk_widget_get_allocation (widget, &allocation);
    cairo_set_source_rgb (cr, 0, 0, 0);
    cairo_rectangle (cr, 0, 0, allocation.width, allocation.height);
    cairo_fill (cr);
  }

  return FALSE;
}

/* This function is called when the slider changes its position. We perform a seek to the
 * new position here. */
static void slider_cb (GtkRange *range, CustomData *data) {
    
  gdouble value = gtk_range_get_value (GTK_RANGE (data->slider));
  gst_element_seek_simple (data->playbin, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT,
      (gint64)(value * GST_SECOND));
}

/* This creates all the GTK+ widgets that compose our application, and registers the callbacks */
static void create_ui (CustomData *data) {
    
  GtkWidget *main_window;  /* The uppermost window, containing all other windows */
  GtkWidget *video_window; /* The drawing area where the video will be shown */
  GtkWidget *main_box;     /* VBox to hold main_hbox and the controls */
  GtkWidget *main_hbox;    /* HBox to hold the video_window and the stream info text widget */
  GtkWidget *controls;     /* HBox to hold the buttons and the slider */
  GtkWidget *play_button, *pause_button, *stop_button; /* Buttons */

  main_window = gtk_window_new (GTK_WINDOW_TOPLEVEL);
  g_signal_connect (G_OBJECT (main_window), "delete-event", G_CALLBACK (delete_event_cb), data);

  video_window = gtk_drawing_area_new ();
  gtk_widget_set_double_buffered (video_window, FALSE);
  g_signal_connect (video_window, "realize", G_CALLBACK (realize_cb), data);
  g_signal_connect (video_window, "draw", G_CALLBACK (draw_cb), data);

  play_button = gtk_button_new_from_icon_name ("media-playback-start", GTK_ICON_SIZE_SMALL_TOOLBAR);
  g_signal_connect (G_OBJECT (play_button), "clicked", G_CALLBACK (play_cb), data);

  pause_button = gtk_button_new_from_icon_name ("media-playback-pause", GTK_ICON_SIZE_SMALL_TOOLBAR);
  g_signal_connect (G_OBJECT (pause_button), "clicked", G_CALLBACK (pause_cb), data);

  stop_button = gtk_button_new_from_icon_name ("media-playback-stop", GTK_ICON_SIZE_SMALL_TOOLBAR);
  g_signal_connect (G_OBJECT (stop_button), "clicked", G_CALLBACK (stop_cb), data);

  data->slider = gtk_scale_new_with_range (GTK_ORIENTATION_HORIZONTAL, 0, 100, 1);
  gtk_scale_set_draw_value (GTK_SCALE (data->slider), 0);
  data->slider_update_signal_id = g_signal_connect (G_OBJECT (data->slider), "value-changed", G_CALLBACK (slider_cb), data);

  data->streams_list = gtk_text_view_new ();
  gtk_text_view_set_editable (GTK_TEXT_VIEW (data->streams_list), FALSE);

  controls = gtk_box_new (GTK_ORIENTATION_HORIZONTAL, 0);
  gtk_box_pack_start (GTK_BOX (controls), play_button, FALSE, FALSE, 2);
  gtk_box_pack_start (GTK_BOX (controls), pause_button, FALSE, FALSE, 2);
  gtk_box_pack_start (GTK_BOX (controls), stop_button, FALSE, FALSE, 2);
  gtk_box_pack_start (GTK_BOX (controls), data->slider, TRUE, TRUE, 2);

  main_hbox = gtk_box_new (GTK_ORIENTATION_HORIZONTAL, 0);
  gtk_box_pack_start (GTK_BOX (main_hbox), video_window, TRUE, TRUE, 0);
  gtk_box_pack_start (GTK_BOX (main_hbox), data->streams_list, FALSE, FALSE, 2);

  main_box = gtk_box_new (GTK_ORIENTATION_VERTICAL, 0);
  gtk_box_pack_start (GTK_BOX (main_box), main_hbox, TRUE, TRUE, 0);
  gtk_box_pack_start (GTK_BOX (main_box), controls, FALSE, FALSE, 0);
  gtk_container_add (GTK_CONTAINER (main_window), main_box);
  gtk_window_set_default_size (GTK_WINDOW (main_window), 640, 480);

  gtk_widget_show_all (main_window);
}

/* This function is called periodically to refresh the GUI */
static gboolean refresh_ui (CustomData *data) {
    
  gint64 current = -1;

  /* We do not want to update anything unless we are in the PAUSED or PLAYING states */
  if (data->state < GST_STATE_PAUSED)
    return TRUE;

  /* If we didn't know it yet, query the stream duration */
  if (!GST_CLOCK_TIME_IS_VALID (data->duration)) {
    
    if (!gst_element_query_duration (data->playbin, GST_FORMAT_TIME, &data->duration)) {
    
      g_printerr ("Could not query current duration.\n");
    } else {
    
      /* Set the range of the slider to the clip duration, in SECONDS */
      gtk_range_set_range (GTK_RANGE (data->slider), 0, (gdouble)data->duration / GST_SECOND);
    }
  }

  if (gst_element_query_position (data->playbin, GST_FORMAT_TIME, &current)) {
    
    /* Block the "value-changed" signal, so the slider_cb function is not called
     * (which would trigger a seek the user has not requested) */
    g_signal_handler_block (data->slider, data->slider_update_signal_id);
    /* Set the position of the slider to the current pipeline positoin, in SECONDS */
    gtk_range_set_value (GTK_RANGE (data->slider), (gdouble)current / GST_SECOND);
    /* Re-enable the signal */
    g_signal_handler_unblock (data->slider, data->slider_update_signal_id);
  }
  return TRUE;
}

/* This function is called when new metadata is discovered in the stream */
static void tags_cb (GstElement *playbin, gint stream, CustomData *data) {
    
  /* We are possibly in a GStreamer working thread, so we notify the main
   * thread of this event through a message in the bus */
  gst_element_post_message (playbin,
    gst_message_new_application (GST_OBJECT (playbin),
      gst_structure_new_empty ("tags-changed")));
}

/* This function is called when an error message is posted on the bus */
static void error_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
    
  GError *err;
  gchar *debug_info;

  /* Print error details on the screen */
  gst_message_parse_error (msg, &err, &debug_info);
  g_printerr ("Error received from element %s: %s\n", GST_OBJECT_NAME (msg->src), err->message);
  g_printerr ("Debugging information: %s\n", debug_info ? debug_info : "none");
  g_clear_error (&err);
  g_free (debug_info);

  /* Set the pipeline to READY (which stops playback) */
  gst_element_set_state (data->playbin, GST_STATE_READY);
}

/* This function is called when an End-Of-Stream message is posted on the bus.
 * We just set the pipeline to READY (which stops playback) */
static void eos_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
    
  g_print ("End-Of-Stream reached.\n");
  gst_element_set_state (data->playbin, GST_STATE_READY);
}

/* This function is called when the pipeline changes states. We use it to
 * keep track of the current state. */
static void state_changed_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
    
  GstState old_state, new_state, pending_state;
  gst_message_parse_state_changed (msg, &old_state, &new_state, &pending_state);
  if (GST_MESSAGE_SRC (msg) == GST_OBJECT (data->playbin)) {
    
    data->state = new_state;
    g_print ("State set to %s\n", gst_element_state_get_name (new_state));
    if (old_state == GST_STATE_READY && new_state == GST_STATE_PAUSED) {
    
      /* For extra responsiveness, we refresh the GUI as soon as we reach the PAUSED state */
      refresh_ui (data);
    }
  }
}

/* Extract metadata from all the streams and write it to the text widget in the GUI */
static void analyze_streams (CustomData *data) {
    
  gint i;
  GstTagList *tags;
  gchar *str, *total_str;
  guint rate;
  gint n_video, n_audio, n_text;
  GtkTextBuffer *text;

  /* Clean current contents of the widget */
  text = gtk_text_view_get_buffer (GTK_TEXT_VIEW (data->streams_list));
  gtk_text_buffer_set_text (text, "", -1);

  /* Read some properties */
  g_object_get (data->playbin, "n-video", &n_video, NULL);
  g_object_get (data->playbin, "n-audio", &n_audio, NULL);
  g_object_get (data->playbin, "n-text", &n_text, NULL);

  for (i = 0; i < n_video; i++) {
    
    tags = NULL;
    /* Retrieve the stream's video tags */
    g_signal_emit_by_name (data->playbin, "get-video-tags", i, &tags);
    if (tags) {
    
      total_str = g_strdup_printf ("video stream %d:\n", i);
      gtk_text_buffer_insert_at_cursor (text, total_str, -1);
      g_free (total_str);
      gst_tag_list_get_string (tags, GST_TAG_VIDEO_CODEC, &str);
      total_str = g_strdup_printf ("  codec: %s\n", str ? str : "unknown");
      gtk_text_buffer_insert_at_cursor (text, total_str, -1);
      g_free (total_str);
      g_free (str);
      gst_tag_list_free (tags);
    }
  }

  for (i = 0; i < n_audio; i++) {
    
    tags = NULL;
    /* Retrieve the stream's audio tags */
    g_signal_emit_by_name (data->playbin, "get-audio-tags", i, &tags);
    if (tags) {
    
      total_str = g_strdup_printf ("\naudio stream %d:\n", i);
      gtk_text_buffer_insert_at_cursor (text, total_str, -1);
      g_free (total_str);
      if (gst_tag_list_get_string (tags, GST_TAG_AUDIO_CODEC, &str)) {
    
        total_str = g_strdup_printf ("  codec: %s\n", str);
        gtk_text_buffer_insert_at_cursor (text, total_str, -1);
        g_free (total_str);
        g_free (str);
      }
      if (gst_tag_list_get_string (tags, GST_TAG_LANGUAGE_CODE, &str)) {
    
        total_str = g_strdup_printf ("  language: %s\n", str);
        gtk_text_buffer_insert_at_cursor (text, total_str, -1);
        g_free (total_str);
        g_free (str);
      }
      if (gst_tag_list_get_uint (tags, GST_TAG_BITRATE, &rate)) {
    
        total_str = g_strdup_printf ("  bitrate: %d\n", rate);
        gtk_text_buffer_insert_at_cursor (text, total_str, -1);
        g_free (total_str);
      }
      gst_tag_list_free (tags);
    }
  }

  for (i = 0; i < n_text; i++) {
    
    tags = NULL;
    /* Retrieve the stream's subtitle tags */
    g_signal_emit_by_name (data->playbin, "get-text-tags", i, &tags);
    if (tags) {
    
      total_str = g_strdup_printf ("\nsubtitle stream %d:\n", i);
      gtk_text_buffer_insert_at_cursor (text, total_str, -1);
      g_free (total_str);
      if (gst_tag_list_get_string (tags, GST_TAG_LANGUAGE_CODE, &str)) {
    
        total_str = g_strdup_printf ("  language: %s\n", str);
        gtk_text_buffer_insert_at_cursor (text, total_str, -1);
        g_free (total_str);
        g_free (str);
      }
      gst_tag_list_free (tags);
    }
  }
}

/* This function is called when an "application" message is posted on the bus.
 * Here we retrieve the message posted by the tags_cb callback */
static void application_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
    
  if (g_strcmp0 (gst_structure_get_name (gst_message_get_structure (msg)), "tags-changed") == 0) {
    
    /* If the message is the "tags-changed" (only one we are currently issuing), update
     * the stream info GUI */
    analyze_streams (data);
  }
}

int main(int argc, char *argv[]) {
    
  CustomData data;
  GstStateChangeReturn ret;
  GstBus *bus;

  /* Initialize GTK */
  gtk_init (&argc, &argv);

  /* Initialize GStreamer */
  gst_init (&argc, &argv);

  /* Initialize our data structure */
  memset (&data, 0, sizeof (data));
  data.duration = GST_CLOCK_TIME_NONE;

  /* Create the elements */
  data.playbin = gst_element_factory_make ("playbin", "playbin");

  if (!data.playbin) {
    
    g_printerr ("Not all elements could be created.\n");
    return -1;
  }

  /* Set the URI to play */
  g_object_set (data.playbin, "uri", "https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm", NULL);

  /* Connect to interesting signals in playbin */
  g_signal_connect (G_OBJECT (data.playbin), "video-tags-changed", (GCallback) tags_cb, &data);
  g_signal_connect (G_OBJECT (data.playbin), "audio-tags-changed", (GCallback) tags_cb, &data);
  g_signal_connect (G_OBJECT (data.playbin), "text-tags-changed", (GCallback) tags_cb, &data);

  /* Create the GUI */
  create_ui (&data);

  /* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
  bus = gst_element_get_bus (data.playbin);
  gst_bus_add_signal_watch (bus);
  g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, &data);
  g_signal_connect (G_OBJECT (bus), "message::eos", (GCallback)eos_cb, &data);
  g_signal_connect (G_OBJECT (bus), "message::state-changed", (GCallback)state_changed_cb, &data);
  g_signal_connect (G_OBJECT (bus), "message::application", (GCallback)application_cb, &data);
  gst_object_unref (bus);

  /* Start playing */
  ret = gst_element_set_state (data.playbin, GST_STATE_PLAYING);
  if (ret == GST_STATE_CHANGE_FAILURE) {
    
    g_printerr ("Unable to set the pipeline to the playing state.\n");
    gst_object_unref (data.playbin);
    return -1;
  }

  /* Register a function that GLib will call every second */
  g_timeout_add_seconds (1, (GSourceFunc)refresh_ui, &data);

  /* Start the GTK main loop. We will not regain control until gtk_main_quit is called. */
  gtk_main ();

  /* Free resources */
  gst_element_set_state (data.playbin, GST_STATE_NULL);
  gst_object_unref (data.playbin);
  return 0;
}

通过 gcc basic-tutorial-5.c -o basic-tutorial-5 pkg-config --cflags --libs gstreamer-video-1.0 gtk+-3.0 gstreamer-1.0 编译代码进行测试

实例打开一个GTK+窗口,并显示一个电影,以及附带的音频。媒体是从Internet获取的,因此窗口可能需要几秒钟才能显示出来,这取决于连接速度。该窗口有一些用于暂停、停止和播放影片的GTK+按钮,以及一个显示流的当前位置的滑块,可以通过拖动来更改它。此外,关于流的信息显示在窗口右边缘的一列上。

没有延迟管理(缓冲),因此在慢速连接上,电影可能会在几秒钟后停止。之后的教程会解决这个问题

这次教程的结构,函数会在使用之前被定义。另外,为了解释清楚,代码片段呈现的顺序并不总是与程序的顺序匹配。

#include <gdk/gdk.h>
#if defined (GDK_WINDOWING_X11)
#include <gdk/gdkx.h>
#elif defined (GDK_WINDOWING_WIN32)
#include <gdk/gdkwin32.h>
#elif defined (GDK_WINDOWING_QUARTZ)
#include <gdk/gdkquartzwindow.h>
#endif

首先Gtk+在代码中适配了不同平台的窗口系统。我们需要为将要使用的窗口系统包含适当的GDK头文件。受支持的窗口系统并不多:X11用于Linux, Win32用于Windows, Quartz用于Mac OSX。

本教程主要由回调函数组成,这些函数将从GStreamer或GTK+中调用,因此我们首先来看下主函数,它注册了所有这些回调函数。

int main(int argc, char *argv[]) {
    
  CustomData data;
  GstStateChangeReturn ret;
  GstBus *bus;

  /* Initialize GTK */
  gtk_init (&argc, &argv);

  /* Initialize GStreamer */
  gst_init (&argc, &argv);

  /* Initialize our data structure */
  memset (&data, 0, sizeof (data));
  data.duration = GST_CLOCK_TIME_NONE;

  /* Create the elements */
  data.playbin = gst_element_factory_make ("playbin", "playbin");

  if (!data.playbin) {
    
    g_printerr ("Not all elements could be created.\n");
    return -1;
  }

  /* Set the URI to play */
  g_object_set (data.playbin, "uri", "https://www.freedesktop.org/software/gstreamer-sdk/data/media/sintel_trailer-480p.webm", NULL);

这里没有什么新的内容,标准GStreamer初始化和playbin管道创建,以及GTK+初始化

 /* Connect to interesting signals in playbin */
  g_signal_connect (G_OBJECT (data.playbin), "video-tags-changed", (GCallback) tags_cb, &data);
  g_signal_connect (G_OBJECT (data.playbin), "audio-tags-changed", (GCallback) tags_cb, &data);
  g_signal_connect (G_OBJECT (data.playbin), "text-tags-changed", (GCallback) tags_cb, &data);

当流中出现新的标记(元数据)时,我们希望得到通知。为了简单起见,我们将处理来自同一个回调tags_cb的所有类型的标记(视频、音频和文本)。

/* Create the GUI */
create_ui (&data);

所有GTK+小部件的创建和信号注册都在这个函数中进行。它只包含与gtk相关的函数调用,因此我们将跳过它的定义。它所注册的信号传递用户命令,如下所示

/* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
  bus = gst_element_get_bus (data.playbin);
  gst_bus_add_signal_watch (bus);
  g_signal_connect (G_OBJECT (bus), "message::error", (GCallback)error_cb, &data);
  g_signal_connect (G_OBJECT (bus), "message::eos", (GCallback)eos_cb, &data);
  g_signal_connect (G_OBJECT (bus), "message::state-changed", (GCallback)state_changed_cb, &data);
  g_signal_connect (G_OBJECT (bus), "message::application", (GCallback)application_cb, &data);
  gst_object_unref (bus);

通过调用gst_bus_add_signal_watch(),我们指示总线在每次接收到消息时发出一个信号。该信号的名称为message::detail,其中detail是触发信号发射的消息。例如,当总线接收到EOS消息时,它会发出一个名称为message:: EOS的信号。

为了让总线监视工作(无论是gst_bus_add_watch()还是gst_bus_add_signal_watch()),必须运行GLib主循环。

在本例中,我们使用了GTK+主循环,它封装了GLib的主循环

/* Register a function that GLib will call every second */
g_timeout_add_seconds (1, (GSourceFunc)refresh_ui, &data);

在将控制权转移到GTK+之前,我们使用g_timeout_add_seconds()来注册另一个回调,这次有一个超时,因此每秒钟都会调用它。我们将使用它从refresh_ui函数刷新GUI。

在此之后,我们就完成了设置,可以开始GTK+主循环了。当有趣的事情发生时,我们将从回调中重新获得控制权。

/* This function is called when the GUI toolkit creates the physical window that will hold the video.
 * At this point we can retrieve its handler (which has a different meaning depending on the windowing system)
 * and pass it to GStreamer through the VideoOverlay interface. */
static void realize_cb (GtkWidget *widget, CustomData *data) {
    
  GdkWindow *window = gtk_widget_get_window (widget);
  guintptr window_handle;

  if (!gdk_window_ensure_native (window))
    g_error ("Couldn't create native window needed for GstVideoOverlay!");

  /* Retrieve window handler from GDK */
#if defined (GDK_WINDOWING_WIN32)
  window_handle = (guintptr)GDK_WINDOW_HWND (window);
#elif defined (GDK_WINDOWING_QUARTZ)
  window_handle = gdk_quartz_window_get_nsview (window);
#elif defined (GDK_WINDOWING_X11)
  window_handle = GDK_WINDOW_XID (window);
#endif
  /* Pass it to playbin, which implements VideoOverlay and will forward it to the video sink */
  gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (data->playbin), window_handle);
}

我们知道了GStreamer渲染视频的窗口的句柄(无论是X11的XID,窗口的HWND还是Quartz的NSView)。我们只需从窗口系统检索它,并使用gst_video_overlay_set_window_handle()通过GstVideoOverlay接口将其传递给playbin。playbin将定位视频接收器并将处理程序传递给它,因此它不会创建自己的窗口并使用这个窗口。

playbin和GstVideoOverlay极大地简化了这个过程!

/* This function is called when the PLAY button is clicked */
static void play_cb (GtkButton *button, CustomData *data) {
    
  gst_element_set_state (data->playbin, GST_STATE_PLAYING);
}

/* This function is called when the PAUSE button is clicked */
static void pause_cb (GtkButton *button, CustomData *data) {
    
  gst_element_set_state (data->playbin, GST_STATE_PAUSED);
}

/* This function is called when the STOP button is clicked */
static void stop_cb (GtkButton *button, CustomData *data) {
    
  gst_element_set_state (data->playbin, GST_STATE_READY);
}

这三个回调与GUI中的播放、暂停和停止按钮相关联。它们只是将管道设置为相应的状态。注意,在STOP状态下,我们将管道设置为READY。我们可以将管道一直设置为空状态,但是这样转换就会慢一些,因为需要释放和重新获取一些资源(比如音频设备)。

/* This function is called when the main window is closed */
static void delete_event_cb (GtkWidget *widget, GdkEvent *event, CustomData *data) {
    
  stop_cb (NULL, data);
  gtk_main_quit ();
}

gtk_main_quit()最终将调用main中的gtk_main_run()来终止,在本例中,它将完成程序。在这里,我们在主窗口关闭时,在停止管道之后调用它

/* This function is called everytime the video window needs to be redrawn (due to damage/exposure,
 * rescaling, etc). GStreamer takes care of this in the PAUSED and PLAYING states, otherwise,
 * we simply draw a black rectangle to avoid garbage showing up. */
static gboolean draw_cb (GtkWidget *widget, cairo_t *cr, CustomData *data) {
    
  if (data->state < GST_STATE_PAUSED) {
    
    GtkAllocation allocation;

    /* Cairo is a 2D graphics library which we use here to clean the video window.
     * It is used by GStreamer for other reasons, so it will always be available to us. */
    gtk_widget_get_allocation (widget, &allocation);
    cairo_set_source_rgb (cr, 0, 0, 0);
    cairo_rectangle (cr, 0, 0, allocation.width, allocation.height);
    cairo_fill (cr);
  }

  return FALSE;
}

当有数据流时(处于暂停和播放状态),视频接收器负责刷新视频窗口的内容。但是在其他情况下,它不会,所以我们必须这样做。在这个例子中,我们只是用一个黑色矩形填充窗口。

/* This function is called when the slider changes its position. We perform a seek to the
 * new position here. */
static void slider_cb (GtkRange *range, CustomData *data) {
    
  gdouble value = gtk_range_get_value (GTK_RANGE (data->slider));
  gst_element_seek_simple (data->playbin, GST_FORMAT_TIME, GST_SEEK_FLAG_FLUSH | GST_SEEK_FLAG_KEY_UNIT,
      (gint64)(value * GST_SECOND));
}

这里通过GStreamer和GTK+的协作,实现了滑块定位和媒体流的同步。如果滑块被拖动到一个新的位置,告诉GStreamer使用gst_element_seek_simple()查找那个位置

值得一提的是,通过进行一些限制,可以获得一些性能(和响应能力),也就是说,不响应每次用户的请求。由于查找操作必然要花费一些时间,在允许另一个查找之前等待半秒(例如)通常更好。否则,如果用户疯狂地拖动滑块,应用程序可能会看起来没有响应

/* This function is called periodically to refresh the GUI */
static gboolean refresh_ui (CustomData *data) {
    
  gint64 current = -1;

  /* We do not want to update anything unless we are in the PAUSED or PLAYING states */
  if (data->state < GST_STATE_PAUSED)
    return TRUE;

此函数将移动滑块以反映媒体的当前位置。首先,如果我们不在播放状态,我们在这里就没有什么可做的(加上,位置和持续时间查询通常会失败)。

/* If we didn't know it yet, query the stream duration */
if (!GST_CLOCK_TIME_IS_VALID (data->duration)) {
    
  if (!gst_element_query_duration (data->playbin, GST_FORMAT_TIME, &data->duration)) {
    
    g_printerr ("Could not query current duration.\n");
  } else {
    
    /* Set the range of the slider to the clip duration, in SECONDS */
    gtk_range_set_range (GTK_RANGE (data->slider), 0, (gdouble)data->duration / GST_SECOND);
  }
}

我们查询媒体流的持续时间,然后设置滑动条的范围。

if (gst_element_query_position (data->playbin, GST_FORMAT_TIME, &current)) {
    
  /* Block the "value-changed" signal, so the slider_cb function is not called
   * (which would trigger a seek the user has not requested) */
  g_signal_handler_block (data->slider, data->slider_update_signal_id);
  /* Set the position of the slider to the current pipeline positoin, in SECONDS */
  gtk_range_set_value (GTK_RANGE (data->slider), (gdouble)current / GST_SECOND);
  /* Re-enable the signal */
  g_signal_handler_unblock (data->slider, data->slider_update_signal_id);
}
return TRUE;

我们查询当前管道位置,并相应地设置滑块的位置。这将触发值变化信号的发射,我们使用该信号来知道用户何时在拖动滑块。除非用户请求它们,我们不希望去更改播放位置,所以我们在此操作期间使用g_signal_handler_block()和g_signal_handler_unblock()禁用value-changed信号发射。

从这个函数返回TRUE将使它在将来被调用。如果我们返回FALSE,定时器将被删除。

/* This function is called when new metadata is discovered in the stream */
static void tags_cb (GstElement *playbin, gint stream, CustomData *data) {
    
  /* We are possibly in a GStreamer working thread, so we notify the main
   * thread of this event through a message in the bus */
  gst_element_post_message (playbin,
    gst_message_new_application (GST_OBJECT (playbin),
      gst_structure_new_empty ("tags-changed")));
}

这是本教程的关键点之一。当从一个流线程(即应用程序(或主)线程以外的线程)在媒体中发现新的标记时,将调用此函数。我们在这里要做的是更新一个GTK+小部件来反映这个新信息,但是GTK+不允许从主线程以外的线程进行操作。

解决方案是让playbin在总线上发布消息并返回到调用线程。在适当的时候,主线程将获取此消息并更新GTK

gst_element_post_message()使GStreamer元素将给定的消息发布到总线。gst_message_new_application()创建一个应用程序类型的新消息。GStreamer消息具有不同的类型,并且这个特定的类型保留给应用程序:它将通过不受GStreamer影响的总线。类型列表可以在GstMessageType文档中找到。

消息可以通过其嵌入的gst结构传递附加信息,这是一种非常灵活的数据容器。在这里,我们使用gst_structure_new()创建一个新的结构,并将其命名为tag -changed,以避免在希望发送其他应用程序消息时产生混淆。

稍后,一旦进入主线程,总线将接收到此消息并发出message::application信号,我们已经将其关联到application_cb函数:

/* This function is called when an "application" message is posted on the bus.
 * Here we retrieve the message posted by the tags_cb callback */
static void application_cb (GstBus *bus, GstMessage *msg, CustomData *data) {
    
  if (g_strcmp0 (gst_structure_get_name (gst_message_get_structure (msg)), "tags-changed") == 0) {
    
    /* If the message is the "tags-changed" (only one we are currently issuing), update
     * the stream info GUI */
    analyze_streams (data);
  }
}

一旦我确定它是标记更改的消息,我们调用analyze_streams函数,它基本上从流中恢复标签,并将它们写入GUI中的文本小部件中。

error_cb、eos_cb和state_changed_cb实际上不值得解释,因为它们的作用与之 前所有教程中的相同,但它们现在有自己的功能。

就是它了!本教程中的代码量可能令人生畏,好在所需的概念并不太

总结

本教程演示了:

如何使用gst_video_overlay_set_window_handle()将视频输出到特定的窗口句柄。

如何通过使用g_timeout_add_seconds()注册一个超时回调来定期刷新GUI。

如何通过gst_element_post_message()总线通过应用程序消息将信息传递给主线程。

如何通过使用gst_bus_add_signal_watch()让总线发出信号并使用信号细节区分所有消息类型来只通知感兴趣的消息。

(此篇文章翻译自官网,旨在分享技术理念,如有不当敬请指正)

版权声明:本文为博主原创文章,遵循 CC 4.0 BY-SA 版权协议,转载请附上原文出处链接和本声明。
本文链接:https://blog.csdn.net/m0_49214895/article/details/110127270

智能推荐

2022黑龙江最新建筑八大员(材料员)模拟考试试题及答案_料账的试题-程序员宅基地

文章浏览阅读529次。百分百题库提供建筑八大员(材料员)考试试题、建筑八大员(材料员)考试预测题、建筑八大员(材料员)考试真题、建筑八大员(材料员)证考试题库等,提供在线做题刷题,在线模拟考试,助你考试轻松过关。310项目经理部应编制机械设备使用计划并报()审批。A监理单位B企业C建设单位D租赁单位答案:B311对技术开发、新技术和新工艺应用等情况进行的分析和评价属于()。A人力资源管理考核B材料管理考核C机械设备管理考核D技术管理考核答案:D312建筑垃圾和渣土._料账的试题

chatgpt赋能python:Python自动打开浏览器的技巧-程序员宅基地

文章浏览阅读614次。本文由chatgpt生成,文章没有在chatgpt生成的基础上进行任何的修改。以上只是chatgpt能力的冰山一角。作为通用的Aigc大模型,只是展现它原本的实力。对于颠覆工作方式的ChatGPT,应该选择拥抱而不是抗拒,未来属于“会用”AI的人。AI职场汇报智能办公文案写作效率提升教程 专注于AI+职场+办公方向。下图是课程的整体大纲下图是AI职场汇报智能办公文案写作效率提升教程中用到的ai工具。_python自动打开浏览器

Linux中安装JDK-RPM_linux 安装jdk rpm-程序员宅基地

文章浏览阅读545次。Linux中安装JDK-RPM方式_linux 安装jdk rpm

net高校志愿者管理系统-73371,计算机毕业设计(上万套实战教程,赠送源码)-程序员宅基地

文章浏览阅读25次。免费领取项目源码,请关注赞收藏并私信博主,谢谢-高校志愿者管理系统主要功能模块包括页、个人资料(个人信息。修改密码)、公共管理(轮播图、系统公告)、用户管理(管理员、志愿用户)、信息管理(志愿资讯、资讯分类)、活动分类、志愿活动、报名信息、活动心得、留言反馈,采取面对对象的开发模式进行软件的开发和硬体的架设,能很好的满足实际使用的需求,完善了对应的软体架设以及程序编码的工作,采取SQL Server 作为后台数据的主要存储单元,采用Asp.Net技术进行业务系统的编码及其开发,实现了本系统的全部功能。

小米宣布用鸿蒙了吗,小米OV对于是否采用鸿蒙保持沉默,原因是中国制造需要它们...-程序员宅基地

文章浏览阅读122次。原标题:小米OV对于是否采用鸿蒙保持沉默,原因是中国制造需要它们目前华为已开始对鸿蒙系统大规模宣传,不过中国手机四强中的另外三家小米、OPPO、vivo对于是否采用鸿蒙系统保持沉默,甚至OPPO还因此而闹出了一些风波,对此柏铭科技认为这是因为中国制造当下需要小米OV几家继续将手机出口至海外市场。 2020年中国制造支持中国经济渡过了艰难的一年,这一年中国进出口贸易额保持稳步增长的势头,成为全球唯一..._小米宣布用鸿蒙系统

Kafka Eagle_kafka eagle git-程序员宅基地

文章浏览阅读1.3k次。1.Kafka Eagle实现kafka消息监控的代码细节是什么?2.Kafka owner的组成规则是什么?3.怎样使用SQL进行kafka数据预览?4.Kafka Eagle是否支持多集群监控?1.概述在《Kafka 消息监控 - Kafka Eagle》一文中,简单的介绍了 Kafka Eagle这款监控工具的作用,截图预览,以及使用详情。今天_kafka eagle git

随便推点

Eva.js是什么(互动小游戏开发)-程序员宅基地

文章浏览阅读1.1k次,点赞29次,收藏19次。Eva.js 是一个专注于开发互动游戏项目的前端游戏引擎。:Eva.js 提供开箱即用的游戏组件供开发人员立即使用。是的,它简单而优雅!:Eva.js 由高效的运行时和渲染管道 (Pixi.JS) 提供支持,这使得释放设备的全部潜力成为可能。:得益于 ECS(实体-组件-系统)架构,你可以通过高度可定制的 API 扩展您的需求。唯一的限制是你的想象力!_eva.js

OC学习笔记-Objective-C概述和特点_objective-c特点及应用领域-程序员宅基地

文章浏览阅读1k次。Objective-C概述Objective-C是一种面向对象的计算机语言,1980年代初布莱德.考斯特在其公司Stepstone发明Objective-C,该语言是基于SmallTalk-80。1988年NeXT公司发布了OC,他的开发环境和类库叫NEXTSTEP, 1994年NExt与Sun公司发布了标准的NEXTSTEP系统,取名openStep。1996_objective-c特点及应用领域

STM32学习笔记6:TIM基本介绍_stm32 tim寄存器详解-程序员宅基地

文章浏览阅读955次,点赞20次,收藏16次。TIM(Timer)定时器定时器可以对输入的时钟进行计数,并在计数值达到设定值时触发中断16位计数器、预分频器、自动重装寄存器的时基单元,在 72MHz 计数时钟下可以实现最大 59.65s 的定时,59.65s65536×65536×172MHz59.65s65536×65536×721​MHz不仅具备基本的定时中断功能,而且还包含内外时钟源选择、输入捕获、输出比较、编码器接口、主从触发模式等多种功能。_stm32 tim寄存器详解

前端基础语言HTML、CSS 和 JavaScript 学习指南_艾编程学习资料-程序员宅基地

文章浏览阅读1.5k次。对于任何有兴趣学习前端 Web 开发的人来说,了解 HTML、CSS 和JavaScript 之间的区别至关重要。这三种前端语言都是您访问过的每个网站的用户界面构建块。而且,虽然每种语言都有不同的功能重点,但它们都可以共同创建令人兴奋的交互式网站,让用户保持参与。因此,您会发现学习所有三种语言都很重要。如果您有兴趣从事前端开发工作,可以通过多种方式学习这些语言——在艾编程就可以参与到学习当中来。在本文中,我们将回顾每种语言的特征、它们如何协同工作以及您可以在哪里学习它们。HTML vs C._艾编程学习资料

三维重构(10):PCL点云配准_局部点云与全局点云配准-程序员宅基地

文章浏览阅读2.8k次。点云配准主要针对点云的:不完整、旋转错位、平移错位。因此要得到完整点云就需要对局部点云进行配准。为了得到被测物体的完整数据模型,需要确定一个合适的坐标系变换,将从各个视角得到的点集合并到一个统一的坐标系下形成一个完整的数据点云,然后就可以方便地进行可视化,这就是点云数据的配准。点云配准技术通过计算机技术和统计学规律,通过计算机计算两个点云之间的错位,也就是把在不同的坐标系下的得到的点云进行坐标变..._局部点云与全局点云配准

python零基础学习书-Python零基础到进阶必读的书藉:Python学习手册pdf免费下载-程序员宅基地

文章浏览阅读273次。提取码:0oorGoogle和YouTube由于Python的高可适应性、易于维护以及适合于快速开发而采用它。如果你想要编写高质量、高效的并且易于与其他语言和工具集成的代码,《Python学习手册:第4 版》将帮助你使用Python快速实现这一点,不管你是编程新手还是Python初学者。本书是易于掌握和自学的教程,根据作者Python专家Mark Lutz的著名培训课程编写而成。《Python学习..._零基础学pythonpdf电子书