From: Nikola Pavlica <pavlica.nikola@gmail.com>
Because some VMs in QEMU can get GPU virtualization (using technologies
such as iGVT-g, as mentioned previously), they could produce a video
output that had a higher display refresh rate than of what the GTK
display was displaying. (fxp. Playing a video game inside of a Windows
VM at 60 Hz, while the output stood locked at 33 Hz because of defaults
set in include/ui/console.h)
Since QEMU does indeed have internal systems for determining frame
times as defined in ui/console.c.
The code checks for a variable called update_interval that it later
uses for time calculation. This variable, however, isn't defined
anywhere in ui/gtk.c and instead ui/console.c just sets it to
GUI_REFRESH_INTERVAL_DEFAULT which is 30
update_interval represents the number of milliseconds per display
refresh, and by doing some math we get that 1000/30 = 33.33... Hz
This creates the mentioned problem and what this patch does is that it
checks for the display refresh rate reported by GTK itself (we can take
this as a safe value) and just converts it back to a number of
milliseconds per display refresh.
Signed-off-by: Nikola Pavlica <pavlica.nikola@gmail.com>
---
include/ui/gtk.h | 2 ++
ui/gtk.c | 10 ++++++++++
2 files changed, 12 insertions(+)
diff --git a/include/ui/gtk.h b/include/ui/gtk.h
index d9eedad976..d1b230848a 100644
--- a/include/ui/gtk.h
+++ b/include/ui/gtk.h
@@ -28,6 +28,8 @@
#include "ui/egl-context.h"
#endif
+#define MILLISEC_PER_SEC 1000000
+
typedef struct GtkDisplayState GtkDisplayState;
typedef struct VirtualGfxConsole {
diff --git a/ui/gtk.c b/ui/gtk.c
index 692ccc7bbb..eb0efc70ee 100644
--- a/ui/gtk.c
+++ b/ui/gtk.c
@@ -1966,6 +1966,10 @@ static GSList *gd_vc_gfx_init(GtkDisplayState *s, VirtualConsole *vc,
GSList *group, GtkWidget *view_menu)
{
bool zoom_to_fit = false;
+ int refresh_rate_millihz;
+ GdkDisplay *dpy = gtk_widget_get_display(s->window);
+ GdkWindow *win = gtk_widget_get_window(s->window);
+ GdkMonitor *monitor = gdk_display_get_monitor_at_window(dpy, win);
vc->label = qemu_console_get_label(con);
vc->s = s;
@@ -2026,6 +2030,12 @@ static GSList *gd_vc_gfx_init(GtkDisplayState *s, VirtualConsole *vc,
vc->gfx.kbd = qkbd_state_init(con);
vc->gfx.dcl.con = con;
+
+ refresh_rate_millihz = gdk_monitor_get_refresh_rate(monitor);
+ if (refresh_rate_millihz) {
+ vc->gfx.dcl.update_interval = MILLISEC_PER_SEC / refresh_rate_millihz;
+ }
+
register_displaychangelistener(&vc->gfx.dcl);
gd_connect_vc_gfx_signals(vc);
--
2.24.1
On 1/8/20 1:13 PM, pavlica.nikola@gmail.com wrote: > From: Nikola Pavlica <pavlica.nikola@gmail.com> > > Because some VMs in QEMU can get GPU virtualization (using technologies > such as iGVT-g, as mentioned previously), they could produce a video > output that had a higher display refresh rate than of what the GTK > display was displaying. (fxp. Playing a video game inside of a Windows > VM at 60 Hz, while the output stood locked at 33 Hz because of defaults > set in include/ui/console.h) > > Since QEMU does indeed have internal systems for determining frame > times as defined in ui/console.c. > The code checks for a variable called update_interval that it later > uses for time calculation. This variable, however, isn't defined > anywhere in ui/gtk.c and instead ui/console.c just sets it to > GUI_REFRESH_INTERVAL_DEFAULT which is 30 > > update_interval represents the number of milliseconds per display > refresh, and by doing some math we get that 1000/30 = 33.33... Hz > > This creates the mentioned problem and what this patch does is that it > checks for the display refresh rate reported by GTK itself (we can take > this as a safe value) and just converts it back to a number of > milliseconds per display refresh. > > Signed-off-by: Nikola Pavlica <pavlica.nikola@gmail.com> Reviewed-by: Philippe Mathieu-Daudé <philmd@redhat.com> > --- > include/ui/gtk.h | 2 ++ > ui/gtk.c | 10 ++++++++++ > 2 files changed, 12 insertions(+) > > diff --git a/include/ui/gtk.h b/include/ui/gtk.h > index d9eedad976..d1b230848a 100644 > --- a/include/ui/gtk.h > +++ b/include/ui/gtk.h > @@ -28,6 +28,8 @@ > #include "ui/egl-context.h" > #endif > > +#define MILLISEC_PER_SEC 1000000 > + > typedef struct GtkDisplayState GtkDisplayState; > > typedef struct VirtualGfxConsole { > diff --git a/ui/gtk.c b/ui/gtk.c > index 692ccc7bbb..eb0efc70ee 100644 > --- a/ui/gtk.c > +++ b/ui/gtk.c > @@ -1966,6 +1966,10 @@ static GSList *gd_vc_gfx_init(GtkDisplayState *s, VirtualConsole *vc, > GSList *group, GtkWidget *view_menu) > { > bool zoom_to_fit = false; > + int refresh_rate_millihz; > + GdkDisplay *dpy = gtk_widget_get_display(s->window); > + GdkWindow *win = gtk_widget_get_window(s->window); > + GdkMonitor *monitor = gdk_display_get_monitor_at_window(dpy, win); > > vc->label = qemu_console_get_label(con); > vc->s = s; > @@ -2026,6 +2030,12 @@ static GSList *gd_vc_gfx_init(GtkDisplayState *s, VirtualConsole *vc, > > vc->gfx.kbd = qkbd_state_init(con); > vc->gfx.dcl.con = con; > + > + refresh_rate_millihz = gdk_monitor_get_refresh_rate(monitor); > + if (refresh_rate_millihz) { > + vc->gfx.dcl.update_interval = MILLISEC_PER_SEC / refresh_rate_millihz; > + } > + > register_displaychangelistener(&vc->gfx.dcl); > > gd_connect_vc_gfx_signals(vc); >
On Wed, Jan 08, 2020 at 01:13:42PM +0100, pavlica.nikola@gmail.com wrote: > From: Nikola Pavlica <pavlica.nikola@gmail.com> > > Because some VMs in QEMU can get GPU virtualization (using technologies > such as iGVT-g, as mentioned previously), they could produce a video > output that had a higher display refresh rate than of what the GTK > display was displaying. (fxp. Playing a video game inside of a Windows > VM at 60 Hz, while the output stood locked at 33 Hz because of defaults > set in include/ui/console.h) > > Since QEMU does indeed have internal systems for determining frame > times as defined in ui/console.c. > The code checks for a variable called update_interval that it later > uses for time calculation. This variable, however, isn't defined > anywhere in ui/gtk.c and instead ui/console.c just sets it to > GUI_REFRESH_INTERVAL_DEFAULT which is 30 > > update_interval represents the number of milliseconds per display > refresh, and by doing some math we get that 1000/30 = 33.33... Hz > > This creates the mentioned problem and what this patch does is that it > checks for the display refresh rate reported by GTK itself (we can take > this as a safe value) and just converts it back to a number of > milliseconds per display refresh. > > Signed-off-by: Nikola Pavlica <pavlica.nikola@gmail.com> Added to ui queue. thanks, Gerd
© 2016 - 2024 Red Hat, Inc.