33
votes

Context: Writing a drawing program that's supposed to be cross-platform. So I have multiple frontends responsible for providing a backend class access to a Cairo context, basic event handling, and widget size information. I recently decided to add the current UI scale (for high-DPI displays) to that last bit, primarily because I want to switch graphics to being rendered into a tile cache, so I need to know the highest level of detail the display can support.

In my little cross-platform world I expect that the front-end adapter for the backend class I'm talking about to properly configure the Cairo context to work in virtualized pixels before handing it over to me. I just need the scale to limit how much scaling I use on my tiles.

On AppKit this was easy: ask NSView to scale an NSSize of 1 virtual pixel into "backing store coordinates", and hand that to the backend class. Apple was also smart enough to provide a pre-scaled CoreGraphics context so all I have to do is request flipped coordinates and shove the CGContext into Cairo.

On GTK I'm a little bit more confused. There seems to be way too many ways to do this. GtkWidget has a "get scale factor" call, gint gtk_widget_get_scale_factor (GtkWidget *widget) that returns an integer value. This seems a little bit too restrictive because you can't deal with screens which are in-between with this kind of scenario. I.e. the 28" 3840x2160 monitor I've been eyeing is supposed to be a 1.5x monitor, but under GTK everything will be too small or too big.

Ubuntu throws it's own wrench in the works too, because it has it's own DPI scale factor which seems to be different from everything else. Only a few applications seem to actually support it. I turned it up to 1.5; Firefox and Brackets didn't scale, Nautilus and the terminal did scale, and Empathy does this weird thing where everything but conversation text scales. So it's clearly not system-wide or even built into GTK... blarrgh.

Also, X11 has it's own way of getting DPI information which I've heard is hilariously inaccurate and not worth thinking about anyway since Mir and Wayland are going to replace it anyway.

I can't find any information about this super-special-awesome Unity API for getting the user's specified UI scale, though. So I'm going to grab GTK's own integer scale for now, which I don't think is adequate but whatever. I'd really like to know how to grab the Unity scale parameter, or any other desktop environment's own proprietary UI scale parameters, but I have a feeling I might have to also ship a custom preference to configure UI scaling manually for this particular frontend.

(For those curious: my code is available here - the backend class I'm talking about is in src/canvasview.cpp and examples of front-end "view adapters" are in frontends/gtk/src/CanvasWidget.cpp and frontends/appkit/src/ICAKCanvasView.m respectively. Keep in mind that code is currently in a broken state as I am busy debugging tile-based rendering.)

2
Gtk+-3.x only recently got (unstable release!) full support within the framework for high DPI devices. What version do you use?drahnr
Officially added in 3.10 but you need some very fresh cairo to make it work mail.gnome.org/archives/gtk-list/2013-September/msg00021.htmldrahnr
@Elle Please stop. All of your latest edits have been harmful in some way - you think you're improving grammar, but you are introducing subtle errors instead. I have rolled back edits which harmed posts and notified the moderators. If you wish, I can explain what was wrong with them.Xan

2 Answers

4
votes

I haven't seen a dpi getter in GTK, but with some lines of code from open sources and some changes, I usually ask the X-server with this code-snippet to get the dpi values for the x and y direction. For the displayname you can pass the value(s) you want via argv[].

If you name it 'getdpi.c' then compile with

gcc -Wall -std=c99  -o getdpi getdpi.c -lX11

If it helps you, a vote would be appreciated. :-)

#include <X11/Xlib.h>

#include <stdio.h>
#include <stdlib.h>

const static unsigned int FALSE = 0;
const static unsigned int TRUE = 1;
typedef unsigned int bool;

int getDpi(Display *dpy, int scr, bool xRes)
{
    /*
     * an inch is 25.4 millimeters.
     * dpi = N pixels / (M millimeters / (25.4 millimeters / 1 inch))
     *     = N pixels / (M inch / 25.4)
     *     = N * 25.4 pixels / M inch
     */

    double res = xRes ? ((((double) DisplayWidth(dpy,scr)) * 25.4) / ((double) DisplayWidthMM(dpy,scr)))
        : ((((double) DisplayHeight(dpy,scr)) * 25.4) / ((double) DisplayHeightMM(dpy,scr)));

    return (int) (res + 0.5);
} // print_Screen_info

int main(int argc, char *argv[])
{
    Display *dpy;           /* X connection */
    char *displayname = NULL;       /* set to what you want or need */
    dpy = XOpenDisplay (displayname);
    if (!dpy) 
    {
    fprintf (stderr, "xdpi:  unable to open display \"%s\".\n",
         XDisplayName (displayname));
    exit (1);
    }

    for (int i = 0; i < ScreenCount(dpy); ++i)
        printf ("Xdpi: %d, Ydpi %d\n", 
                getDpi(dpy, i, TRUE),
                getDpi(dpy, i, FALSE));
} // main
1
votes

export GDK_DPI_SCALE=1.5; export GDK_SCALE=1; run_your_application_to_be_scaled

What is more you can create a dedicated launcher only for a specific applications setting their own scale factor :-)

For Qt these are the variables:

export QT_AUTO_SCREEN_SET_FACTOR=0
export QT_SCALE_FACTOR=1
export QT_FONT_DPI=128

For EFL / Enlightenment these are the variables:

export ELM_SCALE=1.5