I've got an app that programmatically moves its window around the user's screen. My problem:
- User has two physical monitors
- User starts app in primary monitor
- App moves window, causing more of it to "overflow" into the secondary monitor than is on the primary monitor
- This causes the app's window to entirely jump to that secondary monitor and disappear from the primary monitor.
I would like for the window to stay in the primary monitor unless I want it to go to the secondary monitor, or some defined point (e.g. the center) goes into the secondary monitor, or it would even be ok if the window were split and shown on both. Is there any way to prevent this jump from happening as soon the window intersects more with the secondary monitor?
To reiterate, I'm moving the window programmatically and am using Electron on macOS.
Also, this doesn't appear to happen when I move window manually, but I think this is because it's not using its percentage rule but going on whether or not the point of the mouse has entered the secondary monitor.
Also, I'm open to any kind of solution, including C/C++ or Swift.
EDIT: Here's how I'm currently moving the window:
win.setPosition(x, y);
where win
is an instance of Electron's BrowserWindow
.
The desired behavior is to move the window anywhere on a display. Currently, if some of the window goes off the edge of the current display enough, it jumps to another display. This is because Apple's default behavior it to automatically move a window to the display with which it overlaps the most and hide it on all other displays. It is worth noting that this is not the behavior when dragging a window manually, but just when moving it programmatically.