0
votes

I'm trying to get text which user type from keyboard. Everything seems working fine when I have English Keyboard Layout, bus when I change it into russian it does not work. I have used

private static IntPtr HookCallback(int nCode, IntPtr wParam, IntPtr lParam)
    {
        if (nCode >= 0 && wParam == (IntPtr)WM_KEYDOWN)
        { 
            int vkCode = Marshal.ReadInt32(lParam);
            if ((Keys)vkCode == Keys.Enter || (Keys)vkCode == Keys.Tab || (Keys)vkCode == Keys.LButton || (Keys)vkCode == Keys.RButton)
                WriteNewLine(_text);
            else
            if (ShiftKey)
                _text += GetCharsFromKeys((Keys)vkCode, true, false);
            else
                _text += GetCharsFromKeys((Keys)vkCode, false, false);
        }
        return CallNextHookEx(_hookID, nCode, wParam, lParam);
    }

This function gets key code. As you can see I call next function

static string GetCharsFromKeys(Keys keys, bool shift, bool altGr)
    {
        var buf = new StringBuilder(256);
        var keyboardState = new byte[256];
        if (shift)
            keyboardState[(int)Keys.ShiftKey] = 0xff;
        if (altGr)
        {
            keyboardState[(int)Keys.ControlKey] = 0xff;
            keyboardState[(int)Keys.Menu] = 0xff;
        }
        ToUnicode((uint)keys, 0, keyboardState, buf, 256, 0);
        return buf.ToString();
    }

It returns actual key what user type. But it work only for ENG keyboard layout. If you know how to add International keyboard layout handling, pls let me know. P.S. When I enter :

Руский

I get :

Hecrbq

My keyboard

1
@NickA I know, this is why I use toUnicode function. It translates the specified virtual-key code and keyboard state to the corresponding Unicode character - Victor Semeniuk
Looking at the documentation for ToUnicode, i don't think you understand what it's saying, if you hit the H key, its keycode is 0x48, which funnily enough is 0x0048 in unicode, not 0x0420 like Р is - Nick
Is this for a thread-local hook or a system-wide hook? Also, why don't you trap WM_CHAR instead of WM_KEYDOWN? - Marc Durdin

1 Answers

0
votes

You need to use InstalledInputLanguages and CurrentInputLanguage. There is an example here that you can follow. I will copy the code here in case the link dies. This code is looking to recognize @ symbol entry in the European keyboard so you would need to tweak it to meet your needs.

DllImport("user32.dll")]      
public static  extern short VkKeyScanEx(char ch, IntPtr dwhkl);   

public static  void GetKeyboardShortcutForChar(char c, InputLanguage lang, out Keys key, out  bool shift)      
{      
    var keyCode = VkKeyScanEx(c, lang.Handle);      
    key = (Keys) (keyCode & 0xFF);      
    shift = (keyCode & 0x100) == 0x100;      
}  

[DllImport("user32.dll",  CharSet = CharSet.Unicode)]      
public static  extern int ToUnicodeEx(int wVirtKey, uint wScanCode, byte[] lpKeyState,  StringBuilder pwszBuff, int cchBuff, uint wFlags, IntPtr dwhkl);      

public static  char? FromKeys(int keys, bool shift, bool capsLock)      
{      
    var keyStates = new byte[256];      
    if (shift)      
        keyStates[16] = 0x80;      
    if (capsLock)      
        keyStates[20] = 0x80;      

    var sb = new StringBuilder(10);      
    int ret = User32.ToUnicodeEx(keys, 0,  keyStates, sb, sb.Capacity, 0, InputLanguage.CurrentInputLanguage.Handle);      

    return ret == 1 ? (char?)sb[0] : null;      
}