User Input Events

Views automatically capture user keyboard and mouse input events on the client, and then send this user input to the service as commands via the server.

On the service side, you use the postKeyEvent and postMouseEvent functions of the IRenderedView interface to map the client-side user input events to the service-side functionality.

To handle touch screen gestures on mobile devices, you can either send a command directly, or map the input to a PureWeb API mouse or keyboard event, as shown at the bottom of this page.

Mapping Mouse and Keyboard Input

Mouse and keyboard events that occur on the client application are automatically captured by the view, and sent to the service application where the are handled by the IRenderedView interface’s postMouseEvent and postKeyEvent functions. You define how the service application responds based on the content of these events.

The actual implementation of these methods depends on what framework the service application uses (Windows Forms, Microsoft Foundation Class (MFC), Qt, Swing, etc.)

Below is a .Net example of how to capture and handle a mouse event using the Windows Forms framework.

public void PostMouseEvent(PureWebMouseEventArgs mouseEvent){
    System.Windows.Forms.MouseButtons buttons = System.Windows.Forms.MouseButtons.None;
    if (0 != (mouseEvent.Buttons & PureWeb.Ui.MouseButtons.Left)) buttons |= System.Windows.Forms.MouseButtons.Left;
    if (0 != (mouseEvent.Buttons & PureWeb.Ui.MouseButtons.Right)) buttons |= System.Windows.Forms.MouseButtons.Right;
    if (0 != (mouseEvent.Buttons & PureWeb.Ui.MouseButtons.Middle)) buttons |= System.Windows.Forms.MouseButtons.Middle;
    if (0 != (mouseEvent.Buttons & PureWeb.Ui.MouseButtons.XButton1)) buttons |= System.Windows.Forms.MouseButtons.XButton1;
    if (0 != (mouseEvent.Buttons & PureWeb.Ui.MouseButtons.XButton2)) buttons |= System.Windows.Forms.MouseButtons.XButton2;
switch (mouseEvent.EventType){
    case MouseEventType.MouseEnter:
        OnMouseEnter(EventArgs.Empty);
        break;
    case MouseEventType.MouseLeave:
        OnMouseLeave(EventArgs.Empty);
        break;
    case MouseEventType.MouseMove:
        OnMouseMove(new MouseEventArgs(buttons, 0, (int)mouseEvent.X, (int)mouseEvent.Y, (int)mouseEvent.Delta));
        break;
    case MouseEventType.MouseDown:
        OnMouseDown(new MouseEventArgs(buttons, 0, (int)mouseEvent.X, (int)mouseEvent.Y, (int)mouseEvent.Delta));
        break;
    default:
        Trace.WriteLine("Received unknown mouse event type {0}.", (int)mouseEvent.EventType);
        return;
    }
}		

And below is an example of how to capture and handle a keyboard event.

const int WM_KEYDOWN = 0x100;
const int WM_KEYUP = 0x101;
const int WM_SYSKEYDOWN = 0x104;
const int WM_SYSKEYUP = 0x105;
[return: MarshalAs(UnmanagedType.Bool)]
[DllImport("user32.dll", SetLastError = true)]
static extern bool PostMessage(IntPtr hWnd, UInt32 Msg, IntPtr wParam, IntPtr lParam);
            
(...)
			
public void postKeyEvent(PureWebKeyboardEventArgs keyEvent){
    bool isAltDown = 0 != (keyEvent.Modifiers & Modifiers.Alternate);
    int wParam = (int)keyEvent.KeyCode;
    int lParam = isAltDown ? (1 << 29) : 0; // "context code";
    int message;
            
    if (isAltDown || keyEvent.KeyCode == KeyCode.F10){
        message = keyEvent.EventType == KeyboardEventType.KeyDown ? WM_SYSKEYDOWN : WM_SYSKEYUP;
    }else{
        message = keyEvent.EventType == KeyboardEventType.KeyDown ? WM_KEYDOWN : WM_KEYUP;
    }
    PostMessage(this.Handle, (UInt32)message, new IntPtr(wParam), new IntPtr(lParam));
}   

Handling Touch Screen input

Views do not automatically handle touch screen input, the way they capture mouse and keyboard events.

Instead, you can set up your client to send a command directly to the service whenever a touch screen input event occurs.

Alternately, you could map the touch screen input to a PureWeb API mouse or keyboard event. The Asteroids sample applications in Android and HTML5 illustrate this could be accomplished.

Converting Touch-Screen Input to Keyboard Events

It is possible for PureWeb clients to simulate keyboard presses for touch-screen devices. Fundamentally, this is achieved by creating a key-down command when certain parts of the screen are touched.

An example of this approach is provided in the HTML5 and Android Asteroids sample application.

HTML5

When the HTML5 application runs on a mobile device, arrow buttons on the left-hand side of the screen allow users to play the Asteroids game using touch-screen input.

The code used to implement this button panel is described below. The code can be found in the AsteroidsApp.js sample file.

When the user touches a button on the panel, the client application simulates a key press by sending a command. See Commands.

function simKeyDown(e, keyCode) {
//Simulate a keyboard event
var eventObj = queueKeyboardEvent('KeyDown', keyCode);
//Suppress the default action e.preventDefault();};//Send a keyboard event using a PureWeb commandfunction
queueKeyboardEvent(eventType, keyCode) {
//Create the keyboard event as a JS object
var parameters = {'EventType': eventType, 'Path': 'AsteroidsView',
'KeyCode': keyCode, 'CharacterCode': 0, 'Modifiers': 0 };
//Send the PW command
pureweb.getClient().queueCommand('InputEvent', parameters);};			

Android

The onTouch method handles game button touches. It translates the button presses/releases into keyboard events. This method calls the queueKeyboardEvent which is described later.

@Override
public boolean onTouch(android.view.View v, MotionEvent event)
{
    // determine whether this is a key down or up event
    KeyboardEventType eventType;
	int action = event.getAction();

	switch (action)
	{
		case MotionEvent.ACTION_DOWN:
		case MotionEvent.ACTION_POINTER_1_DOWN:
		case MotionEvent.ACTION_POINTER_2_DOWN:
		case MotionEvent.ACTION_POINTER_3_DOWN:
		    eventType = KeyboardEventType.KeyDown;
		break;

		case MotionEvent.ACTION_UP:
		case MotionEvent.ACTION_POINTER_1_UP:
		case MotionEvent.ACTION_POINTER_2_UP:
		case MotionEvent.ACTION_POINTER_3_UP:
		    eventType = KeyboardEventType.KeyUp;
		break;
		...
	}
	pointerIndex = event.getPointerId(pointerIndex);
						
	// determine which button was pressed (if any)
	if (eventType == KeyboardEventType.KeyDown)
	{
		if (pointerIndex == 0)
		{
			pressedButtonId[0] = v.getId();
		} else
		{
			int[] xy = new int[2];
			getScreenCoordinates(v.getId(), (int)event.getX(pointerIndex), (int)event.getY(pointerIndex), xy);
			pressedButtonId[pointerIndex] = getGameButtonId(xy[0], xy[1]);
			...
			queueKeyboardEvent(pressedButtonId[pointerIndex], eventType);
		...
}				

The queueKeyboardEvent method creates an InputCommand for the keyboard events that correspond to the game buttons. It uses a switch statement with a case for each of the game buttons (fire, shields, forward, reverse, left, and right). The information is put into a queue to be sent to the Asteroids service application.

private void queueKeyboardEvent(int button, KeyboardEventType eventType) {
    KeyCode keyCode;

	if (button == R.id.fire) {
		keyCode = KeyCode.Space;
	} else if (button == R.id.shields) {
		keyCode = KeyCode.S;
	} else if (button == R.id.forward) {
		keyCode = KeyCode.Up;
	} else if (button == R.id.reverse) {
		keyCode = KeyCode.Down;
	} else if (button == R.id.left) {
		keyCode = KeyCode.Left;
	} else if (button == R.id.right) {
		keyCode = KeyCode.Right;
	} else {
		return; // not a game button - ignore!
	}
							
	Map<String, Object> parameters = new HashMap<String, Object>();
	parameters.put("EventType", eventType);
	parameters.put("Path", "AsteroidsView");
	parameters.put("KeyCode", keyCode.getKeyCode());
	parameters.put("CharacterCode", 0);
	parameters.put("Modifiers", Modifiers.None.toInt());

	framework.getWebClient().queueCommand("InputEvent", parameters);

	if (eventType == KeyboardEventType.KeyDown) {
	    view.performHapticFeedback(HapticFeedbackConstants.VIRTUAL_KEY);
	}
}