0
votes

I am using Kinect with Simple OpenNI and Processing, and I was trying to use the Z position of a hand to emulate a button press. So far when I try it using one hand it works really well, however, when I try to get it to work with a second hand, only one of the hands work. (I know it can be more efficient by moving everything except the fill out of the if statements, but I kept those in there just in case I want to change the sizes or something.)

irz and ilz are the initial Z positions of the hands when they are first recognized by onCreateHands and rz and lz are the current Z positions. As of now, the code works fine with one hand, but the other hand will either stay pressed or unpressed. If i comment one of the sections out, it works fine as well.

if (rz - irz > 0) {
 pushStyle();
 fill(60);
 ellipse(rx, ry, 10, 10);
 popStyle();
 rpressed = true;
}
else {
 pushStyle();
 noFill();
 ellipse(rx, ry, 10, 10);
 popStyle();
 rpressed = false;
}

if (lz - ilz > 0) {
 pushStyle();
 fill(60);
 ellipse(lx, ly, 10, 10);
 popStyle();
 lpressed = true;
}
else {
 pushStyle();
 noFill();
 ellipse(lx, ly, 10, 10);
 popStyle();
 lpressed = false;
}

I tried outputting the values of rz - irz and lz - ilz and the numbers range from small negative values to small positive values (around -8 to 8) for lz - ilz. But rz - irz outputs numbers from around 8-30 depending on each time I run it and is never consistent. Also, when I comment out the code for the lz-ilz, the values for rz-irz look just fine and it operates as intended. Is there a reason tracking both Z positions throws off one hand? And is there a way to get it to work?

Thanks!

1

1 Answers

1
votes

I've a couple of ideas:

  1. Use the NITE "Click" gesture
  2. Use the z position of the hand as you are now, but also keep track of differences in z movements

SimpleOpenNI seems to like one hand better than two with NITE gestures like "Click"(you should see a message printed after the hand is picked up and then move your hand forward and backward). Bellow is a quick example. Note that I'm keeping track of +/- differences on Z and using a threshold to only trigger based on a certain distance, this could be range for example.

import SimpleOpenNI.*;
SimpleOpenNI context;
boolean      handsTrackFlag = false;
PVector      handVec = new PVector();
PVector      handVec2D  = new PVector();//just for drawing
String       lastGesture = "";
float        lastZ = 0;
boolean      isPushing,wasPushing;
float        yourClickThreshold = 20;//set this up as you see fit for your interaction

void setup(){
  size(640,480);  
  context = new SimpleOpenNI(this);
  context.enableDepth();
  // enable hands + gesture generation
  context.enableGesture();
  context.enableHands();
  // add focus gestures  / here i do have some problems on the mac, i only recognize raiseHand ? Maybe cpu performance ?
  context.addGesture("Wave");
  context.addGesture("Click");
  context.addGesture("RaiseHand");

}

void draw()
{
  context.update();
  image(context.depthImage(),0,0);
  // draw the tracked hand
  if(handsTrackFlag){
    context.convertRealWorldToProjective(handVec,handVec2D);
    float diff = (handVec.z-lastZ);
    isPushing = diff < 0;
    if(diff > yourClickThreshold){
      if(!wasPushing && isPushing) fill(255,0,0);
      if(wasPushing && !isPushing) fill(0,255,0);
    }else fill(255);
    lastZ = handVec.z;
    wasPushing = isPushing;
    ellipse(handVec2D.x,handVec2D.y,10,10);
  }

}


// -----------------------------------------------------------------
// hand events

void onCreateHands(int handId,PVector pos,float time){
  println("onCreateHands - handId: " + handId + ", pos: " + pos + ", time:" + time);

  handsTrackFlag = true;
  handVec = pos;
}

void onUpdateHands(int handId,PVector pos,float time){
  //println("onUpdateHandsCb - handId: " + handId + ", pos: " + pos + ", time:" + time);
  handVec = pos;
}

void onDestroyHands(int handId,float time){
  println("onDestroyHandsCb - handId: " + handId + ", time:" + time);
  handsTrackFlag = false;
  context.addGesture(lastGesture);
}

// -----------------------------------------------------------------
// gesture events

void onRecognizeGesture(String strGesture, PVector idPosition, PVector endPosition){
  if(strGesture == "Click") println("onRecognizeGesture - strGesture: " + strGesture + ", idPosition: " + idPosition + ", endPosition:" + endPosition);

  lastGesture = strGesture;
  context.removeGesture(strGesture); 
  context.startTrackingHands(endPosition);
}

void onProgressGesture(String strGesture, PVector position,float progress){
  //println("onProgressGesture - strGesture: " + strGesture + ", position: " + position + ", progress:" + progress);
}

An alternative to get two hands is to use the SKEL_PROFILE_HEAD_HANDS when doing skeletal tracking, but note that the hand precision is lower.