Author Archives: pgfeldman

Results?

Looks like we got some results with the headset system. Still trying to figure out what it means (other than the obvious that it’s easier to find the source of a single sound).HeadsetPrelimResults

Here are the confidence intervals:

confidenceIntervals

Next I try to do something with the Phantom results. I think I may need some more data before anything shakes out.

Strain Relief and Shorts

IMG_2194Yesterday, just as I was about to leave work, one of my coworkers dropped by to see what I was doing and thought it would be fun to be experimented upon. Cool.

I fired up the system, created a new setup file and ran the test. Everything ran perfectly, and I got more good results. When I cam in this morning though, the rig was pretty banged up. A wiring harness that had been fine for me working out bugs was nowhere near robust enough to run even one person through a suite of tasks. It’s the Law of Enemy Action.

You’ve heard of Murphy’s Law (Everything that can go wrong, will). The Law of Enemy action is similar: “People will use your product as if they are trying to destroy it”. In a previous life I designed fitness equipment and it was jaw dropping to see the amount of damage a customer could inflict on a product. Simply stated – you need to overdesign and overbuild if at all possible.

With that in mind, I pulled all the hardware off the Phantom and started over. New, lighter, more flexible wire. strain relieved connections. Breakaway connections. The works.

When it was done, I fired it up and started to test. Sensors – check. Actuators – check. Yay! And then the right pressure sensor started to misbehave. It was kind of beat up, so it made sense to replace it. But when I went to test, the new sensor was misbehaving in the same way. And it seemed to be related to turning on the vibro-acoustic actuators.

Time to open the box up and poke around. Nope – everything looked good. Maybe the connector? Aha! My new more flexible cable was stranded rather than solid. And a few strands from one of the wires was touching the right sensor connection.

So I pulled everything apart and replaced the cable that went into the connection with 22 gauge solid wired which then connected to my stranded cable. All fixed.And an example that even though Murphy’s Law is bad enough, you should always be prepared for Enemy Action.

 

First Results :-)

Downloaded several wav files of sine wave tones, ranging from 100hz to 1,000hz. The files are created using this generator, and are 0.5 sec in length.

Glued the tactor actuators in place, since they kept on coming loose during the testing

Fixed the file outputEach test result is now ordered

Fixed a bug where the number of targets and the number of goals were not being recorded

Added a listing of the audio files used in the experiment.

Got some initial results based on my self-testing today: firstResults
The pure haptic and tactor times to perform the task are all over the place, but it’s pretty interesting to note that Haptic/Tactor and Open Loop are probably significantly different. Hmmmm.

Packaging!

Ok, here it is, all ready to travel:

IMG_2192

It’s still a bit of a rat’s nest inside the box, but I’ll clean that up later today.

Adding a “practice mode” to the app. It will read in a setup file and allow the user to try any of the feedback modalities using srand(current milliseconds) – done

Sent an email off to these folks asking how to get their C2-transducers.

Need to look into perceptual equivalence WRT speech/nonspeech tactile interaction. Here’s one paper that might help: http://www.haskins.yale.edu/Reprints/HL0334.pdf

Fixed my truculent pressure sensor and glued the components into the enclosure. Need to order a power strip.

Life can be a drag sometimes

  • Cleaning up commands. Mostly done
  • While testing the “test” part of the app, I’m realizing that my “ratio” calculations have some issues. Before tying to fix them directly, I’m going to try just making a different gripper that has three “sensor spheres” on each finger. Then I can just let my “drag-based” physics to the whole job. Finished. That’s much better
  • Quick! What’s wrong with the following code?
	for(int i = 0; i < 3; ++i){
		position[i] += velocityVec[i];
		if(velocityVec[i] > drag*ratio){
			velocityVec[i] -= drag*ratio;
		}else{
			velocityVec[i] = 0;
		}
	}
  • Yep, the drag is only being applied for objects moving in a positive direction. This is a problem that has been driving me crazy for days. I thought is was some artifact of the communication between the Phantom control loop (1k hz) and the simulation loop (100 hz). Nope. Simple math mistake. Facepalm.
  • Pretty picture for the day. Notice that the grippers now have multiple points of contact:

BetterGripper

  • I’ve also started to notice how feedback changes the speed that you can perform the task. Haptic and tactor seem pretty close. Open loop is much worse, at least subjectively. Let’s see what the data says.

Zeno’s development schedule

Ok, it’s not as bad as that paradox, but the amount of work remaining always grows when you get closer to the end. Still, I think I’ll be ready by the end of the week.

  • Goal Box – done. I had to add collision detection for determining if a TargetSphere was touching (for setup) or inside (achieving the goal). I basically wrote an axis-aligned bounding box where the radius of the TargetSphere was either added (inside) to or subtracted from (touching) the size of the GoalBox. I’m not calculating penetration, just looking for sign change on the line segment.
  • Added <map> of UI_cmd to handle commands coming from the control system and results coming from the sim. Worked right the first time. Yay, C++ templates!
  • Enable/disable haptics/tactors – done. It’s interesting to see how the behavior and feel of the system changes with the different capabilities enabled/disabled.
  • Started working on the experiment session management

Picture for the day:GoalBox

Blew my hand off for a while

I’m in the process of turning the Phantom testbed code into a research tool. This means that a lot of items that have been #defines now need to be variables and such.

One of the mechanisms that the shared memory app uses to communicate is a char[255] message. I basically sprintf whatever I want into that, and I can then debug both applications simultaneously.

However, after checking to see that some data were coming across correctly, I took the formatting argument out of the sprintf statement and left the value in. Suddenly I was overflowing the 255 limit and causing all kinds of havoc. Took a few hours to chase that one down. That’s what you get for playing with C/C++. Moving on.

Anyway, I now have an event handling loop, and am able to load target spheres into the application and associate them with a sound file. Tommorrow we’ll try getting the sounds associated with the targets to play. There are some issues, primarily that the gripper can touch multiple targets simultaneously. Still, it looks pretty straightforward. After that I’ll start to roll in the TestManager and TestResults classes into the application.

The other thing to do for the day is to check out the headset code with Brian this evening in the lab and see if the output file bug has either disappeared or can be replicated.

Deadlines and schedules

I was just asked to see how many hours I have left for working this research. It turns out at the rate I’m going, that I can continue until mid-October. This is basically a big shout-out to Novetta, who has granted a continuation of my 20% time that was originally a hiring condition when I went to work for Edge. Thanks. And if you’d like a programming job in the DC area that supports creativity, give them a call.

I just can’t make the audio code break in writing out results. Odd. Maybe a corrupt input file can have unforeseen effects? Regardless, I’m going to stop pursuing this particular bug without more information

Fixing the state problem. Done.

Fixing the saving issue. Also changing the naming of the speakers to reflect Dolby or not. Done.

New version release built and deployed.

And back to Phantom++

TestScreenV1

I started to add in the user interface that will support experiments. Since it was already done, I pulled in most of the Fluid code from the Vibrotactile headset, which made things pretty easy. I needed to add an enclosing control system class that can move commande between the various pieces.

I’ve also decided that each sound will have an associated object with it. This allows each object to have a simple “acoustic” texture that doesn’t require any fancy data structure.

At this point, I’m estimating that the first version of the test program should be ready by Friday.