I think I know what the vibroacoustic study should be. I put an actuator on the Phantom and drive wav files based on the material associated with the collision. I can use the built-in haptic pattern playback as a control. To make the wav files, it might be as simple as recording the word, or using a microphone to contact a material, move across it and lift off (personally, I like this because it mimics what could be done with telepresence. The use of multiple sensor/actuator pairs can be used in a later study.
Which means that I don’t actually need the Phidgets code in the new KF hand codebase. I’m going to include it anyway, simple because I’m so close and can use it later.
Come to think of it, I could put an actuator on a mouse as well and move over materials?
Tasks for today:
- Finish getting the Phidgets code working in KF_Hand_3 – done
- Start to add sound classes – done inasmuch as sounds are loaded and played using the library I wrote. More detail will come later.
- Start to integrate Phantom. Got HelloHapticDevice2 up and running again, as well as quite a few demos