Hello everyone
A few weeks ago I ordered the CH3-R hexapod. My goal (right now) is to control the hexapod from a PC using Java and to add basic capabilities over time (walking, basic object detection, sensors on the feet etc.).
At the moment the software can only make my bot walk using different gait types and it can make the body of the hexapod move around.
I also added a little simulation panel to the software. This lets me debug the gait, and I can also see when a servo is moved to a position that’s mechanically not possible on the real bot.
The simulation is developed in Java using Java3D (OpenGL).
I can now visualize the hexapod and control the real hexapod at the same time. The servo positions are sent to the bot by serial cable.
http://img486.imageshack.us/img486/7338/ch3rgg5.png
The white panels with the red dot in the middle are used to control the most important functions:
(1, Leftmost): centerpoint of the body relative to the legs
(2): controls pitch and yaw of the body
(3, 4): controls the rotation point to control the direction of the
hexapod (middle->turns on position, rightmost->walk straight ahead,
up->walk sidewards straight ahead etc.)
The gait control panel (top right) controls the type of the gait.
The gait algorithm uses an input value between 0.0 and 1.0 to control the phase of the step.
Leg Ground Time: controls the amount of time the foot stays on the floor.
1, 2, 3, 4, 5, 6: controls the offset of the phase of each leg
This is quite a general way to specify gaits. I can change between tripod, ripple and wave gait by changing these values. (see video 02:12-02:35)
Here is the video of the simulation software in action:
youtube.com/watch?v=QkrTDEijRIM
I was just able to record the 3D part of the UI since the recording software directly captures the rendered frames from the OpenGL pipeline.
Martin



