Simulated Articulated Entities

Glossary Item Box

Microsoft Robotics Developer Studio Send feedback on this topic

Simulated Articulated Entities

This tutorial teaches you how to create six-degree of freedom joints and compose them with simple capsule shapes to build articulated, motor driven arms.

Simulated Articulated Entities Visualizer

Simulated Articulated Entities Visualizer - Articulated Arm in Simulation Visualizer

Simulated Articulated Entities Physics

Simulated Articulated Entities Physics - Articulated Arm in Simulation Physics View

This tutorial is provided in the C# language. You can find the project files for this tutorial at the following location under the Microsoft Robotics Developer Studio installation folder:

Samples\SimulationTutorials\Advanced\Simulated Articulated Entities

This tutorial teaches you how to:

  • Create and Start Services.
  • Configure a Joint.
  • Understand Entity Implementation, Define the Segments and Joints
  • Run the Tutorial.

See Also:

  • Getting Started

Getting Started

The service for this tutorial already exists and is available under the setup install directory, in the samples\SimulationTutorials\Advanced\SimulatedArticulatedEntities directory.

Begin by loading SimulatedArticulatedEntities.csproj in Microsoft Visual Studio.

Required References

This tutorial requires the following new references beyond those used in the "Simulation Empty Project" tutorial

  • RoboticsCommon.Proxy - Type definitions for abstract services such as the Articulated Arm Service.
  • SimulatedLBR3Arm.2006.M07.Proxy - Proxy library for the simulated LBR3 arm service.
  • SimulationCommon.Proxy - Proxy library for the common simulation types. Required for sending requests to simulated arm service.

Simulated Articulated Entities References

Simulated Articulated Entities References - SimulatedArticulatedEntities References

The following additional using statements were added:

using arm = Microsoft.Robotics.Services.ArticulatedArm.Proxy;

using Microsoft.Robotics.PhysicalModel;
using physicalmodelproxy = Microsoft.Robotics.PhysicalModel.Proxy;

Using Types from the PhysicalModel Namespace

Microsoft.Robotics.PhysicalModel namespace defines types for describing the physical characteristics of robots. Since these types are also used by the simulation to describe entities and simulation services, the namespace and types from Microsoft.Robotics.Simulation.Physics are unified into the PhysicalModel namespace. However, to avoid a runtime dependency on XNA where the common types such as Vector3 and Quaternion are defined, new basic types are also defined with those names in the new namespace.

For services talking to simulation services or real services that use these new physical model types, there is no additional complexity. You just link with the proxy for the remote service and send messages as usual. For simulation services however, or a service that needs to perform calculations involving quaternions, things become less trivial. The primary issues are:

  • Because proxies don't move over static routines, any static helper methods for vector and quaternion manipulation are not available in the physicalmodelproxy namespace. This means you have to include both a reference to RoboticsCommon.dll (where the physical model is defined) and RoboticsCommon.Proxy.dll. It is good practice to use a namespace alias for the proxy namespace to avoid confusion since the types are named the same (e.g., the code above uses the physicalmodelproxy alias).
  • Entities are linked against RoboticsCommon.dll and the Microsoft.Robotics.PhysicalModel namespace. This means that if your simulation service uses the physical model proxy, you need to convert to the non-proxy types.

A good example for dealing with these issues is the SimulatedLBR3Arm service under samples\simulation\ArticulatedArms\LBR3.

Step 1: Create and Start Services

Service Start

This tutorial starts by calling the PopulateWorld method that adds entities to the simulation world. In general, the past simulation tutorials have used SimulationEngineScene state files. This tutorials shows an alternate way of adding entities to a scene.

protected override void Start()
{
    base.Start();
    // Orient sim camera view point
    SetupCamera();
    // Add objects (entities) in our simulated world
    PopulateWorld();
}

private void SetupCamera()
{
    // Set up initial view
    CameraView view = new CameraView();
    view.EyePosition = new Vector3(1.8f, 0.598689f, -1.28f);
    view.LookAtPoint = new Vector3(0, 0, 0.2830455f);
    SimulationEngine.GlobalInstancePort.Update(view);
}

private void PopulateWorld()
{
    AddSky();
    AddGround();
    AddDomino(new Vector3(0.3f, 0, 0.3f), 10);
    SpawnIterator(AddArticulatedArm);
}

Simulation Arm Service and Entity

The method AddArticulatedArm creates a simulation entity for an industrial articulated arm robot and a corresponding service used to control the entity. An iterator is used because the method performs multiple asynchronous operations in sequence.

IEnumerator<ITask> AddArticulatedArm()
{
    Vector3 position = new Vector3(0, 0, 0);

    // Create an instance of our custom arm entity.
    // Source code for this entity can be found under
    // Samples\Simulation\Entities\Entities.cs
    KukaLBR3Entity entity = new KukaLBR3Entity(position);

    // Name the entity
    entity.State.Name = "LBR3Arm";

    // Insert entity in simulation.
    SimulationEngine.GlobalInstancePort.Insert(entity);

    // create simulated arm service
    DsspResponsePort<CreateResponse> resultPort = CreateService(
        Microsoft.Robotics.Services.Simulation.LBR3Arm.Proxy.Contract.Identifier,
        Microsoft.Robotics.Simulation.Partners.CreateEntityPartner("https://localhost/" + entity.State.Name));

    // asynchronously handle service creation result.
    yield return Arbiter.Choice(resultPort,
        delegate(CreateResponse rsp)
        {
            _armServicePort = ServiceForwarder<arm.ArticulatedArmOperations>(rsp.Service);
        },
        delegate(Fault fault)
        {
            LogError(fault);
        });

    if (_armServicePort == null)
        yield break;

    // we re-issue the get until we get a response with a fully initialized state
    do
    {
        yield return Arbiter.Receive(false, TimeoutPort(1000), delegate(DateTime signal) { });

        yield return Arbiter.Choice(
            _armServicePort.Get(new GetRequestType()),
            delegate(arm.ArticulatedArmState state)
            {
                _cachedArmState = state;
            },
            delegate(Fault f)
            {
                LogError(f);
            });

        // exit on error
        if (_cachedArmState == null)
            yield break;
    } while (_cachedArmState.Joints == null);

    // Start a timer that will move the arm in a loop
    // Comment out the line below if you want to control
    // the arm through SimpleDashboard
    // Spawn<DateTime>(DateTime.Now, MoveArm);
}

An entity is created and inserted to the simulation engine plus and a service for that entity. Further, the following steps are taken:

  1. The response from the CreateService method is handled asynchronously and a port is created for communicating with the new service.
  2. A Get DSSP operation retrieves its state. The articulated arm state definition contains a list of Joint instances that describe the joints between the arms.
  3. The MoveArm method is spawned (when uncommented).

The MoveArm method issues requests to the simulated articulated arm service to control the various joints that link the arm shapes. Since the simulation service uses the abstract definition of an articulated arm service, this method works with any service implementing those types.

void MoveArm(DateTime signal)
{
    _angleInRadians += 0.005f;
    float angle = (float)Math.Sin(_angleInRadians);

    // Create set pose operation. Notice we have specified
    // null for the response port to eliminate a roundtrip
    arm.SetJointTargetPose setPose = new arm.SetJointTargetPose(
        new arm.SetJointTargetPoseRequest(),
        null);

    // specify by name, which joint to orient
    setPose.Body.JointName = _cachedArmState.Joints[0].State.Name;
    setPose.Body.TargetOrientation = new physicalmodelproxy.AxisAngle(
        new physicalmodelproxy.Vector3(1, 0, 0), angle);

    // issue request to arm service for joint0 move.
    _armServicePort.Post(setPose);

    // now move other joints.
    setPose.Body.JointName = _cachedArmState.Joints[1].State.Name;
    _armServicePort.Post(setPose);
    setPose.Body.JointName = _cachedArmState.Joints[2].State.Name;
    _armServicePort.Post(setPose);
    setPose.Body.JointName = _cachedArmState.Joints[3].State.Name;
    _armServicePort.Post(setPose);
    setPose.Body.JointName = _cachedArmState.Joints[4].State.Name;
    _armServicePort.Post(setPose);

    // re- issue timer request so we wake up again
    Activate(Arbiter.Receive(false, TimeoutPort(15), MoveArm));
}

The method issues a SetJointTargetPose request every 15 milliseconds, gradually moving the joints around their twist axis (from the -1 to +1 radians angle range). For the simulated LBR3 robotic arm, all joints have been setup so they have one degree of freedom unlocked.

Step 2: Configure a Joint

Before going into more detail of the code implementing the simulated arm, look at how joints are specified in the simulation and underlying physics engine. For a more detailed overview, you should download the latest NVIDIA PhysX Technology documentation and review the section for Joints, especially the 6DoF joints. The Joint Properties (the state of the joint) data type is defined in Samples\Common\PhysicalModel.cs as follows:

/// <summary>
/// Joint properties
/// </summary>
[DataContract]
public class JointProperties
{
    /// <summary>
    /// Joint name must be unique for all joints between an entity pair.
    /// </summary>
    [DataMember]
    [Description("The descriptive identifier for the joint.")]
    public string Name;

    /// <summary>
    /// Pair of entities joined through this joint
    /// </summary>
    [DataMember]
    [Description("The pair of entities connected through this joint.")]
    public EntityJointConnector[] Connectors = new EntityJointConnector[2];

    /// <summary>
    /// Maximum force supported by the joint
    /// </summary>
    [DataMember]
    [Description("The maximum force supported by the joint.")]
    public float MaximumForce;

    /// <summary>
    /// Maximum torque supported by the joint
    /// </summary>
    [DataMember]
    [Description("The maximum torque supported by the joint.")]
    public float MaximumTorque;

    /// <summary>
    /// Enables collision modelling between entities coupled by the joint
    /// </summary>
    [DataMember]
    [Description("Enables collision between entities couples by the joint.")]
    public bool EnableCollisions;

    /// <summary>
    /// Underlying physics mechanism to compensate joint simulation errors
    /// </summary>
    [DataMember]
    [Description("Underlying physics mechanism to compensate joint simulation errors.")]
    public JointProjectionProperties Projection;

    /// <summary>
    /// If set, defines a joint with translation/linear position drives
    /// </summary>
    [DataMember]
    [Description("Identifies if the joint supports translation/linear position drives.")]
    public JointLinearProperties Linear;

    /// <summary>
    /// If set, defines a joint with angular drives.
    /// </summary>
    [DataMember]
    [Description("Identifies if the joint supports angular drives.")]
    public JointAngularProperties Angular;


    // ... Constructors omitted for clarity
}

The simulation engine exposes only one type of joint that can be configured to express all known joint configurations. The joint is essentially a six degree-of-freedom (6DoF) joint where you can select which degrees of freedom are unlocked or limited. A joint connects two physics entities and applies constraints to the motion relative to each other. The primary elements of a joint are:

  • The degrees of freedom that are unlocked (free), locked or limited.
  • The entities being connected and which local point within each entity local coordinate frame the joint attaches to.
  • The joint axis, specified for each entity. The primary joint axis is assigned to the "twist" angular degree of freedom.
  • The joint normal that shows which way the joint is oriented.
  • An optional drive and spring mechanism for each unlocked DoF.

The Joint Frame

The joint orientation is defined by specifying two vectors which define a joint frame at each connection point. The three axes which define the joint frame are called the Local or Joint Axis, the Normal Axis, and the Binormal Axis as shown in the following diagram.

Simulated Articulated Entities Joints

Simulated Articulated Entities Joints - The axes that define the joint frame.

The joint is oriented by specifying the Local Axis vector and the Normal Axis vector. The third vector, the Binormal Axis, is derived from the other two by calculating their cross product.

Each vector in the joint frame corresponds to a degree of freedom of the joint. The Joint Axis corresponds to the Twist degree of freedom. If this degree of freedom is unlocked, the joint will move freely around the Joint Axis. The Normal Axis corresponds to the Swing1 degree of freedom and the Binormal Axis corresponds to the Swing2 degree of freedom.

For example, if you want the joint to freely rotate around the X axis, you could specify the Joint Axis as (1,0,0) and the Normal Axis as (0,1,0) and then unlock the Twist degree of freedom.

If a different frame from connector 0 is specified for connector 1, the entity specified in connector 1 will be rotated so that the two frames line up.

Spherical Joint Example

To model a spherical joint, you lock all linear degrees of freedom (or not specify anything for the JointProperties.Linear) and then you unlock all angular degrees of freedom. An example of a spherical joint configuration is given below:

JointAngularProperties angular = new JointAngularProperties();
angular.TwistMode = JointDoFMode.Free;
angular.Swing1Mode = JointDoFMode.Free;
angular.Swing2Mode = JointDoFMode.Free;

angular.TwistDrive = new JointDriveProperties(JointDriveMode.Position,
    new SpringProperties(500, 10, 0), 1000);
angular.SwingDrive = new JointDriveProperties(JointDriveMode.Position,
new SpringProperties(500, 10, 0), 1000);


JointProperties sphericalJoint = new JointProperties(angular, null, null);

The code snippet above unlocks the three angular degrees of freedom and enables a drive mechanism for the only "twist" DoF. The joint can now be controlled with the SetJointTargetPose request, shown in the MoveArm method. To control the joint drive using SetJointTargetVelocity, you have to set the drive mode to use:

angular.TwistDrive = new JointDriveProperties(JointDriveMode.Velocity,
    new SpringProperties(500, 10, 0), 1000);

Notice that you can also attach a spring to each joint DoF drive mechanism. The spring coefficient in the SpringProperties models the amount of force that is applied to move the joint toward its target position. The damping coefficient determines if the joint will oscillate as it reaches the target.

Step 3: Understand Entity Implementation, Define the Segments and Joints

The source code for all entities supplied with Robotics Developer Studio is available as part of the samples. The samples\simulation\entities\entities.cs file under the Robotics Developer Studio installation directory contains the implementation for all built-in entities and is compiled as part of SimulationEngine.dll. This tutorial focuses on the KukaLBR3Entity, which simulates an articulated arm with 7 arm segments and 6 joints.

The implementation of this entity takes advantage of the ParentJoint that is created when a child entity is attached to a parent entity. The ParentJoint is usually a rigid joint that simply glues the entities together. In this entity, we define a class called CustomJointSingleShapeEntity which subclasses the SingleShapeEntity class and adds functionality to replace the rigid ParentJoint with a custom joint. The code for this class is shown below.

/// <summary>
/// Defines a new entity type that overrides the ParentJoint with 
/// custom joint properties.  It also handles serialization and
/// deserialization properly.
/// </summary>
[DataContract]
public class CustomJointSingleShapeEntity : SingleShapeEntity
{
    private Joint _customJoint;

    /// <summary>
    /// A custom joint definition used for the ParentJoint
    /// </summary>
    [DataMember]
    public Joint CustomJoint
    {
        get { return _customJoint; }
        set { _customJoint = value; }
    }

    /// <summary>
    /// Default constructor
    /// </summary>
    public CustomJointSingleShapeEntity() { }

    /// <summary>
    /// Initialization constructor
    /// </summary>
    /// <param name="shape"></param>
    /// <param name="initialPos"></param>
    public CustomJointSingleShapeEntity(Shape shape, Vector3 initialPos)
        : base(shape, initialPos)
    {
    }

    /// <summary>
    /// Initialization override which adds support for custom joint initialization
    /// </summary>
    /// <param name="device"></param>
    /// <param name="physicsEngine"></param>
    public override void Initialize(Microsoft.Xna.Framework.Graphics.GraphicsDevice device, PhysicsEngine physicsEngine)
    {
        base.Initialize(device, physicsEngine);

        // update the parent joint to match our custom joint parameters
        if (_customJoint != null)
        {
            if ((ParentJoint != null) &amp;&amp; (ParentJoint.InternalHandle != (IntPtr)0))
                PhysicsEngine.DeleteJoint((PhysicsJoint)ParentJoint);

            PhysicsEntity.SolverIterationCount = 128;
            Parent.PhysicsEntity.SolverIterationCount = 128;

            _customJoint.State.Connectors[0].Entity = Parent;
            _customJoint.State.Connectors[1].Entity = this;

            ParentJoint = _customJoint;
            PhysicsEngine.InsertJoint((PhysicsJoint)ParentJoint);
        }
    }
}

A property called CustomJoint is added and given the [DataMember] attribute so that it will serialize properly. The interesting code is in the override of the Initialize method. If the CustomJoint has been defined, we delete the old Parent Joint (if it is valid) and replace it with the custom joint. This makes it possible to define very complex articulated entities simply by adding segments built from CustomJointSingleShapeEntity as children of the entity they are attached to with a joint.

The code for the first part of the entity is shown below.

/// <summary>
/// Models KUKA LBR3 robotic arm
/// </summary>
[DataContract]
[CLSCompliant(true)]
[EditorAttribute(typeof(GenericObjectEditor), typeof(System.Drawing.Design.UITypeEditor))]
public class KukaLBR3Entity : SingleShapeEntity
{
    const float ARM_MASS = 1f;
    const float ARM_THICKNESS = 0.03f;
    const float ARM_LENGTH = 0.075f;
    const float DELTA = 0.000f;

    // approximation of the Lbr3 arms for arms 1->5
    // base is considered arm/link 0, and the end effectors are 6 and 7
    float ARM_LENGTH2 = ARM_LENGTH + ARM_THICKNESS * 2;

    List<Joint> _joints = new List<Joint>();
    /// <summary>
    /// Joints
    /// </summary>
    [Description("All of the joints in the entity.")]
    public List<Joint> Joints
    {
        get
        {
            if (_joints.Count == 0)
            {
                VisualEntity entity = this;
                while (true)
                {
                    if (entity.Children.Count == 0)
                        return _joints;

                    if (entity.Children[0].ParentJoint != null)
                        _joints.Add(entity.Children[0].ParentJoint);

                    entity = entity.Children[0];
                }
            }
            return _joints;
        }
        set { _joints = value; }
    }

Some physical attributes of the arm are defined and a Joints property is defined which will gather and return a list of all the joints in the entity by traversing all of the children entities. Following the definition of the CustomJointSingleShapeEntity, the SegmentDescriptor class is defined to hold information about each of the segments in the arm. The CreateSegment method on this class creates a CustomSingleShapeEntity object with the specified attributes including the custom joint attributes.

private class SegmentDescriptor
{
    public string Name;
    public string Mesh;
    public float Radius;
    public float Length;
    public float Mass;
    public Vector3 MeshRotation;
    public Vector3 MeshOffset;
    public Vector3 NormalAxis;
    public Vector3 LocalAxis;
    public Vector3 Connect0;
    public Vector3 Connect1;
    public Shape Shape;
    public SegmentDescriptor()
    {
        Name = Mesh = string.Empty;
        Radius = Length = 0f;
        Mass = 1;
        NormalAxis = LocalAxis = Connect0 = Connect1 = new Vector3();
        MeshRotation = MeshOffset = new Vector3();
        Shape = null;
    }

    public CustomJointSingleShapeEntity CreateSegment()
    {
        CustomJointSingleShapeEntity result = null;
        if (Shape != null)
        {
            result = new CustomJointSingleShapeEntity(Shape, new Vector3());
        }
        else
        {
            CapsuleShape capsule = new CapsuleShape(new CapsuleShapeProperties(
                Mass,
                new Pose(new Vector3(0, Length / 2 + Radius, 0)),
                Radius,
                Length));

            result = new CustomJointSingleShapeEntity(capsule, new Vector3());
        }
        result.State.Name = "Segment"+Name;
        result.State.Assets.Mesh = Mesh;
        result.State.Pose.Orientation = new Quaternion(0, 0, 0, 1);
        result.MeshTranslation = MeshOffset;
        result.MeshRotation = MeshRotation;
        result.State.MassDensity.AngularDamping = 5000;
        result.State.MassDensity.LinearDamping = 5000;


        JointAngularProperties angular = new JointAngularProperties();
        angular.TwistMode = JointDOFMode.Free;
        angular.TwistDrive = new JointDriveProperties(
            JointDriveMode.Position,
            new SpringProperties(500000, 100000, 0),
            1000000);

        EntityJointConnector[] connectors = new EntityJointConnector[2]
        {
            new EntityJointConnector(null, NormalAxis, LocalAxis, Connect0),
            new EntityJointConnector(null, NormalAxis, LocalAxis, Connect1)
        };

        result.CustomJoint = new Joint();
        result.CustomJoint.State = new JointProperties(angular, connectors);
        result.CustomJoint.State.Name = "Joint" + Name;

        return result;
    }
}

This entity, just like all the others, provides a default constructor that takes no parameters. This is the constructor that is used when the entity is deserialized. For this entity, the default constructor is empty because all of the necessary state is restored when the entity is deserialized.

The non-default constructor builds the entity by defining and then creating arm segments and then joining them together. Notice that the KukaLBR3 entity is a subclass of the SingleShapeEntity. It contains the capsule shape which is the base of the arm. By default, this base entity is defined to be kinematic which means that the physics engine will not move it. If you wish to attach this entity to another entity, you should clear the Kinematic flag from the State Flags after the constructor is called and then you can add the base entity as a child of another entity. The code for the non-default constructor is shown below.

/// <summary>
/// Default constructor
/// </summary>
public KukaLBR3Entity()
{
}

/// <summary>
/// Initialization constructor
/// </summary>
/// <param name="position"></param>
public KukaLBR3Entity(Vector3 position)
{
    State.Pose.Position = position;

    const float BaseMass = 1;
    const float BaseRadius = 0.03f;
    const float BaseHeight = 0.015f;
    Vector3 XAxis = new Vector3(1, 0, 0);
    Vector3 YAxis = new Vector3(0, 1, 0);
    Vector3 ZAxis = new Vector3(0, 0, 1);
    Vector3 NegativeZAxis = new Vector3(0, 0, -1);

    // Base (this entity)
    this.CapsuleShape = new CapsuleShape(new CapsuleShapeProperties(
        BaseMass,
        new Pose(new Vector3(0, BaseRadius + BaseHeight / 2, 0)),
        BaseRadius,
        BaseHeight));

    // define arm segments and connection points
    SegmentDescriptor[] _segments = new SegmentDescriptor[7];

    // Segment 0
    _segments[0] = new SegmentDescriptor();
    _segments[0].Name = "0 " + Guid.NewGuid().ToString();
    _segments[0].Mesh = "lbr3_j1.obj";
    _segments[0].Radius = ARM_THICKNESS;
    _segments[0].Length = ARM_LENGTH;
    _segments[0].Mass = 1;
    _segments[0].NormalAxis = ZAxis;
    _segments[0].LocalAxis = YAxis;
    _segments[0].Connect0 = new Vector3(0, BaseRadius * 2 + BaseHeight, 0);
    _segments[0].Connect1 = new Vector3(0, 0, 0);

    // Segment 1
    _segments[1] = new SegmentDescriptor();
    _segments[1].Name = "1 " + Guid.NewGuid().ToString();
    _segments[1].Mesh = "lbr3_j1.obj";
    _segments[1].Radius = ARM_THICKNESS;
    _segments[1].Length = ARM_LENGTH;
    _segments[1].Mass = 1;
    _segments[1].MeshOffset = new Vector3(0, ARM_LENGTH + 2 * ARM_THICKNESS, 0);
    _segments[1].MeshRotation = new Vector3(180, 0, 0);
    _segments[1].NormalAxis = YAxis;
    _segments[1].LocalAxis = NegativeZAxis;
    _segments[1].Connect0 = new Vector3(0, ARM_LENGTH2, 0);
    _segments[1].Connect1 = new Vector3(0, 0, 0);

    // Segment 2
    _segments[2] = new SegmentDescriptor();
    _segments[2].Name = "2 " + Guid.NewGuid().ToString();
    _segments[2].Mesh = "lbr3_j1.obj";
    _segments[2].Radius = ARM_THICKNESS;
    _segments[2].Length = ARM_LENGTH;
    _segments[2].Mass = 1;
    _segments[2].MeshOffset = new Vector3(0, 0, 0);
    _segments[2].MeshRotation = new Vector3(0, 180, 0);
    _segments[2].NormalAxis = ZAxis;
    _segments[2].LocalAxis = YAxis;
    _segments[2].Connect0 = new Vector3(0, ARM_LENGTH2, 0);
    _segments[2].Connect1 = new Vector3(0, 0, 0);

    // Segment 3
    _segments[3] = new SegmentDescriptor();
    _segments[3].Name = "3 " + Guid.NewGuid().ToString();
    _segments[3].Mesh = "lbr3_j1.obj";
    _segments[3].Radius = ARM_THICKNESS;
    _segments[3].Length = ARM_LENGTH;
    _segments[3].Mass = 1;
    _segments[3].MeshOffset = new Vector3(0, ARM_LENGTH + 2 * ARM_THICKNESS, 0);
    _segments[3].MeshRotation = new Vector3(0, 0, 180);
    _segments[3].NormalAxis = YAxis;
    _segments[3].LocalAxis = NegativeZAxis;
    _segments[3].Connect0 = new Vector3(0, ARM_LENGTH2, 0);
    _segments[3].Connect1 = new Vector3(0, 0, 0);

    // Segment 4
    _segments[4] = new SegmentDescriptor();
    _segments[4].Name = "4 " + Guid.NewGuid().ToString();
    _segments[4].Mesh = "lbr3_j5.obj";
    _segments[4].Radius = ARM_THICKNESS;
    _segments[4].Length = ARM_LENGTH;
    _segments[4].Mass = 1;
    _segments[4].MeshOffset = new Vector3(0, 0, 0);
    _segments[4].MeshRotation = new Vector3(0, 0, -90);
    _segments[4].NormalAxis = ZAxis;
    _segments[4].LocalAxis = YAxis;
    _segments[4].Connect0 = new Vector3(0, ARM_LENGTH2, 0);
    _segments[4].Connect1 = new Vector3(0, 0, 0);

    // Segment 5
    _segments[5] = new SegmentDescriptor();
    _segments[5].Name = "5 " + Guid.NewGuid().ToString();
    _segments[5].Mesh = "lbr3_j6.obj";
    _segments[5].Radius = ARM_THICKNESS;
    _segments[5].Length = ARM_LENGTH;
    _segments[5].Mass = 1;
    _segments[5].MeshOffset = new Vector3(0, 0, 0.01f);
    _segments[5].MeshRotation = new Vector3(0, 0, -90);
    _segments[5].NormalAxis = YAxis;
    _segments[5].LocalAxis = NegativeZAxis;
    _segments[5].Connect0 = new Vector3(0, ARM_LENGTH2 - 0.01f, 0);
    _segments[5].Connect1 = new Vector3(0, 0, 0.02f - 0.006f);
    _segments[5].Shape = new CapsuleShape(new CapsuleShapeProperties(1, new Pose(), 0.02f, 0.01f));

    // Segment 6
    _segments[6] = new SegmentDescriptor();
    _segments[6].Name = "6 " + Guid.NewGuid().ToString();
    _segments[6].Mesh = string.Empty;
    _segments[6].Radius = ARM_THICKNESS;
    _segments[6].Length = ARM_LENGTH;
    _segments[6].Mass = 0.1f;
    _segments[6].MeshOffset = new Vector3(0, 0.015f, 0);
    _segments[6].MeshRotation = new Vector3(0, 0, 0);
    _segments[6].NormalAxis = XAxis;
    _segments[6].LocalAxis = YAxis;
    _segments[6].Connect0 = new Vector3(0, 0.025f, 0.01f);
    _segments[6].Connect1 = new Vector3(0, 0, 0);
    _segments[6].Shape =  new BoxShape(new BoxShapeProperties(0.1f,
        new Pose(new Vector3(0, 0, 0),
        TypeConversion.FromXNA(xna.Quaternion.CreateFromAxisAngle(new xna.Vector3(0, 1, 0), -(float)Math.PI / 2))
        ),
        new Vector3(0.05f, 0.002f, 0.05f)));

    Vector3 initialPosition = new Vector3(0, ARM_LENGTH, 0);
    // start out with the base entity as parent
    VisualEntity parent = this; 
    // insert each entity as a child of the previous
    foreach (SegmentDescriptor desc in _segments)
    {
        if (desc != null)
        {
            CustomJointSingleShapeEntity child = desc.CreateSegment();
            child.State.Pose.Position = initialPosition;
            child.CustomJoint.State.Connectors[0].Entity = parent;
            child.CustomJoint.State.Connectors[1].Entity = child;
            parent.InsertEntity(child);
            parent = child;
            initialPosition += new Vector3(0, ARM_LENGTH, 0);
        }
    }

    // Make the base kinematic.  Clear this flag after calling the constructor
    // to attach to another entity.
    State.Flags |= EntitySimulationModifiers.Kinematic;
    State.Assets.Mesh = "lbr3_j0.obj";
}

Each segment is defined by specifying its attributes in the _segments array. The attributes include the name, mesh, and physical attributes of the segment. Mesh rotation and offset is also specified for each segment to allow the same mesh to be reused for multiple segments. The orientation and connection points for each joint are also specified.

When all of the segments have been defined, each one is created and then added as a child to the previous segment. The rough position of each entity is also specified. If all of the segments are given the same position, the entity will move violently for the first few frames after it is created as the joints pull the pieces of the entity together.

The remaining methods allow the Pose or Velocity of each of the joints to be set. These joint properties can only be changed when the physics engine is not active and so they are deferred to be executed during the Update method.

/// <summary>
/// Sets orientation only for angular drives
/// </summary>
/// <param name="j"></param>
/// <param name="axisAngle"></param>
public void SetJointTargetOrientation(Joint j, AxisAngle axisAngle)
{
    Task<Joint, AxisAngle> deferredTask = new Task<Joint, AxisAngle>(j, axisAngle, SetJointTargetOrientationInternal);
    DeferredTaskQueue.Post(deferredTask);
}

/// <summary>
/// Sets position and orientation depending on the DOF configuration of the joint
/// </summary>
/// <param name="j"></param>
/// <param name="pose"></param>
public void SetJointTargetPose(Joint j, Pose pose)
{
    Task<Joint, Pose> deferredTask = new Task<Joint, Pose>(j, pose, SetJointTargetPoseInternal);
    DeferredTaskQueue.Post(deferredTask);
}

/// <summary>
/// Sets angular or linear velocity
/// </summary>
/// <param name="j"></param>
/// <param name="velocity"></param>
public void SetJointTargetVelocity(Joint j, Vector3 velocity)
{
    Task<Joint, Vector3> deferredTask = new Task<Joint, Vector3>(j, velocity, SetJointTargetVelocityInternal);
    DeferredTaskQueue.Post(deferredTask);
}

void SetJointTargetPoseInternal(Joint j, Pose pose)
{
    if (j.State.Linear != null)
        ((PhysicsJoint)j).SetLinearDrivePosition(pose.Position);

    if (j.State.Angular != null)
        ((PhysicsJoint)j).SetAngularDriveOrientation(pose.Orientation);
}

void SetJointTargetOrientationInternal(Joint j, AxisAngle axisAngle)
{
    if (j.State.Angular != null)
    {
        // the physics engine doesn't deal well with axis angles in the range -2*Pi to -Pi
        // and Pi to 2*Pi.  Work around this problem by adjusting the angle to the range -Pi to Pi
        // and then map the angle to the range 2*Pi to 3*Pi if positive and -2*Pi to -3*Pi if negative.
        // This seems to get the physics engine to do the right thing.
        float fullRotation = (float)(2 * Math.PI);

        while (axisAngle.Angle < -Math.PI)
            axisAngle.Angle += fullRotation;

        while (axisAngle.Angle > Math.PI)
            axisAngle.Angle -= fullRotation;

        if (axisAngle.Angle < 0.0f)
            axisAngle.Angle -= fullRotation;
        else
            axisAngle.Angle += fullRotation;

        Quaternion target =
            TypeConversion.FromXNA(
            xna.Quaternion.CreateFromAxisAngle(TypeConversion.ToXNA(axisAngle.Axis), axisAngle.Angle)
            );
        ((PhysicsJoint)j).SetAngularDriveOrientation(target);
    }
}

void SetJointTargetVelocityInternal(Joint j, Vector3 velocity)
{
    if (j.State.Linear != null)
        ((PhysicsJoint)j).SetLinearDriveVelocity(velocity);
    else
        ((PhysicsJoint)j).SetAngularDriveVelocity(velocity);
}

Re-Use of Complex Geometry for Visual Representation

Each of the segments uses a capsule shape for its physics representation except for the last one which represents the actuator on the end of the arm. For the visual rendering, only 4 distinct meshes are actually used for the segments:

  • Lbr3_j0.obj is the mesh file for the base.
  • Lbr3_j1.obj is the mesh file used for arm links 1 through 4 (with 0 being the base link).
  • Lbr3_j5.obj is the mesh file for link 5.
  • Lbr3_j6.obj is the mesh file for the spherical link 6.

Simulated Articulated Entities Combined

Simulated Articulated Entities Combined - Visual and Physics Representation

The mesh from Lbr3_j1.obj is rotated and transformed to show how you can use one mesh to visually render a complex robot with similar links. Also note that the real LBR3 prototype robot arm has two more end effector links that are not shown here for the sake of simplicity.

Joint Visualization

In the picture below, the base and the first arm segment both have the joint axis represented by the red arrow (positive Y-axis).

Simulated Articulated Entities Arm

Simulated Articulated Entities Arm - Physics View - Base and First Segment

The joint that connects the base and the first segment illustrates two critical pieces of information that you must define properly when connecting entities with joints:

  • The joint axis, in the coordinate frame of each shape being linked by the joint. Notice that the Joint Axis is set to be the +Y axis. This is because the first segment rotates vertically around the center of the base.
  • The entity connection point which specifies where the joint connects with each entity. The coordinates are specified in the local coordinate frame for the entity. The point where the joint connects to the base is one half of the base height above its center (or the top of the base). The point where the joint connects to the first segment is at its origin. You might expect the joint to connect at the bottom of the first segment rather than its center. Notice that the LocalPose for the capsule shape in the first segment offsets the capsule shape in the Y direction so that the origin of the entity is actually the bottom of the capsule shape.

Step 4: Run the Tutorial

After you have built the project, you can run it from the DSS Command Prompt:

bin\dsshost32 -p:50000 -m:"samples\config\SimulatedArticulatedEntities.manifest.xml"

By default, Microsoft Visual Studio is configured to start each sample automatically. You can review these settings from within the program. In the Solution Explorer, double-click Projects Properties folder. In the Debug tab Start external program references dsshost32.exe, which is in the bin subdirectory of the Microsoft Robotics Developer Studio installation folder (for example C:\Users\[YourUserName]\Microsoft Robotics Dev Studio 2008\bin\dsshost32.exe, where [YourUserName] is the name you login as).

Use the following command line arguments:

-p:50000 -m:"samples\config\SimulatedArticulatedEntities.manifest.xml"

Use the setup installation folder as the working directory. You can then run the sample by pressing F5 or choosing Debug -> Start Debugging menu command. The simulation window should open displaying a moving, articulated arm. Then start your web browser and go to the following URL:

https://localhost:50000/directory (opens in a new window)

Simulated Articulated Entities Browser

Simulated Articulated Entities Browser - Directory Service in Browser

From the directory listing above, click on the simulated arm service URL that starts with /simulatedlbr3. Now you should see the state of the simulated arm and all the individual joint properties.

Simulated Articulated Entities Arm State

Simulated Articulated Entities Arm State - State of The Simulated Arm

Note on Initial Obj File Conversion

The simulation engine automatically converts OBJ mesh files to a binary cache format for fast load. The first time you open a file, you will notice a significant delay (one minute or so) before Simulated Articulated Entities starts rendering properly.

After the OBJ file conversion is complete, you should see the LBR3 in its "zero" pose:

Simulated Articulated Entities Dominos

Simulated Articulated Entities Dominos - KUKA LBR3 Arm with Dominos

Using the Simple Dashboard

The default manifest for Simulated Articulated Entities also starts the Simple Dashboard service in the same node. In the dashboard User Interface, type localhost in the Machine text box, and click the Connect button. You should now see the following:

Simulated Articulated Entities Dashboard

Simulated Articulated Entities Dashboard - Using Simple Dashboard to Move The Simulated Arm

Click the Connect button in the Articulated Arm group at the bottom of the form. A list of joints appears:

Simulated Articulated Entities Joints List

Simulated Articulated Entities Joints List - List of Joints Under Articulated Arm

Joint0 is selected by default but you can double click on any joint to make it active. Specify an angle, in degrees, in the Angle text box and click the Apply Changes button. You should see the arm rotate around the joint you specified (joint0 would make the entire arm rotate). Perform the following steps to see the arm interact with the domino boxes placed next to it:

  1. Select Joint1 from the list (Active Joint should display Joint1) and enter 50 for the angle. Click Apply Changes.
  2. Select Joint0 and enter -90 for the angle. Click Apply Changes.
  3. Watch the arm rotate and knock over the first box, which starts the domino cascade!

Simulated Articulated Entities Arm Action

Simulated Articulated Entities Arm Action - Simulated Arm In Action

Summary

In this tutorial, you learned how to:

  • Create and Start Services.
  • Configure a Joint.
  • Understand Entity Implementation, Define the Segments and Joints
  • Run the Tutorial.

The concepts in this tutorial are advanced and the user should review the NVIDIA™ PhysX™ Technology documentation, especially the sections explaining in detail the 6DoF joint modeling. Also, familiarity with 3D coordinate transforms is essential. We chose to use quaternions to capture orientation for joints and shapes because they are much more concise to serialize and easier to use. Unfortunately, this requires people familiar with 4x4 matrices used commonly in the robotic arm domain, to translate to quaternion. The XNA rendering library used by the simulation engine offers a simple translation routine, Matrix.CreateFromQuaternion to get a matrix from a quaternion.

 

 

© 2012 Microsoft Corporation. All Rights Reserved.