As computer moved from a government/corporate tool to a personal tool, there was a shift from batch processing to interactive computing. One of the fronts this was seen on is the onset of computer graphics. Early computer graphics were limited to simple line drawings, but as hardware improved, more complex graphics became possible.
Physical Simulation Vs Computer Graphics
Both worked together, but had different loops:
Computer Graphics
- Define Position, orientation, and shape of each object at time .
- Render
- Define the position, orientation, and shape of each object at time
While a big loop, this was still mostly dictated by humans, “feeling out” how to move the objects in a scene from time step to time step.
With the arrival of Math Engine in 2000, artificial physics engines started to be used to simulate the physical world. This allowed for more realistic simulations, as the physics engine could handle the complex calculations needed to simulate real-world physics.
Physical Simulation
- Define the position, oreintation, shape, mass and friction properties of each object.
- Define how objects are attached together
- Start the simulation
Basic Physics Simulation
Say you have a ball of mass 2kg defined to be a set few meters above the ground. The physics engine will apply a gravitational force roughly equal to downwards on the ball. The acceleration of the ball will be:
Physics Engines
While we wont be using ODE in this class, its important to understand the basic components of a physics engine.
dInitODE2(0); // initialize ODE
world = dWorldCreate(); // this holds all the physical objects and simulates their interactions
space = dHashSpaceCreate (0); // manages the spatial organization of objects for collision detection
contactgroup = dJointGroupCreate (0); // holds the contact joints generated during collision detection
dWorldSetGravity (world,0,0,-GRAVITY); // set gravity in the world
dWorldSetCFM (world,1e-5); // set the constraint force mixing parameter for stabilityThis piece of code initializes the ODE physics engine, creates a world to hold physical objects, sets up a space for collision detection, and configures gravity and stability parameters.
Bodies
Bodies in ODE are the fundamental physical objects that can move and interact within the simulation. Each body has properties such as mass, position, velocity, and orientation.
We create a body and set its mass and position like so:
dBodyID body = dBodyCreate(world); // create a new body in the world
dMass mass;
dMassSetBox(&mass,1,1,1,1); // set mass properties for a box of size 1x1x1 with density 1
dMassAdjust(&mass,0.2f); // adjust mass to have a total mass of 0.2
dBodySetMass(body,&mass); // assign the mass properties to the body
dBodySetPosition(body,0,6,0); // set the initial position of the body to (0,6,0)What this does is create a body in the world, define its mass as a box with dimensions 1x1x1 and density 0.2, and set its initial position to (0,6,0).
Bodies in this case will always be rectangular prisms, but you can define other shapes as well.
Orientation for Bodies
We use Quaternions here, I love them and am used to them but for others in this class we will assume layman understanding of them. Main thing is that you can avoid the Gimbal Lock problem that comes with Euler Angles.
Without going too deep into them, we can always assume that specifying orientation in 3d will usually require 4 numbers (x, y, z, w). First 3 specify the axis of rotation, and the last specifies the amount of rotation. Encoded as a quaternion:
If the object is a cylinder, we can specify its rotation with 3 angles:
dMatrix3 R;
dRFromZAxis(R,x1,y1,z1);
dBodySetRotation(body,R);Joints - how Bodies Move Relative to Each other
Joints define how bodies are connected and how they can move relative to each other. There are several types of joints in ODE, including hinge joints, slider joints, and ball-and-socket joints.
dJointID hinge; // create a hinge joint
hinge = dJointCreateHinge (world,0); // create a hinge joint in the world
dJointAttach (hinge,body[0],body[1]); // attach the hinge joint to two bodies
dJointSetHingeAnchor (hinge,0,0,1); // set the anchor point of the hinge joint at (0,0,1)
dJointSetHingeAxis (hinge,x1,y1,z1); // set the axis of rotation for the hinge jointThese have to be defined before the simulation starts, and define how the bodies can move relative to each other.
Important Part - Be Aware of DEgrees of Freedom
Possibly the most important part of joints is understanding how many degrees of freedom (DOF) a joint has. This defines how many independent movements the joint allows.
More degrees of freedom means more complex movement, but also more complexity in the simulation. We usually are looking to minimize the degrees of freedom to make the simulation more stable and efficient.
Insects
Insects like ants have a LOT of legs and joints. However, each joint only has a few degrees of freedom. This allows them to move efficiently without overcomplicating the simulation.
Think about their “elbows” and “knees” - they mostly just bend in one direction, which simplifies the joint mechanics.
Geoms - how Bodies Interact
Geoms define the shape of bodies for collision detection. They are used to determine when and how bodies collide with each other in the simulation.
We distinguish beetween the inertial properties of a body (mass, position, velocity) and the geometric properties (shape, size) used for collision detection.
Below creates a box geom and associates it with a body:
dGeomID geom = dCreateBox(space,1,1,1)
dGeomSetBody(geom,body);To keep pairs from colliding:
- Compute the distance between each geom pair.
- If , push the bodies apart proportionally to the penetration depth.
Why is collision detection so computationally expensive?
It involves checking every pair of objects in the simulation to see if they are colliding, which can be a lot of calculations, especially as the number of objects increases. Efficient algorithms and data structures are needed to manage this complexity.
Contacts - How Objects Collide
Contacts define the interaction between colliding bodies. When two geoms collide, a contact point is generated, and the physics engine calculates the forces needed to resolve the collision.
Something you might not be used to is the idea of event driven programming. Instead of operating in a linear fashion, computer keeps track of important events, and you implement functions called callbacks that get executed when those events happen.
static void nearCallback (void *data, dGeomID o1, dGeomID o2)
{ int i;
// exit without doing anything if the two bodies are connected by a joint dBodyID b1 = dGeomGetBody(o1);
dBodyID b2 = dGeomGetBody(o2);
if (b1 && b2 && dAreConnectedExcluding (b1,b2,dJointTypeContact))
return;In ODE contacts are calculated by a callback function that is called whenever two geoms are close enough to potentially collide. The nearCallback function above checks if the two bodies are connected by a joint, and if not, it proceeds to calculate the contact points.
SimLoop
The simulation loop is where the main simulation happens. It involves checking for collisions, updating the physics world, and rendering the objects.
static void simLoop (int pause) {
dSpaceCollide (space,0,&nearCallback);
if (!pause)
dWorldStep(world,0.05);
dJointGroupEmpty (contactgroup);
for (int object = 0; object < totalObjects ; object++)
dsSetColor (obj[object].r, obj[object].g, obj[object].b);
Draw(obj[object]);
}We first check for collisions using dSpaceCollide, then if the simulation is not paused, we step the world forward in time using dWorldStep. Finally, we clear the contact group and render each object.
Sensors
Simulating robots involves not just physical bodies, but also sensors that allow the robot to perceive its environment. Common sensors include touch sensors, distance sensors, and cameras.
Simulating a Touch Sensor
Touch sensors can be simulated by checking for collisions between the robot and other objects in the environment. When a collision is detected, the touch sensor can be activated.
in nearCallback,
dCollide(obj2,floor)==true?Simulating a Light Sensor
Light sensors can be simulated by calculating the distance between the light source and the sensor. The intensity of the light can be modeled to decrease with distance.
const dReal posObj1 = dBodyGetPosition(obj1);
const dReal posObj4 = dBodyGetPosition(obj4);
double lightLevel = 1 / EuclideanDistance(posObj1,posObj4)2Motors
This isnt deploying