• Praktische Grundlagen der Informatik
  • Algorithmen und Datenstrukturen
  • Foundations for Robotic and AI
  • Übersicht Data Science:
  • Data Science / Machine Learning
  • Projekt: Deep Teaching
  • Alte Veranstaltungen:
  • Grundlagen der Informatik (NF)
  • Octave/Matlab Tutorial
  • Game Physik
  • Home
  • Lehre
  • Publikationen
  • Kontakt/Impressum

2D Rotations and Orientation¶

While a "point" only has a location, a robot has a heading. In robotics, we define this as orientation or attitude.

To track how a robot is facing, we add a fifth state variable: the angle $\theta$.

Introduction: The Angular State¶

Orientation ($\theta$) is usually measured in radians. Unlike linear position, which can grow infinitely, $\theta$ is periodic (wrapping around at $2\pi$). In this model, we assume the robot rotates around its center, and we measure its "spinning speed" using angular velocity ($\omega$).

Differential Equations of Rotation¶

The relationship between orientation and angular velocity is the rotational analog of the position-velocity relationship. If $\omega$ is the rate of change of the angle:$$\omega = \frac{d\theta}{dt} = \dot{\theta}$$

Discretization of Rotation¶

To use this in a Kalman Filter, we must discretize it. Following the same logic as linear motion, we assume the measured angular velocity ($\omega_{k-1}$) is constant over the time step $\Delta t = t_k - t_{k-1}$. Integrating the differential equation:

$$\int_{\theta_{k-1}}^{\theta_k} d\theta = \int_{t_{k-1}}^{t_k} \omega_{k-1} \, dt$$

Final Discrete Orientation Equation:

$$\theta_k = \theta_{k-1} + \omega_{k-1} \Delta t$$

The Connection: Body Frame vs. Global Frame¶

In your measurement model, you have $(a_1, a_2, \omega)$. While $\omega$ directly updates $\theta$, the accelerations $a_1$ and $a_2$ are often measured by an onboard IMU (Inertial Measurement Unit).

  • Body Frame ($B$): The sensor measures acceleration relative to the robot's nose.
  • Global Frame or World Frame ($W$): The Kalman Filter tracks the robot relative to the world's map.

To bridge this, we use the orientation $\theta$ to rotate the body-frame accelerations into the world frame using a rotation matrix:

$$\begin{bmatrix} a_1^W \\ a_{2}^W \end{bmatrix} = \begin{bmatrix} \cos\theta & -\sin\theta \\ \sin\theta & \cos\theta \end{bmatrix} \begin{bmatrix} a_{1}^B \\ a_{2}^B \end{bmatrix} = \begin{bmatrix} a_{1}^B \cos\theta & - a_{2}^B \sin\theta \\ a_{1}^B \sin\theta & a_{2}^B \cos\theta \end{bmatrix} $$

or compact but with time indicies ($\theta_k)$

$$ \mathbf{a}_k^W = \mathbf{R}_k \mathbf{a}_k^B $$

resp.

$$ \mathbf{a}_k^B = \mathbf{R}_k^T \mathbf{a}_k^W $$

Defining Orientation ($\theta$): The Bridge Between Frames¶

Crucially, the orientation angle $\theta_k$ is defined as the angle of the Body Frame relative to the World Frame. By standard convention, this is the counter-clockwise angle measured from the fixed World X-axis ($X_W$) to the robot's forward-pointing Body X-axis ($X_B$). This definition is what allows our rotation matrix $\mathbf{R}_k$ to function as a Body-to-World transformer, taking accelerations measured relative to the robot's nose ($\mathbf{a}^B$) and projecting them onto the global map axes ($\mathbf{a}^W$). Conversely, the transpose $\mathbf{R}_k^T$ represents the inverse World-to-Body transformation.

In [1]:
import matplotlib.pyplot as plt
import matplotlib.patches as patches
import numpy as np

def visualize_frames(theta_deg):
    """
    Visualizes the relationship between World Frame (W) and Body Frame (B)
    rotated by angle theta.
    """
    theta_rad = np.radians(theta_deg)
    
    fig, ax = plt.subplots(figsize=(8, 8))
    
    # 1. Setup Plot Limits and Aspect
    ax.set_xlim(-1.5, 2.5)
    ax.set_ylim(-.55, 2.)
    ax.set_aspect('equal')
    ax.grid(True, linestyle='--', alpha=0.3)
    ax.set_title(f"Coordinate Frames with $\\theta = {theta_deg}^\circ$")
    ax.set_xlabel("World X position")
    ax.set_ylabel("World Y position")

    origin = np.array([0, 0])
    scale = 1.5 # Length of axis arrows

    # 2. Draw WORLD Frame (Fixed, Black)
    # World X-axis
    ax.quiver(*origin, scale, 0, color='k', angles='xy', scale_units='xy', scale=1, width=0.005)
    ax.text(scale + 0.1, 0, "$X_W$ (World East)", fontsize=12, va='center')
    # World Y-axis
    ax.quiver(*origin, 0, scale, color='k', angles='xy', scale_units='xy', scale=1, width=0.005)
    ax.text(0, scale + 0.1, "$Y_W$ (World North)", fontsize=12, ha='center')

    # 3. Calculate BODY Frame Axes based on theta
    # Body X points in direction [cos(theta), sin(theta)]
    bx_vec = np.array([np.cos(theta_rad), np.sin(theta_rad)]) * scale
    # Body Y is rotated 90 deg CCW from Body X: [-sin(theta), cos(theta)]
    by_vec = np.array([-np.sin(theta_rad), np.cos(theta_rad)]) * scale

    # 4. Draw BODY Frame (Rotated, Red)
    # Body X-axis (Nose direction)
    ax.quiver(*origin, *bx_vec, color='r', angles='xy', scale_units='xy', scale=1, width=0.008)
    ax.text(*(bx_vec + 0.1), "$X_B$ (Body Nose)", fontsize=12, color='r', va='center')
    # Body Y-axis (Left wing direction)
    ax.quiver(*origin, *by_vec, color='r', angles='xy', scale_units='xy', scale=1, width=0.008)
    ax.text(*(by_vec + 0.1), "$Y_B$ (Body Left)", fontsize=12, color='r', ha='center')

    # 5. Visualize the angle Theta
    # Draw an arc from World X to Body X
    arc_radius = 0.7
    arc = patches.Arc(origin, arc_radius*2, arc_radius*2, angle=0.0, 
                      theta1=0.0, theta2=theta_deg, color='blue', linewidth=2)
    ax.add_patch(arc)
    
    # Add label for theta roughly in the middle of the arc
    label_angle = theta_rad / 2
    label_x = (arc_radius + 0.2) * np.cos(label_angle)
    label_y = (arc_radius + 0.2) * np.sin(label_angle)
    ax.text(label_x, label_y, f"$\\theta={theta_deg}^\circ$", fontsize=14, color='blue', ha='center')

    # Optional: Add a simple robot body triangle to make it concrete
    robot_poly = np.array([
        [scale*0.8, 0],                  # Nose tip
        [-scale*0.2, scale*0.3],         # Back left
        [-scale*0.2, -scale*0.3]         # Back right
    ])
    # Rotate the robot body points
    rot_matrix = np.array([[np.cos(theta_rad), -np.sin(theta_rad)],
                           [np.sin(theta_rad),  np.cos(theta_rad)]])
    rotated_poly = (rot_matrix @ robot_poly.T).T
    robot_patch = patches.Polygon(rotated_poly, closed=True, facecolor='r', alpha=0.1, edgecolor='r')
    ax.add_patch(robot_patch)

    plt.show()

# Run the visualization with an example angle (e.g., 45 degrees)
visualize_frames(theta_deg=45)
No description has been provided for this image

Example: Projecting a local vector to the world frame¶

Suppose the robot is oriented at $\theta = 45^\circ$ and its sensors measure an acceleration vector $\mathbf{a}^B = [1, 1]^T$ (meaning the robot is accelerating equally "forward" and "to its left"). To find this acceleration in the world frame, we apply the rotation matrix:

$$\mathbf{a}^W = \mathbf{R} \mathbf{a}^B = \begin{bmatrix} \cos 45^\circ & -\sin 45^\circ \\ \sin 45^\circ & \cos 45^\circ \end{bmatrix} \begin{bmatrix} 1 \\ 1 \end{bmatrix} = \begin{bmatrix} 0.707 & -0.707 \\ 0.707 & 0.707 \end{bmatrix} \begin{bmatrix} 1 \\ 1 \end{bmatrix} = \begin{bmatrix} 0 \\ 1.414 \end{bmatrix}$$

Result: The vector $[0, 1.414]^T$ points exactly along the World Y-axis.

In [ ]:
 
Kontakt/Impressum