diff --git a/doc/source/camera_3dtrajectory/cam_calibration.rst b/doc/source/camera_3dtrajectory/cam_calibration.rst new file mode 100644 index 0000000000000000000000000000000000000000..40254d721712ee6bec469dbaa11238cbf472858b --- /dev/null +++ b/doc/source/camera_3dtrajectory/cam_calibration.rst @@ -0,0 +1,27 @@ +Camera Calibration +================== + +A camera is an optical intrument for aquiring images. Cameras are composed of a sensors (converting photon to electrical signal) and a lens (foccussing light rays on the sensors). \ +Many experimental paradigm require the recording of animal motion with one or more camera. \ +The experimenter is more interested by the position of the animal in his or her arena (world \ +coordinate system) than by the position of the animal within the image (camera-coordinate system). Therefore, we need to transform the position of the animal in the image to its position in the world. To do so, we need to calibrate our camera. This calibration require two steps: + +* remove distortion from the lens (intrinsic calibration) +* determine the position and orientation (i.e. pose) of the camera in the environment (extrinsic calibration) + +Intrinsic calibration +--------------------- + +`With Matlab <https://de.mathworks.com/help/vision/ug/single-camera-calibrator-app.html>`_ + +`With Opencv <https://docs.opencv.org/3.3.1/dc/dbb/tutorial_py_calibration.html>`_ + +Extrinsic calibration +--------------------- + +.. automodule:: bodorientpy.calib + +Alternatively you can use Matlab: + +`With Matlab <https://de.mathworks.com/help/vision/ref/estimateworldcamerapose.html>` + diff --git a/doc/source/camera_3dtrajectory/cam_triangulation.rst b/doc/source/camera_3dtrajectory/cam_triangulation.rst new file mode 100644 index 0000000000000000000000000000000000000000..ed0efd7b69364f1f9feb0e8788a5a541fd5c7800 --- /dev/null +++ b/doc/source/camera_3dtrajectory/cam_triangulation.rst @@ -0,0 +1,11 @@ +Triangulation +============= + +In computer vision triangulation refers to the process of determining a point in 3D space given its projections onto two, or more, images. In order to solve this problem it is necessary to know the parameters of the camera projection function from 3D to 2D for the cameras involved, in the simplest case represented by the camera matrices. Triangulation is sometimes also referred to as reconstruction. + + +The triangulation problem is in theory trivial. Since each point in an image corresponds to a line in 3D space, all points on the line in 3D are projected to the point in the image. If a pair of corresponding points in two, or more images, can be found it must be the case that they are the projection of a common 3D point x. The set of lines generated by the image points must intersect at x (3D point). + +Richard Hartley and Andrew Zisserman (2003). Multiple View Geometry in computer vision. Cambridge University Press + +.. automodule:: bodorientpy.triangulate diff --git a/doc/source/camera_3dtrajectory/examples/extrinsic_calibration.py b/doc/source/camera_3dtrajectory/examples/extrinsic_calibration.py new file mode 100644 index 0000000000000000000000000000000000000000..9a94e47c27fc03246b9c856a798a96d2a68a6b06 --- /dev/null +++ b/doc/source/camera_3dtrajectory/examples/extrinsic_calibration.py @@ -0,0 +1,50 @@ +""" + Script to calibrate your camera system, knowing the intrinsic \ +calibration of the camera-lenses +""" +import pandas as pd +import pkg_resources +from bodorientpy.io import opencv as opencv_io +from bodorientpy.calib import calibrates_extrinsic + +# The filename used for the manhttan +manhattan3d_filename = pkg_resources.resource_filename( + 'bodorientpy', 'resources/calib/manhattan.hdf') +manhattan3d_key = 'manhattan' +# where to save the calibration +filename_fullcalib = 'full_calibration.xml' +# a filelist of intrinsic calibration +filenames_intrinsics = dict() +filenames_intrinsics[0] = pkg_resources.resource_filename( + 'bodorientpy', 'resources/calib/calib_20180117_cam_0.xml') +filenames_intrinsics[1] = pkg_resources.resource_filename( + 'bodorientpy', 'resources/calib/calib_20180117_cam_2.xml') +# a filelist for manhattan +filenames_manhattans = dict() +filenames_manhattans[0] = pkg_resources.resource_filename( + 'bodorientpy', 'resources/calib/manhattan_cam_0_xypts.csv') +filenames_manhattans[1] = pkg_resources.resource_filename( + 'bodorientpy', 'resources/calib/manhattan_cam_2_xypts.csv') + +# number of cameras +ncams = 3 +# threshold to exclude point in top-left corner of manhattan +corner_th = 60 # in pixel + +# Load the manhttanx 3d +manhattan_3d = pd.read_hdf(manhattan3d_filename, manhattan3d_key) + +# Load the intrinsics camlibraitons +cameras_intrinsics = opencv_io.cameras_intrinsic_calibration( + filenames_intrinsics) + +# Load the manhattans +cameras_extrinsics = calibrates_extrinsic(filenames_manhattans, + manhattan_3d, + cameras_intrinsics, + corner_th) +# Save the full calibration +opencv_io.save_cameras_calibration(cameras_intrinsics, + cameras_extrinsics, + filename_fullcalib, + overwrite=True) diff --git a/doc/source/camera_3dtrajectory/examples/manhattan.py b/doc/source/camera_3dtrajectory/examples/manhattan.py new file mode 100644 index 0000000000000000000000000000000000000000..323d27e1cd1816c3be388b761b31770fdf4cd65e --- /dev/null +++ b/doc/source/camera_3dtrajectory/examples/manhattan.py @@ -0,0 +1,52 @@ +""" + Building your own manhattan +""" +import pandas as pd +import bodorientpy.calib.plot as cplot + +manhattan_filename = 'manhattan.hdf' +manhattan_key = 'manhattan' +num_points = 27 # Number of points +manhattan_3d = pd.DataFrame(index=range(num_points), + columns=['x', 'y', 'z']) +# Then you manually fill the manhattan measurement +manhattan_3d.loc[0, :] = [-10, -10, 4.66] +manhattan_3d.loc[1, :] = [-10, -5, 4.39] +manhattan_3d.loc[2, :] = [-10, -0, 4.63] +manhattan_3d.loc[3, :] = [-10, +5, 4.38] +manhattan_3d.loc[4, :] = [-10, +10, 4.85] + +manhattan_3d.loc[5, :] = [-8.09, -0.25, 10.13] + +manhattan_3d.loc[6, :] = [-5, -10, 4.52] +manhattan_3d.loc[7, :] = [-5, -5, 4.57] +manhattan_3d.loc[8, :] = [-5, -0, 4.36] +manhattan_3d.loc[9, :] = [-5, +5, 4.45] +manhattan_3d.loc[10, :] = [-5, +10, 4.43] + +manhattan_3d.loc[11, :] = [0, -10, 4.70] +manhattan_3d.loc[12, :] = [0, -5, 4.57] +manhattan_3d.loc[13, :] = [0, +5, 4.16] +manhattan_3d.loc[14, :] = [0, +10, 4.43] + +manhattan_3d.loc[15, :] = [+5, -10, 4.70] +manhattan_3d.loc[16, :] = [+5, -5, 4.56] +manhattan_3d.loc[17, :] = [+5, -0, 4.27] +manhattan_3d.loc[18, :] = [+5, +5, 4.49] +manhattan_3d.loc[19, :] = [+5, +10, 4.50] + +manhattan_3d.loc[20, :] = [+8.62, -8.38, 10.19] +manhattan_3d.loc[21, :] = [+8.55, +8.72, 10.09] + +manhattan_3d.loc[22, :] = [+10, -10, 4.51] +manhattan_3d.loc[23, :] = [+10, -5, 4.33] +manhattan_3d.loc[24, :] = [+10, -0, 4.69] +manhattan_3d.loc[25, :] = [+10, +5, 4.69] +manhattan_3d.loc[26, :] = [+10, +10, 4.71] + +manhattan_3d *= 10 # Because millimeters are nicer to use than centimeters + +# And save it +manhattan_3d.to_hdf(manhattan_filename, manhattan_key) + +cplot.manhattan(manhattan_3d, unit='mm') diff --git a/doc/source/camera_3dtrajectory/examples/manhattan2d.py b/doc/source/camera_3dtrajectory/examples/manhattan2d.py new file mode 100644 index 0000000000000000000000000000000000000000..b1187959c08553554d075b80be59aa880243d679 --- /dev/null +++ b/doc/source/camera_3dtrajectory/examples/manhattan2d.py @@ -0,0 +1,44 @@ +import pkg_resources +import matplotlib.pyplot as plt +import pandas as pd + + +manhattan_imfile = list() +manhattan_imfile.append(pkg_resources.resource_filename( + 'bodorientpy', 'resources/calib/20180117_cam_0.tif')) +manhattan_imfile.append(pkg_resources.resource_filename( + 'bodorientpy', 'resources/calib/20180117_cam_2.tif')) +# a filelist for manhattan +filenames_manhattans = list() +filenames_manhattans.append(pkg_resources.resource_filename( + 'bodorientpy', 'resources/calib/manhattan_cam_0_xypts.csv')) +filenames_manhattans.append(pkg_resources.resource_filename( + 'bodorientpy', 'resources/calib/manhattan_cam_2_xypts.csv')) + + +f, axarr = plt.subplots(1, 2, figsize=(15, 8)) +for i, ax in enumerate(axarr): + image = plt.imread(manhattan_imfile[i]) + manhattan_2d = pd.read_csv(filenames_manhattans[i], + names=['x', 'y'], + header=0) + + # Show the manhattan image + ax.imshow(image, cmap='gray') + toplotx = manhattan_2d.x + # Because inverted y axis in image + toploty = image.shape[0]-manhattan_2d.y + markersize = 10 + # Plot marker + ax.plot(toplotx, + toploty, 'o', + markersize=markersize, + markerfacecolor="None", + markeredgecolor='red', + markeredgewidth=2) + # Plot marker label + for mi, xy in enumerate(zip(toplotx, toploty)): + ax.text(xy[0]+2*markersize, + xy[1]+2*markersize, + '{}'.format(mi), color='r', + horizontalalignment='left') diff --git a/doc/source/camera_3dtrajectory/examples/manhattan_0001.hdf b/doc/source/camera_3dtrajectory/examples/manhattan_0001.hdf new file mode 100644 index 0000000000000000000000000000000000000000..dd5dc1c44c110835e5e420f7584cf9a1766d5c66 Binary files /dev/null and b/doc/source/camera_3dtrajectory/examples/manhattan_0001.hdf differ diff --git a/doc/source/camera_3dtrajectory/examples/npairwise_triangulation.py b/doc/source/camera_3dtrajectory/examples/npairwise_triangulation.py new file mode 100644 index 0000000000000000000000000000000000000000..2af84e1520744409ec51d483633f6548271eef69 --- /dev/null +++ b/doc/source/camera_3dtrajectory/examples/npairwise_triangulation.py @@ -0,0 +1,29 @@ +import pkg_resources +import matplotlib.pyplot as plt +from mpl_toolkits.mplot3d import Axes3D # noqa F401 +from bodorientpy.io import opencv +from bodorientpy.io import marker_cam_fromfiles, marker_cam2pts_cam +from bodorientpy.triangulate import triangulate_ncam_pairwise + +# Load full calibration +full_calibfile = pkg_resources.resource_filename( + 'bodorientpy', 'resources/calib/full_calibration_20180117.xml') +cameras_calib = opencv.load_cameras_calibration(full_calibfile) +# Load markers on camera +filenames = [pkg_resources.resource_filename( + 'bodorientpy', 'resources/calib/bee_005_cam_0_bodyxypts.csv'), + pkg_resources.resource_filename( + 'bodorientpy', 'resources/calib/bee_005_cam_2_bodyxypts.csv')] +marker_cam = marker_cam_fromfiles(filenames) +nmarkers = len(marker_cam.columns.levels[0]) +# Plot +fig = plt.figure() +ax = fig.add_subplot(111, projection='3d') +for mark_i in range(3): + pts_cam = marker_cam2pts_cam(marker_cam, mark_i) + pts3d, _, _ = triangulate_ncam_pairwise(cameras_calib, pts_cam) + ax.plot(pts3d[:, 0], pts3d[:, 1], pts3d[:, 2]) + +ax.set_xlabel('x') +ax.set_ylabel('y') +ax.set_zlabel('z') diff --git a/doc/source/camera_3dtrajectory/index.rst b/doc/source/camera_3dtrajectory/index.rst new file mode 100644 index 0000000000000000000000000000000000000000..5f58af94dda5c95d70f5ee488cfec33b558927b3 --- /dev/null +++ b/doc/source/camera_3dtrajectory/index.rst @@ -0,0 +1,9 @@ +=================== +Video to trajectory +=================== + +.. toctree:: + :maxdepth: 2 + + cam_calibration.rst + cam_triangulation.rst diff --git a/doc/source/classification/examples/saccade_extraction.py b/doc/source/classification/examples/saccade_extraction.py new file mode 100644 index 0000000000000000000000000000000000000000..5f3a86effd91823b1566dd397f080e9fa6dd30d3 --- /dev/null +++ b/doc/source/classification/examples/saccade_extraction.py @@ -0,0 +1,38 @@ +from bodorientpy.trajectory.random import saccadic_traj +from bodorientpy.classification.sacintersaccade import extract + + +import matplotlib.pyplot as plt +import numpy as np + +# fix the seed for constant example +np.random.seed(0) +# generate fake saccadic data +trajectory, saccade_df = saccadic_traj() +# +thresholds = [2, 3] +sacintersac, angvel = extract(trajectory, thresholds) + +f, axarr = plt.subplots(3, 1) +ax = axarr[0] +trajectory.loc[:, ['x', 'y', 'z']].plot(ax=ax) +ax = axarr[1] +trajectory.loc[:, ['angle_0', 'angle_1', 'angle_2']].plot(ax=ax) +ax = axarr[2] +angvel.plot(ax=ax) +ax.axhline(min(thresholds), color='r', linestyle=':') +ax.axhline(max(thresholds), color='r', linestyle='-.') +ylim = ax.get_ylim() +ax.fill_between(sacintersac.index, min(ylim), max(ylim), + where=sacintersac.saccade >= 0, + facecolor='green', + alpha=0.2) +ax.fill_between(sacintersac.index, min(ylim), max(ylim), + where=sacintersac.intersaccade >= 0, + facecolor='black', + alpha=0.2) +f.show() +print(sacintersac) + +import time +time.sleep(6) diff --git a/doc/source/classification/examples/saccade_loliplot.py b/doc/source/classification/examples/saccade_loliplot.py new file mode 100644 index 0000000000000000000000000000000000000000..2710cb70cc7048029824723b351ee833f251a556 --- /dev/null +++ b/doc/source/classification/examples/saccade_loliplot.py @@ -0,0 +1,26 @@ +import numpy as np +from bodorientpy.trajectory.random import saccadic_traj +from bodorientpy.trajectory.plot import get_color_frame_dataframe +from bodorientpy.trajectory.plot import lollipops +import matplotlib.pyplot as plt + +# fix the seed for constant example +np.random.seed(0) +# generate fake saccadic data +trajectory, saccade_df = saccadic_traj() +# for colors +colors, sm = get_color_frame_dataframe(frame_range=[ + trajectory.index.min(), + trajectory.index.max()], cmap=plt.get_cmap('jet')) + +f = plt.figure(figsize=(15, 10)) +ax = f.add_subplot(111, projection='3d') +lollipops(ax, trajectory, + colors, step_lollipop=20, + offset_lollipop=1, lollipop_marker='o') +ax.set_xlabel('x') +ax.set_ylabel('y') +ax.set_zlabel('z') +cb = f.colorbar(sm) +cb.set_label('frame number') +f.show() diff --git a/doc/source/classification/index.rst b/doc/source/classification/index.rst new file mode 100644 index 0000000000000000000000000000000000000000..f7834e53ac29808a66619b0ca293cf58cbcbce1e --- /dev/null +++ b/doc/source/classification/index.rst @@ -0,0 +1,5 @@ +============== +Classification +============== + +.. include:: sac_intersaccade.rst diff --git a/doc/source/classification/sac_intersaccade.rst b/doc/source/classification/sac_intersaccade.rst new file mode 100644 index 0000000000000000000000000000000000000000..3d262ded7e6e51f30e944a8eeadde41cc9aeeeb2 --- /dev/null +++ b/doc/source/classification/sac_intersaccade.rst @@ -0,0 +1,34 @@ +Saccade and intersaccade +======================== + +The trajectories of flying insects are often composed of saccade, \ +a rapide rotation, and intersaccade, period of no rapid turn. During \ +a saccade the angular velocity can reach several hundred or thousands of \ +degrees per second. + +An example of a saccadic flight +------------------------------- + +.. plot:: classification/examples/saccade_loliplot.py + +Extraction +---------- + +To extract the saccade from a trajectory, we need to use the angular \ +velocity, and two thresholds. When the angular velocity is above the \ +highest threshold, we have a saccade. When the angular velocity is \ +below the lowest threshold, we have an intersaccade. Part of the \ +trajectory between the threshold are unclassified. Indeed it could part\ +of the saccade or intersaccade, depending on the noise presents in the \ +trajectory. + +.. literalinclude:: examples/saccade_extraction.py + :lines: 2,13,14 + +.. plot:: classification/examples/saccade_extraction.py + + +Prototypical mouvement +====================== + +See Elke Braun paper (under construction) diff --git a/doc/source/error_propagation/error_propagation.rst b/doc/source/error_propagation/error_propagation.rst new file mode 100644 index 0000000000000000000000000000000000000000..c4bc28a08d026d4e7397a55fcee24251c575d119 --- /dev/null +++ b/doc/source/error_propagation/error_propagation.rst @@ -0,0 +1 @@ +.. automodule:: bodorientpy.errorprop diff --git a/doc/source/error_propagation/examples/error_propagation_univariate.py b/doc/source/error_propagation/examples/error_propagation_univariate.py new file mode 100644 index 0000000000000000000000000000000000000000..37b73290e035b71c4e1226436f087eb1bee28492 --- /dev/null +++ b/doc/source/error_propagation/examples/error_propagation_univariate.py @@ -0,0 +1,37 @@ +import numpy as np +import matplotlib.pyplot as plt +from bodorientpy.errorprop import propagate_error + + +def plot_error_f(ax, func_f, x, delta_x, delta_y, x0): + y0 = func_f(x0) + xdelta = np.linspace(x0-delta_x, x0+delta_x, len(x)) + ydelta = np.linspace(y0-delta_y, y0+delta_y, len(x)) + y = func_f(x) + ax.fill_between(xdelta, min(y), max(y), facecolor='k', + alpha=0.3, edgecolor='None') + ax.fill_between(x, min(ydelta), max(ydelta), + facecolor='r', alpha=0.5, edgecolor='None') + ax.plot(x, y, 'k', linewidth=2) + + ax.set_xlim([min(x), max(x)]) + ax.set_ylim([min(y), max(y)]) + ax.set_xlabel('x') + ax.set_ylabel('y=f(x)') + + +def func_f(x): + return (x+1)*(x-2)*(x+3) + + +delta_x = 0.5 +x = np.linspace(-5, 4.5, 100) + +fig, axarr = plt.subplots(1, 2, figsize=(15, 5)) +x0 = -1 +delta_y = propagate_error(func_f, x0, delta_x) +plot_error_f(axarr[0], func_f, x, delta_x, delta_y, x0=x0) +x0 = 3 +delta_y = propagate_error(func_f, x0, delta_x) +plot_error_f(axarr[1], func_f, x, delta_x, delta_y, x0=x0) +fig.show() diff --git a/doc/source/index.rst b/doc/source/index.rst index 0ccef0ddf457df97c61e0109da51951cd6c0e5be..244a4ec3eea6945251985ae90ae3c1d641bf76ad 100644 --- a/doc/source/index.rst +++ b/doc/source/index.rst @@ -24,6 +24,9 @@ Content gettingstarted overview/index + classification/index + camera_3dtrajectory/index + orientation/index tutorials/index references/index tests/index diff --git a/doc/source/media/marked_bumblebee.png b/doc/source/media/marked_bumblebee.png new file mode 100644 index 0000000000000000000000000000000000000000..663b78695be367f35f1cd0d08654d4708df3ca7b Binary files /dev/null and b/doc/source/media/marked_bumblebee.png differ diff --git a/doc/source/orientation/Orientation_two_markers.rst b/doc/source/orientation/Orientation_two_markers.rst new file mode 100644 index 0000000000000000000000000000000000000000..f22ab42ff3592276cf43b2a00774b95534970ef9 --- /dev/null +++ b/doc/source/orientation/Orientation_two_markers.rst @@ -0,0 +1,149 @@ +Orientation from two markers +============================ + +We have seen in the previous section, how to describe the orientation of a rigid body marked by three markers, i.e. forming a local reference frame (X,Y,Z). Having three markers is not always possible, due to physical constrains such as space and camera resolutions. The task is then to find the solutions (they are not unique) to the system: + +.. math:: + + \begin{equation} + R.v^{ref}=v^{bee} + \end{equation} + +here :math:`v^{bee}` is the vector given by the 3D coordinates of the two markers, :math:`v^{ref}` is the vector given by the 3D coordinates of the two markers when the orientation of the agent is null, i.e. the orientation of the matrix is equal to the identity matrix. + +The system has 3 equations and 9 unknown variables: the elements of the orientation matrix. + +**Note** Why 9 unknown variables if we have three rotation angles? The rotation angles appear in the orientation matrix in cosine and sine function, both nonlinear function. Therefore the sine and the cosine of the rotation angle have to be treated separatly. Moreover the multiplication of two cosines, two sine, or one cosine and one sine has to be treated has variables, because multiplication is... nonlinear. + +.. math:: + + \begin{align} + v_x^{bee} & = +v_x^{ref} \cos\alpha \cos\beta + v_y^{ref}(\cos\alpha \sin\beta \sin\gamma - \sin\alpha \cos\gamma) + v_z^{ref} (\cos\alpha \sin\beta \cos\gamma + \sin\alpha \sin\gamma) \\ + v_y^{bee} & = +v_x^{ref} \sin\alpha \cos\beta + v_y^{ref}(\sin\alpha \sin\beta \sin\gamma + \cos\alpha \cos\gamma) + v_z^{ref} ( \sin\alpha \sin\beta \cos\gamma - \cos\alpha \sin\gamma )\\ + v_z^{bee} & = -v_x^{ref}\sin\beta + v_y^{ref} \cos\beta \sin\gamma + v_z^{ref} \cos\beta \cos\gamma + \end{align} + +or equivalently we can look at :math:`v^{ref}=R^Tv^{bee}`, because :math:`R^T=R^{-1}` + +.. math:: + + \begin{align} + v_x^{ref} & = +v_x^{bee} \cos\alpha \cos\beta +v_y^{bee} \sin\alpha \cos\beta -v_z^{bee}\sin\beta \\ + v_y^{ref} & = +v_x^{bee}(\cos\alpha \sin\beta \sin\gamma - \sin\alpha \cos\gamma) +v_y^{bee}(\sin\alpha \sin\beta \sin\gamma + \cos\alpha \cos\gamma)+ v_z^{bee} \cos\beta \sin\gamma \\ + v_z^{ref} & = +v_x^{bee}(\cos\alpha \sin\beta \cos\gamma + \sin\alpha \sin\gamma) + v_y^{bee} ( \sin\alpha \sin\beta \cos\gamma - \cos\alpha \sin\gamma )+v_z^{bee} \cos\beta \cos\gamma + \end{align} + +To simplify the problem, we need to remove terms in the system of equations. Removing terms is easily done by letting certain variables to be zero. We want to determine :math:`\alpha`, :math:`\beta`, and :math:`\gamma`. Thus we can only set to zeros :math:`v_x^{ref}`, :math:`v_y^{ref}`, or :math:`v_z^{ref}`. + +A simple case +------------- + +If we assume that the two markers are aligned with the roll axis, +i.e. :math:`v^{ref}=(1,0,0)^T`, the problem can easily be solve as follow: + +.. math:: + + \begin{align} + v_x^{bee} & = +\cos\alpha \cos\beta \\ + v_y^{bee} & = +\sin\alpha \cos\beta \\ + v_z^{bee} & = -\sin\beta + \end{align} + +.. math:: + + \begin{align} + \tan\alpha & = \frac{\pm v_y^{bee}}{\pm v_x^{bee}} &\quad \text{from L1 and L2} \\ + \tan\beta & = \frac{-v_z^{bee}}{\pm\sqrt{v_y^{bee}+v_x^{bee} }} &\quad\text{from all} + \end{align} + +We remark that the solution does not depend of the angle :math:`\gamma`, \ +the roll angle. Thus, when we do not care about the full orientation of \ +a body but simply care about pitch and yaw, two markers are sufficients. + +Other cases +----------- + +The two markers may be aligned with the pitch (or yaw axis) of the body. \ +In such situation, the problem is slightly more complex. We can, still \ +solve the system of equation, but can not find a solution independent of \ +the pitch (or yaw angle), we need to guess the pitch (or yaw angle). + +Angles from two markers +----------------------- +Here we illustrate, how the solution of the system in three cases: \ +roll-aligned, pitch-aligned, and yaw-aligned, varies as a function of \ +the a priori known angle. + +Note that the a priori known angle is the third angle of a convention internally \ +used by the orientation estimater, namely: + +* The rotation around x, with convention :math:`R_zR_yR_x`, for x-aligned markers +* The rotation around y, with convention :math:`R_zR_xR_y`, for y-aligned markers +* The rotation around z, with convention :math:`R_yR_xR_z`, for x-aligned markers + +Thus the none of the yaw pitch roll angles may match of the a priori \ +known angle (except for x-aligned markers, because the internal convention is the yaw pitch roll convention) + +.. literalinclude:: examples/two_markers.py + :lines: 5, 13, 43-44 + +here the alignment is defined by a string: + +.. literalinclude:: examples/two_markers.py + :lines: 36, 47, 58 + +.. plot:: orientation/examples/two_markers.py + +Solving pitch aligned markers with yaw, pitch, roll convention +-------------------------------------------------------------- +:math:`v^{ref}=(0,-1,0)^T` + +.. math:: + + \begin{align} + v_x^{bee} & = -\cos\alpha \sin\beta \sin\gamma + \sin\alpha \cos\gamma &\quad \text{from L1}\\ + v_y^{bee} & = -\sin\alpha \sin\beta \sin\gamma - \cos\alpha \cos\gamma &\quad \text{from L2}\\ + v_z^{bee} & = -\cos\beta \sin\gamma &\quad \text{from L3} + \end{align} + +for :math:`v_z^{bee}\neq0 \Rightarrow \sin\gamma\neq0` +~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + +.. math:: + + \begin{align} + \cos\gamma & = v_x^{bee}\sin\alpha -v_y^{bee}\cos\alpha &\quad \text{from L1 and L2} \\ + \cos\beta & = -\frac{v_z^{bee}}{\sin\gamma } &\quad\text{from L3} + \end{align} + +From the two last equation :math:`\gamma` and :math:`\beta` can be found as a function of :math:`\alpha` and :math:`v^{bee}` + +for :math:`v_z^{bee}=0` +~~~~~~~~~~~~~~~~~~~~~~~ + +.. math:: + + \begin{align} + \tan\alpha & = -\frac{v_x^{bee}}{v_y^{bee}} &\quad \text{from L1 and L2}\\ + \cos\gamma & =\pm\sqrt{ \left(v_x^{bee}\right)^2 + \left(v_y^{bee}\right)^2 } &\quad \text{from L1 and L2}\\ + \end{align} + +for :math:`\beta` known, and :math:`\beta\neq\pm\pi/2`: + +.. math:: + + \begin{align} + \sin\gamma & = \frac{v_z^{bee}}{-\cos\beta} \\ + \tan\left(\alpha+\theta\right) & = \frac{v_x^{bee}}{-v_y^{bee}} \\ + \tan\theta&=\frac{v_z^{bee}\tan\beta}{\pm\sqrt{1-\sin^2\gamma}} + \end{align} + +for :math:`\gamma` known, and :math:`\gamma\neq 0 + k2pi`: + +.. math:: + + \begin{align} + \sin\left(\alpha+\theta\right) & = \frac{\cos\gamma}{\sqrt{ \left(v_x^{bee}\right)^2+\left(v_y^{bee}\right)^2}} \\ + \cos\beta & = -\frac{v_z^{bee}}{\sin\gamma } \\ + \tan\theta&=\frac{v_y^{bee}}{v_x^{bee}} + \end{align} diff --git a/doc/source/orientation/bumblebee_orientation.rst b/doc/source/orientation/bumblebee_orientation.rst new file mode 100644 index 0000000000000000000000000000000000000000..5c066ca6f95b2f345966009b37cd86435ec02fd1 --- /dev/null +++ b/doc/source/orientation/bumblebee_orientation.rst @@ -0,0 +1,43 @@ +Bumblebee orientation during a learning flight +============================================== + +Bumblebees are eusocial insects. The bees of the specie *bombus terrestris* have their hive underground, and can access it by a hole. When leaving the hive, they perform a learning and orientation flight to remember the location of the hive entrance and decide in which direction they should forage or explore the environment. Bees can be filmed while perforing their learning flight. As we have seen before, 3 points or markers need to be placed on the bee's thorax to determine its orientation. Bees have therefore been marked with three points, forming a triangle, on their thorax + +.. image:: media/marked_bumblebee.png + +**Note** The dataset used in this example come from recording made at the University of Bielefeld in Martin Egelhaaf's group, by Charlotte Doussot. + +Position of the thorax's markers along time +------------------------------------------- + +**Note** due to occlusion during the experiment, at several time-points, the positions of the markers have not been recorded. + +.. plot:: orientation/examples/bumblebee_markers.py + + +Yaw, pitch and roll along time +------------------------------ + +We first need to load the markers + +.. literalinclude:: examples/bumblebee_thorax_orientation.py + :lines: 9-10 + +To calculate the orientation, the euler axis convention as well as how our \ +markers are placed relative to the three rotational axis of the thorax. + +.. literalinclude:: examples/bumblebee_thorax_orientation.py + :lines: 13-14 + +We now just have to loop frame by frame through our trajectory and \ +calculate the orientation from the markers. + +.. literalinclude:: examples/bumblebee_thorax_orientation.py + :lines: 16-29 + +The angles are in radians. Some persons prefer to read angle in degrees + +.. literalinclude:: examples/bumblebee_thorax_orientation.py + :lines: 32 + +.. plot:: orientation/examples/bumblebee_thorax_orientation.py diff --git a/doc/source/orientation/examples/bumblebee_markers.py b/doc/source/orientation/examples/bumblebee_markers.py new file mode 100644 index 0000000000000000000000000000000000000000..56a7873f2ddd3e5965fef8864f640ce87b1a9a83 --- /dev/null +++ b/doc/source/orientation/examples/bumblebee_markers.py @@ -0,0 +1,19 @@ +import pandas as pd +import matplotlib.pyplot as plt +import pkg_resources + +bumblebee_flight = pkg_resources.resource_filename( + 'bodorientpy', 'resources/bee007.hdf') +markers = pd.read_hdf(bumblebee_flight, 'markers') +markers_thorax = [0, 1, 2] + +fig, axarr = plt.subplots(1, len(markers_thorax), + figsize=(15, 3), + sharey=True) +for plt_i, cmark_name in enumerate(markers_thorax): + markers.loc[:, cmark_name].plot(ax=axarr[plt_i], + linewidth=2) + axarr[plt_i].set_title(cmark_name) + axarr[plt_i].set_xlabel('time [frame]') + axarr[plt_i].set_ylabel('position [mm]') +fig.show() diff --git a/doc/source/orientation/examples/bumblebee_thorax_orientation.py b/doc/source/orientation/examples/bumblebee_thorax_orientation.py new file mode 100644 index 0000000000000000000000000000000000000000..11c0926cdcb7416632c30f3bd80e9b4244336413 --- /dev/null +++ b/doc/source/orientation/examples/bumblebee_thorax_orientation.py @@ -0,0 +1,39 @@ +import pandas as pd +import numpy as np +import matplotlib.pyplot as plt +import pkg_resources +from bodorientpy.markers.transformations import markers2decompose + +bumblebee_flight = pkg_resources.resource_filename( + 'bodorientpy', 'resources/bee007.hdf') +markers = pd.read_hdf(bumblebee_flight, 'markers') +markers = markers.astype(float) +markers_thorax = [0, 1, 2] + +axes = 'rzyx' +triangle_mode = 'y-axis=2-1' + +yawpitchroll = pd.DataFrame(index=markers.index, + columns=['yaw', 'pitch', 'roll'], + dtype=float) + +for frame_i, markers in markers.dropna().iterrows(): + _, _, angles, _, _ = markers2decompose( + markers.loc[markers_thorax[0]], + markers.loc[markers_thorax[1]], + markers.loc[markers_thorax[2]], + triangle_mode=triangle_mode, + euler_axes=axes) + yawpitchroll.yaw[frame_i] = angles[0] + yawpitchroll.pitch[frame_i] = angles[1] + yawpitchroll.roll[frame_i] = angles[2] + + +yawpitchroll_deg = np.rad2deg(yawpitchroll) + +fig = plt.figure() +ax = fig.gca() +yawpitchroll_deg.plot(ax=ax) +ax.set_xlabel('time [frame]') +ax.set_ylabel('euler angle [deg]') +fig.show() diff --git a/doc/source/orientation/examples/triangle_isocel.py b/doc/source/orientation/examples/triangle_isocel.py new file mode 100644 index 0000000000000000000000000000000000000000..dc0b4c731c6813d247b4da4f03ee5f1556f4f3c3 --- /dev/null +++ b/doc/source/orientation/examples/triangle_isocel.py @@ -0,0 +1,26 @@ +import pandas as pd +from matplotlib import pyplot as plt +from bodorientpy.markers.transformations import triangle2homogeous_transform +from bodorientpy.markers.triangle import Triangle +from bodorientpy.plot import draw_frame + + +apex0 = pd.Series(data=[0., 0.25, 0.], + index=['x', 'y', 'z']) +apex1 = pd.Series(data=[0.5, -0.5, 0.], + index=['x', 'y', 'z']) +apex2 = pd.Series(data=[0.5, 0.5, 0.], + index=['x', 'y', 'z']) + +mytriangle = Triangle(apex0, apex1, apex2) +frame = triangle2homogeous_transform( + mytriangle, triangle_mode='x-axis=median-from-0') + +fig = plt.figure(figsize=(5, 5)) +ax = fig.add_subplot(111, projection='3d') +draw_frame(ax=ax, frame=frame) +mytriangle.plot(ax=ax, apex_marker='wo') +ax.set_xlim([-1, 1]) +ax.set_ylim([-1, 1]) +ax.set_zlim([-1, 1]) +fig.show() diff --git a/doc/source/orientation/examples/triangle_pitch.py b/doc/source/orientation/examples/triangle_pitch.py new file mode 100644 index 0000000000000000000000000000000000000000..59e1ce6b018873ac418f8e9960d494ed3e8022e5 --- /dev/null +++ b/doc/source/orientation/examples/triangle_pitch.py @@ -0,0 +1,26 @@ +import pandas as pd +from matplotlib import pyplot as plt +from bodorientpy.markers.transformations import triangle2homogeous_transform +from bodorientpy.markers.triangle import Triangle +from bodorientpy.plot import draw_frame + + +apex0 = pd.Series(data=[0., 0.25, 0.], + index=['x', 'y', 'z']) +apex1 = pd.Series(data=[0.5, -0.5, 0.], + index=['x', 'y', 'z']) +apex2 = pd.Series(data=[0.5, 0.5, 0.], + index=['x', 'y', 'z']) + +mytriangle = Triangle(apex0, apex1, apex2) +frame = triangle2homogeous_transform( + mytriangle, triangle_mode='y-axis=1-2') + +fig = plt.figure(figsize=(5, 5)) +ax = fig.add_subplot(111, projection='3d') +draw_frame(ax=ax, frame=frame) +mytriangle.plot(ax=ax, apex_marker='wo') +ax.set_xlim([-1, 1]) +ax.set_ylim([-1, 1]) +ax.set_zlim([-1, 1]) +fig.show() diff --git a/doc/source/orientation/examples/two_markers.py b/doc/source/orientation/examples/two_markers.py new file mode 100644 index 0000000000000000000000000000000000000000..dac6aa731c0b6e2776a4840fac4e20c5ba2e1ae1 --- /dev/null +++ b/doc/source/orientation/examples/two_markers.py @@ -0,0 +1,88 @@ +import pandas as pd +import numpy as np +import matplotlib.pyplot as plt +from bodorientpy.homogenous.transformations import compose_matrix +import bodorientpy.markers.transformations as mtf +from bodorientpy.markers.triangle import Triangle + + +yaw_t = +3*np.pi/4 +pitch_t = -1*np.pi/6 +roll_t = -1*np.pi/12 +angles = [yaw_t, pitch_t, roll_t] +euler_axes = 'rzyx' +# Create a triangle in a given orientation +# and get the x,y,z axis used as our two markers +triangle_mode = 'x-axis=median-from-0' +transform = compose_matrix(angles=angles, + axes=euler_axes) +markers = pd.Series(data=0, + index=pd.MultiIndex.from_product( + [[0, 1, 2], ['x', 'y', 'z']])) +markers.loc[(0, 'x')] = -1 +markers.loc[(2, 'y')] = np.sin(np.pi / 3) +markers.loc[(1, 'y')] = -np.sin(np.pi / 3) +markers.loc[(1, 'x')] = np.cos(np.pi / 3) +markers.loc[(2, 'x')] = np.cos(np.pi / 3) +equilateral = Triangle(markers.loc[0], + markers.loc[1], + markers.loc[2]) +equilateral.transform(transform) +_, x_axis, y_axis, z_axis = mtf.triangle2bodyaxis( + equilateral, triangle_mode) + +known_angles = np.linspace(-np.pi, np.pi, 180) + +axis_alignement = 'x-axis' +mark0 = pd.Series(data=0, index=['x', 'y', 'z']) +mark1 = pd.Series(x_axis, index=['x', 'y', 'z']) +solution_x_axis = pd.DataFrame(index=known_angles, columns=['yaw', + 'pitch', + 'roll']) +for known_angle in known_angles: + angles_estimate = mtf.twomarkers2euler( + mark0, mark1, axis_alignement, known_angle, euler_axes) + solution_x_axis.loc[known_angle, :] = angles_estimate + +axis_alignement = 'y-axis' +mark0 = pd.Series(data=0, index=['x', 'y', 'z']) +mark1 = pd.Series(y_axis, index=['x', 'y', 'z']) +solution_y_axis = pd.DataFrame(index=known_angles, columns=['yaw', + 'pitch', + 'roll']) +for known_angle in known_angles: + angles_estimate = mtf.twomarkers2euler( + mark0, mark1, axis_alignement, known_angle, euler_axes) + solution_y_axis.loc[known_angle, :] = angles_estimate + +axis_alignement = 'z-axis' +mark0 = pd.Series(data=0, index=['x', 'y', 'z']) +mark1 = pd.Series(z_axis, index=['x', 'y', 'z']) +solution_z_axis = pd.DataFrame(index=known_angles, columns=['yaw', + 'pitch', + 'roll']) +for known_angle in known_angles: + angles_estimate = mtf.twomarkers2euler( + mark0, mark1, axis_alignement, known_angle, euler_axes) + solution_z_axis.loc[known_angle, :] = angles_estimate + +fig, axarr = plt.subplots(1, 3, figsize=(15, 4), + sharey=True) +ax = axarr[0] +solution_x_axis.plot(ax=ax) +ax.set_title('x-aligned') +ax = axarr[1] +solution_y_axis.plot(ax=ax) +ax.set_title('y-aligned') +ax = axarr[2] +solution_z_axis.plot(ax=ax) +ax.set_title('z-aligned') + +for ax in axarr: + ax.set_xlabel('input angle [rad]') + ax.set_ylabel('euler angle [rad]') + +fig.show() + +import time +time.sleep(5) diff --git a/doc/source/orientation/examples/yawpitchroll.py b/doc/source/orientation/examples/yawpitchroll.py new file mode 100644 index 0000000000000000000000000000000000000000..413dc9d126742bd060d737f63a7aba58e9337c9c --- /dev/null +++ b/doc/source/orientation/examples/yawpitchroll.py @@ -0,0 +1,58 @@ +import numpy as np +from bodorientpy.homogenous.transformations import compose_matrix, \ + decompose_matrix +import pandas as pd +from matplotlib import pyplot as plt +from bodorientpy.markers.transformations import triangle2homogeous_transform +from bodorientpy.markers.triangle import Triangle +from bodorientpy.plot import draw_frame + + +apex0 = pd.Series(data=[0., 0.25, 0.], + index=['x', 'y', 'z']) +apex1 = pd.Series(data=[0.5, -0.5, 0.], + index=['x', 'y', 'z']) +apex2 = pd.Series(data=[0.5, 0.5, 0.], + index=['x', 'y', 'z']) + +triangle_orig = Triangle(apex0, apex1, apex2) +frame_orig = triangle2homogeous_transform( + triangle_orig, triangle_mode='y-axis=1-2') +triangle = Triangle(apex0, apex1, apex2) + +pos = np.array([1.0, 1.0, 1.0]) +yaw = +1*np.pi/3 +pitch = +1*np.pi/6 +roll = -1*np.pi/6 +axes = 'rzyx' + +transform = compose_matrix(translate=pos, + angles=[yaw, pitch, roll], + axes=axes) +triangle.transform(transform) +frame = triangle2homogeous_transform( + triangle, triangle_mode='y-axis=1-2') +_, _, angles, translate, _ = decompose_matrix(frame, axes=axes) + +fig = plt.figure(figsize=(5, 5)) +ax = fig.add_subplot(111, projection='3d') +draw_frame(ax=ax, frame=frame_orig) +triangle_orig.plot(ax=ax, apex_marker='wo') +draw_frame(ax=ax, frame=frame) +triangle.plot(ax=ax, apex_marker='wo') + +celltext = [['{:0.2f}'.format(np.rad2deg(yaw)), + '{:0.2f}'.format(np.rad2deg(angles[0]))], + ['{:0.2f}'.format(np.rad2deg(pitch)), + '{:0.2f}'.format(np.rad2deg(angles[1]))], + ['{:0.2f}'.format(np.rad2deg(roll)), + '{:0.2f}'.format(np.rad2deg(angles[2]))]] +ax.table(cellText=celltext, + rowLabels=['yaw', 'pitch', 'roll'], + colLabels=['theoric', 'estimate'], + loc='bottom') + +ax.set_xlim([-1, 1.5]) +ax.set_ylim([-1, 1.5]) +ax.set_zlim([-1, 1.5]) +fig.show() diff --git a/doc/source/orientation/index.rst b/doc/source/orientation/index.rst new file mode 100644 index 0000000000000000000000000000000000000000..328d26294041e3514a4bfd69cb310e1e25de85e9 --- /dev/null +++ b/doc/source/orientation/index.rst @@ -0,0 +1,11 @@ +=========== +Orientation +=========== + +.. toctree:: + :maxdepth: 2 + + rotation.rst + orientation.rst + bumblebee_orientation.rst + Orientation_two_markers.rst diff --git a/doc/source/orientation/media/marked_bumblebee.png b/doc/source/orientation/media/marked_bumblebee.png new file mode 100644 index 0000000000000000000000000000000000000000..401aea47dce0e5679a6f1c7ed14a309828f72b21 Binary files /dev/null and b/doc/source/orientation/media/marked_bumblebee.png differ diff --git a/doc/source/orientation/orientation.rst b/doc/source/orientation/orientation.rst new file mode 100644 index 0000000000000000000000000000000000000000..205fc4095eb4c4ce83e453c9c7ae12b4aea5253f --- /dev/null +++ b/doc/source/orientation/orientation.rst @@ -0,0 +1,172 @@ +From reference points to an orientation matrix +============================================== + +In geometry the orientation ( also: angular position, or attitude ) of an object is part of the description of how it is placed in the space it is in. Namely, it is the imaginary rotation that is needed to move the object from a reference placement to its current placement. For example for the head of an animal it is the description in which direction the animal is looking (foward), the direction from left to right (sideward), and the direction from top to bottom (downward). +Each direction is mathematically represented by a unit vector. Furthormore those vector are orthogonal to each other, and formed a directly oriented base. +To get an intuitive idea, use your right hand, thumbs toward a wall, the index parallel to the floor, and the major pointing downward. Your right hand form a directly orientaed orthogonal base :) . + +The orientation of an agent is described by three vectors. Those vector can be obtained, for example, from the apexes of a triangle. Let's take an equilateral triangle for simplicity, but any triangle could work. All vectors will originate from the same points, so let choose one apex to be the origin, and mark it with a black point. The forward vector points toward the edge facing the origin, and is along the mediatrix, i.e. it crosses the the middle of the facing edge. The sideward is parrallel to the facing edge. The last vector is simply the cross product of the two other vectors. + +This example illustrated how one can define the orientation of an agent from a triangle. A triangle being described by only three points in space. To know the orinentation of the head of an animal, one only need three points on its head, neat :). + +For an aircraft, the forward, sideward, and downward vectors are nammed roll axis, pitch axis, and yaw axis, respectively. We will from now on stick to this convention. + +Isocel triangle +--------------- + +Let's assume that we have one marker (mark0) centered on the center of mass \ +of the agent. The two others (mark1 and mark2) are placed such that + +* the markers form an isosceles triangle, here the two equal sides are \ +mark0-mark1 and mark0-mark2, +* the median of the triangle, i.e. the vector going from mark0 to the \ +middle of the segment between mark1 and mark2, is along the the roll axis + +The yaw,pitch,roll axis are then calculated as: + +* The roll_axis is along the median between the 2nd and 3rd apexes \ + ( mark1 and mark2) +* The yaw_axis is the cross-product between the vector 1st appex to 2nd and \ + 1st appex to 3rd apexe. +* The pitch_axis is the cross-product between the roll_axis and the opposite \ +of the yaw_axis + +.. literalinclude:: examples/triangle_isocel.py + :lines: 15-17 + +.. plot:: orientation/examples/triangle_isocel.py + +Pitch align triangle +-------------------- +However the markers may not be always correctly placed, therefore the \ +local reference frames may differ a bit from the axis convention for aircraft. To compensate this error, one may calculate the rotation between the correct local reference frame and the estimated local reference frame by identifying other reference points, when the agent is at the null orientation. This procedure is sadly rarely plausible with insects. It is rather difficult to define unambigeously the local reference frame at null orientation. The experimentalist may trust more one of the axis of the triangle. For example less error can be done while placing markers aligned with the pitch axis on the head, because the pitch axis is aligned with axis connecting the two eyes. + +When the pitch axis can be trusted, the yaw,pitch,roll axis are calculated as: + +* The pitch_axis is the vector between the 2nd and the 3rd apexe +* The yaw_axis is the cross-product between the vector 1st appex to 2nd and 1st appex to 3rd apexe. +* The roll_axis is the cross-product between the pitch_axis and yaw_axis + +.. literalinclude:: examples/triangle_pitch.py + :lines: 15-17 + +.. plot:: orientation/examples/triangle_pitch.py + + +Yaw, pitch, and roll rotations +------------------------------ +A rotation is a circular movement of an object around a center (or point) of rotation . A three-dimensional object always rotates around an imaginary line called a rotation axis. In three dimensions, the orientation of an object is given by three rotations, i.e. by three rotation axis. Several conventions exist for defining the rotations. Here we will talk only about the ZYX convention, also known as yaw, pitch, and roll rotations. + + +(from: http://planning.cs.uiuc.edu/node102.html) + +A 3D body can be rotated about three orthogonal axes, as shown in Figure 3.8. Borrowing aviation terminology, these rotations will be referred to as yaw, pitch, and roll: + + +A yaw is a counterclockwise rotation of :math:`\alpha` about the :math:`z`-axis. The rotation matrix is given by + +.. math:: + R_z(\alpha) = \begin{pmatrix}\cos\alpha & -\sin\alpha & 0 \\ \sin\alpha & \cos\alpha & 0 \\ 0 & 0 & 1 \end{pmatrix} + +Note that the upper left entries of :math:`R_z(\alpha)` form a 2D rotation applied to the :math:`x` and :math:`y` coordinates, whereas the :math:`z` coordinate remains constant. + +A pitch is a counterclockwise rotation of :math:`\beta` about the :math:`y`-axis. The rotation matrix is given by + +.. math:: + R_y(\beta) = \begin{pmatrix}\cos\beta & 0 & \sin\beta \\ 0 & 1 & 0 \\ -\sin\beta & 0 & \cos\beta \end{pmatrix} + + +A roll is a counterclockwise rotation of :math:`\gamma` about the :math:`x`-axis. The rotation matrix is given by + +.. math:: + R_x(\gamma) = \begin{pmatrix}1 & 0 & 0 \\ 0 & \cos\gamma & -\sin\gamma \\ 0 & \sin\gamma & \cos\gamma \end{pmatrix} + +Each rotation matrix is a simple extension of the 2D rotation matrix, (3.31). For example, the yaw matrix, :math:`R_z(\alpha)`, essentially performs a 2D rotation with respect to the :math:`x` and :math:`y` coordinates while leaving the :math:`z` coordinate unchanged. Thus, the third row and third column of :math:`R_z(\alpha)` look like part of the identity matrix, while the upper right portion of :math:`R_z(\alpha)` looks like the 2D rotation matrix. +The yaw, pitch, and roll rotations can be used to place a 3D body in any orientation. A single rotation matrix can be formed by multiplying the yaw, pitch, and roll rotation matrices to obtain + +.. math:: + \begin{split} + R(\alpha,& \beta,\gamma) = R_z(\alpha) \, R_y(\beta) \, R_x(\gamma) = \\ + & \begin{pmatrix} + \cos\alpha \cos\beta & + \cos\alpha \sin\beta \sin\gamma - \sin\alpha \cos\gamma & + \cos\alpha \sin\beta \cos\gamma + \sin\alpha \sin\gamma \\ + \sin\alpha \cos\beta & + \sin\alpha \sin\beta \sin\gamma + \cos\alpha \cos\gamma & + \sin\alpha \sin\beta \cos\gamma - \cos\alpha \sin\gamma \\ + -\sin\beta & \cos\beta \sin\gamma & \cos\beta \cos\gamma \\ + \end{pmatrix} + \end{split} + + +It is important to note that :math:`R(\alpha,\beta,\gamma)` performs the roll first, then the pitch, and finally the yaw. If the order of these operations is changed, a different rotation matrix would result. Be careful when interpreting the rotations. Consider the final rotation, a yaw by :math:`\alpha`. Imagine sitting inside of a robot :math:`{\cal A}` that looks like an aircraft. If :math:`\beta = \gamma = 0`, then the yaw turns the plane in a way that feels like turning a car to the left. However, for arbitrary values of :math:`\beta` and :math:`\gamma`, the final rotation axis will not be vertically aligned with the aircraft because the aircraft is left in an unusual orientation before :math:`\alpha` is applied. The yaw rotation occurs about the :math:`z`-axis of the world frame, not the body frame of :math:`{\cal A}`. Each time a new rotation matrix is introduced from the left, it has no concern for original body frame of :math:`{\cal A}`. It simply rotates every point in :math:`{\mathbb{R}}^3` in terms of the world frame. Note that 3D rotations depend on three parameters, :math:`\alpha`, :math:`\beta`, and :math:`\gamma`, whereas 2D rotations depend only on a single parameter, :math:`\theta `. The primitives of the model can be transformed using :math:`R(\alpha,\beta,\gamma)`, resulting in :math:`{\cal A}(\alpha,\beta,\gamma)`. + +Orientation matrix +------------------ + +When introducing the body frame, we talked about an imaginary rotation of the reference frame to the animal's head frame. This is equivalent to finding the matrix :math:`R` so that: + +.. math:: + R.F^\text{ref} = F^\text{bee} \\ + F^\text{ref} = \begin{pmatrix} + roll^\text{ref}_x & pitch^\text{ref}_x & yaw^\text{ref}_z \\ + roll^\text{ref}_y & pitch^\text{ref}_y & yaw^\text{ref}_y \\ + roll^\text{ref}_z & pitch^\text{ref}_z & yaw^\text{ref}_z \\ + \end{pmatrix} + +The linear algebra tells us that if the inverse of :math:`F^\text{ref}` exists; the rotation matrix :math:`R`is equal to :math:`F^\text{bee}.\left(F^\text{ref}\right)^{-1}` + +The rotation matrix :math:`R` has nine values, but we know that only three angles are necessary to know the orientation of the rigid body. So how can we have the yaw, pitch, and roll angles from the rotation matrix :math:`R`? + +Determining yaw, pitch, and roll from a rotation matrix +------------------------------------------------------- +(adapted from http://planning.cs.uiuc.edu/node103.html) + +It is often convenient to determine the :math:`\alpha`, :math:`\beta`, and :math:`\gamma` parameters directly from a given rotation matrix. Suppose an arbitrary rotation matrix + +.. math:: + \begin{pmatrix}r_{11} & r_{12} & r_{13}\\ r_{21} & r_{22} & r_{23}\\ r_{31} & r_{32} & r_{33} \end{pmatrix} + +is given. By setting each entry equal to its corresponding entry in (3.42), equations are obtained that must be solved for :math:`\alpha`, :math:`\beta`, and :math:`\gamma`. Note that :math:`r_{21}/r_{11} = \tan\alpha` and :math:`r_{32}/r_{33} = \tan \gamma`. Also, :math:`r_{31} = - \sin\beta` and :math:`\pm\sqrt{r^2_{32}+r^2_{33}} = \cos\beta`. Solving for each angle yields + +.. math:: + \displaystyle \alpha = \tan^{-1} (\pm r_{21}/\pm r_{11}) + +.. math:: + \beta = \tan^{-1} \Big(-r_{31} \big/ \pm\sqrt{r^2_{32}+r^2_{33}}\Big) + +.. math:: + \gamma = \tan^{-1} (\pm r_{32}/\pm r_{33}) + +Note that the ambiguity on :math:`\pm`come from the sign of :math:`\cos\beta`, which is a priori unknown. + +There is a choice of four quadrants for the inverse tangent functions. How can the correct quadrant be determined? Each quadrant should be chosen by using the signs of the numerator and denominator of the argument. The numerator sign selects whether the direction will be above or below the :math:`x` -axis, and the denominator selects whether the direction will be to the left or right of the :math:`y` -axis. This is the same as the :math:`\arctan_2` function in the C programming language, which nicely expands the range of the arctangent to :math:`[0,2 +\pi)`. This can be applied to express (3.44), (3.45), and (3.46) as + +.. math:: + \alpha = \arctan_2(\pm r_{21},\pm r_{11}) + +.. math:: + \beta = \arctan_2\Big(-r_{31},\pm\sqrt{r^2_{32}+r^2_{33}}\Big) + +.. math:: + \gamma = \arctan_2(\pm r_{32},\pm r_{33}) + +Note that this method assumes :math:`r_{11} \not = 0` and :math:`r_{33} \not = 0`. + +Note that the choice of :math:`\pm` can be determined by comparing the estimated orientation matrix from :math:`\alpha`,:math:`\beta`, and :math:`\gamma` and the orientation matrix of the agent. + +.. literalinclude:: examples/yawpitchroll.py + :lines: 32-34 + +.. plot:: orientation/examples/yawpitchroll.py + + +Axis convention for aircraft +---------------------------- + +The local reference frames is composed of three axis. + +* X axis is the longitudinal axis pointing ahead +* Z axis is the vertical axis pointing downwards +* Y axis is the lateral one, pointing in such a way that the frame is right handed (from left to right when looking ahead) diff --git a/doc/source/orientation/rotation.rst b/doc/source/orientation/rotation.rst new file mode 100644 index 0000000000000000000000000000000000000000..ca8c6a3516ff8d32970942088ed6f717409ca77d --- /dev/null +++ b/doc/source/orientation/rotation.rst @@ -0,0 +1,51 @@ +Rotation +======== + +The world is full of animated agents, e.g. animals, aircraft, and robots. They move in the world avoiding predators, foraring and transporting back suplies, or looking for the perfect (or a suitable) mate. As observer of curiosities and strangeness of the world, we wonder where such agents look at, we wonder of their orientation. In geometry the orientation ( also: angular position, or attitude ) of an object is part of the description of how it is placed in the space it is in. Namely, it is the imaginary rotation that is needed to move the object from a reference placement to its current placement. + +To introduce the different concept related to the orientation of an agent, we will first work in a imaginary world composed of only two dimensions. The position of an agent in such world can be express as a function of two variables. Your screen is indeed a two dimentional space. The position of the mouse is express in term of pixel along the height and the length of your monitor. We could for example place the mouse at the position :math:`(100,200)`. +Wait, where should I start counting? Where is the position :math:`(0,0)`? We need to define an origin, and the direction in which we count, e.g. from left to right and from bottom to top for the first and second variable respectivly. Without knowing we are defining a reference frame. A reference frame is composed of unit vectors, i.e. the direction in which we have to count and the unit used (here the unit is the pixel), and an origin (here the bottom left corner of the monitor). + +The reference frame allows to position an agent in the world, but what about its orientation. The orientation of an agent requires another frame, one link to the agent itself. We need an origin, e.g. the center of mass, and two unit vectors (because we are in a 2D space, remember). One unit vector can be chosen along the long axis of the body, and the second one orthogonal to first one. The orthogonality will ease later the formalism. We can then place the agent in its resting position, i.e. at null orientation. When the agent will move its orientation will change, i.e. the imaginary rotation that is needed to move the reference frame at the resting position to the reference attached to the agent. + +A rotation is a circular movement of an object around a center (or point) of rotation. In linear algebra the rotation of an angle :math:`\alpha` is defined by the matrix: + +.. math:: + R = + \begin{bmatrix} + \cos \alpha & -\sin \alpha \\ + \sin \alpha & \cos \alpha \\ + \end{bmatrix} + +A vector :math:`v_0` can be rotated by an angle :math:`\alpha` by aplying the matrix :math:`R` to :math:`v_0`, i.e. :math:`v=Rv_0` + +We will say that the agent is looking in the direction $\alpha$ when the frame link to the agent is the rotation by the angle $\alpha$ of the frame at the resting position, i.e. when + +.. math:: + F_a=RF_0 + +here :math:`F_a` is the actual frame of the agent, :math:`F_0` is the frame at the resting position, and :math:`R` the imaginary rotation. + + +Finding the orientation of an agent (2D) +---------------------------------------- + +Usualy we do not know the imaginary rotation made by the agent. To find it we need to invert the linar system introduced above. + +.. math:: + R=F_a(F_0)^{-1} + +Once we have the orientation matrix we can find the angle :math:`\alpha` by using combination of elements of the matrix. For the present case we can get :math:`\alpha` by using the first column of the matrix. + +Reference frame in the real world (3D) +-------------------------------------- + +The real world do not have only two dimensions but three. The reference frames will have then three unit vectors. The first unit vector can still be choosen along the longitudinal axis of the agent. But how do we define the two other one. We can no longer determined the 2nd vector by using the 1st vector and orthogonality, because the 1st vector has an infinite amount of unit vectors. We need to introduce a convention. Scientists and ingenieurs have converged to a convention for aircraft. The first vector is along the longitidunal axis, the 2nd from left to right when seated in the aircraft, and the 3rd and last one pointing downward. The last vector is used to measure height, and it makes sens for an aircraft to measure height positivly downward. + +Once the reference frame has been introduce we need to have a look at the orientation matrix. This time it will be a 3x3 matrix, i.e. composed of 9 elements. + +A rotation in a three dimentional is made around a line, i.e. an axis or a vector. We have already defined three vectors, and you know what, the orientation can be defined by three rotations. + +**Note** The frame of the agent can be computed from three none colinear points. One will be the origin, the 1st axis can go from the origin an between the two other points. The second axis is orthogonal to the plane formed by the three points. The last vector is the cross product of the two other ones. This process assumes that the distance between any two given points of a rigid body remains constant in time regardless of external forces exerted on it, i.e. the points are placed on a rigid body + +**Note** The set of vectors is call in linear algebra, a basis, if the vectors are linearly independent and every vector in the vector space is a linear combination of this set. In physics, it is called a frame of reference, i.e. it consits of an abstract coordinate system and the set of physical reference points that uniquely fix (locate and orient) the coordinate system and standardize measurements. diff --git a/setup.py b/setup.py index 44587fd0658a5535f5987f6a517b33836da9cc4e..4aae5a327c8be9ba4dae2da6de88ba9a42720fcf 100644 --- a/setup.py +++ b/setup.py @@ -62,7 +62,8 @@ setup_dict = {'name': 'navipy', 'tox', 'pyyaml', 'Pillow', - 'tables'], + 'tables', + 'nbsphinx'], 'package_data': {'navipy': package_data_files("navipy")}, 'include_package_data': True,