Skip to content
Snippets Groups Projects
Commit 31fab465 authored by Olivier Bertrand's avatar Olivier Bertrand
Browse files

Add trajectory example bodybee

parent 6f5115a9
No related branches found
No related tags found
No related merge requests found
%% Cell type:code id: tags:
 
``` python
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline
 
from navipy.processing.mcode import optic_flow
from navipy.maths.coordinates import cartesian_to_spherical
from navipy.moving.agent import posorient_columns
from navipy.moving.agent import velocities_columns
from navipy.scene import __spherical_indeces__
```
 
%% Cell type:markdown id: tags:
 
# Optic Flow
 
Optic flow is defined as the change of light in the image, e.g. on the retina or the camera’s sensor, due to a relative motion between the eye or camera and the scene.
 
In this section we focus on calculating the optic flow from the animal or camera motion and the distance to surrounding objects. Note, that in applied task, the problem of interest is the opposite of our current focus, namely the problem is how to find the animal or camera motion and the distance to surrounding objects from an estimate of the optic flow obtained from a series of images.
 
To avoid confusion between, we qualify the optic flow from known animal or camera motion and distance to surrounding objects as geometrical, and will call it geometrical optic flow or gOF.
 
Koendering Van Dorn in their 1987 article 'Facts on optic flow' derive geometrical optic flow, and obtained the following equation
 
$$ gOF_i = -\mu_i(t-t. d_i)d_i)-R\times d_i $$
 
here
* $gOF_i$ is the geometrical optic flow in the $i-th$ viewing direction $d_i$.
* $t$ and $R$ are the translational and angular velocity, respectively.
* $\mu_i$ is the nearness (inverse of the distance) to the closest object in the $i-th$ viewing direction
 
The equation can be rewritten by using the "apparent rotation" due to the translation $A_i=\mu_i di_i\times t$, and become
 
$$ gOF_i = -(A_i+R)\times d_i $$
 
The optic flow is thus the sum of apparent rotation due to the translation and the angular rotation projected to a plane orthogonal to the viewing direction.
 
The eye is sometimes described as a spherical apparatus (espectially in insect research), here each viewing direction can be expressed in a spherical coordinate system. The gOF in a spherical system as three component, but the one along the viewing direction (i.e. the $\rho$ of the coordinate system) equates zero, because the gOF is othogonal to that direction.
 
In the remaining sections we will assume that the distance to close object have already been calculated (see renderering tutorials) and will only look at a single point along a trajectory at which the differences in x,y,z,and euler angles could be obtained. Furthermore we will use the yaw-pitch-roll (zyx) for the euler angles
 
 
%% Cell type:code id: tags:
 
``` python
convention='zyx'
tuples_posvel = posorient_columns(convention)
tuples_posvel.extend(velocities_columns(convention))
index_posvel = pd.MultiIndex.from_tuples(tuples_posvel,
names=['position',
'orientation'])
velocity = pd.Series(index=index_posvel, data=0)
 
# Define the eye
elevation = np.linspace(-np.pi/2,np.pi/2,11)
azimuth = np.linspace(-np.pi,np.pi,21)
[ma,me]=np.meshgrid(azimuth,elevation)
imshape = me.shape
viewing_directions = np.zeros((ma.shape[0], ma.shape[1], 2))
viewing_directions[..., __spherical_indeces__['elevation']] = me
viewing_directions[..., __spherical_indeces__['azimuth']] = ma
# Useful for quiver plots
vdir_az = viewing_directions[..., __spherical_indeces__['azimuth']]
vdir_el = viewing_directions[..., __spherical_indeces__['elevation']]
# Create a scene
scene = np.random.rand(imshape[0],imshape[1])
scene = np.tile(scene[...,np.newaxis],4)
scene = scene[...,np.newaxis]
```
 
%% Cell type:markdown id: tags:
 
## Optic flow in an x,y plane
 
When the objects are at equidistances
 
%% Cell type:code id: tags:
 
``` python
vel = velocity.copy() # to avoid overwright
vel.loc[('location','dx')]=np.random.randn()
vel.loc[('location','dy')]=np.random.randn()
sce = scene.copy() # to avoid overwright
sce = 0*sce + 10 # equidistant
#
el,az,_=cartesian_to_spherical(vel.loc[('location','dx')],
vel.loc[('location','dy')],
vel.loc[('location','dz')])
 
plt.figure(figsize=(15,6))
rof, hof, vof= optic_flow(sce, viewing_directions, vel)
plt.quiver(vdir_az,vdir_el,hof,vof, label='optic flow')
plt.plot([az],[el],'ro', label='Focus of Expension')
plt.legend()
plt.xlabel('azimuth [rad]')
plt.ylabel('elevation [rad]');
```
 
%% Output
 
 
%% Cell type:markdown id: tags:
 
With one object closer than the rest
 
%% Cell type:code id: tags:
 
``` python
vel = velocity.copy() # to avoid overwright
vel.loc[('location','dx')]=1.0
vel.loc[('location','dy')]=1.0
sce = scene.copy() # to avoid overwright
sce = 0*sce + 10 # equidistant
azindeces = np.arange(7,10)
sce[:,azindeces,...]=5 # closer
#
el,az,_=cartesian_to_spherical(vel.loc[('location','dx')],
vel.loc[('location','dy')],
vel.loc[('location','dz')])
 
plt.figure(figsize=(15,6))
rof, hof, vof= optic_flow(sce, viewing_directions, vel)
plt.quiver(vdir_az,vdir_el,hof,vof, label='optic flow',scale=5)
plt.quiver(vdir_az[:,azindeces],vdir_el[:,azindeces],
hof[:,azindeces],vof[:,azindeces],
color='b',
label='at the objects',scale=5)
plt.plot([az],[el],'ro', label='Focus of Expension')
plt.legend()
plt.xlabel('azimuth [rad]')
plt.ylabel('elevation [rad]');
```
 
%% Output
 
 
%% Cell type:markdown id: tags:
 
## Optic flow around the yaw axis
 
%% Cell type:code id: tags:
 
``` python
vel = velocity.copy() # to avoid overwright
vel.loc[(convention,'dalpha_0')]=0.2
sce = scene.copy() # to avoid overwright
#
el,az,_=cartesian_to_spherical(vel.loc[('location','dx')],
vel.loc[('location','dy')],
vel.loc[('location','dz')])
 
plt.figure(figsize=(15,6))
rof, hof, vof= optic_flow(sce, viewing_directions, vel)
plt.quiver(vdir_az,vdir_el,hof,vof, label='optic flow', scale=5)
plt.legend()
plt.xlabel('azimuth [rad]')
plt.ylabel('elevation [rad]');
```
 
%% Output
 
 
%% Cell type:markdown id: tags:
 
## Optic flow around the pitch axis
 
%% Cell type:code id: tags:
 
``` python
vel = velocity.copy() # to avoid overwright
vel.loc[(convention,'dalpha_1')]=0.2
sce = scene.copy() # to avoid overwright
#
el,az,_=cartesian_to_spherical(vel.loc[('location','dx')],
vel.loc[('location','dy')],
vel.loc[('location','dz')])
 
plt.figure(figsize=(15,6))
rof, hof, vof= optic_flow(sce, viewing_directions, vel)
plt.quiver(vdir_az,vdir_el,hof,vof, label='optic flow', scale=5)
plt.legend()
plt.xlabel('azimuth [rad]')
plt.ylabel('elevation [rad]');
```
 
%% Output
 
 
%% Cell type:markdown id: tags:
 
## Optic flow around the roll axis
 
%% Cell type:code id: tags:
 
``` python
vel = velocity.copy() # to avoid overwright
vel.loc[(convention,'dalpha_2')]=0.2
sce = scene.copy() # to avoid overwright
#
el,az,_=cartesian_to_spherical(vel.loc[('location','dx')],
vel.loc[('location','dy')],
vel.loc[('location','dz')])
 
plt.figure(figsize=(15,6))
rof, hof, vof= optic_flow(sce, viewing_directions, vel)
plt.quiver(vdir_az,vdir_el,hof,vof, label='optic flow', scale=5)
plt.legend()
plt.xlabel('azimuth [rad]')
plt.ylabel('elevation [rad]');
```
 
%% Output
 
 
%% Cell type:markdown id: tags:
 
## Optic flow of relevant points
We consider N relevant points, named as follow
%% Cell type:code id: tags:
``` python
points_name = ['nest', '1', '2', '3', '4']
relevant_points = pd.Series(index=pd.MultiIndex.from_product([points_name,['x','y','z']]))
relevant_points.nest=[0,0,0] # [x,y,z] of the point
relevant_points['1']=[1,0,0]
relevant_points['2']=[-1,0,0]
relevant_points['3']=[0,1,0]
relevant_points['4']=[0,-1,0]
```
%% Cell type:markdown id: tags:
and want to calculate the optic flow of these point along the following trajectory
%% Cell type:code id: tags:
``` python
mytraj=Trajectory().read_hdf('../../../navipy/resources/sample_experiment/Doussot_2018a/bodybee05.hdf')
mytraj.dropna().head()
```
%% Output
location zyx
x y z alpha_0 alpha_1 alpha_2
0
1445 64.346281 12.479517 14.689387 0.767312 0.006603 0.178150
1446 64.633030 12.514757 14.383999 0.738181 0.126137 0.167408
1447 64.781736 12.358705 14.548036 0.759602 -0.000240 0.158936
1448 64.908390 12.285283 14.674417 0.718028 0.099787 0.051479
1449 65.156870 12.207232 14.615128 0.720968 0.244536 0.256502
%% Cell type:markdown id: tags:
We convert the point from the world coordinate system to the bee coordinate system, and then express them in spherical coordinates, i.e. the coordinate system used in the optic_flow functions.
%% Cell type:code id: tags:
``` python
markers2bee = mytraj.world2body(relevant_points)
from navipy.maths.coordinates import cartesian_to_spherical
tmp=markers2bee.swaplevel(axis=1)
el,az,radius=cartesian_to_spherical(tmp.x,
tmp.y,
tmp.z)
markers2bee_sh = pd.DataFrame(index=markers2bee.index,
columns=markers2bee.columns)
d = dict(zip(['x','y','z'], ['elevation','azimuth','radius']))
markers2bee_sh = markers2bee_sh.rename(columns=d, level=1)
markers2bee_sh = markers2bee_sh.swaplevel(axis=1)
markers2bee_sh.elevation=el
markers2bee_sh.azimuth=az
markers2bee_sh.radius=radius
markers2bee_sh = markers2bee_sh.swaplevel(axis=1)
markers2bee_sh.dropna().head()
```
%% Output
/home/bolirev/.virtualenvs/toolbox-navigation/lib/python3.6/site-packages/navipy-0.1-py3.6.egg/navipy/trajectories/__init__.py:537: UserWarning: Prior to 12/09/2018:
world2body was doing a reverse transformed
Please use body2world instead
warnings.warn(msg)
%% Cell type:markdown id: tags:
The optic flow not only need the trajectory, but also the differentiation of it.
%% Cell type:code id: tags:
``` python
mytrajdiff = mytraj.diff(periods=1)
mytrajdiff.dropna().head()
d = dict(zip(mytrajdiff.columns.levels[1], ['d'+col for col in mytrajdiff.columns.levels[1]]))
mytrajdiff = mytrajdiff.rename(columns=d, level=1)
```
%% Cell type:markdown id: tags:
ToDO:
* implement obpc in optic flow
* convert markers2bee_sh into obpc format. Need to create viewing dir for el and az, and scene from radius.
%% Cell type:markdown id: tags:
## Remarks
 
### misconception about rotational optic flow
 
In the formulation of the optic flow the angular velocity $R$ of the animal is used. This velocity is a vector expressed in the animal coordinates system, and is **not** equal to the vector formed by the differentiation of the euler angles between two time points $t$ and $t-1$, i.e. $[yaw(t)-yaw(t-1),pitch(t)-pitch(t-1),roll(t)-roll(t-1)]$, because each euler angle is the rotation from one frame to the next frame, and only the final frame correspond to the coordinate system of the bee. It implies for the optic flow, that an arbitrary orientation, if droll is not zero, we can not expect the optic flow to be a rotation around the bee x-axis.
 
### Calculating the translational and rotational optic flow
 
It may be sometimes interesting to look at the optic flow field as the animal would have experience if it would not have rotated or translated. In other words it may be interesting to look at the translational and rotational optic flow instead of the full optic flow.
 
In `navipy` you can calculate such optic flow by using `optic_flow_translational` and `optic_flow_rotational`.
......
File added
This diff is collapsed.
......@@ -568,6 +568,21 @@ class Trajectory(pd.DataFrame):
dtype=float)
return transformed_markers
def differentiate(self, periods=1):
"""differentiate the trajectory and rename columns as d+col
:param periods: periods as in pd.diff()
:returns: Diff of the trajectory
:rtype: pd.DataFrame with MultiIndex
"""
mytrajdiff = self.diff(periods=1)
d = dict(zip(mytrajdiff.columns.levels[1],
['d'+col for col in mytrajdiff.columns.levels[1]]))
mytrajdiff = mytrajdiff.rename(columns=d, level=1)
mytrajdiff.dropna().head()
return mytrajdiff
def velocity(self):
""" Calculate the velocity on a trajectory
"""
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment