"print('Optic flow was calculated on {} frame in {}s'.format(mytrajvel.dropna().shape[0],\n",
" t_elapse))"
]
},
{
...
...
%% Cell type:code id: tags:
``` python
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline
from navipy.processing.mcode import optic_flow
from navipy.maths.coordinates import cartesian_to_spherical
from navipy.moving.agent import posorient_columns
from navipy.moving.agent import velocities_columns
from navipy.scene import __spherical_indeces__
```
%% Output
%% Cell type:markdown id: tags:
# Optic Flow
Optic flow is defined as the change of light in the image, e.g. on the retina or the camera’s sensor, due to a relative motion between the eye or camera and the scene.
In this section we focus on calculating the optic flow from the animal or camera motion and the distance to surrounding objects. Note, that in applied task, the problem of interest is the opposite of our current focus, namely the problem is how to find the animal or camera motion and the distance to surrounding objects from an estimate of the optic flow obtained from a series of images.
To avoid confusion between, we qualify the optic flow from known animal or camera motion and distance to surrounding objects as geometrical, and will call it geometrical optic flow or gOF.
Koendering Van Dorn in their 1987 article 'Facts on optic flow' derive geometrical optic flow, and obtained the following equation
$$ gOF_i = -\mu_i(t-t. d_i)d_i)-R\times d_i $$
here
* $gOF_i$ is the geometrical optic flow in the $i-th$ viewing direction $d_i$.
* $t$ and $R$ are the translational and angular velocity, respectively.
* $\mu_i$ is the nearness (inverse of the distance) to the closest object in the $i-th$ viewing direction
The equation can be rewritten by using the "apparent rotation" due to the translation $A_i=\mu_i di_i\times t$, and become
$$ gOF_i = -(A_i+R)\times d_i $$
The optic flow is thus the sum of apparent rotation due to the translation and the angular rotation projected to a plane orthogonal to the viewing direction.
The eye is sometimes described as a spherical apparatus (espectially in insect research), here each viewing direction can be expressed in a spherical coordinate system. The gOF in a spherical system as three component, but the one along the viewing direction (i.e. the $\rho$ of the coordinate system) equates zero, because the gOF is othogonal to that direction.
In the remaining sections we will assume that the distance to close object have already been calculated (see renderering tutorials) and will only look at a single point along a trajectory at which the differences in x,y,z,and euler angles could be obtained. Furthermore we will use the yaw-pitch-roll (zyx) for the euler angles
We convert the point from the world coordinate system to the bee coordinate system, and then express them in spherical coordinates, i.e. the coordinate system used in the optic_flow functions.
%% Cell type:code id: tags:
``` python
markers2bee = mytraj.world2body(relevant_points)
from navipy.maths.coordinates import cartesian_to_spherical
/home/bolirev/.virtualenvs/toolbox-navigation/lib/python3.6/site-packages/navipy-0.1-py3.6.egg/navipy/trajectories/__init__.py:537: UserWarning: Prior to 12/09/2018:
world2body was doing a reverse transformed
Please use body2world instead
%% Cell type:markdown id: tags:
The optic flow not only need the trajectory, but also the differentiation of it,
and merge traj with its velocity because the optic flow depend also of the agent orientation
Database of rendered views along a trajectory can be created by using the tool `blendalongtraj`. Once the database has been created, optic flow can be obtain by using the difference in the position orientation and the distance to surrounding objects (The 'D' channel).
print('Optic flow was calculated on {} frame in {}s'.format(mytrajvel.dropna().shape[0],
t_elapse))
```
%% Output
Optic flow was calculated on 24 frame in 104.01745867729187s
%% Cell type:markdown id: tags:
## Remarks
### misconception about rotational optic flow
In the formulation of the optic flow the angular velocity $R$ of the animal is used. This velocity is a vector expressed in the animal coordinates system, and is **not** equal to the vector formed by the differentiation of the euler angles between two time points $t$ and $t-1$, i.e. $[yaw(t)-yaw(t-1),pitch(t)-pitch(t-1),roll(t)-roll(t-1)]$, because each euler angle is the rotation from one frame to the next frame, and only the final frame correspond to the coordinate system of the bee. It implies for the optic flow, that an arbitrary orientation, if droll is not zero, we can not expect the optic flow to be a rotation around the bee x-axis.
### Calculating the translational and rotational optic flow
It may be sometimes interesting to look at the optic flow field as the animal would have experience if it would not have rotated or translated. In other words it may be interesting to look at the translational and rotational optic flow instead of the full optic flow.
In `navipy` you can calculate such optic flow by using `optic_flow_translational` and `optic_flow_rotational`.