Skip to content
Snippets Groups Projects
Commit ac3ca4d0 authored by Olivier Bertrand's avatar Olivier Bertrand
Browse files

Update database to save frame # as well in order to avoid confusion in case of nans

parent 09d2d16f
No related branches found
No related tags found
No related merge requests found
%% Cell type:code id: tags: %% Cell type:code id: tags:
   
``` python ``` python
import pandas as pd import pandas as pd
import numpy as np import numpy as np
import matplotlib.pyplot as plt import matplotlib.pyplot as plt
%matplotlib inline %matplotlib inline
   
from navipy.processing.mcode import optic_flow from navipy.processing.mcode import optic_flow
from navipy.maths.coordinates import cartesian_to_spherical from navipy.maths.coordinates import cartesian_to_spherical
from navipy.moving.agent import posorient_columns from navipy.moving.agent import posorient_columns
from navipy.moving.agent import velocities_columns from navipy.moving.agent import velocities_columns
from navipy.scene import __spherical_indeces__ from navipy.scene import __spherical_indeces__
``` ```
   
%% Output %% Output
   
   
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
   
# Optic Flow # Optic Flow
   
Optic flow is defined as the change of light in the image, e.g. on the retina or the camera’s sensor, due to a relative motion between the eye or camera and the scene. Optic flow is defined as the change of light in the image, e.g. on the retina or the camera’s sensor, due to a relative motion between the eye or camera and the scene.
   
In this section we focus on calculating the optic flow from the animal or camera motion and the distance to surrounding objects. Note, that in applied task, the problem of interest is the opposite of our current focus, namely the problem is how to find the animal or camera motion and the distance to surrounding objects from an estimate of the optic flow obtained from a series of images. In this section we focus on calculating the optic flow from the animal or camera motion and the distance to surrounding objects. Note, that in applied task, the problem of interest is the opposite of our current focus, namely the problem is how to find the animal or camera motion and the distance to surrounding objects from an estimate of the optic flow obtained from a series of images.
   
To avoid confusion between, we qualify the optic flow from known animal or camera motion and distance to surrounding objects as geometrical, and will call it geometrical optic flow or gOF. To avoid confusion between, we qualify the optic flow from known animal or camera motion and distance to surrounding objects as geometrical, and will call it geometrical optic flow or gOF.
   
Koendering Van Dorn in their 1987 article 'Facts on optic flow' derive geometrical optic flow, and obtained the following equation Koendering Van Dorn in their 1987 article 'Facts on optic flow' derive geometrical optic flow, and obtained the following equation
   
$$ gOF_i = -\mu_i(t-t. d_i)d_i)-R\times d_i $$ $$ gOF_i = -\mu_i(t-t. d_i)d_i)-R\times d_i $$
   
here here
* $gOF_i$ is the geometrical optic flow in the $i-th$ viewing direction $d_i$. * $gOF_i$ is the geometrical optic flow in the $i-th$ viewing direction $d_i$.
* $t$ and $R$ are the translational and angular velocity, respectively. * $t$ and $R$ are the translational and angular velocity, respectively.
* $\mu_i$ is the nearness (inverse of the distance) to the closest object in the $i-th$ viewing direction * $\mu_i$ is the nearness (inverse of the distance) to the closest object in the $i-th$ viewing direction
   
The equation can be rewritten by using the "apparent rotation" due to the translation $A_i=\mu_i di_i\times t$, and become The equation can be rewritten by using the "apparent rotation" due to the translation $A_i=\mu_i di_i\times t$, and become
   
$$ gOF_i = -(A_i+R)\times d_i $$ $$ gOF_i = -(A_i+R)\times d_i $$
   
The optic flow is thus the sum of apparent rotation due to the translation and the angular rotation projected to a plane orthogonal to the viewing direction. The optic flow is thus the sum of apparent rotation due to the translation and the angular rotation projected to a plane orthogonal to the viewing direction.
   
The eye is sometimes described as a spherical apparatus (espectially in insect research), here each viewing direction can be expressed in a spherical coordinate system. The gOF in a spherical system as three component, but the one along the viewing direction (i.e. the $\rho$ of the coordinate system) equates zero, because the gOF is othogonal to that direction. The eye is sometimes described as a spherical apparatus (espectially in insect research), here each viewing direction can be expressed in a spherical coordinate system. The gOF in a spherical system as three component, but the one along the viewing direction (i.e. the $\rho$ of the coordinate system) equates zero, because the gOF is othogonal to that direction.
   
In the remaining sections we will assume that the distance to close object have already been calculated (see renderering tutorials) and will only look at a single point along a trajectory at which the differences in x,y,z,and euler angles could be obtained. Furthermore we will use the yaw-pitch-roll (zyx) for the euler angles In the remaining sections we will assume that the distance to close object have already been calculated (see renderering tutorials) and will only look at a single point along a trajectory at which the differences in x,y,z,and euler angles could be obtained. Furthermore we will use the yaw-pitch-roll (zyx) for the euler angles
   
   
%% Cell type:code id: tags: %% Cell type:code id: tags:
   
``` python ``` python
convention='zyx' convention='zyx'
tuples_posvel = posorient_columns(convention) tuples_posvel = posorient_columns(convention)
tuples_posvel.extend(velocities_columns(convention)) tuples_posvel.extend(velocities_columns(convention))
index_posvel = pd.MultiIndex.from_tuples(tuples_posvel, index_posvel = pd.MultiIndex.from_tuples(tuples_posvel,
names=['position', names=['position',
'orientation']) 'orientation'])
velocity = pd.Series(index=index_posvel, data=0) velocity = pd.Series(index=index_posvel, data=0)
   
# Define the eye # Define the eye
elevation = np.linspace(-np.pi/2,np.pi/2,11) elevation = np.linspace(-np.pi/2,np.pi/2,11)
azimuth = np.linspace(-np.pi,np.pi,21) azimuth = np.linspace(-np.pi,np.pi,21)
[ma,me]=np.meshgrid(azimuth,elevation) [ma,me]=np.meshgrid(azimuth,elevation)
imshape = me.shape imshape = me.shape
viewing_directions = np.zeros((ma.shape[0], ma.shape[1], 2)) viewing_directions = np.zeros((ma.shape[0], ma.shape[1], 2))
viewing_directions[..., __spherical_indeces__['elevation']] = me viewing_directions[..., __spherical_indeces__['elevation']] = me
viewing_directions[..., __spherical_indeces__['azimuth']] = ma viewing_directions[..., __spherical_indeces__['azimuth']] = ma
# Useful for quiver plots # Useful for quiver plots
vdir_az = viewing_directions[..., __spherical_indeces__['azimuth']] vdir_az = viewing_directions[..., __spherical_indeces__['azimuth']]
vdir_el = viewing_directions[..., __spherical_indeces__['elevation']] vdir_el = viewing_directions[..., __spherical_indeces__['elevation']]
# Create a scene # Create a scene
scene = np.random.rand(imshape[0],imshape[1]) scene = np.random.rand(imshape[0],imshape[1])
``` ```
   
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
   
## Optic flow in an x,y plane ## Optic flow in an x,y plane
   
When the objects are at equidistances When the objects are at equidistances
   
%% Cell type:code id: tags: %% Cell type:code id: tags:
   
``` python ``` python
vel = velocity.copy() # to avoid overwright vel = velocity.copy() # to avoid overwright
vel.loc[('location','dx')]=np.random.randn() vel.loc[('location','dx')]=np.random.randn()
vel.loc[('location','dy')]=np.random.randn() vel.loc[('location','dy')]=np.random.randn()
sce = scene.copy() # to avoid overwright sce = scene.copy() # to avoid overwright
sce = 0*sce + 10 # equidistant sce = 0*sce + 10 # equidistant
# #
el,az,_=cartesian_to_spherical(vel.loc[('location','dx')], el,az,_=cartesian_to_spherical(vel.loc[('location','dx')],
vel.loc[('location','dy')], vel.loc[('location','dy')],
vel.loc[('location','dz')]) vel.loc[('location','dz')])
   
plt.figure(figsize=(15,6)) plt.figure(figsize=(15,6))
rof, hof, vof= optic_flow(sce, viewing_directions, vel) rof, hof, vof= optic_flow(sce, viewing_directions, vel)
plt.quiver(vdir_az,vdir_el,hof,vof, label='optic flow') plt.quiver(vdir_az,vdir_el,hof,vof, label='optic flow')
plt.plot([az],[el],'ro', label='Focus of Expension') plt.plot([az],[el],'ro', label='Focus of Expension')
plt.legend() plt.legend()
plt.xlabel('azimuth [rad]') plt.xlabel('azimuth [rad]')
plt.ylabel('elevation [rad]'); plt.ylabel('elevation [rad]');
``` ```
   
%% Output %% Output
   
   
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
   
With one object closer than the rest With one object closer than the rest
   
%% Cell type:code id: tags: %% Cell type:code id: tags:
   
``` python ``` python
vel = velocity.copy() # to avoid overwright vel = velocity.copy() # to avoid overwright
vel.loc[('location','dx')]=1.0 vel.loc[('location','dx')]=1.0
vel.loc[('location','dy')]=1.0 vel.loc[('location','dy')]=1.0
sce = scene.copy() # to avoid overwright sce = scene.copy() # to avoid overwright
sce = 0*sce + 10 # equidistant sce = 0*sce + 10 # equidistant
azindeces = np.arange(7,10) azindeces = np.arange(7,10)
sce[:,azindeces,...]=5 # closer sce[:,azindeces,...]=5 # closer
# #
el,az,_=cartesian_to_spherical(vel.loc[('location','dx')], el,az,_=cartesian_to_spherical(vel.loc[('location','dx')],
vel.loc[('location','dy')], vel.loc[('location','dy')],
vel.loc[('location','dz')]) vel.loc[('location','dz')])
   
plt.figure(figsize=(15,6)) plt.figure(figsize=(15,6))
rof, hof, vof= optic_flow(sce, viewing_directions, vel) rof, hof, vof= optic_flow(sce, viewing_directions, vel)
plt.quiver(vdir_az,vdir_el,hof,vof, label='optic flow',scale=5) plt.quiver(vdir_az,vdir_el,hof,vof, label='optic flow',scale=5)
plt.quiver(vdir_az[:,azindeces],vdir_el[:,azindeces], plt.quiver(vdir_az[:,azindeces],vdir_el[:,azindeces],
hof[:,azindeces],vof[:,azindeces], hof[:,azindeces],vof[:,azindeces],
color='b', color='b',
label='at the objects',scale=5) label='at the objects',scale=5)
plt.plot([az],[el],'ro', label='Focus of Expension') plt.plot([az],[el],'ro', label='Focus of Expension')
plt.legend() plt.legend()
plt.xlabel('azimuth [rad]') plt.xlabel('azimuth [rad]')
plt.ylabel('elevation [rad]'); plt.ylabel('elevation [rad]');
``` ```
   
%% Output %% Output
   
   
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
   
## Optic flow around the yaw axis ## Optic flow around the yaw axis
   
%% Cell type:code id: tags: %% Cell type:code id: tags:
   
``` python ``` python
vel = velocity.copy() # to avoid overwright vel = velocity.copy() # to avoid overwright
vel.loc[(convention,'dalpha_0')]=0.2 vel.loc[(convention,'dalpha_0')]=0.2
sce = scene.copy() # to avoid overwright sce = scene.copy() # to avoid overwright
# #
el,az,_=cartesian_to_spherical(vel.loc[('location','dx')], el,az,_=cartesian_to_spherical(vel.loc[('location','dx')],
vel.loc[('location','dy')], vel.loc[('location','dy')],
vel.loc[('location','dz')]) vel.loc[('location','dz')])
   
plt.figure(figsize=(15,6)) plt.figure(figsize=(15,6))
rof, hof, vof= optic_flow(sce, viewing_directions, vel) rof, hof, vof= optic_flow(sce, viewing_directions, vel)
plt.quiver(vdir_az,vdir_el,hof,vof, label='optic flow', scale=5) plt.quiver(vdir_az,vdir_el,hof,vof, label='optic flow', scale=5)
plt.legend() plt.legend()
plt.xlabel('azimuth [rad]') plt.xlabel('azimuth [rad]')
plt.ylabel('elevation [rad]'); plt.ylabel('elevation [rad]');
``` ```
   
%% Output %% Output
   
   
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
   
## Optic flow around the pitch axis ## Optic flow around the pitch axis
   
%% Cell type:code id: tags: %% Cell type:code id: tags:
   
``` python ``` python
vel = velocity.copy() # to avoid overwright vel = velocity.copy() # to avoid overwright
vel.loc[(convention,'dalpha_1')]=0.2 vel.loc[(convention,'dalpha_1')]=0.2
sce = scene.copy() # to avoid overwright sce = scene.copy() # to avoid overwright
# #
el,az,_=cartesian_to_spherical(vel.loc[('location','dx')], el,az,_=cartesian_to_spherical(vel.loc[('location','dx')],
vel.loc[('location','dy')], vel.loc[('location','dy')],
vel.loc[('location','dz')]) vel.loc[('location','dz')])
   
plt.figure(figsize=(15,6)) plt.figure(figsize=(15,6))
rof, hof, vof= optic_flow(sce, viewing_directions, vel) rof, hof, vof= optic_flow(sce, viewing_directions, vel)
plt.quiver(vdir_az,vdir_el,hof,vof, label='optic flow', scale=5) plt.quiver(vdir_az,vdir_el,hof,vof, label='optic flow', scale=5)
plt.legend() plt.legend()
plt.xlabel('azimuth [rad]') plt.xlabel('azimuth [rad]')
plt.ylabel('elevation [rad]'); plt.ylabel('elevation [rad]');
``` ```
   
%% Output %% Output
   
   
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
   
## Optic flow around the roll axis ## Optic flow around the roll axis
   
%% Cell type:code id: tags: %% Cell type:code id: tags:
   
``` python ``` python
vel = velocity.copy() # to avoid overwright vel = velocity.copy() # to avoid overwright
vel.loc[(convention,'dalpha_2')]=0.2 vel.loc[(convention,'dalpha_2')]=0.2
sce = scene.copy() # to avoid overwright sce = scene.copy() # to avoid overwright
# #
el,az,_=cartesian_to_spherical(vel.loc[('location','dx')], el,az,_=cartesian_to_spherical(vel.loc[('location','dx')],
vel.loc[('location','dy')], vel.loc[('location','dy')],
vel.loc[('location','dz')]) vel.loc[('location','dz')])
   
plt.figure(figsize=(15,6)) plt.figure(figsize=(15,6))
rof, hof, vof= optic_flow(sce, viewing_directions, vel) rof, hof, vof= optic_flow(sce, viewing_directions, vel)
plt.quiver(vdir_az,vdir_el,hof,vof, label='optic flow', scale=5) plt.quiver(vdir_az,vdir_el,hof,vof, label='optic flow', scale=5)
plt.legend() plt.legend()
plt.xlabel('azimuth [rad]') plt.xlabel('azimuth [rad]')
plt.ylabel('elevation [rad]'); plt.ylabel('elevation [rad]');
``` ```
   
%% Output %% Output
   
   
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
   
## Optic flow of relevant points ## Optic flow of relevant points
   
We consider N relevant points, named as follow We consider N relevant points, named as follow
   
%% Cell type:code id: tags: %% Cell type:code id: tags:
   
``` python ``` python
from navipy.trajectories import Trajectory from navipy.trajectories import Trajectory
points_name = ['nest', '1', '2', '3', '4'] points_name = ['nest', '1', '2', '3', '4']
relevant_points = pd.Series(index=pd.MultiIndex.from_product([points_name,['x','y','z']])) relevant_points = pd.Series(index=pd.MultiIndex.from_product([points_name,['x','y','z']]))
relevant_points.nest=[0,0,0] # [x,y,z] of the point relevant_points.nest=[0,0,0] # [x,y,z] of the point
relevant_points['1']=[1,0,0] relevant_points['1']=[1,0,0]
relevant_points['2']=[-1,0,0] relevant_points['2']=[-1,0,0]
relevant_points['3']=[0,1,0] relevant_points['3']=[0,1,0]
relevant_points['4']=[0,-1,0] relevant_points['4']=[0,-1,0]
``` ```
   
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
   
and want to calculate the optic flow of these point along the following trajectory and want to calculate the optic flow of these point along the following trajectory
   
%% Cell type:code id: tags: %% Cell type:code id: tags:
   
``` python ``` python
mytraj=Trajectory().read_hdf('../../../navipy/resources/sample_experiment/Doussot_2018a/bodybee05.hdf') mytraj=Trajectory().read_hdf('../../../navipy/resources/sample_experiment/Doussot_2018a/bodybee05.hdf')
``` ```
   
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
   
We convert the point from the world coordinate system to the bee coordinate system, and then express them in spherical coordinates, i.e. the coordinate system used in the optic_flow functions. We convert the point from the world coordinate system to the bee coordinate system, and then express them in spherical coordinates, i.e. the coordinate system used in the optic_flow functions.
   
%% Cell type:code id: tags: %% Cell type:code id: tags:
   
``` python ``` python
markers2bee = mytraj.world2body(relevant_points) markers2bee = mytraj.world2body(relevant_points)
   
from navipy.maths.coordinates import cartesian_to_spherical from navipy.maths.coordinates import cartesian_to_spherical
from navipy.scene import spherical_indeces from navipy.scene import spherical_indeces
tmp=markers2bee.swaplevel(axis=1) tmp=markers2bee.swaplevel(axis=1)
el,az,radius=cartesian_to_spherical(tmp.x, el,az,radius=cartesian_to_spherical(tmp.x,
tmp.y, tmp.y,
tmp.z) tmp.z)
markers2bee_sh = pd.DataFrame(index=markers2bee.index, markers2bee_sh = pd.DataFrame(index=markers2bee.index,
columns=markers2bee.columns) columns=markers2bee.columns)
d = dict(zip(['x','y','z'], ['elevation','azimuth','radius'])) d = dict(zip(['x','y','z'], ['elevation','azimuth','radius']))
markers2bee_sh = markers2bee_sh.rename(columns=d, level=1) markers2bee_sh = markers2bee_sh.rename(columns=d, level=1)
markers2bee_sh = markers2bee_sh.swaplevel(axis=1) markers2bee_sh = markers2bee_sh.swaplevel(axis=1)
markers2bee_sh.elevation=el markers2bee_sh.elevation=el
markers2bee_sh.azimuth=az markers2bee_sh.azimuth=az
markers2bee_sh.radius=radius markers2bee_sh.radius=radius
markers2bee_sh = markers2bee_sh.swaplevel(axis=1) markers2bee_sh = markers2bee_sh.swaplevel(axis=1)
``` ```
   
%% Output %% Output
   
/home/bolirev/.virtualenvs/toolbox-navigation/lib/python3.6/site-packages/navipy-0.1-py3.6.egg/navipy/trajectories/__init__.py:537: UserWarning: Prior to 12/09/2018: /home/bolirev/.virtualenvs/toolbox-navigation/lib/python3.6/site-packages/navipy-0.1-py3.6.egg/navipy/trajectories/__init__.py:537: UserWarning: Prior to 12/09/2018:
world2body was doing a reverse transformed world2body was doing a reverse transformed
Please use body2world instead Please use body2world instead
   
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
   
The optic flow not only need the trajectory, but also the differentiation of it, The optic flow not only need the trajectory, but also the differentiation of it,
and merge traj with its velocity because the optic flow depend also of the agent orientation and merge traj with its velocity because the optic flow depend also of the agent orientation
   
%% Cell type:code id: tags: %% Cell type:code id: tags:
   
``` python ``` python
mytrajdiff = mytraj.differentiate(periods=1) mytrajdiff = mytraj.differentiate(periods=1)
mytrajvel = mytraj.join(mytrajdiff) mytrajvel = mytraj.join(mytrajdiff)
mytrajvel.dropna().head() mytrajvel.dropna().head()
``` ```
   
%% Output %% Output
   
location zyx location \ location zyx location \
x y z alpha_0 alpha_1 alpha_2 dx x y z alpha_0 alpha_1 alpha_2 dx
0 0
1446 64.633030 12.514757 14.383999 0.738181 0.126137 0.167408 0.286749 1446 64.633030 12.514757 14.383999 0.738181 0.126137 0.167408 0.286749
1447 64.781736 12.358705 14.548036 0.759602 -0.000240 0.158936 0.148706 1447 64.781736 12.358705 14.548036 0.759602 -0.000240 0.158936 0.148706
1448 64.908390 12.285283 14.674417 0.718028 0.099787 0.051479 0.126654 1448 64.908390 12.285283 14.674417 0.718028 0.099787 0.051479 0.126654
1449 65.156870 12.207232 14.615128 0.720968 0.244536 0.256502 0.248480 1449 65.156870 12.207232 14.615128 0.720968 0.244536 0.256502 0.248480
1450 65.322761 12.144090 14.666038 0.754467 0.097272 0.039248 0.165891 1450 65.322761 12.144090 14.666038 0.754467 0.097272 0.039248 0.165891
zyx zyx
dy dz dalpha_0 dalpha_1 dalpha_2 dy dz dalpha_0 dalpha_1 dalpha_2
0 0
1446 0.035240 -0.305388 -0.029131 0.119534 -0.010741 1446 0.035240 -0.305388 -0.029131 0.119534 -0.010741
1447 -0.156052 0.164037 0.021421 -0.126377 -0.008473 1447 -0.156052 0.164037 0.021421 -0.126377 -0.008473
1448 -0.073422 0.126382 -0.041574 0.100027 -0.107457 1448 -0.073422 0.126382 -0.041574 0.100027 -0.107457
1449 -0.078051 -0.059289 0.002941 0.144750 0.205023 1449 -0.078051 -0.059289 0.002941 0.144750 0.205023
1450 -0.063142 0.050910 0.033498 -0.147265 -0.217254 1450 -0.063142 0.050910 0.033498 -0.147265 -0.217254
   
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
   
We loop through all time point in the velocity and calculate the optic flow of the selected We loop through all time point in the velocity and calculate the optic flow of the selected
points. points.
   
%% Cell type:code id: tags: %% Cell type:code id: tags:
   
``` python ``` python
my_opticflow=pd.DataFrame(index=mytrajvel.index, my_opticflow=pd.DataFrame(index=mytrajvel.index,
columns=pd.MultiIndex.from_product([points_name,['rof','hof','vof']])) columns=pd.MultiIndex.from_product([points_name,['rof','hof','vof']]))
my_opticflow=my_opticflow.swaplevel(axis=1) my_opticflow=my_opticflow.swaplevel(axis=1)
tmp = markers2bee_sh.swaplevel(axis=1) tmp = markers2bee_sh.swaplevel(axis=1)
for ii, velocity in mytrajvel.dropna().iterrows(): for ii, velocity in mytrajvel.dropna().iterrows():
# get the elevation, azimuth of the seen point of interest # get the elevation, azimuth of the seen point of interest
# and the distance to it # and the distance to it
elevation = tmp.loc[ii,'elevation'] elevation = tmp.loc[ii,'elevation']
azimuth = tmp.loc[ii,'azimuth'] azimuth = tmp.loc[ii,'azimuth']
distance = tmp.loc[ii,'radius'].values distance = tmp.loc[ii,'radius'].values
# Build viewing direction of all objects # Build viewing direction of all objects
viewing_dir = np.zeros((elevation.shape[0],2)) viewing_dir = np.zeros((elevation.shape[0],2))
viewing_dir[...,spherical_indeces()['elevation']] = elevation viewing_dir[...,spherical_indeces()['elevation']] = elevation
viewing_dir[...,spherical_indeces()['azimuth']] = azimuth viewing_dir[...,spherical_indeces()['azimuth']] = azimuth
# Calculate the optic flow of these points # Calculate the optic flow of these points
rof,hof,vof = optic_flow(distance,viewing_dir,velocity) rof,hof,vof = optic_flow(distance,viewing_dir,velocity)
# save the results in df # save the results in df
my_opticflow.loc[ii,'rof']=rof my_opticflow.loc[ii,'rof']=rof
my_opticflow.loc[ii,'hof']=hof my_opticflow.loc[ii,'hof']=hof
my_opticflow.loc[ii,'vof']=vof my_opticflow.loc[ii,'vof']=vof
# swap level again, to index optic flow with point of interest # swap level again, to index optic flow with point of interest
my_opticflow=my_opticflow.swaplevel(axis=1) my_opticflow=my_opticflow.swaplevel(axis=1)
``` ```
   
%% Cell type:code id: tags: %% Cell type:code id: tags:
   
``` python ``` python
fig=plt.figure() fig=plt.figure()
ax=fig.gca() ax=fig.gca()
of_norm = my_opticflow.swaplevel(axis=1).hof**2+my_opticflow.swaplevel(axis=1).vof**2 of_norm = my_opticflow.swaplevel(axis=1).hof**2+my_opticflow.swaplevel(axis=1).vof**2
of_norm = np.sqrt(of_norm.astype(float)) of_norm = np.sqrt(of_norm.astype(float))
ax.plot(of_norm.nest) ax.plot(of_norm.nest)
ax.set_ylim([0,0.4]) ax.set_ylim([0,0.4])
``` ```
   
%% Output %% Output
   
(0, 0.4) (0, 0.4)
   
   
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
   
ToDO: ## Optic flow from a database
* implement obpc in optic flow
* convert markers2bee_sh into obpc format. Need to create viewing dir for el and az, and scene from radius. Database of rendered views along a trajectory can be created by using the tool `blendalongtraj`. Once the database has been created, optic flow can be obtain by using the difference in the position orientation and the distance to surrounding objects (The 'D' channel).
%% Cell type:code id: tags:
``` python
import pkg_resources
from navipy.database import DataBase
mydbfile = pkg_resources.resource_filename(
'navipy', 'resources/database.db')
mydb = DataBase(mydbfile)
mytraj = mydb.posorients
mytrajdiff = mytraj.differentiate(periods=1)
mytrajvel = mytraj.join(mytrajdiff)
mytrajvel.dropna().head()
my_opticflow=pd.DataFrame(index=mytrajvel.index,
columns=pd.MultiIndex.from_product([points_name,['rof','hof','vof']]))
my_opticflow=my_opticflow.swaplevel(axis=1)
horizontal_of = np.zeros((mytrajvel.shape[0],*mydb.viewing_directions[...,0].shape))
vertical_of = horizontal_of.copy()
for ii, (frame_i, velocity) in enumerate(mytrajvel.dropna().iterrows()):
scene = mydb.scene(velocity)
distance = scene[...,3,0]
# Calculate the optic flow of these points
_,hof,vof = optic_flow(distance,mydb.viewing_directions,velocity)
# save the results in df
horizontal_of[ii,...]=hof
vertical_of[ii,...]=vof
```
   
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
   
## Remarks ## Remarks
   
### misconception about rotational optic flow ### misconception about rotational optic flow
   
In the formulation of the optic flow the angular velocity $R$ of the animal is used. This velocity is a vector expressed in the animal coordinates system, and is **not** equal to the vector formed by the differentiation of the euler angles between two time points $t$ and $t-1$, i.e. $[yaw(t)-yaw(t-1),pitch(t)-pitch(t-1),roll(t)-roll(t-1)]$, because each euler angle is the rotation from one frame to the next frame, and only the final frame correspond to the coordinate system of the bee. It implies for the optic flow, that an arbitrary orientation, if droll is not zero, we can not expect the optic flow to be a rotation around the bee x-axis. In the formulation of the optic flow the angular velocity $R$ of the animal is used. This velocity is a vector expressed in the animal coordinates system, and is **not** equal to the vector formed by the differentiation of the euler angles between two time points $t$ and $t-1$, i.e. $[yaw(t)-yaw(t-1),pitch(t)-pitch(t-1),roll(t)-roll(t-1)]$, because each euler angle is the rotation from one frame to the next frame, and only the final frame correspond to the coordinate system of the bee. It implies for the optic flow, that an arbitrary orientation, if droll is not zero, we can not expect the optic flow to be a rotation around the bee x-axis.
   
### Calculating the translational and rotational optic flow ### Calculating the translational and rotational optic flow
   
It may be sometimes interesting to look at the optic flow field as the animal would have experience if it would not have rotated or translated. In other words it may be interesting to look at the translational and rotational optic flow instead of the full optic flow. It may be sometimes interesting to look at the optic flow field as the animal would have experience if it would not have rotated or translated. In other words it may be interesting to look at the translational and rotational optic flow instead of the full optic flow.
   
In `navipy` you can calculate such optic flow by using `optic_flow_translational` and `optic_flow_rotational`. In `navipy` you can calculate such optic flow by using `optic_flow_translational` and `optic_flow_rotational`.
......
...@@ -137,6 +137,7 @@ class DataBase(): ...@@ -137,6 +137,7 @@ class DataBase():
self.tablecolumns['position_orientation']['q_1'] = 'real' self.tablecolumns['position_orientation']['q_1'] = 'real'
self.tablecolumns['position_orientation']['q_2'] = 'real' self.tablecolumns['position_orientation']['q_2'] = 'real'
self.tablecolumns['position_orientation']['q_3'] = 'real' self.tablecolumns['position_orientation']['q_3'] = 'real'
self.tablecolumns['position_orientation']['frame_i'] = 'real'
self.tablecolumns['position_orientation']['rotconv_id'] = 'string' self.tablecolumns['position_orientation']['rotconv_id'] = 'string'
self.tablecolumns['image'] = dict() self.tablecolumns['image'] = dict()
self.tablecolumns['image']['data'] = 'array' self.tablecolumns['image']['data'] = 'array'
...@@ -292,6 +293,12 @@ class DataBase(): ...@@ -292,6 +293,12 @@ class DataBase():
raise Exception(msg) raise Exception(msg)
found_convention = False found_convention = False
index = posorient.index index = posorient.index
if isinstance(posorient.name, int):
msg = 'posorient.name should give the frame #'
self._logger.exception(msg)
raise Exception(msg)
frame_i = posorient.name
if isinstance(index, pd.MultiIndex): if isinstance(index, pd.MultiIndex):
convention = index.levels[0][-1] convention = index.levels[0][-1]
else: else:
...@@ -353,79 +360,35 @@ class DataBase(): ...@@ -353,79 +360,35 @@ class DataBase():
msg = 'posorient must not contain nan\n {}'.format(posorient) msg = 'posorient must not contain nan\n {}'.format(posorient)
self._logger.exception(msg) self._logger.exception(msg)
raise ValueError(msg) raise ValueError(msg)
cursor = self.db_cursor.execute('select * from position_orientation')
names = list(map(lambda x: x[0], cursor.description))
where = "" where = ""
params = None where += """x>=? and x<=?"""
if 'alpha_0' in names: where += """and y>=? and y<=?"""
self._logger.warning("you are loading a database with an\ where += """and z>=? and z<=?"""
old convention") where += """and q_0>=? and q_0<=?"""
where += """x>=? and x<=?""" where += """and q_1>=? and q_1<=?"""
where += """and y>=? and y<=?""" where += """and q_2>=? and q_2<=?"""
where += """and z>=? and z<=?""" where += """and rotconv_id =?"""
where += """and alpha_0>=? and alpha_0<=?""" where += """and frame_i = ?"""
where += """and alpha_1>=? and alpha_1<=?""" params = (
where += """and alpha_2>=? and alpha_2<=?""" posorient['location']['x'] - self.__float_tolerance,
params = ( posorient['location']['x'] + self.__float_tolerance,
posorient['location']['x'] - self.__float_tolerance, posorient['location']['y'] - self.__float_tolerance,
posorient['location']['x'] + self.__float_tolerance, posorient['location']['y'] + self.__float_tolerance,
posorient['location']['y'] - self.__float_tolerance, posorient['location']['z'] - self.__float_tolerance,
posorient['location']['y'] + self.__float_tolerance, posorient['location']['z'] + self.__float_tolerance,
posorient['location']['z'] - self.__float_tolerance, posorient[convention][naming_map[0]] - self.__float_tolerance,
posorient['location']['z'] + self.__float_tolerance, posorient[convention][naming_map[0]] + self.__float_tolerance,
posorient[convention][naming_map[0]] - self.__float_tolerance, posorient[convention][naming_map[1]] - self.__float_tolerance,
posorient[convention][naming_map[0]] + self.__float_tolerance, posorient[convention][naming_map[1]] + self.__float_tolerance,
posorient[convention][naming_map[1]] - self.__float_tolerance, posorient[convention][naming_map[2]] - self.__float_tolerance,
posorient[convention][naming_map[1]] + self.__float_tolerance, posorient[convention][naming_map[2]] + self.__float_tolerance,
posorient[convention][naming_map[2]] - self.__float_tolerance, convention,
posorient[convention][naming_map[2]] + self.__float_tolerance) frame_i)
elif convention != 'quaternion': if convention == 'quaternion':
where += """x>=? and x<=?"""
where += """and y>=? and y<=?"""
where += """and z>=? and z<=?"""
where += """and q_0>=? and q_0<=?"""
where += """and q_1>=? and q_1<=?"""
where += """and q_2>=? and q_2<=?"""
where += """and rotconv_id =?"""
params = (
posorient['location']['x'] - self.__float_tolerance,
posorient['location']['x'] + self.__float_tolerance,
posorient['location']['y'] - self.__float_tolerance,
posorient['location']['y'] + self.__float_tolerance,
posorient['location']['z'] - self.__float_tolerance,
posorient['location']['z'] + self.__float_tolerance,
posorient[convention][naming_map[0]] - self.__float_tolerance,
posorient[convention][naming_map[0]] + self.__float_tolerance,
posorient[convention][naming_map[1]] - self.__float_tolerance,
posorient[convention][naming_map[1]] + self.__float_tolerance,
posorient[convention][naming_map[2]] - self.__float_tolerance,
posorient[convention][naming_map[2]] + self.__float_tolerance,
convention)
else:
where += """x>=? and x<=?"""
where += """and y>=? and y<=?"""
where += """and z>=? and z<=?"""
where += """and q_0>=? and q_0<=?"""
where += """and q_1>=? and q_1<=?"""
where += """and q_2>=? and q_2<=?"""
where += """and q_3>=? and q_3<=?""" where += """and q_3>=? and q_3<=?"""
where += """and rotconv_id =?""" params = params + (
params = (
posorient['location']['x'] - self.__float_tolerance,
posorient['location']['x'] + self.__float_tolerance,
posorient['location']['y'] - self.__float_tolerance,
posorient['location']['y'] + self.__float_tolerance,
posorient['location']['z'] - self.__float_tolerance,
posorient['location']['z'] + self.__float_tolerance,
posorient[convention][naming_map[0]] - self.__float_tolerance,
posorient[convention][naming_map[0]] + self.__float_tolerance,
posorient[convention][naming_map[1]] - self.__float_tolerance,
posorient[convention][naming_map[1]] + self.__float_tolerance,
posorient[convention][naming_map[2]] - self.__float_tolerance,
posorient[convention][naming_map[2]] + self.__float_tolerance,
posorient[convention][naming_map[3]] - self.__float_tolerance, posorient[convention][naming_map[3]] - self.__float_tolerance,
posorient[convention][naming_map[3]] + self.__float_tolerance, posorient[convention][naming_map[3]] + self.__float_tolerance)
convention)
self.db_cursor.execute( self.db_cursor.execute(
""" """
SELECT count(*) SELECT count(*)
...@@ -445,8 +408,8 @@ class DataBase(): ...@@ -445,8 +408,8 @@ class DataBase():
self.db_cursor.execute( self.db_cursor.execute(
""" """
INSERT INSERT
INTO position_orientation(x,y,z,q_0,q_1,q_2,q_3,rotconv_id) INTO position_orientation(x,y,z,q_0,q_1,q_2,q_3,rotconv_id,frame_i)
VALUES (?,?,?,?,?,?,?,?) VALUES (?,?,?,?,?,?,?,?,?)
""", ( """, (
posorient['location']['x'], posorient['location']['x'],
posorient['location']['y'], posorient['location']['y'],
...@@ -455,13 +418,13 @@ class DataBase(): ...@@ -455,13 +418,13 @@ class DataBase():
posorient[convention]['alpha_1'], posorient[convention]['alpha_1'],
posorient[convention]['alpha_2'], posorient[convention]['alpha_2'],
np.nan, np.nan,
convention)) convention, frame_i))
else: else:
self.db_cursor.execute( self.db_cursor.execute(
""" """
INSERT INSERT
INTO position_orientation(x,y,z,q_0,q_1,q_2,rotconv_id) INTO position_orientation(x,y,z,q_0,q_1,q_2,rotconv_id,frame_i)
VALUES (?,?,?,?,?,?,?,?) VALUES (?,?,?,?,?,?,?,?,?)
""", ( """, (
posorient['location']['x'], posorient['location']['x'],
posorient['location']['y'], posorient['location']['y'],
...@@ -470,7 +433,7 @@ class DataBase(): ...@@ -470,7 +433,7 @@ class DataBase():
posorient[convention]['q_1'], posorient[convention]['q_1'],
posorient[convention]['q_2'], posorient[convention]['q_2'],
posorient[convention]['q_3'], posorient[convention]['q_3'],
convention)) convention, frame_i))
rowid = self.db_cursor.lastrowid rowid = self.db_cursor.lastrowid
self.db.commit() self.db.commit()
return rowid return rowid
...@@ -532,15 +495,20 @@ class DataBase(): ...@@ -532,15 +495,20 @@ class DataBase():
return self.__convention return self.__convention
@property @property
def posorients(self): def posorients(self, indexby='frame_i'):
"""Return the position orientations of all points in the \ """Return the position orientations of all points in the \
database database
:params indexby: index posorients by 'frame_i' (default) or 'id'
:returns: all position orientations :returns: all position orientations
:rtype: list of pd.Series :rtype: list of pd.Series
""" """
posorient = pd.read_sql_query( posorient = pd.read_sql_query(
"select * from position_orientation;", self.db) "select * from position_orientation;", self.db)
posorient.set_index('id', inplace=True) if indexby in posorient.columns:
posorient.set_index('frame_i', inplace=True)
else: # Handle older db version
print('Could not index by {}'.format(indexby))
posorient.set_index('id', inplace=True)
if self.rotation_convention is not None: if self.rotation_convention is not None:
posorients = Trajectory() posorients = Trajectory()
posorients.from_dataframe(posorient, rotconv=self.__convention) posorients.from_dataframe(posorient, rotconv=self.__convention)
...@@ -562,6 +530,14 @@ class DataBase(): ...@@ -562,6 +530,14 @@ class DataBase():
# Read from database # Read from database
# #
def read_posorient(self, posorient=None, rowid=None): def read_posorient(self, posorient=None, rowid=None):
"""Read posorient with a given posorient or rowid
:param posorient: pd.Series with MuliIndex
:param rowid: integer, rowid of the database
:returns: return posorient
:rtype: pd.Series
"""
if rowid is not None: if rowid is not None:
if not isinstance(rowid, int): if not isinstance(rowid, int):
msg = 'rowid must be an integer' msg = 'rowid must be an integer'
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment