Coverage for navipy/comparing/__init__.py : 65%

Hot-keys on this page
r m x p toggle line displays
j k next/prev highlighted chunk
0 (zero) top of page
1 (one) first highlighted chunk
""" Comparing """
"""Compute the difference between the current and memorised place code
:param current: current place code :param memory: memorised place code :returns: the image difference :rtype: float
..ref: Zeil, J., 2012. Visual homing: an insect perspective. Current opinion in neurobiology
""" raise TypeError('current place code should be a numpy array') raise TypeError('memory place code should be a numpy array') have the same shape') elif is_obpc(current): return diff else: raise TypeError('place code is neither an ibpc nor obpc')
"""Compute the root mean square difference between the current and memorised place code
:param current: current place code :param memory: memorised place code :returns: the image difference :rtype: float #array(1,4) for ibpc and float for obpc
""" elif is_obpc(current): return np.sqrt(diff.mean(axis=0).mean(axis=0)) else: raise TypeError('place code is neither an ibpc nor obpc')
"""Compute the rotational image difference between the current and memorised place code.
:param current: current place code :param memory: memorised place code :returns: the rotational image difference :rtype: (np.ndarray)
..ref: Zeil, J., 2012. Visual homing: an insect perspective. Current opinion in neurobiology ..note: assume that the image is periodic along the x axis (the left-right axis)
""" should be image based') should be image based') # ridf is a NxM matrix, # because one value per azimuth (N) and n values per channel # (M) # Perform a counter clock wise rotation
"""Computes the direction of motion from current to memory by using the optic flow under the constrain that the brightness is constant, (small movement), using a taylor expansion and solving the equation: .. math::
0=I_t+\delta I*<u,v> or I_x+I_y+I_t=0
afterwards the aperture problem is solved by a Matrix equation Ax=b, where x=(u,v) and .. math::
A=(I_x,I_y) and b = I_t
The intput parameters are the following: :param current: current place code :param memory: memorised place code :returns: a directional vectors :rtype: (np.ndarray)
..ref: aperture problem: Shimojo, Shinsuke, Gerald H. Silverman, and Ken Nakayama: "Occlusion and the solution to the aperture problem for motion." Vision research 29.5 (1989): 619-626. optic flow: Horn, Berthold KP, and Brian G. Schunck.: "Determining optical flow." Artificial intelligence 17.1-3 (1981): 185-203. """ should be image based') should be image based')
return 0
mem_scenes, viewing_directions): """Weighted image rotational difference
Return an homing vector direction based on an \ Image rotational difference weighted between \ some reference snapshots
:param current: actual scene, np.array :param mem_scenes: list of memorised of views :returns: dx, dy, dz, dyaw, dpitch, droll. :rtypes: pd.Series """ if not isinstance(mem_scenes, (list, tuple)): msg = 'mem_scenes should be of type' msg += 'list or tuple and not {}' msg = msg.format(type(mem_scenes)) raise TypeError(mem_scenes) for scene in mem_scenes: check_scene(scene) check_scene(current)
# A dataframe to store # the minimum of the irdf and the angle # at which the minimum takes place df_svp = pd.DataFrame(index=range(0, len(mem_scenes)), columns=['irdf', 'angle'])
for i, scene in enumerate(mem_scenes): irdf = rot_imagediff(current, scene) idx = np.argmin(irdf[..., 0]) value = np.min(irdf[..., 0]) df_svp.loc[i, 'angle'] = \ viewing_directions[idx, __spherical_indeces__['azimuth']] df_svp.loc[i, 'irdf'] = value
min_irdf = df_svp.irdf.min() # Take the best svp irdf and make the ratio # for each others that gives the weighted irdf w_svp = min_irdf / df_svp.irdf # Weighting of the vector direction based on circular statistics j = complex(0, 1) H = w_svp * np.exp(df_svp.angle * j) return np.sum(H) |