diff --git a/doc/source/tutorials/03-rendering-along-trajectory.ipynb b/doc/source/tutorials/03-rendering-along-trajectory.ipynb
index 176760946fd9546ac6f35f565b5079c411278fe2..b63c55b4f5665d3e8f1d4f1d97b257aa44e65eb2 100644
--- a/doc/source/tutorials/03-rendering-along-trajectory.ipynb
+++ b/doc/source/tutorials/03-rendering-along-trajectory.ipynb
@@ -4,7 +4,7 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "# Rendering along a trajectory\n",
+    "# Rendering\n",
     "\n",
     "In this tutorial, you will learn to take advantage of navipy and the rendering modules (blender) to render what has been seen by the animal along its trajectory. \n",
     "\n",
@@ -13,7 +13,9 @@
     "* the environment (a blender file)\n",
     "* the trajectory formatted for navipy (see 02-recording-animal-trajectory)\n",
     "\n",
-    "## Ploting the position and orientation of the animal\n",
+    "## Along a trajectory\n",
+    "\n",
+    "### Ploting the position and orientation of the animal\n",
     "\n",
     "The trajectory of the animal can be plotted in different manners. Here we will look at two representations:\n",
     "\n",
@@ -115,7 +117,7 @@
    "source": [
     "\n",
     "\n",
-    "## Checking the position of the animal within the environment\n",
+    "### Checking the position of the animal within the environment\n",
     "\n",
     "Rendering a trajectory will take time, and thus we want to check - prior to rendering - the correctness of the animal within the environment. The best way to check, is to overlay the trajectory within the blender world, and by rotating the environment, one could look for: part of the trajectory crossing objects, a wrong height, etc... \n",
     "\n",
@@ -133,7 +135,7 @@
     "> along the trajectory and thus creating a 3D shaped. (Don't forget to enable rendering)\n",
     "\n",
     "\n",
-    "## Rendering the trajectory in a database\n",
+    "### Rendering the trajectory in a database\n",
     "\n",
     "Once we know that the trajectory is correctly placed within the environment, it is time to render along the trajectory. We, however, recommand to render only the first view frame of your trajectory first in order to check for the orientation. Indeed we only checked the position of the animal, but not its orientation. Navipy supports all 24 Euler convention, and also quaternion. You need to figure out which one you used. Sadly we did not find until now an easy way to do it... Having said that, if you \"only\" tracked the yaw of the animal, you can safely use the 'rzyx' convention, here alpha_0 of your trajectory correspond to the yaw of the animal and all other angles are set to 0. \n",
     "\n",
@@ -151,7 +153,30 @@
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "## Database to list of images\n",
+    "## On a grid\n",
+    "\n",
+    "In certain situation we want to have an image for every point on a rectangular and equally spaced grid.\n",
+    "\n",
+    "```bash\n",
+    "blendongrid --blender-world='pathtomyworld.blend'\n",
+    "            --config-file='pathtomyconfig.yaml'\n",
+    "            --output-file='pathtodatabase.db'\n",
+    "```\n",
+    "\n",
+    "here ```pathtomyworld.blend```, ```'pathtomytrajectory.csv'```, ```pathtodatabase.db```, ```pathtomyconfig.yaml``` are the path to your blender environment, to your trajectory, to the file to store the iamges, and to the configuration parameters (see for examples ```navipy/resources/configs```), respectively.\n",
+    "\n",
+    "\n",
+    "> Note that ```blendongrid```, ```blendalongtraj```, and other blend navipy functions can also generate a log of the steps in rendering, sometime useful for debugging. \n",
+    "> You enable the loging by adding the following options: ```--logfile='pathtolog.log' -vvv```"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "## Database and images\n",
+    "\n",
+    "### Database to images\n",
     "\n",
     "The database store all position-orientation and images along the trajectory in a single file. On the one hand, it is convenient, because you always know at which position and in which orientation an image has been rendered. On the other hand, we can not easily visualise the images. \n",
     "\n",
@@ -204,38 +229,11 @@
     "    imsave(database_template.format(frame_i), to_plot_im[::-1,...] )"
    ]
   },
-  {
-   "cell_type": "code",
-   "execution_count": 6,
-   "metadata": {},
-   "outputs": [],
-   "source": [
-    "database = '/home/bolirev/Bielefeld/Research/Toolboxes/toolbox-navigation/navipy/resources/database2.db'\n",
-    "mydb = DataBase(database)"
-   ]
-  },
-  {
-   "cell_type": "code",
-   "execution_count": 8,
-   "metadata": {},
-   "outputs": [
-    {
-     "name": "stdout",
-     "output_type": "stream",
-     "text": [
-      "Could not index by frame_i\n"
-     ]
-    }
-   ],
-   "source": [
-    "mydb.posorients.to_csv('/home/bolirev/Bielefeld/Research/Toolboxes/toolbox-navigation/navipy/resources/gridforest.csv')"
-   ]
-  },
   {
    "cell_type": "markdown",
    "metadata": {},
    "source": [
-    "## Plotting an image from the database"
+    "### Plotting an image from the database"
    ]
   },
   {
@@ -286,6 +284,13 @@
     "\n",
     "f.show()"
    ]
+  },
+  {
+   "cell_type": "markdown",
+   "metadata": {},
+   "source": [
+    "### Database from images"
+   ]
   }
  ],
  "metadata": {