{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "MotionClouds may be considered as a control stimulus - it seems more interesting to consider more complex trajectories. \n", "\n", "\n", "\n", "Let's start with the classical Motion Cloud:" ] }, { "cell_type": "code", "execution_count": 1, "metadata": { "ExecuteTime": { "end_time": "2018-01-16T15:06:58.608090Z", "start_time": "2018-01-16T15:06:56.480814Z" } }, "outputs": [], "source": [ "name = 'trajectory'\n", "import os\n", "import numpy as np\n", "import MotionClouds as mc\n", "fx, fy, ft = mc.get_grids(mc.N_X, mc.N_Y, mc.N_frame)\n", "\n", "mc.figpath = '../files/2018-01-16-testing-more-complex'\n", "if not(os.path.isdir(mc.figpath)): os.mkdir(mc.figpath)" ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "ExecuteTime": { "end_time": "2018-01-16T15:07:14.108893Z", "start_time": "2018-01-16T15:06:58.614799Z" } }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "/usr/local/lib/python3.7/site-packages/vispy/visuals/isocurve.py:22: UserWarning: VisPy is not yet compatible with matplotlib 2.2+\n", " warnings.warn(\"VisPy is not yet compatible with matplotlib 2.2+\")\n" ] }, { "data": { "text/html": [ "\n", "
\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "name_ = name + '_dense'\n", "seed = 42\n", "mc1 = mc.envelope_gabor(fx, fy, ft)\n", "mc.figures(mc1, name_, seed=seed, figpath=mc.figpath)\n", "mc.in_show_video(name_, figpath=mc.figpath)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "The information is distributed densely in space and time. \n", "\n", "## one definition of a trajectory\n", "\n", "It is also possible to show the impulse response (\"texton\") corresponding to this particular texture (be patient to see a full period):" ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "ExecuteTime": { "end_time": "2018-01-16T15:07:33.397357Z", "start_time": "2018-01-16T15:07:14.115329Z" } }, "outputs": [ { "data": { "text/html": [ "\n", "
\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "name_ = name + '_impulse'\n", "seed = 42\n", "mc1 = mc.envelope_gabor(fx, fy, ft)\n", "mc.figures(mc1, name_, seed=seed, impulse=True, figpath=mc.figpath)\n", "mc.in_show_video(name_, figpath=mc.figpath)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "To generate a trajectory, we should just convolve this impulse response to a trajectory defined as a binary profile in space and time:" ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "ExecuteTime": { "end_time": "2018-01-16T15:07:56.663100Z", "start_time": "2018-01-16T15:07:33.425386Z" } }, "outputs": [ { "data": { "text/html": [ "\n", "
\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "name_ = name + '_straight'\n", "seed = 42\n", "x, y, t = fx+.5, fy+.5, ft+.5\n", "width_y, width_x = 0.01, 0.005\n", "events = 1. * (np.abs(y - .5) < width_y )* (np.abs(x - t) < width_x )\n", "mc1 = mc.envelope_gabor(fx, fy, ft)\n", "mc.figures(mc1, name_, seed=seed, events=events, figpath=mc.figpath)\n", "mc.in_show_video(name_, figpath=mc.figpath)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "It is possible to make this trajectory noisy:" ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "ExecuteTime": { "end_time": "2018-01-16T15:08:17.558980Z", "start_time": "2018-01-16T15:07:56.666386Z" } }, "outputs": [ { "data": { "text/html": [ "\n", "
\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "name_ = name + '_noisy'\n", "noise_x = 0.02\n", "noise = noise_x * np.random.randn(1, 1, mc.N_frame)\n", "events = 1. * (np.abs(y - .5) < width_y )* (np.abs(x + noise - t) < width_x )\n", "mc1 = mc.envelope_gabor(fx, fy, ft)\n", "mc.figures(mc1, name_, seed=seed, events=events, figpath=mc.figpath)\n", "mc.in_show_video(name_, figpath=mc.figpath)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Finally, it is possible to make the amplitude of the texton change as a function of time:" ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "ExecuteTime": { "end_time": "2018-01-16T15:09:22.521657Z", "start_time": "2018-01-16T15:08:41.248865Z" } }, "outputs": [ { "data": { "text/html": [ "\n", "
\n", " \n", " \n", " \n", " \n", " \n", " \n", " \n", "
" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "name_ = name + '_noisier'\n", "noise = noise_x * np.random.randn(1, 1, mc.N_frame)\n", "events = 1. * (np.abs(y - .5) < width_y )* (np.abs(x + noise - t) < width_x )\n", "A_noise_x = 0.02\n", "A_noise = A_noise_x * np.random.randn(1, 1, mc.N_frame)\n", "phase_noise = 2 * np.pi * np.random.rand(1, 1, mc.N_frame)\n", "A_noise = np.cumsum(A_noise, axis=-1) / np.sqrt(t+1)\n", "phase_noise = np.cumsum(phase_noise, axis=-1)\n", "mc1 = mc.envelope_gabor(fx, fy, ft)\n", "mc.figures(mc1, name_, seed=seed, events=A_noise*np.exp(phase_noise*1j)*events, figpath=mc.figpath)\n", "mc.in_show_video(name_, figpath=mc.figpath)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## addition of a trajectory to the incoherent noise\n", "\n", "It is now possible to add this trajectory to any kind of background, such as a background texture of the same \"texton\" but with a null average motion:" ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "ExecuteTime": { "end_time": "2018-01-16T15:09:22.521657Z", "start_time": "2018-01-16T15:08:41.248865Z" }, "scrolled": false }, "outputs": [ { "data": { "text/html": [ "\n", "
\n", "
" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "name_ = name + '_overlay'\n", "movie_coh = mc.rectif(mc.random_cloud(mc1, seed=seed, events=A_noise*np.exp(phase_noise*1j)*events))\n", "mc0 = mc.envelope_gabor(fx, fy, ft, V_X=0)\n", "movie_unc = mc.rectif(mc.random_cloud(mc0, seed=seed+1))\n", "rho_coh = .9\n", "mc.anim_save(rho_coh*movie_coh+(1-rho_coh)*movie_unc, os.path.join(mc.figpath, name_))\n", "mc.in_show_video(name_, figpath=mc.figpath)" ] }, { "cell_type": "code", "execution_count": 8, "metadata": { "ExecuteTime": { "end_time": "2018-01-16T15:09:53.872831Z", "start_time": "2018-01-16T15:09:22.526189Z" }, "scrolled": false }, "outputs": [ { "data": { "text/html": [ "\n", "
\n", "
" ], "text/plain": [ "" ] }, "metadata": {}, "output_type": "display_data" } ], "source": [ "name_ = name + '_overlay_difficult'\n", "rho_coh = .5\n", "mc.anim_save(rho_coh*movie_coh+(1-rho_coh)*movie_unc, os.path.join(mc.figpath, name_))\n", "mc.in_show_video(name_, figpath=mc.figpath)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Though it is difficult to find the coherent pattern in a single frame, one detects it thanks to its coherent motion (see work from Watamaniuk, McKee and colleagues)." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## some book keeping for the notebook" ] }, { "cell_type": "code", "execution_count": 9, "metadata": { "ExecuteTime": { "end_time": "2018-11-07T16:19:23.177738Z", "start_time": "2018-11-07T16:19:23.125993Z" } }, "outputs": [ { "data": { "application/json": { "Software versions": [ { "module": "Python", "version": "3.7.2 64bit [Clang 10.0.0 (clang-1000.11.45.5)]" }, { "module": "IPython", "version": "7.3.0" }, { "module": "OS", "version": "Darwin 17.7.0 x86_64 i386 64bit" }, { "module": "numpy", "version": "1.16.2" }, { "module": "matplotlib", "version": "3.0.2" }, { "module": "MotionClouds", "version": "20180606" } ] }, "text/html": [ "
SoftwareVersion
Python3.7.2 64bit [Clang 10.0.0 (clang-1000.11.45.5)]
IPython7.3.0
OSDarwin 17.7.0 x86_64 i386 64bit
numpy1.16.2
matplotlib3.0.2
MotionClouds20180606
Tue Feb 26 22:35:19 2019 CET
" ], "text/latex": [ "\\begin{tabular}{|l|l|}\\hline\n", "{\\bf Software} & {\\bf Version} \\\\ \\hline\\hline\n", "Python & 3.7.2 64bit [Clang 10.0.0 (clang-1000.11.45.5)] \\\\ \\hline\n", "IPython & 7.3.0 \\\\ \\hline\n", "OS & Darwin 17.7.0 x86\\_64 i386 64bit \\\\ \\hline\n", "numpy & 1.16.2 \\\\ \\hline\n", "matplotlib & 3.0.2 \\\\ \\hline\n", "MotionClouds & 20180606 \\\\ \\hline\n", "\\hline \\multicolumn{2}{|l|}{Tue Feb 26 22:35:19 2019 CET} \\\\ \\hline\n", "\\end{tabular}\n" ], "text/plain": [ "Software versions\n", "Python 3.7.2 64bit [Clang 10.0.0 (clang-1000.11.45.5)]\n", "IPython 7.3.0\n", "OS Darwin 17.7.0 x86_64 i386 64bit\n", "numpy 1.16.2\n", "matplotlib 3.0.2\n", "MotionClouds 20180606\n", "Tue Feb 26 22:35:19 2019 CET" ] }, "execution_count": 9, "metadata": {}, "output_type": "execute_result" } ], "source": [ "%load_ext version_information\n", "%version_information numpy, matplotlib, MotionClouds" ] } ], "metadata": { "kernelspec": { "display_name": "Python 3", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.7.2" }, "nikola": { "category": "", "date": "2018-01-16 11:43:46 UTC+02:00", "description": "", "link": "", "slug": "2018-01-16-testing-more-complex-trajectories", "tags": "motionclouds, code, trajectory", "title": "Testing more complex trajectories", "type": "text" }, "toc": { "nav_menu": {}, "number_sections": true, "sideBar": true, "skip_h1_title": false, "toc_cell": false, "toc_position": {}, "toc_section_display": "block", "toc_window_display": false } }, "nbformat": 4, "nbformat_minor": 2 }