Instructor: Peter Asaro asarop AT
Teaching Assistant: Athira Murali a920 AT newschool.edu
Course Numbers: NMDS 5283 & NCOM 1283
Time: Monday, 7:00 - 9:45 pm
Location: 79 5th Ave, Room 1645
Course webpage is here: http://peterasaro.org/courses/2017Studio.html
Course blog is here: http://robotstudio2017blog.wordpress.com
This course explores the potential of robotic media platforms and computer vision for cinematic expression. As a Co-Lab, students will work in collaborative groups that will utilize the latest robotic and computer vision technologies to make short films. The first half of the semester will consist of an introduction to these technologies and in-class group exercises that will familiarize you with advanced digital camera techniques, and robotic camera control. These camera techniques and platforms will include advanced computer vision techniques such as Time-lapse, High Dynamic Range Imagery, Motion Magnification, Facial Recognition, Object Tracking, Optic Flow, and others, as well as 3D active-vision systems such as the Xbox Kinect. Robotic camera control will be explored through the use of remote-operated and computer-controlled servo-driven cameras, including RC vehicles, mobile robot dollys, robotic arms, and quadrotors (drones). We will explore a variety of control methods from remote control to pre-programmed and 3D model-driven control, as well as how these can be combined with vision techniques for the interactive control of cameras with gestures. We will also explore how these cinematographic techniques relate to visual storytelling and expression. In the second half of the course, students will pursue projects of their own design in groups, with the goal of producing a short experimental or narrative video utilizing these techniques. Previous programming experience is not required, but students will be expected to learn and apply basic programming skills in this course, and will be introduced to programming languages such as Processing, Python and Java, and programming platforms and libraries such as Arduino, ROS and OpenCV.
Please email me to setup an appointment.
You are expected to have thoroughly and thoughtfully read the assigned texts and to have prepared yourself to contribute meaningfully to the class discussions. Your participation will be evaluated in terms of both quantity and quality.
As this is a studio course, and a co-lab, regular attendance is essential. Because you will be doing group projects, if you do not come to class your fellow group members will be at a disadvantage. You will be permitted two excused absences (you must notify me and your group partners of your inability to attend before class, via email or phone). Any subsequent absences and any un-excused absences will adversely affect your grade.
For the first half of the class, studio time will be devoted to in-class group exercises that are designed to teach you fundamental concepts, techniques and how to apply these towards an aesthetic goal. Your results for each exercise should be posted to the course blog, along with a description. Generally, this will be a short video, program/software, or both. You should also note the names of the group members, and their roles, in your description. This should be done by the end of each class, but you may edit or update it later. You should also create or own an account on YouTube or Vimeo to uplaod any videos that you produce.
You will be required to create an account on WordPress, and send me an email with the EMAIL ADDRESS used to create the account, so that you can be added as authors for the collective course blog. Everyone will be posting to a common blog page, and this will be publicly accessible. When writing and making comments, you are expected to treat other students with the same respect and courtesy as you should in the classroom. You are also expected to respect rules of academic integrity, research ethics, and copyright when posting to the blog.
At the start of each class, we will review questions and concerns from the previous week, as well as review and critique the films that you have produced.
Proposals Due: April 17
Paper Due: May 15
There will be no final exam. Instead, your Final Project will be due, May 15th at 7:00PM. If that time will not work for you, you need to make other arrangements by May 8th at the latest.
The Final Project could take different forms, but should contain two key elements: robots and video/media. Final Projects will be evaluated in terms of technological and aesthetic innovation and quality. Ideally, aesthetics goals should drive the technology.
Some examples of good Final Projects:
Use of robotic platforms and/or computer vision for a short film
Use of robots in a short film
Development of advanced tools/technology for robotic/computational filmmaking
Development of an interactive media project that employs robotics and/or computer vision
A research paper that explores some advanced aspect of robotic media
In addition to submitting the final video clip, software and/or hardware that you created, you will also be asked to write-up an extended blog entry or webpage that describes how you created and used technologies to create your final project. These will be posted to the course blog. There may also be opportunities to submit films to festivals and events..
Because we have limited number of various robotics technologies and computers, and in an effort to build teamwork early, all of the studio projects will be done in groups of 3 or 4 people (depending on the number of students). On the first day of class, you will be asked to fill out a questionnaire to identify your skills and interests. You will then be put in a group which complements those skills. The goal is to have at least one person with film/video experience in each group, and at least one person with some programming experience in each group.
You will be in the same group from all of the in-class exerciese over the first 10 weeks. If one or more people miss a class, we may adjust the groups as needed.
You are free to form you own groups for the Final Project, or to continue with your exercise group.
Because we will need Linux (Ubuntu Studio 12.04) for the in-class exercises, there will be laptops provided each week. Depending on the number of students, each group will get at least 1 laptop.
For various exercises, there will be more limited amounts of equipment. In these cases there will be a sign-up sheet in advance. So if your group is eager to work with a particular item, be sure to sign up early for it.
Equipment will be signed out at the beginning of class, and signed back in at the end, during the period of exercises. After that, or by special arrangement, equipment can be checkout out for longer periods to complete the Final Project outside of class.
We will be programming real robots and drones which move around in the world. They are mostly small and safe, but you should always use care and caution when working with them to protect yourself, classmates, and the public.
Don't run code if you are unsure of what it will do, or if you think it might be hazardous!
Anyone who behaves recklessly or endangers others will not be allowed to work with the robots any more.
We will review more detailed safety considerations when we start flying the quadrotors.
All readings will be available electronically, via the web, in PDF, MS Word, HTML, or similar format.
Most of the films and TV programs that will be assigned are available from a variety of sources. Many are available through the New School Library on DVD. In addition, they can be purchased from most book or video stores, rented from most video shop, or found through Netflix. For the videos which cannot be obtained easily in these ways, other means will be provided for you to view these films prior to class.
Course and Syllabus Overview
Watch: Robot Media Videos:
Bot & Dolly "Box"
Behind the Scenes: Bot & Dolly "Box"
Sun Yuan & Peng Yu's "Can't Help Myself"
Timo vs. Kuka
Hong Kong Hyperlapse
RC Car chase
Behind the Scenes: RC Car chase
Neill Blomkamp "Tetra Vaal"
Spike Jonez "I'm Here"
How to use Ubuntu and the Linux terminal.
Programming with Processing
Reference: Learning Processing, Daniel Schiffman
Do the Fun Programming tutorials, starting with "2.
Download Processing. Use point() and line()", and work through as many of these
as you have time for: 2-4, 9-12, 18-23, 29-32, 35-41, 47, 54, 74, 94,
If you prefer the textbook format, feel free to work
through the Schiffman book above instead.
If you are already familiar with the Processing language, come talk to the TAs or professor.
Once you get comfortable with the language, or finish the exercises, trying writing your own original program, and post it to the blog!
Bill Gates (2007) "A Robot in Every Home: The leader of the PC revolution predicts that the next hot field will be robotics," Scientific American, January 2007.
Hans Moravec (2009) "Rise of the Robots--The Future of Artificial Intelligence," Scientific American, March 23, 2009
Watch: Rodney Brooks says robots will invade our lives, Ted Talk 2003, 19 min.
Watch: BBC Horizon, Where's My
Robot?, BBC, 2008, 50 min.
Watch: Dennis Hong: My seven species of robot, Ted Talk 2009, 16 min.
How Digital Cameras Work http://www.astropix.com/HTML/I_ASTROP/HOW.HTM
The History and Science of Lenses
The Science of Camera Sensors
Capturing Digital Images
A Bit of History on Data
How Blurs & Filter Work
Finding the Edges
Canny Edge Detectors
Geometric Face Detection
Canny Edge Dector in OpenCV
https://github.com/atduskgreg/opencv-processing (should be installed)
You should already have Processing installed, if not: Install Processing
Install OpenCV for Processing:
1. In the Processing IDE development environment, go to “Import Library” under the “Sketch” menu
2. Search for ‘OpenCV’
3. Install “OpenCV for Processing” by Greg Borenstein.
Studio 2: Complete the Edge Detection and Face Detection Exercises
(2010) "Toward robotic cars," Communications of the ACM, Volume 53 Issue
4, April 2010, pp. 99-106.
Erico Guizzo, (2011) "How Google's Self-Driving Car Works," IEEE Spectrum Automation Blog, October 18, 2011.
Watch: The Evolution of Self-Driving Cars, YouTube, 20 min.
Explore: Navlab: The Carnegie Mellon University Navigation Laboratory
Ernst D. Dickmanns (1997) "Vehicles Capable of Dynamic Vision," Proceedings IJCAI, 1997.
Watch: Ernst Dickmanns (2011) Keynote Lecture, YouTube, 77 min.
Read: "Bringing Impressionism to Life with Neural Style Transfer in Come Swim"
Watch:How HDR Works
Introduction to HDR with Lumenance
HDR with Canon Powershot: http://www.ragingsloth.com/photography/S95/s95hdrhowto.html
HDR with Canon EOS: http://www.youtube.com/watch?v=pLOME_WVm-A
Intro to HDR (1 hour 30 min) http://www.youtube.com/watch?v=v3CPavb2NWs
HDR with RAW file in Photoshop http://www.youtube.com/watch?v=YY7cYyXjQ-I
Watch: How Kinect 3D works
The New Xbox Kinect
Kinect and Processing Tutorials
The Depth Image in Kinect and Processing
Kinect & JNI
Open NI http://www.openni.org/about/
John MacComick "How Does the Kinect Work?"
J. J. Gibson https://en.wikipedia.org/wiki/James_J._Gibson
Time-of-Flight Camera http://en.wikipedia.org/wiki/Time-of-flight_camera
Bullet Time http://en.wikipedia.org/wiki/Bullet_time
Free Viewpoint Television http://en.wikipedia.org/wiki/Free_viewpoint_television
Watch: The Science of Rendering Photorealistic CGI
Point Clouds in Kinect and Processing
Hand tracking in Kinect and Processing
Videos on SLAM
Capturing 3D data as a PointCloud and replaying it from different angles.
Microsoft Robotics Studio
Willow Garage ROS (Robot Operating System)
ROS & Blender
ROS & Arduino http://wiki.ros.org/rosserial
Watch: Craig Gillespie, Lars and the Real Girl, 2007, 106 min.
Watch: David Hanson: Robots that "show emotion", 2010 TED talk, 5 min.
Watch: Caleb Chung plays with Pleo, 2010 TED talk
Neural Networks Demystified
Inside Google's "Daydreaming" Computer
Neural Network that Changes Everything
Deep Dream (Google)
Journey on the Deep Dream
Creating Videos of the Future
Prior to class:
If you have an iPhone, iPad or Android device, please download and install this app from iTunes, GooglePlay or the Android Market: AR.FreeFlight
It is also strongly recommended that you download
this simulator, and practice with it before class: ARDrone Sim
Note: This app costs $1.99 (iOS) or $2.39 (Android)
For the Simulator options choose: AR.Drone 2.0, Indoor Hull, Standard Battery
Press the Green Takeoff button to launch and hover, the Green Landing button for an automatic controlled landing.
The Left joystick tilts the drone, and causes it to move laterally.
The Right joystick causes the drone to go up and down, and rotate clockwise or counter clockwise.
The Camera icon will shift your viewing perspective.
If you crash, it will simulate damage, so you may need to go to Settings on the main screen and Restore Defaults to fly straight again.
AR 2.0 Tutorial #1 SETUP
AR 2.0 Tutorial #2 PILOT
AR 2.0 Tutorial #3 RECORD
AR 2.0 Best Of SDK
Watch: "Spidercam US Open 2010 Opening," YouTube, 1 min.
Watch: "Spidercam European swimming Championships Budapest," YouTube, 3 min.
Watch: "Spidercam Real Madrid Barcelona El Classico," YouTube, 5 min.
Watch: "How Does Skycam Work?" YouTube, 2 min.
ROS & Parrot, Processing & Parrot
Watch: Omer Fast (2011)5,000 Feet is the Best, 30 min. [select from "Online Preview" menu]
Crane & Aerial shots
Paul Virilio, "Cinema isn't I See, it's I Fly," in War and Cinema: The Logistics of Perception, London: Verso, 1989: 11-30.
Beth Herst, "Review: The Disembodied Eye," PAJ: A Journal of Performance and Art, Vol. 24, No. 1, Intelligent Stages: Digital Art
and Performance (Jan., 2002), pp. 122-126.
Ryan Calo (2012) "Robots and Privacy," Robot Ethics: The Ethical and Social Implications of Robotics, MIT Press, pp. 187-201.
Ryan Calo (2011) "The Drone as Privacy Catalyst," Stanford Law Review Online, 29, December 12, 2011, pp. 64.
American Civil Liberties Union (2011) Protecting Privacy From Aerial Surveillance: Recommendations for Government Use of Drone Aircraft, ACLU, December 2011.
"Unblinking eyes in the sky," The Economist, March 3, 2012.
Watch: "Protester Films Polish Riots Using Drone 2011" YouTube, 3 min.
Meghan Keneally (2012) "Drone plane spots a river of blood flowing from the back of a Dallas meat packing plant," Daily Mail, January 24, 2012.
Watch: "Citizens Shoot Down Animal Rights Group's Surveillance Drone" The Blaze, 3 min.
Belton, John. "The Bionic Eye: Zoom Esthetics." Cineaste (1980): 20-27.
Brown, Garrett, "It's a Bird... It's a Plane... It's a... Camera!", American Film (Archive: 1975-1992) 8.10 (Sept. 1, 1983): 59-61.
Watch: Alex Rivera, Sleep Dealer, Likely Story, 2008, 90 min.
Steve Dixon, "Metal Performance: Humanizing Robots, Returning to Nature, and Camping About," and "A Brief History of Robots and Automata," TDR: The Drama Review, Volume 48, Number 4 (T 184), Winter 2004, pp. 15-46.
Eduardo Kac, "Foundation and Development of Robotic Art," Art Journal, Vol. 56, No. 3, Digital Reflections: The Dialogue of Art and Technology, (Autumn, 1997), pp. 60-67.
Edward A. Shanken, "Tele-Agency: Telematics, Telerobotics, and the Art of Meaning,"Art Journal, Vol. 59, No. 2 (Summer, 2000), pp. 65-77.
Watch: Jonathan Mostow, Surrogates, Touchstone Pictures, 2009, 89 min.
http://en.wikipedia.org/wiki/Uncanny_valleyBrian Fung (2012) "The Uncanny Valley: What Robot Theory Tells Us About Mitt Romney," The Atlantic, January 31, 2012.
Watch: Errol Morris, Fast, Cheap and Out of Control, Sony Pictures Classics, 1997, 80 min.