Introduction: Mini Projection-Mapped Landscape

About: Interdisciplinary Artist

This Instructable about creating a simple sculpture that is augmented with projection mapping, designed to run on the Raspberry Pi, built with OpenFrameworks.

Projection Mapping is a process that relies on knowledge of the physical space, graphics software, and installation of the hardware necessary to complete the illusion of precisely mapped light. There are a myriad of tools and methods available to projection map objects, some relying on camera-vision, and other auto-calibrating methods. My goal for this instructable is to streamline the process of creating a sculpture and having it precisely projection mapped.

Material List:

Projector
Vivitek Qumi Q5 Super Bright HD Pocket Projector

Computer

Raspberry PI Model B+

Projection Mapping Software

Custom app written with OpenFrameworks.

Modeling Software

Rhino + Grasshopper

Sculpture Fabrication

MakerBot Duo PLA

Step 1: Sculpture Design

I designed a landscape with Rhino3D/Grasshopper. You can use any software you're comfortable with. The projection software is designed for polygonal shapes and not optimized for curves, so if you do have curves in your mesh, or a highly-tessellated mesh, this will create more work in the mapping phase.

This design process could translated to another modeling software. I'm using Rhino3D/Grasshopper. The design is based on a series of points placed in space, which will be converted into a Mesh with the Delunay node in Grasshopper.

Open Rhino and Grasshopper.

Place a few vertices in Rhino's 3D space by repeating the following process:

Type "Point". Move the cursor the place where you want to place the point Click the left mouse button. Click the right mouse button to repeat the point command.

Once you have a group of points, select then in the Rhinoceros window. Then, in the Grasshopper window, create a Point collection by typing 'Point', and right clicking on the node and select 'Set Multiple Points'.

Then, in Grasshopper, connect your point collection to a Delunay Mesh node. The Rhinoceros window will show the result.

For each new point you add to your mesh in Rhino, you'll need to right-click on the Point collection in Grasshopper.

Add, move, and modify your points in Rhinoceros to create a form that you like.

I've also included a series of various formats of my landscape model for you to use.

Step 2: 3D Print the Sculpture

I printed the sculpture with a Makerbot Duo with a white PLA. I've also made a few different colors. Gray seems to work well. They are a bit shiny, so I may scuff them up with sandpaper.

Attached to this step is the x3g file which can be dropped directly on your SD card and printed on a MakerBot.

Step 3: Setup Software on the Raspberry Pi

We're going to now set up our Raspberry Pi.


1. Setup Raspberry Pi

If you haven't set up a Raspberry Pi before, check out Scott Kildalls awesome Ultimate Raspberry Pi Configuration Guide.


2. Install OpenFrameworks on the Raspberry Pi

OpenFrameworks has a great tutorial on setting up OpenFrameworks on the Raspberry Pi.

Install openframeworks into your home directory. I'm assuming your username is the default (pi). The result should be a directory structure that looks like this:

/home/pi/openFrameworks/


4. Install Addons

You'll need one ofxAddon called ofxCameraSaveLoad. You can install this by downloading it from

http://ofxaddons.com/, or directly from the author's github, or the zip file I've attached to this step. Extract the files into /home/pi/openFrameworks/addons/

Step 4: Install Of-meshMappingExample

This software is written in OpenFrameworks/C++ running on a Raspberry Pi. The code will work on your desktop or laptop, as well. The app has tools for distorting a polygonal mesh imported from another software. The app has controls for a virtual software camera, rotation, translation, and manual vertex mesh-tweaking.

This process is that if you have a lot of vertices in your mesh, it could make it very hard to map, as you will want to to place each vertex by hand when you get to the mapping phase for fine-detail tweaks.

Features:

  • Single-Vertex Mesh Editor
  • Camera Position Editor
  • PLY Mesh Object Import
  • Save/Load scenes and camera settings

Install meshMappingExample software

Fork or Download from GitHub:

https://github.com/quilime/of-meshMappingExample

Extract the files onto /home/pi/openFrameworks/apps/myApps/meshMappingExample/

Change directories to the meshMapping Example:

$ cd /home/pi/openFrameworks/apps/myApps/meshMappingExample/

Run Make:

$ make

If all goes well, you can test the app by running

$ make run

Step 5: Set Up Projector and Pi With Tripod Mount

I am using the Vivitek QUMI Q5 for my projector. A nice feature about this projector is that it has a powered USB port which can power the rPi. This means there's only one power chord for the entire setup.

I found the shortest HDMI and USB cables I could find, and connected the PI to the Projector. I then zip-tied the rPi to the projector, making sure not to obscure any ventilation ports.

For initial setup, I plugged in a mouse and keyboard to do any mesh tweaking.

Once you get the projected image in focus, place the 3D model in the path of the projected area.

Step 6: Map the Sculpture

Plug in a keyboard and mouse into the Raspberry Pi. You can also run the software on your laptop if it's easier for compiling, and then put your code and compile on the Raspberry.

Set the sculpture is the projected light from the projector, and run the meshMappingExample app by cd'ing into the application folder

$ cd /home/pi/openFrameworks/apps/myApps/meshMappingExample/

and run it:

$ make run

The first time the app boots up, you will get a blank screen because you haven't created a scene yet. Press 'h' to show the help menu, and then press 'TAB' to enter mesh-editing mode.

Mapping the sculpture takes a series of camera movements to get the shape in the general area, and then fine-tuning the mesh so the points line up as precisely as possible.

Key commands:

  • 'h' toggles help window
  • 'f' toggles fullscreen
  • 'c' toggle camera control
    • With camera control set to 'ON', dragging the mouse will modify the virtual camera position. You'll want to match the position of the virtual mesh to the real mesh as close as possible.
  • 'TAB' toggles mesh-edit mode
    • With mesh-editing mode set to 'ON', you can drag individual vertices to match their real-life positions on the sculpture. This allows for fine-edits.
  • 's' save scene
    • Save current mesh and camera settings
  • 'l' load scene
    • Load previously saved mesh and camera settings.

Toggle in between mesh edit mode with TAB to check your progress, and you may have to scoot the sculpture around a little bit to get it to align exactly. Make sure to save your progress as you go by hitting 's', so you don't lose your work!

Step 7: Experience the Sculpture!

Once the form is mapped and you exit "Edit Mode", you will see your sculpture being lit by two virtual lights. These lights revolve around the object, lighting each face as if it was in virtual space. Try tweaking the ofApp.cpp to modify the light color, speed, maybe adding more lights, and if you've got some experience with OpenFrameworks, try adding textures.

The QUMI projector is 500 lumens, so it works pretty ok in ambient light. The dimmer the ambient light the better the projection will appear. The QUMI is a DLP projector, so the banding will appear when filming on a cellphone or a point and shoot camera without a rolling shutter. This banding can be eliminated by using an LCD projector.

Some ideas for additions that aren't covered in this Instructable, but would be cool ideas to investigate with this setup:

  • Start up the mapping application when the pi boots by placing a startup script in /etc/init.d/
  • Remove all keyboard interactions so the app only relies on the mouse, allowing you to plug in only one peripheral to do the mapping task.
  • Add on the RaspberryPi Camera Module and implement OpenCV to do camera-based projector calibration, based on Structured Light.
  • Include multiple meshes in the software to allow for multiple projection modes on various objects.
  • Include texture-mapping to map video onto mesh faces.
  • Include GLSL shaders applied to mesh faces for even more visual control.
  • Set up the RaspberryPi with a USB WiFI adapter and set up the pi as a hotspot, allowing you to log in an modify the software remotely from your laptop.
  • ... and so much more!

This basic example represents a proof-of-concept in getting a simple projection mapped scene running on a Raspberry Pi. I hope this tutorial gives you a starting point to designing and mapping your own sculptural forms.