Adding 3D Objects to 2D Images With Ease

On Friday, I wrote about the Throwable Panoramic Ball Camera, a device that takes panoramic images by simply tossing it into the air. One use case that came to mind was capturing HDR sphere images to be used in realistically placing 3D models in real-world scenarios. If Kevin Karsch’s research moves into the public space, using HDR images for this reason may soon be a thing of the past.

Kevin Karsch is a Computer Science PhD student at the University of Illinois at Urbana Champaign who is currently researching computer graphics and computer vision. What makes Kevin’s research so interesting is what he calls Physically grounded photo editing. From the description:

Current image editing software only allows 2D manipulations with no regard to the high level spatial information that is present in a given scene, and 3D modeling tools are sometimes complex and tedious for a novice user. Our goal is to extract 3D scene information from single images to allow for seamless object insertion, removal, and relocation. This process can be broken into three somewhat independent phases: luminaire inference, perspective estimation (depth, occlusion, camera parameters), and texture replacement. We are working on developing novel solutions to each of these phases, in hopes of creating a new class of physically-aware image editors.

In other words: the software aims to allow people to easily insert 3D objects into existing 2D photographs. Kevin has posted the following video on his Vimeo page, describing the process and results with examples:

Found via PhotoWeeklyOnline INC


Throwable Panoramic Ball Camera

Imagine if capturing a panoramic image of your surroundings was as simple as tossing a ball in the air. Computer Graphics Group, TU Berlin appearantly had the same vision, and created the Throwable Panoramic Ball Camera, a ball with 36 cameras, accelerometers and software to stitch it all together.

The ball will be demonstrated this year at SIGGRAPH ASIA 2011, but you won’t need to wait until December to see it in action. Below is a video demo the group published on their page:

The first thought that came to mind after watching the demo was HDR lighting using a sphere. This could make it incredibly easy for 3D designers to overlay models in real-world scenarios with realistic lighting.