Skip To Content

Imaging Systems and Applications

25 June 2018 – 28 June 2018 Wyndham Orlando Resort International Drive, Orlando, Florida United States

IS brings together experts from many different scientific and engineering disciplines who contribute to the design and integration of optics, sensors, digital processing and displays in imaging systems. IS captures the state-of-the-art in unique light gathering optics, image sensor architectures and technology, on and off chip digital image processing, and methods for compression, storage, transmission, and utilization.

The meeting highlights the leading-edge use of imaging systems in consumer imaging, automotive and drone imaging, photography and digital cinematography capture and projection, remote sensing, microscopy, invasive and non-invasive surgery, and airborne and astronomical observations and imaging.
 


Topics

Applications of military, industrial, medical and consumer imaging with special sessions on:
  • Wearable sensors and displays
  • 3-D imaging - capture methods and displays
  • Imaging for autonomous vehicles
  • Advances in biomedical imaging
  • Aerospace imaging
  • Optical Sensing
Imaging systems and components with special sessions on:
  • Image sensors
  • Image systems design and simulation
  • Image processing
  • Multispectral, hyperspectral and thermal imaging
  • Computational Imaging
  • Novel imaging optics including metamaterials, freeform optics, diffractive optics, polarization, and programmable optics

Top


Speakers

  • K. Vijayan Asari, University of DaytonUnited States
  • Claude Boccara, Institut LangevinFrance 
    Ultra-High Resolution Full-Field OCT (FFOCT) for Cornea and Retina
  • Dmitry Dylov, Skolkovo Instit Science and TechnologyRussia 
    Integrated tissue analytics for clinical imaging systems
  • Katsumasa Fujita, Osaka UniversityJapan 
    Super-resolution confocal microscopy using optical nonlinearity
  • Juliet Gopinath, University of Colorado at BoulderUnited States 
    Adaptive electrowetting optical devices for imaging
  • Boris Gramatikov, Johns Hopkins UniversityUnited States 
    Integrating Retinal Birefringence Scanning and Optical Coherence Tomography for Pediatric Retinal Imaging
  • Sanjeev Koppal, University of FloridaUnited States 
    Toward Miniature Computer Vision Sensors
  • Geoffrey McKnight, HRL Laboratories, LLC 
    Spherically Curved Image Sensors
  • Hirotaka Murakami, Sony Electronics 
    CMOS Image Sensor Evolution toward Sensing World
  • Shuo Pang, University of Central Florida, CREOLUnited States 
    Spatiotemporally Encoded Fast Fluorescence Microscopy with Full Field-of-View
  • Clara Rivero-Baleine, Lockheed MartinUnited States 
    Engineered Materials for Next Generation EO/IR Sensors
  • Antonio Robles-Kelly, CSIRO Australia Telescope Natl Facility 
    A Multispectral Light Field Camera for 3D Imaging from a Single Lens
  • David Sampson, University of SurreyUnited Kingdom 
    Angiography, Lymphangiography, Eastography, and Polarisation Contrast Extensions of Optical Coherence Tomography
  • Jane Sprigg, Tarsier OpticsUnited States 
    Incoherent Super-resolution Imaging
  • Nelson Tabiryan, Beam Enginering for Adv Measurements CoUnited States 
    Electrically Switchable Large, Thin, and Fast Optics
  • C Alex Young, NASAUnited States 
    Eclipse Ballooning Project
  • Ignacio Zuleta, PlanetUnited States

Top


Committee

  • Matthew Arnison, Canon Info. Sys. Research Australia, Australia , Chair
  • Michael Groenert, US Army RDECOM CERDEC, United States , Chair
  • Ginni Grover, Intel Corporation, United States , Program Chair
  • Kristina Irsch, Johns Hopkins University & Sorbonne Univ, United States , Program Chair
  • Kenneth Barnard, US Air Force Research Laboratory, United States
  • Peter Catrysse, Stanford University, United States
  • Christopher Dainty, FotoNation, Ireland
  • Mini Das, University of Houston, United States
  • Aristide Dogariu, University of Central Florida, CREOL, United States
  • Joyce Farrell, Stanford University, United States
  • Boyd Fowler, Omnivision Technologies, United States
  • Kevin Gemp, MITRE Corp, United States
  • Francisco Imai, Apple Inc., United States
  • Chulmin Joo, Yonsei University, South Korea
  • Byoungho Lee, Seoul National University, South Korea
  • Ofer Levi, University of Toronto, Canada
  • Dale Linne von Berg, US Naval Research Laboratory, United States
  • Rajesh Menon, University of Utah, United States
  • Lise Randeberg, Norges Teknisk Naturvitenskapelige Univ, Norway
  • Maitreyee Roy, University of New South Wales, Australia
  • Todd Sachs, Apple Inc., United States
  • Casey Streuber, Raytheon Missile Systems, United States
  • Jay Vizgaitis, optX imaging systems, United States
  • Laura Waller, University of California Berkeley, United States
  • Zeev Zalevsky, Bar-Ilan University, Israel

Top


Plenary Session

Paul Debevec

Google VR, USA

Light Fields and Light Stages for Photoreal Movies, Games, and Virtual Reality

This talk will present work from USC ICT and Google VR in creating actors and environments for movies, games, and virtual reality.  The Light Stage computational illumination and facial scanning systems are geodesic spheres of inward-pointing LED lights which have been used to create digital actor effects in movies such as Avatar, Benjamin Button, and Gravity, and have recently been used to create photoreal digital actors based on real people in movies such as Furious 7, Blade Runner: 2049, and Ready Player One.  The lighting reproduction process of light stages allows omnidirectional lighting environments captured from the real world to be accurately reproduced in a studio, and has recently be extended with multispectral capabilities to enable LED lighting to accurately mimic the color rendition properties of daylight, incandescent, and mixed lighting environments.  They have also recently used their full-body light stage in conjuction with natural language processing and automultiscopic projection to record and project interactive conversations with survivors of the World War II Holocaust. Debevec will conclude by discussing the technology and production processes behind "Welcome to Light Fields", the first downloadable virtual reality experience based on light field capture techniques which allow the visual appearance of an explorable volume of space to be recorded and reprojected photorealistically in VR enabling full 6DOF head movement.

About the Speaker

Paul Debevec is a research professor at the University of Southern California and the associate director of graphics research at USC's Institute for Creative Technologies. Debevec's Ph.D. thesis (UC Berkeley, 1996) presented Façade, an image-based modeling and rendering system for creating photoreal architectural models from photographs. Using Façade he led the creation of virtual cinematography of the Berkeley campus for his 1997 film The Campanile Movie whose techniques were used to create virtual backgrounds in The Matrix. Subsequently, Debevec pioneered high dynamic range image-based lighting techniques in his films Rendering with Natural Light (1998), Fiat Lux (1999), and The Parthenon (2004); he also leads the design of HDR Shop, the first high dynamic range image editing program. At USC ICT, Debevec has led the development of a series of Light Stage devices for capturing and simulating how objects and people reflect light, used to create photoreal digital actors in films such as Spider Man 2, Superman Returns, and The Curious Case of Benjamin Button, and Avatar, as well as 3D Display devices for telepresence and teleconferencing. He received ACM SIGGRAPH's first Significant New Researcher Award in 2001, co-authored the 2005 book High Dynamic Range Imaging from Morgan Kaufmann, and chaired the SIGGRAPH 2007 Computer Animation Festival. He serves as Vice President of ACM SIGGRAPH and is a member of the Visual Effects Society, the Academy of Motion Picture Arts and Sciences, and the Academy's Science and Technology Council.

Jason Eichenholz

Luminar Technologies, USA

​OSA Light the Future Presentation: The Role of Optics and Photonics in the Vehicles of Tomorrow

In this presentation, OSA Fellow Jason Eichenholz will take a high level look at the future of optics and photonics technologies in autonomous vehicles. Optics are a crucial component in an industry headed for extreme disruption over the next few decades and will play a critical role in shaping the future of navigation, passenger experience and the ultimate safety of the autonomous trip. Eichenholz will discuss the key components of all-things-optic, including LiDAR, laser headlights, passenger monitoring and interior lighting and displays, the role each plays inside a future automobile and its impact on the transportation industry.

About the Speaker

Jason Eichenholz is a serial entrepreneur and pioneer in laser, optics and photonics product development and commercialization. Over the course of his twenty-five year career, his unique blend of business and technical leadership has resulted in hundreds of millions of dollars of new photonics products being brought to market. Eichenholz is an inventor on ten U.S. patents on new types of solid-state lasers, displays and photonic devices.

Laurent Pueyo

Space Telescope Science Institute, USA

Exoplanet Imaging: From Precision Optics to Precision Measurements

During this plenary talk Laurent will present recent observational results in exoplanet imaging and discuss prospects for similar experiments on NASA missions such as the upcoming James Webb Space Telescope and the currently studied The Large UV/Optical/IR Surveyor.

About the Speaker

Laurent Pueyo is an astronomer at the Space Telescope Science Institute, in Baltimore, Maryland. He earned his doctorate from Princeton University in 2008 and conducted his post-doctoral work as a NASA Fellow at the Jet Propulsion Laboratory and as a Sagan Fellow at the Johns Hopkins University. His research focuses on imaging faint planets around nearby stars. He has pioneered advanced data analysis methods that are now standard tools used to study extrasolar planets, and invented an optical technique that is now baselined for future NASA missions. At STScI his duties include optimizing the extrasolar-planet imaging capabilities of NASA's James Webb Space Telescope (JWST), scheduled to launch in late 2019. He is also a member of the Science and Technology Definition Team for the Large Ultraviolet Optical and Infrared telescope, a future observatory that will identify Earth-sized planets and assess their habitability.

Top


Special Events

 

Digital Holographic Microscopy: Present and Future Panel Discussion

Monday, 25 June, 12:30–14:00
Join the OSA Holography and Diffractive Optics Technical Group for a panel discussion exploring potential breakthroughs in digital holographic microscopy. Brief
presentations from our featured panelists will be followed by a moderated question and answer session, helping facilitate the exchange of information with our
community. Contact TGactivities@osa.org to register, pending availability.
Hosted by:   
 

Congress Reception

Monday, 25 June; 18:30–20:00
Come join your colleagues for drinks, networking and thoughtful discussion. Enjoy light fare while networking. The reception is open to all full conference attendees.
Conference attendees may purchase extra tickets for their guest.

Student & Early Career Professional Development & Networking Lunch and Learn

Tuesday, 26 June; 12:30-14:00
This program will provide a unique opportunity for students and early career professionals, who are close to finishing or who have recently finished their doctorate degree, to interact with experienced researchers. Key industry and academic leaders in the community will be matched for each student based on the student's preference or similarity of research interests. Students will have an opportunity to discuss their ongoing research and career plans with their mentor, while mentors will share their professional journey and provide useful tips to those who attend. Lunch will be provided. 

This Workshop is complimentary for OSA Members and space is limited. Not all who apply will be able to attend due to space limitations and priority will be given to those who have most recently or are close to graduation.
Hosted by  OSAF

50th Anniversary of Introduction to Fourier Optics by Joseph Goodman

Tuesday, 26 June; 13:30-19:30
This year marks the 50th anniversary of the publishing of Introduction to Fourier Optics by Joseph Goodman, a book that has fundamental influence in the field of optical imaging. To commemorate this anniversary a special series of talks will be presented covering Fourier optics in the classroom to the evolvement of the field.

Join the Image Sensing and Pattern Recognition Technical Group for a small reception immediately following the conclusion of the program.
 
  • Joseph W. Goodman, Stanford University, USA, Origins and Evolution of Introduction to Fourier Optics
  • James Fienup, University of Rochester, USA, ABCD Matrix Analysis for Fourier-Optics Imaging
  • Raymond Kostuk, University of Arizona, USA, A review of the wonderful discussion of Holography by Professor Goodman in his book: The Introduction to Fourier Optics.
  • James Leger, University of Minnesota Twin Cities, USA, What’s the Problem? Insight and Inspiration Derived from Solving the Exercises in J. Goodman’s Classic Book Introduction to Fourier Optics
  • Masud Mansuripur, University of Arizona, USA, Fourier Optics in the Classroom
  • Demetri Psaltis, Ecole Polytechnique Federale de Lausanne, Switzerland, The Transition of Fourier Optics Towards Computational Imaging and Digital Holography
  • William T. Rhodes, Florida Atlantic University, USA, Teaching Fourier Optics: What I do Differently after 50 Years
  • Bahaa Saleh, University of Central Florida, USA
Reception Hosted By 
 

Illumicon II

Tuesday, 26 June 2018, 19:00 – 21:00
 You are invited to join the OSA Display Technology Technical Group for Illumicon II, an exclusive members-only event. Building on the declarations established at the inaugural Illumicon, which was convened in 2016, attendees will come together to discuss and debate emerging trends, technologies and opportunities in advanced 3D displays. Our discussions will also seek input on how the Display Technology Technical Group can further engage the 3D community in the years ahead. Illumicon II attendees will converge over drinks and appetizers at the confidential location. Entrance will be granted to those able to provide the secret Illumicon II event password. RSVP to tgactivities@osa.org to receive the event location and password.
Hosted by 

 

Applications of Visual Science Technical Group Networking Lunch

Wednesday, 27 June 2018, 12:00–13:00
Members of the OSA Applications of Visual Science Technical Group are invited to join us for a networking lunch on Wednesday. The event will provide an opportunity
to connect with fellow attendees who share an interest in this field and to learn more about this technical group. Contact TGactivities@osa.org to register, pending availability.

Hosted by 


Tour of Laser Propagation Facilities at Kennedy Space Center

Thursday, 28 June (13:00-18:00) and Friday (07:00-12:00)
Additional Fee: $25 per person.
 *Fee includes only transportation. Transportation will leave from the conference hotel and return to the Orlando International Airport and the conference hotel. 

During this tour at Kennedy Space Center (KSC), you will see various facilities used for outside field experiments such as laser propagation measurements. The tour will include UCF’s Townes Institute Science and Technology Experimentation Facility (TISTEF), the Shuttle Landing Facility (SLF), and Vehicle Assembly Building (VAB). TISTEF is a site for experiments that require deployment in a fielded setting and consists of a 1 km grass range equipped with atmospheric monitoring instruments and multiple scintillometers as well as capabilities for optical tracking and remote sensing. From this site, slant path measurements can be made over the 13 km path to the top of the VAB. The 5 km long SLF is ideal for longer path measurements because of its homogeneity and flatness (earth’s curvature has been removed). This tour is possible because of the pcAOP committee and University of Central Florida.

 

Student Grand Challenge: the optical systems of the future

The challenge is open to OSA student members and their advisors interested in presenting concepts for enhanced machine visioning or systems that enhance the human vision system by augmenting or extending another human sense.  
 
Individuals or teams are invited to submit ideas for either a novel passive or active optical system in the form of a 35-word abstract and 2 page summary highlighting the novelty, originality and feasibility of the concept. (Additional materials may include videos highlighting system mock-up or demos.)
 
Up to four finalists will be chosen to attend the Imaging and Applied Optics Congress, 25-28 June 2018 in Orlando, FL USA to present a 3-minute synopsis of their concept as well as to host a poster during the conference poster session.  Finalists will receive a travel stipend up to $2,000 USD to cover airfare and hotel as well as full technical registration for the congress.
 
Two winners will be announced on-site and will receive a recognition plaque and a $250 prize.

Sponsored by Lockheed Martin and the OSA Foundation.

Passive Optical System Challenge Problem

The image processing community strives to duplicate human vision.  For certain specific and well defined tasks we have succeeded or surpassed human capability, but still struggle with poorly defined and dynamic environments. The category comparison between machine vision and human vision include:
  • Spectrum: Machine vision is superior as human vision is limited to the visible spectrum.  Machine vision is also more capable of seeing narrower spectrum steps and larger dynamic ranges than our eyes.
  • Resolution: Human vision is superior.  Current machine vision systems that are approaching 8K x 8K formats are starting to get there but, only with visible systems.
  • Focus: Human vision is superior being able to focus from very close to very far with a single lens element. The eye aperture is limited by the size of the pupil for objects far away.  Machine vision systems are specifically designed for very close or very far away and do not suffer from being aperture limited, but utilize many lens elements to accomplish the same human eye tasks.
  • Optical Processing: Human vision + brain is superior to machine vision on pattern recognition and decision making.
The passive optical systems challenge is to create a novel concept, technology or system for improving results in one of the 8 categories below:  
  • Image Processing: Ideas focused on detection and categorization of objects in the view field.
  • Lens Technology: Ideas that are focused on optical sensors.
  • High Speed Data Transport: Ideas that are focused on fast and efficient transport of high resolution image data and streams.
  • Adaptable Lens Technology: Ideas that are focused on adaptable optical sensors.
  • Liquid Lens Optical Sensors: Ideas focused on liquid lens.
  • Artificial Intelligence: Ideas that focus on image based cognition
  • AV/VR Technology: Ideas that are focused on Augmented and Virtual Reality technologies.
  • Other: Ideas that do not fall into one of the existing categories.

Active Optical System Challenge

Given you have a human vision system, which is intrinsically passive, how would you use active sensing techniques to augment that vision system to mimic or extend the human senses? Augmentation could mean adding higher precision 3D vision, active foveae imaging, active IR assisted sensing, vibrometry, polarimetry, sensing motion in the FOV, chemical/biological sensing, looking through fog/turbulence, etc.  Many such systems have been demonstrated, but they are often large, heavy, and costly.
 
The active optical system challenge is to come up with novel sensor concepts that mimic at least two of the human senses at a distance of at least 10 m, with the sensor fitting into one third of the human brain (roughly 0.5 liters).  More sensing modalities are encouraged, especially those that extend what humans can do.
 
  1. Sight (e.g., producing 2D or  3D images)
  2. Hearing (e.g., measuring object vibrations through optical means)
  3. Smell (e.g., chemical/biological sensing)
  4. Taste (e.g., chemical/biological sensing)
  5. Touch (e.g., characterization of surface texture and/or temperature)

Rules

  • Limited to undergraduate or graduate students.
  • The teams should be composed of at least one OSA student member and at least one advisor who is an OSA member.
  • Required submission format: PDF with a 35-word abstract and 2 page summary.
    • Optional submission material: videos, system mock-ups, demonstrations.

Key Criteria

  • Compliance: is the idea submission complete and does it comply with the rules of the challenge?
  • Novelty: does the idea describe a novel approach to providing a solution?
  • Originality: how original is the proposed technology or use of existing technology?
  • Relevance: How well does the idea relate to the topic and provide a solution aligned with the goals of this challenge?
  • Feasibility: how likely can the idea be prototyped?

Evaluation

Submissions will be evaluated by a committee of Imaging and Applied Optics Congress leadership and Lockheed Martin executives.

Top

Image for keeping the session alive