Visual Depth and Velocity Mapping

From Mindworks
Jump to: navigation, search
[[File:
Disparity.jpg
|300px|center|alt=]]
Sponsors Dan Schneider
  • University of Idaho [1]
Team Name Spectralink
Duration Fall 2018 - Spring 2019
Faculty Adviser Dr. Bruce Bolden
Client
  • Dan Schneider / Schweitzer Engineering Labs
Team Members
  • Bailey Lind-Trefts
  • Dustin Pierce
  • Matthew Mills

This project provides a sight substitution system for the visually impaired. Information about the surrounding environment is gathered by stereo vision cameras and used to generate disparity maps. These maps are then used to calculate the depth and velocity of surrounding objects. The depth and velocity values are expressed as vectors, which are then translated into audio signals of varying pitch and synchronization.


Problem Definition[edit]

Although there are extensive resources available for most disabilities, there are very few aids for the visually impaired. This project aims to write software to map visual stimuli to an auditory field using Real Time Disparity Mapping (RTDM) with a dual camera system and simple processor. The unit should be wearable and functional in the sense that somebody could walk around campus with it on their head. By November 27th we expect to have our first prototype running and functional.

Background[edit]

It has been demonstrated that those who are visually impaired are able to learn how to navigate via echo location. Thus, this project is based on the intuition that it is possible to learn how to translate variations of an audio signal into meaningful information which can then be used to ascertain the spatial relationship of objects in the surrounding environment.

Deliverables[edit]

The deliverables for this project include the following:

1. Functioning hardware (ZED stereo camera integrated with Nvidia GPU processing board)

2. Proprietary modifications to SIFT and SNAP disparity mapping algorithms

3. Portfolio containing documentation

Specifications[edit]

  • User Interface Requirements:
    • A set of headphones and a camera shall be attached to a helmet or hat.
    • The device should be secured to the body in a configuration suitable for walking and sitting
    • The device should not impede user's ability to interact with objects in front or to either side of him or her
    • The device should not expose the user to any severe electric or mechanical hazards (electric shock or burning)
  • What the product should do:
    • This product will relay 3D position and velocity information in an audio format to the user. The resolution and accuracy of the data should be sufficient for traversing a room without the assistance of eyesight.
  • Physical Requirements
    • Maximum weight: 10 lbs
    • Power: Board 12.6 W
    • Camera 3.8 W
    • Framerate 15 30 60 FPS
    • Depth of field (max): 15 20 30 m

Design Considerations[edit]

In order to create a Real Time Disparity Mapping system that could be used in a mobile setup, it was necessary to find hardware that was small enough to be worn on the user's body and yet powerful enough to process images. The first prototype that we proposed was a Tara Stereo vision camera that would be connected to a Raspberry Pi 3.0 B+, which would be powered by a battery pack.

Tara.jpg

Tara

Pi.png

Raspberry Pi


However, after discussing this option with our client, we determined that the Raspberry Pi was not powerful enough to do the image processing that we needed and that the viewing angle of the Tara camera was not large enough. After more research, we found another camera called ZED produced by StereoLabs which had the desired viewing angle and depth perception. Also, we found another processing board called the Nvidia Jetson TX2 which had sufficient computational capabilities to perform the tasks that we needed.


Zed.jpg

ZED


Jetson.jpg

Jetson


Another problem that our team deliberated over was how to place the hardware on the user's body. We initially thought of placing the camera on the user's chest, but then decided to have it fixed to their head with the jetson board stored in a backpack along with a battery pack.


Body.png


Another discovery that our team made was that we need to develop the code for the project on a custom development kit produced by Nvidia instead of Open CV. Links to each of these development environments are given below:

  • Open CV: [2]
  • Nvidia Jetpack: [3]

Project Learning[edit]

Final Design[edit]

Validation[edit]

Team Members[edit]

Cs480 fall2018 spectralink bailey.jpg

Major:Mechanical Engineering
Hometown:Coeur d'Alene, ID
Responsibility:Team Lead
Email:lind3831@vandals.uidaho.edu


Cs480 fall2018 spectralink dustin.jpg

Major:Computer Science
Hometown:Sandpoint, ID
Responsibility:Team Member
Email:pier2205@vandals.uidaho.edu


Cs480 fall2018 spectralink matthew.jpg

Major:Computer Science
Hometown:Coeur d'Alene, ID
Responsibility:Team Member
Email:matt9751@vandals.uidaho.edu


Additional Documentation[edit]

Project Schedule

Media:Spectralink schedule.pdf

Meeting Minutes

[[File:]]
[[File:]]
[[File:]]
[[File:]]
[[File:]]
[[File:]]
[[File:]]
Presentations

[[File:]]
[[File:]]
[[File:]]
[[File:]]


Client Interview

[[File:]]