The Pulse @ 1 Market

This company has developed stereo cameras for driverless cars

A look at what Seegrid’s dual-camera sensors “see” using their proprietary software. The longer the blue line, the closer the object.
Source: Seegrid
This sensor company wants to power the driverless car revolution
VIDEO2:5602:56
This sensor company wants to power the driverless car revolution

There are at least 33 major corporations working on getting driverless cars on the road, from Google, to Ford, to Tesla, according to a CB Insights report.

Now, a new player wants to join the game — a manufacturing robotics company.

Pittsburgh-based Seegrid has been making vehicles drive autonomously for a dozen years. The company, founded in 2003, develops sensors and software to move industrial trucks around manufacturing facilities and distribution centers for companies like Whirlpool, Amazon, Volvo, BMW and Jaguar.

Seegrid CEO Jim Rock said the technology has proven itself to be ready for the roads.

"We've done nearly 500,000 miles in actual customer production environments with no accidents," Rock said.

Seegrid is powering this driverless car prototype with its dual-camera sensor technology.
Jeniece Pettitt | CNBC

Seegrid's vehicles navigate entirely proprietary stereo cameras that give the robots a depth of vision.

"It's just like the two eyes that are in human beings," said Jeff Christensen, Seegrid's vice president of products. "You get a depth of field."

Autonomous vehicles use all kinds of sensors — including radar, ultrasound and cameras — but Christensen said the design leader has not yet been chosen by the industry.

The primary sensor used by Google's and Uber's autonomous vehicles is Lidar, or light detection and ranging. It's the spinning contraption that you see on the top of Google's driverless cars.

But Tesla, which recently announced it was adding autonomous driving hardware in all of its vehicles, ditched the pricey Lidar for cameras, which are more affordable. Telsa's cars will come equipped with eight monocular cameras going around the perimeter of the vehicle paired with radar to capture depth of field.

Christensen said he applauds Telsa for using cameras as a primary sensor, but that stereo cameras would be a better choice.

"Then you can get both the image data and the ranging data from a single sensor," he said.

A look at what Seegrid’s dual-camera sensors “see” using their proprietary software. The longer the blue line, the closer the object.
Source: Seegrid

Seegrid is not looking to build its own cars, but instead partner with an automaker and sell its sensor kit.

The company has a prototype using a Nissan Leaf to test its sensors outside of a manufacturing setting for the first time. It jury-rigged its vision sensors into the car and has been testing it on the roads to gather data.

Rock said its "evidence grid" software that captures and translates the data is what sets it apart from others in the industry.

The prototype cannot yet drive on its own, but the company says the technology is ready and expects to have a more advanced prototype next year.

The global autonomous vehicles market is expected to reach $65.3 billion by 2027, according to a report by Market Research Future.