How AI Is Unlocking New Value From Onboard Cameras
[Stay on top of transportation news: Get TTNews in your inbox.]
As the use of onboard cameras has expanded in the trucking industry, so have the capabilities of these video-based systems, which increasingly utilize artificial intelligence to more efficiently extract insights from video and ultimately improve fleet safety.
Predictive analysis of driver behavior, customized driver coaching and exoneration from liability in crashes are among benefits that fleet operators can realize from the use of AI-enabled camera systems on heavy trucks, trucking and technology executives said.
“These days professional drivers really demand a certain level of protection from any fleet that they want to work for,” said Michael Lasko, an executive at Boyle Transportation and Skelton USA. That includes “exonerations from false claims or frivolous claims [and] coaching and skills development.”
Boyle Transportation and Skelton USA started using onboard cameras in 2016 and switched to AI-enabled video systems from Netradyne about three years ago.
Together, Boyle and Skelton operate 150 trucks, nearly all of which are used in over-the-road operations and driven by teams. Those vehicles are fitted with a road-facing dashcam and cameras on each side.
“The lion’s share of accidents — at least what we experience — occurs along the side of the truck,” Lasko said.
Lasko
A common example is an inattentive car driver drifting into the side of the commercial vehicle.
Boyle Transportation and Skelton USA, which together employ some 300 drivers, are headquartered in Billerica, Mass., and are owned by Vaughan, Ontario-based health care logistics firm Andlauer Healthcare Group.
“We try to keep our safety program positive,” said Lasko, who is assistant general manager and vice president of environmental, health and safety and quality for the two trucking companies.
Fleets should avoid calling drivers only when there is a negative situation to discuss, Lasko said. “That’s usually the death of a good safety program.”
The Netradyne system captures drivers’ good behaviors as well as the bad so managers can recognize and reward instances of exemplary driving.
Broad-brush or unfocused training can cause drivers to become skeptical, Lasko said.
For example, a driver who has never had a backing accident in 25 years might wonder why he is being asked for additional training on backing.
The AI-enabled camera system can identify specific behaviors by specific drivers.
“Gone are the days when every driver got brought into the office to do training about X, Y and Z, whether they needed it or not,” Lasko said. “Now you’re able to really identify things that are meaningful and impactful to a driver and provide them with that training. Drivers appreciate that.”
Boyle uses AI-enabled video safety systems from Netradyne to provide tailored training to its drivers. (Boyle Transportation)
With a driver app that Netradyne offers, drivers log in to track their performance, including their safety score and any coaching comments that have been added to event videos.
“That whole punitive [approach] should go the way of the dinosaur,” Lasko said. “These days, it’s really about what kind of support you can give the driver to help them get to that next level of their career.”
Alliance Industries, a Marietta, Ohio-based manufacturing firm whose companies include Hi-Vac Corp., Mole-Master Services and Terra Sonic International, is using Motive’s safety system with road- and driver-facing AI-enabled cameras on some vehicles, according to Mitchell Wagner, IT business analyst for Alliance Industries, and Paula Starkey, operations assistant for Mole-Master.
RELATED: Fleets can earn driver acceptance
The fleet operates heavy, medium and light trucks.
Wagner said the AI cameras provide “huge insight,” especially into the actions of Mole-Master’s new drivers with little experience. Last year, data from the system showed that one driver had accumulated more than 200 close following alerts.
“We would never have known that” without the camera system, Wagner said. “The silver lining is we talked to the guy and after that conversation there was one more incident. He’s good.”
RELATED: Side, rear cameras extend visibility
The information generated by the system is being used to support performance reviews as well. “Also, on an annual basis it helps us understand who’s being safe out there in general,” Wagner said, “and we want to reward them for that.”
A facial recognition feature in the AI-enabled camera system helps identify the operator of a vehicle, Starkey said. She also cited an incident in which a vehicle struck a Mole-Master truck. After reviewing video that she sent, law enforcement found that the other driver was operating a stolen vehicle, Starkey said.
AI-Enabled Safety Alerts
Developers of AI-enabled camera systems have introduced capabilities such as in-cab collision warnings as the systems ingest growing volumes of data that are sorted and can then be combined with pertinent information from engine control modules and telematics systems.
“We use machine vision and artificial intelligence to analyze behavior patterns that correlate with risks, such as distraction, fatigue, not wearing a seat belt, and so on,” said Rajesh Rudraradhya, chief technology officer at Lytx Inc.
Algorithms analyze video and vehicle data to assess risk.
“We use machine vision and artificial intelligence to analyze behavior patterns that correlate with risks, says Lytx's Rajeesh Rudraradhya. (Lytx Inc.)
“With telematics and access to the vehicle bus, we also have rich information about the vehicle [and] with video we have rich information about the driver, the roads and situational risk,” Rudraradhya said.
The combination, he said, is “magical.”
For instance, he said, “We can use [engine control module] speed and video to determine if the driver came to a full stop. We can use engine RPM to determine if the [vehicle] is idling and wasting fuel. Combined with video, we can see the circumstances in which the vehicle was idling and try to find systemic solutions that can help reduce fuel use and emissions at scale.”
Rudraradhya noted, “There is still a human component with AI that is often forgotten.”
AI can scan through hours of video in a flash, “but we still need people who are experienced … to tune these algorithms to the multitude of scenarios in the real world,” Rudraradhya said.
A trash truck in urban New England and a longhaul truck operate in “vastly different” circumstances, he said. “It’s always going to be the combination of human intelligence and AI that creates the best solutions.”
A growing number of fleet technology vendors have been introducing safety capabilities that provide drivers with proactive safety alerts in the cab.
Samsara, for one, recently launched its Drowsiness Detection feature, which uses its AI Dash Cams to monitor criteria such as eye movement and yawning and then warns drivers when they are dangerously fatigued.
Increasingly, technology vendors are able to differentiate distracted, drowsy and fatigued driving.
Lytx’s Rudraradhya said machine vision combined with AI can be trained to detect distracted driving behaviors “such as when a driver is inattentive, using a mobile device, smoking, eating and drinking, reading papers, pulling things out of wallets, fiddling with the instrument dash, and so on.”
Using thousands of images of drivers exhibiting signs of drowsiness or falling asleep, machine vision and AI can be trained to predict when behavior is highly correlative to fatigue. Fatigue patterns are detected over a longer period of time as compared with distracted driving behaviors, which are more immediate and discrete. In both cases Lytx uses machine vision to detect risky driving behavior and provide real-time alerts to help prevent accidents.
Drowsiness is a condition that could be detected by an AI-enabled camera system, says Motive's Abhishek Gupa. (John Sommers II for Transport Topics)
“Where AI can make a difference is in harder-to-catch situations,” said Abhishek Gupta, vice president of product for Motive. “Drowsiness has been extremely challenging as a problem for our customers and for the industry. It’s been one of the biggest causes of accidents.”
Drowsiness could be detected by an AI-enabled camera system catching an isolated indicator, such as eyes closing, however briefly, Gupta said.
Fatigue, however, is something that may be viewed as happening over “multiple hours” of driving and involving several indicators such as blinking, fluctuations in speed, lane shifting and head drooping.
“These things may not happen all at once,” Gupta said. “It’s important we look at them not as individual behaviors, but as some things we can pull together to consider as fatigue.”
Upon arriving at the finding, the system could issue an alert announcing that “for the last three hours your driver has exhibited a fatigue level that’s extremely high,” Gupta said.
For AI to recognize cellphone use or tailgating in snow or rain, it has to have been fed data that allows it to learn, Gupta and other tech executives said.
A 400-plus “safety team” at Motive reviews “every single video that our cameras capture” and labels what it is, Gupta said, citing “cellphone detection” as an example.
“That allows us to build our models based on that data,” he said.
Further contributing to the model-building is generative AI, which Motive and other companies use to generate data “synthetically.” The method is used “for situations that you may not have [real-world] examples of,” that need to be anticipated and accounted for, Gupta said.
Generative AI “can be helpful for us to be able to generate different situations, different datasets that we can use to train our models,” he said.
Mitigating Insurance Costs
The deployment of onboard video and AI-enabled safety alerts can also give fleet operators a way to help control their insurance costs.
Sentry Insurance offers “a nominal discount” to its trucking customers that share data from telematics systems provided by Motive, said Nick Saeger, Sentry’s assistant vice president of products and pricing for transportation.
The insurance company has safety consultants around the country who work with fleets to tailor driver training and coaching based on analysis of the telematics and camera data from their Motive systems.
David Bell, CEO of CloneOps.ai, discusses the impact of AI on the trucking industry. Tune in above or by going to RoadSigns.ttnews.com.
“The cameras, and in particular now the AI-driven ones, that capture more data [are] used to create dashboards” showing drivers that need more coaching and those that are meeting or exceeding the fleet’s standards, Saeger said.
He added a note of caution: “Just because they have the cameras doesn’t necessarily mean they’re going to be better fleets.”
As Sentry continues analyzing data, it is getting closer to documenting a reliable indicator that could connect certain driving behaviors and the likelihood of an accident, Saeger said, “but we’re not quite there yet.”
Sentry’s director of claims, Larry Harlow, said that a frequently overlooked benefit of the AI-enabled systems is prompt notification that a crash has occurred, which speeds the arrival of claims adjusters and can help with resolutions.
Data-Driven Fleet Management
The advent of AI-enabled in-cab cameras is changing fleet management practices, observed Gary Falldin, senior director of industry solutions for Trimble.
Falldin
“With all that information that’s coming off the truck, triggering videos … it gives you something to act on,” he said. “A good carrier will know what to act on and what not to act on. You can’t chase every single thing out there. You don’t have time to do that. You’ve got to prioritize.”
Falldin recommended that fleet operators start by focusing on the riskiest drivers and making sure training is in place for them.
Moving forward, AI can give fleet managers more complete information about what is happening out on the road.
Simply capturing video of a driver looking down doesn’t provide the full context of the event.
“What are they looking down at? Are they looking down at their cellphone between their legs?” Falldin asked.
In that example, the driver’s eyes are not shut, but they are looking in a different direction.
Some fleets have deployed side cameras to provide greater visibility around their vehicles, particularly in the event of a creash. (Trimble)
“That’s the difference between a fatigue and a distraction, and that’s where AI comes into play,” he said.
For distinguishing between drowsy and distracted driving, a sensor placed lower and closer to the driver, such as on the A-pillar or dashboard behind the steering wheel, provides a better angle for determining if the driver’s eyes are actually closed versus looking down and distracted, said David Julian, Netradyne chief technology officer.
Also, the closer view allows for looking at pupil dilation. As drivers become drowsier, they lose muscle control and the pupil dilation starts to vibrate more, he said.
Videos captured by onboard video safety systems can be submitted to the Federal Motor Carrier Safety Administration’s Crash Preventability Determination Program, which reviews crashes and modifies information in the agency’s Safety Measurement System to distinguish crashes that were not preventable. The program accepts applications for 21 different crash types.
Want more news? Listen to today's daily briefing below or go here for more info: