Vehicles stopping for red lights, idling as they wait for the signal lights to change and accelerating to get back up to speed wastes fuel and adds pollutants to the air. Idling vehicles waste more than 6 billion gallons of gasoline and diesel combined every year, according to Department of Energy (DOE) estimates.
Seeking a better way, the DOE last year awarded $1.89 million to researchers at the University of Tennessee-Chattanooga, the University of Pittsburgh, Georgia Institute of Technology, Oak Ridge National Laboratory and the City of Chattanooga to create a new model for traffic intersections that reduces energy consumption and improves the flow of traffic.
The goal of the project is to develop an automated traffic control system that would reduce corridor level fuel consumption by 20%, while maintaining a safe and efficient transportation environment. The researchers intend to apply artificial intelligence and machine learning to support a number of smart transportation applications, including emergency vehicle preemption and transit signal priority and pedestrian safety.
“Our vehicles and phones have combined to make driving safer while nascent intelligent transportation systems have improved traffic congestion in some cities. The next step in their evolution is the merging of these systems through AI,” stated Aleksandar Stevanovic, director of the Pittsburgh Intelligent Transportation Systems Lab. “Creation of such a system, especially for dense urban corridors and sprawling exurbs, can greatly improve energy and sustainability impacts,” he said, noting that transportation will rely heavily on gasoline powered vehicles for some time.
Oak Ridge National Lab is working on part of the problem, in a project using overhead cameras and roadway sensors to identify gas guzzling commercial trucks in traffic. AI and machine learning algorithms identify the least efficient vehicles, then track their path and speed in order to change the traffic signals up ahead. This eliminates some degree of the inefficient starting and stopping at intersections and minimizes fuel consumption.
The testing is being conducted on an existing smart corridor built from a 2014 partnership between the Oak Ridge National Laboratory and the Electric Power Board (EPB) of Chattanooga as part of an effort to develop new energy technologies. The corridor employs cameras, LIDAR, radar, software defined radios, wireless communications and sensors for air quality and audio. These collect information from their spots on poles along a 10 block section of Martin Luther King Boulevard in the city’s downtown. A 10 Gbps fiber network underlies the smart city testbed, enabling real time data transmission.
Smart AI cameras will transform traffic management by 2025. The cameras will enable machine vision applications such as pedestrian detection and alerting.
Research suggests that more than 155,000 AI-based cameras will be in use by 2025, up from 33,000 in 2020. In the "Edge Analytics Cloud Use Cases in Smart Cities and Intelligent Transportation" research report, traffic management applications include adaptive traffic lights, vehicle prioritization and preemption, parking access and detection, and electronic tolling.
Camera system revenue will grow from $46M in 2020 to $189M in 2025, according to Dominique Bonte, a vice president at ABI Research. ”Advanced AI-capable processors featuring hardware acceleration for high performance neural net software frameworks from silicon vendors like Intel, Nvidia, and Qualcomm are propelling smart cameras into the mainstream, offering more features and flexibility at lower price points compared with legacy traffic and electronic toll collection (ETC) sensors like magnetic loops and radio frequency identification (RFID),” he stated.
A low latency computer network is one that is optimized to process a high volume of data messages with minimal delay or latency. The deployment of 5G and vehicle-to-everything (V2X) connectivity will allow moving low latency analytics to the edge of telco network, referred to as edge cloud, network cloud, multiaccess edge computing (MEC) or distributed cloud, will enable a new range of application categories across larger geographical areas.
These will include:
road intersection management: cooperative adaptive traffic lights and remote traffic management;
safety and security operations: crowdsourced hazard and security alerts and remotely controlled response management systems installed on light poles, buildings and other street furniture; and
autonomous asset management: remote control and operation of driverless vehicles, drones and robots.
“In most cases the edge cloud will not replace the roadside edge but rather complement and enhance local safety and security systems into more aggregated, collective, cooperative, and holistic solutions including feeding urban digital twins with actionable local intelligence,” stated Bonte.
Texas A&M Team Using Deep Neural Network for Signal Controller
Researchers at Texas A&M University are applying reinforcement learning to the study of traffic management. The team is applying learning algorithms that reward favorable outcomes in an effort to optimize the signal controller to make decisions that improve operations, in this case, a reduction in the buildup of traffic delays.
The model is using a deep neural network (DNN) machine learning algorithm, which tend to be unpredictable and inconsistent in their decision making, making it challenging to work with them, said Guni Sharon, professor in the Department of Computer Science and Engineering at Texas A&M, in an account from Futurity. To overcome this, Sharon and his team defined and validated an approach that can successfully train a DNN in real time while transferring what it has learned from observing the real world to a different control function that can be better understood and regulated by engineers.
Using a simulation of a real intersection, the team found that their approach was effective for optimizing their interpretable controller, resulting in up to a 19.4% reduction in vehicle delay in comparison to commonly deployed signal controllers. The researchers said it took about two days for the controller to understand what actions help to mitigate traffic congestion.
“Our future work will examine techniques for jump starting the controller’s learning process by observing the operation of a currently deployed controller while guaranteeing a baseline level of performance and learning from that,” Sharon stated.
Comments