To address a problem inherent in American life, The city of Chattanooga is looking for a smart and innovative approach to congestion, She found a supercomputer with a digital twin that monitors traffic through the city and collects data from a variety of sources to help officials understand the causes of congestion and run a smoother and more sustainable traffic system.
Traffic congestion is a must-see in growing urban areas around the world. This phenomenon appears to be an inevitable consequence of the patterns of modern societies and a challenge to public policy. While a definitive solution is intractable, There are several ideas and ways to mitigate them.
Of course This phenomenon is influenced by many factors, Their severity varies from country to country, Perhaps it is obvious that the problem is more complex in the big countries, In the United States of America, It is estimated that the average citizen spends 42 hours a year stuck in traffic jams. According to the Environmental Protection Agency, The transport system accounts for 26% of total energy consumption and 29% of greenhouse gas emissions.
Many cities suffer from traffic jams, Especially those located at the intersection of highways between two or more states, As in the case of Chattanooga, A small city with mountainous terrain and a modest population of only 180,000 inhabitants, which ranked among the 20 busiest cities, outperforming the cities that are much larger in terms of area and population. What sets Chattanooga apart is that it has more than 350 intersections and ranks seventh in the list of the largest shipping lanes in the country. And when talking about these corridors in America, It is necessary to note the intensity of the movement that it is witnessing, 11 billion tons of cargo pass through it annually, That's $32 billion in business activity every day.
This city is located between the states of Georgia and Tennessee, and administratively subordinate to the second, which loses $1.1 billion each year as a result of traffic congestion, Its population wastes about 11.5 billion liters of fuel and loses 8.8 billion hours of productivity.
In search of a way to ease traffic congestion, Oak Ridge National Laboratory and National Renewable Energy Laboratory collaborated to leverage joint funding provided by the Federal Highway Administration and the Department of Transportation. This exceptional partnership and the burgeoning technology sector have played an important role in paving the way for the birth of one of the largest smart intersection networks in the country.
The project uses a supercomputer called Eagle. It is the latest version developed by the teams of the National Renewable Energy Laboratory, It can perform 8 million billion calculations per second and project it onto pre-entered data. Thus, the data turns into cumulative knowledge on which to base real-time conclusions, Which was not available only 5 years ago.
Work started in 2018, The team began studying local traffic data taking advantage of 500 sources, including databases available at the Ministry of Energy, satellite imagery, GPS, automated cameras, radars, weather stations and records showing the whereabouts of cars. Even the sightings of residents and traffic police.
Based on this data, Scientists have created a digital twin for traffic, It is an intelligent computer model that simulates what is happening on the ground and has the advantage of machine learning, It is a form of artificial intelligence that allows the target system to learn through data and not through explicit programming, In the sense of developing algorithms and statistical models on which computer systems build decision-making processes and perform tasks by inference from previous models without clear instructions for each case separately.
This system provides the team with a comprehensive and direct view of traffic in all its details, Digital is used to measure the impact of proposed strategies and schemes, By introducing these modifications by default to the electronic form, which will give results that perfectly simulate what would happen if applied on the ground.
In here Problematic traffic patterns are emerging, The team analyzes them and finds the causes of traffic jams, such as the distribution of traffic lights on one of the secondary roads leading to the city center, and the lack of coordination between them, This was solved by going back to the traffic light controllers and finding the ideal equation that regulates the timing of traffic lights.
Perhaps the most prominent challenge facing machine learning technology is that it cannot determine the cause of the problem or phenomenon it observes. But it offers enough accurate data to help experts fill in this loophole. Once this is achieved, A simple change would have a big impact, Such as the case mentioned concerning the modification of the traffic light system on a street, This saved 16% of its energy consumption. So, The team is studying the possibility of applying this technology in other streets to reach 35%. It is a goal that will be achieved over the next two years.
The objectives of this project vary, According to estimates, The success of the trial will contribute to reducing the fuel consumption of the transport sector by approximately 20%. It will also reduce U.S.-wide car emissions.
Moreover, Shifting the burden of waiting will increase the productivity of citizens, This is expected to bring $100 billion to the U.S. state treasury over the next decade. All of this depends on the success of the Chattanooga experiment in preparation for its deployment in cities and regional corridors across the country.
References:
https://www.ornl.gov/news/clearing-congestion-ornl-computing-researchers-help-unclog-traffic-jams
https://www.weforum.org/agenda/2021/05/supercomputer-reduces-traffic-jams-cities/