Entrepreneurial Leadership and Management . . . and Other Stuff

RSS
Jun
23

Modeling a “Jamiton” – The Mathematics of a Traffic Jam

My good friend Dave just sent me an article from Wired magazine, “MIT Hopes to Exorcise ‘Phantom’ Traffic Jams,” about research going on at MIT in mathematically modeling randomly occurring traffic jams to discover their source as well as potential remedies.  Since I’m not a mathematician, I developed a somewhat less scientific theory as to why such traffic jams happen – morons and ignorant motorists who shouldn’t have licenses.  If people paid more attention, the chance of randomly occurring traffic jams taking place would be greatly eliminated.  Read the signs: Slower Traffic Keep Right.”  But, if you’re interested in a more rational theory, check out the article.

Reblog this post [with Zemanta]
Can anyone else you know use this information? Please share . . .
Share on Facebook
Facebook
Tweet about this on Twitter
Twitter
Share on LinkedIn
Linkedin
Buffer this page
Buffer
Email this to someone
email
 June 23rd, 2009  
 Will  
 Stuff with a Motor  
   
 4 Comments

4 Responses to Modeling a “Jamiton” – The Mathematics of a Traffic Jam

  1. I first obseved this phenomenon about 25+ years ago when commuting from Palo Alto to South San Francisco up highway 101. I read more about this after the ’84 Olympics in LA failed to generate horrendous traffic jams, but instead traffic for the 10 days was actually better.

    In the first case, the vestiges of an earlier event during the commute hour could be seen an hour later, ie, you would be stuck in traffic, come to a slow down of 15 MPH and then accelerate, and there was nothing there, nothing. I figured if I could get 3 other friends to drive 4 abreast with me down 101 and slow down to 10 MPH (albeit, in a safe manner) then quickly accelerate back up to highway speed. We could get off the highway, double back and park and watch the phantom traffic jaw for an hour.

    The theory that was developed after the Olympics was essentially this: each lane has traffic capacity (I want to say 800-1200 cars/hour) and cars all move smoothly. Once you approach the high end of this range, the slightest disturbance causes a cascade of over reaction and the system grinds to a near halt and until the traffic flow drops below some critical value and the blockage clears up, the slowed up traffic will remain in that state indefinitely. (during the Olympics, many employers staggered their start hours and this dropped the peak traffic levels down to the low to mid range of traffic flow capacity, so traffic jams didn’t get triggered).

    They say this behaviour mimics fluid flow and the transition from laminar to turbulent flow really cloesly. So this would mean the average drive is no smarter than a water molecule. Hmmm Will, I guess your theory is in fact correct.

    John

  2. I first obseved this phenomenon about 25+ years ago when commuting from Palo Alto to South San Francisco up highway 101. I read more about this after the ’84 Olympics in LA failed to generate horrendous traffic jams, but instead traffic for the 10 days was actually better.

    In the first case, the vestiges of an earlier event during the commute hour could be seen an hour later, ie, you would be stuck in traffic, come to a slow down of 15 MPH and then accelerate, and there was nothing there, nothing. I figured if I could get 3 other friends to drive 4 abreast with me down 101 and slow down to 10 MPH (albeit, in a safe manner) then quickly accelerate back up to highway speed. We could get off the highway, double back and park and watch the phantom traffic jaw for an hour.

    The theory that was developed after the Olympics was essentially this: each lane has traffic capacity (I want to say 800-1200 cars/hour) and cars all move smoothly. Once you approach the high end of this range, the slightest disturbance causes a cascade of over reaction and the system grinds to a near halt and until the traffic flow drops below some critical value and the blockage clears up, the slowed up traffic will remain in that state indefinitely. (during the Olympics, many employers staggered their start hours and this dropped the peak traffic levels down to the low to mid range of traffic flow capacity, so traffic jams didn’t get triggered).

    They say this behaviour mimics fluid flow and the transition from laminar to turbulent flow really cloesly. So this would mean the average drive is no smarter than a water molecule. Hmmm Will, I guess your theory is in fact correct.

    John

  3. Yes, I saw this too on wired. Actually, the theory has been around for 10 years (or more). Looks like a guy from Seattle got the idea moving (no pun intended). Also, I believe the UK Highways Agency were developing models for this long ago.

    Here’s an article from 2000 on it. Back when I was managing geo-data for the M25 (busiest motorway network in Europe), they had just installed a new enforced variable speed limit in the most congested sections. As the congestion increased, the speed limit would decrease and if you went over it, you would get a ticket from one of the many speed cameras in the overhead gantries. The difference before and after was marked and despite the lower speeds, traffic does flow a lot more consistently.

    It used the wave principle as a basis; as traffic speed increases with increased traffic, the severity of traffic flow halting increases dramatically. By bringing the speed down the halts are less likely to happen and you get from a to b faster.

    Of course this would never catch on in Boston for a number of reasons, notably the pitiful standard of driving that is required to get a driving license here. Then of course there’s privacy issues with speed cameras that would never fly here and despite the best efforts of brits to blow up / burn down / knock down cameras over there, the cameras are there to stay.

    http://news.bbc.co.uk/2/hi/uk_news/1017242.stm

    I suggest the smart kids at MIT use google 🙂

  4. Yes, I saw this too on wired. Actually, the theory has been around for 10 years (or more). Looks like a guy from Seattle got the idea moving (no pun intended). Also, I believe the UK Highways Agency were developing models for this long ago.

    Here’s an article from 2000 on it. Back when I was managing geo-data for the M25 (busiest motorway network in Europe), they had just installed a new enforced variable speed limit in the most congested sections. As the congestion increased, the speed limit would decrease and if you went over it, you would get a ticket from one of the many speed cameras in the overhead gantries. The difference before and after was marked and despite the lower speeds, traffic does flow a lot more consistently.

    It used the wave principle as a basis; as traffic speed increases with increased traffic, the severity of traffic flow halting increases dramatically. By bringing the speed down the halts are less likely to happen and you get from a to b faster.

    Of course this would never catch on in Boston for a number of reasons, notably the pitiful standard of driving that is required to get a driving license here. Then of course there’s privacy issues with speed cameras that would never fly here and despite the best efforts of brits to blow up / burn down / knock down cameras over there, the cameras are there to stay.

    http://news.bbc.co.uk/2/hi/uk_news/1017242.stm

    I suggest the smart kids at MIT use google 🙂