Flannel shirts. Dial-up connections. "The X-Files." Nirvana and NAFTA. Baggy pants and "Seinfeld." The Bridges Of Madison County and Rent. The phrases "At the end of the day," "Generation X," and "Think outside the box."

And ... higher mathematics?

Like any decades, the 1990s had its trends, from the sartorial (backwards baseball caps) to the musical (grunge and, far more painfully, post-grunge), from the televisual (all those "Simpsons" ripoffs) to the political (Bill Clinton's "New Democrats" and Newt Gingrich's "Contract With America"). And, as in any decade, any number of difficult academic ideas also somehow wound up in the trend-blender. Ideas and theories, after all, can turn trendy as easily as anything else, and during the Clinton years, the difficult and abstruse mathematics of chaos theory found itself under scrutiny from a spate of newspaper articles, books, and even movies. The decade that gave us dumbed-down, hyped-up versions of multiculturalism and deconstruction also put chaos theory on the bestseller lists.

Alas, most Americans will probably always remember chaos theory as the thing described by Jeff Goldblum's character in Jurassic Park - something about a butterfly flapping its wings in Japan causing a hurricane in Hawaii a month later. The then-emerging theories of chaos and complexity did indeed provide author Michael Crichton with the hook on which to hang a successful 1991 novel - and provided Steven Spielberg, in turn, with his most successful movie (in 1993), second only to Titanic on the all-time-highest-grossing list - but chaos theory's real contributions are not to the movie theatres and home-entertainment centers of America (where Spielberg's dinosaur thriller is already half-forgotten), but to the unfolding human understanding of the rules by which nature is governed.

Chaos theory is a difficult and fascinating idea, growing out of mathematicians' researches in many areas. The chief precursors, however, were three, as Ian Stewart points out in his 1995 study Nature's Numbers. First of all, some mathematicians turned their attention from simple to complex patterns. Also, advances in computer technology not only allowed better modeling of complex systems, but also quicker solutions to dynamic equations; finally, around the same time, mathematicians started trying to use geometry, rather than numbers, to understand dynamic systems (systems that involve high levels of change).

Chaos theory does not mean (as one common misconception holds) that the world's a mess, you can't predict anything, and nothing happens as you expect it to (though all of these may well be arguable positions!). Rather, it has to do with the imperfection of human measurements - and, ultimately, with the concept of infinity. (Much of the following exposition, by the way, is indebted to the aforementioned Ian Stewart book, Nature's Numbers.)

For starters, let's go back to the eighteenth and nineteenth century - a heady period for mathematicians and scientists. After all, Isaac Newton's Principia Mathematica provided thinkers with what seemed almost a blueprint of creation - his physical theories had proven so powerful in predicting the behavior of bodies that, for some thinkers, the ability to (in theory) know nearly everything must have seemed within science's grasp. The mathematician Pierre-Simon de Laplace had argued, in 1812, that with enough knowledge, scientists would someday be able to predict the future behavior of any particle - and, by extension, any body, including a human body - simply by knowing its present position and the nature of the forces acting upon it.

The problem is that we actually can't know the position of a particle accurately enough - in fact, we never can. That can't is no overstatement - because, in fact, measurement of a particle's position can go on forever. You can always take your measurements to another decimal place, and in order to know the "exact" position needed to do the kind of exact predictions Laplace had in mind, you would have to know the particle's position to an infinity of decimal points. Plainly, that isn't possible.

Any measurement that goes to less than an infinity of decimal places has some inaccuracy to it - no matter how infinitesimal. And here's the part that Laplace and other nineteenth-century thinkers didn't expect - that inaccuracy doesn't stay put. If my measurements stop at, say, the hundredth place, that error grows and amplifies. It widens like a crack over time, so that, after a relatively short series of changes, the object is behaving in ways I would never have predicted from my earlier measurements.

This is true in any system, of any object. Even accuracy such as the greatest, most powerful computers can now attain, will prove unpredictive after a few changes, as the little bit of error left over in any not-to-an-infinite-series-of-decimal-places series of measurements quickly metastasizes. And that's your "butterfly effect" - like a flapping of butterfly wings that becomes a hurricane after a month, the tiniest bit of error will get bigger and bigger, until it's as if we can't truly predict, with certainty, the behavior of any particle.

Author Resource:-
Math Made Easy provides Math help for Algebra help, Geometry help, math homework help using math online tutorial services and math tutorial cd so you can watch your math scores soar.