"The Physics of Happening"

 

(i.e. decoding the eventfulness of change'... reading natural system organizational changes directly from their shifting continuities.

 

What's so different about my methods is that I take the "naive" view of nature, which I tested and apparently found sound proof, that modern science made a great mistake in accepting the theory of Niels Bohr and Heisenberg, among others, that science can only study its information, and so must treat information as the nature of physical reality.  Science studies nature, limited by the information we have and the practice of representing nature as determined by theory, when it has always been the exact opposite. That’s been very successful in helping science focus on what’s definable, but of course then excludes the indefinable, but highly organized natural processes from study.  That probably sounds strange. If you think about it, what human escell in is learning to work with complex systems in context that we can’t define, but can read the signals and respond to.  It’s how we relate to each other. So it also offers another way to work with the world, especially when we find the problem sets indefinable.

 

The main subject here is finding and confirming contextual signals of system change, helped by algorithms tuned to recognize the dynamic signals that distinguish system change from fluctuation (for example). For more general introductions, see the updated Reading Nature’s Signals journal. The ten or twenty-year-old notes and reports below are on the technical methods found to work, centered on the routine called “derivative reconstruction.” The aim is recognize the flows in time-series data that tell you about what kind of system is producing it, a diagnostic approach, rather than trying to represent ever changing system with models. That work was most detailed in the papers on:

 

1.      Reconstructing the Physical Continuity of Events (1995)

2.      Features of derivative continuity in shape (1999)

3.      Flowing processes in a punctuated species change, G. pleisotumida to G. tumida, displaying feedback-driven evolution (2007)

______________________________________

 

The work archived here, mostly from the 90's based on fundamental research in the 70's and  80's seems to answer a question I had when doing my junior and senior physics labs in college: "Why does it seem every run of every experiment somewhat misbehaves?"  The answer appears to be that it's because every chain of events needs to involve the explosive local self-organization of the energy using processes doing it, as the way nature turns "the switch" on and off.   It reframes the equations of science as describing boundary conditions for natural processes that all individually find their own paths of continuity in order to develop.   That's then "what you see happening" to create the vast proliferation of organically developing "S" curves at seemingly every natural transition.

 

I'm not sure what prompted me to mentally experiment with reversing the assumptions of modern science that way, interpreting our information as only helping us understand the limits of natural phenomena.   The "naive" view that the phenomena of the natural world work by themselves, and not by our theories, did open doors, helping me to get a whole lot further in investigating my subjects of interest.   I had by the 70's expanded my general question beyond "what makes things misbehave" to "what makes life lively" and studying all sorts of lively phenomena.   I found more leads assuming that nature works by means other than our information.

So, to unlearn the view that nature is made of theory we'd need to teach ourselves to "go back" to our naive point of view, as the door to discovering the more organic processes of nature behind the data we collect, that are really so important to us.   To discover the natural systems behind our data what I study are the changing continuities in the data, implying flowing organizational change in the systems producing the data.  I use equations some sometimes as boundary conditions, but not to study nature.  

The equations one might fit to the data only serve to represent the complex reorganization of systems in nature to just one single organization, invented by a scientist.  So they serve to erase all information about the system of nature producing the data.   Equations then just don't prompt questions about the multi-scale organizational changes occurring in the systems being studied. 

The basic lens used here to help expose the good questions to ask is a "cipher", composed of mirror "S" curves  (¸¸¸.·´ ¯ `·.¸¸¸).  It expresses the absolute "continuity of change over time", as being composed of a four part succession of continuous developmental phases, from start to finish, and the transient "changes in the directions of change" between them.  Those transient change are implicitly emerging from other scales of behavior, to initiate and terminate each period of progressive change, that the physics theory requires and observation seems to strongly confirm is how physical organization in natural systems generally develops.

The key, though, is what I first mentioned.  We interpret this model in a "new" old way, not interpreting the model as what nature is doing, but as the boundary conditions we can deduce from our data, that what nature is doing have to fit into somehow, the naive ontology that "where you see events taking place is right where they're happening".  The challenge is to find "what's happening", not quite so easy to put in a phrase.  The physics theory is derived from the conservation of energy and described in the chapters in the story of events and in lots of good short essays on all kinds of subjects on my blog "Reading Nature's Signals".    jlh

 


Table of Contents...Related Subjects... Introduction,..
Ongoing Studies by J.L. Henshaw 

     To learn to think like nature, first learn to watch nature think.
Always under construction,  Experiments in progresssome time I'll read the whole thing and fix everything that's out of date!


The General Method In Brief:     ...an accumulation of different experiments and models .  ..Since this work was done very independently, mostly in the 80's and 90's, raising some quite different kinds of questions than the community using the terms "emergence", "complexity" and "self-organization" for the same general subject, the reader needs to recognize that their familiar terms may be being used differently here, as well as the likelihood that a lot of updating may be needed considering how long ago these pages were done...6/16/06 10/13

Physics of Conserved Change - Emergence, "little bangs" & "big booms" 07/08
Key Principles 12/07 and an overview Observing Systems - 6/18/06
Key Shapes - the Shapes that Connect [
4 cascades],
Inflation, Integration, Disintegration & Decay - the mechanisms
The General Idea - recognizing complex processes from shapes of change
How Derivative Reconstruction really works - a complete reverse test of the shape reconstruction method
Absolute Growth Scale - measure stages of development in their own evolutionary scale 10/06
Discoveries and Results - some fairly solid and surprising findings in the curves
A paradox - why nature's rules have no location or way to be followed

Archived introductions to the new scientific methods I developed for studying self-organization in nature, mostly 2008 and earlier views, that previously were at the top of this "the physics of happening", page

Theory & Applications: a series of short treatments with links to other figures and discussion 

Introduction - the method of time series shape study

Theory - laws and measures of physical continuity
Tools - how its done
Stat Technique - technical details & measures of confidence
 

Applications :- various sample studies

Related Subjects: 

Publications Articles, Letters & Web Archives
Others - Who thinks like I do? Influences, Reading Lists & Fields of Interest
time-series data source - Rob Hindman

·         Common sense, Key Methods, Physics Theory, Systems Thinking Principles 12/07
·         the Basics of Steering -
·         the Principal Principle of Cybernetics - pump it up... just enough
·         What Approximation Leaves Out - Proposed For NECSI talk 10/06
·         Why we're all mostly out of the loop 
·         Page History & 9/06 Current View  View from 06 -
·         Models, Emergence & Complexity View from 06
·         Experimental Outline of a new Science Physics of natural time, (a bit old)
·         A Letter in the Science Times  on mathematical truth & beauty
·         Evolving Air Currents  The experimental origin - and all time favorite page
·         These Pages... Until recently were called 'Derivative Reconstruction'...
·         Notes & clippings... Some comments on related issues

The general idea of the research method,... observing change 

ed Apr 00
Most of these pages are about a careful way of observing change over time, using hard data and soft mathematical tools, to develop a sensitive guide for studying  the connecting flows change of all kinds.  Changes in the flow of events reflects evolving organizational change in the natural systems that produce it.   The tools are fairly conventional scientific tools, and guided by conventional scientific interpretations of events, but the object is to look beyond the measure to study complex organizational processes, the inner workings of nature's deft transformations.

What results are a some surprising discoveries, and a greater appreciation of the emergence and layering of  patterns of change in general, causation.  It is probable that what you'll find here will not live up to its promise, at least not until you are able to use it with the kinds of change you are most interested in and most familiar with.  Much of the focus here is also on mathematical methods and problems.  More on how to use the approach might be helpful, but it is given somewhat less attention.     Some of the techniques involve new areas of mathematics (empirical measures of derivative continuity, the paradox of indescribable simple pattern in the beginning and ending of events) and contribute to new methods in computer vision (defining  landmarks of organizational change), as well as providing a new empirical method of inquiry.   The method also involves a new point of view, a super-realism.   Why 'super'-realism?   It involves watching for, and draws you into, the deep inner workings of particular individual events.    There's lot's there.

For example, one might ask why intricate patterns develop in fluid flow, say watching cream in your coffee.  Even if we can see various scales of pattern, can name a cause such as heat or drag, and make some predictions, there's still nothing but the kinetic interaction of molecules to produce any complex pattern.  Somehow the molecules interact to build complex evolving pattern.   There are no other communicating forces, no imposed guides, no embedded memories, only accumulative original pattern development in molecular motion, with every part participating independently.

This is a common trait of nature's confusing but highly organized behaviors, they happen all at once, with unified orchestration but no central point of control.   So much is happening at the same time, and so smoothly going through changes, one has to watch it very very slowly....   A one dimensional trace, say temperatures at just one point, may provide particularly good information on when and at what rates organizational transitions develop.   That may point to exactly when, where and what kind of thing to look for to find the mechanism that develops to do it.   The same is true for tracing the inner workings of your business market, looking for particular kinds of turning points and then to look more closely for what it is and where it might go.

Measures of change over time directly reflect, but poor describe, the complex accumulations of events involved in any process of change.  They're not the answer, they're a guide you can use for looking beyond. The common indicator of  local organizational change in inner workings of a system is a growth curve, reflecting far more complex events, but accurately displaying their timing.   These are typically represented by the 'S' curves, the connecting shapes that are found in virtually every transition.   Their shape draws a smooth curve between the static measures as the underlying organizational processes develops a smooth changes between systems of behavior.   After identifying a growth curve a few simple checks usually identify what process it is reflecting and at least hint at some of its distributed system of interactions.

Beyond the effort to test scientific ideas, what drives the inquiry is the wonderful depth of unique detail that can be found in individual things and events of all kinds.    Nature is indescribably deep.   Individual events  of all sorts display a fantastic variety of local organization at many scales.    When you look at them in detail, it is obvious why any mathematical description needs to include uncertainty, there's simply too much there, too much going on to describe.   Patterns that can't be understood are called random, and in our data much of the intricacy of nature appears random.  Data is such a very poor story telling device.   The solution is not to abandon data, but to use it differently, i.e. not as the description of what's happening but as the map or guide for your own exploration of a territory of understanding the map can't directly provide.

ed Jan 99
The particular mathematical aspect of natural complexity focused on here is the continuity, or flow, of change.   Much of the work concerns a disciplined analytical tool, derivative reconstruction (DR).  It's not magic, but a tool which allows you to discover much more information about the history and progression of change than you would first expect to be available.   It's an extremely general technique, applicable to any subject of change over time whatever.  Change in anything usually progresses in a self consistent manner, developmentally, displaying the presence and accumulative modification of coherent systems of relationships.   By very sensitively representing smooth underlying flows of change DR helps make the details of these patterns more visible.

The property of infinitely smooth progression in mathematics is called 'derivative continuity'.   It is one of the most fundamental and useful properties of mathematical functions.  Physical measures of change often display a similar physical property, the organizational continuity of nature, call it flow.  DR uses the mathematical definition of the derivative in reverse to represent dynamic organizational change reflected in measures of physical systems .  It doesn't always work, but often does.  When it does it often exposes otherwise invisible natural structures and the links between them.

Recent investigation indicates that there is a distinct similarity between the primitive elements of DR and those being rapidly explored and developed in the fields of computer vision, artificial intelligence and parallel distributed processing. In the terms of those fields DR is a form of curve generalization, that treats randomly sampled data as compressed information about multi-scale continuities.

Who else thinks like I do?... others.    Back to contents

............

Reconstructing the flow of individual events

DR is a method of reconstructing the shapes of physical processes directly from their measures without theories and equations. It uses the idea of derivative continuity, a structural property of mathematical functions. With derivative continuity, change in location takes a period of velocity, and change in velocity a period of acceleration, etc. It all takes time, forming a seamless flow of rates of change without any gaps. In the derivatives of an ordinary equation near any point there is information about the entire future and past of the curve. The physical systems underlying flows of nature are more complex and changeable and the information about them usually less complete and less accurate, and, the range over which distributed information about the curve can be found varies a great deal. The point though, is that the distributed information in a series of measures about the flows of a physical system is of the same kind as of a continuous mathematical curve, and that you can find much more when you look for it than when you don't.

Sampling a smooth, but complex flow of change may produce a pattern of points that appears randomly scattered. If you have no information about the underlying process then you can treat the data as form of compressed information about a flow. You might have some other reason to expect an underlying continuity or simply want to see if recognizable forms appear when it is looked at as if it had continuity. Using the implied progression of the derivatives to reconstruct some details of what went on in-between the data points is then a method of data decompression. When it works, it displays previously unseen fine dynamic structures directly, without requiring any preconceived theory of behavior. It also produces a new kind of mathematical entity called a 'proportional walk', having the mathematical property of derivative continuity, but no equation, composed instead of a finite sequence of points and a parsimonious rule for how to connect them with continuous derivatives.

Throughout all fields of scientific research there is a tendency for investigators to see whatever it is they are looking for. In a way that is both more and less of a hazard with derivative reconstruction. The curves produced are more strongly data driven than perhaps any other method, so the likelihood that they represent some real feature of the data is quite high. The problem lies in interpretation, generally requiring a thorough immersion in the detail of the subject, to know what the data represents and what assumptions about the different features of its shape can be made in the process.

There is also a philosophical problem. Reconstructing and studying the individual dynamic flows of change raises a question concerning whether to view nature with either a deterministic or probabilistic model of events. Neither seems appropriate. Individual processes are not probabilistic, since that paradigm applies only to classes of events. Flows of individual events are also not even remotely deterministic. You rarely find any fixed patterns. At the beginning and end of any apparently deterministic pattern (because they all begin and end somewhere) there is an indeterminate change in pattern. To make a possibly long story short, the inevitable conclusion is that individual events need to be seen as something else, call it 'opportunistic', displaying processes of discovery rather than of following rules. How individual events actually come to have the unique and intricate structures they display is clearly not by humanistic volition and intention etc., and it is not that sense of their being 'opportunistic' that is meant. It is only that new behavior begins to develop when, and only when, the appropriate circumstances arise, and then develops in a manner displaying continuity. A theoretical study shows that this implies that change begins with a very specific and recognizable mathematical shape.

The computation routines used (see Tools & StatTechnique) were developed in AutoLISP, the programming language of AutoCAD, and run on a PC. When proper care is taken the immediate usefulness is that the method tends to uncover evidence of surprising patterns of change. In long studied data these newly appearing patterns are often different from what others have found. For example DR offers a direct means of identifying dynamic coupling between complex systems, without having any behavioral theory for either one. (warming3.gif, econcync.gif). Sometimes this brings into question strongly held notions about very well studied processes.

(to Applications ) back to contents


Theory top 

 - Emergence and the Law of "little bangs" 

Equations are usually made to fit many sets of data at once, to represent idealized behavioral structures. Here the proposal is to use the constructed curves fitting the detailed shape of individual sets of data to represent the unique structures of individual events. It is used where it can be reasonably assumed or concluded that the data reflects a continuous physical process which can be treated as having derivative continuity.  The more general principles of organizational continuity follow from general observation and a physics of conserved change.   The critical point of physics is a straight forward derivation from the conservation laws, assuming only that rates of energy transfer must be finite.   It demonstrates that all physical change must satisfy a principle of continuity & divergence.   It seems to imply that the only ultimate discontinuity in nature is in information.    That doesn't indicate when you have enough information to identify the continuities, or "little bangs" that may be present, of course.    It does point to where you'd need to look for them though.    Quantum mechanics, for example, concerns a range of behaviors beyond the known limit where a knowledge of continuity is possible.   There's no conflict in a lack of information, just various kinds of assumptions you can make about it.

What one starts with in observation is information without meaning, like a series of points with no necessary relationship between them.   It could be a history with a fascinating story to tell if you could connect the dots with the processes that produced them.    To interpret a curve as suggesting a process, one needs to ask how to determine when a series of points can be treated as information about a continuous process there's a series of general questions to ask.    Was each recorded the same way?  Are there any recognizable shapes?  Are there enough points for the kinds of mathematical tests you might try?   Is there any reason to think from the environment there was any accumulative change?.  Are there other things that might be affecting the points?    One exercise to begin learning the technique is to look at a time series graph and see if you can think of two or three possible explanations for each bump.   Having explored the context of the shapes and processes that might be involved, ask if there's any overall shape.    That can be by simple inspection, learning the specific steps taken to collect the data , using statistical measures, or by other means of determining that the data most likely represents some derivative continuities.

Just because there are breaks in the shape of a curve does not necessarily mean there is a break in the process producing the curve.    The activity of the process may have switched to somewhere else for example.   Studying complex systems involves developing an intuition for questions like that.    Others would have other lists.   Four of my lists of them are arranged as a list of principles.    some advanced methods are described for identifying cybernetic body parts and things in complex systems and learning from them.   This site is mostly about the work I've done with analyzing one dimensional curves for the questions they raise about the processes that produce them.    The analytical methods I developed are described below in 'Tools' and a little within the Autolisp program files in Curve.zip, and in a long 1995 paper. The discussion of analytical principles begins on p12 and is followed by an appendix describing the routines developed at that time starts of p35.

back to contents


Tools

There are various ways to use the derivative continuity of time-series data to reconstruct the shifts in continuity of underlying processes.  The trick is to see the difference between complex fluctuation and noise.  One is meaningful, and the other not.    Many people start with curve fitting, perhaps a spline curve, to represent the data as a smooth continuous curve, that treats all fluctuation as noise.   Here the routine is derivative interpolation (DIN), which projects interpolated points by matching 3rd derivatives from before and after.    For validity it is necessary have good reason to guess that your time-series data points lie on the path of some continuous process (i.e. not random or a random-walk). 

Commonly some noise suppression is used first, with the one hard and fast rule being to use the very least amount needed.    Validation of the process, like any other scientific method, is whether it's useful and everything knits together in the end.   For very noisy data there are good mathematical tests for the degree of smoothing that will reveal the continuities without erasing them.   The full methodology is not published as I've invested most of my time in applications, and trying to explain why anyone would want to closely study the real dynamics displayed by independent natural systems...  Contact for assistance: id @ synapes9.com

Scales of larger and smaller fluctuation are the most common finding.   Estimating the inflection points and growth rates for the larger scale fluctuating processes is usually the interest.   That can be greatly improved by representing larger scale fluctuating processes as curves drawn through the centers of the smaller scale fluctuation.   That can be done by connecting the inflection points (TLIN) of the smaller fluctuations, called 'integral interpolation', or finding the 'dynamic mean'.    The fluctuations in a process are usually symmetric and represent elastic variation in the underlying behavior.   Sometimes the variation is one sided and the maxima and minima of the fluctuations are used to represent the norms of the underlying process.  The effect is quite often to display surprising structural features of the real underlying processes that would not have been visible by any other kind of representation.

The routines available are written in AutoLISP (one of the programming languages of AutoCAD). This platform was chosen for the purpose because it allows curves with any number and spacing of points to be related in the same 'table', and is fully programmable. For conventional statistical analysis the data is transferred to a statistical package, such as JMP. Suggestions of alternate development platforms would be quite welcome. Current updates available on request.

Mar 97 reverse test - How it really works, a complete demonstration of the DR method.

Sep. 95 drtools.pdf Download Adobe Reader description of the AutoLISP analytical command library

Apr 97 Curve.zip - a basic collection of AutoLISP DR command functions for AutoCad 13 or earlier

1/2/08 notes, back to contents



Applications
 - various examples of successes and failures -

Each of the following sections is a brief study of a particular time series data set. The graphs display the data, a series of interpretive curves and some notes on methods and interpretation.   I am always interested in finding other interesting subjects of dynamic environmental change for study.  
 



Gamma Ray Bursts

One of the most astonishing events in the heavens are the enormous energy bursts of gamma rays that now seem to occur during the birth of black holes.   Gamma ray bursts are the largest explosions in the universe, producing the energy of 1000 supernovae at once, in a period of a few seconds to a few hundredths of a second. They are observable from earth once or twice a day from apparently random directions.    This study of the microscopic dynamics of one such event, based on the counts of gamma rays recorded per second from a gamma ray observing satellite.

April 1998 BATSE Trigger 551
May 1999 The classic "backwards wave" shape of GRB's, perhaps indicating implosion
NASA Introduction to Gamma Ray Bursts
Information on NASA Compton Gamma Ray Observatory

back to contents



 

Plate Tectonic Flow...

The motion of the earth's continental plates is very gradual, and any changes in the rate of movement might be assumed to be imperceptible.   Good time-series measurements of the physical location of 'fixed' sites on the earth are now becoming available, gathered with satellite observation by the USGS Global Positioning System (GPS).    A search for better data and method refinements are needed to make this approach more revealing.    The directly observed rates of plate motion are minute, 1 cm/yr, and the data is significantly noisy. The particular kind of noise is problematic for the current DR technique, in that the noise is large scale and often clustered and one-sided. Derivative reconstructions of the motion of neighboring sites in the San Francisco/Oakland basin (called PBL1 and TIBB), appear to display both matching accelerations in the tectonic flow and matching noise events in the calibration of the measure.

Mar. 1997 PBL1 , TIBB , Match

back to contents


Ice Core CO2

The CO2 concentrations in an Antarctic ice core over the past 150,000 years follow a remarkably smooth curve with a semi-regular 3,100 year period of fluctuation.   There is also a distinct semi-regular 12,500 year period, and a couple large scale singular events. The overall shape is dominated by two major jumps, 140,000 years ago and beginning 20,000 years ago, with a general slide in-between. The recent atmospheric CO2 trend is shown for contrast. The shapes found in the curve are not definitive, but suggest a long term stable system occasionally perturbed by great events. One of those great events is clearly the current relatively explosive rate of increase, considerably higher and growing at 50 times the steepest rate found throughout the past 150,000 years.

The most interesting feature of the ancient record is the amazingly smooth progression of the data, seen close up in fig. 2. The underlying process appears perfectly regular and does not appear to have any fluctuation more frequent on average than 3,000 years. That is extraordinary log term regularity! Perhaps it is driven by something like magma upwelling, or a glacial solar cycle, or there is some diffusion process within the ice which perfectly suppresses the shorter fluctuations without suppressing the larger ones over time. The 100,000 year decline might reflect the general rate at which carbon is withdrawn from the biosphere. The dramatic rise that began just 20,000 years ago might just possibly represent the extensive use of fire.

The study provides an interesting demonstration of the ability of DR to highlight, and repair, a bothersome calibration irregularity found in the original data. The second derivative of the data is still remarkably smooth, except for what appear to be occasional single point spikes due to calibration errors in data collection. The analysis of underlying behavior is that what appears as a general trend of decline interrupted by events of increase may actually be the reverse. This is shown by the second derivative of the trend of 12,500 year fluctuation showing a positive norm punctuated by transient negative periods.

Jan. 1996, Sept 1997, iceco2-1.gif , iceco2-2.gif , iceco2-3.gif

back to contents


Punctuated Equilibrium in Plankton Evolution

evidence of continuous growth system in a speciation event  - (2007 draft of full paper, 2006 PowerPoint)

95 data points, G.tumida sizeBjorn. Malmgren gathered data on the size of a plankton species over 7 million years from an Indian ocean deep sea sediment core covering the transition from one stable form of a plankton species to another (G. plesiotumidia to G. tumidia).   During the transition in shape the organism also tripled in size, giving a good corollary indicator of the progression. The data only hints at it at first, but yields clear evidence of continuous growth processes, repeated eruptions accelerating and decelerating progressive rates change, producing in continuous developmental transition between two steady states. There are other interpretations to consider, but the appearance is of mutation having proceeded by a complex feedback regulated process.

The validity of applying continuity analysis to evolution comes in part from finding a strong linear relationship between the mean sizes of each sample and their standard deviations.  That variation in size is correlated with size is not surprising perhaps, but it does contradict the commonly held 'null hypothesis', assuming that the data was produce by what is called a 'random-walk'.  The variance is neither invariant nor constantly increasing so the random walk assumption for individual lineages or the whole population does not fit.  A second, and more specific contradiction of random walk and demonstration of underlying continuity in the data is provided by the step variance test.    It shows that the variation between widely and narrowly spaced points does not differ like it would for random walks   In a random walk the variances would multiply with the number of points.   The variances of the data do not.  This indicates that the irregularity of the data is noise, and that recognizable shapes of continuous processes that are visible through the noise probably indicate the presence of a corresponding mechanism.    A rapid continuous developmental process that began and ended seems indicated and the kinds of mechanisms that could produce that are considered.

Because the standard deviations are related to plankton size by a linear relation both can be used as indicators of the shape of the transitional events.   The values of the standard deviations were rescaled by that linear relation so that its shapes would be properly scaled for comparison.   Both the size and std. deviation curves were derivative smoothed and interpolated and both show an appearance of a similar progression.
 
 

Rates of Change in Plankton SizeThe first derivatives of the two curves make the point much more clear.   They are remarkably similar despite seeming larger differences in the shapes of the curves themselves, and both display the presence of a dramatic transient underlying process.
 

back to contents



 

Global Warming -1 01/1996, 02/2009   *Note continuity of fluctuation observed, and anticipation of the "climate denial" period 12/2015

20th Century Global Surface Temperature & Economic Activity 

The average global surface temperature over the past 100 years has tended continually upward, but with both short and long wave fluctuation, and in the 90's was rising at the highest average rate of the period on the upslope of the long range curve.    Comparing the rate of change of warming and the rate of change of economic activity with US GDP, the derivative reconstruction curves(1) show the long term trends shows no apparent dynamic link between warming and the general increase in economic activity.   Economic activity is related to energy use and fossil fuel consumption, and the contribution of CO2 and other pollutants to the atmosphere.    This should be further studied.

1) [The derivative reconstruction method (dr) uses mathematical routines to identify continuities underlying natural process fluctuations, the waves underneath the ripples, in this case by taking the mid-points of the smaller scale fluctuations and using a special smoothing kernel to reduce the higher derivative fluctuations without changing the local integral of the curve.   It seems to be a very sensitive and effective means of indentifying natural system behaviors.]

One possible reason why the underlying dynamics of the two systems do not match is a variable lagging and leading response of the climate system responding to the forcing of CO2, visible in the data as a long wave fluctuation above and below a rising norm.   It doesn't mean that atmospheric CO2 doesn't influence global surface temperature.   It just means that something else that speeds and slows the process alternately is happening too.   Having now found data ( Next study below) that does appear to demonstrate a direct dynamic link between warming and CO2, the question is what is the larger effect, and how will it exaggerate or hide the warming effect..

The long wave curve looks like a climate system fluctuation that might have momentum and be repeated. That is quite common for system change, that there are multiple scales of fluctuation.   If so, that might begin to produce a period of actual atmospheric cooling, or a pause in warming, to hide continued  CO2 induced warming and give people a false impression that warming has stopped*.   I've drawn what looks like the probable scale of the momentum of the large scale fluctuation, but this should be revisited with better data.    Click the images the temp. curve(A.1) and rate of change curve(A.2) to enlarge.

In 2007 a slowing of warming of just this type was noticed as associated with increasing cloud formation in the tropical latitudes, the Iris effect, Lindzen 2007 and seemingly substantiated by Spencer 2007.   Both authors consider the evidence of a pause in warming as a permanent effect, but I think, especially if its timing matches the 'long wave' seen here then it is probably a new kind of "El Nino" effect, but in the atmosphere.

Jan 1996 Air temp warming1.gif , 1st deriv (compared to GDP) warming2.gif , 2nd deriv (compared to GDP)warming3.gif

back to contents


Global Warming - 2

Trends of the 70's, 80's & 90's  - finding synchrony in the timing of CO2 & Temperature turning points, comparing Hansen Global Monthly Temperature Index and Mauna Loa atmospheric CO2

[I had thought that this little piece deserved credit for being the first definite human fingerprints on global warming in July 96...but it didn't get notice.   The real prize and credit go to Michael Mann, Ray Bradley (u.mass) and Malcolm Hughes (u.ariz).  Their 3/15/99 paper in Geophysical Research Letters shows a detailed 1000 year temperature record (based on a broad group of climate indicators) which quite abruptly changes direction in ~1900.][link to earlier 4 century curve] [link to Mann's 1999 study]

The study linked above displays an appearance of dynamic coupling between earth temperature and atmospheric CO2  in the alignment of first derivative turning points for recent temperature and CO2 measurements. The Hansen temperature index is a global aggregate of temperature measurements. The NOAA trends in CO2 used comes from measurements made at the top of Mauna Loa. The close correlation between changes in direction of increasing CO2 and changes in direction of earth temperature appears to be evidence of there being a connection (unspecified). The fact that during this period earth temperature tends to decline when the rate of CO2 increase slackens, indicates that there other factors were dominant at the time.   There appears to be a link, though, and atmospheric CO2 is increasing radically (see comparison with ice core CO2 data).  ed 2/4/06

back to contents


History of Economic Growth

The underlying pattern of economic growth rates over the past 130 years shows a tendency for rates to steadily decline.  Steady decline is found throughout the period, interrupted only by one dramatic rise, corresponding to the period of the great depression and WW2.  The resurrection of growth rates following long decline could be evidence of a global turnover in the capital structure new directions get a fresh start and the old regime is abandoned.   Paradigm shifts in technologies, methods and values should show up in the long term growth rate some how, if they were actually transformational.    Whether a resurrection of the growth rates after a long period of decline is cyclic is possible, but doubtful.    It is, of course, also unclear how long any paradigm shift will sustain growth or whether another paradigm shift will automatically appear any time an old one one is exhausted.   

Reconstructing the path of GNP discusses the statistical methodology involved,

Aug. 1995 GNP08.gif , GNP10.gif , GNP12.gif , GNP13.gif

back to contents


Dynamic Synchrony between Economic Measures

A fascinating regularity is apparent in relationship between the second derivatives of the underlying trends of US GNP and Unemployment Rates. There is a remarkable synchrony between the turning points of the underlying derivative rates of change, seeming to indicate that neither is a leading indicator of the other, acting as if the two are expressions of some other strongly unified inclusive process.

Aug. 1995 econcync.gif, Charts

Aug. 1995 econcync.htm, An old methods reference

back to contents


Childhood Epidemics

The reported cases of Measles, Chicken Pox and Mumps in New York City from 1928 to 1963 are data sets dominated by chaotic variation. (Olsen & Schaffer Science 3/8/90) DR technique was able to identify underlying regularities of some possible interest but with such poor confidence.

I keep some of my more obviously problematic methods and interpretations around to keep me and others honest and and on their toes.    Nothing on the site is predigested for unthinking acceptance.    I tried constructing this cool statistical measure, I called 'DAR', to help identify when strings of dots were likely to represent a curve.   Great intent, but this version doesn't do it.    My ~2000 step variance test that successfully disproves random walk in time series where variation is symmetric is my first really useful step in that direction, and I have some other ideas, but haven't gotten any further.   ed 2/4/06

Apr. 1996 NYmeasl.gif, NYchick.gif, NYmumps.gif

back to contents


Sparks

The development of a spark is a very complicated thing

Nov. 1999  Sparks; a first look in a new area.  I'd love it if someone would tell me where I can find detailed data on the growth phenomena of electrical discharges, sparks, lightening etc.

back to contents


The Great Crimewave

Waves of popular (and unpopular) culture are ecological events with patterns that suggest where to look for the particular processes that are developing

Sep. 1999  Patterns in Crime the record shows clear events in youth culture that are separate from the society at large, thought this brief look only points to events that are local.   What that suggests is that their dynamics were local and internal, and raises the question of where they were occurring and what erupting to produce their dynamics.

Oct. 2005  Crimewave's birth and collapse  I studied great crack epidemic of the late 80's in some detail, because of its dramatic end.   The linked report is only a summary research note, about the fascinating secret.   The dynamics show a clear true collapse, starting about three years before the reputed influence of Mayor Giuliani.   There's no bend in the curve for Giuliani's mayoralty at all really, reinforcing the idea that it was the cultural change the broke the crime wave that got mayor Giuliani elected,  not the reverse.   Perhaps it sounds a bit speculative, but in fact the great crime wave passed rather abruptly like an intense fever as the whole community of addicts and their families rejected the crime culture in their midst.   It was an abrupt dramatic silence to bring to an end the 30 years of youth violence that began in the 60's with the rage following recognition of civil rights for the black American and Latino communities that resulted in the emergence of major youth crime cultures in all American cities.  

So I went out and did a number of  interviews on the streets of Harlem and the Bronx to see if people remembered what it was that happened, finding that 15 years later they only vaguely remembered.   I pieced some of it together, I think, from talking to enough people and having lived in one of the high crime neighborhoods of Manhattan when it broke, and so recalling having watched it first hand.   There may be other scientists who have done so, but I have not read about it. (http://www.synapse9.com/cw/cw_interview_notes_10-22_audio.pdf) The study was not meant to be exhaustive, and but to explore my method with a minimal amount of effort, to see if some better questions would come out of it.

back to contents


The Emergence of Sustainability

Oct 2006 Emergence of Sustainability  This brief research note shows how to trace the frequency of a key word or phrase to investigate the dynamics of changing ideas.    In this case the word 'sustainability' in NY Times articles shows a dynamic long term growth phenomenon punctuated by dramatic flurries of conversation linked to the individual articles discussing the subject.  

 The Collapse of General Systems Theory

Feb 2006 General System Theory's Collapse  This brief research note is another example of using word use frequency to trace the evolution of an idea.  



top of file