Global Warming Debate

July 2, 2014

Whatever the result of all the study of the numbers today and the way in which calculations are done, what errors were made the thing you are all missing is that the fundamental ongoing issue is data quality! Assuming we debate and eventually conclude what the correct methodology for handling the data are in terms of computing averages, etc the fact remains that every day as new data are entered and things change (however those changes may come about for whatever reasons) if you are depending on those numbers for serious work you need to have tools to insure data quality.

What does that mean? It means that the NOAA and other reporting agencies should add new statistics and tools when they report their data. They should tell us things like:

a) number of infilled data points and changes in infilled data points
b) percentage of infilled vs real data
c) changes in averages because of infilling
d) areas where adjustments have resulted in significant changes
e) areas where there are significant number of anomalous readings
f) measures of the number of anomalous readings reported
g) correlation of news stories to reported results in specific regions
h) the average size of corrections and direction
i) the number of various kinds of adjustments, comparison of these numbers from pervious periods.

What I am saying has to do with this constant doubt that plagues me and others that the data is either being manipulated purposely or accidentally too frequently. We need to know this but the agency itself NEEDS to know this because how can they be certain of their results without such data? They could be fooling themselves. There could be a mole in the organization futzing with data or doing mischief. Even if they don’t believe there is anything wrong and everything is perfect they should do this because they continue to have suspicion of their data by outside folks who doubt them.

This is standard procedure in the financial industry where data means money. If we see a number that jumps by a higher percentage than expected we have automated and manual ways of checking. We will check news stories to see if the data makes sense. We can cross correlate data with other data to see if it makes sense. Maybe this data is not worth billions of dollars but if these agencies want to look clean and put some semblance of transparency into this so they can be removed from the debate (which I hope they would) then they should institute data quality procedures like I’ve described.

Further of course we need to have a full vetting of all the methods they use for adjusting data so that everyone understands the methods and parameters used and can analyze, debate the efficacy of these methods. The data quality data can then insure those methods appear to be being applied correctly. Then the debate can move on from all of this constant doubt.

As someone has pointed out if the amount of adjustment is large either in magnitude or number of adjustments that reduces the confidence in the data. Calculated data CANNOT improve the quality of the data or its accuracy. If the amount of raw data declines then the certainty declines all else being the same. The point is that knowing the amount of adjustments, the number of adjustments helps to define the certainty of the results. If 30% of the data is calculated then that is a serious problem. If the magnitude of the adjustments is on the order of magnitude of the total variation that is a problem. We need to understand what the accuracy of the adjustments we are making is too. We need statistical validation continuing (not just once but over time continuing proof that our adjustments are making sense and accurate).

In academia we have people to validate papers and there is rigor applied to an extent for a particular paper for some time on a static paper. However, when you are in business applying something repeatedly, where data is coming in continuously where we have to depend on things working we have learned that what works and seems good in academia may be insufficient. I have seen egregious errors by these agencies over the years. I don;t think they can take many more hits to their credibility.

The Brain and our state of understanding NOT

July 2, 2014
http://www.technologyreview.com/featuredstory/528131/cracking-the-brains-codes/
Here is an article telling us the state of the art of understanding the brain.  Some people are waiting for the singularity which is the point at which the brain can be read out and stored in a digital form so that we can store and replicate individual human minds.   From this paper it is clear we are far from that day.
The paper is trying to be optimistic but I think it is clear the big distinction that needs to be made is between the brain “contents and cognition” versus observation of the brain action on the external world and input signals.  The former is completely obscure whereas we have an ability to observe the latter.   We shouldn’t be so stupid as to think that the fact we can see a signal from the brain going down a nervous pathway doesn’t mean we understand any more about the how that signal was created, i.e the cognitive process inbetween. What we have today is observed using electrical methods indicators of what kind of information the brain is receiving from our senses and ideas about what the brain does electrically that seems to cause muscle firing and movement of the human body.  We also seem to be able to recognize some other blunt possibly secondary effects in the brain that seemed to be an indicator of the end result of the brains activity.   The basic point is that the actual operation of the brain to store, create higher level concepts, recall exactly, correlate information, how the brain has a consciousness, a seeming direction of thought and sense of identity, the ability to process information at unbelievable speeds in some cases (for instance an athlete performing very complex actions with the body and senses in tandem with incredible precision all of these things seem completely impenetrable.
In the article the author talks about how individual neurons seem to be responsible for recognizing whole people.   Given the brains input is a series of spikes how is it possible a single neuron gets the information to encode something as specific and general as a particular hollywood star for instance?  How would a single neuron have enough complexity to record or detect things like this?
The article mentions over 1000 neuron types and that for instance possibly some types of neurons are able to detect lines or movement of lines vertical or horizontal.  This may be possible but the existence of a thousand or more neuron types and the existence of any preprogramming in the brain is a terribly complex matter.   I have always been staggered by the low number of genes in the human genetic code.  I now understand genes are simply blueprints for building nano-machines.   Fragments of a gene code up components of these machines and so the machines are composed of common building blocks consisting of repeating patterns of DNA that construct levers and detection mechanisms.  A gene is simply a factory for making a particular type of machine.  There is separate coding in the billions of other DNA fragments that direct how many machines to make, when to make the machines etc.   Therefore what was considered junk DNA before is now considered the most crucial DNA because it is where the actual instructions for operating the factories and the machines that operate the body.
There is an information problem I don’t think anyone has really thought through which is the sheer complexity of all the chemicals and the machines and the different cell types, the processes to operate cells, each organ or group of cells and the instructions to manage the interoperation of this giant machine must be extremely complex code.  This code is not like a fixed computer program which is unchangeable and breaks with the slightest unconsidered input.  This machine has adaptive capabilities which it can use to repair itself, to handle scenarios it has seen in the past ( possibly the far past from thousands millions of years ago that some previous DNA had to deal with) and bring together forces to combat detected attacks.  It has the ability to constantly change and to have augmentations.   We can think each cell operates autonomously but we know that in fact the system has more global capabilities and that far removed systems can be triggered and action taken, that even thoughts in the brain and mood can affect how the factories and the machines themselves work.  There is an awful lot of complexity of any machine composed of so many components, so many different components.  I can’t even imagine how many different situations the body deals with on a daily basis to maintain homeostasis to keep everything operating and repair, grow muscle, create copies of cells, copies of humans.
The point is this is incredibly complex program.  As big as the DNA is there is a question in my mind if the coding could POSSIBLY be sufficient to represent the complexity of this machine let alone trying to understand what the coding mechanism is.  Knowing how many lines of code is needed for the simplest program it is disturbing to think how much more complex a human body is consisting of so many different cell types, so many chemicals, so many parts of the puzzle and all the unbelievable complexity in writing a program to reliably manage it let alone the repair mechanisms, adaption mechanisms, retaining knowledge of past experiences and dealing with those situations when they arise again, i.e. learning.  It is mind boggling and no computer program is imaginable that could do all this.   We have never imagined let alone written anything remotely as complex.
Over and over we are faced with the problem that we don’t understand the basic language of the genetic code, the brains coding of information, concepts, the process by which instructions and action is taken, how decisions are made or processes work beyond the simplest observable action.  It is hard to believe this is all done with chemical reactions because we know the operation of these chemical reactions is slow compared to the needed speed to operate the machine, to react.    Electrical signals seem so far to be insufficient to explain the brains function.   We have studied these electrical patterns and not discerned their meaning.  Possibly there is no meaning and we are observing a secondary effect not the primary effect that is going on.
I have no answers.  I am not calling for mystical answers, i.e. god, etc.  I am simply pointing out the pathetic state of our understanding.  We are like little worms crawling around this stuff with so little understanding but thinking we have a clue.   I am reminded how it is clear to me that animals recognize and communicate somehow.  Studies have shown they can tell each other things.  They work together in some cases.  We have never been able to understand the language of animals.  If these animals are so dumb and we are so smart shoulnd’t we be able to figure out what a penguin is saying to another penguin or a dog to a dog.   I mean there is clearly something more going on here than we grasp, some more detailed language at play that we don’t seem to have a clue and can’t see the regularity of it.
Maybe we are closer than I think and if we have one simple break in understanding one little thing about how the brain or the junk code in the DNA really works that we will suddenly have a path to complete understanding.  Maybe we are close.  It is just so bloody hard to see how this all works.  If the brain was a completely organized thing of some small number of cells organized in a repeating pattern we could understand how a limited number of instructions could be used to construct a brain, how it could operate with a simple set of procedures.  I remember a book I read which outlined the basic structure of the cortex.  There is a regularity to the cortex.  It is composed of 6 layers of cells that have some repeating structure.  I could see how such a structure could become a general learning machine perhaps but the more we learn about things like the eye and all these different neurons it turns out to be a lot more complex.  The cortex itself is more complex than that simple explanation.  There is a lot of variation in regions of the cortex.  How could the programming in the genetic structure be so detailed to lay out how to build a cortex and brain (forget the rest of the body).  Shouldn’t we see massive amounts of DNA related to this?  Yet 80% of the DNA of a fly is similar to a human.  The brain is composed of regions, lots of regions more than simply the cortex and 1000 cell types and many processes going on at the same time electrical and chemical.  We have discovered recently microtubules to increase the complexity of interaction between neurons.
Maybe instead of thinking of ourselves at the near ending point of learning about nature and how smart we are we should all take a humble pill and realize we really have no clue.   I believe this has come upon me when realizing where we are with our physics knowledge.
We should think of ourselves really at a cro magnon state.  We have this rudimentary ability to understand things based on our observation of macro phenomenon.  We are like early observers of the human body who talked about humours and had no idea what the organs of the body did.  We are blind to so much of what’s probably really going on which is why it seems so hard to understand how it could possibly operate.   That is exciting to think there is so much to learn ahead of us and depressing to think I will probably not be around to see it unfold.

 

Thoughts about Physics and the nature of everything

April 13, 2014

I’ve read 2 books recently on the topic of life the universe and everything.  One is called Biocentrism by Robert Lanza and the other is called Our Mathematical Universe by Max Tegmark.  Both are flawed but both made me think more deeply about the problem and while I can’t offer any new physics I have made some observations.

The Collapse of the Classical View

In the early 20th century the fundamental change that essentially de-virginized us (excuse the analogy but it is actually appropriate) occurred with the unbelievable result from the double slit experiment.  This fundamental inexplicable result which has baffled scientists to this day holds the complete collapse of the classical deterministic view of the world.   I’m not the only one.   Several physicists have called this experiment the fundamental experiment that exposes the quantum wierdness that essentially turned physics from a scholarly straightforward pursuit of linear reasoning to a mind bending that has resulted in ever more and more bizarre experiments and results that produce ever more bizarre and unbelievable theories.

Please note.  I am not criticizing physicists here for doing all this.  I have no better explanation than they do for what we are seeing but the fact is that experiment revealed that nature was far more complex and baffling than we ever imagined and we have had to construct ever more bizarre theories to explain what we see as we do experiment after experiment.

Scientists with a straight face will try to tell you the world consists of the following facts:

1) 94 % of the universe is composed of dark matter and dark energy which we don’t actually have any understanding of.   We don’t know what these things are, have never seen them and yet they are pervasive everywhere and are filling space all around us.   Yet we don’t see them, have no idea what they are.   We need dark matter fundamentally because calculation after calculation has shown that galaxies would fly apart without the addition of 5 times as much matter as all the visible matter.   Somehow invisible and around us giving this necessary boost to keep galaxies from exploding is “invisible” matter that is 5 times more than what we see.   Okay, so there are ghosts flying around us all the time but don’t worry, since it doesn’t interact with us it is there, trust us.

2) However, dark matter is still a minority of the energy in the universe.  According to our new understanding we need dark energy because simultaneously with the huge amount of attractive dark matter that keeps our galaxies together there is a repulsive energy that is pushing the galaxies and everything in the universe apart.   If we do not accept dark matter and dark energy we have no other theory or even conceptual theory plausible that could account for the observations of galaxies behavior and the undeniable fact the universe is to our surprise and mystification actually flying apart.

The data behind these observations is essentially indisputable.  It has been observed in countless experiments now that the universe is expanding faster and faster and observations of galaxies clearly shows the existence of matter we cannot account for that somehow appears to be hidden from us.  Other theories have so far not been able to work that have tried to explain these phenomenon any other way.

3) That in the first 10^-30th second of the universes existence a force called inflation caused an expansion of the universe by a factor in excesss of 10^100 times in size in less than 10^-6 seconds.   After this unbelievable sudden explosion of the universe into existence the expansion stopped.   Various theories purport to explain this inflation but the fact inflation occurred is bizarre.   It seems so contrived and convenient that this explosion happened to enable our universe.

4) That this inflation is so big that large parts of our universe today is there and we can never see it.   The universe is so large now that we assume from calculations that identical copies of the earth must exist with human beings on it like you and me every 10^…. so many light years in all directions and therefore there are virtually an infinite number of copies of you and me living lives.  This is called the level 1 multi-verse.

5) Quantum mechanics tells us that the most likely explanation for the bizarre results observed is that the world lives in what is called superposition with other worlds, a real virtual infinite number of parallel worlds in which all possible outcomes of all possible quantum states exists.   This is the level2 multi-verse in which there are an infinite number of copies of you and me living all possible combinations of lives.

6) There are 32 constants that physicists have found that cannot be tied to any other quantity by necessity.  Things such as the ratio of the mass of the electron to the proton, the speed of light, the ratio of dark matter to regular matter, the plank constant, the strength of the strong force constant and so on.  These constants appear to be randomly selected.  In fact one book says a statistical analysis has been made and they are to within a significant degree perfectly random it appears.  However, they are not random.  These constants turn out to be incredibly brittle constants.  The slightest change in any one of them would make life as we know it in our universe impossible.  Maybe life is possible with the constants slightly modified or even largely modified but we know that with only a change of less than 1 in a million in the strong force constant we would not have solar systems like we see today composed of carbon and heavy atoms.  Stars would fail to produce these materials on their explosions.   If the ratio of dark matter to regular matter were changed by even the smallest amount the entire universe would have imploded or exploded outward in such a way that no solar systems could have formed or the universe would have lasted no time at all practically.   We have 3 dimensions of physical space and one dimension of time.  We’ve known for centuries that you cannot form stable orbits in anything other than 3 dimensions.  All other dimensionality would result in no planets, no orbits, no stars.   Each of these constants appears to have been tuned to produce the universe we are in and yet we have no explanation for why these constants are what they are.   The chances these constants would arise at random to be what they are is < 1 in 10^500 according to one book.   So, the fact that these constants are selected the way they are points to an almost irrefutable result.   Either there is an explanation for why these constants are as they are because they do in fact have some law that forces them to be what they are or there must be at least 10^500 universes with all these possibilities existing in them and we are simply luckily in the one that humans can live and think.

7) We are to believe that quantum strangeness is so bizarre that it appears that doing things in the future has the effect of making things you do now different.  While this is confusing, the result is still one of these bizarre beyond bizarre ideas.  Since entangled particles have to obey certain properties depending on what we know about their behavior we can only live in universes where these things work out so that what we do in the past corresponds to what we do in the future and this entanglement forces us to be unable to do some things or see some things that we should be able to do but we cannot do them because of things we do or don’t do in the future.   This is confusing but it doesn’t violate causality.  It is simply that some possible sequences of actions that we thing should be doable aren’t.  Universes exist in which we do only a select combination of actions but not all actions are possible independently.

8) Because we have no good reason to believe that the laws of physics are unique we probably have to accept the notion of a 4rth level of multiverse in which all possible consistent laws of physics are possible.  This is of course the biggest multiverse of all.

Let me recount the truly staggering state of current results from experiments we have found:

1) Galaxies should be flying apart, so we need something called dark matter which we haven’t seen

2) The universe is expanding very fast and the only explanation we have is something called dark energy which itself is more than 10 times the energy of all matter in the universe and also hasn’t been seen.

3) We discovered a massive inflation occurred in the early universe where bizarrely the universe expanded by more than 10^100 in < 10^-10 seconds and then stopped.

4) The universe is so large now that it is virtually infinite in size not limited to the 10^13 billion light years across we thought just a few years ago.  This gives us level 1 multiverses.

5) The bizarreness of quantum physics forces us to a worldview that says that multiple infinite universes exist in superposition at any time with all possible quantum states elaborated resulting from all possible previous quantum states and ad infinitum.

6) That 32 constants have been precisely picked that result in the universe being the way it is and that these constants are all extremely brittle and our universe would collapse or be inhospitable to life as we know it with any of them changed by even a very very small amount.  The probability that these constants would result at random appears to be 1 in 10^500 which means essentially there must be a god or there are infinite universes that have other inhospitable constant values in them.

7) That time causality is more complicated than it would appear and things we think should be doable aren’t that take place at different times and places.

8) that there are probably an infinite number of universes with different physical laws possible

I will conclude this blog here and continue with where I;ve gone with some of these things and other strangenesses and bizarre things we are to believe.

Bigdata and Privacy

March 28, 2014

I believe we need to have new laws to deal with the accuracy of information being held about people and the duration that data can be held.   For instance, no company should need information about you for more than 3 years duration without your explicit permission, not permission in a 20 page “legal disclosure” but explicitly that you are okay with someone keeping data longer than that in a separate acknowledgement.  If you are under 21 it should be 1 year limit by law.   Any data kept after 3 years (1 year if under 21) must be kept in such a way that you can dispute it and find out who has such data by consulting a central registry.  Disputes of the data should be resolved to the benefit of the consumer unless the holder of such data wants to fight and prove the legitimacy of such data.   

Every company I talk to is accumulating vast information about you and I. While I am a big fan and excited about using bigdata to provide better service, higher intelligence smarter services I am worried also that it is an invasion of privacy or even that improper or inaccurate data will cause people problems.  My company WSO2 is trying to build secure solutions and bigdata solutions to enable companies to be intelligent.  It is an awesome responsibility to have personal information about people.   It’s not just a legal responsibility but a personal responsibility societal responsibility to make sure that everyone is treated fairly by the systems we build.

As an Open Source company WSO2 has an obligation to promote transparency and responsible use of data.   We provide our source code to everything we do.  There are no “enterprise licenses.”  I believe our advocacy of open source is a statement also about transparency.   Please let me know if you think my personal ideas about privacy above are reasonable and sound.   I feel very passionate that while a new cyber world is being built that world shouldn’t be something we fear or are hurt by.   The goal of all this new technology is to make life better.  We must find a way to build this new world so that we and our children want to live in that world and that this new world is compassionate and fair.

Job for recent graduate – any interest

June 13, 2013

WSO2 is looking for an Account Manager in the US working in Palo Alto, Ca.  This is an entry level position ideal for a recent graduate with an interest in business and sales.   Anybody know of someone?

The ideal candidate should have:

Graduate with an IT background OR 2+ years of experience selling products/services 
Comfortable interfacing daily with Fortune 500 clients located globally
A moderate knowledge and understanding of service oriented architecture, middleware market and open source software
Sales Professionals with sound knowledge of IT
A positive can-do attitude and a desire to succeed in a fast-paced environment
Requirements:
Able to travel internationally
Highly computer literate and comfortable using Web 2.0 tools such as LinkedIn and Facebook
Exceptional English verbal and written communication skills required.  Proficiency in a foreign language a plus
 

Hello world!

January 30, 2010

Welcome to WordPress.com. This is your first post. Edit or delete it and start blogging!


Follow

Get every new post delivered to your Inbox.