A football team wins FIFA world cup, a society that is progressive, a complex software system that fails in field utterly, a strategic group fail to deliver a good strategy, a promising youth later turns out be a pathetic failure at later stage of his life due to his tiny compromises, a kid by continuous practice turns out to be a distinguished performer in arts by being passionate everyday.
Let us see a software perspective.
Few years back, having a personal computer with 2GB of RAM is very rare and today it is common thing and so is parallel processing (particularly after multi-cores becoming a common thing). The sheer performance of the hardware has increased at least few folds. However, Microsoft Windows froze few years back and it continues to freeze even now. BTW, the freezing is little to do with MS Windows (but it plays a role). Most of the damage is done by the application developers.
Few years back, the developers used to think that we have 512 MB RAM and the part of it is ours (may be half of it). Today, we tend to think that 2 GB RAM is available in the system (as we give minimum requirements in our user guide or release note) and half of it ours. Our thinking doesn't change over a period of time. When we develop applications, we think that there will be few applications that will coexists with ours.
Let us assume that for a 2 GB RAM and 2 GB of swap, we think that at least four applications can run with each consuming 1 GB of RAM. The math works perfectly well on paper and for the first few days until the processes grows up to 2GB (put together). Beyond 2 GB, the operating system has to swap some portion of RAM to disk. Once few pages are swapped, a method call or access to heap may lead to page miss and the operating system has to fetch the page from swap - an interesting complexity gets added. Due to this complexity, the time to execute an intended operation takes little longer which again may increase the time and memory footprint (as delayed execution delays reclaim of memory which will become eligible for reclaim if the program runs as per plan). This complexity grows exponentially until few process crashes or the system crawls and you decide to give a rebirth by rebooting the system.
But mathematically, we think that there is no problem with the parts (each process) but the entire issue is with the whole (the operating system and the processes put together). This is called emergent behavior. The whole exhibits a behavior that cannot be explicitly attributed to the parts and each part thinks that none of its behavior is responsible for the whole behavior. There is inherent complexity due to movement of parts, the environment and the processing of events that are external to the parts (and no way related to any of the parts).
This emergent behavior affects the operating systems, make climate change a failure or reason for human stupidity. Emergent Behavior proves that the software can be made with the same process that is used to make this universe and the same process can be used to attain spiritual enlightenment.
Think about how an image is formed. The image are formed using pixels. Does removing a pixel lead to disfigured image? No. The disfigure is almost negligible. So, the whole has a power than its parts due to the emergent behavior. People who talk about "the bigger picture" should focus more on studying "emergent behavior". Generalizing things from individual events can be done well by understanding this emergent behavior (and it actually helps not to generalize the exceptions).
No comments:
Post a Comment