Population stabilisation:
Most of the societies even in the developing world are now seeing stagnation in replacement levels of population growth. It will be challenging in the future to ensure economic growth rates of present times for that reason alone. When population stagnates, labour becomes expensive, consumption decreases and growth flattens. Even the last remaining regions of Africa are now seeing rapid urbanisation, and consequently will see this trend in the next 20 years. In a way, evolution itself is forcing humanity to cut down on its wasteful ways. The authors of this article should have touched upon the factor of population deflation in their article to make a more balanced case.
Biological systems are logical too. The genome is the program that is hardwired into them. They also run on mathematics involved in chemical reactions. Even the simplest organism like the single-celled amoeba has more complicated and highly optimized algorithms programmed into its genome for survival than the most complex system (like the LHC) ever built by humans. An amoeba behaves the way it behaves because the laws of physics imbued in the mathematics of its chemical reactions caused it to overwrite its own gene code in such a way so as to optimize the mathematical probability of its own survival. Mathematics is merely an abstraction of our mind to understand the reality of our world. The programs that we create solve problems in our world, but are constrained by the same kind of variables that constrain biological systems, but both biological and software systems could be interpreted in terms of mathematics.
No real world biological or physical system is 'simple'. Every program starts simple, but even that is up to interpretation. A one liner program in any language is extremely complicated once you factor in the OS, the compiler and the interpreter that will actually run this program. Once the program intakes more complicated functionalities, and also multiple programmers, it becomes anything but simple. Simplicity is good to organize our thoughts and design systems, but it a whole different matter to expect that the designed system will also stay simple. Hence just like any other system in nature, software also has to evolve, and there shouldn't be anything wrong with that. A mathematical proof is not a panacea for the problems in software development. The purpose of software changes, its constraints change, and proofs will have to be rewritten everytime. You don't expect the underlying laws of mathematics to change, that is why once proved, a theorem stays proved. But in software it is expected that the system's underlying purpose will change with time. So proving the current state is what we can do, which is not of much consequence as time goes by.
This was written in 2009. Perhaps one can be kinder to the author in his pleading that looking at cells as individual entities was not getting adequate research focus. Also, things like changes after organ donation do not yet have any scientific explanation.
> This was written in 2009. Perhaps one can be kinder to the author in his pleading that looking at cells as individual entities was not getting adequate research focus.
It wasn't that underresearched in 2009 either, however. Microbiology has always been an active field.
> Also, things like changes after organ donation do not yet have any scientific explanation.
> This was written in 2009. Perhaps one can be kinder to the author in his pleading that looking at cells as individual entities was not getting adequate research focus.
What does the first sentence have to do with the second? What's
changed since 2009 that would be relevant?
From my own experiences, Firefox is a better product than Chrome in a lot of ways. For my daily needs, Firefox can easily survive 150 tabs, while Chrome nearly chokes my machine in 30. Firefox on Android has a quaint reader mode, which I use to read at night. Firefox on Android supports extensions. It stays the most standard implementation of web standards that I know. I have been using it for over 5 years, and though I am forced to use Chrome occasionally (for Chromecast related work), it remains my primary browser. I will use it as long as I can.
You can see when a company has built a stellar product when they can sustain return customers even without discounts and payback offers. See Amazon's rivals, in India atleast, Flipkart and snapdeal are biting the dust, and Amazon doesn't even give any special offers to the customer. They have perfected the thing they must do well to succeed in their business: fast delivery and returns. Amazon is the flagship bearer in e-commerce and is doing a good job advancing state of the art in this segment of human endeavour.
I think the author has not thought of frame reordering in error scenarios. If UDP is used, frames could even get dropped in the middle, and in case of TCP, frames could arrive out of order. That would wreck havoc in the odd-odd numbering sequence of the encoding.
Not at all convincing. I have been meditating for more than 10 years and it has always been an exhilarating experience. The author is trying to make up a story, it seems.
Anecdote, just like lots of people have great times with psychedelics. If you search you'll see "some" stories of people suffering derealization/depersonalization after going on a meditation retreat, for instance.
Kafka can be used if you want to treat data as streams for some processing (think producer-consumer kind of scenarios). You can point to the stream from any point in time to read it 'as and when things happened'. Kafka's own nodes have replication enabled, and the data that it produces can be consumed in a distributed setting as well (meaning multiple consumers acting as a single high level consumer). But it is not a traditional database as MongoDB or MySQL.
I have been using Kafka 0.8.2 in a production setting for consuming real-time event traffic from our caching layer for six months. The most difficult parts of my experience were the occasional consumer lags that erupted without warning/cause in the high level Java consumer APIs. A lot of experimentation with their configuration proved to be futile and now we have had to create a feedback system that triggers alerts to change group Ids of our high level consumers every time some consumers start lagging.
Otherwise the performance of Kafka has been impressive (giving a throughput of upto 15000 packets/sec to a 8-consumer pool), even though I have not had the chance to compare it with any other such tool/library.
Nevertheless, I think this update is a long awaited one, and Kafka Connect may really be good starting point for building more (and better) endpoints.
I have been writing Java code since 6 years. Although I can't say that I have programmed in many other languages (I have coded in C/C++ and Javascript), Java has been always very comfortable to use. It's easy to think of a solution in the Object Oriented paradigm, and I think refactoring (when a requirement changes) has not been much of a problem for me. It's only bad designs that make the code more verbose than it should be, and I think verbosity is actually a good way to articulate your thoughts, and see them evolve in front of you. Recently when I started learning Haskell a bit, I found that there are much faster and concise ways of writing code, but I still prefer Java because of the clarity of the way it allows me to express my logic. People would say that writing a line of code in Haskell would do what 10 lines of code in Java would do, but in the long run, I have found that expressing programming paraphernalia like interfaces and classes actually helps the coder in his job (not necessarily results in wastage of time).