What’s next to the cloud? – Java code geeks


Today, the software world is dominated by “Big Data” and “AI” work taking place in “The Cloud”. Thinking back to the mainframe era of the 1950s, it must have been exceptionally difficult to see the coming of microcomputers, cellphones and the Internet, otherwise mainframes would be obsolete and left behind, existing only to run things. legacy software. This is how it is today with the cloud. We are at a unique point in the evolution of computing machines, and something is like this. But what?

In Thomas Kuhn’s seminal book on the sociology of science, The structure of scientific revolutions, scientists work within a well-defined and accepted paradigm. During these periods of “normal science” scientists make useful and exciting discoveries, but not particularly surprising or upsetting. They carry out daily work to advance the state of the art in a constant and progressive manner.

Within a paradigm, one of the main functions of science is to provide models that accurately predict observations. If the model fails to accurately predict the outcome we are observing, there are three possibilities:

  1. Experimental error
  2. The model can be made to work with some refinements
  3. The model can no longer work and a new model is required

When evidence begins to emerge that cannot be taken into account in the current model, the scientific community strongly prefers options 1 and 2. The use or development of a new model may be possible, but the community begins to resist by:

  1. Denial of abnormal evidence
  2. Affirmation of authority
  3. Harsh criticism of new ideas
  4. Emphasis that experimental error must be the cause
  5. Suggested workarounds and improvements to reduce predictive errors
  6. Attacking other scientists

The history of science is littered with ruined careers and missed opportunities due to the group’s resistance to better role models. Science is hardly a rational short-term endeavor. But in the long run, he wins, or at least as far as we know.

It is only when a new and better model appears with the ability to explain anomalous observations that the new model is adopted. The workarounds are starting to get more difficult to use than the new model. This shift, of course, forms a new paradigm in which to work and the start of a new period of normal science.

Interestingly, Kuhn made these observations of science but not of technology or other human activities. Technology also has a sociological structure, and we see a similar pattern: (1) periods of “normal technology” followed by (2) resistance and (3) the emergence of a new technological pattern forming a new period of technology. normal.

The pace of this is particularly fast in the calculation. In terms of the evolution of electronic computing systems, the progression to date is something (only very roughly) like this:

1880s Relays, logic gates
1890s Hollerith tabulator
1930s Special computers
40s Memory, transistors, von Neumann architecture
1950s Mainframe computers
60s Microprocessors, graphic interfaces
1970s Mini-computers, UNIX, ethernet, microcomputers, distributed computing
1980s Personal computers, AI
90s Cell phones, Internet
2000s Cloud computing, big data
the 2010s Quantum computers

Today, the cloud is our era of “normal technology” (quantum computers exist, but no one seems to know quite what to do with them). So, given that major changes have occurred virtually every decade in the past, what can we expect next?

One likely possibility is that an earlier paradigm is upset due to its inability to function as a model (as in Kuhn’s work). In particular, two problems are currently holding back IT:

  1. Heat dissipation
  2. Calculation and storage density

It’s possible that a small change like gallium arsenide (GaAs) chips could extend the current computing paradigm. It could give us faster chips (THz, not GHz) and less heat. However, this would not represent a real paradigm shift. In particular, it wouldn’t give us anything qualitatively different from the current era calculation under The Cloud.

So what would have be new?

Perhaps the von Neumann architecture from the 1940s to the present day will be the next to disappear. We are already seeing evidence of the need to bypass von Neumann’s design in modern chips and vector processors. But how could the whole paradigm change?

One possibility is “molecular computing”, where the density of computation and storage could increase dramatically. The Internet can seem like a great thing. It is estimated to be of the order of zettabyte (10 ^ 21 bytes or 10 ^ 22 bits). This is an interesting number because it is very close to 10 ^ 23, which is the order of magnitude of one mole (of any substance, one mole of water weighs about 18 grams). So if we could store data in terms of water molecules, where each molecule stored one bit, we could store 10 ^ 23 bits, or 10 bits of the current internet. In 18 grams.

It will take time for IT to move in this direction, but it sure will. And with it will go the von Neumann model. Molecular computers will have to be qualitatively different from von Neumann machines. They will have huge amounts of memory, but they will likely have to perform calculations more locally, as a centralized bus and clock pulse will not scale to this size. Regardless of the actual shape of molecular computers, they will (probably temporarily) obviate our current need for the cloud. So what will we have instead? Another kind of localized computing.

The driving force behind this paradigm shift will be the same as the last major paradigm shifts:

Why would you want your huge pile of sensitive, proprietary data to be under someone else’s control on an internet cloud like AWS, when you can put it on one or two devices that are under your full control? If your big data business can meet all of its needs with a small number of machines, why would you even want to connect those machines to the Internet?

Of course, a decade later we will probably have applications so demanding that some weird applications (perhaps computational biology) require molecular computers to be placed in a “molecular cloud”.

The transitions that have occurred between computational paradigms have more to do with the ownership and control of computation and data than with the technology itself:

1880s to 50s mainframe era (centralized)
from the 60s to the 80s The era of the electronic chip (decentralized)
from the 1990s to the 2020s The Internet Age (Centralized)
Unknown Molecular Age (Decentralized)

The decentralization that occurred in the age of microchips was linked to the demand of managers and executives who wanted full control over the systems they used (in the 1950s, IBM mainframes were leased and maintained by IBM ). Will the same happen in reaction to the cloud age? Is AWS the new IBM, leasing compute to companies that ultimately want to take back control of their computing resources? The only thing stopping businesses from going back to desktops is that they can’t handle the required compute load. If molecular computers of the not-so-distant future can do what entire cloud systems do today, why would anyone use the cloud except to run legacy software?

Posted on Java Code Geeks with the permission of Jonathan Locke, partner of our JCG program. See the original article here: What’s next to the cloud?

The opinions expressed by contributors to Java Code Geeks are their own.



Source link