Blockchain & Artificial Intelligence are the most transformational and disruptive of all new technologies
The thriving world of new technologies has never failed to fascinate us. ‘Emerging technologies’, a term now part of the popular tech jargon, broadly refers to any technology that is currently under development and is expected to be available widely in the coming years.
These technologies are going to permeate every way of life in the near future and might create momentous socio-economic disruptions.
In one of the recent World Economic Forum meetings in Davos, there was a panel discussion on emerging technologies wherein some of the most innovative and promising technologies highlighted are: Blockchain, Artificial Intelligence (AI), Machine Learning, 3D Computing, and Cloud-based Computing were listed out and marked as the engines of the Fourth Industrial Revolution.
Of these, blockchain & AI are the most transformational and disruptive of all technologies.
Read: As Bitcoin loses steam, blockchain moving into next generation (September 10, 2021)
Before exploring further, let’s look at the history of computing as of today.
History
In the 1820s, a mathematician by the name of Charles Babbage (1791-1871) designed the first mechanical computing device. It was a general-purpose computing device called the Difference Engine.
But it didn’t stay much longer as it was a very complex device. Today’s computer systems are based on Boolean algebra created by George Boole (1815-1864) in the mid-1850s, wherein the number system is based on two symbols 0 & 1 (i.e. base two).
Back then, nobody realized its utility. However, he went ahead and created an entire gamut of algebra around the Boolean theory, which became the foundation of present-day digital computing.
Later in 1885, Herman Hollerith (1860-1929) was given the contract of calculating the US Census for the 1890s. He invented a series of calculating machines strictly based on electro-mechanical devices.
His company was sold to IBM under the leadership of Thomas J. Watson (1874-1956), which later went on to become the world’s foremost digital computer manufacturer, and a revolutionized the field of information processing.
In 1936, Alan Turing (1912-1954) designed the first programmable computer. It is the basis of how a new computer system should be designed.
This was followed by John von Neumann (1903-1957) who, by the early 1940s, had devised the real architecture of modern digital computer system.
Von Neumann’s architecture was very close to the modern computer system, and contained a memory unit, a central processing unit, an arithmetic processing unit, and input and output, all built into one box.
Initially, this system was designed and developed based on vacuum tubes. The first vacuum tube computer was ENIAC, which is in Philadelphia museums today.
The invention of transistors in 1947 revolutionized the entire digital computer industry. As the machine language which was used for programming until then was very complex and difficult to code, the US Department of Defense in 1959, constituted a committee to design a higher-level programming language.
This resulted in the CODASYL (Conference on Data Systems Languages) committee and the design of COBOL (Common Business-Oriented Language). Meanwhile, IBM also defined and propagated a new language for scientific calculations, called the Fortran language.
In the 1960s, IBM came out with much more versatile computers, called the IBM 360 series, which could do commercial data processing as well as scientific calculations.
Through IBM 360 series introduced the first of database management systems. The first database management system was a hierarchical database management system called IMS (Information Management System).
In 1970, two mathematicians and computer scientists, Edgar F. Codd (1923-2003) and Christopher J. Date (b. 1941), proposed the idea of a relational data model for databases, which is used by all the major databases even today.
In 1971, Intel Corporation came out with the first 4-bits chip; soon, the development of multiple varieties of Intel chips from 4-bits to 8-bits to 16-bits made the microcomputer possible.
Until then, all the computational work was done by giant mainframe computers which were limited to universities and corporate sectors.
In 1973, IIT Madras got one of the fastest computer systems in the world back then. It was the IBM 370-155. It was probably had substantially small computing power than what we have in the Apple iPhone today.
The world has changed so much. In the 1980s, microcomputers and desktops became very popular. Microsoft Corporation came out with two new operating systems, which again proliferated the use of computers for small businesses and personal usage.
The telecommunications industry was also getting transformed. AT&T, which has been split into several companies by the US government, was a very powerful corporation back then.
The 1980s also saw the development of computer-to-computer networking. In the early 1970s, we had a computer for terminal connectivity, using the IBM TSO facility (Time Sharing Options).
But the real networking took place in the late 80s with the Microsoft operating systems. During that time, Prof. James Martin (1933-2013), a fellow at IBM came out with a series of textbooks may be termed as ‘Trilogy of Information Engineering’, wherein he defined software development as an engineering process rather than a creative artifacts.
Texas Instruments came up with the CASE (Computer-Aided Software Engineering) tool, which was used to design the whole computer application system, starting from basics to planning to execution, without any programming. That was a novelty.
Although CASE tools and other technologies were coming up, computer programming had still not become a real engineering product for many reasons.
This resulted in high software creation costs and significant failure rates. The next revolutionary thing came up in 1993 when the US government released internet technology to the world. Until then, the internet was available to the US Defense personnel, Defense agencies, and some major universities.
The Internet started a revolution of information technology. However, there was a glitch that came in between called the “Y2K”. It was a problem created by programmers.
In those days, a computer program had to be short and crisp, as computer memory was a major concern. As a result, programmers had to compress the four-digit year to two digits, which was alright in the 80s and 90s until 1999, but as 2000 comes, the two-digit year which would become 00 will screw up the programs.
That was a real shock wave among the computer corporations, government, and financial institutions. They spent a substantial amount of money redesigning programs.
This presented many opportunities for Asian immigrants especially Indians to come to the United States to work on the Y2K, leading to the first wave of IT people coming to the US and the Western world.
The real revolution of the Internet started at the beginning of 2000. Y2K surplus funds were available, and a lot of companies promised to produce nice products, but nothing much happened.
This resulted in the 2000s’ dot-com bubble. However, the ingenuity of people prevailed. From 2001-02, during the last 18 years, the internet has reached every corner of the world. Today, there are more than four billion people connected to the internet.
The Internet was able to function by the integration of information technology, telecommunication technology, and software engineering. The basis of the internet is the telecommunication concept or standard called TCP/IP (Transmission Control Protocol / Internet Protocol).
TCP/IP is a protocol for transferring data from one location to another. The infrastructure was built everywhere so that TCP data transfer was never a problem for the programmers.
Read: Lure of Bitcoins: Greed blinds even the wise to risks (February 24, 2021)
The rest of the edifice was much easier to be built. Today, every conceivable application has already been migrated from the legacy system to the internet.
However, the internet is not very reliable, secure, safe or protected. With the advent of more computer applications, and the opening of the internet, data security problems became a real challenge.
On an average during the past few years data privacy and security violations alone have cost about $50 billion to US insurance and financial institutions.
Computer privacy and data security have also entered the realms of espionage. Russians could log into American computer systems and influence the US elections.
To resolve this problem, we must design new innovative technologies. Emerging technologies are promising to be a major solution provider in this field.
Blockchain
The most promising among emerging technologies is blockchain. The concept was originally proposed through a white paper in 1995, it is implemented for the first time in 2008 by Satoshi Nakamoto.
Bitcoin is a blockchain based development platform to mine cryptocurrency floated by him and his associates. It enabled Bitcoin system to transfer funds between registered users.
Only sovereign countries are authorized to issue currencies but these bitcoin founders who believe in less governments created the first cryptocurrency, the popular bitcoin brand.
If they’re able to create a currency, they can manage it and the government will never know about it. As a result, undisclosed funds from various individuals and corporations went into cryptocurrencies like Bitcoin.
The design and development of Bitcoin is done using blockchain. It is one of the most secure internet systems today. Blockchain builds its own infrastructure for carrying out transactions, over and above the existing TCP/IP infrastructure.
For example, in the early days of the internet, email facility was not standardized. As a result, one could not communicate with another with a different email system.
But later, a standard called SMTP (Simple Mail Transfer Protocol) was designed, which standardized all protocols within the email system, and enabled inter-communication between different email systems.
Peer-to-peer communication is a new concept in information technology. In the past, all IT devices were connected to a central processing system, a central computer, and everything else was connected to it.
All information was stored in this central computer. In the current internet system, this central computer is distributed all over the world.
In the peer-to-peer communication architecture, there is no central computer. Each computer node is equal to another node and is connected to each other on an equal status basis.
This is accomplished with the help of cryptographically accessible data structures. The data in the blockchain is transmitted cryptographically, to circumvent security and privacy concerns, i.e. an algorithm transforms the basic data into a new mode before the transfer and retransforms it back for retrieval.
In addition to that, there is a private key and a public key to access any data in blockchain technology. An interesting feature of blockchain is that it is an open-source technology. Nobody is centrally controlled. There is a group of people who provide open-source programming code, available to everybody.
Anyone can participate and contribute to the design and development, which will be evaluated and accepted. The two keys are each about 32-bit long in alphanumeric characters.
While the private key is safe for individual usage and need not be disclosed to anybody unless they are accessing the transaction, the public key is for the system to distribute the data across the network. Thus, blockchain is very robust in terms of data transmission and data recording.
Another aspect of blockchain technology is that data is written in the form of a block. The physical data is first transcribed and then converted cryptographically.
The key, date, and timestamp of the record are also created and stored cryptographically. So, a record contains all this information, and the next record refers to this record, and so on, a chain of records is systematically created.
There cannot be any record added in between or deleted. This chained record of blocks together forms a ledger, which is open and available in the network for everybody to use.
Anybody who has access to this network can view it but will not know who this record belongs to or what information it contains, as the recording is cryptographic. Only the owner of that device or node can interpret what his data is.
Bitcoin is essentially for financial transactions only. When it was released in 2011, one Bitcoin cost about five cents, but in 2017, its price went up to $20,000.
In 2019, the price was reduced to about $7,000, from where the price has skyrocketed to $58,000 today. So, Bitcoin is nothing but a speculative currency.
Despite its instability, many people who want to circumvent government control and knowledge about their funding, are fascinated with Bitcoin. Today, there are more than 10,000 cryptocurrencies in the marketplace.
Of late there is another company called Ethereum, whose principles also worked on Bitcoin Foundation, which also came up with the similar framework.
Their framework can be used not only for financial transactions but also asset-based transactions. Any asset-based online system can be developed using Ethereum.
Again, Ethereum and Bitcoin, are essentially platforms. They’re used to develop, implement and operate the software, as the entire gamut of TCP/IP or blockchain layers need not be developed again.
Thus, Ethereum, which is also open source, has become the second most popular platform among the blockchain developers now.
As peer-to-peer architecture is not ideal for all kinds of data processing needs on the internet, migrating and transferring all existing applications to the blockchain is a real challenge.
Read: Cryptocurrency: A serious threat; Blockchain: A revolution underway (July 6, 2021)
So, a third group under the leadership of Unix, Linux, and IBM came out with a new open-source framework called the Hyperledger Fabric. Hyperledger Fabric is a foundation framework to be used for developing applications rapidly.
It provides a virtual cloud computing environment with tools for data analysis, database design, screen design, logic design, etc. without the need for much programming.
Earlier, a technology called the client-server model was used, in which a client serves a group of users, without the need to know who the client and the server are.
In cloud computing, the computer can be used without the need to know its location. Cloud computing can also be customized for a specific application, called virtual cloud computing. It also allows database access.
Normally, blockchain does not have an interface with outside programs. For reasons of data security and privacy blockchain systems maintain strict limitations for interfacing with other programs and databases, except for Ethereum which has some relaxations.
The IBM Hyperledger Fabric, on the other hand, has permissioned network access restrictions and can be used to develop internet applications including databases.
Artificial Intelligence (AI)
AI is proven technology now with widespread applications already in place. GPT3 is the newer neural network-oriented machine learning programming language. It is mathematically complex and needs a large volume of data for analysis.
In 1980, the data we created in one year is equivalent to the data created in one second today. With the massive amount of data that is available, AI is getting more popular and robust.
The future of information technology is increasingly depending highly on complex mathematical algorithms. It is going to be the most fascinating evolution for the next 10 years.
These mathematical algorithms are evolved using conceptual mathematical principles, with the need for problem-solution. For example, the spam blocker in email systems works based on a simple machine learning concept.
Read: The world of Artificial Intelligence (September 6, 2020)
An algorithm analyses emails to figure out if they could be spam. Similarly, in the case of networks, there are Machine Learning-based technologies to prevent illegal intrusions.
AI needs a high volume of data to intuitively come out with potential solutions. For example, when you use a web browser and click on different things, you are captured as an individual along with your preferences, location, and other details. AI is still in the initial stage, but it’s going to grow substantially.
3D Printing
3D Printing is another emerging technology. Two-dimensional (2D) printing started about 500 years back when Johannes Gutenberg (1400-1468) designed the printing machine for mass publication of the Bible.
Now, with the advent of information technology, innovative programming, and precise use of materials, three-dimensional printing has become a reality.
It’s a mathematical algorithm that creates a three-dimensional geometry of the product and uses finite element technique and precise material injection to print the product. It’s a sophisticated mechanical engineering application.
Today, 3D printing is finding applications even in the arena of biotechnology. In the US, it has already been demonstrated that certain tissues can be 3D-printed.
It is predicted that even complicated organs of the human body can be 3D-printed using tissue culture and other advanced biotechnology tools. Thus, 3D-printing technology is going to make our lives much more comfortable, and the industry is going to grow.
Conclusion
The world is changing faster than ever, and the future is for the most part, uncertain. Among the different emerging technologies analysed, the two most promising are blockchain and AI.
Blockchain is a disruptive technology because it is going to fundamentally alter the way the system works at present. All existing applications are going to be migrated to blockchain in the next 10-20 years. Imagine the massive amount of work required!
In one of the recent reports by Gartner Group, the popular technology evaluation company, it is predicted that about 10% of all computer applications will be converted to blockchain technology by 2022, and about $3.1 trillion worth of business will be done by blockchain, by 2030.
Emerging technologies are growing faster than we anticipate and will create vital challenges for tech-entrepreneurs and the common man alike. Discovering synergies and crafting innovative solutions, emerging technological disruptions can be driven in our favor.
It’s a great opportunity for everyone, particularly the younger generation, to take advantage of the wealth of resources, and create a fortune for themselves and the world.
(Krish Pillai is a former business owner and Information Technology consultant for over thirty years. This is an abstract from a guest lecture delivered by the former IIT Delhi student and IIT Madras staff member at the Indian Institute of Technology, Guwahati, India. Transcribed and edited by Nandakishore Nair.)
1 Comment
Excellent write up