Digitalization

Digitalization can be defined as the conversion of information (text, pictures, sound, and basically any type of information) into a series of the digits 0 and 1, which can be automatically processed by a computer. Transistors in integrated circuits are the building blocks for the transmitting, processing, and storing of such digits, and therefore of digitalization. The transistor was invented in 1947 by John Bardeen, Walter Brattain, and William Shockley, three researchers working at Bell Labs, the research branch of the American telephony monopoly. They were awarded the Nobel Prize in physics for an invention that would transform the world. Shockley moved to his family’s hometown, Palo Alto, to launch his transistor manufacturing company and attracted some of the world’s best talent for his venture. Shockley’s venture did not succeed, but as his first employees (the so-called “traitorous eight”) started new ventures, the history of Silicon Valley cluster kicked off. Increasing computer power has dramatically increased the capability to process and store huge amounts of data at lower and lower cost.

Digitalization allows the creation of “digital twins” that simplify the coordination of complex systems such as network industries. A data layer is emerging on top of physical reality, which virtually recreates it in silico. Algorithms are then able to identify the opportunities to improve the organization of the system, thus increasing efficiency. The underlying reality can subsequently be transformed and improved. This is also the case of infrastructures and network industries,5 such as aviation,6 railways,7 and electricity.8

Sensors, cameras, meters, and other devices can be installed in the physical assets that capture and transmit data to the infrastructure manager. Such data can recreate the status of the infrastructure (location, damage, collapse, and so on) in the data layer, as well as the usage of the infrastructure for the provision of services (capacity, traffic flows, and payments).

Just as important as the processing and storage of digits by computers is the transmission of such digits between computers. On October 29, 1969, at 10:30 pm, the computer network of the University of California in Los Angeles and the computer network of the Stanford Research Institute were interconnected, creating ARPANET, the network of computer networks of ARPA, the research arm of the US Department of Defense. This was the seed for what would later become the internet,9 an abbreviation of “interconnected networks.” In its essence, the internet is a protocol (Transmission Control Protocol/Internet Protocol - TCP/IP), developed by Vinton Cerf and Bob Kahn, that enables the interaction of independent computer networks. The very nature of the internet is to facilitate the interaction of previously fragmented and isolated computer networks.

Digitalizing infrastructure depends on the availability of connectivity for the transmission of data, as high-speed internet access is a key enabler for other digital technologies.10 The development of 5G wireless networks, not only in densely populated areas, but also in the remote areas crossed by infrastructure, is a fundamental condition for the digitalization of infrastructures.

Algorithms are enabling the full exploitation of Big Data.11 Sophisticated algorithms are necessary to give order to the massive amounts of data generated by sensors and other data-collecting devices, thus making them relevant. Furthermore, algorithms are incorporating machine learning tools, or “artificial intelligence.” They are no longer a set of fixed commands rigidly linking a fact to a consequence. On the contrary, algorithms peruse through the available data in order to learn from previous experiences and subsequently transform data into information and link it to consequences. Algorithms improve upon themselves with each interaction; they are becoming more and more predictive.12 Automation is the ultimate goal of many digitalization projects.

 
Source
< Prev   CONTENTS   Source   Next >