Sfetcu, Nicolae (2022). The supremacy of IT&C technologies, DOI: 10.13140/RG.2.2.33253.27362, in Telework, https://www.telework.ro/en/the-supremacy-of-itc-technologies/
Abstract
Over the last 60 years, computing power has grown explosively, mainly due to space, military and industrial research, and forced by the recent COVID-19 pandemic and geopolitical conflicts. The specific features of the near future will be greatly increased computing power, smarter devices, datafication of all aspects of our lives, increased reliance on digital technologies, increased automation of industrial processes. The most targeted areas are the Internet of Things, virtual reality, augmented reality and artificial intelligence. IT trends in business, military and geopolitics will reshape our future socially. There are technologies that are not yet in their stable and mature stage, including 5G networks, blockchain, virtual reality, etc. but for which analysts see a great potential.
Keywords: technologies, IT&C, predictions
The supremacy of IT&C technologies
Nicolae Sfetcu
Over the last 60 years, computing power has grown explosively, mainly due to space, military and industrial research, and forced by the recent COVID-19 pandemic and geopolitical conflicts. The specific features of the near future will be greatly increased computing power, smarter devices, datafication of all aspects of our lives, increased reliance on digital technologies, increased automation of industrial processes. The most targeted areas are the Internet of Things, virtual reality, augmented reality and artificial intelligence.
The optimal solution to solving the current problems is to accelerate the digital transformation by prioritizing some specific processes and functions. A typical example is the development of applications for education and video conferencing during the pandemic. Current issues hindering digital transformation involve the lack of comprehensive macro-level datasets, so the overall effect of digital transformation is still quite small.
Artificial intelligence has been used since its inception in the 1950s. We now find it in home appliances, communications, healthcare, agriculture, military applications, etc. It is ubiquitous in all human activities that involve a minimal degree of automation and control. But it is also contested at the same time, by the possibility of losing control over it if robots surpass human intelligence, and the social problems that arise through the loss of jobs determined by automation based on artificial intelligence. Artificial intelligence will increase efficiency, is less error-prone, can work continuously without breaks, and can be used in risky situations for humans. Artificial intelligence will further be used to analyze interactions to determine underlying connections and insights, help predict demand for services, and detect changing patterns of behaviors. Currently some studies focus on the life cycle of an AI system, which defines the phases an organization should follow in order to take advantage of AI techniques and especially machine learning models to achieve value business practice. The current problems with artificial intelligence are that for now its application involves high costs. Research in artificial intelligence is currently focused on goals for reasoning, knowledge representation, planning, learning, natural language processing, perception, and the ability to move and manipulate objects. These goals are achieved through mathematical search and optimization, formal logic, artificial neural networks, and methods based on statistics, probability, and economics. If research in artificial general intelligence produced sufficiently intelligent software, it might be able to reprogram itself and improve itself. His intelligence would grow exponentially in an intelligence explosion and could dramatically surpass humans according to the concept of transhumanism. Intelligence will play a predominant role in facial recognition, school education, medicine, business and diplomacy in the near future.
5G communications networks have recently become a reality after a decade of development. 5G is the fifth-generation technology standard for cellular broadband networks. The new networks have higher download speeds and more bandwidth, improving the quality of Internet services in congested areas. 5G-Advanced is the name for version 18 of 3GPP. 5G enables complex operations that are now achievable, such as drone control, autonomous cars and smart cities. The 5G architecture results from the analysis of various publicly accessible reports published by standards, research and scientific bodies, aimed at establishing a common and coherent understanding of the components of the 5G architecture. For this purpose, it is necessary to visualize the various components in a modular and general way. By 2024, 5G networks will cover 40% of the world, practically 25% of global mobile traffic data.
The Internet of Things (IoT), after a period of slowing growth during the pandemic, has resumed its accelerated development and is expected to grow much more in the coming years due to the convergence of several technologies, including commodity sensors, embedded systems from increasingly powerful and machine learning. The concept of “smart home”, more relevant than ever in light of the current energy crisis, is almost impossible to develop without the involvement of the Internet of Things, despite the privacy and security risks. The current IoT trend is the explosive growth of devices connected and controlled over the Internet with the help of web servers, allowing a more direct integration of the physical world into computer-based systems.
Software developers have, in recent years, processors with multiple execution units or multiple processors performing computations together, and computing has become a much more competitive activity than in the past. Software design and implementation varies depending on the complexity of the software. For example, designing and creating Adobe Photoshop took much longer than designing and developing a text editor because the former has much more basic functionality. In the coming years, new, more powerful integrated development environments are expected to emerge that can simplify the process and compile software through the application programming interface.
Data science is an emerging interdisciplinary field for identifying, collecting, analyzing and displaying data of interest in a wide range of application domains. Data science is closely related to big data, data mining, machine learning and artificial intelligence, using specific techniques and programming languages both general (such as Python) and specific (the R programming language, for example). There is still no consensus on the definition of data science. Data scientists are responsible for breaking down massive data (big data) into usable information, and creating software (such as R) and algorithms that help companies and organizations determine optimal operations.
Cybersecurity has become particularly important due to the increased dependence on computer systems (internet, Bluetooth, Wi-Fi, etc.), and smart devices (smartphones, TVs, in general the Internet of Things) In the near future, cybersecurity is one of significant challenges due to the complexity of information systems. In the context of current armed conflicts, there is growing concern that cyberspace will become the predominant theater of war. To protect against cyberattacks and cyberterrorism, technologies are being developed to combat deepfakes and open-source vulnerabilities, and advances are being made in blockchain security and homomorphic encryption. In this context, big data security involves adhering to the concepts of right and wrong ethical behavior with regard to data, especially personal data. Big data ethics focuses on the collectors and disseminators of structured or unstructured data. Information security and confidentiality is supported by extensive documentation, through which concrete solutions are sought in the coming years to maximize the value of information without sacrificing fundamental human rights.
Blockchain is a technology based on a distributed ledger, consisting of records (blocks) connected to each other using cryptography, and therefore providing much increased security. In addition to cryptocurrency transactions, blockchain technology is used in a host of applications with far-reaching consequences for the economy and society, such as smart contracts and the Internet of Things (IoT). The advantage of this technology is that it is based on consensus, and data once added cannot be removed or changed, eliminating a trusted third party needed in traditional transactions. Ontology engineering, together with Semantic Web technologies, enable the semantic modeling and development of operational flow required for blockchain technology design, with a future focus on enterprise modeling system.
Through the dramatic expansion of the Internet, social media can be used to represent, identify, or influence a culture. Social media can help enhance a person’s sense of connection with real or online communities and can be an effective communication (or marketing) tool, but a use of social media as a communication, propaganda and organizing tool has also developed in times of political unrest. Social media analytics is a new and emerging field, ready to enable companies to improve their performance management initiatives in various business functions by measuring the effectiveness of promotional campaigns, gathering information about customer needs and preferences, discerning brand perceptions, obtain feedback on product performance or to capture market trend data. Current research on social media analytics focuses on a business intelligence-based approach to obtain promising, albeit challenging, data for business intelligence.
Cloud computing involves the on-demand availability of computing resources, especially data storage (in the cloud) and computing power, without direct active management by users. This involves sharing resources with user benefits such as reduced capital expenditure, reliability, improved disaster recovery, and collaboration. However, there are also disadvantages to this technology, such as ongoing costs, security issues, and internet dependency. Alternative technologies to cloud computing are currently being developed, including edge computing, a distributed computing paradigm that brings computing and data storage closer to data sources, improving response times and saving bandwidth. The Internet of Things (IoT) is an example of edge computing, without these concepts being synonymous. Another possibility is the development of the concept of fog computing, an architecture that uses edge devices to perform a substantial amount of computing, storage, and communication locally and routed through the Internet backbone.
Virtual reality gives users an immersive experience of a virtual world, with uses in entertainment, education, and business, among others. Currently, standard virtual reality systems use special headsets, or multi-projected environments. Efforts are currently underway to integrate virtual reality and personalized cognitive behavioral therapy. Virtual reality techniques and technologies currently focus on training and simulation, for example in surgery, or in aviation for flight simulation. It is hoped that in the coming period some of the problems that limit it, such as the size of the devices and the graphic performance, will be solved.
Augmented reality combines the real world and computer-generated content, including visual, auditory, haptic, somatosensory, and olfactory. Superimposed sensory information can be constructive or destructive, this experience perfectly interweaving the virtual world with the physical world, being perceived as an immersive aspect of the real environment. Augmented reality is used to enhance natural environments or situations and provide perceptually enriched experiences. It already has many applications, such as matching products from online catalogs with the user’s environment, to see how well a library, for example, fits into the user’s room. Neurosurgeons use augmented reality to project a 3D image of the brain to aid in surgery, and in the military, pilots wear headsets to create a more realistic experience in simulation tests.
A chatbot is a software application for conducting an online chat conversation via text or text-to-speech without direct contact with a human agent. It is used for various purposes, such as customer service, request routing or information gathering. Advanced applications use extensive word classification processes, natural language processors and sophisticated artificial intelligence. Some of the research to this end focuses on natural language processing. More recently, it combines real-time learning with evolutionary algorithms that optimize its ability to communicate based on each conversation it has.
The IT landscape is constantly changing, driven by the recent pandemic and the geopolitical conflicts of recent years. IT trends in business, military and geopolitics will reshape our future socially. There are technologies that are not yet in their stable and mature stage, including 5G networks, blockchain, virtual reality, etc. but which analysts see a great potential.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License CC BY 4.0 (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Leave a Reply