Home » Articole » Articles » Computers » Web development » Web technologies

Web technologies

Web technologies are a set of technologies that make up and use of the World Wide Web (generally abbreviated Web) and its standards. The web was created in 1990 as an information sharing application and has become a platform in its own right, on which new technologies are regularly developed. The basics of these technologies are the Hypertext Transfer Protocol (HTTP) protocol standardized by the IETF and the W3C standardized HTML document format (Hypertext Markup Language).

Designed by its creator Tim Berners-Lee, in Switzerland, at CERN, to link one document to another via a text tag pointing to another page, according to the principle of hypertext, the web has become one of the most used interchange protocols. Its evolution has been steady since its launch. Benefiting from its support on the popular internet network, the World Wide Web is, with email, the most common use of Internet, and has been extended well beyond its original use. Its URLs and URIs, destined to identify and locate a document in a unique and stable way over time, today identify applications with dynamically generated documents in dedicated programming languages, complete computer applications or abstract concepts with the web semantics.

The HTML format now allows many types of documents to the web page itself, images, sound and video, 3D interfaces, videoconferencing, more and more complex design tools. Web technologies today make it possible to create computer applications that would have been possible in the past only in native applications, with the bonus of being naturally network applications and thus allowing both easy collaborative work and being accessible. from any point of access to the Internet. This has led to the emergence of publishing platforms for collaborative documents such as wikis. Web browsers can be found on any type of computer terminal, from servers without graphical interfaces to simple browsers like W3m to mobile phones and tablets. Interoperability and accessibility are important concerns, and the ubiquity of the platform makes it a prime target for developers.


Context of the invention

The Internet was born during the Cold War, in 1969, from a US Department of Defense project aimed at enabling communication between research sites and United States military sites, including war and partial destruction of infrastructure. Each site itself has a local network.

TCP / IP technology was launched in 1970 and adopted for the Internet. This suite of protocols connects networks, and gives the illusion of a single network. The various connected networks can use different telecommunication technologies (Ethernet, switched network …) and the data transmitted over the TCP/IP networks can comply with different protocols, such as HTTP or FTP, according to the OSI4 model.

Invention and evolutions

The web was created in 1989 at CERN by Tim Berners-Lee and was intended for researchers in particle physics. The aim was to facilitate the exchange of information between members of the scientific community using the Internet. He created in 1991 the first editor and web browser, called WorldWideWeb, and the first HTTP server, CERN httpd, it was rewritten in Java and maintained by the w3c under the name Jigsaw since 19966.

The source code of the WorldWideWeb application, initially developed on a NeXTSTEP system, is placed in the public domain to promote its dissemination and incite improvements.

Four years after CERN’s launch of CERN httpd, the NCSA released in 1992 the second web browser: Mosaic. This program allows, always through a graphical interface to consult the documents placed on servers. The documents are in HTML format and images can be embedded in the pages. This is the main evolution compared to CERN HTTPd.

In 1994 was released the Netscape Navigator browser, developed by some NCSA Mosaic developers. Its main advantage over Mosaic is to be able to display the content of the page before all the images are loaded, which made it particularly attractive at a time when dial-up network (PSTN) modems had a bandwidth. passing from 1 to 3 Kio per second. It is a multiplatform browser, able to work on the UNIX stations of the computer sections of universities and research centers. It will be the most popular browser until it is overtaken in 1997 by Microsoft Internet Explorer. He is also at the origin of Mozilla Firefox.

Following requests from web site developers for the creation of a standard for the dynamic sites, the implementation of the Common Gateway Interface (CGI) was discussed at the WWWW 5th International Meeting, May 6-10, 1996. in Paris, and organized by the w3c.

Semantic Web

In its initial construction, the web allows a person to retrieve information; sorting, classifying and extracting what is relevant to it is a manual operation. This can sometimes require a lot of work: a search engine can produce more than 30,000 results. In 1994, Tim Berners-Lee, the inventor of the web, launches the idea of ​​a semantic web, where the sorting and classification of information is automated by artificial intelligence techniques and a description of the content of documents by metadata; metadata being manipulated by computers. These ideas gave rise today to projects such as Google’s Knowledge Graph, Freebase, a semantic database, wikis-like software variants like Semantic MediaWiki, with, as theoretical background, the domain of formal ontologies, and as standards. technology exchanges such as RDF and OWL, standardized by the W3C.

The idea of these projects is to be able to serve as a generic database to store any type of data without being constrained by a pre-established schema like a database with a predefined schema for a certain domain, like most relational databases. A semantic database allows the user to define his own storage scheme, and the standards allow the exchange of data and the establishment of a Giant Global Graph in which each object presents an identifier in the form of a URI on which information can be expressed in a decentralized way. The techniques of annotating web pages have also emerged with sites like schema.org in the spirit of the Semantic Web in order to facilitate the work of indexing robots by adding metadata allowing them to better understand the content of the web pages without having to interpret the natural language.

Web 2.0

The web has changed the way people communicate, trade and share information. In its early days in 1990, the web was a collection of pages that were rarely changed, and usage shifted to apps for sharing photos, videos, doing business, and participating in group activities; it’s Web 2.0; an evolution due as much to the technological changes as to the increase of the coverage of Internet and the evolution of the habits of the users.


HTML5 is another major advance, it has the advantage of the semantic web, where everything can be indexed by automata and multimedia dynamic contents; WebGL for 3D, audio, video, videoconferencing, JSON for data exchange and dynamic client search in service databases.

Description of working

The web allows the sharing of information scattered around the world and their exchange using the HTTP protocol. The conventional form of information exchanged are HTML documents.

HTTP (abbreviated to HyperText Transfer Protocol) is the underlying protocol of the web. This convention defines how messages are formatted and transmitted and how the HTTP server and the web browser should respond to messages. The convention provides for example the transmission of URL between the browser and the HTTP server. This is one of the main standards of the web, the second being the HTML which concerns the way documents are coded and displayed.

Lower network layer

TCP/IP is one of the Internet communication techniques and the underlying network protocol used to make HTTP to work. The use of HTTP is also possible in UDP, this allows for example to have a better bit rate for audio or video streams.

Provision of pages

Languages ​​used

Among the first technologies used, we can cite the Perl scripting language, so widely used in the field of UNIX administration, it allows, thanks to good management of regular expressions, to be particularly adapted to the processing of character strings that constitute the web pages.

Since 1994, PHP, a scripting language intended to serve web pages. In 2010 PHP was the dominant language, used by almost 70% of sites, followed by ASP with 30%, and lastly, less than 1% for JSP and Ruby on Rails. PHP has been used as the basis for developing applications like forums since its creation, for example the once very popular forum PhpBB is named in reference to this language, like many other software developed around the Web.

In 1995, NeXT made the first presentation of its WebObjects that will be released officially in March 1996. The success will not be long.

JavaServer Pages, Active Server Pages, or Java Servlet were very common in the 1990s.

The two most used HTTP servers today are Apache and nginx, which alone share, in August 2013, 61.5% of server market share and 71.8% of the first million most active sites. They usually use FastCGI technology or its variants to call the scripts that will produce the HTML pages.

The Content Management System principle has also provided a common software base for many sites by category; blog, merchant site, information site, webmail, etc. and to develop only minor aspects, specific to the site, presentation, layout, and content.

The web has evolved from an information-sharing device to a technical platform for creating application software. These web applications using standards and technologies of the web and are manipulable using a browser.

Between 1995 and 2005, the web was the cradle of several new technologies. Some of these new technologies have been introduced to replace a predecessor or to compete. For example Java Servlet was intended to succeed CGI, then Active Server Pages was launched as a competing technology. The need to develop two-part client and server code has spawned technologies that can handle the client and server part with a single platform and language, such as the Google Web Toolkit.

On the client side, JavaScript, designed as a browser extension language to dynamically manage certain tasks on the client machine in an asynchronous and event-like manner, such as form validation and DOM manipulation of Web pages, has undergone a major evolution with the introduction of Ajax technology to communicate synchronously with servers and create more dynamic applications requiring fewer page reloads. The language is the object of much attention with entire libraries of code dedicated to it like JQuery, and even extended to run on the server side as Node.js. Navigation software developers need to optimize the code that is intended for platforms that are not very powerful, such as those of some smartphones. Many virtual machines have emerged, such as V8 (JavaScript engine), SpiderMonkey and many others. It has even been often compared to “the assembler of the web”, which seem to confirm projects like asm.js which aim at choosing a subset of JavaScript intended to be generated by compilers in order to optimize the performances, or a very important list of JavaScript compilers.

Translated from Wikipedia

Leave a Reply

Your email address will not be published. Required fields are marked *