the future of the internet

In historical terms, it wasn't that long ago that we relied on a dial-up modem to connect to a newborn Internet.

That being the case, either we were talking on the phone, or we could surf the web, but never both at the same time. If you went through that first step, you will surely remember many anecdotes related to this relatively massive first technology.

In that context, thinking about "access speed" was not an issue put on the table. Either you had the internet, which was a real privilege (comparable at the time to having a color TV) or you didn't (and you had to go to a friend or a public internet room). Websites would all seem pretty, flashy (read “innovative”), although, seen today, they would make our eyes bleed. HTML, the language of web pages, reached version 4 at the end of the nineties, and stopped evolving for many years (in fact, until October 2014). In its first version, HTML/1, we appreciated that the texts could change size and color, and it was also possible to add images (CSS, the modern style libraries on every website, were only effectively implemented from the year 2000). Being new, fast or slow, everything was equally wonderful.

In the first internet, depending on a dial-up modem was difficult: the device made an atrocious noise, it did not always connect to the Internet provider, it disconnected at any line noise, and the pages loaded with a slowness that is unthinkable today, because in Throughout the 1990s, the average person accessing the Internet did so with a 28.8k modem (that is, I was browsing at 3.6kb per second). Today, a standard connection of 10 megabytes allows us to download 1024kb per second (a 28.500% faster).

feeding the beast

Presumably, as access speeds grew exponentially in the current millennium, the human being changed. And it really changed.

Beyond globalizations, social networks, audio and video streaming and the proliferation of blogs, in the 21st century we became impatient. Field studies show that today, if a website reaches its fifth second without fully loading, 40% of visits go elsewhere. It's practically losing half the traffic in five seconds; and, as a few more seconds accumulate, the bulk of the traffic will flee in terror. But technology is on our side, once again, so that everything is faster and our impatience has a short stride... Technology, yes, but above all business interests.

Give me speed (and more money)

The first and most resounding study of the behavior of the visits in relation to the speed of load time was carried out by Walmart. The international supermarket chain paid for a computer audit to measure conversion (real purchases) on its online sales site. It turned out that on page load, for every 100 milliseconds less time, the company saw 1% more profit. Conclusion: slow internet equals loss for any business purpose (in many cases millionaire losses).

When this pot was uncovered, companies from all over the world cloned the study in their own online businesses, and more than one will have suffered a heart attack. Needless to say that investments made since then in connectivity and faster sites add up to millions of millions of dollars (money that was recovered later because the rule of one hundred milliseconds less, one percent more profits, is strictly adhered to). First of all, at least the users were and continue to be more and more benefited. Because time is gold, of course.

the main actor

Let's not lose sight of one of the most important players in the morphology of the internet for a second: Google. It handles the 90% of global internet search traffic, and this "authorizes" you to take a few turns at the destination of the web. For one thing, the Mountain View-based company has been giving lighter sites top spots in its search results for years (it's all the same to say punish slow sites). In fact, Google invested in the development of faster and more efficient navigation protocols, thus obtaining its patent SPDY (read “speedy”, for fast). While we all raced to have such technology on our sites in the beginning (since Chrome and Firefox soon supported it), SPDY development eventually dissolved into the HTTP/2 protocol (in 2016). And yes, HTTP/2 is based on Google's SPDY. Although there is no longer support for the deprecated SPDY today, its part in the story is further evidence of just how interested Google is in everyone having a faster and more secure internet.

HTTP/2… HTTP/3… HTTP/4?

Since we mentioned it, let's review what HTTP means Hyper-Text Transfer Protocol, the protocol (language) with which the communication between your computer or cell phone and a server is encrypted. When we just mentioned that SPDY was merged into the (then new) HTTP/2, we're talking about a connection speed improvement of up to 40%. An outrage, and only by improving the "language" or connection protocol. This is a perfect sample that the software is equal or even more important than the hardware of a server. While we may be surfing the internet indifferent to the version of HTTP or what the hell it means, it is true that HTTP/2 allowed the creation of more complex websites, with more elements and calls to the server without this being a detriment in speed. Which brings us to the fresh and shiny new “Hypertext Transfer Protocol over QUIC”, finally named HTTP/3.

QUIC is a network protocol belonging to the transport layer whose development has (again) a well-known actor: Google… How "strange" to see you here.

HTTP/3 is already supported by recent versions of Chrome, Firefox, Edge and Safari, and at Duplika it is built into all of our hosting plans (so that you are at the forefront, obviously).

While HTTP/3 is faster than its predecessor (essentially because it handles multiple transfers independent of one another), it's also more secure: in HTTP/2, connections are encrypted if a site developer wishes, but with HTTP/3 data is forced to be encrypted, without exception. This is a very clear advance in web security for everyone, because, of course, not everything is about speed.

But let's keep the oxygen going: full HTTP/3 push will be slow, it's still under scrutiny, and the servers that promote it (or the sites that use it) are pretty much exclusive. Let's think that embracing a protocol implies changes in the servers, reconfigurations, tests, fine optimizations... Cloudflare, the most popular mega CDN company (Content Delivery Network), count the development cost involved in incorporating HTTP/3 here.

And another reflection with a view to an eventual HTTP/4: HTTP is a protocol created at a time when few could take into account the explosive growth of web traffic over the years. In other words, HTTP was conceived at a time in history where the traffic entering a hosting server was minimal, controllable. HTTP/3, with all the progress it means, is still an improvement on an "old" protocol, and there will come a time when we must rethink the connectivity between a server and a web browser in a new light, necessary. Yes, although the cost of implementation is very high. Over the years we will see if an eventual HTTP/4 is a reality, or if a completely new protocol emerges (a big problem to solve would be backward compatibility, but we will see how things turn out).

From the games, with love

There is a piece of hardware that here to stay, and are the GPU processors, better known in the jargon as "video cards". If you were or are a fan of video games, you will know exactly what I am talking about. If not, I'll tell you that a GPU is an electronic circuit designed to speed up the creation of images on any device, be it PC, desktop MAC, and even tablets and cell phones. A GPU improves the sharpness of a YouTube video, renders many frames per second (lighting, shadows, perspective) from 3D models, speeds up video and photo editing, among many other things. And among those “many things” there is one in particular related to the reason for this note.

The development of GPUs has been favored with large investments of money, especially since the moment when everyone has a mobile phone, and most of us play video games with it (even if it is a candy crush). Let's dig deeper: a normal CPU, Intel or AMD, can have 4 or 6 physical cores. The same on your cell phone; for example, an iPhoneX has six cores, but also has three additional cores, dedicated for graphics acceleration. On PC and MAC, a current video card, whether from nVidia or AMD, has no less than 4,000 processors inside its GPU. Surprised?

It was only a matter of time before this potential had other aspects.

These days, the contexts where the power of GPUs is used grows exponentially, and a very good example of this is the case of apache spark. Apache Spark, the unified analytics engine for big data processing (with built-in modules for streaming, SQL, and smart learning) now uses GPUs from the firm nVidia (you can take a look at the inclusion of GPUs in servers here).

Cloud: clouds, but not storm

The cloud concept implies the availability, through requests, of the resources of a remote computer system, especially data storage (cloud storage) and computing power, without direct active management by the user. The term is generally used to describe data centers available to users via the Internet.

With "data storage" we can think of a Google email account: the emails are in the cloud, and we can access them from anywhere and with any device. But our Netflix account, our preferred game sales console or service, Chrome or Firefox, etc. are also valid examples. The idea is that our information, settings, preferences and other data are present on any device connected to the internet. It is enough to log into our Firefox account on our new cell phone so that our pages marked as favorites, the browsing history, the forms we have filled out, the components we use and others are present. With the PlayStation®Plus service, we are allowed to upload our progress in any game for download on a different PS4. Similarly, if we connect to WhatsApp on a new phone, all groups and contacts will be downloaded, including chats. All this is Cloud...

…And it is much more. The not-so-distant future of the Internet will be mostly Cloud, a network of interconnected servers, geographically dispersed throughout the planet. So that? Well, on the one hand, the data that makes up a website will be cloned on multiple servers, guaranteeing its security. On the other, our cell phone or desktop computer will automatically connect to the nearest server to download a site, regardless of the country where we are.

The interweaving of Cloud servers identifies the origin of the request (us entering Google.com) and delegates the data download to the closest datacenter. Today this is already a reality with Google Cloud, with CDN services and with various account services, such as Steam, Netflix, Adobe, Apple TV, Spotify (uses Google Cloud) and a huge etcetera. As the speed of the internet doubles, one of the things that will happen is that we will no longer need to have processors in our devices, hard drives, or even GPU accelerators. A PC or a mobile phone will have a screen, a battery and a modem to connect to the Internet, allowing us to transmit our requests and receive a response at the speed of light. Our applications, emails, information, games, everything will be in the cloud, but the speed that will allow an internet not far away will make us believe that all the data will be inside our device. Like old times.

cloud-hosting

Cloud will mark very well the difference between a quality hosting from a mediocre one. The technical support of a hosting service, more than ever, will be the yardstick with which the efficiency of a company can be measured when a client needs to scale, resolve conflicts and implement their mobile apps so that they work without problems. At Duplika we are training all the time, so we are ready to welcome the future with open arms.

You may be interested in evaluating our professional plan of cloud-hosting for demanding customers. We offer you all the power of the Cloud so that you can enjoy its powerful (and safe) benefits.

complete immediacy

It is not difficult to suppose that the true speed of the Internet will arrive when the sites and apps do not take more than milliseconds to load. Is it long for that? No, but it takes some kind of joint movement between technologies, servers and devices for that to happen. Let's take the example of 5G, the fifth generation of mobile phone connectivity. This technology will allow speeds, in principle, of 1250 MB per second with a latency of 4ms. Eventually, and after many years of investment and development, it is predicted that it could reach 20 gigabytes per second. However, slap in the face: the coming generations this 2020 in terms of smartphones will not have 5G support.

And they have a point, because there is still a long way to go before 5G is implemented in large cities, let alone more modest settlements. The world of 5G promises self-driving cars, machinery that works without the need for operators, systematized public signage, interconnected appliances, smart homes, virtual reality like we've never seen it, and even automated energy networks. Everything sounds great, but until 5G connectivity networks are implemented in the regions of the world, its true scope will be a promise. 4G, the prevailing generation of mobile connectivity today, has not been fully implemented in all corners of the world, and where it has, it does not necessarily provide your cell phone with the 20 to 100 Mbps that it promises (the subject has more details on the ones we mentioned, but you can already get an idea). The fact that a technology has the capacity to allow any download speed does not mean that companies provide it from their headquarters. In short, there is a lot of noise around this but few nuts. Even though the internet and global traffic has grown, as we said, exponentially, certain changes are not as fast as we would like (we still expect cars to fly).

The future that has already arrived

On the server side, the future holds super-fast computers that can handle more requests at the same time, and host sites and apps without any throttling. Not only do processors have more and more cores, but the power of GPUs is adding to this, and quantum technology arrived to overcome the nanometric obstacle. The software is increasingly efficient, safe and fast, and if at the same time we add the progress of the systems of smart learning, we will get computers that will manage with predictive response the flow of traffic and the use of resources with amazing precision (even stopping attacks before chaos ensues).

On the connectivity side, the limitations of broadband over coaxial cable, and the cost of installing fiber optics in the home (provided we find companies that offer this—and Google is ahead, again), are issues that will remain. relegated to the side of history at the time the internet is as common and accessible as the air we breathe. If the companies and States of the world invest in technologies such as 5G, needing specific home wiring to connect to the web will become something medieval. And everything indicates that this is a probable reality. On the other hand, the day connections are fast enough (and accessible to everyone), the devices with which we navigate will not require more than a modem that accesses Clouds (Cloud Computing), where all the calculations will be carried out remotely, returning the only thing that is needed: the information to be presented on our screens. Spreadsheets, emails, apps, web pages, games... Everything. Our devices can be thought of as a mere monitor connected to a server (a mega computer, really) through a very long cable. And everything we see on screen will be happening in "The Cloud".

The technology is ready; We will have to wait for its implementation. In the meantime, let us remember that twenty years ago we would be reading this note printed in a magazine.

Thank you for reading.

We are Duplika

Give your site the hosting it deserves

no comments on the future of the internet

back to blog

Duplika

Duplika

We are online, we are not a bot :)

I will be back soon

Duplika
Hi 👋
Select the prefered contact method to get in touch.
Connect via:
chat