Answer a few emails, text your kids, watch a few videos and read the news.
This is probably what most of you do on the internet.
I won’t mention the other activity. It’s too early in the day for that.
You only have to go back three generations, though, to see an entirely different world. A world without Twitter and YouTube.
A world where you might grow up on the farm, sheltered in a community that knows only what the local paper and passers-by tell them.
Today, you have the world at your fingertips. You can work from anywhere. You can find answers to almost any question.
It’s amazing what we’ve seen in a little over a lifetime.
But it’s just the beginning. So says the father of the web…
We’re still in our infancy stage
Back in 1991, Sir Tim Berners-Lee was a researcher at CERN, a research institute in Switzerland. He came up with an idea that would allow fellow scientists to share information.
It didn’t matter if they were in the next room or on the other side of the world. And…you know what happened next…
We saw everything from e-commerce to social media spring up out of this information sharing plan.
I want you to take a look at our connected world for a moment. Greater than 90% penetration is not uncommon…
Source: We Are Social
The data we consume is rising every year…
Source: We Are Social
In fact, you can go to this website: https://www.vpnsrus.com/data-consumption/ and you can see how much data is being consumed each second you’re on the web page.
Within the minute I was there, the world consumed 2.2 billion megabytes worth of data…
Source: VPN’S ‘R US
And being social creatures, we increasingly use our connective infrastructure to reach out to others…
Source: We Are Social
But Berners-Lee says this is all early days stuff. In a Quartz’s interview, Berners-Lee said:
‘It’s the utopian vision growing up…
‘When it was 19 years old, it felt a little bit adolescent. When it was three, it might have had a very simple, star-spangled vision of itself. When it was small enough not to have been noticed by criminals, it was a wonderful time. When the National Science Foundation didn’t allow commercial use of it, we didn’t have to worry about advertising, because until they changed the acceptable-use policy, advertising wasn’t allowed. It was, in a way, in a protected, embryonic state before the NSF policy change.
‘At that point, [the web] was in transition to growing up.’
And it continues to grow today. We’re still seeing an adolescent connectivity infrastructure finding its feet.
Just look at what’s happening over in China…
The tech race rages on
Had you told General Moa about supercomputers, he’d have no idea what you were talking about. He was too busying making the farmers in China pump out useless scrap metal.
But today, China speeds along in their tech race with the US, developing such computers.
South China Morning Post (SCMP) writes:
‘China is aiming for its newest Shuguang supercomputers to operate at about 50 per cent faster than the current best US machines, which assuming all goes to plan should help China wrest the title back from the US in this year’s rankings of the world’s fastest machines, according to people, who asked not to be named discussing private information.’
Unlike their name, supercomputers are not just big, superfast computers. They work in an entirely different way.
They use parallel processing instead of serial processing. The difference between the two is like doing one task after another versus splitting that task into multiple smaller tasks and working on them all at once.
Source: Explain That Stuff
In some circumstances, many of these supercomputers will be linked together, via the internet. This means you can spread the processing power across many machines.
But why develop them in the first place? SCMP explains:
‘The ability to produce state-of-the-art supercomputers is an important metric of any nation’s technical prowess as they are widely deployed for tasks ranging from weather predictions and modelling ocean currents to energy technology and simulating nuclear explosions. Demand for supercomputing in commercial applications is also on the rise, driven by developments in artificial intelligence.’
And with time, even these futuristic computers will be old news.
Technology like driverless cars and supercomputers, aided by our expanding connective infrastructure, might become as mundane as Twitter and YouTube.
Of course, I can only guess what the Internet 2.0 world might look like.
All I can be sure of is that it will look different from today.