Cloud and Edge Computing

When the internet was in its infancy the word ‘cloud’ was used as a metaphor to describe how the complex telephone networks connected. This makes sense since the only way to connect was with a telephone modem and all of the telephone wires were up in the air at that time.  Now, many people and organizations refer to it as ‘THE cloud’ but it’s not a single entity, and it doesn’t exist in just the one place.  Cloud is a model of computing where servers, networks, storage, development tools, and even applications (apps) are enabled through the internet.  Cloud computing is believed to have been invented by Joseph Carl Robnett Licklider an American Psychologist and Computer Scientist in the 1960s with his work on ARPANET (Advanced Research Project Agency Network) to connect people and data from anywhere at any time.  This project was trying to develop a technology that allowed a computer to be used by two or more persons simultaneously.  Licklider created the Intergalactic Computer Network, which allowed anyone on the globe to be interconnected by means of computers and also be able to access the information from anywhere and anytime.

The cloud is basically a bunch of servers connected to the internet that can be leased as part of a software or application service.  Cloud computing is the delivery of computing services, which includes servers, storage, databases, networking, software, analytics, and intelligence over the Internet to offer faster innovation, flexible resources, and economies of scale.  The simplest definition of cloud is a data center that’s full of identical hardware that no-one ever touches except to unpack it on day one and throw it away when it fails, but in between, every deployment, update, investigation, and management process is automated.  A cloud server is powerful physical or virtual infrastructure that performs application and information processing storage.  Rather than owning their own computing infrastructure or data centers, companies can rent access to anything from applications to storage from a cloud service provider.  Examples of cloud storage include all types of information like text, graphics, photos, video, and music, so basically whatever you see and access on the Internet and computers.  When a user exceeds their email message sending limit of so many megabytes, their email service provider will move their older messages to be stored on the cloud.  People that have computers are constantly acquiring digital data and then they need to find enough storage space to hold their files, so that is when they turn to the cloud.

It is estimated that the internet contained 74 zettabytes of data in 2020.  That is a lot of storage space, as a zettabyte is a trillion gigabytes.  We are now creating data faster than we can store it and by 2025, the amount of data generated each day is expected to reach 463 exabytes globally, where an exabyte is equal to one billion gigabytes.  The datasphere will keep on increasing and eventually terms like Yottabyte, Xenottabyte, Shilentnobyte and Domegemegrottebyte will become common place.  Nearly 60% of all the people on the planet are digitally active and in North America and Europe this is around 90%.

At the moment, Google stands as the undisputed leader, as it handles a staggering 1.2 trillion searches every year.  Every day, 306.4 billion emails are sent, and 500 million Tweets are made, and it is thought that 1.145 trillion MB of data is created every day.  The number of devices connected to the internet, and the volume of data being produced by those devices and used by businesses, is growing so quickly that it is becoming difficult for traditional data center infrastructures to accommodate.

Small and big companies are continually moving their applications to the cloud.  More than 28 percent of an organization’s total IT budget is now kept aside for cloud computing.  Today, 70 percent of organizations have at least one application in the cloud, indicating that enterprises are realizing the benefits of cloud computing and slowly adapting.  Some experts believe that the cloud my not be the best way to store data, and they are embracing the benefits that they can obtain from edge computing.  A delay in the machine’s decision-making process due to latency (combined delay between an input or command and the desired output) would result in losses for the organization and speed is a primary concern for computers using AI.  An AI doesn’t need high speed computing to work, it’s just less annoying when you don’t need to wait for it to process data, but it does have to perform quicker than the human brain.

As the size of the data increases, it becomes harder to accommodate everything on cloud.  The problem with cloud computing services today is that they are slow, because of the transmission time that is involved in accessing where the data is stored.  Edge computing means running fewer processes in the cloud and moving those processes to local places, such as on a user’s computer, an IoT device, or an edge server.  Since Edge computing is all a matter of location, Edge processors are faster as they are stationed on premises closer to where the data is collected.  By bringing computing closer to the source of the data, the need for long distance communications between client and server is being minimized.  If your process involves time-sensitive data, Edge computing should be used, while cloud computing is fine when used to process data that is not time-driven.  Edge computing enables processing at greater speeds and volumes, leading to greater action-led results in real time.  This proximity to data at its source can deliver faster insights, improved response times and better bandwidth availability.  Even as companies and industry experts predict the future growth of cloud computing, experts believe that the cloud has reached the end of its run at the top, and they are betting on the growing popularity and benefits of edge computing.

Some examples of edge use cases include self-driving cars, autonomous robots, smart equipment data and automated retail.  We are already using devices that do edge computing every day, devices like smart speakers, watches and phones.  These devices are locally collecting and processing data while touching the physical world, they compute locally and talk to the cloud.  Edge computing doesn’t require a separate “edge network” to exist, as it could be located on individual edge devices or a router.  As organizations generate data from Internet of Things (IoT) devices, smart sensors, and other devices on the edge of their networks, this data must be collected, stored, and processed.  In order to extract business insight from this data, it must flow seamlessly between edges, clouds, data centers, and users in a wide variety of work locations and environments.  IT architects have shifted their focus from the central data center to the logical edge of the infrastructure by taking storage and computing resources from the data center and moving those resources to the point where the data is generated, as close to the originating source as possible.

Data is the lifeblood of modern business, providing valuable business insight and supporting real-time control over critical business processes and operations.  Today’s businesses are awash in an ocean of data, and huge amounts of data can be routinely collected from sensors and IoT devices operating in real time from remote locations and inhospitable operating environments almost anywhere in the world.  Edge computing makes it easy to get data to data centers for processing.  Edge computing allows devices in remote locations to process data at the edge of the network, either by the device or a local server.  With edge computing latency is minimized, as edge computing will act like mini datacenters.

Time changes everything and computers go out of date and become obsolete, because the old hardware isn’t able to handle the new software, or they run out of memory, or become too slow, or they may get infected with a virus, or they simply develop problems.  Most computers are expected to survive for five to eight years, before they need to be replaced.  A computer does not have an expiration date like food, so even if your computer is ancient, if it still works and you can use it to do what you want with it, well, then you have a perfectly fine computer.  We’re entering a period of intensified innovation in PC hardware, so computers are bound to keep on getting faster and smaller, but what do you think computers will be like in the future?

Respond to this post if you think that computers in the future will be radically different from what we are using today.  Will we still be using keyboards and mice, or will we actually live inside a partially digital world?  You could write about if you are happy with your computer, or if you are thinking about getting a new one.  You could write about how much stuff you have stored in the cloud, whether that is email or photos.  Did you ever Google yourself, and if you did you could write about what you found.  If you went on Ancestry or 23andMe, you could write about what you found out, or perhaps you have a profile listed on Linkedin.  You could write about texting replacing email, if you think that is going to happen.  I would be happy if you basically wrote anything about a computer, or the internet.

10 comments

  1. Reblogged this on A Unique Title For Me and commented:

    Moore’s law (an observation made by Gordon Moore in 1965) predicts that the number of discrete elements on a square-inch silicon integrated circuit will double every two years. While it’s not exactly a direct relationship, you can interpret that to mean that computers will double in processing power every two years and this has been said to be the greatest technological prediction of the last half-century. That means in the years between 2010 and 2050, computer processing power will double 20 times if Moore’s law holds true. Moore’s prediction has fueled many of today’s breakthroughs in artificial intelligence and it has given machine-learning techniques the ability to chew through massive amounts of data to find answers.

    There is breaking news where Apple is warning of a serious security flaw for iPhones, iPads and Macs that allows hackers to access devices. Apple’s explanation of the vulnerability means a hacker could get “full admin access” to the device. That would allow intruders to impersonate the device’s owner and subsequently run any software in their name.

    Like

    • Thanks Reena and I commented on your post about your forgotten password, but I think it went to your SPAM folder because I included a video to show you how you can recover it.

      Like

Comments are closed.