Edge: The Next Frontier - Amazon ...

 
CONTINUE READING
Edge: The Next Frontier - Amazon ...
Gabelli Funds                                         November 18, 2020
  One Corporate Center
  Rye, NY 10580-1422
  (914) 921-7738
  www.gabelli.com

                            Edge: The Next Frontier

               Source: Cloudflare

Christopher Ward, CFA                                 Gabelli Funds 2020
(914) 921-7738
cward@gabelli.com
Edge: The Next Frontier - Amazon ...
Gabelli Funds
One Corporate Center                                                                             November 18, 2020
Rye, NY 10580-1422
(914) 921-7738
www.gabelli.com

                          “Computing is no longer confined to a device or even a
                          single data center. Instead it’s a ubiquitous fabric, it’s
                          distributed from the cloud to the edge, closer to where data
                          is generated, and with the ability to adapt to the wide range
                          of inputs, whether it’s touch, speech, vision or gestures.”

                          Satya Nadella, CEO, Microsoft, Mobile World Congress
                          2019

OVERVIEW
Investors should always on the lookout for emerging technological paradigms. The theme in software for the past
several years has been the migration of on-premises workloads to public cloud infrastructure and software-as-a-service
(SaaS) applications. COVID-19 has acted as a force function for cloud adoption. In a recent Fortune survey, 75% of
Fortune 500 CEOs expect digital transformation projects to accelerate. The traditional benefits of cloud have included
lower total cost of ownership, scalability and access to best-of-breed software. But even more important today, cloud is
enabling business continuity for remote workforces. As more workloads move to the cloud, businesses are being
forced to rethink their entire IT stack, including networking and security. Historically, organizations would purchase
physical networking hardware, such as routers, switches and firewalls, which would sit next to server racks in a data
center. But corporate networks are increasingly being accessed remotely, and legacy models of backhauling internet
traffic through centralized on-prem data centers are slow and expensive. The on-prem model fails to match the speed,
security and cost efficiency of cloud native networking and security solutions, which some refer to as edge networks.
However, there remains a disconnect between investor perception and the utility of edge networks. The objective of
this whitepaper is to explore the role edge networks will play in a world increasingly demanding compliant, secure,
reliable, and low-latency applications.

DEFINING THE EDGE                                              Exhibit 1: Edge to Core Value Chain
In the technical sense of the word, the edge is anywhere
things and people can connect with the networked digital
world. At Google’s 2019 Hardware Event, Rick Osterloh,
Senior Vice President of Devices and Services, described the
company’s vision for “ambient computing”: “Your devices
work together with services and AI, so help is anywhere you
want it, and it’s fluid. The technology just fades into the
background when you don’t need it.” Osterloh is describing a
world in which internet-of-things (IoT) devices are
pervasive, Smartphones, tablets, speakers, wearables and
even home appliances. These are just some examples of edge
devices. The edge is poised to play a growing role in the        Source: Gartner
computing ecosystem. McKinsey expects the growth in
Internet of Things (IoT) devices to surge to 43 billion in 2023, a threefold increase from 2018. This growth will be
enabled by new sensors, more computing power and 5G wireless connectivity. However, in order for IoT to realize its
                                                                        full potential of driving ubiquitous ambient
                                                                        computing, the hardware on edge devices will
  “The current practice of IoT data being transported to a central      need to be low cost with a long battery life.
  location for processing does not scale well and will not meet the     Dispatching technicians (“truck rolls”) to
  real-time latency requirements of some key use cases.”                troubleshoot or replace batteries will be cost
                                                                        prohibitive in a world where connected devices
   – The Future X Network, A Bell Labs Perspective
                                                                        number in the tens of billions. To mitigate the
                                                                        need for extra compute hardware on the device

                                                          -1-
Edge: The Next Frontier - Amazon ...
and reduce the power consumption of sensors, data processing can be outsourced from the device to
edge networks. For this reason, and more discussed below, compute resources are expected to accrue
to edge networks as IoT devices proliferate. The remainder of this paper will focus not on edge devices, but on edge
networks, as defined by Gartner: “a part of a distributed computing topology in which information processing is
located close to the edge – where things and people produce or consume that information.”

CONTENT DELIVERY NETWORKS                      Exhibit 2: Conversion rates decline with load times
Any discussion about edge networks
must begin with Content Delivery
Networks (CDNs). As stated in a
recent Federal Communications
Commission                whitepaper,
“Distributed edge computing is
analogous to, and can be regarded as
an extension of, the evolution of
content distribution over the last few
decades.” The value proposition of
CDNs is that they can significantly enhance
                                                   Source: Walmart.com study, February 2012
the performance of content delivery over the
internet. The public internet was simply not
architected to handle today’s ever-increasing speed, security and data demands. In an era where web speed can dictate
business outcomes, fast load time is critical. Exhibit 2 shows how slow load times reduce conversion rates.

Exhibit 3: Example of Content Delivery Network
                                                               CDNs sit between an endpoint device (e.g. mobile phone
                                                               or laptop) and an origin server (a public cloud or enterprise
                                                               data center). CDN servers are typically placed at internet
                                                               exchange points (IXPs) geographically dispersed around
                                                               the world, with the goal of being physically close to the
                                                               end user. These edge locations are referred to as points of
                                                               presence (POPs). When a user requests content (webpages,
                                                               images, video) for the first time from a service using a
                                                               CDN, the CDN will retrieve the content from the origin
                                                               server and save a copy on the edge server for subsequent
                                                               requests. The process of origin servers offloading content
                                                               onto a CDN server is known as caching. Placing copies of
                                                               files in a cache closer to the endpoint results in much faster
  Source: CDN Review                                         loading   times. In the event a user requests content that is not
                                                             cached (a “cache miss”), the CDN can request the content
from the origin server (of course, this will be slower than a “cache hit”).

CDNs also have many benefits that are less
visible to the end user, including reduced
bandwidth costs (by reducing the traffic that         “A website on Cloudflare sees 65% fewer requests to the origin
traverses telco pipes), redundancy (improved          yielding a 60% reduction in bandwidth consumption on your origin
                                                      web servers.”
uptime) and security (e.g. distributed denial-of-
service protection, web application firewalls).
                                                      – Cloudflare CDN Whitepaper
CDNs play a critical role in the internet
ecosystem, but the industry is relatively
commoditized with annual price compression
of ~20%.

                                                            -2-
Edge: The Next Frontier - Amazon ...
CASE STUDY: NETFLIX’S OPEN CONNECT                              Exhibit 4: Netflix Open Connect Network
If you’ve ever wondered how Netflix is able to
instantaneously load video, even faster than peer streaming
services, the answer lies in its purpose-built globally
distributed CDN, Open Connect. While Netflix utilizes
Amazon Web Services (AWS) for storage and compute,
Netflix maintains its own CDN infrastructure to optimize the
delivery of video traffic. Prior to developing Open Connect
in 2011, Netflix used a variety of third-party CDNs, such as
Akamai, Level 3 and Limelight Networks. But in response to
its growing scale, Netflix decided that they could better
optimize video delivery by building their own customized
CDN in order to streamline the hardware and software. In a         Source: Netflix
digital world, where milliseconds can impact customer
engagement, Netflix can’t rely on the public internet, or even third-party CDNs, to deliver the consumer experience it
strives for. While this do-it-yourself (DIY) CDN approach can make sense for the biggest internet properties in the
world, the economics of shared infrastructure are far more attractive for the vast majority of applications.

                                                                            EDGE NETWORKS
  “The enterprise perimeter is no longer a location; it is a set of         While CDNs focus on the delivery of media
  dynamic edge capabilities delivered when needed as a service from         content, modern edge networks go one step
  the cloud.”                                                               further by facilitating networking and network
                                                                            security at the edge. In a traditional “hub-and-
  – Gartner, The Future of Network Security Is in the Cloud                 spoke” network architecture, all applications
                                                                            would be hosted in a corporate data center,
                                                                            which users and branch offices could access
through a localized private network or VPN. This gave rise to the “castle-and-moat” security model: protect the
network perimeter with firewalls and trust users within the perimeter by default. In recent years, the growth of cloud
applications and distributed workforces have challenged these legacy models. The attack surface has broadened as
there are fewer users and applications residing within the castle. As a result, new cloud-based security models have
emerged. Secure access service edge (SASE), combines software-defined networking and network security into a
single solution. Gartner estimates that by 2024,
at least 40% of enterprises will have explicit
strategies to adopt SASE, up from less than 1% “Employees have left the building. Applications have left the
at year-end 2018. SASE is being leveraged to building. And we now live in this distributed world of applications
implement Zero Trust models, which, in and employees… The security stack needs to become ubiquitous and
contrast to legacy castle-and-moat models, omnipresent… and delivered at the edge.”
removes all trust assumptions when users
connect to applications. Edge networks are – Dr. Robert Blumofe, Akamai, EVP, Platform & GM, Enterprise
uniquely positioned to provide these cloud-
based security solutions, which won’t just
enhance security, but will reduce internet backhaul requirements, saving on costs and reducing latency. Importantly,
the success of these new security models is not dependent on the demise of the corporate data center. The corporate
data center will still exist, but no longer as the center of the network universe.

Cloudflare and Fastly are two disruptors in the edge networking space. Both have CDN heritage and have established
hundreds of POPs around the globe. Cloudflare currently has about 200 POPs, while Fastly maintains about half of
that. Zscaler, an edge network focused solely on security, has an edge footprint with over 150 edge locations.
Meanwhile, Akamai, a legacy CDN vendor, has over 325,000 servers in more than 4,000 locations. Here, we believe,
rests some of the biggest misconceptions of edge networks. While some argue the more, the better, Cloudflare and
Fastly edge servers are much more powerful and capable of caching much more content. This results in much higher
cache hit rates, and ultimately, a faster network.

                                                           -3-
Edge: The Next Frontier - Amazon ...
Even more important than cache hit rates,
 “Think of legacy POPs like convenience stores. You’ll find one on   Cloudflare and Fastly were architected from
 the corner of almost every street. Since they’re close to you, you can
                                                                     the ground up to enable programmability. Both
 reach these stores quickly, but they only have the bare essentials. Cloudflare and Fastly built software-defined
 That means you may need to go online to order the rest of the items networks on top of commodity hardware. To
 on your shopping list and have them shipped overnight (analogous to loosely quote a Fastly engineer at the 2015
 a cache miss forcing a request back to the origin server).          Fastly Altitude conference: rather than the
 Alternatively, you can go to a large supermarket a few miles away (in
                                                                     network being the infrastructure, the network
 this analogy, this would be one of Fastly’s powerful POPs). It may
                                                                     is now the application. This enables features
 take slightly longer to reach, but the vast selection means that you
                                                                     that are highly valuable to engineers, like
 have a much better chance of finding everything on your shopping
 list quickly (nearly everything is cached, which results in a highercustomized cache policy and real-time
 cache hit ratio).”                                                  performance monitoring. The capabilities of
                                                                     these edge networks have attracted a customer
  – Fastly Blog, June 8, 2016                                        base including some the world’s most
                                                                     sophisticated internet and software properties,
                                                                     like Shopify, Google, Pinterest and Slack. As
                                                                     consumer use cases evolve, which they surely
will with the advent of 5G and IoT, the programmable nature of edge networks will allow them to continuously adapt
through software iteration.

IDC estimates the networking and security market to reach $47B in 2022. Edge networks have tremendous growth
runway as companies shift spending away from on-premises networking and security hardware and towards cloud
services. However, there’s an even more exciting opportunity that has only recently become a reality: edge compute.

EDGE COMPUTE – A NEW PARADIGM
Jim Gray, famed Microsoft researcher, was
decades ahead of his time when in the mid-
1990s he predicted that advancements in             “As sensor datasets grow larger, traditional data management
computing would drive a “data deluge”, which        techniques (such as loading data into a SQL database and then
would overwhelm           existing computing        querying it) will clearly prove inadequate. To avoid moving massive
architectures and necessitate a “fourth             amounts of data around, computations will need to be distributed and
paradigm”. The framework Gray put forward           pushed as close to data sources as possible.”
to accommodate this data deluge was a
distributed architecture in which compute was       – The Fourth Paradigm, Data-Intensive Scientific Discovery
brought to the data, rather than data to the
compute.

It is helpful to view the potential for edge compute through the lens of the Von Neumann Bottleneck. The Von
                                                                    Neumann Bottleneck is a computing concept
                                                                    that suggests total processing power of a
                                                                    computer is limited by the bandwidth of the
  “While distributed CDNs mostly revolve around storage caches,
  enabling applications with edge compute extends this to both
                                                                    network. In other words, a computing
  compute and storage, and the more general cloud service stack     architecture is only as strong as its weakest
  necessary to on-board and run 3 party applications.”
                                 rd                                 link – a powerful CPU can’t overcome a slow
                                                                    rate-of-transfer to deliver low-latency.
  – FCC Technological Advisory Council: 5G Edge Computing           NVIDIA CEO Jensen Huang has used this
  Whitepaper                                                        rationale extensively to justify the company’s
                                                                    acquisition of networking company Mellanox:
                                                                    “When NVIDIA accelerates compute by 10-
                                                                    50x, moving data becomes the bottleneck.”

                                                           -4-
Edge: The Next Frontier - Amazon ...
Bottlenecks in the computing value chain have
                                                                          evolved with every computing paradigm. The
  “As we look toward the future dominated by much simpler, lower          original mainframes were expensive and took
  power ‘things’ with minimal storage and with the requirement of a       up considerable space. Compute resources
  10-year battery life, this phenomenon of shifting computing power to    were scarce and had to be shared amongst
  the cloud will be accelerated. The network will again become the        multiple users. During this era, the bottleneck
  bottleneck.”                                                            was compute. With the introduction of multi-
                                                                          user operating systems, like Unix, multiple
  – The Future X Network, A Bell Labs Perspective                         terminals could share the compute resources of
                                                                          a single mainframe. The bottleneck was no
                                                                          longer compute, but the network. The network
bottleneck was removed with the introduction of TCP/IP networking, which enabled local area networks and
eventually the “network of networks”, the internet. As data centers scaled out to accommodate the growth of the
internet, the bottleneck was no longer compute or networking, but locating useful information in a cyberspace defined
by abundance. This void was eventually filled by browsers and search engines. Today, most mobile and desktop
applications continue to rely on these centralized data centers for processing data. However, due to the physical
distance between cloud data centers and endpoints, there are limitations to the speed of light that introduce a degree of
latency that is not tolerable for certain use cases. In a future of ubiquitous 5G providing low-latency wireless
connectivity for the “last mile”, paired with the unlimited compute resources residing in the cloud, the “middle mile”
network will, once again, become the
bottleneck. This is what the next iteration of
cloud, edge compute, is seeking to solve. By
                                                      “As enterprises complete the shift to the cost-saving measures
moving compute to edge servers, minimizing            provided by the central cloud, their next frontier is to move logic,
the distance from endpoint devices, latency           compute power and security to the edge in order to more effectively
can be dramatically improved. Edge networks           meet their customers in the digital-first way that consumers have
are in a unique position to enable edge come to expect and rely on.”
computing by leveraging their globally
distributed, powerful, and programmable edge – Fastly Q3 2020 shareholder letter
networks.

SERVERLESS AT THE EDGE                              Exhibit 5: Evolution of computing architectures
New software architectures have emerged with
each computing paradigm. Prior to the
introduction of virtual machines (VMs), a single
server would be dedicated to a single application.
VMs significantly increased the efficiency of
compute resources by making it possible to host
multiple environments on one server. Containers
                                                        Source: Cloudflare
increased efficiency even further, using a fraction
of the memory of a VM and allowing developers
to package applications into units that can be deployed in any operating environment. Serverless computing is an even
more efficient, lightweight architecture. Serverless, sometimes referred to as Function-as-a-Service (FaaS), completely
abstracts infrastructure management tasks and allows developers to focus on writing code and deploying it everywhere
at once. Developers don’t have to pick a cloud availability region or scale up an instance. While servers are still
responsible for executing the code, no specific machine is assigned to a given function or application. In a serverless
environment, servers aren’t continuously running. Instead, code is only executed when called upon, eliminating the
problem of idle server capacity. Developers are charged based on usage rather than bandwidth or number of servers.
Serverless architectures are inherently scalable and generally lower cost (users don’t pay for idle server time). The IDC
expects the Function-as-a-Service market to grow at a 27% CAGR to reach over $10B in 2023.

                                                          -5-
Exhibit 6: Serverless response times

           Source: Cloudflare

Serverless computing is not a new concept. AWS Lambda, Amazon’s serverless platform, was launched in 2014.
Microsoft (Azure Functions) and Google (Cloud Functions) have competing offerings. What is new, however, is
running a serverless platform at the edge. Cloudflare launched its Workers platform in 2018. Fastly’s Compute@Edge
recently transitioned from beta to limited
availability. And Akamai launched its EdgeWorkers
in October of 2019. It is notable, as illustrated in “Because of the latency to the cloud, there will be what we call
Exhibit 6, that next-gen edge networks, like edge clouds, you’ll start to see cloud extending itself very close
Cloudflare, can deliver significantly lower latency      to the network to take advantage of the low-latency you have
compared to public cloud offerings, including            with 5G… you could do computational offload, for example…
Amazon’s Lambda@Edge. Part of Cloudflare’s               we think it is going to be a very significant driver of the future.”
advantage lies in the architecture of the software
(Cloudflare is built on V8 isolates, in comparison to – Dr. James Thompson, EVP, Engineering & CTO, Qualcomm,
most serverless platforms that are container-based).     2019 Analyst Day

What we are really describing here, platforms for developers to build and deploy applications at the edge for low-
latency use cases, could be considered new type of cloud: “edge clouds”.

                                                            -6-
EDGE COMPUTING USE CASES                                           Exhibit 7: 5G a catalyst for new technologies
Edge computing is not a new concept. Akamai launched an
edge computing platform in 2002 (only to shut it down due to
lack of traction). So why the excitement today? Key
technologies, such as 5G, IoT and software defined
networking, are converging and reaching critical inflection
points. These technologies will require massive amounts of
low-latency processing in order to deliver on the vision of
ambient computing. There is a bit of a chicken-or-egg dilemma
with 5G and IoT applications. Developers can be hesitant to
write applications ahead of the deployment of widely available
5G networks. Meanwhile, without applications to run on the
network, it can be hard to attract the capital to build out the
physical infrastructure. But after many years of preparation and
anticipation, the rollout of 5G is happening. Just as Uber would
have been hard to fathom prior to 4G, the speed, latency and
massive connectivity of 5G will provide the foundation for the
next generation of developers to spawn entirely new categories
of applications. With the consumer well served by existing
applications, many edge applications will be tailored to
machines rather than humans. Edge computing could
potentially address any use case requiring low-latency, two-
                                                                    Source: Deloitte Insights
way bandwidth. Autonomous vehicles, drones and industrial
robots will all carry a certain amount of local compute
capabilities, but the onboard compute will need to be continuously retrieving real-time insights (e.g. navigation maps,
weather) delivered from edge servers. Smart cities could be retrofitted with security surveillance and traffic
                                                                management systems. Hospitals could deploy more
                                                                sophisticated patient monitoring systems and remote
                                                                robotic surgery tools. Farms could utilize precision
                                                                agriculture and soil condition sensing to improve yield.
                                                                Predictive maintenance on heavy machinery could
                                                                improve productivity and safety. Voice recognition and
                                                                natural language processing could be nearly
  Source: Fastly Q2 2020 Investor Presentation                  instantaneous, making these applications more practical
                                                                for more use cases, like customer service centers and
smart home devices. And lastly, edge compute might be the key unlock for video game streaming, which has so far
failed to gain widespread adoption due to latency issues. AR and VR applications would be able to render their
surroundings in real-time, avoiding the common motion sickness caused by just 15 milliseconds of latency.

While it is easy to get excited about autonomous vehicles and drones, the more mundane use cases might offer the
biggest near-term opportunities. Compliance is one such underappreciated near-term use case. Historically, bits have
flowed unencumbered across borders, but the
global internet is becoming increasingly
fragmented. This goes well beyond China’s
Great Firewall to now include Europe with       “Incoming data can be triaged and acted upon at the point of
the implementation of the General Data collection. This can deliver greater security and privacy by keeping
                                                critical information such as personally identifiable data at the
Protection Regulation (GDPR). The growing
                                                endpoint rather than moving over networks, as well as meeting data
trend towards data sovereignty will require all
                                                residency requirements mandating that personal data must be
citizens’ data to be stored and processed confined within specific jurisdictions.”
within a country’s borders. The centralized
nature of today’s public clouds makes them      – Deloitte Insights
ill-equipped to comply with these
requirements. Meanwhile, the distributed,
                                                         -7-
programmable nature of edge networks makes them uniquely suited to the task. Matthew Prince,
CEO of Cloudflare, sees compliance as one of the killer features for edge computing: “as
governments impose new data sovereignty regulations, having a network that, with a single platform, spans every
regulated geography will be critical for companies seeking to keep and process locally to comply with these new laws
while remaining efficient.”

EDGE AS ESG                                                    Exhibit 9: Data center efficiency has plateaued
Edge networks also have a lot to offer ESG-minded
investors. In addition to enabling security, a central feature
of these networks, edge networks also have potential to
make a positive environmental impact. Some, like
economist Jeremy Rifkin, believe IoT paired with an
“energy internet” could serve as the foundation for a more
efficient, distributed, renewable energy infrastructure. Edge
networks could serve as the backbone for these smart grids,
monitoring energy consumption, managing energy
distribution, and orchestrating information management           Source: Google 2019 Environmental Report
across the entire value chain. Energy consumption for
commercial buildings could be optimized. Scarce natural resources, like water, could be more effectively monitored
from collection to delivery and consumption.

Edge networks could also reduce energy consumption in data centers. A recent study published by the academic
journal Science estimates that data centers comprised 1% of global electricity consumption in 2018. Though data
center operators have become more energy efficient over the last decade, efficiency gains have plateaued, as seen in
Exhibit 9. Moving forward, data center energy demand, driven by growth in data and compute demands, is likely to
rise faster than energy efficiency gains. Edge networks can help reduce energy consumption in a few ways. By
reducing the volume of traffic traversing the network to the origin server, edge networks reduce not only latency, but
also cost and energy consumption. Meanwhile, serverless computing improves server utilization rates compared to a
virtual machine, which requires reserved capacity.

SIZING EDGE COMPUTE                               Exhibit 10: Edge compute market to grow at a 37% CAGR
These are just some of the edge use cases
that drive the “data deluge” forecasted by
Jim Gray. The IDC estimates the global
“datasphere”, fueled by the explosion of
connected devices, will grow from 33
zettabytes (33 trillion gigabytes) in 2018 to
175 zettabytes by 2025. Gartner predicts that
data generated outside of data centers will
grow from 10% of total data in 2018 to 50%
in 2022 and 75% by 2025. If public clouds
were to absorb all of this data, there would
be tremendous bandwidth and infrastructure
costs as origin servers would need to scale
with the growth in data. Edge networks can
help. By processing more data locally and
                                                 Source: Grand View Research
filtering out irrelevant data, edge networks
can reduce the compute burden on origin
servers. The explosion in data is driving edge industry growth. Gartner expects that over 50% of large enterprises will
deploy at least six edge computing use cases by the end of 2023, up from less than 1% of enterprises in 2019. The IDC
predicts that by 2023, over 50% of new enterprise IT infrastructure deployed will be at the edge rather than corporate
datacenters, and that edge networks will represent 60% of all deployed infrastructures. Putting into dollar context,
Grand View Research estimates the edge computing market to grow at a 39% CAGR to reach $43 billion by 2027,
with software being the fastest growing sub-segment. Other industry experts, such as Mark Thiele, CEO of edgevana
                                                          -8-
and thought leader in the space, believe “the $43 billion number by 2027 is off by 300% and that in
the same 14 years it took cloud/SaaS/PaaS to reach $218 billion, all Edge services will be worth
closer to $300 billion.” While the exact addressable market can be debated, it is fair to say the opportunity is
somewhere between big and huge.

THE RACE TO THE EDGE
Edge computing remains in its nascency, but with the catalyst
of 5G on the horizon, operators are jockeying for position.
Perhaps the most important leading indicator of success for
any new computing platform is developer mindshare. Steve
Ballmer’s infamous “developers, developers, developers!”
chant at Microsoft’s 2000 Windows Conference highlights
just how critical developers are. The influence of developers
within organizations will only continue to increase as digital
transformation accelerates and every business transforms into
a digital business. Edge developer data is scant, but some
recent datapoints stand out. Cloudflare had more than 27,000 Source: OliverGeary
developers write and deploy their first Cloudflare Workers
application in the third quarter, up from 15,000 a year ago. Further, more developers deploy code on Workers each
month than every other edge computing platform combined. Cloudflare clearly has early developer momentum.

One should never underestimate the resources and technical capabilities of the hyperscalers, Amazon, Microsoft and
Google. Fortunately, today’s modern edge networks have some structural advantages. First, they can claim neutrality,
as they don’t compete with their customers the way Amazon might in retail, or Google might in digital advertising.
Additionally, the modern edge networks are truly cloud agnostic, ideal for developers with multi-cloud environments.

CONCLUSION
While some hypothesize that edge networks
could subsume public cloud, the more likely
role of edge networks will be to augment           “Critically, the intelligent edge is not a replacement for enterprise
centralized clouds. Much of the “data deluge” and hyperscale cloud data centers but a way to distribute tasks
that is expected from connected devices will be across the network based on timeliness, connectivity, and security.”
incremental, not cannibalistic, to the data
residing in public clouds. Furthermore, there – Deloitte Insight
are certain use cases that benefit from data
centralization. Big data analytics and machine
learning are best suited for public cloud. “Hybrid” cloud has historically described the mix of workloads between on-
premises and public cloud. In the future, the definition of hybrid is likely to extend further to encapsulate edge
networks. Workloads will be optimized across these various environments, each serving a specific purpose. While total
addressable market can be debated, edge networks are in a unique position to capture share of the $47 billion
networking and security market and $43 billion edge compute market.

Christopher Ward, CFA                                                                     © Gabelli Funds 2020
(914) 921-7738
cward@gabelli.com

                                                         -9-
One Corporate Center Rye, NY 10580                 Gabelli Funds        Tel (914) 921-5000            Fax 914-921-5098
This whitepaper was prepared by Christopher Ward, CFA. The examples cited herein are based on public information
and we make no representations regarding their accuracy or usefulness as precedent. The Research Analyst’s views are
subject to change at any time based on market and other conditions. The information in this report represent the
opinions of the individual Research Analyst’s as of the date hereof and is not intended to be a forecast of future events,
a guarantee of future results, or investments advice. The views expressed may differ from other Research Analyst or of
the Firm as a whole.

As of October 31, 2020, affiliates of GAMCO Investors, Inc. beneficially owned less than 1% of all other companies
mentioned.

This whitepaper is not an offer to sell any security nor is it a solicitation of an offer to buy any security.
Investors should consider the investment objectives, risks, sales charges and expense of the fund carefully before
investing.

               For more information, visit our website at: www.gabelli.com or call: 800-GABELLI

                            800-422-3554 • 914-921-5000 • Fax 914-921-5098 • info@gabelli.com

                                                          - 10 -
You can also read