Q&A: How to Find Value at the Edge Featuring Michele Pelino | SUSE Communities

Q&A: How to Find Value at the Edge Featuring Michele Pelino


We recently held a webinar, “Find Value at the Edge: Innovation Opportunities and Use Cases,” where Forrester Principal Analyst Michele Pelino was our guest speaker. After the event, we held a Q&A with Pelino highlighting edge infrastructure solutions and benefits. Here’s a look into the interview: 

SUSE: What technologies (containers, Kubernetes, cloud native, etc.) enable workload affinity in the context of edge? 

Michele: The concept of workload affinity enables firms to deploy software where it runs best. Workload affinity is increasingly important as firms deploy AI code across a variety of specialized chips and networks. As firms explore these new possibilities, running the right workloads in the right locations — cloud, data center, and edge — is critical. Increasingly, firms are embracing cloud native technologies to achieve these deployment synergies. 

Many technologies enable workload affinity for firms — for example, cloud native integration tools and container platforms’ application architecture solutions that enable the benefits of cloud everywhere. Kubernetes, a key open source system, enables enterprises to automate deployment, as well as to scale and manage containerized applications in a cloud native environment. Kubernetes solutions also provide developers with software design, deployment, and portability strategies to extend applications in a seamless, scalable manner. 

SUSE: What are the benefits of using cloud native technology in implementing edge computing solutions? 

Michele: Proactive enterprises are extending applications to the edge by deploying compute, connectivity, storage, and intelligence close to where it’s needed. Cloud native technologies deliver massive scalability, as well as enable performance, resilience, and ease of management for critical applications and business scenarios. In addition, cloud functions can analyze large data sets, identify trends, generate predictive analytics models, and remotely manage data and applications globally. 

Cloud native apps can leverage development principles such as containers and microservices to make edge solutions more dynamic. Applications running at the edge can be developed, iterated, and deployed at an accelerated rate, which reduces the time it takes to launch new features and services. This approach improves end user experience because updates can be made swiftly. In addition, when connections are lost between the edge and the cloud, those applications at the edge remain up to date and functional. 

SUSE: How do you mitigate/address some of the operational challenges in implementing edge computing at scale? 

Michele: Edge solutions make real-time decisions across key operational processes in distributed sites and local geographies. Firms must address key impacts on network operations and infrastructure. It is essential to ensure interoperability of edge computing deployments, which often have different device, infrastructure, and connectivity requirements. Third-party partners can help stakeholders deploy seamless solutions across edge environments, as well as connect to the cloud when appropriate. Data centers in geographically diverse locations make maintenance more difficult and highlight the need for automated and orchestrated management systems spanning various edge environments. 

Other operational issues include assessing data response requirements for edge use cases and the distance between edge environments and computing resources, which impacts response times. Network connectivity issues include evaluating bandwidth limitations and determining processing characteristics at the edge. It is also important to ensure that deployment initiatives enable seamless orchestration and maintenance of edge solutions. Finally, it is important to identify employee expertise to determine skill-set gaps in areas such as mesh networking, software-defined networking (SDN), analytics, and development expertise. 

SUSE: What are some of the must-haves for securing the edge? 

Michele: Thousands of connected edge devices across multiple locations create a fragmented attack surface for hackers, as well as business-wide networking fabrics that interweave business assets, customers, partners, and digital assets connecting the business ecosystem. This complex environment elevates the importance of addressing edge security and implementing strong end-to-end security from sensors to data centers in order to mitigate security threats. 

Implementing a Zero Trust edge (ZTE) policy for networks and devices powering edge solutions using a least-privileged approach to access control addresses these security issues.[i] ZTE solutions securely connect and transport traffic using Zero Trust access principles in and out of remote sites, leveraging mostly cloud-based security and networking services. These ZTE solutions protect businesses from customers, employees, contractors, and devices at remote sites connecting through WAN fabrics to more open, dangerous, and turbulent environments. When designing a system architecture that incorporates edge computing resources, technology stakeholders need to ensure that the architecture adheres to cybersecurity best practices and regulations that govern data wherever it is located. 

SUSE: Once cloud radio access network (RAN) becomes a reality, will operators be able to monetize the underlying edge infrastructure to run customer applications side by side? 

Michele: Cloud RAN can enhance network versatility and agility, accelerate introduction of new radio features, and enable shared infrastructure with other edge services, such as multiaccess edge computing or fixed-wireless access. In the future, new opportunities will extend use cases to transform business operations and industry-focused applications. Infrastructure sharing will help firms reduce costs, enhance service scalability, and facilitate portable applications. RAN and cloud native application development will extend private 5G in enterprise and industrial environments by reducing latency from the telco edge to the device edge. Enabling compute functions closer to the data will power AI and machine-learning insights to build smarter infrastructure, smarter industry, and smarter city environments. Sharing insights and innovations through open source communities will facilitate evolving innovation in cloud RAN deployments and emerging applications that leverage new hardware features and cloud native design principles.

What’s next? 

Register and watch the “Find Value at the Edge: Innovation Opportunities and Use Cases” Webinar today! Also, get a complimentary copy of the Forrester report: The Future of Edge Computing.