What does it mean for AI (Artificial Ingelligence)?

The edge is an end point at which data is generated via an interface, device or sensor. Remember, the technology is nothing new. However, given the rapid innovation in a variety of categories, the lead has become an important growth business.

“The edge brings intelligence as close as possible to the data source and the action point,” said Teresa Tung, executive director at Accenture Labs. “This is important because centralized cloud computing makes processing data on a large scale easier and cheaper. However, sometimes it doesn’t make sense to send data to the cloud for processing.”

This is definitely critical to the AI. The fact is, consumers and businesses alike want their applications to perform at super fast speed.

“At the moment, AI training is generating large amounts of data that are almost exclusively implemented and stored in the cloud,” said Flavio Bonomi, Board Advisor at Lynx Software. “However, by placing the calculation on the edge, you can view patterns locally. We believe this can evolve the training models to become simpler and more effective. “

The edge can even enable improved privacy in AI models. “Connected learning means that no end-user data is centralized or communicated between nodes,” said Sean Leach, Fastly’s chief product architect.


What can be done on the sidelines

The most notable use case for Edge and AI is the self-driving car. The complexity is mind-boggling, which is why this technology has taken so long to develop.

But of course there are many other use cases that cover a wide variety of industries. Just look at the manufacturing. “When monitoring manufacturing processes, where seconds or minutes can cost millions of dollars in losses, for example, machine learning models embedded in sensors and devices that collect the data can enable operators to preventively mitigate serious production problems and Optimize performance, ”said Santiago Giraldo, senior product marketing manager for machine learning at Cloudera.

Here are some other examples:

  • Chris Bergey, Senior Vice President and General Manager of Infrastructure at Arm: “AI and the Edge can study the effects of urbanization and climate change with software-defined sensor networks and use data to determine the causes of blackouts in smart grids based on the origins or improvement of initiatives to improve public safety through data streaming. “
  • Adam Burns, Vice President of IoT and Director of Edge Inference Products at Intel: “CORaiL, a project with Accenture and the Sulubaaï Environmental Foundation, can analyze coral reef resilience using intelligent cameras and video analytics powered by Intel Movidius VPUs. Intel FPGAs and CPUs as well as the OpenVINO Toolkit. “
  • Jason Shepherd, Vice President of Ecosystems at ZEDEDA: “TinyML will enable AI in more devices, connected products, healthcare wearables, etc. for fixed functions that can be performed locally by simple voice and gesture commands and frequent sounds (a baby is crying, water is running) triggered, one shot), location and orientation, environmental conditions, vital signs and so on. “
  • Michael Berthold, CEO and co-founder of KNIME: “In the future we will also see models that update themselves and possibly intentionally recruit new data points for retraining.”
  • Ari Weil, Akamai’s global vice president of product and industry marketing: “Look at medical devices like pacemakers and heart rate monitors in hospitals. When they indicate distress or a condition that requires immediate attention, the AI ​​processing on or near the device means the difference between life and death. “

However, the successful implementation of AI will face challenges and will likely take years to reach critical mass. “The edge has relatively less resource capacity than a data center. Edge deployments require lightweight solutions that are security-centric and support low-latency applications,” said Brons Larson, PhD and AI Strategy Lead, Dell Technologies.

There is also a need to invest heavily in infrastructure and upgrading existing technologies. “This is a great opportunity for NetApp, but we need to reinvent our storage to support it,” said Ross Ackerman, director of Customer Experience and Active IQ Data Science, NetApp. “Much of the typical ONTAP value proposition is lost on the edge because clones and snapshots have less value. The data on the edge is mostly short-lived and only takes a short time to make a recommendation. “

Then there are the cybersecurity risks. In fact, because of the impact on the physical world, they could become more dangerous than typical threats.

“Because the edge is used in applications and workflows, there is not always consistent security in place to provide centralized visibility,” said Derek Manky, chief of security insights and global threat alliances at Fortinet’s FortiGuard Labs. “Centralized visibility and unified controls are sometimes sacrificed in favor of performance and agility.”

Given the issues with the edge and AI, the focus needs to be on building quality systems, but also on rethinking traditional approaches. Here are some recommendations:

  • Prasad Alluri, vice president of corporate strategy at Micron: “The rise of AI also means that it is increasingly important that edge computing is close to 5G base stations. So soon every tower in every base station could contain computing and storage nodes. “
  • Debu Chatterjee, Senior Director for AI Platform Engineering at ServiceNow: “There must be newer chips with tensor functions in GPUs or their alternatives or specialize in certain inference models that are burned into FPGAs. A hardware / software combination is required to provide a security model with no marginal trust. “
  • Abhinav Joshi, Global Product Marketing Leader at OpenShift Kubernetes Platform at Red Hat: “Many of these challenges can be tackled successfully at the beginning by approaching the project with a focus on an end-to-end solution architecture based on Container based, Kubernetes and DevOps best practices. “

When it comes to AI and the edge, starting with the low hanging fruit is probably the best strategy. This should help avoid failed projects.

“Enterprises should apply AI to smaller, non-business-critical applications first,” said Bob Friday, chief technology officer at Mist Systems, a Juniper Networks company. “By paying close attention to details such as finding the right location and the operational cloud stack, the management of the operation can be simplified.”

Regardless of the approach, however, the future looks bright. And AI efforts really need to consider the potential use cases to achieve their full value.

Tom (@ttaulli) is a consultant / board member of startups and author of Artificial Intelligence Basics: A Non-Technical Introduction, The Robotic Process Automation Handbook: A Guide to Implementing RPA Systems and Implementation of AI systems: transform your business in 6 steps. H.e has also developed various online courses such as: COBOL and python Programming languages.

Related posts

Exit mobile version