Medical Imaging AI Advances Muscular Dystrophy Diagnosis

Muscular dystrophy, an inherited disease with several variants that can appear anywhere from early childhood to middle age, is exceptionally tricky to diagnose. First, the patient’s genetic profile must be plotted out in great detail. Then it is examined and compared with large sets of genomic data stored at research centers and hospitals. Analysis is painstaking, requiring physicians to do a great deal of manual work. The entire process can take 24 weeks to deliver results.

A new method uses high-performance computing and AI inferencing to do much of the heavy lifting, relieving doctors of tedious manual labor. By pointing them in the right direction, it can shorten the time to diagnosis to 16 weeks or shorter, depending on patients’ condition. That’s also good news for patients and their families, who can start treatments sooner, and for researchers, who can advance their knowledge of the disease.

Collaboration Leads to IoT Healthcare Solutions

This new diagnosis method came about through the close collaboration of two Taiwanese companies. Avalue Technology Inc., a provider of IoT computing equipment, has deep experience with hospitals and labs. Biomdcare Corporation specializes in medical software, imaging, and screening tools. Together, the companies developed a muscular dystrophy screening kit that can analyze massive genomic data quickly and deliver results that are 97% accurate.

To create the Genomics Analysis Platform – Muscular Dystrophy Screening Kit, both companies had to overcome the vexing challenges that make working with medical data so time-consuming and difficult. “We worked together to develop hardware and software with the latest technology,” says Rus Lu, Senior Product Manager at Avalue.

Avalue was tasked with finding a way to efficiently transport and process the enormous sets of data to be analyzed.

“We knew the solution required a very high-performance CPU, so we applied the latest Intel® Processors,” Lu says. The company also built an extra graphics card slot into its server to accommodate the dense medical images and used a 10-gigabit Intel Ethernet chipset to avoid common bottlenecks slowing data transmission.

The Biomdcare software had to identify the patient’s disease variant and classify any correlations among the vast stores of genomic data it was able to use for comparison. Working with data sets this large is beyond the processing capability of most medical laboratories, and is usually confined to research centers and universities.

Biomdcare used the Intel® OpenVINO Toolkit to develop an AI-assisted software program that combs through all the data, filters out irrelevant results, and homes in on promising correlations. It then fully analyzes these correlations before handing them over to doctors.

#AI algorithms have an insatiable appetite for #data, since the more they crunch, the more accurate their results will be. Avalue Technology Inc. and Biomdcare Corporation via @insightdottech

Analyzing Medical Images with AI

Ironically, given the enormous amount of data it must sort through, one of the main problems Biomdcare encountered was having a small amount of relevant data to work with.

There are several reasons for this. The first is that AI algorithms have an insatiable appetite for data, since the more they crunch, the more accurate their results will be. And unlike the products and machinery AI systems were originally designed to analyze, humans are unique, their genetic data imbued with exponentially more permutations.

“In a factory, a lot of data and photos are generated about defective items and problems on the production line every day. But in healthcare, we don’t have as much data compared to the overall size of the patient population. Low data counts are always a problem for AI solutions in healthcare,” explains Richard Lin, Marketing Director of Biomdcare.

Diseases involving genetic mutations are especially complex, and analyzing them is never a straightforward task. The problem is compounded for muscular dystrophy, a rare disease roughly estimated to affect fewer than four per 100,000 people globally, according to the National Institutes of Health. In addition, its variants are so different from one another that medical organizations often refer to the condition as a group of diseases.

These issues would normally make it difficult to find enough specific correlations to confidently point toward a diagnosis. But Biomdcare developed a proprietary process to make it work. “We use a smaller database to generate a more accurate AI model. It’s a key value of our solution,” Lin says.

After the data has been analyzed, a report is prepared for physicians and stored—along with all the genomic data and the patient’s records—on Avalue servers. Medical professionals can access it on a software platform containing simple annotation and workflow tools, which they use to arrive at a final diagnosis.

Smart Healthcare Solutions Help Patients and Researchers

In addition to making doctors’ work easier, the muscular dystrophy screening kit provides a faster way for patients to learn if they have the disease. “In Taiwan, about 1 in 40 patients may have the recessive gene but are without symptoms,” says Olivia Wang, Product Manager at Biomdcare.

People who have a family member with the disease or couples starting a family can use the screening kit to learn if they have it or are carriers. Though there is currently no cure, treatments have been developed to improve muscle strength, and in some cases, slow disease progression. A faster diagnosis enables patients and their families to prepare for the future and seek treatment options sooner.

As more people use the screening kit, the data it produces will expand medical repositories, furthering research and helping AI systems achieve even better accuracy. “We hope our solution will help the research community build more reference data to help other patients who need these tests,” Wang says.

The companies are also extending their technology to screen for other diseases, including breast cancer and human papillomavirus. “We believe AI model-training solutions can help in many situations,” Lin says. “We anticipate developing more disease screening solutions in the future.”

 

Edited by Georganne Benesch, Associate Editorial Director for insight.tech.

Hardware Key to Meeting 5G and Edge Computing Challenges

5G is rapidly expanding the possibilities for edge computing and advanced networking in industrial IoT (IIoT), allowing manufacturers to better take advantage of data, automation, and artificial intelligence to truly transform operations.

But while manufacturers enjoy the benefits of high-bandwidth, low-latency connectivity, the sheer speed of 5G technology catches their attention—enabling networking experts to build more sophisticated and efficient industrial networks.

“In 5G development, the innovation is the virtualization of the entire network structure,” says Jesse Chiang, Senior Director of Product at IBASE, a manufacturer of edge computing and networking hardware for the industrial sector. “Network functions virtualization (NFV) enables technologies like software-defined wide area networks (SD-WANs), making it easier to manage the data flow and helping to bring down data transfer costs.”

With this is mind, it’s clear why industrial systems integrators (SIs) and networking providers are starting to explore the possibilities that 5G offers. Unfortunately, what they learn early on is that the 5G network hardware element of the industrial IoT equation can become a major stumbling block.

“Hardware for industrial IoT really has to check a lot of boxes: versatile, high performance, high availability, secure, easy to install, and rugged enough to handle harsh environmental conditions,” says Chiang. “For most SIs and software specialists, it’s just not cost-effective to engineer such complex 5G network hardware on their own—to say nothing of the business opportunities that would be lost during a lengthy development process.”

Thankfully, to overcome this difficulty, industrial 5G hardware manufacturers are starting to leverage their expertise and build products that allow SIs and software experts to take advantage of the technology standard’s promise.

Building for the Industrial Edge

Hardware manufacturers can help address the unique challenges of industrial 5G and edge computing in two principal ways: equipment design and component selection.

IBASE, for example, made several design choices to meet the demands of IIoT environments (Video 1):

  • Modularization to enable SIs and end customers to configure hardware platforms to their exact specifications, and to expand and scale up as needed.
  • Redundant power supplies and cooling fans so that hardware components can be serviced or replaced without interrupting factory operations, ensuring high availability.
  • Thermal design tested in simulations to guarantee that equipment will function in harsh operating conditions.
  • A compact form factor to help devices fit into cramped or limited spaces if required.

Video 1. IBASE builds 5G intelligent solutions for the factory with modularization, performance, connectivity, and thermal dissipation in mind. (Source: IBASE)

On the component side, Chiang explains, manufacturers must be able to maximize on performance while eliminating unknowns. “To develop edge or networking solutions for IIoT, you need a reliable platform on which to build. That means 5G hardware built with high-performance components that have well-understood, well-defined capabilities—so that they won’t fail you in the field,” he explains.

IBASE leverages its technology partnership with Intel to accomplish this goal:

For example, its equipment includes:

  • Intel® processors to handle control, compute, management, and packet processing while optimizing for networking performance.
  • Intel® QuickAssist Technology (Intel® QAT) to provide acceleration for data encryption and compression/decompression processing tasks.
  • Intel Hyperscan, a regular expression matching library that IBASE uses to accelerate deep packet inspection (DPI) in its industrial security products.

“In addition to tremendous computing power, Intel has equipped its processors with excellent networking capabilities. Intel is also putting a lot of effort into 5G development, which will help us keep up with the most advanced technology in the years ahead,” he says.

“#Hardware for industrial #IoT really has to check a lot of boxes: versatile, high performance, high availability, secure, easy to install, and rugged enough to handle harsh environmental conditions” – Jesse Chiang, @IBASE_Tech via @insightdottech

A Growing Range of 5G and Edge Computing Products

With the emergence of hardware built for 5G IIoT solutions, IBASE is already preparing for the future of possibilities in the industrial computing space.

For example, its product roadmap includes a range of 5G-compatible multi-access edge computing (MEC) servers designed to provide a stable, performant platform for AI-enabled solutions at the industrial edge. To meet the growing need for secure 5G networking—especially in situations where network nodes are spread out and local IT resources are limited—the company is also developing a line of Universal Customer Premises Equipment (uCPE) devices that will enable SD-WANs and network security applications.

In the coming years, hardware designed for 5G and edge computing will allow industrial SIs, edge AI specialists, and secure networking providers to deliver a number of important benefits to their customers, according to Chiang.

“SD-WANs will make managing networks much more efficient, reducing labor and equipment costs,” says Chiang. “And with more and more 5G deployments, edge computing is going to become an everyday reality in factories, ushering in the next wave of digital transformation in the manufacturing sector.”

 

This article was edited by Christina Cardoza, Associate Editorial Director for insight.tech.

An Inside Look at the 13th Gen Intel® Core™ Processors

Intel kicked off the year with the release of its 13th Gen Intel® Core Processors, codenamed Raptor Lake, designed to improve both desktop and mobile performance. For desktop PC and gateway-class systems, the release takes advantage of the hybrid microarchitecture designs first introduced in 12th Gen Intel® Core processors—giving users new capabilities and improved performance they need when they need it. On the mobile side, the processors come with more multitasking power and harness new performance cores to handle even more demanding workloads.

In this podcast, we look at what this latest-generation CPU release means to the overall IoT and enterprise space, performance and capability advantages of hybrid microarchitecture design, and advantages that 13th Gen Intel® Core processors offer over previous generations.

Listen Here

[Podcast Player]

Apple Podcasts      Spotify      Google Podcasts      Amazon Music

Our Guest: Intel

Our guest this episode is Jeni Barovian Panhorst, Vice President & General Manager, Network & Edge Compute Division at Intel Corporation. Jeni has held a variety of roles at Intel, starting off as a rotation engineer, then platform solutions architect before joining the Networks Group. In her current position leading the Network & Edge Compute Division, Jeni is responsible for the silicon portfolio that services network infrastructure and edge computing and the platform software that unleashes the technology and capabilities within that silicon to service use cases.

Podcast Topics

Jeni answers our questions about:

  • (1:31) Exciting new features of 13th Gen Intel® Core processors
  • (6:49) Advantages of hybrid microarchitecture
  • (9:58) How it compares to previous generations of processors
  • (14:30) New opportunities made possible by the latest processors

Related Content

To learn more about capabilities and hybrid microarchitecture of 13th Gen Intel® Core Processors, read Intel Boosts Edge Productivity with Processor Innovations. For the latest innovations from Intel, follow them on Twitter and LinkedIn.

Transcript

Christina Cardoza: Hello, and welcome to the IoT Chat, where we explore the latest developments in the Internet of Things. I’m your host, Christina Cardoza, Editorial Director of insight.tech. And today we’re talking about the Intel® 13th Generation Core processors with Jeni Barovian Panhorst from Intel. Jeni, welcome to the show.

Jeni Barovian Panhorst: Thanks so much, it’s great to be here.

Christina Cardoza: So, before we get into the conversation, I know you’ve been at Intel for a long time in a various different roles, so I would love to hear more about yourself and what you do at Intel.

Jeni Barovian Panhorst: Yeah, so I lead the Network and Edge Compute Division in Intel’s Network and Edge Group. So, basically what that means is that I have responsibility for our silicon portfolio that services network infrastructure and edge computing across a number of different sectors. And then also the platform software that unleashes the technology and capabilities within that silicon to service all those exciting use cases. So, really excited to be here today and talk specifically about one of the products, or a couple of the products, in that portfolio, specifically the 13th Generation Intel Core processors.

Christina Cardoza: Absolutely. And I know Intel just made that release at the beginning of the year. And, like you said, you are working in edge roles and network roles, and I think this release is going to be huge for those markets. So I would love to start off the conversation to learn a little bit more about the release—the13th Generation Core processors, codenamed Raptor Lake. But what makes this so exciting for the network and edge markets today?

Jeni Barovian Panhorst: Yeah. So, we’re here to talk about the new 13th Generation Intel Core processors for the IoT edge. And those are really our top choice to maximize performance and memory and IO and edge deployments. And also we want to talk about the new 13th Generation Intel Core mobile processors, which are focused on combining power efficiency and performance and flexibility, as well as industrial-grade features that focus specifically in areas that are important for network and IoT edge, including AI and graphics and ruggedized edge use cases.

And if we look specifically at some of the capabilities of these 13th Gen Intel Core mobile processors, they’re focused on delivering a boost in performance compared to the prior generation, while also offering really a range of options for different power design points. So this allows our customers to get exactly the performance per watt that they’re looking for, in what are often space- and power-constrained deployments. And so what our customers have an opportunity to do—and the solution providers that deliver solutions to those customers—

they’re able to benefit from higher single-threaded performance, higher multi-threaded performance, graphics, AI performance, but they also are really benefiting from increased flexibility to run more applications simultaneously, more workloads and more connected devices, which are very critical at the IoT edge.

And so I just wanted to talk about a couple of features that are really exciting about these product launches. First of all, we offer up to 14 cores and 20 threads with our performance-hybrid architecture. And we have a technology called Intel® Thread Director that allows us to match our—match the cores specifically to the needs of our customers’ workloads. We also have really great graphics performance. And this is really essential at the edge for use cases like autonomous mobile robotics, optical inspection use cases; and the 13th Gen Intel Core mobile processors, they feature integrated Intel® Iris® Xe graphics with up to 96 graphics execution units for fast visual processing. And that high number of graphics EUs also enables parallel processing for a number of AI workloads. And so then when you combine that with the capabilities of the processor, that has technologies like Intel® DL Boost with VNNI instructions, and combining that with developer tools like the Intel OpenVINO toolkit, all of this in combination has an opportunity to further enhance AI inference optimization on Intel-enabled solutions, which help reduce dependence on external accelerators.

Another thing I wanted to talk about was that the 13th Gen Intel Core mobile processors are the first generation of mobile processors to introduce PCI Express Gen 5 connectivity. This is enabled on the H-series SKUs specifically, and actually it was previously already available on the 12th Gen Intel Core processors. So PCIe Gen 5 allows our customers to really focus on deploying more demanding workloads in more places, because there’s a much bigger data pipeline and the ability to provide faster, more-capable connections to a variety of different peripherals.

And, last but not least, I talked a little bit about industrial-grade features. The 13th Gen Intel Core mobile processors also are focused on really redefining industrial intelligence by bringing flexibility and scalability and durability to the edge. And so select SKUs in the portfolio are compliant with industrial-grade use casesthat really is—it allows our customers to operate in 100% utilization over 10 years. Also stringent environments; so they offer extended temperature ranges of -40 °C to 100 °C. And also support for in-band ECC memory to improve reliability of those use cases, and deliver the types of performance and capability that are needed in harsh environments for installations in areas like machine control, AMR, avionics, and other really exciting use cases for the IoT edge. 

Christina Cardoza: So, sounds like it’s packed with a lot of great new features and capabilities. You know, I love to hear all the multitasking power. And you mentioned some use cases like autonomous robots; I can just imagine that that’s taking so much power, so many things running behind the scenes and memory to make that happen. So it’s great to see the processes are improving to make sure that things don’t slow down, that performance is improved. And that to handle those more demanding workloads it seems like Intel, you guys are always on top of things, always thinking of what’s next, and always releasing new processes like this.

It feels like the 12th Generation processors were just released, and then you guys updated to the 13th Generation processors. But I think, with these core processors, they actually introduced the hybrid microarchitecture that we saw in the 12th Generation of the release. Can you talk a little bit more about how you guys are utilizing that architecture?

Jeni Barovian Panhorst: Yeah, absolutely. And it is really important to address those complex workloads that you were just talking about just now. And as you said, we introduced that performance-hybrid architecture in 12th Gen, which is really about bringing together the best of two Intel architectures: our Performance cores, or P-cores; and then also our Efficient cores, or our E-cores, on to a single SOC. And so you know, really the primary advantage of bringing this into a single product, this Intel performance-hybrid architecture, is to be able to scale up multi-threaded performance by using these P-cores and E-cores optimally for the workloads at hand.

You know, it’s pretty intuitive that certain multi-threaded application performance scales with the number of cores that are available to them. But, really, that performance scale-up is dependent upon how efficiently a given application is divided into multiple different tasks and the number of available CPUs to deliver that parallel execution of those tasks. So, to cater to that vast array—that vast diversity of client applications and usages of cores—we focused on designing a SOC architecture where the larger cores are utilized and unleashed in performance to go after single-threaded performance and limited-threading scenarios. And then, simultaneously, the efficient cores or the E-cores can help extend scalability of multi-threaded performance over prior generations. So, by putting these together, that’s where we were able to deliver this performance-hybrid technology that achieves the best performance on multi-threaded, as well as limited-threaded and power-constrained workloads.

And, as I mentioned briefly before, that performance-hybrid architecture is coupled with Intel Thread Director, which optimizes performance for concurrent workloads across these P-cores and E-cores. So how that works is that the Thread Director just monitors that instruction mix in real time. And it provides that runtime feedback to the operating system, and it dynamically then provides guidance to the scheduler in the operating system, allowing it to make more intelligent and data-driven decisions on how to schedule those threads.

And so performance threads are prioritized on the P-cores, delivering responsive performance where maybe there aren’t as many limitations in terms of power requirements. And then the E-cores are utilized for highly parallel workloads, and other power-constrained conditions where power might be needed elsewhere in the system, such as the graphics engines that I mentioned earlier, or perhaps other accelerators in the platform. And then combined this delivers the best user experience.

Christina Cardoza: I know that hybrid microarchitecture was really exciting for a lot of people in the last release, so it’s great to see that carryover in this release. And you touched upon a lot of the new capabilities and experiences, but I’m wondering if we can expand a little bit more about what the top capabilities or features, improvements and differences you think users are really going to gain from this release over previous generations.

Jeni Barovian Panhorst: Yeah. So, performance is always top of mind for people. And there are certainly significant gains in the 13th Gen by comparison to the 12th Gen Intel Core processors. So, if we look at performance gains within the same power envelope for the mobile family of products, we’ve got up to 1.08x faster single-threaded performance. In the desktop processors we have up to 1.34x faster multi-threaded performance. And if we look specifically at AI performance, which is so critical for the edge, we’ve got up to 1.25x gains in CPU classification inference workloads. So that’s what we’re seeing in terms of exciting performance gains.

Another area that’s important to our customers is an easy upgrade path. And so these 13th Gen processors are socket compatible with the 12th Gen Intel Core processors to deliver that easy upgrade ability, both for our ecosystem as well as customers who have deployed solutions they have an opportunity to more easily upgrade. I mentioned before PCI Express Gen 5 conductivity. So, in our mobile products it is the first generation to include that to deliver a faster pipeline for more data throughput. A great example of use cases that benefit from that would be medical imaging, which requires a tremendous amount of visual data.

I wanted to talk a little bit about a specific customer example of where we’re seeing improvements, gen-on-gen from 12th Gen to 13th Gen. And one particular company that we’re working with is Hellometer. They’re a great example of a company that is digging into those gen-on-gen performance gains and also achieving platform flexibility in the process. So, Hellometer, what they’re focused on is they are a startup; they have a SaaS solution specializing in AI for restaurant automation. And if you look at what 13th Gen is capable of delivering for their application, they can deliver now more AI performance at the edge cost effectively for their target market, which is fast food restaurants and quick service restaurants.

So if you look at these restaurants, and specifically in the drive-through or consumers in the dining room, time is truly of the essence; it translates directly to revenue for these restaurants. And if a line is too long guests will drive past, or they won’t go back and they’ll find something to eat elsewhere. So that’s why these brands are really focused on utilizing Hellometer’s technology, which is computer vision–based technology. It’s a restaurant-automation solution, and it uses our prior generation of Intel Core mobile processors with the built-in AI acceleration in the processor itself—I mentioned DL Boost and OpenVINO before. So, Hellometer uses those technologies, and as a result of using their solution, these restaurant and franchisee operators can learn how to optimize their guest experience and get those meals out swiftly and build that customer and brand loyalty.

And Hellometer, his CEO joined us for the launch, and he has talked about how the 13th Generation Intel Core processors will enable them to add an extra video stream to their solution, which actually increases their ability to process customer data by over 30% for real-time inferencing without a discrete AI accelerator. It’s just using the technology that is integrated into the processor. And so this is really exciting for us to talk about these examples, because it really gives our customers the ability to win business by better understanding their guests’ experiences, and it delivers innovations that really drive business value. And we’re really excited to partner with our customers in examples like this.

Christina Cardoza: I love hearing all those examples. I mean, we’ve talked about manufacturing, healthcare, retail hospitality—so this is really hitting all sectors and improving businesses across all these different industries. And one thing that I’ve noticed that you’ve mentioned is that these businesses and organizations, they’re just getting smarter and smarter. And so that is having an increase on their network workloads. And everybody wants to move closer to the edge to get those real-time insights, like you were just describing with Hellometer. So I’m wondering, how else do you see the 13th Generation Core processors being able to provide new opportunities, provide new improvements as network workloads get larger, and we just move closer and closer to the edge?

Jeni Barovian Panhorst: Yeah, there’s just an incredible breadth of use cases that we’re supporting, and a lot going on. If we look in military applications, we have an opportunity to support embedded computing for vehicles and aircraft, or edge devices for intelligence and safety and recon. Next-generation avionics with multitasking performance and durability requirements for space-constrained and stringent-use conditions. We’ve got healthcare advancements. If you look at enabling ultrasound imaging, endoscopy, clinical devices—all these different use cases—it’s really a massive amount of visual data that has to be processed.

So customers can really take advantage of multitasking on that performance-hybrid architecture that we talked about—high data throughput that’s enabled through that PCI Express Gen 5, AI tools that enable developers to unleash the power of the underlying silicon to support these imaging workloads and inferencing workloads. And also, when you talk about a lot of these markets, they have very long life cycles for a qualification for certification. And they’re in service for a very long period of time as well. So the long-life availability of our products ensures consistent supply for repairs, for maintenance, and to really drive value from these long life cycles.

I talked a little bit about hospitality when I was talking about the Hellometer example. There’s all kinds of other applications as well, including video walls and digital signage. You know, AI-driven, in-store advertising, interactive flat-panel displays—these can all take advantage of our 13th Generation Core processors to offer a great solution for retail and service and hospitality industries as well. Industrial applications, like AI-based industrial process control, we’re seeing incredible innovation from our partners, leveraging 13th Gen Intel Core processors to really converge powerful compute and AI workloads in situations where you’ve got space constraints and power constraints.

And there’s another example I wanted to talk about in this area, and that is our partner Advantech. They’re focusing specifically in AMRs—which I mentioned before, autonomous mobile robotics—which are truly becoming a new normal in areas like warehousing, logistics, and manufacturing environments. And this market is just growing incredibly quickly. It’s growing over the course of the next couple of years at over a 40% compounded annual growth rate. And so, tremendous opportunity. It’s a question of how do we address that opportunity and enable our customers to extract that value. So, these AMRs and these other computer vision applications are really challenged by the need to provide powerful AI and camera-based inputs, but in very small form factors. And AMRs, in particular, may need to process data from multiple different cameras, as well as proximity sensors, so that they can navigate safely around their environment.

So if we look at what Advantech is doing, they’ve got a couple of offerings that leverage the 13th Generation Intel Core mobile processors to address what’s needed for both compute and graphics-processing performance, but also the power-efficiency needs of automated applications that are being used in AMRs, as well as an optical inspection. So, each of these solutions really benefits from the fact that you can get adaptive performance from the 13th Gen Intel Core mobile processors that feature that performance-hybrid architecture, but also that intensive graphics processing that we get from those Iris Xe graphics integrated, as well as the memory support that we get from DDR5.

And then, last but not least, they certainly benefit from the really great power efficiency in the latest processor generation. And that also contributes to helping our customers improve their total cost of ownership, and focusing on areas that are important to AMRs, like longer battery life to boost operational duration of robots on the factory or the warehouse floor. So, just a lot of really exciting innovation going on across these different edge and IoT use cases.

Christina Cardoza: Absolutely, and I can’t wait to see what else partners come up with when they’re powered by the 13th Generation Core processors. Just talking about the AMRs—autonomous mobile robots—this is just like, I can’t believe we have these robots operating across factory floors, helping out in the production line. And it’s all thanks to Intel technology, and it’s only going to get smarter and better from here.

But, unfortunately, we are running out of time. Before we go, I just want to throw it back to you one last time if there’s any final thoughts or key takeaways you want to leave our listeners with today.

Jeni Barovian Panhorst: Yeah, absolutely. You know, as we’ve discussed already, there’s a huge diversity of use cases and deployment models across network and edge computing infrastructure, and Intel’s product portfolio needs to comprehend all of those needs. Our mission really is to deliver the hardware and software platforms that enable infrastructure operators and enterprises of all types to adopt an edge-native strategy. And we’re really guided by those priorities with the goal of delivering that workload-specific performance and truly leadership performance for our customers at the right power at the right design points, servicing all of our customers’ needs, and really ultimately improving their total cost of ownership and their value.

And so we need to meet all of these design points across the spectrum—whether we’re talking about the devices themselves, the edge infrastructure, the network infrastructure, the cloud—our customers also want to be able to scale their software investment in whatever parts of our portfolio that they’re using. And so we’re really focused on this mission of being a catalyst for that digital transformation, and improving that business value. We’re focused on driving and democratizing AI, and making it accessible across the full ecosystem. So, with these latest 13th Gen Intel Core processors, we’re really proud to be delivering that next generation of diverse, edge-ready processors, and giving our customers more choices in leveraging this hybrid microarchitecture to unlock all these possibilities. So I’m really excited to partner with everyone here in the audience to realize the full breadth of these technology innovations, and really the promise of the future that’s built on AI-enabled edge computing.

Christina Cardoza: Absolutely, and I think we talked a lot about edge in this conversation. I see the intelligent edge being a huge trend, not only this year, but the next couple of years. And so it’s great to see how Intel is supporting that and helping organizations reach their goals and really realize the full value of their operations. So, thank you so much, Jeni, for joining the podcast. It’s been a pleasure talking to you and a great conversation.

Jeni Barovian Panhorst: Thanks so much. It’s been great.

Christina Cardoza: And thank you to our listeners for tuning in. Until next time, this has been the IoT chat.

The preceding transcript is provided to ensure accessibility and is intended to accurately capture an informal conversation. The transcript may contain improper uses of trademarked terms and as such should not be used for any other purposes. For more information, please see the Intel® trademark information.

This transcript was edited by Erin Noble, copy editor.

Using a Hyperscaler in an Edge AI Project? Read This First

So you want to build an AI system. Where do you start?

Take multimodal sentiment analysis, for example, which relies on multiple natural language processing and/or computer vision models that are refined through rigorous data labeling and training. Implementing it in, say, a retail customer service kiosk requires infrastructure like databases and visualization tools, web app development environments, deployment and delivery services, and, of course, an AI model training framework or two.

If this is your first foray into AI-enabled system creation, your organization probably doesn’t have all these tools configured in a way that’s conducive to fast AI system prototyping—if it has all the necessary components at all. In these instances, AI engineering hopefuls often turn to hyperscaler cloud platforms like Microsoft Azure, AWS, and Google Cloud. In addition to essentially infinite data capacity and infrastructure services, many of these hyperscalers support end-to-end AI development out of the box or offer API-based integration for specific third-party tools in a few clicks.

And best of all, you can get started for relatively cheap and add capabilities later as-a-service. So why assemble an ecosystem of technology partners when you can get started so quickly and easily on your own?

Hidden Cost of Hyperscalers for Edge AI Engineers

In early-stage proof of concepts (PoCs), hyperscaler cloud platforms are great for fleshing out ideas. But as you move into prototyping that more closely resembles the end product, their limitations are quickly exposed.

“What is difficult with hyperscalers is a real bespoke PoC because hyperscalers are based on standards. You use those standards, or you don’t use the hyperscaler,” says Glenn Fitzgerald, Chief Data Officer in the Product Business at Fujitsu Limited, a global information and communication technology company. “That applies both to the infrastructure and the application stacks that they use.”

“There’s also the issue of data sovereignty and residency, which isn’t so relevant in PoCs but certainly is if you get to prototyping,” Fitzgerald continues. “The hyperscalers don’t like you taking data out of their clouds and structure to avoid it. Legal and regulatory issues can significantly complicate data-driven projects, those that use AI in a hyperscaler environment.”

#AI #technology depends on increasing amounts of data being funneled into training models to improve the accuracy and performance of neural networks, making #edge-core-comms and #data management critical factors. @Fujitsu_Global via @insightdottech

The data is the key. AI technology depends on increasing amounts of data being funneled into training models to improve the accuracy and performance of neural networks, making edge-core-comms and data management critical factors. Data storage is a key revenue generator for hyperscalers.

It’s not hard to imagine starting an AI PoC in a hyperscaler environment with a few images, only to have it balloon into multiple databases with hundreds of thousands of images as prototypes evolve. And since extracting data from a hyperscaler cloud can be difficult, what began as innocuous platform selection can quickly become a costly platform trap.

An AI Identity Crisis

At this point you should also be asking whether you need to develop AI at all. For example, most companies don’t sell sentiment classification. Instead, they use it as an enabler of solutions like retail kiosks or market research software. That’s because, out of the box, AI isn’t a solution but rather a new capability that can solve existing problems.

“AI is not a solution to anything,” Fitzgerald explains. “If you think of AI in its traditional meanings of machine learning or natural language processing or neural networks, 99% of the time it’s a component in a solution, not a solution in and of itself.

“Where companies should start is, ‘This is my business issue.’ Far too many of them start with, ‘I need to be doing AI.’” says Fitzgerald. “But if you start with, ‘We need to do AI,’ you’ll end up doing nothing.”

In many cases, a better strategy is to leverage technology ecosystems that offload the overhead of AI model creation while keeping costs low. Done right, this approach allows OEMs and system integrators to capitalize on AI’s advantages while concentrating on the end application.

Accelerate AI Inference with a Partner Ecosystem

Fujitsu, in collaboration with Intel and British consultancy Brainpool.AI, has established a partnership to provide an onramp for AI prototypers. Called “co-creation workshops,” companies can access Brainpool.AI’s collection of more than 600 leading AI academics who advise on infrastructure components required to achieve the desired outcome. Fujitsu operates as an integrator, orchestrating additional partners and establishing the necessary infrastructure to scale AI from PoC through prototyping.

To facilitate this process, Fujitsu created AI Test Drive, a purpose-built AI infrastructure based on web app components, data services, monitoring tools, and AI suites from SUSE Linux, NetApp, and Juniper Networks. This software is packaged in a demo cluster that runs on Intel® processor-based servers and lets users stress-test AI designs while retaining 100% control of their data for curation, ingestion, and cleaning.

Free trials of AI Test Drive can be accessed through a portal. To deliver best-in-class model accuracy, latency, and performance across the gamut of AI use cases, it makes use of the Intel® OpenVINO Toolkit. The toolkit is an AI model optimization suite that compresses and accelerates different neural network software generated in different environments for use on different hardware. It’s compatible with the Open Model Zoo so that pre-trained models can be imported easily into prototyping pipelines.

As shown in Figure 1, OpenVINO accelerated an FP32 BERT sentiment classification model by 2.68x compared to the same unoptimized PyTorch FP32 model.

Latency comparison between Pytorch FP32 and OpenVINO FP32
Figure 1. The Intel® OpenVINO Toolkit optimizes different types of AI inference in the Fujitsu Test Drive platform. (Source: Fujitsu)

“You have to build an ecosystem that’s appropriate to the problem you’re trying to solve,” Fitzgerald says. “An organization like Fujitsu, which can bring other organizations into it and cover all those bases, is how you get the optimum team to solve a problem,” says Fitzgerald.

Start with the Business Problem

Today there’s an industry-wide fear of missing out on edge AI, visual AI, and machine learning. But before getting carried into the frenzy, understand how to avoid chasing red herrings into competencies that aren’t your own.

“Start with the business problem,” Fitzgerald advises. “If you understand the business problem, then you can work with your stakeholders, your trusted partners, and third parties to solve that problem.”

 

Edited by Georganne Benesch, Associate Editorial Director for insight.tech.

Improving Public Safety and Traffic with AI

Modern cities are abuzz with traffic, and the heavier it is, the more people try to take dangerous shortcuts. Hazardous situations multiply as cars swerve to avoid jaywalkers and buses park at odd angles, jutting into the street because motorcycles occupy their loading zones.

Many cities have installed traffic-light cameras at major intersections, allowing them to address some violations. But these cameras are expensive and miss problems on side streets, as well as damaged infrastructure and dangerous behavior on buses, such as a driver nodding off.

Martin Ting, CEO of 7StarLake, a Taiwanese company that makes high-performance computing equipment and developed a sensor system for the country’s semi-autonomous shuttles, learned all about city traffic problems when meeting with government and transportation officials in Taiwan, Canada, and the U.S.

“They always asked me one question: ‘Can you help us prevent accidents and improve public safety?’” he says.

In response, 7StarLake developed a bus-mounted edge AI computer vision system that spots both traffic problems and dangerous bus driver behavior in real time, issuing warnings of impending hazards. By analyzing data from the system over time, city officials gain in-depth knowledge of traffic and behavioral patterns, helping them improve public safety and traffic management.

Preventing Accidents with Smart Buses

Like many cities, Taiwan has cameras mounted at main intersections—Taipei, for example, has 14,000. But they cost $150,000 apiece to install and an equal amount for annual maintenance. And they can’t interpret information in real time or spot dangerous situations outside their limited field of vision. Obtaining thorough coverage would require an average large city to install 100,000 cameras, Ting explains.

In contrast, 7StarLake’s Time Eye Smart Traffic Solution installs much less expensive computer vision cameras—plus a GPS sensor—on buses, capturing both exterior and interior information that can help prevent accidents. For example, if a driver’s eyes are closing, the seat can vibrate in alert. Data is also sent to transit officials, who can issue a warning.

Exterior cameras capture traffic activity, including trucks illegally parked on side streets and buses unable to enter loading zones occupied by other vehicles (Video 1).

Video 1. 7StarLake’s Time Eye computer vision cameras identify vehicles and capture information about traffic violations. (Source: 7StarLake)

These incidents can cause drivers to dart around the larger vehicles without seeing what’s on the other side—an extremely dangerous situation.

In Taiwan, accidents involving trucks and buses are the major cause of traffic fatalities. Time Eye instantly relays information about serious violations to officials, who can issue tickets or arrange a tow if necessary. For less serious infractions, the system records license plate numbers and, in some cases, sends tickets without having to summon an officer.

Time Eye’s cameras also relay information about infrastructure hazards, such as rocks or tree limbs blocking a street or missing manhole covers on sidewalks. Cities can immediately notify maintenance crews and alert bus drivers to slow down or avoid hazardous areas.

By analyzing #data from the system over time, city officials gain in-depth knowledge of #traffic and behavioral patterns, helping them improve public #safety and traffic management. 7 StarLake Co. Ltd. via @insightdottech

Computer Vision Cameras Reveal the Truth

In attempting to avoid an accident, a bus driver may speed up or stop suddenly, which could cause a passenger to fall. Fall claims are a frequent source of municipal lawsuits in Taiwan.

Time Eye computer vision cameras capture time, traffic conditions, bus location, acceleration, and braking. They also show whether passengers are standing or seated with seatbelts fastened, as required by law except when boarding or exiting. This information serves as an objective source of truth, streamlining investigations and legal proceedings. After adopting Time Eye, a city in south Taiwan experienced an 80% decline in passenger lawsuits, according to Ting.

While cameras record passenger activity, they do not use facial recognition software. Data is encrypted and sent to city computers through VPNs with enterprise-grade security.

Smarter Traffic Management Improvements

7StarLake customizes its system for cities, which choose the information they want to collect. Engineers use the Intel® Distribution of OpenVINO Toolkit to save time in training algorithms to recognize traffic conditions, vehicles, and human behaviors.

“OpenVINO has thousands of pre-trained algorithms and can do about 70% of the AI model development, so my engineers don’t have to build it from scratch,” Ting says. “We can save a lot of time, and I don’t need as many engineers.”

A city bus can capture an enormous amount of information—up to a terabyte a day. City administrators receive real-time data and images of up to 2 megabytes, relayed by high-speed Intel® Core processors. The rest is conveyed later and stored on city computers, where officials can analyze it to learn about traffic problems at specific locations and times. They can then better allocate resources to improve traffic flow and public safety, whether that means posting signs, sending police to the right locations, or creating new parking facilities.

The 5G Smart Traffic Management Future

As more cities adopt 5G connectivity and bandwidth costs decrease, higher transmission speed and low latency will allow systems like Time Eye to deliver even more information in real time—a capability city officials are clamoring for, Ting says. Transit authorities and emergency technicians will be able to view not just data and still images but full video footage of events as they unfold, helping them better understand and respond to critical incidents.

“Smart traffic management can help cities deploy resources better and save lives,” Ting says. “I firmly believe that once 5G is fully deployed, it will take off.”

 

This article was edited by Christina Cardoza, Associate Editorial Director for insight.tech.

All-in-One Platforms: Simplify Edge AI Product Development

Edge AI can be used to solve problems across a diverse range of verticals. In manufacturing, it offers process optimization, real-time data visualization, and supply chain management benefits. In medicine, it can power better diagnostic tools and patient management systems that relieve overstretched healthcare workers. And for smart cities, AI at the edge can help to address everything from traffic congestion and energy efficiency to public health and safety issues.

But ironically, despite the general usefulness and benefits of edge AI, it is often very difficult to get these solutions onto factory floors, into hospitals, and out on city streets. To put it simply, few systems integrators (SIs) and organizations in these sectors have the internal resources to build an end-to-end edge solution.

“Experience with edge AI is growing, but it’s still rare to find a business or SI with all of the pieces of the puzzle,” says Tiana Shao, Product Marketing Manager at AEWIN, a provider of smart networking and edge AI solutions for digital transformation. “They may understand the hardware, but not how to build a suitable AI application. Or they might be AI software specialists, but are unsure of how to select hardware that will meet their end users’ specifications.”

But all-in-one edge AI platforms may be the answer to this digital transformation skills gap. These comprehensive platforms enable flexible, high-performance edge AI solutions that run on secure, reliable hardware—streamlining product development work and speeding time to market.

Meeting the Challenges of Edge AI Solution Development

The key to simplifying edge AI product engineering is to build on a platform that gives you a head start on the hardest parts of the process: AI software development, device optimization, performance enhancement, and flexible design.

All-in-one platforms accomplish this by drawing on the strengths of both hardware experts and AI specialists. While AEWIN, for example, has expertise in edge hardware, AI software development was more difficult for the company. As a result, it partnered with AI specialist InfinitiesSoft to incorporate its AI software stack into its solution. Because of this, AEWIN is able to provide a platform that addresses hardware as well as software challenges—speeding up the development work and offering device optimization for a wide range of processors and servers.

By removing many of the traditional #edge #AI implementation barriers, all-in-one platforms will help to increase AI adoption and drive #DigitalTransformation in multiple industries. @IPC_aewin via @insightdottech

On the hardware side, the solution also leverages the capabilities of Intel® processors. Shao says that this brings several benefits:

“Computational speed is extremely important for AI at the edge, and Intel processors help us build a high-performance computing platform that is ideal for edge applications. In addition, these processors offer important security features such as Intel® Software Guard Extensions (Intel® SGX), Intel® Platform Firmware Resilience (Intel® PFR), and execution controls.”

Last, all-in-one platforms are built with flexibility in mind, since they must be adaptable to a variety of use cases. For instance, AEWIN’s platform is installable in almost any edge environment, and comes with expansion slots for network interface cards and hardware-based accelerators.

“This is meant to be one platform with lots of options,” says Shao. “Different end users, of course, will have different needs. We designed the solution so that it can be customized according to our customers’ requirements.”

The end result is a platform that can be used in many scenarios—even if a relatively sophisticated solution is called for.

Smarter Cities on a Shorter Timeline

An example of how this works in practice is AEWIN’s smart traffic management use case.

Cities all over the world are under pressure to solve urban traffic issues, alleviate commuter congestion, improve quality of life, and cut down on carbon emissions to meet sustainability targets. Edge AI-enabled traffic management systems present a promising approach to the challenge, but they are technically demanding to design and implement. And unfortunately, this makes systems integrators and city managers shy away from what might otherwise be an effective way to address these problems.

But using an all-in-one edge AI platform like AEWIN’s, SIs and cities can work together to develop customized traffic management solutions in a far shorter time frame.

The platform enables real-time computer vision processing of traffic camera video at the edge—removing the need to send large amounts of raw data to the cloud for processing. The AI software stack, meanwhile, can be used to create a tailored solution without a lengthy development process, offering capabilities like data visualization, traffic flow optimization, integration with traffic control signals, and automatic alerting when traffic violations are detected.

“Traffic management for smart cities is just one possible use case,” says Shao, “but it’s a good example of how an all-in-one platform can help SIs shorten time to market when complex solutions are needed, and build scalable, repeatable products that they can then sell to other customers.”

A Multitool for Digital Transformation

The versatility of AI at the edge means that it can be used to solve business problems in nearly any setting. By removing many of the traditional edge AI implementation barriers, all-in-one platforms will help to increase AI adoption and drive digital transformation in multiple industries—and will present lucrative opportunities for both SIs and solutions providers.

In the long term, by making edge AI so accessible, these platforms will also help to power the next wave of digital transformation. AEWIN sees its edge AI appliances supporting next-generation technologies like digital twins, which are virtual representations of physical devices. Digital twins enable real-time simulations, precise predictive analytics, and more effective automation, offering tremendous benefits in medicine, automotive engineering, manufacturing, and smart cities, Shao explains.

“Digital twin technology requires a powerful, reliable edge AI server to handle that volume of real-time data processing, and some fairly sophisticated AI software development as well,” remarks Shao. “Until now, that made digital twins a nonstarter for most organizations, but in the future, all-in-one edge AI solutions will bring this exciting technology within reach.”

 

This article was edited by Christina Cardoza, Associate Editorial Director for insight.tech.

Unified Platforms + Edge Intelligence = Intelligent Spaces

When you can communicate and act in real time thanks to intelligence derived from data, that’s when the magic happens. Picture a fire alarm going off in a 12-story hotel. The default course of action for management would be to evacuate all guests. But if a central intelligence platform could pinpoint the precise location of the alarm that’s going off, the front desk could activate a camera near the source and scope out the size of the problem.

Even further, guests on just that floor could be evacuated with guided signage for the closest exit, delivered in real time, to avoid crowding. Management can also relay live information about the problem to the fire department. Such efficiencies might seem minor, but they systematically add up over time and make business operations more efficient, according to Surya Varanasi, Chief Product Officer at Kloudspot, an edge intelligence service provider.

To help businesses get there, Kloudspot offers a situational awareness and intelligence platform, which ingests Wi-Fi and sensor data to deliver such intelligence. Depending on enterprise needs, insights can take the form of intelligent spaces, smart surveillance, or even hybrid work solutions.

“With a central #edge intelligence platform, we’re able to cut across all these disparate views and create outcomes across systems” – Surya Varanasi, @kloudspot via @insightdottech

A Unified Data Platform for Edge Intelligence

The IoT advantage here is that it gathers data from a large variety of sensors—HVAC and lighting controls, plumbing controls, parking lot cameras, and more. But the problem most enterprises contend with is in wrangling all that data to feed contextual information in real time to the right people. “When there’s so much data coming in, it’s very hard to actually process it all and make sense of it,” Varanasi points out.

By running on Wi-Fi or other available connectivity, Kloudspot’s platform can gather information from various IoT edge devices and camera feeds and layer relevant intelligence—whether it’s real-time reporting or historical data. The net result is “an intelligent space where all these sensors come together and we’re able to deliver outcomes in a very simple way,” Varanasi says. “All this allows you to make intelligent decisions based on what you see.”

Since all communication devices—whether it’s signage in the hotel or in an individual guestroom or through a mobile app—funnel into the central platform, system administrators can route relevant information to the right device. Management can also program alerts into the system so they can ensure a superior guest experience.

Depending on the kind of information a business’ intelligent space needs, system administrators can set up the Kloudspot platform with different layers of data. “If you look at any space today, you have a few views of what’s happening,” Varanasi says. “We have the building management view, the physical security view, and the third is the view from wireless networks and Bluetooth access points. With a central edge intelligence platform, we’re able to cut across all these disparate views and create outcomes across systems.”

In essence, Kloudspot unites otherwise siloed views to get the most out of data. Especially important, according to Varanasi, is the platform’s ability to work with existing systems. “You can use us in a way that’s not rip-and-replace,” he says. “Once clients see that we can augment data through this platform to deliver very unique outcomes across the board, it’s very easy to show the business value.”

An Edge Intelligence Prescription for Healthcare

Improved outcomes across the board are what Kloudspot facilitated for its client Aspen Medical, a global healthcare company. Aspen Medical won a contract to provide primary healthcare in underserved areas in Abu Dhabi, and Kloudspot helped execute a digital transformation strategy. Kloudspot’s unified platform used existing Wi-Fi service to improve the patient registration and wayfaring experience through the medical facility. Patients can log into the guest Wi-Fi portal and become part of Kloudspot’s platform so the facility can route custom messages and deliver a better experience.

The Kloudspot platform also delivers a gateway service to access a Wi-Fi portal across multiple service providers so disruptions from one provider do not disrupt business operations.

Kloudspot works with system integrators to deliver its solutions and uses Intel® Xeon® CPUs with integrated graphics engines for vision processing. Their software runs on Docker containers on Intel service. The Aspen Medical solution uses Intel® Core i5 NUC boxes, Blade Server, and the OpenVINO toolkit.

The Many Uses of Unified Data Platforms

Healthcare and building management are not the only implementations for edge intelligence through unified data platforms.

For example, Kloudspot’s Immersive Work solution enables hybrid workers to access the same environment no matter where they’re located. Through Kloudspot’s unified intelligent spaces and smart surveillance platform solutions, airports can move security lines faster so passengers can spend more time at concessions and increase airport revenue. Similarly, camera feeds can conduct license plate tracking to ensure proper parking. Management can search video feeds with tagged metadata using natural language commands to track suspicious activity across terminals and improve security operations.

The future of intelligent spaces is about presenting actionable information in a visual and easily digestible format so relevant stakeholders can make decisions even more quickly, Varanasi says. After all, businesses save money and improve efficiencies when they make timely and contextual information available to relevant stakeholders through a single pane of glass.

This article was edited by Christina Cardoza, Associate Editorial Director for insight.tech.

Video AI OS Enables Large Scale Adoption of AI Apps

It’s a brave new world for video AI application development and consumption. The variety of video AI use cases is exploding, with the global computer vision market valued at $11.22 billion. And it’s expected only to continue to grow at a compound annual rate of 7.0% over the next couple of years.

Catalyzing this rapid growth is deep learning techniques, which are expanding the possibilities for computer vision solutions across a multitude of industries. Recent developments in this area have improved neural network architecture and training algorithms, increased the affordability of hardware to power AI algorithms, and made data more accessible across sectors.

With these technological advancements, the potential for video AI use cases is almost endless and certainly no longer limited to select industries.

It’s showing up in cities for safety detection, in the transportation industry for wrong-way detection, and even in manufacturing for defect detection. Additionally, warehouse management is becoming more streamlined than ever before with SKU counting and visual inventory management. And forensic specialists can now view the contents of a lengthy surveillance video in just a few minutes. These are just a small set of the possibilities with video AI today.

The Duality of Video AI Market Challenges

But this explosion of AI possibilities also comes with market fragmentation challenges, which are highly pertinent to developers creating these applications and the organizations that leverage them.

On the development side, developers are struggling to get their applications discovered by potential users. And on the organizational side, businesses are struggling to consume AI applications at scale.

With these #technological advancements, the potential for #VideoAI use cases is almost endless and certainly no longer limited to select industries. @awirosweb via @insightdottech

Part of the problem has been that computer vision applications have traditionally been developed by different companies focusing on niche areas with limited geographical spread. But the growing diversity of customer needs in video AI is spanning across industries, and it’s becoming harder and harder to find video AI apps to meet their specific needs.

According to Yatin Kavishwar, Co-Founder of Awiros, a video AI OS and marketplace, this diversity is impossible to address without a centralized platform to scale.

Even if an organization does find a reputable developer with a suitable app or two, it doesn’t solve their scalability issues—making it difficult for niche app developers to justify their offering. Considering the cost of critical elements enabling video AI adoption such as network, hardware, infrastructure, and cameras, no company is going to achieve a favorable return on investment by investing in one or two siloed applications, explains Kavishwar.

“In our experience, enterprise customers seriously exploring video AI applications are aiming to purchase eight to ten apps minimum,” he explains.

Solving Computer Vision Market Fragmentation

As a result, Awiros is working to solve this market fragmentation with its software platform and operating system Awiros OS. The solution is designed to enable enterprise customers to achieve diverse insights and business outcomes from static video content and real-time camera streams.

Through its centralized marketplace Awiros AppStack, customers can source a collection of video AI apps in a quick and integrated manner, and third-party developers can access tools to build, deploy, train, scale, and manage video AI apps.

For example, when a leading global luxury car manufacturer was looking for a set of video AI apps to cater to its 14 unique use cases, it turned to Awiros. With Awiros OS, the company found a centralized way of discovering, hosting, and managing its large number of video AI apps under a single platform. And the Awiros AppStack provided them with the ability to search for existing apps that solved its immediate needs.

In just two short months, the Awiros team successfully executed the proof of concept (POC) using Awiros OS, which integrates multiple applications, websites, and servers across several geographical locations. Not only did Awiros offer video AI applications that solved for each of the customer’s use case parameters, but its application marketplace AppStack (Video 1) provided future solutions for a variety of use cases, all hosted in the same platform.

Developers and Enterprise Customers Benefit from an AI OS

One of the biggest advantages Awiros brings to customers is through eliminating camera-specific limitations and expanding the possibilities.

Enterprise customers can enjoy the freedom to choose from 60 applications currently available in the Awiros AppStack, specify a camera stream, deploy the application for whatever time is required, and automatically schedule the redeployment on other cameras. All of this enables more efficient management of resource-hungry apps and drives greater ROI, Kavishwar explains.

Video 1. Awiros AppStack is a video intelligence marketplace and aggregator of computer vision applications across industries. (Source: Intel)

Developers can also benefit from a low-code environment—allowing them to bring their applications to market quickly without the hassle of containerization. Additionally, Awiros provides a platform for niche developers to create applications that are domain-specific and require localized data. This is a critical gap closure since the best trained algorithms are trained on localized data, which is often siloed itself, according to Kavishwar.

Awiros has been able to successfully deploy its operating system in a cloud, on-premises, and within hybrid environments, thanks to Intel’s help. The company is leveraging Intel hardware such as Intel® Xeon® processors for some of its most critical projects as well as the OpenVINO toolkit for its entire computer vision library set.

The Future of Video AI Is Bright

Despite the barriers and fragmentation of video AI development, Kavishwar expects adoption to continue to rise over the next couple of years.

“The role of camera-as-a-sensor and IoT as a technology are increasing,” he observes. “The more proliferation of these technologies leveraged in solving day-to-day problems that businesses face, the more relevant video AI will become.”

Going forward, Awiros aims to make it easier than ever to discover relevant solutions to business challenges by creating a marketplace of 1,000 video AI applications, the majority being third-party developed by 2025.

 

This article was edited by Christina Cardoza, Associate Editorial Director for insight.tech.

Open Hybrid Cloud: The Key to Manufacturing Automation

Manufacturing companies know very well that technology is a means to an end. At the end of the day, they just want their systems to work. Unfortunately, that is easier said than done—especially for those who are late to Industry 4.0, the sector’s version of digital transformation.

Part of the problem is that the COVID-19 pandemic blindsided many and delayed the process, according to John Archer, Senior Principal Business Development Manager of AI/edge at software company Red Hat. Owners of operational technology (OT) have not had time to retrain, and industry is realizing that OT teams must be shown the improvements they can expect from Industry 4.0 before they can adopt what they sometimes perceive to be a “black box.”

“There’s an enormous transformational aspect to Industry 4.0 which, is really impacting end users and creating some level of resistance,” says Reza Mokhtari, Global Telco Alliance Executive at Red Hat. He points out that while hardware providers are ready with small form factor edge computing devices that can process sensor data and deliver efficiencies, there’s limited know-how as to how to go about it all.

Giving OT the IT Treatment

Complicating the landscape even further are legacy systems that are incompatible with modern-day operations. “You’ll go into a shop and there’ll be proprietary hardware running software that no one is maintaining except for ‘hair-on-fire’ types of situations,” Archer explains. “So today if manufacturers want to make a change to the production line, they sometimes have to shut down the line for weeks to get software updated.”

Eager not to get stuck with such antiquated systems again, manufacturers want to upscale and rearchitect solutions that are flexible. To do so, current processes will have to change. For instance, achieving Industry 4.0 initiatives requires OT assets to be managed like IT assets.

Red Hat offers open interoperable systems designed to manage OT more efficiently, according to Archer. Because open systems are compatible and can talk to one another, they can be deployed at scale, just like IT.

In addition, Archer says, “We’re making it easier to conduct edge machine learning operations while increasing security.”

But before open interoperable systems can be deployed at scale to facilitate manufacturing automation, enterprises need another component: strong connectivity. Latency and phase lag in data processing won’t fly. It’s why modernized 5G infrastructure is also a key component of Industry 4.0. Red Hat’s OpenShift and Intel® Xeon® solutions help manufacturing companies that want the best of all worlds: a private 5G network with the reliability of local area networks (LAN) and the flexibility and mobility of Wi-Fi.

Hybrid #cloud enables #manufacturers to distribute workloads between on-prem and other cloud resources, as needed. @RedHat via @insightdottech

Why Open Hybrid Cloud Matters

Red Hat’s emphasis on flexibility facilitates ease with respect to all aspects of digital transformation. Delivering computing resources to customers, depending on where their data needs reside, is part of that solution. “You can call it hybrid or multi-cloud or whatever, but it’s really architecture for the realities of where things are,” Archer says.

Hybrid cloud enables manufacturers to distribute workloads between on-prem and other cloud resources, as needed. Red Hat products such as Ansible help manufacturers prioritize workloads and automate operational aspects of such management, Mokhtari says.

Getting to Industry 4.0

The many moving facets of digital transformation can feel intimidating, so Red Hat gives manufacturers the confidence that “systems will just run and be supported,” Archer says. “That’s part of our value proposition here is to give you the guidance on how to right-size operations and manage it at scale. We ensure that ‘if it runs our stuff, you should expect it to operate and be managed in a certain way.’ No matter if the load is running on bare-metal Linux or containerized and virtualized computing environments,” he adds.

To that end, Red Hat is test driving Intel’s software development kit to certify it and “get it to a place where customers can consume it more readily and scale out production.” Red Hat and Intel have also launched the Intelligent Edge Solution Center, a lab environment designed to advance the Industry 4.0 ecosystem. With locations around the world, this collaboration helps the two technology companies develop potential edge solutions centered on computer vision and machine telemetry models. The locations serve as test beds to develop custom solutions for customers and partners.

As part of setting up the intelligent edge for automation in manufacturing, Red Hat has been using the Intel® Distribution of OpenVINO Toolkit and oneAPI. They also collaborate on device management and provisioning workflows at the edge, using Intel® Edge Insights for Industrial.

Manufacturing is moving steadily forward with many companies ushering in digital transformation. No matter where manufacturers are in their digital maturity journey, they can use the open hybrid cloud and the intelligent edge to their advantage, Archer says.

“We’re super focused on how to make edge infrastructure work without our clients having to manage a bunch of different bespoke platforms,” Archer says. “Our clients want a single pane of glass with insights, and they want us to show them how to manage digital transformation at scale. That’s the kind of guidance we deliver.”

 

This article was edited by Christina Cardoza, Associate Editorial Director for insight.tech.

Video Safety and Security Lead to Business Efficiencies

Security video and sensor data can deliver plenty of business insights to organizations looking to optimize building utilization and user safety. So when organizations invest in expensive video infrastructure for safety and security reasons, they also want to leverage data for business purposes.

Take retail. Images of shoppers in stores not only help deter theft and keep people safe but also tell retailers how patrons move through a store and where they spend most of their time.

The challenge is to get the insights cost-effectively. Cloud-based management of video solutions helps control costs, but using the cloud to analyze the amount of data captured by cameras gets expensive. It’s cheaper to do it at the edge.

“The cloud-based management platform is excellent for a lot of workloads. But for workloads like video, where it’s very demanding and very expensive to move that workload off-premises, there needs to be some level of flexibility,” says David Grey, Senior Manager for the Video Appliances Product Group at Genetec, a video physical security solutions vendor.

Genetec solves the problem by providing customers with a combination of technology and global services. The company’s unified security platform combines video with access-control data from sensors. An appliance, called Streamvault Edge, sits on the edge at customer sites to analyze the data and give it context.

About 70% of the data is handled at the edge. Genetec transfers anything requiring further analysis to the cloud. Then the company converts data insights to an easily consumable format that customers access through a web interface.

Beyond Safety and Security

“Only 5% of video collected from security systems is played back—to investigate incidents,” says Grey. “The other 95% in a traditional deployment is just getting thrown away, essentially. At the end of a 30-day cycle or at the end of the retention period, it gets rewritten.”

Organizations that make substantial investments in video frown on that kind of waste. One way to derive value is to use real-time analytics to prevent security incidents. Someone captured on camera acting suspiciously can be stopped before slipping into a building illegally or stealing something.

Today, companies look for more capabilities from their existing video #SecuritySystems. They look for actionable business #intelligence. @genetec via @insightdottech

But today, companies look for more capabilities from their existing video security systems. They look for actionable business intelligence. For example, by knowing the number of individuals entering and exiting a facility, companies can measure occupancy trends. Visibility into these types of trends can drive decisions from energy management to conference-room demands.

“Operationally, if you’re an office, there’s a lot of value in understanding what’s actually happening in your premises,” says Grey.

Video Data Leads to New Opportunities

Whether you’re a retailer, a sports stadium, or an airport, operational efficiencies derived from security data can create commercial opportunities. Retailers, for instance, are always interested in tracking the customer journey to create better shopping experiences. Genetec helps them by monitoring traffic flow and checkout queues.

“Once a certain amount of people has lined up at a cashier, is it time to open another cashier? Can you count people that have decided not to purchase because the queue was too long? There’s a lot of insights into how to operate a retail store more efficiently,” says Grey.

Other industries, such as financial services, also can benefit. One financial services customer with global reach needed to monitor locations without access to the company’s network. Extending the network to each site would have been costly.

So the company tapped Genetec to deploy Streamvault Edge at those sites. Because the solution is managed through the cloud, it required only two minor changes to the company’s network for security monitoring and access control.

The company saved money in two ways: The solution didn’t require major network integration, and the customer got access control without having to post security guards in the buildings, Grey says.

Continuous Improvements: From Video Security to Cybersecurity

Beyond video security, analytics, and business intelligence, cybersecurity plays a big part in Genetec’s solution and service strategy. Grey says all of the company’s hardware and software are hardened and penetration-tested by a third party. And all of their software partners go through cybersecurity training twice a year.

The platform tracks cyber risks in real time and issues alerts if someone tampers with a camera. Grey says Genetec won’t work with cameras lacking security controls such as password protection: “Cybersecurity is a huge part of our credibility with our customers.”

Partnering with Intel furthers that credibility. Genetec uses Intel CPUs to power Streamvault Edge and works closely with Intel on R&D and go-to-market strategies.

Going forward, Grey is confident Streamvault Edge will open inroads into new markets. The solution is quick and easy to deploy, and its software is updated continuously. “When new features become available, those features are automatically available to our customers so that they can always be consuming our latest and greatest,” says Grey. “It’s our desire to protect the flow of everyday life by providing organizations around the globe with a means to improve their business intelligence, operational awareness, and security.”

 

This article was edited by Georganne Benesch, Associate Editorial Director for insight.tech.