Private 5G Predictions with CCS Insight

Richard Webb

[podcast player]

With the recent allocation to enterprises of more spectrum, private 5G is rapidly making inroads in industries like manufacturing. The engineering company EXOR International is just one example of this, as it is using 5G spectrum to capitalize on the benefits of Industry 4.0—enabling engineers to quickly collaborate, experiment, test, and deploy new technology.

Wondering how you and your industry can benefit? Listen to this podcast, as we explore the use cases for private 5G networks, the role of cloud providers like Amazon Web Services, Google, and Microsoft, and ways to incorporate 5G into existing networks with solutions from companies like Cisco, Dell, HP, and IBM.

Our Guest: CCS Insight

Our guest this episode is Richard Webb, Director of Network Infrastructure at CCS Insight, a mobile and wireless research firm. Richard has been an industry analyst for more than 20 years, with a focus on the 5G networking landscape and markets like RAN architecture, 5G network enterprise considerations, Wi-Fi, and 6G. Prior to joining CCS Insight, he worked for a business conference company, where he was introduced to the telecommunications industry and gradually moved to an analyst role.

Podcast Topics

Richard answers our questions about:

  • (4:14) The trend toward private 5G networks
  • (7:56) Different 5G use cases
  • (11:08) Wi-Fi 6 versus 5G
  • (13:44 ) How organizations can implement private 5G networks
  • (17:35) The role of cloud providers in private 5G
  • (19:57) Where enterprise solution providers fit in
  • (22:00) Enterprise network technology providers in a 5G/Wi-Fi world
  • (24:04) What private 5G looks like in practice

Related Content

To learn more about the future of private 5G, read ​​Challenges and Requirements of the Private 5G Network Market. For the latest innovations from CCS Insight, follow them on Twitter at @CCSInsight and on LinkedIn at CCS-Insight.

 

This podcast was edited by Christina Cardoza, Senior Editor for insight.tech.

 

Apple Podcasts  Spotify  Google Podcasts  

Transcript

Kenton Williston: Welcome to the IoT Chat, where we explore the trends that matter for consultants, systems integrators, and end users. I’m Kenton Williston, the Editor-in-Chief of insight.tech. Every episode we talk to a leading expert about the latest developments in the Internet of Things. Today I’m exploring the rise of private 5G networks with Richard Webb, Director of Network Infrastructure at analyst firm CCS Insight.

With every new generation of cellular technology, the hype is always there. But with the rise of private networks, 5G is set to be a true game changer for manufacturers and other businesses. To successfully take advantage of everything 5G has to offer, you need collaboration between telecom operators, cloud service providers, systems integrators, and even enterprise solution providers.

In this podcast, we’ll talk about these partners and their role in the 5G ecosystem, the reasons enterprises should deploy private 5G networks in the first place, and the differences between Wi-Fi and 5G. But before we get started, let me introduce our guest. Richard, welcome to the show.

Richard Webb: Thank you very much. Pleasure to be here.

Kenton Williston: Can you tell me what CCS Insight does and what your role is there?

Richard Webb: Sure. CCS Insight is an industry analyst firm focused on a range of different emerging digital technologies. Everything from AI, machine learning, automation, analytics, mobile network infrastructure, enterprise, digital transformation, IoT, smartphones and mobile devices, including wearables, VR, AR, a range of different technologies. And really a lot of the work we do is consulting on the market opportunities for the intersectionality of those different emerging technologies. We’re a relatively small firm based in the UK, but with the global remit. So we look at markets across the whole world. We serve a customer base of mobile operators, cloud service providers, network equipment providers, software solution providers, everything in between.

Kenton Williston: I look forward to getting into that, but I’m curious. Actually earlier in my career I did a little bit of work as an analyst myself. So I’m curious what got you into this line of work and what you did before your current role.

Richard Webb: Okay. So I’ve been an industry analyst for over 20 years now, but I started my career actually working in entertainment. I was running a venue and it held a number of different corporate events, as well as public entertainment. One of the corporate events was a conference, a business conference, focused on technology. And I got speaking to the organizer who was putting the show on and hiring our venue. And I got really interested in that as a possible career. I jumped lanes as it were, and got into a conference organization and media company and got put into the telecommunications division. So a lot of the European telecoms markets were liberalizing. Really interesting time, GSM, 2G mobile networks were just beginning to proliferate around the world. We were all getting mobile phones for the first time. A huge range of business opportunities and lots of dynamism there to investigate. And that evolved into more in-depth research. And I sort stepped sideways into the analyst environment. That was a very natural move. And I’ve been an analyst for over 20 years.

Kenton Williston: You know that history where you started with the emergence of 2G and the changes in the European landscape is really reminiscent in some ways of where we are now in the telecom space, right? Every G that we’ve experienced so far has been the one that’s really going to revolutionize everything, right? The hype, the hype’s always there, but I think in a lot of ways, 5G really is a pretty meaningful step change. And especially for the world that we care about here at insight.tech, which I should mention is an Intel® publication. You know, 5G is incredibly important for the Internet of Things. I would love to hear a little bit from your perspective, what in particular is driving the trend toward private 5G networks. Because this is something that’s relatively new. Private networks really haven’t been a thing before now. And like I said, this is something that’s going to be incredibly important for the Internet of Things. What are you seeing in this area?

Richard Webb: Well, firstly, I think you’re absolutely right about 5G. I think previous it’s iterations from 2G to 3G. And then 3G to 4G have really been about developing mobile communications, but on a fairly straight line. Adding data to voice and then improving the speeds and feeds of that data capability in particular. But you come to 5G, I think absolutely you’re right. This is a real game changer simply because 5G brings along new capabilities. It’s not really just about faster broadband. I think that’s very much how it might be presented to the consumer market, but I think it means different things to other markets, particularly enterprise and industrial vertical sectors.

And I think this is really where we get to see the kind of full-flavored 5G really reach fruition. And that’s because some of the capabilities that 5G had, yes, some of those are around better capacity, but also lower latency, kind of emerge at the same time as the evolution of other digital technologies, such as multi-access edge computing, such as big data analytics, such as AI and machine learning. And really the combination of 5G with those other digital capabilities that makes it more powerful.

5G is a real foundational infrastructure and service environment for those other capabilities. And when you put those into combination with each other and other technologies as well, you get a much richer environment. And I think that’s something that you’re going to see industry really take advantage of. Early-phase 5G was very much around—non-standalone 5G infrastructure that really was kind of a slightly better iteration of mobile broadband. As we get into the second phase or the later phase of 5G, it’s going to be built around standalone. And there’s a lot that’s happening, the core infrastructure there that really kind of releases some of those capabilities and some of that ability to interact with those other technologies. When you look at private 5G networks, you’re looking at an industrial environment that’s already going through its own changes. Those industries, those enterprises are going through their own digital transformation.

So really they’re putting more and more of their processes into the cloud. They’re doing a lot more that’s data driven and IT-centric and computing-centric within their own processes. And 5G comes along at a very opportune time, because it can really play a role in supporting that digital transformation in enterprises. And also accelerating that by bringing those other digital capabilities to an enterprise. It could say, “Hey, there’s possibilities here that you haven’t ever had before.” So there’s all, all sorts of new use cases that are reemerging for enterprise and industrial verticals built around that 5G connectivity as a kind of means of access, those other digital capabilities I was talking about. Particularly edge computing, which is incredibly powerful.

Kenton Williston: Yeah, absolutely. And like you said, the overall trajectory of 2G, 3G, et cetera, has been greater bandwidth than the more consumer facing things. And I think on the 5G networks there are flavors. Now granted these sorts of things were starting to come into fruition with LTE, but much more fully fleshed out in 5G, where there are flavors of it that are really designed specifically for things like industrial environments, where the emphasis is not so much on having huge, fat pipes to pump the latest of your HBO Max show or whatever it might be. It’s more about having low latency connections to an awful lot of devices, which is really perfect, like you said, for an industrial setting.

Richard Webb: Yeah, absolutely. I think once you stop thinking in terms of the connected device being attached to a human necessarily, and being attached to a machine instead, and gathering data that is related to the activity that that machine is there for, then you’ve got different data streams that could be maybe only sending small pieces of data, but scaled up over tens or hundreds of thousands of data points, of connectivity points, within an organization. These could be sensors that are just relaying one single piece of information, but on a very regular basis as a small part of a very complicated manufacturing line, for example. Or a very complicated set of different processes around lots of different types of machinery within a smart healthcare facility, for example, or a smart grid environment. They’re very specialized environments with very different requirements to what was capable even or what was even possible over 4G. But the different devices and the different data streams they’re capturing offer a number of very, very interesting, but very sophisticated and often complex use cases.

There’s a great deal of possibilities, but there’s a great deal of requirements as well. And this is where 5G really plays its ace. Because it’s got capabilities that, yes, are around capacity and latency, but it’s the mobility bringing that into an industrial environment where previously those machines were tethered to wired networks, for example. It’s the flexibility that 5G offers in terms of moving machinery around its actual mobility within a location as well. Its interrelation between other network technologies, like Wi-Fi, can be a very useful capability for 5G as well. Of course, it’s that nationwide infrastructure, that mobile network that goes beyond a single enterprise premises that is another dimension to it as well. So there’s an awful lot that 5G can bring to the table, but these are very demanding, very exacting requirements. And that is not just shaping the technological landscape, but it’s shaping the commercial landscape around private 5G networks.

Kenton Williston: Yeah, absolutely. And so you touched on the idea of the relationship between 5G and Wi-Fi, and I’m curious where you see that positioning exactly. And especially as Wi-Fi itself continues to advance. We’re seeing Wi-Fi 6, for example, coming out. Where do you see the interface between those two technologies? Why would you use one versus the other?

Richard Webb: Yeah, it is a good question. And it’s something that I’ve seen quite a bit of debate around. I don’t really see it as being a zero-sum, kind of one or the other standoff between Wi-Fi and 5G. Wi-Fi is incredibly broadly deployed within enterprises. I don’t think it’s going to go away, and I see no reason why it should go away just because 5G becomes an option. There’s been a very progressive and coordinated conversation between Wi-Fi development camps and mobile development. So Wi-Fi and 4G, 5G are not strangers to each other. These are technologies that can now talk to each other. They can be integrated within a network, or they can operate as two separate simultaneous networks. And this is how I think we’re going to see both of them coexist within an enterprise location. Wi-Fi is very well deployed. It scales very well indeed.

It may be better suited to non-critical communications because of the contentious nature of Wi-Fi. You can have Wi-Fi get a little bit overloaded when you’ve got too much going on that network. It may not be suited to mission-critical connectivity. When you have 5G, that’s much better suited to that mission-critical capability. You connect a device, you can typically, if not guarantee, then you can work within performance parameters to a much greater and more consistent extent. So it’s not necessarily that 5G is better than Wi-Fi. And certainly Wi-Fi 6 have their own capabilities that make them very applicable in many use case environments. It’s just that 5G can be better at certain things or in certain scenarios. If you think in terms of how best to get performance out of Wi-Fi and how best to get performance out of 5G, think of them as complementary. It’s actually when you’ve got both of them working simultaneously, they’ll help the performance of each other.

Kenton Williston: One of the things that I think comes to mind that’s really important about the distinction between these two technologies, beyond all the points you’ve been making, is just the fact that Wi-Fi is so familiar, right? Whereas, cellular technologies to many organizations are a new idea. It’s not something they’ve broadly deployed, not something they’re IT department is necessarily particularly familiar with. And that makes me curious about how organizations might go about implementing a private 5G network. Is this something that the IT team can develop expertise on in house? Is there a role here that they should be looking to partners to bring in to deploy these technologies? How do you see that playing out?

Richard Webb: 5G is almost like a reset opportunity for telcos to reshape their game for the enterprise and industrial verticals. To think differently about how they position, not just their services, but 5G as a kind of technology platform within enterprise. And bringing on board all those other capabilities, some of which enterprises and industrial verticals are already grappling those capabilities like edge computing, AI analytics, and so on. So 5G can really harness that and be an accelerant of those digital technologies, fueling the transformation that’s already in play for a lot of enterprises. Again, there’s a lot they have to do on the commercial side, not just figuring out the right way to approach the market and these channels to market, but really being open about what their capabilities are. And here is where I think the market environment is going to get very, very different for mobile, and really sort of based around that private 5G opportunity.

It’s around telcos being honest that they can’t do everything for everyone, particularly within an enterprise environment. That itself is becoming more complex, more demanding in terms of its digital transformation processes, its potential use cases, and so on. They’ve got to put in place an ecosystem of solutions. That’s both network hardware and software. Involves cloud service capabilities, could involve systems integrators, could involve players with deep vertically specific knowledge for some of those markets that telcos want to address. And figuring out that value chain is by no means an easy thing to do. It’s almost something that you have to do on a customer-by-customer basis, or certainly on a vertical-by-vertical basis. And telcos might have different strategies to approach different verticals. In one vertical, let’s say healthcare, they may have a good customer base. They may know a lot about the technology needs of that healthcare industry.

And so they feel they could be the direct-touch lead for a particular healthcare-transformation environment. Pick a different vertical and that telco may not be as strong. So it may take a different approach, being more of a sort of wholesale provider, but the lead could be a cloud partner that is the direct interface with the customer in that particular environment. Or it could be a network-equipment provider when you see offers from the likes of Erickson and Nokia, in which certainly Nokia has a direct-touch approach amongst a number of different strategies for reaching out to different vertical environments. It isn’t necessarily something that the telco has to front up, but they do have to play a part in positioning that solution for a particular customer as part of an ecosystem of solutions that’s going to pull in resources from a number of different plays. So there’s a technology component to it, but there’s very much a commercial component to it.

Kenton Williston: Yeah. One of the things that was really interesting in your list of potential partners and who is involved in the ecosystem are the cloud service providers, right? These are not folks who would traditionally think of having a role in telecom networks. So I’m thinking here, let’s say the Amazons, the Googles, the Microsofts, what role do these cloud providers have in these private 5G networks?

Richard Webb: I think these providers like Amazon Wavelength services, like Microsoft Azure, Google Cloud, and so on, have it incredibly important roles to play, and a great opportunity within the private 5G network environment. Many of them have existing relationships with enterprises, partly because those enterprises are already undergoing their own digital transformation. A lot of that is revolving around the cloudification of their own processes. Their process is becoming more data-centric. And so they’re already customers of those cloud providers. And secondly, because I think there’s a scale and a reach to those cloud providers that can often outperform or out-scale what a telecoms’ operator can potentially offer. Don’t forget these are global organizations in many cases. And whilst yes, an operator has 5G infrastructure, those cloud providers have a great deal of investment in infrastructure of their own servers and data centers and so on. And a great range of skill sets and a flexibility that’s very powerful.

It’s not really a case of them usurping the operators, or at least I don’t believe so. But I do see from some operators that I’m speaking to there is a little bit of tension around exactly what the role of the cloud providers are, given their scale and reach. But I think it’s again about that powerful combination of different service providers in the mix. And so I think there’s still quite a bit to be figured out with regard to telcos and how they interact with those cloud providers. But I think there is a growing market, a growing pie, if you like, that there’s room for everyone to coexist, so everyone can get a piece of that pie.

Kenton Williston: Yeah. So on that point, it’s not just the telcos, not just the cloud providers, but there are also enterprise-solution providers, the likes of IBM, Dell, and HP, who are also getting involved in this private 5G space. Can you tell me a little bit more about what role they play in this ecosystem?

Richard Webb: I think they’re also incredibly important. Many of these organizations will have very longstanding relationships and reputation within the enterprise space. You mentioned IBM, that’s got decades of heritage as an enterprise-computing provider. Others like HPE, Dell, absolutely the same, and have been very progressive in their moves into the telco environment as a solution provider. A lot of them, particularly HP, very active in 5G-core environments leveraging their software capabilities. They’ve already been moving in this direction.

Private 5G networks is really just an extension of that strategy to be part of telecommunications in a networking sense and not just in a devices sense. It’s still about access to computing and processing capabilities, but it’s much more in tune with the virtualization of networking. I don’t see that they’re any more a dangerous threat or a predator in this environment. I see them having a role that is valuable to the market alongside 5G network operators, alongside those cloud providers. Those IT providers have got heritage with mobile. And what that really means is often they have deep knowledge of how those different vertical sectors are evolving, whereabouts they are on their roadmap of digital transformation. That’s a really important piece, if you like, of the jigsaw puzzle in putting solutionstogether.

Kenton Williston: Yeah, for sure. And of course there’s one other element of this ecosystem I think is going to be quite important, and that’s from the enterprise-network-technology providers. I’m thinking here, for example, of the Ciscos of the world. Can you tell me where they fit into this? And really, I suppose, not just from a 5G perspective, from the larger perspective of 5G, Wi-Fi, and everything networking that’s going on in the enterprise space.

Richard Webb: You know, this is meat and drink in some ways to Cisco. It’s just, yeah, figuring out what their strategy is and where they fit. And I think that’s where a lot of the different players are still figuring it out. It’s not so much, have we got the right technology, but where do we fit best to offer most value in that value chain? Where can we build business for ourself? For Cisco, it may well be around the integration capabilities sitting at the middle of telco and enterprise networking, having capabilities across that whole gamut of technology intersection, if you like.

There’s a number of different ways they can position within this market. But really, I don’t think it’s necessarily even right to think of private 5G as a single market. I think it is going to be a very diverse market, perhaps according to industry sector. But you can be one thing to one market and you can be a different thing to another market. And it’s really sort of looking at your channels, your opportunities, your customer base, your partnerships, and so on. And figuring out almost on a case-by-case basis, what is our best opportunity? Who can we work with? How are we going to put solutions together and run it on a project-by-project basis.

Kenton Williston: You’ve mentioned quite a few times, and I absolutely agree with you, that exactly what private 5G networks will look like will depend entirely on the context, not just the industry that you’re talking about, but even the particulars of the given organization’s needs. It’d be great to talk in a little bit more detail about a specific example, and I’ll point to the report that we’ve got hosted on insight.tech, and I’d encourage our listeners to go over there and check out the CCS Insight report on 5G private networks. And in that report there’s some discussion of deployment that’s happened with Exor in one of their manufacturing facilities. I’d love to hear from your perspective what is significant about that example, and what takeaways folks can have from it. And I suppose even really what did Exor do in that deployment exactly anyhow?

Richard Webb: Exor is a technology manufacturer. And within its Verona smart-manufacturing facility in Italy it has covered that with 5G as part of a private mobile network environment in partnership with Italian mobile operator TIM, and also in partnership with JMA wireless and Intel. And what it’s done is not just operate a 5G private network for its own processing capabilities, but has built a 5G smart lab environment. It’s actually testing new use cases within its own network environment that it can then deploy within its own network, but also present as part of its solution suite to customers as well. It’s opened its door to allow other companies into its 5G lab, to explore how they can interact with industry 4.0 wireless applications based on 5G.

It’s not just an example of how to deploy it, but it’s really a sharing partner for those learnings as well. It’s an incredibly powerful environment to get a sense of not only how use cases are deployed within a smart-manufacturing environment, but to experiment with what more could be done. Particularly where you are looking at the interface between industrial PCs and human machines, and so on. A lot of rich potential coming out of there. And I’m really interested to see how that story evolves over time.

Kenton Williston: Yeah, likewise. There’s a lot of exciting things happening in that example. And again, I’d really encourage our readers to go check out the full report. Not only for more details on that example, but more details of all of your thinking on the future of private 5G networks. We’re getting close-ish to the end of our time together. I want to leave a little bit of time here just to see if there are some particularly important things that we may have wanted to cover from your point of view.

Richard Webb: Thank you very much. One thing I’ve talked about a little bit in some of my responses has been around complexity and around the sophistication of use cases and the high demand that that can put on connectivity and data processing and so on. And how 5G can be a platform for a combination of digital technologies. And that sounds great, but you need those technologies to be integrated. This is a complicated environment. So what I think is important for the acceleration of 5G private mobile network adoption are solutions that can cut away some of that complexity. And I think we’re beginning to see some of that addressed in some recent announcements. I’m thinking particularly of AWS’s reinvent that took place recently at which it launched its private 5G solution service. And really this was about providing, if you like, a one stop shop, plug-and-play 5G network as a service. To present that directly to enterprises in conjunction with 5G telco partners.

That’s probably one example of an attempt to cut through some of that complexity, make it really easy for enterprises to pull the trigger and green-light 5G private mobile network adoption within their organizations. It’s really about that collaborative environment on the technology side, and having a commercial framework that enables that. That is what’s really powerful potentially, but you’ve got to get 5G adopted into the enterprise first, and to do that, simplicity and reducing complexity is going to be very, very important.

Kenton Williston: Yeah, absolutely. And I think one thing that’s worth pointing out here, as we mentioned briefly some of the enterprise solution providers like IBM, Dell, and HP, and I think one of the things that’s important to note here is that it’s not just about the private 5G–specific solutions that these folks are starting to introduce, although that is very important. But just the wider sphere of how these folks are starting to deploy server technologies and edge-computing technologies that are well suited for being out on a factory floor, or being in other sorts of edge-computing use cases. And this is very much an integral part of being able to make use of these private 5G networks.

Richard Webb: Yeah, I’d agree. And that’s why I think the likes of Dell and HP and IBM are very powerful partners for enterprise within this environment. Because they’re already partners. They’re already holding the hand of their enterprise customers as they go through this digital transformation journey. And don’t think of private 5G networks as something separate. Private 5G is really part of that digital transformation. It doesn’t exist as an island. It’s in many cases a very natural place to arrive at as part of that ongoing process of evolving your data processes within an organization. It’s simply a better way of connecting those different parts of data functionality with processing and computing capability over a resilient and flexible network. So it’s really kind of a natural evolution. And that’s why I think it’s absolutely right that the likes of HPE, Dell, IBM, and others are very much integral to this adoption of private mobile networks. They have a role to play because they’ve already been playing that role, and that’s not going to change.

Kenton Williston: Absolutely. I think this is a great place for us to wrap things up. Richard, this has been a really fascinating and wide-ranging conversation, and really appreciate you sharing your thoughts with us.

Richard Webb: Thank you very much. Pleasure to be here and thank you very much for having me.

Kenton Williston: And thanks to our listeners for joining us. To keep up with the latest from CCS Insight, follow them on Twitter at @CCSInsight, and on LinkedIn at CCS-Insight.

If you enjoyed listening, please support us by subscribing and rating us on your favorite podcast app. This has been the IoT Chat. We’ll be back next time with more ideas from industry leaders at the forefront of IoT design.

This transcript has been edited by Erin Noble, proofreader.

 

The preceding transcript is provided to ensure accessibility and is intended to accurately capture an informal conversation. The transcript may contain improper uses of trademarked terms and as such should not be used for any other purposes. For more information, please see the Intel® trademark information.

The IIoT As-a-Service Streamlines Digital Transformation

Industrial machines produce an inordinate amount of data. But manufacturers have predominantly relied on this valuable information to control their production flow rather than to drive automation or support forward-looking activities like predictive maintenance.

But big data is gradually bringing about big disruption in manufacturing. Along with the IoT and edge computing, digital transformation may finally take hold within the industry. q.beyond, a leading IT service provider, is making this happen by employing a “digitization-as-a-service” approach. In this way, the company is helping midsize manufacturing customers fully harness the power of data in their business.

Why Manufacturing Digital Transformation Now?

Many companies have been slow to adopt transformative technologies due to the lack of time and expertise for innovation outside their core business. Yet digitization is critical now more than ever for manufacturers. With a shortage of skilled labor, increased global competition, and profitability pressures, they must foster more agile operations.

But modernizing a business with legacy processes into a digitally driven organization isn’t that simple. Innovation is often time- and cost-intensive, which is why some companies have turned to managed services to accelerate their transformation.

From cloud modernization to IT outsourcing, q.beyond’s platform-based services empower manufacturers to digitize their business. The company combines edge computing infrastructure and hardware devices with advanced engineering software to help manufacturers innovate without having to overhaul their existing systems or significantly expand their IT capacity.

“Companies do not have to develop software themselves and don’t have to integrate it themselves. That saves them time and risks,” says Uwe Schnepf, Head of Industrial IoT Solutions and Strategic Partnerships at q.beyond.

The #Edge is advancing shop floor connectivity and transforming #manufacturing companies into digitally-enabled businesses. @qbeyondag via @insightdottech

Transforming Industrial Processes with the Edge

q.beyond’s model is anchored around its EasyEdge platform service, which uses Intel® processor-based edge gateways to securely collect, process, and analyze data from industrial equipment on the factory floor. The solution is interoperable and extensible, so manufacturers can use it to expand the capabilities of existing systems and to connect any device they use in production without encountering vendor lock-in or issues with data sharing.

Partnerships are key to q.beyond’s overall solution and services. One of the company’s partners is IoT supplier Advantech, which provides edge devices that can be directly connected to machines via industrial fieldbus or ethernet protocols. Software from another partner, cbb software GmbH, is integrated into the edge device to facilitate OT-IT integration and to collect machine data, which can be stored locally on the device or in the cloud.

q.beyond also has a middleware solution called Edgizer used by EasyEdge that serves as the operating system for the edge gateways and provides centralized device management

Combining software and hardware from its partners allows q.beyond to deliver several benefits for customers, including the ability to decouple data collection, abstraction capabilities that reduce data complexity, and support for various protocols such as OPC UA, MQTT, and REST API. Customers can access real-time shop floor connectivity and insights, numerous interfaces for integration, and remotely access machines.

q.beyond offers all these services to customers through a cost-efficient, pay-as-you-grow model. Schnepf says manufacturers can start with just one edge gateway connected to a single machine and the Edgizer platform for centralized device management. Once a customer decides to continue with the service, they can then connect additional machines and pay a monthly service fee per device.

Driving Innovation for Manufacturers

One example of how q.beyond help customers achieve more agile operations is with Schütte, a Germany-based leading global manufacturer of machine tools.

Schütte uses edge gateways and the Edgizer platform, along with its own proprietary software. This allows them to collect data and run analytics in real time to measure machine performance, and remotely access and update equipment on their customers’ shop floors.

“They use the processes running on the edge gateway to do their analytics and to visualize it. Then they talk to the people operating the machine at shop floor level about system status,” Schnepf says.

Schütte also has used the platform for around-the-clock remote monitoring of machines at customer sites and to remotely access machines and perform updates. This has minimized on-site deployments and increased uptime of its machines worldwide.

Schnepf says Schütte showcases how other manufacturers can use this emerging technology to digitize their businesses.

“You do not need people onsite to do setup. That helps a lot in terms of lifetime support and establishing new models for smart service,” he says. “Also, having the data about how their machine is used to produce parts provides a lot of feedback to the manufacturer about how to optimize the machine itself.”

Bringing the Smart Factory to Life

The edge is advancing shop floor connectivity and transforming manufacturing companies into digitally enabled businesses. But this all starts and ends with data. Data will serve as the foundation for so many future manufacturing innovations — whether it’s AI-driven predictive maintenance or smarter product development.

“We think there are many, many ways of improving services and products using this type of technology,” Schnepf says. And with all these edge-driven services and capabilities, manufacturing’s long-anticipated digital transformation is coming to fruition.

 

This article was edited by Georganne Benesch, Associate Content Director for insight.tech.

Forecasting the Future of IoT with CCS Insight

Who in the business world doesn’t wish they could predict the future? But there are some out there who try to gaze into the crystal ball—and are brave enough to publish the results. Analyst firm CCS Insight is one such soothsayer with the release of its annual IoT predictions. They’ve surveyed the whole CCS Insight staff on topics from sustainability to the future of the cloud and even the evolving definition of “IoT.” They’ve prepared a special version of the 2022 report that focuses on the IoT transformation space for readers of insight.tech.

We’ll get some of the highlights in our conversation with Martin Garner, COO and Head of IoT Research at CCS Insight. He’ll look back at 2021 to see where the crystal ball was clear—and where it was cloudy—as well as forward into the trends to watch in 2022 and beyond.

How did your predictions for 2021 play out: What went right, and what went wrong?

There were a few we were quite pleased we got right. One was that COVID would accelerate the adoption of robots, automation, and IoT across sectors. There was an initial pause in investment, but people realized they needed this stuff to keep their operations going. Another one was that security and privacy in AI and machine learning would become much stronger areas of concern. Machine learning has quite a big attack surface, and it could be initially really hard to detect a hack.

Now we did get a few predictions wrong as well. We did predict that somebody would buy Nokia, and no one did. We also predicted that the regulation of the big tech players would slow down, and it’s actually moving faster than we expected.

And then there are a few that were longer-term predictions, which we’re still waiting on. For example, a big cloud player will offer the full range of mobile network solutions by 2025. Another one is that tiny AI will move up to being 20% of all AI workloads. There is a lot going on here—especially in IoT—and the role is growing, but we’re not at that level yet.

What’s on your mind for IoT in 2022?

We have nearly a hundred predictions for 2022 and beyond, and obviously we can’t go through all those here. What we did do was a cut of those that are relevant in some way for the IoT community, and we’ve packaged that up in a report that is available as a download from insight.tech.

I’ll just highlight a few that caught my attention. Several were follow-ons from the impact of COVID. By 2025 there will be somewhat less use of office space in the developed world—down about 25%. And there will be much more use of 5G as an additional home broadband; we think maybe 10% of households will have it.

“We think #IoT is going to fade away as a term. There will be much more focus on the intelligence—the way people use it, and the value they get out of exploiting the #data they’ve got.” –Martin Garner, COO and Head of IoT Research at @CCSInsight

We also saw, coming out of last year, much higher attention paid to sustainability. We really think clean cloud is going to be something of a battlefield this year. We also think that IoT can really benefit from using sustainability in its marketing. IoT is great news for sustainability, generally speaking, and we’re mostly not making enough use of that. We also think sustainability will be built into the specifications for 6G—when we get there.

And then there’s quite a lot around IoT itself. A much greater focus on software and machine learning—a shift toward higher intelligence of things. Also a much greater linkage between smart grid and wide-area networking. We actually expect to see pan-utility—where one company is both an energy provider and a network provider, doing both by 2025, because those two networks are becoming remarkably similar.

How will the role of cloud providers such as Amazon, Google, and Microsoft evolve going forward?

One area where they’re all pushing very hard is telecoms networks. And they’re doing more in the 5G world—especially as 5G moves from its current consumer phase more into an industrial phase. If you are, say, a global automotive manufacturer and you want a 5G private network in all of your manufacturing sites across the globe, who is best placed to provide that? Well, I don’t think it’s the local telco, because they’re not global enough. It’s more likely to be your big cloud provider. So we think they’re going to become a really key distribution channel for some of the telecoms products. And I think this is a good example of where the domain between what the cloud providers do and what the telecoms guys do is going to blur quite a lot over the coming years.

Where do on-prem cloud-like experiences fit into the landscape?

What we’re seeing now is that companies like Dell, HP, and other computing providers are offering cloud-like experiences, and—this is really important—they’re offering them as a service-business model for on-premises computing. You don’t have to have the big capital costs in order to get started with quite a major computing program—you can do it all on OpEx. We’re also seeing the big cloud providers offering local cloud containers in on-premises devices—AWS Greengrass, Azure Stack, and so on—and they’re offering as-a-service hardware.

Our expectation is that on-premises will, if anything, make a bit of a comeback, and that will tend to slow the growth of public cloud. We also think that IoT is a really, really big part of this because of the strength of edge computing—the fact that we’re generating such a lot of data in industrial IoT systems, and the fact that we often need to act on that data really quickly. We can’t do everything just in the cloud; we need the on-premises side. And as IoT grows and grows and grows, we think that will enhance that trend back toward a stronger on-premises suite.

Where do you see technologies like AI, machine learning, and computer vision going?

There will be a huge focus on the intelligence, rather than on the IoT itself. What we see at the moment is that there’s a very strong focus on the tools for machine learning and AI—making it easier for ordinary engineers in ordinary companies around the world to choose algorithms, set them up for use, and build them into development. It’s actually really challenging for ordinary people to choose and use systems in this area, so we’re also expecting a lot more focus on providing finished systems for machine learning and AI. We may even increasingly see some of the finished AI bundled into things like market-ready solutions.

We are also expecting the role of smaller or specialist systems integrators to grow a lot here. They can take on a lot of the training and configuration for you, because it’s still true that the widgets that you make in your factory are not the same widgets that other people use, and you need to train the models on images of what you are doing.

There’s also a little caveat here. It’s a large task to get thousands and thousands of specialist systems integrators up to speed in this area. Maybe they originally trained as installers for surveillance systems, and they may not be very skilled in machine learning. One of our other predictions, left over from a couple of years ago, is that we will move over time toward much more distributed training, rather than centralized training. And then, having done that, you will need to trust it enough to run your operations off it.

What do you think will be the impact of making AI more trustworthy and democratized?

I think this is one of the most fascinating areas in the whole tech sector at the moment. But I want to sound just a little bit of a warning here. We think that AI is a special category of technology—small assumptions or biases introduced by a designer or an engineer at the design stage can cause huge difficulties in society. So we need more layers of support and regulation in place before we can all be comfortable that AI is being used appropriately and properly.

Another key aspect is the formation of ethics groups that are not tied to specific companies. I think we need to take away the commercial-profit focus, and instead focus purely on the ethics. It’s also clear that to build strong user trust we’re going to need a mix of other things, like external regulation. But we also then need industry best practices and standards, and we need sector-level certification of AI systems.

Then we need to certify the practitioners. There have got to be professional qualifications for people who develop AI algorithms. All these layers are being developed and introduced, but we’re just not there yet. So one prediction we have in this area is that 80% of large enterprises will formalize human oversight of their AI systems by 2024. There’s going to be a whole layer of quality control that we put in place with human oversight before we let it loose.

Tell me more about your predication of the Internet of Things becoming the Intelligence of Things.

Actually very few people buy IoT. What they do is they buy a solution to a business issue. And somewhere inside that is IoT, which is used as a technology to make it work. The real value of IoT is not in the connection that you’ve created with the things, but in how you use the data that you now have access to. With computer vision on a production line, you don’t care much about the camera; you do care about what it’s telling you.

The trouble is, we are now generating so much data that we increasingly need lots of machine learning and AI to analyze it. And then it has to be done at the edge to do it really quickly, and so on. So getting the maximum value out of those systems is going to become all about the intelligence that can be applied to the data.

Monitoring something is useful, but you still need good analytics to help you focus on the right data and not get distracted. Controlling something is even more useful—you can make huge savings by controlling things better. And, finally, with suitable intelligence, you can now optimize a machine, a system, or a whole supply chain in ways you never could before.

We think IoT is going to fade away as a term. There will be much more focus on the intelligence—the way people use it, and the value they get out of exploiting the data they’ve got. Then you will need suitable systems for aggregating and analyzing the data, data lakes analytics, digital twins, machine learning, AI, and so on. And many, many companies are already well down this path, but actually there’s still a lot to learn.

I think the ecosystem angle is a really important theme to bring out here. Very few companies can do this on their own. There’s also an interesting organizational point here for a lot of IoT suppliers. From what I can tell, most IoT suppliers are 80% engineers working on the product and 20% other—which includes HR, marketing, sales, and so on. I think it needs to be the other way around. They need to have big customer-engagement groups. If you’re in healthcare, you employ ex-nurses and ex-doctors—people who really understand what’s going on within the customer organizations, and who feed that back into the product.

Assuming you get all of that done, really a lot of the value you get comes from then applying it across the organization. And that’s a people issue more than a technology issue. It comes back to one of the truisms of digital transformation, which is that success depends on taking people with you more than on the technology that you’re using to make it all work.

Related Content

To learn more about the future of IoT, listen to IoT Predictions: What to Expect in 2022 and Beyond and read CCS Insight’s IoT predictions for 2022. For the latest innovations from CCS Insight, follow them on Twitter at @ccsinsight and on LinkedIn at CCS-Insight.

 

This article was edited by Christina Cardoza, Senior Editor for insight.tech.

Global SI Reimagines Security with AI and Computer Vision

“Culture is the foundation for our company,” says Eric Yunag, Vice President of Technology for Convergint, a global systems integrator. “Our top priority is service in every way—service to our customers, our colleagues, and our community.”

Founded 20 years ago, the service provider goes beyond simply offering security, fire, and life safety solutions, to dedicating time and resources for actions that help secure a better future. For example, Convergint participates in local service projects that are meaningful for their frontline businesses. They provide safety and security resources for underfunded organizations. And they give financial help to employees who are experiencing hardships through a colleague emergency fund launched in honor of one of its founders.

When it comes to serving their customers, Convergint works to stay accountable for their number-one objective: “We want to be our customers’ best service provider in any category across the board,” says Yunag.

Global Systems Integrators: Partners in Innovation

They accomplish this mission by leveraging artificial intelligence (AI), computer vision, and other IoT technologies to create transformational opportunities, moving the role of security beyond surveillance and into data collection that provides insights.

“AI, IoT, and computer vision are fundamentally changing the way the world operates,” says Yunag. “Convergint helps its customers assess opportunities and create a new technology roadmap that serves overall organizational objectives. These technologies can re-architect security, fire, and life safety systems that can drive better customer and employee experiences.”

Connecting a variety of different cameras and sensors, Convergint pieces together situational intelligence models that connect the physical world to the digital. And Intel® has been an important partner. “The vision that Intel® is casting around IoT and the intelligent edge is an important catalyst for a lot of conversations,” says Yunag. “At a practical level, the Intel® Solutions Marketplace is an important way for us to package and position solutions that solve customers’ problems.”

By leveraging #ConnectedIntelligence and reimagining how security #systems provide value, businesses can take advantage of opportunities that weren’t possible before. @Convergint via @insightdottech

IoT Security Solutions Transform Industries

Convergint’s customer list is a who’s who of Fortune 5000 companies in energy, transportation, retail, government, healthcare, and finance industries. For example, a manufacturing customer in the mining space operates expensive equipment that digs for a specific type of material in a water-based pit. Objects like tree stumps or rocks can damage the equipment. Convergint developed a model that identifies these objects and, when detected, shuts down the equipment before damage is done.

Convergint deployed cameras connected to business intelligence infrastructure at a food processing company that does high-volume production of packaged foods. A machine learning model was created based on the company’s specific requirements that looks for defects.

And brick-and-mortar retailers leverage computer vision to generate analytics, such as how many people come into the store, where they go, what they look at, and how long they dwell in certain areas. The technology can also determine if customers convert through a cash register or leave the store without purchasing.

“All of the metrics that can be collected for eCommerce can now be collected in the physical world,” says Yunag. “Bringing those types of metrics into the physical space gives brick-and-mortar retailers a competitive advantage, connecting the physical to the digital world.”

IoT Solutions Transform the Future of Security

Connecting the dots with edge intelligence can solve previously unsolved problems with new technologies. And as innovative technology continues to emerge, security systems of the future can be leveraged for revenue-generating activities and process automation.

Visual intelligence is a completely disruptive theme and powerful outcome driver for organizations. Anything a camera can capture can be turned into data that can drive those customer experiences and outcomes in very different ways.

The challenge for traditional security buyers is to break the mental paradigm, stop thinking of cameras as security devices, and start thinking about camera technology as a visual intelligence platform for their whole organization to leverage. The new approach opens the door to conversations with other areas within organizations, such as marketing or operations. By leveraging connected intelligence and reimagining how systems provide value, businesses can take advantage of opportunities that weren’t possible before.

“Security is going from being a profit-and-loss expense line item to being leveraged for revenue-generating or cost-reduction activities in other areas of the business,” says Yunag. “We think about the hundreds of billions of IoT devices that will be put into the world in the next 10 or 20 years and the impact that it’ll have on the world. It’s exciting to think about the opportunities ahead and the changes that bringing those things together start to unfold.”

This article was edited by Georganne Benesch, Associate Content Director for insight.tech.

The On-Premises Performance and Reliability Guarantee

Contrary to popular belief, not everyone is rushing to the cloud. While the cloud does promise better flexibility, scalability, and speed, many organizations remain suspicious of its performance, availability, and security. Especially when every major cloud provider has suffered through multiple massive outages.

“Many companies have the mindset of ‘It’s my data, and I don’t want it on someone else’s computer,’” says Philip Elder, owner of MPECS, a hyperconverged infrastructure (HCI) provider. Because of this, he believes some workloads will never make their way to the cloud.

On-Premises Versus Cloud Computing

But questions around reliability are just one of the reasons companies decide to stay on-premises. Other factors include performance and costs. “The marketing has been very successful as far as the cloud is concerned; however, the reality is that public cloud gets very expensive fast when the workloads require performance,” Elder says.

For instance, Elder recently worked with a customer in the surveying and mapping industry that wanted to run everything in the cloud. But the amount of data they generated, stored, and processed was massive.

“We are talking about terabytes of data per minute. As soon as we get into a high-performance solution set where we need volume, the public cloud doesn’t make sense,” he explains. “After doing an assessment of their company, best practices, and business model, we found the cost to run their business in a Tier One cloud would be seven figures per month.”

The #cloud pendulum is swinging back toward #OnPrem solutions as customers realize they can achieve the same if not better benefits on-premises. @MPECSInc via @insightdottech

This is one of the reasons why Elder founded MPECS: to give businesses a choice to keep critical workloads out of the public cloud and run HCI-based solutions either fully on-premises or in a hybrid cloud approach.

The company focuses on small to medium-size businesses providing HCI solutions that Elder says deliver the performance, reliability, and cost public clouds cannot match.

The survey and mapping customer was able to save hundreds of thousands of dollars using a hyperconverged cluster solution.

According to Elder, MPECS was one of the first companies to build a Hyper-V cluster on the Intel® Modular Server platform. This allowed the company to deliver resilient storage and networking. As the industry shifted to the cloud, MPECS stuck with the on-premises cluster approach, adding a hybrid option for customers that wanted some level of cloud capabilities.

The Promise of On-Premises

With MPECS, Elder says organizations get an on-premises guarantee that enables them to confidently keep their services online with no interruptions.

This capability was particularly important to one of MPECS’ nonprofit customers, which relies on donations for 100% of its revenue. The nonprofit organization often operates on deadlines, sending out time-sensitive emails to collect funds. An outage that would interrupt its operations is out of the question, Elder says: “So the preference in that case, where there’s a sensitivity to timing, is for on-premises solutions.”

The number of organizations that require keeping their data in-house expands across all industries, according to Elder. “Some companies want everything on-premises. They want everything to be hands-on,” he says. “The cost per hour for downtime or being locked out of a cloud service, for instance—for even a small accounting firm of 18 to 20 seats—can be in the thousands of dollars per hour.”

Elder thinks of MPECS solutions as an insurance policy that ensures servers keep working for the duration of their lifetime. For instance, one accounting customer has been running an MPECS solution that leverages Intel servers for more than six years, and the solution has run since Day One without a single interruption.

MPECS does perform regular maintenance to keep its solutions running, but there are no unscheduled outages that occur. “It’s like owning a car,” Elder says. “You have to change your oil, you have to change your tires and your brakes, and all the obvious maintenance that needs to be done.”

Hybrid Cloud Security You Can Trust

In addition to performance and reliability, MPECS solutions are designed with security in mind. The host infrastructure, whether it is a standalone Intel server or a Microsoft Storage Spaces Direct cluster, is separate from the production layer.

“The advantage we have with the on-premises model versus a public cloud or even a hybrid situation is we have control over the security,” Elder says. For example, if a user clicks a ransomware link, MPECS can isolate the machine and prevent the infection from spreading.

But even more important, Elder says, is user awareness. Since the company focuses on small to medium businesses, MPECS works with customers to ensure they understand the consequences of what they do and how to spot vulnerabilities. “The first rule of thumb of on-premises or cloud security is to train the human. That’s the first line of defense, and then the second line of defense is in the way the system is architected,” he says.

As Elder looks toward the future, he believes the cloud pendulum is swinging back toward on-premises solutions as customers realize they can achieve the same if not better benefits on-premises.

“I see that pendulum continuing as more and more people take an honest evaluation of their costs and realize that the little box in the corner and software licensing is all they really need to run their business at a much-reduced cost,” he says.

 

This article was edited by Christina Cardoza, Senior Editor for insight.tech.

Smart Factories Close Data Gaps Between OT and IT

Industry 4.0 has the potential to be a real game-changer. As manufacturers overcome data-related challenges, they become more agile and achieve considerable operational improvements. The promise of digital transformation in manufacturing lies in its ability to make data “talk” and deliver insights to act on. But challenges persist.

While data-driven manufacturing is not a particularly new idea, what is new is using data in context and applying learned lessons at scale. With industry 4.0, data from new sources such as machines outfitted with IoT sensors adds to the holistic picture and delivers context.

Manufacturers can see the forest and the trees—and plan accordingly. They can relieve some of the many pressures they face, including meeting sustainability goals, working with a talent shortage, and a growing need for remote monitoring of production processes.

Closing the Gaps Across OT and IT

“Unfortunately, the lofty ambitions of digital transformation frequently come to a grinding halt when the rubber meets the road,” says Florian Hoenigschmid, Vice President of Strategy and Sales at azeti, an IoT platform provider.

For one thing, because machines have been phased in over decades, the data they generate is not always uniform. “You can’t apply a one-size-fits-all solution when shop floors have to work with a lot of different machine interfaces, different protocols, and different standards,” Hoenigschmid says. Heterogeneous data is a challenge because it is not easy to process.

Patchwork solutions proliferate. Soon, you start missing the trees in the forest.

Second, even if you wrangle all the data in one format, granularity is a problem. “Getting data at a certain quality and scale that is useful for applications like machine learning is another challenge,” Hoenigschmid says. Low-resolution fuzzy data paints an inaccurate picture.

Gaps between operation technology (OT) and information technology (IT) integration further complicate the process of deriving insights. “There’s the software piece, the network piece, how to get data while still following corporate security policies—plus the OT piece of working with controllers and machines on the shop floor,” Hoenigschmid says. Getting data from these silos to talk to one another and in one language is no easy feat.

The azeti #IoT platform connects heterogeneous protocols and #machines on the shop floor to a central #software platform—solving many headaches in one fell swoop. @azeti_gmbh via @insightdottech

Digital Transformation in Manufacturing

The data challenges might be frustrating but are not insurmountable.

Case in point: The azeti IoT platform connects heterogeneous protocols and machines on the shop floor to a central IoT software platform, solving many headaches in one fell swoop. “We are able to connect most of the machines which are already on the production line into our software stack. On the software side, we convert raw data in an understandable way and make that available to our customers,” Hoenigschmid says.

The industrial rugged computers that azeti uses to bridge data gaps all run on Intel® processor-based hardware. “Intel technology helps us actually do the job of connecting and talking to machines, understanding the different protocols, and converting them to something that makes sense for our platform,” Hoenigschmid says.

Real-World Industry 4.0 Examples

Beyond building a robust data foundation, azeti also helps clients with the insights part of the equation.

For example, a large metal producer faced the following challenge: how to best operate the ventilation systems depending on the varying conditions of the smelting process. Conventional SCADA and PLC systems are used to control the operational conditions of the industrial ventilators, but they fall short for maintenance planning. Progressive deteriorations are difficult to recognize, SCADA data must be manually evaluated by the personnel, and temporary anomalies are recognized with delay.

The azeti solution supported the manufacturer by the means of a flexible platform for data collection, analysis, and visualization. After data acquisition, the relevant measurements are clustered over longer periods, while charts and a derived indicator are created on a dashboard. This provides the maintenance personnel with a tool to quantitively compare deterioration effects, detect anomalies in real time, and easily identify persistent patterns for reliable maintenance planning.

“The client now has a holistic and unified view of production-critical assets, which help them plan resources more efficiently,” Hoenigschmid says. “This provides the foundation to transition from preventive to predictive maintenance at some point. The process has been kicked off but is not yet concluded. Improved productivity and maximizing asset use are the welcome outcomes of the solution.

The IoT platform also helps metal companies use resources more efficiently. When workers fill dosing furnaces with metal to start production, many eyeball the amount of material to the fill line but miss the mark. Furnaces end up using wasted energy.

The platform connects to sensors and controllers that are part of the furnace so workers can see fill volume, and when to kick-start the melting process. “Maximizing utilization of these furnaces and using resources like gas efficiently can drive up efficiency,” Hoenigschmid says. “And that can add millions of dollars to the bottom line.”

Ushering in a Data Revolution

Digital transformation in manufacturing can unlock a range of additional possibilities. It can drive digital twins that simulate production processes so manufacturers can use resource-efficient methods and meet sustainability goals.

Product quality assessments can be fine-tuned. Data itself becomes valuable currency that can be traded as a service, enabling companies to rent their facilities to specialty manufacturers.

The digitalization of the shop floor may enable manufacturers to also negotiate lower insurance rates. Hoenigschmid foresees. He is excited about the future of smart manufacturing: “We can forecast the future in a very precise way and that is possible with very good and lots of granular data.”

In turn, manufacturers will finally work with the right toolkits in their digital transformation arsenal.

 

This article was edited by Georganne Benesch, Associate Content Director for insight.tech.

Inside the Latest Intel® Processors with ASRock Industrial

Kenny Chang

[podcast player]

Calling a CPU “revolutionary” is a big claim—but the 12th Generation Intel® Core processors have a lot of features to back up that boast. From an all-new hybrid architecture to dramatically better graphics, the latest Intel® processors can be used for high-performance AI, workload consolidation, and so much more.

Join us as we explore the most exciting new IoT features, why they matter, and how industries are already leveraging the new processors, in this podcast episode with ASRock Industrial.

Our Guest: ASRock Industrial

Our guest this episode is Kenny Chang, Vice President of System Product BU at ASRock Industrial, a leading industrial computer provider. Kenny has a wide range of experience covering server, edge AIoT, embedded computer, hardware and software technologies, as well as leadership positions in product and engineering management. Before joining ASRock Industrial, he was the Vice President of Product Development at AEPX Global, and Director of IoT Business Development at Compal.

Kenny answers our questions about:

  • (1:52) The most exciting features of the 12th Generation Intel® Core processors
  • (3:19) Why this release has the potential to revolutionize IoT applications
  • (10:13) How companies can benefit from the GPU upgrade
  • (13:32) The software capabilities of the new core processors
  • (16:07) How ASRock is helping customers quickly take advantage of the new features
  • (20:13) The power of Intel® to deliver and support development efforts
  • (22:25) How companies are already using the latest Intel® Core processors
  • (24:35) The importance of the new hardware for security features

Related Content

To learn more about the 12th Generation Intel® Core Desktop and Mobile processors, read CES 2022: Intel® Launches Revolutionary CPU Architecture. For the latest innovations from ASRock Industrial, follow them on LinkedIn at ASRock-Industrial.

 

This podcast was edited by Christina Cardoza, Senior Editor for insight.tech.

 

Apple Podcasts  Spotify  Google Podcasts  

Transcript

Kenton Williston: Welcome to the IoT Chat, where we explore the trends that matter for consultants, systems integrators, and enterprises.  I’m Kenton Williston, the Editor-in-Chief of insight.tech. Every episode, we talk to a leading expert about the latest developments in the Internet of Things. Today, I’m discussing the new 12th Gen Intel® Core processors with Kenny Chang, Vice President of the System Product Business Unit at ASRock Industrial.

The new CPUs, which were formally named Alder Lake, were just announced at CES 2022. They pack a ton of cool features, like an all-new hybrid architecture, massively upgraded GPUs, and real-time capabilities. As one of the first companies to come to market with the new chips, ASRock has unique insights on these processors. So I’m really looking forward to hearing Kenny’s thoughts.

So with that, Kenny, I would like to welcome you to the podcast.

Kenny Chang: Hi, thank you for having me and it’s my honor to join the podcast.

Kenton Williston: Yeah, absolutely. I’m curious about your career. What have you done before your previous role at ASRock?

Kenny Chang: I was in charge of the product-development division as the vice president, and our product is major in medical devices for what we call IoM. That means Internet of Medical. And the major equipment we are developing is the front panel dictator. It is used for the X-ray system. That’s what I did before I joined ASRock Industrial.

Kenton Williston: So as I was just saying, there are so very many new features in the latest Intel core processors. So there are a lot of things we could spend our time talking about, but the first thing I would like to know is what features you are most excited about in the latest Intel core processors and why?

Kenny Chang: I think the most amazing feature is about the hybrid architecture, combining the performance core as well as the efficient core. I think that gives us the very flexible way to manage, especially in software-defined everything, we can adjust which core is doing what kind of jobs accordingly. And I think this is the major benefit when we adopt the Alder Lake-S processor into our products.

Kenton Williston: Yeah, that totally makes sense. And I should mention too that this podcast as well as the overall insight.tech program are produced by Intel, so of course, we have very specific reasons to want to talk about the latest Intel technologies. But having said that I agree the hybrid architecture is very interesting. This is something that’s become more popular in a variety of CPU designs and, just like you said, having both high performance and good efficiency is a really amazing combination.

You can combine these two things together and not have to give up low power to achieve high performance or vice versa. That’s very, very helpful. One of the things that Intel has been saying about these new processors is that they are revolutionary, which is, of course, a very strong claim and I’m wondering how you think they might revolutionize IoT applications.

Kenny Chang: The IoT application is very diversified and across various vertical applications such as the automation, automobile, smart city, energy, or even smart retail. We have so many software applications onto the end products. We are going to the microservice enabled by the containers. That means there’s so many containers running simultaneously on the one edge platform. Especially the edge is the most important. I think it is a major migration from the data center to decentralized. That means we have edge computing running on the local sites. They can reduce then enforce all the latency between the cloud and the devices. So edge is the best solution for this kind of situation.

Back to the 12th Generation Alder Lake processor here, we can embed the powerful processor into the  edge devices. So, with that, we need not only powerful, but also traceable architecture to deal with all the microservices or tasks. That’s the first one I would like to highlight here. And the second one is, we do lots of critical missions on the edge. The real time is really a method for every operation, and our 12th generation possessor Alder Lake-S featuring about the real-time controlling by the TSN and the TCC.

So that’s a good feature to make us to have more confidence: to make all tasks be occurred, synchronization in the real-time measure. This is the two major key benefits for the IoT application.

Kenton Williston: Yeah, I think those are all really good points. So, let’s see if I can summarize those and add hopefully something useful on top of it. The point you made about microservices and containers, I think is a very good one. It’s a very reflective of a big change in how edge computing is done. I hate to date myself like this, but I’ve been working in this space now for let’s see, I guess this was going to be 22 years this year. When I began my career, the things that you would find in what we’d call now, edge computing, which back then would be called embedded computing, were very specialized code. You had to have very specific knowledge to write for these devices. And now people are looking more and more to use the same kind of coding practices that you would find people using in the cloud.

And I think this is a good change because one thing that’s very good about it is it gives you so much more flexibility. Because it is important, I think, to move some things out of the cloud to the edge, but of course there’s always this back and forth. Sometimes things need to be decentralized, sometimes they need to be centralized. And I really like the way that with modern application development processes, you gain a lot of flexibility. The microservice can just run wherever it makes the most sense. But of course you need, in order for that to happen, you need to have a platform that will support running these microservices. And I think the older lake platform is a very good one for that.

Kenny Chang: Yeah, exactly.

Kenton Williston: And then you mentioned the importance of the hybrid architecture, and one of the things that’s important here is you can configure your system in such a way so that the less performance hungry microservices, and just in general the less performance hungry tasks, can run on the efficient cores, so that if you don’t actually need the high performance cores at the moment, you’re running very efficiently, very low power, which is important for all kinds of reasons. Obviously, if you have something that is battery powered, it’s very important not to draw too much power, but even if you have something that is plugged into the wall, in many situations, it’s very important not to have too high of thermals because then you start having much more complicated systems that are less reliable and have more moving parts and all these sort of things.

And of course, in addition to this overall trend of edge equipment looking more and more, at least from a software perspective, the same as what’s running in the cloud, there’s been this very strong trend towards IT/OT convergence. And so things that, a lot of business workloads, increasingly are overlapping with IoT devices. And so it’s very useful to have a platform that can run different business services as well as edge computation. So there’re all kinds of used cases for this, where you might want to combine things in a lot of new and interesting ways.

And one thing in particular, you’re talking about the importance of moving things out of cloud to the edge for the purposes of minimizing bandwidth utilization and latency. And similarly for the real-time computing capabilities, which I believe these are the first core processors that offer that feature, these are both important capabilities in many applications, but one of the things that’s been really growing quickly has been AI applications, which potentially can be very, very data hungry. And I think this is a perfect example of where the computing really needs to happen at the edge. Would you agree with that?

Kenny Chang: Yeah, sure. Absolutely.

Kenton Williston: And that leads me out to another thing I wanted to get your opinion on, is the GPU. So they’re very, very heavily upgraded. And of course, when Intel announced these new parts, the main thing they showed off was how you could play the latest and greatest games on these processors, but maybe not the most relevant thing for industrial healthcare applications. But there is a lot of relevance that it might surprise people, because you can actually use these GPUs to accelerate AI workloads quite a bit. Is that something that you’re seeing as an important way to use these processors?

Kenny Chang: As you mentioned, AI is the mega trend. And right now we can see the Alder Lake-S, they have a big improvements on the GPU performers. As I know they will have 1.94 faster graphic performance, and up to 2.81 faster GPU image classification performance compared to the previous generation. The great benefits for us is we can eliminate the additional GPU card put into our box. That’s good for us, especially in the industry use case, they can reduce lots of maintenance cost on that. We don’t need any simplification of the performance, as well.

Kenton Williston: Yeah, I think that’s all very true. And I think there’s an awful lot of benefit to be had from being able to execute image classification and other AI workloads directly on the CPU. A less complex, easier to maintain lower cost system if you don’t have to add a graphics card. And that I think has always been true. If you can avoid adding more parts, it’s always better, but especially at the moment, graphic cards are very hard to obtain. And of course they’re very expensive when you can get a hold of them. So, I think it’s really nice to have a platform where you can get a tremendous amount of performance out of the GPU right out of the box without having to add any cards. Are there any other benefits to the GPU upgrade? Like I said, probably your customers are not too worried about gaming, but they are still quite upgraded and I’m wondering if you’re seeing any other use cases beyond things like image classification for the GPUs.

Kenny Chang: Another case is about the factory automation. We have the AOI virtually integrated with the AI capability to enhance the capability and the productivity for the defect inspection without a GPU card. They also have the more complex size which can be put into the enclosure. That’s the other key feature for us, to have the same performance, but with the more complex size integrated into our production line.

Kenton Williston: Yeah. That makes sense. And you’re making a very good point that, just the size of the solution by itself can be very important. There are a lot of applications, such as smart city applications, where you might need to squeeze the equipment into an existing space that’s quite limited, or on a manufacturing line where it’s already crowded, any space savings you can get will be very beneficial. That is a very good point. And I’m glad you brought it up. One thing I’m wondering, beyond the hardware attributes, of course, you need to be able to program these things. And I’m wondering from a software perspective, how your customers can best take advantage of all of these new features?

Kenny Chang: Yeah. I think the hybrid architecture, some of the heavy workloads, they need huge powerful processor to deal with that. So, they can assign the task onto the P-cores , what we call the performance core. But some of background, as such as the impact management, they don’t need so much powerful processor to deal with states. They can choose the efficient core to take this job. So this also means they can easily assign: which task, which core. So, I think the major benefits for the software development guys, to respond, to leverage the technology here.

Kenton Williston: And are your customers using things like Intel® oneAPI to take advantage of this hardware?

Kenny Chang: They are mostly using the OpenVINO for the AI inference task. Intel also has the oneAPI. I think it is good for them to get the API that they want on the one platform. That takes a lot off all their workloads, this platform, I hear from them.

Kenton Williston: That’s really great. And I’m glad you mentioned the OpenVINO architecture. So this is a platform that provides a layer of software abstraction so that you can create and implement different kinds of AI algorithms without having to know all the fine details of the architecture. And it’s very useful when Intel does things like this latest generation Intel core processor, which has a very much faster GPU. You don’t have to worry necessarily about rewriting your code. You just get a performance boost, which is very, very helpful. So, I’m wondering in practice how some of this is playing out. So, I understand you worked with a company called DMS, and you mentioned one of your major lines of business is automated optical inspection. And I understand you did some work with DMS to use the 12th Gen Intel® Core processors. Can you tell me about some of the challenges that company was facing and how they benefited from using Alder Lake?

Kenny Chang: For the first touch with our customer, they are introduced to the AOI with AI to enhance the accuracy and the efficiency. They really did a good job compared to not introducing the AI task into the AOI. But they also encounter challenges regarding the data transmission. The time between the computer A to computer B, it would take too long to complete the data transition. As you know, the image size is very big, a few megabytes for one image. We saw this challenge and hear the pain points from their viewpoint. Then we think about the Alder Lake-S processor, where it would face their headache a lot. So, we introduced the proposal to them. We would like to build up the workload-consolidation solution. That means we will integrate the computer A and the computer B into one platform. And the most important, we are using the virtualization by the KBN combining the Windows OS and the Linux OS onto the one hardware platform. And we address the data transmission by the shared memory technology. That can make it 100 times faster compared to the previous one.

Kenton Williston: Yeah. And I think this is a really good example. I really appreciate you sharing that with us, Kenny. So, I think many, many factories and other sorts of industrial use cases are in a similar situation right now. AI is the big mega trend right now. It’s being deployed just in all kinds of use cases, but it is very difficult, if you’ve got some existing equipment, to just keep adding some additional equipment to perform AI, because like you said, many times what’s happening with the AI is processing a huge amount of data, whether that’s images or other high bandwidth sensor data. So, sometimes it’s really just the network that is the constraint, even just the local network, never mind going to the cloud, that would prevent you from adding AI to your system. So, the fact that you have very IT-friendly standards based on the platform that can do… It can run windows, it can run Linux, you can have all kinds of different virtualized environments. And you can bring things together on a single platform, making it much easier to add these new capabilities, because you don’t have to have old machine A, new machine B scenario.

It’s like you said, you can just run everything on one machine and support your existing software to a new box, and then start adding all the new things you want. And then in addition to consolidating those workloads, you now have a platform that is very well suited to consolidating other kinds of workloads that you might have in nearby machines or taking things that are currently running into the cloud and bringing them out to the edge.

You have a lot of options there. So, I’m curious, I know that you’re one of the first companies to come to market with a solution for the industrial market based on the 12th Gen Intel® Core processors, how is that possible and how are you working with Intel to deliver these solutions to market quickly and to not just bring in early solutions to your customers, but really the most advanced kind of solutions?

Kenny Chang: We have a long partnership with Intel for a very, very long time. ASRock Industrial is the leading company for the industrial, mobile, and the system production, especially for the industrial applications. As you mentioned earlier, it is very good for us to get the early sample by the EA program with Intel. We also accept lots of performance updates and the technology updates, such as the tool or architecture to help us deal with the vertical markets, such as the edge insight for industrial and edge control for industrial. They can bring more insights how to address customer needs and how to help our customer, especially for the systems integrator to reduce their development time.

They just focus on what they are good at. They don’t need to waste time to deal with the hardware and software integration. So they just put their application stuff onto the box and that’s done. So they save lots of development and working time from there, and they can get the quick-win solution.

Kenton Williston: So Kenny I’m interested, you’re talking about how your close relationship with Intel and  the early access relationship and access to their roadmap and all these things really help you build application-ready boxes. Can you give me an example of what some of the features might be for some of the solutions you’ve got available now using the latest Intel core processors?

Kenny Chang: Well, that’s good question here. Initially, the feature was for a workload consolidation. More precisely we can say it is the middleware. What we did for customers, just like the case I mentioned before for the AI AOI, we put the virtualization middleware KBN onto our hardware box, and we know how to enable the shared memory and then turn it into API for our customer. So if a customer would like to use this solution, they can just buy our box and we can have such stuff installed in our system, so they can just open the box and put on their software application, then that’s up and running quickly.

Kenton Williston: Yeah, that’s really interesting. So, it sounds like what you’re telling me is ASRock goes beyond simply putting the hard hardware together and sending the hardware to your customers, but you actually offer a certain level of services to provide some appropriate set up and middleware and these things. So that when the system arrives at the customer, it’s ready for them to start putting their software on it. They don’t have to think about those things. Do I have the right idea there?

Kenny Chang: I think it’s just a very beginning stage. It’s optional items. If customer needs this, we can do such a service for them. I think the right now we will take some time to educate the institution, more ideal to our customer before they adopt the solution.

Kenton Williston: Got it. That makes sense. I wanted to, speaking of education, touch on something we haven’t talked about yet. One of the other new features of the platform or some new hardware security features, do you think these will be important for your customers as well?

Kenny Chang: Well, yes, it’s very, very important. Cybersecurity in tech is the hot topic all over the world. In most cases it’s happening in the IT, but right now, when we introduce the industrial IoT into the industrial automation, there are lots of OT devices. They are very vulnerable.

Let me bring one case I have. The case is about 5G smart pole, smart city. Smart pole is integrated with a lot of devices, not only for the lightning for the streets, but also it has the sensor for the air condition quality. Also, we have the smart camera to monitor the traffic overall in the city. We provide our Alder Lake-S platform solution into the smart pole, which working as the edge server.

All the camera data will go into the edge server for the image classification or any of the dual-based checking by the edge server. But all this data is very sensitive. The benefits to adapt the Intel processor is they are integrated with the software guards we call SG C, as well as they have the PTT technology. So they can make the data be secure in the hardware method. And they also have very good leverage by this from the system integrator to put the software on that and to ensure the security be in place.

Kenton Williston: That all makes total sense. You’re raising some good points about the Intel software guard, or SWG, and the PGP features that are in there as well. It’s very important, like you said, many more systems are coming under attack, and earlier I was mentioning how it’s a good thing that IT and OT are converging and the systems that used to have very specialized code are now much more often becoming, by necessity, more IT-friendly systems that run very familiar operating systems.

This is a good thing because it helps innovation move forward more quickly, but also makes these systems more open to attack than they used to be. I like this example you gave of a smart pole where you might have lighting and cameras and other sensors, and it’s very useful to have real-time visibility into what’s happening in the city, whether it’s air pollution or traffic levels or whatever.

But of course, especially with cameras, there’s always a risk that people could get access to video feeds that they really shouldn’t have access to. And so it is very important to keep in mind security, and of course just the very fact you’re talking about something that’s connected back to government systems.

There can be all sorts of very subtle ways of… once you’ve gotten into a system getting back into very sensitive data, and there’s been cases of people, for example, showing how you can access, say a printer, for example, and then just a couple of jumps and you’re into a very sensitive database. Yes. So, the security is very, very important. So I’m glad we had a chance to talk about that. We’ve covered a lot of different topics. I’m wondering if there is anything that I didn’t ask you about that you would like to add.

Kenny Chang: I just want to add that ASRock Industrial is not only for the hardware provider, we also think about and are working with the Intel verticals to help customers to get the better solution for them. That’s our goal for our end customer. And we also help, by this, co-create to make the world  much better than ever.

Kenton Williston: I love that. That’s great. Well, Kenny, I want to thank you again for joining us today. I really appreciate your time.

Kenny Chang: Yeah. Thank you.

Kenton Williston: And thanks to our listeners for joining us. To keep up with the latest from ASRock Industrial, follow them on LinkedIn at ASRock-Industrial, that’s ASRock-Industrial.

If you enjoyed listening, please support us by subscribing and rating us on your favorite podcast app. This has been the IoT Chat. We’ll be back next time, with more ideas from industry leaders at the forefront of IoT design.

Democratizing AI: It’s Not Just for Big Tech Anymore

AI and computer vision have become commonplace in manufacturing. In manufacturing, if you can see the data, there’s something you can do with the data. But not every industry has that mindset, or the luxury of data scientists on-site. That shouldn’t stop the many new and exciting use cases for AI—from medicine to traffic to agriculture—from taking advantage of these tools.

We talk with Elizabeth Spears, Co-Founder and Chief Product Officer at Plainsight (formerly known as Sixgill), a machine learning lifecycle management provider for AIoT platforms, and with Bridget Martin, Director of Industrial AI and Analytics of the Internet of Things Group at Intel®, about the accessibility and democratization of AI, and how these factors are key to getting the most out of this crucial technology—for companies and, ultimately, for consumers.

Where do things stand right now in terms of new applications in the manufacturing space?

Bridget Martin: There are two different perspectives. Some manufacturers we would consider more mature—where there are automated compute machines already on the factory floor, or in individual processes on the manufacturing floor. They are automating processes, but also—and this is critical when we’re talking about AI—outputting data. And that could be the metadata of the sensors, or of the processes that that automated tool is performing.

These manufacturers are really looking to take advantage of the data that’s already being generated in order to predict and avoid unplanned downtime for those automated tools. This is where we’re seeing an increase in predictive maintenance–type applications and usages.

But then you also have a significant portion of the world that is still doing a lot of manual manufacturing applications. And those less mature markets want to skip some of the automation phases by leveraging computer vision—deploying cameras to identify opportunities to improve their overall factory production, as well as the workflow of the widgets going through the supply chain within their factories.

Can you talk about some new applications making use of this technology?

Elizabeth Spears: A really cool one, which is just becoming possible, is super resolution. One of the places where they’re researching its application is in using less radiation in CT scans. Think of one of those FBI investigation movies where they’re looking for a suspect, and there’s some grainy image of a license plate or a person’s face. And the investigator says, “Enhance that image.” All of a sudden it becomes this sharp image, and they know who did the crime. That technology really does exist now.

Another example is in simulating environments for training purposes in cases where the data itself is hard to get. Think about car crashes, or gun detection. In those cases, you want your models to be really accurate, but it’s hard to get data to train your models with. So just like in a video game, where you have a simulated environment, you can do the same thing to create data. Companies like Tesla are using this for crash detection.

It’s really across industries, and there’s so much low-hanging fruit where you can really build on quick wins. My favorite cases around computer vision are the really practical ones. And they can be small cases, but they provide really high value.

One that we’ve worked on is just counting cattle accurately—and that represents tens of millions of dollars in savings for that company.

How can organizations recognize their AI use cases and leverage computer vision?

Elizabeth Spears: I feel like we often talk about AI as though an organization has to go through a huge transformation to take advantage of it, and it has to be this gigantic investment of time and money. But what we find is that when you can implement solutions in weeks, you can get these quick wins. And that is really what starts to build value.

For us it’s really about expanding AI through accessibility—AI isn’t just for the top-five largest companies in the world. And we want to make it accessible not just through simplified tools but also simplified best practices. When you can bake some of those best practices into the platform itself, companies can have a lot more confidence using the technology. We do a lot of education in our conversations with customers, and we talk to a lot of different departments; we’re not just talking to data scientists. We like to really dig into what our customers need, and then be able to talk through how the technology can be applied.

“For us it’s really about expanding #AI through #accessibility—AI isn’t just for the top-five largest companies in the world.” – Elizabeth Spears, Co-Founder and Chief Product Officer @plainsightAI via @insightdottech

Hiring machine learning and data science talent is really difficult right now. And even if you do have those big teams, building out an end-to-end platform to be able to build these models, train them, monitor them, deploy them, keep them up to date, and provide the continuous training that many of these models require to stay accurate—that all requires a lot of different types of engineers.

So, it’s a huge undertaking—if you don’t have a tool for it. That’s why we built this platform end-to-end, so that it would be more accessible and simpler for organizations to be able to just adopt it.

What are some of the challenges to democratizing AI and what is Intel® doing to address those?

Bridget Martin: Complexity is absolutely the biggest barrier to adoption. As Elizabeth mentioned, data scientists are few and far between, and they’re extremely expensive in most cases. This concept of democratizing AI and enabling, say, the farmers themselves to create these AI-training pipelines and models, and to deploy, retrain, and keep them up to date—that’s going to be the holy grail for this technology.

We’re talking about really putting these tools in the hands of subject-matter experts. It gets us out of the old cycle—take a quality-inspection use case—where you have a factory operator who would typically be manually inspecting each of the parts going through the system. When you automate that type of scenario, typically that factory operator needs to be in constant communication with the data scientist who is developing the model so that the data scientist can ensure that the data they’re using to train their model is labeled correctly.

Now, what if you’re able to remove multiple steps from that process and enable that factory operator or that subject-matter expert to label that data themselves—give them the ability to create a training pipeline themselves. It sounds like a crazy idea—enabling non–data scientists to have that function—but that’s exactly the kind of tooling that we need in order to actually properly democratize AI.

Because when you start to put these tools in people’s hands, and they start to think of new, creative ways to apply those tools to build new things—that’s when we’re really going to see a significant explosion of AI technologies. We’re going to start to see use cases that I, or Elizabeth, or the plethora of data scientists out there, have never thought about before.

Intel is doing a multitude of things in this space to enable deployment into unique scenarios and to lower the complexity. For example, with Intel® Edge Insights for Industrial we help stitch together an end-to-end pipeline as well as provide a blueprint for how users can create these solutions. We also have configuration-deployment tools to help system integrators install technology. For example, if a SI is installing a camera, our tools can help determine the best resolution and lightning. All these factors have a great impact on the ability to train and deploy AI pipelines and models.

How can organizations go about starting this journey?

Elizabeth Spears: There are so many great resources on the internet now—courses and webinars and things like that. There’s a whole learning section on the Plainsight website, and we do a lot of “intro to computer vision” events for beginners.

But we also we have events for experts—where they can find out how to use the platform, and how to speed up their process and have more reliable deployments. We really like being partners with our customers. So, we research what they’re working on, and we find other products that might apply as well. We like really taking them from idea, all the way to a solution that’s production ready and really works for their organization. 

How is Intel working through its ecosystem to enable its partners, end users, and customers?

Bridget Martin: One of my favorite ways of approaching this is to really partner with that end customer to understand what they’re ultimately trying to achieve, and then work backward. Also, one of the great things about AI is that you don’t have to take down your entire manufacturing process in order to start playing with it. It’s relatively easy to deploy a camera and some lighting and point it at a tool or a process. And so that is really going to be one of the best ways to get started.

And of course, we have all kinds of ecosystem partners and players that we can recommend to the end customers—partners who really specialize in the different areas that the customer is either wanting to get to, or that they’re experiencing some pain points in.

How does Plainsight address scalability, and how does Intel help make an impact here?

Elizabeth Spears: We look at scale from the start, because our customers have big use cases with a lot of data. But another way you can look at it is to scale through the organization, which really comes back to educating more people. We’ll talk to a specific department within a company, and someone will say, “I have a colleague in this other department that has a different problem. Would it work for that?”

Concerning Intel—because we’re a software solution, Intel’s hardware is definitely one of the places that we utilize them. But they’re also really amazing with their partners—bringing partners together to give enterprises great solutions. 

What do you both see as some of the most exciting emerging opportunities for computer vision?

Bridget Martin: One, I would say, is actually that concept of scalability. Not just scaling to different use cases, but also scaling to different hardware—there’s no realistic scenario where there is just one type of compute device involved. I think that’s going to be extremely influential, and really help transform the different industries that are going to be leveraging AI.

But what’s really exciting is this move toward democratization of AI—really enabling people who don’t necessarily have a PhD or specialized education in AI or machine learning to take advantage of that technology.

Elizabeth Spears: I agree. Getting accessible tools into the hands of subject-matter experts and end users, making it really simple to implement solutions quickly, and then being able to expand on that. It’s less about really big AI transformations, and more about identifying all of these smaller use cases or building blocks that you can start doing really quickly, that over time make a really big difference in a business.

Related Content

To learn more about the future of democratizing AI, listen to Democratizing AI for All with Plainsight and Intel® and read Build ML Models with a No-Code Platform. For the latest innovations from Plainsight, follow them on Twitter at @PlainsightAI and on LinkedIn at Plainsight.

 

This article was edited by Christina Cardoza, Senior Editor for insight.tech.

CES 2022: Intel® Launches Revolutionary CPU Architecture

Intel® made big news at CES 2022 with the launch of their 12th Gen Intel® Core Desktop and Mobile processors, formerly known as “Alder Lake”.

What makes these chipsets groundbreaking compared to previous-generation processors? First, increasing workload diversity—driven by the demands of IoT, AI, and visual edge computing—means we need enabling technologies that are more flexible. And this new reality calls for a brand-new approach to processing.

The 12th Gen Intel® Core processors introduce a hybrid core architecture across Desktop and Mobile SKUs for the first time in x86 history. Here’s what you need to know:

New Hybrid Core Architecture: The Benefits Are in the Benchmarks

The new hybrid architecture is built with both Performance- and Efficient-cores, combined to deliver Intel’s biggest desktop performance gains in more than a decade—without sacrificing additional power. Want proof? Check out these benchmarks in Figure 1.

12th Gen Intel® Core™ desktop processors compared to 10th Gen Intel® Core™ processors
Figure 1. 12th Gen Intel® Core™ desktop processors’ performance compared to 10th Gen Intel® Core processors, the previous generation in this series for IoT. For workloads and configurations. Results may vary. (Source: Intel®)

The new processors give retail, healthcare, digital signage, industrial automation, and other edge system designers unprecedented platform control, allowing them to transition seamlessly between top-line productivity and resourceful task completion.

With as many as eight cores of each type—supporting multiple execution threads—developers can consolidate multiple workloads on a single device. For example, a modern POS system could analyze video and run price checks using object recognition algorithms on Performance-cores while the Efficient-cores simultaneously read barcode scans, tally receipts, and accept payment.

Check out this video highlighting all the goodness of the 12th Gen Intel Core processors, from performance to graphics—from media to AI.

Whether you’re designing a video wall, test equipment, #medical imaging system, or a #MachineVision solution, 12th Gen Intel® Core #processors building blocks are now available from @IntelTech partners. via @insightdottech

Digital Signage Struts Its Stuff

Want more proof? A remarkable Intel IoT Video Wall Solution Demo consists of four ViewSonic VP3268A-4K LCD displays. Behind the scenes is a media player powered by a 12th Gen Intel® Core i9 Desktop Processor Reference Validation Platform (RVP).

In a demo at CES, the 12th Gen Intel Core Desktop processors’ four video outputs are synchronized into a continuous large image of a video playlist that spans the four-screen display at 4K resolution. But there’s a lot more going on beneath the surface. New IoT features like Genlock and Pipelock will be key components of the video wall designs of the future.

And although they’re obviously tailored to digital signage, these features represent just a fraction of the capabilities introduced on 12th Gen Intel Core processors.

Available Now for the IoT Edge

Whether you’re designing a video wall, test equipment, medical imaging system, or machine vision solution, off-the-shelf, long-lifecycle 12th Gen Intel Core processors building blocks are now available from Intel partners. These subsystems can jump-start your next design, but they aren’t demo platforms. They’re the real, production-ready deal.

For example, manufacturers are leveraging the new 12th Gen Intel Core Desktop processors in the ASRock Industrial iEPF-9010S to enable workload consolidation and data acceleration in automated optical inspection systems. Others are turning to the SECO CHPC-D80-CSA, a COM-HPC client module that integrates hardware security and time-sensitive networking (TSN) alongside the 12th Gen Intel Core processors high-performance graphics processing.

In semiconductor testing, the Advantech SOM-C350 COM-HPC Client module is setting new standards for data throughput with 12th Gen Intel processors that combine PCIe Gen 5 and DDR5 support. High-end sockets also benefit from devices based on the new chipsets, such as the Avnet Embedded C6B-ALP COM Express Type 6 module and BCM Advanced Research MX670QD Mini-ITX motherboard. These have found homes in robotic surgery and medical imaging, respectively.

Digital signage like the demo video wall can be built on solutions such as Shenzen Decenta Technology’s new Mobile Series-based OPS Module. It sports Intel® Wi-Fi 6E and USB4 interfaces, plus local AI inferencing via the Vector Neural Network Instructions (VNNI). AI workloads on all 12th Gen Intel Core processors can also be accelerated by the Intel® OpenVINO Toolkit.

Any developer can use the Intel® oneAPI Toolkit to harness the hardware-accelerated features mentioned above from the friendly confines of their software stack. And it’s easy to get these stacks initialized thanks to native support for robust software offerings, including the UEFI BIOS Slim Bootloader, hypervisors, multiple Linux distributions, and the Microsoft Windows 10 IoT Enterprise 2021 Long-Term Servicing Channel (LTSC).

It’s all ready for you out of the box.

A New Era of Edge Computing Starts Now

As more, different types of objects leverage electronic intelligence, workloads are changing, and the emphasis is shifting from having the most processing to the right processing. Developers have been waiting for hardware that offers a path forward, and 12th Gen Intel Core processors deliver.

To learn more, check out the 12th Gen Intel Core Desktop processors and 12th Gen Intel Core Mobile processors product briefs.

Related Content

Read about advances in visual computing driven by the Intel OpenVINO Toolkit in Intel Innovation: The Event Designed by Developers for Developers.

2 Performance results are based on testing as of dates shown in configurations and may not reflect all publicly available updates. For more complete information about performance and benchmark results, visit intel.com/PerformanceIndex.

IoT Virtualization Jump-Starts Collaborative Robots

The next evolution in manufacturing automation is being conceptualized around collaborative cobots. Cobots—a type of autonomous robot—are capable of laboring safely alongside human workers. But while their advantages seem obvious, the design of these complex systems is anything but.

Yes, most of the enabling technologies required to build a cobot exist today. And many are already mainstream, from high-resolution cameras that let robots see the world to multicore processors with the performance to locally manage IoT connectivity, edge machine learning, and control tasks.

The challenge is not so much the availability of technology as it is the process of bringing it all together—and doing so on a single platform in a way that reduces power consumption, cost, and design complexity. A logical starting point to achieve this would be replacing multiple single-function robotic controllers with one high-end module. But even that’s not so simple.

“Collaborative robots have to perform multiple tasks at the same time,” says Michael Reichlin, Head of Sales & Marketing at Real-Time Systems GmbH, a leading provider of engineering services and products for embedded systems. “That starts with real-time motion control and goes up to high-performance computing.”

“The increasing number of sensors, interactivity, and communication functionality of collaborative robots demands versatile controllers capable of executing various workloads that have very different requirements,” Reichlin continues. “You need to have these workloads running in parallel and they cannot disturb each other.”

This is where things start to get tricky.

IoT Virtualization and Collaborative Robots in Manufacturing

One of the benefits of multicore processing technology is that software and applications can view each core as a standalone system with its own dedicated threads and memory. That’s how a single controller can manage multiple applications simultaneously.

Historically, the downside of this architecture in robotics has been that viewing cores as discrete systems doesn’t mean they are discrete systems. For example, memory resources are often shared between cores, and there’s only so much to go around. If tasks aren’t scheduled and prioritized appropriately, sharing can quickly become a resource competition that increases latency, and that’s obviously not ideal for safety-critical machines like cobots.

How do you construct a multi-purpose system on the same #hardware that can safely share computational resources without sacrificing #performance? The answer is a real-time #hypervisor. @CongatecAG via @insightdottech

Even if there were ample memory and computational resources to support several applications at once on a multicore processor, you still wouldn’t be able to assign just one workload to one core and call it a day. Because many applications in complex cobot designs must pass data to one another (for example, a sensor input feeds an AI algorithm that informs a control function), there’s often a real need for cores and software to share memory.

This returns us to the issue of partitioning, or as Reichlin put it previously, the ability for workloads to run in parallel and not disturb one another. But how do you construct a multi-purpose system on the same hardware that can safely share computational resources without sacrificing performance?

The answer is a real-time hypervisor. Hypervisors manage different operating systems, shared memory, and system events to ensure all workloads on a device remain isolated while still receiving the resources they need (Figure 1).

Figure depicting the Real-Time Hypervisor’s multi-core and multi-OS systems.
Figure 1. The Real-Time (bare metal) Hypervisor provides hardware separation and rigid determinism. (Source: Real-Time Systems GmbH)

Some hypervisors are software layers that separate different applications. But to meet the deterministic requirements of cobots, bare metal versions like the Real-Time Hypervisor integrate tightly with IoT-centric silicon like 6th gen Intel® Atom and 11th gen Intel® Core processors.

The Atom x6000E and 11th gen Core families support Intel® Virtualization Technology (Intel® VT-x), a hardware-assisted abstraction of compute, memory, and other resources that enables real-time performance for bare-metal hypervisors.

“To keep the determinism on a system, you cannot have a software layer in between your real-time application and hardware. We do not have this software layer,” Reichlin explains. “Customers can just set up their real-time application and have direct hardware access.

“We start with the bootloader and separate the hardware to isolate different workloads and guarantee that you will have determinism,” he continues. “We do not add any jitter. We do not add any latency to real-time applications because of how we separate different cores.”

Data transfer between cores partitioned by the RTS Hypervisor can be conducted in a few ways depending on requirements. For example, developers can either use a virtual network or message interrupts that send or read data when an event occurs.

A third option is transferring blocks of data via shared memory that can’t be overwritten by other workloads. Here, the RTS Hypervisor leverages native features of Intel® processors like software SRAM available on devices that support Intel® Time-Coordinated Computing (Intel® TCC). This new capability places latency-sensitive data and code into a memory cache to improve temporal isolation.

Features like software SRAM are automatically leveraged by the Real-Time Hypervisor without developers having to configure them. This is possible thanks to years of co-development between Real-Time Systems and Intel®.

Hypervisors Split Processors So Cobots Can Share Work

The rigidity of a bare metal, real-time hypervisor affords design flexibility in systems like cobots. Now, systems integrators can pull applications with different timing, safety, and security requirements from different sources and seamlessly integrate them onto the same robotic controller.

There’s no concern over interference between processes or competition for limited resources as all of that is managed by the hypervisor. Real-Time Systems is also developing a safety-certified version of their hypervisor, which will further simplify the development and integration of mixed-criticality cobot systems.

Reichlin expects industrial cobots ranging from desktop personal assistants to those that support humans operating heavy machinery will become mainstream over the next few years. And most will include a hypervisor that allows a single processor to share workloads, so that the cobot can share the work.


This article was edited by
Georganne Benesch, Associate Content Director for insight.tech.