Telehealth Platforms: The Foundation for Digital Transformation

Related Content

To learn more about how telehealth platforms are transforming the healthcare world, read our articles: From Building Automation to Health Tech, Remote Monitoring: The Vision for Patient Safety, and The Doctor Will View You Now.

Transcript

Corporate Participants

Kenton Williston
insight.tech – Editor-in-Chief

Matthew Tyler
Wachter – VP of Strategic Innovation

Peter Smith
IAconnects – Head of Sales and Marketing

Jim Wright
Siemens Healthineers – Business Development Innovation & Digital Business

Presentation

Kenton Williston: Hello and welcome to the webinar “Telehealth Platforms: The Foundation for Digital Transformation.” I’m Kenton Williston, the Editor-in-Chief of insight.tech and your host for this webinar, and I’m joined today by our illustrious panel of guests from Siemens Healthineers, Wachter, and IAconnects. So, I’ll let our guests introduce themselves. Jim, if you could give us a start. Who are you and what do you do at Siemens?

Jim Wright: Thanks, Kenton. Jim Wright, I’m business development for Innovation and Digital Business for North America. So, what I do, Kenton, is I look at partnerships, I look at innovation in the marketplace, and evaluate those new digital solutions, and decide if it fits into where Siemens wants to go.

Kenton Williston: Fabulous. All right, Matt Tyler, if you could give us your credentials.

Matthew Tyler: Sure, I’m Matt Tyler, I’m VP of Strategic Innovation for Wachter, mainly responsible for looking at the marketplace and systems manufacturers that we can incorporate into our solutions that we deliver across North America.

Kenton Williston: Fabulous, and last but not least, Peter Smith.

Peter Smith: Thanks, Kenton. So, I’m Peter Smith, Head of Sales and Marketing at IAconnects. So, we have a wide range of solutions in what’s now known as the IoT space, and assisted living is one of those areas. So, we provide solutions alongside partners, which include the sensors and transferring that data to other third party platforms for end users and carers alike.

Kenton Williston: All right, before we get into our conversation today, just a quick preview of our agenda. We’ll be talking at a broad level about the role of digital transformation in the healthcare space, and particularly where telehealth fits into the picture. We’ll take a little bit of a look at the state of healthcare today. One of the important things that’s been going on is an influx of new data, so we’ll also take a look at how to make sense and use that data, and then we’ll also talk about the actual benefits of all these activities, both to the patients as well to the caregivers themselves.

So, to start with, what in the world is digital transformation in the healthcare space? I’ll go ahead and I’ll start in the same order we introduced ourselves. So, Jim, from your perspective, what does digital transformation mean in a healthcare context?

Jim Wright: Yes, so I think digital transformation in healthcare, you can look at it as a building block of a patient-focused approach, if you will, to healthcare. Helping care providers streamline their operations, understand what the patient requires, and build that loyalty and trust that offer a better healthcare experience.

Kenton Williston: Yes, and I think at the end of the day, it’s experience that’s a really big part of this, and I look forward to getting into that more.

Peter, from your point of view, I know one of the big things you’re doing at IAconnects is remote patient care. Of course, digital transformation, kind of a big concept, where would that remote care fit in, and what does that mean to you?

Peter Smith: Yes, so firstly, it’s having that added element for carers, for family, for next of kin, neighbors, whoever it may be who’s got a vested interest in that individual. We can pull that data into applications where anyone across the world, as long as they have the credentials to access the data, can see how their loved one is doing, and within their own home or within a care setting. So, being able to provide that data for someone else to then provide the real value to the end users, as kind of a business term, but in reality, it’s going to be the relations and the friends of those people. So, that’s where we see it having a big impact.

Kenton Williston: Yes, totally, and again, like I said in our kickoff here in our agenda, I think data is really the thing that matters when we talk about digital transformation and how you use that data. So, Matt, how are you seeing this play out with your clients?

Matthew Tyler: Well, the ability to connect the unconnected in prior years is really where we’re starting to see acceleration. Where you may have trained professionals in one geography, where they’re needed in another geography at the same time, we can utilize video and audio to really connect with patients to the specialists that really require the attention. Ultimately, in healthcare, just like any other industry, workforce is an issue, whether it’s nursing, whether it’s doctors. We’re seeing a huge need for additional manpower. So, utilizing technology to be in more places simultaneously is where we’re really starting to see the transformation happening.

Kenton Williston: Yes, absolutely. Matt, you mentioned a really important word there, which is acceleration, and I think there’s no doubt in anyone’s mind that the last 18 months drastically accelerated all kinds of digital transformation efforts.

So, Jim, I want to come back to you. In what ways did you see these efforts accelerate and particularly in the telehealth sphere?

Jim Wright: Yes, I think the current pandemic put a spotlight on telehealth technologies and remote patient monitoring, for example, and virtual visits, you know, for a lot of reasons, right? One was reducing potential exposure of the virus in patients and healthcare providers alike, that reduced exposure. So, as we saw this come of age, if you will, during the pandemic, it was a step change, and it was enabled by different factors. The increased consumer willingness, if you will, to use telehealth, the increased provider willingness as well, and then, of course, the regulatory changes came into play and those adapted very quickly to accommodate the telemedicine/telehealth situation.

Kenton Williston: Yes, absolutely, and so I think this is all things that you could say, a broad stroke, that digital transformation was accelerated pretty much everywhere during this pandemic, but I think the healthcare sector has had some specific constraints and specific ways in which that happened that are pretty different from other industries.

So, Matt, I want to come back to you. If you had to characterize how digital transformation is different in the healthcare sector versus other sectors, how would you describe that?

Matthew Tyler: I think there’s a level of regulation as it pertains to technology in how it’s utilized within the patient setting. The use of video, everybody says a picture’s worth a thousand words. Video is so powerful in patient care, yet with privacy concerns, there’s some strict guidance that we need to follow as technologists in applying that technology into those spaces. So, I really think that following within regulation, helping working with regulators to better understand what the technology is capable of doing, and then implementing that technology in a responsible manner is really what falls on our shoulders.

Kenton Williston: Yes, absolutely. So, on that note, we’ve talked a lot here about telehealth/telemedicine in particular. Of course, the video applications are a huge, huge part of that. But of course, that’s not the only place where telehealth has become really big.

So, Peter, what do you see big picture, where do telehealth and telemedicine fit into the bigger picture of what’s going on in the healthcare space, and in particular, this idea of having care on demand?

Peter Smith: Yes, so primarily speaking from a UK and European standpoint on this—I’m the only European guy on this today—but we’re seeing that a lot of both patients or individuals and loved ones, or even as far as the actual organizations and employers, are looking to find out how well people are coping, obviously, through the pandemic has been one thing, but in general now, we see mental health being one of the biggest areas for organizations who are looking to help individuals, whether it’s working from home, and that social anxiety about going back, or whether it’s coming into the office and only wanting to be around five different people in an area, so each person can be six feet or three meters apart. And the mental health side of it seems to have gone from kind of maybe just creeping into the top 10, to now definitely been in the top three of things that organizations are looking at doing in trying to find solutions around, so it’s not just the on-call thing, like you said, around with doctors getting appointments for patients. It’s organizations now looking to have a huge impact on their employees and we’re seeing a big, big change in that in the UK in particular.

Kenton Williston: Yes, absolutely, and I think looking forward, I only see this growing. I just can’t imagine going back to the way things were. And gentlemen, I’m wondering from your perspective where you see this concept of on-demand care, you get it from home. Heck, the providers can be at home. Where do you see this expanding as we go forward?

Jim Wright: Well, I think, no, good question. I think there’s a lot of discussion around that. I believe with what’s happened in the market, and because of the pandemic, remote patient monitoring is definitely in the future, and this idea of keeping tabs on that patient at home, not only from a visit, but also devices that will give those vital information back to the caregiver.

And there’s a lot of reasons why, right, because it improves the flow of that information, it improves the adherence to instructions by the patient, the cost of care is reduced. Gosh, it reduces patient expense, improves productivity. And probably one of the more important things, the patients want to be at home, right. They don’t necessarily want to be in the hospital. And of course, that data that’s gathered is—if it can be harnessed—is a valuable asset as the clinician treats that patient.

Kenton Williston: Yes, absolutely. And so, on that point, I think it’d be good just to take a survey, as it were, of where things stand today. We’ve talked a little bit about some of the use cases that have risen up, so I think, you know, Peter mentioned a really good one here that mental health has become one of the key areas for telemedicine, which really wasn’t a thing at all prior to the pandemic. But I think, at the same time, there’s been some challenges because a lot of these new approaches to care that have come online in the 18 months have been done in a very ad hoc manner, you know, as necessitated by the pandemic, and maybe are not as well integrated into the larger healthcare infrastructure as well as they could be.

So, as we think about the current state of affairs, Matt, I’d love to get your thoughts on how well these sort of ad hoc solutions have been implemented, and how well they fit into the rest of the healthcare infrastructure.

Matthew Tyler: Yes, coming from the physical layer perspective as a solutions integrator, we see a lot of technologies being implemented, especially when it comes to video around the pandemic. You know, we’ve had customers back and say, “Hey, we’ll use a baby monitor if we can get away with it”. Those systems don’t tend to integrate very well. So, we do see a lot of siloed approaches when it comes to very specific use cases.

Where I think if, we as the integrator or any of the manufacturing partners we work with, sit down and really try to address the bigger picture with our customers, they can better have a way to collect all of the data that’s being acquired throughout a healthcare organization, bring that back into a central database and be able to use it more to their advantage.

So, I would say the state of the industry is still very segmented at this point, although they are trying to become more integrated, to be able to collect that data, and even more of the data beyond what they’re already capturing today.

Kenton Williston: Yes, for sure. And so, Jim, I know one of the big things that Siemens Healthineers is doing to address some of these things, you’ve got some platforms that are intended, specifically, to collect and collate and understand this data. So, what do you see are some of the key things that need to happen to better pull all of these systems together?

Jim Wright: Well, as Matt said, I think this idea of aggregation of data not only, you know, as I said earlier in our remote patient monitoring devices in the home, but then it becomes, how do we take this data, aggregate it into a patient-centric record, serve it up in a manner to a clinician that gives them better insights of how to treat that patient in the best possible way.

Next evolution of that, obviously—well, maybe not obviously—would be this idea of building algorithms around artificial intelligence and helping, and be a partner to that clinician as they’re diagnosing and treating that patient in that way as close to real-time.

So, I think—I hope that answers your questions, what you were asking.

Kenton Williston: Yes, absolutely. And I want to come back and touch on that idea of AI in just a moment. But before I get to that, I think there’s one other point that’s really important, which—Pete, you brought up some of the regulatory concerns, and I know that’s been a big issue, right. So, for example, you know, we’re on a Zoom call right now, recording this webinar. Maybe it’s not the right platform to meet the regulatory requirements, you might need to use something more specialized, right. So, there’s a regulatory element of things, but there’s lots of other concerns too, like security comes to mind. You know, we’re talking about a lot of very sensitive data, so even above and beyond the regulations, healthcare providers might want to take extra safeguards to really protect that information.

So, where do you see that, I guess, first, appetite for risk? And what are some of the key considerations healthcare providers need to keep in mind as they’re looking to further their telehealth and overall digital transformation efforts?

Peter Smith: So, I think the appetite for risk has kind of definitely dwindled a little bit over the last 18 months where people have seen more positives than negatives about being able to remotely monitor people. I think people are willing to take slightly more risks than they may have been two years ago.

But on the flipside, because there’s so many different providers of kind of remote healthcare or healthcare, in general, the bigger the name of the company that is providing that solution, so like an IBM, for instance, that has a huge security system behind it, if someone just sees a brand name like IBM, like a Siemens, and thinks, “Right, they’re involved with it, it must be secure”. If anything, that’s more important to people than actually looking at what security is behind it.

So, being able to have kind of the right partnership and ecosystem within a solution is definitely a big thing that we see and that has kind of been the same ever since we’ve been involved in this area for kind of—maybe five or six years now. That hasn’t changed, but it’s just meant that more people are willing to take a risk on something if those big names are involved.

Kenton Williston: Yes, absolutely. And you know, the other thing that comes to mind there when I’m thinking about risk is not just the risk of, will this project actually do what I want it to do, will it protect the patient data? But there’s also the question of, will I be able to pay for all this? Right, and I think one of the things that has changed a lot during the pandemic is the reimbursement structure from insurers has grown to accommodate a lot more of these telehealth solutions.

So, Jim, I’m wondering what you’re seeing in that area.

Jim Wright: No, exactly, Kenton. Those coding and reimbursement not only from CMS, folks may know last—I guess it was November of last year, the CMS Medicare came out with Hospitals Without Walls program and reimbursing structure for those patients that would be admitted as an inpatient but treated at home, along with coding for RPM. And of course, right on the heels of that, will be the therapeutics.

So, there’s certainly an eye on how do we pay for this and reimburse providers, not only from the government in the way of Medicare, but private insurers have stepped up to the plate too and are certainly active in making sure that that reimbursement is addressed.

Kenton Williston: Absolutely. So, with that, Jim, I want to come back to the point you made earlier about AI. I promised I’d get back to you on that. So, again, I think one of the key things that’s happened here is, on one hand, a lot of these services were implemented over the last 18 months out of necessity, but I think that also means we’ve got all new kinds of data that we’re collecting now, or at least able to collect now that just weren’t possible before, because things are being done digitally. But there’s, I think, a question there of, you know, first of all, are we capturing that data to begin with? And then secondly, how can we actually make use of this data, so it’s not just sitting in a repository somewhere gathering dust, as it were.

So, Jim, can you give me a little more of your thoughts on that?

Jim Wright: Yes, so, again, this idea of how do you aggregate this data and then how do you monetize it, or how do you use it for better clinical care. So, building those algorithms around AI, and then again, delivering that information up, we feel that we augment what the clinician is trying to achieve, not only in the diagnosis, but in the treatment of, say, best practices or best pathways, excuse me.

So, as that clinician—as we know, there’s a lot of different—when you’re diagnosing a patient and treating a patient, there’s a lot of pathways that folks can use, and that data that becomes available will help that physician, and we almost partner with that physician or clinician to help them make more informed, better decisions.

Kenton Williston: Yes, absolutely. And Matt, I know that’s something really important to Wachter as well. Can you tell me a little bit about your perspective on how to better capture and utilize all this data?

Matthew Tyler: Absolutely. So, there’s a lot of areas that I don’t think healthcare providers take into account. Patient satisfaction is a huge scoring factor, especially in reimbursement. So, if we go back to the whole payment topic, if we’re putting technology into a patient bed space, we should be collecting as much data as possible. We can understand, with the solutions that we already provide, how many times a nurse has been redirected, how many times they’ve had to intervene. But how can we take into other—or take into account other topics such as ambient lighting conditions, temperature, humidity, all of the environmental senses that a patient has to live through while in-stay at a hospital. And then, how can we improve those conditions automatically without having to send a human in to take care of those for the patients?

With some of the technology that’s available, that’s very inexpensive, we’re able to collect a heck of a lot more data now and provide it back to like, say, a Siemens where they can do that slice and dice of the data to provide a better patient experience while they’re in the care of the caregiver.

Kenton Williston: Yes, absolutely. And I think this is an important thing for the industry to keep in mind, right. It’s not just about the outcomes, although those are very important, but it’s also about the quality of care and the patient experience. So, I’d like to get into that a little bit more now.

So, Pete, I’ll turn this over to you. How has the patient experience changed during the pandemic and what new expectations might they have going forward?

Peter Smith: Yes, so I think particularly in the UK, again, we’ve had periods of three or four months at a time where the country has been in lockdown, and unless you’re in a specific bubble or you’re allowed to travel for work, you don’t get to see many family or friends across kind of those lockdown periods.

So, speaking from personal experience, my grandma is in her 90s, she lives probably only about 40 minutes away, but because she was at such a high risk due to some health factors and also the age concern, the only time I or the family members were allowed to see her were when you’re taking shopping to the door, you unlock the door, you put the shopping in, you come back out again, and then you talk to them through the window.

So, having that as the only interaction with them, from a personal point of view, isn’t very nice. You can’t really tell how well they’re doing. But by having these monitoring solutions in there, it gives not only the patient, but then other people as well peace of mind that that person is being checked on, but it’s in a non-invasive way. They know there’s sensors around the house, but they’re not cameras, they’re just picking up environmental factors or movement.

So, having that and then alerts being able to be created from them as well gives that overall peace of mind.

A very quick example is my grandma wears one of these little—it looks like a watch, it’s not quite—but a little band on her wrist and if she was to fall over, it would detect a fall. Being as stubborn as she is, she took that band off, put it on the worktop, knocked it on the floor. Obviously, that then sent an alert to me, to a neighbor, to the emergency services. And within about 17 minutes, there were three people at her house checking if she was OK. And of course, she was, she was making a cup of tea and she doesn’t like wearing the band.

But old people are different, and there’s lots of different solutions out there, which I think can help both patients and their loved ones as well.

Kenton Williston: I incredibly strongly resonate with your experiences there, Peter. I’ve got an elderly aunt who happily is only on the other side of town in an adult care facility, but boy oh boy, it was very frustrating. Much like you, the only time we got to see her for months on end was through a pane of glass. And much like you said, she is a stubborn old lady, and various technology solutions that the facility tried, she just didn’t want to have anything to do with them. So, I very much understand that experience.

Peter Smith: Yes, there’s lots of different things out there. I think there’s a happy medium for almost everyone out there with different solutions. So, that’s a good thing to hear as well.

Kenton Williston: So, Jim, I’m really interested, you know, we’ve touched briefly on some of the AI and, you know, data aggregation ideas. So, how do you see these technologies being applied to—like we’re talking about here—give the patient, really, a better experience?

Jim Wright: You know, and we think about that quite a bit, Kenton. So, we’re observing a quick evolution of space and innovation, you know, beyond the virtual urgent care convenience. So, these innovations around virtual longitudinal care and enabling care at the home and remote patient monitoring, and investment in this digital front door, if you will, is all coming to be, let’s just say, adopted and accepted by the medical community, not only from the physician, but from the patient.

So, I think you’re going to see this grow. I think it’s going to help clinicians, providers as they stretch their resources become, let’s just say, more attentive and have—be able to do more with less, let’s just say, as it relates to their resources.

Kenton Williston: Yes, absolutely. And I think you’ve touched on an important point, so a perfect segue, thank you for setting me up, that it’s not just about the patients. We also want to take care of the caregivers. And I think, boy, this has just been a really rough period. You know, I’m really good friends with a nurse in a step-down ICU and it’s—I mean, boy, it has been a rough go of it these last 18 months. And I think the world at large has shown a lot of appreciation for our caregivers, and I think there’s, you know, a real mandate, in my mind, for the industry to carry that thoughtfulness forward and find new ways of making life better for the caregivers.

So, on that point, Matt, I know that Wachter has some really cool solutions that can help in this regard. So, how do you see this whole concept of digital transformation facilitating a better workload balance and just, generally, you know, better quality of the work experience for the caregivers.

Matthew Tyler: Sure. So, when we started putting video into patient bed spaces, we got a lot of pushback from nursing unions, as well as other caregivers, just they had this perception that Big Brother is going to be watching over them. Their evaluations were going to be performed based off the video that was being captured. And once the video got in, into the space, the observation tech were monitoring the patients without the nurses having to be in the rooms at all times, we actually got a lot of feedback from the customers that nurses’ anxiety and stress levels were coming down because they knew that someone else had eyes on their patients. And they realized that it wasn’t Big Brother, it was really just better patient observation and satisfaction that was coming down.

So, you know, early on prior to the pandemic, that’s where we really faced the obstacles and noticed that it was providing a better quality of life for the caregivers themselves.

And then during the pandemic, the use of video, you know, two-way video/two-way audio into the patient bed spaces has allowed the reduction in PPE. So, maybe a nurse or a doctor doesn’t need to don all their PPE to go in and just have a quick conversation with the patient. They can actually do it remote from a monitoring station or their phone. That also gives a whole level of comfort and a lot less anxiety or stress about having to go enter into a patient bed space that may be suffering from the pandemic.

Kenton Williston: Yes, absolutely. And Pete, I think that’s one of the big advantages that some of the IAconnects solutions bring as well, right. It’s both the reduction in workload, continuous monitoring so you don’t have to worry as much, and last but certainly not least, keeping both parties safe from any sort of transmission.

So, can you speak a little bit more to where you see these factors playing a role going forward?

Peter Smith: Yes, so I think when we finally are out of the pandemic, the ability to decide which patients you’re going to see first based on the remote monitoring solutions data that you’ve already acquired, that can be a huge, huge benefit. Typically, you would go from either—if it’s in a care facility—bed one right the way through to however many beds they are. If it’s in a retirement village, you probably start at one side and go to the other side. But the data, that will be telling you that you need to go and see Pete in house 17, followed by Matt in house 33, and so on. And being able to build your rounds, as it were, based on the data is something that can be beneficial, both to the caregivers, because they know exactly what they’re going into and why, and also for the patients, there’s the obvious benefit of the people that need to be seen sooner will be seen sooner.

But some of the environmental factors that are being monitored, so CO2 as an example. In the past, we’ve had CO2 monitoring solutions, because it can affect productivity levels. And the higher the level of CO2, the lower your concentration, the lower the productivity as a general rule. But it also has a similar effect for the transmission rates and kind of transmissible effects of COVID and other diseases.

So, when the CO2 levels increase, that means there’s a higher chance that COVID can be transmissible. So, if you can reduce the CO2 levels as much as possible by having less face to face meetings, by having less people in a certain space, then that can only be a good thing as well for everyone involved including the carers.

Kenton Williston: So, Peter, that’s a great opportunity, I think, for us to talk about where all of you have actually seen some implementations of these technologies we’re talking about. So, I know, for example, you’ve done some work with IBM to implement your remote patient monitoring technologies. So, if you could just tell me a little bit about that and what kinds of outcomes you saw from that engagement.

Peter Smith: Yes, thanks, Kenton. So, a few years ago now, we were approached by IBM who wanted to build an ecosystem of partners to offer an assisted living or a smart assisted living solution, as they called it at the time. They wanted to use their Watson IoT software, which incorporated the AI piece. They wanted the IoT non-invasive sensors from IAconnects. And there was another partner in the mix as well who built the actual mobile application, which can be used by both carers, family, and patients alike.

So, we provided sensors which looked at environmental factors, so temperature, humidity, CO2, vibration monitoring for when people sat in their favorite chair or getting in and out of their bed, and then also occupancy for bathrooms and other main rooms that they used in the place they live. And finally, some contacts—magnetic contacts for windows and doors, mainly for security purposes, but the initial use case was for patients with Alzheimer’s and dementia. So, knowing when a door or window had been opened, if it has been opened at three o’clock in the morning, you know there’s probably something that ought to be looked at and send someone around to check that that person is still in their house and hasn’t gone walking about in the middle of the night.

So, by being able to collect all of this, primarily environmental data amongst the occupancy stuff, we could send out to this application, which then used AI from the IBM side to monitor and build up a picture of what a patient’s day-to-day routine might look like. So, it knows that Pete gets up at about 6:50 every morning. The first thing he does is go to the bathroom, then he goes and makes a cup of coffee, so he’s been downstairs. Then he comes back upstairs again to get dressed. And then from about 8 a.m. until probably 8 p.m. most days at the moment, sits at his desk, does a lot of work, does a lot of Zoom calls, and then he goes back to bed at, say, 10, 11 o’clock at night.

For someone who doesn’t have any kind of medical conditions, that might change quite a lot depending what time of year it is, whether he goes to play sports or whatever it might be. But if it’s a patient who has medical conditions, that generally means they have the same routine every day, being able to spot when something has changed, so like they’ve left the house at three o’clock in the morning, or that they haven’t got out of bed until 8:30 that day, it allows alerts to be spent to people who need to see it, so the neighbors, the carers, the next of kin, so they can go and check on those people without having to physically go and do that every day. That has been a blessing throughout the pandemic, something which obviously nobody knew was going to happen. But having that solution pre-pandemic was definitely beneficial and could be adapted slightly during the last 18 months.

Kenton Williston: Yes, absolutely. And I think that’s one of the most important things to take away from all these innovations is that, you know, as we move forward, these things will continue to be of great value and, you know, the world continues to be an uncertain place. So, you know, having these systems in place will be fantastically valuable, no matter what comes next.

So, speaking of complexity and the great unknown. You know, I know Siemens, Jim, has just a tremendous amount of technology for the healthcare space. We’ve been talking about data aggregation, for example, AI, all these sorts of things. So, in this great world of all the things Siemens is doing, is there a use case that comes to mind that’s relevant to this conversation? And if so, what were the outcomes there?

Jim Wright: Yes, there is. We stood up a program called HerzConnect, and this program was a technical partnership, if you will, of the Heart and Diabetes Center in Bad Oeynhausen, Germany, and Siemens. And so, this idea of—the program aimed to provide patients with care according to the guidelines, slow the disease progression, and certainly by close monitoring of the patient. The outcome was a marked improvement in their condition, and that ability to keep tabs on that patient that had either heart failure or cardiovascular problems, if you will.

So, as we developed that program, which was in existence—oh, it’s probably been three or four years now—and so, this idea of connecting with the patient, not only through virtual visits, but this idea of remote patient monitoring and collecting that information, whether it’s, you know, weight gain over scales or blood pressure cuffs or other types of equipment. And so, you know, a lot of that technology we’re just now starting to introduce into North America around harnessing, not only the collection of that data, but then, like I said earlier in the program, is how do you use that data for better clinical insights and augment what the caregiver is already doing with that patient, whether it’s diagnostics, or treatment.

Kenton Williston: That’s fantastic. And Matt, I want to come back to you here at the end. So, you talked a little bit earlier about some of the things Wachter is doing in relationship to some of the monitoring and working with nurses. So, I’d love to hear a little bit more detail of a specific use case that you’ve had there.

Matthew Tyler: Sure. So, we have a solution that was tailored to reduce the need for one-to-one sitters. So, any fall risk patient that may require a nurse or a caregiver to sit with that patient 24 hours a day, we found that working with our customers that it’s an extreme stress on their workload. So, what we were able to do is utilize video and audio capabilities where one person or maybe two would be able to monitor a number of other patients, a one to 12 ratio from that one to one.

So, we’ve seen a tremendous amount of success around reducing that workload, but then also offering the data that we’re capturing. So, we’re charting on the number of redirects, the number of interventions, things like that, being able to integrate with the customer’s EMR of choice. And then even taking it beyond that.

So, mental health is always an issue. Mental health regulations and requirements have changed recently, where now we can utilize video and audio within those spaces, the lower risk levels in those spaces. We were able to develop, in conjunction with some of our partners, some anti-ligature devices that work well in mental health situations. So, we’re really seeing this expansion of the use of video and audio throughout the hospital, and it’s really up to the customer’s imagination of where the best fit is.

Kenton Williston: Well, that’s a great segue into our last topic, which is where are things going to go next, and how can healthcare providers continue their journey, or start their journey.

So, Matt, I’ll stay with you. As healthcare organizations are thinking forward, where should they be focusing their efforts?

Matthew Tyler: Every one of our customers seems to be unique in where their deficiencies are. So, it’s really doing that investigation of where they stand, set that benchmark, and then look for the gaps that need to be filled. Though not everyone’s issues or deficiencies are unique, certain customers have deficiencies where others do not.

We’re seeing a big play in the 5G in the private networking space in connecting very rural, very remote locations. So, we’re seeing a tremendous amount of success there in being able to provide quality and equity amongst the care, right. If you’re out in a rural area, a rural setting, you can receive the same level of care utilizing some of these new technologies out there that, you know, maybe a very urban environment may already have within the confines of that city.

Kenton Williston: Yes, absolutely. That totally makes sense. And Pete, I bet you’re seeing some similar things, and I’d love to hear your thoughts on some of the steps healthcare organizations can take to set themselves up for success in these efforts.

Peter Smith: Yes, so there’s—a lot of the things now are we’re getting healthcare organizations come to us asking for things and suggesting things, whereas even two years ago, they would come and say, “Have you got something that can monitor patients for us?” not “We need to monitor occupancy, we need to monitor CO2”. It was very much a one-way street before, whereas now we can have those very open conversations with customers to work out what the best solution is for people. And the fact that the pandemic has driven a mass production and kind of a huge increase in different organizations building sensors, for example, it means now that there’s such a wide range of sensors on the market, that it’s also driving the price down of hardware. Also, even increased problems of actually getting some of the sensors and component pieces at the moment, but the actual increase in products is definitely driving down that cost, which then allows the solutions to be accessible to a lot more people, which is ultimately where we all want to get to. We want to be able to offer these solutions to as many people as possible that can benefit from them.

So, the continued increase in products, the continued decrease in price, and the kind of the different API connections and MQTT connections, data is now at a huge, huge step forward, and I only see that continuing as the months go by.

Kenton Williston: It totally makes sense. And Jim, I’ll leave some of the final words with you. I’m wondering even just from an organizational perspective if there are different ways healthcare providers can be even conceptualizing what they do to set themselves up for continued success, better patient outcomes, better patient experiences, and better experiences for the workers themselves.

Jim Wright: Yes, good question. We think about that a lot, and it kind of goes somewhat what Peter was talking about. So, if you talk to clients, customers, providers, and they talk about, you know, what’s on your radar screen for the future. They’ll talk about, you know, increasing that digital transformation through readiness, if you will, and this greater investment that needs to be made as it relates to transformation. And of course, the security things that come into play with that.

And so, this whole idea around continuing increase of telehealth and, as we mentioned earlier, the aspects around collecting information off of remote patient monitoring equipment, or artificial intelligence, harnessing that data that’s collected.

So, I think as we see this unfold, those conversations that are happening with our customers are very in tune to how do we harness this technology to deliver better, quicker, less expensive care. And more importantly, what does that do to the outcomes for the patient, I should say, in their care.

So, I think you’re—I really believe you’re going to see this continue. I don’t think even down the road, two or three years after the pandemic hopefully is behind us, that we’re going to go back to where it was. I think—let’s just say, to use the term the genie is out of the bottle, right. So, I think you’re going to see a continued interest in these types of solutions.

Kenton Williston: Yes, I certainly agree. Well, that just leaves me to say thank you to all of you for joining us today, both our panelists, really appreciate your time, as well as our guests. Thank you so much for your kind attention. And I would just like to encourage all of our attendees to visit insight.tech to learn more. We’ve got a ton of great information there about the latest in healthcare technology, not least of which are some fantastic articles featuring the technologies that we talked about today from Siemens, Wachter, and IAconnects.

So, with that, I’ll just say once again, thanks so much for your time and look forward to seeing you on insight.tech.

Interactive Digital Displays Transform Retail Banking

If you’re like most people, you probably do the bulk of your banking on your phone or online. But what if you want to refinance your home, open an account, or apply for a student loan? In that case, personal service is a must. And like any retail experience, when we walk into a bank, we hope to speak to someone who’s friendly, knowledgeable, and can solve our problems fast.

In many ways, this model is the one many retail banks have been targeting for years. With routine transactions offloaded onto technology, networks can be streamlined and the branch itself can be reconceived as a professional services firm focused on helping customers the way only humans can.

But fewer physical locations and a new staffing model are just the beginning. Just like retailers, banks all over the world are experimenting as they search for the ideal omnichannel strategy.

What do we know already? That many people, even younger demographics, still rely on brick-and-mortar banks. In fact, “according to one study, nearly 40% of bank customers believe they need more advice now,” says Nancy Radermecher, President of JohnRyan, Inc.—a full-service marketing communications firm that specializes in retail banking.

The best strategy, then, is one that not only improves the bottom line but also takes advantage of all the ways technology can improve the customer experience—both online and in-store.

The Many Faces of Digital Signage in Banks

More and more, digital signage is an important part of that equation. Picture a 24/7 interactive bank branch with heat sensors on the façade to detect when a person is close—and QR codes that deliver personalized brochures from the sidewalk.

Inside, a range of interactive displays could play targeted marketing messages based on how many customers are there, the logged reasons for their visit, or which staff are in the office. But they might also help waiting customers pass the time. Some banks are using digital displays for games like trivia, which can be played after downloading the bank’s app.

Clearly, the more targeted the messaging, and the more app downloads, the greater value to banks. “The more you know about your customers—where they are, what they’re doing, how long they’re dwelling here and there—the better you can optimize the placement, duration, and content of the messaging,” explains Radermecher. And the increasing number of visual analytics tools on the market, including ones that detect sentiment, are helping banks perfect their messaging even further.

With increasingly powerful #IoT, #AI, and #DigitalSignage technology—and with partners like @JohnRyanInc—we can expect an ultra-efficient, personalized banking experience. via @insightdottech

Innovating with the Innovators

Innovation in retail bank marketing didn’t start with technology. It started in 1985, when JohnRyan first thought to use point of sale, printed marketing materials in banks (which wasn’t done at the time). Since then, the company has continued to be a pioneer in the space by offering technical solutions to their bank customers in addition to communications strategy.

“It just became obvious that the better way to message to consumers in the bank branch would be through digital displays,” explains Radermecher. And not least because people who take bank brochures tend to flip them over right away and use them as scratch paper.

But in-branch digital marketing messages are also superior because they’re easier to produce, can vary by location based on the customers served, and most important—can be updated in real time. “So if there’s movement in the market, you can respond to it immediately,” adds Radermecher.

Which is why, when it became clear the industry was moving in this direction, JohnRyan developed the Digital Communications Network for Financial Services—a platform that allows banks to create more effective marketing messages. The idea behind it was to make in-branch marketing more akin to an immersive website, which can be easily updated on the fly.

Banking Interactive Displays in Action

This model is a big departure from the traditional route of hiring agencies to produce expensive, static A/V media to play on loop—and it’s an important differentiator for JohnRyan. Another is its emphasis on integrating digital signage with other systems, such as appointment check-in, that make retail banking more enjoyable for customers.

For example, one client used to rely on the “clipboard mechanism,” where visiting customers signed in on paper and then waited, hoping someone would notice them. After installing JohnRyan’s digital signage with integrated check-in, customers could check the display to see their place in line instead of wondering how long they’d have to wait.

Knowing why they were there also allowed bankers to better prepare to receive them—so they could address them by name and have a more productive discussion. Customer satisfaction improved, sales increased, and the bank benefitted from having access to data they could never gather before: average wait time and length of sales sessions, for example, or foot traffic by day and time.

Technology Is the Key to Better Banking

Intel vPro® technology is vital for keeping branch systems up and running. “It maximizes the level of remote control we have over these devices, which allows us to avoid the kinds of technical outages that often plague banks,” says Radermecher. “And having access to their engineers has helped us achieve a level of mastery we wouldn’t have otherwise.”

Despite its industry-specific challenges, retail banking is well on its way to digital transformation. And while we don’t know exactly what that will look like in five or 10 years, one thing is certain: It will only get better, for banks and their customers. With increasingly powerful IoT, AI, and digital signage technology—and with partners like JohnRyan—we can expect an ultra-efficient, personalized banking experience the likes of which we haven’t even imagined.

Telehealth Is the Future of Care, and the Future Is Now

Imagine recovering from a stroke in your own bedroom or receiving infusion therapy right from your most comfortable living room chair. A combination of healthcare expertise and technology is driving the future of telehealth, expanding what’s possible with solutions that meet a growing set of patient needs.

Telehealth dates to the ‘60s when NASA provided one-on-one care in space. But its mainstream beginnings were in 1968 when a Massachusetts General Hospital pulmonologist used closed-circuit television to examine ill travelers who were arriving at Boston’s Logan Airport. The solution provided proof of concept that care is possible when there is distance between the provider and patient.

But widespread adoption has faced a variety of roadblocks, from technology and regulatory issues to privacy and lack of aligned financial incentives. In the 1990s, using technology to bridge distances started to take hold, but health system leadership was more focused on face-to-face care than virtual opportunities.

“Senior leadership of most health systems were more focused on electronic health systems — first medical imaging PACS followed by electronic health records. If there isn’t a financial incentive, it’s not likely to happen,” says Dr. Richard Bakalar, Chief Strategy Officer for ViTel Net, provider of scalable virtual care solutions.

And Bakalar should know. His first job out of medical school was as the U.S. president’s flight surgeon on Marine One. He served 25 years in the Navy, as an internist, in nuclear medicine, and he set up the Navy’s operational fleet telemedicine program.

“I learned the importance of not being isolated,” says Bakalar. “And that’s why I really got interested in telemedicine. Setting up the Navy’s telemedicine program is where I learned how to connect primary care providers with specialists, when needed. That was the beginning of this transition.”

The Sudden Shift to Telemedicine

COVID-19 provided an incentive, suddenly accelerating the adoption of telehealth from a “nice to have” to a “must have” necessity.

“We had to bring healthcare to the patient, protecting them and providing access to those who couldn’t travel,” says Bakalar.

There was a financial incentive, too: On the business side, healthcare facilities had to furlough providers because they didn’t have the capacity, or demand, for nonessential surgery or onsite primary care. The only way to retain revenue and provide services was to use telehealth.

“Bringing the #health system to the patient saves money and leads to better patient engagement and #medical outcomes.” —Dr. Richard Bakalar, CSO, @ViTelNet

But most organizations were not prepared with technology, training, or support in place. And very few had governance or a business model that would allow them to rapidly transition and scale from in-person to telehealth care.

“Before the pandemic hit, telehealth made up only about 5% of remote encounters,” says Bakalar. “During COVID, it was as much as 80%. Post-COVID, we anticipate a leveling off to about 25%. Going from 5% to 25% of an organization’s business will create more visibility at the board level, and require a more systematic, disciplined level of reporting, accountability, and compliance.”

A Need to Connect Health Tech Sources

To create an effective telehealth system, hospitals and clinics need to marry disconnected information sources. Most organizations take an app-store approach to choosing healthcare technology, picking products that solve specific service lines. Each solution has a separate database and workflow.

Few organizations had the foresight to take a platform approach, which does a better job across service lines. With a greater emphasis on telehealth, organizations will need to go from an app-store model to a platform model.

ViTel Net has a unique telehealth solution and consulting approach, working with customers to understand their current workflow, business needs, and goals, and providing technical services, compliance, and deployment. Its cloud-based vCareCommand modular platform uses high-compute Intel® processor-based platforms and integrates into a provider’s existing information systems.

Live videoconferencing hosts patient encounters and embedded tools provide clinical documentation and access to medical imaging. The system also records the amount of time spent with patients for proper billing, coding, and enhanced reporting. Instead of being a system of record, it’s a cache of information that aggregates the collaboration and images from the encounter and provides a report to the system of record, usually the EHR, for continuity of care.

“Our platform allows us to meet the local workflow needs of the customer, share data with the systems of record that have governance associated with them, and provide real-time business intelligence,” says Bakalar.

Deploying the solution is easier because ViTel Net’s vCareCommand technology can be assembled like Lego blocks. Customers can start small by adding application modules to their current technology, then scale as needed. For example, in 2018 a large academic medical center had successfully integrated the solution into its platform for tele-stroke care.

“In 2020, they were struck like everybody else with clinics being shut down,” says Bakalar. “They added a module called ‘TeleUrgent Care,’ where patients who were at home could be evaluated through the platform. They had not anticipated that need, but it was something we could rapidly add because they had the existing capability.”

A similar situation happened at a hospital in Washington, D.C., which treated children with serious infectious diseases pre-COVID. “They were able to put our technology in their ICU isolation rooms,” says Bakalar. “Nurses from outside the ICU patient room could monitor the patients without having to go into their rooms with full PPE.”

Telehealth Trends of the Future

While there are times when a patient needs to come to a health system, such as surgery, technology is creating a growing number of care services that can be done from home.

“Bringing the health system to the patient saves money and leads to better patient engagement and medical outcomes,” says Bakalar. “It isn’t just medical surveillance. We can treat them at home, too. We can send technicians to the home to perform services like infusions or set up patients on ventilators and monitor them. The proportion of face-to-face and virtual care will continue to shift in a positive way toward virtual care. Telehealth’s time has come. It’s taken us 30 years, but we’re getting close.”

Intel Innovation: The Event Designed by Developers for Developers

First it was the hyperconvergence of IT and OT. Then it was developing ultra-portable applications. Today, it’s increasingly a cloud-native world where pervasive connectivity and ubiquitous intelligence with AI everywhere are the new normal. How do you keep your development skills on track when the landscape keeps shifting?

Intel Innovation—an event designed by developers for developers—delivers precisely what you need. It’s free, it’s global, it’s all digital, and it’s right around the corner—October 27 & 28—so register today.

Here’s your guide to must-see presentations, demos, and training sessions:

Start at the top with CEO Pat Gelsinger’s keynote, where you’ll hear exciting news and insights from Intel and industry thought leaders—along with in-depth technical sessions across a range of products, technologies, and developer-enabling tools like OpenVINO—and much more.

Then take a deep dive into the world of edge computing and 5G networks with Intel technologists Rita Wouhaybi and Rajesh Gadiyar. They’ll walk you through free, ready-to-deploy tools for network, AI, and IoT developers, solutions architects, and data scientists, to name a few. And get answers to your questions at the live “Meet the Geek” Q&A with Rita and Rajesh immediately following their presentation.

Then roll up your sleeves.

See how a worldwide ecosystem of partners is working alongside @Intel to accomplish amazing things across industries, use cases, and geographies. #IntelON via @insightdottech

AI Everywhere Through the Lens of OpenVINO

See how a worldwide ecosystem of partners is working alongside Intel to accomplish amazing things across industries, use cases, and geographies.

From healthcare to retail to auto manufacturing there’s a massive need for intelligence via AI-enabled computer vision, natural language processing, and more. That’s easier said than done. The Edge & 5G track at Intel Innovation demonstrates how organizations use OpenVINO to streamline AI development—and how you can, too. Plus, you’ll see that exciting new common framework integrations, features, and capabilities are right around the corner.

An Eye on Healthcare

Learn how GE Healthcare teamed up with Intel on a medical imaging solution using OpenVINO to train and tune CV models for fast inferencing at the edge. The technical session will demonstrate the increasingly urgent needs in critical interventional medicine brought on by the COVID-19 pandemic. See a step-by-step, in-session demonstration with live code used to build an AI-assisted X-Ray intubation procedure.

And in the marquee demo showcase, see how OpenVINO-optimized models for kidney segmentation are accelerating and improving diagnostic accuracy in radiology departments at the point of care. The live code demonstration shows exactly how they did it.

Route Planning and New Age Retail

Hear how Intel partner Pathr.ai uses the potential of location data for Spatial Intelligence. Retailers and mall operators can analyze customer foot traffic and dwell time to improve product placement, merchandising, tenant lease pricing, and safety. Most important, it protects personal privacy by anonymizing images of shoppers.

And they aren’t the only ones creating smart retail solutions on OpenVINO. Quividi uses it with great success for digital out-of-home (DOOH) advertising.

Check out how these and other ecosystem partners put OpenVINO to use in the IoT Edge Innovation Zone.

Keeping Automotive in Drive with Open Source

Discover how BMW and Robotron built a manufacturing quality control system that obscures production processes and anonymizes factory workers to comply with European GDPR privacy regulations. You can see the solution’s range of anonymization capabilities enabled by OpenVINO in the Edge & 5G showcase.

And this is just one example of how Intel tech is transforming transportation. See how IntelliSite software powers VectoLabs’ Smart Driver Behavior Analytics and the Supply Chain-as-a-Service for Intelligent Transportation.

Speaking of factories, the industrial segment has been one of the early adopters and beneficiaries of AI at the edge. BlueSkies.AI will illustrate how it’s done in a walkthrough of AI-based product inspection on commercial-off-the-shelf (COTS) industrial PCs, which is a great way to get acquainted with the tech.

From there you can gain an understanding of the actual AI models powering these applications. Another must-see demo is the Baidu PaddlePaddle deep learning-based worker safety system and the open Industry 4.0 platform from the IndustryFusion Foundation (IFF)—each uses OpenVINO as a key optimization enabler and solution building block.

There’s plenty happening beyond the world of AI, like next-generation industrial networking and cloud-native application development. Intel Innovation has you covered here as well.

Discover how Microsoft and Intel are enabling 5G in a joint session on everything from the new radio access network architecture to how it works with time-sensitive networking (TSN) in factories. Then apply that knowledge in a collaboration with Real-Time Systems GmbH and Intel called “Building Modern Real-Time Applications with Intel Time-Coordinated Computing (TCC)”. You’ll see how industrial networks are getting an overhaul.

You can also uncover ways to unify applications across the cloud-edge continuum in “Introducing the Intel® Smart Edge Software Portfolio,” where Intel will reveal an edge-native software stack for managing private 5G networks.

Innovation You Can Take to Market

If you’re just starting your journey, Intel Innovation shows how developers can get off the ground with prototyping platforms and open-source code.

For example, Sergio Velmay of consulting firm Bravent, and winner of the ADLINK and Intel 20/20 Vision Hackathon, used OpenVINO and a Vizi-AI development kit to generate 3D simulation models of industrial work cells. Find out how ADLINK helped take Sergio’s design to production—and start tinkering for yourself.

If you’re ready to go a step further, Intel Principal Engineer Hassnaa Moustafa present “Rapidly Build Commercial Offerings with Edge Reference Solutions.” This practical training session reviews off-the-shelf platforms that Intel has tailored for use in specific applications. And best of all, it shows how you can leverage them to get to market quickly.

Regardless of your skill level, role, or industry, there is a path into the hyperconnected era of AI everywhere. Take a big step forward at Intel Innovation.

Learn more, register for free, and start building your personal agenda today.

Omnichannel Retail: Put Your Commerce Tech in One Basket

The global pandemic has caused retailers to come face to face with their customers’ adoption of hybrid digital and in-store shopping. Business survival has forced them to follow a customer’s journey across multiple verticals, channels, and devices. In today’s environment, merchants need to deliver a consistent and data-driven customer experience across every touchpoint.

The convergence of the physical and the digital is redefining retail. The beauty and strength of the physical storefront is being combined with the efficiency and intelligence of digital technologies.

One example is the rising trend for consumers to have retailer apps on their phones. And retailers are equipping store personnel with mobile devices to improve productivity, customer service, merchandising, and sales.

Edge Computing Powers Retail Agility

The dramatic swing of consumer shopping preferences, such as purchasing online and picking up in-store, has led to demand for consistent experiences regardless of interface, location, or device. To meet these multiple new operational challenges, retailers need:

  • Data integration that enables omnichannel marketing potential, including information on all customer interactions during the different stages of their journey.
  • Compelling marketing attribution to know the impact of various touchpoints on shopping behavior and measure the ROI of its marketing spend.
  • A corporate focus on consumer privacy protection that meets legal requirements of national laws and social expectations of all customers.

All this demands a sophisticated edge computing strategy from point of sale to the usually benign “back-of-house” operations.

The beauty and strength of the physical #storefront is being combined with the efficiency and intelligence of #digital #technologies. @MakeItFlooid via @insightdottech

Flooid, a global provider of enterprise scale solutions for retailers, helps address these rapidly changing dynamics. Its Flooid Unified Commerce solution is a platform for delivering retail innovation (Video 1).

https://www.youtube.com/watch?v=9_ySrfHSDRo

Video 1. Retailers gain more agility with the Flooid Unified Commerce platform. (Source: Flooid)

“The platform at its heart is a commerce engine that does all the fundamentals of the selling system, product price, promotion, basket, calculation, and payment,” says Martyn Osborne, Flooid’s CEO, EMEA and Group Chief Product Officer. “For example, it gives retailers options like consumer mobile, where the customers go into the store, scan products with their camera, using their fingerprint to pay, and then go.”

Microservice Infrastructure

By using open-source technology, the Flooid platform embraces a modern and flexible infrastructure. This design framework ensures scalable and robust operation that provides resilience from the in-store edge to the cloud.

It supports many different formats and touchpoints—grocery, general store, department store, fashion, pharmacy, food service, specialty, and more—through a single application architecture.

For example, Marks & Spencer, a large department store and grocery retailer in the U.K., wanted to build a consumer-based mobile app called “Scan Pay & Go”. Its in-house digital team developed the application. But the engine they use is the same that does all the pricing behind all of the endpoint applications that run in the store.

“They don’t have to come back to us every time they want to change that mobile app,” says Osborne. But what they are getting behind the scenes is all the heavy lifting such as pricing and promotions, payments, taxes, and more, which is complicated.

By using on-demand services for its core functionality, Flooid ensures that the customer can easily re-engage through other devices or channels no matter where a feature is first accessed. Its modern architecture extensively uses microservices to deliver operational speed and resilience through small-footprint edge POS devices, manager and associate tablets, and consumer personal devices.

This consistent and integrated approach to the customer experience embraces an omnichannel strategy while simultaneously respecting consumers’ privacy.

Scan, Bag, and Go

A fast-growing sign of how technology is redefining shopping is the broadening use of scan-bag-and-go technology. The first British chain to adopt this customer-facing technology was Flooid customer Waitrose, a high-end grocer. The key to its investment in payment technology was ensuring that it met customers’ needs.

The system developed through Flooid allows people entering a store to pick up a handset or download an app and use their mobile phones to scan products as they shop. This approach greatly improves the checkout experience and now represents about 40% of all in-store transactions.

As part of the company’s broader commerce strategy, the Flooid architecture includes a basket pricing system that services the self-scan solution to look up prices and customers on its commerce platform.

“They took our APIs and built their own front-end application on our engine. Other than our initial training, the Waitrose team developed the solution on their own,” says Osborne.

Staff Mobility Enriches Experiences

Still another innovative example is the recent deployment by a global grocery chain. With stores in Canada and the U.K., the organization was stuck using traditional in-store grocery processes. Margin pressures and competitive challenges motivated the retailer to adopt mobile 10-inch tablets to enhance customer interaction.

“We’ve done quite a lot with mobile technology,” Osborne says. “We’ve got a big acceleration now around many of our customers wanting to push mobile POS and other devices in their stores.”

This adoption also required the data access, durability, storage volume, and data transport bandwidth of a global cloud infrastructure. Flooid Unified Commerce, deployed on Google Cloud Platform, met this challenge through its ability to leverage its robust microservices architecture to deliver speed and resilience to in-store edge devices.

Intel® technology is an integral part of Flooid solution deployments. The company’s technology leadership shows its presence from back-end cloud servers to handheld edge devices. Mobile technology accelerates both POS and mobile device performance. And network edge operations feature small-footprint devices that can consume and process large amounts of data at the edge.

Along with its innovative technology, the effectiveness of the Flooid solution comes down to the company’s operation mindset. “We have a full 365-degree kind of visibility of the whole life cycle from inception, development, production, operations, and support,” says Osborne. “We deliver the full-stack service.”

AI-Driven Predictive Analytics Powers Machine Health

Machine builders often remotely monitor the equipment they sell to their manufacturing customers, but it probably does little good. IIoT sensors collect enormous amounts of raw data about equipment health and performance. But it isn’t organized in a way that either machine builders (OEMS) or their customers can create a transformational impact on their process.

As a result, manufacturers can experience frustrating breakdowns on the factory floor. And machine builders miss out on opportunities to learn how their equipment is used and develop offerings that can significantly increase service revenue.

Applying advanced predictive analytics to the torrent of machine data opens a world of new possibilities.

OEMs can create AI algorithms that optimize machine performance in real time and reveal which equipment might fail and when, allowing technicians to fix machines before a breakdown. They can offer this help through as-a-service contracts that provide a steady source of income—and apply the same predictive models they use for their machines to determine optimal pricing.

#Edge applications, #PredictiveMaintenance, and as-a-service business models represent a new, more efficient, and more profitable future for #machine builders. @TCS via @insightdottech

“As they progress in their digital transformation journey, manufacturers are looking for ways to improve efficiency, extend the life of their equipment, and especially, to monetize data,” says Senthil Kumar, business head, connected products and services at global IT services company Tata Consulting Services (TCS). “Today’s IIoT technology allows them to achieve these goals.”

Improve Service with Predictive Maintenance

Using predictive analytics allows manufacturers to keep their equipment in better shape, cutting down on expensive service visits and providing customers with more uptime.

For example, a construction equipment manufacturer continued to receive customer complaints about downtime. Remote monitoring alone won’t solve problems like these. Applying predictive analytics to the data is the key to success.

“That is where we jumped in and said, ‘Instead of just having you monitor the equipment, why don’t we run analytics in real time?’” Kumar says.

To set up real-time problem detection and failure prevention, Equiptix collected historical machine data and ran models predicting weaknesses and break points for 10 different subsystems in the company’s excavators. It demonstrated the reduction of downtime 50% for that subsystem. The company now plans to extend Equiptix to another nine subsystems.

Anticipating problems also saves the equipment manufacturer money on service calls. “Using data collected from their installed base, they can plan service visits in advance, instead of running out when they receive a call,” Kumar says.

AI-Powered Predictive Analytics Optimize Service Contracts

Analyzing data from hundreds or thousands of customer machines over time gives OEMs the confidence they need to offer the gold standard of customer service: performance guarantees.

“In the past, manufacturers tried to sell customers software designed to help them improve machine performance, but it was never successful,” Kumar says. “Customers want to know, ‘What’s in it for me?’ They want something that moves the needle.”

Performance contracts take away the headaches of managing complicated software and give customers results they can count on. In conjunction with real-time condition monitoring and predictive maintenance, they represent a new business model for manufacturers—one that provides a steady source of income from services, as explained in the comprehensive video below (Video 1).

Video 1. The TCS Equiptix platform enables machine builders to offer condition monitoring, predictive maintenance, and performance-based service contracts. (Source: TCS)

Service contracts also involve risks from variables outside the manufacturer’s control, such as weather, transportation problems, fuel costs, and the pricing of competitors’ offerings. Equiptix simulation models analyze all these factors to create contracts that are attractive to customers while providing manufacturers with an acceptable level of financial risk.

“We ensure that each contract is profitable, and all risks are managed properly,” Kumar says.

Equiptix also examines all the digital products a manufacturer makes and identifies opportunities for creating new service contracts, helping to expand sources of reliable revenue.

Unleashing the Power of Edge Applications

In addition to preventing breakdowns and enabling service contracts, analyzing machine data allows manufacturers to optimize ongoing machine performance for specific customer uses. “People want to process information at the edge, but most equipment today does not extract enough data to run the analysis,” Kumar says.

Working with TCS engineers, manufacturers can select the right data needed to create and deploy targeted performance-enhancing applications.

Edge applications provide TCS customers with complex capabilities—such as machine vision, mixed reality, and digital twins—in real time. High-performance Intel® processors allow customers to use these services on edge machines without connecting to the cloud. They can boost efficiency, improve quality control, and tweak machine processes in simulation before companies spend money to deploy them in the field.

“That’s the power of the edge,” Kumar says.

A New IIoT Frontier

Edge applications, predictive maintenance, and as-a-service business models represent a new, more efficient, and more profitable future for machine builders. Getting there will take time, but manufacturers are starting to make the changes they need to reap the rewards.

“It’s not just a matter of technology—you’re changing the whole business model,” Kumar says. “It’s a long journey, but we are already seeing a lot of traction.”

Creating a Data-Driven Manufacturing Culture with RoviSys

Bryan DeBois, data-driven manufacturing, digital transformation in manufacturing

[podcast player]

Are you tired of hearing the term digital transformation? That’s probably because it’s been misused and widely misunderstood. Manufacturers are under the impression that they must make large investments to be successful. But what ends up happening is they just waste time, resources, and money.

A successful manufacturing digital transformation effort starts small and grows from there. In this podcast episode, we uncover what’s behind successful digital transformation initiatives, how tools and technologies can help, and the common pitfalls to avoid.

Our Guest: RoviSys

Our guest this episode is Bryan DeBois, Director of Industrial AI at RoviSys, a leading automation and information solutions provider. Bryan has been with RoviSys for more than 20 years in various roles, including programming, software development, and software group manager. Today, Bryan is focused on implementing advanced technologies like artificial intelligence into the manufacturing and industrial space.

Podcast Topics

Bryan answers our questions about:

  • (4:34) What digital transformation means for manufacturers
  • (7:16) The benefits manufacturers are looking to achieve
  • (11:29) The state of digital transformation in the industry
  • (12:19) Why manufacturing digital transformation projects fail
  • (14:46) How to successfully embark on a digital transformation journey
  • (18:08) Where artificial intelligence comes into play
  • (22:19) How to get all stakeholders aligned
  • (25:35) The role technology plays in digital transformation

Related Content

To learn more about the digitalization of industrial operations, read Demystifying Digital Transformation for Manufacturers. For the latest innovations from RoviSys, follow it at Twitter at @RoviSys and on LinkedIn at RoviSys.

Apple Podcasts  Spotify  Google Podcasts  

Transcript

Kenton Williston: Welcome to the IoT Chat, where we explore the trends that matter for consultants, systems integrators, and end users. I’m Kenton Williston, the Editor-in-Chief of insight.tech.

Every episode I talk to a leading expert about the latest developments in the Internet of Things. Today I’m talking about digital transformation in manufacturing with Bryan DeBois, the Director of Industrial AI at RoviSys.

Honestly, the term digital transformation has been so overused in the last few years that it’s hard to know what it even means. I want to get Bryan’s perspective on questions like: what’s the hype and what’s the reality of digital transformation? Why is digital transformation taking so long? And how can you get your team to buy into your efforts?

But first, let me introduce our guest. Bryan, welcome to the show.

Bryan DeBois: Yeah, thanks for having me.

Kenton Williston: Bryan, tell me, what is your role at RoviSys, and what does RoviSys do?

Bryan DeBois: Yeah. So, I am our Director of Industrial AI. In that role I’m focused on the application of advanced technologies to this manufacturing and industrial space that RoviSys focuses on. My background was software. I’ve been with RoviSys for about 20 years now. And we look at—when we say industrial AI, we look at that as a very broad brush; so yes, you’ve got the AI and machine learning. But then my division gets involved in things like computer vision.

We’ve got drone projects going on. Really anything that’s advanced technology. And, again, the manufacturing world tends to be five to seven years behind most other industries in terms of technology. So some of the stuff that may be somewhat old news in other industries is just coming into the manufacturing and industrial world right now.

Kenton Williston: Got it. And, out of curiosity, what did you do before your current role?

Bryan DeBois: I’ve spent my whole professional career at RoviSys, but before I was in this role, I was focused on our information solutions and MES projects. For some of your listeners, if you’re not familiar—because I may use this term again—MES is manufacturing execution systems. These are big, complex software systems. They’re comparable to ERP-type of rollouts. They sit below the ERP in a manufacturing stack, but above the plant floor. And I was involved in those projects before moving into this industrial AI role.

Kenton Williston: Got it. Now, you’re touching on a, I think, really pertinent point that I’m sure we’re going to get into in our conversation today, which is that a lot of the technologies that you’re dealing with are unfamiliar to various groups within your customers. What I mean by that is, on the manufacturing side there’s all this industrial-automation-specific technology and MES systems. What does that acronym even mean if you’re an IT person?

And then, conversely, the IT systems have all this stuff going on that are kind of foreign territory to a traditional manufacturing expert, for example, typically working with systems that are super isolated in an industrial-automation context, and you don’t have things like security patches or. . .

Bryan DeBois: Right.

Kenton Williston: . . . anything like that going on. It’s like, “Just leave it alone. Let the IT guys keep doing their CI/CD approach. We’re going to just have that sit here for 20 years and be super solid, and please don’t touch my system.” It’s like a totally different worldview on the two sides.

Bryan DeBois: You’re absolutely right because, now with this IT/OT convergence, now we’ve got a whole new audience of folks who have not heard of any of this. And, frankly, these are not technologies that the typical IT person has ever been exposed to—has ever had to manage.

The second point there that you made, about just something as simple as patching and life cycle management of an operating system in the plant floor—has to operate differently and has to be on a different cycle than what you can typically do in an IT space.

When people talk about this IT/OT convergence, it’s the traditional IT roles, technologies, the servers, the governance, the security—all of that coming into this OT world, which, as you can imagine, has been a good and a bad thing. But it definitely has been a little bit of culture shock there from both sides in that invasion of IT into the OT space.

Kenton Williston: Yeah, for sure. And I think this has been something that’s been an ongoing challenge just in general, as everything gets increasingly digitized—how folks can work together in this very much not shared context.

And so that brings me really to the topic of our conversation today, which is digital transformation. It’s interesting to me—this idea has certainly been prevalent in the IT space for a few years. And, in fact, I’d say it’s gotten to the point where it’s been overused, so it’s hard to even know what it really means. But it is still a relatively new concept in the OT, plant-floor kind of space.

So, what does digital transformation mean to you in a manufacturing context? And why is this something that folks should be paying attention to?

Bryan DeBois: In the manufacturing world we’ve gone through a number of these digitalization movements, we’ll say. So, I’ve been around long enough that Industry 4.0—but then even before that we had smart technology, and smart manufacturing was kind of one of the buzz words. And these are great concepts and they’re important concepts, but ultimately so much gets kind of hung on these terms that they start to lose all meaning. Because if they mean something different to everyone, then they don’t really have any meaning at all.

The way I look at it is that there is still so much room to grow in manufacturing. I’d say there’s so much value that digitalization can bring to manufacturing still. Because, like I said, we’re typically behind the times—we’re typically five to seven years behind the times in terms of adoption of technology. But then, also unique to manufacturing is this idea of legacy equipment.

You have equipment—we regularly see equipment that is 15, 20, 25 years old. One of the stories that I like to talk about is, we’ve got a customer right now that is operating in their powerhouse. They’re operating a generator that was actually installed by Thomas Edison. So that was a 100, 105 years, something like that, that this generator has been operating. It actually operates at a different frequency—it was before we standardized on 60 hertz, so it has to be stepped up to 60 hertz. But it’s still there, it’s still running, and we’ve instrumented it and we have control over it. So we’ve got modern tools around it, but there’s just not really a compelling case to replace that generator for that customer at this point. And that’s the case we see in a lot of customers.

So the combination of legacy equipment, the combination of the—our industry is highly risk averse. So they’re not going to adopt the bleeding-edge technology. All of those things combine to bring about a great opportunity for digitalization. I don’t want to downplay the importance of digital transformation. It is something that—that we’ve been working on, again, for a long time, at trying to—sometimes kicking and screaming—bring the OT space and OT customers into this modern era, into the 21st century.

Kenton Williston: And why is that? What benefits are you looking to achieve?

Bryan DeBois: I know for me, personally, I do believe that manufacturing and production is a world-changing process, that I think that manufacturing is the lifeblood of any economy. But more so than that, the fact that we have so much available to us that’s of higher quality—and it’s cheaper and it’s prevalent—is a direct result of our ability as humans to manufacture things and produce products at high quality.

And so the more that we can increase these manufacturers’ efficiency and the more that we can—yeah, it makes them profitable and that’s all great and everyone wants to make money, but I really see it as a deeper need as a society, because that’s what raises the bar and the quality of life for everyone, frankly.

And so I really believe strongly that digitalization is the path to make these manufacturers more productive, more efficient. And, frankly, also to make the lives of those folks who actually operate these plants better.

Kenton Williston: It’s interesting. So I think there’s a couple of things there that really caught my attention. So, one I want to come back to in a minute is that it’s not just about the dollars and cents, right? It’s about the environment people are working in and the company culture. I think there’s a lot to talk about there.

But the other thing that I’m hearing is that fundamentally digital transformation is about using data, but you’ve got to instrument things to get the data and then apply intelligence, i.e., AI—if I want to use too many abbreviations next to each other—you’ve got to apply that AI to do something with that data, and then of course the human beings are engaged in a way that allows them to—like you said—operate at a higher level of abstraction, do higher-level work that’s more meaningful and more valuable.

Bryan DeBois: Mm-hmm.

Let’s talk about the human factor first, because one of the questions I get a lot—I’m a company that consults and implements projects to bring technology to manufacturers. Well, one of the most common questions I get, even at parties and things like that when people find out what I do is, “Oh, so you’re putting people out of work.” Let’s address that head on. I’ve literally never seen, in my 20 years, one of our customers do a workforce reduction because of one of our projects. And, as we’re seeing right now, they don’t have enough people. They need more people than are available right now; in every single case those people are typically reallocated to higher value tasks like you’re talking about, and, frankly, tasks that they’d rather be doing. Nobody wants to compile Excel reports. If I can put in logging there that automatically records all those values—besides the fact that you’re going to get more accurate data out of that equipment—that’s just a better situation for that operator. They can stay focused on what they typically like to do, which is to operate the machinery.

But the other thing you talked about, that’s so critical to digital transformation—ultimately what we’re trying to create is a data-driven culture. And so a great example of that—we worked with a company that presses aluminum wheels, and they described the situation before we came in and did a digital transformation project. They were focused on continuous improvement—which was great, and that’s one of the things that I think separates some of the great manufacturers from not—is those that focus on continuous improvement. And so they had dollars allocated every year for continuous improvement.

However, where those dollars went, oftentimes, would simply be the loudest voice in the room. And so they said that after we were there and after we did this project, that they now consider themselves a data-driven culture. And those funds, those limited funds, would now go to whoever had the best data story. And so your ability to go into the system and find the data that supported your particular project—those were the people that won.

Kenton Williston: What would you say is the state of affairs? It sounds like some of the folks you talked to have been—you mentioned sometimes folks will come in with a big sales pitch about digital transformation and AI—you kind of promise the moon and the stars, and it’s just not very realistic.

Bryan DeBois: Yes.

Kenton Williston: I’m sure you encounter people who have been disappointed, and I’m sure you’re encountering people who are just trying to figure out what all this means. What would you say, overall, is the state of digital transformation in the industry?

Bryan DeBois: I definitely think that we’re past the hype cycle now in digital transformation, and we’re unfortunately starting to see some of these projects fail. The good news is that there’s still lots of projects that are successful in some form or another. But, yeah, I think that there’s definite trepidation around this.

And then, kind of right in line with that, as you’re defining those smaller projects you need to be tying them to use cases. As you mentioned, there’s been a lot of vendors—there’s been some really big consultant companies that have come in and promise the moon and said, “Give us 10, 20 million dollars and we’ll transform you digitally.” And without a real clear strategy on how that’s going to be done.

Kenton Williston: Yeah, absolutely. So, on that point of the misfires that folks have experienced—what do you think are some of the biggest pitfalls that have caused these projects to not succeed? Or at least not succeed to the extent that manufacturers were hoping.

Bryan DeBois: So, those pitfalls are also the things that, if you want a road map on how to do it right, these are the things to watch out for. So, the first one that I would say is, is that they didn’t start—and so we are very big on: walk before you run, phased approach—whatever you want to call it—small projects, small wins, and having that momentum.

Then roll you into the next project and the next project. That’s where we’ve seen the most value. That’s where we’ve seen the most progress. As you’re defining those smaller projects, you need to be tying them to use cases. One of the advantages, too, to leading with a use case is that it solves the problem of pilot purgatory, right? Or POC purgatory. We hear this all the time where, “Well, we did a couple of pilots on that, but then it didn’t really go anywhere.”

Typically, the reason why is because you didn’t have a clearly stated use case; you didn’t have this clearly stated goal as to a business case typically tied to financials. That was, “Here’s what we’re going to do or save because of this project.” So, what we do is, we put the problem first and so we prioritize that. We say: “What’s the most important problem that we can solve with a certain budget, with a certain constraint on timeline,” and things like that. And then we go and solve that problem with technology.

So, now that customer has a proven solution to a problem. Now the rollout part—the part that’s past pilot purgatory—that part’s easy because you have a working solution to a problem. Where else do you have that problem? “Well, we have it at five other plants.” Great. Let’s roll it out to five other plants. It’s a whole different way of thinking about these types of things than the traditional: “Well, let’s do a POC, and then why don’t you give us $2 million or $5 million in licensing, and we’ll roll it out everywhere.” That doesn’t work. So, leading with use cases and making sure that we’re starting with that is so critical as well.

Kenton Williston: Got it. So, you’ve partially addressed this question, but I want to dig a little bit deeper on how RoviSys approaches this digital transformation journey, and how you help companies navigate the path. And it sounds like a key part of that, like you said, is starting small. And I suspect just from what I’m hearing you say that there’s another part of it which is that you’re coming in without an agenda, so to speak, right?

Bryan DeBois: Yes.

Kenton Williston: So, maybe you could expand on that, and then if there’s more, you could tell me about just the overall process of how you would help a manufacturer embark and successfully navigate this digital transformation journey.

Bryan DeBois: Absolutely. So, two parts to that answer. The first part that you touched on is so important, and this isn’t unique to us, but most systems integrators—they don’t have to push any one particular product. We are independent. So, we make that very clear to all of our vendors. We are ready to simply look at the problems that you have, and then we’ve got a whole toolbox of products and platforms that we can implement to solve that problem.

The second part of your question is, how do we go about solving that problem? So, we do have kind of a—I wouldn’t even call it a playbook, but I would call it kind of a loose path that we typically follow that’s proven and seems to be the right way to approach these digital transformation projects.

So, just a quick glimpse into it at a high level. The first thing we’re going to look at in a digital transformation project is we’re going to look at their OT data infrastructure. I mentioned historian before. We think historian is such a critical foundational part of these projects. That’s typically the very first thing we ask is, “Do you have a historian? Is it comprehensive?”

“Yeah, we got a couple.”

“What kind of coverage do you have in your historian? Is it capturing 100% of your data? 50%, 20%?”

“Well, you know, maybe 20%, 25% of our data is being captured in the historian.”

So, one of the first things that we’ll do is we’re going to try to expand that coverage so that they’ve got more data to work with. The other thing that historians do is they become that OT data infrastructure.

They become—the thing that everyone’s asking for is: “We’ve got all these different vendors and the plant floor, and this plant was an acquisition.” Whatever. “I want one system I can query to get to that process data.” Well, that’s the historian. So, it becomes that common platform to query. As part of that, there’s typically networking upgrades that have to happen too.

And then the next thing we’re going to look at is, we’re going to look at OEE and visibility. So, OEE has been around for a long time. It’s availability, throughput, and quality. And while it’s been around for a long time, there’s a reason why it’s lasted as long as it has. It’s one of the easiest ways and most approachable ways to start on a digital transformation journey. So we oftentimes will reach for that next in our toolbox.

And then, along with OEE, comes just visibility. So many of these projects, once we’ve put in a historian for them, and we’ve got some displays that visualize their process—that’s transformative. Sometimes that in itself can lead to so many wins. But now that you’ve got that data maturity, you’ve got OEE as a KPI that’s talked about, and you’ve got this data-driven culture—now you can start looking at things like OT data warehouse. Now you can start looking at combining all that process data, other sources of data on the plant floor, MES—there’s relational and transactional sources of data on the plant floor. Now you can look at combining all of that into a single data warehouse and starting to query that. And that oftentimes is the carrot that’s dangling in front of IT—that’s where they want to get. They want that OT data warehouse that makes it easy for them to unlock all of the value of that data on the plant floor.

Kenton Williston: So, with that all in mind, one thing you haven’t mentioned yet that I’m very curious about is where AI enters the picture here. Because, of course, that’s your title—you’re the Director of Industrial AI. We haven’t really talked about where that factors in. So, could you tell me a little bit more about that?

Bryan DeBois: Yeah. So, I talked previously about the carrot that we dangle for IT, and that’s unlocking this data on the plant floor. So AI then becomes a further carrot. Sometimes I will say it’s a little frustrating being in the role that I am, because you walk in to customers, and they’re super excited to talk to you about AI, and they’ve got big vision about what they want to do around analytics. And, unfortunately, then you take a look at their data infrastructure, and you’re like, “Uh, you’re not anywhere near where you need to be in terms of OT data infrastructure to even be able to take advantage of some of this stuff.”

One of the things that AI needs to feed on is data, and it needs lots of data. And not only does it need a ton of data, it needs very, very clean data. That’s just not what we find.

And so I tend to have to be the bearer of bad news that there’s a lot of foundational work that we’re going to have to do before we can really take advantage of some of this AI.

The other piece that’s important to remember is that they oftentimes, again, have been sold a bill of goods from some of these big IT vendors that: “Well, we can build a model for you in a week.” And they can, but there’re some caveats there. First off, that model is going to be built with CSV exports. So it’s all going to be data that was extracted and manually cleaned by somebody—a data science scientist typically—on the vendor’s team. That’s what they’re going to build their model off of. And they’re going to build a model, and it’ll have predictive qualities, but they won’t have any idea of how to actually deploy that model and put it into operation on the plant floor.

And one of the key aspects of that is hooking it up to those real-time data sources—the same ones that they manually export data from and manually cleaned all that data. Those are the data sources that have to feed that model then in the future for it to actually be able to predict anything.

And one of the other aspects of AI that I like to emphasize is that all of that work that I just described, all of that—you’ve made no money at this point off that; that was all invested dollars. So, until an operator, a supervisor, someone on the plant floor is making decisions based on the predictions of that model, you haven’t seen a dime from your investment. Everything up to that point has been a giant science experiment. And so it turns out that operationalizing the AI is actually the hard part.

And that’s the part that we’re focused on. We’re trying marry, in our division, 30 years of plant floor experience and information-solution experience with the AI and the data science. And that, I think, is where—and we’re not the only ones doing it—but that I think is where we’re going to move the needle, is focusing on how we actually put those AI systems into operation on a plant floor that has to run 24/7 with operators, that oftentimes may not trust that AI versus their own eyes, ears, and smell that they’ve developed over many, many decades.

Kenton Williston: Yeah, absolutely all good points, and I think, again, just so much of this is about data. Going back to our top-level point here—that digital transformation equals data-driven culture—well, you have to have the right data, right place, at the right time, even the right format, right? You’ve got to get all those things right for this data to be in any way useful towards driving a difference in how you operate.

But I think there’s another piece to this, too, that I really wanted to come back to, which is the cultural part. And you touched on a little bit of that just now with the operators who’ve got their own ideas about how to run things, which are well founded, by the way—they know what they’re doing; that’s the reason to trust their eyes, ears, and nose, and all the rest. Because they’ve got the experience.

So you’ve got that perspective, and then you’ve got, of course, the engineers who were building all the industrial-automation equipment, who have a totally different perspective. And then, of course, the IT folks, like we talked about, have yet again another perspective, and last but not least, the people-who-hold-the-purse-strings perspective as well. Right?

Bryan DeBois: We’ve got to respect their perspective too, yeah.

Kenton Williston: So, how do you get all these stakeholders aligned on the digital transformation goals and process? Where do you start first, if you’re going to start small and walk before you run—what does that first thing look like? How do you achieve that alignment?

Bryan DeBois: If you’re leading with use cases, you’re going about it the right way. So, if you’re starting with: first, identifying what the problems are, and then prioritize those problems that are the right combination of achievable in a certain amount of time that are maybe least expensive to try to bite off but have the biggest impact. We want to identify those pretty quickly, pretty early on. And we’ve got certain workshops and things that we do to bring all those stakeholders together, to generate those ideas and then prioritize them. And so those are great opportunities right from the beginning to try to get everyone engaged.

So the second thing that we do is, we’ve got ideas about what projects are going to have the biggest impact. So now we take a couple of those off the top, and we go around and we do an assessment.

And so that process—we talk to all of those stakeholders, we talk to maintenance, we talk to management, we talk to operations, we talk to everyone involved—and we talk about those specific couple of use cases that we’re going to pursue. We talk about, what data do we need to make that project a success? What data are you going to—can your systems contribute to this? And what kind of organizational change management is going to be required to change the way that we operate in this future state? So we capture all of that into an assessment. And then, typically, we’re getting in front of those purse string holders that you talked about, and we’ve got all the documentation now.

We’ve got a pretty clear plan of how to get from A to B. Again, we’re focused on specific solutions—forget digital transformation and all of the buzzwords. Here’s a problem you have, and here’s a road map for a solution that would solve that problem, and here’s roughly what it would cost. And so when you lay it all out like that, it’s actually pretty easy to get everyone on board, to get everyone excited, and to get those purse string holders to say, “Okay, yeah, let’s do it. Let’s try this first one.”

Kenton Williston: One thing I’m curious about here. We haven’t really talked about the underlying technology, and I think there’s some important things happening there. So, for example, we’ve talked about AI, and how are there some cases where it’s been oversold, and I think this is still a very young technology. I think just from the perspective of how well AI techniques have been developed there’s been a lot of progress recently, and I think that progress is continuing. So, there’s more and more use cases where AI can actually deliver real benefits.

And there’s also, of course, the ongoing merge of technology from the hardware side. Personal note here, that this is an Intel-produced podcast, so of course I’m going to be biased towards all the amazing things Intel’s doing. Nonetheless, it’s true on both of the sides.

I’m thinking about things like, on the software side there’s this OpenVINO architecture that is great for many reasons. One of the things, like I mentioned, there’s ever more use cases that are available out of the box, so to speak. The algorithms that are predefined that you can use as a starting point to help accelerate things. Then, of course, from the hardware side, obviously the hardware doesn’t just continue to get faster, but there’s more and more specialized AI features built in. So, how do you see technology helping these digital transformations along, whether it’s AI or any other elements?

Bryan DeBois: It’s been really exciting to see the advancements that have happened even in the last five years in this space. And Intel, of course, is leading the charge on a lot of that. When we talk about digital transformation, one of the projects that we oftentimes will do—our MES or MES-light types of rollouts—and those oftentimes require new kiosks at each of the different work cells. So each work cell now needs its own kiosk so that it can give work instructions to the operator that maybe were on paper before, and so that it can record information about the operations that operator did at that work cell. All of that requires now industrial PCs. So you’ve got smart PCs that have to be rolled out—25, 50 different industrial PCs that need to be rolled out across all these different work cells where it didn’t exist before.

So there’s a ton of value that these industrial PCs bring to the equation. Obviously you’ve got the edge play, the IoT play, more and more smart devices everywhere. But then, specifically from the AI perspective, it’s been really exciting to see these chip vendors focus on specifically AI workloads. One of the interesting things—that the focus right now seems to be on vision. And so there’s a lot of value that computer vision can bring to a manufacturing facility. But what I’m excited about is what’s next. Whereas there’s definitely problems that computer vision can solve, we see just a whole lot of data that is coming from the plant floor that is not vision related, but still has a lot of value in AI applications.

So we’re excited for what’s next, where we start to see some dedicated hardware for processing AI workloads outside of vision, because that’s going to be really exciting.

And then of, course, you’ve got—when we talk about historian rollouts and MES rollouts and things like that—these typically require a good amount of hardware, and so, the Intel servers and things like that that we would typically roll these things out to.

And then, finally, you have the cloud. So there’s definitely a lot of customers that are saying, “Look, we’re about to make this big investment in a comprehensive enterprise historian, but, frankly, we don’t really want to allocate all of those server resources on site and have all of that IT footprint on site.” And so they’re looking to the cloud to roll out these really big OT deployments. And so, of course, I know that Intel makes a big impact there, too, across the different cloud providers.

So, yeah, I think that there’s impact all across with Intel and the other hardware providers. I think that they are really pushing the envelope on what’s possible.

Kenton Williston: Well, Bryan, I’ve really enjoyed talking to you, and we’re getting close to the end of our time together. So I just wanted to give you a chance if there’s any kind of like key takeaway you would like to leave with folks who are thinking about embarking on a digital transformation effort—what would that be?

Bryan DeBois: Yeah. So my pitch has always been: involve OT early. A lot of these projects nowadays are being driven almost exclusively by IT, and that makes a lot of sense for a number of reasons, but it’s so critical to get OT to the table early in these projects. And then I would take it a step further and say: and also involve an OTSI.

We have a great amount of knowledge about the different technologies and platforms and things that are out there; and can definitely help guide during the ideation process; can guide the conversation on what’s feasible, what’s not; where the lowest hanging fruit is; and all of that kind of thing. Start small, focus on use cases, and build that business case early on and get those wins. Build that momentum, and start to develop that culture for digital transformation.

Kenton Williston: Got it. Well, with that, let me just say, thank you so much, Bryan, for sharing your thoughts with us today.

Bryan DeBois: Absolutely. Thank you, Kenton.

Kenton Williston: And thanks to our listeners for joining us. To keep up with the latest from RoviSys, follow them on Twitter at @RoviSys and on LinkedIn at RoviSys. If you enjoyed listening, please support us by subscribing and rating us on your favorite podcast app. This has been the IoT Chat. We’ll be back next time with more ideas from industry leaders at the forefront of IoT design.

Take the Pain Out of IoT Projects with Microsoft

Pete Bernard, Internet of Things, IoT interoperability, IoT friction, IoT development

[Podcast Player]

The pandemic accelerated nearly every IoT project. Whether it is curbside pickup, online ordering, telehealth, wellness monitoring, or work-from-home services, organizations across the globe are putting IoT applications on a fast track.

But getting these new applications deployed has never been more complicated. Supply chains are experiencing disruption. New workloads like AI are pushing the boundaries of compute power. Interoperability challenges are slowing development. And many organizations are missing key skill sets and resources.

In this podcast, we talk about the key pain points of IoT development, the importance of having a developer-friendly platform, and how to address IoT interoperability challenges.

Our Guest: Microsoft

Our guest this episode is Pete Bernard, Senior Director of Silicon and Telecom in the Azure Edge Devices Platform and Services Group at Microsoft. Pete has been with Microsoft for about 16 years, and is currently focused on telecommunications, mobility, IoT, and how to successfully drive disruption with innovative engineering and strong partners.

Podcast Topics

Pete answers our questions about:

  • (4:20) The major trends and challenges of IoT development
  • (5:27) The rise of intensive workloads and the role of edge computing
  • (6:43) How the pandemic has changed IoT app considerations
  • (8:43) Why being IoT developer-friendly matters
  • (12:57) The pain points of IoT interoperability
  • (22:05) How Microsoft is trying to reduce IoT friction
  • (24:43) The types of skill sets and resources organizations need to have
  • (29:03) A look at medium- to long-term IoT priorities
  • (31:47) How to make the life of an IoT developer easier

Related Content

For the latest innovations from Microsoft, follow them on Twitter at @Microsoft and LinkedIn at Microsoft.

Apple Podcasts  Spotify  Google Podcasts  

Transcript

Kenton Williston: Welcome to the IoT Chat, where we explore the trends that matter for consultants, systems integrators, and end users. I’m Kenton Williston, the Editor-in-Chief of insight.tech. Every episode I talk to a leading expert about the latest developments in the Internet of Things.

Today I’m talking about ways to remove friction from IoT projects with Pete Bernard, Senior Director of Silicon and Telecom in the Platform and Services Group at Microsoft.

So, what makes commercial IoT applications so challenging? Well, tight schedules, growing complexity, and the need for new skills in areas like AI and security are just a few of the issues that come to my mind. But I want to know how Microsoft is looking at these issues.

So let’s get right to it. Pete, welcome to the show.

Pete Bernard: Thanks for having me.

Kenton Williston: Can you tell me a little bit more about your position?

Pete Bernard: I’m responsible for all of our silicon partnerships for the edge, as well as our, kind of, telco work around 5G edge and AI. I’m part of the Azure Edge Devices, Platforms, and Services Group, which is a real mouthful. Sometimes we call it AED-Plus or AED-Please, if you want to be polite about it—which is all part of the wonderful world of Azure.

Kenton Williston: Perfect. And what brought you into this position—what did you do before this?

Pete Bernard: Well, I’ve been at Microsoft about 16 years. Came from Silicon Valley, came from sort of software-meets-hardware background. I was originally a BIOS engineer at Phoenix Technologies and did that for a while. And then did a little startup stuff, and did some mobile Java stuff, and eventually got to Microsoft. And so I’ve had a number of roles through the years here, mostly in that, kind of, more edge space, and especially where edge means telco, but also where semiconductor partners are involved.

Kenton Williston: Very cool. I didn’t know that—the BIOS stuff has always been really interesting to me. And of course it’s one of the things that’s very differentiated about when you’re talking about embedded systems for IoT versus PCs, it’s one of the things that starts to be really important.

Pete Bernard: It is. It’s one of those things you ask yourself: what happens once the power turns on, and what exactly happens on a board, how do all the chips get initialized, and how do they get powered—and all that kind of messy stuff under the covers that we sort of take for granted when we use these devices, right? These days, a lot of software development— very high-level software development—is very visual, but when you get down there onto the board and to the metal, that’s pretty tactical and practical, but it’s fun.

Kenton Williston: That’s a really awesome segue into what I wanted to talk to you about today. So, I’ve been working in the embedded/IoT space for—I’d rather not say that it was 20 years, but it is. And back in the day—and even until quite recently—these things, like just the frustrations of trying to get a board working and trying to get your operating system loaded—I mean, just getting those basic things done has just been a huge headache. And, as time has progressed, of course these systems become more and more complicated, more and more interconnected. So the level of effort that’s required to, kind of, from the ground up build something has just got to be huge.

Pete Bernard: Right.

Kenton Williston: And I think there’s a real pressing need today to really simplify things. So you can get to the value, and not just be spending all your time on the basic bring-up-and-try-and-get-everything-just-working.

Pete Bernard: It’s a very solutions-oriented market out there, I would say, especially in what I would consider the edge ecosystem. If you’re in other ecosystems maybe it’s a little more homogeneous, a little more straightforward. But out here on the edge, if you talk to commercial customers, like we do all day, they have some pretty complicated problems, and those problems are not solved with a single device or a single app.

And so quite often it’s about how do these systems work together, how do they work together securely, how can they be managed, how can they be deployed. You’ve got an ROI envelope, right? That all this stuff needs to get into. There are CapEx costs, your OpEx costs; what workloads run on the cloud versus the edge or the near edge. So the solutions right now are quite complicated. And, yeah, I mean, you can’t afford to spend that much time sort of bringing up your board. You’ve got other fish to fry.

Kenton Williston: Yeah, absolutely. And so, to that point, you mentioned some of the key areas of concern you’re seeing, such as, for example, security, which is a huge one, as these things get all interconnected. So, what do you see as some of the major trends/challenges that folks are looking to tackle these days?

Pete Bernard: Like I said, it’s a heterogeneous space. So, quite often you’ve got lots of different things connected. And how do you have a kind of a developer fabric over that solution so that you can develop and deploy solutions? So, for example, AI models that are maybe trained on the cloud and Azure, and then get deployed to the edge—how are they optimized for the silicon on the edge? And I know we’ve been working a lot with Intel®on OpenVINO™and working well with Azure, and a platform we call Azure Percept, which we launched in March. But that’s just one example of where the silicon characteristics and the capabilities of the silicon—which are really becoming pretty amazing—you really need to be able to take advantage of those to get the performance and the power to solve those problems. So, that’s just, kind of, one area that we’ve seen.

Kenton Williston: Perfect. And you give me a perfect opportunity—speaking of things that are perfect—to note that this podcast is an Intel production, so take everything we say from that perspective.

I do absolutely agree, though, with that said, that I think some of the things around AI and machine learning and computer vision and all that cluster of technologies have been incredibly important. And I think that’s probably one of the single biggest things that has changed, I would say, over the last two or three years—is the rise of these really intensive workloads that really need to be done at the edge because you’re talking about just such massive amounts of data. Like, if you’ve got a vision system, you can’t just pipe that stuff back up to the cloud; it’s just not practical. So you need a lot of compute at the edge.

Pete Bernard: It’s interesting. We’ve seen sort of the evolution of, kind of, standalone systems, or disconnected systems. And systems connected to the cloud that can just send data up to, kind of, simple sensors. And now we have the cloud talking to the edge, and the edge talking to the cloud. And now we have the edge itself becoming more—I don’t know what the right term is—more, I don’t want to say self-aware, that sounds pretty scary. But basically you can run a lot of compute on the edge and sort of asynchronously from the cloud, and you have to really figure out how those things can work together. But you’re right, there’s a lot of new scenarios out there where you need some pretty high-performance edge computing, and maybe that’s for privacy reasons, maybe that’s for bandwidth and kind of egress reasons, like you mentioned. But it’s pretty exciting to see some of the innovation that’s happening out there, and the way people are using it.

Kenton Williston: So, a big question that’s on my mind, if we’re going to be talking about recent trends, is, obviously the world’s been turned upside down over the past 18 months or so, with the pandemic. And has that changed what people are considering with their IoT applications?

Pete Bernard: I think so. I mean, for sure. I mean, there’ve been many heroes of the pandemic, but the internet is one of the heroes of the pandemic, right? It’s kept us all connected and working throughout this whole thing.

But I think people are thinking a lot more about—other than just general acceleration—like, everything’s sort of accelerated. All of the experiments we had—that we were cooking, that we were going to do two or three years from now—have all been deployed, but you’re seeing a lot more AI-vision work. Maybe some things like, just curbside pickup, and, kind of like, online ordering, and all kinds of systems like that.

Automation has really accelerated a lot, I think, through the pandemic. And then we’re seeing a lot more things around healthcare and optimizing that, and using a lot more information at the edge to make sure people can have a smooth experience, and an authenticated experience. The pandemic has just sort of hyper-accelerated a lot of the stuff that’s been in the pipeline.

Kenton Williston: Boy oh boy—supply chain management has become so much more challenging, and having some technology help with that, I think, is going to be a really good thing in the long term.

Pete Bernard: I think actually one of the unsung heroes of the pandemic has been the QR code. Who would have thought QR codes would make such a comeback? When you go to a restaurant, everyone knows what a QR code is. They know how to look one up on their phone, and they’re like, that’s become like an interesting little by-product.

Kenton Williston: So, I want to come back—you were talking about some of the large suite of offerings that Microsoft has, and kind of how that relates to this question of, okay—so we’re already have deployed, in pretty much any industry you look at, a lot of technologies that people have been talking about for some time and just got hyper-accelerated.

Pete Bernard: Right.

Kenton Williston: And we have to figure out where to go from here, and that could be just a question of, how do we integrate these things back into our larger enterprise systems? Because this is all done in kind of an ad hoc way, and how do we secure them? How do we build on this going forward? And I think an important part of that conversation, like you said, is making these things a lot more developer-friendly and providing a platform where you can readily make new decisions and bring systems together in new ways. So, I just want to hear a little bit from you—what does that mean in practical terms?

Pete Bernard: Right. Yeah, no—I think you bring in a good point, because a lot of these systems that are getting deployed are not—they can’t be—sort of one-off, bespoke systems, right? You can’t, sort of, hard code your solution this year, and then hard code another solution next year. So you have to sort of think about, what are the problems you’re going to tackle today, versus two or three years from now, by adding more and more capabilities. So, we’re seeing people do—one example in retail is point-of-sale terminals, right? They’re sort of pretty prevalent right now, as point-of-sale terminals have been for many years. People are saying, “Well, I have—now I have these point-of-sale platforms. What else can I do with those platforms? Can I add AI vision to those point-of-sale terminals to provide some kind of security monitoring in my retail store?” Or, “Can I then add some other sensors, and other things?”

And so we’re seeing platforms like point-of-sale terminals, humble platforms like that, actually becoming kind of edge endpoints in and of themselves, that can do quite a bit of compute. And, actually, you can start adding on more and more capabilities to the platforms like that that you’ve already deployed. Or we’re seeing—another thing that we’re seeing is, quite often we’ll go into a retail store or something like that—I won’t name who it is—but there was a supermarket we were talking to recently—not in the US—and the first thing they said was, “We already bought the cameras.” So it’s like, “Last year we bought all these cameras, and they’re sitting in boxes.” Okay. Well I say, “It’s all been deprecated or appreciated—whatever.” So a lot of people have legacy equipment.

So, how do you connect legacy equipment, and make that more AI capable, and more secure, and more manageable, right? So, we’re seeing a real explosion in the use of gateways and, kind of, medium edge equipment that can connect a lot of brownfield—we call it brownfield, or legacy—equipment into the cloud securely. It can be managed, and then you can run AI workloads against this kind of legacy equipment. And so that’s another big thing that I think people are thinking about as well. So, you sort of have to think about it systemically, right? It’s like you have some problems you need to solve, but what equipment do you currently have that you can leverage to solve that? And then, how do you want to sort of scale that as inevitably, right—we’re talking about Intel—inevitably the chips get better and faster and cheaper, and all that good stuff, right? So next year, or the year after, this is going to be even better equipment to connect into the system.

Kenton Williston: Yeah. And so, specifically, what’s Microsoft doing, you know with this whole—you’re going to have to help me with that acronym—the Azure A, D, IOP—all those, all those letters…

Pete Bernard: Right.

Kenton Williston: Yeah.

Pete Bernard: One of the things we’re doing is we have—obviously Azure is an incredible business, cloud business. And let’s just—I think it’s 95% of the Fortune 500 run their businesses on Azure. Some incredible statistic like that. And what we’re doing, from our group, is really helping out on the edge. So, what are all the interesting things connecting to Azure that are kind of getting superpowers from Azure. That’s what we’re really focused on, and making sure whether that’s working with Intel, for example, in Windows IoT. And we have something called EFLOW now, in which you can run Linux workloads on Windows.

So you can have a secure and managed Windows platform that everyone knows how to manage and deploy. And, on top of that, you can now run legacy Linux workloads that can be run from the cloud. Those could be AI workloads, or any kind of legacy Linux workloads, as opposed to having a separate box that runs Linux. So, that’s kind of one example of something that we’re doing out of our team that’s kind of really helping customers solve problems with the equipment they have, with a little bit of new software, which is pretty cool.

Kenton Williston: Yeah. So I think what you’re really pointing to here is a—one of the key pain points, or friction points, you might say, of interoperability, right? So you’ve got, like you said, all of this brownfield/legacy equipment, which doesn’t necessarily mean it’s horrendously old—even something that you just bought last year—like you said, “What do I do with this box of cameras now that I have them?” Right? And, in fact, you may have stuff that’s very old. Maybe you’ve got a point of sale—the device that’s three, five years old; maybe you’ve got an industrial automation system that’s 10 or 20 years old, right? You’ve got just a really broad array of things that were built at different points of time, with different ideas about whether they would be connected or not, and what kind of electronic infrastructure they would be interfacing to, if they were connecting. So, I imagine this is just one of the key things that people have trouble with. Would you agree with that?

Pete Bernard: Yeah, no. I think so. Because, like you said, I mean, everyone’s got sort of their own unique circumstances, right? They have a problem to solve. They have kind of an ROI envelope they need to get that problem to fit into—it’s worth only so much money for them to fix it. They probably have some equipment already that they don’t want to chuck. And hopefully we and our partners are able to work with these companies and find creative solutions that help solve those problems. I mean, getting pretty practical about it, frankly. And what’s exciting, I think, is that there’s no shortage of ways to solve problems these days, right? So even the typology of solutions of how much workload am I running at the far edge, versus a gateway, versus a co-located heavy edge server versus the cloud—there’s a lot of variability there.

So that’s good news, in that there’s lots of different ways to solve problems in a very cost-efficient way, very low-CapEx way. But I guess the downside of that—I guess that’s why we get paid—is that it’s complicated, and kind of your earlier point is it can’t be so complicated that nobody can actually deploy the darn thing. So you have to take friction out—with developer tools, with platforms. If you look at all the SDKs we have up on GitHub, and developer outreach, and engagement—we have an enablement, and we do that with Intel and all of our partners, so that people have, like, blueprints and reference platforms and reference design, so that they can get, sort of, 80% there right out of the box. That’s really what we’re trying to do, is take the friction out and give people the power of all of the optionality that’s out there without making it too complicated.

Kenton Williston: Yeah. And, I think something that comes to mind—hearing you lay out this landscape—is how much things have turned from an, “I’ve got a box that does something” sort of mindset on how you solve things; to, “How do I create some software to solve this problem?” Right? And then you get into questions, of course—like, where does this software live, and all these sorts of things. But, fundamentally, I think that, pretty much across the board, Internet of Things–applications have really evolved to a place where the questions are more about the software model than the particulars of the hardware.

Pete Bernard: Yeah. And I think so, because the software has to really work across many different pieces of a system, right? Different pieces of hardware have to all work together. And that’s where the software really comes into play. You have your business logic and your power apps, and all that stuff running on top. And then the hardware is there to either collect the data, right? If it’s kind of a sensor or AI-vision thing. And, in some cases, obviously provide some kind of UI and input, if it’s a point-of-sale terminal or a McDonald’s kiosk, menu-ordering system, et cetera. So it really is a very software-driven exercise. I mean, we’ve seen, if you look at—I forget the stats—something like 7% of all Tesla employees are software engineers. And I think in General Motors it was somewhere like 1% or 2% are software engineers.

It just shows that, like, newer companies, or companies that are disrupting spaces, have a lot of software capability. I actually do a lot of mentorships for college students, and they always ask me, “What companies should I work for if I want to be in tech?” And I always tell them, “Well, pretty much any company is in tech these days.” Right? Because. . .

Kenton Williston: Absolutely.

Pete Bernard: Everyone, whether you’re at McDonald’s, or Chevron, or whoever, they have—it’s tech, and you have to be tech capable, and you have to have software capability built into the company. So, from a career perspective, that’s exciting for a lot of people coming out of college and getting trained and stuff, because they can pretty much work anywhere as long as you have software capability. But to your earlier point, yes, the software is really critical, and we’re taking advantage of a lot of new semiconductor capability on lower power, higher tops, higher performance, lower cost. And that’s going to continue, but really it’s the software. And maybe I’m a little biased coming from Microsoft, right? It’s really the software that can unlock some of these amazing scenarios.

Kenton Williston: Yeah. And I will co-sign under that bias, because I think, again, what I’m seeing is just that the capabilities of the hardware have gotten to be so incredible. What you can do with a relatively inexpensive, relatively low-power piece of hardware is so amazing. And I think you can kind of see this even in the consumer-IT worlds, right? So, there was just an iPhone 13 dropped, right?

Pete Bernard: Right.

Kenton Williston: It’s a nice phone, right? But it’s like, well, how much better is it than the iPhone 12? Well, I mean, it’s pretty similar, right? Because just the hardware is so powerful that you just don’t really notice a big difference with a hardware upgrade from your day-to-day use. And same kind of thing with laptops and lots and lots of other places that are, you know, just kind of every day, individual experiences you can kind of see for yourself that it’s like, well, yeah—the hardware is getting better. But what really matters is, okay—the iPhone 13 is out, but really what’s cool is what’s in iOS 15—is what the software does. It’s really unlocking something new and exciting.

Pete Bernard: I’d say that’s true. Although we’re also seeing with new fabrication processes and other things there’s—especially maybe in the AI space (I spend a lot of time there) and the AI models require a lot of horsepower. And in some cases, like, you go back to solutions that customers—the problems they want to solve. And they require AI models that are really pretty heavy edge, pretty processor intensive still. And so we’re starting to see that AI acceleration capability in silicon and get, in the lower costs, kind of more edgy platforms, but it’s still pretty heavy edge oriented. And then—so it’ll be interesting to see over the next five years how that sort of pushes out to the farther edge and gets even lower power, right? And less costly. But, so there’s still a lot of headroom, and I think an opportunity for the hardware to accelerate—maybe not in the consumer space as much, but I think on the commercial side, everyone’s looking for higher performance, lower power, lower power consumption, lower cost.

Kenton Williston: Yeah, absolutely. And I think that the point you mentioned about AI is particularly a prime example of that—where a lot of accelerators are being added into, for example, the latest generation of Intel core lineup—that makes a big difference in what you can do. And, again, at relatively low power, low costs are a scenario. And I think what’s really interesting to me is, I mean, we can kind of all see where this is going, and I feel like a big part of how to have a framework that’s forward looking—like we’ve been talking about: don’t just make some bespoke thing, but make a solution that’s going to work for what you’re trying to do tomorrow as well—a lot of that has to do, in my opinion, with having an underlying platform that’s very flexible and scalable. So that you can readily say, “Okay, maybe today we’re doing this workload in the cloud. Maybe tomorrow it’s in a gateway. Maybe next year it’s on the edge.” Right? And you don’t have to feel too pinned down to anything.

Pete Bernard: Yeah. And so the management and deployment of workloads is a big deal for us at Microsoft and what we’re doing with Azure and Azure Arc and Kubernetes, and a bunch of these technologies, where people can develop and deploy applications, quote unquote “software,” and be able to manage where that software is running and where those workloads are running. And I think we’ll see even more flexibility over the years, the next few years, in that area. And that’s pretty exciting because, like you said, things are going to evolve pretty quick.

Kenton Williston: Absolutely. So, I’m curious—one of the big challenges beyond just, like, “What is the right framework to give us a forward-looking path that we’ll have this kind of flexibility?” Is not just the flexibility, but the scale, right? Oftentimes it’s relatively easy to put together a sort of a lab-based proof of concept that shows, yes: you can actually execute some AI workload to recognize whatever you’re trying to recognize. But then you get into questions of scale, and it’s not only: how do you deploy this heavy workload in a sensible way? But all the other things that come into it. Like the security, like the device management—all these other things that are really important beyond just running whatever application it is you’re trying to do. So, what do you see as some of the key pain points there, and how is Microsoft trying to reduce the friction of those?

Pete Bernard: Yeah, that’s a good question. So, yeah, you’re right. I mean, there’s the “show your boss that your camera could recognize a banana,” and then there’s actually, deploying that. And one of the things we’re trying to do is kind of minimize the steps between the demo to your boss and deployments, for lack of a better term. So, like, Azure Percept is something we introduced in March, and it uses some pretty cool Intel technology as well. And it’s really a developer kit that enables people to, yeah, quickly and easily recognize bananas and import their own AI models, or use a bunch of models out of the box. But then you can—because you’re now deploying that on Azure, you’ve got full Azure IoT, Azure IoT hub-device management, deployment—you’ve got Azure device-update capabilities, in that you’ve got everything you need, actually, to really go to production with that whole software framework that you’ve developed for your proof of concept.

So there isn’t a—you don’t have to sort of start over again when you move to production hardware, which is pretty cool. So we’re trying to give developers a way to really harvest the work that they’ve done in the POC stage, and really not have to do anything over again to get to a full deployment stage. And, like you said, the production hardware may change, and you’re going to maybe change models and get something weatherized or whatever, but the software and AI models that you develop and trained, and the way you’ve managed and deployed them—that’s all sort of production-level code. And that’s, I guess, that’s one of the benefits of working with a platform like Azure, and working with partners like Intel is that this is what we call GA—general availability—type stuff. And being able to develop and deploy on Azure kind of gets you pretty far along when you want to actually do full production deployment.

Kenton Williston: Yeah, absolutely. So, this is also bringing up a question in my mind of what kind of skill sets organizations need. So, we were talking earlier just about the fact that it doesn’t matter what kind of business you’re in—you’re in the tech business, period, because you can’t not be. I think the good news is, everyone’s got at least some sense of what that means. You know any kind of organization is going to have an IT department, and things like this, but surely there are things that are unique when you are starting to make these very complex decisions about having a highly scalable, forward-looking-in-all-these-sorts-of-good-things sort of an architecture for your IoT deployments. That gets, like we’ve been talking about, to be very complicated. So, are there any particular kinds of skill sets or resources that companies should be making sure they have shored up?

Pete Bernard: Yeah. I think one of the things we’ve seen is sort of a confluence of embedded developers, and data science–kind of developers, and AI developers. And one of the things we’re trying to do with our tools and Visual Studio and the whole  Azure platform is like—how do we enable embedded developers to become smarter AI developers, and vice versa? And so you don’t have necessarily two different types of software developers—like, one software developer can skill up and become really good at all of those things, right? To develop and train AI models, and to also write code to develop and deploy applications on embedded or edge devices. And there’s a ton of Microsoft learning—learning paths and things like that. But certainly the data science and AI capabilities are a new one that, I think, is really required for lots of companies these days.

And then, the ability to understand these platforms—common platforms and, obviously, Azure, but there’s AWS and GCP, but also things like Kubernetes and workload management. So I think all those things are kind of feeding into each other. And I think software developers, I’ve seen them be—coming myself from Phoenix, where we are firmware developers, BIOS developers, kind of a unique species—the developers today are a lot more well-rounded, I would say, and usually have a pretty well-rounded skill set. And I think AI is just the latest skill set for people to add into the résumé.

Kenton Williston: So it sounds to me like maybe part of this is reconceiving what even your tech department is, and that it’s not just IT—it needs to be bigger than that. And the other thing that’s coming to mind here is, maybe this is a really golden opportunity to think about the relationships you have with the Microsofts of the world, with the Intels of the world, and with whoever your vendors and systems integrators, and so forth, are.

Pete Bernard: Right. Yeah, I mean, there’s some incredible solutions providers out there that have done a great job taking a lot of this tech. And, yeah, you’re right—you don’t have to be a firmware developer when you work at a shoe store; you can buy a package solution for AI vision and security from a solutions provider and be on your way, right? So it does make a bit of a difference, but, yeah, I think that there’s definitely some interesting opportunities there. The solutions provider space is pretty fantastic. And I think that, you know, in the old days—like when I was working out in Silicon Valley—the IT department would be behind a Dutch door—one of those halfway doors—you’d walk through it and say your laptop didn’t work or something, and they’d take it from you and tell you that you could come back in a few hours, or something.

So, that doesn’t really happen anymore. Your IT department is about security and about productivity, and probably doing some custom application development and, hopefully, buying some of these solutions or sourcing some of these solutions and adding your own, sort of, power apps and other business logic on top for your particular business. So, yeah—I think it’s an incredible opportunity for developers these days to get out of their comfort zone a little bit, and to start experimenting with things like AI and building on top of some of these solutions that are out there that are really part of a kind of a basket of solutions

Kenton Williston: And what are Microsoft and Intel doing together to support this ecosystem of IT departments and solutions providers?

Pete Bernard: Well, there’s a ton of work, obviously, we’re doing on the platforms for manageability and security. So, one of the interesting things we’re doing with Intel with Azure Percept—we’re actually encrypting the AI models on Azure and then decrypting them on the device.

Kenton Williston: Oh, cool.

Pete Bernard: And, so, that’s kind of an interesting new thing, right? Because no one else has really done that before, and the reason is because the AI models themselves are really important IP. And so that is an attack vector, right? That someone could take some important AI model IP that you’ve spent millions of dollars developing and training. And so we’re doing encryption end to end. I’m making that invisible to the developer and working with Intel on that.

Kenton Williston: You’ve brought up a lot of really cool, cutting-edge stuff that’s really brand new—around Percept and around the Flow products, where there’s just a lot of stuff going on. And, really, I mean, honestly, even the whole Azure IoT platform is still kind of newish in the grand scheme of things, right? There’s just a ton of new capabilities coming out. What do you see in, sort of like, the medium to long term of where your priorities lie? And where you want the IoT ecosystem to be?

Pete Bernard: So, there’s a lot of, like I mentioned—the brownfield area is a big area, right? Every company has a problem, and every company has equipment. And so sort of one of the things that we’re seeing a lot of action on is: How do I leverage my legacy equipment in the brownfield as opposed to greenfield? Greenfield means buying new equipment, and brownfield means I have some—I have my box of cameras. So we’re seeing a lot of activity there, and thinking about the existing platforms, and how can I write new software and applications to work on those platforms? So, for example, EFLOW—back to that—was actually, it can be deployed on Windows IoT today. You don’t need to buy a new box with Windows IoT on it, or a new instantiation of that. It can take existing Windows IoT, and actually go download Flow as a Linux container on it today, which is pretty cool, right?

So now you’ve sort of married legacy workloads with existing equipment that’s in the field. So there’s a lot of brownfield work that’s going on right now. And people are just trying to do smarter software with a lot of their existing equipment, and thinking about, “How do I solve my problem with some of the stuff I already have around the store or in my manufacturing plant, et cetera?” And, at the same time, people are planning for the next big hardware cycle, right? And how do I use 5G and private 5G, and how do I use Wi-Fi 6, and how do I do all kinds of new things with these new vision processors, right? So that’s all happening in parallel, but I think brownfield is kind of where there’s a lot of near-term action.

Kenton Williston: Yeah. That makes sense. And absolutely we haven’t touched too much on—nor do we have a lot of time left to dig into—but I think beyond the silicon, beyond the software, the connectivity side of things has really changed a lot. And I do think 5G, Wi-Fi 6, are going to continue to unlock a lot of really exciting new possibilities—be able to do things at scale they just couldn’t do before.

Pete Bernard: Yeah. And I would say, I think my advice for folks is to kind of keep an open mind when it  comes to connectivity, right? So it’s, yes, there’s Wi-Fi, but also 5G. There’s something called LPWA, or Low Power Wireless Access. There’s—that can include NBIOT. I mean, there’s an alphabet soup here, but, like, LPWA. There’s Bluetooth low energy, which has gotten really good. So there’s lots of different ways to connect these things together, and people should really keep an open mind about what’s the best way to do that, because there’s so many options these days.

Kenton Williston: Yeah, for sure. And I think that’s part of what makes this time so exciting, is that there’s just—there are so many options and so many possibilities just coming online all the time. Well, listen, Pete, it’s really been great talking to you today. Any key takeaways you’d like to leave with our audience? Kind of the big picture of how do you make the life of an IoT developer easier?

Pete Bernard: Yeah. We really need to be—they say “customer obsessed.” It sounds a little trite, but it really does mean something. And being customer obsessed means thinking about the solutions, not just the technology. So, think about how you can help solve problems for your company or your customers holistically, and assume that there’s a heterogeneous ecosystem out there. And part of your value add is being able to glue that stuff together in a seamless way to solve some of those problems. So, really thinking at that altitude is pretty helpful.

Kenton Williston: Excellent. Well, with that, I’d just like to thank you again for joining us. Really interesting conversation.

Pete Bernard: Sure. Appreciate it. Thanks for having me

Kenton Williston: And thanks to our listeners for joining us. To keep up with the latest from Microsoft, follow them on Twitter @Microsoft, and LinkedIn @Microsoft. If you enjoyed listening, please support us by subscribing and rating us on your favorite podcast app. This has been the IoT Chat. We’ll be back next time with more ideas from industry leaders at the forefront of IoT design.

AI-Driven Metals Fabrication Leads to Zero Waste

Clearly, no company manufactures products just to send them to a scrap pile, but in the metals casting process, one of five casting components winds up there. While defective metal is recyclable, it takes time and energy to rework. With a lofty goal of zero waste—material, energy, and more—the metal industry is hyper-focused on reducing defects during manufacturing and improving sustainability.

Perhaps the biggest challenge is getting an industry to let go of decades-old practices and embrace the innovative tools and processes of the future—and that starts on the shop floor.

“To get the industrial world to zero waste, it’s important to recognize that 80% of the KPIs (key performance indicators) in manufacturing come from operations,” says Rahul Prajapat, founder of Tvarit GmbH, creator of end-to-end customized AI solutions for manufacturers. “Using AI, we can start to predict the important KPIs, including equipment efficiency, production quality, energy usage, and carbon footprint.”

And these predictions can result in a 20% to 50% reduction in scrap rate and a considerable reduction in energy bills—leading to sustainable and zero-waste manufacturing processes.

Starting from the familiar helps discover what’s possible, by comparing new methods to what companies already have in place. Many use Six Sigma, the long-adopted methodology to define and improve manufacturing quality and processes.

“People on the shop floor use this Six Sigma methodology to achieve required efficiencies,” says Prajapat. “They will start by defining specific types of defects which are occurring in the production line, measure data points on Excel sheets, and do trend analysis. As a result, managers can better understand anomalies and make improvements.”

Based on the following steps, the Tvarit AI-driven quality control system has great overlap with Six Sigma:

  • Determine customer readiness by understanding its problem statement and the amount of data available
  • Identify the missing measurements and install sensors that can collect additional information
  • Prepare an AI approach to predict desired output
  • Perform a root cause analysis to find areas of improvement
  • Prescribe recommended settings to shop floor engineers

“This overlap gives confidence to shop floor engineers and managers who can go from an old-style improvement that used to gain maybe 1% or 2% reduction in the scrap rate to two-digit improvements that transform their KPIs,” says Prajapat.

#AI predictions can result in a 20% to 50% reduction in scrap rate and a considerable reduction in energy bills—leading to #sustainable and zero waste #manufacturing processes. @TvaritAI via @insightdottech

AI + Domain Knowledge Drive Better Outcomes

But AI alone isn’t a complete solution. A platform like the Tvarit Industrial AI Solution creates a more accurate hybrid model by adding domain knowledge from simulation patterns. These patterns consist of metallurgical knowledge built from an understanding of certain types of defects occurring in products.

The platform is made up of sensors connected to an edge server that sends information through the cloud to the Tvarit AI software that uses predictive modeling, evaluation, and risk analysis.

“Using data from the machine, the production line, and the plant, we can make predictions based on patterns,” says Juergen Halt, research and development director and senior partner of Tvarit GmbH. “When we bring it all together and create a hybrid model, we can simulate situations and make those predictions much more accurate. It’s like having automatic cruise control in a car, but because you now have radar, you don’t crash against the wall.”

The Industrial AI solution includes three components:

  • Tvarit Industrial AI (TIA), where specialized pretrained AI modules for metal manufacturing processes are stored, for instance, Aluminum Die-casting.
  • Tvarit Intelligent Monitoring (TIM), which calculates KPIs, including machine availability and performance, product quality, energy consumption, and more. It runs on infrastructure powered by Intel® Xeon® servers and cloud platforms such as AWS and Azure.
  • Tvarit Observant Module (TOM), a hardware module, which includes edge devices that collect data and mark the component for traceability.

“TIA is our unique selling proposition,” says Prajapat. “It evaluates the data and predicts risk. It’s an end-to-end solution where customers can pick the modules needed for their operation.

Predictive Analytics in Aluminum Casting Reduces Waste

An aluminum casting manufacturer uses the Tvarit AI Solution in its facility to generate real-time visibility that helps improve its scrap rate.

“There are two big challenges with aluminum casting,” says Prajapat. “First, it takes hours or sometimes one or two days to get the results if the coil produced is good or bad. A company can’t do logistics or supply chain planning because it’s lacking real-time visibility. And the second challenge is the scrap rate, which in this industry can range from 6% to 10%.”

Leveraging predictive-analytics AI models, the aluminum manufacturer resolved these challenges using Tvarit’s process-specific AI models for die-casting, welding, and cold forming. The company also used plant-specific customizations that created ease of scale through the transfer learning technology for sensor data analytics.

“The biggest impact was real-time analytics,” says Prajapat. “If there are any defects occurring in the process, they can get those insights in real time. We were able to reduce the defect by 35% for this customer. And this is the impact we are driving for all of our customers in the metal industry.”

“The keys to change are proof of concept and agility,” says Halt. It’s also teamwork with the customer. These are the most important elements for creating positive change.”

AI Uncovers Bioinformatic Treasure in a Sea of Data

There’s a paradox in the world of bioinformatics. The rapid growth of data is creating countless new opportunities—but all this data is useless if researchers can’t make sense of it.

Indeed, this is already a problem. Researchers are spending so much time trying to organize and comprehend the data that little room is left for meaningful discoveries. And the challenges will only increase as technological advances increase the availability of bioinformatic data.

“The field of bioinformatics has rapidly evolved and is indispensable in this current era of big data and fourth industrial revolution. Many areas such as pharmaceuticals, clinical research, disease research, and healthcare industries have to deal with big data. People who work in those areas need a bioinformatics approach to analyze big data,” says Seung-cheon Yong, Senior Consultant for the AI-driven bioinformatics company Insilicogen.

The future of bioinformatics, Yong explains, will rely on artificial intelligence techniques to uncover biological impacts.

A New Approach to In Silico Experiments

Human genome data alone can come from blood, saliva, bone samples, X-rays, and MRIs. Researchers are also collecting data from animals, plants, and microorganisms. This type of data can provide more insights into traits, rare diseases, drug resistance, and personalized nutrition.

But collecting this data can be a slow and tedious process—hence the emergence of in silico research, which uses computer programs to simulate biologic systems and perform scientific experiments. This has been adequate at validating outcomes and visualizing data, but limited in its ability to help researchers analyze, interpret, and comprehend all the data.

The future of #bioinformatics will rely on #ArtificialIntelligence techniques to uncover biological impacts. @insilicogen via @insightdottech

Insilicogen is transforming its in silico approach by working with global partners to incorporate artificial intelligence techniques. Yong says with the use of AI, researchers can gain valuable insights faster without as much time or effort.

“We are living in an age of large-scale data. You can let AI learn from the data and interpret it. Hence, we can get fast and accurate results,” says Yong.

The company implements techniques such as structuring, interconnecting, machine learning, and feature selection and extraction to find hidden value in the data (Figure 1).

Insilicogen machine learning visualization goes from customer needs to EDA, statistics, visualization to tailored knowledge.
Figure 1. Insilicogen’s concept of machine learning deploys models on customer data to better visualize and understand biological information. (Source: Insilicogen)

Insilicogen’s machine learning models analyze the data structures, assess and refine the data, and perform feature selection and extraction. Deep learning neural nets are used to extract valuable information from text sequences, images, videos, and natural language processing. Insilicogen also leverages its tailored knowledge to provide web-based results, real-time insights, and further develop AI and analysis algorithms.

“As a result, customers can deal with their data and do bioinformatics analysis and research,” says Yong.

AI-Driven Bioinformatics

One area that benefits from applying AI to bioinformatics is the food service industry. For example, the data food company DP, a subsidiary of Insillicogen, combines AI, big data, and bioinformatics to create a customized fruit subscription service.

Fruit Compatibility looks at a user’s biological data such as gender, age, height, weight, blood pressure, blood sugar, and health goals to recommend certain fruits for a healthier living status.

DP is also working with Hanbio Gene Co., Ltd. to launch a food and diet recommendation platform based on individual genetic information and AI.

In addition, Insilicogen helps several other organizations and research institutions apply AI to their biological research. For example, it works with:

  • The Korea Institute for Animal Projects Quality Evaluation to help analyze and detect the quality of beef using image analysis and machine learning
  • The Korean National Research Institute of Cultural Heritage to establish a machine learning model that predicts termite wood damage
  • The Korean National Institute of Fisheries Science to predict the high temperature tolerance of abalone
  • The Food Industry Technology Support Center to analyze ingredients based on function, culture, and geographic location

Understanding and Storing Bioinformatical Data

Beyond being able to make worthwhile analysis on bioinformatical data, Yong says there is still a challenge with being able to store and handle the data.

“The technology keeps improving and tons of biological data is being produced, but compute power and capability is insufficient. It’s not good enough,” he says. “Many people don’t know how to handle the data to find answers to clinical or biological questions. In order to overcome these issues, a national supercomputing center or cloud computing system is needed.”

Insilicogen’s partnership with Intel® has enabled the company to provide a high-spec computer with NGS data analysis solutions to customers at an affordable price. “We created business synergy to utilize an Intel solution with high-spec computers and Insilicogen’s technical support to customers,” says Yong.

The company combines its bioinformatic analysis consulting service with Intel to create the Inco X Intel Select Solution designed to process, store, and analyze disease genome sequencing data.

Yong explains Insilicogen is responsible for interpreting and storing more than 60% of South Korea’s bio-big-data.

Educating the Next Generation of Bioinformatic Researchers

According to Insilicogen’s CEO Namwoo Choi, the future of bioinformatics driven by AI requires more experience and understanding of the field.

Insilicogen offers an active bioinformatics education program to help more researchers understand how to use this data for biological research and development.

As part of its on-site human resource training, Insilicogen has experts in data processing, genomics, transcriptomics analysis, and programming. Customers can benefit from basic education programs or programs tailored to their specific needs as well as an online education center, workshops, and hands-on training with bioinformatics and genomic data analysis.

“The future of bioinformatics is promising. There are thousands of data sets that need data mining and bioinformatics analysis. Insilicogen understands the situation, and we will keep providing bioinformatics analysis support, education, and collaboration with other global/local companies to create a solution based on future needs,” says Yong.