Utilities Edge Closer to a More Sustainable Future

Environmentalists everywhere have circled 2050 on their calendars. By then, scientists say we will have to have stopped, if not reversed, the effects of decades of greenhouse gas emissions. That is, if we hope to stabilize the Earth’s climate.

The alternative is bleak, but for these efforts to succeed, we need cooperation that scales from the smallest, most remote edges of our environment to some of the largest multinational corporations in the world.

For instance, VINCI Energies (VE), a French energy infrastructure provider that develops technology solutions for building, factory, and IT customers worldwide, is working toward becoming carbon neutral by mid-century. The company consists of 1,800 specialized business units across 55 countries and manages about 400,000 projects each year. Needless to say, their carbon influence is immense (Figure 1).

A circle graph depicting VINCI Energies direct greenhouse gas emissions by source: worksite machines, company cars, industrial activates, and buildings.
Figure 1. VINCI Energies estimated a total of 2.2 million greenhouse gas emissions in 2020. (Source: VINCI Energies)

It hopes to reduce its carbon emissions 40% by 2030. To do this, they have adopted an “Avoid and Reduce” strategy, based on a plan consisting of three scopes. The first focuses on emissions from buildings or offices. The second focuses on gas-powered service vehicles. And the third, still in development, will include long-distance travel.

But they’ve realized that setting clean-energy goals and achieving them are two very different things.

To avoid carbon-intensive activities in some places and reduce emissions in others, efforts must first be able to track and monitor the environmental impact of what the company is doing. The sheer size and variation of use cases addressed by VE made this challenging, as off-the-shelf carbon-monitoring solutions with the scalability and flexibility it required simply don’t exist.

“The main challenge most companies have is automated #data ingestion that provides accurate, up-to-date information when visualizing your own #carbon emissions.” – Natali Velozo, Head of Operations, @AxiansGlobal via @insightdottech

The capabilities VE was looking for included the ability to:

  • Paint an accurate picture of CO2 emissions at the company, organization, and business unit levels
  • Analyze data from third-party agencies like fleet management, garbage handling, transport, and other service providers
  • Offer actionable benchmarks that allowed improvements to be measured

Avoid and Reduce Sustainability Initiatives in Action

Fortunately, the company was able to find a path to build its ideal solution internally. VINCI Energies leveraged Axians IoT Operations, a VE subsidiary that designs industry PaaS offerings around its microServiceBus.com device management solution.

microServiceBus.com is a protocol-agnostic device management solution that runs small-footprint software agents on IoT gateways. These agents communicate sensor data from edge nodes using protocols like Modbus, mBus, OPC-UA, LoRa, Bluetooth, IEC 61850, and others to- and from the cloud.

With microServiceBus.com as a foundation, Axians IoT Operations proceeded to build the GreenEdge PaaS-solution, a real-time environmental footprint reporting system. GreenEdge is currently being used by VE Sweden to automatically update the three scopes with real-world data from IoT sensors and other business systems. The solution is built on the Microsoft Azure IoT Hub, but integrates with major cloud platforms like AWS, Google Cloud, and IBM Watson.

“The main challenge most companies have is automated data ingestion that provides accurate, up-to-date information when visualizing your own carbon emissions,” says Natali Velozo, Head of Operations at Axians IoT. “For example, VE’s management structure, with different regions and business units and divisions, is a complex structure, and being able to see different emissions on different levels depending on your needs was quite a challenge.”

The GreenEdge platform not only can eliminate manual importing of data, it also provides stakeholders the opportunity to visualize and respond to carbon emissions indicators in real time.

“We developed GreenEdge to be able to follow different customers’ management structures and aggregate data depending on your role and what you need to see,” Velozo explains.

Carbon Emissions Data at the Green Edge

Despite the flexibility and scalability of the GreenEdge platform, carbon emissions data doesn’t originate in the cloud. It is created at the edge.

microServiceBus.com agents run on hardware targets that host Ubuntu and Yocto Project Linux distributions, Arm mbed, Docker, or Node.js runtimes. In the VE GreenEdge deployment in Sweden, the agents reside on hundreds of Intel® Next Unit of Computing (Intel® NUC) for Industrial mini-PCs. The rugged edge gateways provide a cost-effective solution for IoT applications that demand 24/7 operation, and natively support Intel® vPro technology for remote monitoring and management.

They also host either an Intel® Celeron®, Pentium®, or Core processor to deliver the performance scalability needed at the edge.

“Edge gateways are designed to have delegated workloads that otherwise would have been processed in the cloud. The reasons for that are because of large volumes of data or requirements such as low latency. Therefore, our preferred gateway is the industrial Intel® NUC, and that’s something we can use for video processing and machine learning,” Velozo explains. 

Bringing Sustainability Online for Everyone

VE and Axians IoT realize that emissions aren’t limited to just air pollution. Therefore, they have built provisions into GreenEdge to support water conservation and have another goal of recycling 80% of company waste, which includes treating and recycling 100% of hazardous waste.

But the biggest impact of the VE’s clean-energy efforts likely won’t come from the company at all. As a utility infrastructure provider, the organization is at the core of energy decisions for a customer base it estimates has a carbon footprint 25x greater than its own.

While humbling, without action its potential is terrifying. But more than anything, it represents a vast opportunity to bring sustainability monitoring online to clean up life at the edge. And everywhere, for that matter.

 

This article was edited by Christina Cardoza, Associate Content Director for insight.tech.

Airport Efficiency Ready for Takeoff with Digital Tech

Going to the airport is typically not an easy or smooth experience. Passengers may rush to make their plane, only to find someone else is holding up the flight. If someone is on a tight schedule, for instance, they must catch a connecting flight, and delays can disrupt or ruin the entire trip.

All this can end up hurting an airline’s reputation and bottom line, especially if the situation could have been avoided. Disjointed airport operations are often to blame. Many tasks must be completed before a plane taxis off the runway: bags loaded, refueling and cleaning done, meals delivered, passengers accounted for, etc. And all this information must be relayed to the pilot and other operational staff. But each servicing team has its own communications channel, and they can’t always inform others of their status. The result is confusion and unnecessary delays that erode airport efficiency and cost airlines and passengers $33 billion a year, according to the United States Federal Aviation Administration.

There is a better way. Recent technology advancements can reduce flight delays by giving all service teams a common, instant, and secure communications system connecting them with operational staff on any device. Airports can also build stronger connections with passengers, learning more about them through smart digital signage and location-based Wi-Fi—and generating new revenue in the process.

Making Airport Operations More Efficient

For airports, getting planes in and out as quickly as possible is a key metric of success.

“The goal is to improve the turn time of the plane, because if you can do that, you make the airport more efficient,” says Andy Manuel, Global Solutions Architect and Business Development Manager for Transportation at Cisco, a global technology leader.

At Bangalore International Airport Limited (BIAL), improving turnaround time was especially critical. The third-busiest airport in India, BIAL was growing at a rate of 20% a year before the pandemic. By 2019, it was managing 240,000 flights and 33 million passengers a year. Airport executives realized that expanding the facility to meet this growing demand would be only a short-term fix.

BIAL worked with Cisco and its partners to develop a plan for implementing IoT, computer vision, analytics, and unified communications technologies. The goal was to streamline and improve the flow of information and gain new insights into operations (Video 1).

Video 1. Strategic deployment of Cisco’s IoT, computer vision, analytics, and unified communications technologies improved efficiency at Bangalore International Airport Limited. (Source: Cisco)

The airport installed specialized sensors to track information about the location, speed, and altitude of arriving and departing aircraft with greater precision than radar. Other IoT sensors were attached to fuelers, baggage loaders, catering vehicles, stepladders, and other equipment. This enabled staff in the central control center to “see” the objects—and the activity surrounding them—in real time. IoT data was analyzed by an application focused on turnaround time metrics, helping the airport find ways to improve.

Cisco also unified BIAL’s communications. Ground staff and airline operators can now instantly bring one another up to speed on any kind of device without worrying about radio frequency or location interference, a common problem at gates and tarmacs.

“There are a number of areas that require constant communication,” Manuel says. “Staff needs to make sure food service arrives at the gate at the right time and the plane is cleaned and sanitized before passengers reembark. You may be able to leave five minutes earlier if you know in real time you’ve got everything loaded and ready to go.”

As more #data is collected and analyzed, it will point the way for #airports like @BLRAirport, and others across the globe to handle additional traffic with machine-like precision. @Cisco via @insightdottech

With so much information flying back and forth, security is a paramount concern. With Cisco’s solution, it starts at the chip level, with Intel®-embedded protections. A zero-trust system extends granular policy controls across all networks, applications, and devices.

The chips are also able to deliver analytical insights in real time and scale to meet airports’ growing needs. Speedier communication and data insights have greatly improved efficiency at BIAL, and the airport is now able to get two additional airplane “turns” per day in each stand, or aircraft parking area.

Improving the Passenger Experience with Airline Technology

In addition to boosting efficiency, technology can make the journey smoother for travelers. An airport in the UK is experimenting with a computer vision and AI system, developed by Cisco and its partner WaitTime. Together, they can analyze anonymized passenger count and behavior in real time, helping to improve traffic flows and reduce passenger congestion. It also provides useful information to passengers themselves.

For instance, a digital sign or app can tell passengers the wait time at certain shops.

Analytics gathered from mobile devices of passengers who opt into Cisco’s system could allow airlines to deliver a new level of customized service. If the airport knows a returning customer always gets a coffee before his flight and he is running late, they could deliver the coffee to the gate for him.

Going even further, the airline can detect the location of a late passenger and send someone to collect them, instead of trying to alert them over the loudspeaker—which many people ignore or don’t hear.

In other cases, if the operations staff can see a passenger hasn’t even arrived at the airport yet, they could substitute a standby passenger and start to get ready for takeoff.

Generating More Revenue for Airlines

Using the right technology can not only make airports more efficient; it can also boost the bottom line. AI and computer vision can tell airports how many people congregate in front of a shop and how many decide to enter. With this information, airports can charge more for retail rentals in high-traffic areas.

Sharing traffic data with advertisers could also create another income stream.

Another potentially large source of income could come from providing unified Wi-Fi connectivity. At most airports, cellular service providers like AT&T, Verizon, Vodafone, and others build out their own infrastructure.

“If you have multiple service providers, you might have four separate installations and separate networks,” Manuel says. “The opportunity would exist to build out a neutral host infrastructure for the airport. With Cisco technology powered by Intel® inside, the airport can provide a single tower and back-end infrastructure foundation for all of them. Cellular service providers can leverage this infrastructure and create a potential additional revenue stream for the airport.”

As more data is collected and analyzed, it will point the way for airports like BIAL and others across the globe to handle additional traffic with machine-like precision while at the same time improving customer service.

These are just some of the ways digital technology is transforming the airport experience for operational staff, commercial tenants, and passengers—before, during, and after the journey.

 

This article was edited by Christina Cardoza, Associate Content Director for insight.tech.

Wildfire Detection: Follow the Smoke with Smart Systems

It’s too easy to start a wildfire. All you need is a small spark from a cigarette tossed on dry ground, a controlled burn run amok, or a lightning strike to a utility pole. The ensuing fire can burn for weeks, consuming millions of acres and taking numerous lives in its wake.

From the forests in California to wooded lands in Spain and even large farming operations across the world, the risk of wildfires is everywhere. And traditional efforts to prevent them are not enough.

For instance, utilities, forest rangers, and firefighting agencies traditionally rely on staffed observation towers to spot fires and initiate a response. But it is virtually impossible to dedicate staff around the clock to fire detection, says Laura Moreno Sánchez, manager of Phygital Assets for Minsait, a subsidiary of Spain’s Indra Sistemas specializing in digital transformation.

“We are talking about kilometers and kilometers that would have to be looked after constantly,” she says.

Taking images by satellite is another option, but it is an expensive one. And then there’s the six-hour lag between each set of photos. That’s plenty of time for a fire to spread, putting infrastructure, property, human lives, and livestock at risk.

It doesn’t have to be this way. Recent technology advancements are making automated wildfire monitoring, detection, and prevention a reality—giving new hope to wildfire-prone areas across the globe.

The Pressing Need for Wildfire Prevention Technology

Minsait is working to help combat the start and spread of wildfires in Central Spain and other areas in the Iberian Peninsula that are plagued by raging wildfires during dry months. It has developed the Smart Wildfire Detection (SWD) solution that operates on a simple directive: “Follow the smoke.”

Utilities in Spain have been testing the Minsait system for the better part of a year with fleets of AI-enabled cameras connected through the Internet of Things (IoT) to collect and analyze fire and fire-causing data.

Originally, the company had been working on an artificial intelligence and visual detection system for the manufacturing industry. But after a conversation with the third largest Spanish distribution company Naturgy, it realized it could leverage its existing technology to build the smart wildfire detection solution.

Recent #technology advancements are making automated #wildfire monitoring, detection, and prevention a reality—giving new hope to wildfire-prone areas across the globe. @IndraCompany via @insightdottech

The company also found the problem with wildfires extended beyond the destruction of life and property. Wildfires can cause millions of dollars in infrastructure losses and rebuilding, Sánchez explains.

With climate change increasing the potential of wildfires, Spanish authorities have enacted stringent prevention regulations on utilities, according to Sánchez. For instance, terrain under power lines must be clear of vegetation and trees, essentially creating an access road. “The law says that they have to have 50 meters free of any type of vegetation underneath the lines,” Sánchez says.

Visual data captured by Minsait’s SWD technology is designed to help utilities adhere to the 50-meter rule. “Having these cameras there can also help them to decide when to send people to prune or to cut the trees,” Sánchez explains. The cameras provide a 360-degree view and can cover up to two kilometers.

The cameras also evaluate environmental conditions such as humidity, temperature, and wind, and calculate the dew point. AI software reviews the images to detect and confirm any signs of fire. Having this information about wind speed and direction can also help in situations where firefighters need to forecast the spread of the fire.

Wildfire Detection That’s Cost-Effective and Sustainable

To achieve these results, the company leverages its phygital platform, which blends physical and digital components to deliver information to users. The platform leverages Intel® processors for its computing power and hybrid architecture as well as Intel’s VPU technology to make its vision of phygital possible.

According to Mariano Ortega de Mues, Minsait Phygital IoT/Edge Computing Director, Intel’s technology combined with Minsait’s work in AI and phygital systems has the potential to really address the climate change and wildfire problem. The solution combines IoT and edge computing technology to cover vast areas and transmit data in real time to a monitoring site. When the solution detects a potential fire, it sends an alarm to a cloud-based central monitoring location using 4G or LTE wireless networks.

Since utilities are always under budget and energy conservation pressures, Minsait built the solution to be sustainable. It leverages solar power through a small photovoltaic panel on the cameras to conserve energy and lower costs.

The solution also goes into “sleep mode” to conserve power and can “wake up” at regular intervals during dry months to capture data when needed the most, Ortega explains.

For security, the solution is built around the Intel Trusted Platform Module 2.0. “This allows us to have a solution that can live in the middle of nowhere. If someone tries to open the device, they can open the box, but they’re not going to be able to access the information in it,” says Ortega.

The Future of Wildfire Detection Technology

While the Minsait solution has been used only by utilities, Sánchez says the company plans to expand its use cases in the future.

She explains that in addition to wooded areas and power lines, the detection system is suited to environments such as farms and rail lines, which often cut through fire-prone areas. The company has already conducted some tests in farmland areas where the system was able to accurately detect fires and issue an alarm.

This summer the SWD system will be fully operational in Spain where it will work toward making a significant impact on reducing the region’s wildfire problem.

 

This article was edited by Christina Cardoza, Associate Content Director for insight.tech.

AI Robots Transform E-Commerce Fulfillment

The Enlightenment marked a transition to intellect, rather than intuition, folklore, or cultural beliefs in understanding natural phenomena. At the time, noted chemist Antoine-Laurent de Lavoisier described the principles of physics as follows: “Nothing is created, nothing is destroyed, everything is transformed.” While his observation was documented in the 18th century, it well describes our modern hyper-connectedness and the draw of e-commerce.

The appeal of purchasing something from the comfort of your couch and having it delivered to your door lies not only in its convenience; it is often more affordable then shopping in a big-box store. And it’s better for public health as we have navigated an ongoing global pandemic.

To meet the growing demand of e-commerce, large online marketplaces continue to spring up, and tend to focus on streamlining their warehouse operations and delivery services to remain competitive and cater to customers’ expectations for same-day delivery.

But despite the many advantages of e-commerce, there are also drawbacks – from job loss and environmental impacts, to questioning the continued existence of brick and mortar. We have seen store closures, and a devaluation in commercial real estate among both temporary tenants and anchor stores alike.

Yet when viewed through another lens, job and real estate values have shifted across industries and sectors to meet growing demands. Above all, e-commerce has created countless opportunities for technology experts, has resulted in large spaces being rented to support business operations and logistics, and has fomented long-term partnerships with delivery companies.

Although times have certainly changed, Lavoisier’s insights remain constant and relevant; namely, that nothing is created, nothing is destroyed, and everything is transformed. I would simply add that transformations are always underway.

AI Robots Power the Future of Retail

I have been collaborating with Intel® for more than four years, which has enabled me to get to know their Intel® Partner Alliance program. Essentially, its goal is to help innovative companies refine their inventions via strategic partnerships and state-of-the-art technology. Through this ecosystem, I discovered Fabric, a company making profitable on-demand e-commerce a reality. Their aim is to transform the sector into local e-commerce, using a technique that they refer to as “micro-fulfillment”.

Fabric’s proprietary solution is poised to shift e-commerce toward a model that is not only efficient but also inclusive and sustainable. Leveraging advanced Intel® technology, Fabric’s team of experts has created a highly automated proximity logistics model. Combining robotics and AI has allowed them to set up sorting centers within city centers. All warehouse operations are executed by AI-controlled robots, limiting the need for human involvement to the final phase of the delivery process (Video 1).

https://www.youtube.com/watch?v=mizez0yqioM

Video 1. Thousands of orders can be filled per day by hundreds of AI-powered robots. (Source: Fabric)

Thanks to this advanced automation, customers can benefit from deliveries within an hour and a guarantee that their items will be available in-store at the designated time. The space required to install the system depends on the product catalog and numbers, and it can accommodate virtually any business model. With careful planning, it is possible to achieve retail goals that were previously unthinkable.

One of the Fabric’s designing principles is that micro-fulfilment must be easy to scale. To that end, the company monitor’s each fulfillment center from its support center in Tel Aviv. Centralized support means most issues can be solved by Fabric personnel before the local operation even notices that something has gone wrong.

“Our technology is set up so that service personnel can see a 3D map. They see where every robot is moving and what tasks it’s doing,” says Phil Godden, Fabric’s Director of Sales Engineering. “They can see every sensor, every solenoid. And they have full control, even though it could be a problem taking place in Brooklyn and the person fixing it is in Tel Aviv.”

Thanks to this advanced #automation, customers can benefit from deliveries within an hour and a guarantee that their items will be available in-store at the designated time. via @insightdottech

Micro-Fulfillment Transforms Business

In fact, Fabric’s solution is already being used by multinational corporations and franchises with great success. Since 2018, Super-Pharm—the largest health and beauty retailer in Israel—has benefited from the pioneering micro-fulfillment process. To date, the business has migrated 90% of its home deliveries to micro-fulfillment, with exceptional results, including:

  • Same-day delivery
  • A 250% increase in fulfillments
  • Auto-adjusting to peak demand or low demand times, with minimal labor costs
  • Optimizing inventory and stock, and allowing for predictive planning
  • Unmatched customer experience

“The 660-square-meter site can currently do there more than a thousand orders per day, which is amazing from that space,” says Shirley Bachar, Commercial Director at Fabric. “As it ramped really quickly when COVID started, we were able to add more robots and increase the capacity.”

On a deeper level, by challenging the concept of “centralization,” businesses can exploit unused real estate assets within cities and create jobs. Without the need to compromise on operational independence, this decentralized approach increases the benefits of e-commerce by mitigating its impact on local communities and the environment.

After having worked in the tech industry for almost 40 years, this is the kind of transformation that I like to see—a transformation that does not involve trade-offs but rather fosters sustainability and inclusion. Through Fabric and Intel®, retail business operations have been reconceptualized with an eye to affordability, enhanced customer experience, and respect for the environment and workers.

 

This article was edited by Georganne Benesch, Associate Content Director for insight.tech.

The Future of Retail? Great Customer Experiences

Today’s shoppers want more than choices; they want experiences. In New York, Nike is delivering just that with a basketball court inside its 55,000-square-foot SoHo store. Coming in for new shoes or a jersey? Stick around and shoot some hoops.

Like Nike, other retailers focus on delivering experiences, not just transactions. Dick’s Sporting Goods’ House of Sport includes a climbing wall and wellness services. Lowe’s plans to provide services such as windshield cleaning and air stations to contractors. In-store innovations elsewhere include vaccination stations and community meetups.

Customer experience reigns supreme, and retailers know it. After two years of a pandemic during which shoppers got used to ordering online, retailers are getting creative about drawing people back into stores with a combination of experience and technology.

“You’re going in and you’re not just shopping. You’re trying things. You’re tasting; you’re sampling the product,” says Andy Szanger, Director of Strategic Industries at CDW, a multi-brand provider of technology solutions. “You’re seeing retailers start to have different events within the store, whether that’s a happy hour, musical performance, or spoken word.”

Part of the in-store attraction is providing a digital-like experience to expedite shopping. So retailers are investing in fast checkouts, inventory management systems, and seamless online/in-person shopping.

“If you truly want to be a tier-one retailer, you’re going to need both e-commerce and physical bricks because your shoppers want to shop with you in both ways,” says David Dobson, Industry Director, Retail and Hospitality, at Intel.

But as they try to strike a balance between digital and in-store experiences, retailers are still coping with pandemic-related challenges and an acute labor shortage.

After two years of a pandemic, #retailers are getting creative about drawing #shoppers back to stores with a combination of experience and #technology. @CDWCorp via @insightdottech

Digital Transformation Trends in Retail

The pandemic taught retailers to be more agile and adaptable as shoppers embraced habit-forming practices such as BOPIS (buy online, pick up in store) and curbside pickup.

To support those practices and survive temporary shutdowns and reduced hours, many retailers accelerated their digital journeys. They invested in seamless integration between online, mobile and physical channels, contactless payments, and tighter integration between POS, inventory, and ordering systems.

“There was about 10 years of innovation in about six months,” Szanger says.

Digital transformation trends still drive a lot of investment, but drawing customers back to stores is a priority. Shipping orders is costly, cutting into profits. More important, customers buy more when they can see, touch, and smell products. This leads to unplanned purchases.

“Some of the most profitable purchases that are made for a retailer are often through impulse buys,” says Szanger.

CDW is helping its retail business customers with solutions for POS modernization. They include automated inventory management, smart shelves, and AI-driven analytics to capture online and in-store data to drive supply chain decisions.

“Retailers are looking for customers to want to go to the store instead of need to go to the store,” says Szanger. “Although it’s a subtle difference, it’s an important one because it’s about that shopping experience.”

Coping with Labor Shortages

Running stores is no easy task during the “Great Resignation.” To make up for staff shortages, retailers are investing in automation and modernization with a focus on employee productivity, Szanger says.

For example, handheld devices enable checkouts anywhere in a store. Handhelds also allow associates to answer customer inquiries, access inventory, and manage curbside pickup in an efficient manner.

So-called microservices can play a key role. Microservices give users access to in-store functions through mobile apps. For instance, shoppers can download an app that replicates checkout screens so they don’t have to stand in line to make purchases.

Retailers can also use microservices for omnichannel marketing. If a retailer launches a new promotion, it appears on the POS screen. At the same time, the promotion appears in multiple other places, such as self-checkouts, smartphone shopping basket icons, and curbside pickup messages.

Partnerships Prevail

To help retailers with technology decisions, CDW provides a consultative approach.

“We help our customers with full-stack solutions throughout every step of their technology journey, whether it’s figuring out what to buy, procuring the gear, setting it up, and ultimately managing the system as-a-service,” says Szanger. “We have fully hosted solutions that we offer to our clients as well.”

What makes all of this possible is a combination of CDW’s fulfillment and service capabilities and partnerships with industry leaders such as Intel®.

Addressing customer needs takes an ecosystem of technology and service partners that can step in when needed. “The real magic happens when we’re working with our partners to consult with our clients and help them look at technology and their business in new ways,” Szanger says. “Intel has always been an extremely strategic partner for us.”

With ongoing innovations and partnerships, CDW facilitates success. Whether through in-store mini tradeshows, community gatherings, happy hours, or vaccination drives, retailers are finding various ways to bring people into their stores.

So if all customers want is to grab a couple of items and leave, new technologies can usher them through quickly. If they want to hang out and play for a while, meet people, or learn about new offerings, increasingly they have that choice as well.

 

This article was edited by Georganne Benesch, Associate Content Director for insight.tech.

IIoT Opens the Floodgates for Smart Water Management Systems

You expect water to be there when you turn on a faucet, but utilities must jump over several obstacles to get it there. Infrastructure is deteriorating, equipment is reaching the end of life, and water scarcity issues make it even harder to guarantee reliable water.

The traditional approaches to managing water assets are no longer sustainable. The current method of stretching equipment life and making patches and repairs to underlying infrastructure puts them at even greater risk. But it’s not that water utilities are unwilling to change. They are dealing with constrained budgets that constantly force them to rethink and reprioritize where and how they should invest in modern technologies.

“Long-term under-investments often leading to costly reactive break/fix maintenance models or preventative maintenance at best pose a mounting threat to water quality and availability. The fact that many water utilities have limited visibility and control over these assets amplifies the challenge,” says Jamal Shareef, Chief Executive Officer for Zotera, an industrial IoT and industry 4.0 solution provider.

Pumping Toward a Digital Transformation

Water utilities already have extensive experience with the “things” component of the Industrial Internet of Things (IIoT), given many sensors and other devices already deployed throughout their operations. Today they use Supervisory Control and Data Acquisition (SCADA) systems to monitor the status of operations. But SCADA is a 40+-year-old technology and maintaining the increasing number of distributed assets and complex infrastructures has become cost prohibitive and error prone. Utilities need a more innovative approach to water management to meet and adapt to these emerging challenges.

So where can they start? According to Shareef, the best entry point for initial IIoT investments is at the pumping station, given that they are present in every facet of water operations. And there can be hundreds and, in some cases, thousands of pumping stations across a metro area.

“Providing deeper insight into operations, with real-time #DataAnalytics that enable meaningful and effective management decisions is key.”–Jamal Shareef, CEO, Zotera via @insightdottech

By upgrading and replacing older pump control technologies—and in some cases, simply augmenting assets with intelligent remote monitoring and control systems—utilities can improve operational efficiencies and system reliability, extend the operational life of pumps, as well as reduce energy and maintenance costs.

Smart Water Management Systems for Increased Efficiency

Water utilities must find solutions that complement the infrastructure and technologies they already have in place. For instance, as old as SCADA technology is, it’s not going to go away overnight. “But it’s more than just replacing pumps, adding sensors to equipment, and advanced monitoring capabilities. Providing deeper insight into operations, with real-time data analytics that enable meaningful and effective management decisions is key,” says Shareef.

SCADA systems in water utilities are already collecting all types of data in compliance with regulatory, environmental, and citizen needs. Traditionally, that data goes to a control center where operators monitor and analyze it. When data shows abnormalities, operators must often manually tend to the pumps and perform the necessary remediation service.

But if a particular area experienced heavy rainfall, an operator in-charge of monitoring flood conditions could get several different alarms to make quick decisions.

Adoption of IIoT technology can help accurately analyze the data, predict flow volumes, and determine next steps in real time, explains Mo Kotaiche, Chief Technology Officer at Zotera.

Zotera’s Radius solutions add intelligence to the pumps, allowing automated processes to reduce manual work, remove latency, and eliminate delays.

Zotera’s Radius Edge Computing Platforms allow real-time analytics to be hosted close to sensors at the location where data is generated, enabling real-time decisions. Alerts and alarms are sent to the control center, and the cloud in real time. The remaining data is sent to the cloud for predictive and prescriptive analytics.

“This approach informs operators of the time before maintenance is required, eliminating guesswork and reducing unscheduled downtimes,” says Shareef.

Coexisting in the Water Industry

According to Shareef, the solution would not be possible without collaboration. Zotera works to bring partners together inside and outside its solutions.

For instance, Zotera fosters collaboration between IT and OT teams—which traditionally have been isolated from each other. The company hopes that by converging the two principles of IT and OT in one system, it will develop more robust solutions that provide a standard approach on how to implement solutions at the edge of the network. It may also foster a closer working relationship between IT and OT teams.

On the Radius product development side, Zotera worked in technology provider Arrow Electronics’ Colorado Open Lab to demonstrate how its solutions would work in the real world without impacting its clients’ environments and help bring OT and IT together. It provides a collaborative space for companies to explore other new capabilities with their partners.

When small vapor-filled cavities are present in the water, they can wreak pumping systems havoc. Zotera is currently working with utility technology leader ABB on a firmware upgrade to its Variable Frequency Drive, which enables Radius to perform cavitation detection. The plan is to have ABB come out to the Open Lab, install the firmware, and test the use case.

Zotera’s edge computing platforms are built with Intel® technologies, including Industrial Ruggedized Servers, Industrial HMI Panel PC, Rugged Compact Computers, Cisco Ruggedized Industrial Routers, and Internal SSD. Zotera also partners with Intel to provide edge computing capabilities and prepare for new technology advancements. Advanced communications built on the Cisco Industrial Automation solution deliver high-speed connectivity, scalability, and high availability to connect Zotera’s Radius Edge Computing Platforms to the cloud and operations centers.

Last, Zotera works closely with system integrators on their end-to-end solutions. “SIs are out there on the frontlines talking to customers every day. They can be very helpful in terms of how we develop solutions. They tell us what they are hearing, and we talk about what is coming down the pike in terms of technologies and determine together how best to apply it,” says Shareef.

He hopes going forward that pumping stations will become a catalyst for further adoption of IIoT technologies within the water industry. “From opportunities in water treatment and quality to reducing power generation, we envision a lot more good changes to come in the future. This is really just the beginning,” he says.

 

This article was edited by Georganne Benesch, Associate Content Director for insight.tech.

Cloud Migration Accelerates as Confidence Grows

How secure is the cloud? For years, that has been the number-one question on everyone’s mind when considering moving critical workloads to the cloud. And it’s been the main reason businesses have stayed off the cloud. But all that is starting to change.

“That security concern in the early days was paramount. People just couldn’t get comfortable. But as time has passed, I think it’s pretty much been proven that the cloud can be as secure, if not more secure, than on-premises data centers,” says Jim Jordan, President and CEO of RiverMeadow, a multi-cloud migration company.

Over time, the extent of capabilities and services has fundamentally increased the cloud’s ability to run mission-critical or any workloads. The benefits can no longer be ignored. Businesses want lower CapEx costs, elasticity, and the on-demand capabilities that the cloud offers, according to Jordan.

But migration challenges haven’t changed. And most organizations still lack the in-house expertise to do it on their own.

On-Demand Cloud Migration Expertise

Jordan finds that when businesses start to move to the cloud, they quickly realize they don’t even know what’s running in their data centers. For instance, how many servers and applications do they have, and what servers make up an application?

“Now that security has kind of been checked, the key blocker is that organizations still don’t have their people trained on how to run their business in the cloud,” Jordan says.

RiverMeadow saw the need for automated cloud migrations early on when the company launched in 2013. But at the time, the market wasn’t ready yet.

It hasn’t been until the past three to four years that market maturity and confidence has started to grow, Jordan explains. But he estimates that 70% of the market still has yet to move entire states to the cloud—making it a perfect time for companies to leverage the expertise from a multi-cloud migration specialist like RiverMeadow.

“Think of RiverMeadow as the moving truck,” says Jordan. “We show up at your house, we pack up all your assets—the infrastructure and applications—we move them to your new place, we plug them in, and they work.”

RiverMeadow assesses an organization’s data center and then determines what workloads can and cannot be moved.

70% of the market still has yet to move entire states to the #cloud—making it a perfect time for companies to leverage the expertise from a #MultiCloud migration specialist like @RiverMeadow1. via @insightdottech

In the past, migration was limited to more single applications such as email or productivity apps, according to Magnus Rusk, Technical Manager EMEA at RiverMeadow. Customers didn’t trust applications to work as well in the cloud. But as more companies start to experiment in the cloud, they are quickly realizing all the possibilities.

“It’s database applications, it’s ERP systems, it’s just everything out there,” Rusk says. “Customers are starting to migrate more complex applications because they have more trust in the cloud.”

A Lift, Shift, and Optimize Cloud Migration Strategy

When organizations need to get to the cloud as quicky and easily as possible, RiverMeadow takes a lift-and-shift cloud migration approach. That’s why assessing an organization’s application landscape and all its dependencies upfront is so vital. It allows a cloud migration to be swift and efficient, with little disruption as the applications are effectively “lifted” from their on-prem environment and “shifted,” as is, to the target private or public cloud.

If businesses are running on legacy operating systems, they can also “lift and optimize” their workloads as they move to the cloud with RiverMeadow’s OS modernization capability. This enables them to upgrade their legacy Windows and Linux operating systems to more current versions, helping them to reduce operational risk and avoid extended support costs.

RiverMeadow also helps businesses right-size for the cloud, so they don’t over-provision resources and end up paying too much for memory or CPU. Business can scale up or down resources and memory as needed, which is often a problem with on-premises data centers, according to Rusk.

“We have a customer that bought a whole bunch of new gear for its on-prem data center. And it was two to three months before they could physically get to use that gear. The benefit of cloud is, effectively, with the swipe of a credit card, I can spin up more storage capacity if that is what I need. I can spin up virtual machines,” he says.

The Need for Cloud Migration Speed

RiverMeadow recently did a migration for Mitel Communications in less than 90 days (Video 1). “They were under a time pressure,” says RiverMeadow’s Global Marketing Director Emma Tompkins. “And we were able to do a phenomenally quick migration for them.” The migration consisted of moving 1,000 VMware workloads that run customer services to Google Cloud VMware Engine.

Video 1. RiverMeadow lifts, shifts, and optimizes workloads for customers in a fast, efficient way. (Source: RiverMeadow Software)

Another project involved moving the IT infrastructure of Cambridge University Press to the cloud—which resulted in substantial CapEx and OpEx savings, according to Tompkins. RiverMeadow was able to migrate about 750 servers from two on-premises data centers to Amazon’s AWS cloud services in about six months.

Post-migration, RiverMeadow performed acceptance tests to validate that applications and components were running as expected. It helped automate common operational tasks such as application monitoring, operating system monitoring, endpoint protection, and patch management.

Making Way for More Cloud Migrations

According to Tompkins, what those customer migrations had in common was that it was really a top-down mandate to move to the cloud. Whether it is about controlling costs or reducing OpEx costs, top business leaders are the ones making the decision to migrate.

“Everyone is saying, ‘We may not know everything we need to know about this thing called cloud, but we have to have a cloud strategy as a corporation, as a strategic mandate,’” says Jordan. And then it is up to the IT teams to figure it out and do the heavy lifting.

To address these challenges, RiverMeadow works with partners like Intel®. Not only is Intel providing the hardware to make moving to the cloud possible, it also is funding pilots to help businesses understand private, public, or hybrid clouds. This breaks down any lingering fear, uncertainty, or doubt a customer has about migration, Jordan says.

“We can go in and demonstrate our capability to businesses. Show them how fast and easy it is and open their eyes to move to the cloud a lot faster,” he says.

As businesses head into a cloud-native future, Jordan expects rapid adoption and acceleration over the next couple of years. Customers will not only get more comfortable with migrating but also with using multiple clouds and moving between clouds to leverage pricing and capabilities, Jordan explains.

Increasingly, they will want to perform all these actions themselves. RiverMeadow will continue to support these transitions with their fixed-price, easy-to-use, self-service, multi-cloud migration platform. “That way,” says Jordan, “customers can start to move their workloads to and between any clouds on their own, with a single pane of glass.”

This article was edited by Christina Cardoza, Senior Editor for insight.tech.

Processor Innovation + Virtualization Power Edge Computing

When we look back on the early 2020s, we’ll remember how adversity compelled us to adapt, and in many cases, revealed parts of us we didn’t know we had. Globally, we have experienced shopping, attending classes, working with colleagues, and even spending time with families in a whole new light.

The same is true in the tech world. Faced with industry-wide digital transformation, electronics companies had been maneuvering for position before the COVID-19 pandemic. Then like dominos, quarantines turned into production slowdowns, which resulted in supply chain disruptions.

ASRock Industrial, an IoT edge solutions OEM, was one of them. The company was less than four years from spinning out of ASRock Inc., a provider of industrial grade products, when COVID hit. Suddenly, demand from its low- to medium-volume factory automation, robotics, and security customers became uncertain. The ASRock Industrial team realized it could no longer rely solely on its traditional off-the-shelf hardware business.

“We are moving from pure hardware to gradual value-adds,” says James Lee, President of ASRock Industrial. “We build application-ready platforms for our clients, which include the industrial PC itself, middleware, and containers to host different types of operating systems, and shared memory technology to speed up the transmission of data.”

Flexibility Is Key to IoT Edge Computing Evolution

For a hardware OEM to evolve into an IoT edge solutions provider, it needs a flexible foundation with the performance to satisfy multiple use cases. Otherwise, completely custom designs would be required for every customer, which is neither scalable nor cost-effective.

Understanding this, ASRock Industrial doubled down on its heritage of Intel® technology-based designs by adding support for 12th gen Intel® Core processors (previously known as “Alder Lake”). These new processors introduce a hybrid architecture with up to 16 Performance- and Efficient-cores that adapt seamlessly to edge workloads.

According to ASRock Industrial engineers, performance improvements in 12th gen Intel Core processors single- and multi-threaded processing combine with the platform’s real-time capabilities to enable container-based microservices on industrial automation machines.

As a result, ASRock Industrial can help customers consolidate functions like the Intel® OpenVINO Toolkit-driven AI inferencing, motion control, and other capabilities onto a single device, as it did for one industrial customer that builds an automated optical inspection (AOI) system.

Consolidating IoT Optical Inspection on a Single IPC

For years, the customer used a multi-PC setup to perform product quality inspections at a manufacturing plant. The system included a Windows-based industrial control PC for machine vision and image processing, and a separate Linux platform that ran the image inspection models by AI inference.

The 12th gen Intel Core processor introduces a hybrid architecture with up to 16 Performance- and Efficient-cores that adapt seamlessly to #edge workloads. via @insightdottech

Because the two systems had to constantly pass imaging data back and forth over a physical LAN, latency became an issue. The delays were so pronounced that eventually the AOI became the bottleneck in the entire manufacturing process. This prompted the customer to look for a new single, self-contained system architecture. Partnering with ASRock Industrial, it arrived at the 12th gen Intel® Core Desktop Series processor-powered IPC.

As shown in Figure 1, the iEPF-9010S supports all the functionality the AOI requires by hosting applications previously run on the two systems in different virtual machines. Hardware-assisted Intel® Virtualization Technology (Intel® VT) and native real-time connectivity for the application’s deterministic tasks makes this possible. Plus it’s compatible with OpenVINO, which can be used to accelerate AI workloads.

iEPF-9010S consolidates edge devices with improved performance
Figure 1. A combination of engineering expertise and the iEPF-9010S consolidates a multi-system edge computing setup—improving data throughput 100x. (Source: ASRock Industrial)

But the IPC didn’t support the AOI application out of the box. To facilitate the new environment, ASRock Industrial partitioned the workloads using a KVM hypervisor and developed a virtual LAN that replaced the physical data exchange connection. The company also designed a software tool that lets users optimize the platform even further by shared memory between the virtual machines.

In all, the AOI system’s data transmission speeds increased by 100x compared to the dual-platform configuration.

A New Era of IoT Edge Solutions & Suppliers

The outcome of the project is better than either party could have hoped for. The customer, of course, not only removed the bottleneck in its manufacturing plant but also benefitted from the reduced cost and complexity of having to manage one system instead of two. And on the surface, ASRock Industrial got a design win and a happy customer.

But going deeper, the technology enabled by 12th gen Intel Core processors probably transformed the IoT edge computing provider as much as it transformed its customer. With the processors’ inherent versatility, ASRock Industrial can now iterate on top of its own open-architecture hardware with middleware and design services that get customer solutions to market faster.

“We see this kind of customization as a trend,” Lee explains. “Right now, we’re introducing this concept to our customers because they tend to be able to deploy their software. In terms of the business right now, we will gradually add on customization and software capabilities into our products, and my expectation is after five or ten years it will be more than 30% of our business.”

This adaptability, born out of adversity, is what separates success from failure in uncertain times.

 

Listen to Inside the Latest Intel® Processors with ASRock Industrial on our IoT Chat podcast to discover even more about ASRock Industrial and how they are leveraging the latest 12th gen Intel Core Processors.

 

This article was edited by Georganne Benesch, Associate Content Director for insight.tech

Standing on the Edge of Smart Retail Possibilities

Most of us grew up knowing retail and hospitality as hands-on businesses—the server handing you a menu, the ring of a cash register (not to mention cash!), perhaps even a string tied around your bakery box. But times change and needs change. And a lot of legacy businesses out there—like many shops and restaurants—still struggle with the hows, whens, and whys of the transition to digital. Enter companies like edge computing platform provider Reliant.

We talk with its CTO and Co-Founder Richard Newman about how brick and mortar is transforming into smart retail businesses with cloud- and edge-based architectures, what can be done about the supply chain problem, and the best way to transition out of legacy systems.

How is edge computing addressing the changing needs of retailers and restaurants?

Because of COVID and the pandemic, a lot of the industry has doubled down on contactless—whether that be payment, or buy online and pick up in-store, or delivery-based services and fulfillment out of physical stores or restaurants. And so a lot of older brands, ones that hadn’t invested as heavily in technology as they might have, either shrank substantially or even went out of business.

The innovative companies, the ones that were already pushing the envelope with things like autonomous shopping, self-checkout, tight integrations and delivery, advanced omnichannel-based order management—they’re the ones that did the best coming through COVID.

But brick and mortar isn’t going away by any stretch. It’s really becoming much more of a hybrid model. That has forced changes in the systems deployed in that environment, changes that require modernization and re-architecting of those actual physical systems. Take quick-serve restaurant operators. Suddenly they’re seeing a much larger percentage of orders being processed outside of their restaurants, and their current systems and the way their kitchens are configured can’t necessarily keep up unless they are investing in next-generation technologies.

How can businesses go about rethinking their systems to effect these changes?

It’s an incremental approach, but edge computing provides some great ways to do it. The virtualization of existing legacy systems can be combined. Virtual machines can run at the same time and workloads can go from a monolithic architecture and be delivered as lightweight containers. Then all of that can be connected much more closely to cloud-based systems and services. That’s how to make a start. It could be one simple application at a time, but along the way they’ll be able to develop a much more agile architecture.

And changing up the architecture, going from legacy to an edge-based approach, provides an opportunity. If you think about the way things used to be back when we were talking about data centers, it wasn’t so easy. You had to actually order a server from somebody, wait for that server to arrive, rack it, connect it to the network, provision it, and load software on it. Whereas if you’re running a cloud-based infrastructure, people think nothing of adding another virtual machine or virtual host or another set of container-based workloads in the cloud. But many operators, when they need a new system deployed in their physical store or physical restaurant, are still stuck in the older mindset.

Edge changes that. In an edge-based workload, they’ll have a foundational system—like Reliant’s platform—that will sit at their physical store or physical restaurant. And if they want to add another virtual machine, they provision it and it pops up. If they want to add another set of containers connected into a docker container registry—a set of applications they may have—all those actions are done through the cloud, and they’re up and running in the store or restaurant. It’s really that easy, and that’s the big change.

Tell me about the importance of Intel® to achieving these scenarios you’re describing.

Customers want to be able to have something that’s very leverageable, but that’s also going to be highly reliable. Suddenly they’re going to be concentrating more workloads running on what might be a cluster, and they’re going to want to have scalable capacity. That moves them off of what might be consumer-grade components and hardware to something that’s more data-center grade.

That’s where Intel® has been doing a lot of tremendous work, and the silicon it produces and the architectures that it’s supporting lead the industry in that space. And it’s coming out with new stuff all the time, which is just fantastic for these types of applications.

Are you seeing any hesitation in your customers to transform?

What often happens in the physical world in retail and hospitality is that everyone goes through various levels of hardware upgrade cycles, or network upgrade cycles, or major application shifts. So it’s at those moments of transition that they can take advantage of moving to an edge architecture instead of doing the same old thing. Take advantage of the changes they have to do, to make the changes they need to do.

“Many operators, when they need a new system deployed in their physical #store or physical #restaurant, are still stuck in the older mindset. #Edge changes that.” –Richard Newman, CTO and Co-Founder @reliantio, via @insightdottech

In most cases, when they dig in, there’s real ROI associated with moving architectures. They end up with less physical gear overall, which means there’s less to break and less cost associated with break fix. Higher reliability, better uptime. It’s easier to support and service the edge-based products, too.

How does it all work in terms of the technical complexity?

Edge-based systems should look and act like cloud-based systems, relative to how they are managed and operated. Businesses should be using easy-to-use GUI tools for configuration. They should have configuration on a highly automated basis. They should be able to use an API to manage configuration orchestration. And that mirrors cloud.

Anyone who’s been through a cloud-migration project knows what’s involved with doing it. They understand that there are new technical skill sets that come up along the way. At some point organizations have to look at what they’re doing in their physical premises, and they’re going to say, “What are things going to look like going forward? How are we going to evolve to accommodate that?” That’s where companies like Reliant come in. We’re rolling up our sleeves and helping our customers achieve success.

Where do you see edge technology being critical to this evolution?

When we look forward, we’re going to see a world where there are many more opportunities to use AI and ML to support facilitated retail, facilitated restaurant operations, etc. Let’s just pick the most obvious and easy one—walk-in-walk-out frictionless checkout. We’ve grown up in a world where we’re used to waiting on checkout lines, and having our products scanned and weighed, our shopping carts emptied out, our products handled—all by a cashier.

Machine learning and machine vision and related technologies like LiDAR and shelf sensors, and the ability to wrap all of that around in a web-to-edge-based architecture, eliminates the need for all that. The benefits are numerous, and they result in happier customers. They result in employees who are able to do jobs that involve more than just weighing a bag of brussels sprouts.

Another example—every restaurant has food prep going on. But if someone says they’re allergic to dairy yet sour cream ends up on their burrito, that’s an expensive mistake and it’s an unpleasant customer experience. But it’s really not much of a job for a camera to alert someone that there’s a problem. And it’s not just order quality, but food safety. How long has something been sitting out? What’s the temperature of a refrigerator or a cooking service? A computer can catch mistakes, or help people make smarter decisions.

Are these technologies providing some new benefits to the supply chain?

Certainly. Edge computing can provide options for getting much smarter about what physical inventory or food products are being maintained in a store or restaurant. It’s better than having a manager run around with a clipboard trying to figure out what’s going on. And the data can be real time. So it provides opportunities for more dynamic pricing. It provides opportunities to potentially tell a customer, “We don’t have this product at this store, but we have it at this other store and we’ll find a way to get you what you need.”

Of course there are challenges with supply chain that can go halfway around the world to where products are manufactured. And the technologies we’re talking about here can provide visibility into: Where is something in the manufacturing process? Where is something in shipping? How long has it been at the dock? When am I going to get it in my store? And how can I use all this data to optimize my ability to give the customer the product or service they want at the best possible price?

How is Reliant helping customers make the transition?

We’re focused on making it as easy as possible for our customers to take their existing legacy applications and run them as virtual machines. Or to define new, container-based workloads that they want to run, and plug it all together.

That puts us in a great position. Customers count on us to be smart and knowledgeable about the retail or the restaurant application stack. So we’re very conversant in payment, point of sale, signage, kitchen automation, order management, RFID—all these components that drive the modern store or the modern restaurant. We bring that business knowledge together with the customer’s requirements on edge computing on a platform that’s as open and as agnostic as possible.

Management of systems at scale becomes both an opportunity and potentially a challenge. All of these systems that are running in cloud and edge with a high degree of integrity have got to be managed. So an important part of what we try to do is to really make configuration highly automated, but also highly visible.

Another focus we have is making sure that, when customers are thinking about their workloads, they end up in situations where those workloads can run even if cloud connectivity is compromised, not available at all, or just degraded. And that’s very important, because the business needs to happen no matter what. It can’t be, “Sorry, we’re not going to be able to make your burger today because we’ve lost cloud connectivity. Our stove works, but we just don’t know how to cook a burger without the cloud-based system telling us what to do.” Resiliency is super important. It’s part of what you do with edge-based computing.

We’re really very excited about the pace of change right now. This couldn’t be a better time. Obviously COVID has driven a lot of work to us as organizations have scrambled to adapt. But coming out of COVID we’re seeing another phenomenon, which is everyone suddenly saying, “All these projects I had—for whatever reason they’ve been on hold. Now I’ve got to get going with them.” So we’re busy. It’s delightful. And I just want to keep it going.

Related Content

To learn more about edge and cloud retail experiences, read Edge + Cloud = Advanced Retail Operations and listen to the podcast Smart Retail Needs Edge Computing with Reliant. For the latest innovations from Reliant, follow them on Twitter at @reliantio and on LinkedIn at Reliantdotio.

 

This article was edited by Erin Noble, copy editor.

Why Big Memory Is a Big Deal for Big Data

Ever hear the saying “too much of anything is a bad thing”? That is exactly the case happening with data today. While information has become the lifeblood for businesses to make decisions and improve operations, the size of data is outpacing the memory available to store it. When that happens, performance and progress slow down or come to a halt.

This problem is only expected to increase as real-time workloads and more data-intensive applications are on the rise.

“We are in an age of information. The amount of data being created is significantly increasing every day. And it needs to be processed rapidly. Today’s computer systems, based on the von Neumann architecture, are no longer able to keep up with this influx,” says Jonathan Jiang, Chief Operating Officer at MemVerge, a big memory software provider.

Storage I/O Is Not the Answer

This is especially difficult in the biosciences space, where it is not uncommon to have a data set exceed a terabyte. “When data is bigger than memory, the research often cannot be completed. In many cases, the program will just report an error and exit,” Jiang explains.

To get around this, researchers have traditionally had to swap data between memory and disk. This process, known as storage I/O, results in a lot of time wasted just reading from and writing to disk. For example, when researchers are in the middle of an analysis or experiment, they store data for persistence to protect against any program failures or for future reproducibility.

While the data is being copied to or read from storage, the researcher is forced to sit around and wait for that to be completed. This can equate to hours of downtime. Additionally, if the workload fails midstream, then the researcher has lost all their progress.

“One of the fundamental bottlenecks for performance of the von Neumann model is that data needs to be moved between memory and storage. And when you need to move data between a fast media and a slow media, your performance drops. As the amount of data continues to explode, that weakness in the computing infrastructure will be more pronounced,” says Jiang.

To address this problem, MemVerge, has pioneered a new category of computing: big memory (Video 1), which allows applications to bypass traditional storage systems in favor of persistent memory. Jiang explains that this can result in a 10x performance improvement for data-intensive applications.

But for big memory to really take off, it will require software innovations like the ones MemVerge has made. The company’s snapshot technology eliminates IO to storage and recovers terabytes of data from persistent memory in seconds.

https://www.youtube.com/watch?v=843_ibMpXAI

Video 1. MemVerge is leading the next generation of in-memory computing called big memory computing. (Source: MemVerge)

Big Memory Offers Big Results

This is the exact answer Analytical Biosciences, a leader in single-cell genomics, was looking for in its research to fight cancer and help stop the spread of COVID-19. The organization found more than 50% of its multistage analytic pipeline was spent just loading data from storage.

To overcome this storage bottleneck, accelerate its discoveries, and be able to make faster predictions, Analytical Biosciences turned to MemVerge for help.

#BigMemory can result in a 10x performance improvement for #data-intensive applications. @MemVerge via @insightdottech

“Our goal and what we enable is big memory computing, which allows us to keep all the data in memory all the time, thereby eliminating that storage I/O,” says Jiang. “Even the fastest, high-end storage solutions available today is still an order of magnitude slower that what memory can do.”

With MemVerge’s big memory computing platform Machine Memory, Analytical Biosciences was able to leverage the solution’s memory snapshot technology, clone the data, and write it to persistent memory. This enabled Analytical Biosciences to load data 800 times faster, eliminate 97% of its storage IO, and reduce the overall pipeline time by over 60%.

“In another use case for snapshots, you can move workloads seamlessly between on-prem data centers to cloud data centers and between different clouds. There are many interesting new operational concepts that can be enabled with big memory,” Jiang explains.

A New Era of In-Memory Computing

In the past, it has been too expensive to put all data in memory all the time. But recent advancements in persistent memory from Intel® have made the price point much lower per gigabyte than traditional DRAM.

By utilizing Intel® Optane technology in its Machine Memory solution, MemVerge provides more capacity and persistence in memory—improving application performance, scalability, and reliability.

As applications become more data intensive and memory becomes faster, Jiang predicts every industry will change its applications to take advantage of a big-memory infrastructure.

For instance, it is critical for the financial industry to provide services with high performance and low latency. Big memory will be crucial for them to stay competitive and transport/share data faster. Big-memory computing can also help the media and entertainment industry, which deals with a lot of the same interruptions in its pipelines as the biosciences space because of its large data sets.

App developers who have made accommodations for calling data sets from storage, bringing it in bit by bit for performance reasons, will have to rethink how they write their applications. “When the application developers and the IT operations organizations shift their mindset to big-memory computing, a lot more can be done,” says Jiang.

To make it easier to adopt its big-memory technology, MemVerge provides an SDK that allows customers to take advantage of the underlying infrastructure, develop new applications, and make use of its capabilities directly.

“This will change the face of the data center. When memory can cross physical machine boundaries, the focus of applications will change. They won’t need to optimize around memory usage,” says Jiang. “When that happens, that’s when big memory will really take off.”

 

This article was edited by Georganne Benesch, Associate Content Director for insight.tech.