Smart Factories and Next Gen IIoT

Smart Factories and Next Gen IIoT. A Look at the Refinery of the Future

Recently Daniel Newman, Principal Analyst at Futurum Research discussed Video as a Sensor with Linda Salinas of Texmark Chemicals, Stan Galanski of CBT, and HPE’s Tripp Partain. A key element of the Refinery of the Future, the Video as a Sensor solution combines video sensor technology, edge computing devices, artificial intelligence (AI) and machine learning (ML) to bring enhanced, real-time data to internal plant operators so they are better positioned to address the most strategic aspects of their job. The solution is led and integrated by CBT, with critical technologies provided by HPE and Intel.

The Refinery of the Future, deployed at Texmark Chemicals in Galena Park, Texas, addresses some of the most pressing issues facing the highly competitive oil and gas industry, including marketplace volatility, stringent environmental standards, competitive pressures, and workplace safety.

For Texmark, a chemical processing facility that manufactures hazardous materials in the petroleum product supply chain, the Refinery of the Future has sparked heightened performance via uninterrupted productivity, increased uptime, and better process analytics while also delivering a safer environment for employees. Another key element of the Refinery of the Future is how it’s cultivated a more connected and enterprising workforce, so let’s talk about how Video as a Sensor contributes in these areas. Let’s look at Smart Factories and Next Gen IIoT.

Video as a Sensor

CBT was invited to work on this project with strategic partners, HPE and Intel. Texmark said they needed to find a way to relieve the significant burden of activities on their operations control staff, and they wanted to deploy Video as a Sensor capability. To make sure that CBT understood what they needed and that we did not leave any stone unturned, we felt it necessary to utilize our methodology called Innovation Delivery as a Service (IDAAS).

That methodology kicked off with a discovery series that we call Jumpstart, where we sit down with the customer at length for a couple of days to fully understand and inspect their processes, how they do their jobs, who they work with (both interior and exterior to the company) and mark that out and architect it. We say it back to them, ask them further questions until we converge on a very clear understanding of what their business is, and how they need to change in order to insert more safety and security into their workforce. In this case, specifically out in the rail yards.

Providing Solutions

It began about three years ago, Texmark identified a need to upgrade its mechanical integrity program and they were also looking for partners to upgrade or enhance their distributed control system (DCS). And it was at that point that Doug Smith, their CEO, and Linda Salinas, VP of Operations reached out to some of their technology partners at HPE and said, we’ve got these two challenges, how can technology help us? That’s when HPE introduced Texmark to IoT (Internet of Things). They took a tour of their IoT lab, and from that was born the Refinery of the Future. The Refinery of the Future consists of five solutions. In this article, we are going to discuss Videos as a Sensor.

Very quickly, the other solutions are Predictive Maintenance and Analytics, where we have sensored a couple critical pumps in the plant. Another is Worker Safety and Security. The fourth one is Connected Worker which CBT is leading. And then Asset Integrity Management. So that gives you a little bit of the history of Texmark and the Refinery of the Future.

Partnerships 

When you’re pulling this many collective resources together (CBT, HPE, Intel and more), it is important to focus on building a cohesive solution. This is an industry (oil and gas) that’s very compliance/governance driven, safety-driven and has to maintain rigorous standards. One might think of Hewlett Packard Enterprise as an IT company. And the same thing with CBT, as a system integrator. Or Intel, they’re a computer chip company. However, what we learned very quickly, as we formed our partnership is that within each of these organizations is an oil and gas vertical.

It’s important to consider that with the collective experience of our three major solution partners, we have the ability to deeply understand your business, to understand your organization, understand your challenges in the plant. Even the physical challenges with working in a hot, dirty, dusty environment, working in Class 1, Division 1 situation.

We understand that you just can’t grab a computer off the shelf and plug it into the control room and have it perform the way that it would in a nice clean data center. So it was very useful and helpful to each of the partners – Intel, CBT, and HPE, and others – had that internal knowledge about chemical manufacturing.

Working with Texmark

Our team started with: what is the situation today? CBT and partners planted ourselves right into the Operations Control Room. We lived with them for several days, watching how they did their job, what resources they had, the demands on their time, the interruptions, where they were having a difficult time being able to fully provide safety observation and direction to their teams throughout the day. Because what IoT brings to the table is a chance to automate and reduce those pressures, and actually fulfill some of those functions, such that they can make decisions faster and quicker.

So by working with them on a daily basis and using our IDAAS process, we saw that not only did they care about possible intruders coming into their plant through their rail yard, they also were concerned that maybe there might be a safety problem if some of their equipment got dislodged that was normally supposed to be still, or if one of their workers may have become fatigued and was not in an upright position, but had fallen, or somebody was wearing the improper safety attire and that person needed to be identified.

We ended up expanding the requirements based on this discovery process with them and decided on a baseline scope of the mission. And that was to give them the ability to sense something was abnormal in this part of the plant and provide an artificial intelligence-based program, running on state of the art computers that would alert them and save them the time of having to do a manual intervention.

That solution is what we now call Video as a Sensor. It started as a proof of concept but now it’s being scaled up. We have a well-determined process that we follow…it’s a capture, categorize and analyze. And it’s important to spell that out because many IoT projects that people hear about out there are quite simply sensors collecting data. With this solution, we’re actually utilizing the data to make important strategic decisions at the plant.

Scope, What it Isn’t

It is uncommon to hear about projects like this that integrate something so complex with video, ML and AI. And if you do hear about it, you usually don’t get practical, real-world examples in real workplace environments. This is exactly what we’ve done.

This is not a lab that we were dealing with, we are actually working in a chemical plant that’s operational 24 hours a day, 7 days a week, which makes it all the more unbelievable. So, we were able to capture data like anybody else can through video cameras. And once that data was captured, we then had to categorize it to determine what’s a good situation, what’s a normal situation, and what’s an anomaly.

And the only way to do that takes that data, go back, share it with the customer, and have them start to make the judicious decisions. This is good, or this is bad, etc. Once we have that categorization, then we’re able to feed it into the software. And the software starts its first cycle of learning. And it determines what indicates something as abnormal. And then as we take more data, the software was able to assess it and come up with a smaller set of anomalies. We share that with the customer. They categorized it again. And through several cycles, we were able to hone the independent abilities of the application.

This is the machine learning or ML aspect, taken to the point that it could now distinguish key features that would prevent calamity or potential safety infractions.

Building Trust

After discussions, planning, analysis and data collection, CBT, HPE and Intel went back to Texmark and presented our findings and a plan of what it was going to look like when we completed the project. 

And from the time that Texmark had the IDAAS workshop with CBT, to the time that the first demo was rolled out, until now, the process and the ideation, and that whole iterative process with Texmark employees was fantastic.

It actually helped build trust between Texmark’s employees and CBT’s team, and trust between the employees and the software package itself. Texmark felt that if CBT, HPE and Intel all got in a room together and said, “Hey, this solution is great, we’re going to roll it out to your employees. Here you go, here’s your new video as a sensor package. Click this button and an alarm will go off as needed”, that the employees would not have accepted it as readily as they have now since they’ve been intimately involved in the development. 

There is trust in the system because they gave the input from the beginning as they were developing the program. The solution continues to improve through ML because Texmark’s employees are teaching the system: What is a hit? And what is a false alarm? As it is detecting railcars, or a fallen worker, or an intruder and so forth.

Real-World Solution

It’s an ongoing process. It does have a lot of iteration. In fact, Texmark had a recent example where a railcar was delivered to Texmark and their operator detected a leak from the gasket around the top of the railcar. This was particularly important to them because they can’t have any hazardous materials leaking for compliance. Immediately, the operators knew “hey, we’ve got the cameras down there at the railcar loading area, can we roll the tape and find out if it was leaking upon arrival? Or when did it start to leak?” All of that footage became of interest to the railroad operator and for the railcar product supplier. They’ll be using that information as part of their investigation. So what did that trigger? It had operators saying, “Well, hey, we detected the leak upon inspection. However, can those cameras help detect those leaks more quickly, less manually?” That example was so perfect you would have thought we planned it! 

Tripp Partain, CTO of Converged Servers, Edge and IoT Systems at HPE, said, “Within HPE we’ve added vertical focuses, so that we could take what a lot of folks would consider generic data center technologies and some of the other technologies that HPE has had over the years, and really be able to focus those to solve problems that are a little more vertical-centric. So that at least the way you use it, the way you explain it, the way you align, it really moves the focus toward what makes sense for that particular vertical. When you start to look at Edge and IoT, because it’s a much newer area in tech, and while it’s been talked about a lot, I would say you’re probably still early. Maybe not early in the hype curve, but definitely early in the adoption and real value-generation curve.”

On the Same Journey

Technologists themselves really have to be very close to the problem sets, get closer to the vertical than they would have in the past to make sure that the way they’re developing and enabling technologies is actually going to be adoptable by the end-user and fit for that problem. So, it’s one thing to have a customer that’s willing to take that journey with you, and to allow you to get close enough, but it’s also even more critical to have the right provider, the solution provider, the system integrator, the partner that can find and integrate the really cool technologies being brought to the forefront. It’s going to take a combination of that, plus a number of other pieces to really solve the problem.

Partain goes on to say, “Having a key partner like CBT to work with, so that we can then go and focus these new Edge and IoT solutions toward the Texmark use cases and the problems they presented, really is the only way for us to make sure that these newer solutions are fit for purpose and have the right adaptability to a very different environment than what a company with a data center background is used to. Now granted, we started in a garage, so you would think we would have the ability to work outside of a data center. But over the years, we’ve really sort of migrated to much more pristine data center environments. And now with the emergence of Edge and IoT, we’re kind of working ourselves back toward the garage again.”

On the Edge with The Edgeline 300

HPE is truly building hardened Edge solutions that consider these environments and that are really redesigned to run at the Edge. Not a repurposed box or a box with slightly different wrappers around it. There was a product out of the box that was almost like it was designed for this application. The Edgeline 300, the Converged System. HPE quickly realized with its partners that this was the right solution for Texmark.

When you start to look at very complex environments involving AI, ML, involving video as a sensor, what you end up with is really the key tenants around the whole reason HPE created the Edgeline Converged Edge System. So out of the box, you have to make sure that whatever fits in Edgeline description has to be rugged enough and compact enough to be able to work in a non-data center environment.

Take that a step further: this is not just a non-data center environment, this is a hazardous plant environment. So you’re taking it to that next extreme. But when you start talking about using AI, ML and video as a sensor or video analytics, in general, you’re talking about a very high level of computing. If you look at the way a lot of companies approach IoT today, it’s a gateway focus. It’s a very small device that has a minimal amount of compute power, not much expandability, and the whole idea is to grab data and send it somewhere else.

Well, in the case of worker safety and plant safety, that doesn’t work. You can’t wait for the data to be moved and for that computing to happen elsewhere. So, one of the key tenets of Edgeline is not only to meet the ruggedization requirements but have the most Edge compute capabilities that are in the marketplace.

The Right Solution

Also the ability to expand with industry-standard VPUs and GPUs and other accelerators that are really fit for purpose is crucial. In the case of Texmark, looking at video as a sensor, the ability to have the right size device with the right level of ruggedization, the right level of compute, but also the ability to take some special capabilities from Intel, and have those embedded directly into that device to really handle specific video sensor analytics, you really start to see that the vision HPE had for Edgeline and the way that HPE is now adapting that toward Edge and IoT solutions. It was a pretty good fit for what CBT and Texmark were looking for.

CBT looked across the entire industry. We pretty much found that the Edgeline compute platform by HPE was far and above most all the competition. That’s why we used it on Connected Worker. That’s why we used it on Predictive Maintenance and Analytics and on other solutions at Texmark. But when it came to putting the compute power way out, into the field, right there in the railyard, and knowing that we’re going to have to process streaming video, and do it quickly, do it fast, we found that Intel and HPE had collaboratively designed this EL300 to do just that. So it became a perfect fit. And that’s why when we deployed this and tested it, we brought the Intel engineers, we brought the HPE engineers to make sure we were tuning this correctly and optimizing it for just what it was made for. And that was one of the beauties of this solution, the way it works.

We looked at everything out there and determined in the end that that was the right solution. When we were designing the system, we were contemplating where we were going to store the data for a month, are we going to do playbacks at certain intervals? Did we have to reposition the cameras to get glare out in certain quadrants of the picture? We are technologists and as far as the technology is concerned, we sit down with the customer and they say, “Nope don’t need that much data. This is how much I need. The glare is not important here. This is what’s important.” This allowed us to really hone in on what were the strategic features. It saved us a lot of time. We started learning, we started thinking like an operator versus a technologist. Technologies are enablers to solving problems, but solutions are what really solve problems. And in order to get a solution, you have to have the right buy-in from the end customer.

Critical Components

The most critical piece is having the right partner, an integrator to really take the components and turn it into the right solution. And it’s still early days around Edge and IoT. CBT is really uniquely positioned for solving these kinds of problems and really being the hands-on system integrator that’s required for these types of solutions. Every partner has a role to play. But there are critical partnerships to getting the right solutions created to really solve the end customer’s problems.

Strategic partnerships can really add a tremendous amount of value in delivering a solution. It drives performance and productivity and helps companies move their digital transformation forward. 

For more information, visit our Industrial IoT page, where this solution and other solutions related to Refinery the Future are captured.

Smart Factories and Next Gen IIoT

 

Let's Innovate Together

Just ask us how we can make a difference for you today.