DeltaV PKM for One-Click Tech Transfer Podcast

Digital Products

The technology transfer process for manufacturers in the Life Sciences industry has been traditionally a slow and cumbersome process to scale up from research and development to commercial production. Capturing the process knowledge along the way and making changes can slow down the overall process.

In this Emerson Automation Experts podcast, Bob Lenich joins me to discuss how the Process Knowledge Management (PKM) digital collaboration platform can help streamline the process and minimize errors, resulting in faster time-to-market.

Give the podcast a listen and visit the DeltaV Process Knowledge Management section on Emerson.com for more on how to drive performance improvements across your pharmaceutical and biopharmaceutical development lifecycle.

Digital Products Transcript

Jim: Hi everyone, I’m Jim Cahill with another Emerson Automation Experts podcast. Digital technology transfer is a growing opportunity to reduce time to market for new therapies. The current approach for developing and transferring a manufacturing process to different product lines across the globe is difficult. The development phase within a global supply chain is very cumbersome, inefficient, and time-consuming. I’m joined here today by Bob Lenich, business director for Emerson’s Process Knowledge Management digital collaboration platform to discuss what is being done to improve this key activity and reduce time to market. Welcome Bob.

Bob: Hey Jim, happy to be here.

Jim: Well, I’m glad that you’re here to share some of your knowledge with our listeners. So Bob, I’ve been around here at Emerson a very long time and I know you have too. Can you share a little bit about your background with our listeners?

Bob: Sure, I hate to talk about how long it’s been because it has been a while but I’ve been at Emerson for several decades and most of that has been in the Life Sciences, process management and operations space. I’ve worked on projects, I’ve worked in product development, I’ve worked in marketing and consulting and again I have a really broad background in all things related to Life Sciences, manufacturing, operations, and overall supply chain.

I’m very excited about understanding both the opportunities that are coming and what Emerson can do to help make those get realized, especially given everything that’s going on in Life Sciences right now because it is such a dynamic space and a lot of the things that we’re talking about doing are really important to just help the industry get better and better.

Jim: Well, that sounds like a wealth of background that you bring to our conversation today. So let’s dive into it. What is driving this change in, I guess, digitizing tech transfer? What is the technology transfer and what are the problems today?

Bob: Sure, so the idea is today there are just a growing number of new therapies being developed. Some of the more recent data we have shows that there are more than 40,000 clinical trials currently underway here in 2024. And so in order for a development to happen, it starts in early stage, it goes to later stage development, it goes to clinical, then it goes to commercial assuming it’s successful. As you’re going through each of those phases of development, someone has to define here’s how we want to make this particular thing and then as it moves from phase to phase, until the next phase, okay, here’s the next thing that we want to do. So capturing all of the information, defining all of those things and making sure that what was done in an early stage is effectively passed on to a subsequent stage and that what’s done in that is captured and continued to be passed on is the opportunity that’s in front of us.

Today, doing things in an individual stage can represent thousands of parameters, and parameters include things like equipment definition and specification for capabilities, material information, operating parameters, quality parameters, just all kinds of things around that. And so if you say, great, there’s 40,000 clinical trials happening, there are thousands of pieces of information to manage around each of these, you’re going to go through multiple phases. You can just see the growth in terms of the amount of information to be managed and how to do that.

Today, all of that is typically done through a paper process or spreadsheets or things like that, which works because it reflects how things have been done in the past, but it’s really not very effective. And so the challenge is people want to accelerate what they’re doing. They have a large and growing volume of all of this and they really would like to apply technology to start doing it. There are two other things that have kind of really impacted this. One, during COVID, everybody saw that if you throw lots of people and bodies at things, you could actually get things done much more quickly. I mean, everybody saw that vaccines were developed in 18 months versus what normally takes 10 years. But again, that was because they threw lots of people at it.

Also, there have been some incentive changes like recent laws in the U.S. with things like the Inflation Reduction Act have also combined to reduce the amount of time patents are in play for development that’s done. So the combination of all of these new things coming, the fact that regulatory groups are reducing the amount of time you’ve got patent protection, and everybody needs to do this. But managing all this information is becoming more and more challenging as the number of these things grow, says that we really need to figure out how to apply technology to this process to make it more efficient and effective.

Jim: Wow, my brain hurts from that 40,000 times 10,000s per each of these in there. That’s really a lot of information to manage, I guess, especially with all the compliance and validation requirements in the industry. So I guess what tools or procedures exist today for this?

Bob: Right. And today, like I was saying earlier, this is kind of a, it reflects the way things have historically been done, which is a combination of people and paper. Traditionally, it’s a paper thing. You know, I mean, you basically have to write things down in documents, create reports, people sign those reports, and then they manage those documents through a paper process. Today, a lot of that has been, you know, kind of converted from paper to paper on glass, but it’s still very much a paper process. And the thought here is, is that just because it’s in a digital document and it’s digitally available doesn’t mean that all of the information in that particular report is easily available because getting information out of a Word document can be challenging. Getting information out of a spreadsheet can be challenging.

And just to give you a quick example, even though things are digital today, today, when someone is working on a spreadsheet to characterize, you know, how to make something or how to scale a piece of processing equipment up, a scientist will sit in and they’ll start working on their calculations and they’ll start typing it in. And when they figure it out in order to meet the regulatory and validation requirements, a second person has to sit with them and verify that they typed in the calculation correctly and sign off on the fact that, oh yeah, you typed it correctly. Guess what? That’s the kind of thing that, well, that’s not what those people originally went to their, you know, they got their degrees in to watch somebody else type in Excel. So if there are things that we can do to eliminate that kind of a step, it’s a huge savings to those people because it gets them away from the tedium of just managing the process to actually doing the kind of work that they want.

And then by the same token, when you take it from site one to site two and you go from early stage to late stage to clinical, every time you do a transfer, two people have to get involved in that exact same kind of a process. The source site has to say, here’s what we did. The destination site says, okay, we got it. Here’s how we’re updating it. Everybody signs off on that. And so again, just the tedium of managing the exchange, setting up the meetings to do that, making sure that people are available, you know, all of those kinds of things, just make it a real headache to try and do it with, you know, today’s typical tools.

Once you have those things, even though you do then have those documents in like a Word document or a spreadsheet, those are typically then managed on SharePoint. And again, SharePoint has a lot of benefits in terms of making it easily available. But finding stuff on SharePoint is a real headache. And if you’re trying to find a particular parameter in a particular file on SharePoint and you want to find out, oh, where was that pH value referenced in five different places across five different SharePoint sites, again, that’s a major headache to try and go in and do it. So while things work today, they don’t work very efficiently and effective today. And so there’s a lot of opportunity to go in and start converting them.

So what we’re really talking about doing is trying to take what had been this, you know, traditionally paper process, even though it’s somewhat digital, and convert it from a paper process to more of a data process. So instead of having a report or a spreadsheet, all of the elements that are going into that document or into that spreadsheet are now pieces of data that we want to manage independently through some kind of a centralized application. And then we want to do collaboration, change management and things around that so that you’re converting it from the way you’ve done things to something that takes advantage of new technology.

Jim: Yeah, I think you describe that challenge really well that, you know, one step is just another from pieces of paper to really pieces of paper behind a computer monitor, which is a step in the right direction. But it certainly doesn’t give you the benefit of what could be possible. So I guess what are some of the new approaches to replace and improve this traditional approach?

Bob: Sure. So one of the things that’s out there is this idea of digital process knowledge management is a growing opportunity and there are growing solutions out there. So about three years ago, three, four years ago, we had been working in this space and we were trying to figure out good ways of doing that. There was a company out there called Fluxa. They had a package called Process Knowledge Management. Emerson did an acquisition of that company and we’ve been working now to embed that product platform into the overall Emerson portfolio.

The idea here again is that instead of having SharePoint that’s managing all these individual documents and spreadsheets and things, what we’re doing is we’re extracting the information out of those sources. We’re putting that into an enterprise-wide database application so that all of those things are available, changeable, traceable, etc. so that you can do your job much more effectively. And so the thought here is that particularly when you’re defining a process, a manufacturing process, and you need to say things like, okay, what are the step sequences that are required in order to go through and make this? What are all the associated operating parameters and then attributes around that? If it’s a pH for making sure that the bioreactor is operating at the right level for growth, what’s the appropriate operating range? What’s the appropriate calibration range? Just making sure you’ve got all of those things captured.

Similarly, when we talk about the equipment and that bioreactor, what are the characteristics and capabilities of that bioreactor or other things in the process so that you can really say, yep, I’ve got a good understanding of how that piece of equipment is going to work. And then as I go from stage or phase-to-phase and development line to development line, I can confirm that, yep, I know that the new piece of equipment I’m going to use reflects what the original one had and therefore we can confirm that, yep, it’s going to work and you’re going to get the same kind of result.

So this idea of converting all of this information from something that was captured in a spreadsheet or a document, making each of those things an individual element, an individual recipe object is one of the first things that happens in PKM. And the key thought there is that when you have all of those things broken down into this level, then you can provide traceability and versioning and automation. So the thought here is that when somebody then says, great, we define the pH value, but you know what, we found out after running some tests that instead of running at 3.7, we want to run at 3.9. And great, you’ve got to go through and do change management. Doing that, again, in SharePoint with spreadsheets is a real headache. With something like PKM though, you make the change in one place and then every place that particular value is referenced is automatically updated. So that’s again taking advantage of the propagation characteristics of new technology and taking advantage of this new model.

Similarly, once you start combining things together and you’ve got all those building blocks in, then you can start building things like templates to say, okay, let’s define how a bioreactor works. What are the key operating activities, key step sequences, and then all of the associated parameters with that so that you’ve got a standard approach to say, you know, this is how bioreactors generally work. And then you can customize that for each individual product that you do.

So having this template then gives you the ability to be much more efficient. And then again, as people make changes to the template, any place that template was used, you can reference that, you can propagate the updates, and you can ensure that any changes are done are appropriately reflected everywhere it was used. And you have confidence because you’re using digital technology to track all of that versioning, all of the updates and everything else.

Then once you’ve got the building blocks done, then you can start writing recipes. And there has historically been something called ISA88. ISA88 is a standard that the industry has adopted for a number of years. And what it really does, it talks about having general recipes, which are scale agnostic, but kind of defined in general how things work. They will have site recipes, which are more specific and are starting to getting very scale-specific, as in instead of it being a benchtop scale at a five-liter bioreactor versus a commercial scale at a 20,000-liter bioreactor, you’re getting into those kind of distinctions. And then you go to master recipes and control recipes.

So what we’ve done with PKM is we’ve historically never really been able to do the general and site recipes in a very efficient manner. So PKM really focuses on those things. And then what it allows you to do is once you’ve got general and site recipes defined, you can then very easily export that information and integrate it with the execution systems that do the master recipes and the control recipes to actually do things. So this combination of defining things across the full scope of a recipe hierarchy and then being able to integrate things is another one of the major elements that you have and that you can do.

And then the last part is that when you look at all of this, you also want to be able to go through and do risk assessments, because again, it’s great to be able to say, here’s the definition of everything and here’s how it should work. But in order to ensure that the pharmaceutical manufacturing is meeting all of the safety requirements that are out there, then you want to be able to say, great, what are all the key risk areas at each stage against each piece of equipment, against each parameter that I’m talking about, and be able to again capture that and go, Okay, yep, this is a real significant risk. Let’s identify that. Let’s identify how we’re going to deal with that and turn that into something that we’re going to actually manage. And so, again, capturing that definition, capturing that information, and then being able to reference that as you’re actually doing things to address those risks is something that’s captured by PKM.

So a lot of stuff there. It’s a really powerful package and it really takes advantage of new technology so that you’ve got everything in a centralized database. And then you can have multiple people work on common areas, you know, collectively. They don’t have to be in the same time zone. They don’t have to be in the same location. They can see changes that are done. You can add comments and annotations. You know, you can have chats back and forth. There are just a lot of things that make this a much more effective way of taking advantage of digital technologies for people to share information about how to do something.

Jim: Yeah, it sounds like it’s not just organizing it better in the database and make an update here reflects in the places where it needs to reflect is that collaborative element of people being able to work and seeing what each other’s doing that’s really able to foster that. So, Bob, can you share some details like where this has been done or what kind of results and benefits that have been achieved so far?

Bob: Sure. Like I said, this is a really growing opportunity and a number of leading-edge life science companies are actually applying this and have been doing this for a number of years. So it’s really interesting what they’ve been able to accomplish.

The first thing that’s done is that they’ve been able to go through a standardization exercise. One of the things that companies have found is that when they’ve started out, they’ve had a lot of differences that are minor differences, and they’ve always kind of agreed, Okay, the way we do it in site one versus site two, they’re close, but they’re not identical. And that’s okay. And we’re not saying you need to make them identical. But what’s been interesting is when you start with this combination of general and site recipe approach, getting some kind of common naming convention and standardization really provides a lot of benefits because it helps you eliminate redundancies and it makes communications much clearer, both within your organization and then when you actually talk to the regulatory groups.

So when you’re doing a regulatory submission for a filing for a new indication, for a new therapy, having common names for things so that people understand and realize, oh, yeah, I called it bioreactor and I called it titre. And exactly what that means and knowing that it’s the same thing across different sites that you have in different products that you’re talking about is a really valuable thing that those people that they didn’t necessarily realize when they started, but it’s become something that’s very evident as they’ve been using it. And again, the thought here is that people were doing the same things, but they were calling it slightly differently. And so aligning on that is a really big benefit.

The second thing that’s been going on is that the opportunity to start doing this and then be able to really reduce the amount of time, because when you look at what we’re talking about here and what I talked earlier about saying, you know, people are using spreadsheets and they’ve got two people typing in a spreadsheet and you’ve got people reviewing it and you’ve got to transfer it over to another group. They’ve got to do more typing, then when you connect it to an automation system, somebody’s got to extract the information on the spreadsheet, you know, put it into an automation system. Somebody’s got to verify they did that correctly. You’ve got to go through all this testing. There’s just a whole bunch of data integration and management activities that don’t provide a lot of value add but you have to do just because today all of those separate systems have to work together in order to make this happen.

So as a great example, Roche recently won facility of the year in 2023 for their clinical supply chain facility in San Francisco. And it’s one of the best applications of taking advantage of Process Knowledge Management and connecting that into execution systems. And so the idea here is that they’ve established, you know, “general recipes.” So they have process families defined for general classifications of things like monoclonal antibodies or other things like that. And they have a general way of approaching those. Then when they’re doing an individual product, they will define that product. And as they’re going from, you know, an initial site for development to a launch site to a full-blown commercial site, they will have site versions of that one general recipe for that particular product. And they’ll be able to automatically scale things.

So the fact that you can take advantage of the information, you can embed calculations that they refer to the scale of the site and automatically scale things up so that if you’re at a 2K site versus a 20K site, you know, the calculations for how to size something, how to, you know, figure out the amount of materials that you need, things like that, those kind of things can automatically be done and then tracked as a part of what you’re doing. So Roche has embedded that. They’ve also integrated it with a lot of their existing systems. So they’ve tied their knowledge management system into their materials system. So they know that the standardized material names and suppliers that exist in their overall enterprise planning system are the ones that are being referenced by the scientists. So, again, nobody’s having to go through and cross check those kind of things.

So the idea that this information that’s required from material definitions, to equipment characterizations, to telling the automation system what to do, all of those things are connected together and integrated. And the real benefit of that is one, they’re able to do things on a much faster basis. And Roche will tell you they have been able to dramatically reduce the amount of time and effort it takes them to create a new product. And in the clinical supply center, they have their digital knowledge management system integrated with their execution systems. And they will tell you that they have eliminated the need for transcriptions, you know, so that people are not involved at all in moving something from system one to system two, because it’s all integrated. Therefore, all the transcription errors have gone away. All of the work to review that, sign off on that, have gone away. And they will clearly document that they have been able to save significant amounts of calendar time, significant amount of costs as they’re going through and doing this. And this is now a core part of what they’re doing at that facility. And they have certainly been working to extend that to other words.

So the real plus here is that you can significantly reduce the amount of calendar time that historically is taken to go, you know, from site one to site two or to go from version one to version two, eliminating a lot of the testing, you know, verification, confirming that there aren’t any errors, things like that because you’re doing all of this with digital integration and you’re able to confirm that, you know, that the thing’s transferred correctly and you’ve got the traceability, you’ve got the data integrity, you’ve got all of the elements around that. It just significantly reduces the amount of time and effort that it takes.

It does take a chunk of work. You do have to have done this standardization exercise up front. You do have to have worked to make the integration there. And so what we’re also working on is making it so that eliminating those kind of barriers, making it easier to start, making it easier to do the integration so that you can get to this end result, dramatically reducing the amount of calendar time and or effort it takes to bring in a new product to a new facility. Those are things that are coming. But at the moment, you can certainly go out and look and there are a number of papers that are out there, presentations that have been done that just kind of document it. And like I said, that facility of the year award from ISPE for 2023 is one of the best ones we’ve seen.

Jim: You can just see that the workflow, the whole process getting way more efficient and way less errors involved. And I liked what you said at the start about, you know, kind of bringing that common tribal language between facilities all across the world. You know, everyone’s doing things a little bit differently. This will get away from that. So it’s really common language of communication, which is really a huge benefit, I think. So where can a life sciences manufacturer get started to learn more about the Process Knowledge Management solution?

Bob: You know, there’s some really good places to go. I mean, obviously, we’ve got a lot of information on the Emerson websites. If you go to Emerson and you start searching for Process Knowledge Management, it’ll take you to the websites on the Emerson portals where we have what we’re doing and examples of what have been done there. And again, you can see videos of things people have done. We can show you the kind of capabilities that we have. And obviously, we are ready to talk to you about anything else we can do to help you understand things and potentially start showing you and give you demonstrations of this.

The other places you can go, hopefully, people in the life science world are familiar with Bioforum. Bioforum is an industry consortia that consists of a large number of end users, technology providers like Emerson, equipment providers like, you know, Cytiva and Sartorius, regulatory groups, academic groups, etc. And, you know, we work collaboratively together to start kind of capture what the industry needs. And we do this in a I’ll say an agnostic perspective so that we’re capturing the needs, but we’re not doing anything that’s proprietary there. And if you go to the Bioforum site, you can also get a lot of good information about what’s going on relative to tech transfer and more of the general things that are happening. But if you want to start looking at specific examples and how to start targeting it and make it real for you, then I would say you need to talk to someone like us that’s actually working and doing it with people. But you need both. You need to understand here’s in general what’s coming and how it’s being looked at by both the industry, the regulatory groups, etc. And then you can look at here are places to go to show you specific examples and the real benefits that have been achieved.

Jim: Yeah, and I know we’ve covered a number of things in the Bioforum that they’ve put out in separate blog posts. And I’ll provide a [Biophorum] link to where we have some of those located here on the blog and some of the other things that you mentioned. So, yeah, there’s a whole lot of information and videos and other documents to learn more or connect with the experts that we have around the world. So, Bob, thank you so much for joining us today and sharing your expertise with our listeners.

Bob: Happy to do so. Again, this is a really exciting time in Life Sciences because making these kind of changes is something that everybody’s been interested in for a long time. And the reality is it’s happening now and people are really excited about actually making these kinds of things real and then getting the benefits from, it is a really exciting place to be. Thank you.

-End of transcript-

The post DeltaV PKM for One-Click Tech Transfer Podcast appeared first on the Emerson Automation Experts blog.

Emerson Electric Co., a diversified global technology company, engages in designing and supplying product technology and delivering engineering services to various industrial and commercial, and consumer markets worldwide. The company operates through five segments: Process Management, Industrial Automation, Network Power, Climate Technologies, and Appliance and Tools.

Emerson Electric was founded in 1890 and is based in St. Louis, Missouri.

Source of this programme

“This is one astonishing constituent.”

“Bob Lenich joins podcast host Jim Cahill to discuss how the Process Knowledge Management (PKM) digital collaboration platform can help streamline the process and minimize errors, resulting in faster time-to-market…”

Source: Read More

Source Link: http://prsync.com/emerson-electric/deltav-pkm-for-one-click-tech-transfer-podcast-4276461/

#DigitalProducts – BLOGGER – DigitalProducts

Author: BLOGGER