Unsere Lösungen

Wir haben für alles eine Lösung – und die entsprechende Erfahrung. In unseren derzeit sieben Kompetenzbereichen stehen wir Ihnen mit unserem umfassenden Know-how mit Rat und Tat zur Seite.

Alle Lösungen



Mit unserem Werum PAS-X MES – vor Ort oder in der Cloud installiert – und unseren Softwarelösungen für Datenanalyse, Track & Trace, vernetzte Fabriken und intelligente Verpackungen sind wir der weltweit führende Anbieter und Partner der Pharma- und Biotechindustrie.

Übersicht Software



Wir sind Spezialisten für komplette Transportsysteme im Bereich Pharma- und Medizinprodukte. Unsere Lösungen sind maßgebend im Bereich des kontaktlosen und sicheren Transports von z.B. Glasspritzen.

Übersicht Transportsysteme



Als weltweit führender Inspektionsexperte entwicklen wir Lösungen für die Pharma und Biotechindustrie. Unser Angebot reicht von Hochleistungsmaschinen und Halbautomaten über Laboreinheiten bis Inspektionsapplikationen für die Inprozesskontrolle.

Übersicht Inspektion




Wir sind führender Anbieter von Verpackungsmaschinen für flüssige und feste pharmazeutische- sowie für medizinische Produkte. Mit unseren Blister-, Sachet- und Stickpackmaschinen bieten wir Lösungen für Primärverpackungen. Unsere Side- und Topload-Kartonierer setzen weltweit Standards für die Sekundärverpackung.

Übersicht Verpackungsmaschinen

K.Pak Topload Case Packer

Introducing our latest solution from Körber; the K.Pak Topload Case Packer! Created specifically for the pharmaceutical industry, the K.Pak solution provides operator-friendly machines to complete any production line. Our solution focuses on innovative technology, high-quality design and expert handling and packaging of your product. It’s time to start connecting the dots with Körber!



Als langjährige Spezialisten entwickeln wir Verpackungslösungen für innovative und hochwertige Pharma-Sekundärverpackungen aus Karton. Wir bieten Ihnen Lösungen für Fälschungssicherheit, Standard- Faltschachteln und vieles mehr.

Übersicht Verpackungslösungen



Unsere Experten beraten Sie nach der Analyse Ihrer Anforderungen, zeigen Ihnen Optimierungspotenziale auf und unterstützen Sie bei der Implementierung von Projekten in allen Bereichen der Pharma-, Biotech- und Medizinproduktindustrie.

Übersicht Beratung

Christopher Taylor


The Role of Culture in Bioprocess Development

What makes a national soccer team successful? Is it the technique of the individual player or team? And most importantly, how best to compare bioprocess development to the glamor of football?

Just in time for the World Cup, the Economist magazine published a statistical review of all national football team results to parse out what effects impact the goal differential between countries. Interestingly, but perhaps unsurprisingly, up to 40% of the variation was explainable by culture, government support, and stable organization of the national football league. While 60% presumably lies within the talent and development of the team, a surprising amount of the success of a football team may be tracked not to the techniques but to the cultural and social management of the environment in which the football team plays.
Dare we compare biotechnologists to football stars? Sure, let’s have a go: first of all, because we are all at least, if not better looking on average than football stars. Secondly, because the variation in the success of bioprocess development depends on much more than having the best hardware and software systems available.

Organization of bioprocess data and data analytics is Koerber’s bread and butter. However, no matter how solid our collaborative results are, if the culture & management around bioprocess development is not sound, the results will either not be accepted or, if accepted, will not be continued in the future.

In creating a state-of-the-art bioprocess development department, culture and organization are at least as important as all hardware and software systems combined. Here are some of the key principles, from our experience in process development, that will ensure a sustainable, advanced development environment:

Structure the bioprocess experimental approach from the beginning

The first thing Development department heads must ward off is the tendency in early development phases to play around with unstructured experiments. Developers will often say: A bit more oxygen here, bump the temperature there, and see what happens. It cannot really be done any other way, right?

We believe that structuring the early process development experiments is as critical as the later, more formal process characterization studies. By planning out even the earliest runs in any given process unit operation or equipment, a number of advantages are generated:

  • The data remains organized so that it may be used in the creation of the digital twin
  • The data is more likely to be stored in a physical format usable for future modeling
  • The results are more likely to be comprehensible for knowledge transfer to future development projects
  • The results can be used to augment experimental designs in the characterization phase

Say, for example, that with absolutely zero prior knowledge, we set up a screening DoE with a maximum number of settings. Let’s further assume that half of these settings lead to complete failure to produce results. No problem! With clever use of the statistical techniques and software packages, we can either rerun the experiments in a constrained D-optimal design or we can augment further experiments with full knowledge of the edge of failure. Moreover, the remaining half of the DoE can still be evaluated for the main effects of the remaining parameters. This is a win/win.

This change must be both cultural and organizational. From our perspective, the whole department should understand that no experiment should be run without the big picture in mind. If correctly supported by management, this change can be extremely effective and surprisingly sustainable.

Design of Experiment (DoE) unless otherwise justified

Not to put it too subtly, in this age of bioprocess development, DoE should be the rule, not the exception, even in the earliest phases. We simply have too little time and resources not to wring every last piece of information out of every experimental plan. Is OFATS always the wrong choice? No! (see next point). But they should always be the second choice to a DoE.

All scientists must be on-board with this approach, or some will go back to the old ways and will slow down the development process. Management must support their scientist to be up to date with the most efficient design approaches and must constantly warn against inefficient experimental plans.

And elegant DoE designs are getting easier and easier to perform. Some experts are even forgetting about deciding between different designs and rather running Optimal algorithms for any given configuration, allowing for a complete standardization of design generation (See Jones and Goos).

Software set-ups in high throughput systems are also increasingly automating the ability to design and run DoEs to maximize information at a micro-scale.

Simply put, there is no reason why DoEs should be considered exotic or mysterious. They should be the standard and accepted by every developer in the department.

OFATS done correctly

Many developers will correctly point out that OFAT (one factor at a time experiments) does not equal bad design. These developers should be encouraged, however, only within the overall DoE framework of the department.

There are situations that will definitely call for an OFAT approach to process parameter modeling. However, even in these cases, we must consider what the future use of the results will be. If we are simply testing a single point in space, we are not actually getting information. We have no idea what the variation is, or how often we will be able to repeat that result. We learned almost nothing.

However, if we cleverly modulate one single factor (and even replicate it) over a specific range, we not only learn more about the behavior of that parameter (at least in a univariate space), but we can also still fit this simple model into later more complex models, such as in the digital twin (more on that in a minute).

We are going to need all our data to maximize process understanding; therefore, we need to ensure that all our development runs may be useful and useable in later modeling.

Structure and store the data and models for future use

The data must be available to everyone that needs it. If a company remains in a silo mentality, where each scientist has their experimental plan and results on an Excel spreadsheet on a local folder, the culture will without a doubt lead to underperformance of the department.

Most companies are now on the road towards the central organization of data, even in development. However, controlling and filtering development data for active use both in the current development project as well a future project is critical.

Digital Twin: Reap the rewards of your effort

If you have come this far, it is time to be rewarded by compiling all your data into a digital twin. The digital twin is actually simply a synthesis of all active process models, leading to a complete in-silico version of your process.

By the end of process characterization, and including all data from even the early stages of development, you should be able to construct this virtual version of the process easily and use it to:

  • Establish the correct NORs
  • Warn against likely OOS
  • Predict outliers in real-time
  • Reduce useless deviations

The digital twins should become the end goal of the development team and should be considered a huge victory. This is no easy task, but the results are beautiful, and it serves as a very clear end-point that can be celebrated as such. Don’t simply turn over a development report. Wow the group and management with the stunning plots and predictions of the digital twin.

Transfer your knowledge to the next process

Lastly, the teams must work together to bring the process information derived from all this work forward to the next project. Even vastly different processes can still benefit from the experience and technical achievements of the previous project.

For example, by leveraging the development database, one could notice that a single CPP nearly always behaves in the same way, despite the different processes. This could serve to create a standard experimental configuration for the optimization of this parameter.

This can only work if the culture of the development team promotes the time to talk through the lessons learned at the end of each project. And if the data is allowed to be shared between groups. We are all working on this together, and as much as Ronaldo dominates the Portuguese team, he still cannot play alone.


The talent of the development team, and the techniques and strategies they use, are extremely important – just like the World Cup teams require star players to make the biggest difference. But without the culture and management surrounding the team, it is impossible to enable the important changes that the bioprocess industry requires sustainably.

If you would like to know more about our vision and approach to the organization of bioprocess development groups, email us here.


Keine Kommentare

Kommentar schreiben

* Diese Felder sind erforderlich

nach oben
nach oben