Configit Technology Center First Look
What is your current role in Configit, Anders?
I am involved in what we call Technology Center (TC), which is a group of people who take care of our core technology. If you use Configit’s software, “Core Technology” means the compilers, Configit runtime and many of the advanced capabilities possible with Configit Runtime such as BOM Analysis, BOM validation searching for rules etc.
How long have you been in this role?
I have been working with our core technology since we founded Configit 15 years ago, however TC as a concept or a group is quite new – about a year old.
By forming TC, we are putting focus back into core technology. In general terms, we are incorporating not just what we can do with our technology, but expanding the actual footprint of the technology.
What are some of TC’s current initiatives?
We are currently working with a new implementation of previous offerings, with the goal of more accessible APIs (not components) that enable us to do things we couldn’t do before. For instance, we are moving beyond compilation and configuration to opening for analytics like Bill-Of-Material (BOM) validation.
We have new ways of applying product models. Instead of just looking for valid combinations there is the possibility to ask for more sophisticated questions.
You mention we have a new implementation of everything we have been able to do until now. Can you elaborate?
Previously we had a very batch oriented approach to configuration, with tools to define product models and a compiler to turn them into VT-files and then an application to load VT-files in Runtime. This meant a separate two-step approach.
With our new implementation process, we are merging this into one. There is no longer a need to make a clear distinction between compile-time and runtime. There is still a process of compiling for certain operations before data can be consumed, but with a more holistic approach. When someone is working with modeling, they quite often want feedback about what each rule implies. For instance, “Am I breaking other rules? Can I get a different view of the solution space?” We think of this as a closed cycle, referred to as compose-compile-consume. Typically, as someone composes they want to quickly compile and consume what they are composing to see the effects. This new implementation works as a unified offering where a closed loop replaces the gap between composing and consuming.
Could you say that this merges pmc (product model compiler), the runtime and Configit Model into a single unit?
From a technological point of view – yes it does. Essentially what we are doing is putting all the available capabilities into one coherent package, lessening compilation and runtime steps previously necessary. Typically, users can find very little in terms of analysis and explanation when discovering errors in new rules or assignments that behave differently than expected. Our goal is to build an offering that can be consumed by all our products: Model, Ace and Build as well as being incorporated into other Configit products.
Will this implementation have a formal name?
Yes, the working title is Configit Core, with an internal nickname “Noddy” for historic reasons. In the early days, pre-Configit, there was a brilliant programmer who made a BDD-package called Buddy, which we turned into a variant for Configit called “Cuddy” (’C’ for Configit) and eventually this evolved into Noddy. Noddy still serves as a module, which is the very lowest level of Configit Core. In simple terms, Noddy handles BDD-operations and forms the foundation for everything we do in Configit – although there are many layers on top of this; some for authoring/composing (allowing for input in some form: text, BDD-format or other) and others for allowing users to configure against it. Runtime also serves as an additional layer.
Who is a primary customer for core?
Core was conceived to provide a closer integration between the technologies in Ace. Technically speaking, the first user of Configit Core is Ace, however we have already had a number of projects where Services transforms customer data into something that can be configured and exposed to customers. Instead of trying to import customer data into Configit Model, re-export it to another intermediate step, and pass it through pmc to get a VT-file before configuring, Core provides a shortcut that allows you to go from customer data to configuration in one smooth step.
Is core already in use by Ace today?
What examples can be shared?
In an effort to protect client confidentiality, we will discuss 2 cases, “Customer A” and “Customer B”. “Customer A” is currently moving towards a full-scale Ace implementation; however they have another authoring tool and want to take content from this authoring tool and make it available for configuration. The link from their product tool to configuration is Configit Core. The beauty of this approach is that they can use their own authoring tool today and Ace tomorrow, and the service that consumes the configurations and performs solves will be the same service. Through Core, they are able to use the future solve service in Ace.
“Customer B” is also interesting, because they have very different requirements. They do not have configuration needs. They have a PLM system. This system is capable of authoring everything – from a concept of rules (writing and editing), as well as controlling change management. However, this system does not do much with the rules – what the client wants to know is, “Are there any inconsistencies in our rules? Are they sane? What can be done with them? Do they look right? Are the BOMs aligned?”
Essentially, we are responsible for taking their system rules and creating reports to reveal inconsistencies. Again, we use Configit Core to extract the rules and compile them into the new core VT-format and then, instead of doing configuration, we do analysis and ask if the rules are OK, if the values and intervals are possible, if there are inconsistencies etc.
If It Can Be Configured….It Can Be Built
We have heard about Solve Services and Configuration Services and now also Analysis Services. What about BOM Services – can you describe this concept?
BOM validation is an example of the most advanced form of analysis services Configit works with. The challenge is that quite often, when working with configuration data, the definition of what is sold is disconnected from the definition from what can be engineered and then again from the definitions of what can be manufactured.
Ace allows users to mix marketing intent with engineering intent. Ace captures what marketing wants to expose – for which markets and when – while at the same time capturing the technical requirements, so when choosing some features users automatically get all implied required features. The challenge is to make sure that what we offer to the customer is something that can be manufactured.
We can take the definitions of all the parts that can go into a product – the super-BOM – and match it against all rules used for both selling and engineering the product. The end result? A promise that everything that CAN be configured CAN be built.
We can then guarantee the following promise: “Everything you can configure can be built!”
All the individual parts of a product are organized into components (pieces). Configit can guarantee that for every possible valid configuration there will be exactly one part in every component and that we will never create an incomplete or incorrect end product.
Under Engineering vs Over Engineering
What do you call a BOM if it does not have this property?
We use different names, but one of the most frequently used is “Under Engineering”. This happens when engineers have “forgotten” to describe how to engineer a certain configuration.
To further complicate matters, effectivity must also be considered as different parts in different components replace each other at different times all while configurations change validity over time. For instance, when an order is taken there is no guarantee that it will be able to be manufactured at a specific point in time. None the less, the unit BOM needs to be checked and validated against a specific point in time (being the point in time when we guess the product will be manufactured). The unit BOM is what appears when you apply the configuration to the super BOM, the result is all the concrete parts one needs to manufacture a given configuration. Complications can arise if an order is rescheduled. When an order (or part of an order) is delayed the parts in the order have to be recalculated / re- confirmed, even the configuration driving the parts might need to change. The challenge here is that BOMs cannot be reconfigured or re-exploded for every single point in time, which is very challenging for production.
Configit’s guarantee comes into place during such scenarios. “For every possible product that can be configured at any possible time, the criteria applied to the BOM will be valid”.
Under Engineering is interesting because companies want to guarantee that what is promised to customers can be produced. We have seen real examples where products have been configured and sold, yet when they reach the manufacturing line parts are missing.
We also look at Over Engineering. Over Engineering is when parts are defined, saying that this part belongs in this place in the BOM under these conditions. As an automobile example, companies could introduce an advanced heated seat, automatic temperature controls and possibly some other top-of- the-line gadgets. However, it could well be that this combination will never be allowed. This means that there is a part – the advanced heated seat – that could sit unused, never being placed into any automobile. Even though the conditions look correct, they end up costing a lot of money. With parts going unused, engineers spending time creating unsolvable puzzles and then later spending time figuring out how to make the puzzle solvable. The process of checking for Over Engineering is really just a sanity check that the BOM is valid.
Believing in BOM
What are the primary challenges in the field of BOM validation?
Many customers find it difficult to understand the impact of this type of technology. The automotive industry as a whole accepts that this problem exists. Their options are to accept it (i.e. do nothing) or they can hand craft custom solutions using brute force checking methods or have engineers validate the BOM definitions. The realization that this can be completely automated can generate the feelings of a science fiction narrative.
Do you think the automotive industry is the most mature industry for BOM Validation and Analysis?
The automotive industry is a good fit because there are a few things needed in order for solution space management or BOM validation to make sense; i.e., a highly configurable product with an advanced bill of material. This is the case for passenger vehicles, trucks and many commercial vehicles. There are other parts of the automotive industry where there is a lot of engineer-to-order (ETO). Several truck manufacturers accept that while they may have a super BOM (configurable BOM) they always have to perform a fair degree of ETO; it is likely or even certain that a large part of orders cannot go through without special engineering and the BOM functions as a guideline, not a rule.
What about other industries – have you seen similar requirements there?
The approach is the same regardless of industry, but not every company uses Super BOMs to the same extent. For example, with pumps, where we know super BOMs are used, they are not as highly configurable. Companies with pumps have a very broad spectrum of products but the configuration challenge is often more to find the right product and then adjusting the options to that specific product, whereas the automotive industry has fewer products that in themselves are much more configurable. The point remains that the challenges are the same.
Configuration Lifecycle Management (CLM)
Does CLM make sense for the automotive industry?
CLM starts in the early phases of a product lifecycle, as part of this; we specifically look at the engineering phase. I have talked about being able to produce what we can sell, but the engineering phase is not about what can be sold (the sales configurator). Here we take the Engineering BOM (e- BOM), which in most cases is the foundation for the other BOMs (the Sales BOM, The Manufacturing BOM, The Service BOM etc.). It starts with the e-BOM. The e-BOM expresses how the engineers look at their products and it is typically the e-BOM we validate. When we talk about BOM validation our customers understand that we are talking about the early phases of defining the products not the later sales phases that are typically in focus when talking about configuration.
Exploring the Solution Space
Why do you think Configit is a first mover?
We have some technology that is unique in this area. Compared to other configuration vendors, we have the capability to work with solution spaces. This may sound like a subtle change from constraint solvers or other configuration approaches but we can unify the solution space for what you wish to sell with the solution space for what you can engineer and match that with solution spaces for customers. So while other configuration technologies might be able to do some of the same things…we have an advantage with our technology because it is extremely well suited for this challenge. In the end it is really about finding overlaps and gaps in solution spaces and that fits extremely well within our VT- technology.
Is it an oversimplification to say that traditional technologies are oriented towards points in solution spaces, whereas we operate on entire spaces?
Yes – it is more complex than that. Constraint based solvers really just try to figure out if something has a solution. To do this they search and search. Some of those technologies can be used for part of the BOM validation challenges, however, we don’t just validate with a yes/no. We also find counter examples and identify individual rules that help analyze and debug the solution space.
What I have heard from our customers is that there are a few cases where they actually try to do BOM validation. The way some of them do this is brute force – enumerating every valid combination of a certain set of variables to see if it ends up with a correct BOM. A truck manufacturer who actually does this explains that you cannot do this very often because just checking one component can be quite complicated. They have a single component which has 128 million different valid combinations of the driving variables, which in combination with thousands of parts; explodes to a huge number of combinations to validate so it is something they do not do too often.
Now, we want to do this continuously, and this is where we bring the unique perspective of an integrated compile-consume approach. What we try to do with BOM validation is provide a service for the users that defines materials in PLM systems, giving them immediate feedback as they are authoring their materials on their PLM platform. They immediately see if they have rules that can never be satisfied. This way we extend the footprint of where configuration data is available and we make it available across systems. Configit is not about authoring BOMs – but we support the authoring from a Configuration Lifecycle Management perspective.
Do you have feedback from the market about the value of CLM authoring and BOM-validation?
We have indications from an automotive manufacturer who produces quite a lot of vehicles. They sometimes make engineering decisions, which may seem harmless, but they cannot gauge the implications on parts. They would really like to know if an engineering decision – say opening up for a certain combination of features – would be covered by the parts in the BOM. They also want to know the extended costs of introducing new parts. They are working with small margins of a few euros per part but with thousands of parts in hundreds of thousands of vehicles each year this translates into huge sums. Even subtle changes can have a huge impact and this is why it is valuable to be able to analyze the impact of changes.
Keeping Your Promise
I believe our message is not just about savings or increased revenue. We emphasize the value of keeping a customer’s promise to their customer…making sure they can indeed manufacture customer choices. It translates into efficiency, quality and ultimately into goodwill.
It is the same thing we are doing with configurators. Instead of just taking orders and hoping for the best, we use a sales configurator. There are two aspects. One is support; streamlining the processes; making sure companies don’t waste engineering hours on something they don’t need as well as making sure they provide a quality experience to customers. Lastly, there is finding the right parts, removing those that do not work and identifying parts related savings.