Every time I see this picture, it reminds me of my role as an interoperability engineer. Both drawers work great individually, but when placed into a production environment, there are some glaring problems.
Not taking the time to set our products up and test with other solutions prior to production installation could land us in the same situation.
My role in interoperability testing is to take our core products and make sure they “play well” with others. This could be anything from a SIP trunk on our SCM Compact to an analytics package attached to our Wireless Enterprise Access Point Controller.
The key to a successful interoperability test is a well thought-out test plan which develops into a well-written interoperability guide for our partners. This typically involves taking the designers original feature guide and listing every possible combination of how those features can be used together.
In the case of our Samsung Call Manager and a 3rd party SIP trunk, this could include testing features like call hold, call transfer, conference calling, and about 150 other different scenarios that are less common. Yes, both products speak SIP, but it is my goal to make sure they are speaking the exact same dialect of SIP. Think of it as both systems speak a Latin-based language. Perhaps one speaks Italian, and the other speaks Spanish. Chances are they will understand each other, but some things will be lost in translation. I need to make sure they are speaking the exact same language down to the region.
In a properly set up lab environment, it is easy for an engineer to run through a test plan, diligently testing each scenario, with all the logs and traces like Wireshark running to capture the activity. If something doesn’t work just right, the engineer can review the traces, determine the misalignment, adjust, and test again in the lab environment without impacting a customer. Attempting to do the same in production, by the time the technician or engineer coordinates a maintenance window and works with the local IT department to get all the traps in place, it could take hours or even days to accomplish something that would only take minutes in a lab. Not only that, a major burden is placed on the customer.
At the end of the process, when everything works as expected, it is important to document the changes needed so that a solutions installer can configure the system apples-to-apples.
Now, while every effort is made to test every possible scenario, sometimes a solutions provider will do something in the field that we didn’t imagine. Not that this is wrong, it is just a use of our products that we hadn’t imagined. The key here is that if problems are identified and resolved in these cases, that feedback should be updated to the interoperability guide. I would be weary of an interoperability guide with “Version 1.0” listed on the title page.
An example of poor interop testing could play out like this: if a partner deploys a Samsung solution with a 3rd party SIP trunk that we have not completed interop testing with, the partner may get complaints of dropped calls. The typical end user knows they have a Samsung solution, but they are not aware that they have a Samsung PBX with 3rd party SIP trunks. In the eyes of the end user, it’s the Samsung solution that is not working. They reach out to their installing partner who rolls a truck to site, sets up a trace to watch it happen, and learns that a simple timer change resolves the issue.
If this product had completed interoperability testing, the issue most likely never would have presented itself.
Granted, this is a very simplistic example. In the real world, a product that has not completed interoperability testing typically has many symptoms presenting at the same time, and it takes a manufacture support engineer like myself to peel back the layers and determine the core issues. These layers are usually very simple changes, but it may be difficult to dissect when all of the symptoms are combined.
The absolute best advice I can give to a potential customer or solutions provider is to verify every single component they plan to install has been tested as a solution. If one or more of the components have not been tested, insist they be tested prior to installation. Without this due diligence, the customer may end up with two corner pull-out drawers.