Synergis Software Blog

Making a Positive Impact on Time to Knowledge

Written by Synergis Software | Aug 10, 2017 4:12:43 PM

It is easy to get bogged down in details when discussing topics like engineering data management (AKA product data management or PDM). Sometimes it is good to step back and look at the big picture. For me, the details of PDM are the bricks in a building; the building is Time to Knowledge.

I define “Time to Knowledge” as the time it takes someone to get the specific accurate information needed to answer a question. The typical day in engineering has hundreds of moments which trigger a Time to Knowledge event. Such questions as “What is the status of yesterday’s engineering change request?” or “Which document is the right revision, and where is it?” are specific questions that require specific answers available in your existing engineering data. If the answer is quickly accessible, productivity is enhanced. If the answer is an uncertain quest away, the human tendency too often is to find an imprecise workaround or to avoid the subject completely.

In 1982 IBM released a research paper on the economic value of rapid response time. The immediate subject was the speed of computer processors, but the principles apply to the broader discussion of Time to Knowledge. The IBM researchers noted a critical amount of time that prevented the computer from getting in the way of a user’s thought process. In those days, even word processors were slow enough that response time could be an issue. Today computers are so much faster we are rarely annoyed by response times for text-based tasks. Even graphics-based tasks are falling into line for all but the most demanding uses (like engineering simulation).

Now the critical amount of time is not how long it takes the computer to respond to a keystroke, but how long it takes your data infrastructure to respond to your question. And if your data infrastructure is so basic the only way to ask it a question is to comb through files and folders, reducing Time to Information becomes imperative.

When engineering companies first invested in computer automation, the money was spent on CAD. These tools help engineers communicate product designs, but they add complexity to data management. Too many companies lived with the added complexity, and never moved beyond operation system and LAN folder management. Today these companies are still hobbled by a decision not to automate data use along with data creation. The end result is lots of information hidden behind a gossamer wall of archaic access methods.

Making sure the mountain of engineering data is organized for fast accessibility is key. Ken Lechner is VP of Engineering for AMETEK Technical and Industrial Products. He considers data accessibility as an integrity issue for his company.  “The key is product integrity. With PDM, we don’t wonder what the right revision is and don’t need to go to multiple places to find it. We know it is the latest revision and that it has gone through the approval process.”

Companies like AMETEK have eliminated the wonder factor that comes with imprecise Time to Knowledge methods. The experience of people like Ken Lechner is the subject of a white paper by Jim Brown, president of Tech-Clarity, an independent research and consulting firm that specializes in analyzing the true business value of software technology and services. You can read the complete report here.

Randall S. Newton is the principal analyst and managing director at Consilia Vektor, a consulting firm serving the engineering software industry. He has been directly involved in engineering software in a number of roles since 1985. More information is available at https://www.linkedin.com/in/randallnewton.