Control & Automation
COLUMN – Data Models Alter Industry Dynamics
June 6, 2013 - I just returned from the first ever very successful Haystack Connect event in Chattanooga. What I learned is it is not just the naming of data but a consistent data model that allows us to free our data to a world of dynamic dimensions for our own purposes. No longer must data be predefined before use if an accurate self-discoverable model is present. This new way of viewing data allows us a new world in which data can be used in several different ways as a dynamic subset of many scenarios.
By Ken Sinclair
So from every successful event there needs to be a takeaway that changes our point of view. For me it was captured when a couple of hayseeds took the stage and planted the seeds of change and showed us all how a connected Haystack using data modeling could alter industry dynamics as we know them. A great production with an amazing cast in this historical and mildly hysterical production…. smile.
Links to this video plus several other presentations from this event are included in my review of event Something Happening Here.
Take the time to view it all, as it starts with building on Niagara Framework but that is not where it ends. You will be surprised to see how the haystack concept can be used almost everywhere and at several levels.
In Marc’s Petock review of the event, he states; It’s also important to note that Haystack doesn’t have to be embedded in an end device for it to be useful. For example, one of the demonstrations presented at the Haystack Connect event showed products from six different suppliers being integrated with a software application that was able to discover the data in controllers and automatically build a database that a user could navigate even including automatically assembling graphic displays of the equipment (an air handler and VAVs).
In George Thomas’s review and interview of the event, he states; Linux is especially useful in IP routing applications where much of the routing technology is freely available to use without royalties. Being part of the Linux community allows us to seek help on issues and to help others solve issues. The Linux community is large and willing to help and many processor manufacturers support Linux.
Yet another connection community I was not aware of, how about you?
The data modeling is happen at several levels not just the dynamic data level but also at the building information level. In this article Modeling Building Automation and Control Systems, Jim Sinopoli, Smart Buildings, LLC, provides this insight:
Many building owners and facility managers lack good documentation for their automation control systems. Documentation has value; lack of system documentation can cost an organization and will increase risk. Lack of documentation means troubleshooting and work orders take longer, are more expensive and it extends the time it takes to resolve issues for tenants or occupants. It also may mean preventative maintenance isn’t done because you don’t know what the PM schedule is, possibly shortening the life of the equipment. Or it may mean that facility personnel really do know a lot about their systems but if they move to another organization or company or retire, all that knowledge or “system documentation” leaves with them.
This lack of documentation for automation and control systems is caused by inadequate organization and planning in the handoff from construction to building operations and the fact that much of the documentation is in a paper format.
Help is on its way. The buildingSMART alliance with the input of the US Army Corps of Engineers has developed and proposed a data structure for representing information related to automation and controls. It falls under a large umbrella called Building Information Modeling or BIM.
If you’ve had any involvement in new building design and construction you’re probably familiar with BIM. You may be aware or exposed to the 3-D modeling of a building and its components and understand the value it can provide in avoiding potential “collisions” between the designs created by different engineers. Likewise, the usefulness this modeling can provide to contractors in fabricating building systems and components. Major designers and construction companies have embraced BIM and rightfully so; it can reduce change orders, assist in maintaining schedules and generally produce better buildings.
The larger picture and the utilization of BIM should be an approach of a life-cycle asset management tool. Such a tool is used in creating and acquiring data during design and construction which is then delivered to facility management. It’s the building operations that will be 85% to 95% of a building’s lifecycle.
Once we all get our minds around the power of data model we can then start to understand Toby’s excitement in his column—A Path to Multi-Agent Operation of Buildings – Toby Considine, TC9 Inc.
The energy manager proposed that each space be represented by a software agent; that agent would understand the unique needs of each space. During times of shortage, the agents would compete to acquire power to support the needs of their space. The general model for competition leading to optimum control would follow the pattern outlined by Huberman and Clearwater (“Multi-Agent control of Building Environments”, 1995). Unlike that work, the agents would compete over multiple dimensions of building services.
As we talked, I began to tie these two projects, the BIM-based services of oBIX 2.0 and software agents for spaces, together.
Through BIM-based queries, one can create collections of points that are each, in effect, a virtual BAS. Each virtual BAS can be represented by an agent. A traditional general purpose BAS can be treated as many special purpose BAS, the interests of each supported by an agent. From the perspective of a high-level architecture, it makes no difference whether an agent is “near” a system or in the cloud controlling a system. Binding oBIX through BIM creates a simple path to multi-agent transactive operation of buildings.
If you want to find out more about high level architectures for transactive operations, drop me a line, or look for the paper “Understanding Microgrids as the essential Smart Energy Architecture”, (Considine, Cazalet, & Cox, 2012).
It is a brave new world forming fast in the cloud and standard data models allow the feedstock for the cloud to be automatically reconfigured for the purpose of the moment.
Data is set free and has none of the restrictions that normally came with the data which is provided by several sources and vendors. And of course once you have a strong data model as part of your data if you need to convert to a different model rule based routines can do this automatically.
“Data Models Alter Industry Dynamics”