The IT industry is approaching a point where we’ll soon need to distinguish between “cognitive” and “semantic” computing. The terms are blurring into each other. Actually, the distinctions among them have never been clear. Sometimes it’s even easier to refer to them both by the even vaguer “smart computing” to allude to the practical magic they enable.
I recently came across an article that promotes the concept of “cognition as a service” while stating bluntly that the “semantic Web [has] failed,” as if the concepts are mutually exclusive or aiming at exactly the same objectives. For many years, I’ve blogged my thoughts on the “semantic Web” (as a Tim Berners-Lee initiative at W3C) vs. “semantic interoperability” (as a long-standing data-integration imperative). I won’t rehash it here, other than to say semantic technologies have permeated every aspect of the enterprise architecture and cloud universe. They have most certainly not failed to take hold.
Cognitive computing can’t achieve its potential without a strong semantic-processing substrate that executes across diverse content sources. It’s good to see that the above-cited article’s author, Nova Spivack, references IBM Watson in this regard. The cloud service’s DeepQA technology incorporates semantic approaches into its very core, balancing the use of strict and shallow semantics and leveraging many loosely formed ontologies to deliver precise answers to natural-language queries.