There are few areas of the current data landscape that the self-service movement has not altered and positioned firmly within the grasp of the enterprise and its myriad users, from novices to the most accomplished IT personnel.
One can argue that cognitive computing and its self-service analytics have always been a forerunner of this effort, as their capability of integrating and analyzing disparate sources of big data to deliver rapid results with explanations and recommendations proves.
Historically, machine learning and its penchant for predictive analytics has functioned as the most accessible of cognitive computing technologies that include natural language processing, neural networks, semantic modeling and vocabularies, and other aspects of artificial intelligence. According to indico co-founder and CEO Slater Victoroff, however, the crux of machine learningâs utility might actually revolve around deep learning and, specifically, transfer learning.
By accessing these technologies at scale via the cloud, enterprises can now deploy cognitive computing analytics on sets of big data without data scientists and the inordinate volumes of data required to develop the models and algorithms that function at the core of machine learning.
From Machine Learning to Deep Learning
The cost, scale, and agility advantages of the cloud have resulted in numerous Machine Learning-as-a-Service vendors, some of which substantially enhance enterprise utility with Deep Learning-as-a-Service. Machine learning is widely conceived of as a subset of predictive analytics in which existing models of algorithms are informed by the results of previous ones, so that future models are formed quicker to tailor analytics according to use case or data type. According to Slater, deep learning algorithms and models âresult in better accuracies for a wide variety of analytical tasks.â Largely considered a subset of machine learning, deep learning is understood as a more mature form of the former. That difference is conceptualized in multiple ways, including âinstead of trying to handcraft specific rules to solve a given problem (relying on expert knowledge), you let the computer solve it (deep learning approach),â Slater mentioned.
Transfer Learning and Scalable Advantages
The parallel is completed with an analogy of machine learning likened to an infant and deep learning likened to a child. Whereas an infant must be taught everything, âa child has automatically learnt some approximate notions of what things are, and if you can build on these, you can get to higher level concepts much more efficiently,â Slater commented. âThis is the deep learning approach.â That distinction in efficiency is critical in terms of scale and data science requirements, as there is a â100 to 100,000 ratioâ according to Slater on the amounts of data required to form the aforementioned âconceptsâ (modeling and algorithm principles to solve business problems) with a deep learning approach versus a machine learning one. That difference is accounted for by transfer learning, a subset of deep learning that âlets you leverage generalized concepts of knowledge when solving new problems, so you donât have to start from scratch,â Slater revealed. âThis means that your training data sets can be one, two or even three orders of magnitude smaller in size and this makes a big difference in practical terms.â
Image and Textual Analytics on âMessyâ Unstructured Data
Those practical terms expressly denote the difference between staffing multiple data scientists to formulate algorithms on exorbitant sets of big data, versus leveraging a library of preset models of service providers tailored to vertical industries and use cases. These models are also readily modified by competent developers. Providers such as indico offer these solutions for companies tasked with analyzing the most challenging âmessy data setsâ, as characterized by Slater. In fact, the vast forms of unstructured text and image analytics required of unstructured data is ideal for deep learning and transfer learning. âMessy data, by nature, is harder to cope with using handcrafted rules,â Slater observed. âIn the case of images things like image quality, lighting conditions, etc. introduce noise. Sarcasm, double negatives, and slang are examples of noise in the text domain. Deep learning allows us to effectively work with real world noisy data and still extract meaningful signal.â
The foregoing library of models utilizing this technology can derive insight from an assortment of textual and image data including characteristics of personality, emotions, various languages, content filtering, and many more. These cognitive computing analytic capabilities are primed for social media monitoring and sentiment analysis in particular for verticals such as finance, marketing, public relations, and others.
Sentiment Analysis and Natural Language Processing
The difference with a deep learning approach is both in the rapidity and the granular nature of the analytics performed. Conventional natural language processing tools are adept at identifying specific words and spellings, and at determining their meaning in relation to additional vocabularies and taxonomies. NLP informed by deep learning can expand this utility to include entire phrases and a plethora of subtleties such as humor, sarcasm, irony and meaning that is implicit to native speakers of a particular language. Such accuracy is pivotal to gauging sentiment analysis.
Additionally, the necessity of image analysis as part of sentiment analysis and other forms of big data analytics is only increasing. Slater characterized this propensity of deep learning in terms of popular social media platforms such as Twitter, in which images are frequently incorporated. Image analysis can detect when someone is holding up a âguitar, and writes by it âoh, wowâ,â Slater said. Without that image analysis, organizations lose the context of the text and the meaning of the entire post. Moreover, image analysis technologies can also discern meaning in various facial expressions, gestures, and other aspects of text that yield insight.
Cognitive Computing Analytics for All
The provisioning of cognitive computing analytics via MLaaS and DLaaS illustrates once again exactly how pervasive the self-service movement is. It also demonstrates the democratization of analytics and the fact that with contemporary technology, data scientists and massive sets of big data (augmented by expensive physical infrastructure) are not required to reap the benefits of some of the fundamental principles of cognitive computing and other applications of semantic technologies. Those technologies and their applications, in turn, are responsible for increasing the very power of analytics and of data-driven processes themselves.
In fact, according to Cambridge Semantics VP of Marketing John Rueter, many of the self-service facets of analytics that are powered by semantic technologies âare built for the way that we think and the way that we analyze information. Now, weâre no longer held hostage by the technology and by solving problems based upon a technological approach. Weâre actually addressing problems with an approach that is more aligned with the way we think, process, and do analysis.â