Riding the Waves of Smart Data

This blog started off titled as “What’s in a name”, and was going to cover the nuances in the meaning of various terms used in the data science & analytics industry, which can be a little confusing at times.

However, it quickly evolved into how the industry has progressed in technological waves, offering different capabilities and marketed terms. Perhaps, in part, this has been driven by the software industry, perhaps by academia. For sure there is likely no single driver, but the waves are clear.

Let’s dig in and find out what has happened and a little bit of what is going to happen!

Business Intelligence (BI)

Business Intelligence (BI) “comprises the strategies and technologies used by enterprises for the data analysis of business information”, Wikipedia

When we look at Google Trends (data available from 2004 onwards), as a proxy for interest and activity in the field/discipline, we can clearly see that Business Intelligence had a lot of volatility in interest earlier in time and that it has overall reduced in interest and stabilised.

This kind of makes sense to me as it has always had a strong ongoing interest by business in pursuit of competitive advantage.

Analytics

Analytics “ is the systematic computational analysis of data or statistics.[1] It is used for the discovery, interpretation, and communication of meaningful patterns in data. It also entails applying data patterns towards effective decision making. It can be valuable in areas rich with recorded information; analytics relies on the simultaneous application of statistics, computer programming and operations research to quantify performance.” ,Wikipedia

In my mind, “Business Intelligence” has traditionally been seen as focusing on analysing what had happened (descriptive) where Analytics is a more modern term and thought of as what could happen or should happen – it is more about prediction and optimisation.

Analytics rises in popularity but then there is a blip …

We can see that the term had risen quickly in popularity from 2004 through to 2010 and continued to slowly climb through to 2014 and then has begun to lose interest slowly over time until July 2020 at which point it took a nose dive during the pandemic.

At this point, some see that Analytics as a subset of BI and vice versa. Then comes along “Big Data”…

Big Data

Big Data “is a term that applies to the growing availability of large datasets in information technology. Big data analytics is a field that treats ways to analyze, systematically extract information from, or otherwise deal with data sets that are too large or complex to be dealt with by traditional data-processing application software. Data with many fields (rows) offer greater statistical power, while data with higher complexity (more attributes or columns) may lead to a higher false discovery rate.[2] Big data analysis challenges include capturing datadata storagedata analysis, search, sharingtransfervisualizationquerying, updating, information privacy and data source. Big data was originally associated with three key concepts: volumevariety, and velocity. The analysis of big data presents challenges in sampling, and thus previously allowing for only observations and sampling. Therefore, big data often includes data with sizes that exceed the capacity of traditional software to process within an acceptable time and value.“, Wikipedia

I think it is likely everyone in business has heard of “Big Data”, it really did have big hype. That hype was, in part, driven by the “relative” nature of defining something as big. At the time the technologies were just appearing on the scene and now those technologies/capabilities are well embedded and what was once “big” is now “meh”. But don’t get me wrong there are certainly organisations that are having to manage large volumes of data and it is still at times challenging, however I think the technologies like Hadoop, Spark, Microsoft Azure Data Lake, MicrosoftAzure Synapse Analytics and Microsoft Azure Databricks, et al, have certainly helped.

IoT (Internet of Things)

The Internet of Things “describes the network of physical objects—“things”—that are embedded with sensors, software, and other technologies for the purpose of connecting and exchanging data with other devices and systems over the Internet.”, Wikipedia

New technology comes over the horizon …

BAM! in early 2015, IoT begun rising on the coat-tails of “Big Data”. Along comes the flood of data from sensors everywhere. As clearly highlighted by Gartner, it follows a Hype Cycle and the interest initially peaks June, 2016 but then begins to stabilise when everyone starts to appreciate how and where the technology plays it’s role.

Again, a drop in interest during the pandemic… everyone has begun searching for COVID or coronavirus!

Data Science

Data Science “is an inter-disciplinary field that uses scientific methods, processes, algorithms and systems to extract knowledge and insights from many structural and unstructured data.[1][2] Data science is related to data mining, machine learning and big data.”, Wikipedia

“Data Science” sometimes seems to be really about the person. There is a lot of talk about the same activities of prediction, optimisation, etc. I find that this new brand is of often a brand of the person rather than the discipline, although data science really should have science inquiry at it’s core, for example “reproducibility” and “replicability” ( Business Intelligence becomes a stable core term …

Data Science, the new kid on the block (or is it) …

Just as Big Data is reaching it’s peak, the new kid “Data Science” has come onto the scene. Notably this field or discipline has been around for a long time but had skyrocketed in popularity between 2014 through to it’s peak in July 2019. From there it appears to have stabilised, albeit in a volatile manner.

Again, you know it, the pandemic has hit… interest drops off. If only we didn’t have that pesky pandemic!

Artificial Intelligence

Artificial Intelligence (AI) ” is intelligence demonstrated by machines, unlike the natural intelligence displayed by humans and animals, which involves consciousness and emotionality. The distinction between the former and the latter categories is often revealed by the acronym chosen. ‘Strong’ AI is usually labelled as AGI (Artificial General Intelligence) while attempts to emulate ‘natural’ intelligence have been called ABI (Artificial Biological Intelligence). Leading AI textbooks define the field as the study of “intelligent agents“: any device that perceives its environment and takes actions that maximize its chance of successfully achieving its goals.[3] Colloquially, the term “artificial intelligence” is often used to describe machines (or computers) that mimic “cognitive” functions that humans associate with the human mind, such as “learning” and “problem solving”.[4]“, Wikipedia

Just when you thought it was safe to go back in the water …

Clearly AI is now becoming a mainstream technology and has significant overlaps with areas we have already talked about. An example that clearly comes to mind when we think of AI are chatbots.

As you can see from Google Trends analysis, AI garnered a lot of attention in the early days and especially so in the 1960s and face challenges in the 1970s – a likely fallout of what we now know is a hype cycle.

Good news is that it has picked up steam again and this is likely a result of improved compute power (including cloud) and increased availability of data (and of course ongoing research).

Pulling it all together

It seems like IoT and Data Science really did become more mainstream around the same time on the back of Big Data. Data Science has continued to steam ahead in recent years, where interest in IoT has begun to fall away, albeit slowly.

Damn that pandemic!

note: AI is missing due to Google Trends limits (included below)

If we exclude the overwhelming interest in the broader term of “Analytics”, then we get a clearer picture of how interest in the other technological capabilities have played out.

Wow! What a ride this disciple / industry has been on… from Business Intelligence through to Analytics to Big Data, IoT and Data Science. And yet there is more on the horizon with Edge computing / analytics and more … just look at this emerging technology hype cycle from Gartner.

What will the impact be for business with the upcoming

  • 5G
  • Progress and trust in AI
  • Lower cost hardware and moving beyond silicon
  • Edge compute
  • Digital me

Regardless of what the terms mean to you, at the end of the day what you should care about is the capability that you leverage to shape and execute your business strategy!

What do you think? 

References

Wikipedia: https://en.wikipedia.org/wiki/Business_intelligence

Wikipedia: https://en.wikipedia.org/wiki/Analytics

Wikipedia: https://en.wikipedia.org/wiki/Big_data

Wikipedia: https://en.wikipedia.org/wiki/Internet_of_things

Wikipedia: https://en.wikipedia.org/wiki/Data_science

Wikipedia: https://en.wikipedia.org/wiki/Artificial_intelligence

Google Trends: https://trends.google.com/trends/?geo=AU

The History of Artificial Intelligence, Harvard University, Graduate School of Arts & Sciences: http://sitn.hms.harvard.edu/flash/2017/history-artificial-intelligence/

5 Trends Drive the Gartner Hype Cycle for Emerging Technologies, 2020: https://www.gartner.com/smarterwithgartner/5-trends-drive-the-gartner-hype-cycle-for-emerging-technologies-2020/

Are you interested in knowing more about it?

Let’s talk, we can help you!

Contact | Lucid Insights

Check out the Lucid Insights blog

There is a variety of content that may help you to improve your business!