Leveraging Curated Data for Strategic Decision Making

Navigating today’s volatile business landscape without top-tier data is like trying to predict a hurricane with last month’s weather report. It’s not just reckless; it’s downright dangerous. Quality, up-to-date information is the Doppler radar for your business, helping you see through the unpredictable market conditions to make decisions that aren’t just reactive guesses but proactive strategies. After all, facts are as unyielding as the laws of nature: they don’t bend to our wishes or fears.

Cautionary tales underscore the critical importance of data accuracy in our interconnected, information-driven world. The 1999 NASA Mars Climate Orbiter failure serves as a stark reminder of how a simple unit conversion error, undetected due to inadequate data validation, led to the spacecraft’s destruction. Similarly, the 2008 financial crisis highlighted the dangers of relying on flawed historical data models that failed to account for the risks of subprime mortgages, resulting in a global economic downturn. More recently, the 2020 UK A-Level grading algorithm controversy demonstrated the pitfalls of not thoroughly validating the impact of data models on diverse student groups, leading to widespread perceived unfairness and policy reversal.

So what’s the solution? It’s not just one thing. Instead, multiple factors come into play: data breadth and curation, as well as its integration into a traditional programming environment or high-level AI/LLM analysis.

Enter the Wolfram Knowledgebase, a comprehensive data system that underpins Wolfram Language, with vast, curated datasets that are immediately computable—no data cleanup or excessive tagging/categorization is required because that’s already been done for you.

Data Scope

The Wolfram Knowledgebase is a comprehensive repository of curated data across multiple areas: from physical sciences (elements and molecules), medical data (gene properties) and astronomical information to economic indicators (stock prices), country statistics and cultural data, including historical events and notable personalities, along with internet-based metrics.

Knowledgebase entities

As Alan Joyce, Director of Content Development at Wolfram, describes it, “Basically, the Wolfram Knowledgebase covers pretty much every type of known information about the world that you can think of.”

This data is then organized into hundreds of different “entities” that represent either physical “things,” including types of animals (dogs and cats), movies (Casablanca to Barbie) and countries (the US and France), or mathematical and other scientific concepts such as space curves, mathematical functions and periodic tilings.

Plus, you’re not limited to only the data in the Wolfram Knowledgebase. You can also add entities for your own datasets and import that information to use with what’s already in the Wolfram entity framework.

Data Management

The breadth of available datasets is the first thing that sets the Wolfram Knowledgebase apart from other online databases. But this information is not just a hodgepodge collection of information from different sources. Instead, as detailed by CEO Stephen Wolfram, it’s carefully evaluated, selected and processed for integration within the Wolfram tech stack.

Data management process

The process of transforming raw data into coherent computable data begins with collecting it from records, surveys and experiments, followed by converting it into digital formats such as .txt, .pdf and .xls files. The data is then systematically stored in cloud repositories with metadata and organized in structured formats using JSON and XML. Quantitative elements like quantities, dates and geolocations are represented in canonical symbolic form, and standard entities such as countries, species and chemicals are also given canonical symbolic representations. All entities, including custom ones, are uniformly consistent through normalization and manual or automated analysis. Computations for derived properties like interpolations, formulas and models are added, along with natural language mappings and access via tools like Wolfram|Alpha. The final goal is to ensure the data is suitable for repeated, systematic computations in Wolfram Language.

In contrast, a resource such as the Google-curated Dataset Search is more indiscriminate about indexing available online datasets, which can have limited (or no) manual curation, unlike the Wolfram Knowledgebase. And other information sources such as Quandl and OpenWeatherMap are more narrowly focused, which can make cross-referencing multiple dataset points more labor intensive due to the need for data alignment across multiple file types and categories.

Programming Integration

The final piece is the Wolfram Knowledgebase’s integration within Wolfram Language with sophisticated semantic representations such as Quantity and DateObject. This makes data analysis and visualization part of a unified framework as opposed to juggling data libraries from varied sources, multiple pieces of software and different programming languages.

UK air temperature map

“The Wolfram Knowledgebase isn’t just a big database that you can go and pluck data out of to put somewhere else,” says Joyce. “Its power comes from the fact that it’s all integrated with the Wolfram tech stack for analysis, visualization and deployment to the web.”

For example, with just a few lines of Wolfram Language code, you can easily generate an up-to-date UK air temperature map.

But the possibilities of data analysis and visualization with Wolfram Language extend even further. As George Danner, president of Business Laboratory, says about the software, “It can work on very large, very diverse datasets, including unstructured datasets, [and] can do just about whatever we ask and perform any kind of function. We’ve used statistical functions. We’ve used the optimization functions… to put together very fine-grained visualizations of the systems at hand, and these visualizations are critical for allowing our complicated models to communicate in non-mathematical ways to our audience, which are often company executives.”

Sustaining Success through Data-Driven Insights

Utilizing the Wolfram technology stack for data analysis can drive significant advancements and innovative solutions across multiple fields. For example, perhaps you need to cross-reference geographic, seismic exploration and imaging data to find new oil reservoirs. Or if you’re in power generation, maybe you want to use thousands of different inputs—everything from weather forecasts to plant repair time—to create predictive energy load models for power plants. And in finance, you may have to pull together data such as asset prices, cost of goods and inflation to analyze and predict economic trends.

One fact remains clear: in today’s digital age, quality data is the currency of success, and that requires multiple information streams to stay ahead of the curve. As you navigate the business landscape, remember that it’s not about weathering the storm—it’s about harnessing the power of data to create your own blue sky opportunities.

Contact the Wolfram Consulting Group to learn more about using the Wolfram tech stack, curated data and custom AI tools to generate actionable business insights.