Building Software Applications Is Not Like Lego Bricks

When it comes to constructing software infrastructure, companies should avoid playing with independent Lego bricks. Instead, why not invest in a complete Lego castle? Michael Feindt, Chief Scientist and Strategic Advisor at Blue Yonder discusses. 

Michael Feindt, Chief Scientist and Strategic Advisor at Blue Yonder.

As organizations’ data sets expand exponentially, their need to harness and channel such vast insights in the right direction is becoming more pronounced. The role of artificial intelligence (AI) or machine learning (ML) in this process is now vital as companies look to leverage historical data for future efficiencies and revenues.

However, the complexity of creating and building an AI system for this purpose is more complicated than many would like to believe. As data expands, businesses tend to resort to the idea of seeing their infrastructure as a game of Lego. As the infrastructure needs broadening, a new Lego brick can simply be added.

Unfortunately, given the often millions or billions of variables and data sets that will need transporting and managing with each new connection, this is simply not a feasible solution. To this end, businesses should look at those Lego castles — offered by third-party AI experts — that already contain the connected bricks needed to fulfill their AI ambitions.

Big Data Means Big Complexity

Delving into the level of complexity further, it’s not just the sheer volume of data that makes independent brick building an unsuccessful method. It’s the way that data and pertinent information intermingles with each other.

In the big data era, a retailer will be faced with the notion of single-data cases such as how many stores and how many items they have. But they then crossover with the selling history of said items, the store locations, the prices of different products or seasonal trends. There is a lot of information per single case that is becoming bigger and bigger.

From an AI standpoint, these dimensions culminate in millions of predictions being conducted, in real time, every day. The resultant decision arrives following innumerable overlapping inputs — a tower of Lego blocks — where every factor depends on other factors.

Even more complicated is the time causality element. ML works off the past, so time ordering becomes another critical element to take into account — all when that database is getting larger and the implications of accurate forecasting become more severe.

The Struggle For Simplicity

It is this idea of complexity that companies are struggling with. The issue has as much to do with human instinct as technical understanding.

People’s instincts are to crave simplicity. Modularization fits that instinct perfectly, following the notion that as tasks become more complicated, we simply connect another coping mechanism to mitigate the challenge.

Even in science, this is a familiar concept. To try and make conclusions, scientists often look to isolate variables and make simple connections to inform wider theories. It’s a process of very few inputs, yielding small outputs, via a simple formula that people love. It’s much like playing with Legos.

To modularize a process that accounts for millions of evolving, single factors and millions of ultimate decisions feels perfect and logical. But in data science, it can’t be that simple.

In fact, this way of developing a forecasting infrastructure is costing those companies more money, time and difficulty in enacting their AI ambitions.

Finding A Fit-For-Purpose Approach

The inclination to make a complex world more feasible by simplifying it often takes the form of assuming that things are independent or linear. If we can deal with situations on a case-by-case basis, then it’s much easier for a human brain to compute and resolve.

That’s why the Lego scenario is so appealing. One bit of input impacts the next and so on. And the end result is something grand, despite all the small simple steps taken to get there.

From a data engineering perspective, this approach can work. It used to be a de facto approach because organizations were dealing with very simple data sets. If that is still the case, then a small connection process can cater to that need. But very few companies are dealing with small quantities of information.

To try that same approach for billions of information strands means transporting data and recalculating at every new module — an extremely timely and expensive proposition. The alternative is to build an elaborate and complex interface to exchange more information in-house. However, given companies are nascent with this level of technology and the existing bias to crave simplicity, it’s unlikely to result in a fit-for-purpose system.

You Need A Castle To Become A King

We need to veer away from the idea, even among IT specialists, that independence is efficient. That working with simple formulas can streamline a large data project.

This level of wishful thinking can quickly get out of control upon the realization that you have to duplicate so many strands of raw data to innumerable new building blocks, with no real concept of an end tower in mind.

A trend in recent years shows companies running into the hurdle and trying to counter it by simply acquiring new “bricks.” They then try to work out how to piece them together. At this stage, not only is time and money being wasted, but the likelihood of making the right connections is also put at risk, given how advanced this level of data science is.

To use another analogy, it would be like ordering a modular sofa, but instead of requesting four or five components to form a very recognizable final structure, you request every nut, bolt, fabric and thread and hope you end up with something comfortable.

Striving for simplicity is a psychological hurdle to overcome, but the impacts of the Covid-19 pandemic have accelerated organizations in the right direction. With more pressure on financial efficiencies and making every investment count, getting large-scale digital transformation projects right the first time is pivotal.

With so much emphasis on making stronger AI-based predictions, it’s time to stop playing on your own with Lego bricks and instead outsource the build of a complete castle for the better.

Read the article in Forbes >>

04.03.2024Blue Yonder

Sponseret

Prysmian Digitally Transforms Warehousing Capabilities With Blue Yonder

12.02.2024Blue Yonder

Sponseret

Blue Yonder Acquires Flexis, a Leader in Manufacturing and Supply Chain Planning Technology

02.02.2024Blue Yonder

Sponseret

Blue Yonder Releases Q4 2023 Company Highlights and Q1 2024 Industry Insights

15.01.2024Blue Yonder

Sponseret

Profit and Planet: Two Sides of the Same Supply Chain Coin

04.01.2024Blue Yonder

Sponseret

The Future of S&OP and IBP

18.12.2023Blue Yonder

Sponseret

The Time for Change Is Now