EconLearnerEconLearner
  • Business Insight
    • Data Analytics
    • Entrepreneurship
    • Personal Finance
    • Innovation
    • Marketing
    • Operations
    • Organizations
    • Strategy
  • Leadership & Careers
    • Careers
    • Leadership
    • Social Impact
  • Policy & The Economy
    • Economics
    • Healthcare
    • Policy
    • Politics & Elections
  • Podcast & More
    • Podcasts
    • E-Books
    • Newsletter
What's Hot

Why is Donald Trump the only villain in the Jerome Powell investigation?

January 19, 2026

The new MacBook Pro M5 Pro release date is hidden in Apple’s latest software

January 18, 2026

How To Build Wealth

January 18, 2026
Facebook X (Twitter) Instagram
EconLearnerEconLearner
  • Business Insight
    • Data Analytics
    • Entrepreneurship
    • Personal Finance
    • Innovation
    • Marketing
    • Operations
    • Organizations
    • Strategy
  • Leadership & Careers
    • Careers
    • Leadership
    • Social Impact
  • Policy & The Economy
    • Economics
    • Healthcare
    • Policy
    • Politics & Elections
  • Podcast & More
    • Podcasts
    • E-Books
    • Newsletter
EconLearnerEconLearner
Home » Do you really need all this data?
Operations

Do you really need all this data?

EconLearnerBy EconLearnerJanuary 15, 2026No Comments6 Mins Read
Do You Really Need All This Data?
Share
Facebook Twitter LinkedIn Pinterest Email

But not every job requires this deep reservoir of information. Many business or government decisions can be made based on a smaller amount of data, provided the right data is available. So the big question is which data you need to make the best decision?

A new algorithm jointly created by Amin Benunaassistant professor of operations at the Kellogg School, guides decision makers on this critical information.

Developed with collaborators Omar Bennouna, Saurabh Amin and Asuman Ozdaglar of MIT, the team’s algorithmic method identifies the critical data that decision makers need to ensure they find the optimal solution given the specific problem, from hiring to supply chain optimization to large public works projects. As a result, the algorithm can help decision makers arrive at the best solution while minimizing their investment in money and time.

It flips the script on data-driven decision making, where the answer lies not in simply throwing more and more data at a problem, but instead in being smart about what data to collect.

“It’s not about size [of the data] himself; it’s what data matters,” says Bennouna. “Instead of scaling and scaling, it’s more strategic to target where to study your system or where to get data.”

Optimization under uncertainty

You may not realize it, but the mathematical method of linear optimization is ever-present in the modern world. From sending packets on energy networks to portfolio balancing, linear optimization uses algorithms to calculate the best or worst solution from a universe of possibilities, based on available data.

That said, there are also limitations to how accurately linear optimization can define optimal decisions. The biggest is uncertainty—that some inputs can only be estimated as a range, not an exact number. The more estimates made in the model, the less accurate the result, which is a problem for some applications.

“Linear optimization is a beautiful discovery. It has allowed us to solve many very important decision-making problems that we couldn’t before,” says Bennouna. “But if you just assume that’s the perfect model of your problem and solve it, you’re probably going to be disappointed. Reality isn’t exactly your model, so things are going to deviate.”

Decision makers can reduce uncertainty and get clearer answers by conducting more studies and adding more and better data to their models. But this process can quickly become costly.

For example, imagine that you are the chief engineer for the construction of a new subway line through a large city. While knowing the start and end points of the line, you must determine the path that best minimizes the construction cost. Many cost-determining factors here are highly uncertain and can only be determined after extensive field studies.

In a perfect world, the engineer would conduct one study after another throughout the city to determine the exact cost of building each possible route. But in the real world, this isn’t financially feasible, and it’s probably also a waste – some studies will give you more useful information than others.

“A million data points can be equivalent to two data points depending on how relevant they are to what we’re trying to do with them,” says Bennouna. “We want to reduce the uncertainty that matters most to the decision—determining exactly the data that allows you to find the optimal decision.”

A more practical solution

Previous mathematical efforts to solve this problem have focused on settings where the decision maker collects some data, runs his model, and then uses the results to decide where to look next.

A classic example is the “secretary’s problem,” a scenario where an employer interviews applicants one by one until they find the best candidate. Mathematicians have created algorithms to calculate how many interviews an employer needs to conduct to find that dream hire.

But this sequential process doesn’t work in many real-world scenarios. In the example of the subway, the engineers are anxious to get the results of one study back before starting the next, otherwise the preparation would take years.

“Sometimes, these experiments take so long that we can’t wait for one to finish before moving on to the next. And in a company, an approach with too much customization and change makes it even more difficult to implement,” says Bennouna.

Bennouna and colleagues’ algorithm takes a different approach. It calculates the minimum sufficient data set or the smallest data set that can be used to reach an optimal decision. This provides the decision maker with a more selective and manageable set of factors to investigate immediately to reduce uncertainty.

In the recruitment scenario, for example, this might mean first identifying a subset of candidates that should be moved to the interview stage, rather than selecting the next candidate after each interview outcome. In the subway example, this could translate into finding the set of locations where cost studies should be prioritized.

“We’re thinking about collecting data in a more practical way in settings where you have to experiment at the same time,” says Bennouna. “The key idea is to find data that informs decisions in the best possible way.”

More effective data-driven decisions

But there is another real constraint that interferes with optimal decision-making: budgets. Even if governments and companies want to choose the best possible option, sometimes, there is only enough money for a “good enough” solution.

Bennouna and his colleagues are now working on an extension of their algorithm that takes this reality into account.

“Maybe you just want to know what’s the best thing you could do with that budget and how that would change your data requirements,” says Bennouna. “We want to be able to quantify the trade-off of the optimal decision and the type and size of the data.”

The team is also looking at how their approach could be applied to different types of decision-making tasks beyond those modeled by linear optimization, such as the process online retailers use to optimize their inventory across locations.

The researchers’ concept of “data efficiency” could be extended to other problems. For example, it could potentially help improve the environmental performance of energy-intensive computations used by large language models by selecting the most relevant data for training the models.

“We have these models that take all the data online and extract knowledge, and we’re getting better and better at it. But the more data [there is]the more expensive these algorithms are, and we are already approaching the limit of what we can do,” says Bennouna. “So the question is going to be, really, what specific data do you need and how do you be effective in that sense?”

data
nguyenthomas2708
EconLearner
  • Website

Related Posts

Say hello to your new AI study buddy

January 2, 2026

When supply chain disruptions strike, preparation is everything

December 9, 2025

China has an underwater data center. The US will build them in space

October 20, 2025

Warning signs abound in commercial data as Fed hints at interest rates reduction

August 25, 2025
Add A Comment

Leave A Reply Cancel Reply

Personal Finance

How to Replace a 6-Figure Job You Hate With a Life That You Love

February 10, 2024

How To Build An Investment Portfolio For Retirement

February 10, 2024

What you thought you knew is hurting your money

December 6, 2023

What qualifies as an eligible HSA expense?

December 6, 2023
Latest Posts

Why is Donald Trump the only villain in the Jerome Powell investigation?

January 19, 2026

The new MacBook Pro M5 Pro release date is hidden in Apple’s latest software

January 18, 2026

How To Build Wealth

January 18, 2026

Subscribe to Updates

Stay in the loop and never miss a beat!

At EconLearner, we're dedicated to equipping high school students with the fundamental knowledge they need to understand the intricacies of the economy, finance, and business. Our platform serves as a comprehensive resource, offering insightful articles, valuable content, and engaging podcasts aimed at demystifying the complex world of finance.

Facebook X (Twitter) Instagram Pinterest YouTube
Quick Links
  • About Us
  • Contact Us
  • Privacy Policy
  • Terms and Conditions
  • Disclaimer
Main Categories
  • Business Insight
  • Leadership & Careers
  • Policy & The Economy
  • Podcast & More

Subscribe to Updates

Stay in the loop and never miss a beat!

© 2026 EconLeaners. All Rights Reserved

Type above and press Enter to search. Press Esc to cancel.