Chapter 1. Why IT sector lacks foresight?
Despite a long history of mutual influence between Strategic Foresight (SF) and Information Technology, foresight remains a rare beast in modern Big Tech. Even as these companies loudly proclaim to "operate strategically," SF/FS (Strategic Foresight, Futures Studies) practices are largely absent from the industry.
This isn't inherently good or bad, and the tide is slowly turning. But before we explore what’s emerging, we need to examine the historical baggage that shaped this landscape—and establish a baseline for why tech is so remarkably blind to its own future.
1. The Bubble Ego
We often use the "bubble" metaphor to describe the IT sector's 30+ year expansion, but we forget what it feels like to live inside one. When you’re in a rapidly inflating bubble, it’s easy to believe you are the denominator of the future. Who else could it be?
Even if the facts don't back it up, the perceived potential does. Even a junior dev in the middle of nowhere "knows" that it was their tribe that gave the world Windows, Google, and GenAI. It’s your party reshaping the world through "Digital Transformation," your global nation of digital nomads, your ecosystem redesigning everything else in its image. In this worldview, there are no alternative futures—only the one you are currently coding. Why bother thinking about alternatives when you’re the one holding the pen?
2. The Infant Behemoth
Computer scientists love tracing their lineage back to Ada Lovelace and Charles Babbage, but the global IT industry is still a screaming infant. A very large infant, admittedly, but an infant nonetheless. It’s only been since 1998 that tech took the top spot in the S&P 500—right before the dot-com crash sent it to the nursery for a decade[1].
Beyond the "dinosaurs" like IBM or Microsoft, most industry leaders are startlingly young. Meta has never even experienced a CEO transition or a major generational leadership change. For a company under fifteen, the world hasn't "changed" much because they emerged into a reality where cloud computing and iPhones were already the status quo. This breeds a dangerous delusion: a belief in the eternal validity of the founder’s vision and their innate ability to define the future by fiat.
3. The "Not My Problem" Defense
A company can run charity marathons, donate blood, and plaster its annual report with ESG buzzwords, but that doesn't make it responsible for societal well-being. Capitalist firms rarely adopt self-constraint unless someone holds a metaphorical gun to their head[2].
If you aren't Amazon—directly touching millions of physical doorsteps—and your core product sits comfortably above the "survival" layers of Maslow’s hierarchy, societal consequences are easy to dismiss as externalities. The sector generally feels zero accountability for ecological damage, cultural erosion, or long-term social cohesion—even while practicing the art of enshittification.[3] If you don’t feel responsible for the climate, you don't bother checking the weather forecast for ten years out.
4. The Gartner curse
Foresight and strategy aren't for everyone; they require a level of organizational maturity that many tech firms haven't reached. But by the time a company is old enough to care, it’s already infested by a dense ecosystem of VCs, advisors, and MBA-toting managers.
These people arrive with a ready-made handbook of all the "wrong right answers." Ask an IT strategy team for a budget, and they’ll ask for a Gartner subscription. It’s the safe bet. It’s easy to justify a six-figure report from a legacy analyst to a Board of Directors. Thinking through futures independently is risky; buying a pre-packaged "Trend Report" is comfortable.
5. Total Trend-Washing
In tech, "strategy" is usually what we’re doing next quarter. Everything else dissolves into the frantic rhythm of two-week sprints. We’ve reached a point where tactical discussions—the vital space between "what we’re doing Monday" and "where we’re going in 2030"—have vanished.
This is a linguistic failure. We call tactics "strategy" and mistake short-lived fashions for "trends." In this environment, a "long-term strategy" means three years[4]. In classic foresight, that’s barely a coffee break. We are obsessed with the direction of the wind because we’ve forgotten that the climate even exists.
6. The Inadequacy of Classic Foresight
To be fair, classic foresight is often a terrible fit for the tech world. Refined over 50 years, it struggles to speak the language of product-centric cultures and platform architectures.
Most "strategy sessions" in tech look like a collision between two incompatible species. You get a moderated brainstorming exercise labeled "Strategy" just to unlock the budget, but there is no bridge from a PESTEL analysis to a product roadmap. You can't easily translate a 20-year demographic shift into a Jira ticket. In the clash between long horizons and short iteration cycles, the sprint always wins.
**7. It Was All Too Easy
After the dot-com bust, the industry developed a powerful vaccine: Customer Development. We learned that we can’t just build whatever we want; we have to validate it.
A/B testing and usage metrics became the new religion. This tight synchronization with the user allows us to postpone the "big" questions of responsibility until the next crisis hits. We’ve created an illusion of co-evolution: we’re sensitive to minor UI preferences, but blind to massive social outcomes. We claim responsibility for progress while remaining strikingly irresponsible for consequences.
Do you want it to look good, or do you want it to work? That is the unspoken social contract of the modern IT industry.
The landscape isn't entirely bleak, and in the next chapter, we’ll look on what and why is changing. But in the meantime, ask yourself: when was the last time you saw a tech company systematically exploring scenarios decades ahead? Not as a "visionary" quote in a PR interview, but as a grounded, rigorous practice?
I’ll wait.
- However, after that the Technology sector "recovered" and regained leadership only 10 years later in 2008, when the Financial sector survived its own crisis. ↩︎
- In early 2026, Microsoft announced a "5-point plan to partner with local communities across the U.S." committing to build a "community-first AI infrastructure". In this attempt to be a "good neighbor", company proclaims making steps to minimize water usage, ensure electricity prices don't increase for the community, etc. However, this does not sound like a humble move; rather, it is positioned as a political step around America's 250th year of independence, with a preliminary announcement by President Trump, and after multiple reports throughout previous years on the negative impact of data-centers. ↩︎
- The "enshittification" term was popularized by Canadian writer Cory Doctorow:
"Here is how platforms die: first, they are good to their users; then they abuse their users to make things better for their business customers; finally, they abuse those business customers to claw back all the value for themselves. Then, they die. I call this **enshittification**..."↩︎ - Interestingly, that matches the inherently short life cycle of many software products, with many "eternal" apps and services turning bricks just few years after release. For example, as reported by WIRED, customers are not secured from some product being discontinued or frozen, because some major company like Meta decided to pivot from the Metaverse to AI and laid off its staff. As a result the users of the Supernatural Fitness app accustomed to use VR headset for their sport exercises, are now left without updates losing the thing that made them love VR at first. ↩︎