Imagination Economy

This is a primer on the Imagination Economy.


Table of Contents

A Blue Future

Ask an image generator to show you “the future.” The result will almost certainly be blue. Gleaming skyscrapers and smooth surfaces, a vaguely Asian cityscape glowing under a sapphire sky. Ask a different model. Same blue. Ask it in a different language. Still blue. The machines trained on decades of stock imagery, science fiction covers, and corporate pitch decks have internalized a single aesthetic register for “tomorrow.” And that register has nothing to say about care work, grief, or the distribution of power.

This is where the “Imagination Economy” starts to get interesting. The term has been circulating since around 2018 in business strategy, technology commentary, and creative economy discourse, initially as a vague gesture toward post-knowledge-economy thinking. It acquired its current urgency with the arrival of generative AI in 2022: if machines can generate text, images, and code, the ability to imagine what to generate becomes the defining economic bottleneck, more scarce than knowledge, attention, or labor. Those who can envision new products, futures, and worlds will thrive. Those who cannot will be left prompting machines for blue cities.

The claim deserves scrutiny. Because the term itself functions as what Future Imaginaries research would call a narrative artifact: a story about the future that, by circulating widely enough, begins to shape the present it describes. Treating it as a program to follow misses the point. The interesting question is what the emergence of this term reveals about where we are.

What is the “Imagination Economy”?

The “Imagination Economy” is a label for an emerging economic paradigm in which human imagination, the capacity to envision what does not yet exist, becomes the primary scarce resource and source of value creation. The premise: as AI automates knowledge work, data processing, and even creative execution, the bottleneck shifts from doing to conceiving. Organizations and individuals who can imagine genuinely novel futures, products, and solutions gain a structural advantage.

There is no canonical text. The term appears in at least six independent strands since 2018, used by business strategists, technologists, cultural critics, and creative economy researchers. Nobody defined it with the precision of, say, Sheila Jasanoff’s Sociotechnical Imaginaries or Jens Beckert’s Fictional Expectations. It floats.

This looseness is a feature, not a bug. The term functions precisely because it is vague enough to mean different things to different audiences. A management consultant reads “untapped resource.” A cultural critic reads “new frontier of extraction.” A foresight practitioner reads “another future narrative to analyze.” All of them are onto something.

Precursors

The concept has a longer genealogy than most of its current advocates acknowledge. Charlie Magee coined the phrase “Imagination Age” in 1993, arguing that post-industrial economies would increasingly depend on creativity and mental modeling rather than physical production.1 Rita J. King elaborated the idea in 2007, connecting it to the emerging social web and distributed creative collaboration.2 Martin Reeves and Jack Fuller’s The Imagination Machine (2021) brought the concept into mainstream business strategy, arguing that imagination is a “muscle” organizations can systematically train and deploy.3

What changed in 2024 and 2025 was generative AI. Tools that could produce text, images, code, and music from natural language prompts made the question visceral: if a machine can generate a serviceable novel, a photo-realistic cityscape, a working prototype, what exactly is left for humans? The answer, for proponents, was imagination itself: the capacity to know what to ask for.

Stratification of Scarcity

The Imagination Economy does not arrive in a vacuum. It sits atop a layered sequence of economic paradigms, each identified by its defining scarce resource:

Knowledge Economy (1960s onward): Peter Drucker’s insight that knowledge workers, not factory workers, drive economic value. Information becomes the critical input. Universities, research labs, and consulting firms flourish.

Attention Economy (1990s onward): Michael Goldhaber’s prescient 1997 argument that in an information-rich world, attention becomes the scarce currency. Social media platforms, algorithmic feeds, and the entire advertising industry operate on this logic.

Creator Economy (2010s): The platform era where individuals monetize their own creative output directly. YouTube, Substack, Patreon. Value shifts from institutions to individuals, but remains tethered to platform infrastructure.

Imagination Economy (2020s): The claim that AI handles execution, so the scarce input becomes what to execute. Vision, taste, the ability to specify what has never existed.

This is not a linear progression where each paradigm replaces the last. It is a stratification: each layer persists while a new scarcity emerges on top. Knowledge workers still exist. Attention is still fiercely competed for. Creators still struggle with platform dependency. But the argument goes that imagination now sits at the apex of the value chain.

The framing is seductive. It is also worth interrogating: who benefits from declaring imagination the new bottleneck? Certainly the companies selling AI tools that promise to “amplify” it. Certainly the consultants offering to “unlock” it. The economic framing is not neutral. It primes us to treat imagination as a resource to be extracted and scaled. The word “economy” does real work in this compound noun.

Four Positions in the Discourse

The conversation around the Imagination Economy clusters into four distinct positions. They are not neatly separated; most serious thinkers draw from more than one. But mapping them clarifies what is actually being debated.

1. Imagination Amplifier

Core claim: AI makes human imagination manifestable. For the first time, the gap between conceiving an idea and realizing it collapses. A person with taste but no technical skills can build a prototype, generate a visual identity, compose a score. AI is an equalizer that democratizes creative agency.

Proponents: Business strategists like Reeves and Fuller, technology optimists, the “vibe coding” community, and reports like RBC’s 2026 analysis identifying imagination as the primary barrier to AI adoption: 97% of firms that did adopt reported massive benefits, yet most struggled to envision what AI could do for their specific context.4

Strongest argument: The empirical observation that people who bring clear intent to AI tools get better results than those who don’t. The tool amplifies what the user already carries. A skilled scenario planner who prompts an AI with detailed parameters gets rich, novel output. Someone who types “show me the future” gets a blue city.

2. Imagination Simulacrum

Science journalist Lizzie Wade offers the sharpest articulation of this counter-position in her 2025 essay “The imagination economy.”5 What AI produces, she argues, is not imagination but its simulation. The process of imagining, with its friction, dead ends, surprise, and embodied struggle, is where the value resides. A machine that generates plausible outputs from statistical patterns produces simulacra: things that look like the products of imagination without any imagining having occurred.

Wade traces a historical sequence of commodification: first labor (Marx, 19th century), then attention (social media, 21st century), now imagination (generative AI). Each step produces deeper alienation. But she insists there is a qualitative break: attention remains attention even when degraded. “Imagination is not imagination without the process of imagining.”5

Why are we vulnerable? Exhaustion. Decades of operating at the financial, physical, and psychological limit have depleted the capacity for friction-heavy creative work. AI promises imagination without effort. But the effort is the imagination. Strip it away and what remains is, in Wade’s formulation, “not imagination. Not creativity. Not real.”5

The observation that strengthens this position: AI-generated outputs converge. Different users, different prompts, same aesthetic register. The blue city problem at scale. If most users work with a handful of dominant models trained on similar data, the range of what gets imagined narrows even as the volume increases.

3. Creativity Paradox

Core claim: AI simultaneously boosts individual creative performance and degrades collective creative diversity. The benefits are real at the personal level. The costs emerge at the population level, in ways that are invisible to any single user.

Key evidence: A 2024 study in Science Advances by Doshi and Hauser found that access to GPT-4 story ideas increased individual story quality by roughly 9%, particularly for less creative writers. But across all participants, AI-assisted stories were 10.7% more similar to each other than human-only stories.6 AI raised the floor and compressed the range.

A 2025 Harvard Business Review study found that AI collaboration increased productivity but reduced intrinsic motivation by 11% and increased boredom in subsequent solo work by 20%.7 The tools make you faster at producing but less motivated to produce on your own.

Strongest argument: The hidden cost structure. Individual gains are visible and immediate. Collective losses are statistical and delayed. This creates a classic social dilemma: every rational individual adopts the tool, and the aggregate result is homogenization that harms everyone.

4. Imagination as Infrastructure

Core claim: The real question is not about individual creativity but about the collective capacity of societies to imagine alternative futures. This is an infrastructure problem, not a productivity problem.

Key thinkers: Geoff Mulgan, whose 2020 paper “The Imaginary Crisis” argues that societies are losing the ability to generate positive visions of the future, creating a “shrinking future” that constrains political and social possibility.8 Jens Beckert, whose concept of Fictional Expectations shows how capitalist economies depend on shared imaginaries of future profit to coordinate present action.9 And the entire tradition of Critical Futures Studies, which examines whose futures get imagined and whose get suppressed.

The Imagination Economy discourse focuses almost entirely on individuals and organizations. The most consequential forms of imagination are collective: the shared assumptions about what futures are possible, desirable, or inevitable that Future Imaginaries research calls the deep structure of social behavior. If those collective imaginaries narrow, it does not matter how many individuals have access to AI tools. The range of thinkable futures shrinks anyway.

The Creativity Crisis

One empirical finding cuts across all four positions and deserves separate attention.

Kyung Hee Kim’s analysis of nearly 300,000 Torrance Tests of Creative Thinking, the most widely used measure of creative ability, found a significant decline in creativity scores since the 1990s. By 2008, 85% of children scored below the 1984 average.10 The decline predates generative AI by decades.

This matters for the Imagination Economy thesis: the term arrives precisely when the underlying capacity it names is in documented decline. If imagination is both the barrier to AI adoption (as the Amplifier position argues) and a diminishing resource at the population level, the Imagination Economy may be a framework that describes a problem it cannot solve.

The Simulacrum Critique

Lizzie Wade’s essay deserves separate attention because it articulates a position that the other three frameworks cannot easily absorb.

Her argument is not that AI produces bad outputs. It is that the category of imagination requires a process, and that process is being systematically bypassed. This is not a quality objection but an ontological one. Just as a photograph of a meal is not food, a machine-generated text that reads like the output of careful thought is not thought. The resemblance is exact. The thing itself is absent.

Wade connects this to political capacity: “Incapable of imagining, let alone fighting for, a better world.”5 The concern is not aesthetic. It is democratic. If imagination is the faculty through which citizens conceive alternatives to the present, and that faculty is outsourced to statistical models trained on the past, the political consequence is a society that can no longer envision its own transformation.

The trap, in Wade’s formulation, is that AI promises imagination without friction precisely at the moment when decades of exhaustion have depleted the capacity for friction. The offer is irresistible. And the cost is invisible because what disappears (the private struggle, the dead end, the revision, the moment of genuine surprise) was never visible in the first place. Only the outputs were.

This critique has a structural blind spot. It treats all AI interaction as consumption rather than collaboration. Practitioners who use AI as a sparring partner, pushing back against its defaults and iterating through multiple cycles, report a different experience: the friction does not disappear; it relocates. The struggle shifts from execution to direction. Whether that relocation preserves the essential quality Wade identifies, or subtly hollows it out, is an open empirical question.

A Foresight Perspective

A Foresight lens adds three dimensions that the mainstream Imagination Economy discourse tends to overlook.

The Term Itself is a Narrative Artifact

From the perspective of Future Imaginaries research, “Imagination Economy” is not a neutral description of an emerging reality. It is a narrative artifact. Analyzing it as a narrative, rather than through it, reveals something: the compound noun smuggles in a market logic. What was once a private faculty becomes a resource with a price tag, subject to the same optimization pressures as any other input.

This is what Beckert’s Fictional Expectations framework predicts: in capitalist economies, the future exists primarily as an imagined state used to coordinate present investment decisions. Declaring imagination the next scarce resource is itself a fictional expectation. It invites capital allocation. It generates consulting revenue. It shapes hiring decisions and educational curricula. The narrative becomes performative: by describing an Imagination Economy, it helps create one.

And performativity has political consequences. In an “economy” of imagination, market logic applies. Imagination that is legible to markets, that can be packaged into products, pitched to investors, and scaled through platforms, gets rewarded. Imagination that resists commodification (speculative fiction that challenges power structures, visions of degrowth, indigenous futurisms) gets marginalized. The frame itself is a sorting mechanism.

Cultural Lead, Not Cultural Lag

The sociologist William F. Ogburn coined the term “Cultural Lag” in 1922 to describe how material culture (technology) typically advances faster than adaptive culture (institutions, norms, values).11 The automobile arrived decades before traffic law. The internet arrived decades before digital regulation. Technology leads; culture follows.

AI inverts this pattern. For at least a century, popular culture has carried narratives about intelligent machines: HAL 9000, Skynet, Ex Machina, Her. The cultural imagination of AI was fully formed long before the technology caught up. ChatGPT did not introduce the idea of talking to a machine. It confirmed a story society had been rehearsing for generations.

This makes AI a case of Cultural Lead: the cultural narrative precedes and shapes the reception of the technology, rather than lagging behind it. The emotional intensity of the AI discourse, the fear, the hype, the existential anxiety, makes more sense through this lens. People are not reacting to what AI does today. They are reacting to a century of accumulated narrative about what AI will be. The fictional expectation runs ahead of the material reality.

For the Imagination Economy concept, this matters: much of what passes for “imagination” about AI’s future is actually recognition, the activation of pre-existing cultural scripts rather than novelty. When executives declare that “AI will change everything,” they are typically not imagining a specific future. They are performing a narrative they absorbed from science fiction and technology journalism. The Cultural Lead means the Imagination Economy may be less about new imagination and more about the recirculation of old ones.

“AI Amplifies What’s Already There”

The most useful synthesis of the Amplifier and Simulacrum positions may be a simple observation: AI amplifies what the user brings.

Those who arrive with a formed perspective, a specific vision, deep domain knowledge, or an unusual aesthetic sensibility get amplification. The tool extends their reach. A foresight practitioner who prompts scenarios with precise parameters and unconventional framings gets outputs that surprise and provoke, because the specificity of the input pushes the model outside its default distributions.

Those who arrive without a formed perspective get defaults. The statistical average of the training data. Blue cities. Generic business prose. Smooth surfaces. The model fills the vacuum with the most probable completion of whatever vague prompt it received. This is not amplification. It is substitution, and the user may not notice the difference.

The implication: the Imagination Economy is real, but only for those who already have imaginative capacity. For everyone else, it is a machine that produces the appearance of imagination while reinforcing the narrowest available defaults. Whether AI serves as amplifier or substitute depends entirely on what the user brings to the encounter.

This loops back to the Creativity Crisis. If imaginative capacity is declining at the population level, AI tools will increasingly function as substitution engines for most users and amplification engines for a shrinking minority. The Imagination Economy, in this reading, is an accelerant of inequality: it rewards the imaginatively rich and impoverishes the imaginatively poor.

Open Questions

The Imagination Economy discourse is young, fragmented, and evolving rapidly. Several questions remain genuinely unresolved:

Whose imagination gets amplified? The power question surfaces repeatedly but never gets systematically addressed. When Sam Altman articulates a vision of AI-driven abundance on global stages, that is imagination operating at civilizational scale, backed by billions in capital. When a community organization in Detroit imagines a neighborhood without surveillance, that imagination has no comparable amplification infrastructure. An “economy” of imagination, like any economy, will have winners and concentrations of power.

What happens during the transition? Most Imagination Economy discourse extrapolates to a stable end state: a world where AI handles execution and humans provide vision. But the transitional period, where 5% of organizations have adopted this paradigm and 95% have not, will generate its own dynamics. Coordination problems, regulatory gaps, labor displacement, and cultural backlash will shape the terrain long before any equilibrium emerges.

Can collective imagination be rebuilt? Mulgan’s “imaginary crisis” predates AI. Societies were already losing the capacity to envision positive collective futures before generative models arrived. The question is whether AI tools can serve as infrastructure for rebuilding collective imagination (through simulation, visualization, and scenario generation at scale) or whether they will further fragment it into billions of individually optimized but collectively incoherent visions.

Where are the material limits? The Imagination Economy discourse is strikingly silent about physical infrastructure. AI models require enormous energy and computational resources. Hardware supply chains are constrained. The “imagination” the economy promises depends on a material base that is neither unlimited nor equally distributed. An imagination economy for knowledge workers in San Francisco funded by rare earth extraction in the Congo is not a new paradigm. It is an old one wearing new clothes.

Is the term useful or misleading? This may be the most important question. “Imagination Economy” performs two functions simultaneously: it names a genuine shift in what economic activity rewards, and it smuggles in an ideological frame that treats imagination as a commodity. Whether the term illuminates or obscures depends entirely on whether users are aware of this double function. For those who treat it as an analytical lens, it reveals real dynamics. For those who treat it as a program, it reproduces exactly the commodification it claims to transcend.


  1. Magee, C. (1993). The Age of Imagination: Coming Soon to a Civilization Near You. Cited in various secondary sources on post-industrial economic theory. 

  2. King, R. J. (2007). The Imagination Age. Various publications and presentations elaborating Magee’s original concept for the social web era. 

  3. Reeves, M., & Fuller, J. (2021). The Imagination Machine: How to Spark New Ideas and Create Your Company’s Future. Harvard Business Review Press. (Publisher

  4. RBC (2026). Bridging the Imagination Gap: How Canadian Companies Can Become Global Leaders in AI Adoption. RBC Thought Leadership. (PDF

  5. Wade, L. (2025). The imagination economy. The Lizzie Wade Weekly. (Author 2 3 4

  6. Doshi, A. R., & Hauser, O. P. (2024). Generative AI enhances individual creativity but reduces the collective diversity of novel content. Science Advances, 10(28). (DOI

  7. Harvard Business Review (2025). Research: Gen AI Makes People More Productive, and Less Motivated. HBR, May 2025. (Article

  8. Mulgan, G. (2020). The Imaginary Crisis (and how we might quicken social and public imagination). Demos Helsinki & UCL. (PDF

  9. Beckert, J. (2016). Imagined Futures: Fictional Expectations and Capitalist Dynamics. Harvard University Press. (Publisher

  10. Kim, K. H. (2011). The Creativity Crisis: The Decrease in Creative Thinking Scores on the Torrance Tests of Creative Thinking. Creativity Research Journal, 23(4), 285-295. (DOI

  11. Ogburn, W. F. (1922). Social Change with Respect to Culture and Original Nature. B.W. Huebsch. (Full Text

No notes link to this note yet.


Note Graph

ESC