Information Processing and Entrepreneurship
The Foundations of Information Processing Theory
Information Processing Theory was originally articulated by Allen Newell and Herbert A. Simon in the seventies in their seminal work, Human Problem Solving. The core of their research focuses on the mechanics of human cognition, specifically how we intake, store, and retrieve data. They conceptualize the human mind as a complex system comprised of specific subsystems, including sensory input, memory storage, and arousal levels, which work in tandem to solve problems.
Environmental Complexity and Information Flow
Hansen and Allen (1992) later adapted this theory to explain and predict the creation of new business ventures. Their premise rests on the idea that different business environments generate information in varying volumes and degrees of diversity. For example, a simple environment—like a local landscaping business—might produce a manageable stream of repetitive data regarding seasonal schedules and fuel costs. Conversely, a complex environment produces a vast quantity of heterogeneous information that can be difficult for one person to navigate.
Industry Dynamics: Simple vs. Complex
To visualize this, consider that a complex environment often mirrors a fast-paced, high-tech industry where regulations, competitor breakthroughs, and consumer preferences shift daily. An example would be the artificial intelligence sector, where a founder must track global hardware shortages, ethical legislation, and rapid software updates simultaneously. In contrast, a simple environment aligns with a traditional, slower-moving industry, such as a small-town bakery, where the core variables—flour prices and local foot traffic—remain relatively stable over decades.
The Power of the Entrepreneurial Network
A single individual often finds it impossible to cope with the "information overload" created by complex markets. However, a team can function as a distributed processor; each individual absorbs and filters a specific portion of the environmental data. Through consistent communication, these individuals share their findings, allowing the collective network to make sense of the "noise." For instance, in a biotech startup, one co-founder might process clinical trial data while another monitors venture capital trends, ensuring the company reacts to the full picture rather than just a fragment.
Communication Frequency and Inter-connectivity
The structure of these networks is critical. Both the frequency of communication and the level of inter-connectivity (or network density) determine how effectively a group can seize opportunities. When individuals communicate frequently with a larger share of the network—rather than siloed groups—the "collective intelligence" rises. This high-density interaction is often the catalyst for formalizing a loose group of collaborators into a structured organization.
Strategic Implications for Founders
A key implication of this theory is that prospective entrepreneurs entering complex fields should avoid "going it alone." To succeed, they must team up with others to distribute the cognitive load. By building networks that process information efficiently, they create the necessary infrastructure for organizational growth. Supporting this, research indicates that solo entrepreneurs are less likely to survive and more frequently remain low-growth ventures (Hansen, 1992). Similarly, lone inventors tend to produce less innovative technologies and at a significantly slower pace than collaborative organizations.
References:Related Theories
Cognition is the ultimate bottleneck. These frameworks explore the mechanics of mental maps, the power of distributed networks, and the adaptive routines needed to survive "Information Overload":
1. Cognitive Infrastructure
- Sensemaking: How founders build plausible maps to filter the noise of complex markets.
- Sleep & Performance: Protecting the biological subsystems that intake and store data.
2. Collective Intelligence
- Network Density: Why high inter-connectivity is required to share findings effectively.
- Dynamic Capabilities: Reconfiguring routines to keep up with high-velocity information flow.
Comments