Tue. Feb 10th, 2026

Journal of Policies and Recommendations

Journal Home Page

OPEN ACCESS

Decision-Making Models in Design Thinking: A Management Perspective

1Khatune Jannat Snigdha
ORCID: https://orcid.org/0009-0000-9058-7340
1Md Seam Hasan
ORCID: https://orcid.org/0009-0001-0579-1855
2Md. Ashraful Haq Saeed
ORCID:
2Tanisha Tabashum Mim
ORCID: https://orcid.org/0009-0009-9968-8290
1Department of Graphic Design & Multimedia
2Department of Interior Architecture
Shanto-Mariam University of Creative Technology
Dhaka, Bangladesh
Prof. Dr Kazi Abdul Mannan
Department of Business Administration
Faculty of Business
Shanto-Mariam University of Creative Technology
Dhaka, Bangladesh
Email: drkaziabdulmannan@gmail.com
ORCID: https://orcid.org/0000-0002-7123-132X  

Corresponding author: Khatune Jannat Snigdha:  snigdhojannat2003@gmail.com

J. polic. recomm. 2026, 5(1); https://doi.org/10.64907/xkmf.v5i1.jopr.2

Submission received: 3 October 2025 / Revised: 9 November 2025 / Accepted: 17 December 2025 / Published: 2 January 2026

Download (PDF)

Abstract

Design thinking has emerged as a dominant paradigm in contemporary management practice, blending creative problem-solving with user-centred methods. While much literature describes the phases and tools of design thinking, less attention has been devoted to the specific decision-making models that designers, managers, and cross-functional teams enact during design processes. This paper synthesises decision-making theories (rational, bounded rationality, intuitive/heuristic, recognition-primed, dual-process, participatory and collaborative models) and situates them within the praxis of design thinking. Using a qualitative multiple-case study approach, data were collected through semi-structured interviews, participant observation in design workshops, and document analysis of three organisations that adopted design thinking for strategic innovation. Findings show that effective decision-making in design thinking is dynamic, contextually contingent, and often hybrid — combining structured analytic methods with abductive reasoning, rapid prototyping feedback loops, and stakeholder co-creation. The paper outlines a management-oriented conceptual model linking decision model selection to project phase, risk profile, team composition, and organisational culture. Implications for managers include guidelines for choosing and scaffolding decision processes, training recommendations, and suggestions for integrating evidence-based and participatory decision practices into design routines. Limitations and directions for future research are discussed.

Keywords: design thinking, decision making, management, bounded rationality, recognition-primed decision model, qualitative research

1. Introduction

Design thinking has diffused across corporate, governmental, and non-profit sectors as a means to tackle complex, ill-defined problems by combining empathic inquiry, iterative prototyping, and cross-disciplinary collaboration (Brown, 2008; Liedtka, 2018). As organisations adopt design thinking, managers face a practical question: How are decisions made during design processes, and which decision models best support innovation and implementation? Traditional managerial decision frameworks emphasise optimisation and formal analysis (Simon, 1947/1976), yet design work frequently relies on abductive reasoning, intuitive judgment, and collaborative sensemaking (Dorst, 2011; Schön, 1983). Integrating decision-making theory with design practice is essential for managers seeking to structure teams, allocate resources, and formalise learning without constraining creativity.

This paper addresses integration by reviewing decision-making models relevant to design thinking, proposing a theoretical framework that links decision models to stages of the design process and organisational variables, and reporting qualitative empirical findings from organisations employing design thinking. The goal is to produce actionable guidance for managers on how to recognise, encourage, and scaffold appropriate decision processes in design projects.

2. Literature Review

2.1 Design Thinking: Phases and Practices

Design thinking is often represented by iterative stages—empathise, define, ideate, prototype, test—though models vary and many practitioners emphasise fluidity and loops rather than strict sequence (Brown, 2008; IDEO, 2015; Liedtka, 2018). Core practices include user research, rapid prototyping, cross-functional collaboration, reframing of problems, and a tolerance for ambiguity (Cross, 2006; Dorst, 2011).

2.2 Classical Decision-Making Models in Management

The rational model assumes well-defined problems, clear objectives, full information, and optimisation (Simon, 1947/1976). Bounded rationality relaxes the full rationality assumption, arguing that decision makers satisfice using heuristics under cognitive and informational constraints (Simon, 1957). Organisational decision research (March & Olsen, 1976; March & Simon, 1958) emphasises organisational routines, political processes, and incrementalism.

2.3 Heuristics, Intuition, and Recognition-Primed Decision Making

Naturalistic decision making (NDM) highlights experts making rapid, near-real-time decisions under uncertainty using experience-based pattern recognition; Klein’s Recognition-Primed Decision (RPD) model describes how experts match situations to typical courses of action and mentally simulate outcomes (Klein, 1993). Kahneman and Tversky (1974; Kahneman, 2011) characterise cognitive heuristics and biases, differentiating between fast, intuitive (System 1) and slow, deliberative (System 2) thinking.

2.4 Participatory, Collaborative and Co-Creation Models

Participatory decision making engages stakeholders in problem framing and solution selection, and is central to human-centred design and co-creation approaches (Sanders & Stappers, 2008; Steen, Manschot, & De Koning, 2011). Collaborative decision making emphasises shared mental models, boundary objects, and negotiation of value tradeoffs within multidisciplinary teams (Carlile, 2004; Edmondson & Harvey, 2018).

2.5 Abductive Reasoning and Reflection-in-Action

Design reasoning frequently uses abduction—inference to the best explanation—to generate hypotheses and design moves (Peirce; Dorst & Cross, 2001). Schön’s (1983) reflective-in-action emphasises problem framing, reframing, and simultaneous doing and thinking processes that challenge linear decision models.

2.6 Decision Support and Evidence-Based Design

Evidence-based decision making integrates empirical data and testing into design choices (Briggs & de la Haye, 2017). In design thinking, prototyping and user feedback serve as rapid evidence sources that inform decisions iteratively (Ries, 2011; Brown, 2008).

2.7 Gaps and Opportunities

Although many decision theories exist, scholarship has not fully mapped how specific decision models operate within the micro-practices of design projects or provided managerial heuristics for selecting models based on context, phase, and risk. This gap motivates the theoretical framework and empirical study presented below.

3. Theoretical Framework

This section develops a conceptual framework linking decision-making models to design thinking phases, team characteristics, project risk profiles, and organisational supports. The framework synthesises classic decision theory with design reasoning and translates it into managerial variables that can guide practice.

3.1 Framework Overview

At the core, the framework views decision-making during design as contextually adaptive: teams shift among decision models (rational/analytic, bounded rationality/heuristic, recognition-primed/intuitive, collaborative/participatory, abductive/reflexive) according to four moderators: project phase, time pressure and uncertainty, expertise composition, and organisational culture and governance. Managers influence decision outcomes by choosing scaffolds (process templates, evidence infrastructures, facilitation methods, and boundaries for autonomy) that support appropriate model use (Simon, 1957; Klein, 1993; Dorst, 2011).

3.2 Project Phase as Primary Moderator

Design thinking’s phases map onto decision-making needs:

Empathise & Define (Exploratory phase): High ambiguity, low certainty about problem frames. Decision tasks include selecting which user insights to pursue and reframing problem definitions. Here, abductive reasoning and participatory decision models are salient: teams generate hypotheses, use stakeholder co-creation to validate problem frames, and tolerate multiple competing framings (Dorst, 2011; Schön, 1983). Managerial implication: create open sensemaking sessions, use boundary objects (prototypes, personas), and enable divergent exploration.

Ideate (Divergent design): Rapid generation of options; decisions focus on idea selection filters and which ideas to prototype. Heuristic and collaborative models dominate, using structured criteria (feasibility, desirability, viability) combined with team judgment. Use techniques like dot-voting and design sprints to make prompt choices while preserving novelty (Brown, 2008; Liedtka, 2018).

Prototype & Test (Convergent learning): Decisions about which prototypes to invest in, which metrics to track, and when to pivot. Evidence-based and bounded rationality approaches operate: teams use small-N empirical data to update beliefs, satisfice on viable directions, and engage in quick experiments (Ries, 2011). Managers should ensure rapid feedback loops and low-cost testing infrastructures.

Implement & Scale (Execution): Implementation demands formal decision processes for resource allocation, risk management, and operationalisation. Rational/analytic and political organisational decision models become more prominent (Simon, 1947/1976; March & Simon, 1958). Managers must institutionalise learnings via KPIs, governance mechanisms, and change management.

Mapping phases to decision models is not deterministic; hybridisation is typical (e.g., analytical tools are used in ideation to prioritise ideas; intuition guides prototyping under time pressure). The phase mapping is a heuristic to inform managerial scaffolding.

3.3 Time Pressure, Uncertainty, and Risk

High time pressure and ambiguous environments push teams toward recognition-primed and heuristic strategies: fast pattern matching with rapid mental simulation (Klein, 1993). Conversely, when time and information permit, teams can engage in deliberative, analytic evaluation (Kahneman, 2011). Risk profile (e.g., regulatory or safety-critical domains) constrains reliance on intuitive models; such domains require formal analysis and stakeholder oversight. Managers must calibrate acceptable risk and provide decision rules (e.g., thresholds for escalation to System 2 review).

3.4 Expertise Composition and Cognitive Diversity

Teams with deep domain expertise may validly rely on RPD and intuitive judgments, but cognitive diversity enhances idea generation and mitigates shared biases (Page, 2007). Collaborative decision-making benefits from asymmetric knowledge distribution: domain experts, designers, and business managers contribute complementary perspectives. Managers should design team composition to balance expertise and facilitation to surface tacit assumptions.

3.5 Organisational Culture and Governance

Organisational norms (hierarchy vs. empowerment), reward systems, and decision governance influence which models can be used. A culture that tolerates failure and supports experimentation enables abductive and heuristic decision-making (Edmondson, 2011). Governance structures must provide clear escalation paths and criteria for transitioning from exploratory to formal decision modes.

3.6 Scaffolds and Managerial Interventions

Managers control levers that scaffold decision processes:

  • Procedural scaffolds: Checklists, decision matrices, stage-gate criteria for shifting from prototyping to scale (Cooper, 2008).
  • Information scaffolds: Dashboards, ethnographic reports, and user metrics to increase evidence availability.
  • Cognitive scaffolds: Facilitation, reflective practices (e.g., debriefs), and design critique structures that surface assumptions.
  • Social scaffolds: Stakeholder workshops, governance forums, and cross-functional liaisons to integrate voices.

These scaffolds enable switching between System 1 and System 2 processing, balance creativity with control, and reduce catastrophic risks associated with blind intuition.

3.7 Propositions

From this framework, the paper advances several propositions for empirical assessment:

  • The project phase will systematically predict dominant decision models: exploratory phases favour abductive and participatory models; implementation phases favour analytic/rational models.
  • Time pressure and uncertainty increase reliance on recognition-primed and heuristic decision making.
  • Cognitive diversity within teams reduces bias and increases the number of viable prototypes selected during ideation.

Organisational scaffolds (procedural and information) moderate the effectiveness of intuitive decision models by providing lightweight validation mechanisms.

These propositions guided the empirical design and analysis.

4. Research Methodology

This study adopts a qualitative, multi-case study approach to explore how decision-making models operate in real design thinking practice and how managers can facilitate appropriate decision choices. The methodology emphasises depth, contextual richness, and theory building (Yin, 2014; Stake, 1995).

4.1 Research Design and Rationale

A qualitative multiple-case study enables comparison across contexts and identification of patterns while preserving contextual detail (Eisenhardt, 1989; Yin, 2014). The goal is theory refinement rather than statistical generalisation. Given the exploratory nature of the research question—how decision models manifest in design thinking—the qualitative approach is appropriate for uncovering processes, meanings, and managerial levers.

4.2 Case Selection

Three organisations were purposefully selected using theoretical sampling to maximise variation on variables theorised to influence decision models: industry sector, organisational size, and design maturity. Criteria included (a) explicit adoption of design thinking methods for product or service innovation, (b) willingness to grant access to teams and artefacts, and (c) variation in governance (start-up vs. corporate vs. public sector). The three cases, anonymised as Case A (technology start-up), Case B (large financial services firm), and Case C (municipal service design unit), provided contrastive contexts.

4.3 Data Collection

Data collection combined semi-structured interviews, participant observation, and documentary analysis over 9 months.

Semi-structured interviews: 32 interviews (20–90 minutes) with designers, product managers, executives, and frontline staff. Interview guides probed decision practices, use of tools, instances of success and failure, and perceived governance. Interviews were audio-recorded and transcribed verbatim.

Participant observation: The researcher attended 15 design workshops/sprints and observed team decision points, facilitation methods, and prototype testing sessions. Detailed field notes captured interactions, artefacts, and temporal sequences.

Document analysis: Project artefacts (journey maps, prototypes, test reports), governance documents (stage-gate criteria, reporting dashboards), and training materials were collected.

Triangulation across data types strengthened internal validity (Denzin, 1978).

4.4 Data Analysis

Analysis followed iterative, thematic coding and cross-case synthesis procedures (Miles, Huberman, & Saldaña, 2014; Braun & Clarke, 2006; Gioia, Corley, & Hamilton, 2013).

Familiarisation: Transcripts and field notes were read multiple times to identify salient decision episodes.

Open coding: Initial codes captured actions (e.g., “rapid vote,” “escalate to committee”), cognitive processes (“heuristic selection,” “mental simulation”), and contextual conditions (“deadline pressure,” “risk threshold”).

Axial coding: Codes were grouped into categories corresponding to decision models, moderators, and outcomes. The theoretical framework guided axial categories, but inductive themes were also allowed to emerge.

Cross-case synthesis: Patterns were compared across cases to identify commonalities and divergences.

Reliability checks: Intercoder reliability was established by coding a subset of transcripts with an independent researcher; discrepancies were reconciled through discussion.

Member checking: Preliminary findings were shared with participants for validation and correction.

Analysis emphasised process tracing of decision episodes—linking antecedent conditions, choice heuristics, and subsequent outcomes.

4.5 Ethical Considerations

Ethical approval was obtained from the author’s institutional review board. Participants provided informed consent; organisations and individuals are anonymised. Confidentiality was maintained in data storage and reporting. The researcher was reflexive about positionality, acknowledging the potential influence of the observer role on team behaviour (Berger, 2015).

4.6 Trustworthiness and Limitations

Trustworthiness was addressed through triangulation, member checking, and transparent documentation of coding procedures (Lincoln & Guba, 1985). Limitations include potential Hawthorne effects during observation, limited generalizability beyond the three cases, and reliance on retrospective accounts in interviews. Nonetheless, the methodology provides rich empirical grounding for theory refinement and managerial guidance.

5. Findings

5.1 Overview: Hybrid and Phase-Contingent Decisioning

Across all cases, decision-making in design thinking was rarely singular; teams regularly combined multiple models. Decisions were phase-contingent: abductive and participatory approaches dominated early phases, while analytic and governance-driven decisions increased during implementation. Time pressure, expertise distribution, and governance constraints shaped the hybrid mixes.

5.2 Empathise & Define: Participatory Framing and Abduction

In Case C (municipal service design), workshops with citizens produced a profusion of needs. Teams used participatory decision routines—structured co-creation sessions with voting and affinity mapping—to converge on problem statements. Rather than choosing a single ‘correct’ problem, teams framed multiple opportunity areas. Managers in Case C explicitly encouraged

“holding options open” for two sprints before committing; this institutional allowance enabled abductive leaps where ethnographic insights reframed assumptions (participant quote: “We let the stories speak first before we try to solve them”).

Similarly, Case A used quick ethnography and storyboarding; designers applied abductive inference to generate “how-might-we” statements. Decisions about which insights to prioritise were often made via facilitated sensemaking sessions where managers used boundary objects (personas, journey maps) to anchor discussion.

5.3 Ideation: Structured Divergence with Lightweight Filters

During ideation, teams valued divergent thinking but needed fast selection mechanisms to decide what to prototype. All three cases used heuristic filters—feasibility, customer value, effort to learn (often phrased as “bang for buck”)—as quick satisficing criteria consistent with bounded rationality. In Case B (financial services), regulatory constraints introduced strict exclusion criteria (e.g., no user data stored offsite), which became part of the heuristic filter.

Dot-voting, idea clustering, and silent ranking were common. Importantly, these mechanisms were socialised: the facilitator presented explicit selection rules before voting, which reduced post-hoc conflict. Managers emphasised that selection rules were provisional and subject to revision after prototyping.

5.4 Prototyping & Testing: Evidence as Decision Currency

Prototyping converted speculative ideas into empirical probes. Across cases, decisions to continue, pivot, or stop were predominantly evidence-driven: user test results, quantitative metrics from usability tests, and cost estimates. Case A used A/B prototype tests to inform product roadmap choices; Case B used pilot programs to assess operational impact.

However, evidence thresholds were pragmatic. Teams rarely required statistical significance; rather, they used directional signals (e.g., “majority of users found it confusing”) to guide satisficing choices. Interviewees noted that low-cost, rapid tests provided actionable evidence and reduced reliance on managerial intuition.

5.5 Recognition-Primed Decisions Under Time Pressure

Several observed episodes during sprints revealed the use of recognition-primed decision-making. In a 48-hour design sprint at Case A, senior designers rapidly chose a solution path based on pattern recognition from previous projects and mentally simulated user interactions. Their fast choice was later validated by prototype feedback. Participants described this as “instinct backed by experience.” Managers noted that such RPD use required personnel with deep domain experience; junior staff tended to rely more on explicit criteria.

5.6 Collaborative Decision Making and Conflict Resolution

Cross-functional teams faced tensions: engineering prioritised feasibility, designers prioritised user desirability, and business stakeholders prioritised viability metrics. Collaborative decision practices—facilitated negotiation, co-creation sessions, and use of boundary objects—were effective in reconciling differences. Where governance committees intervened (Case B), decisions slowed but gained organisational buy-in.

In one documented negotiation, product managers mediated conflicting priorities by translating technical constraints into costed options, enabling a hybrid solution. Managers who cultivated psychological safety and open critique reported smoother reconciliations.

5.7 Governance, Escalation, and Transition to Analytic Modes

Transition from exploration to execution triggered formal decision gates. In Case B, stage-gate criteria required business cases with quantified ROI, regulatory sign-offs, and operational readiness. These formal analytic decisions were often made by senior management or cross-functional boards. Interviewees described tension when creative prototypes were forced into rigid analytic frames; however, clear escalation rules and metrics (e.g., minimal viable metrics) eased the transition.

5.8 Managerial Scaffolds in Practice

Managers used multiple scaffolds to shape decision-making:

  • Decision templates: Simple matrices for idea prioritisation reduced ambiguity during ideation.
  • Rapid evidence infrastructure: Pre-approved usability labs and templated survey instruments enabled quick testing.
  • Facilitation playbooks: Facilitators used scripts to prevent dominant voices from skewing votes and to ensure all perspectives contributed.
  • Boundary conditions: Explicit constraints (budget caps, compliance rules) prevented unrealistic choices.

These scaffolds allowed intuitive and heuristic processes to be validated and bounded, reducing risk while preserving creative tempo.

5.9 Outcomes Associated with Model Use

Projects that deliberately combined abductive exploration with rapid evidence checks tended to produce higher stakeholder acceptance and faster implementation. Overreliance on intuition without testing produced costly pivots; overreliance on heavy analysis in early phases reduced novelty. Cognitive diversity correlated with richer ideation outcomes but required strong facilitation to avoid decision paralysis.

6. Discussion

6.1 Integrative Interpretation

The findings confirm the theoretical framework’s central claim: decision-making in design thinking is adaptive and phase-contingent. Managers must therefore be fluent in multiple decision models and able to scaffold transitions between them. The empirical evidence supports propositions that time pressure, team expertise, and governance mechanisms shape decision model selection.

6.2 Managerial Implications

Match the decision model to the phase and risk. Managers should encourage abductive and participatory decision processes during empathise/define phases and progressively introduce analytic, evidence-based processes as projects move toward implementation. Explicitly define transition criteria (e.g., evidence thresholds) to avoid premature convergence or endless exploration.

Provide lightweight evidence pipelines. Organisations should invest in rapid testing infrastructure (templated studies, low-cost prototype labs) so that heuristic and intuitive choices can be quickly validated. Such infrastructure reduces the downside of fast, recognition-based choices while preserving tempo.

Foster cognitive diversity and facilitate effectively. Diverse teams generate more novel options, but require structured facilitation to avoid conflict and decision paralysis. Training facilitators and adopting facilitation playbooks preserves openness while enabling decision closure.

Calibrate governance to accommodate creativity. Governance processes must balance control with flexibility. Stage-gates should be adaptive (e.g., require evidence-based learning plans rather than fixed ROI) and include “safe-to-fail” criteria for exploratory work.

Develop decision literacy. Managers and designers would benefit from explicit training in decision models (System 1/System 2, RPD, satisficing) so that teams can consciously choose the most appropriate approach and recognise cognitive biases.

Use boundary objects to coordinate. Artefacts such as personas, journey maps, and prototypes serve as coordination devices between disparate stakeholders and facilitate joint decision-making.

6.3 Theoretical Contributions

This paper contributes by explicitly linking existing decision theories to design thinking practice. It extends bounded rationality and naturalistic decision making into the design context, showing how these models interplay through organisational scaffolds. The proposed phase-contingent model adds granularity to the literature by mapping decision models onto design phases and managerial interventions.

6.4 Reconciling Intuition and Analysis

A central managerial challenge is reconciling intuition (valuable for speed and novelty) with analysis (necessary for risk mitigation). The study suggests a practical reconciliation: permit intuitive and recognition-based choices early but systematise immediate, low-cost validation via prototypes and user testing. This “intuit-then-test” pattern leverages strengths of both systems and aligns with dual-process theories (Kahneman, 2011).

6.5 Practical Tools and Tactics

Based on findings, managers can adopt a toolkit:

  • Decision phase map: Explicitly label each project phase with recommended decision models and scaffolds.
  • Provisional decision rules: Predefine thresholds for moving from exploration to commitment (e.g., minimum n of user tests, acceptable error rates).
  • Facilitation checklist: Steps to ensure equitable participation during idea selection.
  • Rapid evidence templates: Standardised test plans and user metrics for quick deployment.

These tools aid operationalisation of the study’s insights.

6.6 Limitations and Future Research

Limitations include the small number of cases, sectoral restriction (three organisational contexts), and potential observer effects. Future research could test the propositions quantitatively across larger samples, examine longitudinal outcomes of different decision mixes on innovation performance, and explore the role of digital collaboration tools in mediating decision processes. Experimental designs could compare projects using different scaffolds to assess causal impacts.

7. Conclusion and Recommendations

Decision-making is central to design thinking, yet often implicit. This study demonstrates that design projects require hybrid decision models that evolve across phases and are moderated by time pressure, expertise, and governance. Managers play a pivotal role in enabling appropriate decision processes by providing scaffolds—procedural, informational, cognitive, and social—that allow teams to move fluidly between abductive exploration and analytic execution.

Recommendations for managers:

  • Adopt a phase-contingent decision policy. Articulate which decision models are preferred at each stage and provide clear transition criteria to avoid premature closure or endless divergence.
  • Invest in rapid evidence infrastructure. Equip teams with low-cost testing capabilities and standardised templates to validate intuitive choices quickly.
  • Train for decision literacy. Offer workshops on heuristics, recognition-primed decisioning, and cognitive biases, complemented by facilitation training.
  • Design governance to be adaptive. Replace rigid stage-gates with evidence-based checkpoints and “safe-to-fail” experiments in early stages.
  • Cultivate cross-functional diversity and facilitation. Build teams with complementary skills and ensure facilitators can manage power dynamics and integrate perspectives.

By deliberately structuring decision processes rather than leaving them implicit, organisations can retain the creative advantages of design thinking while managing risk and accelerating implementation.

References

Berger, R. (2015). Now I see it, now I don’t: Researcher’s position and reflexivity in qualitative research. Qualitative Research, 15(2), 219–234. https://doi.org/10.1177/1468794112468475

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101.

Briggs, L., & de la Haye, A. (2017). Evidence-based design in product development. Journal of Design Research, 15(3), 201–222.

Brown, T. (2008). Design thinking. Harvard Business Review, 86(6), 84–92.

Carlile, P. R. (2004). Transferring, translating, and transforming: An integrative framework for managing knowledge across boundaries. Organization Science, 15(5), 555–568.

Cooper, R. G. (2008). Perspective: The Stage-Gate® idea-to-launch process—update, what’s new, and NexGen systems. Journal of Product Innovation Management, 25(3), 213–232.

Cross, N. (2006). Designerly ways of knowing. Springer.

Denzin, N. K. (1978). The research act: A theoretical introduction to sociological methods (2nd ed.). McGraw-Hill.

Denzin, N. K., & Lincoln, Y. S. (2011). The SAGE handbook of qualitative research (4th ed.). SAGE.

Dorst, K. (2011). The core of ‘design thinking’ and its application. Design Studies, 32(6), 521–532.

Dorst, K., & Cross, N. (2001). Creativity in the design process: Co-evolution of problem–solution. Design Studies, 22(5), 425–437.

Edmondson, A. C. (2011). Strategies for learning from failure. Harvard Business Review, 89(4), 48–55.

Edmondson, A. C., & Harvey, J. F. (2018). Cross-boundary teaming for innovation: Integrating research on teams and knowledge in organizations. Human Resource Management Review, 28(4), 347–360.

Eisenhardt, K. M. (1989). Building theories from case study research. Academy of Management Review, 14(4), 532–550.

Gioia, D. A., Corley, K. G., & Hamilton, A. L. (2013). Seeking qualitative rigor in inductive research: Notes on the Gioia methodology. Organizational Research Methods, 16(1), 15–31.

IDEO. (2015). The Field Guide to Human-Centered Design. IDEO.org.

Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.

Klein, G. (1993). A recognition-primed decision (RPD) model of rapid decision making. In G. A. Klein, J. Orasanu, R. Calderwood, & C. E. Zsambok (Eds.), Decision making in action: Models and methods (pp. 138–147). Ablex.

Liedtka, J. (2018). Why design thinking works. Harvard Business Review, 96(5), 72–79.

Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry. SAGE.

March, J. G., & Simon, H. A. (1958). Organizations. Wiley.

Miles, M. B., Huberman, A. M., & Saldaña, J. (2014). Qualitative data analysis: A methods sourcebook (3rd ed.). SAGE.

Page, S. E. (2007). The difference: How the power of diversity creates better groups, firms, schools, and societies. Princeton University Press.

Patton, M. Q. (2002). Qualitative research and evaluation methods (3rd ed.). SAGE.

Peirce, C. S. (1931–1958). Collected Papers of Charles Sanders Peirce (C. Hartshorne & P. Weiss, Eds.). Harvard University Press. (Original work published various years)

Ries, E. (2011). The Lean Startup: How today’s entrepreneurs use continuous innovation to create radically successful businesses. Crown Business.

Sanders, E. B.-N., & Stappers, P. J. (2008). Co-creation and the new landscapes of design. CoDesign, 4(1), 5–18.

Schön, D. A. (1983). The reflective practitioner: How professionals think in action. Basic Books.

Simon, H. A. (1947/1976). Administrative behavior: A study of decision-making processes in administrative organizations (3rd ed.). Free Press. (Original work published 1947)

Simon, H. A. (1957). Models of man: Social and rational. Wiley.

Stake, R. E. (1995). The art of case study research. SAGE.

Steen, M., Manschot, M., & De Koning, N. (2011). Benefits of co-design in service design projects. International Journal of Design, 5(2), 53–60.

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–1131.

Yin, R. K. (2014). Case study research: Design and methods (5th ed.). SAGE.