Journal Home Page
OPEN ACCESS
Project Management Tools and Their Effectiveness in Multimedia Production Workflows
| 1Afia Tabassum Sayed ORCID: https://orcid.org/0009-0007-3470-5753 1Sri Anamika ORCID: 1Fauzia Afrin ORCID: 2Abu Yousuf Pranto ORCID: 2Abid Hasan ORCID: 2Md. Ekhtakhairul Alam Anik ORCID: 1Department of Graphic Design & Multimedia 2Department of Drawing & Painting Shanto-Mariam University of Creative Technology Dhaka, Bangladesh |
| Prof. Dr Kazi Abdul Mannan Department of Business Administration Faculty of Business Shanto-Mariam University of Creative Technology Dhaka, Bangladesh Email: drkaziabdulmannan@gmail.com ORCID: https://orcid.org/0000-0002-7123-132X Corresponding author: Afia Tabassum Sayed: |
Asian microecon. rev. 2026, 6(1); https://doi.org/10.64907/xkmf.v6i1.amr.1
Submission received: 1 October 2025 / Revised: 9 November 2025 / Accepted: 21 December 2025 / Published: 2 January 2026
Download PDF (000 KB)
Abstract
This study examines the role and effectiveness of project management tools within multimedia production workflows. Multimedia production—encompassing video, audio, animation, interactive media, and mixed-media experiences—presents unique challenges including tight deadlines, creative iteration, cross-disciplinary collaboration, and heavy file management. Project management tools promise to streamline communication, scheduling, resource allocation, version control, and review cycles. Using a qualitative research design, this paper investigates how practitioners in multimedia production adopt, configure, and experience contemporary project management tools (e.g., Asana, Trello, Jira, Monday.com, ShotGrid, Frame.io) and workflow-specific platforms. Data were collected through semi-structured interviews with 24 multimedia professionals across production houses, independent studios, and educational media units, and analysed via thematic coding. Findings indicate that tool effectiveness depends less on platform feature sets and more on alignment between tool affordances and production practices, governance of shared assets, and organisational culture. Recommendations emphasise tailoring tool choice to workflow stage, investing in onboarding and shared conventions, and combining lightweight visual boards with specialised media review tools. Limitations and suggestions for future research are discussed.
Keywords: project management tools, multimedia production, workflows, collaboration, qualitative research, creative industries
1. Introduction
Multimedia production combines creative practice with complex logistical coordination. Projects commonly involve directors, producers, editors, animators, sound designers, programmers, and clients, each contributing specialised work that must be integrated across time and digital assets. Historically, production coordination was supported by spreadsheets, email, and in-person meetings; however, the last two decades have seen the emergence of digital project management tools aiming to make planning, task allocation, communications, and review more efficient (Project Management Institute [PMI], 2021).
The adoption of general-purpose project management platforms (e.g., Asana, Trello, Jira) and media-specific tools (e.g., ShotGrid, Frame.io) has accelerated in multimedia contexts. Yet despite widespread uptake, evidence about which tools work best and why remains fragmented. Studies in software development and corporate project management suggest that the fit between methodological approach (e.g., Agile vs. Waterfall) and tooling strongly affects outcomes (Beck et al., 2001; Highsmith, 2009). Multimedia production introduces creative uncertainty and iteration that challenge traditional linear frameworks and raise questions about how project management tools should be configured or extended to fit creative workflows.
This study asks: What makes project management tools effective (or ineffective) in multimedia production workflows? It focuses on practitioners’ experiences to surface how tools shape, enable, or constrain production practices, and to derive implications for better tool selection and implementation.
2. Literature Review
Research on project management tools spans fields including information systems, organisational studies, and creative production research. Existing literature reveals several themes relevant to multimedia production: the relationship between methodology and tools; the affordances of visual task boards and their cognitive impacts; the role of media-focused review/versioning platforms; and sociotechnical factors such as adoption, training, and governance.
Methodological alignment is a recurring theme. Agile methodologies, emphasising short iterations, continuous feedback, and cross-functional teams, were developed in software but have been applied to creative projects with mixed results (Serrador & Pinto, 2015; Rigby, Sutherland, & Takeuchi, 2016). Scholars argue that Agile’s focus on incremental delivery suits iterative creative tasks, but that specific practices (e.g., daily stand-ups, sprint planning) require adaptation for distributed or asynchronous creative teams (Denning, 2016).
Visual task boards—whether Kanban-style boards or Scrum-style backlogs—offer affordances for shared situational awareness and cognitive offloading (Anderson, 2010; Liker & Morgan, 2006). In multimedia contexts, boards can track stages (e.g., pre-production, production, post-production, review) and make dependencies visible. However, rigid board structures may conflict with nonlinear creative processes, and boards alone do not address version control or large-file management (Berger & Johnson, 2018).
Media-specific platforms (ShotGrid, Frame.io, Wipster) provide video review, frame-accurate comments, version tracking, and integration with editing tools (Adler & Wyman, 2019). These platforms reduce friction in review cycles by allowing stakeholders to annotate timelines, request revisions, and automatically generate version histories. Yet they can become siloed if not integrated with broader project planning tools.
Finally, sociotechnical research shows that tool adoption is shaped by organisational culture, leadership support, and training (Venkatesh, Morris, Davis, & Davis, 2003). The same platform may yield different experiences and outcomes across organisations because of governance, conventions (naming standards, metadata), and explicit processes for using the tool (Hollingsworth & O’Donnell, 2014).
Taken together, the literature suggests that tool effectiveness in multimedia production is contingent: features matter, but so do fit, practices, and social coordination.
3. Theoretical Framework
This study draws on a sociotechnical perspective and the theory of task-technology fit (TTF) to frame an analysis of project management tools in multimedia workflows. The sociotechnical perspective foregrounds the interplay between social systems (people, structures, culture) and technical artifacts (tools, platforms) and has been widely used to understand how technology mediates work practices (Trist & Bamforth, 1951; Orlikowski, 1992). Task-technology fit (Goodhue & Thompson, 1995) posits that information technology will have a positive impact on individual performance when the functionalities of the technology match the tasks that users must perform. Combining these approaches provides a robust lens for exploring how project management tools support (or fail to support) the specific and often idiosyncratic tasks in multimedia production.
3.1 Sociotechnical systems in multimedia production
Multimedia production is inherently sociotechnical. Creative outputs arise through collaboration between humans with distinct professional identities (e.g., cinematographer, animator, sound engineer) and through the orchestration of technical artifacts, including cameras, editing suites, digital asset management (DAM) systems, and project management platforms. From a sociotechnical perspective, tools do not simply automate tasks; they reshape communication, bias attention, and reconfigure responsibility (Bijker, Hughes, & Pinch, 1987). For instance, a tool that surfaces task deadlines prominently may pressure editors to prioritise speed, whereas a review platform that highlights frame-accurate comments can alter how directors provide feedback.
This perspective implies that studying tools in isolation—focusing only on feature checklists—misses how technologies are embedded in practices. Instead, we must analyse how tools are appropriated, modified, and governed within organisational contexts. Appropriation includes mundane practices such as naming conventions, creation of templates, or informal rules about who posts comments and when. Governance includes role definitions, permissions, and escalation paths that make the tool a site for accountability.
3.2 Task-Technology Fit applied to creative workflows
Task-technology fit (TTF) guides us to ask whether a given tool’s capabilities match the demands of multimedia tasks. Goodhue and Thompson originally defined fit along dimensions such as data quality, timeliness, and task structure (Goodhue & Thompson, 1995). For multimedia production, TTF can be extended to include media-specific requirements: support for large binary files, frame-accurate commenting, transcoding and preview generation, metadata and versioning, and visual timeline representations.
Applying TTF to multimedia tasks yields several hypotheses:
- Tools that provide frame-accurate review and native media playback will fit post-production review tasks better than generic task trackers.
- Visual boards and dependency mapping will fit scheduling and resource allocation tasks when they allow representation of parallel creative streams (e.g., simultaneous animation and sound design) and file dependencies.
- Integration between DAM systems, editing suites, and PM tools improves fit by reducing manual file transfers and metadata mismatches.
However, TTF also acknowledges contextual factors: even high-fit tools may fail if users lack training, if workflows are poorly specified, or if organisational norms discourage centralised tracking (e.g., a culture that prizes ad-hoc verbal instructions over documented tasks). Therefore, we supplement TTF with sociotechnical attention to governance, culture, and appropriation.
3.3 Socio-materiality and affordances
Closely allied to sociotechnical thinking is the socio-materiality approach, which emphasises that technologies and social practices are mutually constitutive (Orlikowski & Scott, 2008). Affordances—what the tool allows users to do—are not absolute; they depend on users’ skills and organisational arrangements (Gibson, 1977; Norman, 2013). A platform may afford asynchronous review, but that affordance only materialises if stakeholders adopt asynchronous conventions (e.g., time-bound review windows, acceptance of recorded feedback).
By integrating TTF, sociotechnical systems, and socio-materiality, this theoretical framework positions the analysis to capture not only whether tools have necessary features, but also how those features are enacted in practice. It leads to analytic attention to affordances (what the tool enables), fit (how well features map to tasks), and appropriation/governance (how organisations shape use). Together, these concepts guide data collection and thematic coding—helping to surface patterns about why tools succeed or fail in multimedia production contexts.
4. Research Methodology
4.1 Research design
This study used a qualitative research design focused on an exploratory, interpretive understanding of practitioners’ experiences. Qualitative methods are appropriate for capturing the nuance of tool use, situated practices, and meanings that users attach to tools (Creswell & Poth, 2018). The research follows an instrumental case sampling approach: rather than studying a single organisational case in-depth, it purposively sampled individuals across a range of multimedia production settings to identify common patterns and divergent practices.
4.2 Sampling and participants
Participants were purposively sampled to capture diversity in role, organisation type, project scale, and tooling. Recruitment targeted multimedia professionals through industry networks, LinkedIn groups, and contacts at production studios. Inclusion criteria required participants to have at least two years of experience in multimedia production and recent experience (within the last 18 months) using at least one digital project management tool.
A total of 24 participants were interviewed, including:
- 6 producers/project managers from mid-sized production houses;
- 5 post-production supervisors and editors from film/TV and commercial studios;
- 4 independent multimedia producers/creative directors;
- 5 technical leads or pipeline engineers at animation/VFX studios;
- 4 academics or learning designers producing educational multimedia.
Participants represented geographically distributed teams (North America, Europe, Asia), and projects ranged from short-form marketing videos to episodic educational series and feature-length post-production work.
4.3 Data collection
Data collection centred on semi-structured interviews conducted via video conferencing software. Interview protocol covered participants’ typical workflows, tools used (task trackers, media review platforms, DAM systems), perceptions of tool strengths and weaknesses, examples of successes and failures, governance and conventions, and suggestions for improving tool fit. Interviews averaged 60 minutes and were audio-recorded with participant consent. In addition to interviews, participants were invited (when feasible) to share anonymised screenshots of their project boards or review platform interfaces and to describe particular artifacts (templates, naming conventions).
To increase credibility through triangulation, the study also collected supplementary documentation where available: sample workflow diagrams, templates, and written process notes. These artifacts helped ground participants’ narratives and provided concrete examples of how tools were configured.
4.4 Data analysis
Interviews were transcribed verbatim and uploaded to qualitative analysis software for coding (e.g., NVivo or similar). Thematic analysis followed Braun and Clarke’s (2006) iterative process: familiarisation with data, generating initial codes, searching for themes, reviewing themes, defining and naming themes, and producing the report.
Coding combined deductive and inductive approaches. Deductive codes were informed by the theoretical framework (fit, affordances, governance, appropriation), while inductive codes emerged from participants’ language (e.g., “review fatigue,” “version sprawl”). Two researchers coded a subset of transcripts to establish inter-coder consistency; discrepancies were resolved through discussion and codebook refinement.
4.5 Trustworthiness and reflexivity
To bolster trustworthiness, the study used member checking—participants were offered anonymised summaries of findings and invited to correct or clarify interpretations. Thick description is provided through illustrative quotes and artifacts to allow readers to evaluate transferability. The researchers remained reflexive about their backgrounds in media production and project management, noting potential biases toward emphasising tool affordances; reflexive memos were kept to record analytic decisions and assumptions.
4.6 Ethical considerations
Ethical approval was obtained from the relevant institutional review board. Participants provided informed consent and were assured of confidentiality. Identifying details (studio names, proprietary workflow specifics) were anonymised or generalised to prevent disclosure of sensitive information. Interview recordings and transcripts are stored securely and will be deleted after archival according to the approved protocol.
5. Findings
5.1 Overview of tool usage patterns
Participants reported using a combination of general-purpose project management platforms and media-specific tools. Most common task trackers were Asana, Trello, and Jira for tracking tasks and schedules; Frame.io, ShotGrid, and Vimeo Review were used for media review and client-facing feedback; and cloud storage/DAM solutions (Dropbox, Google Drive, proprietary studio servers) held master assets. Many teams employed hybrid models where a lightweight visual board tracked high-level tasks while media review platforms handled versioned feedback.
5.2 Theme 1: Fit is contextual — “the right tool for the right stage”
A dominant theme was that tool effectiveness depends on the workflow stage. Participants described using different tools for pre-production (scheduling, script versions), production (call sheets, on-set notes), and post-production (frame-accurate review, version control). As one post-production supervisor explained,
“Our Trello board tells us what sequence is in which stage, but Frame.io is where the director leaves frame-accurate notes. They solve different problems.” (Participant P11).
5.3 Theme 2: Integration and data flow
Poor integration between tools caused friction. Participants frequently cited manual work: exporting lists from one system to another, uploading new media versions without synchronised metadata, and reconciling task status across platforms. Pipeline engineers emphasised the value of APIs and integrations:
“When our DAM updates the master file, and that change propagates to ShotGrid and the PM board automatically, we save hours a week. Without that, you have version sprawl.” (Participant P7).
5.4 Theme 3: Governance, conventions, and onboarding
Tool affordances were realised only with governance. Teams that established naming conventions, versioning protocols, and explicit rules for posting comments reported smoother workflows. Conversely, teams that expected tools to self-organise work without conventions experienced chaos.
“We adopted Asana but never agreed on how to name tasks or tag deliverables. It became a mess of duplicate cards and unclear owners.” (Participant P3).
5.5 Theme 4: Social practices shape tool use
Participants highlighted that cultural practices—such as who is expected to attend review sessions or whether verbal feedback is acceptable—determined tool adoption. In some studios, senior creatives resisted centralised tracking, preferring direct Slack messages or phone calls; in others, leadership mandated updates in the PM tool, and compliance improved accountability.
5.6 Theme 5: Media review tools reduce cognitive friction
Tools offering frame-accurate comments, time-coded annotations, and easy version comparison were valued highly for post-production. These platforms reduced the cognitive load of translating spoken notes into actionable tasks.
“When a director scribbles on a printed frame, it takes time to turn that into an actionable edit. Frame.io’s annotations let us see exactly where to change things.” (Participant P14).
5.7 Theme 6: Trade-offs between flexibility and structure
Lightweight boards (e.g., Trello) were praised for flexibility and ease of onboarding, but criticised for lack of advanced features (dependencies, Gantt views, resource levelling). Robust platforms (e.g., Jira, Monday.com) supported complex scheduling but required configuration and training. Participants often adopted a layered approach: a flexible board for creative teams and a heavier planning tool for producers.
6. Discussion
6.1 Interpreting findings through the theoretical framework
The findings align with task-technology fit: tools addressing media-specific tasks (frame-accurate review, versioning) fit post-production tasks better than generic trackers, while visual boards supported task coordination and situational awareness. However, the sociotechnical lens clarifies why fit alone is insufficient. Governance and appropriation practices determined whether affordances translated into performance improvements. In other words, a high-fit tool must be coupled with social practices to realise benefits.
6.2 Practical implications for tool selection and implementation
First, organisations should adopt a staged tool strategy: match tools to production stages and purpose (scheduling vs. review vs. asset management). Second, plan for integration: invest in APIs, middleware, or manual processes to keep metadata and statuses synchronised. Third, codify conventions early—naming, tagging, versioning, and review cycles—and include onboarding materials so new collaborators align quickly.
A layered tooling approach emerged as pragmatic: lightweight visual boards for creative teams; media review tools for director/editor feedback; and a producer-facing system for budgets, timelines, and external reporting. This minimises cognitive overhead for creatives while retaining managerial oversight.
6.3 The role of leadership and culture
Because adoption is a social process, leadership matters. Producers who model consistent use of tools and enforce simple governance practices reduce ambiguity and encourage compliance. Conversely, when senior team members bypass tools, their teams often follow suit—leading to information fragmentation. Change management strategies (training, champions, incremental rollout) are therefore essential.
6.4 The tension between creative freedom and process
Participants expressed a recurring tension: structure can improve predictability but may impede creative spontaneity. The recommended approach is to design minimal constraints that enable coordination without micromanaging creative decisions—for example, using boards for milestones and deliverables but leaving task-level creative decisions to individual practitioners.
6.5 Implications for tool developers
Tool developers serving multimedia markets should prioritise: seamless media playback and frame-accurate annotation; robust versioning and metadata handling for large assets; well-documented integrations with common editing suites and DAMs; and lightweight templates that lower onboarding cost. Developers should also consider features that capture non-task artifacts such as moodboards, cut lists, and review rounds.
7. Conclusion and Recommendations
This study explored how project management tools function within multimedia production workflows, combining qualitative interviews with a sociotechnical and task-technology fit framework. Results indicate that tool effectiveness depends on three interrelated factors: functional fit to the specific production task (especially media review and versioning), integration and data flow across platforms, and organisational practices that govern tool use.
Recommendations for practitioners include:
- Adopt a staged tooling strategy: Use different tools for pre-production, production, and post-production tasks. Pair lightweight visual boards with media-specific review platforms.
- Prioritise integration: Automate metadata and version propagation where possible to reduce manual reconciliation and version sprawl.
- Establish conventions and governance: Define naming conventions, versioning protocols, owners, and review windows early in the project and document them in a shared playbook.
- Invest in onboarding and champions: Provide short training sessions and designate power users who can support others and enforce good practices.
- Balance structure and creative freedom: Apply minimal necessary structure to coordinate work while preserving space for creative iteration.
For tool vendors: focus on media-native features (frame-accurate feedback, scalable asset handling), easy integrations, and templates tailored to common multimedia workflows. For researchers: future work could quantitatively link tool configurations and practices to delivery outcomes (on-time delivery, budget adherence, quality assessments) and explore the role of AI-assisted asset management in reducing manual overhead.
7.1. Limitations
This study used purposive sampling and qualitative methods to prioritise depth over generalizability. Findings reflect the experiences of 24 practitioners and may not capture all organisational contexts, particularly large-scale feature film pipelines with highly bespoke tools. Additionally, while participants spanned geographies, the sample is biased toward organisations willing to discuss their practices; studios with highly proprietary pipelines may be underrepresented.
References
Adler, M., & Wyman, S. (2019). Collaborative review systems in post-production. Journal of Media Production, 6(1), 22–41.
Anderson, D. J. (2010). Kanban: Successful evolutionary change for your technology business. Blue Hole Press.
Beck, K., et al. (2001). Manifesto for Agile Software Development. http://agilemanifesto.org/
Berger, R., & Johnson, L. (2018). Task boards and the creative process: Visual management in film production. Journal of Creative Media Studies, 4(2), 45–62.
Bijker, W. E., Hughes, T. P., & Pinch, T. J. (1987). The social construction of technological systems: New directions in the sociology and history of technology. MIT Press.
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101.
Creswell, J. W., & Poth, C. N. (2018). Qualitative inquiry and research design: Choosing among five approaches (4th ed.). Sage.
Denning, S. (2016). What is Agile? Strategy & Leadership, 44(4), 10–17.
Gibson, J. J. (1977). The theory of affordances. In R. Shaw & J. Bransford (Eds.), Perceiving, acting, and knowing (pp. 67–82). Lawrence Erlbaum.
Goodhue, D. L., & Thompson, R. L. (1995). Task-technology fit and individual performance. MIS Quarterly, 19(2), 213–236.
Highsmith, J. (2009). Agile project management: Creating innovative products (2nd ed.). Addison-Wesley.
Hollingsworth, R., & O’Donnell, K. (2014). Governance and workflow in creative industries. International Journal of Cultural Policy, 20(3), 321–335.
Liker, J. K., & Morgan, J. M. (2006). The Toyota way in services: The case of lean management in service industries. McGraw-Hill.
Norman, D. A. (2013). The design of everyday things: Revised and expanded edition. Basic Books.
Orlikowski, W. J. (1992). The duality of technology: Rethinking the concept of technology in organisations. Organisation Science, 3(3), 398–427.
Orlikowski, W. J., & Scott, S. V. (2008). Sociomateriality: Challenging the separation of technology, work and organisation. The Academy of Management Annals, 2(1), 433–474.
Project Management Institute. (2021). A guide to the project management body of knowledge (PMBOK Guide) (7th ed.). PMI.
Rigby, D. K., Sutherland, J., & Takeuchi, H. (2016). Embracing Agile. Harvard Business Review, 94(5), 40–50.
Serrador, P., & Pinto, J. K. (2015). Does Agile work? — A quantitative analysis of project success. International Journal of Project Management, 33(5), 1040–1051.
Trist, E., & Bamforth, K. (1951). Some social and psychological consequences of the Longwall method of coal-getting. Human Relations, 4(1), 3–38.
Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27(3), 425–478.