Hostname: page-component-745bb68f8f-v2bm5 Total loading time: 0 Render date: 2025-02-06T06:39:20.526Z Has data issue: false hasContentIssue false

What's taking so long? A collaborative method of collecting designers’ insight into what factors increase design effort levels in projects

Published online by Cambridge University Press:  11 September 2020

Alexander Freddie Holliman*
Affiliation:
Department of Design, Manufacturing and Engineering Management, University of Strathclyde, James Weir Building, GlasgowG1 1XJ, UK
Avril Thomson
Affiliation:
Department of Design, Manufacturing and Engineering Management, University of Strathclyde, James Weir Building, GlasgowG1 1XJ, UK
Abigail Hird
Affiliation:
Department of Design, Manufacturing and Engineering Management, University of Strathclyde, James Weir Building, GlasgowG1 1XJ, UK
Nicky Wilson
Affiliation:
Department of Design, Manufacturing and Engineering Management, University of Strathclyde, James Weir Building, GlasgowG1 1XJ, UK
*
Author for correspondence: Alexander Freddie Holliman, E-mail: alexander.holliman@strath.ac.uk
Rights & Permissions [Opens in a new window]

Abstract

Design effort is a key resource for product design projects. Environments where design effort is scarce, and therefore valuable, include hackathons and other time-limited design challenges. Predicting design effort needs is key to successful project planning; therefore, understanding design effort-influencing factors (objective considerations that are universally accepted to exert influence on a subject, that is, types of phenomena, constraints, characteristics, or stimulus) will aid in planning success, offering an improved organizational understanding of product design, characterizing the design space and providing a perspective to assess project briefs from the outset. This paper presents the Collaborative Factor Identification for Design Effort (CoFIDE) Method based on Hird's (2012) method for developing resource forecasting tools for new product development teams. CoFIDE enables the collection of novel data of, and insight into, the collaborative understanding and perceptions of the most influential factors of design effort levels in design projects and how their behavior changes over the course of design projects. CoFIDE also enables design teams, hackathon teams, and makerspace collaborators to characterize their creative spaces, to quickly enable mutual understanding, without the need for complex software and large bodies of past project data. This insight offers design teams, hackathon teams, and makerspace collaborators opportunities to capitalize on positive influences while minimizing negative influences. This paper demonstrates the use of CoFIDE through a case study with a UK-based product design agency, which enabled the design team to identify and model the behavior of four influential factors.

Type
Research Article
Copyright
Copyright © The Author(s), 2020. Published by Cambridge University Press

Introduction

As with many industries, time is a valuable, irreplaceable resource for design projects, in particular, for hackathons where challenges are particularly time-constrained (Raatikainen et al., Reference Raatikainen, Komssi, Bianco, Kindstöm and Järvinen2013). This resource, referred to as design effort, is typically measured in person-hours and is defined as the amount of time required to complete a project or a task (Salam et al., Reference Salam, Bhuiyan, Gouw and Raza2009; Salam and Bhuiyan, Reference Salam and Bhuiyan2016). Looking at the design agency industry by way of example, designers record their efforts using timesheets, and design agencies and teams will typically charge (or invoice) their clients in either hours or days. It is a common practice that design agencies and teams will charge their clients for one length of time, while work a greater amount. This deliberate project time discrepancy behavior is especially common with smaller agencies and teams, where projects are run to tight margins. Therefore, quotes are deliberately underestimated, especially when a project is put out to tender, due to the desire to win the bid and secure the business of the client, with a view to establishing a longer-term working relationship. This presents a significant challenge when conducting research into small design teams, as the anticipated project design effort levels may not be a reliable measure of design project management.

With such a universal and critical resource, there is undoubtedly a wide range of potential factors which contribute to the characterization of a design space, influencing the design effort required to complete a project. But which factors have the greatest influence and how do they behave over time? An improved and enhanced understanding of what these factors are and how they influence design effort is key to effective and improved project planning and, in the case of industry, improved invoicing of projects.

One successful means of estimating design effort is through the use of tacit knowledge and experience, which designers already use to plan product design projects (Brauers and Weber, Reference Brauers and Weber1988; Eckert and Clarkson, Reference Eckert and Clarkson2010; Jack, Reference Jack2013; Serrat et al., Reference Serrat, Lumbreras and López2013). Yet one notable limitation of using tacit knowledge is that it can be difficult to articulate and can manifest as a “gut feeling” or “hunch”. This difficulty in communicating opinions effectively can also lead to misunderstanding between design members (Luck, Reference Luck2013). This is particularly significant during the initial stages of the design process, which is a social and collaborative process (Shai and Reich, Reference Shai and Reich2004) where fostering a dialogue, especially within interdisciplinary teams, is a particular challenge (Bowen et al., Reference Bowen, Durrant, Nissen, Bowers and Wright2016).

Notably, this challenge is present in hackathons and makerspaces, where the creation of collaborative ad hoc interdisciplinary teams to respond to specific goals is commonplace. These teams, comprised of members who likely have only met at the start of the project (Komssi et al., Reference Komssi, Pichlis, Raatikainen, Kindström and Järvinen2015), are typically from vastly differing backgrounds and levels of experience (i.e. Jensen et al., Reference Jensen, Semb, Vindal and Steinert2016; Pe-Than et al., Reference Pe-Than, Nolte, Filippova, Bird, Scallen and Herbsleb2019). Further compounding this challenge, hackathon activities are time-constrained (Raatikainen et al., Reference Raatikainen, Komssi, Bianco, Kindstöm and Järvinen2013) limiting the level of design considerations made within their process (Saravi et al., Reference Saravi, Joannou, Kalawsky, King, Marr, Hall and Wright2018). Considering the imposed time limits in hackathons, and the ad hoc team creation found in makerspaces, there is a clear need for quick, effective development of mutual understanding and team dynamics (Raatikainen et al., Reference Raatikainen, Komssi, Bianco, Kindstöm and Järvinen2013).

A means of capturing the tacit knowledge and experience of designers can enable the successful articulation of opinions, providing and sharing tactical insight into the design and planning process and enable designers to discuss their perceptions between them, bringing them “on to the same page”. Therefore, there is a clear need for a method which can capture the tacit knowledge of factors that influence design effort levels in product design, as held by designers and design teams. Additionally, there is a clear requirement for a method which can articulate this knowledge in a manner that can be understood by all team members, further improving the overall characterization and understanding of the design space and the factors which influence it.

This paper presents the Collaborative Factor Identification for Design Effort (CoFIDE) method, a new method developed from Hird's (Reference Hird2012) method for producing resource forecasting tools for new product development (NPD). CoFIDE aids design teams to characterize their creative spaces by identifying and capturing those factors which are perceived as being most influential on design effort levels in product design projects through the capture of tacit knowledge and experience. This tacit knowledge and experience are modeled graphically, characterizing the design space, enabling the direct comparison and providing the opportunity for better understanding between team members, illustrating the changes each factor has over the course of a design project.

This paper can be considered to have two parts, starting with an outline of the current state of design effort-influencing factor identification approaches, which demonstrates the need for suitable methods. The second section is a presentation of CoFIDE using a case study example with data gathered at a UK-based product design agency, Design Agency 1 (DA1), demonstrating the novelty of the data collected and the insight which it offers. This demonstration includes the factors considered by the team at DA1 to be most influential and an analysis of the method and the results. This section will also discuss the output of CoFIDE in detail, including the key mean effect plots and percentage influence graphs used to model the participating designers’ perceptions on design effort levels in product design. The five-person design team of DA1 acted as study participants and are either experienced product designers and product design engineers, or members of management (both at director level and middle management) who are educated to a degree-level in either product design or product design engineering.

Literature review

A literature review was conducted to identify existing published work into design effort estimation. This was done using combinations of key words to search internet databases (Scopus and IEEExplore), identifying key papers relating to the research. Key words used in the search included “design effort”, “design project”, “product design”, “project time”, “resource estimation”, “resource forecasting”, and “project planning” used in various combinations. Thirty-five papers were identified where the estimation of product design effort, project time, or similar was either the focus of the method covered or as part of a larger method. Of the papers identified, 16 either were generalized approaches where insufficient detail was provided in order to determine what, if any, methods and techniques were applied, or discussed generic project management methods, thus not specifically focusing on design effort estimation. The remaining 19 papers have a range of scope, from generic product design projects, to tooling design.

Papers addressing the estimation of product design project cost have been included as there is an intrinsic link between project length and project cost (Jacome and Lapinskii, Reference Jacome and Lapinskii1997) and these methods estimate the project length as part of their methods. Additionally, three papers were found using these terms that addressed design effort-influencing factors, without producing a resource estimating tool, these were also included in this review.

This review will firstly discuss the published design effort estimation methods to determine how each factor (if present) was identified. Additionally, the factors identified by each method are gathered and categorized based on the definitions stated by the authors of each publication.

Current methods to estimate design effort

Literature addressing design effort estimation were categorized in six ways shown on the top row of Table 1. These categories were identified by considering the methods, technologies, etc. that are used in their method and the sources of their data, and address whether the method identifies factors or draws conclusions from existing literature; their means of factor identification (brainstorming, data analysis or surveys, and interviews) and methods that either do not state factors or do not justify the factors used.

Table 1. Design effort estimation methods in product design that consider influential factors

Methods that identify factors

Table 1 indicates that eight papers reviewed identified factors as part of their overall process. Of these papers, four used a statistical analysis approach and the remaining four engaged with experts by various means to identify factors. A further eight papers reviewed made assumptions on influential factors based either on a synthesized list derived from their own literature reviews or on pre-existing research and methods. Another common approach for factor identification is to gather insight from industry. This is typically achieved through interviews with designers or through brainstorming (i.e. Andersson et al., Reference Andersson, Pohl and Eppinger1998). Such approaches rely on the tacit knowledge of designers to successfully identify these factors.

An alternative method for identifying influential factors is through various forms of data analysis. Four papers were identified using this approach including Cho and Eppinger (Reference Cho and Eppinger2005) who further acknowledge that the influence of factors can vary over time. Many data analysis methods use regression analysis, particularly to train simulations, that is, a Monte–Carlo simulation (Hellenbrand et al., Reference Hellenbrand, Helten and Lindemann2010), or other regression-based approaches.

The other main approach to identifying influential factors is through the review of literature. Eight of the methods reviewed based on assumptions on influential factors on published research or models or by creating a list synthesized from a literature review. These studies look to produce a range of factors from which practitioners can identify the most influential by using their tacit knowledge of the design space (i.e. Bashir and Thomson, Reference Bashir and Thomson2001a); or use various factors from their own literature reviews to inform various statistical analysis approaches, using variations of neural networks, or similar (Xu and Yan, Reference Xu and Yan2006; Yan and Xu, Reference Yan and Xu2007; Pollmanns et al., Reference Pollmanns, Hohnen, Feldhusen, Abramovici and Stark2013; Wang et al., Reference Wang, Tong and Huang2015). Notably, these approaches do not utilize the first-hand tacit knowledge of the design teams their methods are intended for.

Some methods use a small number of factors, which they discuss within their literature reviews. Bashir and Thomson (Reference Bashir and Thomson2001b) offer two approaches to estimate design effort through historical data analysis. In both instances, they consider product complexity to be the major influential factor, along with the severity of requirements. Other approaches with few influential factors are for specific use cases, such as those of Salam et al. (Reference Salam, Bhuiyan, Gouw and Raza2009) for aircraft engine compressor design.

Other methods using factors

Unjustified factor use

Two of the papers covered in this review use influential factors, or a similar term, without any specific justification, to produce cost estimates, drawing connections between a project cost and project design effort levels, actors relating to productivity as influential factors, yet no sources for these are specified. The method proposed by Zhigen and Yan (Reference Zhigen and Yan2011) uses regression analysis to predict design effort with factors that have no justification for their use.

Methods without factors

Design effort estimation approaches were also identified for this study which did not use any influential factors in the methods. These methods opt to either model the design process in collaboration with designers and engineers (Eppinger et al., Reference Eppinger, Nukala and Whitney1997) or DSM-based modeling (Smith and Eppinger, Reference Smith and Eppinger1997; Yan et al., Reference Yan, Wang, Xu and Wang2010). Although these methods do not explicitly identify any influential factors, clearly it is necessary to understand what influences the calculated probability.

It is clear that a significant number of design effort estimating approaches rely on the use and understanding of influential factors, identifying them in various ways. These methods vary in approach, with some participating with design teams, utilizing their tacit knowledge, and the level of structure applied to them. Regardless of the specific steps of these approaches, an understanding of which factors exert an influence over design effort levels is essential to the process. The following section will consider what these types of factors are.

Factors influencing design effort levels found in the literature

From the analysis of the 59 factors found in the literature, shown in Table 2, 10-factor categories were found based on the collation of definitions given by the authors. These categories are project, product, team management, business management, client, information, stakeholder, tools & technology, external influences, and retrospective-only. A further category of “Not Included” has also been added to acknowledge the instances where it was not possible to confidently determine the justifications or definitions of the term. The distribution of factor categories is shown in Figure 1 and the full list of factors found in the literature is shown in Table 2. Many of the factors identified within this review are categorized within more than one category. Project-based factors refer to the project type or the activities within a project.

Fig. 1. Factor categorization analysis.

Table 2. Design effort-influencing factors

Product-based factors are those that refer to qualities or attributes of the intended product. This was one of the most common types, with the most common factor being “product complexity”. Two management-based factors: Team management and business management-based factors refer to the makeup and management of design team members and overall management of the design agency (or similar), the business plan, strategies, etc. that are used business-wide, respectively. Additionally, means-based factors fall into two categories: information and tools & technology-based factors. With information-based factors relating to the exchange of information (although many factors can be assigned additional categorizations), and tools & technology-based factors referring to the use and availability of equipment or other technologies to aid in the development of a product.

Two other factor categories consider external parties that are involved in the design process. Client-based factors refer to any issues or characteristics that are displayed by the client. These include the factors that consider the levels of information being provided to the design team by the client. Stakeholder-based factors refer to those that involve other stakeholders, other than the client, including processes to resolve conflict with stakeholders and the geographical locations of stakeholders.

Additional factor categories include External influences-based and retrospective factors. These that refer to any non-stakeholder external body that may influence a design project (including political and market-based influences); and those proposed by authors but can only be assessed after a project has been completed.

Factors not included

Two factors were proposed by Pollmanns et al. (Reference Pollmanns, Hohnen, Feldhusen, Abramovici and Stark2013) which have not been included in the analysis as the source is written in German. To prevent any misinterpretation of the authors’ intent, these have been disregarded.

Many factors have been found to influence project planning and design effort, with varied contexts and can be specific to a product feature or project phase, or general with universal influence. Writing on the topic is significantly limited which emphasizes the need for further study of the field. Furthermore, although this analysis of studies shows which factor categories are more common (i.e. product and team management-related factors), the specifics of each factor vary from study to study. This further emphasizes the need for study into this field. Additionally, those factors identified through literature review do not allow for practicing design teams to offer their own insight (based on their tacit knowledge of the design space).

Literature review summary

In this literature review, a clear link has been shown between a designer's, or design team's understanding of the design effort levels needed for a project, and their understanding of the factors which influence such levels. It has been shown that the use of experience and tacit knowledge can lead to accurate design effort estimation, demonstrating that designers have a working understanding of influencing factors. Some methods have been developed for factor identification or design effort estimation with a range of use cases and scope. Yet, research into this topic is limited. Many assuming factors have the same influence over different teams’ projects, using literature review findings as a guide; others relying on the analysis of past project data to identify factors. There is therefore a clear need for more study into these factors to enable improved comprehension across the product design field; more study into the identification of factors to enhance factor discovery by design teams to effect valuable impact on practicing design teams; and more study into capturing the tacit knowledge and experience of designers to aid this discovery by using the data that has been captured through experience. The findings of this literature review show that there is a clear gap in methods to identify the influential factors of design effort levels in product design projects, specifically those that utilize the tacit knowledge of design teams. In response, the authors propose the following research questions:

  • RQ1: Through the capture of tacit knowledge of design teams, what novel data and data presentations can be generated from new design effort influencing factor identification approaches?

  • RQ2: What new insights and opportunities does this offer makerspace collaborators and hackathons participants?

The following discussion will address and answer these research questions through the application of a new approach and discuss the output of said approach in a stepwise manner, highlighting the novelty of the data produced and the value gained from it.

CoFIDE – a new method for identifying design effort-influencing factors in product design

CoFIDE provides detailed novel data from which researchers can gain insight into how design teams perceive their design space, their projects, and the factors which influence them. This insight facilitates a deeper and improved understanding of the factors which influence design effort demands, including the behavior if each factor changes over the course of the project. By identifying which factors have the greatest influence, researchers and design teams can make efforts to minimize the negative effects of some factors while maximizing the positive effects of others. In hackathons and limited-time design challenges, this insight can aid to maximize the effectiveness of the design team and enable mutual understanding within the team. By repeating CoFIDE at regular intervals, it is possible for researchers to determine if the influence of each factor has changed based on design teams’ efforts to manipulate these factors.

CoFIDE method background

This paper presents CoFIDE, which builds upon resource forecasting for NPD teams method developed by Hird (Reference Hird2012). Working with NPD teams in various industries, Hird developed a method with a foundation in Fisher's Design of Experiments (Reference Fisher1949), which captures the perceptions and tacit knowledge of NPD teams’ management in order to replicate it for future NPD project planning (Fig. 2).

Fig. 2. A new product development resource forecasting method. Adapted from Hird (Reference Hird2012).

Hird's method is a five-step process, following closely to that of the traditional Design of Experiments approach, but with three main differences: physical (or simulated) experiments are replaced with estimations about hypothetical scenarios; objective, measurable inputs are replaced with tacit, subjective expert knowledge as the subject of modeling; and results of the analysis will be used for prediction, rather than optimization.

CoFIDE builds upon Hird's method in three main ways:

  1. 1. Collaborative approach: CoFIDE works collaboratively with every member of a design team, rather than with team managers, which prevents users of CoFIDE from overlooking the potentially valuable insight and knowledge held by design team members.

  2. 2. Project types it has been developed for: NPD teams typically operate within the parameters of their company – a medical device company will most likely develop other medical devices, rather than say children's toys. The scope for new projects will be limited, so the factors that influence these projects may be niche to the field. Whereas the diversity of potential project types that a design agency may take is significant. CoFIDE has been developed to be used by design agencies, therefore the factors being considered could be broader, or more generalized.

  3. 3. Graphical modeling of each designer's perceptions: CoFIDE graphically models the perceptions each team member has of these factors and provides a means of comparison and a greater understanding of the characteristics of each factor during the course of the project, rather than using this insight for design effort estimation.

CoFIDE method

The following section will describe each stage of the application of CoFIDE in turn, providing example case study data, and the analysis and findings of applying CoFIDE in a design context. The novelty of the data gathered and generated will be shown, with the insight offered demonstrated, and the potential benefits for design teams in makerspaces and hackathons also explored.

CoFIDE method introduction

CoFIDE is a four-step method enabling research into a collaborative understanding of the most influential factors of design effort requirements in product design projects, as perceived by design team members. Each of the four stages (shown in the left-hand column of Figure 3) provide data key to the study of the practice of product design and of the product design industry, including models of factor behavior during a design project and design processes used in the industry. Case study examples of these data and the research insight it offers are included throughout this study.

Fig. 3. Collaborative Factor Identification for Design Effort (CoFIDE) method.

Case study introduction

The case study data presented in this paper were collected from a UK-based product design engineering agency, Design Agency 1 (DA1). The case study was conducted over 4 h over the course of 2 months due to participant availability. DA1 has experience developing a diverse range of products for varied markets, including sports training equipment and food & beverage equipment. At the time of the study, DA1 employed five full-time product designers and product design engineers, and a studio manager, all of whom participated in the study. The participants have various degrees of experience, from mid-level to design directors with over 10 years of experience, outlined in Table 3. Discussion of the findings of each step will be included where appropriate. Although in this example case study, statistical analysis was conducted using Minitab 17.0, this can be replicated using MS Excel (or similar).

Table 3. Design Agency 1 participant roles

Stage 1: Design process and factor identification

CoFIDE consists of semi-structured interviews and brainstorming workshops conducted by the researcher to generate and gather all relevant data to build the experimental design. This first stage provides researchers with the fundamental data to produce experimental designs for further data collection, as well as details on the types of factors considered and the design processes followed in the industry.

Mapping the design project process

DA1 has a formal design process that they use for all their projects, therefore gathering this information was simple, during a semi-structured interview conducted by the researcher with the Managing Director and Studio Manager. This process is similar to that of the Design Council's Double Diamond. This adaptation included the standard four stages (each with their own tasks and sub-tasks) of “Discover, Define, Develop, Deliver”, including an initial “Pre-sign off” stage and splitting the “develop” stage into two: “design” and “detail”. DA1's design process is illustrated in Figure 4.

Fig. 4. Design Agency 1's design process.

Resource identification

Identifying resources is key for CoFIDE as it is the subject of the factors’ influence. As discussed during the literature review, although the resource in question is design effort and is measured in units of time, the intention of this step is to determine which specific unit will provide maximum utility for the remaining steps of CoFIDE. During the same semi-structured interviews, DA1's project resource type was identified as “Person-hours” as its equivalence is used for effort tracking and invoicing. During the workshop, participants were invited to discuss alternatives, everyone agreed this was the best method.

Factor identification

A long list of factors is generated by the participants through brainstorming facilitated by the researcher. DA1 participants, unprompted by the facilitator, approached this task by addressing each design project stage individually, identifying those factors that influenced the length of each stage. This resulted in the creation of seven distinct categories, one for each design process stage, plus one for factors which affected more than one, or all of the stages. In total, 63 different factors were suggested, shown in the right-hand column of Table 4, and were then regrouped into 10 different categories, shown in the left-hand column of Table 4. During informal interviews, the participants agreed that this clustering process aided them (the participants) in identifying some similar terms, applied to separate stages of the design process, and allowed for common themes to be established. An advantage to this stage-by-stage process is that participants were able to define each of the clustered factors by varied ranges of terms for similar factors. However, this process also allowed for some terms to be suggested that were activities/tasks, rather than factors, these have been placed in parenthesis in Table 4. Best practice for future uses of CoFIDE should include guidance to prevent suggestions of activities, or tasks, in lieu of factors.

Table 4. Factors influencing design effort levels of design projects as perceived by Design Agency 1

Factor selection

The most influential factors were individually rank-voted confidentially in order to prevent inter-participant influence on voting. Ranking factors aided in capturing which factors were considered most influential among those being voted for. This voting activity, shown in Table 5, leads to the selection of client “gut feeling”, definition level inputs, product complexity, delivery output complexity (DOC), and design experience as the factors perceived by DA1 to have the most influence on project length.

Table 5. Voting for shortlist of factors for design effort influence in design projects at Design Agency 1

The top factors were (in descending order): client “gut feeling” (the intuitive reaction the design team has of the client), definition levels inputs, product complexity, and delivery output complexity (after a tie-breaking vote). As shown in Table 2 (in the literature review), of the seven factors to be categorized as client-based, none consider the design team's intuition on the perceived qualities of the client specifically. Definition level inputs relate to the brief (as shown in Table 4), specifically the levels of information provided. Ten of the identified factors in Table 2 consider information by some means. Product complexity (a product-based factor) is one of the most common, specifically mentioned factors in Table 2. It is noteworthy that this was only voted third most influential by the DA1 team. DOC is also a product-based factor, one of the most common factors found in Table 2.

Each factor was assigned a minimum and maximum level with participants using the corresponding factor elements to aid in the definition of the factor's range, shown in Table 6. Two of the factors [client “gut feeling” and definition level (inputs)] were assigned a 4-point scale measurement; product complexity was given a range of “simple” to “complex”; DOC was given a scoring system based on a quadrant diagram with risk and complexity on the axes, giving the factor a range of three.

Table 6. Factor classification and elements

This novel data gathered from this first step of CoFIDE provides researchers with valuable insight into how practicing design teams conduct their projects, through the capture of the processes used in the industry and (as shown with the case study example) how formal processes have been adapted to best suit those using it. This capture of the formal processes used across the industry makes it possible to establish a greater understanding of which (of all the proposed processes) are used and also which processes are most commonly used.

This first step enables researchers to capture the factors considered to be influential by the industry and practicing designers with a formalized process. By applying this process with various design teams, researchers can create a list of global factors that influence design effort. Such data would enable the identification of regional and global trends, correlations, etc. in which factors influence projects. This may offer the opportunity to identify research opportunities to investigate regional differences based on design education (availability, type, etc.); and available resources (manufacturing, supply, etc.). The creation of lists of factors synthesized from brainstorming further improves understanding of the design space by identifying industry-based definitions for these factors.

Applying the first step of CoFIDE in a hackathon or makerspace environment provides a valuable structure for newly formed design teams to follow. By mandating the discussion of design processes from the outset, hackathon teams and makerspace collaborators must agree on a process before tackling the challenge, providing valuable structure to their hack. This includes the consideration of the tasks necessary to complete their goals, which is particular challenge. Identifying a design effort resource provides hackathon teams with context and measure to the design process, aiding in identifying feasible outcomes for the hackathon. Hackathon teams and makerspace collaborators can be made up from participants with varied backgrounds and levels of experience, therefore by brainstorming influential factors and reflecting on the results of a vote (like those shown in Table 5), hackathon design teams are able to establish a mutual understanding of what will affect their project and specifically which factors to give the most attention to.

Stage 2: Estimation collection

During Estimation Collection, CoFIDE uses statistical analysis (using software such as Minitab 17.0) to produce a half factor experimental design [based on Fisher's Design of Experiments approach (Reference Fisher1949)] using the factors and defined levels to describe hypothetical design projects.

The gathered factors and design process were used to produce an experimental plan based on a four-factor, two-level, half-factorial Design of Experiments with Minitab 17.0, without randomization. Randomization was omitted as pilot study participants would locate an experimental run that resembled a design project they had experience with, from which they would base all other estimates. The experimental plan was combined with the six project phases tasks identified in the preliminary work, to create the Estimation Sheet for Workshop 2 – Collect Phase, shown in Table 7.

Table 7. Estimation collection sheet

During semi-structured focus group discussions conducted by the researcher, every participant estimates the design effort needed to complete each of the hypothetical design projects described by the experimental runs. DA1 participants completed their own estimations simultaneously without conferring, taking less than an hour to complete. The estimation responses from this were gathered and used in the next phase of CoFIDE.

Stage 3: Perception model building

Regression equations derived from the participant estimate values are produced by the researcher using statistical analysis software, such as Minitab 17.0, enabling the modeling of participants’ perceptions. These models take two forms: the regression equation factor coefficient model, and the mean effect plots. Taking the regression equation coefficient values allows researchers to identify the perceived magnitude of the influence of each factor. These graphs do not depict the behavior of each factor (i.e. whether they influence design effort positively or negatively); nor do the graphs illustrate the constant value that is included in each regression equation. The mean effects plots, representing the average effect of each factor at each project stage, illustrate the direction of change the influence of the factor has on the project stage length.

Using Minitab 17.0, 30 regression equations were created, six for each participant predicting each phase of the project for design. Each factor has been coded as follows: client “gut feeling” (CGF); definition levels (inputs) (DL); product complexity (PC); and delivery output complexity (DOC). As the experimental design is a half-factorial, not all inter-factor relationships can be modeled, those of definition levels (inputs) x product complexity, definition levels (inputs) x delivery output complexity, and product complexity x delivery output complexity. Each set of participant's regression equations are summarized in Table 8.

Table 8. Participant regression equation values for design effort levels in product design projects

Note: Project Phases are numbered: 1. Pre-sign off; 2. Discover; 3. Define; 4. Design; 5. Detail; 6. Deliver.

Factors are labelled: A. Client “Gut Feeling”; B. Definition Level (Inputs); C. Product Complexity; D. Delivery Output Complexity

This step of CoFIDE provides a unique opportunity for researchers to model designers’ perceptions. Doing so not only provides insight into how factors influence design effort, but how that influence changes from project phase to project phase. Researchers can use these models to estimate design effort levels needed for future projects that have been evaluated against the same factors. Additionally, these models can enable researchers and design teams to optimize their design space by taking steps to reduce the negative influence and conversely increase the positive influence, of factors. The mean effect values produced through CoFIDE enable each factor's influence to be shown on a phase-by-phase basis, allowing researchers to map the behavior of factors over the course of the design project. Although this step of CoFIDE provides novel data, valuable to research, the value to hackathon teams and makerspace collaborators is produced in step 4.

Stage 4: Actionable information collation

The graphical models produced by CoFIDE present analysis data in two forms linked with the data produced in step 3. In a semi-structured interview setting conducted by the researcher, participants can evaluate these graph sets, reflecting on their individual perceptions of factors, allowing researchers to ask questions around whether they felt the graphs illustrated these.

Percentage influence graphs

Percentage influence graphs, derived from the regression equation values, enable the cross-comparison of all factors for each phase of a design project by plotting the percentage of influence each factor has over the output of each regression equation.

Percentage influence graphs for DA1 produced by the researcher, shown in Figure 5, allow the visual identification of which factor has the greatest influence and whether there is consensus within the group. The percentage shown in each graph is the percentage of influence each factor has over the output of the corresponding regression equation. It does not show the percentage of influence in comparison to the regression equation's coefficient, as this would not allow for comparison between two different regression equations (i.e. comparison between different participants). As a set, these graphs also depict the changes in levels of influence over the course of a product design project.

Fig. 5. Percentage of influence on design effort over a design project phases.

Client “gut feeling”

When considering Figure 5 in isolation, it is challenging to determine the characteristics and level of influence that the client “gut feeling” factor has over the design time of a project (Figure 6). Yet when considering the averages of each response, shown in Figure 7, it is clear that the client “gut feeling” factor has a low influence on design times, with minimal fluctuation across the entire design project. When presented with these findings in an informal interview, the participants suggested that this is likely due to the greater involvement the client has at the project's start, which reduces once the designing starts. Additionally, during informal interviews, the participants further suggested that the increase in influence during the Delivery phase is also likely due to this increase in client involvement. Considered the most influential factors at the voting phase of CoFIDE, as the client “gut feeling” score of a client increases, the anticipated design effort levels of a project decrease. As shown by the calculated percentages, this is the least influential factor with an average influence of 9.9% across a project.

Fig. 6. Changes in percentage of influence for factors on design project times.

Fig. 7. Average changes in percentage of factors’ influence on design effort.

Definition level (inputs)

Figure 5 shows that the definition level (inputs) factor has the greatest influence on the design times during the Discover phase and gradually reduces as the project progresses, with least influence at the Deliver phase. This is reinforced when considering the trend line shown in Figure 7. Informal interviews with the case study participants indicate that this is due to ambiguity in the project brief, reflected in the level of the factor, and would be resolved prior to the later stages of the project.

Product complexity

The influence of the product complexity factor (as shown in Figure 5) increases from the project start, peaking at the design phase, and maintaining higher influence in the later project phases. Confirming what has been posited by authors such as Griffin (Reference Griffin1997), the complexity of a product has a direct influence over design effort levels, particularly during the design phase. From the case study data, it is clear that product complexity is the most influential factor; this is further emphasized in Figure 7, where the corresponding trend line maintains the highest percentage of influence throughout the course of the project.

Delivery output complexity

According to Figure 5, the influence of the DOC factor increases over the course of the project, with the greatest level of influence held over the delivery phase of the project. This is more clearly shown in Figure 6, where the trend lines both steadily increase over the course of the project. During informal interviews, the case study participants confirmed that this was due to the factor representing the demands of the client and brief on what is expected as the output of the project, with a more detailed, longer list of project deliverables causing an increase in its perceived complexity, and thus more time will be required in order to fulfill the project requirements.

Mean effect plots

The second type of graphical output of CoFIDE is the mean effect plot produced by the researcher. As the same suggests, this is the graphical version of the mean effect values produced in the previous step. A main effect plot enables researchers to clearly demonstrate the effect a single independent variable (in this case a factor) has on the dependent variable (in this case project time), disregarding the effects of any other factor.

The mean effects plots for DA1's design team, shown in Figure 8, provide the direction of influence each factor has on project times, where the gradient of the graph indicates both the correlation relationships of factors and project times but also the magnitude of said relationships. Values for each graph are included in Table 9. Each graph illustrates the mean effects of each participant for each factor and each project phase of design time.

Fig. 8. Mean effect plots for factor influence over design effort levels.

Table 9. Mean effect plot values for factor influence over design effort levels

Client “gut feeling”

The trend lines shown in Figure 8 show that there is an inverse correlation between client “gut feeling” and design project stage design effort levels, the higher the level of definition, the less design effort will be required. It can be noted that this factor has the greatest influence over the earlier phases of the project. However, considering the “Detail” phase, the trend lines show a mixture of positive and negative gradients, which shows potential confusion in the design space.

Definition level (inputs)

The trend lines shown in Figure 8 show that there is an inverse correlation between definition level (inputs) and design effort levels, the higher the level of definition, the slower the demand for design effort. It can be noted that this factor has the greatest influence over the earlier phases of the project.

Product complexity

Figure 8 shows that there is a clear positive correlation between the product complexity level and design effort levels. Furthermore, it is clear that this influence increases as the project progresses, with the Design, Detail, and Deliver phases having the greatest increase at high levels of complexity.

Delivery output complexity

Figure 8 indicates that there is a positive correlation between the DOC level and design effort levels. Furthermore, it is clear that this influence increases as the project progresses, with the Design, Detail, and Deliver phases having the greatest increase at high levels of complexity. However, considering the “Pre-Sign Off” phase, the trend lines show a mixture of positive and negative gradients, which shows potential confusion in the design space.

A third set of graphical models were created by the researcher, showing comparisons between each to the factor coefficients as they change per stage, per factor, an example is shown in Figure 9. Consensus and confusion within the design team were identified in a semi-structured workshop by presenting these graphical representations to the team. Insight and discussion points could also be identified during this workshop.

Fig. 9. Percentage of influence of factor per participant comparison example.

Participant evaluation of graphs in combination

When considering the graphs in combination, of the five participants interviewed by the researcher, three (1, 2, and 4) agreed that the relationships between each factor and design effort levels represented in the graphs accurately reflect their personal opinions and perceptions of all factors and phases. While the remaining two participants (3 and 5) had some reservations over particular factor-phase length relationships. Participant 3 believed that the mean effects plots reflected his perceptions; however, they felt that the percentage splits could vary. Participant 5 agreed with Participant 3 but expressed doubt in their own ability to accurately estimate design effort, stating that their responses might have been more “anomalous” due to their perceived difficulty completing the estimation task.

Additionally, three participants were able to identify at least one other member of the design team. Although this may not have utility for newly formed design teams for hackathons, etc. this has great potential for longer-standing design teams, where the benefits of mutual understanding can continue beyond a single project.

Observations on definition level (inputs)

Although each participant commented on the results of each factor, specific comments were made around the definition level (inputs) factor, specifically referring to its influence over latter design phases. For example, Participant 2 stated that “possible issues relating to this factor, ambiguity of brief, etc., would be resolved before later [design] phases started.”

Use of graphs in future

During these semi-structured interviews, each participant assessed the potential utility of the graphs shown to them. Specifically whether they found, or could find, any use for the relationships and correlations between factor levels perceived by themselves and their colleagues. The results of which are shown in Table 10.

Table 10. Utility of mean effect plots and percentage influence graphs

Three participants believed that the graphs offered some insight into the way that either they or other team members perceive the different factors and how they view project planning. Three participants also believed that the information provided by the graphs could be used to aid in unspecified future managerial decision making, with one participant stating that such information could help inform future team construction, qualifying that this would be of greater use when the designer team is larger.

By producing graphical models of designers’ perceptions, it is possible to create accessible visual diagrammatic representations of tacit knowledge-informed perceptions of designers. This provides a simple means of drawing direct comparisons between each design team member enabling the identification of consensus and disagreement within design teams, and offering a cornerstone from which to consider various influences of these perceptions, from background and education, to personal taste and opinion. These models further provide visual models for each factor's behavior. Not just the magnitude of influence over design effort levels, but also how and when that influence has the greatest effect. Producing such insight allows for the potential identification of common traits and attitudes towards the practice of design and the design space in general throughout the industry. Additionally, these graphs provide valuable insight and discussion points by identifying potentially industry-wide issues for potential future research.

When considering the graphs produced by CoFIDE, hackathon teams and makerspace collaborators can quickly develop a mutual understanding of not only each team members’ perceptions towards these factors but also the influence and behavior of each factor during a design project. This insight further provides hackathon teams with the means to identify points of consensus and confusion between team members. This, in turn, acts as a basis and reference for open discussions around the design project, the factors themselves, and the perceptions and opinions of each member of the team. Additionally, hackathon teams can use the graphs to identify the contributing issues to the most influential factors to provide a focus for improving and optimizing of the design space.

Application of CoFIDE findings

The case study findings demonstrate that it has been possible to provide valuable insight relating to DA1 design team's design space. CoFIDE has shown that product complexity is the most influential factor for DA1 and that its influence increases during the course of the project, peaking at the “Design” phase, shown in Figure 6. Figure 7 shows that as the perceived complexity of a product increases, so too does the design effort levels for a project. By modeling the level of influence each factor has during each project phase, DA1 can identify which factors to address at each phase and can look to address and optimize their processes to mitigate the negative impacts of each factor. Indeed, during an informal discussion with the company director, DA1 has used the insight attained through CoFIDE in several ways; DA1 has taken steps to manage the influence of factors where possible by various means, since using CoFIDE. For example, improving the management of projects by introducing scoping studies for projects where the definition level is considered to be too low and other processes to improve information collection at the pre-sign off phase. Additionally, DA1 has used the differences in perceptions illustrated by the mean effect plots (Fig. 8), to prompt discussion and reflection between team members, further enabling improved mutual understanding of the design space and each team member's role within the agency.

Limitations of method

There are a number of limitations to CoFIDE which will be discussed in this section. Using CoFIDE to identify the most influential factors is dependant on at least one member of any design team, hackathon team, or makerspace collaborator to think of the factor in some form during the process. This is a clear limitation of the method; however, the natural solution to this would be to have some predetermined factors included as a prompt. However, this may bias the participants, potentially placing more importance on them, rather than those that the participants identify.

Intentionally, CoFIDE works only for the team that is using it. To achieve some form of universal insight from CoFIDE, one must apply it across a large broad range of design teams, hackathon teams, and makerspace collaborators. Naturally, this is a challenge, as it will require gaining access to a suitable number of teams. Furthermore, when considering that each of the aforementioned groups will be working towards their own types of projects, the findings from each application of CoFIDE could only be compared, not be cross-combined.

Conclusions

Design effort is a highly valuable resource to design teams, no more so than for design teams in makerspaces and hackathons, where time and resources are limited to begin with. Extensive studies have shown that the design effort levels of product design projects are influenced by a number of factors. Many past studies have identified these factors through past project data analysis which is not achievable for hackathon teams and makerspace collaborators. Other methods studies have used literature reviews to identify factors, yet studies have shown that the use of design teams’ tacit knowledge and experience has been proven to be effective in the estimation of design effort for product design projects. This indicates that designers know which factors are most influential, and how they influence; yet in many cases, it is not possible for designers to coherently and completely articulate their perceptions of these factors to others. By sharing the understanding gained through this tacit knowledge and experience, effective planning of design projects is achievable. This is particularly critical in design teams at hackathons and limited-time design challenges, where design teams typically do not know each other; and thus there lacks a familiarization present between design team members in the industry.

Through a case study approach, this paper answers the proposed research questions:

  • RQ1: Through the capture of tacit knowledge of design teams, what novel data and data presentations can be generated from new design effort influencing factor identification approaches?

Through the use of the CoFIDE method, researchers can gather and generate a range of valuable data including models of the most influential factors on design effort levels for the product design project shown in the third column of Figure 10. The case study data in this study provides detail on the kinds of insight that can be offered through the use of CoFIDE. By applying CoFIDE in various diverse design teams, these data enable product designers to identify design processes for best practice and design researchers establish, define, and model globally influencing factors. CoFIDE can also provide researchers and designers with the means to estimate design effort for design teams, from small and specialized, to large and widely distributed. Furthermore, CoFIDE produces mathematical and graphical models of designers’ perceptions of design effort-influencing factors and the behavior of these factors, enabling the optimization of the design space. These models further provide the means to draw direct comparisons between the perceptions of design team members; find areas of consensus to build upon, disagreement to discuss and improve; and determine industry-wide factor-based issues to address. By modeling the level of influence each factor has during each project phase, DA1 can identify which factors to address at each phase and can look to address and optimize their processes to mitigate the negative impacts of each factor. Indeed, during an informal discussion with the company director, DA1 has used the insight attained through CoFIDE in several ways; DA1 has taken steps to manage the influence of factors where possible by various means, since using CoFIDE. For example, improving the management of projects by introducing scoping studies for projects where the definition level is considered to be too low and other processes to improve information collection at the pre-sign off phase. Additionally, DA1 has used the differences in perceptions illustrated by the mean effect plots (Fig. 8) to prompt discussion and reflection between team members, further enabling improved mutual understanding of the design space and each team member's role within the agency. Utilizing the advantages that CoFIDE offers, design teams can become more efficient and effective in product design, spend more time designing and less planning, and save money on wasted, miss-allocated resources.

  • RQ2: What new insights and opportunities does this offer makerspace collaborators and hackathons participants?

Fig. 10. CoFIDE novel data and benefits of application.

When applied in a makerspace or hackathon environment, where teams have diverse backgrounds and are likely working together for the first time, CoFIDE provides a range of opportunities and benefits for design teams. CoFIDE enables hackathon teams to quickly organize themselves by structuring discussions around the design processes they could adopt for the hack. CoFIDE enables design teams to have open discussions about the issues surrounding their design task, identifying those most factors that are most influential enabling members to take steps to mitigate the negative impacts of factors. Furthermore, through the creation of graphical models, it is possible for hackathon teams to quickly establish a mutual understanding of each other's perspectives (with the potential to facilitate more effective working); and mutual understanding of the factors and their influencing their design space, including how influence levels change during the course of the design project. In effect, CoFIDE creating team cohesion by rapidly developing mutual understanding, and by presenting opportunities to capitalize on the detailed insight of influential factors through the characterization of the design space.

Future work

The next steps for the development and use of CoFIDE will firstly be the use of CoFIDE across a range of various design teams, both in size, as in the number of team members, as well as the diversity of experience between team members. Doing so will allow for the capabilities of CoFIDE to be fully realized and also will aid in understanding how design team members with different backgrounds and experience perceive the challenges and influences exerted by factors on design effort levels.

Secondly, by using CoFIDE in different design spaces globally, it may be possible to identify which factors have an influence over design effort levels globally. Interestingly, the opposite may also be true, this method may be able to help identify factors which are only considered to be influential in a particular market, country, etc. By doing so, it may further be possible to use such findings to help share different coping mechanisms that render influential factors in one market as impotent in others. This exchange of knowledge, insight, and experience could easily lower the barrier to entry for future designers and makers, further democratizing the act of designing.

Thirdly, by creating a scoring system, as part of CoFIDE, it may be possible to produce a project scorecard tool. Such a tool would allow design teams to evaluate design project briefs, assigning a score to each influential factor. This would allow for comparisons between projects, both past and current to be drawn. Furthermore, once a range of projects had been scored, designers and managers could quickly identify projects with similar scores, enabling comparisons to be drawn, experiences to be recalled and planned projects to be improved.

Finally, the use of CoFIDE and the regression analysis data that it produces should be extended to design effort estimation, allowing for bespoke tools to be created for design teams. This could potentially enable design agencies to significantly save on time, and therefore money, by quickly assessing project briefs and generating accurate design effort estimates. This could be particularly beneficial for design agencies, which are typically SMEs operating with tight budgets, where planning errors are not easily absorbed.

Alexander Freddie Holliman is working towards his PhD in Product Design at the University of Strathclyde. His main interest is in the capture of designers' perceptions to facilitate the estimation of design effort and works with several UK-based design companies to improve their project planning. He currently teaches product design and design sketching at the Department of Design, Manufacturing and Engineering Management.

Dr Avril Thomson is a Reader in Design Engineering in the Department of Design, Manufacturing and Engineering Management at the University of Strathclyde. Her research interests are in the management of collaborative design and health engineering. She has a keen interest in Engineering Design Education and currently acts as Faculty Associate Dean Academic.

Dr Abigail Hird has an EngD is Systems Engineering and a MEng in Product Design Engineering. She played a leading role in the Strathclyde Institute for Operations Management and has been involved in a number of challenge-led research and knowledge exchange projects in close collaboration with industry.

Dr Nicky Wilson has a PhD in Sports Engineering Design where she has a research focus on enabling inclusive design through the application of sports design processes.

References

Andersson, J, Pohl, J and Eppinger, SD (1998) A design process modelling approach incorporating nonlinear elements. Proceedings of 1998 DETC: ASME Design Theory and Methodology Conference. Atlanta, Georgia: American Society of Mechanical Engineers.Google Scholar
Bashir, HA and Thomson, V (1999) Metrics for design projects: a review. Design Studies 20, 263277.CrossRefGoogle Scholar
Bashir, HA and Thomson, V (2001 a) An analogy-based model for estimating design effort. Design Studies 22, 157167.CrossRefGoogle Scholar
Bashir, HA and Thomson, V (2001 b) Models for estimating design effort and time. Design Studies 22, 141155.CrossRefGoogle Scholar
Bashir, HA and Thomson, V (2004) Estimating design effort for GE hydro projects. Computers & Industrial Engineering 46, 195204.CrossRefGoogle Scholar
Benedetto, H, Bernardes, M.M.e.S and Vieira, D (2018) Proposed framework for estimating effort in design projects. International Journal of Managing Projects in Business 11, 257274.Google Scholar
Bowen, S, Durrant, A, Nissen, B, Bowers, J and Wright, P (2016) The value of designers’ creative practice within complex collaborations. Design Studies 46, 174198.CrossRefGoogle Scholar
Brauers, J and Weber, M (1988) A new method of scenario analysis for strategic planning. Journal of Forecasting 7, 3147.CrossRefGoogle Scholar
Bryson, JM and Bromiley, P (1993) Critical factors affecting the planning and implementation of major projects. Strategic Management Journal 14, 319337.CrossRefGoogle Scholar
Cho, S-H and Eppinger, SD (2005) A simulation-based process model for managing complex design projects. IEEE Transactions on Engineering Management 52, 316328.Google Scholar
Christensen, KS (1985) Coping with uncertainty in planning. Journal of the American Planning Association 51, 6373.CrossRefGoogle Scholar
Eckert, CM and Clarkson, PJ (2010) Planning development processes for complex products. Research in Engineering Design 21, 153171.CrossRefGoogle Scholar
Eppinger, SD, Nukala, MV and Whitney, DE (1997) Generalised models of design interaction using signal flow graphs. Research in Engineering Design 9, 112123.CrossRefGoogle Scholar
Fisher, RA (1949) The Design of Experiments, 5th Edn. Edinburgh: Oliver and Boyd.Google Scholar
Griffin, A (1993) Metrics for measuring product development cycle time. Journal of Product Innovation Management 10, 112125.CrossRefGoogle Scholar
Griffin, A (1997) Modeling and measuring product development cycle time across industries. Journal of Engineering and Technology Management 14, 124. http://dx.doi.org/10.1016/S0923-4748(97)00004-0CrossRefGoogle Scholar
Hellenbrand, D, Helten, K and Lindemann, U (2010) Approach for development cost estimation in early design phases, Proceedings of DESIGN 2010, the 11th International Design Conference, Dubrovnik, Croatia, pp. 779–788.Google Scholar
Hird, A (2012) A Systems Approach to Resource Planning in New Product Development (Thesis [Eng. D]). Dept. of Design, M. and E.M., Glasgow: University of Strathclyde, 2012.Google Scholar
Ittner, CD and Larcker, DF (1997) Product development cycle time and organizational performance. Journal of Marketing Research, American Marketing Association 34, 1323.Google Scholar
Jack, H (2013) Chapter 1 – An Overview of Design Projects BT, Engineering Design, Planning, and Management. Boston, MA: Academic Press, pp. 132.Google Scholar
Jacome, MF and Lapinskii, V (1997) NREC: risk assessment and planning of complex designs. IEEE Design & Test of Computers 14, 4249.CrossRefGoogle Scholar
Jensen, MB, Semb, CCS, Vindal, S and Steinert, M (2016) State of the art of makerspaces – success criteria when designing makerspaces for norwegian industrial companies. Procedia CIRP 54, 6570.CrossRefGoogle Scholar
Komssi, M, Pichlis, D, Raatikainen, M, Kindström, K and Järvinen, J (2015) What are Hackathons for? IEEE Software 32, 6067.CrossRefGoogle Scholar
Luck, R (2013) Articulating (mis)understanding across design discipline interfaces at a design team meeting. Artificial Intelligence for Engineering Design, Analysis and Manufacturing 27, 155166.CrossRefGoogle Scholar
Pe-Than, EPP, Nolte, A, Filippova, A, Bird, C, Scallen, S and Herbsleb, JD (2019) Designing corporate Hackathons with a purpose: the future of software development. IEEE Software 36, 1522.CrossRefGoogle Scholar
Pollmanns, J, Hohnen, T and Feldhusen, J (2013) An information model of the design process for the estimation of product development effort BT. In Abramovici, M and Stark, R (eds), Smart Product Engineering. Berlin, Heidelberg: Springer, pp. 885894.Google Scholar
Raatikainen, M, Komssi, M, Bianco, Vd, Kindstöm, K and Järvinen, J (2013) Industrial experiences of organizing a Hackathon to assess a device-centric cloud ecosystem. 2013 IEEE 37th Annual Computer Software and Applications Conference, Kyoto, 2013, pp. 790–799.CrossRefGoogle Scholar
Rondinelli, DA, Middleton, J and Verspoor, AM (1989) Contingency planning for innovative projects. Journal of the American Planning Association 55, 4556.CrossRefGoogle Scholar
Salam, A and Bhuiyan, N (2016) Estimating design effort using parametric models: a case study at Pratt & Whitney Canada. Concurrent Engineering 24, 129138.CrossRefGoogle Scholar
Salam, A, Bhuiyan, N, Gouw, GJ and Raza, SA (2009) Estimating design effort for the compressor design department: a case study at Pratt & Whitney Canada. Design Studies 30, 303319.CrossRefGoogle Scholar
Saravi, S, Joannou, D, Kalawsky, RS, King, MRN, Marr, I, Hall, M, Wright, PCJ, et al. (2018) A systems engineering Hackathon – a methodology involving multiple stakeholders to progress conceptual design of a complex engineered product. IEEE Access 6, 3839938410.CrossRefGoogle Scholar
Serrat, J, Lumbreras, F and López, AM (2013) Cost estimation of custom hoses from STL files and CAD drawings. Computers in Industry 64, 299309.CrossRefGoogle Scholar
Shai, O and Reich, Y (2004) Infused design. I. Theory. Research in Engineering Design 15, 93107.Google Scholar
Shang, Z-G and Yan, H-S (2016) Product design time forecasting by kernel-based regression with gaussian distribution weights. Entropy 18, 231248.CrossRefGoogle Scholar
Smith, RP and Eppinger, SD (1997) A Predictive Model of Sequential Iteration in Engineering Design. Catonsville, MD: Management Science.Google Scholar
Tatikonda, MV and Rosenthal, SR (2000) Technology novelty, project complexity, and product development project execution success: a deeper look at task uncertainty in product innovation. IEEE Transactions on Engineering Management 47, 7487.Google Scholar
Wang, Z, Tong, S and Huang, L (2015) Research on the time prediction model of product variant design. 2015 IEEE International Conference on Mechatronics and Automation (ICMA), Beijing, China, IEEE, pp. 572–576.CrossRefGoogle Scholar
Xu, D and Yan, H-S (2006) An intelligent estimation method for product design time. The International Journal of Advanced Manufacturing Technology 30, 601613.CrossRefGoogle Scholar
Yan, H-S and Shang, Z-G (2015) Method for product design time forecasting based on support vector regression with probabilistic constraints. Applied Artificial Intelligence 29, 297312. http://dx.doi.org/10.1080/08839514.2015.993558CrossRefGoogle Scholar
Yan, HS and Xu, D (2007) An approach to estimating product design time based on fuzzy -nu-support vector machine. IEEE Transactions on Neural Networks 18, 721731.Google Scholar
Yan, H, Wang, B, Xu, D and Wang, Z (2010) Computing completion time and optimal scheduling of design activities in concurrent product development process. IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans 40, 7689.CrossRefGoogle Scholar
Zhigen, S and Yan, H (2011) Forecasting product design time based on Gaussian Margin Regression. IEEE 2011 10th International Conference on Electronic Measurement & Instruments, Chengdu, IEEE, Vol. 4, pp. 86–89.CrossRefGoogle Scholar
Zirger, BJ and Hartley, JL (1994) A conceptual model of product development cycle time. Journal of Engineering and Technology Management 11, 229251.CrossRefGoogle Scholar
Figure 0

Table 1. Design effort estimation methods in product design that consider influential factors

Figure 1

Fig. 1. Factor categorization analysis.

Figure 2

Table 2. Design effort-influencing factors

Figure 3

Fig. 2. A new product development resource forecasting method. Adapted from Hird (2012).

Figure 4

Fig. 3. Collaborative Factor Identification for Design Effort (CoFIDE) method.

Figure 5

Table 3. Design Agency 1 participant roles

Figure 6

Fig. 4. Design Agency 1's design process.

Figure 7

Table 4. Factors influencing design effort levels of design projects as perceived by Design Agency 1

Figure 8

Table 5. Voting for shortlist of factors for design effort influence in design projects at Design Agency 1

Figure 9

Table 6. Factor classification and elements

Figure 10

Table 7. Estimation collection sheet

Figure 11

Table 8. Participant regression equation values for design effort levels in product design projects

Figure 12

Fig. 5. Percentage of influence on design effort over a design project phases.

Figure 13

Fig. 6. Changes in percentage of influence for factors on design project times.

Figure 14

Fig. 7. Average changes in percentage of factors’ influence on design effort.

Figure 15

Fig. 8. Mean effect plots for factor influence over design effort levels.

Figure 16

Table 9. Mean effect plot values for factor influence over design effort levels

Figure 17

Fig. 9. Percentage of influence of factor per participant comparison example.

Figure 18

Table 10. Utility of mean effect plots and percentage influence graphs

Figure 19

Fig. 10. CoFIDE novel data and benefits of application.