Construction Economics and Building Vol. 23, No. 1/2 July 2023 © 2023 by the author(s). This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 International (CC BY 4.0) License (https:// creativecommons.org/licenses/ by/4.0/), allowing third parties to copy and redistribute the material in any medium or format and to remix, transform, and build upon the material for any purpose, even commercially, provided the original work is properly cited and states its license. Citation: Debs, L., Hubbard, B. 2023. Gathering and Disseminating Lessons Learned in Construction Companies to Support Knowledge Management. Construction Economics and Building, 23:1/2, 56–76. https://doi.org/10.5130/AJCEB. v23i1/2.8390 ISSN 2204-9029 | Published by UTS ePRESS | https://epress. lib.uts.edu.au/journals/index. php/AJCEB RESEARCH ARTICLE Gathering and Disseminating Lessons Learned in Construction Companies to Support Knowledge Management Luciana Debs1,*, Bryan Hubbard2 1 Assistant Professor, School of Construction Management Technology, Purdue University 2 Professor, School of Construction Management Technology, Purdue University Corresponding author: Luciana Debs, Ph.D., Assistant Professor, School of Construction Management Technology, Purdue University, ldecresc@purdue.edu DOI: https://doi.org/10.5130/AJCEB.v23i1/2.8390 Article History: Received 11/05/2022; Revised 15/02/2023; Accepted 17/02/2023; Published 31/07/2023 Abstract The importance of knowledge management (KM) in the Architecture, Engineering, and Construction (AEC) industry has risen with the improvement of information and communication technologies. However, the construction industry still struggles to capture and disseminate lessons learned. The present research explores this issue by using interviews and an online questionnaire to provide updated information on lessons learned procedures and their challenges in United States (US) construction companies. To do this, the authors have gathered industry professionals’ perceptions about lessons learned and methods used in their companies for harnessing and disseminating the knowledge generated by them. Our findings indicate that post-project evaluations are frequently conducted, utilize a two-step approach (review of materials followed by meetings), and are mainly organized by project managers. Most often, only key project team members are present in meetings to create a safe environment for discussion. Moreover, our findings echo previous research on the dissemination and reuse of lessons learned, indicating that these procedures are scattered and, most of the time, the information is not effectively reutilized. This suggests that US construction companies still need to revisit how information from lessons learned is currently being harnessed, stored, and especially shared (within and between companies) so that information can be effectively transformed into knowledge that can advance the AEC industry’s productivity. DECLARATION OF CONFLICTING INTEREST The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article. FUNDING The author(s) received no financial support for the research, authorship, and/or publication of this article. 56 https://doi.org/10.5130/AJCEB.v23i1/2.8390 https://doi.org/10.5130/AJCEB.v23i1/2.8390 https://epress.lib.uts.edu.au/journals/index.php/AJCEB https://epress.lib.uts.edu.au/journals/index.php/AJCEB https://epress.lib.uts.edu.au/journals/index.php/AJCEB mailto:ldecresc@purdue.edu https://doi.org/10.5130/AJCEB.v23i1/2.8390 Keywords Knowledge Management; AEC Industry; Lessons Learned; Project Retrospectives; Construction Industry Introduction The architecture, engineering, and construction (AEC) industry is known for its fragmented nature and reliance on professionals’ lived experiences (Hwang, 2022; Forcada, et al., 2013; Pathirage, Amaratunga and Haigh, 2007; Saini, Arif and Kulonda, 2019; Vaz-Serra and Edwards, 2020). This is due mainly to its project-based nature, which can hinder the improvement of knowledge management for the industry as a whole because “there are few incentives to appraise performance, pass learning on and improve overall delivery” (Forcada, et al., 2013, p.83). Teams of professionals from several disciplines are formed based on the specific needs of the projects, and evolve through the project’s duration, with new team members joining and leaving the process. At the end of the project, the team is dissolved, and members are assigned new projects, which may or may not contain the same professionals from previous projects (Hwang, 2022). Yet, previous research indicates that the creation and the reuse of knowledge generated in the AEC industry is key to improving processes and productivity (Rezgui, Hopfe and Vorakulpipat, 2010; Tan, et al., 2007). In addition to the fragmented nature and evolving teams, much knowledge within AEC is tacit, meaning it is based on experience over time (Hwang, 2022; Nesan, 2012; Woo, et al., 2004). This reliance on tacit knowledge makes construction a ‘knowledge-intensive’ industry (Egbu and Robinson, 2005; Wang and Meng, 2019). In industries that rely heavily on tacit knowledge, professionals with many years of experience have a much broader knowledge base about the industry than early career graduates. As issues arise and are solved within projects, professionals acquire more expertise that will be applied to future projects as needed. Previous research has acknowledged the value of knowledge management (KM) practices for the AEC industry and the challenges of effectively capturing and using KM within a project-based industry (Hwang, 2022; Dave and Koskela, 2009; Kamara, et al., 2002; Saini, Arif and Kulonda, 2019). Recently, due to improvements in information and communication technology (ICT) and Building Information Modeling (BIM), there has been increased interest in gathering tacit knowledge and creating knowledge databases for AEC (Yu and Yang, 2018). By transforming tacit knowledge into physical (or virtual) information that can be passed to many professionals, that knowledge becomes explicit so it can be easily shared. Research conducted in the mid-2000s in the US indicated that learning from previous issues (as knowledge gained through experience) was already current practice in many construction companies (Yohe, 2006). This learning practice may have many names, as indicated by Disterer (2002), such as debriefing, reflection, and post-project reviews. The present paper will focus on learning after the completion of the project and will use ‘lessons learned’ as a term to designate that process. Lessons learned (or project retrospectives) are here defined as a KM practice, focused on capturing project-based tacit knowledge and transferring it to other team members and other project teams. However, despite being a quality management procedure conducted by many companies, research indicates that efforts to successfully use previous knowledge to improve a company’s competitive advantage remain low (Yap and Toh, 2020) and the failure to successfully implement knowledge management systems can frustrate AEC professionals (Vaz- Serra and Edwards, 2020). Some of the causes for the ineffective implementation of lessons learned within the construction industry include the industry’s fragmented and project-based nature; increased workload and time constraints, legal issues, lack of trust, cost, and a low perceived value-added for this type of activity (Carrillo, Ruikar and Fuller, 2013; Saini, Arif and Kulonda, 2019; Tan, et al., 2007; Yap and Toh, 2020). Particularly the fragmented and temporary nature of the construction industry poses challenges for different stakeholders to learn from their mistakes. For example, contractors may only identify some constructability issues during the construction phase. Once the contractor identifies the issue, it is usually Debs and Hubbard Construction Economics and Building, Vol. 23, No. 1/2 July 202357 brought to the designer’s attention. However, the professional from the design side might not necessarily be who originally produced the design documents. Given the rise of more collaborative delivery methods, such as design-build and integrated project delivery (McGraw Hill Construction, 2014) and the interdependency of industry stakeholders, it is important to understand better the gathering and disseminating of lessons learned more holistically with all AEC industry stakeholders (Saini, Arif and Kulonda, 2019). To better understand the challenges of capturing and disseminating KM in the AEC industry, the present paper focuses specifically on construction professionals’ perspectives related to lessons learned in a project, within and between companies. The choice to focus on lessons learned procedures was done because it is an industry-recognized practice, and it is well-defined in terms of time. The information can help construction companies benchmark their lessons learned procedures against others and provide helpful information to researchers studying ways to improve knowledge harnessing and reuse in the AEC industry. Background literature First, it is essential to differentiate KM from knowledge management systems (KMS). Alavi and Leidner (2001) define KM as referring to ‘identifying and leveraging the collective knowledge in an organization to help the organization compete’ (p. 113) and KMS as information technology and systems that support the management of that knowledge. The present paper will focus on understanding current AEC industry KM procedures and not on developing and piloting KMS. The advantages of gathering and using previous knowledge in new projects are well established (Anumba, Egbu and Carrillo, 2005; Yang, Cheng and Wang, 2012) and previous research has identified three main points for its effective use within companies (Alavi and Leidner, 2001): 1. Data, information, and knowledge are different constructs – while data refers to the values obtained, information is considered synthesized data, and knowledge is the understanding produced by interpreting the information. Some researchers have also included a fourth construct – wisdom – to refer to the deep knowledge to recognize the influence of context and applicability of knowledge (Siemieniuch and Sinclair, 1999). 2. The transfer of knowledge must take into consideration the need and understanding of the person or people who will be the receivers of the knowledge. 3. A large quantity of information does not necessarily equate to value-added, and that information needs to be reflected on and processed for it to be helpful. In construction, as in other industries, it is essential to avoid pitfalls such as repeating mistakes, reduced project performance, and “reinventing the wheel” (Kamara, et al., 2002; Vaz-Serra and Edwards, 2021; Woo, et al., 2004; Yu and Yang, 2018). However, the construction industry struggles to incentivize learning from projects largely due to its temporary, project-based nature with a strong tacit knowledge component (Carrillo and Chinowsky, 2006; Forcada, et al., 2013; Nesan, 2012). Furthermore, Kamara, et al. (2002) also indicate that different managing models (such as procurement and contract methods) influence how knowledge is transferred between the multiple project stakeholders of one project, which also stresses the importance of understanding the impact of the project-based nature of the AEC industry. Previous research also identified the influence of different managing models (such as procurement, contract methods, and leadership) and trust levels on the transfer of knowledge between multiple project stakeholders (Kamara, et al., 2002; Saini, Arif and Kulonda, 2019). This finding also stresses the importance of understanding the impact of the project-based nature of the AEC industry in its knowledge management procedures. Furthermore, previous research identified three generations of KM practices in this industry, emphasizing the importance of understanding KM practices in the AEC industry. These generations vary according to Debs and Hubbard Construction Economics and Building, Vol. 23, No. 1/2 July 202358 four criteria: ICTs, socio-technical characteristics, lifecycle focus, and knowledge perspective. Specific to the lifecycle focus of each generation, the researchers indicate that the first generation was more focused on software applications, the second generation was focused on specific disciplines, and the third generation focused on the industry as a whole, including integrating KM across AEC disciplines (Rezgui, Hopfe and Vorakulpipat, 2010). This framework (from Rezgui, Hopfe and Vorakulpipat, 2010) provides a more integrated approach to KM in the industry, encompassing not only ICT but also sociological and human factors to create value from knowledge. When considering KM practices, several techniques can be used to harness knowledge in companies (Al-Ghassani, et al., 2005), with lessons learned being one of them. Lessons learned is a broad term used in this paper to define a gain of knowledge based on a project experience. Lessons learned can be obtained at any moment of the project, but the present paper will focus only on post-project learning. There are numerous ways lessons learned can be obtained, and the procedures followed to obtain them can have several names. Some of the terms Disterer (2002) has identified while studying the evaluation of Information Technology (IT) projects that equate to post-project lessons learned procedures and can be echoed in the construction industry are post-project review, post-project appraisal, project post-mortem, debriefing, reuse planning, cooperative project evaluation, reflection, corporate feedback cycle, and experience factory. Though Disterer’s (2002) work is in a different industry, it shares a similar concern about the lack of learning or learning gap from previous projects’ experiences. Some of the reasons identified for this learning gap relate to time and budget concerns after the end of the project, the dispersion of teams to other new projects, few incentives to document and share lessons learned, and the lack of a trusting and open environment that values learning from failures and mistakes (Disterer, 2002). Very few research publications focused specifically on lessons learned procedures, therefore the researchers will draw from previous research on lessons learned as well as KM in the construction industry. The challenges with the effective use of KM, including lessons learned procedures, in the AEC industry are not only related to the effective harnessing of knowledge but also sharing, storing, and reusing (Deng, et al., 2022; Hwang, 2020; Saini, Arif and Kulonda, 2019; Yap and Toh, 2020). For example, previous research indicates that post-project reviews (PPR) in AEC companies are mostly “not properly documented, and if documented they remain locked in archives” (Dave and Koskela, 2009, p.897), which can lead to much frustration among employees (Vaz-Serra and Edwards, 2021). In fact, some of the KM tools perceived in AEC as helpful may not be ICTs, or KM technologies, but rather KM techniques, for example, small group meetings with stakeholders, communities of practice, training, and inter-and intranets (Al-Ghassani, et al., 2005; Forcada, et al., 2013). Management and dissemination for large and geographically dispersed companies are especially challenging (Forcada, et al., 2013). Furthermore, Kamara et al. (2002) indicated that post-project reviews are effective for those in the project to consolidate their learning but not adequate for transferring learning to other projects because there is no assurance that knowledge will, in fact, be transferred. The reuse of information remains an issue and finding better ways to deliver the “right knowledge to the right person at the right time” is key to improving KM practices in the construction industry (Yap and Toh, 2020, p.55). Carillo, Ruikar, and Fuller (2013) studied lessons learned procedures in United Kingdom (UK) construction companies using survey procedures (questionnaires, interviews, and focus groups). Their results indicate that companies performed lessons learned to avoid repeat mistakes, aiming for competitive advantage, and the content of the lessons learned was mainly related to health and safety, then contractual, environmental, and subcontractor procurement. The main participants of these lessons-learned events were managers (project and contract managers), as well as estimators (quantity surveyors) (Carrillo, Ruikar and Fuller, 2013), with other team members occasionally participating (such as design managers). Carrillo, Ruikar, and Fuller (2013) also showed that the lessons learned documents are not perceived as more useful to others in a company. Lesson learned documents ranked sixth place of seven compared to other tacit Debs and Hubbard Construction Economics and Building, Vol. 23, No. 1/2 July 202359 and explicit sharing tools. Barriers to effective post-project lessons learned procedures in Carrillo, Ruikar, and Fuller’s (2013) work echoed the ones mentioned by Disterer (2002) and included issues with team availability after project completion, lack of a trusting environment, and lack of perceived value. Other barriers included issues specific to the AEC industry, such as a lack of a culture of lessons learned, internal competition, siloed environment, duplication of workload, and legal issues. Recent work on managing KM in the construction supply chain in the UK by Saini, Arif and Kulonda (2019) adds to Carrillo, Ruikar and Fuller’s (2013). Their research indicated that the inertia mindset, which is “the traditional ways of doing business” (p.28), lack of motivation, lack of knowledge and trust, indirectly, are significant barriers to improving the industry’s knowledge sharing and transfer procedures (Saini, Arif and Kulonda, 2019). In the US context, Yohe (2006) conducted an extensive survey of lessons learned procedures in construction (and owner’s) companies in the US. Their results indicate that most surveyed companies have lessons-learned procedures in place, though not always formal, and results from these procedures are not always implemented. Most of the formal procedures are meant to be company-wide, while informal procedures are more project-specific or department-specific. In terms of content, construction and engineering/design lessons learned are frequently cited categories of lessons learned (Yohe, 2006). The collection of lessons learned is mainly performed in meetings and interviews, as well as electronically and frequently at the end of the project, but ‘there is not a general agreement on who manages Lessons Learned Programs’ (Yohe, 2006, p.34). Moreover, though issues with trust (Arif, Mohammed and Gupta, 2015) and legal risk (Carrillo, Ruikar and Fuller, 2013) have been identified by previous research as concerning for KM and lessons learned procedures, in the previous work by Yohe (2006) few companies indicated they faced legal issues related to lessons learned, with contractors more concerned about this matter than owners. Legal issues, in this sense, is the ability to use formally documented issues as lessons learned in litigation matters (Yohe, 2006). Findings from a recent research focused on US small and medium regional companies and the sharing of tacit knowledge indicate that, despite having the interest of employees, knowledge-sharing procedures at the company level should be improved (Hwang, 2022). This includes more training opportunities, fostering more networking, and establishing more formal procedures (Hwang, 2022). Furthermore, Hwang (2022) suggests that the value of KM procedures needs to be internalized by employees so that their participation in sharing their knowledge is voluntary, and that company managers must also recognize that time is necessary for codifying employee knowledge. Therefore, though KM is perceived as important to increase AEC industry productivity, little recent research has been performed on the current state of the practice related to lessons learned within the US, with the exception of Hwang (2022). Furthermore, given the rise of more collaborative delivery methods, the authors propose an exploration of construction companies to evaluate the sharing of knowledge within companies and between other key stakeholders of the project. This approach is more aligned with the third generation of KM in the AEC industry, as envisioned by Rezgui, Hopfe, and Vorakulpipat (2010), and also expands on the research by Saini, Arif and Kulonda (2019), who have explored the issue in the UK construction’s supply chain. Methodology Our goal is to explore current procedures for gathering and disseminating lessons learned in AEC companies in the US, as well as the challenges of this process. To achieve our goal, we have used a sequential mixed-methods approach, using interviews (phase 1) and a questionnaire (phase 2). With the sequential approach, the authors were able to utilize findings from the interview to refine the questionnaire in phase 2. Additionally, the interviews allowed for a more in-depth discussion about challenges faced by professionals Debs and Hubbard Construction Economics and Building, Vol. 23, No. 1/2 July 202360 during the process of collecting, storing, disseminating, and reusing lessons learned. Figure 1 depicts the conceptual framework for the present study. Figure 1. Conceptual Framework Findings from both study phases are used in triangulation with previously published literature to answer the following research questions: RQ 1: What is the current state of practice of lessons learned procedures in construction companies in the US? RQ 2: What are the main challenges to the effective implementation of lessons-learned procedures in construction companies in the US? The study was deemed exempt by the Institutional Review Board (IRB) at the authors’ institution under protocol IRB-2020-831. PHASE 1: INTERVIEWS In this first phase, the authors asked the outreach specialist of their current institutional department to send the invitation email to their contact list. This list included 74 company members of the program’s advisory board and companies that come twice per year to participate in the construction-specific career fair organized by this institution. The companies in this outreach list represent a broad range of company sizes and geographical locations within the US. The authors also sent invitation emails to their professional contacts, including additional construction and design firms. Interviews were conducted through phone or videoconferencing (such as Webex) and transcribed for analysis. Participants were not required to have participated in the gathering of lessons learned to participate in the interview because the authors were also assessing dissemination efforts, which was also part of the interview questions. The interview consisted of one meeting divided into six parts: company details, about yourself, post- project evaluation, processes related to post-project evaluation, dissemination, and personal perspective. The analysis included frequency data related to demographic information (about the company and professionals). The transcript portions related to the lessons learned procedures and dissemination were then coded using two passes – a first pass using structural coding and a second using descriptive coding as described by Debs and Hubbard Construction Economics and Building, Vol. 23, No. 1/2 July 202361 Saldaña (2009). Frequencies of codes are presented in the results section. When presenting information about lessons learned procedures and dissemination, a few excerpts from interviews are also included to exemplify participants’ narratives. Finally, the codes from this qualitative analysis helped identify emerging themes from the interviews used in the discussion of the overall results of the present research. PHASE 2: ONLINE QUESTIONNAIRE In addition to providing emerging themes for discussion, phase 1 (interviews) also supported drafting the questions used in phase 2. The resulting online questionnaire (Qualtrics) contained 22 questions and a brief explanation of the focus on post-project lessons learned procedures. Questions focused on company information, demographic questions for respondents, and lessons learned procedures, including dissemination. The authors have also included 10 common names for lessons learned procedures based on previous research at the start of the questionnaire. The email invitation with a link to the online questionnaire for phase 2 was sent to design and construction professionals using the authors’ contacts, and the authors’ school’s contacts, and posted by the researchers on their professional LinkedIn webpage. Snowball sampling was used to reach more professionals, in addition to the authors’ contacts. The analysis of this phase included reporting descriptive statistics for aggregated demographic data of participants and their companies, as well as for the main aspects related to lessons learned, such as reasons for conducting project evaluations, participants and tools for this process, content discussed during this process, and dissemination information. Finally, emerging themes were identified and discussed by triangulating the findings from the interviews, the online questionnaire, and previous literature. Results This section will present the results of our study, starting with results from the interviews, and following with those from the survey of US-based professionals. In both instances, we present a brief overview of the aggregated demographic data and results related to using and disseminating lessons learned procedures for each phase. RESULTS PHASE 1: INTERVIEWS For this phase, we interviewed 12 industry professionals from construction companies. Even though design companies were invited, only two participants were from companies that provided architecture or engineering services (integrated firms). These two participants worked in the construction division of these integrated companies. Most interviewees classified their companies (n=8) as general contractors, four mentioned providing construction management services, and two mentioned that their companies provided construction management at-risk services. Furthermore, one interviewee mentioned working for a subcontractor company. Participants could classify their company into more than one type. Three interviewees mentioned their company had an approximate annual revenue ranging from 20 to 60 million dollars, one interviewee mentioned working for a company that had approximately 275 million dollars in revenue, seven others worked for companies that had one billion dollars or more in annual revenue, and one interviewee did not answer this question. In addition, the company size of 7 of the 12 interviewees ranged from 25 to 6,000 employees, while the other five did not provide an estimated number of employees. Most companies (n=11) for which interviewees currently worked provided services in several construction segments. Only one interviewee worked for a company mainly providing services to the heavy civil and industrial market. Figure 2 shows the frequency of responses for the type of construction Debs and Hubbard Construction Economics and Building, Vol. 23, No. 1/2 July 202362 performed by companies where interviewees worked, knowing that participants could mention more than one type of construction. 10 6 6 4 4 3 2 1 1 0 2 4 6 8 10 12 Commercial Institutional Healthcare Multi-Family Industrial Laboratories and Science Buildings Hospitality Heavy Civil / Industrial Data Centers Figure 2. Distribution of type of work for companies in which phase 1 interviewees worked. Interviewees ranged between 5 to 36 years in the construction industry, with an average of 14.3 years. Positions occupied by respondents were in the following categories: vice president (n=2), project executive, project manager (n=4), project engineer (n=3), and lean engineer or improvement leader (n=2). Following, this section presents the major topics that were discussed during the interviews. These topics help us answer our research question, which is related to the current efforts and major challenges for the effective use of lessons learned in US construction companies. The five major topics are: professionals’ level of awareness about lessons learned procedures and numbers of projects evaluated, factors that influence a company or management to evaluate (or not) projects, the timing of post-project evaluations, a description of how post-project evaluations occur in their companies in their professional experience, and how information gathered during those procedures is stored long-term and disseminated. LEVEL OF AWARENESS AND PROJECTS EVALUATED When asked about their level of awareness of their company’s project evaluation procedures, three participants had little knowledge, one had moderate knowledge, and eight had a high level of awareness about project evaluation procedures in their company. In addition, six participants mentioned their company evaluated all projects; two mentioned their companies evaluated between 75 to 100% of their projects; two mentioned that their companies evaluated 50 to 75% of the projects; one respondent did not provide a clear answer to this question. FACTORS AFFECTING THE DECISION TO EVALUATE OR NOT EVALUATE PROJECTS When asked to provide more information about factors influencing the decision to evaluate or not a project, five respondents mentioned that size influences their decision, with smaller projects being less likely to be evaluated. The size was also understood as project cost, such as the following interviewee indicated: “size would be a factor [for getting evaluated]. I’m going to say it is 20 million [or more will be getting] a post- mortem.” One participant mentioned the value of the project for the company, such as when breaking into a new market or working with a new client. Two participants mentioned repeated projects as a reason for not evaluating a project: “if it was a repeat project team with a similar project, [project evaluations are] maybe not as important.” Finally, one participant mentioned positive and negative performance as decisive factors for conducting evaluations: “[project evaluations are] more likely if something does not go well. But then [the company] is very competitive, so, if something did go well, a project manager will want to have lessons learned [performed]…” to showcase the team’s performance. Debs and Hubbard Construction Economics and Building, Vol. 23, No. 1/2 July 202363 TIMING OF POST-PROJECT EVALUATIONS When asked about the average time gap between project completion and conducting the evaluation, four participants mentioned they collect information about project evaluations in an ongoing manner, as noted by the following excerpt, [I] have been involved in some [projects] where the job is still ongoing, and we are doing a post- mortem on some work that has already been completed, but we are still going to do some more of that work in the coming months. You know, if you have several areas within a plan, and this area is done, well, you want to do your lessons-learned in that area, so that you do not repeat your mistakes on the next one. In these cases, it seems that goal of this process is to reuse information in the same project instead of disseminating it to others in the company. Five participants mentioned spacing project completion and project evaluation between one to four months, two mentioned a space between six months to one year, and one mentioned less than one month after project completion. One participant provided a rationale for choosing two months after completion: we do not want to do a post-evaluation any sooner than two months after the project is complete. So, the sweet spot is sometime between two and four months after the project is complete. If we do it too early, then sometimes we will miss issues that come up in that first two months of occupancy. POST-PROJECT EVALUATIONS PROCEDURES Some common tools and procedures used during project evaluations that participants mentioned were templates provided by the company (n=5), continuous evaluation through the project (n=5), personal feedback to project participants (n=3), brainstorming with the project team about performance (n=3), online data collection procedures (n=4), trained facilitators (n=2), and affinity diagrams (n=1). Templates included excel spreadsheets, questionnaires, and a template agenda with items to cover during the meeting. Participants described the process usually in two steps, a first step collecting information through reports and questionnaires to participants, and a second step utilizing a meeting to discuss major topics. The range of participants in the final meeting varied. However, many participants mentioned it contained the core team (such as a project executive or similar, a project manager, and a superintendent) (n=5), one participant mentioned including “as many people as were able to attend” and included not only the core team mentioned previously but also professionals from accounting, legal, quality assurance and quality control (QAQC) departments of the company. One-third of participants (n=4) stressed that post-evaluation meetings are internal, so design partners or clients do not participate in those debrief meetings. Three participants clearly mentioned not involving design partners in the meetings but having separate meetings to provide targeted feedback to designers, two participants mentioned including clients, but only if they requested their participation in the project evaluation meeting, one participant mentioned including designers sometimes, and one participant mentioned always including designers and other stakeholders outside of their company that were involved in the project. Two participants explained not inviting stakeholders from outside of the companies because of the possibility of this restricting conversations. A participant explained this issue: “It is meant to be a very open discussion, frank discussion, and bringing consultants from outside of our company would be [like] airing your dirty laundry in front of others.” Another participant echoed the ability to conduct safe meetings: “It is a judgment-free zone, and that is the only way we learn: if everybody is honest and open…” Finally, one Debs and Hubbard Construction Economics and Building, Vol. 23, No. 1/2 July 202364 participant mentioned that designers would be invited if the project was delivered using a design-build or an integrated project delivery (IPD) approach. Eleven of the twelve participants provided a bit more explanation on which topics were included in the project evaluation. Though this is not an extensive list, common topics included in project evaluations and mentioned by participants were: comparisons between projected and actual costs (n=7) and schedule (n=4), this may include in-depth discussions by cost category, and profitability of the job; safety (n=3); client satisfaction (n=3); team interaction (n=2); project performance (n=2); personal performance, such as providing feedback to project personnel (n=4); evaluation of trade partners (n=1); and design issues (n=1). One participant clearly mentioned not including safety discussions in the final project evaluation meeting but instead having a separate meeting just for a safety debrief because “[safety] can end up taking the whole meeting time.” LONG-TERM STORAGE AND DISSEMINATION EFFORTS Storage of information related to project evaluations varied by company, with participants mentioning its storage in a shared drive (n=3), construction management software (n=2), intranet, or company portal (n=2). Three participants mentioned that the information is stored but could not define its location, one participant did not clearly answer this question, and one mentioned no long-term storage. Long-term access to the information is available to all employees in the company for half of the participants (n=6), four participants mentioned it was only accessible to the project team and two mentioned only being accessible to the team leaders. One participant mentioned that their company is currently organizing this process, but the data is not yet accessible. Active dissemination of information obtained during project evaluations was not performed in the companies of two interviewees. When practiced, information learned through project evaluations is disseminated to other stakeholders in the company beyond the project team mainly through presentations in meetings (n=5), followed by internal newsletters (n=3), printed handouts or reports (n=2), and email (n=1). Eight of the twelve participants noted that no information about the project evaluation is shared outside of their company. Four interviewees mentioned sharing with stakeholders outside of their companies, especially if it is relevant to those stakeholders. This was not regularly done and was typically used to prevent future similar issues by the repeat trade partners. In summary, during this interview phase, it was noted that many companies conducted project evaluations, though procedures varied greatly. The debriefs focused mainly on cost and schedule but also included more qualitative topics, such as client satisfaction and team interaction. In terms of dissemination, processes seemed scattered, and it was unclear whether information from project evaluations is meaningfully being disseminated and stored for future uses. As explained by one interviewee: “we do not do a great job with doing something [with] the information afterward.” One participant suggests that the uniqueness of each building may make sharing and reusing information more complicated: “every job is so different and [this] makes information sharing so hard.” This was reiterated by another participant commenting on the project-based nature of the industry and the importance of project evaluations: “[Lessons learned are] underutilized in the industry in general. I think a lot of times is because people are site-based, and you are just trying to execute one project, you get a little siloed.” Finally, most companies emphasized the internal nature of project evaluations, though some mentioned sharing select information with external stakeholders, if this could improve future performance on projects shared with that stakeholder. RESULTS PHASE 2: ONLINE QUESTIONNAIRE For phase 2, 78 responses were received, but 38 were returned without any responses after the introductory screen, and one response was received from abroad. Therefore, 39 responses were considered for analysis during phase 2. Debs and Hubbard Construction Economics and Building, Vol. 23, No. 1/2 July 202365 In this phase, 30 of our 39 participants were from companies with 500 or fewer employees, while six participants were from companies with 1,000 or more employees. Most respondents were from companies with annual revenue of $50 million US dollars or more (n=20, see Figure 3) and general contractors or construction management companies (n=21, see figure 4). Table 1 presents the delivery methods used by the respondents’ companies. It can be seen that the methods varied considerably, although design-bid-build seemed to be the most prevalent, followed by design-build and construction management at risk. Less than $ 2 Millions, 3 … ,snoilliM 05 dna 2$ neewteB Over $50 Millions, 20 Prefer not to answer, 1 … ton od I Figure 3. Annual revenue from phase 2 respondents’ companies 21 5 5 4 2 1 1 0 5 10 15 20 General Contractor or Construction Manager Subcontractor Integrated Firm (Design-Constructor) Designer Consulting Firm Real Estate Developer Owner or Owner's Representative Figure 4. Type of company of phase 2 respondents Table 1. Types of delivery methods used in participants’ companies (n=39), in percent Type of Delivery Method Mean Standard Deviation Design-Bid-Build 30.62 30.63 Design-Build 25.23 26.10 Construction Management at Risk 24.31 27.10 Integrated Project Delivery 8.56 15.13 Other 7.15 17.38 Construction Management Agency 5.49 12.05 Several respondents were employees with less than three years in their current companies (n=12), and no respondent had between 11 and 15 years of tenure in their current companies, as seen in Table 2. Furthermore, the current positions of respondents varied greatly, but nearly half were a combination of Debs and Hubbard Construction Economics and Building, Vol. 23, No. 1/2 July 202366 partners, owners, or senior management positions (n=11) and project engineers (n=9), as seen in Table 2. Most respondents were from the American midwest (n=22), followed by the west (n=8), south (n=7), and northeast (n=2) regions. Also, more than half of the respondents (n=24) indicated commercial as one of the top three types of projects they have worked on in their current company, as seen in Table 2. Table 2. Years working and positions held with the current company (n=39) Years working in the company Frequency Position held at current company Frequency Frequent project types Frequency Less than 3 years 12 Partner, owner, or senior management 11 Commercial and retail 24 3 – 5 years 8 Project engineer 9 Industrial1 15 6 – 10 years 5 Architect, designer, or engineer 4 Residential2 11 11 – 15 years 0 Project executive 2 Healthcare 11 16 – 20 years 5 Quality or safety-related positions 1 Education 11 21 – 30 years 4 Other positions 4 Government and Public Assembly 5 31 or more years 5 Hospitality 4 Sports and Recreation 4 Data Centre 4 Infrastructure 2 1Includes manufacturing and pharmaceutical. 2Includes multifamily projects. Following the topics presented in the interview results, we present a similar structure for the survey results. The results are presented first for the survey. Following, in each subsection, a brief comparison of the survey results to the interview results is provided. LEVEL OF AWARENESS AND PROJECTS EVALUATED More than half of the respondents were very or extremely familiar with project evaluation procedures in their companies (extremely familiar n=14; very familiar n=12). Eleven considered themselves moderately familiar with project evaluation procedures, while one participant was only slightly familiar, and one was not at all familiar with project evaluation procedures in their company. Project evaluation procedures largely Debs and Hubbard Construction Economics and Building, Vol. 23, No. 1/2 July 202367 varied per type or sector within a company (n=16) or were standardized within the company (n=10), as seen in figure 5. Approximately 76.82% of projects in survey participants’ companies are evaluated (with a standard deviation of 22.79%), ranging from a minimum of 10% to a maximum of 100%. Procedures vary per type of project / sector, 16 Standard procedures for the whole company, 10 Procedures vary per department, 5 Procedures vary based on team decisions, 5 Procedures vary based on local headquarters definitions, 2 I do not know, 1 Figure 5. Project evaluation standardization (n=39) For this subsection, the awareness level of participants in both phases about lessons learned procedures within their companies was similar, with survey participants being slightly more informed (n=71.8% very or extremely familiar) than interview participants (n=66.7% highly familiar). Furthermore, the average number of projects evaluated in the companies where survey participants worked was slightly lower than the mode from interview participants, with half of them indicating that their companies evaluated all projects. The level of standardization among interview participants was not discussed thoroughly, therefore a separate question about the topic was added to the survey, yielding the specific results presented in figure 5. FACTORS AFFECTING THE DECISION TO EVALUATE OR NOT EVALUATE PROJECTS The condition most frequently mentioned by participants that cause a project to be evaluated was the complexity of the project (n=33), followed by size (n=31), type of project (n=26), and budget of the project (n=24). Figure 6 includes the results for all options in this question. Furthermore, three participants included additional conditions for projects to be evaluated – two mentioned problems or complications in projects, and one indicated the assessment of company performance. 0 5 10 15 20 25 30 35 40 Complexity Size Type Budget Delivery method Contractual obligations Company policy Yes Sometimes No Figure 6. Conditions for project evaluation In this section, findings from the survey seem to provide more depth to the findings from interviews. This is because during the interviews size was most frequently mentioned, but not always well defined by participants, with its meaning varying from physical size to budget size. In addition, it seems complexity may be confounded with physical size and budget, and when included as an option in the survey, it was the most frequently chosen, closely followed by size. Factors that lead a project to not be evaluated were not covered in the scope of the survey but were able to be explored in the interview, during which two Debs and Hubbard Construction Economics and Building, Vol. 23, No. 1/2 July 202368 participants mentioned repeat projects (meaning projects that have similar design aspects and similar teams) were less likely to be evaluated after their completion. TIMING OF POST-PROJECT EVALUATIONS Several participants mentioned that project evaluations are conducted within one month of substantial completion of a project (n=15), followed by evaluations conducted between three and six months (n=11) and between one and two months after substantial completion (n=9). Only three respondents indicated that their companies conduct project evaluations more than six months after project completion, and one participant did not answer this question. Moreover, several participants (n=15) mentioned their companies always conduct periodic evaluations of a project, which is while the project is still under construction, followed by participants who mentioned their companies conduct continuous evaluations either half or most of the time (n=12), and then participants who mentioned their companies conduct periodic evaluations sometimes (n=10). Two participants mentioned that their companies do not conduct periodic project evaluations. Responses on this topic seem to echo the findings from interviews. In interviews, five participants mentioned post-project evaluations being completed between one to six months after project completion. However, the survey allowed more granularity within this space of six months. Therefore, the survey results show that a large part of companies performs pos-project evaluations within the first month of a project being completed. Furthermore, the survey also included an explicit question about continuous evaluation, which we did not have during the interviews, but emerged as discussion topics for four interview participants and was then added to the survey to reveal that several companies do perform continuous evaluation procedures in most or all of their projects. POST-PROJECT EVALUATIONS PROCEDURES Most of the time, this evaluation process is organized by the project manager or project executive (n=24). However, seven participants mentioned that their companies did not have a specific department or person to conduct evaluations and disseminate findings. Five participants mentioned other departments or persons, including senior management, vice-president of operations, business development, construction administration, and project controls. Project evaluations are also either always or primarily conducted through meetings (n=36). When considering all responses, project managers are frequently involved in project evaluations (n=32), followed by project executives, partners, or senior management (n=29). These positions are likely included in most of the participants’ companies. Topics frequently discussed during project evaluation include budget (n=32), schedule (n=30), project team collaboration (n=27), stakeholder or subcontractor performance (n=23), design (n=20), safety (n=19), and employee evaluation (n=14). When provided with an open space to add more topics, two participants mentioned problems and outcomes discussions, followed by additional topics suggested by participants (with one mention each): constructability identifying value-added, building performance, and liability issues. Focusing on the results of the 31 construction-related companies (general contractor, subcontractor, or integrated firm). In these companies, superintendents participated in project evaluations most of the time for about two-thirds of those companies (n=19). Interestingly, quality or safety managers were only mentioned to be present most of the time by eleven construction-related participants. Six construction- related participants also included other personnel who participate in project evaluations in their companies most of the time; examples of these participants are president, construction manager, design team (mentioned by a participant in an integrated firm), ‘DFO’ (respondent did not explain acronym), estimating, Debs and Hubbard Construction Economics and Building, Vol. 23, No. 1/2 July 202369 and professional engineers (each mentioned once). In these companies, project evaluation also rarely involves stakeholders outside of the company, as seen in Table 3. Table 3. Project evaluations involving stakeholders outside of construction-related companies Stakeholder N Mean Median Standard Deviation Owner 27 1.78 1 0.85 Designers, Architects, and Engineers 26 1.62 1 0.81 Subcontractor 27 1.22 1 0.42 Answers ranging from ‘rarely’ = 1, ‘about half of the time’=2 and ‘most of the time’ = 3. ‘Not applicable’ answers not included. Information related to post-project evaluation procedures gathered in the survey mainly focused on the team composition and topics covered, presenting similar results. For example, findings from both phases suggest the core team (composed usually of the project executive, project manager, and superintended assigned to the project) is frequently involved with post-project evaluation, with superintendents being perhaps less involved than project managers and project executives. In terms of topics covered, both phases suggest budget and schedule as main discussion topics for post-project evaluations, though the survey suggests collaboration is also of concern, which was not as frequently mentioned during the interviews. LONG-TERM STORAGE AND DISSEMINATION EFFORTS IN CONSTRUCTION-RELATED COMPANIES Dissemination of lessons learned within construction-related companies is more likely to occur by email, shared drive, or in meetings. More information can be found in Table 4. Moreover, five of the 31 companies that answered this question rarely used any of the dissemination methods, while 24 had at least one method that they used most of the time. This suggests a diversification on the use of media to disseminate information within companies, using methods that ‘push’ information to other employees, such as email and presentations, and others that can be ‘pulled’ by employees, such as shared drive. Table 4. Methods used to disseminate project evaluations results within companies Method N Most of the time About half of the time Rarely Mean Median Standard Deviation Email 29 13 6 10 2.10 2 0.90 Shared drive 28 12 5 11 2.04 2 0.92 Presentations and meetings 29 11 7 11 2.00 2 0.87 Printed report 24 9 1 14 1.79 1 0.98 Project management software 22 6 4 12 1.73 1 0.88 Newsletter 21 2 4 15 1.38 1 0.67 Answers ranging from ‘rarely’ = 1, ‘about half of the time’=2 and ‘most of the time’ = 3. ‘Not applicable’ answers not included. Debs and Hubbard Construction Economics and Building, Vol. 23, No. 1/2 July 202370 When questioned about the long-term availability of lessons learned information to other employees, responses were scattered. For example, the two most frequent responses of the 31 received indicated either the resulting report from lessons learned being available to all employees (n=8) or not available at all (n=7). Other responses also indicated that information was only available for senior management (n=6), employees on the same sector or department (n=5), on the same team (n=4) or depends on the project (n=1). And, similarly to outside-of-company stakeholders involved with project evaluations, the reports produced by those evaluations in construction-related companies are very likely never to be shared outside of the originating organization, as seen in Table 5. Table 5. Outside-company stakeholders with whom project evaluation reports are shared (construction-related respondents) Stakeholder N Mean Median Standard Deviation Customer or owner 27 1.52 1 0.80 Engineers 25 1.40 1 0.65 Designers 26 1.38 1 0.64 Subcontractor 27 1.15 1 0.36 Answers ranging from ‘never’ = 1, ‘about half of the time’=2 and ‘most of the time’ = 3. ‘Not applicable answers’ In this subsection, we note that both phases stressed that information gathered during post-project evaluations is rarely shared outside of their companies. For dissemination within their companies, email was identified in the survey as the preferred tool for internal dissemination of post-project evaluation, while the interviews indicated it much less frequently. The differences here may be a result of the coding and coverage of the interviews, given that survey participants were able to rate all presented options, while the data from interviews was coded by the researchers based on open-ended questions. Finally, seven participants provided additional input in the last open-ended question about project evaluations. Among those, select answers include one participant who suggested that evaluations are put on the calendar for thirty days after final completion to keep information fresh on everyone’s mind; one participant mentioned that sometimes project evaluations are conducted randomly by third-party auditors; one participant mentioned that information is collected in a spreadsheet, but the compiled information can be overwhelming, demanding time for people to review it thoroughly. This added time increases the original time planned between project completion and evaluation, or ‘lag’ as mentioned by the participant. One designer mentioned conducting three to four project evaluations – at the end of schematic design, at the end of construction documents, at the end of construction, and four to ten months after the project, close to when warranties expire; furthermore, the same participant mentioned that a midway review, especially during construction, might lead to poor results due to the high stress of other stakeholders (namely contractor and owner) causing an “uneven assessment of the overall project.” Finally, one participant mentioned that “standardization of lessons learned will be the key to our future success as our business continues to grow. We have a more informal process now and do not disseminate the results enough.” Discussion Our findings echo previous research in KM that indicates that employees recognize the importance of sharing project information with others for process improvement (Hwang, 2022), and that knowledge dissemination remains a challenge (Hwang, 2022; Saini, Arif and Kulonda, 2019; Yap and Toh, 2020). Debs and Hubbard Construction Economics and Building, Vol. 23, No. 1/2 July 202371 Moreover, our findings suggest that lessons learned procedures are more often conducted in US construction companies than previously noted (see findings from Yohe, 2006), yet continuous project evaluation is not. This happens despite research indicating a rise in the use of lean concepts by American contractors (Ghosh and Burghart, 2021) and the focus of lean on the use of continuous evaluation improvement procedures (Aziz and Hafez, 2013). Moreover, our analysis resulted in three main topics of discussion specifically related to lessons learned, with one focused more on the specific procedures within US construction companies (typical topics and procedures), one related to KM sharing and reuse (barriers to internal and external dissemination) and one related to current KM context in the AEC industry (lessons learned procedures remain in the first generation of KM in AEC). Typical topics and procedures. Based on our findings, lessons learned procedures in construction-related companies are composed of a two-step approach. The process starts with the compilation of different types of information, including forms and reports. Then, a meeting is called to debrief the main findings and review more qualitative aspects of the job, such as team performance. This approach seems similar to what was mentioned by Yohe (2006), especially concerning the emphasis on people interaction and the use of meetings and interviews for debriefing. Post-project evaluations in American AEC companies seem to include project managers or project executives and superintendents, but not as many estimating professionals, which is different from Carrillo, Ruikar, and Fuller (2013). Our findings indicated that the topics typically included in post-project evaluation meetings were budget, schedule, and team collaboration, which is different from previous research by Yohe (2006) and more aligned with Hwang (2022), who evaluated KM procedures, of which post-project evaluations are a part of. Moreover, our findings suggest that the post-project evaluation primarily evaluates budget and schedule concerns and disparities of the project, with a secondary focus on team performance and interaction, and finally design errors and omissions. Barriers to internal and external dissemination. Even though conducting post-project evaluations seems to happen consistently in most American AEC projects, the dissemination of their findings remains low, echoing previous research from Yohe (2006), and a challenge, echoing recent research on KM in construction (Hwang, 2022; Vaz-Serra and Edwards, 2021; Yap and Toh, 2020). Involving outside-of- the-company stakeholders in the project evaluations is rare mostly because negative findings may reflect poorly on a team, as pointed out by Carrillo, Ruikar, and Fuller (2013), and also related to the lack of trust between project stakeholders (Saini, Arif and Kulonda, 2019). Our interview participants mentioned the need to make the post-evaluation meetings a safe environment for participants to share their thoughts and having other stakeholders may bias how they act in those meetings. Furthermore, some of our interview participants mentioned having a separate meeting with outside stakeholders, during which topics are targeted and, therefore, fear of negative impressions seems to be reduced. This was done primarily for repeat trade partners or collaborators, stressing the focus on using lessons learned for competitive advantage (Forcada, et al., 2013). Moreover, our findings indicate that the transformation of post-project evaluation information into knowledge that is reused in other projects is still limited, concurring with previous research in KM (Hwang, 2022; Yap and Toh, 2020). Two of the three most used dissemination tools (email and presentation and meetings) mentioned in this research are focused on ‘in-the-moment’ information and are difficult to retrieve in the long term. Processes undertaken in dissemination seem scattered, and it is unclear how they are being used for the transfer of knowledge or generation of new knowledge, concurring with previous research (Carrillo, Ruikar and Fuller, 2013; Vaz-Serra and Edwards, 2021; Yohe, 2006). Previous research outlines the drivers for improving knowledge management processes for construction companies and the reliance on a local, departmental-level focus on the sharing of information (Hwang, Debs and Hubbard Construction Economics and Building, Vol. 23, No. 1/2 July 202372 2022; Kamara, et al., 2002). However, our research also indicated varying levels of sharing of the information, from information being available to all employees to others being available to none. The role of the post-project evaluation organizer in our results seems to rest with project managers. Furthermore, during the interviews, no mention was made of a knowledge manager role, a position that some previous researchers have identified in construction companies (see Carrillo, Ruikar and Fuller, 2013; Forcada, et al., 2013; Tan, et al., 2007). Lessons learned procedures remain in the first generation of KM in AEC. Despite recent advances toward more collaborative delivery methods and previous research indicating that the AEC industry has the capability, in some aspects, to be on what Rezgui, Hopfe, and Vorakulpipat (2010) indicated as the verge of the third generation of KM (knowledge conceptualization and nurturing), our findings place the use of post-project evaluation information as still struggling with knowledge sharing, that is, in the first generation of KM in AEC. The focus of project evaluations in the surveyed companies of this study remains largely on capturing information for improving their efficiency rather than creating new knowledge or processes (Kamara, et al., 2002). This is an interesting finding because, despite improvements in the use of building information modeling (BIM) and other ICT environments that allow for sharing of information and knowledge across different AEC stakeholders (Deng, et al., 2022), lessons learned (and the knowledge generated by it) in construction companies remain primarily subject to human interpretation, with little recognition of its value added in the creation of new value. These findings seem to align well with Saini, Arif and Kulonda (2019) who identified issues with moving away from the status quo, the lack of motivation, and lack of knowledge about the importance of KM benefits affect the improvement of KM within the construction supply chain, which would encompass learning between organizations. Taking Rezgui, Hopfe and Vorakulpipat’s (2010) stages of KM framework and based on the present findings, the authors question the move to the third generation of knowledge management (knowledge value creation) before moving the management of lessons learned to the second generation of knowledge conceptualization and nurturing; and even question the placement of the construction industry at this second generation. Moreover, a question emerges if it would be possible for multiple generations of knowledge management to be present at the same time in the same organization. Limitations Our work has mainly two limitations. First, even though the researchers would like to provide a more significant AEC focus, respondents in both phases were mainly from construction-related companies. Therefore, our findings should be interpreted with caution by architecture and engineering companies. Following, the researchers also believe that recruitment was affected by COVID-19, especially for phase 2 of the study, in which we were hoping to achieve at least fifty completed responses. We note that the recruitment for both phases happened between 2020 and 2021. Conclusion The present research has used a mixed-method approach with interviews and an online survey to gather perceptions and current procedures related to gathering and disseminating lessons learned in US AEC companies. Most respondents were from construction companies; therefore, our findings are primarily skewed toward the construction sector and its practices. Two major key points are provided by the current study. The first key point relates to common practices for post-project evaluations and suggests also that those practices are frequently internal to the companies. Our findings show that post-project evaluations are frequently conducted in participants’ companies. Most frequently, this is a two-step approach with a review Debs and Hubbard Construction Economics and Building, Vol. 23, No. 1/2 July 202373 of documents and then a meeting organized by the project management and involving key stakeholders of the project team (frequently a project executive or senior management and superintendent). Reflecting the time-cost focus of the industry, budget and schedule performance are frequently discussed in the meetings, along with team performance. In addition, there is an effort for the meeting to be a ‘safe space’ for participants, thus, only the internal project team is present. Very rarely do post-project evaluations involve stakeholders outside of the company, and when they do, they are most likely to be trade partners (designers or subcontractors) that have repeatedly worked with the company, therefore, informing them on lessons learned provides the originating company a competitive advantage for future work. Consequently, despite the rise in collaborative delivery methods, knowledge management seems to be focused on individual AEC companies, with few exceptions. Furthermore, the second key point of our findings confirms previous research indicating that the information generated in post-project evaluations is poorly disseminated and reused by others within the company. This suggests that despite the emergence of new collaborative delivery methods and processes that emphasize the reuse and effective management of information, such as BIM and project management systems, lessons-learned procedures are still placed in the first generation of knowledge management, during which the focus is on sharing the knowledge, rather than conceptualizing, nurturing it or creating value from it, which is the ultimate goal of knowledge management processes. The findings support researchers exploring how to improve KM harnessing and sharing within and between AEC companies; and, from a practitioner’s perspective, it provides a benchmark on current lessons learned procedures in US companies. Suggestions for improvement could also focus on exploring the role of BIM and ICTs in improving KM harnessing and dissemination within and between AEC companies. Additionally, further research could take the barriers and limitations outlined in the present research and explore how they affect KM harnessing and sharing, such as trust and legal issues; compare different types of KM techniques among AEC companies; provide focused analysis utilizing inferential statistical tests and a larger sample size; and explore the development of the knowledge manager role within construction companies. References Al-Ghassani, A., Anumba, C., Carrillo, P. and Robinson, H., 2005. Tools and Techniques for Knowledge Management. In: Anumba, C. J., Egbu, C.O. and Carrillo, P.M. eds. 2005. Knowledge management in construction. Blackwell Publishing. Ch.6. https://doi.org/10.1002/9780470759554.ch6 Alavi, M. and Leidner, D.E., 2001. Knowledge management and knowledge management systems: Conceptual foundations and research issues. MIS quarterly, [e-journal] 25(1), pp.107-36. https://doi.org/10.2307/3250961 Anumba, C., Egbu, C. and Carrillo, P. M., 2005. Concluding Notes. In: Anumba, C.J., Egbu, C.O., Carrillo, P.M. and Anumba, C.J. eds. 2005. Knowledge management in construction. Blackwell Publishing. Ch.13. https://doi. org/10.1002/9780470759554.ch13 Arif, M., Mohammed, A.-Z. and Gupta, A.D., 2015. Understanding knowledge sharing in the Jordanian construction industry. Construction Innovation, [e-journal] 15(3), pp.333-54. https://doi.org/10.1108/CI-03-2014-0018 Aziz, R.F. and Hafez, S.M., 2013. Applying lean thinking in construction and performance improvement. Alexandria Engineering Journal, [e-journal] 52(4), pp.679-95. https://doi.org/10.1016/j.aej.2013.04.008 Carrillo, P. and Chinowsky, P., 2006. Exploiting knowledge management: The engineering and construction perspective. Journal of Management in Engineering, [e-journal] 22(1), pp.2-10. https://doi.org/10.1061/(ASCE)0742- 597X(2006)22:1(2) Debs and Hubbard Construction Economics and Building, Vol. 23, No. 1/2 July 202374 https://doi.org/10.1002/9780470759554.ch6 https://doi.org/10.2307/3250961 https://doi.org/10.1002/9780470759554.ch13 https://doi.org/10.1002/9780470759554.ch13 https://doi.org/10.1108/CI-03-2014-0018 https://doi.org/10.1016/j.aej.2013.04.008 https://doi.org/10.1061/(ASCE)0742-597X(2006)22:1(2 https://doi.org/10.1061/(ASCE)0742-597X(2006)22:1(2 Carrillo, P., Ruikar, K. and Fuller, P., 2013. When will we learn? Improving lessons learned practice in construction. International journal of project management, [e-journal] 31(4), pp.567-78. https://doi.org/10.1016/j. ijproman.2012.10.005 Dave, B. and Koskela, L., 2009. Collaborative knowledge management—A construction case study. Automation in construction, [e-journal] 18(7), pp.894-902. https://doi.org/10.1016/j.autcon.2009.03.015 Deng, H., Xu, Y., Deng, Y. and Lin, J., 2022. Transforming knowledge management in the construction industry through information and communications technology: A 15-year review. Automation in Construction, [e-journal] 142, 104530, pp.1-20. https://doi.org/10.1016/j.autcon.2022.104530 Disterer, G., 2002. Management of project knowledge and experiences. Journal of knowledge management, [e-journal] 6(5), pp.512-20. https://doi.org/10.1108/13673270210450450 Egbu, C. and Robinson, H., 2005. Construction as a Knowledge-Based Industry. In: Anumba, C.J., Egbu, C.O. and Carrillo, P.M., eds. Knowledge management in construction. Blackwell Publishers. Ch.3. https://doi. org/10.1002/9780470759554.ch3 Forcada, N., Fuertes, A., Gangolells, M., Casals, M. and Macarulla, M., 2013. Knowledge management perceptions in construction and design companies. Automation in construction, [e-journal] 29, pp.83-91. https://doi.org/10.1016/j. autcon.2012.09.001 Ghosh, S. and Burghart, J., 2021. Lean construction: experience of us contractors. International Journal of Construction Education and Research, [e-journal] 17(2), pp.133-53. https://doi.org/10.1080/15578771.2019.1696902 Hwang, S., 2022. Sharing tacit knowledge in small-medium regional construction companies in the US: the current status and the impact of organizational ecology. International Journal of Construction Management, [e-journal] 22(9), pp.1746-55. https://doi.org/10.1080/15623599.2020.1742628 Kamara, J.M., Augenbroe, G., Anumba, C.J. and Carrillo, P.M., 2002. Knowledge management in the architecture, engineering and construction industry. Construction innovation, [e-journal] 2(1), pp.53-67. https://doi. org/10.1108/14714170210814685 McGraw Hill Construction, 2014. Project Delivery Systems: how they impact efficiency and profitability in the building sector (Smart Market Report). [pdf ] Bedford, MA: McGraw Hill Construction Research & Analytics. Available at: https:// dbia.org/wp-content/uploads/2018/05/Research-Project-Delivery-Systems-SmartMarket.pdf [Accessed 15 March 2022]. Nesan, J., 2012. Factors Influencing Tacit Knowledge in Construction. Construction Economics and Building, [e-journal] 5(1), pp. 48-57. https://doi.org/10.5130/AJCEB.v5i1.2943 Pathirage, C.P., Amaratunga, D.G. and Haigh, R.P., 2007. Tacit knowledge and organisational performance: construction industry perspective. Journal of knowledge management, [e-journal] 11(1), pp.115-26. https://doi. org/10.1108/13673270710728277 Rezgui, Y., Hopfe, C.J. and Vorakulpipat, C., 2010. Generations of knowledge management in the architecture, engineering and construction industry: An evolutionary perspective. Advanced Engineering Informatics, [e-journal] 24(2), pp.219-28. https://doi.org/10.1016/j.aei.2009.12.001 Saini, M., Arif, M. and Kulonda, D.J., 2019. Challenges to transferring and sharing of tacit knowledge within a construction supply chain. Construction Innovation, [e-journal] 19(1), pp.5-33. https://doi.org/10.1108/CI-03-2018- 0015 Saldaña, J., 2009. The coding manual for qualitative researchers. 3rd ed. Thousand Oaks, CA: Sage Publications. Debs and Hubbard Construction Economics and Building, Vol. 23, No. 1/2 July 202375 https://doi.org/10.1016/j.ijproman.2012.10.005 https://doi.org/10.1016/j.ijproman.2012.10.005 https://doi.org/10.1016/j.autcon.2009.03.015 https://doi.org/10.1016/j.autcon.2022.104530 https://doi.org/10.1108/13673270210450450 https://doi.org/10.1002/9780470759554.ch3 https://doi.org/10.1002/9780470759554.ch3 https://doi.org/10.1016/j.autcon.2012.09.001 https://doi.org/10.1016/j.autcon.2012.09.001 https://doi.org/10.1080/15578771.2019.1696902 https://doi.org/10.1080/15623599.2020.1742628 https://doi.org/10.1108/14714170210814685 https://doi.org/10.1108/14714170210814685 https://dbia.org/wp-content/uploads/2018/05/Research-Project-Delivery-Systems-SmartMarket.pdf https://dbia.org/wp-content/uploads/2018/05/Research-Project-Delivery-Systems-SmartMarket.pdf https://doi.org/10.5130/AJCEB.v5i1.2943 https://doi.org/10.1108/13673270710728277 https://doi.org/10.1108/13673270710728277 https://doi.org/10.1016/j.aei.2009.12.001 https://doi.org/10.1108/CI-03-2018-0015 https://doi.org/10.1108/CI-03-2018-0015 Tan, H.C., Carrillo, P.M., Anumba, C.J., Bouchlaghem, N., Kamara, J.M. and Udeaja, C.E., 2007. Development of a methodology for live capture and reuse of project knowledge in construction. Journal of management in engineering, [e-journal] 23(1), pp.18-26. https://doi.org/10.1061/(ASCE)0742-597X(2007)23:1(18) Vaz-Serra, P. and Edwards, P., 2020. Addressing the knowledge management “nightmare” for construction companies. Construction Innovation, [e-journal] 21(2), pp.300-20. https://doi.org/10.1108/CI-02-2019-0013 Wang, H. and Meng, X., 2019. Transformation from IT-based knowledge management into BIM-supported knowledge management: A literature review. Expert Systems with Applications, [e-journal] 121, pp.170-87. https://doi. org/10.1016/j.eswa.2018.12.017 Woo, J.H., Clayton, M.J., Johnson, R.E., Flores, B.E. and Ellis, C., 2004. Dynamic Knowledge Map: reusing experts’ tacit knowledge in the AEC industry. Automation in construction, [e-journal] 13(2), pp.203-07. https://doi. org/10.1016/j.autcon.2003.09.003 Yang, L.R., Chen, J.H. and Wang, H.W., 2012. Assessing impacts of information technology on project success through knowledge management practice. Automation in construction, [e-journal] 22, pp.182-91. https://doi.org/10.1016/j. autcon.2011.06.016 Yap, J.B.H. and Toh, H.M., 2020, Investigating the principal factors impacting knowledge management implementation in construction organisations. Journal of Engineering, Design and Technology, [e-journal] 18(1), pp.55-69. https://doi. org/10.1108/JEDT-03-2019-0069 Yohe, A.M., 2006. An analysis of lessons learned programs in the construction industry. PhD. University of Texas at Austin. Yu, D. and Yang, J., 2018. Knowledge management research in the construction industry: a review. Journal of the Knowledge Economy, [e-journal] 9(3), pp.782-803. https://doi.org/10.1007/s13132-016-0375-7 Debs and Hubbard Construction Economics and Building, Vol. 23, No. 1/2 July 202376 https://doi.org/10.1061/(ASCE)0742-597X(2007)23:1(18 https://doi.org/10.1108/CI-02-2019-0013 https://doi.org/10.1016/j.eswa.2018.12.017 https://doi.org/10.1016/j.eswa.2018.12.017 https://doi.org/10.1016/j.autcon.2003.09.003 https://doi.org/10.1016/j.autcon.2003.09.003 https://doi.org/10.1016/j.autcon.2011.06.016 https://doi.org/10.1016/j.autcon.2011.06.016 https://doi.org/10.1108/JEDT-03-2019-0069 https://doi.org/10.1108/JEDT-03-2019-0069 https://doi.org/10.1007/s13132-016-0375-7