From nobody Sun May 5 09:42:14 2024 Delivered-To: importer@patchew.org Received-SPF: pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) client-ip=66.175.222.12; envelope-from=bounce+27952+50876+1787277+3901457@groups.io; helo=web01.groups.io; Authentication-Results: mx.zohomail.com; dkim=pass; spf=pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) smtp.mailfrom=bounce+27952+50876+1787277+3901457@groups.io; dmarc=fail(p=none dis=none) header.from=intel.com ARC-Seal: i=1; a=rsa-sha256; t=1574155653; cv=none; d=zoho.com; s=zohoarc; b=nlm8O2Io6msxD28C6HvTYqlg1ZvHKsQrh/oWRj3UqYk4bJmEqjx99pqbMJZOIOHH/KHlTyeMW4SugDrwoKeEzfOvlJWkzlOb3Dor9oHuUBOpcC9OzFZvUMuQpCff2BYDaVlpIAoH1Ud0OjZP4rofGqpGezjGRMhoj1CBNFq87jg= ARC-Message-Signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=zoho.com; s=zohoarc; t=1574155653; h=Cc:Date:From:In-Reply-To:List-Id:List-Unsubscribe:Message-ID:Reply-To:References:Sender:Subject:To; bh=QbjQuW4KT0Sq6c4Iq4TLFuTwMT7rg5w1RFk/ZJKjsWQ=; b=QaElc+74nFuqXspPiLZYN3wUGNPnBvYi9ChX0SFZMDaU/UK4BvJWtFPVsxL75KmG2liv7lzXtOgtRKxiTrKDkRAxtAjA4xXvs0DHCc5E1kVfGU8oSTXKu38HVf0QKplaaU30XETUafQgysw6TAS2HCdaggitHeJXpkZAmZftLcw= ARC-Authentication-Results: i=1; mx.zoho.com; dkim=pass; spf=pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) smtp.mailfrom=bounce+27952+50876+1787277+3901457@groups.io; dmarc=fail header.from= (p=none dis=none) header.from= Received: from web01.groups.io (web01.groups.io [66.175.222.12]) by mx.zohomail.com with SMTPS id 1574155653273464.973182209696; Tue, 19 Nov 2019 01:27:33 -0800 (PST) Return-Path: X-Received: by 127.0.0.2 with SMTP id aaaaYY1788612xaaaaaaaaaa; Tue, 19 Nov 2019 01:27:32 -0800 X-Received: from mga01.intel.com (mga01.intel.com []) by mx.groups.io with SMTP id smtpd.web12.18856.1574155650687557305 for ; Tue, 19 Nov 2019 01:27:31 -0800 X-Amp-Result: SKIPPED(no attachment in message) X-Amp-File-Uploaded: False X-Received: from orsmga006.jf.intel.com ([10.7.209.51]) by fmsmga101.fm.intel.com with ESMTP/TLS/DHE-RSA-AES256-GCM-SHA384; 19 Nov 2019 01:27:31 -0800 X-ExtLoop1: 1 X-IronPort-AV: E=Sophos;i="5.68,322,1569308400"; d="scan'208";a="209363414" X-Received: from jshi19-mobl.ccr.corp.intel.com ([10.254.214.80]) by orsmga006.jf.intel.com with ESMTP; 19 Nov 2019 01:27:30 -0800 From: "Steven Shi" To: devel@edk2.groups.io Cc: liming.gao@intel.com, bob.c.feng@intel.com, "Shi, Steven" Subject: [edk2-devel] [PATCH 1/4] BaseTools: store more complete output files in binary cache Date: Tue, 19 Nov 2019 17:26:58 +0800 Message-Id: <20191119092701.22988-2-steven.shi@intel.com> In-Reply-To: <20191119092701.22988-1-steven.shi@intel.com> References: <20191119092701.22988-1-steven.shi@intel.com> Precedence: Bulk List-Unsubscribe: Sender: devel@edk2.groups.io List-Id: Mailing-List: list devel@edk2.groups.io; contact devel+owner@edk2.groups.io Reply-To: devel@edk2.groups.io,steven.shi@intel.com X-Gm-Message-State: aaaaaaaaaaaaaaaaaaaaaaaax1787277AA= DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=groups.io; q=dns/txt; s=20140610; t=1574155652; bh=o+fO2jcw53qImDsQXoZu/T+BxqKMznAYETmFM0SG6OU=; h=Cc:Date:From:Reply-To:Subject:To; b=VfQOFBzf60l5gmxoUBvgWM1b/D0kReNyjyuNjdwrFL+AVxYcCTc/jtDuu/PAiOxEWzW SIEMnQT1uymOJp38HWCTZMAbPwR37lZ51WyFelP3+bDEglR6kqR328Tkq2iy5GrzSlpFY Dnb+sF3Uj7aCZFpMc/lFreKFZXMJaW9G6a0= X-ZohoMail-DKIM: pass (identity @groups.io) Content-Transfer-Encoding: quoted-printable MIME-Version: 1.0 Content-Type: text/plain; charset="utf-8" From: "Shi, Steven" Binary cache use the OutputFile method to return the module built output files needed to store in cache, but current OutputFile implementation doesn't return complete output files. Enhance the OutputFile method to return more complete output files. Cc: Liming Gao Cc: Bob Feng Signed-off-by: Steven Shi --- BaseTools/Source/Python/AutoGen/ModuleAutoGen.py | 20 ++++---------------- 1 file changed, 4 insertions(+), 16 deletions(-) diff --git a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py b/BaseTools/S= ource/Python/AutoGen/ModuleAutoGen.py index 1f45bac846..be87e58f58 100755 --- a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py +++ b/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py @@ -1291,28 +1291,16 @@ class ModuleAutoGen(AutoGen): def OutputFile(self): retVal =3D set() =20 - OutputDir =3D self.OutputDir.replace('\\', '/').strip('/') - DebugDir =3D self.DebugDir.replace('\\', '/').strip('/') - for Item in self.CodaTargetList: - File =3D Item.Target.Path.replace('\\', '/').strip('/').replac= e(DebugDir, '').replace(OutputDir, '').strip('/') - NewFile =3D path.join(self.OutputDir, File) - retVal.add(NewFile) - - Bin =3D self._GenOffsetBin() - if Bin: - NewFile =3D path.join(self.OutputDir, Bin) - retVal.add(NewFile) - - for Root, Dirs, Files in os.walk(self.OutputDir): + for Root, Dirs, Files in os.walk(self.BuildDir): for File in Files: # lib file is already added through above CodaTargetList, = skip it here - if not (File.lower().endswith('.obj') or File.lower().ends= with('.lib')): - NewFile =3D path.join(self.OutputDir, File) + if not (File.lower().endswith('.obj') or File.lower().ends= with('.debug')): + NewFile =3D path.join(Root, File) retVal.add(NewFile) =20 for Root, Dirs, Files in os.walk(self.FfsOutputDir): for File in Files: - NewFile =3D path.join(self.FfsOutputDir, File) + NewFile =3D path.join(Root, File) retVal.add(NewFile) =20 return retVal --=20 2.16.1.windows.4 -=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D- Groups.io Links: You receive all messages sent to this group. View/Reply Online (#50876): https://edk2.groups.io/g/devel/message/50876 Mute This Topic: https://groups.io/mt/60552480/1787277 Group Owner: devel+owner@edk2.groups.io Unsubscribe: https://edk2.groups.io/g/devel/unsub [importer@patchew.org] -=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D- From nobody Sun May 5 09:42:14 2024 Delivered-To: importer@patchew.org Received-SPF: pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) client-ip=66.175.222.12; envelope-from=bounce+27952+50877+1787277+3901457@groups.io; helo=web01.groups.io; Authentication-Results: mx.zohomail.com; dkim=pass; spf=pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) smtp.mailfrom=bounce+27952+50877+1787277+3901457@groups.io; dmarc=fail(p=none dis=none) header.from=intel.com ARC-Seal: i=1; a=rsa-sha256; t=1574155654; cv=none; d=zoho.com; s=zohoarc; b=YhVY/6uCHXxEiFKKgvKnSlHjb6Z+Cgo1rO9zxH3LQfdXVrwy8a5f6lRoc9L0tdml9pMxeGq3XIzClW1VnvEI2MrukRP0bb9P2Ww0X+E/j9CwFy3QRLsinEavNnTPUCYsCjQMxPXwhENa4qvJeaUwuLs+LCfmc5YXadlo1J238l8= ARC-Message-Signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=zoho.com; s=zohoarc; t=1574155654; h=Cc:Date:From:In-Reply-To:List-Id:List-Unsubscribe:Message-ID:Reply-To:References:Sender:Subject:To; bh=R3EkDmSY+8NmcbI72GkXqvijU/0yU/LnHLRUNKnrB3U=; b=X4Q1bmQbynrpZrCS9UP6ObRKj/dInMnv/UGgMRaXS9pWMlEjdSFH+ibxKPNh3YAg29dubhTI3c7qdmqgUST1O9bC0zuRtjazsFxlEPWjv/DBTsRW5Ott9ZO/Ie/aLWAFtH2g1bO9ThpA+5m4qPNZPUZR/+vtQnEAiBhvsQJHe3k= ARC-Authentication-Results: i=1; mx.zoho.com; dkim=pass; spf=pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) smtp.mailfrom=bounce+27952+50877+1787277+3901457@groups.io; dmarc=fail header.from= (p=none dis=none) header.from= Received: from web01.groups.io (web01.groups.io [66.175.222.12]) by mx.zohomail.com with SMTPS id 1574155654186229.8738934722453; Tue, 19 Nov 2019 01:27:34 -0800 (PST) Return-Path: X-Received: by 127.0.0.2 with SMTP id aaaaYY1788612xaaaaaaaaaa; Tue, 19 Nov 2019 01:27:33 -0800 X-Received: from mga01.intel.com (mga01.intel.com []) by mx.groups.io with SMTP id smtpd.web12.18856.1574155650687557305 for ; Tue, 19 Nov 2019 01:27:33 -0800 X-Amp-Result: SKIPPED(no attachment in message) X-Amp-File-Uploaded: False X-Received: from orsmga006.jf.intel.com ([10.7.209.51]) by fmsmga101.fm.intel.com with ESMTP/TLS/DHE-RSA-AES256-GCM-SHA384; 19 Nov 2019 01:27:32 -0800 X-ExtLoop1: 1 X-IronPort-AV: E=Sophos;i="5.68,322,1569308400"; d="scan'208";a="209363428" X-Received: from jshi19-mobl.ccr.corp.intel.com ([10.254.214.80]) by orsmga006.jf.intel.com with ESMTP; 19 Nov 2019 01:27:31 -0800 From: "Steven Shi" To: devel@edk2.groups.io Cc: liming.gao@intel.com, bob.c.feng@intel.com, "Shi, Steven" Subject: [edk2-devel] [PATCH 2/4] BaseTools: enhance the CacheCopyFile method arg names Date: Tue, 19 Nov 2019 17:26:59 +0800 Message-Id: <20191119092701.22988-3-steven.shi@intel.com> In-Reply-To: <20191119092701.22988-1-steven.shi@intel.com> References: <20191119092701.22988-1-steven.shi@intel.com> Precedence: Bulk List-Unsubscribe: Sender: devel@edk2.groups.io List-Id: Mailing-List: list devel@edk2.groups.io; contact devel+owner@edk2.groups.io Reply-To: devel@edk2.groups.io,steven.shi@intel.com X-Gm-Message-State: aaaaaaaaaaaaaaaaaaaaaaaax1787277AA= DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=groups.io; q=dns/txt; s=20140610; t=1574155653; bh=UzUmCiXWO2cbRWTaURJ7/zvXA3RzrXLm3QApiZXApTI=; h=Cc:Date:From:Reply-To:Subject:To; b=qudR9ZPpJZ/ucRKlfaXesZZ6x7Iua7zheK/Yp+oSkWLTn+8c1uL6fo95wQzmXh6E9lP z1B5zI8AMJiFhG/QvYe6YN8fXW0O0wIy0rN+fVTqjK/RlL6vUDqpk+29oMoE84j2XpJPz H5ebP3ndVfy1/eAz18nwjqqKlQBLvAg8gHo= X-ZohoMail-DKIM: pass (identity @groups.io) Content-Transfer-Encoding: quoted-printable MIME-Version: 1.0 Content-Type: text/plain; charset="utf-8" From: "Shi, Steven" Enhance the CacheCopyFile method arg names to be more clear and readable Cc: Liming Gao Cc: Bob Feng Signed-off-by: Steven Shi --- BaseTools/Source/Python/AutoGen/ModuleAutoGen.py | 6 +++--- 1 file changed, 3 insertions(+), 3 deletions(-) diff --git a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py b/BaseTools/S= ource/Python/AutoGen/ModuleAutoGen.py index be87e58f58..d2c23eed6d 100755 --- a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py +++ b/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py @@ -1633,9 +1633,9 @@ class ModuleAutoGen(AutoGen): =20 self.IsAsBuiltInfCreated =3D True =20 - def CacheCopyFile(self, OriginDir, CopyDir, File): - sub_dir =3D os.path.relpath(File, CopyDir) - destination_file =3D os.path.join(OriginDir, sub_dir) + def CacheCopyFile(self, DestDir, SourceDir, File): + sub_dir =3D os.path.relpath(File, SourceDir) + destination_file =3D os.path.join(DestDir, sub_dir) destination_dir =3D os.path.dirname(destination_file) CreateDirectory(destination_dir) try: --=20 2.16.1.windows.4 -=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D- Groups.io Links: You receive all messages sent to this group. View/Reply Online (#50877): https://edk2.groups.io/g/devel/message/50877 Mute This Topic: https://groups.io/mt/60552481/1787277 Group Owner: devel+owner@edk2.groups.io Unsubscribe: https://edk2.groups.io/g/devel/unsub [importer@patchew.org] -=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D- From nobody Sun May 5 09:42:14 2024 Delivered-To: importer@patchew.org Received-SPF: pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) client-ip=66.175.222.12; envelope-from=bounce+27952+50878+1787277+3901457@groups.io; helo=web01.groups.io; Authentication-Results: mx.zohomail.com; dkim=pass; spf=pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) smtp.mailfrom=bounce+27952+50878+1787277+3901457@groups.io; dmarc=fail(p=none dis=none) header.from=intel.com ARC-Seal: i=1; a=rsa-sha256; t=1574155660; cv=none; d=zoho.com; s=zohoarc; b=CIo4gMp17azjXKs/UBw0DuAeRaypPkWDIaplnXayu3ikb/8JkHVQh3qoagiH8ng1AVs0OtOybEEIE7tnPhBkanoxFhQIYGk2E7YhGJOkK9ALY5dXzKVmTyBFrLaDvStPlPPB7KCf3HWKsx9+XogSW+VwxiGXBkWAXezTsiuZzS8= ARC-Message-Signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=zoho.com; s=zohoarc; t=1574155660; h=Cc:Date:From:In-Reply-To:List-Id:List-Unsubscribe:Message-ID:Reply-To:References:Sender:Subject:To; bh=qjLevEReERoJxHthLEx6Dlb2sZnvOG5g9Ti0MfiFqmw=; b=PLwmLWteXZftyjUlaXAihlyDsuVQw+Bp8sALwwjIgB5u22CybU1HGEndyeSBWoDhoD0MZJKTAfyxKZOydU+PaHV5QRjbfyupGS9KSNJrUgcEycFXRaXEskd19zZfvANw6+52eyZVAnmtCLlY2lmdAp992rWHmIJZ+4cZE6RA89U= ARC-Authentication-Results: i=1; mx.zoho.com; dkim=pass; spf=pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) smtp.mailfrom=bounce+27952+50878+1787277+3901457@groups.io; dmarc=fail header.from= (p=none dis=none) header.from= Received: from web01.groups.io (web01.groups.io [66.175.222.12]) by mx.zohomail.com with SMTPS id 1574155660477147.73778473384777; Tue, 19 Nov 2019 01:27:40 -0800 (PST) Return-Path: X-Received: by 127.0.0.2 with SMTP id aaaaYY1788612xaaaaaaaaaa; Tue, 19 Nov 2019 01:27:38 -0800 X-Received: from mga01.intel.com (mga01.intel.com []) by mx.groups.io with SMTP id smtpd.web12.18856.1574155650687557305 for ; Tue, 19 Nov 2019 01:27:38 -0800 X-Amp-Result: SKIPPED(no attachment in message) X-Amp-File-Uploaded: False X-Received: from orsmga006.jf.intel.com ([10.7.209.51]) by fmsmga101.fm.intel.com with ESMTP/TLS/DHE-RSA-AES256-GCM-SHA384; 19 Nov 2019 01:27:36 -0800 X-ExtLoop1: 1 X-IronPort-AV: E=Sophos;i="5.68,322,1569308400"; d="scan'208";a="209363452" X-Received: from jshi19-mobl.ccr.corp.intel.com ([10.254.214.80]) by orsmga006.jf.intel.com with ESMTP; 19 Nov 2019 01:27:33 -0800 From: "Steven Shi" To: devel@edk2.groups.io Cc: liming.gao@intel.com, bob.c.feng@intel.com, Steven Subject: [edk2-devel] [PATCH 3/4] BaseTools: Leverage compiler output to optimize binary cache Date: Tue, 19 Nov 2019 17:27:00 +0800 Message-Id: <20191119092701.22988-4-steven.shi@intel.com> In-Reply-To: <20191119092701.22988-1-steven.shi@intel.com> References: <20191119092701.22988-1-steven.shi@intel.com> Precedence: Bulk List-Unsubscribe: Sender: devel@edk2.groups.io List-Id: Mailing-List: list devel@edk2.groups.io; contact devel+owner@edk2.groups.io Reply-To: devel@edk2.groups.io,steven.shi@intel.com X-Gm-Message-State: aaaaaaaaaaaaaaaaaaaaaaaax1787277AA= DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=groups.io; q=dns/txt; s=20140610; t=1574155658; bh=/Isu5AmBxls7DiKC8qtK47U0jaYHriJmjbxt52Qric0=; h=Cc:Date:From:Reply-To:Subject:To; b=ophUUoXHhOHdRovCYDug5zT2qhQbg1txhe5oyL3qV5pScUr3Mi3rBDhjM5GZBaCJm5j C/xasEwJ6n4Lnou1z3GDJO8UAm5IpDj6mrELL3QaJvGNnJGOGotvOA4A5InMAl84pqndF 21g7iDmU0F5ByxXAGSXRcNzy+bVqhpNE63c= X-ZohoMail-DKIM: pass (identity @groups.io) Content-Transfer-Encoding: quoted-printable MIME-Version: 1.0 Content-Type: text/plain; charset="utf-8" From: Steven Redesign the binary cache and bases on the compiler to output the dependency header files info for every module. The binary cache will directly consume the dependency header files info and doesn't parse the C source code by iteself. Also redesign the dependency files list format for module and try to share the common lib hash result as more as possible in local process. Remove the unnecessary share data access across multiprocessing. Cc: Liming Gao Cc: Bob Feng Signed-off-by: Steven Shi --- BaseTools/Source/Python/AutoGen/AutoGenWorker.py | 77 +- BaseTools/Source/Python/AutoGen/DataPipe.py | 2 + BaseTools/Source/Python/AutoGen/ModuleAutoGen.py | 1082 +++++++++-------= ---- .../Source/Python/AutoGen/WorkspaceAutoGen.py | 64 +- BaseTools/Source/Python/Common/GlobalData.py | 35 +- BaseTools/Source/Python/build/build.py | 276 +++-- 6 files changed, 714 insertions(+), 822 deletions(-) diff --git a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py b/BaseTools/S= ource/Python/AutoGen/AutoGenWorker.py index 94ea61a487..40b448f5b2 100755 --- a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py +++ b/BaseTools/Source/Python/AutoGen/AutoGenWorker.py @@ -128,12 +128,27 @@ class AutoGenManager(threading.Thread): clearQ(taskq) clearQ(self.feedback_q) clearQ(logq) + # Copy the cache queue itmes to parent thread before clear + cacheq =3D self.autogen_workers[0].cache_q + try: + cache_num =3D 0 + while True: + item =3D cacheq.get() + if item =3D=3D "CacheDone": + cache_num +=3D 1 + else: + GlobalData.gModuleAllCacheStatus.add(item) + if cache_num =3D=3D len(self.autogen_workers): + break + except: + print ("cache_q error") + def TerminateWorkers(self): self.error_event.set() def kill(self): self.feedback_q.put(None) class AutoGenWorkerInProcess(mp.Process): - def __init__(self,module_queue,data_pipe_file_path,feedback_q,file_loc= k,cache_lock,share_data,log_q,error_event): + def __init__(self,module_queue,data_pipe_file_path,feedback_q,file_loc= k,cache_q,log_q,error_event): mp.Process.__init__(self) self.module_queue =3D module_queue self.data_pipe_file_path =3Ddata_pipe_file_path @@ -141,8 +156,7 @@ class AutoGenWorkerInProcess(mp.Process): self.feedback_q =3D feedback_q self.PlatformMetaFileSet =3D {} self.file_lock =3D file_lock - self.cache_lock =3D cache_lock - self.share_data =3D share_data + self.cache_q =3D cache_q self.log_q =3D log_q self.error_event =3D error_event def GetPlatformMetaFile(self,filepath,root): @@ -184,12 +198,19 @@ class AutoGenWorkerInProcess(mp.Process): GlobalData.gDisableIncludePathCheck =3D False GlobalData.gFdfParser =3D self.data_pipe.Get("FdfParser") GlobalData.gDatabasePath =3D self.data_pipe.Get("DatabasePath") + + GlobalData.gUseHashCache =3D self.data_pipe.Get("UseHashCache") GlobalData.gBinCacheSource =3D self.data_pipe.Get("BinCacheSou= rce") GlobalData.gBinCacheDest =3D self.data_pipe.Get("BinCacheDest") - GlobalData.gCacheIR =3D self.share_data + GlobalData.gPlatformHashFile =3D self.data_pipe.Get("PlatformH= ashFile") + GlobalData.gModulePreMakeCacheStatus =3D dict() + GlobalData.gModuleMakeCacheStatus =3D dict() + GlobalData.gHashChainStatus =3D dict() + GlobalData.gCMakeHashFile =3D dict() + GlobalData.gModuleHashFile =3D dict() + GlobalData.gFileHashDict =3D dict() GlobalData.gEnableGenfdsMultiThread =3D self.data_pipe.Get("En= ableGenfdsMultiThread") GlobalData.file_lock =3D self.file_lock - GlobalData.cache_lock =3D self.cache_lock CommandTarget =3D self.data_pipe.Get("CommandTarget") pcd_from_build_option =3D [] for pcd_tuple in self.data_pipe.Get("BuildOptPcd"): @@ -205,10 +226,6 @@ class AutoGenWorkerInProcess(mp.Process): GlobalData.FfsCmd =3D FfsCmd PlatformMetaFile =3D self.GetPlatformMetaFile(self.data_pipe.G= et("P_Info").get("ActivePlatform"), self.data_pipe.Get("P_Info").= get("WorkspaceDir")) - libConstPcd =3D self.data_pipe.Get("LibConstPcd") - Refes =3D self.data_pipe.Get("REFS") - GlobalData.libConstPcd =3D libConstPcd - GlobalData.Refes =3D Refes while True: if self.module_queue.empty(): break @@ -230,27 +247,41 @@ class AutoGenWorkerInProcess(mp.Process): toolchain =3D self.data_pipe.Get("P_Info").get("ToolChain") Ma =3D ModuleAutoGen(self.Wa,module_metafile,target,toolch= ain,arch,PlatformMetaFile,self.data_pipe) Ma.IsLibrary =3D IsLib - if IsLib: - if (Ma.MetaFile.File,Ma.MetaFile.Root,Ma.Arch,Ma.MetaF= ile.Path) in libConstPcd: - Ma.ConstPcd =3D libConstPcd[(Ma.MetaFile.File,Ma.M= etaFile.Root,Ma.Arch,Ma.MetaFile.Path)] - if (Ma.MetaFile.File,Ma.MetaFile.Root,Ma.Arch,Ma.MetaF= ile.Path) in Refes: - Ma.ReferenceModules =3D Refes[(Ma.MetaFile.File,Ma= .MetaFile.Root,Ma.Arch,Ma.MetaFile.Path)] - if GlobalData.gBinCacheSource and CommandTarget in [None, = "", "all"]: - Ma.GenModuleFilesHash(GlobalData.gCacheIR) - Ma.GenPreMakefileHash(GlobalData.gCacheIR) - if Ma.CanSkipbyPreMakefileCache(GlobalData.gCacheIR): + # SourceFileList calling sequence impact the makefile stri= ng sequence. + # Create cached SourceFileList here to unify its calling s= equence for both + # CanSkipbyPreMakeCache and CreateCodeFile/CreateMakeFile. + RetVal =3D Ma.SourceFileList + if GlobalData.gUseHashCache and not GlobalData.gBinCacheDe= st and CommandTarget in [None, "", "all"]: + try: + CacheResult =3D Ma.CanSkipbyPreMakeCache() + except: + CacheResult =3D False + traceback.print_exc(file=3Dsys.stdout) + self.feedback_q.put(taskname) + + if CacheResult: + self.cache_q.put((Ma.MetaFile.Path, Ma.Arch, "PreM= akeCache", True)) continue + else: + self.cache_q.put((Ma.MetaFile.Path, Ma.Arch, "PreM= akeCache", False)) =20 Ma.CreateCodeFile(False) Ma.CreateMakeFile(False,GenFfsList=3DFfsCmd.get((Ma.MetaFi= le.Path, Ma.Arch),[])) =20 if GlobalData.gBinCacheSource and CommandTarget in [None, = "", "all"]: - Ma.GenMakeHeaderFilesHash(GlobalData.gCacheIR) - Ma.GenMakeHash(GlobalData.gCacheIR) - if Ma.CanSkipbyMakeCache(GlobalData.gCacheIR): + try: + CacheResult =3D Ma.CanSkipbyMakeCache() + except: + CacheResult =3D False + traceback.print_exc(file=3Dsys.stdout) + self.feedback_q.put(taskname) + + if CacheResult: + self.cache_q.put((Ma.MetaFile.Path, Ma.Arch, "Make= Cache", True)) continue else: - Ma.PrintFirstMakeCacheMissFile(GlobalData.gCacheIR) + self.cache_q.put((Ma.MetaFile.Path, Ma.Arch, "Make= Cache", False)) + except Empty: pass except: @@ -258,6 +289,8 @@ class AutoGenWorkerInProcess(mp.Process): self.feedback_q.put(taskname) finally: self.feedback_q.put("Done") + self.cache_q.put("CacheDone") + def printStatus(self): print("Processs ID: %d Run %d modules in AutoGen " % (os.getpid(),= len(AutoGen.Cache()))) print("Processs ID: %d Run %d modules in AutoGenInfo " % (os.getpi= d(),len(AutoGenInfo.GetCache()))) diff --git a/BaseTools/Source/Python/AutoGen/DataPipe.py b/BaseTools/Source= /Python/AutoGen/DataPipe.py index 078bafecb4..50403fbfb5 100755 --- a/BaseTools/Source/Python/AutoGen/DataPipe.py +++ b/BaseTools/Source/Python/AutoGen/DataPipe.py @@ -159,6 +159,8 @@ class MemoryDataPipe(DataPipe): =20 self.DataContainer =3D {"LogLevel": EdkLogger.GetLevel()} =20 + self.DataContainer =3D {"UseHashCache":GlobalData.gUseHashCache} + self.DataContainer =3D {"BinCacheSource":GlobalData.gBinCacheSourc= e} =20 self.DataContainer =3D {"BinCacheDest":GlobalData.gBinCacheDest} diff --git a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py b/BaseTools/S= ource/Python/AutoGen/ModuleAutoGen.py index d2c23eed6d..699c903c94 100755 --- a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py +++ b/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py @@ -6,7 +6,7 @@ # from __future__ import absolute_import from AutoGen.AutoGen import AutoGen -from Common.LongFilePathSupport import CopyLongFilePath +from Common.LongFilePathSupport import LongFilePath, CopyLongFilePath from Common.BuildToolError import * from Common.DataType import * from Common.Misc import * @@ -26,7 +26,6 @@ from Workspace.MetaFileCommentParser import UsageList from .GenPcdDb import CreatePcdDatabaseCode from Common.caching import cached_class_function from AutoGen.ModuleAutoGenHelper import PlatformInfo,WorkSpaceInfo -from AutoGen.CacheIR import ModuleBuildCacheIR import json import tempfile =20 @@ -1634,6 +1633,9 @@ class ModuleAutoGen(AutoGen): self.IsAsBuiltInfCreated =3D True =20 def CacheCopyFile(self, DestDir, SourceDir, File): + if os.path.isdir(File): + return + sub_dir =3D os.path.relpath(File, SourceDir) destination_file =3D os.path.join(DestDir, sub_dir) destination_dir =3D os.path.dirname(destination_file) @@ -1645,105 +1647,73 @@ class ModuleAutoGen(AutoGen): return =20 def CopyModuleToCache(self): - self.GenPreMakefileHash(GlobalData.gCacheIR) - if not (self.MetaFile.Path, self.Arch) in GlobalData.gCacheIR or \ - not GlobalData.gCacheIR[(self.MetaFile.Path, self.Arch)].PreMak= efileHashHexDigest: - EdkLogger.quiet("[cache warning]: Cannot generate PreMakefileH= ash for module: %s[%s]" % (self.MetaFile.Path, self.Arch)) - return False + # Find the MakeHashStr and PreMakeHashStr from latest MakeHashFile= List + # and PreMakeHashFileList files + MakeHashStr =3D None + PreMakeHashStr =3D None + MakeTimeStamp =3D 0 + PreMakeTimeStamp =3D 0 + Files =3D [f for f in os.listdir(LongFilePath(self.BuildDir)) if p= ath.isfile(LongFilePath(path.join(self.BuildDir, f)))] + for File in Files: + if ".MakeHashFileList." in File: + #find lastest file through time stamp + FileTimeStamp =3D os.stat(LongFilePath(path.join(self.Buil= dDir, File)))[8] + if FileTimeStamp > MakeTimeStamp: + MakeTimeStamp =3D FileTimeStamp + MakeHashStr =3D File.split('.')[-1] + if len(MakeHashStr) !=3D 32: + EdkLogger.quiet("[cache error]: wrong MakeHashFile= List file:%s" % (File)) + if ".PreMakeHashFileList." in File: + FileTimeStamp =3D os.stat(LongFilePath(path.join(self.Buil= dDir, File)))[8] + if FileTimeStamp > PreMakeTimeStamp: + PreMakeTimeStamp =3D FileTimeStamp + PreMakeHashStr =3D File.split('.')[-1] + if len(PreMakeHashStr) !=3D 32: + EdkLogger.quiet("[cache error]: wrong PreMakeHashF= ileList file:%s" % (File)) =20 - self.GenMakeHash(GlobalData.gCacheIR) - if not (self.MetaFile.Path, self.Arch) in GlobalData.gCacheIR or \ - not GlobalData.gCacheIR[(self.MetaFile.Path, self.Arch)].MakeHa= shChain or \ - not GlobalData.gCacheIR[(self.MetaFile.Path, self.Arch)].MakeHa= shHexDigest: - EdkLogger.quiet("[cache warning]: Cannot generate MakeHashChai= n for module: %s[%s]" % (self.MetaFile.Path, self.Arch)) - return False + if not MakeHashStr: + EdkLogger.quiet("[cache error]: No MakeHashFileList file for m= odule:%s[%s]" % (self.MetaFile.Path, self.Arch)) + return + if not PreMakeHashStr: + EdkLogger.quiet("[cache error]: No PreMakeHashFileList file fo= r module:%s[%s]" % (self.MetaFile.Path, self.Arch)) + return =20 - MakeHashStr =3D str(GlobalData.gCacheIR[(self.MetaFile.Path, self.= Arch)].MakeHashHexDigest) - FileDir =3D path.join(GlobalData.gBinCacheDest, self.PlatformInfo.= OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.SourceD= ir, self.MetaFile.BaseName, MakeHashStr) - FfsDir =3D path.join(GlobalData.gBinCacheDest, self.PlatformInfo.O= utputDir, self.BuildTarget + "_" + self.ToolChain, TAB_FV_DIRECTORY, "Ffs",= self.Guid + self.Name, MakeHashStr) + # Create Cache destination dirs + FileDir =3D path.join(GlobalData.gBinCacheDest, self.PlatformInfo.= OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.SourceD= ir, self.MetaFile.BaseName) + FfsDir =3D path.join(GlobalData.gBinCacheDest, self.PlatformInfo.O= utputDir, self.BuildTarget + "_" + self.ToolChain, TAB_FV_DIRECTORY, "Ffs",= self.Guid + self.Name) + CacheFileDir =3D path.join(FileDir, MakeHashStr) + CacheFfsDir =3D path.join(FfsDir, MakeHashStr) + CreateDirectory (CacheFileDir) + CreateDirectory (CacheFfsDir) =20 - CreateDirectory (FileDir) - self.SaveHashChainFileToCache(GlobalData.gCacheIR) - ModuleFile =3D path.join(self.OutputDir, self.Name + '.inf') - if os.path.exists(ModuleFile): - CopyFileOnChange(ModuleFile, FileDir) + # Create ModuleHashPair file to support multiple version cache tog= ether + ModuleHashPair =3D path.join(FileDir, self.Name + ".ModuleHashPair= ") + ModuleHashPairList =3D [] # tuple list: [tuple(PreMakefileHash, Ma= keHash)] + if os.path.exists(ModuleHashPair): + with open(ModuleHashPair, 'r') as f: + ModuleHashPairList =3D json.load(f) + if not (PreMakeHashStr, MakeHashStr) in set(map(tuple, ModuleHashP= airList)): + ModuleHashPairList.insert(0, (PreMakeHashStr, MakeHashStr)) + with open(ModuleHashPair, 'w') as f: + json.dump(ModuleHashPairList, f, indent=3D2) + + # Copy files to Cache destination dirs if not self.OutputFile: Ma =3D self.BuildDatabase[self.MetaFile, self.Arch, self.Build= Target, self.ToolChain] self.OutputFile =3D Ma.Binaries for File in self.OutputFile: - if os.path.exists(File): - if File.startswith(os.path.abspath(self.FfsOutputDir)+os.s= ep): - self.CacheCopyFile(FfsDir, self.FfsOutputDir, File) + if File.startswith(os.path.abspath(self.FfsOutputDir)+os.sep): + self.CacheCopyFile(CacheFfsDir, self.FfsOutputDir, File) + else: + if self.Name + ".autogen.hash." in File or \ + self.Name + ".autogen.hashchain." in File or \ + self.Name + ".hash." in File or \ + self.Name + ".hashchain." in File or \ + self.Name + ".PreMakeHashFileList." in File or \ + self.Name + ".MakeHashFileList." in File: + self.CacheCopyFile(FileDir, self.BuildDir, File) else: - self.CacheCopyFile(FileDir, self.OutputDir, File) - - def SaveHashChainFileToCache(self, gDict): - if not GlobalData.gBinCacheDest: - return False - - self.GenPreMakefileHash(gDict) - if not (self.MetaFile.Path, self.Arch) in gDict or \ - not gDict[(self.MetaFile.Path, self.Arch)].PreMakefileHashHexDi= gest: - EdkLogger.quiet("[cache warning]: Cannot generate PreMakefileH= ash for module: %s[%s]" % (self.MetaFile.Path, self.Arch)) - return False - - self.GenMakeHash(gDict) - if not (self.MetaFile.Path, self.Arch) in gDict or \ - not gDict[(self.MetaFile.Path, self.Arch)].MakeHashChain or \ - not gDict[(self.MetaFile.Path, self.Arch)].MakeHashHexDigest: - EdkLogger.quiet("[cache warning]: Cannot generate MakeHashChai= n for module: %s[%s]" % (self.MetaFile.Path, self.Arch)) - return False - - # save the hash chain list as cache file - MakeHashStr =3D str(GlobalData.gCacheIR[(self.MetaFile.Path, self.= Arch)].MakeHashHexDigest) - CacheDestDir =3D path.join(GlobalData.gBinCacheDest, self.Platform= Info.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.So= urceDir, self.MetaFile.BaseName) - CacheHashDestDir =3D path.join(CacheDestDir, MakeHashStr) - ModuleHashPair =3D path.join(CacheDestDir, self.Name + ".ModuleHas= hPair") - MakeHashChain =3D path.join(CacheHashDestDir, self.Name + ".MakeHa= shChain") - ModuleFilesChain =3D path.join(CacheHashDestDir, self.Name + ".Mod= uleFilesChain") - - # save the HashChainDict as json file - CreateDirectory (CacheDestDir) - CreateDirectory (CacheHashDestDir) - try: - ModuleHashPairList =3D [] # tuple list: [tuple(PreMakefileHash= , MakeHash)] - if os.path.exists(ModuleHashPair): - with open(ModuleHashPair, 'r') as f: - ModuleHashPairList =3D json.load(f) - PreMakeHash =3D gDict[(self.MetaFile.Path, self.Arch)].PreMake= fileHashHexDigest - MakeHash =3D gDict[(self.MetaFile.Path, self.Arch)].MakeHashHe= xDigest - ModuleHashPairList.append((PreMakeHash, MakeHash)) - ModuleHashPairList =3D list(set(map(tuple, ModuleHashPairList)= )) - with open(ModuleHashPair, 'w') as f: - json.dump(ModuleHashPairList, f, indent=3D2) - except: - EdkLogger.quiet("[cache warning]: fail to save ModuleHashPair = file in cache: %s" % ModuleHashPair) - return False - - try: - with open(MakeHashChain, 'w') as f: - json.dump(gDict[(self.MetaFile.Path, self.Arch)].MakeHashC= hain, f, indent=3D2) - except: - EdkLogger.quiet("[cache warning]: fail to save MakeHashChain f= ile in cache: %s" % MakeHashChain) - return False - - try: - with open(ModuleFilesChain, 'w') as f: - json.dump(gDict[(self.MetaFile.Path, self.Arch)].ModuleFil= esChain, f, indent=3D2) - except: - EdkLogger.quiet("[cache warning]: fail to save ModuleFilesChai= n file in cache: %s" % ModuleFilesChain) - return False - - # save the autogenfile and makefile for debug usage - CacheDebugDir =3D path.join(CacheHashDestDir, "CacheDebug") - CreateDirectory (CacheDebugDir) - CopyFileOnChange(gDict[(self.MetaFile.Path, self.Arch)].MakefilePa= th, CacheDebugDir) - if gDict[(self.MetaFile.Path, self.Arch)].AutoGenFileList: - for File in gDict[(self.MetaFile.Path, self.Arch)].AutoGenFile= List: - CopyFileOnChange(str(File), CacheDebugDir) - - return True - + self.CacheCopyFile(CacheFileDir, self.BuildDir, File) ## Create makefile for the module and its dependent libraries # # @param CreateLibraryMakeFile Flag indicating if or not the = makefiles of @@ -1751,10 +1721,6 @@ class ModuleAutoGen(AutoGen): # @cached_class_function def CreateMakeFile(self, CreateLibraryMakeFile=3DTrue, GenFfsList =3D = []): - gDict =3D GlobalData.gCacheIR - if (self.MetaFile.Path, self.Arch) in gDict and \ - gDict[(self.MetaFile.Path, self.Arch)].CreateMakeFileDone: - return =20 # nest this function inside it's only caller. def CreateTimeStamp(): @@ -1804,20 +1770,8 @@ class ModuleAutoGen(AutoGen): MakefileType =3D Makefile._FileType MakefileName =3D Makefile._FILE_NAME_[MakefileType] MakefilePath =3D os.path.join(self.MakeFileDir, MakefileName) - - MewIR =3D ModuleBuildCacheIR(self.MetaFile.Path, self.Arch) - MewIR.MakefilePath =3D MakefilePath - MewIR.DependencyHeaderFileSet =3D Makefile.DependencyHeaderFileSet - MewIR.CreateMakeFileDone =3D True - with GlobalData.cache_lock: - try: - IR =3D gDict[(self.MetaFile.Path, self.Arch)] - IR.MakefilePath =3D MakefilePath - IR.DependencyHeaderFileSet =3D Makefile.DependencyHeaderFi= leSet - IR.CreateMakeFileDone =3D True - gDict[(self.MetaFile.Path, self.Arch)] =3D IR - except: - gDict[(self.MetaFile.Path, self.Arch)] =3D MewIR + FilePath =3D path.join(self.BuildDir, self.Name + ".makefile") + SaveFileOnChange(FilePath, MakefilePath, False) =20 def CopyBinaryFiles(self): for File in self.Module.Binaries: @@ -1830,10 +1784,6 @@ class ModuleAutoGen(AutoGen): # dependent libraries will be cr= eated # def CreateCodeFile(self, CreateLibraryCodeFile=3DTrue): - gDict =3D GlobalData.gCacheIR - if (self.MetaFile.Path, self.Arch) in gDict and \ - gDict[(self.MetaFile.Path, self.Arch)].CreateCodeFileDone: - return =20 if self.IsCodeFileCreated: return @@ -1892,15 +1842,6 @@ class ModuleAutoGen(AutoGen): (" ".join(AutoGenList), " ".join(IgoredAutoGen= List), self.Name, self.Arch)) =20 self.IsCodeFileCreated =3D True - MewIR =3D ModuleBuildCacheIR(self.MetaFile.Path, self.Arch) - MewIR.CreateCodeFileDone =3D True - with GlobalData.cache_lock: - try: - IR =3D gDict[(self.MetaFile.Path, self.Arch)] - IR.CreateCodeFileDone =3D True - gDict[(self.MetaFile.Path, self.Arch)] =3D IR - except: - gDict[(self.MetaFile.Path, self.Arch)] =3D MewIR =20 return AutoGenList =20 @@ -1925,618 +1866,539 @@ class ModuleAutoGen(AutoGen): self._ApplyBuildRule(Lib.Target, TAB_UNKNOWN_FILE) return RetVal =20 - def GenModuleHash(self): - # Initialize a dictionary for each arch type - if self.Arch not in GlobalData.gModuleHash: - GlobalData.gModuleHash[self.Arch] =3D {} + def GenCMakeHash(self): + # GenCMakeHash can only be called in --binary-destination + # Never called in multiprocessing and always directly save result = in main process, + # so no need remote dict to share the gCMakeHashFile result with m= ain process =20 - # Early exit if module or library has been hashed and is in memory - if self.Name in GlobalData.gModuleHash[self.Arch]: - return GlobalData.gModuleHash[self.Arch][self.Name].encode('ut= f-8') + DependencyFileSet =3D set() + # Add AutoGen files + if self.AutoGenFileList: + for File in set(self.AutoGenFileList): + DependencyFileSet.add(File) + + # Add Makefile + abspath =3D path.join(self.BuildDir, self.Name + ".makefile") + try: + with open(LongFilePath(abspath),"r") as fd: + lines =3D fd.readlines() + except Exception as e: + EdkLogger.error("build",FILE_NOT_FOUND, "%s doesn't exist" % a= bspath, ExtraData=3Dstr(e), RaiseError=3DFalse) + if lines: + DependencyFileSet.update(lines) =20 + # Caculate all above dependency files hash # Initialze hash object + FileList =3D [] m =3D hashlib.md5() - - # Add Platform level hash - m.update(GlobalData.gPlatformHash.encode('utf-8')) - - # Add Package level hash - if self.DependentPackageList: - for Pkg in sorted(self.DependentPackageList, key=3Dlambda x: x= .PackageName): - if Pkg.PackageName in GlobalData.gPackageHash: - m.update(GlobalData.gPackageHash[Pkg.PackageName].enco= de('utf-8')) - - # Add Library hash - if self.LibraryAutoGenList: - for Lib in sorted(self.LibraryAutoGenList, key=3Dlambda x: x.N= ame): - if Lib.Name not in GlobalData.gModuleHash[self.Arch]: - Lib.GenModuleHash() - m.update(GlobalData.gModuleHash[self.Arch][Lib.Name].encod= e('utf-8')) - - # Add Module self - with open(str(self.MetaFile), 'rb') as f: - Content =3D f.read() - m.update(Content) - - # Add Module's source files - if self.SourceFileList: - for File in sorted(self.SourceFileList, key=3Dlambda x: str(x)= ): - f =3D open(str(File), 'rb') + for File in sorted(DependencyFileSet, key=3Dlambda x: str(x)): + if not path.exists(LongFilePath(str(File))): + EdkLogger.quiet("[cache warning]: header file %s is missin= g for module: %s[%s]" % (File, self.MetaFile.Path, self.Arch)) + continue + with open(LongFilePath(str(File)), 'rb') as f: Content =3D f.read() - f.close() - m.update(Content) - - GlobalData.gModuleHash[self.Arch][self.Name] =3D m.hexdigest() - - return GlobalData.gModuleHash[self.Arch][self.Name].encode('utf-8') + m.update(Content) + FileList.append((str(File), hashlib.md5(Content).hexdigest())) =20 - def GenModuleFilesHash(self, gDict): - # Early exit if module or library has been hashed and is in memory - if (self.MetaFile.Path, self.Arch) in gDict: - if gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesChain: - return gDict[(self.MetaFile.Path, self.Arch)] + HashChainFile =3D path.join(self.BuildDir, self.Name + ".autogen.h= ashchain." + m.hexdigest()) + GlobalData.gCMakeHashFile[(self.MetaFile.Path, self.Arch)] =3D Has= hChainFile + try: + with open(LongFilePath(HashChainFile), 'w') as f: + json.dump(FileList, f, indent=3D2) + except: + EdkLogger.quiet("[cache warning]: fail to save hashchain file:= %s" % HashChainFile) + return False =20 - # skip if the module cache already crashed - if (self.MetaFile.Path, self.Arch) in gDict and \ - gDict[(self.MetaFile.Path, self.Arch)].CacheCrash: - return + def GenModuleHash(self): + # GenModuleHash only called after autogen phase + # Never called in multiprocessing and always directly save result = in main process, + # so no need remote dict to share the gModuleHashFile result with = main process + # + # GenPreMakefileHashList consume no dict. + # GenPreMakefileHashList produce local gModuleHashFile dict. =20 DependencyFileSet =3D set() # Add Module Meta file - DependencyFileSet.add(self.MetaFile) + DependencyFileSet.add(self.MetaFile.Path) =20 # Add Module's source files if self.SourceFileList: for File in set(self.SourceFileList): - DependencyFileSet.add(File) + DependencyFileSet.add(File.Path) =20 # Add modules's include header files - # Search dependency file list for each source file - SourceFileList =3D [] - OutPutFileList =3D [] - for Target in self.IntroTargetList: - SourceFileList.extend(Target.Inputs) - OutPutFileList.extend(Target.Outputs) - if OutPutFileList: - for Item in OutPutFileList: - if Item in SourceFileList: - SourceFileList.remove(Item) - SearchList =3D [] - for file_path in self.IncludePathList + self.BuildOptionIncPathLis= t: - # skip the folders in platform BuildDir which are not been gen= erated yet - if file_path.startswith(os.path.abspath(self.PlatformInfo.Buil= dDir)+os.sep): - continue - SearchList.append(file_path) - FileDependencyDict =3D {} - ForceIncludedFile =3D [] - for F in SourceFileList: - # skip the files which are not been generated yet, because - # the SourceFileList usually contains intermediate build files= , e.g. AutoGen.c - if not os.path.exists(F.Path): - continue - FileDependencyDict[F] =3D GenMake.GetDependencyList(self, self= .FileDependCache, F, ForceIncludedFile, SearchList) + # Directly use the deps.txt file in the module BuildDir + abspath =3D path.join(self.BuildDir, "deps.txt") + rt =3D None + try: + with open(LongFilePath(abspath),"r") as fd: + lines =3D fd.readlines() + if lines: + rt =3D set([item.lstrip().strip("\n") for item in line= s if item.strip("\n").endswith(".h")]) + except Exception as e: + EdkLogger.error("build",FILE_NOT_FOUND, "%s doesn't exist" % a= bspath, ExtraData=3Dstr(e), RaiseError=3DFalse) + + if rt: + DependencyFileSet.update(rt) =20 - if FileDependencyDict: - for Dependency in FileDependencyDict.values(): - DependencyFileSet.update(set(Dependency)) =20 # Caculate all above dependency files hash # Initialze hash object FileList =3D [] m =3D hashlib.md5() + BuildDirStr =3D path.abspath(self.BuildDir).lower() for File in sorted(DependencyFileSet, key=3Dlambda x: str(x)): - if not os.path.exists(str(File)): + # Skip the AutoGen files in BuildDir which already been + # included in .autogen.hash. file + if BuildDirStr in path.abspath(File).lower(): + continue + if not path.exists(LongFilePath(File)): EdkLogger.quiet("[cache warning]: header file %s is missin= g for module: %s[%s]" % (File, self.MetaFile.Path, self.Arch)) continue - with open(str(File), 'rb') as f: + with open(LongFilePath(File), 'rb') as f: Content =3D f.read() m.update(Content) - FileList.append((str(File), hashlib.md5(Content).hexdigest())) - + FileList.append((File, hashlib.md5(Content).hexdigest())) =20 - MewIR =3D ModuleBuildCacheIR(self.MetaFile.Path, self.Arch) - MewIR.ModuleFilesHashDigest =3D m.digest() - MewIR.ModuleFilesHashHexDigest =3D m.hexdigest() - MewIR.ModuleFilesChain =3D FileList - with GlobalData.cache_lock: - try: - IR =3D gDict[(self.MetaFile.Path, self.Arch)] - IR.ModuleFilesHashDigest =3D m.digest() - IR.ModuleFilesHashHexDigest =3D m.hexdigest() - IR.ModuleFilesChain =3D FileList - gDict[(self.MetaFile.Path, self.Arch)] =3D IR - except: - gDict[(self.MetaFile.Path, self.Arch)] =3D MewIR - - return gDict[(self.MetaFile.Path, self.Arch)] - - def GenPreMakefileHash(self, gDict): - # Early exit if module or library has been hashed and is in memory - if (self.MetaFile.Path, self.Arch) in gDict and \ - gDict[(self.MetaFile.Path, self.Arch)].PreMakefileHashHexDigest: - return gDict[(self.MetaFile.Path, self.Arch)] + HashChainFile =3D path.join(self.BuildDir, self.Name + ".hashchain= ." + m.hexdigest()) + GlobalData.gModuleHashFile[(self.MetaFile.Path, self.Arch)] =3D Ha= shChainFile + try: + with open(LongFilePath(HashChainFile), 'w') as f: + json.dump(FileList, f, indent=3D2) + except: + EdkLogger.quiet("[cache warning]: fail to save hashchain file:= %s" % HashChainFile) + return False =20 - # skip if the module cache already crashed - if (self.MetaFile.Path, self.Arch) in gDict and \ - gDict[(self.MetaFile.Path, self.Arch)].CacheCrash: - return + def GenPreMakefileHashList(self): + # GenPreMakefileHashList consume below dicts: + # gPlatformHashFile + # gPackageHashFile + # gModuleHashFile + # GenPreMakefileHashList produce no dict. + # gModuleHashFile items might be produced in multiprocessing, so + # need check gModuleHashFile remote dict =20 # skip binary module if self.IsBinaryModule: return =20 - if not (self.MetaFile.Path, self.Arch) in gDict or \ - not gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDiges= t: - self.GenModuleFilesHash(gDict) - - if not (self.MetaFile.Path, self.Arch) in gDict or \ - not gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDiges= t: - EdkLogger.quiet("[cache warning]: Cannot generate ModuleFilesH= ashDigest for module %s[%s]" %(self.MetaFile.Path, self.Arch)) - return - - # Initialze hash object + FileList =3D [] m =3D hashlib.md5() - # Add Platform level hash - if ('PlatformHash') in gDict: - m.update(gDict[('PlatformHash')].encode('utf-8')) + HashFile =3D GlobalData.gPlatformHashFile + if path.exists(LongFilePath(HashFile)): + FileList.append(HashFile) + m.update(HashFile.encode('utf-8')) else: - EdkLogger.quiet("[cache warning]: PlatformHash is missing") + EdkLogger.quiet("[cache warning]: No Platform HashFile: %s" % = HashFile) =20 # Add Package level hash if self.DependentPackageList: for Pkg in sorted(self.DependentPackageList, key=3Dlambda x: x= .PackageName): - if (Pkg.PackageName, 'PackageHash') in gDict: - m.update(gDict[(Pkg.PackageName, 'PackageHash')].encod= e('utf-8')) + if not (Pkg.PackageName, Pkg.Arch) in GlobalData.gPackageH= ashFile: + EdkLogger.quiet("[cache warning]:No Package %s for mod= ule %s[%s]" % (Pkg.PackageName, self.MetaFile.Path, self.Arch)) + continue + HashFile =3D GlobalData.gPackageHashFile[(Pkg.PackageName,= Pkg.Arch)] + if path.exists(LongFilePath(HashFile)): + FileList.append(HashFile) + m.update(HashFile.encode('utf-8')) else: - EdkLogger.quiet("[cache warning]: %s PackageHash neede= d by %s[%s] is missing" %(Pkg.PackageName, self.MetaFile.Name, self.Arch)) - - # Add Library hash - if self.LibraryAutoGenList: - for Lib in sorted(self.LibraryAutoGenList, key=3Dlambda x: x.N= ame): - if not (Lib.MetaFile.Path, Lib.Arch) in gDict or \ - not gDict[(Lib.MetaFile.Path, Lib.Arch)].ModuleFilesHas= hDigest: - Lib.GenPreMakefileHash(gDict) - m.update(gDict[(Lib.MetaFile.Path, Lib.Arch)].ModuleFilesH= ashDigest) + EdkLogger.quiet("[cache warning]:No Package HashFile: = %s" % HashFile) =20 # Add Module self - m.update(gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDig= est) - - with GlobalData.cache_lock: - IR =3D gDict[(self.MetaFile.Path, self.Arch)] - IR.PreMakefileHashHexDigest =3D m.hexdigest() - gDict[(self.MetaFile.Path, self.Arch)] =3D IR - - return gDict[(self.MetaFile.Path, self.Arch)] - - def GenMakeHeaderFilesHash(self, gDict): - # Early exit if module or library has been hashed and is in memory - if (self.MetaFile.Path, self.Arch) in gDict and \ - gDict[(self.MetaFile.Path, self.Arch)].MakeHeaderFilesHashDigest: - return gDict[(self.MetaFile.Path, self.Arch)] - - # skip if the module cache already crashed - if (self.MetaFile.Path, self.Arch) in gDict and \ - gDict[(self.MetaFile.Path, self.Arch)].CacheCrash: - return - - # skip binary module - if self.IsBinaryModule: - return - - if not (self.MetaFile.Path, self.Arch) in gDict or \ - not gDict[(self.MetaFile.Path, self.Arch)].CreateCodeFileDone: - if self.IsLibrary: - if (self.MetaFile.File,self.MetaFile.Root,self.Arch,self.M= etaFile.Path) in GlobalData.libConstPcd: - self.ConstPcd =3D GlobalData.libConstPcd[(self.MetaFil= e.File,self.MetaFile.Root,self.Arch,self.MetaFile.Path)] - if (self.MetaFile.File,self.MetaFile.Root,self.Arch,self.M= etaFile.Path) in GlobalData.Refes: - self.ReferenceModules =3D GlobalData.Refes[(self.MetaF= ile.File,self.MetaFile.Root,self.Arch,self.MetaFile.Path)] - self.CreateCodeFile() - if not (self.MetaFile.Path, self.Arch) in gDict or \ - not gDict[(self.MetaFile.Path, self.Arch)].CreateMakeFileDone: - self.CreateMakeFile(GenFfsList=3DGlobalData.FfsCmd.get((self.M= etaFile.Path, self.Arch),[])) - - if not (self.MetaFile.Path, self.Arch) in gDict or \ - not gDict[(self.MetaFile.Path, self.Arch)].CreateCodeFileDone o= r \ - not gDict[(self.MetaFile.Path, self.Arch)].CreateMakeFileDone: - EdkLogger.quiet("[cache warning]: Cannot create CodeFile or Mak= efile for module %s[%s]" %(self.MetaFile.Path, self.Arch)) - return - - DependencyFileSet =3D set() - # Add Makefile - if gDict[(self.MetaFile.Path, self.Arch)].MakefilePath: - DependencyFileSet.add(gDict[(self.MetaFile.Path, self.Arch)].M= akefilePath) + # GenPreMakefileHashList needed in both --binary-destination + # and --hash. And --hash might save ModuleHashFile in remote dict + # during multiprocessing. + if (self.MetaFile.Path, self.Arch) in GlobalData.gModuleHashFile: + HashFile =3D GlobalData.gModuleHashFile[(self.MetaFile.Path, s= elf.Arch)] else: - EdkLogger.quiet("[cache warning]: makefile is missing for modu= le %s[%s]" %(self.MetaFile.Path, self.Arch)) - - # Add header files - if gDict[(self.MetaFile.Path, self.Arch)].DependencyHeaderFileSet: - for File in gDict[(self.MetaFile.Path, self.Arch)].DependencyH= eaderFileSet: - DependencyFileSet.add(File) + EdkLogger.quiet("[cache error]:No ModuleHashFile for module: %= s[%s]" % (self.MetaFile.Path, self.Arch)) + if path.exists(LongFilePath(HashFile)): + FileList.append(HashFile) + m.update(HashFile.encode('utf-8')) else: - EdkLogger.quiet("[cache warning]: No dependency header found f= or module %s[%s]" %(self.MetaFile.Path, self.Arch)) - - # Add AutoGen files - if self.AutoGenFileList: - for File in set(self.AutoGenFileList): - DependencyFileSet.add(File) - - # Caculate all above dependency files hash - # Initialze hash object - FileList =3D [] - m =3D hashlib.md5() - for File in sorted(DependencyFileSet, key=3Dlambda x: str(x)): - if not os.path.exists(str(File)): - EdkLogger.quiet("[cache warning]: header file: %s doesn't = exist for module: %s[%s]" % (File, self.MetaFile.Path, self.Arch)) - continue - f =3D open(str(File), 'rb') - Content =3D f.read() - f.close() - m.update(Content) - FileList.append((str(File), hashlib.md5(Content).hexdigest())) + EdkLogger.quiet("[cache warning]:No Module HashFile: %s" % Has= hFile) =20 - with GlobalData.cache_lock: - IR =3D gDict[(self.MetaFile.Path, self.Arch)] - IR.AutoGenFileList =3D self.AutoGenFileList.keys() - IR.MakeHeaderFilesHashChain =3D FileList - IR.MakeHeaderFilesHashDigest =3D m.digest() - gDict[(self.MetaFile.Path, self.Arch)] =3D IR + # Add Library hash + if self.LibraryAutoGenList: + for Lib in sorted(self.LibraryAutoGenList, key=3Dlambda x: x.M= etaFile.Path): =20 - return gDict[(self.MetaFile.Path, self.Arch)] + if (Lib.MetaFile.Path, Lib.Arch) in GlobalData.gModuleHash= File: + HashFile =3D GlobalData.gModuleHashFile[(Lib.MetaFile.= Path, Lib.Arch)] + else: + EdkLogger.quiet("[cache error]:No ModuleHashFile for l= ib: %s[%s]" % (Lib.MetaFile.Path, Lib.Arch)) + if path.exists(LongFilePath(HashFile)): + FileList.append(HashFile) + m.update(HashFile.encode('utf-8')) + else: + EdkLogger.quiet("[cache warning]:No Lib HashFile: %s" = % HashFile) =20 - def GenMakeHash(self, gDict): - # Early exit if module or library has been hashed and is in memory - if (self.MetaFile.Path, self.Arch) in gDict and \ - gDict[(self.MetaFile.Path, self.Arch)].MakeHashChain: - return gDict[(self.MetaFile.Path, self.Arch)] + # Save PreMakeHashFileList + FilePath =3D path.join(self.BuildDir, self.Name + ".PreMakeHashFil= eList." + m.hexdigest()) + try: + with open(LongFilePath(FilePath), 'w') as f: + json.dump(FileList, f, indent=3D0) + except: + EdkLogger.quiet("[cache warning]: fail to save PreMake HashFil= eList: %s" % FilePath) =20 - # skip if the module cache already crashed - if (self.MetaFile.Path, self.Arch) in gDict and \ - gDict[(self.MetaFile.Path, self.Arch)].CacheCrash: - return + def GenMakefileHashList(self): + # GenMakefileHashList only need in --binary-destination which will + # everything in local dict. So don't need check remote dict. =20 # skip binary module if self.IsBinaryModule: return =20 - if not (self.MetaFile.Path, self.Arch) in gDict or \ - not gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDiges= t: - self.GenModuleFilesHash(gDict) - if not gDict[(self.MetaFile.Path, self.Arch)].MakeHeaderFilesHashD= igest: - self.GenMakeHeaderFilesHash(gDict) - - if not (self.MetaFile.Path, self.Arch) in gDict or \ - not gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDiges= t or \ - not gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesChain or \ - not gDict[(self.MetaFile.Path, self.Arch)].MakeHeaderFilesHashD= igest or \ - not gDict[(self.MetaFile.Path, self.Arch)].MakeHeaderFilesHashC= hain: - EdkLogger.quiet("[cache warning]: Cannot generate ModuleFilesHa= sh or MakeHeaderFilesHash for module %s[%s]" %(self.MetaFile.Path, self.Arc= h)) - return - - # Initialze hash object + FileList =3D [] m =3D hashlib.md5() - MakeHashChain =3D [] + # Add AutoGen hash + HashFile =3D GlobalData.gCMakeHashFile[(self.MetaFile.Path, self.A= rch)] + if path.exists(LongFilePath(HashFile)): + FileList.append(HashFile) + m.update(HashFile.encode('utf-8')) + else: + EdkLogger.quiet("[cache warning]:No AutoGen HashFile: %s" % Ha= shFile) =20 - # Add hash of makefile and dependency header files - m.update(gDict[(self.MetaFile.Path, self.Arch)].MakeHeaderFilesHas= hDigest) - New =3D list(set(gDict[(self.MetaFile.Path, self.Arch)].MakeHeader= FilesHashChain) - set(MakeHashChain)) - New.sort(key=3Dlambda x: str(x)) - MakeHashChain +=3D New + # Add Module self + if (self.MetaFile.Path, self.Arch) in GlobalData.gModuleHashFile: + HashFile =3D GlobalData.gModuleHashFile[(self.MetaFile.Path, s= elf.Arch)] + else: + EdkLogger.quiet("[cache error]:No ModuleHashFile for module: %= s[%s]" % (self.MetaFile.Path, self.Arch)) + if path.exists(LongFilePath(HashFile)): + FileList.append(HashFile) + m.update(HashFile.encode('utf-8')) + else: + EdkLogger.quiet("[cache warning]:No Module HashFile: %s" % Has= hFile) =20 # Add Library hash if self.LibraryAutoGenList: - for Lib in sorted(self.LibraryAutoGenList, key=3Dlambda x: x.N= ame): - if not (Lib.MetaFile.Path, Lib.Arch) in gDict or \ - not gDict[(Lib.MetaFile.Path, Lib.Arch)].MakeHashChain: - Lib.GenMakeHash(gDict) - if not gDict[(Lib.MetaFile.Path, Lib.Arch)].MakeHashDigest: - print("Cannot generate MakeHash for lib module:", Lib.= MetaFile.Path, Lib.Arch) - continue - m.update(gDict[(Lib.MetaFile.Path, Lib.Arch)].MakeHashDige= st) - New =3D list(set(gDict[(Lib.MetaFile.Path, Lib.Arch)].Make= HashChain) - set(MakeHashChain)) - New.sort(key=3Dlambda x: str(x)) - MakeHashChain +=3D New + for Lib in sorted(self.LibraryAutoGenList, key=3Dlambda x: x.M= etaFile.Path): + if (Lib.MetaFile.Path, Lib.Arch) in GlobalData.gModuleHash= File: + HashFile =3D GlobalData.gModuleHashFile[(Lib.MetaFile.= Path, Lib.Arch)] + else: + EdkLogger.quiet("[cache error]:No ModuleHashFile for l= ib: %s[%s]" % (Lib.MetaFile.Path, Lib.Arch)) + if path.exists(LongFilePath(HashFile)): + FileList.append(HashFile) + m.update(HashFile.encode('utf-8')) + else: + EdkLogger.quiet("[cache warning]:No Lib HashFile: %s" = % HashFile) =20 - # Add Module self - m.update(gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDig= est) - New =3D list(set(gDict[(self.MetaFile.Path, self.Arch)].ModuleFile= sChain) - set(MakeHashChain)) - New.sort(key=3Dlambda x: str(x)) - MakeHashChain +=3D New + # Save MakeHashFileList + FilePath =3D path.join(self.BuildDir, self.Name + ".MakeHashFileLi= st." + m.hexdigest()) + try: + with open(LongFilePath(FilePath), 'w') as f: + json.dump(FileList, f, indent=3D0) + except: + EdkLogger.quiet("[cache warning]: fail to save Make HashFileLi= st: %s" % FilePath) + + def CheckHashChainFile(self, HashChainFile): + # Assume the HashChainFile basename format is the 'x.hashchain.16B= ytesHexStr' + # The x is module name and the 16BytesHexStr is md5 hexdigest of + # all hashchain files content + HashStr =3D HashChainFile.split('.')[-1] + if len(HashStr) !=3D 32: + EdkLogger.quiet("[cache error]: wrong format HashChainFile:%s"= % (File)) + return False =20 - with GlobalData.cache_lock: - IR =3D gDict[(self.MetaFile.Path, self.Arch)] - IR.MakeHashDigest =3D m.digest() - IR.MakeHashHexDigest =3D m.hexdigest() - IR.MakeHashChain =3D MakeHashChain - gDict[(self.MetaFile.Path, self.Arch)] =3D IR + try: + with open(LongFilePath(HashChainFile), 'r') as f: + HashChainList =3D json.load(f) + except: + EdkLogger.quiet("[cache error]: fail to load HashChainFile: %s= " % HashChainFile) + return False =20 - return gDict[(self.MetaFile.Path, self.Arch)] + # Print the different file info + # print(HashChainFile) + for idx, (SrcFile, SrcHash) in enumerate (HashChainList): + if SrcFile in GlobalData.gFileHashDict: + DestHash =3D GlobalData.gFileHashDict[SrcFile] + else: + try: + with open(LongFilePath(SrcFile), 'rb') as f: + Content =3D f.read() + DestHash =3D hashlib.md5(Content).hexdigest() + GlobalData.gFileHashDict[SrcFile] =3D DestHash + except IOError as X: + # cache miss if SrcFile is removed in new version code + GlobalData.gFileHashDict[SrcFile] =3D 0 + EdkLogger.quiet("[cache insight]: first cache miss fil= e in %s is %s" % (HashChainFile, SrcFile)) + return False + if SrcHash !=3D DestHash: + EdkLogger.quiet("[cache insight]: first cache miss file in= %s is %s" % (HashChainFile, SrcFile)) + return False + + return True =20 ## Decide whether we can skip the left autogen and make process - def CanSkipbyPreMakefileCache(self, gDict): + def CanSkipbyMakeCache(self): + # For --binary-source only + # CanSkipbyMakeCache consume below dicts: + # gModuleMakeCacheStatus + # gHashChainStatus + # GenPreMakefileHashList produce gModuleMakeCacheStatus, gModuleHa= shFile dict. + # all these dicts might be produced in multiprocessing, so + # need check these remote dict + if not GlobalData.gBinCacheSource: return False =20 - if gDict[(self.MetaFile.Path, self.Arch)].PreMakeCacheHit: - return True - - if gDict[(self.MetaFile.Path, self.Arch)].CacheCrash: - return False + if (self.MetaFile.Path, self.Arch) in GlobalData.gModuleMakeCacheS= tatus: + return GlobalData.gModuleMakeCacheStatus[(self.MetaFile.Path, = self.Arch)] =20 - # If Module is binary, do not skip by cache + # If Module is binary, which has special build rule, do not skip b= y cache. if self.IsBinaryModule: + print("[cache miss]: MakeCache: Skip BinaryModule:", self.Meta= File.Path, self.Arch) + GlobalData.gModuleMakeCacheStatus[(self.MetaFile.Path, self.Ar= ch)] =3D False return False =20 - # .inc is contains binary information so do not skip by hash as we= ll + # see .inc as binary file, do not skip by hash for f_ext in self.SourceFileList: if '.inc' in str(f_ext): + print("[cache miss]: MakeCache: Skip '.inc' File:", self.M= etaFile.Path, self.Arch) + GlobalData.gModuleMakeCacheStatus[(self.MetaFile.Path, sel= f.Arch)] =3D False return False =20 - # Get the module hash values from stored cache and currrent build - # then check whether cache hit based on the hash values - # if cache hit, restore all the files from cache - FileDir =3D path.join(GlobalData.gBinCacheSource, self.PlatformInf= o.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.Sourc= eDir, self.MetaFile.BaseName) + ModuleCacheDir =3D path.join(GlobalData.gBinCacheSource, self.Plat= formInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, sel= f.SourceDir, self.MetaFile.BaseName) FfsDir =3D path.join(GlobalData.gBinCacheSource, self.PlatformInfo= .OutputDir, self.BuildTarget + "_" + self.ToolChain, TAB_FV_DIRECTORY, "Ffs= ", self.Guid + self.Name) =20 ModuleHashPairList =3D [] # tuple list: [tuple(PreMakefileHash, Ma= keHash)] - ModuleHashPair =3D path.join(FileDir, self.Name + ".ModuleHashPair= ") - if not os.path.exists(ModuleHashPair): - EdkLogger.quiet("[cache warning]: Cannot find ModuleHashPair f= ile: %s" % ModuleHashPair) - with GlobalData.cache_lock: - IR =3D gDict[(self.MetaFile.Path, self.Arch)] - IR.CacheCrash =3D True - gDict[(self.MetaFile.Path, self.Arch)] =3D IR - return False - + ModuleHashPair =3D path.join(ModuleCacheDir, self.Name + ".ModuleH= ashPair") try: - with open(ModuleHashPair, 'r') as f: + with open(LongFilePath(ModuleHashPair), 'r') as f: ModuleHashPairList =3D json.load(f) except: + # ModuleHashPair might not exist for new added module + GlobalData.gModuleMakeCacheStatus[(self.MetaFile.Path, self.Ar= ch)] =3D False EdkLogger.quiet("[cache warning]: fail to load ModuleHashPair = file: %s" % ModuleHashPair) + print("[cache miss]: MakeCache:", self.MetaFile.Path, self.Arc= h) return False =20 - self.GenPreMakefileHash(gDict) - if not (self.MetaFile.Path, self.Arch) in gDict or \ - not gDict[(self.MetaFile.Path, self.Arch)].PreMakefileHashHexDi= gest: - EdkLogger.quiet("[cache warning]: PreMakefileHashHexDigest is = missing for module %s[%s]" %(self.MetaFile.Path, self.Arch)) - return False - - MakeHashStr =3D None - CurrentPreMakeHash =3D gDict[(self.MetaFile.Path, self.Arch)].PreM= akefileHashHexDigest + # Check the PreMakeHash in ModuleHashPairList one by one for idx, (PreMakefileHash, MakeHash) in enumerate (ModuleHashPairL= ist): - if PreMakefileHash =3D=3D CurrentPreMakeHash: - MakeHashStr =3D str(MakeHash) + SourceHashDir =3D path.join(ModuleCacheDir, MakeHash) + SourceFfsHashDir =3D path.join(FfsDir, MakeHash) + PreMakeHashFileList_FilePah =3D path.join(ModuleCacheDir, self= .Name + ".PreMakeHashFileList." + PreMakefileHash) + MakeHashFileList_FilePah =3D path.join(ModuleCacheDir, self.Na= me + ".MakeHashFileList." + MakeHash) =20 - if not MakeHashStr: - return False + try: + with open(LongFilePath(MakeHashFileList_FilePah), 'r') as = f: + MakeHashFileList =3D json.load(f) + except: + EdkLogger.quiet("[cache error]: fail to load MakeHashFileL= ist file: %s" % MakeHashFileList_FilePah) + continue =20 - TargetHashDir =3D path.join(FileDir, MakeHashStr) - TargetFfsHashDir =3D path.join(FfsDir, MakeHashStr) + HashMiss =3D False + for HashChainFile in MakeHashFileList: + HashChainStatus =3D None + if HashChainFile in GlobalData.gHashChainStatus: + HashChainStatus =3D GlobalData.gHashChainStatus[HashCh= ainFile] + if HashChainStatus =3D=3D False: + HashMiss =3D True + break + elif HashChainStatus =3D=3D True: + continue + # Convert to path start with cache source dir + RelativePath =3D os.path.relpath(HashChainFile, self.Works= paceDir) + NewFilePath =3D os.path.join(GlobalData.gBinCacheSource, R= elativePath) + if self.CheckHashChainFile(NewFilePath): + GlobalData.gHashChainStatus[HashChainFile] =3D True + # Save the module self HashFile for GenPreMakefileHash= List later usage + if self.Name + ".hashchain." in HashChainFile: + GlobalData.gModuleHashFile[(self.MetaFile.Path, se= lf.Arch)] =3D HashChainFile + else: + GlobalData.gHashChainStatus[HashChainFile] =3D False + HashMiss =3D True + break =20 - if not os.path.exists(TargetHashDir): - EdkLogger.quiet("[cache warning]: Cache folder is missing: %s"= % TargetHashDir) - return False + if HashMiss: + continue =20 - for root, dir, files in os.walk(TargetHashDir): - for f in files: - File =3D path.join(root, f) - self.CacheCopyFile(self.OutputDir, TargetHashDir, File) - if os.path.exists(TargetFfsHashDir): - for root, dir, files in os.walk(TargetFfsHashDir): + # PreMakefile cache hit, restore the module build result + for root, dir, files in os.walk(SourceHashDir): for f in files: File =3D path.join(root, f) - self.CacheCopyFile(self.FfsOutputDir, TargetFfsHashDir= , File) - - if self.Name =3D=3D "PcdPeim" or self.Name =3D=3D "PcdDxe": - CreatePcdDatabaseCode(self, TemplateString(), TemplateString()) + self.CacheCopyFile(self.BuildDir, SourceHashDir, File) + if os.path.exists(SourceFfsHashDir): + for root, dir, files in os.walk(SourceFfsHashDir): + for f in files: + File =3D path.join(root, f) + self.CacheCopyFile(self.FfsOutputDir, SourceFfsHas= hDir, File) + + if self.Name =3D=3D "PcdPeim" or self.Name =3D=3D "PcdDxe": + CreatePcdDatabaseCode(self, TemplateString(), TemplateStri= ng()) + + print("[cache hit]: MakeCache:", self.MetaFile.Path, self.Arch) + GlobalData.gModuleMakeCacheStatus[(self.MetaFile.Path, self.Ar= ch)] =3D True + return True =20 - with GlobalData.cache_lock: - IR =3D gDict[(self.MetaFile.Path, self.Arch)] - IR.PreMakeCacheHit =3D True - gDict[(self.MetaFile.Path, self.Arch)] =3D IR - print("[cache hit]: checkpoint_PreMakefile:", self.MetaFile.Path, = self.Arch) - #EdkLogger.quiet("cache hit: %s[%s]" % (self.MetaFile.Path, self.A= rch)) - return True + print("[cache miss]: MakeCache:", self.MetaFile.Path, self.Arch) + GlobalData.gModuleMakeCacheStatus[(self.MetaFile.Path, self.Arch)]= =3D False + return False =20 - ## Decide whether we can skip the make process - def CanSkipbyMakeCache(self, gDict): - if not GlobalData.gBinCacheSource: + ## Decide whether we can skip the left autogen and make process + def CanSkipbyPreMakeCache(self): + # CanSkipbyPreMakeCache consume below dicts: + # gModulePreMakeCacheStatus + # gHashChainStatus + # gModuleHashFile + # GenPreMakefileHashList produce gModulePreMakeCacheStatus dict. + # all these dicts might be produced in multiprocessing, so + # need check these remote dicts + + if not GlobalData.gUseHashCache or GlobalData.gBinCacheDest: return False =20 - if gDict[(self.MetaFile.Path, self.Arch)].MakeCacheHit: - return True - - if gDict[(self.MetaFile.Path, self.Arch)].CacheCrash: - return False + if (self.MetaFile.Path, self.Arch) in GlobalData.gModulePreMakeCac= heStatus: + return GlobalData.gModulePreMakeCacheStatus[(self.MetaFile.Pat= h, self.Arch)] =20 - # If Module is binary, do not skip by cache + # If Module is binary, which has special build rule, do not skip b= y cache. if self.IsBinaryModule: - print("[cache miss]: checkpoint_Makefile: binary module:", sel= f.MetaFile.Path, self.Arch) + print("[cache miss]: PreMakeCache: Skip BinaryModule:", self.M= etaFile.Path, self.Arch) + GlobalData.gModulePreMakeCacheStatus[(self.MetaFile.Path, self= .Arch)] =3D False return False =20 - # .inc is contains binary information so do not skip by hash as we= ll + # see .inc as binary file, do not skip by hash for f_ext in self.SourceFileList: if '.inc' in str(f_ext): - with GlobalData.cache_lock: - IR =3D gDict[(self.MetaFile.Path, self.Arch)] - IR.MakeCacheHit =3D False - gDict[(self.MetaFile.Path, self.Arch)] =3D IR - print("[cache miss]: checkpoint_Makefile: .inc module:", s= elf.MetaFile.Path, self.Arch) + print("[cache miss]: PreMakeCache: Skip '.inc' File:", sel= f.MetaFile.Path, self.Arch) + GlobalData.gModulePreMakeCacheStatus[(self.MetaFile.Path, = self.Arch)] =3D False return False =20 - # Get the module hash values from stored cache and currrent build - # then check whether cache hit based on the hash values - # if cache hit, restore all the files from cache - FileDir =3D path.join(GlobalData.gBinCacheSource, self.PlatformInf= o.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.Sourc= eDir, self.MetaFile.BaseName) - FfsDir =3D path.join(GlobalData.gBinCacheSource, self.PlatformInfo= .OutputDir, self.BuildTarget + "_" + self.ToolChain, TAB_FV_DIRECTORY, "Ffs= ", self.Guid + self.Name) - - ModuleHashPairList =3D [] # tuple list: [tuple(PreMakefileHash, Ma= keHash)] - ModuleHashPair =3D path.join(FileDir, self.Name + ".ModuleHashPair= ") - if not os.path.exists(ModuleHashPair): - EdkLogger.quiet("[cache warning]: Cannot find ModuleHashPair f= ile: %s" % ModuleHashPair) - with GlobalData.cache_lock: - IR =3D gDict[(self.MetaFile.Path, self.Arch)] - IR.CacheCrash =3D True - gDict[(self.MetaFile.Path, self.Arch)] =3D IR - return False - - try: - with open(ModuleHashPair, 'r') as f: - ModuleHashPairList =3D json.load(f) - except: - EdkLogger.quiet("[cache warning]: fail to load ModuleHashPair = file: %s" % ModuleHashPair) - return False - - self.GenMakeHash(gDict) - if not (self.MetaFile.Path, self.Arch) in gDict or \ - not gDict[(self.MetaFile.Path, self.Arch)].MakeHashHexDigest: - EdkLogger.quiet("[cache warning]: MakeHashHexDigest is missing= for module %s[%s]" %(self.MetaFile.Path, self.Arch)) - return False - - MakeHashStr =3D None - CurrentMakeHash =3D gDict[(self.MetaFile.Path, self.Arch)].MakeHas= hHexDigest - for idx, (PreMakefileHash, MakeHash) in enumerate (ModuleHashPairL= ist): - if MakeHash =3D=3D CurrentMakeHash: - MakeHashStr =3D str(MakeHash) - - if not MakeHashStr: - print("[cache miss]: checkpoint_Makefile:", self.MetaFile.Path= , self.Arch) - return False - - TargetHashDir =3D path.join(FileDir, MakeHashStr) - TargetFfsHashDir =3D path.join(FfsDir, MakeHashStr) - if not os.path.exists(TargetHashDir): - EdkLogger.quiet("[cache warning]: Cache folder is missing: %s"= % TargetHashDir) - return False - - for root, dir, files in os.walk(TargetHashDir): - for f in files: - File =3D path.join(root, f) - self.CacheCopyFile(self.OutputDir, TargetHashDir, File) - - if os.path.exists(TargetFfsHashDir): - for root, dir, files in os.walk(TargetFfsHashDir): - for f in files: - File =3D path.join(root, f) - self.CacheCopyFile(self.FfsOutputDir, TargetFfsHashDir= , File) - - if self.Name =3D=3D "PcdPeim" or self.Name =3D=3D "PcdDxe": - CreatePcdDatabaseCode(self, TemplateString(), TemplateString()) - with GlobalData.cache_lock: - IR =3D gDict[(self.MetaFile.Path, self.Arch)] - IR.MakeCacheHit =3D True - gDict[(self.MetaFile.Path, self.Arch)] =3D IR - print("[cache hit]: checkpoint_Makefile:", self.MetaFile.Path, sel= f.Arch) - return True - - ## Show the first file name which causes cache miss - def PrintFirstMakeCacheMissFile(self, gDict): + # For --hash only in the incremental build if not GlobalData.gBinCacheSource: - return - - # skip if the module cache already crashed - if gDict[(self.MetaFile.Path, self.Arch)].CacheCrash: - return - - # skip binary module - if self.IsBinaryModule: - return + Files =3D [path.join(self.BuildDir, f) for f in os.listdir(sel= f.BuildDir) if path.isfile(path.join(self.BuildDir, f))] + PreMakeHashFileList_FilePah =3D None + MakeTimeStamp =3D 0 + # Find latest PreMakeHashFileList file in self.BuildDir folder + for File in Files: + if ".PreMakeHashFileList." in File: + FileTimeStamp =3D os.stat(path.join(self.BuildDir, Fil= e))[8] + if FileTimeStamp > MakeTimeStamp: + MakeTimeStamp =3D FileTimeStamp + PreMakeHashFileList_FilePah =3D File + if not PreMakeHashFileList_FilePah: + GlobalData.gModulePreMakeCacheStatus[(self.MetaFile.Path, = self.Arch)] =3D False + return False =20 - if not (self.MetaFile.Path, self.Arch) in gDict: - return + try: + with open(LongFilePath(PreMakeHashFileList_FilePah), 'r') = as f: + PreMakeHashFileList =3D json.load(f) + except: + EdkLogger.quiet("[cache error]: fail to load PreMakeHashFi= leList file: %s" % PreMakeHashFileList_FilePah) + print("[cache miss]: PreMakeCache:", self.MetaFile.Path, s= elf.Arch) + GlobalData.gModulePreMakeCacheStatus[(self.MetaFile.Path, = self.Arch)] =3D False + return False =20 - # Only print cache miss file for the MakeCache not hit module - if gDict[(self.MetaFile.Path, self.Arch)].MakeCacheHit: - return + HashMiss =3D False + for HashChainFile in PreMakeHashFileList: + HashChainStatus =3D None + if HashChainFile in GlobalData.gHashChainStatus: + HashChainStatus =3D GlobalData.gHashChainStatus[HashCh= ainFile] + if HashChainStatus =3D=3D False: + HashMiss =3D True + break + elif HashChainStatus =3D=3D True: + continue + if self.CheckHashChainFile(HashChainFile): + GlobalData.gHashChainStatus[HashChainFile] =3D True + # Save the module self HashFile for GenPreMakefileHash= List later usage + if self.Name + ".hashchain." in HashChainFile: + GlobalData.gModuleHashFile[(self.MetaFile.Path, se= lf.Arch)] =3D HashChainFile + else: + GlobalData.gHashChainStatus[HashChainFile] =3D False + HashMiss =3D True + break =20 - if not gDict[(self.MetaFile.Path, self.Arch)].MakeHashChain: - EdkLogger.quiet("[cache insight]: MakeHashChain is missing for= : %s[%s]" % (self.MetaFile.Path, self.Arch)) - return + if HashMiss: + print("[cache miss]: PreMakeCache:", self.MetaFile.Path, s= elf.Arch) + GlobalData.gModulePreMakeCacheStatus[(self.MetaFile.Path, = self.Arch)] =3D False + return False + else: + print("[cache hit]: PreMakeCache:", self.MetaFile.Path, se= lf.Arch) + GlobalData.gModulePreMakeCacheStatus[(self.MetaFile.Path, = self.Arch)] =3D True + return True =20 - # Find the cache dir name through the .ModuleHashPair file info - FileDir =3D path.join(GlobalData.gBinCacheSource, self.PlatformInf= o.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.Sourc= eDir, self.MetaFile.BaseName) + ModuleCacheDir =3D path.join(GlobalData.gBinCacheSource, self.Plat= formInfo.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, sel= f.SourceDir, self.MetaFile.BaseName) + FfsDir =3D path.join(GlobalData.gBinCacheSource, self.PlatformInfo= .OutputDir, self.BuildTarget + "_" + self.ToolChain, TAB_FV_DIRECTORY, "Ffs= ", self.Guid + self.Name) =20 ModuleHashPairList =3D [] # tuple list: [tuple(PreMakefileHash, Ma= keHash)] - ModuleHashPair =3D path.join(FileDir, self.Name + ".ModuleHashPair= ") - if not os.path.exists(ModuleHashPair): - EdkLogger.quiet("[cache insight]: Cannot find ModuleHashPair f= ile for module: %s[%s]" % (self.MetaFile.Path, self.Arch)) - return - + ModuleHashPair =3D path.join(ModuleCacheDir, self.Name + ".ModuleH= ashPair") try: - with open(ModuleHashPair, 'r') as f: + with open(LongFilePath(ModuleHashPair), 'r') as f: ModuleHashPairList =3D json.load(f) except: - EdkLogger.quiet("[cache insight]: Cannot load ModuleHashPair f= ile for module: %s[%s]" % (self.MetaFile.Path, self.Arch)) - return + # ModuleHashPair might not exist for new added module + GlobalData.gModulePreMakeCacheStatus[(self.MetaFile.Path, self= .Arch)] =3D False + EdkLogger.quiet("[cache warning]: fail to load ModuleHashPair = file: %s" % ModuleHashPair) + print("[cache miss]: PreMakeCache:", self.MetaFile.Path, self.= Arch) + return False =20 - MakeHashSet =3D set() + # Check the PreMakeHash in ModuleHashPairList one by one for idx, (PreMakefileHash, MakeHash) in enumerate (ModuleHashPairL= ist): - TargetHashDir =3D path.join(FileDir, str(MakeHash)) - if os.path.exists(TargetHashDir): - MakeHashSet.add(MakeHash) - if not MakeHashSet: - EdkLogger.quiet("[cache insight]: Cannot find valid cache dir = for module: %s[%s]" % (self.MetaFile.Path, self.Arch)) - return + SourceHashDir =3D path.join(ModuleCacheDir, MakeHash) + SourceFfsHashDir =3D path.join(FfsDir, MakeHash) + PreMakeHashFileList_FilePah =3D path.join(ModuleCacheDir, self= .Name + ".PreMakeHashFileList." + PreMakefileHash) + MakeHashFileList_FilePah =3D path.join(ModuleCacheDir, self.Na= me + ".MakeHashFileList." + MakeHash) =20 - TargetHash =3D list(MakeHashSet)[0] - TargetHashDir =3D path.join(FileDir, str(TargetHash)) - if len(MakeHashSet) > 1 : - EdkLogger.quiet("[cache insight]: found multiple cache dirs fo= r this module, random select dir '%s' to search the first cache miss file: = %s[%s]" % (TargetHash, self.MetaFile.Path, self.Arch)) - - ListFile =3D path.join(TargetHashDir, self.Name + '.MakeHashChain') - if os.path.exists(ListFile): try: - f =3D open(ListFile, 'r') - CachedList =3D json.load(f) - f.close() + with open(LongFilePath(PreMakeHashFileList_FilePah), 'r') = as f: + PreMakeHashFileList =3D json.load(f) except: - EdkLogger.quiet("[cache insight]: Cannot load MakeHashChai= n file: %s" % ListFile) - return - else: - EdkLogger.quiet("[cache insight]: Cannot find MakeHashChain fi= le: %s" % ListFile) - return - - CurrentList =3D gDict[(self.MetaFile.Path, self.Arch)].MakeHashCha= in - for idx, (file, hash) in enumerate (CurrentList): - (filecached, hashcached) =3D CachedList[idx] - if file !=3D filecached: - EdkLogger.quiet("[cache insight]: first different file in = %s[%s] is %s, the cached one is %s" % (self.MetaFile.Path, self.Arch, file,= filecached)) - break - if hash !=3D hashcached: - EdkLogger.quiet("[cache insight]: first cache miss file in= %s[%s] is %s" % (self.MetaFile.Path, self.Arch, file)) - break - - return True + EdkLogger.quiet("[cache error]: fail to load PreMakeHashFi= leList file: %s" % PreMakeHashFileList_FilePah) + continue =20 - ## Decide whether we can skip the ModuleAutoGen process - def CanSkipbyCache(self, gDict): - # Hashing feature is off - if not GlobalData.gBinCacheSource: - return False + HashMiss =3D False + for HashChainFile in PreMakeHashFileList: + HashChainStatus =3D None + if HashChainFile in GlobalData.gHashChainStatus: + HashChainStatus =3D GlobalData.gHashChainStatus[HashCh= ainFile] + if HashChainStatus =3D=3D False: + HashMiss =3D True + break + elif HashChainStatus =3D=3D True: + continue + # Convert to path start with cache source dir + RelativePath =3D os.path.relpath(HashChainFile, self.Works= paceDir) + NewFilePath =3D os.path.join(GlobalData.gBinCacheSource, R= elativePath) + if self.CheckHashChainFile(NewFilePath): + GlobalData.gHashChainStatus[HashChainFile] =3D True + else: + GlobalData.gHashChainStatus[HashChainFile] =3D False + HashMiss =3D True + break =20 - if self in GlobalData.gBuildHashSkipTracking: - return GlobalData.gBuildHashSkipTracking[self] + if HashMiss: + continue =20 - # If library or Module is binary do not skip by hash - if self.IsBinaryModule: - GlobalData.gBuildHashSkipTracking[self] =3D False - return False + # PreMakefile cache hit, restore the module build result + for root, dir, files in os.walk(SourceHashDir): + for f in files: + File =3D path.join(root, f) + self.CacheCopyFile(self.BuildDir, SourceHashDir, File) + if os.path.exists(SourceFfsHashDir): + for root, dir, files in os.walk(SourceFfsHashDir): + for f in files: + File =3D path.join(root, f) + self.CacheCopyFile(self.FfsOutputDir, SourceFfsHas= hDir, File) + + if self.Name =3D=3D "PcdPeim" or self.Name =3D=3D "PcdDxe": + CreatePcdDatabaseCode(self, TemplateString(), TemplateStri= ng()) + + print("[cache hit]: PreMakeCache:", self.MetaFile.Path, self.A= rch) + GlobalData.gModulePreMakeCacheStatus[(self.MetaFile.Path, self= .Arch)] =3D True + return True =20 - # .inc is contains binary information so do not skip by hash as we= ll - for f_ext in self.SourceFileList: - if '.inc' in str(f_ext): - GlobalData.gBuildHashSkipTracking[self] =3D False - return False + print("[cache miss]: PreMakeCache:", self.MetaFile.Path, self.Arch) + GlobalData.gModulePreMakeCacheStatus[(self.MetaFile.Path, self.Arc= h)] =3D False + return False =20 - if not (self.MetaFile.Path, self.Arch) in gDict: + ## Decide whether we can skip the Module build + def CanSkipbyCache(self, gHitSet): + # Hashing feature is off + if not GlobalData.gBinCacheSource: return False =20 - if gDict[(self.MetaFile.Path, self.Arch)].PreMakeCacheHit: - GlobalData.gBuildHashSkipTracking[self] =3D True - return True - - if gDict[(self.MetaFile.Path, self.Arch)].MakeCacheHit: - GlobalData.gBuildHashSkipTracking[self] =3D True + if self in gHitSet: return True =20 return False diff --git a/BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py b/BaseTool= s/Source/Python/AutoGen/WorkspaceAutoGen.py index 9d8040905e..0e1404102b 100644 --- a/BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py +++ b/BaseTools/Source/Python/AutoGen/WorkspaceAutoGen.py @@ -23,6 +23,7 @@ from Common.StringUtils import NormPath from Common.BuildToolError import * from Common.DataType import * from Common.Misc import * +import json =20 ## Regular expression for splitting Dependency Expression string into toke= ns gDepexTokenPattern =3D re.compile("(\(|\)|\w+| \S+\.inf)") @@ -127,7 +128,7 @@ class WorkspaceAutoGen(AutoGen): =20 self.CreateBuildOptionsFile() self.CreatePcdTokenNumberFile() - self.CreateModuleHashInfo() + self.GeneratePlatformLevelHash() =20 # # Merge Arch @@ -527,12 +528,12 @@ class WorkspaceAutoGen(AutoGen): PcdTokenNumber +=3D str(Pa.PcdTokenNumber[Pcd.TokenCNa= me, Pcd.TokenSpaceGuidCName]) SaveFileOnChange(os.path.join(self.BuildDir, 'PcdTokenNumber'), Pc= dTokenNumber, False) =20 - def CreateModuleHashInfo(self): + def GeneratePlatformLevelHash(self): # # Get set of workspace metafiles # AllWorkSpaceMetaFiles =3D self._GetMetaFiles(self.BuildTarget, sel= f.ToolChain) - + AllWorkSpaceMetaFileList =3D sorted(AllWorkSpaceMetaFiles, key=3Dl= ambda x: str(x)) # # Retrieve latest modified time of all metafiles # @@ -543,16 +544,35 @@ class WorkspaceAutoGen(AutoGen): self._SrcTimeStamp =3D SrcTimeStamp =20 if GlobalData.gUseHashCache: + FileList =3D [] m =3D hashlib.md5() - for files in AllWorkSpaceMetaFiles: - if files.endswith('.dec'): + for file in AllWorkSpaceMetaFileList: + if file.endswith('.dec'): continue - f =3D open(files, 'rb') + f =3D open(file, 'rb') Content =3D f.read() f.close() m.update(Content) - SaveFileOnChange(os.path.join(self.BuildDir, 'AutoGen.hash'), = m.hexdigest(), False) - GlobalData.gPlatformHash =3D m.hexdigest() + FileList.append((str(file), hashlib.md5(Content).hexdigest= ())) + + HashDir =3D path.join(self.BuildDir, "Hash_Platform") + HashFile =3D path.join(HashDir, 'Platform.hash.' + m.hexdigest= ()) + SaveFileOnChange(HashFile, m.hexdigest(), False) + HashChainFile =3D path.join(HashDir, 'Platform.hashchain.' + m= .hexdigest()) + GlobalData.gPlatformHashFile =3D HashChainFile + try: + with open(HashChainFile, 'w') as f: + json.dump(FileList, f, indent=3D2) + except: + EdkLogger.quiet("[cache warning]: fail to save hashchain f= ile:%s" % HashChainFile) + + if GlobalData.gBinCacheDest: + # Copy platform hash files to cache destination + FileDir =3D path.join(GlobalData.gBinCacheDest, self.Outpu= tDir, self.BuildTarget + "_" + self.ToolChain, "Hash_Platform") + CacheFileDir =3D FileDir + CreateDirectory(CacheFileDir) + CopyFileOnChange(HashFile, CacheFileDir) + CopyFileOnChange(HashChainFile, CacheFileDir) =20 # # Write metafile list to build directory @@ -563,7 +583,7 @@ class WorkspaceAutoGen(AutoGen): if not os.path.exists(self.BuildDir): os.makedirs(self.BuildDir) with open(os.path.join(self.BuildDir, 'AutoGen'), 'w+') as file: - for f in AllWorkSpaceMetaFiles: + for f in AllWorkSpaceMetaFileList: print(f, file=3Dfile) return True =20 @@ -571,15 +591,16 @@ class WorkspaceAutoGen(AutoGen): if Pkg.PackageName in GlobalData.gPackageHash: return =20 - PkgDir =3D os.path.join(self.BuildDir, Pkg.Arch, Pkg.PackageName) + PkgDir =3D os.path.join(self.BuildDir, Pkg.Arch, "Hash_Pkg", Pkg.P= ackageName) CreateDirectory(PkgDir) - HashFile =3D os.path.join(PkgDir, Pkg.PackageName + '.hash') + FileList =3D [] m =3D hashlib.md5() # Get .dec file's hash value f =3D open(Pkg.MetaFile.Path, 'rb') Content =3D f.read() f.close() m.update(Content) + FileList.append((str(Pkg.MetaFile.Path), hashlib.md5(Content).hexd= igest())) # Get include files hash value if Pkg.Includes: for inc in sorted(Pkg.Includes, key=3Dlambda x: str(x)): @@ -590,9 +611,28 @@ class WorkspaceAutoGen(AutoGen): Content =3D f.read() f.close() m.update(Content) - SaveFileOnChange(HashFile, m.hexdigest(), False) + FileList.append((str(File_Path), hashlib.md5(Conte= nt).hexdigest())) GlobalData.gPackageHash[Pkg.PackageName] =3D m.hexdigest() =20 + HashDir =3D PkgDir + HashFile =3D path.join(HashDir, Pkg.PackageName + '.hash.' + m.hex= digest()) + SaveFileOnChange(HashFile, m.hexdigest(), False) + HashChainFile =3D path.join(HashDir, Pkg.PackageName + '.hashchain= .' + m.hexdigest()) + GlobalData.gPackageHashFile[(Pkg.PackageName, Pkg.Arch)] =3D HashC= hainFile + try: + with open(HashChainFile, 'w') as f: + json.dump(FileList, f, indent=3D2) + except: + EdkLogger.quiet("[cache warning]: fail to save hashchain file:= %s" % HashChainFile) + + if GlobalData.gBinCacheDest: + # Copy Pkg hash files to cache destination dir + FileDir =3D path.join(GlobalData.gBinCacheDest, self.OutputDir= , self.BuildTarget + "_" + self.ToolChain, Pkg.Arch, "Hash_Pkg", Pkg.Packag= eName) + CacheFileDir =3D FileDir + CreateDirectory(CacheFileDir) + CopyFileOnChange(HashFile, CacheFileDir) + CopyFileOnChange(HashChainFile, CacheFileDir) + def _GetMetaFiles(self, Target, Toolchain): AllWorkSpaceMetaFiles =3D set() # diff --git a/BaseTools/Source/Python/Common/GlobalData.py b/BaseTools/Sourc= e/Python/Common/GlobalData.py index 74c6d0079b..0b3ebe035d 100755 --- a/BaseTools/Source/Python/Common/GlobalData.py +++ b/BaseTools/Source/Python/Common/GlobalData.py @@ -104,29 +104,20 @@ gUseHashCache =3D None gBinCacheDest =3D None gBinCacheSource =3D None gPlatformHash =3D None -gPackageHash =3D {} -gModuleHash =3D {} +gPlatformHashFile =3D None +gPackageHash =3D None +gPackageHashFile =3D None +gModuleHashFile =3D None +gCMakeHashFile =3D None +gHashChainStatus =3D None +gModulePreMakeCacheStatus =3D None +gModuleMakeCacheStatus =3D None +gFileHashDict =3D None +gModuleAllCacheStatus =3D None +gModuleCacheHit =3D None + gEnableGenfdsMultiThread =3D True gSikpAutoGenCache =3D set() - -# Dictionary for tracking Module build status as success or failure -# Top Dict: Key: Arch Type Value: Dictionary -# Second Dict: Key: AutoGen Obj Value: 'SUCCESS'\'FAIL'\'FAIL_METAFILE' -gModuleBuildTracking =3D dict() - -# Dictionary of booleans that dictate whether a module or -# library can be skiped -# Top Dict: Key: Arch Type Value: Dictionary -# Second Dict: Key: Module\Library Name Value: True\False -gBuildHashSkipTracking =3D dict() - -# Common dictionary to share module cache intermediate result and state -gCacheIR =3D None -# Common lock for the module cache intermediate data -cache_lock =3D None # Common lock for the file access in multiple process AutoGens file_lock =3D None -# Common dictionary to share platform libraries' constant Pcd -libConstPcd =3D None -# Common dictionary to share platform libraries' reference info -Refes =3D None + diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Pyth= on/build/build.py index c48aaf2646..7159aab500 100755 --- a/BaseTools/Source/Python/build/build.py +++ b/BaseTools/Source/Python/build/build.py @@ -620,7 +620,7 @@ class BuildTask: # def AddDependency(self, Dependency): for Dep in Dependency: - if not Dep.BuildObject.IsBinaryModule and not Dep.BuildObject.= CanSkipbyCache(GlobalData.gCacheIR): + if not Dep.BuildObject.IsBinaryModule and not Dep.BuildObject.= CanSkipbyCache(GlobalData.gModuleCacheHit): self.DependencyList.append(BuildTask.New(Dep)) # BuildT= ask list =20 ## The thread wrapper of LaunchCommand function @@ -633,10 +633,13 @@ class BuildTask: self.BuildItem.BuildObject.BuildTime =3D LaunchCommand(Command= , WorkingDir,self.BuildItem.BuildObject) self.CompleteFlag =3D True =20 - # Run hash operation post dependency, to account for libs - if GlobalData.gUseHashCache and self.BuildItem.BuildObject.IsL= ibrary: - HashFile =3D path.join(self.BuildItem.BuildObject.BuildDir= , self.BuildItem.BuildObject.Name + ".hash") - SaveFileOnChange(HashFile, self.BuildItem.BuildObject.GenM= oduleHash(), True) + # Run hash operation post dependency to account for libs + # Run if --hash or --binary-destination + if GlobalData.gUseHashCache and not GlobalData.gBinCacheSource: + self.BuildItem.BuildObject.GenModuleHash() + if GlobalData.gBinCacheDest: + self.BuildItem.BuildObject.GenCMakeHash() + except: # # TRICK: hide the output of threads left running, so that the = user can @@ -653,14 +656,6 @@ class BuildTask: BuildTask._ErrorMessage =3D "%s broken\n %s [%s]" % \ (threading.currentThread().getName()= , Command, WorkingDir) =20 - # Set the value used by hash invalidation flow in GlobalData.gModu= leBuildTracking to 'SUCCESS' - # If Module or Lib is being tracked, it did not fail header check = test, and built successfully - if (self.BuildItem.BuildObject in GlobalData.gModuleBuildTracking = and - GlobalData.gModuleBuildTracking[self.BuildItem.BuildObject] != =3D 'FAIL_METAFILE' and - not BuildTask._ErrorFlag.isSet() - ): - GlobalData.gModuleBuildTracking[self.BuildItem.BuildObject] = =3D 'SUCCESS' - # indicate there's a thread is available for another build task BuildTask._RunningQueueLock.acquire() BuildTask._RunningQueue.pop(self.BuildItem) @@ -835,11 +830,20 @@ class Build(): self.AutoGenMgr =3D None EdkLogger.info("") os.chdir(self.WorkspaceDir) - GlobalData.gCacheIR =3D Manager().dict() self.log_q =3D log_q GlobalData.file_lock =3D mp.Lock() - GlobalData.cache_lock =3D mp.Lock() - def StartAutoGen(self,mqueue, DataPipe,SkipAutoGen,PcdMaList,share_dat= a): + # Init cache data for local only + GlobalData.gPackageHashFile =3D dict() + GlobalData.gModulePreMakeCacheStatus =3D dict() + GlobalData.gModuleMakeCacheStatus =3D dict() + GlobalData.gHashChainStatus =3D dict() + GlobalData.gCMakeHashFile =3D dict() + GlobalData.gModuleHashFile =3D dict() + GlobalData.gFileHashDict =3D dict() + GlobalData.gModuleAllCacheStatus =3D set() + GlobalData.gModuleCacheHit =3D set() + + def StartAutoGen(self,mqueue, DataPipe,SkipAutoGen,PcdMaList,cqueue): try: if SkipAutoGen: return True,0 @@ -849,29 +853,27 @@ class Build(): if FfsCmd is None: FfsCmd =3D {} GlobalData.FfsCmd =3D FfsCmd - GlobalData.libConstPcd =3D DataPipe.Get("LibConstPcd") - GlobalData.Refes =3D DataPipe.Get("REFS") - auto_workers =3D [AutoGenWorkerInProcess(mqueue,DataPipe.dump_= file,feedback_q,GlobalData.file_lock,GlobalData.cache_lock,share_data,self.= log_q,error_event) for _ in range(self.ThreadNumber)] + auto_workers =3D [AutoGenWorkerInProcess(mqueue,DataPipe.dump_= file,feedback_q,GlobalData.file_lock,cqueue,self.log_q,error_event) for _ i= n range(self.ThreadNumber)] self.AutoGenMgr =3D AutoGenManager(auto_workers,feedback_q,err= or_event) self.AutoGenMgr.start() for w in auto_workers: w.start() if PcdMaList is not None: for PcdMa in PcdMaList: - if GlobalData.gBinCacheSource and self.Target in [None= , "", "all"]: - PcdMa.GenModuleFilesHash(share_data) - PcdMa.GenPreMakefileHash(share_data) - if PcdMa.CanSkipbyPreMakefileCache(share_data): - continue + # SourceFileList calling sequence impact the makefile = string sequence. + # Create cached SourceFileList here to unify its calli= ng sequence for both + # CanSkipbyPreMakeCache and CreateCodeFile/CreateMakeF= ile. + RetVal =3D PcdMa.SourceFileList + # Force cache miss for PCD driver + if GlobalData.gUseHashCache and not GlobalData.gBinCac= heDest and self.Target in [None, "", "all"]: + cqueue.put((PcdMa.MetaFile.Path, PcdMa.Arch, "PreM= akeCache", False)) =20 PcdMa.CreateCodeFile(False) PcdMa.CreateMakeFile(False,GenFfsList =3D DataPipe.Get= ("FfsCommand").get((PcdMa.MetaFile.Path, PcdMa.Arch),[])) =20 + # Force cache miss for PCD driver if GlobalData.gBinCacheSource and self.Target in [None= , "", "all"]: - PcdMa.GenMakeHeaderFilesHash(share_data) - PcdMa.GenMakeHash(share_data) - if PcdMa.CanSkipbyMakeCache(share_data): - continue + cqueue.put((PcdMa.MetaFile.Path, PcdMa.Arch, "Make= Cache", False)) =20 self.AutoGenMgr.join() rt =3D self.AutoGenMgr.Status @@ -1182,38 +1184,6 @@ class Build(): EdkLogger.error("Postbuild", POSTBUILD_ERROR, 'Postbuild p= rocess is not success!') EdkLogger.info("\n- Postbuild Done -\n") =20 - ## Error handling for hash feature - # - # On BuildTask error, iterate through the Module Build tracking - # dictionary to determine wheather a module failed to build. Invalidate - # the hash associated with that module by removing it from storage. - # - # - def invalidateHash(self): - # Only for hashing feature - if not GlobalData.gUseHashCache: - return - - # GlobalData.gModuleBuildTracking contains only modules or libs th= at cannot be skipped by hash - for Ma in GlobalData.gModuleBuildTracking: - # Skip invalidating for Successful Module/Lib builds - if GlobalData.gModuleBuildTracking[Ma] =3D=3D 'SUCCESS': - continue - - # The module failed to build, failed to start building, or fai= led the header check test from this point on - - # Remove .hash from build - ModuleHashFile =3D os.path.join(Ma.BuildDir, Ma.Name + ".hash") - if os.path.exists(ModuleHashFile): - os.remove(ModuleHashFile) - - # Remove .hash file from cache - if GlobalData.gBinCacheDest: - FileDir =3D os.path.join(GlobalData.gBinCacheDest, Ma.Plat= formInfo.OutputDir, Ma.BuildTarget + "_" + Ma.ToolChain, Ma.Arch, Ma.Source= Dir, Ma.MetaFile.BaseName) - HashFile =3D os.path.join(FileDir, Ma.Name + '.hash') - if os.path.exists(HashFile): - os.remove(HashFile) - ## Build a module or platform # # Create autogen code and makefile for a module or platform, and the l= aunch @@ -1251,7 +1221,8 @@ class Build(): self.Progress.Start("Generating makefile and code") data_pipe_file =3D os.path.join(AutoGenObject.BuildDir, "Globa= lVar_%s_%s.bin" % (str(AutoGenObject.Guid),AutoGenObject.Arch)) AutoGenObject.DataPipe.dump(data_pipe_file) - autogen_rt,errorcode =3D self.StartAutoGen(mqueue, AutoGenObje= ct.DataPipe, self.SkipAutoGen, PcdMaList, GlobalData.gCacheIR) + cqueue =3D mp.Queue() + autogen_rt,errorcode =3D self.StartAutoGen(mqueue, AutoGenObje= ct.DataPipe, self.SkipAutoGen, PcdMaList, cqueue) AutoGenIdFile =3D os.path.join(GlobalData.gConfDirectory,".Aut= oGenIdFile.txt") with open(AutoGenIdFile,"w") as fw: fw.write("Arch=3D%s\n" % "|".join((AutoGenObject.Workspace= .ArchList))) @@ -1292,7 +1263,11 @@ class Build(): LaunchCommand(BuildCommand, AutoGenObject.MakeFileDir) self.CreateAsBuiltInf() if GlobalData.gBinCacheDest: - self.UpdateBuildCache() + self.GenDestCache() + elif GlobalData.gUseHashCache and not GlobalData.gBinCacheSour= ce: + # Only for --hash + # Update PreMakeCacheChain files + self.GenLocalPreMakeCache() self.BuildModules =3D [] return True =20 @@ -1326,7 +1301,11 @@ class Build(): LaunchCommand(NewBuildCommand, AutoGenObject.MakeFileDir,M= odAutoGen) self.CreateAsBuiltInf() if GlobalData.gBinCacheDest: - self.UpdateBuildCache() + self.GenDestCache() + elif GlobalData.gUseHashCache and not GlobalData.gBinCacheSour= ce: + # Only for --hash + # Update PreMakeCacheChain files + self.GenLocalPreMakeCache() self.BuildModules =3D [] return True =20 @@ -1422,7 +1401,11 @@ class Build(): AutoGenObject.BuildTime =3D LaunchCommand(BuildCommand, AutoGe= nObject.MakeFileDir) self.CreateAsBuiltInf() if GlobalData.gBinCacheDest: - self.UpdateBuildCache() + self.GenDestCache() + elif GlobalData.gUseHashCache and not GlobalData.gBinCacheSour= ce: + # Only for --hash + # Update PreMakeCacheChain files + self.GenLocalPreMakeCache() self.BuildModules =3D [] return True =20 @@ -1870,13 +1853,7 @@ class Build(): if GlobalData.gEnableGenfdsMultiThread and self.Fdf: CmdListDict =3D self._GenFfsCmd(Wa.ArchList) =20 - # Add Platform and Package level hash in share_data for mo= dule hash calculation later - if GlobalData.gBinCacheSource or GlobalData.gBinCacheDest: - GlobalData.gCacheIR[('PlatformHash')] =3D GlobalData.g= PlatformHash - for PkgName in GlobalData.gPackageHash.keys(): - GlobalData.gCacheIR[(PkgName, 'PackageHash')] =3D = GlobalData.gPackageHash[PkgName] GlobalData.file_lock =3D mp.Lock() - GlobalData.cache_lock =3D mp.Lock() GlobalData.FfsCmd =3D CmdListDict =20 self.Progress.Stop("done!") @@ -1888,8 +1865,6 @@ class Build(): AutoGenStart =3D time.time() GlobalData.gGlobalDefines['ARCH'] =3D Arch Pa =3D PlatformAutoGen(Wa, self.PlatformFile, BuildTar= get, ToolChain, Arch) - GlobalData.libConstPcd =3D Pa.DataPipe.Get("LibConstPc= d") - GlobalData.Refes =3D Pa.DataPipe.Get("REFS") for Module in Pa.Platform.Modules: if self.ModuleFile.Dir =3D=3D Module.Dir and self.= ModuleFile.Name =3D=3D Module.Name: Ma =3D ModuleAutoGen(Wa, Module, BuildTarget, = ToolChain, Arch, self.PlatformFile,Pa.DataPipe) @@ -1900,13 +1875,11 @@ class Build(): Ma.Workspace =3D Wa MaList.append(Ma) =20 - if GlobalData.gBinCacheSource and self.Target = in [None, "", "all"]: - Ma.GenModuleFilesHash(GlobalData.gCacheIR) - Ma.GenPreMakefileHash(GlobalData.gCacheIR) - if Ma.CanSkipbyPreMakefileCache(GlobalData= .gCacheIR): - self.HashSkipModules.append(Ma) - EdkLogger.quiet("cache hit: %s[%s]" % = (Ma.MetaFile.Path, Ma.Arch)) + if GlobalData.gUseHashCache and not GlobalData= .gBinCacheDest and self.Target in [None, "", "all"]: + if Ma.CanSkipbyPreMakeCache(): continue + else: + self.PreMakeCacheMiss.add(Ma) =20 # Not to auto-gen for targets 'clean', 'cleanl= ib', 'cleanall', 'run', 'fds' if self.Target not in ['clean', 'cleanlib', 'c= leanall', 'run', 'fds']: @@ -1929,19 +1902,12 @@ class Build(): return True =20 if GlobalData.gBinCacheSource and self.Tar= get in [None, "", "all"]: - Ma.GenMakeHeaderFilesHash(GlobalData.g= CacheIR) - Ma.GenMakeHash(GlobalData.gCacheIR) - if Ma.CanSkipbyMakeCache(GlobalData.gC= acheIR): - self.HashSkipModules.append(Ma) - EdkLogger.quiet("cache hit: %s[%s]= " % (Ma.MetaFile.Path, Ma.Arch)) + if Ma.CanSkipbyMakeCache(): continue else: - EdkLogger.quiet("cache miss: %s[%s= ]" % (Ma.MetaFile.Path, Ma.Arch)) - Ma.PrintFirstMakeCacheMissFile(Glo= balData.gCacheIR) + self.MakeCacheMiss.add(Ma) =20 self.BuildModules.append(Ma) - # Initialize all modules in tracking to 'FAIL' - GlobalData.gModuleBuildTracking[Ma] =3D 'FAIL' self.AutoGenTime +=3D int(round((time.time() - AutoGen= Start))) MakeStart =3D time.time() for Ma in self.BuildModules: @@ -1952,7 +1918,6 @@ class Build(): # we need a full version of makefile for platf= orm ExitFlag.set() BuildTask.WaitForComplete() - self.invalidateHash() Pa.CreateMakeFile(False) EdkLogger.error("build", BUILD_ERROR, "Failed = to build module", ExtraData=3DGlobalData.gBuildingModule) # Start task scheduler @@ -1962,7 +1927,6 @@ class Build(): # in case there's an interruption. we need a full vers= ion of makefile for platform Pa.CreateMakeFile(False) if BuildTask.HasError(): - self.invalidateHash() EdkLogger.error("build", BUILD_ERROR, "Failed to b= uild module", ExtraData=3DGlobalData.gBuildingModule) self.MakeTime +=3D int(round((time.time() - MakeStart)= )) =20 @@ -1971,11 +1935,14 @@ class Build(): BuildTask.WaitForComplete() self.CreateAsBuiltInf() if GlobalData.gBinCacheDest: - self.UpdateBuildCache() + self.GenDestCache() + elif GlobalData.gUseHashCache and not GlobalData.gBinCache= Source: + # Only for --hash + # Update PreMakeCacheChain files + self.GenLocalPreMakeCache() self.BuildModules =3D [] self.MakeTime +=3D int(round((time.time() - MakeContiue))) if BuildTask.HasError(): - self.invalidateHash() EdkLogger.error("build", BUILD_ERROR, "Failed to build= module", ExtraData=3DGlobalData.gBuildingModule) =20 self.BuildReport.AddPlatformReport(Wa, MaList) @@ -2028,7 +1995,6 @@ class Build(): # Save MAP buffer into MAP file. # self._SaveMapFile (MapBuffer, Wa) - self.invalidateHash() =20 def _GenFfsCmd(self,ArchList): # convert dictionary of Cmd:(Inf,Arch) @@ -2134,20 +2100,13 @@ class Build(): self.BuildReport.AddPlatformReport(Wa) Wa.CreateMakeFile(False) =20 - # Add ffs build to makefile + # Add ffs build to makefile CmdListDict =3D {} if GlobalData.gEnableGenfdsMultiThread and self.Fdf: CmdListDict =3D self._GenFfsCmd(Wa.ArchList) =20 - # Add Platform and Package level hash in share_data for module has= h calculation later - if GlobalData.gBinCacheSource or GlobalData.gBinCacheDest: - GlobalData.gCacheIR[('PlatformHash')] =3D GlobalData.gPlatform= Hash - for PkgName in GlobalData.gPackageHash.keys(): - GlobalData.gCacheIR[(PkgName, 'PackageHash')] =3D GlobalDa= ta.gPackageHash[PkgName] - self.AutoGenTime +=3D int(round((time.time() - WorkspaceAutoGenTim= e))) BuildModules =3D [] - TotalModules =3D [] for Arch in Wa.ArchList: PcdMaList =3D [] AutoGenStart =3D time.time() @@ -2158,7 +2117,7 @@ class Build(): ModuleList =3D [] for Inf in Pa.Platform.Modules: ModuleList.append(Inf) - # Add the INF only list in FDF + # Add the INF only list in FDF if GlobalData.gFdfParser is not None: for InfName in GlobalData.gFdfParser.Profile.InfList: Inf =3D PathClass(NormPath(InfName), self.WorkspaceDir= , Arch) @@ -2172,58 +2131,73 @@ class Build(): Pa.DataPipe.DataContainer =3D {"LibraryBuildDirectoryList":Pa.= LibraryBuildDirectoryList} Pa.DataPipe.DataContainer =3D {"ModuleBuildDirectoryList":Pa.M= oduleBuildDirectoryList} Pa.DataPipe.DataContainer =3D {"FdsCommandDict": Wa.GenFdsComm= andDict} + # Prepare the cache share data for multiprocessing + Pa.DataPipe.DataContainer =3D {"gPlatformHashFile":GlobalData.= gPlatformHashFile} ModuleCodaFile =3D {} for ma in Pa.ModuleAutoGenList: ModuleCodaFile[(ma.MetaFile.File,ma.MetaFile.Root,ma.Arch,= ma.MetaFile.Path)] =3D [item.Target for item in ma.CodaTargetList] Pa.DataPipe.DataContainer =3D {"ModuleCodaFile":ModuleCodaFile} + # ModuleList contains all driver modules only for Module in ModuleList: - # Get ModuleAutoGen object to generate C code file= and makefile + # Get ModuleAutoGen object to generate C code file and mak= efile Ma =3D ModuleAutoGen(Wa, Module, BuildTarget, ToolChain, A= rch, self.PlatformFile,Pa.DataPipe) - if Ma is None: continue if Ma.PcdIsDriver: Ma.PlatformInfo =3D Pa Ma.Workspace =3D Wa PcdMaList.append(Ma) - TotalModules.append(Ma) - # Initialize all modules in tracking to 'FAIL' - GlobalData.gModuleBuildTracking[Ma] =3D 'FAIL' - + self.AllDrivers.add(Ma) + self.AllModules.add(Ma) =20 mqueue =3D mp.Queue() + cqueue =3D mp.Queue() for m in Pa.GetAllModuleInfo: mqueue.put(m) + module_file,module_root,module_path,module_basename,\ + module_originalpath,module_arch,IsLib =3D m + Ma =3D ModuleAutoGen(Wa, PathClass(module_path, Wa), Build= Target,\ + ToolChain, Arch, self.PlatformFile,Pa.Da= taPipe) + self.AllModules.add(Ma) data_pipe_file =3D os.path.join(Pa.BuildDir, "GlobalVar_%s_%s.= bin" % (str(Pa.Guid),Pa.Arch)) Pa.DataPipe.dump(data_pipe_file) =20 - autogen_rt, errorcode =3D self.StartAutoGen(mqueue, Pa.DataPip= e, self.SkipAutoGen, PcdMaList,GlobalData.gCacheIR) - - # Skip cache hit modules - if GlobalData.gBinCacheSource: - for Ma in TotalModules: - if (Ma.MetaFile.Path, Ma.Arch) in GlobalData.gCacheIR = and \ - GlobalData.gCacheIR[(Ma.MetaFile.Path, Ma.Arch)].P= reMakeCacheHit: - self.HashSkipModules.append(Ma) - continue - if (Ma.MetaFile.Path, Ma.Arch) in GlobalData.gCacheIR = and \ - GlobalData.gCacheIR[(Ma.MetaFile.Path, Ma.Arch)].M= akeCacheHit: - self.HashSkipModules.append(Ma) - continue - BuildModules.append(Ma) - else: - BuildModules.extend(TotalModules) + autogen_rt, errorcode =3D self.StartAutoGen(mqueue, Pa.DataPip= e, self.SkipAutoGen, PcdMaList, cqueue) =20 if not autogen_rt: self.AutoGenMgr.TerminateWorkers() self.AutoGenMgr.join(1) raise FatalError(errorcode) + + if GlobalData.gUseHashCache: + for item in GlobalData.gModuleAllCacheStatus: + (MetaFilePath, Arch, CacheStr, Status) =3D item + Ma =3D ModuleAutoGen(Wa, PathClass(MetaFilePath, Wa), = BuildTarget,\ + ToolChain, Arch, self.PlatformFile,P= a.DataPipe) + if CacheStr =3D=3D "PreMakeCache" and Status =3D=3D Fa= lse: + self.PreMakeCacheMiss.add(Ma) + if CacheStr =3D=3D "PreMakeCache" and Status =3D=3D Tr= ue: + self.PreMakeCacheHit.add(Ma) + GlobalData.gModuleCacheHit.add(Ma) + if CacheStr =3D=3D "MakeCache" and Status =3D=3D False: + self.MakeCacheMiss.add(Ma) + if CacheStr =3D=3D "MakeCache" and Status =3D=3D True: + self.MakeCacheHit.add(Ma) + GlobalData.gModuleCacheHit.add(Ma) self.AutoGenTime +=3D int(round((time.time() - AutoGenStart))) AutoGenIdFile =3D os.path.join(GlobalData.gConfDirectory,".AutoGen= IdFile.txt") with open(AutoGenIdFile,"w") as fw: fw.write("Arch=3D%s\n" % "|".join((Wa.ArchList))) fw.write("BuildDir=3D%s\n" % Wa.BuildDir) fw.write("PlatformGuid=3D%s\n" % str(Wa.AutoGenObjectList[0].G= uid)) + + if GlobalData.gBinCacheSource: + BuildModules.extend(self.MakeCacheMiss) + elif GlobalData.gUseHashCache and not GlobalData.gBinCacheDest: + BuildModules.extend(self.PreMakeCacheMiss) + else: + BuildModules.extend(self.AllDrivers) + self.Progress.Stop("done!") return Wa, BuildModules =20 @@ -2253,18 +2227,9 @@ class Build(): GlobalData.gAutoGenPhase =3D False =20 if GlobalData.gBinCacheSource: - EdkLogger.quiet("Total cache hit driver num: %s, cache= miss driver num: %s" % (len(set(self.HashSkipModules)), len(set(self.Build= Modules)))) - CacheHitMa =3D set() - CacheNotHitMa =3D set() - for IR in GlobalData.gCacheIR.keys(): - if 'PlatformHash' in IR or 'PackageHash' in IR: - continue - if GlobalData.gCacheIR[IR].PreMakeCacheHit or Glob= alData.gCacheIR[IR].MakeCacheHit: - CacheHitMa.add(IR) - else: - # There might be binary module or module which= has .inc files, not count for cache miss - CacheNotHitMa.add(IR) - EdkLogger.quiet("Total module num: %s, cache hit modul= e num: %s" % (len(CacheHitMa)+len(CacheNotHitMa), len(CacheHitMa))) + EdkLogger.quiet("[cache Summary]: Total module num: %s= " % len(self.AllModules)) + EdkLogger.quiet("[cache Summary]: PreMakecache miss nu= m: %s " % len(self.PreMakeCacheMiss)) + EdkLogger.quiet("[cache Summary]: Makecache miss num: = %s " % len(self.MakeCacheMiss)) =20 for Arch in Wa.ArchList: MakeStart =3D time.time() @@ -2277,7 +2242,6 @@ class Build(): # we need a full version of makefile for platf= orm ExitFlag.set() BuildTask.WaitForComplete() - self.invalidateHash() Pa.CreateMakeFile(False) EdkLogger.error("build", BUILD_ERROR, "Failed = to build module", ExtraData=3DGlobalData.gBuildingModule) # Start task scheduler @@ -2287,7 +2251,6 @@ class Build(): # in case there's an interruption. we need a full vers= ion of makefile for platform =20 if BuildTask.HasError(): - self.invalidateHash() EdkLogger.error("build", BUILD_ERROR, "Failed to b= uild module", ExtraData=3DGlobalData.gBuildingModule) self.MakeTime +=3D int(round((time.time() - MakeStart)= )) =20 @@ -2301,7 +2264,11 @@ class Build(): BuildTask.WaitForComplete() self.CreateAsBuiltInf() if GlobalData.gBinCacheDest: - self.UpdateBuildCache() + self.GenDestCache() + elif GlobalData.gUseHashCache and not GlobalData.gBinCache= Source: + # Only for --hash + # Update PreMakeCacheChain files + self.GenLocalPreMakeCache() self.BuildModules =3D [] self.MakeTime +=3D int(round((time.time() - MakeContiue))) # @@ -2309,7 +2276,6 @@ class Build(): # has been signaled. # if BuildTask.HasError(): - self.invalidateHash() EdkLogger.error("build", BUILD_ERROR, "Failed to build= module", ExtraData=3DGlobalData.gBuildingModule) =20 # Create MAP file when Load Fix Address is enabled. @@ -2350,7 +2316,6 @@ class Build(): # self._SaveMapFile(MapBuffer, Wa) self.CreateGuidedSectionToolsFile(Wa) - self.invalidateHash() ## Generate GuidedSectionTools.txt in the FV directories. # def CreateGuidedSectionToolsFile(self,Wa): @@ -2406,6 +2371,12 @@ class Build(): ## Launch the module or platform build # def Launch(self): + self.AllDrivers =3D set() + self.AllModules =3D set() + self.PreMakeCacheMiss =3D set() + self.PreMakeCacheHit =3D set() + self.MakeCacheMiss =3D set() + self.MakeCacheHit =3D set() if not self.ModuleFile: if not self.SpawnMode or self.Target not in ["", "all"]: self.SpawnMode =3D False @@ -2423,23 +2394,16 @@ class Build(): for Module in self.BuildModules: Module.CreateAsBuiltInf() =20 - def UpdateBuildCache(self): - all_lib_set =3D set() - all_mod_set =3D set() - for Module in self.BuildModules: + def GenDestCache(self): + for Module in self.AllModules: + Module.GenPreMakefileHashList() + Module.GenMakefileHashList() Module.CopyModuleToCache() - all_mod_set.add(Module) - for Module in self.HashSkipModules: - Module.CopyModuleToCache() - all_mod_set.add(Module) - for Module in all_mod_set: - for lib in Module.LibraryAutoGenList: - all_lib_set.add(lib) - for lib in all_lib_set: - lib.CopyModuleToCache() - all_lib_set.clear() - all_mod_set.clear() - self.HashSkipModules =3D [] + + def GenLocalPreMakeCache(self): + for Module in self.PreMakeCacheMiss: + Module.GenPreMakefileHashList() + ## Do some clean-up works when error occurred def Relinquish(self): OldLogLevel =3D EdkLogger.GetLevel() --=20 2.16.1.windows.4 -=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D- Groups.io Links: You receive all messages sent to this group. View/Reply Online (#50878): https://edk2.groups.io/g/devel/message/50878 Mute This Topic: https://groups.io/mt/60552483/1787277 Group Owner: devel+owner@edk2.groups.io Unsubscribe: https://edk2.groups.io/g/devel/unsub [importer@patchew.org] -=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D- From nobody Sun May 5 09:42:14 2024 Delivered-To: importer@patchew.org Received-SPF: pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) client-ip=66.175.222.12; envelope-from=bounce+27952+50879+1787277+3901457@groups.io; helo=web01.groups.io; Authentication-Results: mx.zohomail.com; dkim=pass; spf=pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) smtp.mailfrom=bounce+27952+50879+1787277+3901457@groups.io; dmarc=fail(p=none dis=none) header.from=intel.com ARC-Seal: i=1; a=rsa-sha256; t=1574155660; cv=none; d=zoho.com; s=zohoarc; b=YZY8DC0KOT12z9eVht+OrQE11Et6EjbP/6o2luDcm4WNdRvSMMOnHb7GTbXFvgRLggT3r9uTQkUJd+Cr4W3VP+ph7jbPqgew9+bnrctj5TZYnQZP7wPzq/MV1FTNqHJrgFG/W+cTwkKCYM4blLraiInYw1JTlZeb9yGzHXmF78w= ARC-Message-Signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=zoho.com; s=zohoarc; t=1574155660; h=Cc:Date:From:In-Reply-To:List-Id:List-Unsubscribe:Message-ID:Reply-To:References:Sender:Subject:To; bh=i3zLzdC6adJi2BThhhmDjjjqdqSbD4+/rwakrHm4ynw=; b=lruPzyiglBgWwZRQd2RyHqg/GV/XrlwPhc1FS8gfL/QXHmp2harYdROM8eGL64tKXIsCeH1QiowtlDm7QSX8dYl8vyggzJeGMOlqC+oxLOHPjdrwTfckaXOaULL+k/2Bzp1onsVPXLoyMVeo/whhp4vO/XOkRpwy9fQ5CE2jdN0= ARC-Authentication-Results: i=1; mx.zoho.com; dkim=pass; spf=pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) smtp.mailfrom=bounce+27952+50879+1787277+3901457@groups.io; dmarc=fail header.from= (p=none dis=none) header.from= Received: from web01.groups.io (web01.groups.io [66.175.222.12]) by mx.zohomail.com with SMTPS id 1574155660346909.99297281529; Tue, 19 Nov 2019 01:27:40 -0800 (PST) Return-Path: X-Received: by 127.0.0.2 with SMTP id aaaaYY1788612xaaaaaaaaaa; Tue, 19 Nov 2019 01:27:39 -0800 X-Received: from mga01.intel.com (mga01.intel.com []) by mx.groups.io with SMTP id smtpd.web12.18856.1574155650687557305 for ; Tue, 19 Nov 2019 01:27:38 -0800 X-Amp-Result: SKIPPED(no attachment in message) X-Amp-File-Uploaded: False X-Received: from orsmga006.jf.intel.com ([10.7.209.51]) by fmsmga101.fm.intel.com with ESMTP/TLS/DHE-RSA-AES256-GCM-SHA384; 19 Nov 2019 01:27:38 -0800 X-ExtLoop1: 1 X-IronPort-AV: E=Sophos;i="5.68,322,1569308400"; d="scan'208";a="209363458" X-Received: from jshi19-mobl.ccr.corp.intel.com ([10.254.214.80]) by orsmga006.jf.intel.com with ESMTP; 19 Nov 2019 01:27:37 -0800 From: "Steven Shi" To: devel@edk2.groups.io Cc: liming.gao@intel.com, bob.c.feng@intel.com, Steven Subject: [edk2-devel] [PATCH 4/4] BaseTools: Remove redundant binary cache file Date: Tue, 19 Nov 2019 17:27:01 +0800 Message-Id: <20191119092701.22988-5-steven.shi@intel.com> In-Reply-To: <20191119092701.22988-1-steven.shi@intel.com> References: <20191119092701.22988-1-steven.shi@intel.com> Precedence: Bulk List-Unsubscribe: Sender: devel@edk2.groups.io List-Id: Mailing-List: list devel@edk2.groups.io; contact devel+owner@edk2.groups.io Reply-To: devel@edk2.groups.io,steven.shi@intel.com X-Gm-Message-State: aaaaaaaaaaaaaaaaaaaaaaaax1787277AA= DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=groups.io; q=dns/txt; s=20140610; t=1574155659; bh=HF/DfI+1Tp5wnCXMPM+U0BE4wMWLd5B4/YA/xgWqTnQ=; h=Cc:Date:From:Reply-To:Subject:To; b=qVn8UFsh/UGChuigZCLUn/Iv+3Vm2IVvVSuExy742yC4uQn9EB5Sun5vFb/HIhUkBoy NiIGgAKpC6nx1XuSBTiY3eRePdHqUCwyox8vkWUxaP0AmanqEJZZezIRg5vIhhiHdpRcK NHHqPl+Vq36rLwKAjBec3wW9nF4wfCdI1NE= X-ZohoMail-DKIM: pass (identity @groups.io) Content-Transfer-Encoding: quoted-printable MIME-Version: 1.0 Content-Type: text/plain; charset="utf-8" From: Steven Redesign the binary cache and not need to save the cache intermediate result and state in memory as a ModuleBuildCacheIR class instance. So remove the CacheIR.py which define the ModuleBuildCacheIR class. Cc: Liming Gao Cc: Bob Feng Signed-off-by: Steven Shi --- BaseTools/Source/Python/AutoGen/CacheIR.py | 29 --------------------------= --- 1 file changed, 29 deletions(-) delete mode 100755 BaseTools/Source/Python/AutoGen/CacheIR.py diff --git a/BaseTools/Source/Python/AutoGen/CacheIR.py b/BaseTools/Source/= Python/AutoGen/CacheIR.py deleted file mode 100755 index 715be5273c..0000000000 --- a/BaseTools/Source/Python/AutoGen/CacheIR.py +++ /dev/null @@ -1,29 +0,0 @@ -## @file -# Build cache intermediate result and state -# -# Copyright (c) 2019, Intel Corporation. All rights reserved.
-# SPDX-License-Identifier: BSD-2-Clause-Patent -# - -class ModuleBuildCacheIR(): - def __init__(self, Path, Arch): - self.ModulePath =3D Path - self.ModuleArch =3D Arch - self.ModuleFilesHashDigest =3D None - self.ModuleFilesHashHexDigest =3D None - self.ModuleFilesChain =3D [] - self.PreMakefileHashHexDigest =3D None - self.CreateCodeFileDone =3D False - self.CreateMakeFileDone =3D False - self.MakefilePath =3D None - self.AutoGenFileList =3D None - self.DependencyHeaderFileSet =3D None - self.MakeHeaderFilesHashChain =3D None - self.MakeHeaderFilesHashDigest =3D None - self.MakeHeaderFilesHashChain =3D [] - self.MakeHashDigest =3D None - self.MakeHashHexDigest =3D None - self.MakeHashChain =3D [] - self.CacheCrash =3D False - self.PreMakeCacheHit =3D False - self.MakeCacheHit =3D False --=20 2.16.1.windows.4 -=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D- Groups.io Links: You receive all messages sent to this group. View/Reply Online (#50879): https://edk2.groups.io/g/devel/message/50879 Mute This Topic: https://groups.io/mt/60552484/1787277 Group Owner: devel+owner@edk2.groups.io Unsubscribe: https://edk2.groups.io/g/devel/unsub [importer@patchew.org] -=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-