From nobody Fri Apr 19 18:19:06 2024 Delivered-To: importer@patchew.org Received-SPF: pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) client-ip=66.175.222.12; envelope-from=bounce+27952+45494+1787277+3901457@groups.io; helo=web01.groups.io; Authentication-Results: mx.zohomail.com; dkim=pass; spf=pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) smtp.mailfrom=bounce+27952+45494+1787277+3901457@groups.io; dmarc=fail(p=none dis=none) header.from=intel.com ARC-Seal: i=1; a=rsa-sha256; t=1565668025; cv=none; d=zoho.com; s=zohoarc; b=aWNxpUp9jAcg2CONXdXPiikbQupk5exFNYIp4qplXEI6kk7XQXLDmo/fuaOPJwkOJB+OG+jfs7RZq9JhRlLRyfNYatTFyZfGOdqunbFJVX+rTKu/8ACoSabc1vG9IXU6OY0q+DyzlawUVRL8LeaUiL6B/E1Ja69XSOqtTDwd6i8= ARC-Message-Signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=zoho.com; s=zohoarc; t=1565668025; h=Cc:Date:From:In-Reply-To:List-Id:List-Unsubscribe:Message-ID:Reply-To:References:Sender:Subject:To:ARC-Authentication-Results; bh=6eyyjPeBmBUEULBA5yoN+qaXWCM8VxAUa/HKY7vRuCo=; b=dlNWQj4gFQFZwNrKctGB6ggpetPmRXbwFpTTorATratHquO00rabNLTKS6KOvcDIxtjhQcsTMOu8zVIsPlIu3X7cTm7mJpxTyYRc1idbJL/Ul4ijrduD+I5fx1zPhCZvsKjUnlBy8L/uj2MhyR5ULbP0R7H1l+XEtxuSBiEfrOE= ARC-Authentication-Results: i=1; mx.zoho.com; dkim=pass; spf=pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) smtp.mailfrom=bounce+27952+45494+1787277+3901457@groups.io; dmarc=fail header.from= (p=none dis=none) header.from= Received: from web01.groups.io (web01.groups.io [66.175.222.12]) by mx.zohomail.com with SMTPS id 1565668025156892.3891844969598; Mon, 12 Aug 2019 20:47:05 -0700 (PDT) Return-Path: X-Received: from mga05.intel.com (mga05.intel.com []) by groups.io with SMTP; Mon, 12 Aug 2019 20:47:04 -0700 X-Amp-Result: SKIPPED(no attachment in message) X-Amp-File-Uploaded: False X-Received: from orsmga003.jf.intel.com ([10.7.209.27]) by fmsmga105.fm.intel.com with ESMTP/TLS/DHE-RSA-AES256-GCM-SHA384; 12 Aug 2019 20:47:03 -0700 X-ExtLoop1: 1 X-IronPort-AV: E=Sophos;i="5.64,380,1559545200"; d="scan'208";a="178549503" X-Received: from jshi19-mobl.ccr.corp.intel.com ([10.254.211.59]) by orsmga003.jf.intel.com with ESMTP; 12 Aug 2019 20:47:00 -0700 From: "Steven Shi" To: devel@edk2.groups.io Cc: liming.gao@intel.com, bob.c.feng@intel.com, christian.rodriguez@intel.com, michael.johnson@intel.com, "Shi, Steven" Subject: [edk2-devel] [PATCH v2 1/4] BaseTools: Improve the cache hit in the edk2 build cache Date: Tue, 13 Aug 2019 11:46:51 +0800 Message-Id: <20190813034654.25716-2-steven.shi@intel.com> In-Reply-To: <20190813034654.25716-1-steven.shi@intel.com> References: <20190813034654.25716-1-steven.shi@intel.com> Precedence: Bulk List-Unsubscribe: Sender: devel@edk2.groups.io List-Id: Mailing-List: list devel@edk2.groups.io; contact devel+owner@edk2.groups.io Reply-To: devel@edk2.groups.io,steven.shi@intel.com DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=groups.io; q=dns/txt; s=20140610; t=1565668024; bh=87+ktuDes+Asl5VdAXJQ249f/13gW0pkM109HBIXHiA=; h=Cc:Date:From:Reply-To:Subject:To; b=N7t+k17Vck0eehGa+BwoXElKXGmr9bzsEWGGi4kLQ+rV2J7+Z7xile9UMl34fOvZY8N in5ex/ms/4xy4ydhGavK3qu1RKvV7wMiRJvwI6cDtpmpdlL1RkyvBbilm7vmcD27wXKAp G2AUB9vUIzB0nVgxjaPOlOnd/RWIIxtS3K4= X-ZohoMail-DKIM: pass (identity @groups.io) Content-Transfer-Encoding: quoted-printable MIME-Version: 1.0 Content-Type: text/plain; charset="utf-8" From: "Shi, Steven" BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=3D1927 Current cache hash algorithm does not parse and generate the makefile to get the accurate dependency files for a module. It instead use the platform and package meta files to get the module depenedency in a quick but over approximate way. These meta files are monolithic and involve many redundant dependency for the module, which cause the module build cache miss easily. This patch introduces one more cache checkpoint and a new hash algorithm besides the current quick one. The new hash algorithm leverages the module makefile to achieve more accurate and precise dependency info for a module. When the build cache miss with the first quick hash, the Basetool will caculate new one after makefile is generated and then check again. Cc: Liming Gao Cc: Bob Feng Signed-off-by: Steven Shi --- BaseTools/Source/Python/AutoGen/AutoGenWorker.py | 21 +++++++++++++++++++= ++ BaseTools/Source/Python/AutoGen/CacheIR.py | 28 +++++++++++++++++++= +++++++++ BaseTools/Source/Python/AutoGen/DataPipe.py | 8 ++++++++ BaseTools/Source/Python/AutoGen/GenMake.py | 223 +++++++++++++++++++= +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++= ++++++++++++++++++++-------------------------------------------------------= ------------------------------------------------------ BaseTools/Source/Python/AutoGen/ModuleAutoGen.py | 640 +++++++++++++++++++= +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++= +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++= +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++= +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++= +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++= +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++= +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++= +++++++++++++++++++++++++++++++++++++++------------------------------------= --------------------- BaseTools/Source/Python/Common/GlobalData.py | 9 +++++++++ BaseTools/Source/Python/build/build.py | 122 +++++++++++++++++++= +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++= ++-------------------------- 7 files changed, 859 insertions(+), 192 deletions(-) diff --git a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py b/BaseTools/S= ource/Python/AutoGen/AutoGenWorker.py old mode 100644 new mode 100755 index e583828741..a84ed46f2e --- a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py +++ b/BaseTools/Source/Python/AutoGen/AutoGenWorker.py @@ -182,6 +182,12 @@ class AutoGenWorkerInProcess(mp.Process): GlobalData.gDisableIncludePathCheck =3D False GlobalData.gFdfParser =3D self.data_pipe.Get("FdfParser") GlobalData.gDatabasePath =3D self.data_pipe.Get("DatabasePath") + GlobalData.gBinCacheSource =3D self.data_pipe.Get("BinCacheSou= rce") + GlobalData.gBinCacheDest =3D self.data_pipe.Get("BinCacheDest") + GlobalData.gCacheIR =3D self.data_pipe.Get("CacheIR") + GlobalData.gEnableGenfdsMultiThread =3D self.data_pipe.Get("En= ableGenfdsMultiThread") + GlobalData.file_lock =3D self.file_lock + CommandTarget =3D self.data_pipe.Get("CommandTarget") pcd_from_build_option =3D [] for pcd_tuple in self.data_pipe.Get("BuildOptPcd"): pcd_id =3D ".".join((pcd_tuple[0],pcd_tuple[1])) @@ -193,10 +199,13 @@ class AutoGenWorkerInProcess(mp.Process): FfsCmd =3D self.data_pipe.Get("FfsCommand") if FfsCmd is None: FfsCmd =3D {} + GlobalData.FfsCmd =3D FfsCmd PlatformMetaFile =3D self.GetPlatformMetaFile(self.data_pipe.G= et("P_Info").get("ActivePlatform"), self.data_pipe.Get("P_Info").= get("WorkspaceDir")) libConstPcd =3D self.data_pipe.Get("LibConstPcd") Refes =3D self.data_pipe.Get("REFS") + GlobalData.libConstPcd =3D libConstPcd + GlobalData.Refes =3D Refes while True: if self.module_queue.empty(): break @@ -223,8 +232,20 @@ class AutoGenWorkerInProcess(mp.Process): Ma.ConstPcd =3D libConstPcd[(Ma.MetaFile.File,Ma.M= etaFile.Root,Ma.Arch,Ma.MetaFile.Path)] if (Ma.MetaFile.File,Ma.MetaFile.Root,Ma.Arch,Ma.MetaF= ile.Path) in Refes: Ma.ReferenceModules =3D Refes[(Ma.MetaFile.File,Ma= .MetaFile.Root,Ma.Arch,Ma.MetaFile.Path)] + if GlobalData.gBinCacheSource and CommandTarget in [None, = "", "all"]: + Ma.GenModuleFilesHash(GlobalData.gCacheIR) + Ma.GenPreMakefileHash(GlobalData.gCacheIR) + if Ma.CanSkipbyPreMakefileCache(GlobalData.gCacheIR): + continue + Ma.CreateCodeFile(False) Ma.CreateMakeFile(False,GenFfsList=3DFfsCmd.get((Ma.MetaFi= le.File, Ma.Arch),[])) + + if GlobalData.gBinCacheSource and CommandTarget in [None, = "", "all"]: + Ma.GenMakeHeaderFilesHash(GlobalData.gCacheIR) + Ma.GenMakeHash(GlobalData.gCacheIR) + if Ma.CanSkipbyMakeCache(GlobalData.gCacheIR): + continue except Empty: pass except: diff --git a/BaseTools/Source/Python/AutoGen/CacheIR.py b/BaseTools/Source/= Python/AutoGen/CacheIR.py new file mode 100755 index 0000000000..2d9ffe3f0b --- /dev/null +++ b/BaseTools/Source/Python/AutoGen/CacheIR.py @@ -0,0 +1,28 @@ +## @file +# Build cache intermediate result and state +# +# Copyright (c) 2019, Intel Corporation. All rights reserved.
+# SPDX-License-Identifier: BSD-2-Clause-Patent +# + +class ModuleBuildCacheIR(): + def __init__(self, Path, Arch): + self.ModulePath =3D Path + self.ModuleArch =3D Arch + self.ModuleFilesHashDigest =3D None + self.ModuleFilesHashHexDigest =3D None + self.ModuleFilesChain =3D [] + self.PreMakefileHashHexDigest =3D None + self.CreateCodeFileDone =3D False + self.CreateMakeFileDone =3D False + self.MakefilePath =3D None + self.AutoGenFileList =3D None + self.DependencyHeaderFileSet =3D None + self.MakeHeaderFilesHashChain =3D None + self.MakeHeaderFilesHashDigest =3D None + self.MakeHeaderFilesHashChain =3D [] + self.MakeHashDigest =3D None + self.MakeHashHexDigest =3D None + self.MakeHashChain =3D [] + self.PreMakeCacheHit =3D False + self.MakeCacheHit =3D False diff --git a/BaseTools/Source/Python/AutoGen/DataPipe.py b/BaseTools/Source= /Python/AutoGen/DataPipe.py old mode 100644 new mode 100755 index 2052084bdb..84e77c301a --- a/BaseTools/Source/Python/AutoGen/DataPipe.py +++ b/BaseTools/Source/Python/AutoGen/DataPipe.py @@ -158,3 +158,11 @@ class MemoryDataPipe(DataPipe): self.DataContainer =3D {"FdfParser": True if GlobalData.gFdfParser= else False} =20 self.DataContainer =3D {"LogLevel": EdkLogger.GetLevel()} + + self.DataContainer =3D {"BinCacheSource":GlobalData.gBinCacheSourc= e} + + self.DataContainer =3D {"BinCacheDest":GlobalData.gBinCacheDest} + + self.DataContainer =3D {"CacheIR":GlobalData.gCacheIR} + + self.DataContainer =3D {"EnableGenfdsMultiThread":GlobalData.gEnab= leGenfdsMultiThread} \ No newline at end of file diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/= Python/AutoGen/GenMake.py old mode 100644 new mode 100755 index 5802ae5ae4..79387856bd --- a/BaseTools/Source/Python/AutoGen/GenMake.py +++ b/BaseTools/Source/Python/AutoGen/GenMake.py @@ -906,6 +906,11 @@ cleanlib: self._AutoGenObject.IncludePathList + = self._AutoGenObject.BuildOptionIncPathList ) =20 + self.DependencyHeaderFileSet =3D set() + if FileDependencyDict: + for Dependency in FileDependencyDict.values(): + self.DependencyHeaderFileSet.update(set(Dependency)) + # Check if header files are listed in metafile # Get a list of unique module header source files from MetaFile headerFilesInMetaFileSet =3D set() @@ -1096,7 +1101,7 @@ cleanlib: ## For creating makefile targets for dependent libraries def ProcessDependentLibrary(self): for LibraryAutoGen in self._AutoGenObject.LibraryAutoGenList: - if not LibraryAutoGen.IsBinaryModule and not LibraryAutoGen.Ca= nSkipbyHash(): + if not LibraryAutoGen.IsBinaryModule: self.LibraryBuildDirectoryList.append(self.PlaceMacro(Libr= aryAutoGen.BuildDir, self.Macros)) =20 ## Return a list containing source file's dependencies @@ -1110,114 +1115,9 @@ cleanlib: def GetFileDependency(self, FileList, ForceInculeList, SearchPathList): Dependency =3D {} for F in FileList: - Dependency[F] =3D self.GetDependencyList(F, ForceInculeList, S= earchPathList) + Dependency[F] =3D GetDependencyList(self._AutoGenObject, self.= FileCache, F, ForceInculeList, SearchPathList) return Dependency =20 - ## Find dependencies for one source file - # - # By searching recursively "#include" directive in file, find out all= the - # files needed by given source file. The dependencies will be only se= arched - # in given search path list. - # - # @param File The source file - # @param ForceInculeList The list of files which will be includ= ed forcely - # @param SearchPathList The list of search path - # - # @retval list The list of files the given source fil= e depends on - # - def GetDependencyList(self, File, ForceList, SearchPathList): - EdkLogger.debug(EdkLogger.DEBUG_1, "Try to get dependency files fo= r %s" % File) - FileStack =3D [File] + ForceList - DependencySet =3D set() - - if self._AutoGenObject.Arch not in gDependencyDatabase: - gDependencyDatabase[self._AutoGenObject.Arch] =3D {} - DepDb =3D gDependencyDatabase[self._AutoGenObject.Arch] - - while len(FileStack) > 0: - F =3D FileStack.pop() - - FullPathDependList =3D [] - if F in self.FileCache: - for CacheFile in self.FileCache[F]: - FullPathDependList.append(CacheFile) - if CacheFile not in DependencySet: - FileStack.append(CacheFile) - DependencySet.update(FullPathDependList) - continue - - CurrentFileDependencyList =3D [] - if F in DepDb: - CurrentFileDependencyList =3D DepDb[F] - else: - try: - Fd =3D open(F.Path, 'rb') - FileContent =3D Fd.read() - Fd.close() - except BaseException as X: - EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData= =3DF.Path + "\n\t" + str(X)) - if len(FileContent) =3D=3D 0: - continue - try: - if FileContent[0] =3D=3D 0xff or FileContent[0] =3D=3D= 0xfe: - FileContent =3D FileContent.decode('utf-16') - else: - FileContent =3D FileContent.decode() - except: - # The file is not txt file. for example .mcb file - continue - IncludedFileList =3D gIncludePattern.findall(FileContent) - - for Inc in IncludedFileList: - Inc =3D Inc.strip() - # if there's macro used to reference header file, expa= nd it - HeaderList =3D gMacroPattern.findall(Inc) - if len(HeaderList) =3D=3D 1 and len(HeaderList[0]) =3D= =3D 2: - HeaderType =3D HeaderList[0][0] - HeaderKey =3D HeaderList[0][1] - if HeaderType in gIncludeMacroConversion: - Inc =3D gIncludeMacroConversion[HeaderType] % = {"HeaderKey" : HeaderKey} - else: - # not known macro used in #include, always bui= ld the file by - # returning a empty dependency - self.FileCache[File] =3D [] - return [] - Inc =3D os.path.normpath(Inc) - CurrentFileDependencyList.append(Inc) - DepDb[F] =3D CurrentFileDependencyList - - CurrentFilePath =3D F.Dir - PathList =3D [CurrentFilePath] + SearchPathList - for Inc in CurrentFileDependencyList: - for SearchPath in PathList: - FilePath =3D os.path.join(SearchPath, Inc) - if FilePath in gIsFileMap: - if not gIsFileMap[FilePath]: - continue - # If isfile is called too many times, the performance = is slow down. - elif not os.path.isfile(FilePath): - gIsFileMap[FilePath] =3D False - continue - else: - gIsFileMap[FilePath] =3D True - FilePath =3D PathClass(FilePath) - FullPathDependList.append(FilePath) - if FilePath not in DependencySet: - FileStack.append(FilePath) - break - else: - EdkLogger.debug(EdkLogger.DEBUG_9, "%s included by %s = was not found "\ - "in any given path:\n\t%s" % (Inc, F, = "\n\t".join(SearchPathList))) - - self.FileCache[F] =3D FullPathDependList - DependencySet.update(FullPathDependList) - - DependencySet.update(ForceList) - if File in DependencySet: - DependencySet.remove(File) - DependencyList =3D list(DependencySet) # remove duplicate ones - - return DependencyList =20 ## CustomMakefile class # @@ -1599,7 +1499,7 @@ cleanlib: def GetLibraryBuildDirectoryList(self): DirList =3D [] for LibraryAutoGen in self._AutoGenObject.LibraryAutoGenList: - if not LibraryAutoGen.IsBinaryModule and not LibraryAutoGen.Ca= nSkipbyHash(): + if not LibraryAutoGen.IsBinaryModule: DirList.append(os.path.join(self._AutoGenObject.BuildDir, = LibraryAutoGen.BuildDir)) return DirList =20 @@ -1735,7 +1635,7 @@ class TopLevelMakefile(BuildFile): def GetLibraryBuildDirectoryList(self): DirList =3D [] for LibraryAutoGen in self._AutoGenObject.LibraryAutoGenList: - if not LibraryAutoGen.IsBinaryModule and not LibraryAutoGen.Ca= nSkipbyHash(): + if not LibraryAutoGen.IsBinaryModule: DirList.append(os.path.join(self._AutoGenObject.BuildDir, = LibraryAutoGen.BuildDir)) return DirList =20 @@ -1743,3 +1643,108 @@ class TopLevelMakefile(BuildFile): if __name__ =3D=3D '__main__': pass =20 +## Find dependencies for one source file +# +# By searching recursively "#include" directive in file, find out all the +# files needed by given source file. The dependencies will be only search= ed +# in given search path list. +# +# @param File The source file +# @param ForceInculeList The list of files which will be included f= orcely +# @param SearchPathList The list of search path +# +# @retval list The list of files the given source file de= pends on +# +def GetDependencyList(AutoGenObject, FileCache, File, ForceList, SearchPat= hList): + EdkLogger.debug(EdkLogger.DEBUG_1, "Try to get dependency files for %s= " % File) + FileStack =3D [File] + ForceList + DependencySet =3D set() + + if AutoGenObject.Arch not in gDependencyDatabase: + gDependencyDatabase[AutoGenObject.Arch] =3D {} + DepDb =3D gDependencyDatabase[AutoGenObject.Arch] + + while len(FileStack) > 0: + F =3D FileStack.pop() + + FullPathDependList =3D [] + if F in FileCache: + for CacheFile in FileCache[F]: + FullPathDependList.append(CacheFile) + if CacheFile not in DependencySet: + FileStack.append(CacheFile) + DependencySet.update(FullPathDependList) + continue + + CurrentFileDependencyList =3D [] + if F in DepDb: + CurrentFileDependencyList =3D DepDb[F] + else: + try: + Fd =3D open(F.Path, 'rb') + FileContent =3D Fd.read() + Fd.close() + except BaseException as X: + EdkLogger.error("build", FILE_OPEN_FAILURE, ExtraData=3DF.= Path + "\n\t" + str(X)) + if len(FileContent) =3D=3D 0: + continue + try: + if FileContent[0] =3D=3D 0xff or FileContent[0] =3D=3D 0xf= e: + FileContent =3D FileContent.decode('utf-16') + else: + FileContent =3D FileContent.decode() + except: + # The file is not txt file. for example .mcb file + continue + IncludedFileList =3D gIncludePattern.findall(FileContent) + + for Inc in IncludedFileList: + Inc =3D Inc.strip() + # if there's macro used to reference header file, expand it + HeaderList =3D gMacroPattern.findall(Inc) + if len(HeaderList) =3D=3D 1 and len(HeaderList[0]) =3D=3D = 2: + HeaderType =3D HeaderList[0][0] + HeaderKey =3D HeaderList[0][1] + if HeaderType in gIncludeMacroConversion: + Inc =3D gIncludeMacroConversion[HeaderType] % {"He= aderKey" : HeaderKey} + else: + # not known macro used in #include, always build t= he file by + # returning a empty dependency + FileCache[File] =3D [] + return [] + Inc =3D os.path.normpath(Inc) + CurrentFileDependencyList.append(Inc) + DepDb[F] =3D CurrentFileDependencyList + + CurrentFilePath =3D F.Dir + PathList =3D [CurrentFilePath] + SearchPathList + for Inc in CurrentFileDependencyList: + for SearchPath in PathList: + FilePath =3D os.path.join(SearchPath, Inc) + if FilePath in gIsFileMap: + if not gIsFileMap[FilePath]: + continue + # If isfile is called too many times, the performance is s= low down. + elif not os.path.isfile(FilePath): + gIsFileMap[FilePath] =3D False + continue + else: + gIsFileMap[FilePath] =3D True + FilePath =3D PathClass(FilePath) + FullPathDependList.append(FilePath) + if FilePath not in DependencySet: + FileStack.append(FilePath) + break + else: + EdkLogger.debug(EdkLogger.DEBUG_9, "%s included by %s was = not found "\ + "in any given path:\n\t%s" % (Inc, F, "\n\= t".join(SearchPathList))) + + FileCache[F] =3D FullPathDependList + DependencySet.update(FullPathDependList) + + DependencySet.update(ForceList) + if File in DependencySet: + DependencySet.remove(File) + DependencyList =3D list(DependencySet) # remove duplicate ones + + return DependencyList \ No newline at end of file diff --git a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py b/BaseTools/S= ource/Python/AutoGen/ModuleAutoGen.py old mode 100644 new mode 100755 index ed6822334e..6cb7cae1c7 --- a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py +++ b/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py @@ -26,6 +26,8 @@ from Workspace.MetaFileCommentParser import UsageList from .GenPcdDb import CreatePcdDatabaseCode from Common.caching import cached_class_function from AutoGen.ModuleAutoGenHelper import PlatformInfo,WorkSpaceInfo +from AutoGen.CacheIR import ModuleBuildCacheIR +import json =20 ## Mapping Makefile type gMakeTypeMap =3D {TAB_COMPILER_MSFT:"nmake", "GCC":"gmake"} @@ -252,6 +254,8 @@ class ModuleAutoGen(AutoGen): self.AutoGenDepSet =3D set() self.ReferenceModules =3D [] self.ConstPcd =3D {} + self.Makefile =3D None + self.FileDependCache =3D {} =20 def __init_platform_info__(self): pinfo =3D self.DataPipe.Get("P_Info") @@ -1594,12 +1598,37 @@ class ModuleAutoGen(AutoGen): =20 self.IsAsBuiltInfCreated =3D True =20 + def CacheCopyFile(self, OriginDir, CopyDir, File): + sub_dir =3D os.path.relpath(File, CopyDir) + destination_file =3D os.path.join(OriginDir, sub_dir) + destination_dir =3D os.path.dirname(destination_file) + CreateDirectory(destination_dir) + try: + CopyFileOnChange(File, destination_dir) + except: + EdkLogger.quiet("[cache warning]: fail to copy file:%s to fold= er:%s" % (File, destination_dir)) + return + def CopyModuleToCache(self): - FileDir =3D path.join(GlobalData.gBinCacheDest, self.PlatformInfo.= Name, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.SourceDir, s= elf.MetaFile.BaseName) + self.GenPreMakefileHash(GlobalData.gCacheIR) + if not (self.MetaFile.Path, self.Arch) in GlobalData.gCacheIR or \ + not GlobalData.gCacheIR[(self.MetaFile.Path, self.Arch)].PreMak= efileHashHexDigest: + EdkLogger.quiet("[cache warning]: Cannot generate PreMakefileH= ash for module: %s[%s]" % (self.MetaFile.Path, self.Arch)) + return False + + self.GenMakeHash(GlobalData.gCacheIR) + if not (self.MetaFile.Path, self.Arch) in GlobalData.gCacheIR or \ + not GlobalData.gCacheIR[(self.MetaFile.Path, self.Arch)].MakeHa= shChain or \ + not GlobalData.gCacheIR[(self.MetaFile.Path, self.Arch)].MakeHa= shHexDigest: + EdkLogger.quiet("[cache warning]: Cannot generate MakeHashChai= n for module: %s[%s]" % (self.MetaFile.Path, self.Arch)) + return False + + MakeHashStr =3D str(GlobalData.gCacheIR[(self.MetaFile.Path, self.= Arch)].MakeHashHexDigest) + FileDir =3D path.join(GlobalData.gBinCacheDest, self.PlatformInfo.= OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.SourceD= ir, self.MetaFile.BaseName, MakeHashStr) + FfsDir =3D path.join(GlobalData.gBinCacheDest, self.PlatformInfo.O= utputDir, self.BuildTarget + "_" + self.ToolChain, TAB_FV_DIRECTORY, "Ffs",= self.Guid + self.Name, MakeHashStr) + CreateDirectory (FileDir) - HashFile =3D path.join(self.BuildDir, self.Name + '.hash') - if os.path.exists(HashFile): - CopyFileOnChange(HashFile, FileDir) + self.SaveHashChainFileToCache(GlobalData.gCacheIR) ModuleFile =3D path.join(self.OutputDir, self.Name + '.inf') if os.path.exists(ModuleFile): CopyFileOnChange(ModuleFile, FileDir) @@ -1617,38 +1646,73 @@ class ModuleAutoGen(AutoGen): CreateDirectory(destination_dir) CopyFileOnChange(File, destination_dir) =20 - def AttemptModuleCacheCopy(self): - # If library or Module is binary do not skip by hash - if self.IsBinaryModule: + def SaveHashChainFileToCache(self, gDict): + if not GlobalData.gBinCacheDest: + return False + + self.GenPreMakefileHash(gDict) + if not (self.MetaFile.Path, self.Arch) in gDict or \ + not gDict[(self.MetaFile.Path, self.Arch)].PreMakefileHashHexDi= gest: + EdkLogger.quiet("[cache warning]: Cannot generate PreMakefileH= ash for module: %s[%s]" % (self.MetaFile.Path, self.Arch)) + return False + + self.GenMakeHash(gDict) + if not (self.MetaFile.Path, self.Arch) in gDict or \ + not gDict[(self.MetaFile.Path, self.Arch)].MakeHashChain or \ + not gDict[(self.MetaFile.Path, self.Arch)].MakeHashHexDigest: + EdkLogger.quiet("[cache warning]: Cannot generate MakeHashChai= n for module: %s[%s]" % (self.MetaFile.Path, self.Arch)) return False - # .inc is contains binary information so do not skip by hash as we= ll - for f_ext in self.SourceFileList: - if '.inc' in str(f_ext): - return False - FileDir =3D path.join(GlobalData.gBinCacheSource, self.PlatformInf= o.Name, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.SourceDir,= self.MetaFile.BaseName) - HashFile =3D path.join(FileDir, self.Name + '.hash') - if os.path.exists(HashFile): - f =3D open(HashFile, 'r') - CacheHash =3D f.read() - f.close() - self.GenModuleHash() - if GlobalData.gModuleHash[self.Arch][self.Name]: - if CacheHash =3D=3D GlobalData.gModuleHash[self.Arch][self= .Name]: - for root, dir, files in os.walk(FileDir): - for f in files: - if self.Name + '.hash' in f: - CopyFileOnChange(HashFile, self.BuildDir) - else: - File =3D path.join(root, f) - sub_dir =3D os.path.relpath(File, FileDir) - destination_file =3D os.path.join(self.Out= putDir, sub_dir) - destination_dir =3D os.path.dirname(destin= ation_file) - CreateDirectory(destination_dir) - CopyFileOnChange(File, destination_dir) - if self.Name =3D=3D "PcdPeim" or self.Name =3D=3D "Pcd= Dxe": - CreatePcdDatabaseCode(self, TemplateString(), Temp= lateString()) - return True - return False + + # save the hash chain list as cache file + MakeHashStr =3D str(GlobalData.gCacheIR[(self.MetaFile.Path, self.= Arch)].MakeHashHexDigest) + CacheDestDir =3D path.join(GlobalData.gBinCacheDest, self.Platform= Info.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.So= urceDir, self.MetaFile.BaseName) + CacheHashDestDir =3D path.join(CacheDestDir, MakeHashStr) + ModuleHashPair =3D path.join(CacheDestDir, self.Name + ".ModuleHas= hPair") + MakeHashChain =3D path.join(CacheHashDestDir, self.Name + ".MakeHa= shChain") + ModuleFilesChain =3D path.join(CacheHashDestDir, self.Name + ".Mod= uleFilesChain") + + # save the HashChainDict as json file + CreateDirectory (CacheDestDir) + CreateDirectory (CacheHashDestDir) + try: + ModuleHashPairList =3D [] # tuple list: [tuple(PreMakefileHash= , MakeHash)] + if os.path.exists(ModuleHashPair): + f =3D open(ModuleHashPair, 'r') + ModuleHashPairList =3D json.load(f) + f.close() + PreMakeHash =3D gDict[(self.MetaFile.Path, self.Arch)].PreMake= fileHashHexDigest + MakeHash =3D gDict[(self.MetaFile.Path, self.Arch)].MakeHashHe= xDigest + ModuleHashPairList.append((PreMakeHash, MakeHash)) + ModuleHashPairList =3D list(set(map(tuple, ModuleHashPairList)= )) + with open(ModuleHashPair, 'w') as f: + json.dump(ModuleHashPairList, f, indent=3D2) + except: + EdkLogger.quiet("[cache warning]: fail to save ModuleHashPair = file in cache: %s" % ModuleHashPair) + return False + + try: + with open(MakeHashChain, 'w') as f: + json.dump(gDict[(self.MetaFile.Path, self.Arch)].MakeHashC= hain, f, indent=3D2) + except: + EdkLogger.quiet("[cache warning]: fail to save MakeHashChain f= ile in cache: %s" % MakeHashChain) + return False + + try: + with open(ModuleFilesChain, 'w') as f: + json.dump(gDict[(self.MetaFile.Path, self.Arch)].ModuleFil= esChain, f, indent=3D2) + except: + EdkLogger.quiet("[cache warning]: fail to save ModuleFilesChai= n file in cache: %s" % ModuleFilesChain) + return False + + # save the autogenfile and makefile for debug usage + CacheDebugDir =3D path.join(CacheHashDestDir, "CacheDebug") + CreateDirectory (CacheDebugDir) + CopyFileOnChange(gDict[(self.MetaFile.Path, self.Arch)].MakefilePa= th, CacheDebugDir) + if gDict[(self.MetaFile.Path, self.Arch)].AutoGenFileList: + for File in gDict[(self.MetaFile.Path, self.Arch)].AutoGenFile= List: + CopyFileOnChange(str(File), CacheDebugDir) + + return True =20 ## Create makefile for the module and its dependent libraries # @@ -1657,6 +1721,11 @@ class ModuleAutoGen(AutoGen): # @cached_class_function def CreateMakeFile(self, CreateLibraryMakeFile=3DTrue, GenFfsList =3D = []): + gDict =3D GlobalData.gCacheIR + if (self.MetaFile.Path, self.Arch) in gDict and \ + gDict[(self.MetaFile.Path, self.Arch)].CreateMakeFileDone: + return + # nest this function inside it's only caller. def CreateTimeStamp(): FileSet =3D {self.MetaFile.Path} @@ -1687,8 +1756,8 @@ class ModuleAutoGen(AutoGen): for LibraryAutoGen in self.LibraryAutoGenList: LibraryAutoGen.CreateMakeFile() =20 - # Don't enable if hash feature enabled, CanSkip uses timestamps to= determine build skipping - if not GlobalData.gUseHashCache and self.CanSkip(): + # CanSkip uses timestamps to determine build skipping + if self.CanSkip(): return =20 if len(self.CustomMakefile) =3D=3D 0: @@ -1704,6 +1773,24 @@ class ModuleAutoGen(AutoGen): =20 CreateTimeStamp() =20 + MakefileType =3D Makefile._FileType + MakefileName =3D Makefile._FILE_NAME_[MakefileType] + MakefilePath =3D os.path.join(self.MakeFileDir, MakefileName) + + MewIR =3D ModuleBuildCacheIR(self.MetaFile.Path, self.Arch) + MewIR.MakefilePath =3D MakefilePath + MewIR.DependencyHeaderFileSet =3D Makefile.DependencyHeaderFileSet + MewIR.CreateMakeFileDone =3D True + with GlobalData.file_lock: + try: + IR =3D gDict[(self.MetaFile.Path, self.Arch)] + IR.MakefilePath =3D MakefilePath + IR.DependencyHeaderFileSet =3D Makefile.DependencyHeaderFi= leSet + IR.CreateMakeFileDone =3D True + gDict[(self.MetaFile.Path, self.Arch)] =3D IR + except: + gDict[(self.MetaFile.Path, self.Arch)] =3D MewIR + def CopyBinaryFiles(self): for File in self.Module.Binaries: SrcPath =3D File.Path @@ -1715,6 +1802,11 @@ class ModuleAutoGen(AutoGen): # dependent libraries will be cr= eated # def CreateCodeFile(self, CreateLibraryCodeFile=3DTrue): + gDict =3D GlobalData.gCacheIR + if (self.MetaFile.Path, self.Arch) in gDict and \ + gDict[(self.MetaFile.Path, self.Arch)].CreateCodeFileDone: + return + if self.IsCodeFileCreated: return =20 @@ -1730,8 +1822,9 @@ class ModuleAutoGen(AutoGen): if not self.IsLibrary and CreateLibraryCodeFile: for LibraryAutoGen in self.LibraryAutoGenList: LibraryAutoGen.CreateCodeFile() - # Don't enable if hash feature enabled, CanSkip uses timestamps to= determine build skipping - if not GlobalData.gUseHashCache and self.CanSkip(): + + # CanSkip uses timestamps to determine build skipping + if self.CanSkip(): return =20 AutoGenList =3D [] @@ -1771,6 +1864,16 @@ class ModuleAutoGen(AutoGen): (" ".join(AutoGenList), " ".join(IgoredAutoGen= List), self.Name, self.Arch)) =20 self.IsCodeFileCreated =3D True + MewIR =3D ModuleBuildCacheIR(self.MetaFile.Path, self.Arch) + MewIR.CreateCodeFileDone =3D True + with GlobalData.file_lock: + try: + IR =3D gDict[(self.MetaFile.Path, self.Arch)] + IR.CreateCodeFileDone =3D True + gDict[(self.MetaFile.Path, self.Arch)] =3D IR + except: + gDict[(self.MetaFile.Path, self.Arch)] =3D MewIR + return AutoGenList =20 ## Summarize the ModuleAutoGen objects of all libraries used by this m= odule @@ -1840,46 +1943,469 @@ class ModuleAutoGen(AutoGen): =20 return GlobalData.gModuleHash[self.Arch][self.Name].encode('utf-8') =20 + def GenModuleFilesHash(self, gDict): + # Early exit if module or library has been hashed and is in memory + if (self.MetaFile.Path, self.Arch) in gDict: + if gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesChain: + return gDict[(self.MetaFile.Path, self.Arch)] + + DependencyFileSet =3D set() + # Add Module Meta file + DependencyFileSet.add(self.MetaFile) + + # Add Module's source files + if self.SourceFileList: + for File in set(self.SourceFileList): + DependencyFileSet.add(File) + + # Add modules's include header files + # Search dependency file list for each source file + SourceFileList =3D [] + OutPutFileList =3D [] + for Target in self.IntroTargetList: + SourceFileList.extend(Target.Inputs) + OutPutFileList.extend(Target.Outputs) + if OutPutFileList: + for Item in OutPutFileList: + if Item in SourceFileList: + SourceFileList.remove(Item) + SearchList =3D [] + for file_path in self.IncludePathList + self.BuildOptionIncPathLis= t: + # skip the folders in platform BuildDir which are not been gen= erated yet + if file_path.startswith(os.path.abspath(self.PlatformInfo.Buil= dDir)+os.sep): + continue + SearchList.append(file_path) + FileDependencyDict =3D {} + ForceIncludedFile =3D [] + for F in SourceFileList: + # skip the files which are not been generated yet, because + # the SourceFileList usually contains intermediate build files= , e.g. AutoGen.c + if not os.path.exists(F.Path): + continue + FileDependencyDict[F] =3D GenMake.GetDependencyList(self, self= .FileDependCache, F, ForceIncludedFile, SearchList) + + if FileDependencyDict: + for Dependency in FileDependencyDict.values(): + DependencyFileSet.update(set(Dependency)) + + # Caculate all above dependency files hash + # Initialze hash object + FileList =3D [] + m =3D hashlib.md5() + for File in sorted(DependencyFileSet, key=3Dlambda x: str(x)): + if not os.path.exists(str(File)): + EdkLogger.quiet("[cache warning]: header file %s is missin= g for module: %s[%s]" % (File, self.MetaFile.Path, self.Arch)) + continue + f =3D open(str(File), 'rb') + Content =3D f.read() + f.close() + m.update(Content) + FileList.append((str(File), hashlib.md5(Content).hexdigest())) + + + MewIR =3D ModuleBuildCacheIR(self.MetaFile.Path, self.Arch) + MewIR.ModuleFilesHashDigest =3D m.digest() + MewIR.ModuleFilesHashHexDigest =3D m.hexdigest() + MewIR.ModuleFilesChain =3D FileList + with GlobalData.file_lock: + try: + IR =3D gDict[(self.MetaFile.Path, self.Arch)] + IR.ModuleFilesHashDigest =3D m.digest() + IR.ModuleFilesHashHexDigest =3D m.hexdigest() + IR.ModuleFilesChain =3D FileList + gDict[(self.MetaFile.Path, self.Arch)] =3D IR + except: + gDict[(self.MetaFile.Path, self.Arch)] =3D MewIR + + return gDict[(self.MetaFile.Path, self.Arch)] + + def GenPreMakefileHash(self, gDict): + # Early exit if module or library has been hashed and is in memory + if (self.MetaFile.Path, self.Arch) in gDict and \ + gDict[(self.MetaFile.Path, self.Arch)].PreMakefileHashHexDigest: + return gDict[(self.MetaFile.Path, self.Arch)] + + # skip binary module + if self.IsBinaryModule: + return + + if not (self.MetaFile.Path, self.Arch) in gDict or \ + not gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDiges= t: + self.GenModuleFilesHash(gDict) + + if not (self.MetaFile.Path, self.Arch) in gDict or \ + not gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDiges= t: + EdkLogger.quiet("[cache warning]: Cannot generate ModuleFilesHa= shDigest for module %s[%s]" %(self.MetaFile.Path, self.Arch)) + return + + # Initialze hash object + m =3D hashlib.md5() + + # Add Platform level hash + if ('PlatformHash') in gDict: + m.update(gDict[('PlatformHash')].encode('utf-8')) + else: + EdkLogger.quiet("[cache warning]: PlatformHash is missing") + + # Add Package level hash + if self.DependentPackageList: + for Pkg in sorted(self.DependentPackageList, key=3Dlambda x: x= .PackageName): + if (Pkg.PackageName, 'PackageHash') in gDict: + m.update(gDict[(Pkg.PackageName, 'PackageHash')].encod= e('utf-8')) + else: + EdkLogger.quiet("[cache warning]: %s PackageHash neede= d by %s[%s] is missing" %(Pkg.PackageName, self.MetaFile.Name, self.Arch)) + + # Add Library hash + if self.LibraryAutoGenList: + for Lib in sorted(self.LibraryAutoGenList, key=3Dlambda x: x.N= ame): + if not (Lib.MetaFile.Path, Lib.Arch) in gDict or \ + not gDict[(Lib.MetaFile.Path, Lib.Arch)].ModuleFilesHas= hDigest: + Lib.GenPreMakefileHash(gDict) + m.update(gDict[(Lib.MetaFile.Path, Lib.Arch)].ModuleFilesH= ashDigest) + + # Add Module self + m.update(gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDig= est) + + with GlobalData.file_lock: + IR =3D gDict[(self.MetaFile.Path, self.Arch)] + IR.PreMakefileHashHexDigest =3D m.hexdigest() + gDict[(self.MetaFile.Path, self.Arch)] =3D IR + + return gDict[(self.MetaFile.Path, self.Arch)] + + def GenMakeHeaderFilesHash(self, gDict): + # Early exit if module or library has been hashed and is in memory + if (self.MetaFile.Path, self.Arch) in gDict and \ + gDict[(self.MetaFile.Path, self.Arch)].MakeHeaderFilesHashDigest: + return gDict[(self.MetaFile.Path, self.Arch)] + + # skip binary module + if self.IsBinaryModule: + return + + if not (self.MetaFile.Path, self.Arch) in gDict or \ + not gDict[(self.MetaFile.Path, self.Arch)].CreateCodeFileDone: + if self.IsLibrary: + if (self.MetaFile.File,self.MetaFile.Root,self.Arch,self.M= etaFile.Path) in GlobalData.libConstPcd: + self.ConstPcd =3D GlobalData.libConstPcd[(self.MetaFil= e.File,self.MetaFile.Root,self.Arch,self.MetaFile.Path)] + if (self.MetaFile.File,self.MetaFile.Root,self.Arch,self.M= etaFile.Path) in GlobalData.Refes: + self.ReferenceModules =3D GlobalData.Refes[(self.MetaF= ile.File,self.MetaFile.Root,self.Arch,self.MetaFile.Path)] + self.CreateCodeFile() + if not (self.MetaFile.Path, self.Arch) in gDict or \ + not gDict[(self.MetaFile.Path, self.Arch)].CreateMakeFileDone: + self.CreateMakeFile(GenFfsList=3DGlobalData.FfsCmd.get((self.M= etaFile.File, self.Arch),[])) + + if not (self.MetaFile.Path, self.Arch) in gDict or \ + not gDict[(self.MetaFile.Path, self.Arch)].CreateCodeFileDone o= r \ + not gDict[(self.MetaFile.Path, self.Arch)].CreateMakeFileDone: + EdkLogger.quiet("[cache warning]: Cannot create CodeFile or Mak= efile for module %s[%s]" %(self.MetaFile.Path, self.Arch)) + return + + DependencyFileSet =3D set() + # Add Makefile + if gDict[(self.MetaFile.Path, self.Arch)].MakefilePath: + DependencyFileSet.add(gDict[(self.MetaFile.Path, self.Arch)].M= akefilePath) + else: + EdkLogger.quiet("[cache warning]: makefile is missing for modu= le %s[%s]" %(self.MetaFile.Path, self.Arch)) + + # Add header files + if gDict[(self.MetaFile.Path, self.Arch)].DependencyHeaderFileSet: + for File in gDict[(self.MetaFile.Path, self.Arch)].DependencyH= eaderFileSet: + DependencyFileSet.add(File) + else: + EdkLogger.quiet("[cache warning]: No dependency header found f= or module %s[%s]" %(self.MetaFile.Path, self.Arch)) + + # Add AutoGen files + if self.AutoGenFileList: + for File in set(self.AutoGenFileList): + DependencyFileSet.add(File) + + # Caculate all above dependency files hash + # Initialze hash object + FileList =3D [] + m =3D hashlib.md5() + for File in sorted(DependencyFileSet, key=3Dlambda x: str(x)): + if not os.path.exists(str(File)): + EdkLogger.quiet("[cache warning]: header file: %s doesn't = exist for module: %s[%s]" % (File, self.MetaFile.Path, self.Arch)) + continue + f =3D open(str(File), 'rb') + Content =3D f.read() + f.close() + m.update(Content) + FileList.append((str(File), hashlib.md5(Content).hexdigest())) + + with GlobalData.file_lock: + IR =3D gDict[(self.MetaFile.Path, self.Arch)] + IR.AutoGenFileList =3D self.AutoGenFileList.keys() + IR.MakeHeaderFilesHashChain =3D FileList + IR.MakeHeaderFilesHashDigest =3D m.digest() + gDict[(self.MetaFile.Path, self.Arch)] =3D IR + + return gDict[(self.MetaFile.Path, self.Arch)] + + def GenMakeHash(self, gDict): + # Early exit if module or library has been hashed and is in memory + if (self.MetaFile.Path, self.Arch) in gDict and \ + gDict[(self.MetaFile.Path, self.Arch)].MakeHashChain: + return gDict[(self.MetaFile.Path, self.Arch)] + + # skip binary module + if self.IsBinaryModule: + return + + if not (self.MetaFile.Path, self.Arch) in gDict or \ + not gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDiges= t: + self.GenModuleFilesHash(gDict) + if not gDict[(self.MetaFile.Path, self.Arch)].MakeHeaderFilesHashD= igest: + self.GenMakeHeaderFilesHash(gDict) + + if not (self.MetaFile.Path, self.Arch) in gDict or \ + not gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDiges= t or \ + not gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesChain or \ + not gDict[(self.MetaFile.Path, self.Arch)].MakeHeaderFilesHashD= igest or \ + not gDict[(self.MetaFile.Path, self.Arch)].MakeHeaderFilesHashC= hain: + EdkLogger.quiet("[cache warning]: Cannot generate ModuleFilesHa= sh or MakeHeaderFilesHash for module %s[%s]" %(self.MetaFile.Path, self.Arc= h)) + return + + # Initialze hash object + m =3D hashlib.md5() + MakeHashChain =3D [] + + # Add hash of makefile and dependency header files + m.update(gDict[(self.MetaFile.Path, self.Arch)].MakeHeaderFilesHas= hDigest) + New =3D list(set(gDict[(self.MetaFile.Path, self.Arch)].MakeHeader= FilesHashChain) - set(MakeHashChain)) + New.sort(key=3Dlambda x: str(x)) + MakeHashChain +=3D New + + # Add Library hash + if self.LibraryAutoGenList: + for Lib in sorted(self.LibraryAutoGenList, key=3Dlambda x: x.N= ame): + if not (Lib.MetaFile.Path, Lib.Arch) in gDict or \ + not gDict[(Lib.MetaFile.Path, Lib.Arch)].MakeHashChain: + Lib.GenMakeHash(gDict) + if not gDict[(Lib.MetaFile.Path, Lib.Arch)].MakeHashDigest: + print("Cannot generate MakeHash for lib module:", Lib.= MetaFile.Path, Lib.Arch) + continue + m.update(gDict[(Lib.MetaFile.Path, Lib.Arch)].MakeHashDige= st) + New =3D list(set(gDict[(Lib.MetaFile.Path, Lib.Arch)].Make= HashChain) - set(MakeHashChain)) + New.sort(key=3Dlambda x: str(x)) + MakeHashChain +=3D New + + # Add Module self + m.update(gDict[(self.MetaFile.Path, self.Arch)].ModuleFilesHashDig= est) + New =3D list(set(gDict[(self.MetaFile.Path, self.Arch)].ModuleFile= sChain) - set(MakeHashChain)) + New.sort(key=3Dlambda x: str(x)) + MakeHashChain +=3D New + + with GlobalData.file_lock: + IR =3D gDict[(self.MetaFile.Path, self.Arch)] + IR.MakeHashDigest =3D m.digest() + IR.MakeHashHexDigest =3D m.hexdigest() + IR.MakeHashChain =3D MakeHashChain + gDict[(self.MetaFile.Path, self.Arch)] =3D IR + + return gDict[(self.MetaFile.Path, self.Arch)] + + ## Decide whether we can skip the left autogen and make process + def CanSkipbyPreMakefileCache(self, gDict): + if not GlobalData.gBinCacheSource: + return False + + # If Module is binary, do not skip by cache + if self.IsBinaryModule: + return False + + # .inc is contains binary information so do not skip by hash as we= ll + for f_ext in self.SourceFileList: + if '.inc' in str(f_ext): + return False + + # Get the module hash values from stored cache and currrent build + # then check whether cache hit based on the hash values + # if cache hit, restore all the files from cache + FileDir =3D path.join(GlobalData.gBinCacheSource, self.PlatformInf= o.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.Sourc= eDir, self.MetaFile.BaseName) + FfsDir =3D path.join(GlobalData.gBinCacheSource, self.PlatformInfo= .OutputDir, self.BuildTarget + "_" + self.ToolChain, TAB_FV_DIRECTORY, "Ffs= ", self.Guid + self.Name) + + ModuleHashPairList =3D [] # tuple list: [tuple(PreMakefileHash, Ma= keHash)] + ModuleHashPair =3D path.join(FileDir, self.Name + ".ModuleHashPair= ") + if not os.path.exists(ModuleHashPair): + EdkLogger.quiet("[cache warning]: Cannot find ModuleHashPair f= ile: %s" % ModuleHashPair) + return False + + try: + f =3D open(ModuleHashPair, 'r') + ModuleHashPairList =3D json.load(f) + f.close() + except: + EdkLogger.quiet("[cache warning]: fail to load ModuleHashPair = file: %s" % ModuleHashPair) + return False + + self.GenPreMakefileHash(gDict) + if not (self.MetaFile.Path, self.Arch) in gDict or \ + not gDict[(self.MetaFile.Path, self.Arch)].PreMakefileHashHexDi= gest: + EdkLogger.quiet("[cache warning]: PreMakefileHashHexDigest is = missing for module %s[%s]" %(self.MetaFile.Path, self.Arch)) + return False + + MakeHashStr =3D None + CurrentPreMakeHash =3D gDict[(self.MetaFile.Path, self.Arch)].PreM= akefileHashHexDigest + for idx, (PreMakefileHash, MakeHash) in enumerate (ModuleHashPairL= ist): + if PreMakefileHash =3D=3D CurrentPreMakeHash: + MakeHashStr =3D str(MakeHash) + + if not MakeHashStr: + return False + + TargetHashDir =3D path.join(FileDir, MakeHashStr) + TargetFfsHashDir =3D path.join(FfsDir, MakeHashStr) + + if not os.path.exists(TargetHashDir): + EdkLogger.quiet("[cache warning]: Cache folder is missing: %s"= % TargetHashDir) + return False + + for root, dir, files in os.walk(TargetHashDir): + for f in files: + File =3D path.join(root, f) + self.CacheCopyFile(self.OutputDir, TargetHashDir, File) + if os.path.exists(TargetFfsHashDir): + for root, dir, files in os.walk(TargetFfsHashDir): + for f in files: + File =3D path.join(root, f) + self.CacheCopyFile(self.FfsOutputDir, TargetFfsHashDir= , File) + + if self.Name =3D=3D "PcdPeim" or self.Name =3D=3D "PcdDxe": + CreatePcdDatabaseCode(self, TemplateString(), TemplateString()) + + with GlobalData.file_lock: + IR =3D gDict[(self.MetaFile.Path, self.Arch)] + IR.PreMakeCacheHit =3D True + gDict[(self.MetaFile.Path, self.Arch)] =3D IR + print("[cache hit]: checkpoint_PreMakefile:", self.MetaFile.Path, = self.Arch) + #EdkLogger.quiet("cache hit: %s[%s]" % (self.MetaFile.Path, self.A= rch)) + return True + + ## Decide whether we can skip the make process + def CanSkipbyMakeCache(self, gDict): + if not GlobalData.gBinCacheSource: + return False + + # If Module is binary, do not skip by cache + if self.IsBinaryModule: + print("[cache miss]: checkpoint_Makefile: binary module:", sel= f.MetaFile.Path, self.Arch) + return False + + # .inc is contains binary information so do not skip by hash as we= ll + for f_ext in self.SourceFileList: + if '.inc' in str(f_ext): + with GlobalData.file_lock: + IR =3D gDict[(self.MetaFile.Path, self.Arch)] + IR.MakeCacheHit =3D False + gDict[(self.MetaFile.Path, self.Arch)] =3D IR + print("[cache miss]: checkpoint_Makefile: .inc module:", s= elf.MetaFile.Path, self.Arch) + return False + + # Get the module hash values from stored cache and currrent build + # then check whether cache hit based on the hash values + # if cache hit, restore all the files from cache + FileDir =3D path.join(GlobalData.gBinCacheSource, self.PlatformInf= o.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.Sourc= eDir, self.MetaFile.BaseName) + FfsDir =3D path.join(GlobalData.gBinCacheSource, self.PlatformInfo= .OutputDir, self.BuildTarget + "_" + self.ToolChain, TAB_FV_DIRECTORY, "Ffs= ", self.Guid + self.Name) + + ModuleHashPairList =3D [] # tuple list: [tuple(PreMakefileHash, Ma= keHash)] + ModuleHashPair =3D path.join(FileDir, self.Name + ".ModuleHashPair= ") + if not os.path.exists(ModuleHashPair): + EdkLogger.quiet("[cache warning]: Cannot find ModuleHashPair f= ile: %s" % ModuleHashPair) + return False + + try: + f =3D open(ModuleHashPair, 'r') + ModuleHashPairList =3D json.load(f) + f.close() + except: + EdkLogger.quiet("[cache warning]: fail to load ModuleHashPair = file: %s" % ModuleHashPair) + return False + + self.GenMakeHash(gDict) + if not (self.MetaFile.Path, self.Arch) in gDict or \ + not gDict[(self.MetaFile.Path, self.Arch)].MakeHashHexDigest: + EdkLogger.quiet("[cache warning]: MakeHashHexDigest is missing= for module %s[%s]" %(self.MetaFile.Path, self.Arch)) + return False + + MakeHashStr =3D None + CurrentMakeHash =3D gDict[(self.MetaFile.Path, self.Arch)].MakeHas= hHexDigest + for idx, (PreMakefileHash, MakeHash) in enumerate (ModuleHashPairL= ist): + if MakeHash =3D=3D CurrentMakeHash: + MakeHashStr =3D str(MakeHash) + + if not MakeHashStr: + print("[cache miss]: checkpoint_Makefile:", self.MetaFile.Path= , self.Arch) + return False + + TargetHashDir =3D path.join(FileDir, MakeHashStr) + TargetFfsHashDir =3D path.join(FfsDir, MakeHashStr) + if not os.path.exists(TargetHashDir): + EdkLogger.quiet("[cache warning]: Cache folder is missing: %s"= % TargetHashDir) + return False + + for root, dir, files in os.walk(TargetHashDir): + for f in files: + File =3D path.join(root, f) + self.CacheCopyFile(self.OutputDir, TargetHashDir, File) + + if os.path.exists(TargetFfsHashDir): + for root, dir, files in os.walk(TargetFfsHashDir): + for f in files: + File =3D path.join(root, f) + self.CacheCopyFile(self.FfsOutputDir, TargetFfsHashDir= , File) + + if self.Name =3D=3D "PcdPeim" or self.Name =3D=3D "PcdDxe": + CreatePcdDatabaseCode(self, TemplateString(), TemplateString()) + with GlobalData.file_lock: + IR =3D gDict[(self.MetaFile.Path, self.Arch)] + IR.MakeCacheHit =3D True + gDict[(self.MetaFile.Path, self.Arch)] =3D IR + print("[cache hit]: checkpoint_Makefile:", self.MetaFile.Path, sel= f.Arch) + return True + ## Decide whether we can skip the ModuleAutoGen process - def CanSkipbyHash(self): + def CanSkipbyCache(self, gDict): # Hashing feature is off - if not GlobalData.gUseHashCache: + if not GlobalData.gBinCacheSource: return False =20 - # Initialize a dictionary for each arch type - if self.Arch not in GlobalData.gBuildHashSkipTracking: - GlobalData.gBuildHashSkipTracking[self.Arch] =3D dict() + if self in GlobalData.gBuildHashSkipTracking: + return GlobalData.gBuildHashSkipTracking[self] =20 # If library or Module is binary do not skip by hash if self.IsBinaryModule: + GlobalData.gBuildHashSkipTracking[self] =3D False return False =20 # .inc is contains binary information so do not skip by hash as we= ll for f_ext in self.SourceFileList: if '.inc' in str(f_ext): + GlobalData.gBuildHashSkipTracking[self] =3D False return False =20 - # Use Cache, if exists and if Module has a copy in cache - if GlobalData.gBinCacheSource and self.AttemptModuleCacheCopy(): + if not (self.MetaFile.Path, self.Arch) in gDict: + return False + + if gDict[(self.MetaFile.Path, self.Arch)].PreMakeCacheHit: + GlobalData.gBuildHashSkipTracking[self] =3D True return True =20 - # Early exit for libraries that haven't yet finished building - HashFile =3D path.join(self.BuildDir, self.Name + ".hash") - if self.IsLibrary and not os.path.exists(HashFile): - return False + if gDict[(self.MetaFile.Path, self.Arch)].MakeCacheHit: + GlobalData.gBuildHashSkipTracking[self] =3D True + return True =20 - # Return a Boolean based on if can skip by hash, either from memor= y or from IO. - if self.Name not in GlobalData.gBuildHashSkipTracking[self.Arch]: - # If hashes are the same, SaveFileOnChange() will return False. - GlobalData.gBuildHashSkipTracking[self.Arch][self.Name] =3D no= t SaveFileOnChange(HashFile, self.GenModuleHash(), True) - return GlobalData.gBuildHashSkipTracking[self.Arch][self.Name] - else: - return GlobalData.gBuildHashSkipTracking[self.Arch][self.Name] + return False =20 ## Decide whether we can skip the ModuleAutoGen process # If any source file is newer than the module than we cannot skip # def CanSkip(self): + # Don't skip if cache feature enabled + if GlobalData.gUseHashCache or GlobalData.gBinCacheDest or GlobalD= ata.gBinCacheSource: + return False + if self.MakeFileDir in GlobalData.gSikpAutoGenCache: return True if not os.path.exists(self.TimeStampPath): diff --git a/BaseTools/Source/Python/Common/GlobalData.py b/BaseTools/Sourc= e/Python/Common/GlobalData.py old mode 100644 new mode 100755 index bd45a43728..df10814f04 --- a/BaseTools/Source/Python/Common/GlobalData.py +++ b/BaseTools/Source/Python/Common/GlobalData.py @@ -119,3 +119,12 @@ gModuleBuildTracking =3D dict() # Top Dict: Key: Arch Type Value: Dictionary # Second Dict: Key: Module\Library Name Value: True\False gBuildHashSkipTracking =3D dict() + +# Common dictionary to share module cache intermediate result and state +gCacheIR =3D None +# Common lock for the multiple process AutoGens +file_lock =3D None +# Common dictionary to share platform libraries' constant Pcd +libConstPcd =3D None +# Common dictionary to share platform libraries' reference info +Refes =3D None \ No newline at end of file diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Pyth= on/build/build.py old mode 100644 new mode 100755 index 4de3f43c27..84540d61f5 --- a/BaseTools/Source/Python/build/build.py +++ b/BaseTools/Source/Python/build/build.py @@ -595,7 +595,7 @@ class BuildTask: # def AddDependency(self, Dependency): for Dep in Dependency: - if not Dep.BuildObject.IsBinaryModule and not Dep.BuildObject.= CanSkipbyHash(): + if not Dep.BuildObject.IsBinaryModule and not Dep.BuildObject.= CanSkipbyCache(GlobalData.gCacheIR): self.DependencyList.append(BuildTask.New(Dep)) # BuildT= ask list =20 ## The thread wrapper of LaunchCommand function @@ -811,7 +811,7 @@ class Build(): self.AutoGenMgr =3D None EdkLogger.info("") os.chdir(self.WorkspaceDir) - self.share_data =3D Manager().dict() + GlobalData.gCacheIR =3D Manager().dict() self.log_q =3D log_q def StartAutoGen(self,mqueue, DataPipe,SkipAutoGen,PcdMaList,share_dat= a): try: @@ -820,6 +820,13 @@ class Build(): feedback_q =3D mp.Queue() file_lock =3D mp.Lock() error_event =3D mp.Event() + GlobalData.file_lock =3D file_lock + FfsCmd =3D DataPipe.Get("FfsCommand") + if FfsCmd is None: + FfsCmd =3D {} + GlobalData.FfsCmd =3D FfsCmd + GlobalData.libConstPcd =3D DataPipe.Get("LibConstPcd") + GlobalData.Refes =3D DataPipe.Get("REFS") auto_workers =3D [AutoGenWorkerInProcess(mqueue,DataPipe.dump_= file,feedback_q,file_lock,share_data,self.log_q,error_event) for _ in range= (self.ThreadNumber)] self.AutoGenMgr =3D AutoGenManager(auto_workers,feedback_q,err= or_event) self.AutoGenMgr.start() @@ -827,9 +834,21 @@ class Build(): w.start() if PcdMaList is not None: for PcdMa in PcdMaList: + if GlobalData.gBinCacheSource and self.Target in [None= , "", "all"]: + PcdMa.GenModuleFilesHash(share_data) + PcdMa.GenPreMakefileHash(share_data) + if PcdMa.CanSkipbyPreMakefileCache(share_data): + continue + PcdMa.CreateCodeFile(False) PcdMa.CreateMakeFile(False,GenFfsList =3D DataPipe.Get= ("FfsCommand").get((PcdMa.MetaFile.File, PcdMa.Arch),[])) =20 + if GlobalData.gBinCacheSource and self.Target in [None= , "", "all"]: + PcdMa.GenMakeHeaderFilesHash(share_data) + PcdMa.GenMakeHash(share_data) + if PcdMa.CanSkipbyMakeCache(share_data): + continue + self.AutoGenMgr.join() rt =3D self.AutoGenMgr.Status return rt, 0 @@ -1199,10 +1218,11 @@ class Build(): mqueue.put(m) =20 AutoGenObject.DataPipe.DataContainer =3D {"FfsCommand":FfsComm= and} + AutoGenObject.DataPipe.DataContainer =3D {"CommandTarget": sel= f.Target} self.Progress.Start("Generating makefile and code") data_pipe_file =3D os.path.join(AutoGenObject.BuildDir, "Globa= lVar_%s_%s.bin" % (str(AutoGenObject.Guid),AutoGenObject.Arch)) AutoGenObject.DataPipe.dump(data_pipe_file) - autogen_rt, errorcode =3D self.StartAutoGen(mqueue, AutoGenObj= ect.DataPipe, self.SkipAutoGen, PcdMaList,self.share_data) + autogen_rt,errorcode =3D self.StartAutoGen(mqueue, AutoGenObje= ct.DataPipe, self.SkipAutoGen, PcdMaList, GlobalData.gCacheIR) self.Progress.Stop("done!") if not autogen_rt: self.AutoGenMgr.TerminateWorkers() @@ -1799,6 +1819,15 @@ class Build(): CmdListDict =3D None if GlobalData.gEnableGenfdsMultiThread and self.Fdf: CmdListDict =3D self._GenFfsCmd(Wa.ArchList) + + # Add Platform and Package level hash in share_data for mo= dule hash calculation later + if GlobalData.gBinCacheSource or GlobalData.gBinCacheDest: + GlobalData.gCacheIR[('PlatformHash')] =3D GlobalData.g= PlatformHash + for PkgName in GlobalData.gPackageHash.keys(): + GlobalData.gCacheIR[(PkgName, 'PackageHash')] =3D = GlobalData.gPackageHash[PkgName] + GlobalData.file_lock =3D mp.Lock() + GlobalData.FfsCmd =3D CmdListDict + self.Progress.Stop("done!") MaList =3D [] ExitFlag =3D threading.Event() @@ -1808,20 +1837,23 @@ class Build(): AutoGenStart =3D time.time() GlobalData.gGlobalDefines['ARCH'] =3D Arch Pa =3D PlatformAutoGen(Wa, self.PlatformFile, BuildTar= get, ToolChain, Arch) + GlobalData.libConstPcd =3D Pa.DataPipe.Get("LibConstPc= d") + GlobalData.Refes =3D Pa.DataPipe.Get("REFS") for Module in Pa.Platform.Modules: if self.ModuleFile.Dir =3D=3D Module.Dir and self.= ModuleFile.Name =3D=3D Module.Name: Ma =3D ModuleAutoGen(Wa, Module, BuildTarget, = ToolChain, Arch, self.PlatformFile,Pa.DataPipe) if Ma is None: continue MaList.append(Ma) - if Ma.CanSkipbyHash(): - self.HashSkipModules.append(Ma) - if GlobalData.gBinCacheSource: - EdkLogger.quiet("cache hit: %s[%s]" % = (Ma.MetaFile.Path, Ma.Arch)) - continue - else: - if GlobalData.gBinCacheSource: - EdkLogger.quiet("cache miss: %s[%s]" %= (Ma.MetaFile.Path, Ma.Arch)) + + if GlobalData.gBinCacheSource and self.Target = in [None, "", "all"]: + Ma.GenModuleFilesHash(GlobalData.gCacheIR) + Ma.GenPreMakefileHash(GlobalData.gCacheIR) + if Ma.CanSkipbyPreMakefileCache(GlobalData= .gCacheIR): + self.HashSkipModules.append(Ma) + EdkLogger.quiet("cache hit: %s[%s]" % (= Ma.MetaFile.Path, Ma.Arch)) + continue + # Not to auto-gen for targets 'clean', 'cleanl= ib', 'cleanall', 'run', 'fds' if self.Target not in ['clean', 'cleanlib', 'c= leanall', 'run', 'fds']: # for target which must generate AutoGen c= ode and makefile @@ -1841,6 +1873,18 @@ class Build(): self.Progress.Stop("done!") if self.Target =3D=3D "genmake": return True + + if GlobalData.gBinCacheSource and self.Tar= get in [None, "", "all"]: + Ma.GenMakeHeaderFilesHash(GlobalData.g= CacheIR) + Ma.GenMakeHash(GlobalData.gCacheIR) + if Ma.CanSkipbyMakeCache(GlobalData.gC= acheIR): + self.HashSkipModules.append(Ma) + EdkLogger.quiet("cache hit: %s[%s]= " % (Ma.MetaFile.Path, Ma.Arch)) + continue + else: + EdkLogger.quiet("cache miss: %s[%s= ]" % (Ma.MetaFile.Path, Ma.Arch)) + Ma.PrintFirstMakeCacheMissFile(Glo= balData.gCacheIR) + self.BuildModules.append(Ma) # Initialize all modules in tracking to 'FAIL' if Ma.Arch not in GlobalData.gModuleBuildTrack= ing: @@ -1985,11 +2029,18 @@ class Build(): if GlobalData.gEnableGenfdsMultiThread and self.Fdf: CmdListDict =3D self._GenFfsCmd(Wa.ArchList) =20 + # Add Platform and Package level hash in share_data for mo= dule hash calculation later + if GlobalData.gBinCacheSource or GlobalData.gBinCacheDest: + GlobalData.gCacheIR[('PlatformHash')] =3D GlobalData.g= PlatformHash + for PkgName in GlobalData.gPackageHash.keys(): + GlobalData.gCacheIR[(PkgName, 'PackageHash')] =3D = GlobalData.gPackageHash[PkgName] + # multi-thread exit flag ExitFlag =3D threading.Event() ExitFlag.clear() self.AutoGenTime +=3D int(round((time.time() - WorkspaceAu= toGenTime))) self.BuildModules =3D [] + TotalModules =3D [] for Arch in Wa.ArchList: PcdMaList =3D [] AutoGenStart =3D time.time() @@ -2009,6 +2060,7 @@ class Build(): ModuleList.append(Inf) Pa.DataPipe.DataContainer =3D {"FfsCommand":CmdListDic= t} Pa.DataPipe.DataContainer =3D {"Workspace_timestamp": = Wa._SrcTimeStamp} + Pa.DataPipe.DataContainer =3D {"CommandTarget": self.T= arget} for Module in ModuleList: # Get ModuleAutoGen object to generate C code file= and makefile Ma =3D ModuleAutoGen(Wa, Module, BuildTarget, Tool= Chain, Arch, self.PlatformFile,Pa.DataPipe) @@ -2019,19 +2071,7 @@ class Build(): Ma.PlatformInfo =3D Pa Ma.Workspace =3D Wa PcdMaList.append(Ma) - if Ma.CanSkipbyHash(): - self.HashSkipModules.append(Ma) - if GlobalData.gBinCacheSource: - EdkLogger.quiet("cache hit: %s[%s]" % (Ma.= MetaFile.Path, Ma.Arch)) - continue - else: - if GlobalData.gBinCacheSource: - EdkLogger.quiet("cache miss: %s[%s]" % (Ma= .MetaFile.Path, Ma.Arch)) - - # Not to auto-gen for targets 'clean', 'cleanlib',= 'cleanall', 'run', 'fds' - # for target which must generate AutoGen code = and makefile - - self.BuildModules.append(Ma) + TotalModules.append(Ma) # Initialize all modules in tracking to 'FAIL' if Ma.Arch not in GlobalData.gModuleBuildTracking: GlobalData.gModuleBuildTracking[Ma.Arch] =3D d= ict() @@ -2042,7 +2082,22 @@ class Build(): mqueue.put(m) data_pipe_file =3D os.path.join(Pa.BuildDir, "GlobalVa= r_%s_%s.bin" % (str(Pa.Guid),Pa.Arch)) Pa.DataPipe.dump(data_pipe_file) - autogen_rt, errorcode =3D self.StartAutoGen(mqueue, Pa= .DataPipe, self.SkipAutoGen, PcdMaList,self.share_data) + autogen_rt, errorcode =3D self.StartAutoGen(mqueue, Pa= .DataPipe, self.SkipAutoGen, PcdMaList, GlobalData.gCacheIR) + + # Skip cache hit modules + if GlobalData.gBinCacheSource: + for Ma in TotalModules: + if (Ma.MetaFile.Path, Ma.Arch) in GlobalData.g= CacheIR and \ + GlobalData.gCacheIR[(Ma.MetaFile.Path, Ma.= Arch)].PreMakeCacheHit: + self.HashSkipModules.append(Ma) + continue + if (Ma.MetaFile.Path, Ma.Arch) in GlobalData.g= CacheIR and \ + GlobalData.gCacheIR[(Ma.MetaFile.Path, Ma.= Arch)].MakeCacheHit: + self.HashSkipModules.append(Ma) + continue + self.BuildModules.append(Ma) + else: + self.BuildModules.extend(TotalModules) =20 if not autogen_rt: self.AutoGenMgr.TerminateWorkers() @@ -2050,9 +2105,24 @@ class Build(): raise FatalError(errorcode) self.AutoGenTime +=3D int(round((time.time() - AutoGenStar= t))) self.Progress.Stop("done!") + + if GlobalData.gBinCacheSource: + EdkLogger.quiet("Total cache hit driver num: %s, cache= miss driver num: %s" % (len(set(self.HashSkipModules)), len(set(self.Build= Modules)))) + CacheHitMa =3D set() + CacheNotHitMa =3D set() + for IR in GlobalData.gCacheIR.keys(): + if 'PlatformHash' in IR or 'PackageHash' in IR: + continue + if GlobalData.gCacheIR[IR].PreMakeCacheHit or Glob= alData.gCacheIR[IR].MakeCacheHit: + CacheHitMa.add(IR) + else: + # There might be binary module or module which= has .inc files, not count for cache miss + CacheNotHitMa.add(IR) + EdkLogger.quiet("Total module num: %s, cache hit modul= e num: %s" % (len(CacheHitMa)+len(CacheNotHitMa), len(CacheHitMa))) + for Arch in Wa.ArchList: MakeStart =3D time.time() - for Ma in self.BuildModules: + for Ma in set(self.BuildModules): # Generate build task for the module if not Ma.IsBinaryModule: Bt =3D BuildTask.New(ModuleMakeUnit(Ma, Pa.Bui= ldCommand,self.Target)) --=20 2.17.1 -=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D- Groups.io Links: You receive all messages sent to this group. View/Reply Online (#45494): https://edk2.groups.io/g/devel/message/45494 Mute This Topic: https://groups.io/mt/32849281/1787277 Group Owner: devel+owner@edk2.groups.io Unsubscribe: https://edk2.groups.io/g/devel/unsub [importer@patchew.org] -=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D- From nobody Fri Apr 19 18:19:06 2024 Delivered-To: importer@patchew.org Received-SPF: pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) client-ip=66.175.222.12; envelope-from=bounce+27952+45495+1787277+3901457@groups.io; helo=web01.groups.io; Authentication-Results: mx.zohomail.com; dkim=pass; spf=pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) smtp.mailfrom=bounce+27952+45495+1787277+3901457@groups.io; dmarc=fail(p=none dis=none) header.from=intel.com ARC-Seal: i=1; a=rsa-sha256; t=1565668026; cv=none; d=zoho.com; s=zohoarc; b=B5A1amwg5eH52OeMgbfQ7dgGWK7Twx7KRqk0zIs3zV+udTeJ27mRYH/hg+/6bEmFq/E3h2VVdxxefI84fBaor33wVLgWxCZ2JCHV2Oy6tRo46MvhII8Ki/odvBWsFU19y9rQunb4pXMlM+aRNNtLOm8SnqgHGxex7w4wGybmD4A= ARC-Message-Signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=zoho.com; s=zohoarc; t=1565668026; h=Cc:Date:From:In-Reply-To:List-Id:List-Unsubscribe:Message-ID:Reply-To:References:Sender:Subject:To:ARC-Authentication-Results; bh=6Cam/8x9iWce9Yic300gpNv8UZy798mnzERTDoDhGoY=; b=WGvQlA63gXRlD8Dy/gD5NMH94IQxC/RnXStziLc5GeczKsGyQTBH5qUv7sYwUdBaocTfIquhwCEWkF08nRGYEp8xppOe+VSx7nRkgHHTV5w8pNdHsxwSR3bH3drHG3FV8fxkqE1yczFZdZ98qxS+C0UnMn9kkaXtdC7O6DkgnZ4= ARC-Authentication-Results: i=1; mx.zoho.com; dkim=pass; spf=pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) smtp.mailfrom=bounce+27952+45495+1787277+3901457@groups.io; dmarc=fail header.from= (p=none dis=none) header.from= Received: from web01.groups.io (web01.groups.io [66.175.222.12]) by mx.zohomail.com with SMTPS id 1565668026528254.34090158525385; Mon, 12 Aug 2019 20:47:06 -0700 (PDT) Return-Path: X-Received: from mga05.intel.com (mga05.intel.com []) by groups.io with SMTP; Mon, 12 Aug 2019 20:47:05 -0700 X-Amp-Result: SKIPPED(no attachment in message) X-Amp-File-Uploaded: False X-Received: from orsmga003.jf.intel.com ([10.7.209.27]) by fmsmga105.fm.intel.com with ESMTP/TLS/DHE-RSA-AES256-GCM-SHA384; 12 Aug 2019 20:47:05 -0700 X-ExtLoop1: 1 X-IronPort-AV: E=Sophos;i="5.64,380,1559545200"; d="scan'208";a="178549507" X-Received: from jshi19-mobl.ccr.corp.intel.com ([10.254.211.59]) by orsmga003.jf.intel.com with ESMTP; 12 Aug 2019 20:47:03 -0700 From: "Steven Shi" To: devel@edk2.groups.io Cc: liming.gao@intel.com, bob.c.feng@intel.com, christian.rodriguez@intel.com, michael.johnson@intel.com, "Shi, Steven" Subject: [edk2-devel] [PATCH v2 2/4] BaseTools: Print first cache missing file for build cachle Date: Tue, 13 Aug 2019 11:46:52 +0800 Message-Id: <20190813034654.25716-3-steven.shi@intel.com> In-Reply-To: <20190813034654.25716-1-steven.shi@intel.com> References: <20190813034654.25716-1-steven.shi@intel.com> Precedence: Bulk List-Unsubscribe: Sender: devel@edk2.groups.io List-Id: Mailing-List: list devel@edk2.groups.io; contact devel+owner@edk2.groups.io Reply-To: devel@edk2.groups.io,steven.shi@intel.com DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=groups.io; q=dns/txt; s=20140610; t=1565668026; bh=+Fz15QglMA0gDXxNDIY+BWut/aPsDRAvZiA7MGjiZ30=; h=Cc:Date:From:Reply-To:Subject:To; b=C8CPWSsErcklF0AFMzHEMS61tavnlwAlgtQU6gxqFLnUCXlsbOtxiEE/qqXitD7qHGX tYqjPesQXsi8t9CTS0wGIWr0McceTP5xAG+nJfvja0HwAZOxT9M6TMTTqEc4NPant3jjI Iw4TRklUO9B3kAnOUhSWFnx3QRcHBwijfFE= X-ZohoMail-DKIM: pass (identity @groups.io) Content-Transfer-Encoding: quoted-printable MIME-Version: 1.0 Content-Type: text/plain; charset="utf-8" From: "Shi, Steven" BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=3D1925 When a module build cache miss, add support to print the first cache missing file path and name. Cc: Liming Gao Cc: Bob Feng Signed-off-by: Steven Shi --- BaseTools/Source/Python/AutoGen/AutoGenWorker.py | 2 ++ BaseTools/Source/Python/AutoGen/ModuleAutoGen.py | 76 ++++++++++++++++++++= ++++++++++++++++++++++++++++++++++++++++++++++++++++++++ 2 files changed, 78 insertions(+) diff --git a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py b/BaseTools/S= ource/Python/AutoGen/AutoGenWorker.py index a84ed46f2e..30d2f96fc7 100755 --- a/BaseTools/Source/Python/AutoGen/AutoGenWorker.py +++ b/BaseTools/Source/Python/AutoGen/AutoGenWorker.py @@ -246,6 +246,8 @@ class AutoGenWorkerInProcess(mp.Process): Ma.GenMakeHash(GlobalData.gCacheIR) if Ma.CanSkipbyMakeCache(GlobalData.gCacheIR): continue + else: + Ma.PrintFirstMakeCacheMissFile(GlobalData.gCacheIR) except Empty: pass except: diff --git a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py b/BaseTools/S= ource/Python/AutoGen/ModuleAutoGen.py index 6cb7cae1c7..a73a8a53a0 100755 --- a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py +++ b/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py @@ -2365,6 +2365,82 @@ class ModuleAutoGen(AutoGen): print("[cache hit]: checkpoint_Makefile:", self.MetaFile.Path, sel= f.Arch) return True =20 + ## Show the first file name which causes cache miss + def PrintFirstMakeCacheMissFile(self, gDict): + if not GlobalData.gBinCacheSource: + return + + # skip binary module + if self.IsBinaryModule: + return + + if not (self.MetaFile.Path, self.Arch) in gDict: + return + + # Only print cache miss file for the MakeCache not hit module + if gDict[(self.MetaFile.Path, self.Arch)].MakeCacheHit: + return + + if not gDict[(self.MetaFile.Path, self.Arch)].MakeHashChain: + EdkLogger.quiet("[cache insight]: MakeHashChain is missing for= : %s[%s]" % (self.MetaFile.Path, self.Arch)) + return + + # Find the cache dir name through the .ModuleHashPair file info + FileDir =3D path.join(GlobalData.gBinCacheSource, self.PlatformInf= o.OutputDir, self.BuildTarget + "_" + self.ToolChain, self.Arch, self.Sourc= eDir, self.MetaFile.BaseName) + + ModuleHashPairList =3D [] # tuple list: [tuple(PreMakefileHash, Ma= keHash)] + ModuleHashPair =3D path.join(FileDir, self.Name + ".ModuleHashPair= ") + if not os.path.exists(ModuleHashPair): + EdkLogger.quiet("[cache insight]: Cannot find ModuleHashPair f= ile for module: %s[%s]" % (self.MetaFile.Path, self.Arch)) + return + + try: + f =3D open(ModuleHashPair, 'r') + ModuleHashPairList =3D json.load(f) + f.close() + except: + EdkLogger.quiet("[cache insight]: Cannot load ModuleHashPair f= ile for module: %s[%s]" % (self.MetaFile.Path, self.Arch)) + return + + MakeHashSet =3D set() + for idx, (PreMakefileHash, MakeHash) in enumerate (ModuleHashPairL= ist): + TargetHashDir =3D path.join(FileDir, str(MakeHash)) + if os.path.exists(TargetHashDir): + MakeHashSet.add(MakeHash) + if not MakeHashSet: + EdkLogger.quiet("[cache insight]: Cannot find valid cache dir = for module: %s[%s]" % (self.MetaFile.Path, self.Arch)) + return + + TargetHash =3D list(MakeHashSet)[0] + TargetHashDir =3D path.join(FileDir, str(TargetHash)) + if len(MakeHashSet) > 1 : + EdkLogger.quiet("[cache insight]: found multiple cache dirs fo= r this module, random select dir '%s' to search the first cache miss file: = %s[%s]" % (TargetHash, self.MetaFile.Path, self.Arch)) + + ListFile =3D path.join(TargetHashDir, self.Name + '.MakeHashChain') + if os.path.exists(ListFile): + try: + f =3D open(ListFile, 'r') + CachedList =3D json.load(f) + f.close() + except: + EdkLogger.quiet("[cache insight]: Cannot load MakeHashChai= n file: %s" % ListFile) + return + else: + EdkLogger.quiet("[cache insight]: Cannot find MakeHashChain fi= le: %s" % ListFile) + return + + CurrentList =3D gDict[(self.MetaFile.Path, self.Arch)].MakeHashCha= in + for idx, (file, hash) in enumerate (CurrentList): + (filecached, hashcached) =3D CachedList[idx] + if file !=3D filecached: + EdkLogger.quiet("[cache insight]: first different file in = %s[%s] is %s, the cached one is %s" % (self.MetaFile.Path, self.Arch, file,= filecached)) + break + if hash !=3D hashcached: + EdkLogger.quiet("[cache insight]: first cache miss file in= %s[%s] is %s" % (self.MetaFile.Path, self.Arch, file)) + break + + return True + ## Decide whether we can skip the ModuleAutoGen process def CanSkipbyCache(self, gDict): # Hashing feature is off --=20 2.17.1 -=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D- Groups.io Links: You receive all messages sent to this group. View/Reply Online (#45495): https://edk2.groups.io/g/devel/message/45495 Mute This Topic: https://groups.io/mt/32849282/1787277 Group Owner: devel+owner@edk2.groups.io Unsubscribe: https://edk2.groups.io/g/devel/unsub [importer@patchew.org] -=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D- From nobody Fri Apr 19 18:19:06 2024 Delivered-To: importer@patchew.org Received-SPF: pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) client-ip=66.175.222.12; envelope-from=bounce+27952+45496+1787277+3901457@groups.io; helo=web01.groups.io; Authentication-Results: mx.zohomail.com; dkim=pass; spf=pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) smtp.mailfrom=bounce+27952+45496+1787277+3901457@groups.io; dmarc=fail(p=none dis=none) header.from=intel.com ARC-Seal: i=1; a=rsa-sha256; t=1565668034; cv=none; d=zoho.com; s=zohoarc; b=bbVmfPYVuS6it1LQZTJbXdu6vJPR759ESrRhaYQasxd2gOCP5PpXqDIa8Mf3CvA5C0b3grAlVsD/heT/L+vn9zDahJbXsoHBlwtnLcliYaci0EJJ47HjGZrBQD1mn/kxzcOSh0zKljStOZhKINaJEFpiOXXIwgDxgIzJPXNtUVg= ARC-Message-Signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=zoho.com; s=zohoarc; t=1565668034; h=Cc:Date:From:In-Reply-To:List-Id:List-Unsubscribe:Message-ID:Reply-To:References:Sender:Subject:To:ARC-Authentication-Results; bh=FoVINTvwEu/FgzaRpOMgg+Q1wg4zad5X8f+xHSPqpmA=; b=Yh5lGnDnn0vNq+Nf1c0+0ilC/3cM8OoRWvy9OoiA7JSs56jjdLVvUNUJGemkXkHqTePIVBVl1GPAiQchNiL6c7xu0m3DOjKJd7uEr0QvKEbVI+8wdjqxB16+9DcahXNCZRJK7mm6ghhtJOSzg5PBKgTBOEBGSr9gzDH2kXma2Mc= ARC-Authentication-Results: i=1; mx.zoho.com; dkim=pass; spf=pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) smtp.mailfrom=bounce+27952+45496+1787277+3901457@groups.io; dmarc=fail header.from= (p=none dis=none) header.from= Received: from web01.groups.io (web01.groups.io [66.175.222.12]) by mx.zohomail.com with SMTPS id 1565668034572324.07190539154146; Mon, 12 Aug 2019 20:47:14 -0700 (PDT) Return-Path: X-Received: from mga05.intel.com (mga05.intel.com [192.55.52.43]) by groups.io with SMTP; Mon, 12 Aug 2019 20:47:13 -0700 X-Amp-Result: SKIPPED(no attachment in message) X-Amp-File-Uploaded: False X-Received: from orsmga003.jf.intel.com ([10.7.209.27]) by fmsmga105.fm.intel.com with ESMTP/TLS/DHE-RSA-AES256-GCM-SHA384; 12 Aug 2019 20:47:07 -0700 X-ExtLoop1: 1 X-IronPort-AV: E=Sophos;i="5.64,380,1559545200"; d="scan'208";a="178549517" X-Received: from jshi19-mobl.ccr.corp.intel.com ([10.254.211.59]) by orsmga003.jf.intel.com with ESMTP; 12 Aug 2019 20:47:05 -0700 From: "Steven Shi" To: devel@edk2.groups.io Cc: liming.gao@intel.com, bob.c.feng@intel.com, christian.rodriguez@intel.com, michael.johnson@intel.com, "Shi, Steven" Subject: [edk2-devel] [PATCH v2 3/4] BaseTools: Change the [Arch][Name] module key in Build cache Date: Tue, 13 Aug 2019 11:46:53 +0800 Message-Id: <20190813034654.25716-4-steven.shi@intel.com> In-Reply-To: <20190813034654.25716-1-steven.shi@intel.com> References: <20190813034654.25716-1-steven.shi@intel.com> Precedence: Bulk List-Unsubscribe: Sender: devel@edk2.groups.io List-Id: Mailing-List: list devel@edk2.groups.io; contact devel+owner@edk2.groups.io Reply-To: devel@edk2.groups.io,steven.shi@intel.com DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=groups.io; q=dns/txt; s=20140610; t=1565668034; bh=VroOEyXRV+g0ExtezBkJRuhqgdvxiJCfrO3heATGlw0=; h=Cc:Date:From:Reply-To:Subject:To; b=s7NCN4fEU+lc/X22Rpj98Z1x6xkPnIAsp64R1+v+UsJ7nn+buv8qH5fj3KeJPbk6e5R OD7uUwmDBDZrDAXkI2HkxhQWtTAgxgSM4KeuYeSih1tbch19AibOfbnHm/WcJftaj64hP E9G6H9eh4hDRcC3zZh/8OZrHScx6MKja8XA= X-ZohoMail-DKIM: pass (identity @groups.io) Content-Transfer-Encoding: quoted-printable MIME-Version: 1.0 Content-Type: text/plain; charset="utf-8" From: "Shi, Steven" BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=3D1951 Current build cache use the module's [self.Arch][self.Name] info as the ModuleAutoGen object key in hash list and dictionary. The [self.Arch][self.Name] is not safe as the module key because there could be two modules with same module name and arch name in one platform. E.g. A platform can override a module or library instance in another different path, the overriding module can has the same module name and arch name as the original one. Directly use the ModuleAutoGen obj self as the key, because the obj __hash__ and __repr__ attributes already contain the full path and arch name. Cc: Liming Gao Cc: Bob Feng Signed-off-by: Steven Shi --- BaseTools/Source/Python/AutoGen/GenMake.py | 6 +----- BaseTools/Source/Python/build/build.py | 49 +++++++++++++++++++++-----= ----------------------- 2 files changed, 22 insertions(+), 33 deletions(-) diff --git a/BaseTools/Source/Python/AutoGen/GenMake.py b/BaseTools/Source/= Python/AutoGen/GenMake.py index 79387856bd..de820eeb2f 100755 --- a/BaseTools/Source/Python/AutoGen/GenMake.py +++ b/BaseTools/Source/Python/AutoGen/GenMake.py @@ -940,16 +940,12 @@ cleanlib: continue headerFileDependencySet.add(aFileName) =20 - # Ensure that gModuleBuildTracking has been initialized per archit= ecture - if self._AutoGenObject.Arch not in GlobalData.gModuleBuildTracking: - GlobalData.gModuleBuildTracking[self._AutoGenObject.Arch] =3D = dict() - # Check if a module dependency header file is missing from the mod= ule's MetaFile for aFile in headerFileDependencySet: if aFile in headerFilesInMetaFileSet: continue if GlobalData.gUseHashCache: - GlobalData.gModuleBuildTracking[self._AutoGenObject.Arch][= self._AutoGenObject] =3D 'FAIL_METAFILE' + GlobalData.gModuleBuildTracking[self._AutoGenObject] =3D '= FAIL_METAFILE' EdkLogger.warn("build","Module MetaFile [Sources] is missing l= ocal header!", ExtraData =3D "Local Header: " + aFile + " not fou= nd in " + self._AutoGenObject.MetaFile.Path ) diff --git a/BaseTools/Source/Python/build/build.py b/BaseTools/Source/Pyth= on/build/build.py index 84540d61f5..81f0bbb467 100755 --- a/BaseTools/Source/Python/build/build.py +++ b/BaseTools/Source/Python/build/build.py @@ -630,12 +630,11 @@ class BuildTask: =20 # Set the value used by hash invalidation flow in GlobalData.gModu= leBuildTracking to 'SUCCESS' # If Module or Lib is being tracked, it did not fail header check = test, and built successfully - if (self.BuildItem.BuildObject.Arch in GlobalData.gModuleBuildTrac= king and - self.BuildItem.BuildObject in GlobalData.gModuleBuildTracking[s= elf.BuildItem.BuildObject.Arch] and - GlobalData.gModuleBuildTracking[self.BuildItem.BuildObject.Arch= ][self.BuildItem.BuildObject] !=3D 'FAIL_METAFILE' and + if (self.BuildItem.BuildObject in GlobalData.gModuleBuildTracking = and + GlobalData.gModuleBuildTracking[self.BuildItem.BuildObject] != =3D 'FAIL_METAFILE' and not BuildTask._ErrorFlag.isSet() ): - GlobalData.gModuleBuildTracking[self.BuildItem.BuildObject.Arc= h][self.BuildItem.BuildObject] =3D 'SUCCESS' + GlobalData.gModuleBuildTracking[self.BuildItem.BuildObject] = =3D 'SUCCESS' =20 # indicate there's a thread is available for another build task BuildTask._RunningQueueLock.acquire() @@ -1169,25 +1168,24 @@ class Build(): return =20 # GlobalData.gModuleBuildTracking contains only modules or libs th= at cannot be skipped by hash - for moduleAutoGenObjArch in GlobalData.gModuleBuildTracking.keys(): - for moduleAutoGenObj in GlobalData.gModuleBuildTracking[module= AutoGenObjArch].keys(): - # Skip invalidating for Successful Module/Lib builds - if GlobalData.gModuleBuildTracking[moduleAutoGenObjArch][m= oduleAutoGenObj] =3D=3D 'SUCCESS': - continue + for Ma in GlobalData.gModuleBuildTracking: + # Skip invalidating for Successful Module/Lib builds + if GlobalData.gModuleBuildTracking[Ma] =3D=3D 'SUCCESS': + continue =20 - # The module failed to build, failed to start building, or= failed the header check test from this point on + # The module failed to build, failed to start building, or fai= led the header check test from this point on =20 - # Remove .hash from build - ModuleHashFile =3D os.path.join(moduleAutoGenObj.BuildDir,= moduleAutoGenObj.Name + ".hash") - if os.path.exists(ModuleHashFile): - os.remove(ModuleHashFile) + # Remove .hash from build + ModuleHashFile =3D os.path.join(Ma.BuildDir, Ma.Name + ".hash") + if os.path.exists(ModuleHashFile): + os.remove(ModuleHashFile) =20 - # Remove .hash file from cache - if GlobalData.gBinCacheDest: - FileDir =3D os.path.join(GlobalData.gBinCacheDest, mod= uleAutoGenObj.Arch, moduleAutoGenObj.SourceDir, moduleAutoGenObj.MetaFile.B= aseName) - HashFile =3D os.path.join(FileDir, moduleAutoGenObj.Na= me + '.hash') - if os.path.exists(HashFile): - os.remove(HashFile) + # Remove .hash file from cache + if GlobalData.gBinCacheDest: + FileDir =3D os.path.join(GlobalData.gBinCacheDest, Ma.Plat= formInfo.OutputDir, Ma.BuildTarget + "_" + Ma.ToolChain, Ma.Arch, Ma.Source= Dir, Ma.MetaFile.BaseName) + HashFile =3D os.path.join(FileDir, Ma.Name + '.hash') + if os.path.exists(HashFile): + os.remove(HashFile) =20 ## Build a module or platform # @@ -1887,10 +1885,7 @@ class Build(): =20 self.BuildModules.append(Ma) # Initialize all modules in tracking to 'FAIL' - if Ma.Arch not in GlobalData.gModuleBuildTrack= ing: - GlobalData.gModuleBuildTracking[Ma.Arch] = =3D dict() - if Ma not in GlobalData.gModuleBuildTracking[M= a.Arch]: - GlobalData.gModuleBuildTracking[Ma.Arch][M= a] =3D 'FAIL' + GlobalData.gModuleBuildTracking[Ma] =3D 'FAIL' self.AutoGenTime +=3D int(round((time.time() - AutoGen= Start))) MakeStart =3D time.time() for Ma in self.BuildModules: @@ -2073,10 +2068,8 @@ class Build(): PcdMaList.append(Ma) TotalModules.append(Ma) # Initialize all modules in tracking to 'FAIL' - if Ma.Arch not in GlobalData.gModuleBuildTracking: - GlobalData.gModuleBuildTracking[Ma.Arch] =3D d= ict() - if Ma not in GlobalData.gModuleBuildTracking[Ma.Ar= ch]: - GlobalData.gModuleBuildTracking[Ma.Arch][Ma] = =3D 'FAIL' + GlobalData.gModuleBuildTracking[Ma] =3D 'FAIL' + mqueue =3D mp.Queue() for m in Pa.GetAllModuleInfo: mqueue.put(m) --=20 2.17.1 -=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D- Groups.io Links: You receive all messages sent to this group. View/Reply Online (#45496): https://edk2.groups.io/g/devel/message/45496 Mute This Topic: https://groups.io/mt/32849284/1787277 Group Owner: devel+owner@edk2.groups.io Unsubscribe: https://edk2.groups.io/g/devel/unsub [importer@patchew.org] -=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D- From nobody Fri Apr 19 18:19:06 2024 Delivered-To: importer@patchew.org Received-SPF: pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) client-ip=66.175.222.12; envelope-from=bounce+27952+45497+1787277+3901457@groups.io; helo=web01.groups.io; Authentication-Results: mx.zohomail.com; dkim=pass; spf=pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) smtp.mailfrom=bounce+27952+45497+1787277+3901457@groups.io; dmarc=fail(p=none dis=none) header.from=intel.com ARC-Seal: i=1; a=rsa-sha256; t=1565668034; cv=none; d=zoho.com; s=zohoarc; b=gWZv8xP0mjeN8QszVzGzIUMZSUg9FkVxIRqM6PcEXy1SRCH1b0ANGL+vWPmlyyJa5QlE7ZHXwfGfgg+UqeVgJpmIOt4ConkDIltrbH1otITddmADnmDBMDq+4ehWiXjCLL3+LtC+X3hxIp7X5Qpb0eU91zOVALgbTzEaSaNB1wA= ARC-Message-Signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=zoho.com; s=zohoarc; t=1565668034; h=Cc:Date:From:In-Reply-To:List-Id:List-Unsubscribe:Message-ID:Reply-To:References:Sender:Subject:To:ARC-Authentication-Results; bh=rzu62ZbZdE6SSyNYP6VSfKAgjrOvilTdZF2l3fC96cM=; b=U6XPytzORSqx9nlH9aNOL/FnBSmx4oB4xr1r48DyLki0fQLGQOy+XHdBc+RSlN17FWVi4dCHxkoAHuoMS/W9p1Lfi1zUu3DqOcvMYBRBxI8NRhx5cERfCJvCTNRIEYxOJKgwZF6EHRhrp2xQXmaWdOyXKPnMviBVdmF0o/zTEl8= ARC-Authentication-Results: i=1; mx.zoho.com; dkim=pass; spf=pass (zoho.com: domain of groups.io designates 66.175.222.12 as permitted sender) smtp.mailfrom=bounce+27952+45497+1787277+3901457@groups.io; dmarc=fail header.from= (p=none dis=none) header.from= Received: from web01.groups.io (web01.groups.io [66.175.222.12]) by mx.zohomail.com with SMTPS id 1565668034758189.70075929732832; Mon, 12 Aug 2019 20:47:14 -0700 (PDT) Return-Path: X-Received: from mga05.intel.com (mga05.intel.com []) by groups.io with SMTP; Mon, 12 Aug 2019 20:47:14 -0700 X-Amp-Result: SKIPPED(no attachment in message) X-Amp-File-Uploaded: False X-Received: from orsmga003.jf.intel.com ([10.7.209.27]) by fmsmga105.fm.intel.com with ESMTP/TLS/DHE-RSA-AES256-GCM-SHA384; 12 Aug 2019 20:47:08 -0700 X-ExtLoop1: 1 X-IronPort-AV: E=Sophos;i="5.64,380,1559545200"; d="scan'208";a="178549521" X-Received: from jshi19-mobl.ccr.corp.intel.com ([10.254.211.59]) by orsmga003.jf.intel.com with ESMTP; 12 Aug 2019 20:47:07 -0700 From: "Steven Shi" To: devel@edk2.groups.io Cc: liming.gao@intel.com, bob.c.feng@intel.com, christian.rodriguez@intel.com, michael.johnson@intel.com, "Shi, Steven" Subject: [edk2-devel] [PATCH v2 4/4] BaseTools: Add GenFds multi-thread support in build cache Date: Tue, 13 Aug 2019 11:46:54 +0800 Message-Id: <20190813034654.25716-5-steven.shi@intel.com> In-Reply-To: <20190813034654.25716-1-steven.shi@intel.com> References: <20190813034654.25716-1-steven.shi@intel.com> Precedence: Bulk List-Unsubscribe: Sender: devel@edk2.groups.io List-Id: Mailing-List: list devel@edk2.groups.io; contact devel+owner@edk2.groups.io Reply-To: devel@edk2.groups.io,steven.shi@intel.com DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/simple; d=groups.io; q=dns/txt; s=20140610; t=1565668034; bh=/s75EpfC2J1iCtl0NnYikHmeMTQlXXegTuUWy8CBmf4=; h=Cc:Date:From:Reply-To:Subject:To; b=hC1hrxrPvmHqzKNB6sUhuBjUV3DV/E3MKrcSYerYPDxV1G8yLofTKsBYLqtrRkUF6Ri CMdoeaWhmVwngz2k+nAIDhl2JNAaYsAo00hldrnACuO+0dawAZKyQIWtAwBY/qivWVIEm c04zCYgT1+LYeukEW7cTE7JliMToKY1YhIc= X-ZohoMail-DKIM: pass (identity @groups.io) Content-Transfer-Encoding: quoted-printable MIME-Version: 1.0 Content-Type: text/plain; charset="utf-8" From: "Shi, Steven" BZ: https://bugzilla.tianocore.org/show_bug.cgi?id=3D1923 Fix the issue that the GenFds multi-thread will build fail if enable the build cache together. Cc: Liming Gao Cc: Bob Feng Signed-off-by: Steven Shi --- BaseTools/Source/Python/AutoGen/ModuleAutoGen.py | 23 +++++++++++++++++---= --- 1 file changed, 17 insertions(+), 6 deletions(-) diff --git a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py b/BaseTools/S= ource/Python/AutoGen/ModuleAutoGen.py index a73a8a53a0..ee8518e19c 100755 --- a/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py +++ b/BaseTools/Source/Python/AutoGen/ModuleAutoGen.py @@ -1248,11 +1248,13 @@ class ModuleAutoGen(AutoGen): fStringIO.close () fInputfile.close () return OutputName + @cached_property def OutputFile(self): retVal =3D set() OutputDir =3D self.OutputDir.replace('\\', '/').strip('/') DebugDir =3D self.DebugDir.replace('\\', '/').strip('/') + FfsOutputDir =3D self.FfsOutputDir.replace('\\', '/').rstrip('/') for Item in self.CodaTargetList: File =3D Item.Target.Path.replace('\\', '/').strip('/').replac= e(DebugDir, '').replace(OutputDir, '').strip('/') retVal.add(File) @@ -1268,6 +1270,12 @@ class ModuleAutoGen(AutoGen): if File.lower().endswith('.pdb'): retVal.add(File) =20 + for Root, Dirs, Files in os.walk(FfsOutputDir): + for File in Files: + if File.lower().endswith('.ffs') or File.lower().endswith(= '.offset') or File.lower().endswith('.raw') \ + or File.lower().endswith('.raw.txt'): + retVal.add(File) + return retVal =20 ## Create AsBuilt INF file the module @@ -1638,13 +1646,16 @@ class ModuleAutoGen(AutoGen): for File in self.OutputFile: File =3D str(File) if not os.path.isabs(File): - File =3D os.path.join(self.OutputDir, File) + NewFile =3D os.path.join(self.OutputDir, File) + if not os.path.exists(NewFile): + NewFile =3D os.path.join(self.FfsOutputDir, File) + File =3D NewFile if os.path.exists(File): - sub_dir =3D os.path.relpath(File, self.OutputDir) - destination_file =3D os.path.join(FileDir, sub_dir) - destination_dir =3D os.path.dirname(destination_file) - CreateDirectory(destination_dir) - CopyFileOnChange(File, destination_dir) + if File.lower().endswith('.ffs') or File.lower().endswith(= '.offset') or File.lower().endswith('.raw') \ + or File.lower().endswith('.raw.txt'): + self.CacheCopyFile(FfsDir, self.FfsOutputDir, File) + else: + self.CacheCopyFile(FileDir, self.OutputDir, File) =20 def SaveHashChainFileToCache(self, gDict): if not GlobalData.gBinCacheDest: --=20 2.17.1 -=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D- Groups.io Links: You receive all messages sent to this group. View/Reply Online (#45497): https://edk2.groups.io/g/devel/message/45497 Mute This Topic: https://groups.io/mt/32849285/1787277 Group Owner: devel+owner@edk2.groups.io Unsubscribe: https://edk2.groups.io/g/devel/unsub [importer@patchew.org] -=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-=3D-