gyp: update gyp to 0.2.0

PR-URL: https://github.com/nodejs/node-gyp/pull/2092
Reviewed-By: Rod Vagg <rod@vagg.org>
This commit is contained in:
Ujjwal Sharma 2020-04-07 07:13:04 +05:30 committed by Rod Vagg
parent 9aed6286a3
commit ebc34ec823
No known key found for this signature in database
GPG key ID: C273792F7D83545D
60 changed files with 25980 additions and 22837 deletions

4
gyp/.flake8 Normal file
View file

@ -0,0 +1,4 @@
[flake8]
max-complexity = 10
max-line-length = 88
extend-ignore = E203,C901,E501

31
gyp/.github/workflows/Python_tests.yml vendored Normal file
View file

@ -0,0 +1,31 @@
# TODO: Enable os: windows-latest
# TODO: Enable python-version: 3.5
# TODO: Enable pytest --doctest-modules
name: Python_tests
on: [push, pull_request]
jobs:
Python_tests:
runs-on: ${{ matrix.os }}
strategy:
fail-fast: false
max-parallel: 15
matrix:
os: [macos-latest, ubuntu-latest] # , windows-latest]
python-version: [2.7, 3.6, 3.7, 3.8] # 3.5,
steps:
- uses: actions/checkout@v1
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v1
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install -r requirements_dev.txt
- name: Lint with flake8
run: flake8 . --count --show-source --statistics
- name: Test with pytest
run: pytest
# - name: Run doctests with pytest
# run: pytest --doctest-modules

144
gyp/.gitignore vendored
View file

@ -1 +1,143 @@
*.pyc
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
pip-wheel-metadata/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
cover/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
.pybuilder/
target/
# Jupyter Notebook
.ipynb_checkpoints
# IPython
profile_default/
ipython_config.py
# pyenv
# For a library or package, you might want to ignore these files since the code is
# intended to run in multiple environments; otherwise, check them in:
# .python-version
# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock
# PEP 582; used by e.g. github.com/David-OConnor/pyflow
__pypackages__/
# Celery stuff
celerybeat-schedule
celerybeat.pid
# SageMath parsed files
*.sage.py
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/
# pytype static type analyzer
.pytype/
# Cython debug symbols
cython_debug/
# static files generated from Django application using `collectstatic`
media
static

View file

@ -13,3 +13,4 @@ Eric N. Vander Weele <ericvw@gmail.com>
Tom Freudenberg <th.freudenberg@gmail.com>
Julien Brianceau <jbriance@cisco.com>
Refael Ackermann <refack@gmail.com>
Ujjwal Sharma <ryzokuken@disroot.org>

4
gyp/CODE_OF_CONDUCT.md Normal file
View file

@ -0,0 +1,4 @@
# Code of Conduct
* [Node.js Code of Conduct](https://github.com/nodejs/admin/blob/master/CODE_OF_CONDUCT.md)
* [Node.js Moderation Policy](https://github.com/nodejs/admin/blob/master/Moderation-Policy.md)

32
gyp/CONTRIBUTING.md Normal file
View file

@ -0,0 +1,32 @@
# Contributing to gyp-next
## Code of Conduct
This project is bound to the [Node.js Code of Conduct](https://github.com/nodejs/admin/blob/master/CODE_OF_CONDUCT.md).
<a id="developers-certificate-of-origin"></a>
## Developer's Certificate of Origin 1.1
By making a contribution to this project, I certify that:
* (a) The contribution was created in whole or in part by me and I
have the right to submit it under the open source license
indicated in the file; or
* (b) The contribution is based upon previous work that, to the best
of my knowledge, is covered under an appropriate open source
license and I have the right under that license to submit that
work with modifications, whether created in whole or in part
by me, under the same open source license (unless I am
permitted to submit under a different license), as indicated
in the file; or
* (c) The contribution was provided directly to me by some other
person who certified (a), (b) or (c) and I have not modified
it.
* (d) I understand and agree that this project and the contribution
are public and that a record of the contribution (including all
personal information I submit with it, including my sign-off) is
maintained indefinitely and may be redistributed consistent with
this project or the open source license(s) involved.

View file

@ -1,3 +1,4 @@
Copyright (c) 2019 Ujjwal Sharma. All rights reserved.
Copyright (c) 2009 Google Inc. All rights reserved.
Redistribution and use in source and binary forms, with or without

View file

@ -1 +0,0 @@
*

View file

@ -1,138 +0,0 @@
# Copyright (c) 2012 Google Inc. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
"""Top-level presubmit script for GYP.
See http://dev.chromium.org/developers/how-tos/depottools/presubmit-scripts
for more details about the presubmit API built into gcl.
"""
PYLINT_BLACKLIST = [
# TODO: fix me.
# From SCons, not done in google style.
'test/lib/TestCmd.py',
'test/lib/TestCommon.py',
'test/lib/TestGyp.py',
]
PYLINT_DISABLED_WARNINGS = [
# TODO: fix me.
# Many tests include modules they don't use.
'W0611',
# Possible unbalanced tuple unpacking with sequence.
'W0632',
# Attempting to unpack a non-sequence.
'W0633',
# Include order doesn't properly include local files?
'F0401',
# Some use of built-in names.
'W0622',
# Some unused variables.
'W0612',
# Operator not preceded/followed by space.
'C0323',
'C0322',
# Unnecessary semicolon.
'W0301',
# Unused argument.
'W0613',
# String has no effect (docstring in wrong place).
'W0105',
# map/filter on lambda could be replaced by comprehension.
'W0110',
# Use of eval.
'W0123',
# Comma not followed by space.
'C0324',
# Access to a protected member.
'W0212',
# Bad indent.
'W0311',
# Line too long.
'C0301',
# Undefined variable.
'E0602',
# Not exception type specified.
'W0702',
# No member of that name.
'E1101',
# Dangerous default {}.
'W0102',
# Cyclic import.
'R0401',
# Others, too many to sort.
'W0201', 'W0232', 'E1103', 'W0621', 'W0108', 'W0223', 'W0231',
'R0201', 'E0101', 'C0321',
# ************* Module copy
# W0104:427,12:_test.odict.__setitem__: Statement seems to have no effect
'W0104',
]
def _LicenseHeader(input_api):
# Accept any year number from 2009 to the current year.
current_year = int(input_api.time.strftime('%Y'))
allowed_years = (str(s) for s in reversed(range(2009, current_year + 1)))
years_re = '(' + '|'.join(allowed_years) + ')'
# The (c) is deprecated, but tolerate it until it's removed from all files.
return (
r'.*? Copyright (\(c\) )?%(year)s Google Inc\. All rights reserved\.\n'
r'.*? Use of this source code is governed by a BSD-style license that '
r'can be\n'
r'.*? found in the LICENSE file\.\n'
) % {
'year': years_re,
}
def CheckChangeOnUpload(input_api, output_api):
report = []
report.extend(input_api.canned_checks.PanProjectChecks(
input_api, output_api, license_header=_LicenseHeader(input_api)))
return report
def CheckChangeOnCommit(input_api, output_api):
report = []
report.extend(input_api.canned_checks.PanProjectChecks(
input_api, output_api, license_header=_LicenseHeader(input_api)))
report.extend(input_api.canned_checks.CheckTreeIsOpen(
input_api, output_api,
'http://gyp-status.appspot.com/status',
'http://gyp-status.appspot.com/current'))
import os
import sys
old_sys_path = sys.path
try:
sys.path = ['pylib', 'test/lib'] + sys.path
blacklist = PYLINT_BLACKLIST
if sys.platform == 'win32':
blacklist = [os.path.normpath(x).replace('\\', '\\\\')
for x in PYLINT_BLACKLIST]
report.extend(input_api.canned_checks.RunPylint(
input_api,
output_api,
black_list=blacklist,
disabled_warnings=PYLINT_DISABLED_WARNINGS))
finally:
sys.path = old_sys_path
return report
TRYBOTS = [
'linux_try',
'mac_try',
'win_try',
]
def GetPreferredTryMasters(_, change):
return {
'client.gyp': { t: set(['defaulttests']) for t in TRYBOTS },
}

0
gyp/gyp.bat Normal file → Executable file
View file

View file

@ -10,12 +10,13 @@ import subprocess
PY3 = bytes != str
# Below IsCygwin() function copied from pylib/gyp/common.py
def IsCygwin():
# Function copied from pylib/gyp/common.py
try:
out = subprocess.Popen("uname",
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
out = subprocess.Popen(
"uname", stdout=subprocess.PIPE, stderr=subprocess.STDOUT
)
stdout, stderr = out.communicate()
if PY3:
stdout = stdout.decode("utf-8")
@ -28,9 +29,9 @@ def UnixifyPath(path):
try:
if not IsCygwin():
return path
out = subprocess.Popen(["cygpath", "-u", path],
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
out = subprocess.Popen(
["cygpath", "-u", path], stdout=subprocess.PIPE, stderr=subprocess.STDOUT
)
stdout, _ = out.communicate()
if PY3:
stdout = stdout.decode("utf-8")
@ -43,8 +44,8 @@ def UnixifyPath(path):
# elsewhere on the system. Also convert to Unix style path on Cygwin systems,
# else the 'gyp' library will not be found
path = UnixifyPath(sys.argv[0])
sys.path.insert(0, os.path.join(os.path.dirname(path), 'pylib'))
import gyp
sys.path.insert(0, os.path.join(os.path.dirname(path), "pylib"))
import gyp # noqa: E402
if __name__ == '__main__':
if __name__ == "__main__":
sys.exit(gyp.script_main())

View file

@ -14,23 +14,25 @@ import gyp.common
try:
cmp
except NameError:
def cmp(x, y):
return (x > y) - (x < y)
# Initialize random number generator
random.seed()
# GUIDs for project types
ENTRY_TYPE_GUIDS = {
'project': '{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}',
'folder': '{2150E333-8FDC-42A3-9474-1A3956D46DE8}',
"project": "{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942}",
"folder": "{2150E333-8FDC-42A3-9474-1A3956D46DE8}",
}
# ------------------------------------------------------------------------------
# Helper functions
def MakeGuid(name, seed='msvs_new'):
def MakeGuid(name, seed="msvs_new"):
"""Returns a GUID for the specified target name.
Args:
@ -46,12 +48,24 @@ def MakeGuid(name, seed='msvs_new'):
not change when the project for a target is rebuilt.
"""
# Calculate a MD5 signature for the seed and name.
d = hashlib.md5((str(seed) + str(name)).encode('utf-8')).hexdigest().upper()
d = hashlib.md5((str(seed) + str(name)).encode("utf-8")).hexdigest().upper()
# Convert most of the signature to GUID form (discard the rest)
guid = ('{' + d[:8] + '-' + d[8:12] + '-' + d[12:16] + '-' + d[16:20]
+ '-' + d[20:32] + '}')
guid = (
"{"
+ d[:8]
+ "-"
+ d[8:12]
+ "-"
+ d[12:16]
+ "-"
+ d[16:20]
+ "-"
+ d[20:32]
+ "}"
)
return guid
# ------------------------------------------------------------------------------
@ -64,8 +78,7 @@ class MSVSSolutionEntry(object):
class MSVSFolder(MSVSSolutionEntry):
"""Folder in a Visual Studio project or solution."""
def __init__(self, path, name = None, entries = None,
guid = None, items = None):
def __init__(self, path, name=None, entries=None, guid=None, items=None):
"""Initializes the folder.
Args:
@ -87,15 +100,15 @@ class MSVSFolder(MSVSSolutionEntry):
self.guid = guid
# Copy passed lists (or set to empty lists)
self.entries = sorted(entries or [], key=attrgetter('path'))
self.entries = sorted(entries or [], key=attrgetter("path"))
self.items = list(items or [])
self.entry_type_guid = ENTRY_TYPE_GUIDS['folder']
self.entry_type_guid = ENTRY_TYPE_GUIDS["folder"]
def get_guid(self):
if self.guid is None:
# Use consistent guids for folders (so things don't regenerate).
self.guid = MakeGuid(self.path, seed='msvs_folder')
self.guid = MakeGuid(self.path, seed="msvs_folder")
return self.guid
@ -105,9 +118,17 @@ class MSVSFolder(MSVSSolutionEntry):
class MSVSProject(MSVSSolutionEntry):
"""Visual Studio project."""
def __init__(self, path, name = None, dependencies = None, guid = None,
spec = None, build_file = None, config_platform_overrides = None,
fixpath_prefix = None):
def __init__(
self,
path,
name=None,
dependencies=None,
guid=None,
spec=None,
build_file=None,
config_platform_overrides=None,
fixpath_prefix=None,
):
"""Initializes the project.
Args:
@ -133,7 +154,7 @@ class MSVSProject(MSVSSolutionEntry):
# Copy passed lists (or set to empty lists)
self.dependencies = list(dependencies or [])
self.entry_type_guid = ENTRY_TYPE_GUIDS['project']
self.entry_type_guid = ENTRY_TYPE_GUIDS["project"]
if config_platform_overrides:
self.config_platform_overrides = config_platform_overrides
@ -165,14 +186,16 @@ class MSVSProject(MSVSSolutionEntry):
def set_msbuild_toolset(self, msbuild_toolset):
self.msbuild_toolset = msbuild_toolset
# ------------------------------------------------------------------------------
class MSVSSolution(object):
"""Visual Studio solution."""
def __init__(self, path, version, entries=None, variants=None,
websiteProperties=True):
def __init__(
self, path, version, entries=None, variants=None, websiteProperties=True
):
"""Initializes the solution.
Args:
@ -197,18 +220,16 @@ class MSVSSolution(object):
self.variants = variants[:]
else:
# Use default
self.variants = ['Debug|Win32', 'Release|Win32']
self.variants = ["Debug|Win32", "Release|Win32"]
# TODO(rspangler): Need to be able to handle a mapping of solution config
# to project config. Should we be able to handle variants being a dict,
# or add a separate variant_map variable? If it's a dict, we can't
# guarantee the order of variants since dict keys aren't ordered.
# TODO(rspangler): Automatically write to disk for now; should delay until
# node-evaluation time.
self.Write()
def Write(self, writer=gyp.common.WriteOnDiff):
"""Writes the solution file to disk.
@ -231,13 +252,15 @@ class MSVSSolution(object):
if isinstance(e, MSVSFolder):
entries_to_check += e.entries
all_entries = sorted(all_entries, key=attrgetter('path'))
all_entries = sorted(all_entries, key=attrgetter("path"))
# Open file and print header
f = writer(self.path)
f.write('Microsoft Visual Studio Solution File, '
'Format Version %s\r\n' % self.version.SolutionVersion())
f.write('# %s\r\n' % self.version.Description())
f.write(
"Microsoft Visual Studio Solution File, "
"Format Version %s\r\n" % self.version.SolutionVersion()
)
f.write("# %s\r\n" % self.version.Description())
# Project entries
sln_root = os.path.split(self.path)[0]
@ -245,45 +268,50 @@ class MSVSSolution(object):
relative_path = gyp.common.RelativePath(e.path, sln_root)
# msbuild does not accept an empty folder_name.
# use '.' in case relative_path is empty.
folder_name = relative_path.replace('/', '\\') or '.'
f.write('Project("%s") = "%s", "%s", "%s"\r\n' % (
folder_name = relative_path.replace("/", "\\") or "."
f.write(
'Project("%s") = "%s", "%s", "%s"\r\n'
% (
e.entry_type_guid, # Entry type GUID
e.name, # Folder name
folder_name, # Folder name (again)
e.get_guid(), # Entry GUID
))
)
)
# TODO(rspangler): Need a way to configure this stuff
if self.websiteProperties:
f.write('\tProjectSection(WebsiteProperties) = preProject\r\n'
f.write(
"\tProjectSection(WebsiteProperties) = preProject\r\n"
'\t\tDebug.AspNetCompiler.Debug = "True"\r\n'
'\t\tRelease.AspNetCompiler.Debug = "False"\r\n'
'\tEndProjectSection\r\n')
"\tEndProjectSection\r\n"
)
if isinstance(e, MSVSFolder):
if e.items:
f.write('\tProjectSection(SolutionItems) = preProject\r\n')
f.write("\tProjectSection(SolutionItems) = preProject\r\n")
for i in e.items:
f.write('\t\t%s = %s\r\n' % (i, i))
f.write('\tEndProjectSection\r\n')
f.write("\t\t%s = %s\r\n" % (i, i))
f.write("\tEndProjectSection\r\n")
if isinstance(e, MSVSProject):
if e.dependencies:
f.write('\tProjectSection(ProjectDependencies) = postProject\r\n')
f.write("\tProjectSection(ProjectDependencies) = postProject\r\n")
for d in e.dependencies:
f.write('\t\t%s = %s\r\n' % (d.get_guid(), d.get_guid()))
f.write('\tEndProjectSection\r\n')
f.write("\t\t%s = %s\r\n" % (d.get_guid(), d.get_guid()))
f.write("\tEndProjectSection\r\n")
f.write('EndProject\r\n')
f.write("EndProject\r\n")
# Global section
f.write('Global\r\n')
f.write("Global\r\n")
# Configurations (variants)
f.write('\tGlobalSection(SolutionConfigurationPlatforms) = preSolution\r\n')
f.write("\tGlobalSection(SolutionConfigurationPlatforms) = preSolution\r\n")
for v in self.variants:
f.write('\t\t%s = %s\r\n' % (v, v))
f.write('\tEndGlobalSection\r\n')
f.write("\t\t%s = %s\r\n" % (v, v))
f.write("\tEndGlobalSection\r\n")
# Sort config guids for easier diffing of solution changes.
config_guids = []
@ -294,43 +322,49 @@ class MSVSSolution(object):
config_guids_overrides[e.get_guid()] = e.config_platform_overrides
config_guids.sort()
f.write('\tGlobalSection(ProjectConfigurationPlatforms) = postSolution\r\n')
f.write("\tGlobalSection(ProjectConfigurationPlatforms) = postSolution\r\n")
for g in config_guids:
for v in self.variants:
nv = config_guids_overrides[g].get(v, v)
# Pick which project configuration to build for this solution
# configuration.
f.write('\t\t%s.%s.ActiveCfg = %s\r\n' % (
f.write(
"\t\t%s.%s.ActiveCfg = %s\r\n"
% (
g, # Project GUID
v, # Solution build configuration
nv, # Project build config for that solution config
))
)
)
# Enable project in this solution configuration.
f.write('\t\t%s.%s.Build.0 = %s\r\n' % (
f.write(
"\t\t%s.%s.Build.0 = %s\r\n"
% (
g, # Project GUID
v, # Solution build configuration
nv, # Project build config for that solution config
))
f.write('\tEndGlobalSection\r\n')
)
)
f.write("\tEndGlobalSection\r\n")
# TODO(rspangler): Should be able to configure this stuff too (though I've
# never seen this be any different)
f.write('\tGlobalSection(SolutionProperties) = preSolution\r\n')
f.write('\t\tHideSolutionNode = FALSE\r\n')
f.write('\tEndGlobalSection\r\n')
f.write("\tGlobalSection(SolutionProperties) = preSolution\r\n")
f.write("\t\tHideSolutionNode = FALSE\r\n")
f.write("\tEndGlobalSection\r\n")
# Folder mappings
# Omit this section if there are no folders
if any([e.entries for e in all_entries if isinstance(e, MSVSFolder)]):
f.write('\tGlobalSection(NestedProjects) = preSolution\r\n')
f.write("\tGlobalSection(NestedProjects) = preSolution\r\n")
for e in all_entries:
if not isinstance(e, MSVSFolder):
continue # Does not apply to projects, only folders
for subentry in e.entries:
f.write('\t\t%s = %s\r\n' % (subentry.get_guid(), e.get_guid()))
f.write('\tEndGlobalSection\r\n')
f.write("\t\t%s = %s\r\n" % (subentry.get_guid(), e.get_guid()))
f.write("\tEndGlobalSection\r\n")
f.write('EndGlobal\r\n')
f.write("EndGlobal\r\n")
f.close()

View file

@ -4,7 +4,6 @@
"""Visual Studio project reader/writer."""
import gyp.common
import gyp.easy_xml as easy_xml
# ------------------------------------------------------------------------------
@ -21,7 +20,7 @@ class Tool(object):
attrs: Dict of tool attributes; may be None.
"""
self._attrs = attrs or {}
self._attrs['Name'] = name
self._attrs["Name"] = name
def _GetSpecification(self):
"""Creates an element for the tool.
@ -29,7 +28,8 @@ class Tool(object):
Returns:
A new xml.dom.Element for the tool.
"""
return ['Tool', self._attrs]
return ["Tool", self._attrs]
class Filter(object):
"""Visual Studio filter - that is, a virtual folder."""
@ -68,15 +68,15 @@ class Writer(object):
# Default to Win32 for platforms.
if not platforms:
platforms = ['Win32']
platforms = ["Win32"]
# Initialize the specifications of the various sections.
self.platform_section = ['Platforms']
self.platform_section = ["Platforms"]
for platform in platforms:
self.platform_section.append(['Platform', {'Name': platform}])
self.tool_files_section = ['ToolFiles']
self.configurations_section = ['Configurations']
self.files_section = ['Files']
self.platform_section.append(["Platform", {"Name": platform}])
self.tool_files_section = ["ToolFiles"]
self.configurations_section = ["Configurations"]
self.files_section = ["Files"]
# Keep a dict keyed on filename to speed up access.
self.files_dict = dict()
@ -87,7 +87,7 @@ class Writer(object):
Args:
path: Relative path from project to tool file.
"""
self.tool_files_section.append(['ToolFile', {'RelativePath': path}])
self.tool_files_section.append(["ToolFile", {"RelativePath": path}])
def _GetSpecForConfiguration(self, config_type, config_name, attrs, tools):
"""Returns the specification for a configuration.
@ -107,7 +107,7 @@ class Writer(object):
# Add configuration node and its attributes
node_attrs = attrs.copy()
node_attrs['Name'] = config_name
node_attrs["Name"] = config_name
specification = [config_type, node_attrs]
# Add tool nodes and their attributes
@ -119,7 +119,6 @@ class Writer(object):
specification.append(Tool(t)._GetSpecification())
return specification
def AddConfig(self, name, attrs=None, tools=None):
"""Adds a configuration to the project.
@ -128,7 +127,7 @@ class Writer(object):
attrs: Dict of configuration attributes; may be None.
tools: List of tools (strings or Tool objects); may be None.
"""
spec = self._GetSpecForConfiguration('Configuration', name, attrs, tools)
spec = self._GetSpecForConfiguration("Configuration", name, attrs, tools)
self.configurations_section.append(spec)
def _AddFilesToNode(self, parent, files):
@ -142,10 +141,10 @@ class Writer(object):
"""
for f in files:
if isinstance(f, Filter):
node = ['Filter', {'Name': f.name}]
node = ["Filter", {"Name": f.name}]
self._AddFilesToNode(node, f.contents)
else:
node = ['File', {'RelativePath': f}]
node = ["File", {"RelativePath": f}]
self.files_dict[f] = node
parent.append(node)
@ -181,28 +180,27 @@ class Writer(object):
raise ValueError('AddFileConfig: file "%s" not in project.' % path)
# Add the config to the file node
spec = self._GetSpecForConfiguration('FileConfiguration', config, attrs,
tools)
spec = self._GetSpecForConfiguration("FileConfiguration", config, attrs, tools)
parent.append(spec)
def WriteIfChanged(self):
"""Writes the project file."""
# First create XML content definition
content = [
'VisualStudioProject',
{'ProjectType': 'Visual C++',
'Version': self.version.ProjectVersion(),
'Name': self.name,
'ProjectGUID': self.guid,
'RootNamespace': self.name,
'Keyword': 'Win32Proj'
"VisualStudioProject",
{
"ProjectType": "Visual C++",
"Version": self.version.ProjectVersion(),
"Name": self.name,
"ProjectGUID": self.guid,
"RootNamespace": self.name,
"Keyword": "Win32Proj",
},
self.platform_section,
self.tool_files_section,
self.configurations_section,
['References'], # empty section
["References"], # empty section
self.files_section,
['Globals'] # empty section
["Globals"], # empty section
]
easy_xml.WriteXmlIfChanged(content, self.project_path,
encoding="Windows-1252")
easy_xml.WriteXmlIfChanged(content, self.project_path, encoding="Windows-1252")

File diff suppressed because it is too large Load diff

File diff suppressed because it is too large Load diff

View file

@ -4,7 +4,6 @@
"""Visual Studio project reader/writer."""
import gyp.common
import gyp.easy_xml as easy_xml
@ -20,11 +19,11 @@ class Writer(object):
"""
self.tool_file_path = tool_file_path
self.name = name
self.rules_section = ['Rules']
self.rules_section = ["Rules"]
def AddCustomBuildRule(self, name, cmd, description,
additional_dependencies,
outputs, extensions):
def AddCustomBuildRule(
self, name, cmd, description, additional_dependencies, outputs, extensions
):
"""Adds a rule to the tool file.
Args:
@ -35,24 +34,26 @@ class Writer(object):
outputs: outputs of the rule.
extensions: extensions handled by the rule.
"""
rule = ['CustomBuildRule',
{'Name': name,
'ExecutionDescription': description,
'CommandLine': cmd,
'Outputs': ';'.join(outputs),
'FileExtensions': ';'.join(extensions),
'AdditionalDependencies':
';'.join(additional_dependencies)
}]
rule = [
"CustomBuildRule",
{
"Name": name,
"ExecutionDescription": description,
"CommandLine": cmd,
"Outputs": ";".join(outputs),
"FileExtensions": ";".join(extensions),
"AdditionalDependencies": ";".join(additional_dependencies),
},
]
self.rules_section.append(rule)
def WriteIfChanged(self):
"""Writes the tool file."""
content = ['VisualStudioToolFile',
{'Version': '8.00',
'Name': self.name
},
self.rules_section
content = [
"VisualStudioToolFile",
{"Version": "8.00", "Name": self.name},
self.rules_section,
]
easy_xml.WriteXmlIfChanged(content, self.tool_file_path,
encoding="Windows-1252")
easy_xml.WriteXmlIfChanged(
content, self.tool_file_path, encoding="Windows-1252"
)

View file

@ -8,12 +8,12 @@ import os
import re
import socket # for gethostname
import gyp.common
import gyp.easy_xml as easy_xml
# ------------------------------------------------------------------------------
def _FindCommandInPath(command):
"""If there are no slashes in the command given, this function
searches the PATH env to find the given command, and converts it
@ -21,20 +21,21 @@ def _FindCommandInPath(command):
for an actual file to launch a debugger on, not just a command
line. Note that this happens at GYP time, so anything needing to
be built needs to have a full path."""
if '/' in command or '\\' in command:
if "/" in command or "\\" in command:
# If the command already has path elements (either relative or
# absolute), then assume it is constructed properly.
return command
else:
# Search through the path list and find an existing file that
# we can access.
paths = os.environ.get('PATH','').split(os.pathsep)
paths = os.environ.get("PATH", "").split(os.pathsep)
for path in paths:
item = os.path.join(path, command)
if os.path.isfile(item) and os.access(item, os.X_OK):
return item
return command
def _QuoteWin32CommandLineArgs(args):
new_args = []
for arg in args:
@ -46,11 +47,12 @@ def _QuoteWin32CommandLineArgs(args):
arg = '"%s"' % arg
# Otherwise, if there are any spaces, quote the whole arg.
elif re.search(r'[ \t\n]', arg):
elif re.search(r"[ \t\n]", arg):
arg = '"%s"' % arg
new_args.append(arg)
return new_args
class Writer(object):
"""Visual Studio XML user user file writer."""
@ -73,10 +75,11 @@ class Writer(object):
Args:
name: Configuration name.
"""
self.configurations[name] = ['Configuration', {'Name': name}]
self.configurations[name] = ["Configuration", {"Name": name}]
def AddDebugSettings(self, config_name, command, environment = {},
working_directory=""):
def AddDebugSettings(
self, config_name, command, environment={}, working_directory=""
):
"""Adds a DebugSettings node to the user file for a particular config.
Args:
@ -90,40 +93,42 @@ class Writer(object):
abs_command = _FindCommandInPath(command[0])
if environment and isinstance(environment, dict):
env_list = ['%s="%s"' % (key, val)
for (key,val) in environment.items()]
environment = ' '.join(env_list)
env_list = ['%s="%s"' % (key, val) for (key, val) in environment.items()]
environment = " ".join(env_list)
else:
environment = ''
environment = ""
n_cmd = ['DebugSettings',
{'Command': abs_command,
'WorkingDirectory': working_directory,
'CommandArguments': " ".join(command[1:]),
'RemoteMachine': socket.gethostname(),
'Environment': environment,
'EnvironmentMerge': 'true',
n_cmd = [
"DebugSettings",
{
"Command": abs_command,
"WorkingDirectory": working_directory,
"CommandArguments": " ".join(command[1:]),
"RemoteMachine": socket.gethostname(),
"Environment": environment,
"EnvironmentMerge": "true",
# Currently these are all "dummy" values that we're just setting
# in the default manner that MSVS does it. We could use some of
# these to add additional capabilities, I suppose, but they might
# not have parity with other platforms then.
'Attach': 'false',
'DebuggerType': '3', # 'auto' debugger
'Remote': '1',
'RemoteCommand': '',
'HttpUrl': '',
'PDBPath': '',
'SQLDebugging': '',
'DebuggerFlavor': '0',
'MPIRunCommand': '',
'MPIRunArguments': '',
'MPIRunWorkingDirectory': '',
'ApplicationCommand': '',
'ApplicationArguments': '',
'ShimCommand': '',
'MPIAcceptMode': '',
'MPIAcceptFilter': ''
}]
"Attach": "false",
"DebuggerType": "3", # 'auto' debugger
"Remote": "1",
"RemoteCommand": "",
"HttpUrl": "",
"PDBPath": "",
"SQLDebugging": "",
"DebuggerFlavor": "0",
"MPIRunCommand": "",
"MPIRunArguments": "",
"MPIRunWorkingDirectory": "",
"ApplicationCommand": "",
"ApplicationArguments": "",
"ShimCommand": "",
"MPIAcceptMode": "",
"MPIAcceptFilter": "",
},
]
# Find the config, and add it if it doesn't exist.
if config_name not in self.configurations:
@ -134,14 +139,15 @@ class Writer(object):
def WriteIfChanged(self):
"""Writes the user file."""
configs = ['Configurations']
configs = ["Configurations"]
for config, spec in sorted(self.configurations.items()):
configs.append(spec)
content = ['VisualStudioUserFile',
{'Version': self.version.ProjectVersion(),
'Name': self.name
},
configs]
easy_xml.WriteXmlIfChanged(content, self.user_file_path,
encoding="Windows-1252")
content = [
"VisualStudioUserFile",
{"Version": self.version.ProjectVersion(), "Name": self.name},
configs,
]
easy_xml.WriteXmlIfChanged(
content, self.user_file_path, encoding="Windows-1252"
)

View file

@ -10,20 +10,20 @@ import os
# A dictionary mapping supported target types to extensions.
TARGET_TYPE_EXT = {
'executable': 'exe',
'loadable_module': 'dll',
'shared_library': 'dll',
'static_library': 'lib',
'windows_driver': 'sys',
"executable": "exe",
"loadable_module": "dll",
"shared_library": "dll",
"static_library": "lib",
"windows_driver": "sys",
}
def _GetLargePdbShimCcPath():
"""Returns the path of the large_pdb_shim.cc file."""
this_dir = os.path.abspath(os.path.dirname(__file__))
src_dir = os.path.abspath(os.path.join(this_dir, '..', '..'))
win_data_dir = os.path.join(src_dir, 'data', 'win')
large_pdb_shim_cc = os.path.join(win_data_dir, 'large-pdb-shim.cc')
src_dir = os.path.abspath(os.path.join(this_dir, "..", ".."))
win_data_dir = os.path.join(src_dir, "data", "win")
large_pdb_shim_cc = os.path.join(win_data_dir, "large-pdb-shim.cc")
return large_pdb_shim_cc
@ -54,9 +54,9 @@ def _SuffixName(name, suffix):
Returns:
Target name with suffix added (foo_suffix#target)
"""
parts = name.rsplit('#', 1)
parts[0] = '%s_%s' % (parts[0], suffix)
return '#'.join(parts)
parts = name.rsplit("#", 1)
parts[0] = "%s_%s" % (parts[0], suffix)
return "#".join(parts)
def _ShardName(name, number):
@ -83,7 +83,7 @@ def ShardTargets(target_list, target_dicts):
# Gather the targets to shard, and how many pieces.
targets_to_shard = {}
for t in target_dicts:
shards = int(target_dicts[t].get('msvs_shard', 0))
shards = int(target_dicts[t].get("msvs_shard", 0))
if shards:
targets_to_shard[t] = shards
# Shard target_list.
@ -101,18 +101,19 @@ def ShardTargets(target_list, target_dicts):
for i in range(targets_to_shard[t]):
name = _ShardName(t, i)
new_target_dicts[name] = copy.copy(target_dicts[t])
new_target_dicts[name]['target_name'] = _ShardName(
new_target_dicts[name]['target_name'], i)
sources = new_target_dicts[name].get('sources', [])
new_target_dicts[name]["target_name"] = _ShardName(
new_target_dicts[name]["target_name"], i
)
sources = new_target_dicts[name].get("sources", [])
new_sources = []
for pos in range(i, len(sources), targets_to_shard[t]):
new_sources.append(sources[pos])
new_target_dicts[name]['sources'] = new_sources
new_target_dicts[name]["sources"] = new_sources
else:
new_target_dicts[t] = target_dicts[t]
# Shard dependencies.
for t in sorted(new_target_dicts):
for deptype in ('dependencies', 'dependencies_original'):
for deptype in ("dependencies", "dependencies_original"):
dependencies = copy.copy(new_target_dicts[t].get(deptype, []))
new_dependencies = []
for d in dependencies:
@ -144,24 +145,23 @@ def _GetPdbPath(target_dict, config_name, vars):
Returns:
The path of the corresponding PDB file.
"""
config = target_dict['configurations'][config_name]
msvs = config.setdefault('msvs_settings', {})
config = target_dict["configurations"][config_name]
msvs = config.setdefault("msvs_settings", {})
linker = msvs.get('VCLinkerTool', {})
linker = msvs.get("VCLinkerTool", {})
pdb_path = linker.get('ProgramDatabaseFile')
pdb_path = linker.get("ProgramDatabaseFile")
if pdb_path:
return pdb_path
variables = target_dict.get('variables', {})
pdb_path = variables.get('msvs_large_pdb_path', None)
variables = target_dict.get("variables", {})
pdb_path = variables.get("msvs_large_pdb_path", None)
if pdb_path:
return pdb_path
pdb_base = target_dict.get('product_name', target_dict['target_name'])
pdb_base = '%s.%s.pdb' % (pdb_base, TARGET_TYPE_EXT[target_dict['type']])
pdb_path = vars['PRODUCT_DIR'] + '/' + pdb_base
pdb_base = target_dict.get("product_name", target_dict["target_name"])
pdb_base = "%s.%s.pdb" % (pdb_base, TARGET_TYPE_EXT[target_dict["type"]])
pdb_path = vars["PRODUCT_DIR"] + "/" + pdb_base
return pdb_path
@ -185,7 +185,7 @@ def InsertLargePdbShims(target_list, target_dicts, vars):
target_dict = target_dicts[t]
# We only want to shim targets that have msvs_large_pdb enabled.
if not int(target_dict.get('msvs_large_pdb', 0)):
if not int(target_dict.get("msvs_large_pdb", 0)):
continue
# This is intended for executable, shared_library and loadable_module
# targets where every configuration is set up to produce a PDB output.
@ -197,10 +197,11 @@ def InsertLargePdbShims(target_list, target_dicts, vars):
for t in targets_to_shim:
target_dict = target_dicts[t]
target_name = target_dict.get('target_name')
target_name = target_dict.get("target_name")
base_dict = _DeepCopySomeKeys(target_dict,
['configurations', 'default_configuration', 'toolset'])
base_dict = _DeepCopySomeKeys(
target_dict, ["configurations", "default_configuration", "toolset"]
)
# This is the dict for copying the source file (part of the GYP tree)
# to the intermediate directory of the project. This is necessary because
@ -208,55 +209,54 @@ def InsertLargePdbShims(target_list, target_dicts, vars):
# GYP and the project may be on different drives), and Ninja hates absolute
# paths (it ends up generating the .obj and .obj.d alongside the source
# file, polluting GYPs tree).
copy_suffix = 'large_pdb_copy'
copy_target_name = target_name + '_' + copy_suffix
copy_suffix = "large_pdb_copy"
copy_target_name = target_name + "_" + copy_suffix
full_copy_target_name = _SuffixName(t, copy_suffix)
shim_cc_basename = os.path.basename(large_pdb_shim_cc)
shim_cc_dir = vars['SHARED_INTERMEDIATE_DIR'] + '/' + copy_target_name
shim_cc_path = shim_cc_dir + '/' + shim_cc_basename
shim_cc_dir = vars["SHARED_INTERMEDIATE_DIR"] + "/" + copy_target_name
shim_cc_path = shim_cc_dir + "/" + shim_cc_basename
copy_dict = copy.deepcopy(base_dict)
copy_dict['target_name'] = copy_target_name
copy_dict['type'] = 'none'
copy_dict['sources'] = [ large_pdb_shim_cc ]
copy_dict['copies'] = [{
'destination': shim_cc_dir,
'files': [ large_pdb_shim_cc ]
}]
copy_dict["target_name"] = copy_target_name
copy_dict["type"] = "none"
copy_dict["sources"] = [large_pdb_shim_cc]
copy_dict["copies"] = [
{"destination": shim_cc_dir, "files": [large_pdb_shim_cc]}
]
# This is the dict for the PDB generating shim target. It depends on the
# copy target.
shim_suffix = 'large_pdb_shim'
shim_target_name = target_name + '_' + shim_suffix
shim_suffix = "large_pdb_shim"
shim_target_name = target_name + "_" + shim_suffix
full_shim_target_name = _SuffixName(t, shim_suffix)
shim_dict = copy.deepcopy(base_dict)
shim_dict['target_name'] = shim_target_name
shim_dict['type'] = 'static_library'
shim_dict['sources'] = [ shim_cc_path ]
shim_dict['dependencies'] = [ full_copy_target_name ]
shim_dict["target_name"] = shim_target_name
shim_dict["type"] = "static_library"
shim_dict["sources"] = [shim_cc_path]
shim_dict["dependencies"] = [full_copy_target_name]
# Set up the shim to output its PDB to the same location as the final linker
# target.
for config_name, config in shim_dict.get('configurations').items():
for config_name, config in shim_dict.get("configurations").items():
pdb_path = _GetPdbPath(target_dict, config_name, vars)
# A few keys that we don't want to propagate.
for key in ['msvs_precompiled_header', 'msvs_precompiled_source', 'test']:
for key in ["msvs_precompiled_header", "msvs_precompiled_source", "test"]:
config.pop(key, None)
msvs = config.setdefault('msvs_settings', {})
msvs = config.setdefault("msvs_settings", {})
# Update the compiler directives in the shim target.
compiler = msvs.setdefault('VCCLCompilerTool', {})
compiler['DebugInformationFormat'] = '3'
compiler['ProgramDataBaseFileName'] = pdb_path
compiler = msvs.setdefault("VCCLCompilerTool", {})
compiler["DebugInformationFormat"] = "3"
compiler["ProgramDataBaseFileName"] = pdb_path
# Set the explicit PDB path in the appropriate configuration of the
# original target.
config = target_dict['configurations'][config_name]
msvs = config.setdefault('msvs_settings', {})
linker = msvs.setdefault('VCLinkerTool', {})
linker['GenerateDebugInformation'] = 'true'
linker['ProgramDatabaseFile'] = pdb_path
config = target_dict["configurations"][config_name]
msvs = config.setdefault("msvs_settings", {})
linker = msvs.setdefault("VCLinkerTool", {})
linker["GenerateDebugInformation"] = "true"
linker["ProgramDatabaseFile"] = pdb_path
# Add the new targets. They must go to the beginning of the list so that
# the dependency generation works as expected in ninja.
@ -266,6 +266,6 @@ def InsertLargePdbShims(target_list, target_dicts, vars):
target_dicts[full_shim_target_name] = shim_dict
# Update the original target to depend on the shim target.
target_dict.setdefault('dependencies', []).append(full_shim_target_name)
target_dict.setdefault("dependencies", []).append(full_shim_target_name)
return (target_list, target_dicts)

View file

@ -9,7 +9,6 @@ import os
import re
import subprocess
import sys
import gyp
import glob
PY3 = bytes != str
@ -22,9 +21,19 @@ def JoinPath(*args):
class VisualStudioVersion(object):
"""Information regarding a version of Visual Studio."""
def __init__(self, short_name, description,
solution_version, project_version, flat_sln, uses_vcxproj,
path, sdk_based, default_toolset=None, compatible_sdks=None):
def __init__(
self,
short_name,
description,
solution_version,
project_version,
flat_sln,
uses_vcxproj,
path,
sdk_based,
default_toolset=None,
compatible_sdks=None,
):
self.short_name = short_name
self.description = description
self.solution_version = solution_version
@ -35,7 +44,7 @@ class VisualStudioVersion(object):
self.sdk_based = sdk_based
self.default_toolset = default_toolset
compatible_sdks = compatible_sdks or []
compatible_sdks.sort(key=lambda v: float(v.replace('v', '')), reverse=True)
compatible_sdks.sort(key=lambda v: float(v.replace("v", "")), reverse=True)
self.compatible_sdks = compatible_sdks
def ShortName(self):
@ -62,7 +71,7 @@ class VisualStudioVersion(object):
def ProjectExtension(self):
"""Returns the file extension for the project."""
return self.uses_vcxproj and '.vcxproj' or '.vcproj'
return self.uses_vcxproj and ".vcxproj" or ".vcproj"
def Path(self):
"""Returns the path to Visual Studio installation."""
@ -77,66 +86,70 @@ class VisualStudioVersion(object):
of a user override."""
return self.default_toolset
def _SetupScriptInternal(self, target_arch):
"""Returns a command (with arguments) to be used to set up the
environment."""
assert target_arch in ('x86', 'x64'), "target_arch not supported"
assert target_arch in ("x86", "x64"), "target_arch not supported"
# If WindowsSDKDir is set and SetEnv.Cmd exists then we are using the
# depot_tools build tools and should run SetEnv.Cmd to set up the
# environment. The check for WindowsSDKDir alone is not sufficient because
# this is set by running vcvarsall.bat.
sdk_dir = os.environ.get('WindowsSDKDir', '')
setup_path = JoinPath(sdk_dir, 'Bin', 'SetEnv.Cmd')
sdk_dir = os.environ.get("WindowsSDKDir", "")
setup_path = JoinPath(sdk_dir, "Bin", "SetEnv.Cmd")
if self.sdk_based and sdk_dir and os.path.exists(setup_path):
return [setup_path, '/' + target_arch]
return [setup_path, "/" + target_arch]
is_host_arch_x64 = (
os.environ.get('PROCESSOR_ARCHITECTURE') == 'AMD64' or
os.environ.get('PROCESSOR_ARCHITEW6432') == 'AMD64'
os.environ.get("PROCESSOR_ARCHITECTURE") == "AMD64"
or os.environ.get("PROCESSOR_ARCHITEW6432") == "AMD64"
)
# For VS2017 (and newer) it's fairly easy
if self.short_name >= '2017':
script_path = JoinPath(self.path,
'VC', 'Auxiliary', 'Build', 'vcvarsall.bat')
if self.short_name >= "2017":
script_path = JoinPath(
self.path, "VC", "Auxiliary", "Build", "vcvarsall.bat"
)
# Always use a native executable, cross-compiling if necessary.
host_arch = 'amd64' if is_host_arch_x64 else 'x86'
msvc_target_arch = 'amd64' if target_arch == 'x64' else 'x86'
host_arch = "amd64" if is_host_arch_x64 else "x86"
msvc_target_arch = "amd64" if target_arch == "x64" else "x86"
arg = host_arch
if host_arch != msvc_target_arch:
arg += '_' + msvc_target_arch
arg += "_" + msvc_target_arch
return [script_path, arg]
# We try to find the best version of the env setup batch.
vcvarsall = JoinPath(self.path, 'VC', 'vcvarsall.bat')
if target_arch == 'x86':
if self.short_name >= '2013' and self.short_name[-1] != 'e' and \
is_host_arch_x64:
vcvarsall = JoinPath(self.path, "VC", "vcvarsall.bat")
if target_arch == "x86":
if (
self.short_name >= "2013"
and self.short_name[-1] != "e"
and is_host_arch_x64
):
# VS2013 and later, non-Express have a x64-x86 cross that we want
# to prefer.
return [vcvarsall, 'amd64_x86']
return [vcvarsall, "amd64_x86"]
else:
# Otherwise, the standard x86 compiler. We don't use VC/vcvarsall.bat
# for x86 because vcvarsall calls vcvars32, which it can only find if
# VS??COMNTOOLS is set, which isn't guaranteed.
return [JoinPath(self.path, 'Common7', 'Tools', 'vsvars32.bat')]
elif target_arch == 'x64':
arg = 'x86_amd64'
return [JoinPath(self.path, "Common7", "Tools", "vsvars32.bat")]
elif target_arch == "x64":
arg = "x86_amd64"
# Use the 64-on-64 compiler if we're not using an express edition and
# we're running on a 64bit OS.
if self.short_name[-1] != 'e' and is_host_arch_x64:
arg = 'amd64'
if self.short_name[-1] != "e" and is_host_arch_x64:
arg = "amd64"
return [vcvarsall, arg]
def SetupScript(self, target_arch):
script_data = self._SetupScriptInternal(target_arch)
script_path = script_data[0]
if not os.path.exists(script_path):
raise Exception('%s is missing - make sure VC++ tools are installed.' %
script_path)
raise Exception(
"%s is missing - make sure VC++ tools are installed." % script_path
)
return script_data
@ -154,19 +167,18 @@ def _RegistryQueryBase(sysdir, key, value):
stdout from reg.exe, or None for failure.
"""
# Skip if not on Windows or Python Win32 setup issue
if sys.platform not in ('win32', 'cygwin'):
if sys.platform not in ("win32", "cygwin"):
return None
# Setup params to pass to and attempt to launch reg.exe
cmd = [os.path.join(os.environ.get('WINDIR', ''), sysdir, 'reg.exe'),
'query', key]
cmd = [os.path.join(os.environ.get("WINDIR", ""), sysdir, "reg.exe"), "query", key]
if value:
cmd.extend(['/v', value])
cmd.extend(["/v", value])
p = subprocess.Popen(cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
# Obtain the stdout from reg.exe, reading to the end so p.returncode is valid
# Note that the error text may be in [1] in some cases
text = p.communicate()[0]
if PY3:
text = text.decode('utf-8')
text = text.decode("utf-8")
# Check return code from reg.exe; officially 0==success and 1==error
if p.returncode:
return None
@ -192,10 +204,10 @@ def _RegistryQuery(key, value=None):
"""
text = None
try:
text = _RegistryQueryBase('Sysnative', key, value)
text = _RegistryQueryBase("Sysnative", key, value)
except OSError as e:
if e.errno == errno.ENOENT:
text = _RegistryQueryBase('System32', key, value)
text = _RegistryQueryBase("System32", key, value)
else:
raise
return text
@ -219,8 +231,8 @@ def _RegistryGetValueUsingWinReg(key, value):
from winreg import HKEY_LOCAL_MACHINE, OpenKey, QueryValueEx
try:
root, subkey = key.split('\\', 1)
assert root == 'HKLM' # Only need HKLM for now.
root, subkey = key.split("\\", 1)
assert root == "HKLM" # Only need HKLM for now.
with OpenKey(HKEY_LOCAL_MACHINE, subkey) as hkey:
return QueryValueEx(hkey, value)[0]
except WindowsError:
@ -251,7 +263,7 @@ def _RegistryGetValue(key, value):
if not text:
return None
# Extract value.
match = re.search(r'REG_\w+\s+([^\r]+)\r\n', text)
match = re.search(r"REG_\w+\s+([^\r]+)\r\n", text)
if not match:
return None
return match.group(1)
@ -267,130 +279,156 @@ def _CreateVersion(name, path, sdk_based=False):
if path:
path = os.path.normpath(path)
versions = {
'2019': VisualStudioVersion('2019',
'Visual Studio 2019',
solution_version='12.00',
project_version='16.0',
"2019": VisualStudioVersion(
"2019",
"Visual Studio 2019",
solution_version="12.00",
project_version="16.0",
flat_sln=False,
uses_vcxproj=True,
path=path,
sdk_based=sdk_based,
default_toolset='v142',
compatible_sdks=['v8.1', 'v10.0']),
'2017': VisualStudioVersion('2017',
'Visual Studio 2017',
solution_version='12.00',
project_version='15.0',
default_toolset="v142",
compatible_sdks=["v8.1", "v10.0"],
),
"2017": VisualStudioVersion(
"2017",
"Visual Studio 2017",
solution_version="12.00",
project_version="15.0",
flat_sln=False,
uses_vcxproj=True,
path=path,
sdk_based=sdk_based,
default_toolset='v141',
compatible_sdks=['v8.1', 'v10.0']),
'2015': VisualStudioVersion('2015',
'Visual Studio 2015',
solution_version='12.00',
project_version='14.0',
default_toolset="v141",
compatible_sdks=["v8.1", "v10.0"],
),
"2015": VisualStudioVersion(
"2015",
"Visual Studio 2015",
solution_version="12.00",
project_version="14.0",
flat_sln=False,
uses_vcxproj=True,
path=path,
sdk_based=sdk_based,
default_toolset='v140'),
'2013': VisualStudioVersion('2013',
'Visual Studio 2013',
solution_version='13.00',
project_version='12.0',
default_toolset="v140",
),
"2013": VisualStudioVersion(
"2013",
"Visual Studio 2013",
solution_version="13.00",
project_version="12.0",
flat_sln=False,
uses_vcxproj=True,
path=path,
sdk_based=sdk_based,
default_toolset='v120'),
'2013e': VisualStudioVersion('2013e',
'Visual Studio 2013',
solution_version='13.00',
project_version='12.0',
default_toolset="v120",
),
"2013e": VisualStudioVersion(
"2013e",
"Visual Studio 2013",
solution_version="13.00",
project_version="12.0",
flat_sln=True,
uses_vcxproj=True,
path=path,
sdk_based=sdk_based,
default_toolset='v120'),
'2012': VisualStudioVersion('2012',
'Visual Studio 2012',
solution_version='12.00',
project_version='4.0',
default_toolset="v120",
),
"2012": VisualStudioVersion(
"2012",
"Visual Studio 2012",
solution_version="12.00",
project_version="4.0",
flat_sln=False,
uses_vcxproj=True,
path=path,
sdk_based=sdk_based,
default_toolset='v110'),
'2012e': VisualStudioVersion('2012e',
'Visual Studio 2012',
solution_version='12.00',
project_version='4.0',
default_toolset="v110",
),
"2012e": VisualStudioVersion(
"2012e",
"Visual Studio 2012",
solution_version="12.00",
project_version="4.0",
flat_sln=True,
uses_vcxproj=True,
path=path,
sdk_based=sdk_based,
default_toolset='v110'),
'2010': VisualStudioVersion('2010',
'Visual Studio 2010',
solution_version='11.00',
project_version='4.0',
default_toolset="v110",
),
"2010": VisualStudioVersion(
"2010",
"Visual Studio 2010",
solution_version="11.00",
project_version="4.0",
flat_sln=False,
uses_vcxproj=True,
path=path,
sdk_based=sdk_based),
'2010e': VisualStudioVersion('2010e',
'Visual C++ Express 2010',
solution_version='11.00',
project_version='4.0',
sdk_based=sdk_based,
),
"2010e": VisualStudioVersion(
"2010e",
"Visual C++ Express 2010",
solution_version="11.00",
project_version="4.0",
flat_sln=True,
uses_vcxproj=True,
path=path,
sdk_based=sdk_based),
'2008': VisualStudioVersion('2008',
'Visual Studio 2008',
solution_version='10.00',
project_version='9.00',
sdk_based=sdk_based,
),
"2008": VisualStudioVersion(
"2008",
"Visual Studio 2008",
solution_version="10.00",
project_version="9.00",
flat_sln=False,
uses_vcxproj=False,
path=path,
sdk_based=sdk_based),
'2008e': VisualStudioVersion('2008e',
'Visual Studio 2008',
solution_version='10.00',
project_version='9.00',
sdk_based=sdk_based,
),
"2008e": VisualStudioVersion(
"2008e",
"Visual Studio 2008",
solution_version="10.00",
project_version="9.00",
flat_sln=True,
uses_vcxproj=False,
path=path,
sdk_based=sdk_based),
'2005': VisualStudioVersion('2005',
'Visual Studio 2005',
solution_version='9.00',
project_version='8.00',
sdk_based=sdk_based,
),
"2005": VisualStudioVersion(
"2005",
"Visual Studio 2005",
solution_version="9.00",
project_version="8.00",
flat_sln=False,
uses_vcxproj=False,
path=path,
sdk_based=sdk_based),
'2005e': VisualStudioVersion('2005e',
'Visual Studio 2005',
solution_version='9.00',
project_version='8.00',
sdk_based=sdk_based,
),
"2005e": VisualStudioVersion(
"2005e",
"Visual Studio 2005",
solution_version="9.00",
project_version="8.00",
flat_sln=True,
uses_vcxproj=False,
path=path,
sdk_based=sdk_based),
sdk_based=sdk_based,
),
}
return versions[str(name)]
def _ConvertToCygpath(path):
"""Convert to cygwin path if we are using cygwin."""
if sys.platform == 'cygwin':
p = subprocess.Popen(['cygpath', path], stdout=subprocess.PIPE)
if sys.platform == "cygwin":
p = subprocess.Popen(["cygpath", path], stdout=subprocess.PIPE)
path = p.communicate()[0].strip()
if PY3:
path = path.decode('utf-8')
path = path.decode("utf-8")
return path
@ -413,63 +451,78 @@ def _DetectVisualStudioVersions(versions_to_check, force_express):
Where (e) is e for express editions of MSVS and blank otherwise.
"""
version_to_year = {
'8.0': '2005',
'9.0': '2008',
'10.0': '2010',
'11.0': '2012',
'12.0': '2013',
'14.0': '2015',
'15.0': '2017',
'16.0': '2019',
"8.0": "2005",
"9.0": "2008",
"10.0": "2010",
"11.0": "2012",
"12.0": "2013",
"14.0": "2015",
"15.0": "2017",
"16.0": "2019",
}
versions = []
for version in versions_to_check:
# Old method of searching for which VS version is installed
# We don't use the 2010-encouraged-way because we also want to get the
# path to the binaries, which it doesn't offer.
keys = [r'HKLM\Software\Microsoft\VisualStudio\%s' % version,
r'HKLM\Software\Wow6432Node\Microsoft\VisualStudio\%s' % version,
r'HKLM\Software\Microsoft\VCExpress\%s' % version,
r'HKLM\Software\Wow6432Node\Microsoft\VCExpress\%s' % version]
keys = [
r"HKLM\Software\Microsoft\VisualStudio\%s" % version,
r"HKLM\Software\Wow6432Node\Microsoft\VisualStudio\%s" % version,
r"HKLM\Software\Microsoft\VCExpress\%s" % version,
r"HKLM\Software\Wow6432Node\Microsoft\VCExpress\%s" % version,
]
for index in range(len(keys)):
path = _RegistryGetValue(keys[index], 'InstallDir')
path = _RegistryGetValue(keys[index], "InstallDir")
if not path:
continue
path = _ConvertToCygpath(path)
# Check for full.
full_path = os.path.join(path, 'devenv.exe')
express_path = os.path.join(path, '*express.exe')
full_path = os.path.join(path, "devenv.exe")
express_path = os.path.join(path, "*express.exe")
if not force_express and os.path.exists(full_path):
# Add this one.
versions.append(_CreateVersion(version_to_year[version],
os.path.join(path, '..', '..')))
versions.append(
_CreateVersion(
version_to_year[version], os.path.join(path, "..", "..")
)
)
# Check for express.
elif glob.glob(express_path):
# Add this one.
versions.append(_CreateVersion(version_to_year[version] + 'e',
os.path.join(path, '..', '..')))
versions.append(
_CreateVersion(
version_to_year[version] + "e", os.path.join(path, "..", "..")
)
)
# The old method above does not work when only SDK is installed.
keys = [r'HKLM\Software\Microsoft\VisualStudio\SxS\VC7',
r'HKLM\Software\Wow6432Node\Microsoft\VisualStudio\SxS\VC7',
r'HKLM\Software\Microsoft\VisualStudio\SxS\VS7',
r'HKLM\Software\Wow6432Node\Microsoft\VisualStudio\SxS\VS7']
keys = [
r"HKLM\Software\Microsoft\VisualStudio\SxS\VC7",
r"HKLM\Software\Wow6432Node\Microsoft\VisualStudio\SxS\VC7",
r"HKLM\Software\Microsoft\VisualStudio\SxS\VS7",
r"HKLM\Software\Wow6432Node\Microsoft\VisualStudio\SxS\VS7",
]
for index in range(len(keys)):
path = _RegistryGetValue(keys[index], version)
if not path:
continue
path = _ConvertToCygpath(path)
if version == '15.0':
if version == "15.0":
if os.path.exists(path):
versions.append(_CreateVersion('2017', path))
elif version != '14.0': # There is no Express edition for 2015.
versions.append(_CreateVersion(version_to_year[version] + 'e',
os.path.join(path, '..'), sdk_based=True))
versions.append(_CreateVersion("2017", path))
elif version != "14.0": # There is no Express edition for 2015.
versions.append(
_CreateVersion(
version_to_year[version] + "e",
os.path.join(path, ".."),
sdk_based=True,
)
)
return versions
def SelectVisualStudioVersion(version='auto', allow_fallback=True):
def SelectVisualStudioVersion(version="auto", allow_fallback=True):
"""Select which version of Visual Studio projects to generate.
Arguments:
@ -478,39 +531,41 @@ def SelectVisualStudioVersion(version='auto', allow_fallback=True):
An object representing a visual studio project format version.
"""
# In auto mode, check environment variable for override.
if version == 'auto':
version = os.environ.get('GYP_MSVS_VERSION', 'auto')
if version == "auto":
version = os.environ.get("GYP_MSVS_VERSION", "auto")
version_map = {
'auto': ('16.0', '15.0', '14.0', '12.0', '10.0', '9.0', '8.0', '11.0'),
'2005': ('8.0',),
'2005e': ('8.0',),
'2008': ('9.0',),
'2008e': ('9.0',),
'2010': ('10.0',),
'2010e': ('10.0',),
'2012': ('11.0',),
'2012e': ('11.0',),
'2013': ('12.0',),
'2013e': ('12.0',),
'2015': ('14.0',),
'2017': ('15.0',),
'2019': ('16.0',),
"auto": ("16.0", "15.0", "14.0", "12.0", "10.0", "9.0", "8.0", "11.0"),
"2005": ("8.0",),
"2005e": ("8.0",),
"2008": ("9.0",),
"2008e": ("9.0",),
"2010": ("10.0",),
"2010e": ("10.0",),
"2012": ("11.0",),
"2012e": ("11.0",),
"2013": ("12.0",),
"2013e": ("12.0",),
"2015": ("14.0",),
"2017": ("15.0",),
"2019": ("16.0",),
}
override_path = os.environ.get('GYP_MSVS_OVERRIDE_PATH')
override_path = os.environ.get("GYP_MSVS_OVERRIDE_PATH")
if override_path:
msvs_version = os.environ.get('GYP_MSVS_VERSION')
msvs_version = os.environ.get("GYP_MSVS_VERSION")
if not msvs_version:
raise ValueError('GYP_MSVS_OVERRIDE_PATH requires GYP_MSVS_VERSION to be '
'set to a particular version (e.g. 2010e).')
raise ValueError(
"GYP_MSVS_OVERRIDE_PATH requires GYP_MSVS_VERSION to be "
"set to a particular version (e.g. 2010e)."
)
return _CreateVersion(msvs_version, override_path, sdk_based=True)
version = str(version)
versions = _DetectVisualStudioVersions(version_map[version], 'e' in version)
versions = _DetectVisualStudioVersions(version_map[version], "e" in version)
if not versions:
if not allow_fallback:
raise ValueError('Could not locate Visual Studio installation.')
if version == 'auto':
raise ValueError("Could not locate Visual Studio installation.")
if version == "auto":
# Default to 2005 if we couldn't find anything
return _CreateVersion('2005', None)
return _CreateVersion("2005", None)
else:
return _CreateVersion(version, None)
return versions[0]

View file

@ -27,27 +27,30 @@ except NameError:
debug = {}
# List of "official" debug modes, but you can use anything you like.
DEBUG_GENERAL = 'general'
DEBUG_VARIABLES = 'variables'
DEBUG_INCLUDES = 'includes'
DEBUG_GENERAL = "general"
DEBUG_VARIABLES = "variables"
DEBUG_INCLUDES = "includes"
def DebugOutput(mode, message, *args):
if 'all' in gyp.debug or mode in gyp.debug:
ctx = ('unknown', 0, 'unknown')
if "all" in gyp.debug or mode in gyp.debug:
ctx = ("unknown", 0, "unknown")
try:
f = traceback.extract_stack(limit=2)
if f:
ctx = f[0][:3]
except:
except Exception:
pass
if args:
message %= args
print('%s:%s:%d:%s %s' % (mode.upper(), os.path.basename(ctx[0]),
ctx[1], ctx[2], message))
print(
"%s:%s:%d:%s %s"
% (mode.upper(), os.path.basename(ctx[0]), ctx[1], ctx[2], message)
)
def FindBuildFiles():
extension = '.gyp'
extension = ".gyp"
files = os.listdir(os.getcwd())
build_files = []
for file in files:
@ -56,9 +59,17 @@ def FindBuildFiles():
return build_files
def Load(build_files, format, default_variables={},
includes=[], depth='.', params=None, check=False,
circular_check=True, duplicate_basename_check=True):
def Load(
build_files,
format,
default_variables={},
includes=[],
depth=".",
params=None,
check=False,
circular_check=True,
duplicate_basename_check=True,
):
"""
Loads one or more specified build files.
default_variables and includes will be copied before use.
@ -68,20 +79,20 @@ def Load(build_files, format, default_variables={},
if params is None:
params = {}
if '-' in format:
format, params['flavor'] = format.split('-', 1)
if "-" in format:
format, params["flavor"] = format.split("-", 1)
default_variables = copy.copy(default_variables)
# Default variables provided by this program and its modules should be
# named WITH_CAPITAL_LETTERS to provide a distinct "best practice" namespace,
# avoiding collisions with user and automatic variables.
default_variables['GENERATOR'] = format
default_variables['GENERATOR_FLAVOR'] = params.get('flavor', '')
default_variables["GENERATOR"] = format
default_variables["GENERATOR_FLAVOR"] = params.get("flavor", "")
# Format can be a custom python file, or by default the name of a module
# within gyp.generator.
if format.endswith('.py'):
if format.endswith(".py"):
generator_name = os.path.splitext(format)[0]
path, generator_name = os.path.split(generator_name)
@ -93,7 +104,7 @@ def Load(build_files, format, default_variables={},
if path not in sys.path:
sys.path.insert(0, path)
else:
generator_name = 'gyp.generator.' + format
generator_name = "gyp.generator." + format
# These parameters are passed in order (as opposed to by key)
# because ActivePython cannot handle key parameters to __import__.
@ -103,42 +114,55 @@ def Load(build_files, format, default_variables={},
# Give the generator the opportunity to set additional variables based on
# the params it will receive in the output phase.
if getattr(generator, 'CalculateVariables', None):
if getattr(generator, "CalculateVariables", None):
generator.CalculateVariables(default_variables, params)
# Give the generator the opportunity to set generator_input_info based on
# the params it will receive in the output phase.
if getattr(generator, 'CalculateGeneratorInputInfo', None):
if getattr(generator, "CalculateGeneratorInputInfo", None):
generator.CalculateGeneratorInputInfo(params)
# Fetch the generator specific info that gets fed to input, we use getattr
# so we can default things and the generators only have to provide what
# they need.
generator_input_info = {
'non_configuration_keys':
getattr(generator, 'generator_additional_non_configuration_keys', []),
'path_sections':
getattr(generator, 'generator_additional_path_sections', []),
'extra_sources_for_rules':
getattr(generator, 'generator_extra_sources_for_rules', []),
'generator_supports_multiple_toolsets':
getattr(generator, 'generator_supports_multiple_toolsets', False),
'generator_wants_static_library_dependencies_adjusted':
getattr(generator,
'generator_wants_static_library_dependencies_adjusted', True),
'generator_wants_sorted_dependencies':
getattr(generator, 'generator_wants_sorted_dependencies', False),
'generator_filelist_paths':
getattr(generator, 'generator_filelist_paths', None),
"non_configuration_keys": getattr(
generator, "generator_additional_non_configuration_keys", []
),
"path_sections": getattr(generator, "generator_additional_path_sections", []),
"extra_sources_for_rules": getattr(
generator, "generator_extra_sources_for_rules", []
),
"generator_supports_multiple_toolsets": getattr(
generator, "generator_supports_multiple_toolsets", False
),
"generator_wants_static_library_dependencies_adjusted": getattr(
generator, "generator_wants_static_library_dependencies_adjusted", True
),
"generator_wants_sorted_dependencies": getattr(
generator, "generator_wants_sorted_dependencies", False
),
"generator_filelist_paths": getattr(
generator, "generator_filelist_paths", None
),
}
# Process the input specific to this generator.
result = gyp.input.Load(build_files, default_variables, includes[:],
depth, generator_input_info, check, circular_check,
result = gyp.input.Load(
build_files,
default_variables,
includes[:],
depth,
generator_input_info,
check,
circular_check,
duplicate_basename_check,
params['parallel'], params['root_targets'])
params["parallel"],
params["root_targets"],
)
return [generator] + result
def NameValueListToDict(name_value_list):
"""
Takes an array of strings of the form 'NAME=VALUE' and creates a dictionary
@ -147,7 +171,7 @@ def NameValueListToDict(name_value_list):
"""
result = {}
for item in name_value_list:
tokens = item.split('=', 1)
tokens = item.split("=", 1)
if len(tokens) == 2:
# If we can make it an int, use that, otherwise, use the string.
try:
@ -161,17 +185,20 @@ def NameValueListToDict(name_value_list):
result[tokens[0]] = True
return result
def ShlexEnv(env_name):
flags = os.environ.get(env_name, [])
if flags:
flags = shlex.split(flags)
return flags
def FormatOpt(opt, value):
if opt.startswith('--'):
return '%s=%s' % (opt, value)
if opt.startswith("--"):
return "%s=%s" % (opt, value)
return opt + value
def RegenerateAppendFlag(flag, values, predicate, env_name, options):
"""Regenerate a list of command line flags, for an option of action='append'.
@ -194,6 +221,7 @@ def RegenerateAppendFlag(flag, values, predicate, env_name, options):
flags.append(FormatOpt(flag, predicate(flag_value)))
return flags
def RegenerateFlags(options):
"""Given a parsed options object, and taking the environment variables into
account, returns a list of flags that should regenerate an equivalent options
@ -204,6 +232,7 @@ def RegenerateFlags(options):
The format flag is not included, as it is assumed the calling generator will
set that as appropriate.
"""
def FixPath(path):
path = gyp.common.FixIfRelativePath(path, options.depth)
if not path:
@ -215,35 +244,43 @@ def RegenerateFlags(options):
# We always want to ignore the environment when regenerating, to avoid
# duplicate or changed flags in the environment at the time of regeneration.
flags = ['--ignore-environment']
flags = ["--ignore-environment"]
for name, metadata in options._regeneration_metadata.items():
opt = metadata['opt']
opt = metadata["opt"]
value = getattr(options, name)
value_predicate = metadata['type'] == 'path' and FixPath or Noop
action = metadata['action']
env_name = metadata['env_name']
if action == 'append':
flags.extend(RegenerateAppendFlag(opt, value, value_predicate,
env_name, options))
elif action in ('store', None): # None is a synonym for 'store'.
value_predicate = metadata["type"] == "path" and FixPath or Noop
action = metadata["action"]
env_name = metadata["env_name"]
if action == "append":
flags.extend(
RegenerateAppendFlag(opt, value, value_predicate, env_name, options)
)
elif action in ("store", None): # None is a synonym for 'store'.
if value:
flags.append(FormatOpt(opt, value_predicate(value)))
elif options.use_environment and env_name and os.environ.get(env_name):
flags.append(FormatOpt(opt, value_predicate(os.environ.get(env_name))))
elif action in ('store_true', 'store_false'):
if ((action == 'store_true' and value) or
(action == 'store_false' and not value)):
elif action in ("store_true", "store_false"):
if (action == "store_true" and value) or (
action == "store_false" and not value
):
flags.append(opt)
elif options.use_environment and env_name:
print('Warning: environment regeneration unimplemented '
'for %s flag %r env_name %r' % (action, opt,
env_name), file=sys.stderr)
print(
"Warning: environment regeneration unimplemented "
"for %s flag %r env_name %r" % (action, opt, env_name),
file=sys.stderr,
)
else:
print('Warning: regeneration unimplemented for action %r '
'flag %r' % (action, opt), file=sys.stderr)
print(
"Warning: regeneration unimplemented for action %r "
"flag %r" % (action, opt),
file=sys.stderr,
)
return flags
class RegeneratableOptionParser(argparse.ArgumentParser):
def __init__(self, usage):
self.__regeneratable_options = {}
@ -261,21 +298,21 @@ class RegeneratableOptionParser(argparse.ArgumentParser):
type: adds type='path', to tell the regenerator that the values of
this option need to be made relative to options.depth
"""
env_name = kw.pop('env_name', None)
if 'dest' in kw and kw.pop('regenerate', True):
dest = kw['dest']
env_name = kw.pop("env_name", None)
if "dest" in kw and kw.pop("regenerate", True):
dest = kw["dest"]
# The path type is needed for regenerating, for optparse we can just treat
# it as a string.
type = kw.get('type')
if type == 'path':
kw['type'] = str
type = kw.get("type")
if type == "path":
kw["type"] = str
self.__regeneratable_options[dest] = {
'action': kw.get('action'),
'type': type,
'env_name': env_name,
'opt': args[0],
"action": kw.get("action"),
"type": type,
"env_name": env_name,
"opt": args[0],
}
argparse.ArgumentParser.add_argument(self, *args, **kw)
@ -285,45 +322,100 @@ class RegeneratableOptionParser(argparse.ArgumentParser):
values._regeneration_metadata = self.__regeneratable_options
return values, args
def gyp_main(args):
my_name = os.path.basename(sys.argv[0])
usage = 'usage: %(prog)s [options ...] [build_file ...]'
usage = "usage: %(prog)s [options ...] [build_file ...]"
parser = RegeneratableOptionParser(usage=usage.replace('%s', '%(prog)s'))
parser.add_argument('--build', dest='configs', action='append',
help='configuration for build after project generation')
parser.add_argument('--check', dest='check', action='store_true',
help='check format of gyp files')
parser.add_argument('--config-dir', dest='config_dir', action='store',
env_name='GYP_CONFIG_DIR', default=None,
help='The location for configuration files like '
'include.gypi.')
parser.add_argument('-d', '--debug', dest='debug', metavar='DEBUGMODE',
action='append', default=[], help='turn on a debugging '
parser = RegeneratableOptionParser(usage=usage.replace("%s", "%(prog)s"))
parser.add_argument(
"--build",
dest="configs",
action="append",
help="configuration for build after project generation",
)
parser.add_argument(
"--check", dest="check", action="store_true", help="check format of gyp files"
)
parser.add_argument(
"--config-dir",
dest="config_dir",
action="store",
env_name="GYP_CONFIG_DIR",
default=None,
help="The location for configuration files like " "include.gypi.",
)
parser.add_argument(
"-d",
"--debug",
dest="debug",
metavar="DEBUGMODE",
action="append",
default=[],
help="turn on a debugging "
'mode for debugging GYP. Supported modes are "variables", '
'"includes" and "general" or "all" for all of them.')
parser.add_argument('-D', dest='defines', action='append', metavar='VAR=VAL',
env_name='GYP_DEFINES',
help='sets variable VAR to value VAL')
parser.add_argument('--depth', dest='depth', metavar='PATH', type='path',
help='set DEPTH gyp variable to a relative path to PATH')
parser.add_argument('-f', '--format', dest='formats', action='append',
env_name='GYP_GENERATORS', regenerate=False,
help='output formats to generate')
parser.add_argument('-G', dest='generator_flags', action='append', default=[],
metavar='FLAG=VAL', env_name='GYP_GENERATOR_FLAGS',
help='sets generator flag FLAG to VAL')
parser.add_argument('--generator-output', dest='generator_output',
action='store', default=None, metavar='DIR', type='path',
env_name='GYP_GENERATOR_OUTPUT',
help='puts generated build files under DIR')
parser.add_argument('--ignore-environment', dest='use_environment',
action='store_false', default=True, regenerate=False,
help='do not read options from environment variables')
parser.add_argument('-I', '--include', dest='includes', action='append',
metavar='INCLUDE', type='path',
help='files to include in all loaded .gyp files')
'"includes" and "general" or "all" for all of them.',
)
parser.add_argument(
"-D",
dest="defines",
action="append",
metavar="VAR=VAL",
env_name="GYP_DEFINES",
help="sets variable VAR to value VAL",
)
parser.add_argument(
"--depth",
dest="depth",
metavar="PATH",
type="path",
help="set DEPTH gyp variable to a relative path to PATH",
)
parser.add_argument(
"-f",
"--format",
dest="formats",
action="append",
env_name="GYP_GENERATORS",
regenerate=False,
help="output formats to generate",
)
parser.add_argument(
"-G",
dest="generator_flags",
action="append",
default=[],
metavar="FLAG=VAL",
env_name="GYP_GENERATOR_FLAGS",
help="sets generator flag FLAG to VAL",
)
parser.add_argument(
"--generator-output",
dest="generator_output",
action="store",
default=None,
metavar="DIR",
type="path",
env_name="GYP_GENERATOR_OUTPUT",
help="puts generated build files under DIR",
)
parser.add_argument(
"--ignore-environment",
dest="use_environment",
action="store_false",
default=True,
regenerate=False,
help="do not read options from environment variables",
)
parser.add_argument(
"-I",
"--include",
dest="includes",
action="append",
metavar="INCLUDE",
type="path",
help="files to include in all loaded .gyp files",
)
# --no-circular-check disables the check for circular relationships between
# .gyp files. These relationships should not exist, but they've only been
# observed to be harmful with the Xcode generator. Chromium's .gyp files
@ -331,29 +423,58 @@ def gyp_main(args):
# option allows the strict behavior to be used on Macs and the lenient
# behavior to be used elsewhere.
# TODO(mark): Remove this option when http://crbug.com/35878 is fixed.
parser.add_argument('--no-circular-check', dest='circular_check',
action='store_false', default=True, regenerate=False,
help="don't check for circular relationships between files")
parser.add_argument(
"--no-circular-check",
dest="circular_check",
action="store_false",
default=True,
regenerate=False,
help="don't check for circular relationships between files",
)
# --no-duplicate-basename-check disables the check for duplicate basenames
# in a static_library/shared_library project. Visual C++ 2008 generator
# doesn't support this configuration. Libtool on Mac also generates warnings
# when duplicate basenames are passed into Make generator on Mac.
# TODO(yukawa): Remove this option when these legacy generators are
# deprecated.
parser.add_argument('--no-duplicate-basename-check',
dest='duplicate_basename_check', action='store_false',
default=True, regenerate=False,
help="don't check for duplicate basenames")
parser.add_argument('--no-parallel', action='store_true', default=False,
help='Disable multiprocessing')
parser.add_argument('-S', '--suffix', dest='suffix', default='',
help='suffix to add to generated files')
parser.add_argument('--toplevel-dir', dest='toplevel_dir', action='store',
default=None, metavar='DIR', type='path',
help='directory to use as the root of the source tree')
parser.add_argument('-R', '--root-target', dest='root_targets',
action='append', metavar='TARGET',
help='include only TARGET and its deep dependencies')
parser.add_argument(
"--no-duplicate-basename-check",
dest="duplicate_basename_check",
action="store_false",
default=True,
regenerate=False,
help="don't check for duplicate basenames",
)
parser.add_argument(
"--no-parallel",
action="store_true",
default=False,
help="Disable multiprocessing",
)
parser.add_argument(
"-S",
"--suffix",
dest="suffix",
default="",
help="suffix to add to generated files",
)
parser.add_argument(
"--toplevel-dir",
dest="toplevel_dir",
action="store",
default=None,
metavar="DIR",
type="path",
help="directory to use as the root of the source tree",
)
parser.add_argument(
"-R",
"--root-target",
dest="root_targets",
action="append",
metavar="TARGET",
help="include only TARGET and its deep dependencies",
)
options, build_files_arg = parser.parse_args(args)
build_files = build_files_arg
@ -363,18 +484,18 @@ def gyp_main(args):
home = None
home_dot_gyp = None
if options.use_environment:
home_dot_gyp = os.environ.get('GYP_CONFIG_DIR', None)
home_dot_gyp = os.environ.get("GYP_CONFIG_DIR", None)
if home_dot_gyp:
home_dot_gyp = os.path.expanduser(home_dot_gyp)
if not home_dot_gyp:
home_vars = ['HOME']
if sys.platform in ('cygwin', 'win32'):
home_vars.append('USERPROFILE')
home_vars = ["HOME"]
if sys.platform in ("cygwin", "win32"):
home_vars.append("USERPROFILE")
for home_var in home_vars:
home = os.getenv(home_var)
if home != None:
home_dot_gyp = os.path.join(home, '.gyp')
if home:
home_dot_gyp = os.path.join(home, ".gyp")
if not os.path.exists(home_dot_gyp):
home_dot_gyp = None
else:
@ -389,22 +510,22 @@ def gyp_main(args):
# If no format was given on the command line, then check the env variable.
generate_formats = []
if options.use_environment:
generate_formats = os.environ.get('GYP_GENERATORS', [])
generate_formats = os.environ.get("GYP_GENERATORS", [])
if generate_formats:
generate_formats = re.split(r'[\s,]', generate_formats)
generate_formats = re.split(r"[\s,]", generate_formats)
if generate_formats:
options.formats = generate_formats
else:
# Nothing in the variable, default based on platform.
if sys.platform == 'darwin':
options.formats = ['xcode']
elif sys.platform in ('win32', 'cygwin'):
options.formats = ['msvs']
if sys.platform == "darwin":
options.formats = ["xcode"]
elif sys.platform in ("win32", "cygwin"):
options.formats = ["msvs"]
else:
options.formats = ['make']
options.formats = ["make"]
if not options.generator_output and options.use_environment:
g_o = os.environ.get('GYP_GENERATOR_OUTPUT')
g_o = os.environ.get("GYP_GENERATOR_OUTPUT")
if g_o:
options.generator_output = g_o
@ -415,9 +536,9 @@ def gyp_main(args):
# Do an extra check to avoid work when we're not debugging.
if DEBUG_GENERAL in gyp.debug:
DebugOutput(DEBUG_GENERAL, 'running with these options:')
DebugOutput(DEBUG_GENERAL, "running with these options:")
for option, value in sorted(options.__dict__.items()):
if option[0] == '_':
if option[0] == "_":
continue
if isinstance(value, string_types):
DebugOutput(DEBUG_GENERAL, " %s: '%s'", option, value)
@ -427,8 +548,7 @@ def gyp_main(args):
if not build_files:
build_files = FindBuildFiles()
if not build_files:
raise GypError((usage + '\n\n%s: error: no build_file') %
(my_name, my_name))
raise GypError((usage + "\n\n%s: error: no build_file") % (my_name, my_name))
# TODO(mark): Chromium-specific hack!
# For Chromium, the gyp "depth" variable should always be a relative path
@ -442,7 +562,7 @@ def gyp_main(args):
build_file_dir_components = build_file_dir.split(os.path.sep)
components_len = len(build_file_dir_components)
for index in range(components_len - 1, -1, -1):
if build_file_dir_components[index] == 'src':
if build_file_dir_components[index] == "src":
options.depth = os.path.sep.join(build_file_dir_components)
break
del build_file_dir_components[index]
@ -453,9 +573,11 @@ def gyp_main(args):
break
if not options.depth:
raise GypError('Could not automatically locate src directory. This is'
'a temporary Chromium feature that will be removed. Use'
'--depth as a workaround.')
raise GypError(
"Could not automatically locate src directory. This is"
"a temporary Chromium feature that will be removed. Use"
"--depth as a workaround."
)
# If toplevel-dir is not set, we assume that depth is the root of our source
# tree.
@ -468,23 +590,24 @@ def gyp_main(args):
cmdline_default_variables = {}
defines = []
if options.use_environment:
defines += ShlexEnv('GYP_DEFINES')
defines += ShlexEnv("GYP_DEFINES")
if options.defines:
defines += options.defines
cmdline_default_variables = NameValueListToDict(defines)
if DEBUG_GENERAL in gyp.debug:
DebugOutput(DEBUG_GENERAL,
"cmdline_default_variables: %s", cmdline_default_variables)
DebugOutput(
DEBUG_GENERAL, "cmdline_default_variables: %s", cmdline_default_variables
)
# Set up includes.
includes = []
# If ~/.gyp/include.gypi exists, it'll be forcibly included into every
# .gyp file that's loaded, before anything else is included.
if home_dot_gyp != None:
default_include = os.path.join(home_dot_gyp, 'include.gypi')
if home_dot_gyp:
default_include = os.path.join(home_dot_gyp, "include.gypi")
if os.path.exists(default_include):
print('Using overrides found in ' + default_include)
print("Using overrides found in " + default_include)
includes.append(default_include)
# Command-line --include files come after the default include.
@ -495,7 +618,7 @@ def gyp_main(args):
# are global across all generator runs.
gen_flags = []
if options.use_environment:
gen_flags += ShlexEnv('GYP_GENERATOR_FLAGS')
gen_flags += ShlexEnv("GYP_GENERATOR_FLAGS")
if options.generator_flags:
gen_flags += options.generator_flags
generator_flags = NameValueListToDict(gen_flags)
@ -505,22 +628,31 @@ def gyp_main(args):
# Generate all requested formats (use a set in case we got one format request
# twice)
for format in set(options.formats):
params = {'options': options,
'build_files': build_files,
'generator_flags': generator_flags,
'cwd': os.getcwd(),
'build_files_arg': build_files_arg,
'gyp_binary': sys.argv[0],
'home_dot_gyp': home_dot_gyp,
'parallel': options.parallel,
'root_targets': options.root_targets,
'target_arch': cmdline_default_variables.get('target_arch', '')}
params = {
"options": options,
"build_files": build_files,
"generator_flags": generator_flags,
"cwd": os.getcwd(),
"build_files_arg": build_files_arg,
"gyp_binary": sys.argv[0],
"home_dot_gyp": home_dot_gyp,
"parallel": options.parallel,
"root_targets": options.root_targets,
"target_arch": cmdline_default_variables.get("target_arch", ""),
}
# Start with the default variables from the command line.
[generator, flat_list, targets, data] = Load(
build_files, format, cmdline_default_variables, includes, options.depth,
params, options.check, options.circular_check,
options.duplicate_basename_check)
build_files,
format,
cmdline_default_variables,
includes,
options.depth,
params,
options.check,
options.circular_check,
options.duplicate_basename_check,
)
# TODO(mark): Pass |data| for now because the generator needs a list of
# build files that came in. In the future, maybe it should just accept
@ -532,10 +664,10 @@ def gyp_main(args):
generator.GenerateOutput(flat_list, targets, data, params)
if options.configs:
valid_configs = targets[flat_list[0]]['configurations'].keys()
valid_configs = targets[flat_list[0]]["configurations"]
for conf in options.configs:
if conf not in valid_configs:
raise GypError('Invalid config specified via --build: %s' % conf)
raise GypError("Invalid config specified via --build: %s" % conf)
generator.PerformBuild(data, options.configs, params)
# Done
@ -549,9 +681,11 @@ def main(args):
sys.stderr.write("gyp: %s\n" % e)
return 1
# NOTE: setuptools generated console_scripts calls function with no arguments
def script_main():
return main(sys.argv[1:])
if __name__ == '__main__':
if __name__ == "__main__":
sys.exit(script_main())

View file

@ -24,6 +24,7 @@ class memoize(object):
def __init__(self, func):
self.func = func
self.cache = {}
def __call__(self, *args):
try:
return self.cache[args]
@ -37,6 +38,7 @@ class GypError(Exception):
"""Error class representing an error, which is to be presented
to the user. The main entry point will catch and display this.
"""
pass
@ -45,9 +47,9 @@ def ExceptionAppend(e, msg):
if not e.args:
e.args = (msg,)
elif len(e.args) == 1:
e.args = (str(e.args[0]) + ' ' + msg,)
e.args = (str(e.args[0]) + " " + msg,)
else:
e.args = (str(e.args[0]) + ' ' + msg,) + e.args[1:]
e.args = (str(e.args[0]) + " " + msg,) + e.args[1:]
def FindQualifiedTargets(target, qualified_list):
@ -62,13 +64,13 @@ def ParseQualifiedTarget(target):
# Splits a qualified target into a build file, target name and toolset.
# NOTE: rsplit is used to disambiguate the Windows drive letter separator.
target_split = target.rsplit(':', 1)
target_split = target.rsplit(":", 1)
if len(target_split) == 2:
[build_file, target] = target_split
else:
build_file = None
target_split = target.rsplit('#', 1)
target_split = target.rsplit("#", 1)
if len(target_split) == 2:
[target, toolset] = target_split
else:
@ -97,11 +99,12 @@ def ResolveTarget(build_file, target, toolset):
# interpreting it as relative to build_file. If parsed_build_file is
# absolute, it is usable as a path regardless of the current directory,
# and os.path.join will return it as-is.
build_file = os.path.normpath(os.path.join(os.path.dirname(build_file),
parsed_build_file))
build_file = os.path.normpath(
os.path.join(os.path.dirname(build_file), parsed_build_file)
)
# Further (to handle cases like ../cwd), make it relative to cwd)
if not os.path.isabs(build_file):
build_file = RelativePath(build_file, '.')
build_file = RelativePath(build_file, ".")
else:
build_file = parsed_build_file
@ -129,9 +132,9 @@ def QualifiedTarget(build_file, target, toolset):
# "Qualified" means the file that a target was defined in and the target
# name, separated by a colon, suffixed by a # and the toolset name:
# /path/to/file.gyp:target_name#toolset
fully_qualified = build_file + ':' + target
fully_qualified = build_file + ":" + target
if toolset:
fully_qualified = fully_qualified + '#' + toolset
fully_qualified = fully_qualified + "#" + toolset
return fully_qualified
@ -154,9 +157,11 @@ def RelativePath(path, relative_to, follow_path_symlink=True):
# On Windows, we can't create a relative path to a different drive, so just
# use the absolute path.
if sys.platform == 'win32':
if (os.path.splitdrive(path)[0].lower() !=
os.path.splitdrive(relative_to)[0].lower()):
if sys.platform == "win32":
if (
os.path.splitdrive(path)[0].lower()
!= os.path.splitdrive(relative_to)[0].lower()
):
return path
# Split the paths into components.
@ -168,12 +173,13 @@ def RelativePath(path, relative_to, follow_path_symlink=True):
# Put enough ".." components to back up out of relative_to to the common
# prefix, and then append the part of path_split after the common prefix.
relative_split = [os.path.pardir] * (len(relative_to_split) - prefix_len) + \
path_split[prefix_len:]
relative_split = [os.path.pardir] * (
len(relative_to_split) - prefix_len
) + path_split[prefix_len:]
if len(relative_split) == 0:
# The paths were the same.
return ''
return ""
# Turn it back into a string and we're done.
return os.path.join(*relative_split)
@ -189,7 +195,7 @@ def InvertRelativePath(path, toplevel_dir=None):
"""
if not path:
return path
toplevel_dir = '.' if toplevel_dir is None else toplevel_dir
toplevel_dir = "." if toplevel_dir is None else toplevel_dir
return RelativePath(toplevel_dir, os.path.join(toplevel_dir, path))
@ -234,7 +240,7 @@ def UnrelativePath(path, relative_to):
# This does not match the characters in _escape, because those need to be
# backslash-escaped regardless of whether they appear in a double-quoted
# string.
_quote = re.compile('[\t\n #$%&\'()*;<=>?[{|}~]|^$')
_quote = re.compile("[\t\n #$%&'()*;<=>?[{|}~]|^$")
# _escape is a pattern that should match any character that needs to be
# escaped with a backslash, whether or not the argument matched the _quote
@ -262,6 +268,7 @@ _quote = re.compile('[\t\n #$%&\'()*;<=>?[{|}~]|^$')
# shells, there is no room for error here by ignoring !.
_escape = re.compile(r'(["\\`])')
def EncodePOSIXShellArgument(argument):
"""Encodes |argument| suitably for consumption by POSIX shells.
@ -278,9 +285,9 @@ def EncodePOSIXShellArgument(argument):
if _quote.search(argument):
quote = '"'
else:
quote = ''
quote = ""
encoded = quote + re.sub(_escape, r'\\\1', argument) + quote
encoded = quote + re.sub(_escape, r"\\\1", argument) + quote
return encoded
@ -295,7 +302,7 @@ def EncodePOSIXShellList(list):
encoded_arguments = []
for argument in list:
encoded_arguments.append(EncodePOSIXShellArgument(argument))
return ' '.join(encoded_arguments)
return " ".join(encoded_arguments)
def DeepDependencyTargets(target_dicts, roots):
@ -312,8 +319,8 @@ def DeepDependencyTargets(target_dicts, roots):
dependencies.add(r)
# Add its children.
spec = target_dicts[r]
pending.update(set(spec.get('dependencies', [])))
pending.update(set(spec.get('dependencies_original', [])))
pending.update(set(spec.get("dependencies", [])))
pending.update(set(spec.get("dependencies_original", [])))
return list(dependencies - set(roots))
@ -343,6 +350,7 @@ def WriteOnDiff(filename):
class Writer(object):
"""Wrapper around file which only covers the target if it differs."""
def __init__(self):
# On Cygwin remove the "dir" argument because `C:` prefixed paths are treated as relative,
# consequently ending up with current dir "/cygdrive/c/..." being prefixed to those, which was
@ -351,11 +359,12 @@ def WriteOnDiff(filename):
base_temp_dir = "" if IsCygwin() else os.path.dirname(filename)
# Pick temporary file.
tmp_fd, self.tmp_path = tempfile.mkstemp(
suffix='.tmp',
prefix=os.path.split(filename)[1] + '.gyp.',
dir=base_temp_dir)
suffix=".tmp",
prefix=os.path.split(filename)[1] + ".gyp.",
dir=base_temp_dir,
)
try:
self.tmp_file = os.fdopen(tmp_fd, 'wb')
self.tmp_file = os.fdopen(tmp_fd, "w")
except Exception:
# Don't leave turds behind.
os.unlink(self.tmp_path)
@ -395,7 +404,7 @@ def WriteOnDiff(filename):
umask = os.umask(0o77)
os.umask(umask)
os.chmod(self.tmp_path, 0o666 & ~umask)
if sys.platform == 'win32' and os.path.exists(filename):
if sys.platform == "win32" and os.path.exists(filename):
# NOTE: on windows (but not cygwin) rename will not replace an
# existing file, so it must be preceded with a remove. Sadly there
# is no way to make the switch atomic.
@ -407,7 +416,7 @@ def WriteOnDiff(filename):
raise
def write(self, s):
self.tmp_file.write(s.encode('utf-8'))
self.tmp_file.write(s.encode("utf-8"))
return Writer()
@ -423,29 +432,29 @@ def EnsureDirExists(path):
def GetFlavor(params):
"""Returns |params.flavor| if it's set, the system's default flavor else."""
flavors = {
'cygwin': 'win',
'win32': 'win',
'darwin': 'mac',
"cygwin": "win",
"win32": "win",
"darwin": "mac",
}
if 'flavor' in params:
return params['flavor']
if "flavor" in params:
return params["flavor"]
if sys.platform in flavors:
return flavors[sys.platform]
if sys.platform.startswith('sunos'):
return 'solaris'
if sys.platform.startswith(('dragonfly', 'freebsd')):
return 'freebsd'
if sys.platform.startswith('openbsd'):
return 'openbsd'
if sys.platform.startswith('netbsd'):
return 'netbsd'
if sys.platform.startswith('aix'):
return 'aix'
if sys.platform.startswith(('os390', 'zos')):
return 'zos'
if sys.platform.startswith("sunos"):
return "solaris"
if sys.platform.startswith(("dragonfly", "freebsd")):
return "freebsd"
if sys.platform.startswith("openbsd"):
return "openbsd"
if sys.platform.startswith("netbsd"):
return "netbsd"
if sys.platform.startswith("aix"):
return "aix"
if sys.platform.startswith(("os390", "zos")):
return "zos"
return 'linux'
return "linux"
def CopyTool(flavor, out_path, generator_flags={}):
@ -453,33 +462,29 @@ def CopyTool(flavor, out_path, generator_flags={}):
to |out_path|."""
# aix and solaris just need flock emulation. mac and win use more complicated
# support scripts.
prefix = {
'aix': 'flock',
'solaris': 'flock',
'mac': 'mac',
'win': 'win'
}.get(flavor, None)
prefix = {"aix": "flock", "solaris": "flock", "mac": "mac", "win": "win"}.get(
flavor, None
)
if not prefix:
return
# Slurp input file.
source_path = os.path.join(
os.path.dirname(os.path.abspath(__file__)), '%s_tool.py' % prefix)
os.path.dirname(os.path.abspath(__file__)), "%s_tool.py" % prefix
)
with open(source_path) as source_file:
source = source_file.readlines()
# Set custom header flags.
header = '# Generated by gyp. Do not edit.\n'
mac_toolchain_dir = generator_flags.get('mac_toolchain_dir', None)
if flavor == 'mac' and mac_toolchain_dir:
header += "import os;\nos.environ['DEVELOPER_DIR']='%s'\n" \
% mac_toolchain_dir
header = "# Generated by gyp. Do not edit.\n"
mac_toolchain_dir = generator_flags.get("mac_toolchain_dir", None)
if flavor == "mac" and mac_toolchain_dir:
header += "import os;\nos.environ['DEVELOPER_DIR']='%s'\n" % mac_toolchain_dir
# Add header and write it out.
tool_path = os.path.join(out_path, 'gyp-%s-tool' % prefix)
with open(tool_path, 'w') as tool_file:
tool_file.write(
''.join([source[0], header] + source[1:]))
tool_path = os.path.join(out_path, "gyp-%s-tool" % prefix)
with open(tool_path, "w") as tool_file:
tool_file.write("".join([source[0], header] + source[1:]))
# Make file executable.
os.chmod(tool_path, 0o755)
@ -491,14 +496,14 @@ def CopyTool(flavor, out_path, generator_flags={}):
# First comment, dated 2001/10/13.
# (Also in the printed Python Cookbook.)
def uniquer(seq, idfun=None):
if idfun is None:
idfun = lambda x: x
def uniquer(seq, idfun=lambda x: x):
seen = {}
result = []
for item in seq:
marker = idfun(item)
if marker in seen: continue
if marker in seen:
continue
seen[marker] = 1
result.append(item)
return result
@ -548,15 +553,15 @@ class OrderedSet(MutableSet):
# The second argument is an addition that causes a pylint warning.
def pop(self, last=True): # pylint: disable=W0221
if not self:
raise KeyError('set is empty')
raise KeyError("set is empty")
key = self.end[1][0] if last else self.end[2][0]
self.discard(key)
return key
def __repr__(self):
if not self:
return '%s()' % (self.__class__.__name__,)
return '%s(%r)' % (self.__class__.__name__, list(self))
return "%s()" % (self.__class__.__name__,)
return "%s(%r)" % (self.__class__.__name__, list(self))
def __eq__(self, other):
if isinstance(other, OrderedSet):
@ -572,10 +577,12 @@ class OrderedSet(MutableSet):
class CycleError(Exception):
"""An exception raised when an unexpected cycle is detected."""
def __init__(self, nodes):
self.nodes = nodes
def __str__(self):
return 'CycleError: cycle involving: ' + str(self.nodes)
return "CycleError: cycle involving: " + str(self.nodes)
def TopologicallySorted(graph, get_edges):
@ -603,6 +610,7 @@ def TopologicallySorted(graph, get_edges):
visited = set()
visiting = set()
ordered_nodes = []
def Visit(node):
if node in visiting:
raise CycleError(visiting)
@ -614,26 +622,31 @@ def TopologicallySorted(graph, get_edges):
Visit(neighbor)
visiting.remove(node)
ordered_nodes.insert(0, node)
for node in sorted(graph):
Visit(node)
return ordered_nodes
def CrossCompileRequested():
# TODO: figure out how to not build extra host objects in the
# non-cross-compile case when this is enabled, and enable unconditionally.
return (os.environ.get('GYP_CROSSCOMPILE') or
os.environ.get('AR_host') or
os.environ.get('CC_host') or
os.environ.get('CXX_host') or
os.environ.get('AR_target') or
os.environ.get('CC_target') or
os.environ.get('CXX_target'))
return (
os.environ.get("GYP_CROSSCOMPILE")
or os.environ.get("AR_host")
or os.environ.get("CC_host")
or os.environ.get("CXX_host")
or os.environ.get("AR_target")
or os.environ.get("CC_target")
or os.environ.get("CXX_target")
)
def IsCygwin():
try:
out = subprocess.Popen("uname",
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT)
out = subprocess.Popen(
"uname", stdout=subprocess.PIPE, stderr=subprocess.STDOUT
)
stdout, stderr = out.communicate()
if PY3:
stdout = stdout.decode("utf-8")

View file

@ -15,35 +15,40 @@ class TestTopologicallySorted(unittest.TestCase):
def test_Valid(self):
"""Test that sorting works on a valid graph with one possible order."""
graph = {
'a': ['b', 'c'],
'b': [],
'c': ['d'],
'd': ['b'],
"a": ["b", "c"],
"b": [],
"c": ["d"],
"d": ["b"],
}
def GetEdge(node):
return tuple(graph[node])
self.assertEqual(
gyp.common.TopologicallySorted(graph.keys(), GetEdge),
['a', 'c', 'd', 'b'])
gyp.common.TopologicallySorted(graph.keys(), GetEdge), ["a", "c", "d", "b"]
)
def test_Cycle(self):
"""Test that an exception is thrown on a cyclic graph."""
graph = {
'a': ['b'],
'b': ['c'],
'c': ['d'],
'd': ['a'],
"a": ["b"],
"b": ["c"],
"c": ["d"],
"d": ["a"],
}
def GetEdge(node):
return tuple(graph[node])
self.assertRaises(
gyp.common.CycleError, gyp.common.TopologicallySorted,
graph.keys(), GetEdge)
gyp.common.CycleError, gyp.common.TopologicallySorted, graph.keys(), GetEdge
)
class TestGetFlavor(unittest.TestCase):
"""Test that gyp.common.GetFlavor works as intended"""
original_platform = ''
original_platform = ""
def setUp(self):
self.original_platform = sys.platform
@ -56,17 +61,18 @@ class TestGetFlavor(unittest.TestCase):
self.assertEqual(expected, gyp.common.GetFlavor(param))
def test_platform_default(self):
self.assertFlavor('freebsd', 'freebsd9' , {})
self.assertFlavor('freebsd', 'freebsd10', {})
self.assertFlavor('openbsd', 'openbsd5' , {})
self.assertFlavor('solaris', 'sunos5' , {})
self.assertFlavor('solaris', 'sunos' , {})
self.assertFlavor('linux' , 'linux2' , {})
self.assertFlavor('linux' , 'linux3' , {})
self.assertFlavor("freebsd", "freebsd9", {})
self.assertFlavor("freebsd", "freebsd10", {})
self.assertFlavor("openbsd", "openbsd5", {})
self.assertFlavor("solaris", "sunos5", {})
self.assertFlavor("solaris", "sunos", {})
self.assertFlavor("linux", "linux2", {})
self.assertFlavor("linux", "linux3", {})
self.assertFlavor("linux", "linux", {})
def test_param(self):
self.assertFlavor('foobar', 'linux2' , {'flavor': 'foobar'})
self.assertFlavor("foobar", "linux2", {"flavor": "foobar"})
if __name__ == '__main__':
if __name__ == "__main__":
unittest.main()

View file

@ -8,7 +8,7 @@ import locale
from functools import reduce
def XmlToString(content, encoding='utf-8', pretty=False):
def XmlToString(content, encoding="utf-8", pretty=False):
""" Writes the XML content to disk, touching the file only if it has changed.
Visual Studio files have a lot of pre-defined structures. This function makes
@ -49,11 +49,11 @@ def XmlToString(content, encoding='utf-8', pretty=False):
# We create a huge list of all the elements of the file.
xml_parts = ['<?xml version="1.0" encoding="%s"?>' % encoding]
if pretty:
xml_parts.append('\n')
xml_parts.append("\n")
_ConstructContentList(xml_parts, content, pretty)
# Convert it to a string
return ''.join(xml_parts)
return "".join(xml_parts)
def _ConstructContentList(xml_parts, specification, pretty, level=0):
@ -67,16 +67,18 @@ def _ConstructContentList(xml_parts, specification, pretty, level=0):
"""
# The first item in a specification is the name of the element.
if pretty:
indentation = ' ' * level
new_line = '\n'
indentation = " " * level
new_line = "\n"
else:
indentation = ''
new_line = ''
indentation = ""
new_line = ""
name = specification[0]
if not isinstance(name, str):
raise Exception('The first item of an EasyXml specification should be '
'a string. Specification was ' + str(specification))
xml_parts.append(indentation + '<' + name)
raise Exception(
"The first item of an EasyXml specification should be "
"a string. Specification was " + str(specification)
)
xml_parts.append(indentation + "<" + name)
# Optionally in second position is a dictionary of the attributes.
rest = specification[1:]
@ -85,7 +87,7 @@ def _ConstructContentList(xml_parts, specification, pretty, level=0):
xml_parts.append(' %s="%s"' % (at, _XmlEscape(val, attr=True)))
rest = rest[1:]
if rest:
xml_parts.append('>')
xml_parts.append(">")
all_strings = reduce(lambda x, y: x and isinstance(y, str), rest, True)
multi_line = not all_strings
if multi_line and new_line:
@ -99,13 +101,12 @@ def _ConstructContentList(xml_parts, specification, pretty, level=0):
_ConstructContentList(xml_parts, child_spec, pretty, level + 1)
if multi_line and indentation:
xml_parts.append(indentation)
xml_parts.append('</%s>%s' % (name, new_line))
xml_parts.append("</%s>%s" % (name, new_line))
else:
xml_parts.append('/>%s' % new_line)
xml_parts.append("/>%s" % new_line)
def WriteXmlIfChanged(content, path, encoding='utf-8', pretty=False,
win32=False):
def WriteXmlIfChanged(content, path, encoding="utf-8", pretty=False, win32=False):
""" Writes the XML content to disk, touching the file only if it has changed.
Args:
@ -115,8 +116,8 @@ def WriteXmlIfChanged(content, path, encoding='utf-8', pretty=False,
pretty: True if we want pretty printing with indents and new lines.
"""
xml_string = XmlToString(content, encoding, pretty)
if win32 and os.linesep != '\r\n':
xml_string = xml_string.replace('\n', '\r\n')
if win32 and os.linesep != "\r\n":
xml_string = xml_string.replace("\n", "\r\n")
default_encoding = locale.getdefaultlocale()[1]
if default_encoding and default_encoding.upper() != encoding.upper():
@ -124,40 +125,39 @@ def WriteXmlIfChanged(content, path, encoding='utf-8', pretty=False,
# Get the old content
try:
f = open(path, 'r')
existing = f.read()
f.close()
except:
with open(path, "r") as file:
existing = file.read()
except IOError:
existing = None
# It has changed, write it
if existing != xml_string:
f = open(path, 'wb')
f.write(xml_string)
f.close()
with open(path, "wb") as file:
file.write(xml_string)
_xml_escape_map = {
'"': '&quot;',
"'": '&apos;',
'<': '&lt;',
'>': '&gt;',
'&': '&amp;',
'\n': '&#xA;',
'\r': '&#xD;',
'"': "&quot;",
"'": "&apos;",
"<": "&lt;",
">": "&gt;",
"&": "&amp;",
"\n": "&#xA;",
"\r": "&#xD;",
}
_xml_escape_re = re.compile(
"(%s)" % "|".join(map(re.escape, _xml_escape_map.keys())))
_xml_escape_re = re.compile("(%s)" % "|".join(map(re.escape, _xml_escape_map.keys())))
def _XmlEscape(value, attr=False):
""" Escape a string for inclusion in XML."""
def replace(match):
m = match.string[match.start() : match.end()]
# don't replace single quotes in attrs
if attr and m == "'":
return m
return _xml_escape_map[m]
return _xml_escape_re.sub(replace, value)

View file

@ -16,92 +16,97 @@ except ImportError:
class TestSequenceFunctions(unittest.TestCase):
def setUp(self):
self.stderr = StringIO()
def test_EasyXml_simple(self):
self.assertEqual(
easy_xml.XmlToString(['test']),
'<?xml version="1.0" encoding="utf-8"?><test/>')
easy_xml.XmlToString(["test"]),
'<?xml version="1.0" encoding="utf-8"?><test/>',
)
self.assertEqual(
easy_xml.XmlToString(['test'], encoding='Windows-1252'),
'<?xml version="1.0" encoding="Windows-1252"?><test/>')
easy_xml.XmlToString(["test"], encoding="Windows-1252"),
'<?xml version="1.0" encoding="Windows-1252"?><test/>',
)
def test_EasyXml_simple_with_attributes(self):
self.assertEqual(
easy_xml.XmlToString(['test2', {'a': 'value1', 'b': 'value2'}]),
'<?xml version="1.0" encoding="utf-8"?><test2 a="value1" b="value2"/>')
easy_xml.XmlToString(["test2", {"a": "value1", "b": "value2"}]),
'<?xml version="1.0" encoding="utf-8"?><test2 a="value1" b="value2"/>',
)
def test_EasyXml_escaping(self):
original = '<test>\'"\r&\nfoo'
converted = '&lt;test&gt;\'&quot;&#xD;&amp;&#xA;foo'
converted_apos = converted.replace("'", '&apos;')
original = "<test>'\"\r&\nfoo"
converted = "&lt;test&gt;'&quot;&#xD;&amp;&#xA;foo"
converted_apos = converted.replace("'", "&apos;")
self.assertEqual(
easy_xml.XmlToString(['test3', {'a': original}, original]),
'<?xml version="1.0" encoding="utf-8"?><test3 a="%s">%s</test3>' %
(converted, converted_apos))
easy_xml.XmlToString(["test3", {"a": original}, original]),
'<?xml version="1.0" encoding="utf-8"?><test3 a="%s">%s</test3>'
% (converted, converted_apos),
)
def test_EasyXml_pretty(self):
self.assertEqual(
easy_xml.XmlToString(
['test3',
['GrandParent',
['Parent1',
['Child']
],
['Parent2']
]
],
pretty=True),
["test3", ["GrandParent", ["Parent1", ["Child"]], ["Parent2"]]],
pretty=True,
),
'<?xml version="1.0" encoding="utf-8"?>\n'
'<test3>\n'
' <GrandParent>\n'
' <Parent1>\n'
' <Child/>\n'
' </Parent1>\n'
' <Parent2/>\n'
' </GrandParent>\n'
'</test3>\n')
"<test3>\n"
" <GrandParent>\n"
" <Parent1>\n"
" <Child/>\n"
" </Parent1>\n"
" <Parent2/>\n"
" </GrandParent>\n"
"</test3>\n",
)
def test_EasyXml_complex(self):
# We want to create:
target = (
'<?xml version="1.0" encoding="utf-8"?>'
'<Project>'
"<Project>"
'<PropertyGroup Label="Globals">'
'<ProjectGuid>{D2250C20-3A94-4FB9-AF73-11BC5B73884B}</ProjectGuid>'
'<Keyword>Win32Proj</Keyword>'
'<RootNamespace>automated_ui_tests</RootNamespace>'
'</PropertyGroup>'
"<ProjectGuid>{D2250C20-3A94-4FB9-AF73-11BC5B73884B}</ProjectGuid>"
"<Keyword>Win32Proj</Keyword>"
"<RootNamespace>automated_ui_tests</RootNamespace>"
"</PropertyGroup>"
'<Import Project="$(VCTargetsPath)\\Microsoft.Cpp.props"/>'
'<PropertyGroup '
'Condition="\'$(Configuration)|$(Platform)\'=='
"<PropertyGroup "
"Condition=\"'$(Configuration)|$(Platform)'=="
'\'Debug|Win32\'" Label="Configuration">'
'<ConfigurationType>Application</ConfigurationType>'
'<CharacterSet>Unicode</CharacterSet>'
'</PropertyGroup>'
'</Project>')
"<ConfigurationType>Application</ConfigurationType>"
"<CharacterSet>Unicode</CharacterSet>"
"</PropertyGroup>"
"</Project>"
)
xml = easy_xml.XmlToString(
['Project',
['PropertyGroup', {'Label': 'Globals'},
['ProjectGuid', '{D2250C20-3A94-4FB9-AF73-11BC5B73884B}'],
['Keyword', 'Win32Proj'],
['RootNamespace', 'automated_ui_tests']
[
"Project",
[
"PropertyGroup",
{"Label": "Globals"},
["ProjectGuid", "{D2250C20-3A94-4FB9-AF73-11BC5B73884B}"],
["Keyword", "Win32Proj"],
["RootNamespace", "automated_ui_tests"],
],
["Import", {"Project": "$(VCTargetsPath)\\Microsoft.Cpp.props"}],
[
"PropertyGroup",
{
"Condition": "'$(Configuration)|$(Platform)'=='Debug|Win32'",
"Label": "Configuration",
},
["ConfigurationType", "Application"],
["CharacterSet", "Unicode"],
],
['Import', {'Project': '$(VCTargetsPath)\\Microsoft.Cpp.props'}],
['PropertyGroup',
{'Condition': "'$(Configuration)|$(Platform)'=='Debug|Win32'",
'Label': 'Configuration'},
['ConfigurationType', 'Application'],
['CharacterSet', 'Unicode']
]
])
)
self.assertEqual(xml, target)
if __name__ == '__main__':
if __name__ == "__main__":
unittest.main()

View file

@ -20,6 +20,7 @@ def main(args):
class FlockTool(object):
"""This class emulates the 'flock' command."""
def Dispatch(self, args):
"""Dispatches a string command to a method."""
if len(args) < 1:
@ -30,7 +31,7 @@ class FlockTool(object):
def _CommandifyName(self, name_string):
"""Transforms a tool name like copy-info-plist to CopyInfoPlist"""
return name_string.title().replace('-', '')
return name_string.title().replace("-", "")
def ExecFlock(self, lockfile, *cmd_list):
"""Emulates the most basic behavior of Linux's flock(1)."""
@ -40,15 +41,15 @@ class FlockTool(object):
# with EBADF, that's why we use this F_SETLK
# hack instead.
fd = os.open(lockfile, os.O_WRONLY | os.O_NOCTTY | os.O_CREAT, 0o666)
if sys.platform.startswith('aix'):
if sys.platform.startswith("aix"):
# Python on AIX is compiled with LARGEFILE support, which changes the
# struct size.
op = struct.pack('hhIllqq', fcntl.F_WRLCK, 0, 0, 0, 0, 0, 0)
op = struct.pack("hhIllqq", fcntl.F_WRLCK, 0, 0, 0, 0, 0, 0)
else:
op = struct.pack('hhllhhl', fcntl.F_WRLCK, 0, 0, 0, 0, 0, 0)
op = struct.pack("hhllhhl", fcntl.F_WRLCK, 0, 0, 0, 0, 0, 0)
fcntl.fcntl(fd, fcntl.F_SETLK, op)
return subprocess.call(cmd_list)
if __name__ == '__main__':
if __name__ == "__main__":
sys.exit(main(sys.argv[1:]))

View file

@ -65,18 +65,16 @@ then the "all" target includes "b1" and "b2".
from __future__ import print_function
import gyp.common
import gyp.ninja_syntax as ninja_syntax
import json
import os
import posixpath
import sys
debug = False
found_dependency_string = 'Found dependency'
no_dependency_string = 'No dependencies'
found_dependency_string = "Found dependency"
no_dependency_string = "No dependencies"
# Status when it should be assumed that everything has changed.
all_changed_string = 'Found dependency (all)'
all_changed_string = "Found dependency (all)"
# MatchStatus is used indicate if and how a target depends upon the supplied
# sources.
@ -96,25 +94,37 @@ generator_supports_multiple_toolsets = gyp.common.CrossCompileRequested()
generator_wants_static_library_dependencies_adjusted = False
generator_default_variables = {
}
for dirname in ['INTERMEDIATE_DIR', 'SHARED_INTERMEDIATE_DIR', 'PRODUCT_DIR',
'LIB_DIR', 'SHARED_LIB_DIR']:
generator_default_variables[dirname] = '!!!'
generator_default_variables = {}
for dirname in [
"INTERMEDIATE_DIR",
"SHARED_INTERMEDIATE_DIR",
"PRODUCT_DIR",
"LIB_DIR",
"SHARED_LIB_DIR",
]:
generator_default_variables[dirname] = "!!!"
for unused in ['RULE_INPUT_PATH', 'RULE_INPUT_ROOT', 'RULE_INPUT_NAME',
'RULE_INPUT_DIRNAME', 'RULE_INPUT_EXT',
'EXECUTABLE_PREFIX', 'EXECUTABLE_SUFFIX',
'STATIC_LIB_PREFIX', 'STATIC_LIB_SUFFIX',
'SHARED_LIB_PREFIX', 'SHARED_LIB_SUFFIX',
'CONFIGURATION_NAME']:
generator_default_variables[unused] = ''
for unused in [
"RULE_INPUT_PATH",
"RULE_INPUT_ROOT",
"RULE_INPUT_NAME",
"RULE_INPUT_DIRNAME",
"RULE_INPUT_EXT",
"EXECUTABLE_PREFIX",
"EXECUTABLE_SUFFIX",
"STATIC_LIB_PREFIX",
"STATIC_LIB_SUFFIX",
"SHARED_LIB_PREFIX",
"SHARED_LIB_SUFFIX",
"CONFIGURATION_NAME",
]:
generator_default_variables[unused] = ""
def _ToGypPath(path):
"""Converts a path to the format used by gyp."""
if os.sep == '\\' and os.altsep == '/':
return path.replace('\\', '/')
if os.sep == "\\" and os.altsep == "/":
return path.replace("\\", "/")
return path
@ -123,17 +133,20 @@ def _ResolveParent(path, base_path_components):
string if the path shouldn't be considered. See _AddSources() for a
description of |base_path_components|."""
depth = 0
while path.startswith('../'):
while path.startswith("../"):
depth += 1
path = path[3:]
# Relative includes may go outside the source tree. For example, an action may
# have inputs in /usr/include, which are not in the source tree.
if depth > len(base_path_components):
return ''
return ""
if depth == len(base_path_components):
return path
return '/'.join(base_path_components[0:len(base_path_components) - depth]) + \
'/' + path
return (
"/".join(base_path_components[0 : len(base_path_components) - depth])
+ "/"
+ path
)
def _AddSources(sources, base_path, base_path_components, result):
@ -145,33 +158,32 @@ def _AddSources(sources, base_path, base_path_components, result):
and tracked in some other means."""
# NOTE: gyp paths are always posix style.
for source in sources:
if not len(source) or source.startswith('!!!') or source.startswith('$'):
if not len(source) or source.startswith("!!!") or source.startswith("$"):
continue
# variable expansion may lead to //.
org_source = source
source = source[0] + source[1:].replace('//', '/')
if source.startswith('../'):
source = source[0] + source[1:].replace("//", "/")
if source.startswith("../"):
source = _ResolveParent(source, base_path_components)
if len(source):
result.append(source)
continue
result.append(base_path + source)
if debug:
print('AddSource', org_source, result[len(result) - 1])
print("AddSource", org_source, result[len(result) - 1])
def _ExtractSourcesFromAction(action, base_path, base_path_components,
results):
if 'inputs' in action:
_AddSources(action['inputs'], base_path, base_path_components, results)
def _ExtractSourcesFromAction(action, base_path, base_path_components, results):
if "inputs" in action:
_AddSources(action["inputs"], base_path, base_path_components, results)
def _ToLocalPath(toplevel_dir, path):
"""Converts |path| to a path relative to |toplevel_dir|."""
if path == toplevel_dir:
return ''
if path.startswith(toplevel_dir + '/'):
return path[len(toplevel_dir) + len('/'):]
return ""
if path.startswith(toplevel_dir + "/"):
return path[len(toplevel_dir) + len("/") :]
return path
@ -180,27 +192,25 @@ def _ExtractSources(target, target_dict, toplevel_dir):
# source paths are always posix. Convert |target| to a posix path relative to
# |toplevel_dir_|. This is done to make it easy to build source paths.
base_path = posixpath.dirname(_ToLocalPath(toplevel_dir, _ToGypPath(target)))
base_path_components = base_path.split('/')
base_path_components = base_path.split("/")
# Add a trailing '/' so that _AddSources() can easily build paths.
if len(base_path):
base_path += '/'
base_path += "/"
if debug:
print('ExtractSources', target, base_path)
print("ExtractSources", target, base_path)
results = []
if 'sources' in target_dict:
_AddSources(target_dict['sources'], base_path, base_path_components,
results)
if "sources" in target_dict:
_AddSources(target_dict["sources"], base_path, base_path_components, results)
# Include the inputs from any actions. Any changes to these affect the
# resulting output.
if 'actions' in target_dict:
for action in target_dict['actions']:
_ExtractSourcesFromAction(action, base_path, base_path_components,
results)
if 'rules' in target_dict:
for rule in target_dict['rules']:
if "actions" in target_dict:
for action in target_dict["actions"]:
_ExtractSourcesFromAction(action, base_path, base_path_components, results)
if "rules" in target_dict:
for rule in target_dict["rules"]:
_ExtractSourcesFromAction(rule, base_path, base_path_components, results)
return results
@ -225,6 +235,7 @@ class Target(object):
is_static_library: true if the type of target is static_library.
is_or_has_linked_ancestor: true if the target does a link (eg executable), or
if there is a target in back_deps that does a link."""
def __init__(self, name):
self.deps = set()
self.match_status = MATCH_STATUS_TBD
@ -245,6 +256,7 @@ class Config(object):
"""Details what we're looking for
files: set of files to search for
targets: see file description for details."""
def __init__(self):
self.files = []
self.targets = set()
@ -254,24 +266,25 @@ class Config(object):
def Init(self, params):
"""Initializes Config. This is a separate method as it raises an exception
if there is a parse error."""
generator_flags = params.get('generator_flags', {})
config_path = generator_flags.get('config_path', None)
generator_flags = params.get("generator_flags", {})
config_path = generator_flags.get("config_path", None)
if not config_path:
return
try:
f = open(config_path, 'r')
f = open(config_path, "r")
config = json.load(f)
f.close()
except IOError:
raise Exception('Unable to open file ' + config_path)
raise Exception("Unable to open file " + config_path)
except ValueError as e:
raise Exception('Unable to parse config file ' + config_path + str(e))
raise Exception("Unable to parse config file " + config_path + str(e))
if not isinstance(config, dict):
raise Exception('config_path must be a JSON file containing a dictionary')
self.files = config.get('files', [])
raise Exception("config_path must be a JSON file containing a dictionary")
self.files = config.get("files", [])
self.additional_compile_target_names = set(
config.get('additional_compile_targets', []))
self.test_target_names = set(config.get('test_targets', []))
config.get("additional_compile_targets", [])
)
self.test_target_names = set(config.get("test_targets", []))
def _WasBuildFileModified(build_file, data, files, toplevel_dir):
@ -280,21 +293,26 @@ def _WasBuildFileModified(build_file, data, files, toplevel_dir):
the root of the source tree."""
if _ToLocalPath(toplevel_dir, _ToGypPath(build_file)) in files:
if debug:
print('gyp file modified', build_file)
print("gyp file modified", build_file)
return True
# First element of included_files is the file itself.
if len(data[build_file]['included_files']) <= 1:
if len(data[build_file]["included_files"]) <= 1:
return False
for include_file in data[build_file]['included_files'][1:]:
for include_file in data[build_file]["included_files"][1:]:
# |included_files| are relative to the directory of the |build_file|.
rel_include_file = \
_ToGypPath(gyp.common.UnrelativePath(include_file, build_file))
rel_include_file = _ToGypPath(
gyp.common.UnrelativePath(include_file, build_file)
)
if _ToLocalPath(toplevel_dir, rel_include_file) in files:
if debug:
print('included gyp file modified, gyp_file=', build_file,
'included file=', rel_include_file)
print(
"included gyp file modified, gyp_file=",
build_file,
"included file=",
rel_include_file,
)
return True
return False
@ -313,12 +331,14 @@ def _GetOrCreateTargetByName(targets, target_name):
def _DoesTargetTypeRequireBuild(target_dict):
"""Returns true if the target type is such that it needs to be built."""
# If a 'none' target has rules or actions we assume it requires a build.
return bool(target_dict['type'] != 'none' or
target_dict.get('actions') or target_dict.get('rules'))
return bool(
target_dict["type"] != "none"
or target_dict.get("actions")
or target_dict.get("rules")
)
def _GenerateTargets(data, target_list, target_dicts, toplevel_dir, files,
build_files):
def _GenerateTargets(data, target_list, target_dicts, toplevel_dir, files, build_files):
"""Returns a tuple of the following:
. A dictionary mapping from fully qualified name to Target.
. A list of the targets that have a source file in |files|.
@ -348,26 +368,26 @@ def _GenerateTargets(data, target_list, target_dicts, toplevel_dir, files,
while len(targets_to_visit) > 0:
target_name = targets_to_visit.pop()
created_target, target = _GetOrCreateTargetByName(name_to_target,
target_name)
created_target, target = _GetOrCreateTargetByName(name_to_target, target_name)
if created_target:
roots.add(target)
elif target.visited:
continue
target.visited = True
target.requires_build = _DoesTargetTypeRequireBuild(
target_dicts[target_name])
target_type = target_dicts[target_name]['type']
target.is_executable = target_type == 'executable'
target.is_static_library = target_type == 'static_library'
target.is_or_has_linked_ancestor = (target_type == 'executable' or
target_type == 'shared_library')
target.requires_build = _DoesTargetTypeRequireBuild(target_dicts[target_name])
target_type = target_dicts[target_name]["type"]
target.is_executable = target_type == "executable"
target.is_static_library = target_type == "static_library"
target.is_or_has_linked_ancestor = (
target_type == "executable" or target_type == "shared_library"
)
build_file = gyp.common.ParseQualifiedTarget(target_name)[0]
if not build_file in build_file_in_files:
build_file_in_files[build_file] = \
_WasBuildFileModified(build_file, data, files, toplevel_dir)
if build_file not in build_file_in_files:
build_file_in_files[build_file] = _WasBuildFileModified(
build_file, data, files, toplevel_dir
)
if build_file in build_files:
build_file_targets.add(target)
@ -375,25 +395,27 @@ def _GenerateTargets(data, target_list, target_dicts, toplevel_dir, files,
# If a build file (or any of its included files) is modified we assume all
# targets in the file are modified.
if build_file_in_files[build_file]:
print('matching target from modified build file', target_name)
print("matching target from modified build file", target_name)
target.match_status = MATCH_STATUS_MATCHES
matching_targets.append(target)
else:
sources = _ExtractSources(target_name, target_dicts[target_name],
toplevel_dir)
sources = _ExtractSources(
target_name, target_dicts[target_name], toplevel_dir
)
for source in sources:
if _ToGypPath(os.path.normpath(source)) in files:
print('target', target_name, 'matches', source)
print("target", target_name, "matches", source)
target.match_status = MATCH_STATUS_MATCHES
matching_targets.append(target)
break
# Add dependencies to visit as well as updating back pointers for deps.
for dep in target_dicts[target_name].get('dependencies', []):
for dep in target_dicts[target_name].get("dependencies", []):
targets_to_visit.append(dep)
created_dep_target, dep_target = _GetOrCreateTargetByName(name_to_target,
dep)
created_dep_target, dep_target = _GetOrCreateTargetByName(
name_to_target, dep
)
if not created_dep_target:
roots.discard(dep_target)
@ -429,13 +451,15 @@ def _DoesTargetDependOnMatchingTargets(target):
target: the Target to look for."""
if target.match_status == MATCH_STATUS_DOESNT_MATCH:
return False
if target.match_status == MATCH_STATUS_MATCHES or \
target.match_status == MATCH_STATUS_MATCHES_BY_DEPENDENCY:
if (
target.match_status == MATCH_STATUS_MATCHES
or target.match_status == MATCH_STATUS_MATCHES_BY_DEPENDENCY
):
return True
for dep in target.deps:
if _DoesTargetDependOnMatchingTargets(dep):
target.match_status = MATCH_STATUS_MATCHES_BY_DEPENDENCY
print('\t', target.name, 'matches by dep', dep.name)
print("\t", target.name, "matches by dep", dep.name)
return True
target.match_status = MATCH_STATUS_DOESNT_MATCH
return False
@ -447,7 +471,7 @@ def _GetTargetsDependingOnMatchingTargets(possible_targets):
supplied as input to analyzer.
possible_targets: targets to search from."""
found = []
print('Targets that matched by dependency:')
print("Targets that matched by dependency:")
for target in possible_targets:
if _DoesTargetDependOnMatchingTargets(target):
found.append(target)
@ -471,8 +495,7 @@ def _AddCompileTargets(target, roots, add_if_no_ancestor, result):
_AddCompileTargets(back_dep_target, roots, False, result)
target.added_to_compile_targets |= back_dep_target.added_to_compile_targets
target.in_roots |= back_dep_target.in_roots
target.is_or_has_linked_ancestor |= (
back_dep_target.is_or_has_linked_ancestor)
target.is_or_has_linked_ancestor |= back_dep_target.is_or_has_linked_ancestor
# Always add 'executable' targets. Even though they may be built by other
# targets that depend upon them it makes detection of what is going to be
@ -480,18 +503,34 @@ def _AddCompileTargets(target, roots, add_if_no_ancestor, result):
# And always add static_libraries that have no dependencies on them from
# linkables. This is necessary as the other dependencies on them may be
# static libraries themselves, which are not compile time dependencies.
if target.in_roots and \
(target.is_executable or
(not target.added_to_compile_targets and
(add_if_no_ancestor or target.requires_build)) or
(target.is_static_library and add_if_no_ancestor and
not target.is_or_has_linked_ancestor)):
print('\t\tadding to compile targets', target.name, 'executable',
target.is_executable, 'added_to_compile_targets',
target.added_to_compile_targets, 'add_if_no_ancestor',
add_if_no_ancestor, 'requires_build', target.requires_build,
'is_static_library', target.is_static_library,
'is_or_has_linked_ancestor', target.is_or_has_linked_ancestor)
if target.in_roots and (
target.is_executable
or (
not target.added_to_compile_targets
and (add_if_no_ancestor or target.requires_build)
)
or (
target.is_static_library
and add_if_no_ancestor
and not target.is_or_has_linked_ancestor
)
):
print(
"\t\tadding to compile targets",
target.name,
"executable",
target.is_executable,
"added_to_compile_targets",
target.added_to_compile_targets,
"add_if_no_ancestor",
add_if_no_ancestor,
"requires_build",
target.requires_build,
"is_static_library",
target.is_static_library,
"is_or_has_linked_ancestor",
target.is_or_has_linked_ancestor,
)
result.add(target)
target.added_to_compile_targets = True
@ -502,63 +541,62 @@ def _GetCompileTargets(matching_targets, supplied_targets):
supplied_targets: set of targets supplied to analyzer to search from."""
result = set()
for target in matching_targets:
print('finding compile targets for match', target.name)
print("finding compile targets for match", target.name)
_AddCompileTargets(target, supplied_targets, True, result)
return result
def _WriteOutput(params, **values):
"""Writes the output, either to stdout or a file is specified."""
if 'error' in values:
print('Error:', values['error'])
if 'status' in values:
print(values['status'])
if 'targets' in values:
values['targets'].sort()
print('Supplied targets that depend on changed files:')
for target in values['targets']:
print('\t', target)
if 'invalid_targets' in values:
values['invalid_targets'].sort()
print('The following targets were not found:')
for target in values['invalid_targets']:
print('\t', target)
if 'build_targets' in values:
values['build_targets'].sort()
print('Targets that require a build:')
for target in values['build_targets']:
print('\t', target)
if 'compile_targets' in values:
values['compile_targets'].sort()
print('Targets that need to be built:')
for target in values['compile_targets']:
print('\t', target)
if 'test_targets' in values:
values['test_targets'].sort()
print('Test targets:')
for target in values['test_targets']:
print('\t', target)
if "error" in values:
print("Error:", values["error"])
if "status" in values:
print(values["status"])
if "targets" in values:
values["targets"].sort()
print("Supplied targets that depend on changed files:")
for target in values["targets"]:
print("\t", target)
if "invalid_targets" in values:
values["invalid_targets"].sort()
print("The following targets were not found:")
for target in values["invalid_targets"]:
print("\t", target)
if "build_targets" in values:
values["build_targets"].sort()
print("Targets that require a build:")
for target in values["build_targets"]:
print("\t", target)
if "compile_targets" in values:
values["compile_targets"].sort()
print("Targets that need to be built:")
for target in values["compile_targets"]:
print("\t", target)
if "test_targets" in values:
values["test_targets"].sort()
print("Test targets:")
for target in values["test_targets"]:
print("\t", target)
output_path = params.get('generator_flags', {}).get(
'analyzer_output_path', None)
output_path = params.get("generator_flags", {}).get("analyzer_output_path", None)
if not output_path:
print(json.dumps(values))
return
try:
f = open(output_path, 'w')
f.write(json.dumps(values) + '\n')
f = open(output_path, "w")
f.write(json.dumps(values) + "\n")
f.close()
except IOError as e:
print('Error writing to output file', output_path, str(e))
print("Error writing to output file", output_path, str(e))
def _WasGypIncludeFileModified(params, files):
"""Returns true if one of the files in |files| is in the set of included
files."""
if params['options'].includes:
for include in params['options'].includes:
if params["options"].includes:
for include in params["options"].includes:
if _ToGypPath(os.path.normpath(include)) in files:
print('Include file modified, assuming all changed', include)
print("Include file modified, assuming all changed", include)
return True
return False
@ -577,38 +615,47 @@ def _LookupTargets(names, mapping):
def CalculateVariables(default_variables, params):
"""Calculate additional variables for use in the build (called by gyp)."""
flavor = gyp.common.GetFlavor(params)
if flavor == 'mac':
default_variables.setdefault('OS', 'mac')
elif flavor == 'win':
default_variables.setdefault('OS', 'win')
# Copy additional generator configuration data from VS, which is shared
# by the Windows Ninja generator.
import gyp.generator.msvs as msvs_generator
generator_additional_non_configuration_keys = getattr(msvs_generator,
'generator_additional_non_configuration_keys', [])
generator_additional_path_sections = getattr(msvs_generator,
'generator_additional_path_sections', [])
if flavor == "mac":
default_variables.setdefault("OS", "mac")
elif flavor == "win":
default_variables.setdefault("OS", "win")
gyp.msvs_emulation.CalculateCommonVariables(default_variables, params)
else:
operating_system = flavor
if flavor == 'android':
operating_system = 'linux' # Keep this legacy behavior for now.
default_variables.setdefault('OS', operating_system)
if flavor == "android":
operating_system = "linux" # Keep this legacy behavior for now.
default_variables.setdefault("OS", operating_system)
class TargetCalculator(object):
"""Calculates the matching test_targets and matching compile_targets."""
def __init__(self, files, additional_compile_target_names, test_target_names,
data, target_list, target_dicts, toplevel_dir, build_files):
def __init__(
self,
files,
additional_compile_target_names,
test_target_names,
data,
target_list,
target_dicts,
toplevel_dir,
build_files,
):
self._additional_compile_target_names = set(additional_compile_target_names)
self._test_target_names = set(test_target_names)
self._name_to_target, self._changed_targets, self._root_targets = (
_GenerateTargets(data, target_list, target_dicts, toplevel_dir,
frozenset(files), build_files))
self._unqualified_mapping, self.invalid_targets = (
_GetUnqualifiedToTargetMapping(self._name_to_target,
self._supplied_target_names_no_all()))
(
self._name_to_target,
self._changed_targets,
self._root_targets,
) = _GenerateTargets(
data, target_list, target_dicts, toplevel_dir, frozenset(files), build_files
)
(
self._unqualified_mapping,
self.invalid_targets,
) = _GetUnqualifiedToTargetMapping(
self._name_to_target, self._supplied_target_names_no_all()
)
def _supplied_target_names(self):
return self._additional_compile_target_names | self._test_target_names
@ -616,7 +663,7 @@ class TargetCalculator(object):
def _supplied_target_names_no_all(self):
"""Returns the supplied test targets without 'all'."""
result = self._supplied_target_names()
result.discard('all')
result.discard("all")
return result
def is_build_impacted(self):
@ -631,39 +678,44 @@ class TargetCalculator(object):
# to include the root targets during lookup. If any of the root targets
# match, we remove it and replace it with 'all'.
test_target_names_no_all = set(self._test_target_names)
test_target_names_no_all.discard('all')
test_targets_no_all = _LookupTargets(test_target_names_no_all,
self._unqualified_mapping)
test_target_names_contains_all = 'all' in self._test_target_names
test_target_names_no_all.discard("all")
test_targets_no_all = _LookupTargets(
test_target_names_no_all, self._unqualified_mapping
)
test_target_names_contains_all = "all" in self._test_target_names
if test_target_names_contains_all:
test_targets = [x for x in (set(test_targets_no_all) |
set(self._root_targets))]
test_targets = [
x for x in (set(test_targets_no_all) | set(self._root_targets))
]
else:
test_targets = [x for x in test_targets_no_all]
print('supplied test_targets')
print("supplied test_targets")
for target_name in self._test_target_names:
print('\t', target_name)
print('found test_targets')
print("\t", target_name)
print("found test_targets")
for target in test_targets:
print('\t', target.name)
print('searching for matching test targets')
print("\t", target.name)
print("searching for matching test targets")
matching_test_targets = _GetTargetsDependingOnMatchingTargets(test_targets)
matching_test_targets_contains_all = (test_target_names_contains_all and
set(matching_test_targets) &
set(self._root_targets))
matching_test_targets_contains_all = test_target_names_contains_all and set(
matching_test_targets
) & set(self._root_targets)
if matching_test_targets_contains_all:
# Remove any of the targets for all that were not explicitly supplied,
# 'all' is subsequentely added to the matching names below.
matching_test_targets = [x for x in (set(matching_test_targets) &
set(test_targets_no_all))]
print('matched test_targets')
matching_test_targets = [
x for x in (set(matching_test_targets) & set(test_targets_no_all))
]
print("matched test_targets")
for target in matching_test_targets:
print('\t', target.name)
matching_target_names = [gyp.common.ParseQualifiedTarget(target.name)[1]
for target in matching_test_targets]
print("\t", target.name)
matching_target_names = [
gyp.common.ParseQualifiedTarget(target.name)[1]
for target in matching_test_targets
]
if matching_test_targets_contains_all:
matching_target_names.append('all')
print('\tall')
matching_target_names.append("all")
print("\tall")
return matching_target_names
def find_matching_compile_target_names(self):
@ -674,19 +726,22 @@ class TargetCalculator(object):
for target in self._name_to_target.values():
target.visited = False
supplied_targets = _LookupTargets(self._supplied_target_names_no_all(),
self._unqualified_mapping)
if 'all' in self._supplied_target_names():
supplied_targets = [x for x in (set(supplied_targets) |
set(self._root_targets))]
print('Supplied test_targets & compile_targets')
supplied_targets = _LookupTargets(
self._supplied_target_names_no_all(), self._unqualified_mapping
)
if "all" in self._supplied_target_names():
supplied_targets = [
x for x in (set(supplied_targets) | set(self._root_targets))
]
print("Supplied test_targets & compile_targets")
for target in supplied_targets:
print('\t', target.name)
print('Finding compile targets')
compile_targets = _GetCompileTargets(self._changed_targets,
supplied_targets)
return [gyp.common.ParseQualifiedTarget(target.name)[1]
for target in compile_targets]
print("\t", target.name)
print("Finding compile targets")
compile_targets = _GetCompileTargets(self._changed_targets, supplied_targets)
return [
gyp.common.ParseQualifiedTarget(target.name)[1]
for target in compile_targets
]
def GenerateOutput(target_list, target_dicts, data, params):
@ -696,47 +751,58 @@ def GenerateOutput(target_list, target_dicts, data, params):
config.Init(params)
if not config.files:
raise Exception('Must specify files to analyze via config_path generator '
'flag')
raise Exception(
"Must specify files to analyze via config_path generator " "flag"
)
toplevel_dir = _ToGypPath(os.path.abspath(params['options'].toplevel_dir))
toplevel_dir = _ToGypPath(os.path.abspath(params["options"].toplevel_dir))
if debug:
print('toplevel_dir', toplevel_dir)
print("toplevel_dir", toplevel_dir)
if _WasGypIncludeFileModified(params, config.files):
result_dict = { 'status': all_changed_string,
'test_targets': list(config.test_target_names),
'compile_targets': list(
config.additional_compile_target_names |
config.test_target_names) }
result_dict = {
"status": all_changed_string,
"test_targets": list(config.test_target_names),
"compile_targets": list(
config.additional_compile_target_names | config.test_target_names
),
}
_WriteOutput(params, **result_dict)
return
calculator = TargetCalculator(config.files,
calculator = TargetCalculator(
config.files,
config.additional_compile_target_names,
config.test_target_names, data,
target_list, target_dicts, toplevel_dir,
params['build_files'])
config.test_target_names,
data,
target_list,
target_dicts,
toplevel_dir,
params["build_files"],
)
if not calculator.is_build_impacted():
result_dict = { 'status': no_dependency_string,
'test_targets': [],
'compile_targets': [] }
result_dict = {
"status": no_dependency_string,
"test_targets": [],
"compile_targets": [],
}
if calculator.invalid_targets:
result_dict['invalid_targets'] = calculator.invalid_targets
result_dict["invalid_targets"] = calculator.invalid_targets
_WriteOutput(params, **result_dict)
return
test_target_names = calculator.find_matching_test_target_names()
compile_target_names = calculator.find_matching_compile_target_names()
found_at_least_one_target = compile_target_names or test_target_names
result_dict = { 'test_targets': test_target_names,
'status': found_dependency_string if
found_at_least_one_target else no_dependency_string,
'compile_targets': list(
set(compile_target_names) |
set(test_target_names)) }
result_dict = {
"test_targets": test_target_names,
"status": found_dependency_string
if found_at_least_one_target
else no_dependency_string,
"compile_targets": list(set(compile_target_names) | set(test_target_names)),
}
if calculator.invalid_targets:
result_dict['invalid_targets'] = calculator.invalid_targets
result_dict["invalid_targets"] = calculator.invalid_targets
_WriteOutput(params, **result_dict)
except Exception as e:

File diff suppressed because it is too large Load diff

File diff suppressed because it is too large Load diff

View file

@ -16,63 +16,61 @@ generator_wants_sorted_dependencies = False
# Lifted from make.py. The actual values don't matter much.
generator_default_variables = {
'CONFIGURATION_NAME': '$(BUILDTYPE)',
'EXECUTABLE_PREFIX': '',
'EXECUTABLE_SUFFIX': '',
'INTERMEDIATE_DIR': '$(obj).$(TOOLSET)/$(TARGET)/geni',
'PRODUCT_DIR': '$(builddir)',
'RULE_INPUT_DIRNAME': '%(INPUT_DIRNAME)s',
'RULE_INPUT_EXT': '$(suffix $<)',
'RULE_INPUT_NAME': '$(notdir $<)',
'RULE_INPUT_PATH': '$(abspath $<)',
'RULE_INPUT_ROOT': '%(INPUT_ROOT)s',
'SHARED_INTERMEDIATE_DIR': '$(obj)/gen',
'SHARED_LIB_PREFIX': 'lib',
'STATIC_LIB_PREFIX': 'lib',
'STATIC_LIB_SUFFIX': '.a',
"CONFIGURATION_NAME": "$(BUILDTYPE)",
"EXECUTABLE_PREFIX": "",
"EXECUTABLE_SUFFIX": "",
"INTERMEDIATE_DIR": "$(obj).$(TOOLSET)/$(TARGET)/geni",
"PRODUCT_DIR": "$(builddir)",
"RULE_INPUT_DIRNAME": "%(INPUT_DIRNAME)s",
"RULE_INPUT_EXT": "$(suffix $<)",
"RULE_INPUT_NAME": "$(notdir $<)",
"RULE_INPUT_PATH": "$(abspath $<)",
"RULE_INPUT_ROOT": "%(INPUT_ROOT)s",
"SHARED_INTERMEDIATE_DIR": "$(obj)/gen",
"SHARED_LIB_PREFIX": "lib",
"STATIC_LIB_PREFIX": "lib",
"STATIC_LIB_SUFFIX": ".a",
}
def IsMac(params):
return 'mac' == gyp.common.GetFlavor(params)
return "mac" == gyp.common.GetFlavor(params)
def CalculateVariables(default_variables, params):
default_variables.setdefault('OS', gyp.common.GetFlavor(params))
default_variables.setdefault("OS", gyp.common.GetFlavor(params))
def AddCommandsForTarget(cwd, target, params, per_config_commands):
output_dir = params['generator_flags']['output_dir']
for configuration_name, configuration in target['configurations'].items():
builddir_name = os.path.join(output_dir, configuration_name)
output_dir = params["generator_flags"].get("output_dir", "out")
for configuration_name, configuration in target["configurations"].items():
if IsMac(params):
xcode_settings = gyp.xcode_emulation.XcodeSettings(target)
cflags = xcode_settings.GetCflags(configuration_name)
cflags_c = xcode_settings.GetCflagsC(configuration_name)
cflags_cc = xcode_settings.GetCflagsCC(configuration_name)
else:
cflags = configuration.get('cflags', [])
cflags_c = configuration.get('cflags_c', [])
cflags_cc = configuration.get('cflags_cc', [])
cflags = configuration.get("cflags", [])
cflags_c = configuration.get("cflags_c", [])
cflags_cc = configuration.get("cflags_cc", [])
cflags_c = cflags + cflags_c
cflags_cc = cflags + cflags_cc
defines = configuration.get('defines', [])
defines = ['-D' + s for s in defines]
defines = configuration.get("defines", [])
defines = ["-D" + s for s in defines]
# TODO(bnoordhuis) Handle generated source files.
sources = target.get('sources', [])
sources = [s for s in sources if s.endswith('.c') or s.endswith('.cc')]
sources = target.get("sources", [])
sources = [s for s in sources if s.endswith(".c") or s.endswith(".cc")]
def resolve(filename):
return os.path.abspath(os.path.join(cwd, filename))
# TODO(bnoordhuis) Handle generated header files.
include_dirs = configuration.get('include_dirs', [])
include_dirs = [s for s in include_dirs if not s.startswith('$(obj)')]
includes = ['-I' + resolve(s) for s in include_dirs]
include_dirs = configuration.get("include_dirs", [])
include_dirs = [s for s in include_dirs if not s.startswith("$(obj)")]
includes = ["-I" + resolve(s) for s in include_dirs]
defines = gyp.common.EncodePOSIXShellList(defines)
includes = gyp.common.EncodePOSIXShellList(includes)
@ -82,32 +80,39 @@ def AddCommandsForTarget(cwd, target, params, per_config_commands):
commands = per_config_commands.setdefault(configuration_name, [])
for source in sources:
file = resolve(source)
isc = source.endswith('.c')
cc = 'cc' if isc else 'c++'
isc = source.endswith(".c")
cc = "cc" if isc else "c++"
cflags = cflags_c if isc else cflags_cc
command = ' '.join((cc, defines, includes, cflags,
'-c', gyp.common.EncodePOSIXShellArgument(file)))
command = " ".join(
(
cc,
defines,
includes,
cflags,
"-c",
gyp.common.EncodePOSIXShellArgument(file),
)
)
commands.append(dict(command=command, directory=output_dir, file=file))
def GenerateOutput(target_list, target_dicts, data, params):
per_config_commands = {}
for qualified_target, target in target_dicts.items():
build_file, target_name, toolset = (
gyp.common.ParseQualifiedTarget(qualified_target))
build_file, target_name, toolset = gyp.common.ParseQualifiedTarget(
qualified_target
)
if IsMac(params):
settings = data[build_file]
gyp.xcode_emulation.MergeGlobalXcodeSettingsToSpec(settings, target)
cwd = os.path.dirname(build_file)
AddCommandsForTarget(cwd, target, params, per_config_commands)
output_dir = params['generator_flags']['output_dir']
output_dir = params["generator_flags"].get("output_dir", "out")
for configuration_name, commands in per_config_commands.items():
filename = os.path.join(output_dir,
configuration_name,
'compile_commands.json')
filename = os.path.join(output_dir, configuration_name, "compile_commands.json")
gyp.common.EnsureDirExists(filename)
fp = open(filename, 'w')
fp = open(filename, "w")
json.dump(commands, fp=fp, indent=0, check_circular=False)

View file

@ -1,77 +1,81 @@
from __future__ import print_function
# Copyright (c) 2012 Google Inc. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
import collections
from __future__ import print_function
import os
import gyp
import gyp.common
import gyp.msvs_emulation
import json
import sys
generator_supports_multiple_toolsets = True
generator_wants_static_library_dependencies_adjusted = False
generator_filelist_paths = {
}
generator_filelist_paths = {}
generator_default_variables = {
}
for dirname in ['INTERMEDIATE_DIR', 'SHARED_INTERMEDIATE_DIR', 'PRODUCT_DIR',
'LIB_DIR', 'SHARED_LIB_DIR']:
generator_default_variables = {}
for dirname in [
"INTERMEDIATE_DIR",
"SHARED_INTERMEDIATE_DIR",
"PRODUCT_DIR",
"LIB_DIR",
"SHARED_LIB_DIR",
]:
# Some gyp steps fail if these are empty(!).
generator_default_variables[dirname] = 'dir'
for unused in ['RULE_INPUT_PATH', 'RULE_INPUT_ROOT', 'RULE_INPUT_NAME',
'RULE_INPUT_DIRNAME', 'RULE_INPUT_EXT',
'EXECUTABLE_PREFIX', 'EXECUTABLE_SUFFIX',
'STATIC_LIB_PREFIX', 'STATIC_LIB_SUFFIX',
'SHARED_LIB_PREFIX', 'SHARED_LIB_SUFFIX',
'CONFIGURATION_NAME']:
generator_default_variables[unused] = ''
generator_default_variables[dirname] = "dir"
for unused in [
"RULE_INPUT_PATH",
"RULE_INPUT_ROOT",
"RULE_INPUT_NAME",
"RULE_INPUT_DIRNAME",
"RULE_INPUT_EXT",
"EXECUTABLE_PREFIX",
"EXECUTABLE_SUFFIX",
"STATIC_LIB_PREFIX",
"STATIC_LIB_SUFFIX",
"SHARED_LIB_PREFIX",
"SHARED_LIB_SUFFIX",
"CONFIGURATION_NAME",
]:
generator_default_variables[unused] = ""
def CalculateVariables(default_variables, params):
generator_flags = params.get('generator_flags', {})
generator_flags = params.get("generator_flags", {})
for key, val in generator_flags.items():
default_variables.setdefault(key, val)
default_variables.setdefault('OS', gyp.common.GetFlavor(params))
default_variables.setdefault("OS", gyp.common.GetFlavor(params))
flavor = gyp.common.GetFlavor(params)
if flavor =='win':
# Copy additional generator configuration data from VS, which is shared
# by the Windows Ninja generator.
import gyp.generator.msvs as msvs_generator
generator_additional_non_configuration_keys = getattr(msvs_generator,
'generator_additional_non_configuration_keys', [])
generator_additional_path_sections = getattr(msvs_generator,
'generator_additional_path_sections', [])
if flavor == "win":
gyp.msvs_emulation.CalculateCommonVariables(default_variables, params)
def CalculateGeneratorInputInfo(params):
"""Calculate the generator specific info that gets fed to input (called by
gyp)."""
generator_flags = params.get('generator_flags', {})
if generator_flags.get('adjust_static_libraries', False):
generator_flags = params.get("generator_flags", {})
if generator_flags.get("adjust_static_libraries", False):
global generator_wants_static_library_dependencies_adjusted
generator_wants_static_library_dependencies_adjusted = True
toplevel = params['options'].toplevel_dir
generator_dir = os.path.relpath(params['options'].generator_output or '.')
toplevel = params["options"].toplevel_dir
generator_dir = os.path.relpath(params["options"].generator_output or ".")
# output_dir: relative path from generator_dir to the build directory.
output_dir = generator_flags.get('output_dir', 'out')
qualified_out_dir = os.path.normpath(os.path.join(
toplevel, generator_dir, output_dir, 'gypfiles'))
output_dir = generator_flags.get("output_dir", "out")
qualified_out_dir = os.path.normpath(
os.path.join(toplevel, generator_dir, output_dir, "gypfiles")
)
global generator_filelist_paths
generator_filelist_paths = {
'toplevel': toplevel,
'qualified_out_dir': qualified_out_dir,
"toplevel": toplevel,
"qualified_out_dir": qualified_out_dir,
}
def GenerateOutput(target_list, target_dicts, data, params):
# Map of target -> list of targets it depends on.
edges = {}
@ -85,16 +89,16 @@ def GenerateOutput(target_list, target_dicts, data, params):
continue
edges[target] = []
for dep in target_dicts[target].get('dependencies', []):
for dep in target_dicts[target].get("dependencies", []):
edges[target].append(dep)
targets_to_visit.append(dep)
try:
filepath = params['generator_flags']['output_dir']
filepath = params["generator_flags"]["output_dir"]
except KeyError:
filepath = '.'
filename = os.path.join(filepath, 'dump.json')
f = open(filename, 'w')
filepath = "."
filename = os.path.join(filepath, "dump.json")
f = open(filename, "w")
json.dump(edges, f)
f.close()
print('Wrote json to %s.' % filename)
print("Wrote json to %s." % filename)

View file

@ -30,58 +30,61 @@ PY3 = bytes != str
generator_wants_static_library_dependencies_adjusted = False
generator_default_variables = {
}
generator_default_variables = {}
for dirname in ['INTERMEDIATE_DIR', 'PRODUCT_DIR', 'LIB_DIR', 'SHARED_LIB_DIR']:
for dirname in ["INTERMEDIATE_DIR", "PRODUCT_DIR", "LIB_DIR", "SHARED_LIB_DIR"]:
# Some gyp steps fail if these are empty(!), so we convert them to variables
generator_default_variables[dirname] = '$' + dirname
generator_default_variables[dirname] = "$" + dirname
for unused in ['RULE_INPUT_PATH', 'RULE_INPUT_ROOT', 'RULE_INPUT_NAME',
'RULE_INPUT_DIRNAME', 'RULE_INPUT_EXT',
'EXECUTABLE_PREFIX', 'EXECUTABLE_SUFFIX',
'STATIC_LIB_PREFIX', 'STATIC_LIB_SUFFIX',
'SHARED_LIB_PREFIX', 'SHARED_LIB_SUFFIX',
'CONFIGURATION_NAME']:
generator_default_variables[unused] = ''
for unused in [
"RULE_INPUT_PATH",
"RULE_INPUT_ROOT",
"RULE_INPUT_NAME",
"RULE_INPUT_DIRNAME",
"RULE_INPUT_EXT",
"EXECUTABLE_PREFIX",
"EXECUTABLE_SUFFIX",
"STATIC_LIB_PREFIX",
"STATIC_LIB_SUFFIX",
"SHARED_LIB_PREFIX",
"SHARED_LIB_SUFFIX",
"CONFIGURATION_NAME",
]:
generator_default_variables[unused] = ""
# Include dirs will occasionally use the SHARED_INTERMEDIATE_DIR variable as
# part of the path when dealing with generated headers. This value will be
# replaced dynamically for each configuration.
generator_default_variables['SHARED_INTERMEDIATE_DIR'] = \
'$SHARED_INTERMEDIATE_DIR'
generator_default_variables["SHARED_INTERMEDIATE_DIR"] = "$SHARED_INTERMEDIATE_DIR"
def CalculateVariables(default_variables, params):
generator_flags = params.get('generator_flags', {})
generator_flags = params.get("generator_flags", {})
for key, val in generator_flags.items():
default_variables.setdefault(key, val)
flavor = gyp.common.GetFlavor(params)
default_variables.setdefault('OS', flavor)
if flavor == 'win':
# Copy additional generator configuration data from VS, which is shared
# by the Eclipse generator.
import gyp.generator.msvs as msvs_generator
generator_additional_non_configuration_keys = getattr(msvs_generator,
'generator_additional_non_configuration_keys', [])
generator_additional_path_sections = getattr(msvs_generator,
'generator_additional_path_sections', [])
default_variables.setdefault("OS", flavor)
if flavor == "win":
gyp.msvs_emulation.CalculateCommonVariables(default_variables, params)
def CalculateGeneratorInputInfo(params):
"""Calculate the generator specific info that gets fed to input (called by
gyp)."""
generator_flags = params.get('generator_flags', {})
if generator_flags.get('adjust_static_libraries', False):
generator_flags = params.get("generator_flags", {})
if generator_flags.get("adjust_static_libraries", False):
global generator_wants_static_library_dependencies_adjusted
generator_wants_static_library_dependencies_adjusted = True
def GetAllIncludeDirectories(target_list, target_dicts,
shared_intermediate_dirs, config_name, params,
compiler_path):
def GetAllIncludeDirectories(
target_list,
target_dicts,
shared_intermediate_dirs,
config_name,
params,
compiler_path,
):
"""Calculate the set of include directories to be used.
Returns:
@ -95,12 +98,16 @@ def GetAllIncludeDirectories(target_list, target_dicts,
# Find compiler's default include dirs.
if compiler_path:
command = shlex.split(compiler_path)
command.extend(['-E', '-xc++', '-v', '-'])
proc = subprocess.Popen(args=command, stdin=subprocess.PIPE,
stdout=subprocess.PIPE, stderr=subprocess.PIPE)
command.extend(["-E", "-xc++", "-v", "-"])
proc = subprocess.Popen(
args=command,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
)
output = proc.communicate()[1]
if PY3:
output = output.decode('utf-8')
output = output.decode("utf-8")
# Extract the list of include dirs from the output, which has this format:
# ...
# #include "..." search starts here:
@ -111,10 +118,10 @@ def GetAllIncludeDirectories(target_list, target_dicts,
# ...
in_include_list = False
for line in output.splitlines():
if line.startswith('#include'):
if line.startswith("#include"):
in_include_list = True
continue
if line.startswith('End of search list.'):
if line.startswith("End of search list."):
break
if in_include_list:
include_dir = line.strip()
@ -122,39 +129,40 @@ def GetAllIncludeDirectories(target_list, target_dicts,
compiler_includes_list.append(include_dir)
flavor = gyp.common.GetFlavor(params)
if flavor == 'win':
generator_flags = params.get('generator_flags', {})
if flavor == "win":
generator_flags = params.get("generator_flags", {})
for target_name in target_list:
target = target_dicts[target_name]
if config_name in target['configurations']:
config = target['configurations'][config_name]
if config_name in target["configurations"]:
config = target["configurations"][config_name]
# Look for any include dirs that were explicitly added via cflags. This
# may be done in gyp files to force certain includes to come at the end.
# TODO(jgreenwald): Change the gyp files to not abuse cflags for this, and
# remove this.
if flavor == 'win':
if flavor == "win":
msvs_settings = gyp.msvs_emulation.MsvsSettings(target, generator_flags)
cflags = msvs_settings.GetCflags(config_name)
else:
cflags = config['cflags']
cflags = config["cflags"]
for cflag in cflags:
if cflag.startswith('-I'):
if cflag.startswith("-I"):
include_dir = cflag[2:]
if include_dir not in compiler_includes_list:
compiler_includes_list.append(include_dir)
# Find standard gyp include dirs.
if 'include_dirs' in config:
include_dirs = config['include_dirs']
if "include_dirs" in config:
include_dirs = config["include_dirs"]
for shared_intermediate_dir in shared_intermediate_dirs:
for include_dir in include_dirs:
include_dir = include_dir.replace('$SHARED_INTERMEDIATE_DIR',
shared_intermediate_dir)
include_dir = include_dir.replace(
"$SHARED_INTERMEDIATE_DIR", shared_intermediate_dir
)
if not os.path.isabs(include_dir):
base_dir = os.path.dirname(target_name)
include_dir = base_dir + '/' + include_dir
include_dir = base_dir + "/" + include_dir
include_dir = os.path.abspath(include_dir)
gyp_includes_set.add(include_dir)
@ -163,7 +171,7 @@ def GetAllIncludeDirectories(target_list, target_dicts,
all_includes_list = list(gyp_includes_set)
all_includes_list.sort()
for compiler_include in compiler_includes_list:
if not compiler_include in gyp_includes_set:
if compiler_include not in gyp_includes_set:
all_includes_list.append(compiler_include)
# All done.
@ -180,22 +188,21 @@ def GetCompilerPath(target_list, data, options):
"""
# First, see if the compiler is configured in make's settings.
build_file, _, _ = gyp.common.ParseQualifiedTarget(target_list[0])
make_global_settings_dict = data[build_file].get('make_global_settings', {})
make_global_settings_dict = data[build_file].get("make_global_settings", {})
for key, value in make_global_settings_dict:
if key in ['CC', 'CXX']:
if key in ["CC", "CXX"]:
return os.path.join(options.toplevel_dir, value)
# Check to see if the compiler was specified as an environment variable.
for key in ['CC_target', 'CC', 'CXX']:
for key in ["CC_target", "CC", "CXX"]:
compiler = os.environ.get(key)
if compiler:
return compiler
return 'gcc'
return "gcc"
def GetAllDefines(target_list, target_dicts, data, config_name, params,
compiler_path):
def GetAllDefines(target_list, target_dicts, data, config_name, params, compiler_path):
"""Calculate the defines for a project.
Returns:
@ -206,50 +213,51 @@ def GetAllDefines(target_list, target_dicts, data, config_name, params,
# Get defines declared in the gyp files.
all_defines = {}
flavor = gyp.common.GetFlavor(params)
if flavor == 'win':
generator_flags = params.get('generator_flags', {})
if flavor == "win":
generator_flags = params.get("generator_flags", {})
for target_name in target_list:
target = target_dicts[target_name]
if flavor == 'win':
if flavor == "win":
msvs_settings = gyp.msvs_emulation.MsvsSettings(target, generator_flags)
extra_defines = msvs_settings.GetComputedDefines(config_name)
else:
extra_defines = []
if config_name in target['configurations']:
config = target['configurations'][config_name]
target_defines = config['defines']
if config_name in target["configurations"]:
config = target["configurations"][config_name]
target_defines = config["defines"]
else:
target_defines = []
for define in target_defines + extra_defines:
split_define = define.split('=', 1)
split_define = define.split("=", 1)
if len(split_define) == 1:
split_define.append('1')
split_define.append("1")
if split_define[0].strip() in all_defines:
# Already defined
continue
all_defines[split_define[0].strip()] = split_define[1].strip()
# Get default compiler defines (if possible).
if flavor == 'win':
if flavor == "win":
return all_defines # Default defines already processed in the loop above.
if compiler_path:
command = shlex.split(compiler_path)
command.extend(['-E', '-dM', '-'])
cpp_proc = subprocess.Popen(args=command, cwd='.',
stdin=subprocess.PIPE, stdout=subprocess.PIPE)
command.extend(["-E", "-dM", "-"])
cpp_proc = subprocess.Popen(
args=command, cwd=".", stdin=subprocess.PIPE, stdout=subprocess.PIPE
)
cpp_output = cpp_proc.communicate()[0]
if PY3:
cpp_output = cpp_output.decode('utf-8')
cpp_lines = cpp_output.split('\n')
cpp_output = cpp_output.decode("utf-8")
cpp_lines = cpp_output.split("\n")
for cpp_line in cpp_lines:
if not cpp_line.strip():
continue
cpp_line_parts = cpp_line.split(' ', 2)
cpp_line_parts = cpp_line.split(" ", 2)
key = cpp_line_parts[1]
if len(cpp_line_parts) >= 3:
val = cpp_line_parts[2]
else:
val = '1'
val = "1"
all_defines[key] = val
return all_defines
@ -258,95 +266,125 @@ def GetAllDefines(target_list, target_dicts, data, config_name, params,
def WriteIncludePaths(out, eclipse_langs, include_dirs):
"""Write the includes section of a CDT settings export file."""
out.write(' <section name="org.eclipse.cdt.internal.ui.wizards.' \
'settingswizards.IncludePaths">\n')
out.write(
' <section name="org.eclipse.cdt.internal.ui.wizards.'
'settingswizards.IncludePaths">\n'
)
out.write(' <language name="holder for library settings"></language>\n')
for lang in eclipse_langs:
out.write(' <language name="%s">\n' % lang)
for include_dir in include_dirs:
out.write(' <includepath workspace_path="false">%s</includepath>\n' %
include_dir)
out.write(' </language>\n')
out.write(' </section>\n')
out.write(
' <includepath workspace_path="false">%s</includepath>\n'
% include_dir
)
out.write(" </language>\n")
out.write(" </section>\n")
def WriteMacros(out, eclipse_langs, defines):
"""Write the macros section of a CDT settings export file."""
out.write(' <section name="org.eclipse.cdt.internal.ui.wizards.' \
'settingswizards.Macros">\n')
out.write(
' <section name="org.eclipse.cdt.internal.ui.wizards.'
'settingswizards.Macros">\n'
)
out.write(' <language name="holder for library settings"></language>\n')
for lang in eclipse_langs:
out.write(' <language name="%s">\n' % lang)
for key in sorted(defines):
out.write(' <macro><name>%s</name><value>%s</value></macro>\n' %
(escape(key), escape(defines[key])))
out.write(' </language>\n')
out.write(' </section>\n')
out.write(
" <macro><name>%s</name><value>%s</value></macro>\n"
% (escape(key), escape(defines[key]))
)
out.write(" </language>\n")
out.write(" </section>\n")
def GenerateOutputForConfig(target_list, target_dicts, data, params,
config_name):
options = params['options']
generator_flags = params.get('generator_flags', {})
def GenerateOutputForConfig(target_list, target_dicts, data, params, config_name):
options = params["options"]
generator_flags = params.get("generator_flags", {})
# build_dir: relative path from source root to our output files.
# e.g. "out/Debug"
build_dir = os.path.join(generator_flags.get('output_dir', 'out'),
config_name)
build_dir = os.path.join(generator_flags.get("output_dir", "out"), config_name)
toplevel_build = os.path.join(options.toplevel_dir, build_dir)
# Ninja uses out/Debug/gen while make uses out/Debug/obj/gen as the
# SHARED_INTERMEDIATE_DIR. Include both possible locations.
shared_intermediate_dirs = [os.path.join(toplevel_build, 'obj', 'gen'),
os.path.join(toplevel_build, 'gen')]
shared_intermediate_dirs = [
os.path.join(toplevel_build, "obj", "gen"),
os.path.join(toplevel_build, "gen"),
]
GenerateCdtSettingsFile(target_list,
GenerateCdtSettingsFile(
target_list,
target_dicts,
data,
params,
config_name,
os.path.join(toplevel_build,
'eclipse-cdt-settings.xml'),
os.path.join(toplevel_build, "eclipse-cdt-settings.xml"),
options,
shared_intermediate_dirs)
GenerateClasspathFile(target_list,
shared_intermediate_dirs,
)
GenerateClasspathFile(
target_list,
target_dicts,
options.toplevel_dir,
toplevel_build,
os.path.join(toplevel_build,
'eclipse-classpath.xml'))
os.path.join(toplevel_build, "eclipse-classpath.xml"),
)
def GenerateCdtSettingsFile(target_list, target_dicts, data, params,
config_name, out_name, options,
shared_intermediate_dirs):
gyp.common.EnsureDirExists(out_name)
with open(out_name, 'w') as out:
out.write('<?xml version="1.0" encoding="UTF-8"?>\n')
out.write('<cdtprojectproperties>\n')
eclipse_langs = ['C++ Source File', 'C Source File', 'Assembly Source File',
'GNU C++', 'GNU C', 'Assembly']
compiler_path = GetCompilerPath(target_list, data, options)
include_dirs = GetAllIncludeDirectories(target_list, target_dicts,
def GenerateCdtSettingsFile(
target_list,
target_dicts,
data,
params,
config_name,
out_name,
options,
shared_intermediate_dirs,
config_name, params, compiler_path)
):
gyp.common.EnsureDirExists(out_name)
with open(out_name, "w") as out:
out.write('<?xml version="1.0" encoding="UTF-8"?>\n')
out.write("<cdtprojectproperties>\n")
eclipse_langs = [
"C++ Source File",
"C Source File",
"Assembly Source File",
"GNU C++",
"GNU C",
"Assembly",
]
compiler_path = GetCompilerPath(target_list, data, options)
include_dirs = GetAllIncludeDirectories(
target_list,
target_dicts,
shared_intermediate_dirs,
config_name,
params,
compiler_path,
)
WriteIncludePaths(out, eclipse_langs, include_dirs)
defines = GetAllDefines(target_list, target_dicts, data, config_name,
params, compiler_path)
defines = GetAllDefines(
target_list, target_dicts, data, config_name, params, compiler_path
)
WriteMacros(out, eclipse_langs, defines)
out.write('</cdtprojectproperties>\n')
out.write("</cdtprojectproperties>\n")
def GenerateClasspathFile(target_list, target_dicts, toplevel_dir,
toplevel_build, out_name):
'''Generates a classpath file suitable for symbol navigation and code
def GenerateClasspathFile(
target_list, target_dicts, toplevel_dir, toplevel_build, out_name
):
"""Generates a classpath file suitable for symbol navigation and code
completion of Java code (such as in Android projects) by finding all
.java and .jar files used as action inputs.'''
.java and .jar files used as action inputs."""
gyp.common.EnsureDirExists(out_name)
result = ET.Element('classpath')
result = ET.Element("classpath")
def AddElements(kind, paths):
# First, we need to normalize the paths so they are all relative to the
@ -359,28 +397,28 @@ def GenerateClasspathFile(target_list, target_dicts, toplevel_dir,
rel_paths.add(path)
for path in sorted(rel_paths):
entry_element = ET.SubElement(result, 'classpathentry')
entry_element.set('kind', kind)
entry_element.set('path', path)
entry_element = ET.SubElement(result, "classpathentry")
entry_element.set("kind", kind)
entry_element.set("path", path)
AddElements('lib', GetJavaJars(target_list, target_dicts, toplevel_dir))
AddElements('src', GetJavaSourceDirs(target_list, target_dicts, toplevel_dir))
AddElements("lib", GetJavaJars(target_list, target_dicts, toplevel_dir))
AddElements("src", GetJavaSourceDirs(target_list, target_dicts, toplevel_dir))
# Include the standard JRE container and a dummy out folder
AddElements('con', ['org.eclipse.jdt.launching.JRE_CONTAINER'])
AddElements("con", ["org.eclipse.jdt.launching.JRE_CONTAINER"])
# Include a dummy out folder so that Eclipse doesn't use the default /bin
# folder in the root of the project.
AddElements('output', [os.path.join(toplevel_build, '.eclipse-java-build')])
AddElements("output", [os.path.join(toplevel_build, ".eclipse-java-build")])
ET.ElementTree(result).write(out_name)
def GetJavaJars(target_list, target_dicts, toplevel_dir):
'''Generates a sequence of all .jars used as inputs.'''
"""Generates a sequence of all .jars used as inputs."""
for target_name in target_list:
target = target_dicts[target_name]
for action in target.get('actions', []):
for input_ in action['inputs']:
if os.path.splitext(input_)[1] == '.jar' and not input_.startswith('$'):
for action in target.get("actions", []):
for input_ in action["inputs"]:
if os.path.splitext(input_)[1] == ".jar" and not input_.startswith("$"):
if os.path.isabs(input_):
yield input_
else:
@ -388,22 +426,24 @@ def GetJavaJars(target_list, target_dicts, toplevel_dir):
def GetJavaSourceDirs(target_list, target_dicts, toplevel_dir):
'''Generates a sequence of all likely java package root directories.'''
"""Generates a sequence of all likely java package root directories."""
for target_name in target_list:
target = target_dicts[target_name]
for action in target.get('actions', []):
for input_ in action['inputs']:
if (os.path.splitext(input_)[1] == '.java' and
not input_.startswith('$')):
dir_ = os.path.dirname(os.path.join(os.path.dirname(target_name),
input_))
for action in target.get("actions", []):
for input_ in action["inputs"]:
if os.path.splitext(input_)[1] == ".java" and not input_.startswith(
"$"
):
dir_ = os.path.dirname(
os.path.join(os.path.dirname(target_name), input_)
)
# If there is a parent 'src' or 'java' folder, navigate up to it -
# these are canonical package root names in Chromium. This will
# break if 'src' or 'java' exists in the package structure. This
# could be further improved by inspecting the java file for the
# package name if this proves to be too fragile in practice.
parent_search = dir_
while os.path.basename(parent_search) not in ['src', 'java']:
while os.path.basename(parent_search) not in ["src", "java"]:
parent_search, _ = os.path.split(parent_search)
if not parent_search or parent_search == toplevel_dir:
# Didn't find a known root, just return the original path
@ -416,16 +456,15 @@ def GetJavaSourceDirs(target_list, target_dicts, toplevel_dir):
def GenerateOutput(target_list, target_dicts, data, params):
"""Generate an XML settings file that can be imported into a CDT project."""
if params['options'].generator_output:
if params["options"].generator_output:
raise NotImplementedError("--generator_output not implemented for eclipse")
user_config = params.get('generator_flags', {}).get('config', None)
user_config = params.get("generator_flags", {}).get("config", None)
if user_config:
GenerateOutputForConfig(target_list, target_dicts, data, params,
user_config)
GenerateOutputForConfig(target_list, target_dicts, data, params, user_config)
else:
config_names = target_dicts[target_list[0]]['configurations'].keys()
config_names = target_dicts[target_list[0]]["configurations"]
for config_name in config_names:
GenerateOutputForConfig(target_list, target_dicts, data, params,
config_name)
GenerateOutputForConfig(
target_list, target_dicts, data, params, config_name
)

View file

@ -32,36 +32,33 @@ to change.
import gyp.common
import errno
import os
import pprint
# These variables should just be spit back out as variable references.
_generator_identity_variables = [
'CONFIGURATION_NAME',
'EXECUTABLE_PREFIX',
'EXECUTABLE_SUFFIX',
'INTERMEDIATE_DIR',
'LIB_DIR',
'PRODUCT_DIR',
'RULE_INPUT_ROOT',
'RULE_INPUT_DIRNAME',
'RULE_INPUT_EXT',
'RULE_INPUT_NAME',
'RULE_INPUT_PATH',
'SHARED_INTERMEDIATE_DIR',
'SHARED_LIB_DIR',
'SHARED_LIB_PREFIX',
'SHARED_LIB_SUFFIX',
'STATIC_LIB_PREFIX',
'STATIC_LIB_SUFFIX',
"CONFIGURATION_NAME",
"EXECUTABLE_PREFIX",
"EXECUTABLE_SUFFIX",
"INTERMEDIATE_DIR",
"LIB_DIR",
"PRODUCT_DIR",
"RULE_INPUT_ROOT",
"RULE_INPUT_DIRNAME",
"RULE_INPUT_EXT",
"RULE_INPUT_NAME",
"RULE_INPUT_PATH",
"SHARED_INTERMEDIATE_DIR",
"SHARED_LIB_DIR",
"SHARED_LIB_PREFIX",
"SHARED_LIB_SUFFIX",
"STATIC_LIB_PREFIX",
"STATIC_LIB_SUFFIX",
]
# gypd doesn't define a default value for OS like many other generator
# modules. Specify "-D OS=whatever" on the command line to provide a value.
generator_default_variables = {
}
generator_default_variables = {}
# gypd supports multiple toolsets
generator_supports_multiple_toolsets = True
@ -71,24 +68,22 @@ generator_supports_multiple_toolsets = True
# module should use < for the early phase and then switch to > for the late
# phase. Bonus points for carrying @ back into the output too.
for v in _generator_identity_variables:
generator_default_variables[v] = '<(%s)' % v
generator_default_variables[v] = "<(%s)" % v
def GenerateOutput(target_list, target_dicts, data, params):
output_files = {}
for qualified_target in target_list:
[input_file, target] = \
gyp.common.ParseQualifiedTarget(qualified_target)[0:2]
[input_file, target] = gyp.common.ParseQualifiedTarget(qualified_target)[0:2]
if input_file[-4:] != '.gyp':
if input_file[-4:] != ".gyp":
continue
input_file_stem = input_file[:-4]
output_file = input_file_stem + params['options'].suffix + '.gypd'
output_file = input_file_stem + params["options"].suffix + ".gypd"
if not output_file in output_files:
output_files[output_file] = input_file
output_files[output_file] = output_files.get(output_file, input_file)
for output_file, input_file in output_files.items():
output = open(output_file, 'w')
output = open(output_file, "w")
pprint.pprint(data[input_file], output)
output.close()

View file

@ -21,36 +21,38 @@ import sys
# All of this stuff about generator variables was lovingly ripped from gypd.py.
# That module has a much better description of what's going on and why.
_generator_identity_variables = [
'EXECUTABLE_PREFIX',
'EXECUTABLE_SUFFIX',
'INTERMEDIATE_DIR',
'PRODUCT_DIR',
'RULE_INPUT_ROOT',
'RULE_INPUT_DIRNAME',
'RULE_INPUT_EXT',
'RULE_INPUT_NAME',
'RULE_INPUT_PATH',
'SHARED_INTERMEDIATE_DIR',
"EXECUTABLE_PREFIX",
"EXECUTABLE_SUFFIX",
"INTERMEDIATE_DIR",
"PRODUCT_DIR",
"RULE_INPUT_ROOT",
"RULE_INPUT_DIRNAME",
"RULE_INPUT_EXT",
"RULE_INPUT_NAME",
"RULE_INPUT_PATH",
"SHARED_INTERMEDIATE_DIR",
]
generator_default_variables = {
}
generator_default_variables = {}
for v in _generator_identity_variables:
generator_default_variables[v] = '<(%s)' % v
generator_default_variables[v] = "<(%s)" % v
def GenerateOutput(target_list, target_dicts, data, params):
locals = {
'target_list': target_list,
'target_dicts': target_dicts,
'data': data,
"target_list": target_list,
"target_dicts": target_dicts,
"data": data,
}
# Use a banner that looks like the stock Python one and like what
# code.interact uses by default, but tack on something to indicate what
# locals are available, and identify gypsh.
banner='Python %s on %s\nlocals.keys() = %s\ngypsh' % \
(sys.version, sys.platform, repr(sorted(locals.keys())))
banner = "Python %s on %s\nlocals.keys() = %s\ngypsh" % (
sys.version,
sys.platform,
repr(sorted(locals.keys())),
)
code.interact(banner, local=locals)

File diff suppressed because it is too large Load diff

File diff suppressed because it is too large Load diff

View file

@ -15,27 +15,33 @@ except ImportError:
class TestSequenceFunctions(unittest.TestCase):
def setUp(self):
self.stderr = StringIO()
def test_GetLibraries(self):
self.assertEqual(msvs._GetLibraries({}), [])
self.assertEqual(msvs._GetLibraries({"libraries": []}), [])
self.assertEqual(
msvs._GetLibraries({}),
[])
msvs._GetLibraries({"other": "foo", "libraries": ["a.lib"]}), ["a.lib"]
)
self.assertEqual(msvs._GetLibraries({"libraries": ["-la"]}), ["a.lib"])
self.assertEqual(
msvs._GetLibraries({'libraries': []}),
[])
self.assertEqual(
msvs._GetLibraries({'other':'foo', 'libraries': ['a.lib']}),
['a.lib'])
self.assertEqual(
msvs._GetLibraries({'libraries': ['-la']}),
['a.lib'])
self.assertEqual(
msvs._GetLibraries({'libraries': ['a.lib', 'b.lib', 'c.lib', '-lb.lib',
'-lb.lib', 'd.lib', 'a.lib']}),
['c.lib', 'b.lib', 'd.lib', 'a.lib'])
msvs._GetLibraries(
{
"libraries": [
"a.lib",
"b.lib",
"c.lib",
"-lb.lib",
"-lb.lib",
"d.lib",
"a.lib",
]
}
),
["c.lib", "b.lib", "d.lib", "a.lib"],
)
if __name__ == '__main__':
if __name__ == "__main__":
unittest.main()

File diff suppressed because it is too large Load diff

View file

@ -16,31 +16,40 @@ class TestPrefixesAndSuffixes(unittest.TestCase):
def test_BinaryNamesWindows(self):
# These cannot run on non-Windows as they require a VS installation to
# correctly handle variable expansion.
if sys.platform.startswith('win'):
writer = ninja.NinjaWriter('foo', 'wee', '.', '.', 'build.ninja', '.',
'build.ninja', 'win')
spec = { 'target_name': 'wee' }
self.assertTrue(writer.ComputeOutputFileName(spec, 'executable').
endswith('.exe'))
self.assertTrue(writer.ComputeOutputFileName(spec, 'shared_library').
endswith('.dll'))
self.assertTrue(writer.ComputeOutputFileName(spec, 'static_library').
endswith('.lib'))
if sys.platform.startswith("win"):
writer = ninja.NinjaWriter(
"foo", "wee", ".", ".", "build.ninja", ".", "build.ninja", "win"
)
spec = {"target_name": "wee"}
self.assertTrue(
writer.ComputeOutputFileName(spec, "executable").endswith(".exe")
)
self.assertTrue(
writer.ComputeOutputFileName(spec, "shared_library").endswith(".dll")
)
self.assertTrue(
writer.ComputeOutputFileName(spec, "static_library").endswith(".lib")
)
def test_BinaryNamesLinux(self):
writer = ninja.NinjaWriter('foo', 'wee', '.', '.', 'build.ninja', '.',
'build.ninja', 'linux')
spec = { 'target_name': 'wee' }
self.assertTrue('.' not in writer.ComputeOutputFileName(spec,
'executable'))
self.assertTrue(writer.ComputeOutputFileName(spec, 'shared_library').
startswith('lib'))
self.assertTrue(writer.ComputeOutputFileName(spec, 'static_library').
startswith('lib'))
self.assertTrue(writer.ComputeOutputFileName(spec, 'shared_library').
endswith('.so'))
self.assertTrue(writer.ComputeOutputFileName(spec, 'static_library').
endswith('.a'))
writer = ninja.NinjaWriter(
"foo", "wee", ".", ".", "build.ninja", ".", "build.ninja", "linux"
)
spec = {"target_name": "wee"}
self.assertTrue("." not in writer.ComputeOutputFileName(spec, "executable"))
self.assertTrue(
writer.ComputeOutputFileName(spec, "shared_library").startswith("lib")
)
self.assertTrue(
writer.ComputeOutputFileName(spec, "static_library").startswith("lib")
)
self.assertTrue(
writer.ComputeOutputFileName(spec, "shared_library").endswith(".so")
)
self.assertTrue(
writer.ComputeOutputFileName(spec, "static_library").endswith(".a")
)
if __name__ == '__main__':
if __name__ == "__main__":
unittest.main()

File diff suppressed because it is too large Load diff

View file

@ -12,12 +12,14 @@ import sys
class TestEscapeXcodeDefine(unittest.TestCase):
if sys.platform == 'darwin':
if sys.platform == "darwin":
def test_InheritedRemainsUnescaped(self):
self.assertEqual(xcode.EscapeXcodeDefine('$(inherited)'), '$(inherited)')
self.assertEqual(xcode.EscapeXcodeDefine("$(inherited)"), "$(inherited)")
def test_Escaping(self):
self.assertEqual(xcode.EscapeXcodeDefine('a b"c\\'), 'a\\ b\\"c\\\\')
if __name__ == '__main__':
if __name__ == "__main__":
unittest.main()

File diff suppressed because it is too large Load diff

View file

@ -8,13 +8,12 @@
import gyp.input
import unittest
import sys
class TestFindCycles(unittest.TestCase):
def setUp(self):
self.nodes = {}
for x in ('a', 'b', 'c', 'd', 'e'):
for x in ("a", "b", "c", "d", "e"):
self.nodes[x] = gyp.input.DependencyGraphNode(x)
def _create_dependency(self, dependent, dependency):
@ -26,65 +25,74 @@ class TestFindCycles(unittest.TestCase):
self.assertEqual([], node.FindCycles())
def test_no_cycle_line(self):
self._create_dependency(self.nodes['a'], self.nodes['b'])
self._create_dependency(self.nodes['b'], self.nodes['c'])
self._create_dependency(self.nodes['c'], self.nodes['d'])
self._create_dependency(self.nodes["a"], self.nodes["b"])
self._create_dependency(self.nodes["b"], self.nodes["c"])
self._create_dependency(self.nodes["c"], self.nodes["d"])
for label, node in self.nodes.items():
self.assertEqual([], node.FindCycles())
def test_no_cycle_dag(self):
self._create_dependency(self.nodes['a'], self.nodes['b'])
self._create_dependency(self.nodes['a'], self.nodes['c'])
self._create_dependency(self.nodes['b'], self.nodes['c'])
self._create_dependency(self.nodes["a"], self.nodes["b"])
self._create_dependency(self.nodes["a"], self.nodes["c"])
self._create_dependency(self.nodes["b"], self.nodes["c"])
for label, node in self.nodes.items():
self.assertEqual([], node.FindCycles())
def test_cycle_self_reference(self):
self._create_dependency(self.nodes['a'], self.nodes['a'])
self._create_dependency(self.nodes["a"], self.nodes["a"])
self.assertEqual([[self.nodes['a'], self.nodes['a']]],
self.nodes['a'].FindCycles())
self.assertEqual(
[[self.nodes["a"], self.nodes["a"]]], self.nodes["a"].FindCycles()
)
def test_cycle_two_nodes(self):
self._create_dependency(self.nodes['a'], self.nodes['b'])
self._create_dependency(self.nodes['b'], self.nodes['a'])
self._create_dependency(self.nodes["a"], self.nodes["b"])
self._create_dependency(self.nodes["b"], self.nodes["a"])
self.assertEqual([[self.nodes['a'], self.nodes['b'], self.nodes['a']]],
self.nodes['a'].FindCycles())
self.assertEqual([[self.nodes['b'], self.nodes['a'], self.nodes['b']]],
self.nodes['b'].FindCycles())
self.assertEqual(
[[self.nodes["a"], self.nodes["b"], self.nodes["a"]]],
self.nodes["a"].FindCycles(),
)
self.assertEqual(
[[self.nodes["b"], self.nodes["a"], self.nodes["b"]]],
self.nodes["b"].FindCycles(),
)
def test_two_cycles(self):
self._create_dependency(self.nodes['a'], self.nodes['b'])
self._create_dependency(self.nodes['b'], self.nodes['a'])
self._create_dependency(self.nodes["a"], self.nodes["b"])
self._create_dependency(self.nodes["b"], self.nodes["a"])
self._create_dependency(self.nodes['b'], self.nodes['c'])
self._create_dependency(self.nodes['c'], self.nodes['b'])
self._create_dependency(self.nodes["b"], self.nodes["c"])
self._create_dependency(self.nodes["c"], self.nodes["b"])
cycles = self.nodes['a'].FindCycles()
self.assertTrue(
[self.nodes['a'], self.nodes['b'], self.nodes['a']] in cycles)
self.assertTrue(
[self.nodes['b'], self.nodes['c'], self.nodes['b']] in cycles)
cycles = self.nodes["a"].FindCycles()
self.assertTrue([self.nodes["a"], self.nodes["b"], self.nodes["a"]] in cycles)
self.assertTrue([self.nodes["b"], self.nodes["c"], self.nodes["b"]] in cycles)
self.assertEqual(2, len(cycles))
def test_big_cycle(self):
self._create_dependency(self.nodes['a'], self.nodes['b'])
self._create_dependency(self.nodes['b'], self.nodes['c'])
self._create_dependency(self.nodes['c'], self.nodes['d'])
self._create_dependency(self.nodes['d'], self.nodes['e'])
self._create_dependency(self.nodes['e'], self.nodes['a'])
self._create_dependency(self.nodes["a"], self.nodes["b"])
self._create_dependency(self.nodes["b"], self.nodes["c"])
self._create_dependency(self.nodes["c"], self.nodes["d"])
self._create_dependency(self.nodes["d"], self.nodes["e"])
self._create_dependency(self.nodes["e"], self.nodes["a"])
self.assertEqual([[self.nodes['a'],
self.nodes['b'],
self.nodes['c'],
self.nodes['d'],
self.nodes['e'],
self.nodes['a']]],
self.nodes['a'].FindCycles())
self.assertEqual(
[
[
self.nodes["a"],
self.nodes["b"],
self.nodes["c"],
self.nodes["d"],
self.nodes["e"],
self.nodes["a"],
]
],
self.nodes["a"].FindCycles(),
)
if __name__ == '__main__':
if __name__ == "__main__":
unittest.main()

View file

@ -18,7 +18,6 @@ import os
import plistlib
import re
import shutil
import string
import struct
import subprocess
import sys
@ -48,12 +47,12 @@ class MacTool(object):
def _CommandifyName(self, name_string):
"""Transforms a tool name like copy-info-plist to CopyInfoPlist"""
return name_string.title().replace('-', '')
return name_string.title().replace("-", "")
def ExecCopyBundleResource(self, source, dest, convert_to_binary):
"""Copies a resource file to the bundle/Resources directory, performing any
necessary compilation on each resource."""
convert_to_binary = convert_to_binary == 'True'
convert_to_binary = convert_to_binary == "True"
extension = os.path.splitext(source)[1].lower()
if os.path.isdir(source):
# Copy tree.
@ -63,18 +62,18 @@ class MacTool(object):
if os.path.exists(dest):
shutil.rmtree(dest)
shutil.copytree(source, dest)
elif extension == '.xib':
elif extension == ".xib":
return self._CopyXIBFile(source, dest)
elif extension == '.storyboard':
elif extension == ".storyboard":
return self._CopyXIBFile(source, dest)
elif extension == '.strings' and not convert_to_binary:
elif extension == ".strings" and not convert_to_binary:
self._CopyStringsFile(source, dest)
else:
if os.path.exists(dest):
os.unlink(dest)
shutil.copy(source, dest)
if convert_to_binary and extension in ('.plist', '.strings'):
if convert_to_binary and extension in (".plist", ".strings"):
self._ConvertToBinary(dest)
def _CopyXIBFile(self, source, dest):
@ -87,28 +86,37 @@ class MacTool(object):
if os.path.relpath(dest):
dest = os.path.join(base, dest)
args = ['xcrun', 'ibtool', '--errors', '--warnings', '--notices']
args = ["xcrun", "ibtool", "--errors", "--warnings", "--notices"]
if os.environ['XCODE_VERSION_ACTUAL'] > '0700':
args.extend(['--auto-activate-custom-fonts'])
if 'IPHONEOS_DEPLOYMENT_TARGET' in os.environ:
args.extend([
'--target-device', 'iphone', '--target-device', 'ipad',
'--minimum-deployment-target',
os.environ['IPHONEOS_DEPLOYMENT_TARGET'],
])
if os.environ["XCODE_VERSION_ACTUAL"] > "0700":
args.extend(["--auto-activate-custom-fonts"])
if "IPHONEOS_DEPLOYMENT_TARGET" in os.environ:
args.extend(
[
"--target-device",
"iphone",
"--target-device",
"ipad",
"--minimum-deployment-target",
os.environ["IPHONEOS_DEPLOYMENT_TARGET"],
]
)
else:
args.extend([
'--target-device', 'mac',
'--minimum-deployment-target',
os.environ['MACOSX_DEPLOYMENT_TARGET'],
])
args.extend(
[
"--target-device",
"mac",
"--minimum-deployment-target",
os.environ["MACOSX_DEPLOYMENT_TARGET"],
]
)
args.extend(['--output-format', 'human-readable-text', '--compile', dest,
source])
args.extend(
["--output-format", "human-readable-text", "--compile", dest, source]
)
ibtool_section_re = re.compile(r'/\*.*\*/')
ibtool_re = re.compile(r'.*note:.*is clipping its content')
ibtool_section_re = re.compile(r"/\*.*\*/")
ibtool_re = re.compile(r".*note:.*is clipping its content")
try:
stdout = subprocess.check_output(args)
except subprocess.CalledProcessError as e:
@ -126,8 +134,9 @@ class MacTool(object):
return 0
def _ConvertToBinary(self, dest):
subprocess.check_call([
'xcrun', 'plutil', '-convert', 'binary1', '-o', dest, dest])
subprocess.check_call(
["xcrun", "plutil", "-convert", "binary1", "-o", dest, dest]
)
def _CopyStringsFile(self, source, dest):
"""Copies a .strings file using iconv to reconvert the input into UTF-16."""
@ -139,27 +148,30 @@ class MacTool(object):
# semicolon in dictionary.
# on invalid files. Do the same kind of validation.
import CoreFoundation
with open(source, 'rb') as in_file:
with open(source, "rb") as in_file:
s = in_file.read()
d = CoreFoundation.CFDataCreate(None, s, len(s))
_, error = CoreFoundation.CFPropertyListCreateFromXMLData(None, d, 0, None)
if error:
return
with open(dest, 'wb') as fp:
fp.write(s.decode(input_code).encode('UTF-16'))
with open(dest, "wb") as fp:
fp.write(s.decode(input_code).encode("UTF-16"))
def _DetectInputEncoding(self, file_name):
"""Reads the first few bytes from file_name and tries to guess the text
encoding. Returns None as a guess if it can't detect it."""
with open(file_name, 'rb') as fp:
with open(file_name, "rb") as fp:
try:
header = fp.read(3)
except Exception:
return None
if header.startswith(("\xFE\xFF", "\xFF\xFE")):
if header.startswith(b"\xFE\xFF"):
return "UTF-16"
elif header.startswith("\xEF\xBB\xBF"):
elif header.startswith(b"\xFF\xFE"):
return "UTF-16"
elif header.startswith(b"\xEF\xBB\xBF"):
return "UTF-8"
else:
return None
@ -167,24 +179,24 @@ class MacTool(object):
def ExecCopyInfoPlist(self, source, dest, convert_to_binary, *keys):
"""Copies the |source| Info.plist to the destination directory |dest|."""
# Read the source Info.plist into memory.
with open(source, 'r') as fd:
with open(source, "r") as fd:
lines = fd.read()
# Insert synthesized key/value pairs (e.g. BuildMachineOSBuild).
plist = plistlib.readPlistFromString(lines)
if keys:
plist = dict(plist.items() + json.loads(keys[0]).items())
plist.update(json.loads(keys[0]))
lines = plistlib.writePlistToString(plist)
# Go through all the environment variables and replace them as variables in
# the file.
IDENT_RE = re.compile(r'[_/\s]')
IDENT_RE = re.compile(r"[_/\s]")
for key in os.environ:
if key.startswith('_'):
if key.startswith("_"):
continue
evar = '${%s}' % key
evar = "${%s}" % key
evalue = os.environ[key]
lines = string.replace(lines, evar, evalue)
lines = lines.replace(lines, evar, evalue)
# Xcode supports various suffices on environment variables, which are
# all undocumented. :rfc1034identifier is used in the standard project
@ -192,13 +204,13 @@ class MacTool(object):
# convert non-url characters into things that look like valid urls --
# except that the replacement character for :identifier, '_' isn't valid
# in a URL either -- oops, hence :rfc1034identifier was born.
evar = '${%s:identifier}' % key
evalue = IDENT_RE.sub('_', os.environ[key])
lines = string.replace(lines, evar, evalue)
evar = "${%s:identifier}" % key
evalue = IDENT_RE.sub("_", os.environ[key])
lines = lines.replace(lines, evar, evalue)
evar = '${%s:rfc1034identifier}' % key
evalue = IDENT_RE.sub('-', os.environ[key])
lines = string.replace(lines, evar, evalue)
evar = "${%s:rfc1034identifier}" % key
evalue = IDENT_RE.sub("-", os.environ[key])
lines = lines.replace(lines, evar, evalue)
# Remove any keys with values that haven't been replaced.
lines = lines.splitlines()
@ -206,17 +218,17 @@ class MacTool(object):
if lines[i].strip().startswith("<string>${"):
lines[i] = None
lines[i - 1] = None
lines = '\n'.join(line for line in lines if line is not None)
lines = "\n".join(line for line in lines if line is not None)
# Write out the file with variables replaced.
with open(dest, 'w') as fd:
with open(dest, "w") as fd:
fd.write(lines)
# Now write out PkgInfo file now that the Info.plist file has been
# "compiled".
self._WritePkgInfo(dest)
if convert_to_binary == 'True':
if convert_to_binary == "True":
self._ConvertToBinary(dest)
def _WritePkgInfo(self, info_plist):
@ -226,20 +238,20 @@ class MacTool(object):
return
# Only create PkgInfo for executable types.
package_type = plist['CFBundlePackageType']
if package_type != 'APPL':
package_type = plist["CFBundlePackageType"]
if package_type != "APPL":
return
# The format of PkgInfo is eight characters, representing the bundle type
# and bundle signature, each four characters. If that is missing, four
# '?' characters are used instead.
signature_code = plist.get('CFBundleSignature', '????')
signature_code = plist.get("CFBundleSignature", "????")
if len(signature_code) != 4: # Wrong length resets everything, too.
signature_code = '?' * 4
signature_code = "?" * 4
dest = os.path.join(os.path.dirname(info_plist), 'PkgInfo')
with open(dest, 'w') as fp:
fp.write('%s%s' % (package_type, signature_code))
dest = os.path.join(os.path.dirname(info_plist), "PkgInfo")
with open(dest, "w") as fp:
fp.write("%s%s" % (package_type, signature_code))
def ExecFlock(self, lockfile, *cmd_list):
"""Emulates the most basic behavior of Linux's flock(1)."""
@ -251,22 +263,24 @@ class MacTool(object):
def ExecFilterLibtool(self, *cmd_list):
"""Calls libtool and filters out '/path/to/libtool: file: foo.o has no
symbols'."""
libtool_re = re.compile(r'^.*libtool: (?:for architecture: \S* )?'
r'file: .* has no symbols$')
libtool_re = re.compile(
r"^.*libtool: (?:for architecture: \S* )?" r"file: .* has no symbols$"
)
libtool_re5 = re.compile(
r'^.*libtool: warning for library: ' +
r'.* the table of contents is empty ' +
r'\(no object file members in the library define global symbols\)$')
r"^.*libtool: warning for library: "
+ r".* the table of contents is empty "
+ r"\(no object file members in the library define global symbols\)$"
)
env = os.environ.copy()
# Ref:
# http://www.opensource.apple.com/source/cctools/cctools-809/misc/libtool.c
# The problem with this flag is that it resets the file mtime on the file to
# epoch=0, e.g. 1970-1-1 or 1969-12-31 depending on timezone.
env['ZERO_AR_DATE'] = '1'
env["ZERO_AR_DATE"] = "1"
libtoolout = subprocess.Popen(cmd_list, stderr=subprocess.PIPE, env=env)
_, err = libtoolout.communicate()
if PY3:
err = err.decode('utf-8')
err = err.decode("utf-8")
for line in err.splitlines():
if not libtool_re.match(line) and not libtool_re5.match(line):
print(line, file=sys.stderr)
@ -274,36 +288,38 @@ class MacTool(object):
# and the command succeeded. A bit hacky.
if not libtoolout.returncode:
for i in range(len(cmd_list) - 1):
if cmd_list[i] == "-o" and cmd_list[i+1].endswith('.a'):
if cmd_list[i] == "-o" and cmd_list[i + 1].endswith(".a"):
os.utime(cmd_list[i + 1], None)
break
return libtoolout.returncode
def ExecPackageIosFramework(self, framework):
# Find the name of the binary based on the part before the ".framework".
binary = os.path.basename(framework).split('.')[0]
module_path = os.path.join(framework, 'Modules')
binary = os.path.basename(framework).split(".")[0]
module_path = os.path.join(framework, "Modules")
if not os.path.exists(module_path):
os.mkdir(module_path)
module_template = 'framework module %s {\n' \
' umbrella header "%s.h"\n' \
'\n' \
' export *\n' \
' module * { export * }\n' \
'}\n' % (binary, binary)
module_template = (
"framework module %s {\n"
' umbrella header "%s.h"\n'
"\n"
" export *\n"
" module * { export * }\n"
"}\n" % (binary, binary)
)
with open(os.path.join(module_path, 'module.modulemap'), "w") as module_file:
with open(os.path.join(module_path, "module.modulemap"), "w") as module_file:
module_file.write(module_template)
def ExecPackageFramework(self, framework, version):
"""Takes a path to Something.framework and the Current version of that and
sets up all the symlinks."""
# Find the name of the binary based on the part before the ".framework".
binary = os.path.basename(framework).split('.')[0]
binary = os.path.basename(framework).split(".")[0]
CURRENT = 'Current'
RESOURCES = 'Resources'
VERSIONS = 'Versions'
CURRENT = "Current"
RESOURCES = "Resources"
VERSIONS = "Versions"
if not os.path.exists(os.path.join(framework, VERSIONS, version, binary)):
# Binary-less frameworks don't seem to contain symlinks (see e.g.
@ -332,7 +348,7 @@ class MacTool(object):
os.symlink(dest, link)
def ExecCompileIosFrameworkHeaderMap(self, out, framework, *all_headers):
framework_name = os.path.basename(framework).split('.')[0]
framework_name = os.path.basename(framework).split(".")[0]
all_headers = [os.path.abspath(header) for header in all_headers]
filelist = {}
for header in all_headers:
@ -342,7 +358,7 @@ class MacTool(object):
WriteHmap(out, filelist)
def ExecCopyIosFrameworkHeaders(self, framework, *copy_headers):
header_path = os.path.join(framework, 'Headers')
header_path = os.path.join(framework, "Headers")
if not os.path.exists(header_path):
os.makedirs(header_path)
for header in copy_headers:
@ -360,31 +376,51 @@ class MacTool(object):
catalogs does not contains imageset.
"""
command_line = [
'xcrun', 'actool', '--output-format', 'human-readable-text',
'--compress-pngs', '--notices', '--warnings', '--errors',
"xcrun",
"actool",
"--output-format",
"human-readable-text",
"--compress-pngs",
"--notices",
"--warnings",
"--errors",
]
is_iphone_target = 'IPHONEOS_DEPLOYMENT_TARGET' in os.environ
is_iphone_target = "IPHONEOS_DEPLOYMENT_TARGET" in os.environ
if is_iphone_target:
platform = os.environ['CONFIGURATION'].split('-')[-1]
if platform not in ('iphoneos', 'iphonesimulator'):
platform = 'iphonesimulator'
command_line.extend([
'--platform', platform, '--target-device', 'iphone',
'--target-device', 'ipad', '--minimum-deployment-target',
os.environ['IPHONEOS_DEPLOYMENT_TARGET'], '--compile',
os.path.abspath(os.environ['CONTENTS_FOLDER_PATH']),
])
platform = os.environ["CONFIGURATION"].split("-")[-1]
if platform not in ("iphoneos", "iphonesimulator"):
platform = "iphonesimulator"
command_line.extend(
[
"--platform",
platform,
"--target-device",
"iphone",
"--target-device",
"ipad",
"--minimum-deployment-target",
os.environ["IPHONEOS_DEPLOYMENT_TARGET"],
"--compile",
os.path.abspath(os.environ["CONTENTS_FOLDER_PATH"]),
]
)
else:
command_line.extend([
'--platform', 'macosx', '--target-device', 'mac',
'--minimum-deployment-target', os.environ['MACOSX_DEPLOYMENT_TARGET'],
'--compile',
os.path.abspath(os.environ['UNLOCALIZED_RESOURCES_FOLDER_PATH']),
])
command_line.extend(
[
"--platform",
"macosx",
"--target-device",
"mac",
"--minimum-deployment-target",
os.environ["MACOSX_DEPLOYMENT_TARGET"],
"--compile",
os.path.abspath(os.environ["UNLOCALIZED_RESOURCES_FOLDER_PATH"]),
]
)
if keys:
keys = json.loads(keys)
for key, value in keys.items():
arg_name = '--' + key
arg_name = "--" + key
if isinstance(value, bool):
if value:
command_line.append(arg_name)
@ -419,16 +455,18 @@ class MacTool(object):
3. code sign the bundle.
"""
substitutions, overrides = self._InstallProvisioningProfile(
provisioning, self._GetCFBundleIdentifier())
provisioning, self._GetCFBundleIdentifier()
)
entitlements_path = self._InstallEntitlements(
entitlements, substitutions, overrides)
entitlements, substitutions, overrides
)
args = ['codesign', '--force', '--sign', key]
if preserve == 'True':
args.extend(['--deep', '--preserve-metadata=identifier,entitlements'])
args = ["codesign", "--force", "--sign", key]
if preserve == "True":
args.extend(["--deep", "--preserve-metadata=identifier,entitlements"])
else:
args.extend(['--entitlements', entitlements_path])
args.extend(['--timestamp=none', path])
args.extend(["--entitlements", entitlements_path])
args.extend(["--timestamp=none", path])
subprocess.check_call(args)
def _InstallProvisioningProfile(self, profile, bundle_identifier):
@ -445,14 +483,16 @@ class MacTool(object):
to overrides when generating the entitlements file.
"""
source_path, provisioning_data, team_id = self._FindProvisioningProfile(
profile, bundle_identifier)
profile, bundle_identifier
)
target_path = os.path.join(
os.environ['BUILT_PRODUCTS_DIR'],
os.environ['CONTENTS_FOLDER_PATH'],
'embedded.mobileprovision')
os.environ["BUILT_PRODUCTS_DIR"],
os.environ["CONTENTS_FOLDER_PATH"],
"embedded.mobileprovision",
)
shutil.copy2(source_path, target_path)
substitutions = self._GetSubstitutions(bundle_identifier, team_id + '.')
return substitutions, provisioning_data['Entitlements']
substitutions = self._GetSubstitutions(bundle_identifier, team_id + ".")
return substitutions, provisioning_data["Entitlements"]
def _FindProvisioningProfile(self, profile, bundle_identifier):
"""Finds the .mobileprovision file to use for signing the bundle.
@ -476,30 +516,42 @@ class MacTool(object):
SystemExit: if no .mobileprovision can be used to sign the bundle.
"""
profiles_dir = os.path.join(
os.environ['HOME'], 'Library', 'MobileDevice', 'Provisioning Profiles')
os.environ["HOME"], "Library", "MobileDevice", "Provisioning Profiles"
)
if not os.path.isdir(profiles_dir):
print('cannot find mobile provisioning for %s' % (bundle_identifier), file=sys.stderr)
print(
"cannot find mobile provisioning for %s" % (bundle_identifier),
file=sys.stderr,
)
sys.exit(1)
provisioning_profiles = None
if profile:
profile_path = os.path.join(profiles_dir, profile + '.mobileprovision')
profile_path = os.path.join(profiles_dir, profile + ".mobileprovision")
if os.path.exists(profile_path):
provisioning_profiles = [profile_path]
if not provisioning_profiles:
provisioning_profiles = glob.glob(
os.path.join(profiles_dir, '*.mobileprovision'))
os.path.join(profiles_dir, "*.mobileprovision")
)
valid_provisioning_profiles = {}
for profile_path in provisioning_profiles:
profile_data = self._LoadProvisioningProfile(profile_path)
app_id_pattern = profile_data.get(
'Entitlements', {}).get('application-identifier', '')
for team_identifier in profile_data.get('TeamIdentifier', []):
app_id = '%s.%s' % (team_identifier, bundle_identifier)
app_id_pattern = profile_data.get("Entitlements", {}).get(
"application-identifier", ""
)
for team_identifier in profile_data.get("TeamIdentifier", []):
app_id = "%s.%s" % (team_identifier, bundle_identifier)
if fnmatch.fnmatch(app_id, app_id_pattern):
valid_provisioning_profiles[app_id_pattern] = (
profile_path, profile_data, team_identifier)
profile_path,
profile_data,
team_identifier,
)
if not valid_provisioning_profiles:
print('cannot find mobile provisioning for %s' % (bundle_identifier), file=sys.stderr)
print(
"cannot find mobile provisioning for %s" % (bundle_identifier),
file=sys.stderr,
)
sys.exit(1)
# If the user has multiple provisioning profiles installed that can be
# used for ${bundle_identifier}, pick the most specific one (ie. the
@ -517,8 +569,9 @@ class MacTool(object):
Content of the plist embedded in the provisioning profile as a dictionary.
"""
with tempfile.NamedTemporaryFile() as temp:
subprocess.check_call([
'security', 'cms', '-D', '-i', profile_path, '-o', temp.name])
subprocess.check_call(
["security", "cms", "-D", "-i", profile_path, "-o", temp.name]
)
return self._LoadPlistMaybeBinary(temp.name)
def _MergePlist(self, merged_plist, plist):
@ -552,11 +605,11 @@ class MacTool(object):
# and if an exception is raised, convert a temporary copy to XML and
# load that copy.
return plistlib.readPlist(plist_path)
except:
except Exception:
pass
with tempfile.NamedTemporaryFile() as temp:
shutil.copy2(plist_path, temp.name)
subprocess.check_call(['plutil', '-convert', 'xml1', temp.name])
subprocess.check_call(["plutil", "-convert", "xml1", temp.name])
return plistlib.readPlist(temp.name)
def _GetSubstitutions(self, bundle_identifier, app_identifier_prefix):
@ -570,8 +623,8 @@ class MacTool(object):
Dictionary of substitutions to apply when generating Entitlements.plist.
"""
return {
'CFBundleIdentifier': bundle_identifier,
'AppIdentifierPrefix': app_identifier_prefix,
"CFBundleIdentifier": bundle_identifier,
"AppIdentifierPrefix": app_identifier_prefix,
}
def _GetCFBundleIdentifier(self):
@ -581,10 +634,10 @@ class MacTool(object):
Value of CFBundleIdentifier in the Info.plist located in the bundle.
"""
info_plist_path = os.path.join(
os.environ['TARGET_BUILD_DIR'],
os.environ['INFOPLIST_PATH'])
os.environ["TARGET_BUILD_DIR"], os.environ["INFOPLIST_PATH"]
)
info_plist_data = self._LoadPlistMaybeBinary(info_plist_path)
return info_plist_data['CFBundleIdentifier']
return info_plist_data["CFBundleIdentifier"]
def _InstallEntitlements(self, entitlements, substitutions, overrides):
"""Generates and install the ${BundleName}.xcent entitlements file.
@ -604,12 +657,10 @@ class MacTool(object):
"""
source_path = entitlements
target_path = os.path.join(
os.environ['BUILT_PRODUCTS_DIR'],
os.environ['PRODUCT_NAME'] + '.xcent')
os.environ["BUILT_PRODUCTS_DIR"], os.environ["PRODUCT_NAME"] + ".xcent"
)
if not source_path:
source_path = os.path.join(
os.environ['SDKROOT'],
'Entitlements.plist')
source_path = os.path.join(os.environ["SDKROOT"], "Entitlements.plist")
shutil.copy2(source_path, target_path)
data = self._LoadPlistMaybeBinary(target_path)
data = self._ExpandVariables(data, substitutions)
@ -634,7 +685,7 @@ class MacTool(object):
"""
if isinstance(data, str):
for key, value in substitutions.items():
data = data.replace('$(%s)' % key, value)
data = data.replace("$(%s)" % key, value)
return data
if isinstance(data, list):
return [self._ExpandVariables(v, substitutions) for v in data]
@ -642,9 +693,11 @@ class MacTool(object):
return {k: self._ExpandVariables(data[k], substitutions) for k in data}
return data
def NextGreaterPowerOf2(x):
return 2 ** (x).bit_length()
def WriteHmap(output_name, filelist):
"""Generates a header map based on |filelist|.
@ -666,8 +719,18 @@ def WriteHmap(output_name, filelist):
max_value_length = max(len(value) for value in filelist.values())
out = open(output_name, "wb")
out.write(struct.pack('<LHHLLLL', magic, version, _reserved, strings_offset,
count, capacity, max_value_length))
out.write(
struct.pack(
"<LHHLLLL",
magic,
version,
_reserved,
strings_offset,
count,
capacity,
max_value_length,
)
)
# Create empty hashmap buckets.
buckets = [None] * capacity
@ -684,29 +747,30 @@ def WriteHmap(output_name, filelist):
next_offset = 1
for bucket in buckets:
if bucket is None:
out.write(struct.pack('<LLL', 0, 0, 0))
out.write(struct.pack("<LLL", 0, 0, 0))
else:
(file, path) = bucket
key_offset = next_offset
prefix_offset = key_offset + len(file) + 1
suffix_offset = prefix_offset + len(os.path.dirname(path) + os.sep) + 1
next_offset = suffix_offset + len(os.path.basename(path)) + 1
out.write(struct.pack('<LLL', key_offset, prefix_offset, suffix_offset))
out.write(struct.pack("<LLL", key_offset, prefix_offset, suffix_offset))
# Pad byte since next offset starts at 1.
out.write(struct.pack('<x'))
out.write(struct.pack("<x"))
for bucket in buckets:
if bucket is not None:
(file, path) = bucket
out.write(struct.pack('<%ds' % len(file), file))
out.write(struct.pack('<s', '\0'))
out.write(struct.pack("<%ds" % len(file), file))
out.write(struct.pack("<s", "\0"))
base = os.path.dirname(path) + os.sep
out.write(struct.pack('<%ds' % len(base), base))
out.write(struct.pack('<s', '\0'))
out.write(struct.pack("<%ds" % len(base), base))
out.write(struct.pack("<s", "\0"))
path = os.path.basename(path)
out.write(struct.pack('<%ds' % len(path), path))
out.write(struct.pack('<s', '\0'))
out.write(struct.pack("<%ds" % len(path), path))
out.write(struct.pack("<s", "\0"))
if __name__ == '__main__':
if __name__ == "__main__":
sys.exit(main(sys.argv[1:]))

File diff suppressed because it is too large Load diff

View file

@ -10,10 +10,11 @@ use Python.
"""
import textwrap
import re
def escape_path(word):
return word.replace('$ ','$$ ').replace(' ','$ ').replace(':', '$:')
return word.replace("$ ", "$$ ").replace(" ", "$ ").replace(":", "$:")
class Writer(object):
def __init__(self, output, width=78):
@ -21,47 +22,58 @@ class Writer(object):
self.width = width
def newline(self):
self.output.write('\n')
self.output.write("\n")
def comment(self, text):
for line in textwrap.wrap(text, self.width - 2):
self.output.write('# ' + line + '\n')
self.output.write("# " + line + "\n")
def variable(self, key, value, indent=0):
if value is None:
return
if isinstance(value, list):
value = ' '.join(filter(None, value)) # Filter out empty strings.
self._line('%s = %s' % (key, value), indent)
value = " ".join(filter(None, value)) # Filter out empty strings.
self._line("%s = %s" % (key, value), indent)
def pool(self, name, depth):
self._line('pool %s' % name)
self.variable('depth', depth, indent=1)
self._line("pool %s" % name)
self.variable("depth", depth, indent=1)
def rule(self, name, command, description=None, depfile=None,
generator=False, pool=None, restat=False, rspfile=None,
rspfile_content=None, deps=None):
self._line('rule %s' % name)
self.variable('command', command, indent=1)
def rule(
self,
name,
command,
description=None,
depfile=None,
generator=False,
pool=None,
restat=False,
rspfile=None,
rspfile_content=None,
deps=None,
):
self._line("rule %s" % name)
self.variable("command", command, indent=1)
if description:
self.variable('description', description, indent=1)
self.variable("description", description, indent=1)
if depfile:
self.variable('depfile', depfile, indent=1)
self.variable("depfile", depfile, indent=1)
if generator:
self.variable('generator', '1', indent=1)
self.variable("generator", "1", indent=1)
if pool:
self.variable('pool', pool, indent=1)
self.variable("pool", pool, indent=1)
if restat:
self.variable('restat', '1', indent=1)
self.variable("restat", "1", indent=1)
if rspfile:
self.variable('rspfile', rspfile, indent=1)
self.variable("rspfile", rspfile, indent=1)
if rspfile_content:
self.variable('rspfile_content', rspfile_content, indent=1)
self.variable("rspfile_content", rspfile_content, indent=1)
if deps:
self.variable('deps', deps, indent=1)
self.variable("deps", deps, indent=1)
def build(self, outputs, rule, inputs=None, implicit=None, order_only=None,
variables=None):
def build(
self, outputs, rule, inputs=None, implicit=None, order_only=None, variables=None
):
outputs = self._as_list(outputs)
all_inputs = self._as_list(inputs)[:]
out_outputs = list(map(escape_path, outputs))
@ -69,15 +81,16 @@ class Writer(object):
if implicit:
implicit = map(escape_path, self._as_list(implicit))
all_inputs.append('|')
all_inputs.append("|")
all_inputs.extend(implicit)
if order_only:
order_only = map(escape_path, self._as_list(order_only))
all_inputs.append('||')
all_inputs.append("||")
all_inputs.extend(order_only)
self._line('build %s: %s' % (' '.join(out_outputs),
' '.join([rule] + all_inputs)))
self._line(
"build %s: %s" % (" ".join(out_outputs), " ".join([rule] + all_inputs))
)
if variables:
if isinstance(variables, dict):
@ -91,58 +104,59 @@ class Writer(object):
return outputs
def include(self, path):
self._line('include %s' % path)
self._line("include %s" % path)
def subninja(self, path):
self._line('subninja %s' % path)
self._line("subninja %s" % path)
def default(self, paths):
self._line('default %s' % ' '.join(self._as_list(paths)))
self._line("default %s" % " ".join(self._as_list(paths)))
def _count_dollars_before_index(self, s, i):
"""Returns the number of '$' characters right in front of s[i]."""
dollar_count = 0
dollar_index = i - 1
while dollar_index > 0 and s[dollar_index] == '$':
while dollar_index > 0 and s[dollar_index] == "$":
dollar_count += 1
dollar_index -= 1
return dollar_count
def _line(self, text, indent=0):
"""Write 'text' word-wrapped at self.width characters."""
leading_space = ' ' * indent
leading_space = " " * indent
while len(leading_space) + len(text) > self.width:
# The text is too wide; wrap if possible.
# Find the rightmost space that would obey our width constraint and
# that's not an escaped space.
available_space = self.width - len(leading_space) - len(' $')
available_space = self.width - len(leading_space) - len(" $")
space = available_space
while True:
space = text.rfind(' ', 0, space)
if space < 0 or \
self._count_dollars_before_index(text, space) % 2 == 0:
space = text.rfind(" ", 0, space)
if space < 0 or self._count_dollars_before_index(text, space) % 2 == 0:
break
if space < 0:
# No such space; just use the first unescaped space we can find.
space = available_space - 1
while True:
space = text.find(' ', space + 1)
if space < 0 or \
self._count_dollars_before_index(text, space) % 2 == 0:
space = text.find(" ", space + 1)
if (
space < 0
or self._count_dollars_before_index(text, space) % 2 == 0
):
break
if space < 0:
# Give up on breaking.
break
self.output.write(leading_space + text[0:space] + ' $\n')
self.output.write(leading_space + text[0:space] + " $\n")
text = text[space + 1 :]
# Subsequent lines are continuations, so indent them.
leading_space = ' ' * (indent+2)
leading_space = " " * (indent + 2)
self.output.write(leading_space + text + '\n')
self.output.write(leading_space + text + "\n")
def _as_list(self, input):
if input is None:
@ -155,6 +169,6 @@ class Writer(object):
def escape(string):
"""Escape a string such that it can be embedded into a Ninja file without
further interpretation."""
assert '\n' not in string, 'Ninja syntax does not allow newlines'
assert "\n" not in string, "Ninja syntax does not allow newlines"
# We only have one special metacharacter: '$'.
return string.replace('$', '$$')
return string.replace("$", "$$")

View file

@ -7,11 +7,14 @@ structures or complex types except for dicts and lists. This is
because gyp copies so large structure that small copy overhead ends up
taking seconds in a project the size of Chromium."""
class Error(Exception):
pass
__all__ = ["Error", "deepcopy"]
def deepcopy(x):
"""Deep copy operation on gyp objects such as strings, ints, dicts
and lists. More than twice as fast as copy.deepcopy but much less
@ -20,14 +23,19 @@ def deepcopy(x):
try:
return _deepcopy_dispatch[type(x)](x)
except KeyError:
raise Error('Unsupported type %s for deepcopy. Use copy.deepcopy ' +
'or expand simple_copy support.' % type(x))
raise Error(
"Unsupported type %s for deepcopy. Use copy.deepcopy "
+ "or expand simple_copy support." % type(x)
)
_deepcopy_dispatch = d = {}
def _deepcopy_atomic(x):
return x
try:
types = bool, float, int, str, type, type(None), long, unicode
except NameError: # Python 3
@ -36,15 +44,21 @@ except NameError: # Python 3
for x in types:
d[x] = _deepcopy_atomic
def _deepcopy_list(x):
return [deepcopy(a) for a in x]
d[list] = _deepcopy_list
def _deepcopy_dict(x):
y = {}
for key, value in x.items():
y[deepcopy(key)] = deepcopy(value)
return y
d[dict] = _deepcopy_dict
del d

View file

@ -24,7 +24,8 @@ PY3 = bytes != str
# A regex matching an argument corresponding to the output filename passed to
# link.exe.
_LINK_EXE_OUT_ARG = re.compile('/OUT:(?P<out>.+)$', re.IGNORECASE)
_LINK_EXE_OUT_ARG = re.compile("/OUT:(?P<out>.+)$", re.IGNORECASE)
def main(args):
executor = WinTool()
@ -43,7 +44,7 @@ class WinTool(object):
if len(args) < 1:
raise Exception("Not enough arguments")
if args[0] != 'link.exe':
if args[0] != "link.exe":
return
# Use the output filename passed to the linker to generate an endpoint name
@ -52,8 +53,9 @@ class WinTool(object):
for arg in args:
m = _LINK_EXE_OUT_ARG.match(arg)
if m:
endpoint_name = re.sub(r'\W+', '',
'%s_%d' % (m.group('out'), os.getpid()))
endpoint_name = re.sub(
r"\W+", "", "%s_%d" % (m.group("out"), os.getpid())
)
break
if endpoint_name is None:
@ -62,7 +64,7 @@ class WinTool(object):
# Adds the appropriate environment variable. This will be read by link.exe
# to know which instance of mspdbsrv.exe it should connect to (if it's
# not set then the default endpoint is used).
env['_MSPDBSRV_ENDPOINT_'] = endpoint_name
env["_MSPDBSRV_ENDPOINT_"] = endpoint_name
def Dispatch(self, args):
"""Dispatches a string command to a method."""
@ -74,31 +76,33 @@ class WinTool(object):
def _CommandifyName(self, name_string):
"""Transforms a tool name like recursive-mirror to RecursiveMirror."""
return name_string.title().replace('-', '')
return name_string.title().replace("-", "")
def _GetEnv(self, arch):
"""Gets the saved environment from a file for a given architecture."""
# The environment is saved as an "environment block" (see CreateProcess
# and msvs_emulation for details). We convert to a dict here.
# Drop last 2 NULs, one for list terminator, one for trailing vs. separator.
pairs = open(arch).read()[:-2].split('\0')
kvs = [item.split('=', 1) for item in pairs]
pairs = open(arch).read()[:-2].split("\0")
kvs = [item.split("=", 1) for item in pairs]
return dict(kvs)
def ExecStamp(self, path):
"""Simple stamp command."""
open(path, 'w').close()
open(path, "w").close()
def ExecRecursiveMirror(self, source, dest):
"""Emulation of rm -rf out && cp -af in out."""
if os.path.exists(dest):
if os.path.isdir(dest):
def _on_error(fn, path, excinfo):
# The operation failed, possibly because the file is set to
# read-only. If that's why, make it writable and try the op again.
if not os.access(path, os.W_OK):
os.chmod(path, stat.S_IWRITE)
fn(path)
shutil.rmtree(dest, onerror=_on_error)
else:
if not os.access(dest, os.W_OK):
@ -117,11 +121,11 @@ class WinTool(object):
This happens when there are exports from the dll or exe.
"""
env = self._GetEnv(arch)
if use_separate_mspdbsrv == 'True':
if use_separate_mspdbsrv == "True":
self._UseSeparateMspdbsrv(env, args)
if sys.platform == 'win32':
if sys.platform == "win32":
args = list(args) # *args is a tuple by default, which is read-only.
args[0] = args[0].replace('/', '\\')
args[0] = args[0].replace("/", "\\")
# https://docs.python.org/2/library/subprocess.html:
# "On Unix with shell=True [...] if args is a sequence, the first item
# specifies the command string, and any additional items will be treated as
@ -130,20 +134,37 @@ class WinTool(object):
# Popen(['/bin/sh', '-c', args[0], args[1], ...])"
# For that reason, since going through the shell doesn't seem necessary on
# non-Windows don't do that there.
link = subprocess.Popen(args, shell=sys.platform == 'win32', env=env,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
link = subprocess.Popen(
args,
shell=sys.platform == "win32",
env=env,
stdout=subprocess.PIPE,
stderr=subprocess.STDOUT,
)
out, _ = link.communicate()
if PY3:
out = out.decode('utf-8')
out = out.decode("utf-8")
for line in out.splitlines():
if (not line.startswith(' Creating library ') and
not line.startswith('Generating code') and
not line.startswith('Finished generating code')):
if (
not line.startswith(" Creating library ")
and not line.startswith("Generating code")
and not line.startswith("Finished generating code")
):
print(line)
return link.returncode
def ExecLinkWithManifests(self, arch, embed_manifest, out, ldcmd, resname,
mt, rc, intermediate_manifest, *manifests):
def ExecLinkWithManifests(
self,
arch,
embed_manifest,
out,
ldcmd,
resname,
mt,
rc,
intermediate_manifest,
*manifests
):
"""A wrapper for handling creating a manifest resource and then executing
a link command."""
# The 'normal' way to do manifests is to have link generate a manifest
@ -157,29 +178,32 @@ class WinTool(object):
# first and only link. We still tell link to generate a manifest, but we
# only use that to assert that our simpler process did not miss anything.
variables = {
'python': sys.executable,
'arch': arch,
'out': out,
'ldcmd': ldcmd,
'resname': resname,
'mt': mt,
'rc': rc,
'intermediate_manifest': intermediate_manifest,
'manifests': ' '.join(manifests),
"python": sys.executable,
"arch": arch,
"out": out,
"ldcmd": ldcmd,
"resname": resname,
"mt": mt,
"rc": rc,
"intermediate_manifest": intermediate_manifest,
"manifests": " ".join(manifests),
}
add_to_ld = ''
add_to_ld = ""
if manifests:
subprocess.check_call(
'%(python)s gyp-win-tool manifest-wrapper %(arch)s %(mt)s -nologo '
'-manifest %(manifests)s -out:%(out)s.manifest' % variables)
if embed_manifest == 'True':
"%(python)s gyp-win-tool manifest-wrapper %(arch)s %(mt)s -nologo "
"-manifest %(manifests)s -out:%(out)s.manifest" % variables
)
if embed_manifest == "True":
subprocess.check_call(
'%(python)s gyp-win-tool manifest-to-rc %(arch)s %(out)s.manifest'
' %(out)s.manifest.rc %(resname)s' % variables)
"%(python)s gyp-win-tool manifest-to-rc %(arch)s %(out)s.manifest"
" %(out)s.manifest.rc %(resname)s" % variables
)
subprocess.check_call(
'%(python)s gyp-win-tool rc-wrapper %(arch)s %(rc)s '
'%(out)s.manifest.rc' % variables)
add_to_ld = ' %(out)s.manifest.res' % variables
"%(python)s gyp-win-tool rc-wrapper %(arch)s %(rc)s "
"%(out)s.manifest.rc" % variables
)
add_to_ld = " %(out)s.manifest.res" % variables
subprocess.check_call(ldcmd + add_to_ld)
# Run mt.exe on the theoretically complete manifest we generated, merging
@ -191,31 +215,37 @@ class WinTool(object):
# Merge the intermediate one with ours to .assert.manifest, then check
# that .assert.manifest is identical to ours.
subprocess.check_call(
'%(python)s gyp-win-tool manifest-wrapper %(arch)s %(mt)s -nologo '
'-manifest %(out)s.manifest %(intermediate_manifest)s '
'-out:%(out)s.assert.manifest' % variables)
assert_manifest = '%(out)s.assert.manifest' % variables
our_manifest = '%(out)s.manifest' % variables
"%(python)s gyp-win-tool manifest-wrapper %(arch)s %(mt)s -nologo "
"-manifest %(out)s.manifest %(intermediate_manifest)s "
"-out:%(out)s.assert.manifest" % variables
)
assert_manifest = "%(out)s.assert.manifest" % variables
our_manifest = "%(out)s.manifest" % variables
# Load and normalize the manifests. mt.exe sometimes removes whitespace,
# and sometimes doesn't unfortunately.
with open(our_manifest, 'rb') as our_f:
with open(assert_manifest, 'rb') as assert_f:
with open(our_manifest, "r") as our_f:
with open(assert_manifest, "r") as assert_f:
our_data = our_f.read().translate(None, string.whitespace)
assert_data = assert_f.read().translate(None, string.whitespace)
if our_data != assert_data:
os.unlink(out)
def dump(filename):
sys.stderr.write('%s\n-----\n' % filename)
with open(filename, 'rb') as f:
sys.stderr.write(f.read() + '\n-----\n')
print(filename, file=sys.stderr)
print("-----", file=sys.stderr)
with open(filename, "r") as f:
print(f.read(), file=sys.stderr)
print("-----", file=sys.stderr)
dump(intermediate_manifest)
dump(our_manifest)
dump(assert_manifest)
sys.stderr.write(
'Linker generated manifest "%s" added to final manifest "%s" '
'(result in "%s"). '
'Were /MANIFEST switches used in #pragma statements? ' % (
intermediate_manifest, our_manifest, assert_manifest))
"Were /MANIFEST switches used in #pragma statements? "
% (intermediate_manifest, our_manifest, assert_manifest)
)
return 1
def ExecManifestWrapper(self, arch, *args):
@ -223,13 +253,14 @@ class WinTool(object):
(some XML blocks are recognized by the OS loader, but not the manifest
tool)."""
env = self._GetEnv(arch)
popen = subprocess.Popen(args, shell=True, env=env,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
popen = subprocess.Popen(
args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT
)
out, _ = popen.communicate()
if PY3:
out = out.decode('utf-8')
out = out.decode("utf-8")
for line in out.splitlines():
if line and 'manifest authoring warning 81010002' not in line:
if line and "manifest authoring warning 81010002" not in line:
print(line)
return popen.returncode
@ -238,38 +269,49 @@ class WinTool(object):
|args| is tuple containing path to resource file, path to manifest file
and resource name which can be "1" (for executables) or "2" (for DLLs)."""
manifest_path, resource_path, resource_name = args
with open(resource_path, 'wb') as output:
output.write('#include <windows.h>\n%s RT_MANIFEST "%s"' % (
resource_name,
os.path.abspath(manifest_path).replace('\\', '/')))
with open(resource_path, "w") as output:
output.write(
'#include <windows.h>\n%s RT_MANIFEST "%s"'
% (resource_name, os.path.abspath(manifest_path).replace("\\", "/"))
)
def ExecMidlWrapper(self, arch, outdir, tlb, h, dlldata, iid, proxy, idl,
*flags):
def ExecMidlWrapper(self, arch, outdir, tlb, h, dlldata, iid, proxy, idl, *flags):
"""Filter noisy filenames output from MIDL compile step that isn't
quietable via command line flags.
"""
args = ['midl', '/nologo'] + list(flags) + [
'/out', outdir,
'/tlb', tlb,
'/h', h,
'/dlldata', dlldata,
'/iid', iid,
'/proxy', proxy,
idl]
args = (
["midl", "/nologo"]
+ list(flags)
+ [
"/out",
outdir,
"/tlb",
tlb,
"/h",
h,
"/dlldata",
dlldata,
"/iid",
iid,
"/proxy",
proxy,
idl,
]
)
env = self._GetEnv(arch)
popen = subprocess.Popen(args, shell=True, env=env,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
popen = subprocess.Popen(
args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT
)
out, _ = popen.communicate()
if PY3:
out = out.decode('utf-8')
out = out.decode("utf-8")
# Filter junk out of stdout, and write filtered versions. Output we want
# to filter is pairs of lines that look like this:
# Processing C:\Program Files (x86)\Microsoft SDKs\...\include\objidl.idl
# objidl.idl
lines = out.splitlines()
prefixes = ('Processing ', '64 bit Processing ')
processing = set(os.path.basename(x)
for x in lines if x.startswith(prefixes))
prefixes = ("Processing ", "64 bit Processing ")
processing = set(os.path.basename(x) for x in lines if x.startswith(prefixes))
for line in lines:
if not line.startswith(prefixes) and line not in processing:
print(line)
@ -278,16 +320,19 @@ class WinTool(object):
def ExecAsmWrapper(self, arch, *args):
"""Filter logo banner from invocations of asm.exe."""
env = self._GetEnv(arch)
popen = subprocess.Popen(args, shell=True, env=env,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
popen = subprocess.Popen(
args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT
)
out, _ = popen.communicate()
if PY3:
out = out.decode('utf-8')
out = out.decode("utf-8")
for line in out.splitlines():
if (not line.startswith('Copyright (C) Microsoft Corporation') and
not line.startswith('Microsoft (R) Macro Assembler') and
not line.startswith(' Assembling: ') and
line):
if (
not line.startswith("Copyright (C) Microsoft Corporation")
and not line.startswith("Microsoft (R) Macro Assembler")
and not line.startswith(" Assembling: ")
and line
):
print(line)
return popen.returncode
@ -295,15 +340,18 @@ class WinTool(object):
"""Filter logo banner from invocations of rc.exe. Older versions of RC
don't support the /nologo flag."""
env = self._GetEnv(arch)
popen = subprocess.Popen(args, shell=True, env=env,
stdout=subprocess.PIPE, stderr=subprocess.STDOUT)
popen = subprocess.Popen(
args, shell=True, env=env, stdout=subprocess.PIPE, stderr=subprocess.STDOUT
)
out, _ = popen.communicate()
if PY3:
out = out.decode('utf-8')
out = out.decode("utf-8")
for line in out.splitlines():
if (not line.startswith('Microsoft (R) Windows (R) Resource Compiler') and
not line.startswith('Copyright (C) Microsoft Corporation') and
line):
if (
not line.startswith("Microsoft (R) Windows (R) Resource Compiler")
and not line.startswith("Copyright (C) Microsoft Corporation")
and line
):
print(line)
return popen.returncode
@ -324,12 +372,14 @@ class WinTool(object):
"""Executed by msvs-ninja projects when the 'ClCompile' target is used to
build selected C/C++ files."""
project_dir = os.path.relpath(project_dir, BASE_DIR)
selected_files = selected_files.split(';')
ninja_targets = [os.path.join(project_dir, filename) + '^^'
for filename in selected_files]
cmd = ['ninja.exe']
selected_files = selected_files.split(";")
ninja_targets = [
os.path.join(project_dir, filename) + "^^" for filename in selected_files
]
cmd = ["ninja.exe"]
cmd.extend(ninja_targets)
return subprocess.call(cmd, shell=True, cwd=BASE_DIR)
if __name__ == '__main__':
if __name__ == "__main__":
sys.exit(main(sys.argv[1:]))

File diff suppressed because it is too large Load diff

View file

@ -22,8 +22,8 @@ import xml.sax.saxutils
def _WriteWorkspace(main_gyp, sources_gyp, params):
""" Create a workspace to wrap main and sources gyp paths. """
(build_file_root, build_file_ext) = os.path.splitext(main_gyp)
workspace_path = build_file_root + '.xcworkspace'
options = params['options']
workspace_path = build_file_root + ".xcworkspace"
options = params["options"]
if options.generator_output:
workspace_path = os.path.join(options.generator_output, workspace_path)
try:
@ -31,18 +31,19 @@ def _WriteWorkspace(main_gyp, sources_gyp, params):
except OSError as e:
if e.errno != errno.EEXIST:
raise
output_string = '<?xml version="1.0" encoding="UTF-8"?>\n' + \
'<Workspace version = "1.0">\n'
output_string = (
'<?xml version="1.0" encoding="UTF-8"?>\n' + '<Workspace version = "1.0">\n'
)
for gyp_name in [main_gyp, sources_gyp]:
name = os.path.splitext(os.path.basename(gyp_name))[0] + '.xcodeproj'
name = os.path.splitext(os.path.basename(gyp_name))[0] + ".xcodeproj"
name = xml.sax.saxutils.quoteattr("group:" + name)
output_string += ' <FileRef location = %s></FileRef>\n' % name
output_string += '</Workspace>\n'
output_string += " <FileRef location = %s></FileRef>\n" % name
output_string += "</Workspace>\n"
workspace_file = os.path.join(workspace_path, "contents.xcworkspacedata")
try:
with open(workspace_file, 'r') as input_file:
with open(workspace_file, "r") as input_file:
input_string = input_file.read()
if input_string == output_string:
return
@ -50,84 +51,89 @@ def _WriteWorkspace(main_gyp, sources_gyp, params):
# Ignore errors if the file doesn't exist.
pass
with open(workspace_file, 'w') as output_file:
with open(workspace_file, "w") as output_file:
output_file.write(output_string)
def _TargetFromSpec(old_spec, params):
""" Create fake target for xcode-ninja wrapper. """
# Determine ninja top level build dir (e.g. /path/to/out).
ninja_toplevel = None
jobs = 0
if params:
options = params['options']
ninja_toplevel = \
os.path.join(options.toplevel_dir,
gyp.generator.ninja.ComputeOutputDir(params))
jobs = params.get('generator_flags', {}).get('xcode_ninja_jobs', 0)
options = params["options"]
ninja_toplevel = os.path.join(
options.toplevel_dir, gyp.generator.ninja.ComputeOutputDir(params)
)
jobs = params.get("generator_flags", {}).get("xcode_ninja_jobs", 0)
target_name = old_spec.get('target_name')
product_name = old_spec.get('product_name', target_name)
product_extension = old_spec.get('product_extension')
target_name = old_spec.get("target_name")
product_name = old_spec.get("product_name", target_name)
product_extension = old_spec.get("product_extension")
ninja_target = {}
ninja_target['target_name'] = target_name
ninja_target['product_name'] = product_name
ninja_target["target_name"] = target_name
ninja_target["product_name"] = product_name
if product_extension:
ninja_target['product_extension'] = product_extension
ninja_target['toolset'] = old_spec.get('toolset')
ninja_target['default_configuration'] = old_spec.get('default_configuration')
ninja_target['configurations'] = {}
ninja_target["product_extension"] = product_extension
ninja_target["toolset"] = old_spec.get("toolset")
ninja_target["default_configuration"] = old_spec.get("default_configuration")
ninja_target["configurations"] = {}
# Tell Xcode to look in |ninja_toplevel| for build products.
new_xcode_settings = {}
if ninja_toplevel:
new_xcode_settings['CONFIGURATION_BUILD_DIR'] = \
new_xcode_settings["CONFIGURATION_BUILD_DIR"] = (
"%s/$(CONFIGURATION)$(EFFECTIVE_PLATFORM_NAME)" % ninja_toplevel
)
if 'configurations' in old_spec:
for config in old_spec['configurations']:
old_xcode_settings = \
old_spec['configurations'][config].get('xcode_settings', {})
if 'IPHONEOS_DEPLOYMENT_TARGET' in old_xcode_settings:
new_xcode_settings['CODE_SIGNING_REQUIRED'] = "NO"
new_xcode_settings['IPHONEOS_DEPLOYMENT_TARGET'] = \
old_xcode_settings['IPHONEOS_DEPLOYMENT_TARGET']
for key in ['BUNDLE_LOADER', 'TEST_HOST']:
if "configurations" in old_spec:
for config in old_spec["configurations"]:
old_xcode_settings = old_spec["configurations"][config].get(
"xcode_settings", {}
)
if "IPHONEOS_DEPLOYMENT_TARGET" in old_xcode_settings:
new_xcode_settings["CODE_SIGNING_REQUIRED"] = "NO"
new_xcode_settings["IPHONEOS_DEPLOYMENT_TARGET"] = old_xcode_settings[
"IPHONEOS_DEPLOYMENT_TARGET"
]
for key in ["BUNDLE_LOADER", "TEST_HOST"]:
if key in old_xcode_settings:
new_xcode_settings[key] = old_xcode_settings[key]
ninja_target['configurations'][config] = {}
ninja_target['configurations'][config]['xcode_settings'] = \
new_xcode_settings
ninja_target["configurations"][config] = {}
ninja_target["configurations"][config][
"xcode_settings"
] = new_xcode_settings
ninja_target['mac_bundle'] = old_spec.get('mac_bundle', 0)
ninja_target['mac_xctest_bundle'] = old_spec.get('mac_xctest_bundle', 0)
ninja_target['ios_app_extension'] = old_spec.get('ios_app_extension', 0)
ninja_target['ios_watchkit_extension'] = \
old_spec.get('ios_watchkit_extension', 0)
ninja_target['ios_watchkit_app'] = old_spec.get('ios_watchkit_app', 0)
ninja_target['type'] = old_spec['type']
ninja_target["mac_bundle"] = old_spec.get("mac_bundle", 0)
ninja_target["mac_xctest_bundle"] = old_spec.get("mac_xctest_bundle", 0)
ninja_target["ios_app_extension"] = old_spec.get("ios_app_extension", 0)
ninja_target["ios_watchkit_extension"] = old_spec.get("ios_watchkit_extension", 0)
ninja_target["ios_watchkit_app"] = old_spec.get("ios_watchkit_app", 0)
ninja_target["type"] = old_spec["type"]
if ninja_toplevel:
ninja_target['actions'] = [
ninja_target["actions"] = [
{
'action_name': 'Compile and copy %s via ninja' % target_name,
'inputs': [],
'outputs': [],
'action': [
'env',
'PATH=%s' % os.environ['PATH'],
'ninja',
'-C',
new_xcode_settings['CONFIGURATION_BUILD_DIR'],
"action_name": "Compile and copy %s via ninja" % target_name,
"inputs": [],
"outputs": [],
"action": [
"env",
"PATH=%s" % os.environ["PATH"],
"ninja",
"-C",
new_xcode_settings["CONFIGURATION_BUILD_DIR"],
target_name,
],
'message': 'Compile and copy %s via ninja' % target_name,
"message": "Compile and copy %s via ninja" % target_name,
},
]
if jobs > 0:
ninja_target['actions'][0]['action'].extend(('-j', jobs))
ninja_target["actions"][0]["action"].extend(("-j", jobs))
return ninja_target
def IsValidTargetForWrapper(target_extras, executable_target_pattern, spec):
"""Limit targets for Xcode wrapper.
@ -138,15 +144,16 @@ def IsValidTargetForWrapper(target_extras, executable_target_pattern, spec):
executable_target_pattern: Regular expression limiting executable targets.
spec: Specifications for target.
"""
target_name = spec.get('target_name')
target_name = spec.get("target_name")
# Always include targets matching target_extras.
if target_extras is not None and re.search(target_extras, target_name):
return True
# Otherwise just show executable targets and xc_tests.
if (int(spec.get('mac_xctest_bundle', 0)) != 0 or
(spec.get('type', '') == 'executable' and
spec.get('product_extension', '') != 'bundle')):
if int(spec.get("mac_xctest_bundle", 0)) != 0 or (
spec.get("type", "") == "executable"
and spec.get("product_extension", "") != "bundle"
):
# If there is a filter and the target does not match, exclude the target.
if executable_target_pattern is not None:
@ -155,6 +162,7 @@ def IsValidTargetForWrapper(target_extras, executable_target_pattern, spec):
return True
return False
def CreateWrapper(target_list, target_dicts, data, params):
"""Initialize targets for the ninja wrapper.
@ -166,15 +174,15 @@ def CreateWrapper(target_list, target_dicts, data, params):
data: Dict of flattened build files keyed on gyp path.
params: Dict of global options for gyp.
"""
orig_gyp = params['build_files'][0]
orig_gyp = params["build_files"][0]
for gyp_name, gyp_dict in data.items():
if gyp_name == orig_gyp:
depth = gyp_dict['_DEPTH']
depth = gyp_dict["_DEPTH"]
# Check for custom main gyp name, otherwise use the default CHROMIUM_GYP_FILE
# and prepend .ninja before the .gyp extension.
generator_flags = params.get('generator_flags', {})
main_gyp = generator_flags.get('xcode_ninja_main_gyp', None)
generator_flags = params.get("generator_flags", {})
main_gyp = generator_flags.get("xcode_ninja_main_gyp", None)
if main_gyp is None:
(build_file_root, build_file_ext) = os.path.splitext(orig_gyp)
main_gyp = build_file_root + ".ninja" + build_file_ext
@ -186,103 +194,108 @@ def CreateWrapper(target_list, target_dicts, data, params):
# Set base keys needed for |data|.
new_data[main_gyp] = {}
new_data[main_gyp]['included_files'] = []
new_data[main_gyp]['targets'] = []
new_data[main_gyp]['xcode_settings'] = \
data[orig_gyp].get('xcode_settings', {})
new_data[main_gyp]["included_files"] = []
new_data[main_gyp]["targets"] = []
new_data[main_gyp]["xcode_settings"] = data[orig_gyp].get("xcode_settings", {})
# Normally the xcode-ninja generator includes only valid executable targets.
# If |xcode_ninja_executable_target_pattern| is set, that list is reduced to
# executable targets that match the pattern. (Default all)
executable_target_pattern = \
generator_flags.get('xcode_ninja_executable_target_pattern', None)
executable_target_pattern = generator_flags.get(
"xcode_ninja_executable_target_pattern", None
)
# For including other non-executable targets, add the matching target name
# to the |xcode_ninja_target_pattern| regular expression. (Default none)
target_extras = generator_flags.get('xcode_ninja_target_pattern', None)
target_extras = generator_flags.get("xcode_ninja_target_pattern", None)
for old_qualified_target in target_list:
spec = target_dicts[old_qualified_target]
if IsValidTargetForWrapper(target_extras, executable_target_pattern, spec):
# Add to new_target_list.
target_name = spec.get('target_name')
new_target_name = '%s:%s#target' % (main_gyp, target_name)
target_name = spec.get("target_name")
new_target_name = "%s:%s#target" % (main_gyp, target_name)
new_target_list.append(new_target_name)
# Add to new_target_dicts.
new_target_dicts[new_target_name] = _TargetFromSpec(spec, params)
# Add to new_data.
for old_target in data[old_qualified_target.split(':')[0]]['targets']:
if old_target['target_name'] == target_name:
for old_target in data[old_qualified_target.split(":")[0]]["targets"]:
if old_target["target_name"] == target_name:
new_data_target = {}
new_data_target['target_name'] = old_target['target_name']
new_data_target['toolset'] = old_target['toolset']
new_data[main_gyp]['targets'].append(new_data_target)
new_data_target["target_name"] = old_target["target_name"]
new_data_target["toolset"] = old_target["toolset"]
new_data[main_gyp]["targets"].append(new_data_target)
# Create sources target.
sources_target_name = 'sources_for_indexing'
sources_target_name = "sources_for_indexing"
sources_target = _TargetFromSpec(
{ 'target_name' : sources_target_name,
'toolset': 'target',
'default_configuration': 'Default',
'mac_bundle': '0',
'type': 'executable'
}, None)
{
"target_name": sources_target_name,
"toolset": "target",
"default_configuration": "Default",
"mac_bundle": "0",
"type": "executable",
},
None,
)
# Tell Xcode to look everywhere for headers.
sources_target['configurations'] = {'Default': { 'include_dirs': [ depth ] } }
sources_target["configurations"] = {"Default": {"include_dirs": [depth]}}
# Put excluded files into the sources target so they can be opened in Xcode.
skip_excluded_files = \
not generator_flags.get('xcode_ninja_list_excluded_files', True)
skip_excluded_files = not generator_flags.get(
"xcode_ninja_list_excluded_files", True
)
sources = []
for target, target_dict in target_dicts.items():
base = os.path.dirname(target)
files = target_dict.get('sources', []) + \
target_dict.get('mac_bundle_resources', [])
files = target_dict.get("sources", []) + target_dict.get(
"mac_bundle_resources", []
)
if not skip_excluded_files:
files.extend(target_dict.get('sources_excluded', []) +
target_dict.get('mac_bundle_resources_excluded', []))
files.extend(
target_dict.get("sources_excluded", [])
+ target_dict.get("mac_bundle_resources_excluded", [])
)
for action in target_dict.get('actions', []):
files.extend(action.get('inputs', []))
for action in target_dict.get("actions", []):
files.extend(action.get("inputs", []))
if not skip_excluded_files:
files.extend(action.get('inputs_excluded', []))
files.extend(action.get("inputs_excluded", []))
# Remove files starting with $. These are mostly intermediate files for the
# build system.
files = [ file for file in files if not file.startswith('$')]
files = [file for file in files if not file.startswith("$")]
# Make sources relative to root build file.
relative_path = os.path.dirname(main_gyp)
sources += [ os.path.relpath(os.path.join(base, file), relative_path)
for file in files ]
sources += [
os.path.relpath(os.path.join(base, file), relative_path) for file in files
]
sources_target['sources'] = sorted(set(sources))
sources_target["sources"] = sorted(set(sources))
# Put sources_to_index in it's own gyp.
sources_gyp = \
os.path.join(os.path.dirname(main_gyp), sources_target_name + ".gyp")
fully_qualified_target_name = \
'%s:%s#target' % (sources_gyp, sources_target_name)
sources_gyp = os.path.join(os.path.dirname(main_gyp), sources_target_name + ".gyp")
fully_qualified_target_name = "%s:%s#target" % (sources_gyp, sources_target_name)
# Add to new_target_list, new_target_dicts and new_data.
new_target_list.append(fully_qualified_target_name)
new_target_dicts[fully_qualified_target_name] = sources_target
new_data_target = {}
new_data_target['target_name'] = sources_target['target_name']
new_data_target['_DEPTH'] = depth
new_data_target['toolset'] = "target"
new_data_target["target_name"] = sources_target["target_name"]
new_data_target["_DEPTH"] = depth
new_data_target["toolset"] = "target"
new_data[sources_gyp] = {}
new_data[sources_gyp]['targets'] = []
new_data[sources_gyp]['included_files'] = []
new_data[sources_gyp]['xcode_settings'] = \
data[orig_gyp].get('xcode_settings', {})
new_data[sources_gyp]['targets'].append(new_data_target)
new_data[sources_gyp]["targets"] = []
new_data[sources_gyp]["included_files"] = []
new_data[sources_gyp]["xcode_settings"] = data[orig_gyp].get("xcode_settings", {})
new_data[sources_gyp]["targets"].append(new_data_target)
# Write workspace to file.
_WriteWorkspace(main_gyp, sources_gyp, params)

File diff suppressed because it is too large Load diff

View file

@ -16,12 +16,9 @@ import xml.dom.minidom
def _Replacement_write_data(writer, data, is_attrib=False):
"""Writes datachars to writer."""
data = data.replace("&", "&amp;").replace("<", "&lt;")
data = data.replace("\"", "&quot;").replace(">", "&gt;")
data = data.replace('"', "&quot;").replace(">", "&gt;")
if is_attrib:
data = data.replace(
"\r", "&#xD;").replace(
"\n", "&#xA;").replace(
"\t", "&#x9;")
data = data.replace("\r", "&#xD;").replace("\n", "&#xA;").replace("\t", "&#x9;")
writer.write(data)
@ -32,13 +29,12 @@ def _Replacement_writexml(self, writer, indent="", addindent="", newl=""):
writer.write(indent + "<" + self.tagName)
attrs = self._get_attributes()
a_names = attrs.keys()
a_names.sort()
a_names = sorted(attrs.keys())
for a_name in a_names:
writer.write(" %s=\"" % a_name)
writer.write(' %s="' % a_name)
_Replacement_write_data(writer, attrs[a_name].value, is_attrib=True)
writer.write("\"")
writer.write('"')
if self.childNodes:
writer.write(">%s" % newl)
for node in self.childNodes:

2
gyp/requirements_dev.txt Normal file
View file

@ -0,0 +1,2 @@
flake8
pytest

View file

@ -1,81 +0,0 @@
#!/usr/bin/python
# Copyright (c) 2009 Google Inc. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
import os.path
import shutil
import sys
gyps = [
'app/app.gyp',
'base/base.gyp',
'build/temp_gyp/googleurl.gyp',
'build/all.gyp',
'build/common.gypi',
'build/external_code.gypi',
'chrome/test/security_tests/security_tests.gyp',
'chrome/third_party/hunspell/hunspell.gyp',
'chrome/chrome.gyp',
'media/media.gyp',
'net/net.gyp',
'printing/printing.gyp',
'sdch/sdch.gyp',
'skia/skia.gyp',
'testing/gmock.gyp',
'testing/gtest.gyp',
'third_party/bzip2/bzip2.gyp',
'third_party/icu38/icu38.gyp',
'third_party/libevent/libevent.gyp',
'third_party/libjpeg/libjpeg.gyp',
'third_party/libpng/libpng.gyp',
'third_party/libxml/libxml.gyp',
'third_party/libxslt/libxslt.gyp',
'third_party/lzma_sdk/lzma_sdk.gyp',
'third_party/modp_b64/modp_b64.gyp',
'third_party/npapi/npapi.gyp',
'third_party/sqlite/sqlite.gyp',
'third_party/zlib/zlib.gyp',
'v8/tools/gyp/v8.gyp',
'webkit/activex_shim/activex_shim.gyp',
'webkit/activex_shim_dll/activex_shim_dll.gyp',
'webkit/build/action_csspropertynames.py',
'webkit/build/action_cssvaluekeywords.py',
'webkit/build/action_jsconfig.py',
'webkit/build/action_makenames.py',
'webkit/build/action_maketokenizer.py',
'webkit/build/action_useragentstylesheets.py',
'webkit/build/rule_binding.py',
'webkit/build/rule_bison.py',
'webkit/build/rule_gperf.py',
'webkit/tools/test_shell/test_shell.gyp',
'webkit/webkit.gyp',
]
def Main(argv):
if len(argv) != 3 or argv[1] not in ['push', 'pull']:
print 'Usage: %s push/pull PATH_TO_CHROME' % argv[0]
return 1
path_to_chrome = argv[2]
for g in gyps:
chrome_file = os.path.join(path_to_chrome, g)
local_file = os.path.join(os.path.dirname(argv[0]), os.path.split(g)[1])
if argv[1] == 'push':
print 'Copying %s to %s' % (local_file, chrome_file)
shutil.copyfile(local_file, chrome_file)
elif argv[1] == 'pull':
print 'Copying %s to %s' % (chrome_file, local_file)
shutil.copyfile(chrome_file, local_file)
else:
assert False
return 0
if __name__ == '__main__':
sys.exit(Main(sys.argv))

View file

@ -1,5 +0,0 @@
@rem Copyright (c) 2009 Google Inc. All rights reserved.
@rem Use of this source code is governed by a BSD-style license that can be
@rem found in the LICENSE file.
@python %~dp0/samples %*

View file

@ -4,16 +4,41 @@
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
from os import path
from setuptools import setup
here = path.abspath(path.dirname(__file__))
# Get the long description from the README file
with open(path.join(here, "README.md")) as in_file:
long_description = in_file.read()
setup(
name='gyp',
version='0.1',
description='Generate Your Projects',
author='Chromium Authors',
author_email='chromium-dev@googlegroups.com',
url='http://code.google.com/p/gyp',
package_dir = {'': 'pylib'},
packages=['gyp', 'gyp.generator'],
entry_points = {'console_scripts': ['gyp=gyp:script_main'] }
name="gyp-next",
version="0.2.0",
description="A fork of the GYP build system for use in the Node.js projects",
long_description=long_description,
long_description_content_type="text/markdown",
author="Node.js contributors",
author_email="ryzokuken@disroot.org",
url="https://github.com/nodejs/gyp-next",
package_dir={"": "pylib"},
packages=["gyp", "gyp.generator"],
entry_points={"console_scripts": ["gyp=gyp:script_main"]},
python_requires=">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*",
classifiers=[
'Development Status :: 3 - Alpha',
'Environment :: Console',
'Intended Audience :: Developers',
'License :: OSI Approved :: BSD License',
'Natural Language :: English',
'Programming Language :: Python',
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
],
)

269
gyp/test_gyp.py Executable file
View file

@ -0,0 +1,269 @@
#!/usr/bin/env python
# Copyright (c) 2012 Google Inc. All rights reserved.
# Use of this source code is governed by a BSD-style license that can be
# found in the LICENSE file.
"""gyptest.py -- test runner for GYP tests."""
from __future__ import print_function
import argparse
import os
import platform
import subprocess
import sys
import time
def is_test_name(f):
return f.startswith("gyptest") and f.endswith(".py")
def find_all_gyptest_files(directory):
result = []
for root, dirs, files in os.walk(directory):
result.extend([os.path.join(root, f) for f in files if is_test_name(f)])
result.sort()
return result
def main(argv=None):
if argv is None:
argv = sys.argv
parser = argparse.ArgumentParser()
parser.add_argument("-a", "--all", action="store_true", help="run all tests")
parser.add_argument("-C", "--chdir", action="store", help="change to directory")
parser.add_argument(
"-f",
"--format",
action="store",
default="",
help="run tests with the specified formats",
)
parser.add_argument(
"-G",
"--gyp_option",
action="append",
default=[],
help="Add -G options to the gyp command line",
)
parser.add_argument(
"-l", "--list", action="store_true", help="list available tests and exit"
)
parser.add_argument(
"-n",
"--no-exec",
action="store_true",
help="no execute, just print the command line",
)
parser.add_argument(
"--path", action="append", default=[], help="additional $PATH directory"
)
parser.add_argument(
"-q",
"--quiet",
action="store_true",
help="quiet, don't print anything unless there are failures",
)
parser.add_argument(
"-v",
"--verbose",
action="store_true",
help="print configuration info and test results.",
)
parser.add_argument("tests", nargs="*")
args = parser.parse_args(argv[1:])
if args.chdir:
os.chdir(args.chdir)
if args.path:
extra_path = [os.path.abspath(p) for p in args.path]
extra_path = os.pathsep.join(extra_path)
os.environ["PATH"] = extra_path + os.pathsep + os.environ["PATH"]
if not args.tests:
if not args.all:
sys.stderr.write("Specify -a to get all tests.\n")
return 1
args.tests = ["test"]
tests = []
for arg in args.tests:
if os.path.isdir(arg):
tests.extend(find_all_gyptest_files(os.path.normpath(arg)))
else:
if not is_test_name(os.path.basename(arg)):
print(arg, "is not a valid gyp test name.", file=sys.stderr)
sys.exit(1)
tests.append(arg)
if args.list:
for test in tests:
print(test)
sys.exit(0)
os.environ["PYTHONPATH"] = os.path.abspath("test/lib")
if args.verbose:
print_configuration_info()
if args.gyp_option and not args.quiet:
print("Extra Gyp options: %s\n" % args.gyp_option)
if args.format:
format_list = args.format.split(",")
else:
format_list = {
"aix5": ["make"],
"freebsd7": ["make"],
"freebsd8": ["make"],
"openbsd5": ["make"],
"cygwin": ["msvs"],
"win32": ["msvs", "ninja"],
"linux": ["make", "ninja"],
"linux2": ["make", "ninja"],
"linux3": ["make", "ninja"],
# TODO: Re-enable xcode-ninja.
# https://bugs.chromium.org/p/gyp/issues/detail?id=530
# 'darwin': ['make', 'ninja', 'xcode', 'xcode-ninja'],
"darwin": ["make", "ninja", "xcode"],
}[sys.platform]
gyp_options = []
for option in args.gyp_option:
gyp_options += ["-G", option]
runner = Runner(format_list, tests, gyp_options, args.verbose)
runner.run()
if not args.quiet:
runner.print_results()
if runner.failures:
return 1
else:
return 0
def print_configuration_info():
print("Test configuration:")
if sys.platform == "darwin":
sys.path.append(os.path.abspath("test/lib"))
import TestMac
print(" Mac %s %s" % (platform.mac_ver()[0], platform.mac_ver()[2]))
print(" Xcode %s" % TestMac.Xcode.Version())
elif sys.platform == "win32":
sys.path.append(os.path.abspath("pylib"))
import gyp.MSVSVersion
print(" Win %s %s\n" % platform.win32_ver()[0:2])
print(" MSVS %s" % gyp.MSVSVersion.SelectVisualStudioVersion().Description())
elif sys.platform in ("linux", "linux2"):
print(" Linux %s" % " ".join(platform.linux_distribution()))
print(" Python %s" % platform.python_version())
print(" PYTHONPATH=%s" % os.environ["PYTHONPATH"])
print()
class Runner(object):
def __init__(self, formats, tests, gyp_options, verbose):
self.formats = formats
self.tests = tests
self.verbose = verbose
self.gyp_options = gyp_options
self.failures = []
self.num_tests = len(formats) * len(tests)
num_digits = len(str(self.num_tests))
self.fmt_str = "[%%%dd/%%%dd] (%%s) %%s" % (num_digits, num_digits)
self.isatty = sys.stdout.isatty() and not self.verbose
self.env = os.environ.copy()
self.hpos = 0
def run(self):
run_start = time.time()
i = 1
for fmt in self.formats:
for test in self.tests:
self.run_test(test, fmt, i)
i += 1
if self.isatty:
self.erase_current_line()
self.took = time.time() - run_start
def run_test(self, test, fmt, i):
if self.isatty:
self.erase_current_line()
msg = self.fmt_str % (i, self.num_tests, fmt, test)
self.print_(msg)
start = time.time()
cmd = [sys.executable, test] + self.gyp_options
self.env["TESTGYP_FORMAT"] = fmt
proc = subprocess.Popen(
cmd, stdout=subprocess.PIPE, stderr=subprocess.STDOUT, env=self.env
)
proc.wait()
took = time.time() - start
stdout = proc.stdout.read().decode("utf8")
if proc.returncode == 2:
res = "skipped"
elif proc.returncode:
res = "failed"
self.failures.append("(%s) %s" % (test, fmt))
else:
res = "passed"
res_msg = " %s %.3fs" % (res, took)
self.print_(res_msg)
if (
stdout
and not stdout.endswith("PASSED\n")
and not (stdout.endswith("NO RESULT\n"))
):
print()
for l in stdout.splitlines():
print(" %s" % l)
elif not self.isatty:
print()
def print_(self, msg):
print(msg, end="")
index = msg.rfind("\n")
if index == -1:
self.hpos += len(msg)
else:
self.hpos = len(msg) - index
sys.stdout.flush()
def erase_current_line(self):
print("\b" * self.hpos + " " * self.hpos + "\b" * self.hpos, end="")
sys.stdout.flush()
self.hpos = 0
def print_results(self):
num_failures = len(self.failures)
if num_failures:
print()
if num_failures == 1:
print("Failed the following test:")
else:
print("Failed the following %d tests:" % num_failures)
print("\t" + "\n\t".join(sorted(self.failures)))
print()
print(
"Ran %d tests in %.3fs, %d failed."
% (self.num_tests, self.took, num_failures)
)
print()
if __name__ == "__main__":
sys.exit(main())

View file

@ -16,8 +16,8 @@ import sys
def ParseTarget(target):
target, _, suffix = target.partition('#')
filename, _, target = target.partition(':')
target, _, suffix = target.partition("#")
filename, _, target = target.partition(":")
return filename, target, suffix
@ -25,7 +25,7 @@ def LoadEdges(filename, targets):
"""Load the edges map from the dump file, and filter it to only
show targets in |targets| and their depedendents."""
file = open('dump.json')
file = open("dump.json")
edges = json.load(file)
file.close()
@ -52,9 +52,9 @@ def WriteGraph(edges):
build_file, target_name, toolset = ParseTarget(src)
files[build_file].append(src)
print('digraph D {')
print(' fontsize=8') # Used by subgraphs.
print(' node [fontsize=8]')
print("digraph D {")
print(" fontsize=8") # Used by subgraphs.
print(" node [fontsize=8]")
# Output nodes by file. We must first write out each node within
# its file grouping before writing out any edges that may refer
@ -65,8 +65,9 @@ def WriteGraph(edges):
# the display by making it a box without an internal node.
target = targets[0]
build_file, target_name, toolset = ParseTarget(target)
print(' "%s" [shape=box, label="%s\\n%s"]' % (target, filename,
target_name))
print(
' "%s" [shape=box, label="%s\\n%s"]' % (target, filename, target_name)
)
else:
# Group multiple nodes together in a subgraph.
print(' subgraph "cluster_%s" {' % filename)
@ -74,7 +75,7 @@ def WriteGraph(edges):
for target in targets:
build_file, target_name, toolset = ParseTarget(target)
print(' "%s" [label="%s"]' % (target, target_name))
print(' }')
print(" }")
# Now that we've placed all the nodes within subgraphs, output all
# the edges between nodes.
@ -82,21 +83,21 @@ def WriteGraph(edges):
for dst in dsts:
print(' "%s" -> "%s"' % (src, dst))
print('}')
print("}")
def main():
if len(sys.argv) < 2:
print(__doc__, file=sys.stderr)
print(file=sys.stderr)
print('usage: %s target1 target2...' % (sys.argv[0]), file=sys.stderr)
print("usage: %s target1 target2..." % (sys.argv[0]), file=sys.stderr)
return 1
edges = LoadEdges('dump.json', sys.argv[1:])
edges = LoadEdges("dump.json", sys.argv[1:])
WriteGraph(edges)
return 0
if __name__ == '__main__':
if __name__ == "__main__":
sys.exit(main())

View file

@ -13,7 +13,7 @@ import re
# Regex to remove comments when we're counting braces.
COMMENT_RE = re.compile(r'\s*#.*')
COMMENT_RE = re.compile(r"\s*#.*")
# Regex to remove quoted strings when we're counting braces.
# It takes into account quoted quotes, and makes sure that the quotes match.
@ -24,25 +24,27 @@ QUOTE_RE = re.compile(QUOTE_RE_STR)
def comment_replace(matchobj):
return matchobj.group(1) + matchobj.group(2) + '#' * len(matchobj.group(3))
return matchobj.group(1) + matchobj.group(2) + "#" * len(matchobj.group(3))
def mask_comments(input):
"""Mask the quoted strings so we skip braces inside quoted strings."""
search_re = re.compile(r'(.*?)(#)(.*)')
search_re = re.compile(r"(.*?)(#)(.*)")
return [search_re.sub(comment_replace, line) for line in input]
def quote_replace(matchobj):
return "%s%s%s%s" % (matchobj.group(1),
return "%s%s%s%s" % (
matchobj.group(1),
matchobj.group(2),
'x'*len(matchobj.group(3)),
matchobj.group(2))
"x" * len(matchobj.group(3)),
matchobj.group(2),
)
def mask_quotes(input):
"""Mask the quoted strings so we skip braces inside quoted strings."""
search_re = re.compile(r'(.*?)' + QUOTE_RE_STR)
search_re = re.compile(r"(.*?)" + QUOTE_RE_STR)
return [search_re.sub(quote_replace, line) for line in input]
@ -53,11 +55,11 @@ def do_split(input, masked_input, search_re):
m = search_re.match(masked_line)
while m:
split = len(m.group(1))
line = line[:split] + r'\n' + line[split:]
masked_line = masked_line[:split] + r'\n' + masked_line[split:]
line = line[:split] + r"\n" + line[split:]
masked_line = masked_line[:split] + r"\n" + masked_line[split:]
m = search_re.match(masked_line)
output.extend(line.split(r'\n'))
mask_output.extend(masked_line.split(r'\n'))
output.extend(line.split(r"\n"))
mask_output.extend(masked_line.split(r"\n"))
return (output, mask_output)
@ -70,8 +72,8 @@ def split_double_braces(input):
that the indentation looks prettier when all laid out (e.g. closing
braces make a nice diagonal line).
"""
double_open_brace_re = re.compile(r'(.*?[\[\{\(,])(\s*)([\[\{\(])')
double_close_brace_re = re.compile(r'(.*?[\]\}\)],?)(\s*)([\]\}\)])')
double_open_brace_re = re.compile(r"(.*?[\[\{\(,])(\s*)([\[\{\(])")
double_close_brace_re = re.compile(r"(.*?[\]\}\)],?)(\s*)([\]\}\)])")
masked_input = mask_quotes(input)
masked_input = mask_comments(masked_input)
@ -87,11 +89,11 @@ def count_braces(line):
It starts at zero and subtracts for closed braces, and adds for open braces.
"""
open_braces = ['[', '(', '{']
close_braces = [']', ')', '}']
closing_prefix_re = re.compile(r'(.*?[^\s\]\}\)]+.*?)([\]\}\)],?)\s*$')
open_braces = ["[", "(", "{"]
close_braces = ["]", ")", "}"]
closing_prefix_re = re.compile(r"(.*?[^\s\]\}\)]+.*?)([\]\}\)],?)\s*$")
cnt = 0
stripline = COMMENT_RE.sub(r'', line)
stripline = COMMENT_RE.sub(r"", line)
stripline = QUOTE_RE.sub(r"''", stripline)
for char in stripline:
for brace in open_braces:
@ -118,12 +120,11 @@ def prettyprint_input(lines):
"""Does the main work of indenting the input based on the brace counts."""
indent = 0
basic_offset = 2
last_line = ""
for line in lines:
if COMMENT_RE.match(line):
print(line)
else:
line = line.strip('\r\n\t ') # Otherwise doesn't strip \r on Unix.
line = line.strip("\r\n\t ") # Otherwise doesn't strip \r on Unix.
if len(line) > 0:
(brace_diff, after) = count_braces(line)
if brace_diff != 0:
@ -137,7 +138,6 @@ def prettyprint_input(lines):
print(" " * (basic_offset * indent) + line)
else:
print("")
last_line = line
def main():
@ -153,5 +153,5 @@ def main():
return 0
if __name__ == '__main__':
if __name__ == "__main__":
sys.exit(main())

View file

@ -19,7 +19,8 @@ import re
import sys
import pretty_vcproj
__author__ = 'nsylvain (Nicolas Sylvain)'
__author__ = "nsylvain (Nicolas Sylvain)"
def BuildProject(project, built, projects, deps):
# if all dependencies are done, we can build it, otherwise we try to build the
@ -31,6 +32,7 @@ def BuildProject(project, built, projects, deps):
print(project)
built.append(project)
def ParseSolution(solution_file):
# All projects, their clsid and paths.
projects = dict()
@ -40,17 +42,18 @@ def ParseSolution(solution_file):
# Regular expressions that matches the SLN format.
# The first line of a project definition.
begin_project = re.compile(r'^Project\("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942'
r'}"\) = "(.*)", "(.*)", "(.*)"$')
begin_project = re.compile(
r'^Project\("{8BC9CEB8-8B4A-11D0-8D11-00A0C91BC942'
r'}"\) = "(.*)", "(.*)", "(.*)"$'
)
# The last line of a project definition.
end_project = re.compile('^EndProject$')
end_project = re.compile("^EndProject$")
# The first line of a dependency list.
begin_dep = re.compile(
r'ProjectSection\(ProjectDependencies\) = postProject$')
begin_dep = re.compile(r"ProjectSection\(ProjectDependencies\) = postProject$")
# The last line of a dependency list.
end_dep = re.compile('EndProjectSection$')
end_dep = re.compile("EndProjectSection$")
# A line describing a dependency.
dep_line = re.compile(' *({.*}) = ({.*})$')
dep_line = re.compile(" *({.*}) = ({.*})$")
in_deps = False
solution = open(solution_file)
@ -58,13 +61,15 @@ def ParseSolution(solution_file):
results = begin_project.search(line)
if results:
# Hack to remove icu because the diff is too different.
if results.group(1).find('icu') != -1:
if results.group(1).find("icu") != -1:
continue
# We remove "_gyp" from the names because it helps to diff them.
current_project = results.group(1).replace('_gyp', '')
projects[current_project] = [results.group(2).replace('_gyp', ''),
current_project = results.group(1).replace("_gyp", "")
projects[current_project] = [
results.group(2).replace("_gyp", ""),
results.group(3),
results.group(2)]
results.group(2),
]
dependencies[current_project] = []
continue
@ -101,6 +106,7 @@ def ParseSolution(solution_file):
return (projects, dependencies)
def PrintDependencies(projects, deps):
print("---------------------------------------")
print("Dependencies for all projects")
@ -117,6 +123,7 @@ def PrintDependencies(projects, deps):
print("-- --")
def PrintBuildOrder(projects, deps):
print("---------------------------------------")
print("Build order ")
@ -130,6 +137,7 @@ def PrintBuildOrder(projects, deps):
print("-- --")
def PrintVCProj(projects):
for project in projects:
@ -141,17 +149,20 @@ def PrintVCProj(projects):
print("-------------------------------------")
print("-------------------------------------")
project_path = os.path.abspath(os.path.join(os.path.dirname(sys.argv[1]),
projects[project][2]))
project_path = os.path.abspath(
os.path.join(os.path.dirname(sys.argv[1]), projects[project][2])
)
pretty = pretty_vcproj
argv = [ '',
argv = [
"",
project_path,
'$(SolutionDir)=%s\\' % os.path.dirname(sys.argv[1]),
"$(SolutionDir)=%s\\" % os.path.dirname(sys.argv[1]),
]
argv.extend(sys.argv[3:])
pretty.main(argv)
def main():
# check if we have exactly 1 parameter.
if len(sys.argv) < 2:
@ -162,10 +173,10 @@ def main():
PrintDependencies(projects, deps)
PrintBuildOrder(projects, deps)
if '--recursive' in sys.argv:
if "--recursive" in sys.argv:
PrintVCProj(projects)
return 0
if __name__ == '__main__':
if __name__ == "__main__":
sys.exit(main())

View file

@ -20,20 +20,23 @@ import sys
from xml.dom.minidom import parse
from xml.dom.minidom import Node
__author__ = 'nsylvain (Nicolas Sylvain)'
__author__ = "nsylvain (Nicolas Sylvain)"
try:
cmp
except NameError:
def cmp(x, y):
return (x > y) - (x < y)
REPLACEMENTS = dict()
ARGUMENTS = None
class CmpTuple(object):
"""Compare function between 2 tuple."""
def __call__(self, x, y):
return cmp(x[0], y[0])
@ -69,7 +72,7 @@ class CmpNode(object):
def PrettyPrintNode(node, indent=0):
if node.nodeType == Node.TEXT_NODE:
if node.data.strip():
print('%s%s' % (' '*indent, node.data.strip()))
print("%s%s" % (" " * indent, node.data.strip()))
return
if node.childNodes:
@ -81,36 +84,35 @@ def PrettyPrintNode(node, indent=0):
# Print the main tag
if attr_count == 0:
print('%s<%s>' % (' '*indent, node.nodeName))
print("%s<%s>" % (" " * indent, node.nodeName))
else:
print('%s<%s' % (' '*indent, node.nodeName))
print("%s<%s" % (" " * indent, node.nodeName))
all_attributes = []
for (name, value) in node.attributes.items():
all_attributes.append((name, value))
all_attributes.sort(CmpTuple())
for (name, value) in all_attributes:
print('%s %s="%s"' % (' '*indent, name, value))
print('%s>' % (' '*indent))
print('%s %s="%s"' % (" " * indent, name, value))
print("%s>" % (" " * indent))
if node.nodeValue:
print('%s %s' % (' '*indent, node.nodeValue))
print("%s %s" % (" " * indent, node.nodeValue))
for sub_node in node.childNodes:
PrettyPrintNode(sub_node, indent=indent + 2)
print('%s</%s>' % (' '*indent, node.nodeName))
print("%s</%s>" % (" " * indent, node.nodeName))
def FlattenFilter(node):
"""Returns a list of all the node and sub nodes."""
node_list = []
if (node.attributes and
node.getAttribute('Name') == '_excluded_files'):
if node.attributes and node.getAttribute("Name") == "_excluded_files":
# We don't add the "_excluded_files" filter.
return []
for current in node.childNodes:
if current.nodeName == 'Filter':
if current.nodeName == "Filter":
node_list.extend(FlattenFilter(current))
else:
node_list.append(current)
@ -125,8 +127,8 @@ def FixFilenames(filenames, current_directory):
for key in REPLACEMENTS:
filename = filename.replace(key, REPLACEMENTS[key])
os.chdir(current_directory)
filename = filename.strip('"\' ')
if filename.startswith('$'):
filename = filename.strip("\"' ")
if filename.startswith("$"):
new_list.append(filename)
else:
new_list.append(os.path.abspath(filename))
@ -137,14 +139,18 @@ def AbsoluteNode(node):
"""Makes all the properties we know about in this node absolute."""
if node.attributes:
for (name, value) in node.attributes.items():
if name in ['InheritedPropertySheets', 'RelativePath',
'AdditionalIncludeDirectories',
'IntermediateDirectory', 'OutputDirectory',
'AdditionalLibraryDirectories']:
if name in [
"InheritedPropertySheets",
"RelativePath",
"AdditionalIncludeDirectories",
"IntermediateDirectory",
"OutputDirectory",
"AdditionalLibraryDirectories",
]:
# We want to fix up these paths
path_list = value.split(';')
path_list = value.split(";")
new_list = FixFilenames(path_list, os.path.dirname(ARGUMENTS[1]))
node.setAttribute(name, ';'.join(new_list))
node.setAttribute(name, ";".join(new_list))
if not value:
node.removeAttribute(name)
@ -166,12 +172,12 @@ def CleanupVcproj(node):
# remove the dups.
if node.attributes:
for (name, value) in node.attributes.items():
sorted_list = sorted(value.split(';'))
sorted_list = sorted(value.split(";"))
unique_list = []
for i in sorted_list:
if not unique_list.count(i):
unique_list.append(i)
node.setAttribute(name, ';'.join(unique_list))
node.setAttribute(name, ";".join(unique_list))
if not value:
node.removeAttribute(name)
@ -187,23 +193,22 @@ def CleanupVcproj(node):
# If the child is a filter, we want to append all its children
# to this same list.
if current.nodeName == 'Filter':
if current.nodeName == "Filter":
node_array.extend(FlattenFilter(current))
else:
node_array.append(current)
# Sort the list.
node_array.sort(CmpNode())
# Insert the nodes in the correct order.
for new_node in node_array:
# But don't append empty tool node.
if new_node.nodeName == 'Tool':
if new_node.nodeName == "Tool":
if new_node.attributes and new_node.attributes.length == 1:
# This one was empty.
continue
if new_node.nodeName == 'UserMacro':
if new_node.nodeName == "UserMacro":
continue
node.appendChild(new_node)
@ -223,10 +228,11 @@ def GetConfiguationNodes(vcproj):
def GetChildrenVsprops(filename):
dom = parse(filename)
if dom.documentElement.attributes:
vsprops = dom.documentElement.getAttribute('InheritedPropertySheets')
return FixFilenames(vsprops.split(';'), os.path.dirname(filename))
vsprops = dom.documentElement.getAttribute("InheritedPropertySheets")
return FixFilenames(vsprops.split(";"), os.path.dirname(filename))
return []
def SeekToNode(node1, child2):
# A text node does not have properties.
if child2.nodeType == Node.TEXT_NODE:
@ -256,21 +262,21 @@ def MergeAttributes(node1, node2):
for (name, value2) in node2.attributes.items():
# Don't merge the 'Name' attribute.
if name == 'Name':
if name == "Name":
continue
value1 = node1.getAttribute(name)
if value1:
# The attribute exist in the main node. If it's equal, we leave it
# untouched, otherwise we concatenate it.
if value1 != value2:
node1.setAttribute(name, ';'.join([value1, value2]))
node1.setAttribute(name, ";".join([value1, value2]))
else:
# The attribute does not exist in the main node. We append this one.
node1.setAttribute(name, value2)
# If the attribute was a property sheet attributes, we remove it, since
# they are useless.
if name == 'InheritedPropertySheets':
if name == "InheritedPropertySheets":
node1.removeAttribute(name)
@ -291,13 +297,15 @@ def main(argv):
# check if we have exactly 1 parameter.
if len(argv) < 2:
print('Usage: %s "c:\\path\\to\\vcproj.vcproj" [key1=value1] '
'[key2=value2]' % argv[0])
print(
'Usage: %s "c:\\path\\to\\vcproj.vcproj" [key1=value1] '
"[key2=value2]" % argv[0]
)
return 1
# Parse the keys
for i in range(2, len(argv)):
(key, value) = argv[i].split('=')
(key, value) = argv[i].split("=")
REPLACEMENTS[key] = value
# Open the vcproj and parse the xml.
@ -307,11 +315,12 @@ def main(argv):
# with the vsprops they include.
for configuration_node in GetConfiguationNodes(dom.documentElement):
# Get the property sheets associated with this configuration.
vsprops = configuration_node.getAttribute('InheritedPropertySheets')
vsprops = configuration_node.getAttribute("InheritedPropertySheets")
# Fix the filenames to be absolute.
vsprops_list = FixFilenames(vsprops.strip().split(';'),
os.path.dirname(argv[1]))
vsprops_list = FixFilenames(
vsprops.strip().split(";"), os.path.dirname(argv[1])
)
# Extend the list of vsprops with all vsprops contained in the current
# vsprops.
@ -320,8 +329,7 @@ def main(argv):
# Now that we have all the vsprops, we need to merge them.
for current_vsprops in vsprops_list:
MergeProperties(configuration_node,
parse(current_vsprops).documentElement)
MergeProperties(configuration_node, parse(current_vsprops).documentElement)
# Now that everything is merged, we need to cleanup the xml.
CleanupVcproj(dom.documentElement)
@ -333,5 +341,5 @@ def main(argv):
return 0
if __name__ == '__main__':
if __name__ == "__main__":
sys.exit(main(sys.argv))