pax_global_header00006660000000000000000000000064142722507250014520gustar00rootroot0000000000000052 comment=e19e9a601bd1deeae983ef02aa3aa78171472d3f cppimport-22.08.02/000077500000000000000000000000001427225072500137705ustar00rootroot00000000000000cppimport-22.08.02/.coveragerc000066400000000000000000000000271427225072500161100ustar00rootroot00000000000000[run] source=cppimport cppimport-22.08.02/.github/000077500000000000000000000000001427225072500153305ustar00rootroot00000000000000cppimport-22.08.02/.github/workflows/000077500000000000000000000000001427225072500173655ustar00rootroot00000000000000cppimport-22.08.02/.github/workflows/test.yml000066400000000000000000000047571427225072500211040ustar00rootroot00000000000000name: Test on: pull_request: types: - opened - synchronize push: branches: - main tags: - '*' jobs: test: strategy: fail-fast: false matrix: os: ["ubuntu-latest", "macos-latest"] python-version: [ "3.7", "3.8", "3.9", "3.10"] name: Test (${{ matrix.python-version }}, ${{ matrix.os }}) runs-on: ${{ matrix.os }} defaults: run: shell: bash -l {0} steps: - uses: actions/checkout@v2 - uses: conda-incubator/setup-miniconda@v2 with: mamba-version: "*" channels: conda-forge activate-environment: cppimport environment-file: environment.yml python-version: ${{ matrix.python-version }} - name: Install cppimport run: | pip install --no-deps -e . - name: Lint with flake8 run: | flake8 . - name: Check formatting with black run: | black --check . - name: Check import ordering with isort run: | isort --check . - name: Test if: ${{ matrix.os == 'macos-latest' }} run: | CFLAGS='-stdlib=libc++' pytest --cov=./ --cov-report=xml - name: Test if: ${{ matrix.os != 'macos-latest' }} run: | pytest --cov=./ --cov-report=xml - name: Upload coverage to Codecov uses: codecov/codecov-action@v1 with: fail_ci_if_error: true build-and-publish: # based on https://packaging.python.org/en/latest/guides/publishing-package-distribution-releases-using-github-actions-ci-cd-workflows/ # also see: https://github.com/marketplace/actions/pypi-publish#advanced-release-management needs: test name: Build and publish Python distributions to PyPI and TestPyPI runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - uses: conda-incubator/setup-miniconda@v2 with: mamba-version: "*" channels: conda-forge activate-environment: cppimport environment-file: environment.yml python-version: ${{ matrix.python-version }} - name: Get history and tags for SCM versioning to work run: | git fetch --prune --unshallow git fetch --depth=1 origin +refs/tags/*:refs/tags/* - name: Build sdist run: >- python setup.py sdist - name: Publish distribution 📦 to PyPI if: startsWith(github.ref, 'refs/tags') uses: pypa/gh-action-pypi-publish@master with: password: ${{ secrets.PYPI_API_TOKEN }} verbose: true cppimport-22.08.02/.gitignore000066400000000000000000000002531427225072500157600ustar00rootroot00000000000000*.swp *.pyc *.so *.cppimporthash .rendered.* __pycache__ build cppimport.egg-info dist .cache .tox .mypy_cache/ .coverage htmlcov **/.DS_Store .eggs cppimport/_version.py cppimport-22.08.02/.pre-commit-config.yaml000066400000000000000000000005061427225072500202520ustar00rootroot00000000000000repos: - repo: https://github.com/psf/black rev: 20.8b1 hooks: - id: black language_version: python3 - repo: https://gitlab.com/pycqa/flake8 rev: 3.8.4 hooks: - id: flake8 - repo: https://github.com/pycqa/isort rev: 5.7.0 hooks: - id: isort name: isort (python) cppimport-22.08.02/CONTRIBUTING.md000066400000000000000000000077041427225072500162310ustar00rootroot00000000000000# Contributing When contributing to this repository, feel free to add an issue or pull request! There really aren't any rules, but if you're mean, I'll be sad. I'm happy to collaborate on pull requests if you would like. There's no need to submit a perfect, finished product. To install in development mode and run the tests: ``` git clone git@github.com:tbenthompson/cppimport.git cd cppimport conda env create conda activate cppimport pre-commit install pip install --no-use-pep517 --disable-pip-version-check -e . pytests ``` # Architecture ## Entrypoints: The main entrypoint for cppimport is the `cppimport.import_hook` module, which interfaces with the Python importing system to allow things like `import mycppfilename`. For a C++ file to be a valid import target, it needs to have the word `cppimport` in its first line. Without this first line constraint, it is possible for the importing system to cause imports in other Python packages to fail. Before adding the first-line constraint, the cppimport import_hook had the unfortunate consequence of breaking some scipy modules that had adjacent C and C++ files in the directory tree. There is an alternative, and more explicit interface provided by the `imp`, `imp_from_filepath` and `build` functions here. * `imp` does exactly what the import hook does except via a function so that instead of `import foomodule` we would do `foomodule = imp('foomodule')`. * `imp_from_filepath` is even more explicit, allowing the user to pass a C++ filepath rather than a modulename. For example, `foomodule = imp('../cppcodedir/foodmodule.cpp')`. This is rarely necessary but can be handy for debugging. * `build` is similar to `imp` except that the library is only built and not actually loaded as a Python module. `imp`, `imp_from_filepath` and `build` are in the `__init__.py` to separate external facing API from the guts of the package that live in internal submodules. ## What happens when we import a C++ module. 1. First the `cppimport.find.find_module_cpppath` function is used to find a C++ file that matches the desired module name. 2. Next, we determine if there's already an existing compiled extension that we can use. If there is, the `cppimport.importer.is_build_needed` function is used to determine if the extension is up to date with the current code. If the extension is up to date, we attempt to load it. If the extension is loaded successfully, we return the module and we're done! However, if for whichever reason, we can't load an existing extension, we need to build the extension, a process directed by `cppimport.importer.template_and_build`. 3. The first step of building is to run the C++ file through the Mako templating system with the `cppimport.templating.run_templating` function. The main purpose of this is to allow users to embed configuration information into their C++ file. Without some sort of similar mechanism, there would be no way of passing information to build system because the `import modulename` statement can't carry information. The templating serves a secondary benefit in that simple code generation can be performed if needed. However, most users probably stick to a simple header or footer similar to the one demonstrated in the README. 4. Next, we use setuptools to build the C++ extension using `cppimport.build_module.build_module`. This function calls setuptools with the appropriate arguments to build the extension in place next to the C++ file in the directory tree. 5. Next, we call `cppimport.checksum.checksum_save` to add a hash of the appended contents of all relevant source and header files. This checksum is appended to the end of the `.so` or `.dylib` file. This seems legal according to specifications and, in practice, causes no problems. 6. Finally, the compiled and loaded extension module is returned to the user. ## Useful links * PEP 302 that made this possible: https://www.python.org/dev/peps/pep-0302/ * The gory details of Python importing: https://docs.python.org/3/reference/import.html cppimport-22.08.02/LICENSE000066400000000000000000000020721427225072500147760ustar00rootroot00000000000000The MIT License (MIT) Copyright (c) 2021 T. Ben Thompson Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. cppimport-22.08.02/MANIFEST.in000066400000000000000000000002251427225072500155250ustar00rootroot00000000000000include README.md include VERSION recursive-include tests *.py recursive-include tests *.h recursive-include tests *.cpp recursive-include tests *.c cppimport-22.08.02/README.md000066400000000000000000000261101427225072500152470ustar00rootroot00000000000000# cppimport - Import C++ directly from Python!

## Contributing and architecture See [CONTRIBUTING.md](CONTRIBUTING.md) for details on the internals of `cppimport` and how to get involved in development. ## Installation Install with `pip install cppimport`. ## A quick example Save the C++ code below as `somecode.cpp`. ```c++ // cppimport #include namespace py = pybind11; int square(int x) { return x * x; } PYBIND11_MODULE(somecode, m) { m.def("square", &square); } /* <% setup_pybind11(cfg) %> */ ``` Then open a Python interpreter and import the C++ extension: ```python >>> import cppimport.import_hook >>> import somecode #This will pause for a moment to compile the module >>> somecode.square(9) 81 ``` Hurray, you've called some C++ code from Python using a combination of `cppimport` and [`pybind11`](https://github.com/pybind/pybind11). I'm a big fan of the workflow that this enables, where you can edit both C++ files and Python and recompilation happens transparently! It's also handy for quickly whipping together an optimized version of a slow Python function. ## An explanation Okay, now that I've hopefully convinced you on how exciting this is, let's get into the details of how to do this yourself. First, the comment at top is essential to opt in to cppimport. Don't forget this! (See below for an explanation of why this is necessary.) ```c++ // cppimport ``` The bulk of the file is a generic, simple [pybind11](https://github.com/pybind/pybind11) extension. We include the `pybind11` headers, then define a simple function that squares `x`, then export that function as part of a Python extension called `somecode`. Finally at the end of the file, there's a section I'll call the "configuration block": ``` <% setup_pybind11(cfg) %> ``` This region surrounded by `<%` and `%>` is a [Mako](https://www.makotemplates.org/) code block. The region is evaluated as Python code during the build process and provides configuration info like compiler and linker flags to the cppimport build system. Note that because of the Mako pre-processing, the comments around the configuration block may be omitted. Putting the configuration block at the end of the file, while optional, ensures that line numbers remain correct in compilation error messages. ## Building for production In production deployments you usually don't want to include a c/c++ compiler, all the sources and compile at runtime. Therefore, a simple cli utility for pre-compiling all source files is provided. This utility may, for example, be used in CI/CD pipelines. Usage is as simple as ```commandline python -m cppimport build ``` This will build all `*.c` and `*.cpp` files in the current directory (and it's subdirectories) if they are eligible to be imported (i.e. contain the `// cppimport` comment in the first line). Alternatively, you may specifiy one or more root directories or source files to be built: ```commandline python -m cppimport build ./my/directory/ ./my/single/file.cpp ``` _Note: When specifying a path to a file, the header check (`// cppimport`) is skipped for that file._ ### Fine-tuning for production To further improve startup performance for production builds, you can opt-in to skip the checksum and compiled binary existence checks during importing by either setting the environment variable `CPPIMPORT_RELEASE_MODE` to `true` or setting the configuration from within Python: ```python cppimport.settings['release_mode'] = True ``` **Warning:** Make sure to have all binaries pre-compiled when in release mode, as importing any missing ones will cause exceptions. ## Frequently asked questions ### What's actually going on? Sometimes Python just isn't fast enough. Or you have existing code in a C or C++ library. So, you write a Python *extension module*, a library of compiled code. I recommend [pybind11](https://github.com/pybind/pybind11) for C++ to Python bindings or [cffi](https://cffi.readthedocs.io/en/latest/) for C to Python bindings. I've done this a lot over the years. But, I discovered that my productivity is slower when my development process goes from *Edit -> Test* in just Python to *Edit -> Compile -> Test* in Python plus C++. So, `cppimport` combines the process of compiling and importing an extension in Python so that you can just run `import foobar` and not have to worry about multiple steps. Internally, `cppimport` looks for a file `foobar.cpp`. Assuming one is found, it's run through the Mako templating system to gather compiler options, then it's compiled and loaded as an extension module. ### Does cppimport recompile every time a module is imported? No! Compilation should only happen the first time the module is imported. The C++ source is compared with a checksum on each import to determine if any relevant file has changed. Additional dependencies (e.g. header files!) can be tracked by adding to the Mako header: ```python cfg['dependencies'] = ['file1.h', 'file2.h'] ``` The checksum is computed by simply appending the contents of the extension C++ file together with the files in `cfg['sources']` and `cfg['dependencies']`. ### How can I set compiler or linker args? Standard distutils configuration options are valid: ```python cfg['extra_link_args'] = ['...'] cfg['extra_compile_args'] = ['...'] cfg['libraries'] = ['...'] cfg['include_dirs'] = ['...'] ``` For example, to use C++11, add: ```python cfg['extra_compile_args'] = ['-std=c++11'] ``` ### How can I split my extension across multiple source files? In the configuration block: ```python cfg['sources'] = ['extra_source1.cpp', 'extra_source2.cpp'] ``` ### cppimport isn't doing what I want, can I get more verbose output? `cppimport` uses the standard Python logging tools. Please add logging handlers to either the root logger or the `"cppimport"` logger. For example, to output all debug level log messages: ```python root_logger = logging.getLogger() root_logger.setLevel(logging.DEBUG) handler = logging.StreamHandler(sys.stdout) handler.setLevel(logging.DEBUG) formatter = logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s") handler.setFormatter(formatter) root_logger.addHandler(handler) ``` ### How can I force a rebuild even when the checksum matches? Set: ```python cppimport.settings['force_rebuild'] = True ``` And if this is a common occurence, I would love to hear your use case and why the combination of the checksum, `cfg['dependencies']` and `cfg['sources']` is insufficient! Note that `force_rebuild` does not work when importing the module concurrently. ### Can I import my model concurrently? It's safe to use `cppimport` to import a module concurrently using multiple threads, processes or even machines! Before building a module, `cppimport` obtains a lockfile preventing other processors from building it at the same time - this prevents clashes that can lead to failure. Other processes will wait maximum 10 mins until the first process has built the module and load it. If your module does not build within 10 mins then it will timeout. You can increase the timeout time in the settings: ```python cppimport.settings['lock_timeout'] = 10*60 # 10 mins ``` You should not use `force_rebuild` when importing concurrently. ### How can I get information about filepaths in the configuration block? The module name is available as the `fullname` variable and the C++ module file is available as `filepath`. For example, ``` <% module_dir = os.path.dirname(filepath) %> ``` ### How can I make compilation faster? In single file extensions, this is a fundamental issue with C++. Heavily templated code is often quite slow to compile. If your extension has multiple source files using the `cfg['sources']` capability, then you might be hoping for some kind of incremental compilation. For the uninitiated, incremental compilation involves only recompiling those source files that have changed. Unfortunately this isn't possible because cppimport is built on top of the setuptools and distutils and these standard library components do not support incremental compilation. I recommend following the suggestions on [this SO answer](http://stackoverflow.com/questions/11013851/speeding-up-build-process-with-distutils). That is: 1. Use `ccache` to reduce the cost of rebuilds 2. Enable parallel compilation. This can be done with `cfg['parallel'] = True` in the C++ file's configuration header. As a further thought, if your extension has many source files and you're hoping to do incremental compiles, that probably indicates that you've outgrown `cppimport` and should consider using a more complete build system like CMake. ### Why does the import hook need "cppimport" on the first line of the .cpp file? Modifying the Python import system is a global modification and thus affects all imports from any other package. As a result, when I first implemented `cppimport`, other packages (e.g. `scipy`) suddenly started breaking because import statements internal to those packages were importing C or C++ files instead of the modules they were intended to import. To avoid this failure mode, the import hook uses an "opt in" system where C and C++ files can specify they are meant to be used with cppimport by having a comment on the first line that includes the text "cppimport". As an alternative to the import hook, you can use `imp` or `imp_from_filepath`. The `cppimport.imp` and `cppimport.imp_from_filepath` performs exactly the same operation as the import hook but in a slightly more explicit way: ``` foobar = cppimport.imp("foobar") foobar = cppimport.imp_from_filepath("src/foobar.cpp") ``` By default, these explicit function do not require the "cppimport" keyword on the first line of the C++ source file. ### Windows? The CI system does not run on Windows. A PR would be welcome adding further Windows support. I've used `cppimport` with MinGW-w64 and Python 3.6 and had good success. I've also had reports that `cppimport` works on Windows with Python 3.6 and Visual C++ 2015 Build Tools. The main challenge is making sure that distutils is aware of your available compilers. Try out the suggestion [here](https://stackoverflow.com/questions/3297254/how-to-use-mingws-gcc-compiler-when-installing-python-package-using-pip). ## cppimport uses the MIT License cppimport-22.08.02/cppimport/000077500000000000000000000000001427225072500160055ustar00rootroot00000000000000cppimport-22.08.02/cppimport/__init__.py000066400000000000000000000127711427225072500201260ustar00rootroot00000000000000""" See CONTRIBUTING.md for a description of the project structure and the internal logic. """ import ctypes import logging import os from cppimport.find import _check_first_line_contains_cppimport try: from ._version import version as __version__ from ._version import version_tuple except ImportError: __version__ = "unknown version" version_tuple = (0, 0, "unknown version") settings = dict( force_rebuild=False, # `force_rebuild` with multiple processes is not supported file_exts=[".cpp", ".c"], rtld_flags=ctypes.RTLD_LOCAL, lock_suffix=".lock", lock_timeout=10 * 60, remove_strict_prototypes=True, release_mode=os.getenv("CPPIMPORT_RELEASE_MODE", "0").lower() in ("true", "yes", "1"), ) _logger = logging.getLogger("cppimport") def imp(fullname, opt_in=False): """ `imp` is the explicit alternative to using cppimport.import_hook. Parameters ---------- fullname : the name of the module to import. opt_in : should we require C++ files to opt in via adding "cppimport" to the first line of the file? This is on by default for the import hook, but is off by default for this function since the intent to import a C++ module is clearly specified. Returns ------- module : the compiled and loaded Python extension module """ from cppimport.find import find_module_cpppath # Search through sys.path to find a file that matches the module filepath = find_module_cpppath(fullname, opt_in) return imp_from_filepath(filepath, fullname) def imp_from_filepath(filepath, fullname=None): """ `imp_from_filepath` serves the same purpose as `imp` except allows specifying the exact filepath of the C++ file. Parameters ---------- filepath : the filepath to the C++ file to build and import. fullname : the name of the module to import. This can be different from the module name inferred from the filepath if desired. Returns ------- module : the compiled and loaded Python extension module """ from cppimport.importer import ( build_safely, is_build_needed, load_module, setup_module_data, try_load, ) filepath = os.path.abspath(filepath) if fullname is None: fullname = os.path.splitext(os.path.basename(filepath))[0] module_data = setup_module_data(fullname, filepath) # The call to try_load is necessary here because there are times when the # only evidence a rebuild is needed comes from attempting to load an # existing extension module. For example, if the extension was built on a # different architecture or with different Python headers and will produce # an error when loaded, then the load will fail. In that situation, we will # need to rebuild. if is_build_needed(module_data) or not try_load(module_data): build_safely(filepath, module_data) load_module(module_data) return module_data["module"] def build(fullname): """ `build` builds a extension module like `imp` but does not import the extension. Parameters ---------- fullname : the name of the module to import. Returns ------- ext_path : the path to the compiled extension. """ from cppimport.find import find_module_cpppath # Search through sys.path to find a file that matches the module filepath = find_module_cpppath(fullname) return build_filepath(filepath, fullname=fullname) def build_filepath(filepath, fullname=None): """ `build_filepath` builds a extension module like `build` but allows to directly specify a file path. Parameters ---------- filepath : the filepath to the C++ file to build. fullname : the name of the module to build. Returns ------- ext_path : the path to the compiled extension. """ from cppimport.importer import ( build_safely, is_build_needed, load_module, setup_module_data, ) filepath = os.path.abspath(filepath) if fullname is None: fullname = os.path.splitext(os.path.basename(filepath))[0] module_data = setup_module_data(fullname, filepath) if is_build_needed(module_data): build_safely(filepath, module_data) load_module(module_data) # Return the path to the built module return module_data["ext_path"] def build_all(root_directory): """ `build_all` builds a extension module like `build` for each eligible (that is, containing the "cppimport" header) source file within the given `root_directory`. Parameters ---------- root_directory : the root directory to search for cpp source files in. """ for directory, _, files in os.walk(root_directory): for file in files: if ( not file.startswith(".") and os.path.splitext(file)[1] in settings["file_exts"] ): full_path = os.path.join(directory, file) if _check_first_line_contains_cppimport(full_path): _logger.info(f"Building: {full_path}") build_filepath(full_path) ######## BACKWARDS COMPATIBILITY ######### # Below here, we pay penance for mistakes. # TODO: Add DeprecationWarning """ For backwards compatibility, support this alias for the imp function """ cppimport = imp def force_rebuild(to=True): settings["force_rebuild"] = to def turn_off_strict_prototypes(): pass # turned off by default. def set_rtld_flags(flags): settings["rtld_flags"] = flags cppimport-22.08.02/cppimport/__main__.py000066400000000000000000000034301427225072500200770ustar00rootroot00000000000000import argparse import logging import os import sys from cppimport import build_all, build_filepath, settings def _run_from_commandline(raw_args): parser = argparse.ArgumentParser("cppimport") parser.add_argument( "--verbose", "-v", action="store_true", help="Increase log verbosity." ) parser.add_argument( "--quiet", "-q", action="store_true", help="Only print critical log messages." ) subparsers = parser.add_subparsers(dest="action", required=True) build_parser = subparsers.add_parser( "build", help="Build one or more cpp source files.", ) build_parser.add_argument( "root", help="The file or directory to build. If a directory is given, " "cppimport walks it recursively to build all eligible source " "files.", nargs="*", ) build_parser.add_argument( "--force", "-f", action="store_true", help="Force rebuild." ) args = parser.parse_args(raw_args[1:]) if args.quiet: logging.basicConfig(level=logging.CRITICAL) elif args.verbose: logging.basicConfig(level=logging.DEBUG) else: logging.basicConfig(level=logging.INFO) if args.action == "build": if args.force: settings["force_rebuild"] = True for path in args.root or ["."]: path = os.path.abspath(os.path.expandvars(path)) if os.path.isfile(path): build_filepath(path) elif os.path.isdir(path): build_all(path or os.getcwd()) else: raise FileNotFoundError( f'The given root path "{path}" could not be found.' ) else: parser.print_usage() if __name__ == "__main__": _run_from_commandline(sys.argv) cppimport-22.08.02/cppimport/build_module.py000066400000000000000000000124411427225072500210250ustar00rootroot00000000000000import contextlib import distutils import distutils.sysconfig import io import logging import os import shutil import tempfile import setuptools import setuptools.command.build_ext import cppimport from cppimport.filepaths import make_absolute logger = logging.getLogger(__name__) def build_module(module_data): _handle_strict_prototypes() build_path = tempfile.mkdtemp() full_module_name = module_data["fullname"] filepath = module_data["filepath"] cfg = module_data["cfg"] module_data["abs_include_dirs"] = [ make_absolute(module_data["filedirname"], d) for d in cfg.get("include_dirs", []) ] + [os.path.dirname(filepath)] module_data["abs_library_dirs"] = [ make_absolute(module_data["filedirname"], d) for d in cfg.get("library_dirs", []) ] module_data["dependency_dirs"] = module_data["abs_include_dirs"] + [ module_data["filedirname"] ] module_data["extra_source_filepaths"] = [ make_absolute(module_data["filedirname"], s) for s in cfg.get("sources", []) ] ext = ImportCppExt( os.path.dirname(filepath), full_module_name, language="c++", sources=( module_data["extra_source_filepaths"] + [module_data["rendered_src_filepath"]] ), include_dirs=module_data["abs_include_dirs"], extra_compile_args=cfg.get("extra_compile_args", []), extra_link_args=cfg.get("extra_link_args", []), library_dirs=module_data["abs_library_dirs"], libraries=cfg.get("libraries", []), ) args = [ "build_ext", "--inplace", "--build-temp=" + build_path, "--build-lib=" + build_path, "-v", ] setuptools_args = dict( name=full_module_name, ext_modules=[ext], script_args=args, cmdclass={"build_ext": BuildImportCppExt}, ) # Monkey patch in the parallel compiler if requested. # TODO: this will still cause problems if there is multithreaded code # interacting with distutils. Ideally, we'd just subclass CCompiler # instead. if cfg.get("parallel"): old_compile = distutils.ccompiler.CCompiler.compile distutils.ccompiler.CCompiler.compile = _parallel_compile f = io.StringIO() with contextlib.redirect_stdout(f): with contextlib.redirect_stderr(f): setuptools.setup(**setuptools_args) logger.debug(f"Setuptools/compiler output: {f.getvalue()}") # Remove the parallel compiler to not corrupt the outside environment. if cfg.get("parallel"): distutils.ccompiler.CCompiler.compile = old_compile shutil.rmtree(build_path) def _handle_strict_prototypes(): if not cppimport.settings["remove_strict_prototypes"]: return cfg_vars = distutils.sysconfig.get_config_vars() for key, value in cfg_vars.items(): if type(value) == str: cfg_vars[key] = value.replace("-Wstrict-prototypes", "") class ImportCppExt(setuptools.Extension): """ Subclass setuptools.Extension to add self.libdest specifying where the shared library should be placed after being compiled with BuildImportCppExt. """ def __init__(self, libdest, *args, **kwargs): self.libdest = libdest setuptools.Extension.__init__(self, *args, **kwargs) class BuildImportCppExt(setuptools.command.build_ext.build_ext): """ Subclass setuptools build_ext to put the compiled shared library in the appropriate place in the source tree from the ImportCppExt.libdest value. """ def copy_extensions_to_source(self): for ext in self.extensions: fullname = self.get_ext_fullname(ext.name) filename = self.get_ext_filename(fullname) src_filename = os.path.join(self.build_lib, filename) dest_filename = os.path.join(ext.libdest, os.path.basename(filename)) distutils.file_util.copy_file( src_filename, dest_filename, verbose=self.verbose, dry_run=self.dry_run ) # Patch for parallel compilation with distutils # From: http://stackoverflow.com/questions/11013851/speeding-up-build-process-with-distutils # noqa: E501 def _parallel_compile( self, sources, output_dir=None, macros=None, include_dirs=None, debug=0, extra_preargs=None, extra_postargs=None, depends=None, ): # these lines are copied directly from distutils.ccompiler.CCompiler macros, objects, extra_postargs, pp_opts, build = self._setup_compile( output_dir, macros, include_dirs, sources, depends, extra_postargs ) cc_args = self._get_cc_args(pp_opts, debug, extra_preargs) # Determine the number of compilation threads. Unless there are special # circumstances, this is the number of cores on the machine N = 1 try: import multiprocessing import multiprocessing.pool N = multiprocessing.cpu_count() except (ImportError, NotImplementedError): pass def _single_compile(obj): try: src, ext = build[obj] except KeyError: return self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts) # imap is evaluated on demand, converting to list() forces execution list(multiprocessing.pool.ThreadPool(N).imap(_single_compile, objects)) return objects cppimport-22.08.02/cppimport/checksum.py000066400000000000000000000057111427225072500201650ustar00rootroot00000000000000import hashlib import json import logging import struct from cppimport.filepaths import make_absolute _TAG = b"cppimport" _FMT = struct.Struct("q" + str(len(_TAG)) + "s") logger = logging.getLogger(__name__) def is_checksum_valid(module_data): """ Load the saved checksum from the extension file check if it matches the checksum computed from current source files. """ deps, old_checksum = _load_checksum_trailer(module_data) if old_checksum is None: return False # Already logged error in load_checksum_trailer. try: return old_checksum == _calc_cur_checksum(deps, module_data) except OSError as e: logger.info( "Checksummed file not found while checking cppimport checksum " "(%s); rebuilding." % e ) return False def _load_checksum_trailer(module_data): try: with open(module_data["ext_path"], "rb") as f: f.seek(-_FMT.size, 2) json_len, tag = _FMT.unpack(f.read(_FMT.size)) if tag != _TAG: logger.info( "The extension is missing the trailer tag and thus is missing" " its checksum; rebuilding." ) return None, None f.seek(-(_FMT.size + json_len), 2) json_s = f.read(json_len) except FileNotFoundError: logger.info("Failed to find compiled extension; rebuilding.") return None, None except OSError: logger.info("Checksum trailer invalid. Rebuilding.") return None, None try: deps, old_checksum = json.loads(json_s) except ValueError: logger.info( "Failed to load checksum trailer info from already existing " "compiled extension; rebuilding." ) return None, None return deps, old_checksum def checksum_save(module_data): """ Calculate the module checksum and then write it to the end of the shared object. """ dep_filepaths = ( [ make_absolute(module_data["filedirname"], d) for d in module_data["cfg"].get("dependencies", []) ] + module_data["extra_source_filepaths"] + [module_data["filepath"]] ) cur_checksum = _calc_cur_checksum(dep_filepaths, module_data) _save_checksum_trailer(module_data, dep_filepaths, cur_checksum) def _save_checksum_trailer(module_data, dep_filepaths, cur_checksum): # We can just append the checksum to the shared object; this is effectively # legal (see e.g. https://stackoverflow.com/questions/10106447). dump = json.dumps([dep_filepaths, cur_checksum]).encode("ascii") dump += _FMT.pack(len(dump), _TAG) with open(module_data["ext_path"], "ab", buffering=0) as file: file.write(dump) def _calc_cur_checksum(file_lst, module_data): text = b"" for filepath in file_lst: with open(filepath, "rb") as f: text += f.read() return hashlib.md5(text).hexdigest() cppimport-22.08.02/cppimport/filepaths.py000066400000000000000000000002111427225072500203300ustar00rootroot00000000000000import os def make_absolute(this_dir, s): if os.path.isabs(s): return s else: return os.path.join(this_dir, s) cppimport-22.08.02/cppimport/find.py000066400000000000000000000044331427225072500173030ustar00rootroot00000000000000import logging import os import sys import cppimport from cppimport.filepaths import make_absolute logger = logging.getLogger(__name__) def find_module_cpppath(modulename, opt_in=False): filepath = _find_module_cpppath(modulename, opt_in) if filepath is None: raise ImportError( "Couldn't find a file matching the module name: " + str(modulename) + " (opt_in = " + str(opt_in) + ")" ) return filepath def _find_module_cpppath(modulename, opt_in=False): modulepath_without_ext = modulename.replace(".", os.sep) moduledir = os.path.dirname(modulepath_without_ext + ".throwaway") matching_dirs = _find_matching_path_dirs(moduledir) abs_matching_dirs = _make_dirs_absolute(matching_dirs) for ext in cppimport.settings["file_exts"]: modulefilename = os.path.basename(modulepath_without_ext + ext) outfilename = _find_file_in_folders(modulefilename, abs_matching_dirs, opt_in) if outfilename is not None: return outfilename return None def _make_dirs_absolute(dirs): out = [] for d in dirs: if d == "": d = os.getcwd() out.append(make_absolute(os.getcwd(), d)) return out def _find_matching_path_dirs(moduledir): if moduledir == "": return sys.path ds = [] for dir in sys.path: test_path = os.path.join(dir, moduledir) if os.path.exists(test_path) and os.path.isdir(test_path): ds.append(test_path) return ds def _find_file_in_folders(filename, paths, opt_in): for d in paths: if not os.path.exists(d): continue if os.path.isfile(d): continue for f in os.listdir(d): if f != filename: continue filepath = os.path.join(d, f) if opt_in and not _check_first_line_contains_cppimport(filepath): logger.debug( "Found file but the first line doesn't " "contain cppimport so it will be skipped: " + filepath ) continue return filepath return None def _check_first_line_contains_cppimport(filepath): with open(filepath, "r") as f: return "cppimport" in f.readline() cppimport-22.08.02/cppimport/import_hook.py000066400000000000000000000014711427225072500207140ustar00rootroot00000000000000import logging import sys import traceback import cppimport logger = logging.getLogger(__name__) class Hook(object): def __init__(self): self._running = False def find_spec(self, fullname, path, target=None): # Prevent re-entry by the underlying importer if self._running: return try: self._running = True cppimport.imp(fullname, opt_in=True) except ImportError: # ImportError should be quashed because that simply means cppimport # didn't find anything, and probably shouldn't have found anything! logger.debug(traceback.format_exc()) finally: self._running = False # Add the hook to the list of import handlers for Python. hook_obj = Hook() sys.meta_path.insert(0, hook_obj) cppimport-22.08.02/cppimport/importer.py000066400000000000000000000107521427225072500202250ustar00rootroot00000000000000import importlib import logging import os import sys import sysconfig from contextlib import suppress from time import sleep, time import filelock import cppimport from cppimport.build_module import build_module from cppimport.checksum import checksum_save, is_checksum_valid from cppimport.templating import run_templating logger = logging.getLogger(__name__) def build_safely(filepath, module_data): """Protect against race conditions when multiple processes executing `template_and_build`""" binary_path = module_data["ext_path"] lock_path = binary_path + cppimport.settings["lock_suffix"] def build_completed(): return os.path.exists(binary_path) and is_checksum_valid(module_data) t = time() # Race to obtain the lock and build. Other processes can wait while not build_completed() and time() - t < cppimport.settings["lock_timeout"]: try: with filelock.FileLock(lock_path, timeout=1): if build_completed(): break template_and_build(filepath, module_data) except filelock.Timeout: logging.debug(f"Could not obtain lock (pid {os.getpid()})") if cppimport.settings["force_rebuild"]: raise ValueError( "force_build must be False to build concurrently." "This process failed to claim a filelock indicating that" " a concurrent build is in progress" ) sleep(1) if os.path.exists(lock_path): with suppress(OSError): os.remove(lock_path) if not build_completed(): raise Exception( f"Could not compile binary as lock already taken and timed out." f" Try increasing the timeout setting if " f"the build time is longer (pid {os.getpid()})." ) def template_and_build(filepath, module_data): logger.debug(f"Compiling {filepath}.") run_templating(module_data) build_module(module_data) checksum_save(module_data) def setup_module_data(fullname, filepath): module_data = dict() module_data["fullname"] = fullname module_data["filepath"] = filepath module_data["filedirname"] = os.path.dirname(module_data["filepath"]) module_data["filebasename"] = os.path.basename(module_data["filepath"]) module_data["ext_name"] = get_module_name(fullname) + get_extension_suffix() module_data["ext_path"] = os.path.join( os.path.dirname(filepath), module_data["ext_name"] ) return module_data def get_module_name(full_module_name): return full_module_name.split(".")[-1] def get_extension_suffix(): ext_suffix = sysconfig.get_config_var("EXT_SUFFIX") if ext_suffix is None: ext_suffix = sysconfig.get_config_var("SO") return ext_suffix def _actually_load_module(module_data): module_data["module"] = importlib.import_module(module_data["fullname"]) def load_module(module_data): if hasattr(sys, "getdlopenflags"): # It can be useful to set rtld_flags to RTLD_GLOBAL. This allows # extensions that are loaded later to share the symbols from this # extension. This is primarily useful in a project where several # interdependent extensions are loaded but it's undesirable to combine # the multiple extensions into a single extension. old_flags = sys.getdlopenflags() new_flags = old_flags | cppimport.settings["rtld_flags"] sys.setdlopenflags(new_flags) _actually_load_module(module_data) sys.setdlopenflags(old_flags) else: _actually_load_module(module_data) def is_build_needed(module_data): if cppimport.settings["force_rebuild"]: return True if cppimport.settings["release_mode"]: logger.debug( f"Release mode is enabled. Thus, file {module_data['filepath']} is " f"not being compiled." ) return False if not is_checksum_valid(module_data): return True logger.debug(f"Matching checksum for {module_data['filepath']} --> not compiling") return False def try_load(module_data): """Try loading the module to test if it's not corrupt and for the correct architecture""" try: load_module(module_data) return True except ImportError as e: logger.info( f"ImportError during import with matching checksum: {e}. Trying to rebuild." ) with suppress(OSError): os.remove(module_data["fullname"]) return False cppimport-22.08.02/cppimport/templating.py000066400000000000000000000043171427225072500205300ustar00rootroot00000000000000import io import logging import os import mako.exceptions import mako.lookup import mako.runtime import mako.template logger = logging.getLogger(__name__) def run_templating(module_data): module_data["cfg"] = BuildArgs( sources=[], include_dirs=[], extra_compile_args=[], libraries=[], library_dirs=[], extra_link_args=[], dependencies=[], parallel=False, ) module_data["setup_pybind11"] = setup_pybind11 buf = io.StringIO() ctx = mako.runtime.Context(buf, **module_data) filepath = module_data["filepath"] try: template_dirs = [os.path.dirname(filepath)] lookup = mako.lookup.TemplateLookup(directories=template_dirs) tmpl = lookup.get_template(module_data["filebasename"]) tmpl.render_context(ctx) except: # noqa: E722 logger.exception(mako.exceptions.text_error_template().render()) raise rendered_src_filepath = get_rendered_source_filepath(filepath) with open(rendered_src_filepath, "w", newline="") as f: f.write(buf.getvalue()) module_data["rendered_src_filepath"] = rendered_src_filepath class BuildArgs(dict): """ This exists for backwards compatibility with old configuration key names. TODO: Add deprecation warnings to allow removing this sometime in the future. """ _key_mapping = { "compiler_args": "extra_compile_args", "linker_args": "extra_link_args", } def __getitem__(self, key): return super(BuildArgs, self).__getitem__(self._key_mapping.get(key, key)) def __setitem__(self, key, value): super(BuildArgs, self).__setitem__(self._key_mapping.get(key, key), value) def setup_pybind11(cfg): import pybind11 cfg["include_dirs"] += [pybind11.get_include(), pybind11.get_include(True)] # Prefix with c++11 arg instead of suffix so that if a user specifies c++14 # (or later!) then it won't be overridden. cfg["compiler_args"] = ["-std=c++11", "-fvisibility=hidden"] + cfg["compiler_args"] def get_rendered_source_filepath(filepath): dirname = os.path.dirname(filepath) filename = os.path.basename(filepath) return os.path.join(dirname, ".rendered." + filename) cppimport-22.08.02/environment.yml000066400000000000000000000002631427225072500170600ustar00rootroot00000000000000name: cppimport channels: - conda-forge dependencies: - pybind11 - mako - black - regex>=2021 - flake8 - isort - pytest - pytest-cov - pre-commit - filelock cppimport-22.08.02/pyproject.toml000066400000000000000000000004521427225072500167050ustar00rootroot00000000000000[tool.isort] profile = "black" [tool.pytest.ini_options] addopts = "-s --tb=short" [build-system] requires = ["setuptools>=45", "wheel", "setuptools_scm[toml]>=6.2"] build-backend = "setuptools.build_meta" [tool.setuptools_scm] version_scheme = "post-release" write_to = "cppimport/_version.py" cppimport-22.08.02/release000066400000000000000000000005731427225072500153400ustar00rootroot00000000000000GIT: git commit -m "yy.mm.dd" git tag yy.mm.dd git push --atomic origin main yy.mm.dd wait for github action to complete create release on github SANITY TEST: open new terminal mamba create -n testenv python=3 pip conda activate testenv pip install --force-reinstall --no-cache cppimport cd tests python -c 'import cppimport; assert(cppimport.imp("mymodule").add(1,2) == 3);' cppimport-22.08.02/setup.cfg000066400000000000000000000001521427225072500156070ustar00rootroot00000000000000[flake8] max-line-length = 88 extend-ignore = E203, E266 per-file-ignores= cppimport/__init__.py:F401 cppimport-22.08.02/setup.py000066400000000000000000000017431427225072500155070ustar00rootroot00000000000000from setuptools import setup description = open("README.md").read() setup( use_scm_version={"version_scheme": "post-release"}, setup_requires=["setuptools_scm"], packages=["cppimport"], install_requires=["mako", "pybind11", "filelock"], zip_safe=False, name="cppimport", description="Import C++ files directly from Python!", long_description=description, long_description_content_type="text/markdown", url="https://github.com/tbenthompson/cppimport", author="T. Ben Thompson", author_email="t.ben.thompson@gmail.com", license="MIT", platforms=["any"], classifiers=[ "Development Status :: 5 - Production/Stable", "Intended Audience :: Developers", "Operating System :: OS Independent", "Operating System :: POSIX", "Topic :: Software Development", "License :: OSI Approved :: MIT License", "Programming Language :: Python :: 3", "Programming Language :: C++", ], ) cppimport-22.08.02/tests/000077500000000000000000000000001427225072500151325ustar00rootroot00000000000000cppimport-22.08.02/tests/apackage/000077500000000000000000000000001427225072500166665ustar00rootroot00000000000000cppimport-22.08.02/tests/apackage/__init__.py000066400000000000000000000000001427225072500207650ustar00rootroot00000000000000cppimport-22.08.02/tests/apackage/inner/000077500000000000000000000000001427225072500200015ustar00rootroot00000000000000cppimport-22.08.02/tests/apackage/inner/__init__.py000066400000000000000000000000001427225072500221000ustar00rootroot00000000000000cppimport-22.08.02/tests/apackage/inner/mymodule.cpp000066400000000000000000000006011427225072500223350ustar00rootroot00000000000000/* <% import pybind11 cfg['compiler_args'] = ['-std=c++11'] cfg['include_dirs'] = [pybind11.get_include(), pybind11.get_include(True)] %> */ #include namespace py = pybind11; int add(int i, int j) { return i + j; } PYBIND11_PLUGIN(mymodule) { pybind11::module m("mymodule", "auto-compiled c++ extension"); m.def("add", &add); return m.ptr(); } cppimport-22.08.02/tests/apackage/mymodule.cpp000066400000000000000000000006131427225072500212250ustar00rootroot00000000000000/* cppimport <% import pybind11 cfg['compiler_args'] = ['-std=c++11'] cfg['include_dirs'] = [pybind11.get_include(), pybind11.get_include(True)] %> */ #include namespace py = pybind11; int add(int i, int j) { return i + j; } PYBIND11_PLUGIN(mymodule) { pybind11::module m("mymodule", "auto-compiled c++ extension"); m.def("add", &add); return m.ptr(); } cppimport-22.08.02/tests/apackage/rel_import_tester.py000066400000000000000000000001001427225072500227710ustar00rootroot00000000000000from . import mymodule def f(): return mymodule.add(1, 2) cppimport-22.08.02/tests/cpp14module.cpp000066400000000000000000000005331427225072500177740ustar00rootroot00000000000000/* <% import pybind11 cfg['compiler_args'] = ['-std=c++14'] cfg['include_dirs'] = [pybind11.get_include(), pybind11.get_include(True)] %> */ #include namespace py = pybind11; // Use auto instead of int to check C++14 auto add(int i, int j) { return i + j; } PYBIND11_MODULE(cpp14module, m) { m.def("add", &add); } cppimport-22.08.02/tests/extra_sources.cpp000066400000000000000000000004351427225072500205260ustar00rootroot00000000000000<% setup_pybind11(cfg) cfg['sources'] = ['extra_sources1.cpp'] cfg['parallel'] = True %> #include int square(int x); int square_sum(int x, int y) { return square(x) + square(y); } PYBIND11_MODULE(extra_sources, m) { m.def("square_sum", &square_sum); } cppimport-22.08.02/tests/extra_sources1.cpp000066400000000000000000000000501427225072500206000ustar00rootroot00000000000000int square(int x) { return x * x; } cppimport-22.08.02/tests/free_module.cpp000066400000000000000000000001111427225072500201150ustar00rootroot00000000000000#include int main() { std::cout << "HI!" << std::endl; } cppimport-22.08.02/tests/hook_test.cpp000066400000000000000000000002421427225072500176330ustar00rootroot00000000000000/*cppimport*/ <% setup_pybind11(cfg) %> #include PYBIND11_MODULE(hook_test, m) { m.def("sub", [] (int i, int j) { return i - j; } ); } cppimport-22.08.02/tests/mymodule.cpp000066400000000000000000000006441427225072500174750ustar00rootroot00000000000000/* <% setup_pybind11(cfg) cfg['dependencies'] = ['thing.h'] %> */ #include #include "thing.h" #include "thing2.h" namespace py = pybind11; int add(int i, int j) { return i + j; } PYBIND11_MODULE(mymodule, m) { m.def("add", &add); #ifdef THING_DEFINED #pragma message "stuff" py::class_(m, "Thing") .def(py::init<>()) .def("cheer", &Thing::cheer); #endif } cppimport-22.08.02/tests/raw_extension.c000066400000000000000000000016771427225072500201760ustar00rootroot00000000000000#include #if PY_MAJOR_VERSION >= 3 #define MOD_INIT(name) PyMODINIT_FUNC PyInit_##name(void) #define MOD_DEF(ob, name, doc, methods) \ static struct PyModuleDef moduledef = { \ PyModuleDef_HEAD_INIT, name, doc, -1, methods, }; \ ob = PyModule_Create(&moduledef); #define MOD_SUCCESS_VAL(val) val #else #define MOD_INIT(name) PyMODINIT_FUNC init##name(void) #define MOD_DEF(ob, name, doc, methods) \ ob = Py_InitModule3(name, methods, doc); #define MOD_SUCCESS_VAL(val) #endif static PyObject* add(PyObject* self, PyObject* args) { int a, b; int class = 1; if (!PyArg_ParseTuple(args, "ii", &a, &b)) { return NULL; } return Py_BuildValue("i", a + b); } static PyMethodDef methods[] = { {"add", add, METH_VARARGS, ""}, {NULL} }; MOD_INIT(raw_extension) { PyObject* m; MOD_DEF(m, "raw_extension", "", methods) return MOD_SUCCESS_VAL(m); } cppimport-22.08.02/tests/test_cppimport.py000066400000000000000000000136441427225072500205700ustar00rootroot00000000000000import contextlib import copy import logging import os import shutil import subprocess import sys from multiprocessing import Process from tempfile import TemporaryDirectory import cppimport import cppimport.build_module import cppimport.templating from cppimport.find import find_module_cpppath root_logger = logging.getLogger() root_logger.setLevel(logging.DEBUG) handler = logging.StreamHandler(sys.stdout) handler.setLevel(logging.DEBUG) formatter = logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s") handler.setFormatter(formatter) root_logger.addHandler(handler) @contextlib.contextmanager def appended(filename, text): with open(filename, "r") as f: orig = f.read() with open(filename, "a") as f: f.write(text) try: yield finally: with open(filename, "w") as f: f.write(orig) def subprocess_check(test_code, returncode=0): p = subprocess.run( ["python", "-c", test_code], cwd=os.path.dirname(__file__), stdout=subprocess.PIPE, stderr=subprocess.PIPE, ) if len(p.stdout) > 0: print(p.stdout.decode("utf-8")) if len(p.stderr) > 0: print(p.stderr.decode("utf-8")) assert p.returncode == returncode @contextlib.contextmanager def tmp_dir(files=None): """Create a temporary directory and copy `files` into it. `files` can also include directories.""" files = files if files else [] with TemporaryDirectory() as tmp_path: for f in files: if os.path.isdir(f): shutil.copytree(f, os.path.join(tmp_path, os.path.basename(f))) else: shutil.copyfile(f, os.path.join(tmp_path, os.path.basename(f))) yield tmp_path def test_find_module_cpppath(): mymodule_loc = find_module_cpppath("mymodule") mymodule_dir = os.path.dirname(mymodule_loc) assert os.path.basename(mymodule_loc) == "mymodule.cpp" apackage = find_module_cpppath("apackage.mymodule") apackage_correct = os.path.join(mymodule_dir, "apackage", "mymodule.cpp") assert apackage == apackage_correct inner = find_module_cpppath("apackage.inner.mymodule") inner_correct = os.path.join(mymodule_dir, "apackage", "inner", "mymodule.cpp") assert inner == inner_correct def test_get_rendered_source_filepath(): rendered_path = cppimport.templating.get_rendered_source_filepath("abc.cpp") assert rendered_path == ".rendered.abc.cpp" def module_tester(mod, cheer=False): assert mod.add(1, 2) == 3 if cheer: mod.Thing().cheer() def test_mymodule(): mymodule = cppimport.imp("mymodule") module_tester(mymodule) def test_mymodule_build(): cppimport.build("mymodule") import mymodule module_tester(mymodule) def test_mymodule_from_filepath(): mymodule = cppimport.imp_from_filepath("tests/mymodule.cpp") module_tester(mymodule) def test_package_mymodule(): mymodule = cppimport.imp("apackage.mymodule") module_tester(mymodule) def test_inner_package_mymodule(): mymodule = cppimport.imp("apackage.inner.mymodule") module_tester(mymodule) def test_with_file_in_syspath(): orig_sys_path = copy.copy(sys.path) sys.path.append(os.path.join(os.path.dirname(__file__), "mymodule.cpp")) cppimport.imp("mymodule") sys.path = orig_sys_path def test_rebuild_after_failed_compile(): cppimport.imp("mymodule") test_code = """ import cppimport; mymodule = cppimport.imp("mymodule");assert(mymodule.add(1,2) == 3) """ with appended("tests/mymodule.cpp", ";asdf;"): subprocess_check(test_code, 1) subprocess_check(test_code, 0) add_to_thing = """ #include struct Thing { void cheer() { std::cout << "WAHHOOOO" << std::endl; } }; #define THING_DEFINED """ def test_no_rebuild_if_no_deps_change(): cppimport.imp("mymodule") test_code = """ import cppimport; mymodule = cppimport.imp("mymodule"); assert(not hasattr(mymodule, 'Thing')) """ with appended("tests/thing2.h", add_to_thing): subprocess_check(test_code) def test_rebuild_header_after_change(): cppimport.imp("mymodule") test_code = """ import cppimport; mymodule = cppimport.imp("mymodule"); mymodule.Thing().cheer() """ with appended("tests/thing.h", add_to_thing): subprocess_check(test_code) assert open("tests/thing.h", "r").read() == "" def test_raw_extensions(): raw_extension = cppimport.imp("raw_extension") assert raw_extension.add(1, 2) == 3 def test_extra_sources_and_parallel(): cppimport.settings["force_rebuild"] = True mod = cppimport.imp("extra_sources") cppimport.settings["force_rebuild"] = False assert mod.square_sum(3, 4) == 25 def test_import_hook(): import cppimport.import_hook # Force rebuild to make sure we're not just reloading the already compiled # module from disk cppimport.force_rebuild(True) import hook_test cppimport.force_rebuild(False) assert hook_test.sub(3, 1) == 2 def test_submodule_import_hook(): import cppimport.import_hook # Force rebuild to make sure we're not just reloading the already compiled # module from disk cppimport.force_rebuild(True) import apackage.mymodule cppimport.force_rebuild(False) assert apackage.mymodule.add(3, 1) == 4 def test_relative_import(): import cppimport.import_hook cppimport.force_rebuild(True) from apackage.rel_import_tester import f cppimport.force_rebuild(False) print(f()) assert f() == 3 def test_multiple_processes(): with tmp_dir(["tests/hook_test.cpp"]) as tmp_path: test_code = f""" import os; os.chdir('{tmp_path}'); import cppimport.import_hook; import hook_test; """ processes = [ Process(target=subprocess_check, args=(test_code,)) for i in range(100) ] for p in processes: p.start() for p in processes: p.join() assert all(p.exitcode == 0 for p in processes) cppimport-22.08.02/tests/thing.h000066400000000000000000000000001427225072500164020ustar00rootroot00000000000000cppimport-22.08.02/tests/thing2.h000066400000000000000000000000011427225072500164650ustar00rootroot00000000000000