{% set version = "1.1.0" %}
package:
name: imagesize
version: {{ version }}
source:
url: https://pypi.io/packages/source/i/imagesize/imagesize-{{ version }}.tar.gz
sha256: f3832918bc3c66617f92e35f5d70729187676313caa60c187eb0f28b8fe5e3b5
build:
noarch: python
number: 0
script: python -m pip install --no-deps --ignore-installed .
requirements:
host:
- python
- pip
run:
- python
test:
imports:
- imagesize
about:
home: https://github.com/shibukawa/imagesize_py
license: MIT
summary: 'Getting image size from png/jpeg/jpeg2000/gif file'
description: |
This module analyzes jpeg/jpeg2000/png/gif image header and
return image size.
dev_url: https://github.com/shibukawa/imagesize_py
doc_url: https://pypi.python.org/pypi/imagesize
doc_source_url: https://github.com/shibukawa/imagesize_py/blob/master/README.rst
All sections are optional except for package/name
and
package/version
.
Headers must appear only once. If they appear multiple times,
only the last is remembered. For example, the package:
header
should appear only once in the file.
Package section
Specifies package information.
Package name
The lower case name of the package. It may contain "-", but no
spaces.
package:
name: bsdiff4
Package version
The version number of the package. Use the PEP-386 verlib
conventions. Cannot contain "-". YAML interprets version numbers
such as 1.0 as floats, meaning that 0.10 will be the same as 0.1.
To avoid this, put the version number in quotes so that it is
interpreted as a string.
package:
version: "1.1.4"
Post-build versioning: In some cases, you may not know the
version, build number, or build string of the package until after
it is built. In these cases, you can perform
Templating with Jinja or utilize Git environment variables and
Inherited environment variables.
Source section
Specifies where the source code of the package is coming from.
The source may come from a tarball file, git, hg, or svn. It may
be a local path and it may contain patches.
Source from tarball or zip archive
source:
url: https://pypi.python.org/packages/source/b/bsdiff4/bsdiff4-1.1.4.tar.gz
md5: 29f6089290505fc1a852e176bd276c43
sha1: f0a2c9a30073449cfb7d171c57552f3109d93894
sha256: 5a022ff4c1d1de87232b1c70bde50afbb98212fd246be4a867d8737173cf1f8f
If an extracted archive contains only 1 folder at its top level, its contents
will be moved 1 level up, so that the extracted package contents sit in the
root of the work folder.
Source from git
The git_url can also be a relative path to the recipe directory.
source:
git_url: https://github.com/ilanschnell/bsdiff4.git
git_rev: 1.1.4
git_depth: 1 # (Defaults to -1/not shallow)
The depth argument relates to the ability to perform a shallow clone.
A shallow clone means that you only download part of the history from
Git. If you know that you only need the most recent changes, you can
say, git_depth: 1
, which is faster than cloning the entire repo.
The downside to setting it at 1 is that, unless the tag is on that
specific commit, then you won't have that tag when you go to reference
it in git_rev
(for example). If your git_depth
is insufficient
to capture the tag in git_rev
, you'll encounter an error. So in the
example above, unless the 1.1.4 is the very head commit and the one
that you're going to grab, you may encounter an error.
Source from hg
source:
hg_url: ssh://[email protected]/ilanschnell/bsdiff4
hg_tag: 1.1.4
source:
svn_url: https://github.com/ilanschnell/bsdiff
svn_rev: 1.1.4 # (defaults to head)
svn_ignore_externals: True # (defaults to False)
svn_username: username # Optional, if set must also have svn_password
svn_password: password # Optional, if set must also have svn_username
To access a restricted SVN repository, specify both svn_username
and svn_password
.
Caution
Storing credentials in plaintext carries risks. Alternatively, consider
using environment variables:
source:
svn_username: {{ environ["SVN_USERNAME"] }}
svn_password: {{ environ["SVN_PASSWORD"] }}
Source from a local path
If the path is relative, it is taken relative to the recipe
directory. The source is copied to the work directory before
building.
source:
path: ../src
If the local path is a git or svn repository, you get the
corresponding environment variables defined in your build
environment. The only practical difference between git_url or
hg_url and path as source arguments is that git_url and hg_url
would be clones of a repository, while path would be a copy of
the repository. Using path allows you to build packages with
unstaged and uncommitted changes in the working directory.
git_url can build only up to the latest commit.
Patches
Patches may optionally be applied to the source.
source:
#[source information here]
patches:
- my.patch # the patch file is expected to be found in the recipe
Conda-build automatically determines the patch strip level.
Destination path
Within conda-build's work directory, you may specify a particular folder to
place source into. This feature is new in conda-build 3.0. Conda-build will
always drop you into the same folder (build folder/work), but it's up to you
whether you want your source extracted into that folder, or nested deeper. This
feature is particularly useful when dealing with multiple sources, but can apply
to recipes with single sources as well.
source:
#[source information here]
folder: my-destination/folder
Filename
The filename key is fn
. It was formerly required with URL source types. It is not required now.
If the fn
key is provided, the file is saved on disk with that name. If the fn
key is not provided, the file is saved on disk with a name matching the last part of the URL.
For example, http://www.something.com/myfile.zip
has an implicit filename of myfile.zip
. Users may change this by manually specifying fn
.
source:
url: http://www.something.com/myfile.zip
fn: otherfilename.zip
Source from multiple sources
Some software is most easily built by aggregating several pieces. For this,
conda-build 3.0 has added support for arbitrarily specifying many sources.
The syntax is a list of source dictionaries. Each member of this list
follows the same rules as the single source for earlier conda-build versions
(listed above). All features for each member are supported.
Example:
source:
- url: https://package1.com/a.tar.bz2
folder: stuff
- url: https://package1.com/b.tar.bz2
folder: stuff
- git_url: https://github.com/conda/conda-build
folder: conda-build
Here, the two URL tarballs will go into one folder, and the git repo
is checked out into its own space. Git will not clone into a non-empty folder.
Dashes denote list items in YAML syntax.
Build section
Specifies build information.
Each field that expects a path can also handle a glob pattern. The matching is
performed from the top of the build environment, so to match files inside
your project you can use a pattern similar to the following one:
"**/myproject/**/*.txt". This pattern will match any .txt file found in
your project.
The quotation marks ("") are required for patterns that start with a *.
Recursive globbing using ** is supported only in conda-build >= 3.0.
Build number and string
The build number should be incremented for new builds of the same
version. The number defaults to 0
. The build string cannot
contain "-". The string defaults to the default conda-build
string plus the build number.
build:
number: 1
string: abc
A hash will appear when the package is affected by one or more variables from
the conda_build_config.yaml file. The hash is made up from the "used" variables
- if anything is used, you have a hash. If you don't use these variables then you
won't have a hash. There are a few special cases that do not affect the hash, such as
Python and R or anything that already had a place in the build string.
The build hash will be added to the build string if these are true for any
dependency:
package is an explicit dependency in build, host, or run deps
package has a matching entry in conda_build_config.yaml which
is a pin to a specific version, not a lower bound
that package is not ignored by ignore_version
package uses {{ compiler() }} jinja2 function
You can also influence which variables are considered for the hash with:
build:
force_use_keys:
- package_1
force_ignore_keys:
- package_2
This will ensure that the value of package_2
will not be considered for the hash,
and package_1
will be, regardless of what conda-build discovers is used by its inspection.
This may be useful to further split complex multi-output builds, to ensure each package is built,
or to ensure the right package hash when using more complex templating or scripting.
Python entry points
The following example creates a Python entry point named
"bsdiff4" that calls bsdiff4.cli.main_bsdiff4()
.
build:
entry_points:
- bsdiff4 = bsdiff4.cli:main_bsdiff4
- bspatch4 = bsdiff4.cli:main_bspatch4
Python.app
If osx_is_app is set, entry points use python.app
instead of
Python in macOS. The default is False
.
build:
osx_is_app: True
Track features
Adding track_features to one or more
of the options will cause conda to de-prioritize it or “weigh it down.”
The lowest priority package is the one that would cause the most
track_features to be activated in the environment. The default package
among many variants is the one that would cause the least track_features
to be activated.
No two packages in a given subdir should ever have the same track_feature.
build:
track_features:
- feature2
Preserve Python egg directory
This is needed for some packages that use features specific to
setuptools. The default is False
.
build:
preserve_egg_dir: True
Skip compiling some .py files into .pyc files
Some packages ship .py
files that cannot be compiled, such
as those that contain templates. Some packages also ship .py
files that should not be compiled yet, because the Python
interpreter that will be used is not known at build time. In
these cases, conda-build can skip attempting to compile these
files. The patterns used in this section do not need the ** to
handle recursive paths.
build:
skip_compile_pyc:
- "*/templates/*.py" # These should not (and cannot) be compiled
- "*/share/plugins/gdb/*.py" # The python embedded into gdb is unknown
No link
A list of globs for files that should always be copied and never
soft linked or hard linked.
build:
no_link:
- bin/*.py # Don't link any .py files in bin/
Script
Used instead of build.sh
or bld.bat
. For short build
scripts, this can be more convenient. You may need to use
selectors to use different scripts
for different platforms.
build:
script: python setup.py install --single-version-externally-managed --record=record.txt
RPATHs
Set which RPATHs are used when making executables relocatable on
Linux. This is a Linux feature that is ignored on other systems.
The default is lib/
.
build:
rpaths:
- lib/
- lib/R/lib/
Force files
Force files to always be included, even if they are already in
the environment from the build dependencies. This may be needed,
for example, to create a recipe for conda itself.
build:
always_include_files:
- bin/file1
- bin/file2
Relocation
Advanced features. You can use the following 4 keys to control
relocatability files from the build environment to the
installation environment:
binary_relocation.
has_prefix_files.
binary_has_prefix_files.
ignore_prefix_files.
For more information, see Making packages relocatable.
Binary relocation
Whether binary files should be made relocatable using
install_name_tool on macOS or patchelf on Linux. The
default is True
. It also accepts False
, which indicates
no relocation for any files, or a list of files, which indicates
relocation only for listed files.
build:
binary_relocation: False
Detect binary files with prefix
Binary files may contain the build prefix and need it replaced
with the install prefix at installation time. Conda can
automatically identify and register such files. The default is
True
.
The default changed from False
to True
in conda
build 2.0. Setting this to False
means that binary
relocation---RPATH---replacement will still be done, but
hard-coded prefixes in binaries will not be replaced. Prefixes
in text files will still be replaced.
build:
detect_binary_files_with_prefix: False
Windows handles binary prefix replacement very differently than
Unix-like systems such as macOS and Linux. At this time, we are
unaware of any executable or library that uses hardcoded
embedded paths for locating other libraries or program data on
Windows. Instead, Windows follows DLL search path
rules
or more natively supports relocatability using relative paths.
Because of this, conda ignores most prefixes. However, pip
creates executables for Python entry points that do use embedded
paths on Windows. Conda-build thus detects prefixes in all files
and records them by default. If you are getting errors about
path length on Windows, you should try to disable
detect_binary_files_with_prefix. Newer versions of Conda,
such as recent 4.2.x series releases and up, should have no
problems here, but earlier versions of conda do erroneously try
to apply any binary prefix replacement.
Binary has prefix files
By default, conda-build tries to detect prefixes in all files.
You may also elect to specify files with binary prefixes
individually. This allows you to specify the type of file as
binary, when it may be incorrectly detected as text for some
reason. Binary files are those containing NULL bytes.
build:
binary_has_prefix_files:
- bin/binaryfile1
- lib/binaryfile2
Text files with prefix files
Text files---files containing no NULL bytes---may contain the
build prefix and need it replaced with the install prefix at
installation time. Conda will automatically register such files.
Binary files that contain the build prefix are generally
handled differently---see Binary has prefix files---but there may be
cases where such a binary file needs to be treated as an ordinary
text file, in which case they need to be identified.
build:
has_prefix_files:
- bin/file1
- lib/file2
Ignore prefix files
Used to exclude some or all of the files in the build recipe from
the list of files that have the build prefix replaced with the
install prefix.
To ignore all files in the build recipe, use:
build:
ignore_prefix_files: True
To specify individual filenames, use:
build:
ignore_prefix_files:
- file1
This setting is independent of RPATH replacement. Use the
Detect binary files with prefix setting to control that behavior.
Skipping builds
Specifies whether conda-build should skip the build of this
recipe. Particularly useful for defining recipes that are
platform specific. The default is False
.
build:
skip: True # [not win]
Architecture independent packages
Allows you to specify "no architecture" when building a package,
thus making it compatible with all platforms and architectures.
Noarch packages can be installed on any platform.
Starting with conda-build 2.1, and conda 4.3, there is a new syntax that
supports different languages. Assigning the noarch key as generic
tells
conda to not try any manipulation of the contents.
build:
noarch: generic
noarch: generic
is most useful for packages such as static javascript assets
and source archives. For pure Python packages that can run on any Python
version, you can use the noarch: python
value instead:
build:
noarch: python
The legacy syntax for noarch_python
is still valid, and should be
used when you need to be certain that your package will be installable where
conda 4.3 is not yet available. All other forms of noarch packages require
conda >=4.3 to install.
build:
noarch_python: True
Warning
At the time of this writing, noarch
packages should not make use of preprocess-selectors:
noarch
packages are built with the directives which evaluate to True
in the platform
it was built, which probably will result in incorrect/incomplete installation in other
platforms.
Include build recipe
The full conda-build recipe and rendered meta.yaml
file is
included in the Package metadata by default. You can
disable this with:
build:
include_recipe: False
Use environment variables
Normally the build script in build.sh
or bld.bat
does not
pass through environment variables from the command line. Only
environment variables documented in Environment variables are seen by
the build script. To "white-list" environment variables that
should be passed through to the build script:
build:
script_env:
- MYVAR
- ANOTHER_VAR
If a listed environment variable is missing from the environment
seen by the conda-build process itself, a UserWarning is
emitted during the build process and the variable remains
undefined.
Additionally, values can be set by including =
followed by the desired value:
build:
script_env:
- MY_VAR=some value
Inheriting environment variables can make it difficult for
others to reproduce binaries from source with your recipe. Use
this feature with caution or explicitly set values using the =
syntax.
If you split your build and test phases with --no-test
and --test
,
you need to ensure that the environment variables present at build time and test
time match. If you do not, the package hashes may use different values, and your
package may not be testable, because the hashes will differ.
Export runtime requirements
Some build or host Requirements section will impose a runtime requirement.
Most commonly this is true for shared libraries (e.g. libpng), which are
required for linking at build time, and for resolving the link at run time.
With run_exports
(new in conda-build 3) such a runtime requirement can be
implicitly added by host requirements (e.g. libpng exports libpng), and with
run_exports/strong
even by build requirements (e.g. GCC exports libgcc).
# meta.yaml of libpng
build:
run_exports:
- libpng
Here, because no specific kind of run_exports
is specified, libpng's run_exports
are considered "weak." This means they will only apply when libpng is in the
host section, when they will add their export to the run section. If libpng were
listed in the build section, the run_exports
would not apply to the run section.
# meta.yaml of gcc compiler
build:
run_exports:
strong:
- libgcc
Strong run_exports
are used for things like runtimes, where the same runtime
needs to be present in the host and the run environment, and exactly which
runtime that should be is determined by what's present in the build section.
This mechanism is how we line up appropriate software on Windows, where we must
match MSVC versions used across all of the shared libraries in an environment.
# meta.yaml of some package using gcc and libpng
requirements:
build:
- gcc # has a strong run export
host:
- libpng # has a (weak) run export
# - libgcc <-- implicitly added by gcc
run:
# - libgcc <-- implicitly added by gcc
# - libpng <-- implicitly added by libpng
You can express version constraints directly, or use any of the Jinja2 helper
functions listed at Extra Jinja2 functions.
For example, you may use Pinning expressions to obtain flexible version
pinning relative to versions present at build time:
build:
run_exports:
- {{ pin_subpackage('libpng', max_pin='x.x') }}
With this example, if libpng were version 1.6.34, this pinning expression would
evaluate to >=1.6.34,<1.7
.
If build and link dependencies need to impose constraints on the run environment
but not necessarily pull in additional packages, then this can be done by
altering the Run_constrained entries. In addition to weak
/strong
run_exports
which add to the run
requirements, weak_constrains
and
strong_constrains
add to the run_constrained
requirements.
With these, e.g., minimum versions of compatible but not required packages (like
optional plugins for the linked dependency, or certain system attributes) can be
expressed:
requirements:
build:
- build-tool # has a strong run_constrained export
host:
- link-dependency # has a weak run_constrained export
run:
run_constrained:
# - system-dependency >=min <-- implicitly added by build-tool
# - optional-plugin >=min <-- implicitly added by link-dependency
Note that run_exports
can be specified both in the build section and on
a per-output basis for split packages.
run_exports
only affects directly named dependencies. For example, if you
have a metapackage that includes a compiler that lists run_exports
, you also
need to define run_exports
in the metapackage so that it takes effect
when people install your metapackage. This is important, because if
run_exports
affected transitive dependencies, you would see many added
dependencies to shared libraries where they are not actually direct
dependencies. For example, Python uses bzip2, which can use run_exports
to
make sure that people use a compatible build of bzip2. If people list python as
a build time dependency, bzip2 should only be imposed for Python itself and
should not be automatically imposed as a runtime dependency for the thing using
Python.
The potential downside of this feature is that it takes some control over
constraints away from downstream users. If an upstream package has a problematic
run_exports
constraint, you can ignore it in your recipe by listing the upstream
package name in the build/ignore_run_exports
section:
build:
ignore_run_exports:
- libstdc++
You can also list the package the run_exports
constraint is coming from
using the build/ignore_run_exports_from
section:
build:
ignore_run_exports_from:
- {{ compiler('cxx') }}
Pin runtime dependencies
The pin_depends
build key can be used to enforce pinning
behavior on the output recipe or built package.
There are 2 possible behaviors:
build:
pin_depends: record
With a value of record
, conda-build will record all
requirements exactly as they would be installed in a file
called info/requires. These pins will not
show up in the output of conda render
and they will
not affect the actual run dependencies of the output
package. It is only adding in this new file.
build:
pin_depends: strict
With a value of strict
, conda-build applies the pins
to the actual metadata. This does affect the output of
conda render
and also affects the end result
of the build. The package dependencies will be strictly
pinned down to the build string level. This will
supersede any dynamic or compatible pinning that
conda-build may otherwise be doing.
Ignoring files in overlinking/overdepending checks
The overlinking_ignore_patterns
key in the build section can be used to
ignore patterns of files for the overlinking and overdepending checks. This
is sometimes useful to speed up builds that have many files (large repackage jobs)
or builds where you know only a small fraction of the files should be checked.
Glob patterns are allowed here, but mind your quoting, especially with leading wildcards.
Use this sparingly, as the overlinking checks generally do prevent you from making mistakes.
build:
overlinking_ignore_patterns:
- "bin/*"
Whitelisting shared libraries
The missing_dso_whitelist
build key is a list of globs for
dynamic shared object (DSO) files that should be ignored when
examining linkage information.
During the post-build phase, the shared libraries in the newly created
package are examined for linkages which are not provided by the
package's requirements or a predefined list of system libraries. If such
libraries are detected, either a warning --no-error-overlinking
or error --error-overlinking
will result.
build:
missing_dso_whitelist:
These keys allow additions to the list of allowed libraries.
The runpath_whitelist
build key is a list of globs for paths
which are allowed to appear as runpaths in the package's shared
libraries. All other runpaths will cause a warning message to be
printed during the build.
build:
runpath_whitelist:
Requirements section
Specifies the build and runtime requirements. Dependencies of
these requirements are included automatically.
Versions for requirements must follow the conda match
specification. See Package match specifications.
Build
Tools required to build the package. These packages are run on
the build system and include things such as revision control systems
(Git, SVN) make tools (GNU make, Autotool, CMake) and compilers
(real cross, pseudo-cross, or native when not cross-compiling),
and any source pre-processors.
Packages which provide "sysroot" files, like the CDT
packages (see below)
also belong in the build section.
requirements:
build:
- git
- cmake
Host
This section was added in conda-build 3.0. It represents packages that need to
be specific to the target platform when the target platform is not necessarily
the same as the native build platform. For example, in order for a recipe to be
"cross-capable", shared libraries requirements must be listed in the host
section, rather than the build section, so that the shared libraries that get
linked are ones for the target platform, rather than the native build platform.
You should also include the base interpreter for packages that need one. In other
words, a Python package would list python
here and an R package would list
mro-base
or r-base
.
requirements:
build:
- {{ compiler('c') }}
- {{ cdt('xorg-x11-proto-devel') }} # [linux]
host:
- python
When both build and host sections are defined, the build section can be
thought of as "build tools" - things that run on the native platform, but output
results for the target platform. For example, a cross-compiler that runs on
linux-64, but targets linux-armv7.
The PREFIX environment variable points to the host prefix. With respect to
activation during builds, both the host and build environments are activated.
The build prefix is activated after the host prefix so that the build prefix,
which always contains native executables for the running platform, has priority
over the host prefix, which is not guaranteed to provide native executables (e.g.
when cross-compiling).
As of conda-build 3.1.4, the build and host prefixes are always separate when
both are defined, or when {{ compiler() }}
Jinja2 functions are used. The
only time that build and host are merged is when the host section is absent, and
no {{ compiler() }}
Jinja2 functions are used in meta.yaml. Because these
are separate, you may see some build failures when migrating your recipes. For
example, let's say you have a recipe to build a Python extension. If you add the
compiler Jinja2 functions to the build section, but you do not move your Python
dependency from the build section to the host section, your recipe will fail. It
will fail because the host environment is where new files are detected, but
because you have Python only in the build environment, your extension will be
installed into the build environment. No files will be detected. Also, variables
such as PYTHON will not be defined when Python is not installed into the host
environment.
On Linux, using the compiler packages provided by Anaconda Inc. in the defaults
meta-channel can prevent your build system leaking into the built software by
using our CDT
(Core Dependency Tree) packages for any "system" dependencies.
These packages are repackaged libraries and headers from CentOS6 and are unpacked
into the sysroot of our pseudo-cross compilers and are found by them automatically.
Note that what qualifies as a "system" dependency is a matter of opinion. The
Anaconda Distribution chose not to provide X11 or GL packages, so we use CDT
packages for X11. Conda-forge chose to provide X11 and GL packages.
On macOS, you can also use {{ compiler() }}
to get compiler packages
provided by Anaconda Inc. in the defaults
meta-channel. The
environment variables MACOSX_DEPLOYMENT_TARGET
and CONDA_BUILD_SYSROOT
will be set appropriately by conda-build (see Environment variables).
CONDA_BUILD_SYSROOT
will specify a folder containing a macOS SDK. These
settings achieve backwards compatibility while still providing access to C++14
and C++17. Note that conda-build will set CONDA_BUILD_SYSROOT
by parsing the
conda_build_config.yaml
. For more details, see Anaconda compiler tools.
TL;DR: If you use {{ compiler() }}
Jinja2 to utilize our new
compilers, you must also move anything that is not strictly a build tool into
your host dependencies. This includes Python, Python libraries, and any shared
libraries that you need to link against in your build. Examples of build tools
include any {{ compiler() }}
, Make, Autoconf, Perl (for running scripts, not
installing Perl software), and Python (for running scripts, not for installing
software).
Run
Packages required to run the package. These are the dependencies
that are installed automatically whenever the package is
installed. Package names should follow the package match specifications.
requirements:
run:
- python
- argparse # [py26]
- six >=1.8.0
To build a recipe against different versions of NumPy and ensure
that each version is part of the package dependencies, list
numpy x.x
as a requirement in meta.yaml
and use
conda-build
with a NumPy version option such as
--numpy 1.7
.
The line in the meta.yaml
file should literally say
numpy x.x
and should not have any numbers. If the
meta.yaml
file uses numpy x.x
, it is required to use the
--numpy
option with conda-build
.
requirements:
run:
- python
- numpy x.x
Instead of manually specifying run requirements, since
conda-build 3 you can augment the packages used in your build and host
sections with run_exports which are then automatically
added to the run requirements for you.
Run_constrained
Packages that are optional at runtime but must obey the supplied additional constraint if they are installed.
Package names should follow the package match specifications.
requirements:
run_constrained:
- optional-subpackage =={{ version }}
For example, let's say we have an environment that has package "a" installed at
version 1.0. If we install package "b" that has a run_constrained entry of
"a>1.0", then conda would need to upgrade "a" in the environment in order to
install "b".
This is especially useful in the context of virtual packages, where the
run_constrained dependency is not a package that conda manages, but rather a
virtual package
that represents a system property that conda can't change. For example, a
package on linux may impose a run_constrained dependency on __glibc>=2.12.
This is the version bound consistent with CentOS 6. Software built against glibc
2.12 will be compatible with CentOS 6. This run_constrained dependency helps
conda tell the user that a given package can't be installed if their system
glibc version is too old.
Test section
If this section exists or if there is a
run_test.[py,pl,sh,bat]
file in the recipe, the package is
installed into a test environment after the build is finished
and the tests are run there.
Test files
Test files that are copied from the recipe into the temporary
test directory and are needed during testing. If providing a path,
forward slashes must be used.
test:
files:
- test-data.txt
Source files
Test files that are copied from the source work directory into
the temporary test directory and are needed during testing.
test:
source_files:
- test-data.txt
- some/directory
- some/directory/pattern*.sh
This capability was added in conda-build 2.0.
Test requirements
In addition to the runtime requirements, you can specify
requirements needed during testing. The runtime requirements that you specified
in the "run" section described above are automatically included during testing.