Add support to install collections from git repositories (#69154)

* Enable installing collections from git repositories

* Add tests for installing individual and multiple collections from git repositories

* Test to make sure recursive dependencies with different syntax are deduplicated

* Add documentation

* add a changelog

* Skip Python 2.6

* Only fail if no collections are located in a git repository

Add support for a 'type' key for collections in requirement.yml files.
Update the changelog and document the supported keys and allowed values for the type.

Add a note that the collection(s) in the repo must contain a galaxy.yml

* Add a warning about embedding credentials in SCM URLs

* Update with review suggestions

* suppress sanity compile failure for Python 2.6
This commit is contained in:
Sloane Hertel 2020-05-29 13:33:32 -04:00 committed by GitHub
parent 225ae65b0f
commit e40889e711
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23
27 changed files with 822 additions and 160 deletions

View file

@ -0,0 +1,4 @@
minor_changes:
- ansible-galaxy - Allow installing collections from git repositories.
- ansible-galaxy - Requirement entries for collections now support a 'type' key to indicate whether the collection is a galaxy artifact, file, url, or git repo.
- ansible-galaxy - Support both 'galaxy.yml' and 'galaxy.yaml' files for collections.

View file

@ -45,7 +45,7 @@ Collections follow a simple data structure. None of the directories are required
.. note:: .. note::
* Ansible only accepts ``.yml`` extensions for :file:`galaxy.yml`, and ``.md`` for the :file:`README` file and any files in the :file:`/docs` folder. * Ansible only accepts ``.md`` extensions for the :file:`README` file and any files in the :file:`/docs` folder.
* See the `ansible-collections <https://github.com/ansible-collections/>`_ GitHub Org for examples of collection structure. * See the `ansible-collections <https://github.com/ansible-collections/>`_ GitHub Org for examples of collection structure.
* Not all directories are currently in use. Those are placeholders for future features. * Not all directories are currently in use. Those are placeholders for future features.
@ -343,6 +343,19 @@ installs the collection in the first path defined in :ref:`COLLECTIONS_PATHS`, w
Next, try using the local collection inside a playbook. For examples and more details see :ref:`Using collections <using_collections>` Next, try using the local collection inside a playbook. For examples and more details see :ref:`Using collections <using_collections>`
.. _collections_scm_install:
Installing collections from a git repository
--------------------------------------------
You can also test a version of your collection in development by installing it from a git repository.
.. code-block:: bash
ansible-galaxy collection install git+https://github.com/org/repo.git,devel
.. include:: ../shared_snippets/installing_collections_git_repo.txt
.. _publishing_collections: .. _publishing_collections:
Publishing collections Publishing collections

View file

@ -83,6 +83,10 @@ Downloading a collection for offline use
.. include:: ../shared_snippets/download_tarball_collections.txt .. include:: ../shared_snippets/download_tarball_collections.txt
Installing a collection from a git repository
---------------------------------------------
.. include:: ../shared_snippets/installing_collections_git_repo.txt
Listing installed collections Listing installed collections
----------------------------- -----------------------------
@ -302,6 +306,10 @@ Use the following example as a guide for specifying roles in *requirements.yml*:
scm: git scm: git
version: "0.1" # quoted, so YAML doesn't parse this as a floating-point value version: "0.1" # quoted, so YAML doesn't parse this as a floating-point value
.. warning::
Embedding credentials into a SCM URL is not secure. Make sure to use safe auth options for security reasons. For example, use `SSH <https://help.github.com/en/github/authenticating-to-github/connecting-to-github-with-ssh>`_, `netrc <https://linux.die.net/man/5/netrc>`_ or `http.extraHeader <https://git-scm.com/docs/git-config#Documentation/git-config.txt-httpextraHeader>`_/`url.<base>.pushInsteadOf <https://git-scm.com/docs/git-config#Documentation/git-config.txt-urlltbasegtpushInsteadOf>`_ in Git config to prevent your creds from being exposed in logs.
Installing roles and collections from the same requirements.yml file Installing roles and collections from the same requirements.yml file
--------------------------------------------------------------------- ---------------------------------------------------------------------

View file

@ -0,0 +1,84 @@
You can install a collection in a git repository by providing the URI to the repository instead of a collection name or path to a ``tar.gz`` file. The collection must contain a ``galaxy.yml`` file, which will be used to generate the would-be collection artifact data from the directory. The URI should be prefixed with ``git+`` (or with ``git@`` to use a private repository with ssh authentication) and optionally supports a comma-separated `git commit-ish <https://git-scm.com/docs/gitglossary#def_commit-ish>`_ version (for example, a commit or tag).
.. warning::
Embedding credentials into a git URI is not secure. Make sure to use safe auth options for security reasons. For example, use `SSH <https://help.github.com/en/github/authenticating-to-github/connecting-to-github-with-ssh>`_, `netrc <https://linux.die.net/man/5/netrc>`_ or `http.extraHeader <https://git-scm.com/docs/git-config#Documentation/git-config.txt-httpextraHeader>`_/`url.<base>.pushInsteadOf <https://git-scm.com/docs/git-config#Documentation/git-config.txt-urlltbasegtpushInsteadOf>`_ in Git config to prevent your creds from being exposed in logs.
.. code-block:: bash
# Install a collection in a repository using the latest commit on the branch 'devel'
ansible-galaxy collection install git+https://github.com/organization/repo_name.git,devel
# Install a collection from a private github repository
ansible-galaxy collection install git@github.com:organization/repo_name.git
# Install a collection from a local git repository
ansible-galaxy collection install git+file:///home/user/path/to/repo/.git
In a ``requirements.yml`` file, you can also use the ``type`` and ``version`` keys in addition to using the ``git+repo,version`` syntax for the collection name.
.. code-block:: yaml
collections:
- name: https://github.com/organization/repo_name.git
type: git
version: devel
Git repositories can be used for collection dependencies as well. This can be helpful for local development and testing but built/published artifacts should only have dependencies on other artifacts.
.. code-block:: yaml
dependencies: {'git@github.com:organization/repo_name.git': 'devel'}
Default repository search locations
-----------------------------------
There are two paths searched in a repository for collections by default.
The first is the ``galaxy.yml`` file in the top level of the repository path. If the ``galaxy.yml`` file exists it's used as the collection metadata and the individual collection will be installed.
.. code-block:: text
├── galaxy.yml
├── plugins/
│   ├── lookup/
│   ├── modules/
│   └── module_utils/
└─── README.md
The second is a ``galaxy.yml`` file in each directory in the repository path (one level deep). In this scenario, each directory with a ``galaxy.yml`` is installed as a collection.
.. code-block:: text
directory/
├── docs/
├── galaxy.yml
├── plugins/
│   ├── inventory/
│   └── modules/
└── roles/
Specifying the location to search for collections
-------------------------------------------------
If you have a different repository structure or only want to install a subset of collections, you can add a fragment to the end of your URI (before the optional comma-separated version) to indicate which path ansible-galaxy should inspect for ``galaxy.yml`` file(s). The path should be a directory to a collection or multiple collections (rather than the path to a ``galaxy.yml`` file).
.. code-block:: text
namespace/
└── name/
├── docs/
├── galaxy.yml
├── plugins/
│   ├── README.md
│   └── modules/
├── README.md
└── roles/
.. code-block:: bash
# Install all collections in a particular namespace
ansible-galaxy collection install git+https://github.com/organization/repo_name.git#/namespace/
# Install an individual collection using a specific commit
ansible-galaxy collection install git+https://github.com/organization/repo_name.git#/namespace/name/,7b60ddc245bc416b72d8ea6ed7b799885110f5e5

View file

@ -13,7 +13,11 @@ You can also setup a ``requirements.yml`` file to install multiple collections i
version: 'version range identifiers (default: ``*``)' version: 'version range identifiers (default: ``*``)'
source: 'The Galaxy URL to pull the collection from (default: ``--api-server`` from cmdline)' source: 'The Galaxy URL to pull the collection from (default: ``--api-server`` from cmdline)'
The ``version`` key can take in the same range identifier format documented above. The supported keys for collection requirement entries are ``name``, ``version``, ``source``, and ``type``.
The ``version`` key can take in the same range identifier format documented above. If you're installing a collection from a git repository instead of a built collection artifact, the ``version`` key refers to a `git commit-ish <https://git-scm.com/docs/gitglossary#def_commit-ish>`_.
The ``type`` key can be set to ``galaxy``, ``url``, ``file``, and ``git``. If ``type`` is omitted, the ``name`` key is used to implicitly determine the source of the collection.
Roles can also be specified and placed under the ``roles`` key. The values follow the same format as a requirements Roles can also be specified and placed under the ``roles`` key. The values follow the same format as a requirements
file used in older Ansible releases. file used in older Ansible releases.

View file

@ -33,6 +33,11 @@ Installing an older version of a collection
.. include:: ../shared_snippets/installing_older_collection.txt .. include:: ../shared_snippets/installing_older_collection.txt
Installing a collection from a git repository
---------------------------------------------
.. include:: ../shared_snippets/installing_collections_git_repo.txt
.. _collection_requirements_file: .. _collection_requirements_file:
Install multiple collections with a requirements file Install multiple collections with a requirements file

View file

@ -518,6 +518,7 @@ class GalaxyCLI(CLI):
- name: namespace.collection - name: namespace.collection
version: version identifier, multiple identifiers are separated by ',' version: version identifier, multiple identifiers are separated by ','
source: the URL or a predefined source name that relates to C.GALAXY_SERVER_LIST source: the URL or a predefined source name that relates to C.GALAXY_SERVER_LIST
type: git|file|url|galaxy
:param requirements_file: The path to the requirements file. :param requirements_file: The path to the requirements file.
:param allow_old_format: Will fail if a v1 requirements file is found and this is set to False. :param allow_old_format: Will fail if a v1 requirements file is found and this is set to False.
@ -590,6 +591,10 @@ class GalaxyCLI(CLI):
if req_name is None: if req_name is None:
raise AnsibleError("Collections requirement entry should contain the key name.") raise AnsibleError("Collections requirement entry should contain the key name.")
req_type = collection_req.get('type')
if req_type not in ('file', 'galaxy', 'git', 'url', None):
raise AnsibleError("The collection requirement entry key 'type' must be one of file, galaxy, git, or url.")
req_version = collection_req.get('version', '*') req_version = collection_req.get('version', '*')
req_source = collection_req.get('source', None) req_source = collection_req.get('source', None)
if req_source: if req_source:
@ -601,9 +606,9 @@ class GalaxyCLI(CLI):
req_source, req_source,
validate_certs=not context.CLIARGS['ignore_certs'])) validate_certs=not context.CLIARGS['ignore_certs']))
requirements['collections'].append((req_name, req_version, req_source)) requirements['collections'].append((req_name, req_version, req_source, req_type))
else: else:
requirements['collections'].append((collection_req, '*', None)) requirements['collections'].append((collection_req, '*', None, None))
return requirements return requirements
@ -705,12 +710,13 @@ class GalaxyCLI(CLI):
for collection_input in collections: for collection_input in collections:
requirement = None requirement = None
if os.path.isfile(to_bytes(collection_input, errors='surrogate_or_strict')) or \ if os.path.isfile(to_bytes(collection_input, errors='surrogate_or_strict')) or \
urlparse(collection_input).scheme.lower() in ['http', 'https']: urlparse(collection_input).scheme.lower() in ['http', 'https'] or \
collection_input.startswith(('git+', 'git@')):
# Arg is a file path or URL to a collection # Arg is a file path or URL to a collection
name = collection_input name = collection_input
else: else:
name, dummy, requirement = collection_input.partition(':') name, dummy, requirement = collection_input.partition(':')
requirements['collections'].append((name, requirement or '*', None)) requirements['collections'].append((name, requirement or '*', None, None))
return requirements return requirements
############################ ############################

View file

@ -38,11 +38,13 @@ from ansible.module_utils import six
from ansible.module_utils._text import to_bytes, to_native, to_text from ansible.module_utils._text import to_bytes, to_native, to_text
from ansible.utils.collection_loader import AnsibleCollectionRef from ansible.utils.collection_loader import AnsibleCollectionRef
from ansible.utils.display import Display from ansible.utils.display import Display
from ansible.utils.galaxy import scm_archive_collection
from ansible.utils.hashing import secure_hash, secure_hash_s from ansible.utils.hashing import secure_hash, secure_hash_s
from ansible.utils.version import SemanticVersion from ansible.utils.version import SemanticVersion
from ansible.module_utils.urls import open_url from ansible.module_utils.urls import open_url
urlparse = six.moves.urllib.parse.urlparse urlparse = six.moves.urllib.parse.urlparse
urldefrag = six.moves.urllib.parse.urldefrag
urllib_error = six.moves.urllib.error urllib_error = six.moves.urllib.error
@ -59,8 +61,7 @@ class CollectionRequirement:
def __init__(self, namespace, name, b_path, api, versions, requirement, force, parent=None, metadata=None, def __init__(self, namespace, name, b_path, api, versions, requirement, force, parent=None, metadata=None,
files=None, skip=False, allow_pre_releases=False): files=None, skip=False, allow_pre_releases=False):
""" """Represents a collection requirement, the versions that are available to be installed as well as any
Represents a collection requirement, the versions that are available to be installed as well as any
dependencies the collection has. dependencies the collection has.
:param namespace: The collection namespace. :param namespace: The collection namespace.
@ -140,6 +141,45 @@ class CollectionRequirement:
return dependencies return dependencies
@staticmethod
def artifact_info(b_path):
"""Load the manifest data from the MANIFEST.json and FILES.json. If the files exist, return a dict containing the keys 'files_file' and 'manifest_file'.
:param b_path: The directory of a collection.
"""
info = {}
for b_file_name, property_name in CollectionRequirement._FILE_MAPPING:
b_file_path = os.path.join(b_path, b_file_name)
if not os.path.exists(b_file_path):
continue
with open(b_file_path, 'rb') as file_obj:
try:
info[property_name] = json.loads(to_text(file_obj.read(), errors='surrogate_or_strict'))
except ValueError:
raise AnsibleError("Collection file at '%s' does not contain a valid json string." % to_native(b_file_path))
return info
@staticmethod
def galaxy_metadata(b_path):
"""Generate the manifest data from the galaxy.yml file.
If the galaxy.yml exists, return a dictionary containing the keys 'files_file' and 'manifest_file'.
:param b_path: The directory of a collection.
"""
b_galaxy_path = get_galaxy_metadata_path(b_path)
info = {}
if os.path.exists(b_galaxy_path):
collection_meta = _get_galaxy_yml(b_galaxy_path)
info['files_file'] = _build_files_manifest(b_path, collection_meta['namespace'], collection_meta['name'], collection_meta['build_ignore'])
info['manifest_file'] = _build_manifest(**collection_meta)
return info
@staticmethod
def collection_info(b_path, fallback_metadata=False):
info = CollectionRequirement.artifact_info(b_path)
if info or not fallback_metadata:
return info
return CollectionRequirement.galaxy_metadata(b_path)
def add_requirement(self, parent, requirement): def add_requirement(self, parent, requirement):
self.required_by.append((parent, requirement)) self.required_by.append((parent, requirement))
new_versions = set(v for v in self.versions if self._meets_requirements(v, requirement, parent)) new_versions = set(v for v in self.versions if self._meets_requirements(v, requirement, parent))
@ -204,7 +244,13 @@ class CollectionRequirement:
if os.path.exists(b_collection_path): if os.path.exists(b_collection_path):
shutil.rmtree(b_collection_path) shutil.rmtree(b_collection_path)
os.makedirs(b_collection_path)
if os.path.isfile(self.b_path):
self.install_artifact(b_collection_path, b_temp_path)
else:
self.install_scm(b_collection_path)
def install_artifact(self, b_collection_path, b_temp_path):
try: try:
with tarfile.open(self.b_path, mode='r') as collection_tar: with tarfile.open(self.b_path, mode='r') as collection_tar:
@ -235,6 +281,32 @@ class CollectionRequirement:
raise raise
def install_scm(self, b_collection_output_path):
"""Install the collection from source control into given dir.
Generates the Ansible collection artifact data from a galaxy.yml and installs the artifact to a directory.
This should follow the same pattern as build_collection, but instead of creating an artifact, install it.
:param b_collection_output_path: The installation directory for the collection artifact.
:raises AnsibleError: If no collection metadata found.
"""
b_collection_path = self.b_path
b_galaxy_path = get_galaxy_metadata_path(b_collection_path)
if not os.path.exists(b_galaxy_path):
raise AnsibleError("The collection galaxy.yml path '%s' does not exist." % to_native(b_galaxy_path))
info = CollectionRequirement.galaxy_metadata(b_collection_path)
collection_manifest = info['manifest_file']
collection_meta = collection_manifest['collection_info']
file_manifest = info['files_file']
_build_collection_dir(b_collection_path, b_collection_output_path, collection_manifest, file_manifest)
collection_name = "%s.%s" % (collection_manifest['collection_info']['namespace'],
collection_manifest['collection_info']['name'])
display.display('Created collection for %s at %s' % (collection_name, to_text(b_collection_output_path)))
def set_latest_version(self): def set_latest_version(self):
self.versions = set([self.latest_version]) self.versions = set([self.latest_version])
self._get_metadata() self._get_metadata()
@ -386,26 +458,8 @@ class CollectionRequirement:
metadata=meta, files=files, allow_pre_releases=allow_pre_release) metadata=meta, files=files, allow_pre_releases=allow_pre_release)
@staticmethod @staticmethod
def from_path(b_path, force, parent=None, fallback_metadata=False): def from_path(b_path, force, parent=None, fallback_metadata=False, skip=True):
info = {} info = CollectionRequirement.collection_info(b_path, fallback_metadata)
for b_file_name, property_name in CollectionRequirement._FILE_MAPPING:
b_file_path = os.path.join(b_path, b_file_name)
if not os.path.exists(b_file_path):
continue
with open(b_file_path, 'rb') as file_obj:
try:
info[property_name] = json.loads(to_text(file_obj.read(), errors='surrogate_or_strict'))
except ValueError:
raise AnsibleError("Collection file at '%s' does not contain a valid json string."
% to_native(b_file_path))
if not info and fallback_metadata:
b_galaxy_path = os.path.join(b_path, b'galaxy.yml')
if os.path.exists(b_galaxy_path):
collection_meta = _get_galaxy_yml(b_galaxy_path)
info['files_file'] = _build_files_manifest(b_path, collection_meta['namespace'], collection_meta['name'],
collection_meta['build_ignore'])
info['manifest_file'] = _build_manifest(**collection_meta)
allow_pre_release = False allow_pre_release = False
if 'manifest_file' in info: if 'manifest_file' in info:
@ -442,7 +496,7 @@ class CollectionRequirement:
files = info.get('files_file', {}).get('files', {}) files = info.get('files_file', {}).get('files', {})
return CollectionRequirement(namespace, name, b_path, None, [version], version, force, parent=parent, return CollectionRequirement(namespace, name, b_path, None, [version], version, force, parent=parent,
metadata=meta, files=files, skip=True, allow_pre_releases=allow_pre_release) metadata=meta, files=files, skip=skip, allow_pre_releases=allow_pre_release)
@staticmethod @staticmethod
def from_name(collection, apis, requirement, force, parent=None, allow_pre_release=False): def from_name(collection, apis, requirement, force, parent=None, allow_pre_release=False):
@ -483,8 +537,7 @@ class CollectionRequirement:
def build_collection(collection_path, output_path, force): def build_collection(collection_path, output_path, force):
""" """Creates the Ansible collection artifact in a .tar.gz file.
Creates the Ansible collection artifact in a .tar.gz file.
:param collection_path: The path to the collection to build. This should be the directory that contains the :param collection_path: The path to the collection to build. This should be the directory that contains the
galaxy.yml file. galaxy.yml file.
@ -493,14 +546,15 @@ def build_collection(collection_path, output_path, force):
:return: The path to the collection build artifact. :return: The path to the collection build artifact.
""" """
b_collection_path = to_bytes(collection_path, errors='surrogate_or_strict') b_collection_path = to_bytes(collection_path, errors='surrogate_or_strict')
b_galaxy_path = os.path.join(b_collection_path, b'galaxy.yml') b_galaxy_path = get_galaxy_metadata_path(b_collection_path)
if not os.path.exists(b_galaxy_path): if not os.path.exists(b_galaxy_path):
raise AnsibleError("The collection galaxy.yml path '%s' does not exist." % to_native(b_galaxy_path)) raise AnsibleError("The collection galaxy.yml path '%s' does not exist." % to_native(b_galaxy_path))
collection_meta = _get_galaxy_yml(b_galaxy_path) info = CollectionRequirement.galaxy_metadata(b_collection_path)
file_manifest = _build_files_manifest(b_collection_path, collection_meta['namespace'], collection_meta['name'],
collection_meta['build_ignore']) collection_manifest = info['manifest_file']
collection_manifest = _build_manifest(**collection_meta) collection_meta = collection_manifest['collection_info']
file_manifest = info['files_file']
collection_output = os.path.join(output_path, "%s-%s-%s.tar.gz" % (collection_meta['namespace'], collection_output = os.path.join(output_path, "%s-%s-%s.tar.gz" % (collection_meta['namespace'],
collection_meta['name'], collection_meta['name'],
@ -519,8 +573,7 @@ def build_collection(collection_path, output_path, force):
def download_collections(collections, output_path, apis, validate_certs, no_deps, allow_pre_release): def download_collections(collections, output_path, apis, validate_certs, no_deps, allow_pre_release):
""" """Download Ansible collections as their tarball from a Galaxy server to the path specified and creates a requirements
Download Ansible collections as their tarball from a Galaxy server to the path specified and creates a requirements
file of the downloaded requirements to be used for an install. file of the downloaded requirements to be used for an install.
:param collections: The collections to download, should be a list of tuples with (name, requirement, Galaxy Server). :param collections: The collections to download, should be a list of tuples with (name, requirement, Galaxy Server).
@ -556,8 +609,7 @@ def download_collections(collections, output_path, apis, validate_certs, no_deps
def publish_collection(collection_path, api, wait, timeout): def publish_collection(collection_path, api, wait, timeout):
""" """Publish an Ansible collection tarball into an Ansible Galaxy server.
Publish an Ansible collection tarball into an Ansible Galaxy server.
:param collection_path: The path to the collection tarball to publish. :param collection_path: The path to the collection tarball to publish.
:param api: A GalaxyAPI to publish the collection to. :param api: A GalaxyAPI to publish the collection to.
@ -593,8 +645,7 @@ def publish_collection(collection_path, api, wait, timeout):
def install_collections(collections, output_path, apis, validate_certs, ignore_errors, no_deps, force, force_deps, def install_collections(collections, output_path, apis, validate_certs, ignore_errors, no_deps, force, force_deps,
allow_pre_release=False): allow_pre_release=False):
""" """Install Ansible collections to the path specified.
Install Ansible collections to the path specified.
:param collections: The collections to install, should be a list of tuples with (name, requirement, Galaxy server). :param collections: The collections to install, should be a list of tuples with (name, requirement, Galaxy server).
:param output_path: The path to install the collections to. :param output_path: The path to install the collections to.
@ -628,8 +679,7 @@ def install_collections(collections, output_path, apis, validate_certs, ignore_e
def validate_collection_name(name): def validate_collection_name(name):
""" """Validates the collection name as an input from the user or a requirements file fit the requirements.
Validates the collection name as an input from the user or a requirements file fit the requirements.
:param name: The input name with optional range specifier split by ':'. :param name: The input name with optional range specifier split by ':'.
:return: The input value, required for argparse validation. :return: The input value, required for argparse validation.
@ -859,6 +909,7 @@ def _build_files_manifest(b_collection_path, namespace, name, ignore_patterns):
# patterns can be extended by the build_ignore key in galaxy.yml # patterns can be extended by the build_ignore key in galaxy.yml
b_ignore_patterns = [ b_ignore_patterns = [
b'galaxy.yml', b'galaxy.yml',
b'galaxy.yaml',
b'.git', b'.git',
b'*.pyc', b'*.pyc',
b'*.retry', b'*.retry',
@ -968,6 +1019,7 @@ def _build_manifest(namespace, name, version, authors, readme, tags, description
def _build_collection_tar(b_collection_path, b_tar_path, collection_manifest, file_manifest): def _build_collection_tar(b_collection_path, b_tar_path, collection_manifest, file_manifest):
"""Build a tar.gz collection artifact from the manifest data."""
files_manifest_json = to_bytes(json.dumps(file_manifest, indent=True), errors='surrogate_or_strict') files_manifest_json = to_bytes(json.dumps(file_manifest, indent=True), errors='surrogate_or_strict')
collection_manifest['file_manifest_file']['chksum_sha256'] = secure_hash_s(files_manifest_json, hash_func=sha256) collection_manifest['file_manifest_file']['chksum_sha256'] = secure_hash_s(files_manifest_json, hash_func=sha256)
collection_manifest_json = to_bytes(json.dumps(collection_manifest, indent=True), errors='surrogate_or_strict') collection_manifest_json = to_bytes(json.dumps(collection_manifest, indent=True), errors='surrogate_or_strict')
@ -1008,6 +1060,49 @@ def _build_collection_tar(b_collection_path, b_tar_path, collection_manifest, fi
display.display('Created collection for %s at %s' % (collection_name, to_text(b_tar_path))) display.display('Created collection for %s at %s' % (collection_name, to_text(b_tar_path)))
def _build_collection_dir(b_collection_path, b_collection_output, collection_manifest, file_manifest):
"""Build a collection directory from the manifest data.
This should follow the same pattern as _build_collection_tar.
"""
os.makedirs(b_collection_output, mode=0o0755)
files_manifest_json = to_bytes(json.dumps(file_manifest, indent=True), errors='surrogate_or_strict')
collection_manifest['file_manifest_file']['chksum_sha256'] = secure_hash_s(files_manifest_json, hash_func=sha256)
collection_manifest_json = to_bytes(json.dumps(collection_manifest, indent=True), errors='surrogate_or_strict')
# Write contents to the files
for name, b in [('MANIFEST.json', collection_manifest_json), ('FILES.json', files_manifest_json)]:
b_path = os.path.join(b_collection_output, to_bytes(name, errors='surrogate_or_strict'))
with open(b_path, 'wb') as file_obj, BytesIO(b) as b_io:
shutil.copyfileobj(b_io, file_obj)
os.chmod(b_path, 0o0644)
base_directories = []
for file_info in file_manifest['files']:
if file_info['name'] == '.':
continue
src_file = os.path.join(b_collection_path, to_bytes(file_info['name'], errors='surrogate_or_strict'))
dest_file = os.path.join(b_collection_output, to_bytes(file_info['name'], errors='surrogate_or_strict'))
if any(src_file.startswith(directory) for directory in base_directories):
continue
existing_is_exec = os.stat(src_file).st_mode & stat.S_IXUSR
mode = 0o0755 if existing_is_exec else 0o0644
if os.path.isdir(src_file):
mode = 0o0755
base_directories.append(src_file)
shutil.copytree(src_file, dest_file)
else:
shutil.copyfile(src_file, dest_file)
os.chmod(dest_file, mode)
def find_existing_collections(path, fallback_metadata=False): def find_existing_collections(path, fallback_metadata=False):
collections = [] collections = []
@ -1033,9 +1128,9 @@ def _build_dependency_map(collections, existing_collections, b_temp_path, apis,
dependency_map = {} dependency_map = {}
# First build the dependency map on the actual requirements # First build the dependency map on the actual requirements
for name, version, source in collections: for name, version, source, req_type in collections:
_get_collection_info(dependency_map, existing_collections, name, version, source, b_temp_path, apis, _get_collection_info(dependency_map, existing_collections, name, version, source, b_temp_path, apis,
validate_certs, (force or force_deps), allow_pre_release=allow_pre_release) validate_certs, (force or force_deps), allow_pre_release=allow_pre_release, req_type=req_type)
checked_parents = set([to_text(c) for c in dependency_map.values() if c.skip]) checked_parents = set([to_text(c) for c in dependency_map.values() if c.skip])
while len(dependency_map) != len(checked_parents): while len(dependency_map) != len(checked_parents):
@ -1070,18 +1165,84 @@ def _build_dependency_map(collections, existing_collections, b_temp_path, apis,
return dependency_map return dependency_map
def _collections_from_scm(collection, requirement, b_temp_path, force, parent=None):
"""Returns a list of collections found in the repo. If there is a galaxy.yml in the collection then just return
the specific collection. Otherwise, check each top-level directory for a galaxy.yml.
:param collection: URI to a git repo
:param requirement: The version of the artifact
:param b_temp_path: The temporary path to the archive of a collection
:param force: Whether to overwrite an existing collection or fail
:param parent: The name of the parent collection
:raises AnsibleError: if nothing found
:return: List of CollectionRequirement objects
:rtype: list
"""
reqs = []
name, version, path, fragment = parse_scm(collection, requirement)
b_repo_root = to_bytes(name, errors='surrogate_or_strict')
b_collection_path = os.path.join(b_temp_path, b_repo_root)
if fragment:
b_fragment = to_bytes(fragment, errors='surrogate_or_strict')
b_collection_path = os.path.join(b_collection_path, b_fragment)
b_galaxy_path = get_galaxy_metadata_path(b_collection_path)
err = ("%s appears to be an SCM collection source, but the required galaxy.yml was not found. "
"Append #path/to/collection/ to your URI (before the comma separated version, if one is specified) "
"to point to a directory containing the galaxy.yml or directories of collections" % collection)
display.vvvvv("Considering %s as a possible path to a collection's galaxy.yml" % b_galaxy_path)
if os.path.exists(b_galaxy_path):
return [CollectionRequirement.from_path(b_collection_path, force, parent, fallback_metadata=True, skip=False)]
if not os.path.isdir(b_collection_path) or not os.listdir(b_collection_path):
raise AnsibleError(err)
for b_possible_collection in os.listdir(b_collection_path):
b_collection = os.path.join(b_collection_path, b_possible_collection)
if not os.path.isdir(b_collection):
continue
b_galaxy = get_galaxy_metadata_path(b_collection)
display.vvvvv("Considering %s as a possible path to a collection's galaxy.yml" % b_galaxy)
if os.path.exists(b_galaxy):
reqs.append(CollectionRequirement.from_path(b_collection, force, parent, fallback_metadata=True, skip=False))
if not reqs:
raise AnsibleError(err)
return reqs
def _get_collection_info(dep_map, existing_collections, collection, requirement, source, b_temp_path, apis, def _get_collection_info(dep_map, existing_collections, collection, requirement, source, b_temp_path, apis,
validate_certs, force, parent=None, allow_pre_release=False): validate_certs, force, parent=None, allow_pre_release=False, req_type=None):
dep_msg = "" dep_msg = ""
if parent: if parent:
dep_msg = " - as dependency of %s" % parent dep_msg = " - as dependency of %s" % parent
display.vvv("Processing requirement collection '%s'%s" % (to_text(collection), dep_msg)) display.vvv("Processing requirement collection '%s'%s" % (to_text(collection), dep_msg))
b_tar_path = None b_tar_path = None
if os.path.isfile(to_bytes(collection, errors='surrogate_or_strict')):
is_file = (
req_type == 'file' or
(not req_type and os.path.isfile(to_bytes(collection, errors='surrogate_or_strict')))
)
is_url = (
req_type == 'url' or
(not req_type and urlparse(collection).scheme.lower() in ['http', 'https'])
)
is_scm = (
req_type == 'git' or
(not req_type and not b_tar_path and collection.startswith(('git+', 'git@')))
)
if is_file:
display.vvvv("Collection requirement '%s' is a tar artifact" % to_text(collection)) display.vvvv("Collection requirement '%s' is a tar artifact" % to_text(collection))
b_tar_path = to_bytes(collection, errors='surrogate_or_strict') b_tar_path = to_bytes(collection, errors='surrogate_or_strict')
elif urlparse(collection).scheme.lower() in ['http', 'https']: elif is_url:
display.vvvv("Collection requirement '%s' is a URL to a tar artifact" % collection) display.vvvv("Collection requirement '%s' is a URL to a tar artifact" % collection)
try: try:
b_tar_path = _download_file(collection, b_temp_path, None, validate_certs) b_tar_path = _download_file(collection, b_temp_path, None, validate_certs)
@ -1089,15 +1250,33 @@ def _get_collection_info(dep_map, existing_collections, collection, requirement,
raise AnsibleError("Failed to download collection tar from '%s': %s" raise AnsibleError("Failed to download collection tar from '%s': %s"
% (to_native(collection), to_native(err))) % (to_native(collection), to_native(err)))
if is_scm:
if not collection.startswith('git'):
collection = 'git+' + collection
name, version, path, fragment = parse_scm(collection, requirement)
b_tar_path = scm_archive_collection(path, name=name, version=version)
with tarfile.open(b_tar_path, mode='r') as collection_tar:
collection_tar.extractall(path=to_text(b_temp_path))
# Ignore requirement if it is set (it must follow semantic versioning, unlike a git version, which is any tree-ish)
# If the requirement was the only place version was set, requirement == version at this point
if requirement not in {"*", ""} and requirement != version:
display.warning(
"The collection {0} appears to be a git repository and two versions were provided: '{1}', and '{2}'. "
"The version {2} is being disregarded.".format(collection, version, requirement)
)
requirement = "*"
reqs = _collections_from_scm(collection, requirement, b_temp_path, force, parent)
for req in reqs:
collection_info = get_collection_info_from_req(dep_map, req)
update_dep_map_collection_info(dep_map, existing_collections, collection_info, parent, requirement)
else:
if b_tar_path: if b_tar_path:
req = CollectionRequirement.from_tar(b_tar_path, force, parent=parent) req = CollectionRequirement.from_tar(b_tar_path, force, parent=parent)
collection_info = get_collection_info_from_req(dep_map, req)
collection_name = to_text(req)
if collection_name in dep_map:
collection_info = dep_map[collection_name]
collection_info.add_requirement(None, req.latest_version)
else:
collection_info = req
else: else:
validate_collection_name(collection) validate_collection_name(collection)
@ -1110,6 +1289,20 @@ def _get_collection_info(dep_map, existing_collections, collection, requirement,
collection_info = CollectionRequirement.from_name(collection, apis, requirement, force, parent=parent, collection_info = CollectionRequirement.from_name(collection, apis, requirement, force, parent=parent,
allow_pre_release=allow_pre_release) allow_pre_release=allow_pre_release)
update_dep_map_collection_info(dep_map, existing_collections, collection_info, parent, requirement)
def get_collection_info_from_req(dep_map, collection):
collection_name = to_text(collection)
if collection_name in dep_map:
collection_info = dep_map[collection_name]
collection_info.add_requirement(None, collection.latest_version)
else:
collection_info = collection
return collection_info
def update_dep_map_collection_info(dep_map, existing_collections, collection_info, parent, requirement):
existing = [c for c in existing_collections if to_text(c) == to_text(collection_info)] existing = [c for c in existing_collections if to_text(c) == to_text(collection_info)]
if existing and not collection_info.force: if existing and not collection_info.force:
# Test that the installed collection fits the requirement # Test that the installed collection fits the requirement
@ -1119,6 +1312,32 @@ def _get_collection_info(dep_map, existing_collections, collection, requirement,
dep_map[to_text(collection_info)] = collection_info dep_map[to_text(collection_info)] = collection_info
def parse_scm(collection, version):
if ',' in collection:
collection, version = collection.split(',', 1)
elif version == '*' or not version:
version = 'HEAD'
if collection.startswith('git+'):
path = collection[4:]
else:
path = collection
path, fragment = urldefrag(path)
fragment = fragment.strip(os.path.sep)
if path.endswith(os.path.sep + '.git'):
name = path.split(os.path.sep)[-2]
elif '://' not in path and '@' not in path:
name = path
else:
name = path.split('/')[-1]
if name.endswith('.git'):
name = name[:-4]
return name, version, path, fragment
def _download_file(url, b_path, expected_hash, validate_certs, headers=None): def _download_file(url, b_path, expected_hash, validate_certs, headers=None):
urlsplit = os.path.splitext(to_text(url.rsplit('/', 1)[1])) urlsplit = os.path.splitext(to_text(url.rsplit('/', 1)[1]))
b_file_name = to_bytes(urlsplit[0], errors='surrogate_or_strict') b_file_name = to_bytes(urlsplit[0], errors='surrogate_or_strict')
@ -1216,3 +1435,13 @@ def _consume_file(read_from, write_to=None):
data = read_from.read(bufsize) data = read_from.read(bufsize)
return sha256_digest.hexdigest() return sha256_digest.hexdigest()
def get_galaxy_metadata_path(b_path):
b_default_path = os.path.join(b_path, b'galaxy.yml')
candidate_names = [b'galaxy.yml', b'galaxy.yaml']
for b_name in candidate_names:
b_path = os.path.join(b_path, b_name)
if os.path.exists(b_path):
return b_path
return b_default_path

View file

@ -19,20 +19,11 @@
from __future__ import (absolute_import, division, print_function) from __future__ import (absolute_import, division, print_function)
__metaclass__ = type __metaclass__ = type
import os
import tempfile
import tarfile
from subprocess import Popen, PIPE
from ansible import constants as C
from ansible.errors import AnsibleError from ansible.errors import AnsibleError
from ansible.module_utils._text import to_native
from ansible.module_utils.common.process import get_bin_path
from ansible.module_utils.six import string_types from ansible.module_utils.six import string_types
from ansible.playbook.role.definition import RoleDefinition from ansible.playbook.role.definition import RoleDefinition
from ansible.utils.display import Display from ansible.utils.display import Display
from ansible.module_utils._text import to_text from ansible.utils.galaxy import scm_archive_resource
__all__ = ['RoleRequirement'] __all__ = ['RoleRequirement']
@ -136,57 +127,4 @@ class RoleRequirement(RoleDefinition):
@staticmethod @staticmethod
def scm_archive_role(src, scm='git', name=None, version='HEAD', keep_scm_meta=False): def scm_archive_role(src, scm='git', name=None, version='HEAD', keep_scm_meta=False):
def run_scm_cmd(cmd, tempdir): return scm_archive_resource(src, scm=scm, name=name, version=version, keep_scm_meta=keep_scm_meta)
try:
stdout = ''
stderr = ''
popen = Popen(cmd, cwd=tempdir, stdout=PIPE, stderr=PIPE)
stdout, stderr = popen.communicate()
except Exception as e:
ran = " ".join(cmd)
display.debug("ran %s:" % ran)
display.debug("\tstdout: " + to_text(stdout))
display.debug("\tstderr: " + to_text(stderr))
raise AnsibleError("when executing %s: %s" % (ran, to_native(e)))
if popen.returncode != 0:
raise AnsibleError("- command %s failed in directory %s (rc=%s) - %s" % (' '.join(cmd), tempdir, popen.returncode, to_native(stderr)))
if scm not in ['hg', 'git']:
raise AnsibleError("- scm %s is not currently supported" % scm)
try:
scm_path = get_bin_path(scm)
except (ValueError, OSError, IOError):
raise AnsibleError("could not find/use %s, it is required to continue with installing %s" % (scm, src))
tempdir = tempfile.mkdtemp(dir=C.DEFAULT_LOCAL_TMP)
clone_cmd = [scm_path, 'clone', src, name]
run_scm_cmd(clone_cmd, tempdir)
if scm == 'git' and version:
checkout_cmd = [scm_path, 'checkout', to_text(version)]
run_scm_cmd(checkout_cmd, os.path.join(tempdir, name))
temp_file = tempfile.NamedTemporaryFile(delete=False, suffix='.tar', dir=C.DEFAULT_LOCAL_TMP)
archive_cmd = None
if keep_scm_meta:
display.vvv('tarring %s from %s to %s' % (name, tempdir, temp_file.name))
with tarfile.open(temp_file.name, "w") as tar:
tar.add(os.path.join(tempdir, name), arcname=name)
elif scm == 'hg':
archive_cmd = [scm_path, 'archive', '--prefix', "%s/" % name]
if version:
archive_cmd.extend(['-r', version])
archive_cmd.append(temp_file.name)
elif scm == 'git':
archive_cmd = [scm_path, 'archive', '--prefix=%s/' % name, '--output=%s' % temp_file.name]
if version:
archive_cmd.append(version)
else:
archive_cmd.append('HEAD')
if archive_cmd is not None:
display.vvv('archiving %s' % archive_cmd)
run_scm_cmd(archive_cmd, os.path.join(tempdir, name))
return temp_file.name

View file

@ -0,0 +1,94 @@
# (c) 2014 Michael DeHaan, <michael@ansible.com>
#
# This file is part of Ansible
#
# Ansible is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# Ansible is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with Ansible. If not, see <http://www.gnu.org/licenses/>.
# Make coding more python3-ish
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
import os
import tempfile
from subprocess import Popen, PIPE
import tarfile
import ansible.constants as C
from ansible.errors import AnsibleError
from ansible.utils.display import Display
from ansible.module_utils.common.process import get_bin_path
from ansible.module_utils.common.text.converters import to_text, to_native
display = Display()
def scm_archive_collection(src, name=None, version='HEAD'):
return scm_archive_resource(src, scm='git', name=name, version=version, keep_scm_meta=False)
def scm_archive_resource(src, scm='git', name=None, version='HEAD', keep_scm_meta=False):
def run_scm_cmd(cmd, tempdir):
try:
stdout = ''
stderr = ''
popen = Popen(cmd, cwd=tempdir, stdout=PIPE, stderr=PIPE)
stdout, stderr = popen.communicate()
except Exception as e:
ran = " ".join(cmd)
display.debug("ran %s:" % ran)
raise AnsibleError("when executing %s: %s" % (ran, to_native(e)))
if popen.returncode != 0:
raise AnsibleError("- command %s failed in directory %s (rc=%s) - %s" % (' '.join(cmd), tempdir, popen.returncode, to_native(stderr)))
if scm not in ['hg', 'git']:
raise AnsibleError("- scm %s is not currently supported" % scm)
try:
scm_path = get_bin_path(scm)
except (ValueError, OSError, IOError):
raise AnsibleError("could not find/use %s, it is required to continue with installing %s" % (scm, src))
tempdir = tempfile.mkdtemp(dir=C.DEFAULT_LOCAL_TMP)
clone_cmd = [scm_path, 'clone', src, name]
run_scm_cmd(clone_cmd, tempdir)
if scm == 'git' and version:
checkout_cmd = [scm_path, 'checkout', to_text(version)]
run_scm_cmd(checkout_cmd, os.path.join(tempdir, name))
temp_file = tempfile.NamedTemporaryFile(delete=False, suffix='.tar', dir=C.DEFAULT_LOCAL_TMP)
archive_cmd = None
if keep_scm_meta:
display.vvv('tarring %s from %s to %s' % (name, tempdir, temp_file.name))
with tarfile.open(temp_file.name, "w") as tar:
tar.add(os.path.join(tempdir, name), arcname=name)
elif scm == 'hg':
archive_cmd = [scm_path, 'archive', '--prefix', "%s/" % name]
if version:
archive_cmd.extend(['-r', version])
archive_cmd.append(temp_file.name)
elif scm == 'git':
archive_cmd = [scm_path, 'archive', '--prefix=%s/' % name, '--output=%s' % temp_file.name]
if version:
archive_cmd.append(version)
else:
archive_cmd.append('HEAD')
if archive_cmd is not None:
display.vvv('archiving %s' % archive_cmd)
run_scm_cmd(archive_cmd, os.path.join(tempdir, name))
return temp_file.name

View file

@ -0,0 +1,3 @@
shippable/posix/group4
skip/aix
skip/python2.6 # ansible-galaxy uses tarfile with features not available until 2.7

View file

@ -0,0 +1,3 @@
---
dependencies:
- setup_remote_tmp_dir

View file

@ -0,0 +1,7 @@
- name: delete installed collections
file:
state: "{{ item }}"
path: "{{ galaxy_dir }}/ansible_collections"
loop:
- absent
- directory

View file

@ -0,0 +1,20 @@
- name: Clone a git repository
git:
repo: https://github.com/ansible-collections/amazon.aws.git
dest: '{{ galaxy_dir }}/development/amazon.aws/'
- name: install
command: 'ansible-galaxy collection install git+file://{{galaxy_dir }}/development/amazon.aws/.git'
args:
chdir: '{{ galaxy_dir }}/development'
- name: list installed collections
command: 'ansible-galaxy collection list'
register: installed_collections
- assert:
that:
- "'amazon.aws' in installed_collections.stdout"
- include_tasks: ./empty_installed_collections.yml
when: cleanup

View file

@ -0,0 +1,40 @@
---
- name: set the temp test directory
set_fact:
galaxy_dir: "{{ remote_tmp_dir }}/galaxy"
- name: Test installing collections from git repositories
environment:
ANSIBLE_COLLECTIONS_PATHS: '{{ galaxy_dir }}'
vars:
cleanup: True
galaxy_dir: "{{ galaxy_dir }}"
block:
- include_tasks: ./setup.yml
- include_tasks: ./individual_collection_repo.yml
- include_tasks: ./setup_multi_collection_repo.yml
- include_tasks: ./multi_collection_repo_all.yml
- include_tasks: ./scm_dependency.yml
vars:
cleanup: False
- include_tasks: ./reinstalling.yml
- include_tasks: ./multi_collection_repo_individual.yml
- include_tasks: ./setup_recursive_scm_dependency.yml
- include_tasks: ./scm_dependency_deduplication.yml
always:
- name: Remove the directories for installing collections and git repositories
file:
path: '{{ item }}'
state: absent
loop:
- '{{ galaxy_dir }}/ansible_collections'
- '{{ galaxy_dir }}/development'
- name: remove git
package:
name: git
state: absent
when: git_install is changed

View file

@ -0,0 +1,14 @@
- name: Install all collections by default
command: 'ansible-galaxy collection install git+file://{{ galaxy_dir }}/development/ansible_test/.git'
- name: list installed collections
command: 'ansible-galaxy collection list'
register: installed_collections
- assert:
that:
- "'ansible_test.collection_1' in installed_collections.stdout"
- "'ansible_test.collection_2' in installed_collections.stdout"
- include_tasks: ./empty_installed_collections.yml
when: cleanup

View file

@ -0,0 +1,15 @@
- name: test installing one collection
command: 'ansible-galaxy collection install git+file://{{ galaxy_dir }}/development/ansible_test/.git#collection_2'
- name: list installed collections
command: 'ansible-galaxy collection list'
register: installed_collections
- assert:
that:
- "'amazon.aws' not in installed_collections.stdout"
- "'ansible_test.collection_1' not in installed_collections.stdout"
- "'ansible_test.collection_2' in installed_collections.stdout"
- include_tasks: ./empty_installed_collections.yml
when: cleanup

View file

@ -0,0 +1,31 @@
- name: Rerun installing a collection with a dep
command: 'ansible-galaxy collection install git+file://{{ galaxy_dir }}/development/ansible_test/.git#/collection_1/'
register: installed
- assert:
that:
- "'Skipping' in installed.stdout"
- "'Created' not in installed.stdout"
- name: Only reinstall the collection
command: 'ansible-galaxy collection install git+file://{{ galaxy_dir }}/development/ansible_test/.git#/collection_1/ --force'
register: installed
- assert:
that:
- "'Created collection for ansible_test.collection_1' in installed.stdout"
- "'Created collection for ansible_test.collection_2' not in installed.stdout"
- "'Skipping' in installed.stdout"
- name: Reinstall the collection and dependency
command: 'ansible-galaxy collection install git+file://{{ galaxy_dir }}/development/ansible_test/.git#/collection_1/ --force-with-deps'
register: installed
- assert:
that:
- "'Created collection for ansible_test.collection_1' in installed.stdout"
- "'Created collection for ansible_test.collection_2' in installed.stdout"
- "'Skipping' not in installed.stdout"
- include_tasks: ./empty_installed_collections.yml
when: cleanup

View file

@ -0,0 +1,14 @@
- name: test installing one collection that has a SCM dep
command: 'ansible-galaxy collection install git+file://{{ galaxy_dir }}/development/ansible_test/.git#/collection_1/'
- name: list installed collections
command: 'ansible-galaxy collection list'
register: installed_collections
- assert:
that:
- "'ansible_test.collection_1' in installed_collections.stdout"
- "'ansible_test.collection_2' in installed_collections.stdout"
- include_tasks: ./empty_installed_collections.yml
when: cleanup

View file

@ -0,0 +1,50 @@
- name: Install all collections in a repo, one of which has a recursive dependency
command: 'ansible-galaxy collection install git+file://{{ galaxy_dir }}/development/namespace_1/.git'
register: command
- assert:
that:
- command.stdout_lines | length == 7
- command.stdout_lines[0] == "Starting galaxy collection install process"
- command.stdout_lines[1] == "Process install dependency map"
- command.stdout_lines[2] == "Starting collection install process"
- "'namespace_1.collection_1' in command.stdout_lines[3]"
- "'namespace_1.collection_1' in command.stdout_lines[4]"
- "'namespace_2.collection_2' in command.stdout_lines[5]"
- "'namespace_2.collection_2' in command.stdout_lines[6]"
- name: list installed collections
command: 'ansible-galaxy collection list'
register: installed_collections
- assert:
that:
- "'namespace_1.collection_1' in installed_collections.stdout"
- "'namespace_2.collection_2' in installed_collections.stdout"
- name: Install a specific collection in a repo with a recursive dependency
command: 'ansible-galaxy collection install git+file://{{ galaxy_dir }}/development/namespace_1/.git#/collection_1/ --force-with-deps'
register: command
- assert:
that:
- command.stdout_lines | length == 7
- command.stdout_lines[0] == "Starting galaxy collection install process"
- command.stdout_lines[1] == "Process install dependency map"
- command.stdout_lines[2] == "Starting collection install process"
- "'namespace_1.collection_1' in command.stdout_lines[3]"
- "'namespace_1.collection_1' in command.stdout_lines[4]"
- "'namespace_2.collection_2' in command.stdout_lines[5]"
- "'namespace_2.collection_2' in command.stdout_lines[6]"
- name: list installed collections
command: 'ansible-galaxy collection list'
register: installed_collections
- assert:
that:
- "'namespace_1.collection_1' in installed_collections.stdout"
- "'namespace_2.collection_2' in installed_collections.stdout"
- include_tasks: ./empty_installed_collections.yml
when: cleanup

View file

@ -0,0 +1,19 @@
- name: ensure git is installed
package:
name: git
when: ansible_distribution != "MacOSX"
register: git_install
- name: set git global user.email if not already set
shell: git config --global user.email || git config --global user.email "noreply@example.com"
- name: set git global user.name if not already set
shell: git config --global user.name || git config --global user.name "Ansible Test Runner"
- name: Create a directory for installing collections and creating git repositories
file:
path: '{{ item }}'
state: directory
loop:
- '{{ galaxy_dir }}/ansible_collections'
- '{{ galaxy_dir }}/development/ansible_test'

View file

@ -0,0 +1,27 @@
- name: Initialize a git repo
command: 'git init {{ galaxy_dir }}/development/ansible_test'
- stat:
path: "{{ galaxy_dir }}/development/ansible_test"
- name: Add a couple collections to the repository
command: 'ansible-galaxy collection init {{ item }}'
args:
chdir: '{{ galaxy_dir }}/development'
loop:
- 'ansible_test.collection_1'
- 'ansible_test.collection_2'
- name: Add collection_2 as a dependency of collection_1
lineinfile:
path: '{{ galaxy_dir }}/development/ansible_test/collection_1/galaxy.yml'
regexp: '^dependencies'
line: "dependencies: {'git+file://{{ galaxy_dir }}/development/ansible_test/.git#collection_2/': '*'}"
- name: Commit the changes
command: '{{ item }}'
args:
chdir: '{{ galaxy_dir }}/development/ansible_test'
loop:
- git add ./
- git commit -m 'add collections'

View file

@ -0,0 +1,33 @@
- name: Initialize git repositories
command: 'git init {{ galaxy_dir }}/development/{{ item }}'
loop:
- namespace_1
- namespace_2
- name: Add a couple collections to the repository
command: 'ansible-galaxy collection init {{ item }}'
args:
chdir: '{{ galaxy_dir }}/development'
loop:
- 'namespace_1.collection_1'
- 'namespace_2.collection_2'
- name: Add collection_2 as a dependency of collection_1
lineinfile:
path: '{{ galaxy_dir }}/development/namespace_1/collection_1/galaxy.yml'
regexp: '^dependencies'
line: "dependencies: {'git+file://{{ galaxy_dir }}/development/namespace_2/.git#collection_2/': '*'}"
- name: Add collection_1 as a dependency on collection_2
lineinfile:
path: '{{ galaxy_dir }}/development/namespace_2/collection_2/galaxy.yml'
regexp: '^dependencies'
line: "dependencies: {'git+file://{{ galaxy_dir }}/development/namespace_1/.git#collection_1/': 'master'}"
- name: Commit the changes
shell: git add ./; git commit -m 'add collection'
args:
chdir: '{{ galaxy_dir }}/development/{{ item }}'
loop:
- namespace_1
- namespace_2

View file

@ -63,6 +63,7 @@ lib/ansible/executor/powershell/async_watchdog.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/executor/powershell/async_wrapper.ps1 pslint:PSCustomUseLiteralPath lib/ansible/executor/powershell/async_wrapper.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/executor/powershell/exec_wrapper.ps1 pslint:PSCustomUseLiteralPath lib/ansible/executor/powershell/exec_wrapper.ps1 pslint:PSCustomUseLiteralPath
lib/ansible/executor/task_queue_manager.py pylint:blacklisted-name lib/ansible/executor/task_queue_manager.py pylint:blacklisted-name
lib/ansible/galaxy/collection.py compile-2.6!skip # 'ansible-galaxy collection' requires 2.7+
lib/ansible/module_utils/_text.py future-import-boilerplate lib/ansible/module_utils/_text.py future-import-boilerplate
lib/ansible/module_utils/_text.py metaclass-boilerplate lib/ansible/module_utils/_text.py metaclass-boilerplate
lib/ansible/module_utils/api.py future-import-boilerplate lib/ansible/module_utils/api.py future-import-boilerplate

View file

@ -765,8 +765,8 @@ def test_collection_install_with_names(collection_install):
in mock_warning.call_args[0][0] in mock_warning.call_args[0][0]
assert mock_install.call_count == 1 assert mock_install.call_count == 1
assert mock_install.call_args[0][0] == [('namespace.collection', '*', None), assert mock_install.call_args[0][0] == [('namespace.collection', '*', None, None),
('namespace2.collection', '1.0.1', None)] ('namespace2.collection', '1.0.1', None, None)]
assert mock_install.call_args[0][1] == collection_path assert mock_install.call_args[0][1] == collection_path
assert len(mock_install.call_args[0][2]) == 1 assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com' assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com'
@ -802,8 +802,8 @@ collections:
in mock_warning.call_args[0][0] in mock_warning.call_args[0][0]
assert mock_install.call_count == 1 assert mock_install.call_count == 1
assert mock_install.call_args[0][0] == [('namespace.coll', '*', None), assert mock_install.call_args[0][0] == [('namespace.coll', '*', None, None),
('namespace2.coll', '>2.0.1', None)] ('namespace2.coll', '>2.0.1', None, None)]
assert mock_install.call_args[0][1] == collection_path assert mock_install.call_args[0][1] == collection_path
assert len(mock_install.call_args[0][2]) == 1 assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com' assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com'
@ -819,7 +819,7 @@ def test_collection_install_with_relative_path(collection_install, monkeypatch):
mock_install = collection_install[0] mock_install = collection_install[0]
mock_req = MagicMock() mock_req = MagicMock()
mock_req.return_value = {'collections': [('namespace.coll', '*', None)], 'roles': []} mock_req.return_value = {'collections': [('namespace.coll', '*', None, None)], 'roles': []}
monkeypatch.setattr(ansible.cli.galaxy.GalaxyCLI, '_parse_requirements_file', mock_req) monkeypatch.setattr(ansible.cli.galaxy.GalaxyCLI, '_parse_requirements_file', mock_req)
monkeypatch.setattr(os, 'makedirs', MagicMock()) monkeypatch.setattr(os, 'makedirs', MagicMock())
@ -831,7 +831,7 @@ def test_collection_install_with_relative_path(collection_install, monkeypatch):
GalaxyCLI(args=galaxy_args).run() GalaxyCLI(args=galaxy_args).run()
assert mock_install.call_count == 1 assert mock_install.call_count == 1
assert mock_install.call_args[0][0] == [('namespace.coll', '*', None)] assert mock_install.call_args[0][0] == [('namespace.coll', '*', None, None)]
assert mock_install.call_args[0][1] == os.path.abspath(collections_path) assert mock_install.call_args[0][1] == os.path.abspath(collections_path)
assert len(mock_install.call_args[0][2]) == 1 assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com' assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com'
@ -850,7 +850,7 @@ def test_collection_install_with_unexpanded_path(collection_install, monkeypatch
mock_install = collection_install[0] mock_install = collection_install[0]
mock_req = MagicMock() mock_req = MagicMock()
mock_req.return_value = {'collections': [('namespace.coll', '*', None)], 'roles': []} mock_req.return_value = {'collections': [('namespace.coll', '*', None, None)], 'roles': []}
monkeypatch.setattr(ansible.cli.galaxy.GalaxyCLI, '_parse_requirements_file', mock_req) monkeypatch.setattr(ansible.cli.galaxy.GalaxyCLI, '_parse_requirements_file', mock_req)
monkeypatch.setattr(os, 'makedirs', MagicMock()) monkeypatch.setattr(os, 'makedirs', MagicMock())
@ -862,7 +862,7 @@ def test_collection_install_with_unexpanded_path(collection_install, monkeypatch
GalaxyCLI(args=galaxy_args).run() GalaxyCLI(args=galaxy_args).run()
assert mock_install.call_count == 1 assert mock_install.call_count == 1
assert mock_install.call_args[0][0] == [('namespace.coll', '*', None)] assert mock_install.call_args[0][0] == [('namespace.coll', '*', None, None)]
assert mock_install.call_args[0][1] == os.path.expanduser(os.path.expandvars(collections_path)) assert mock_install.call_args[0][1] == os.path.expanduser(os.path.expandvars(collections_path))
assert len(mock_install.call_args[0][2]) == 1 assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com' assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com'
@ -889,8 +889,8 @@ def test_collection_install_in_collection_dir(collection_install, monkeypatch):
assert mock_warning.call_count == 0 assert mock_warning.call_count == 0
assert mock_install.call_count == 1 assert mock_install.call_count == 1
assert mock_install.call_args[0][0] == [('namespace.collection', '*', None), assert mock_install.call_args[0][0] == [('namespace.collection', '*', None, None),
('namespace2.collection', '1.0.1', None)] ('namespace2.collection', '1.0.1', None, None)]
assert mock_install.call_args[0][1] == os.path.join(collections_path, 'ansible_collections') assert mock_install.call_args[0][1] == os.path.join(collections_path, 'ansible_collections')
assert len(mock_install.call_args[0][2]) == 1 assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com' assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com'
@ -913,7 +913,7 @@ def test_collection_install_with_url(collection_install):
assert os.path.isdir(collection_path) assert os.path.isdir(collection_path)
assert mock_install.call_count == 1 assert mock_install.call_count == 1
assert mock_install.call_args[0][0] == [('https://foo/bar/foo-bar-v1.0.0.tar.gz', '*', None)] assert mock_install.call_args[0][0] == [('https://foo/bar/foo-bar-v1.0.0.tar.gz', '*', None, None)]
assert mock_install.call_args[0][1] == collection_path assert mock_install.call_args[0][1] == collection_path
assert len(mock_install.call_args[0][2]) == 1 assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com' assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com'
@ -958,8 +958,8 @@ def test_collection_install_path_with_ansible_collections(collection_install):
% collection_path in mock_warning.call_args[0][0] % collection_path in mock_warning.call_args[0][0]
assert mock_install.call_count == 1 assert mock_install.call_count == 1
assert mock_install.call_args[0][0] == [('namespace.collection', '*', None), assert mock_install.call_args[0][0] == [('namespace.collection', '*', None, None),
('namespace2.collection', '1.0.1', None)] ('namespace2.collection', '1.0.1', None, None)]
assert mock_install.call_args[0][1] == collection_path assert mock_install.call_args[0][1] == collection_path
assert len(mock_install.call_args[0][2]) == 1 assert len(mock_install.call_args[0][2]) == 1
assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com' assert mock_install.call_args[0][2][0].api_server == 'https://galaxy.ansible.com'
@ -1104,7 +1104,7 @@ collections:
def test_parse_requirements(requirements_cli, requirements_file): def test_parse_requirements(requirements_cli, requirements_file):
expected = { expected = {
'roles': [], 'roles': [],
'collections': [('namespace.collection1', '*', None), ('namespace.collection2', '*', None)] 'collections': [('namespace.collection1', '*', None, None), ('namespace.collection2', '*', None, None)]
} }
actual = requirements_cli._parse_requirements_file(requirements_file) actual = requirements_cli._parse_requirements_file(requirements_file)
@ -1131,7 +1131,7 @@ def test_parse_requirements_with_extra_info(requirements_cli, requirements_file)
assert actual['collections'][0][2].password is None assert actual['collections'][0][2].password is None
assert actual['collections'][0][2].validate_certs is True assert actual['collections'][0][2].validate_certs is True
assert actual['collections'][1] == ('namespace.collection2', '*', None) assert actual['collections'][1] == ('namespace.collection2', '*', None, None)
@pytest.mark.parametrize('requirements_file', [''' @pytest.mark.parametrize('requirements_file', ['''
@ -1154,7 +1154,7 @@ def test_parse_requirements_with_roles_and_collections(requirements_cli, require
assert actual['roles'][2].src == 'ssh://github.com/user/repo' assert actual['roles'][2].src == 'ssh://github.com/user/repo'
assert len(actual['collections']) == 1 assert len(actual['collections']) == 1
assert actual['collections'][0] == ('namespace.collection2', '*', None) assert actual['collections'][0] == ('namespace.collection2', '*', None, None)
@pytest.mark.parametrize('requirements_file', [''' @pytest.mark.parametrize('requirements_file', ['''
@ -1173,7 +1173,7 @@ def test_parse_requirements_with_collection_source(requirements_cli, requirement
assert actual['roles'] == [] assert actual['roles'] == []
assert len(actual['collections']) == 3 assert len(actual['collections']) == 3
assert actual['collections'][0] == ('namespace.collection', '*', None) assert actual['collections'][0] == ('namespace.collection', '*', None, None)
assert actual['collections'][1][0] == 'namespace2.collection2' assert actual['collections'][1][0] == 'namespace2.collection2'
assert actual['collections'][1][1] == '*' assert actual['collections'][1][1] == '*'
@ -1181,7 +1181,7 @@ def test_parse_requirements_with_collection_source(requirements_cli, requirement
assert actual['collections'][1][2].name == 'explicit_requirement_namespace2.collection2' assert actual['collections'][1][2].name == 'explicit_requirement_namespace2.collection2'
assert actual['collections'][1][2].token is None assert actual['collections'][1][2].token is None
assert actual['collections'][2] == ('namespace3.collection3', '*', galaxy_api) assert actual['collections'][2] == ('namespace3.collection3', '*', galaxy_api, None)
@pytest.mark.parametrize('requirements_file', [''' @pytest.mark.parametrize('requirements_file', ['''
@ -1237,7 +1237,7 @@ def test_install_implicit_role_with_collections(requirements_file, monkeypatch):
cli.run() cli.run()
assert mock_collection_install.call_count == 1 assert mock_collection_install.call_count == 1
assert mock_collection_install.call_args[0][0] == [('namespace.name', '*', None)] assert mock_collection_install.call_args[0][0] == [('namespace.name', '*', None, None)]
assert mock_collection_install.call_args[0][1] == cli._get_default_collection_path() assert mock_collection_install.call_args[0][1] == cli._get_default_collection_path()
assert mock_role_install.call_count == 1 assert mock_role_install.call_count == 1
@ -1335,7 +1335,7 @@ def test_install_collection_with_roles(requirements_file, monkeypatch):
cli.run() cli.run()
assert mock_collection_install.call_count == 1 assert mock_collection_install.call_count == 1
assert mock_collection_install.call_args[0][0] == [('namespace.name', '*', None)] assert mock_collection_install.call_args[0][0] == [('namespace.name', '*', None, None)]
assert mock_collection_install.call_args[0][1] == cli._get_default_collection_path() assert mock_collection_install.call_args[0][1] == cli._get_default_collection_path()
assert mock_role_install.call_count == 0 assert mock_role_install.call_count == 0

View file

@ -787,7 +787,7 @@ def test_require_one_of_collections_requirements_with_collections():
requirements = cli._require_one_of_collections_requirements(collections, '')['collections'] requirements = cli._require_one_of_collections_requirements(collections, '')['collections']
assert requirements == [('namespace1.collection1', '*', None), ('namespace2.collection1', '1.0.0', None)] assert requirements == [('namespace1.collection1', '*', None, None), ('namespace2.collection1', '1.0.0', None, None)]
@patch('ansible.cli.galaxy.GalaxyCLI._parse_requirements_file') @patch('ansible.cli.galaxy.GalaxyCLI._parse_requirements_file')
@ -838,7 +838,7 @@ def test_execute_verify_with_defaults(mock_verify_collections):
requirements, search_paths, galaxy_apis, validate, ignore_errors = mock_verify_collections.call_args[0] requirements, search_paths, galaxy_apis, validate, ignore_errors = mock_verify_collections.call_args[0]
assert requirements == [('namespace.collection', '1.0.4', None)] assert requirements == [('namespace.collection', '1.0.4', None, None)]
for install_path in search_paths: for install_path in search_paths:
assert install_path.endswith('ansible_collections') assert install_path.endswith('ansible_collections')
assert galaxy_apis[0].api_server == 'https://galaxy.ansible.com' assert galaxy_apis[0].api_server == 'https://galaxy.ansible.com'
@ -857,7 +857,7 @@ def test_execute_verify(mock_verify_collections):
requirements, search_paths, galaxy_apis, validate, ignore_errors = mock_verify_collections.call_args[0] requirements, search_paths, galaxy_apis, validate, ignore_errors = mock_verify_collections.call_args[0]
assert requirements == [('namespace.collection', '1.0.4', None)] assert requirements == [('namespace.collection', '1.0.4', None, None)]
for install_path in search_paths: for install_path in search_paths:
assert install_path.endswith('ansible_collections') assert install_path.endswith('ansible_collections')
assert galaxy_apis[0].api_server == 'http://galaxy-dev.com' assert galaxy_apis[0].api_server == 'http://galaxy-dev.com'
@ -1184,7 +1184,7 @@ def test_verify_collections_not_installed(mock_verify, mock_collection, monkeypa
found_remote = MagicMock(return_value=mock_collection(local=False)) found_remote = MagicMock(return_value=mock_collection(local=False))
monkeypatch.setattr(collection.CollectionRequirement, 'from_name', found_remote) monkeypatch.setattr(collection.CollectionRequirement, 'from_name', found_remote)
collections = [('%s.%s' % (namespace, name), version, None)] collections = [('%s.%s' % (namespace, name), version, None, None)]
search_path = './' search_path = './'
validate_certs = False validate_certs = False
ignore_errors = False ignore_errors = False

View file

@ -702,7 +702,7 @@ def test_install_collections_from_tar(collection_artifact, monkeypatch):
mock_display = MagicMock() mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display) monkeypatch.setattr(Display, 'display', mock_display)
collection.install_collections([(to_text(collection_tar), '*', None,)], to_text(temp_path), collection.install_collections([(to_text(collection_tar), '*', None, None)], to_text(temp_path),
[u'https://galaxy.ansible.com'], True, False, False, False, False) [u'https://galaxy.ansible.com'], True, False, False, False, False)
assert os.path.isdir(collection_path) assert os.path.isdir(collection_path)
@ -735,7 +735,7 @@ def test_install_collections_existing_without_force(collection_artifact, monkeyp
monkeypatch.setattr(Display, 'display', mock_display) monkeypatch.setattr(Display, 'display', mock_display)
# If we don't delete collection_path it will think the original build skeleton is installed so we expect a skip # If we don't delete collection_path it will think the original build skeleton is installed so we expect a skip
collection.install_collections([(to_text(collection_tar), '*', None,)], to_text(temp_path), collection.install_collections([(to_text(collection_tar), '*', None, None)], to_text(temp_path),
[u'https://galaxy.ansible.com'], True, False, False, False, False) [u'https://galaxy.ansible.com'], True, False, False, False, False)
assert os.path.isdir(collection_path) assert os.path.isdir(collection_path)
@ -768,7 +768,7 @@ def test_install_missing_metadata_warning(collection_artifact, monkeypatch):
if os.path.isfile(b_path): if os.path.isfile(b_path):
os.unlink(b_path) os.unlink(b_path)
collection.install_collections([(to_text(collection_tar), '*', None,)], to_text(temp_path), collection.install_collections([(to_text(collection_tar), '*', None, None)], to_text(temp_path),
[u'https://galaxy.ansible.com'], True, False, False, False, False) [u'https://galaxy.ansible.com'], True, False, False, False, False)
display_msgs = [m[1][0] for m in mock_display.mock_calls if 'newline' not in m[2] and len(m[1]) == 1] display_msgs = [m[1][0] for m in mock_display.mock_calls if 'newline' not in m[2] and len(m[1]) == 1]
@ -788,7 +788,7 @@ def test_install_collection_with_circular_dependency(collection_artifact, monkey
mock_display = MagicMock() mock_display = MagicMock()
monkeypatch.setattr(Display, 'display', mock_display) monkeypatch.setattr(Display, 'display', mock_display)
collection.install_collections([(to_text(collection_tar), '*', None,)], to_text(temp_path), collection.install_collections([(to_text(collection_tar), '*', None, None)], to_text(temp_path),
[u'https://galaxy.ansible.com'], True, False, False, False, False) [u'https://galaxy.ansible.com'], True, False, False, False, False)
assert os.path.isdir(collection_path) assert os.path.isdir(collection_path)