ansible/test/support/integration/plugins/cache/jsonfile.py
Matt Clay e3591223a0
Second batch of incidental integration tests. ()
* Update incidental test aliases.

* Rewrite target references for renamed targets.

* Add incidental tests to CI.

* Update sanity tests for incidental cloud tests.

* Initial copy of incidental tests.

* Copy contrib files into test.

* Update paths in test.

* Add support plugins.

* Update plugin to work around missing deps.

* Update sanity ignores.

* Fix matrix entries.

* Remove debug echo.
2020-02-25 23:18:50 -08:00

63 lines
2 KiB
Python

# (c) 2014, Brian Coca, Josh Drake, et al
# (c) 2017 Ansible Project
# GNU General Public License v3.0+ (see COPYING or https://www.gnu.org/licenses/gpl-3.0.txt)
# Make coding more python3-ish
from __future__ import (absolute_import, division, print_function)
__metaclass__ = type
DOCUMENTATION = '''
cache: jsonfile
short_description: JSON formatted files.
description:
- This cache uses JSON formatted, per host, files saved to the filesystem.
version_added: "1.9"
author: Ansible Core (@ansible-core)
options:
_uri:
required: True
description:
- Path in which the cache plugin will save the JSON files
env:
- name: ANSIBLE_CACHE_PLUGIN_CONNECTION
ini:
- key: fact_caching_connection
section: defaults
_prefix:
description: User defined prefix to use when creating the JSON files
env:
- name: ANSIBLE_CACHE_PLUGIN_PREFIX
ini:
- key: fact_caching_prefix
section: defaults
_timeout:
default: 86400
description: Expiration timeout in seconds for the cache plugin data. Set to 0 to never expire
env:
- name: ANSIBLE_CACHE_PLUGIN_TIMEOUT
ini:
- key: fact_caching_timeout
section: defaults
type: integer
'''
import codecs
import json
from ansible.parsing.ajson import AnsibleJSONEncoder, AnsibleJSONDecoder
from ansible.plugins.cache import BaseFileCacheModule
class CacheModule(BaseFileCacheModule):
"""
A caching module backed by json files.
"""
def _load(self, filepath):
# Valid JSON is always UTF-8 encoded.
with codecs.open(filepath, 'r', encoding='utf-8') as f:
return json.load(f, cls=AnsibleJSONDecoder)
def _dump(self, value, filepath):
with codecs.open(filepath, 'w', encoding='utf-8') as f:
f.write(json.dumps(value, cls=AnsibleJSONEncoder, sort_keys=True, indent=4))