0
0
Fork 1
mirror of https://mau.dev/maunium/synapse.git synced 2024-10-31 19:59:01 +01:00

Merge remote-tracking branch 'upstream/release-v1.104'

This commit is contained in:
Tulir Asokan 2024-03-26 16:24:21 +02:00
commit 67b2fad49e
117 changed files with 1807 additions and 1165 deletions

View file

@ -14,7 +14,7 @@ jobs:
# There's a 'download artifact' action, but it hasn't been updated for the workflow_run action # There's a 'download artifact' action, but it hasn't been updated for the workflow_run action
# (https://github.com/actions/download-artifact/issues/60) so instead we get this mess: # (https://github.com/actions/download-artifact/issues/60) so instead we get this mess:
- name: 📥 Download artifact - name: 📥 Download artifact
uses: dawidd6/action-download-artifact@72aaadce3bc708349fc665eee3785cbb1b6e51d0 # v3.1.1 uses: dawidd6/action-download-artifact@09f2f74827fd3a8607589e5ad7f9398816f540fe # v3.1.4
with: with:
workflow: docs-pr.yaml workflow: docs-pr.yaml
run_id: ${{ github.event.workflow_run.id }} run_id: ${{ github.event.workflow_run.id }}

View file

@ -1,3 +1,64 @@
# Synapse 1.104.0rc1 (2024-03-26)
### Features
- Add an OIDC config to specify extra parameters for the authorization grant URL. IT can be useful to pass an ACR value for example. ([\#16971](https://github.com/element-hq/synapse/issues/16971))
- Add support for OIDC provider returning JWT. ([\#16972](https://github.com/element-hq/synapse/issues/16972), [\#17031](https://github.com/element-hq/synapse/issues/17031))
### Bugfixes
- Fix a bug which meant that, under certain circumstances, we might never retry sending events or to-device messages over federation after a failure. ([\#16925](https://github.com/element-hq/synapse/issues/16925))
- Fix various long-standing bugs which could cause incorrect state to be returned from `/sync` in certain situations. ([\#16949](https://github.com/element-hq/synapse/issues/16949))
- Fix case in which `m.fully_read` marker would not get updated. Contributed by @SpiritCroc. ([\#16990](https://github.com/element-hq/synapse/issues/16990))
- Fix bug which did not retract a user's pending knocks at rooms when their account was deactivated. Contributed by @hanadi92. ([\#17010](https://github.com/element-hq/synapse/issues/17010))
### Updates to the Docker image
- Updated `start.py` to generate config using the correct user ID when running as root (fixes [\#16824](https://github.com/element-hq/synapse/issues/16824), [\#15202](https://github.com/element-hq/synapse/issues/15202)). ([\#16978](https://github.com/element-hq/synapse/issues/16978))
### Improved Documentation
- Add a query to force a refresh of a remote user's device list to the "Useful SQL for Admins" documentation page. ([\#16892](https://github.com/element-hq/synapse/issues/16892))
- Minor grammatical corrections to the upgrade documentation. ([\#16965](https://github.com/element-hq/synapse/issues/16965))
- Fix the sort order for the documentation version picker, so that newer releases appear above older ones. ([\#16966](https://github.com/element-hq/synapse/issues/16966))
- Remove recommendation for a specific poetry version from contributing guide. ([\#17002](https://github.com/element-hq/synapse/issues/17002))
### Internal Changes
- Improve lock performance when a lot of locks are all waiting for a single lock to be released. ([\#16840](https://github.com/element-hq/synapse/issues/16840))
- Update power level default for public rooms. ([\#16907](https://github.com/element-hq/synapse/issues/16907))
- Improve event validation. ([\#16908](https://github.com/element-hq/synapse/issues/16908))
- Multi-worker-docker-container: disable log buffering. ([\#16919](https://github.com/element-hq/synapse/issues/16919))
- Refactor state delta calculation in `/sync` handler. ([\#16929](https://github.com/element-hq/synapse/issues/16929))
- Clarify docs for some room state functions. ([\#16950](https://github.com/element-hq/synapse/issues/16950))
- Specify IP subnets in canonical form. ([\#16953](https://github.com/element-hq/synapse/issues/16953))
- As done for SAML mapping provider, let's pass the module API to the OIDC one so the mapper can do more logic in its code. ([\#16974](https://github.com/element-hq/synapse/issues/16974))
- Allow containers building on top of Synapse's Complement container is use the included PostgreSQL cluster. ([\#16985](https://github.com/element-hq/synapse/issues/16985))
- Raise poetry-core version cap to 1.9.0. ([\#16986](https://github.com/element-hq/synapse/issues/16986))
- Patch the db conn pool sooner in tests. ([\#17017](https://github.com/element-hq/synapse/issues/17017))
### Updates to locked dependencies
* Bump anyhow from 1.0.80 to 1.0.81. ([\#17009](https://github.com/element-hq/synapse/issues/17009))
* Bump black from 23.10.1 to 24.2.0. ([\#16936](https://github.com/element-hq/synapse/issues/16936))
* Bump cryptography from 41.0.7 to 42.0.5. ([\#16958](https://github.com/element-hq/synapse/issues/16958))
* Bump dawidd6/action-download-artifact from 3.1.1 to 3.1.2. ([\#16960](https://github.com/element-hq/synapse/issues/16960))
* Bump dawidd6/action-download-artifact from 3.1.2 to 3.1.4. ([\#17008](https://github.com/element-hq/synapse/issues/17008))
* Bump jinja2 from 3.1.2 to 3.1.3. ([\#17005](https://github.com/element-hq/synapse/issues/17005))
* Bump log from 0.4.20 to 0.4.21. ([\#16977](https://github.com/element-hq/synapse/issues/16977))
* Bump mypy from 1.5.1 to 1.8.0. ([\#16901](https://github.com/element-hq/synapse/issues/16901))
* Bump netaddr from 0.9.0 to 1.2.1. ([\#17006](https://github.com/element-hq/synapse/issues/17006))
* Bump pydantic from 2.6.0 to 2.6.4. ([\#17004](https://github.com/element-hq/synapse/issues/17004))
* Bump pyo3 from 0.20.2 to 0.20.3. ([\#16962](https://github.com/element-hq/synapse/issues/16962))
* Bump ruff from 0.1.14 to 0.3.2. ([\#16994](https://github.com/element-hq/synapse/issues/16994))
* Bump serde from 1.0.196 to 1.0.197. ([\#16963](https://github.com/element-hq/synapse/issues/16963))
* Bump serde_json from 1.0.113 to 1.0.114. ([\#16961](https://github.com/element-hq/synapse/issues/16961))
* Bump types-jsonschema from 4.21.0.20240118 to 4.21.0.20240311. ([\#17007](https://github.com/element-hq/synapse/issues/17007))
* Bump types-psycopg2 from 2.9.21.16 to 2.9.21.20240311. ([\#16995](https://github.com/element-hq/synapse/issues/16995))
* Bump types-pyopenssl from 23.3.0.0 to 24.0.0.20240311. ([\#17003](https://github.com/element-hq/synapse/issues/17003))
# Synapse 1.103.0 (2024-03-19) # Synapse 1.103.0 (2024-03-19)
No significant changes since 1.103.0rc1. No significant changes since 1.103.0rc1.

48
Cargo.lock generated
View file

@ -13,9 +13,9 @@ dependencies = [
[[package]] [[package]]
name = "anyhow" name = "anyhow"
version = "1.0.80" version = "1.0.81"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5ad32ce52e4161730f7098c077cd2ed6229b5804ccf99e5366be1ab72a98b4e1" checksum = "0952808a6c2afd1aa8947271f3a60f1a6763c7b912d210184c5149b5cf147247"
[[package]] [[package]]
name = "arc-swap" name = "arc-swap"
@ -138,9 +138,9 @@ dependencies = [
[[package]] [[package]]
name = "log" name = "log"
version = "0.4.20" version = "0.4.21"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "b5e6163cb8c49088c2c36f57875e58ccd8c87c7427f7fbd50ea6710b2f3f2e8f" checksum = "90ed8c1e510134f979dbc4f070f87d4313098b704861a105fe34231c70a3901c"
[[package]] [[package]]
name = "memchr" name = "memchr"
@ -186,6 +186,12 @@ dependencies = [
"windows-sys", "windows-sys",
] ]
[[package]]
name = "portable-atomic"
version = "1.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7170ef9988bc169ba16dd36a7fa041e5c4cbeb6a35b76d4c03daded371eae7c0"
[[package]] [[package]]
name = "proc-macro2" name = "proc-macro2"
version = "1.0.76" version = "1.0.76"
@ -197,9 +203,9 @@ dependencies = [
[[package]] [[package]]
name = "pyo3" name = "pyo3"
version = "0.20.2" version = "0.20.3"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9a89dc7a5850d0e983be1ec2a463a171d20990487c3cfcd68b5363f1ee3d6fe0" checksum = "53bdbb96d49157e65d45cc287af5f32ffadd5f4761438b527b055fb0d4bb8233"
dependencies = [ dependencies = [
"anyhow", "anyhow",
"cfg-if", "cfg-if",
@ -207,6 +213,7 @@ dependencies = [
"libc", "libc",
"memoffset", "memoffset",
"parking_lot", "parking_lot",
"portable-atomic",
"pyo3-build-config", "pyo3-build-config",
"pyo3-ffi", "pyo3-ffi",
"pyo3-macros", "pyo3-macros",
@ -215,9 +222,9 @@ dependencies = [
[[package]] [[package]]
name = "pyo3-build-config" name = "pyo3-build-config"
version = "0.20.2" version = "0.20.3"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "07426f0d8fe5a601f26293f300afd1a7b1ed5e78b2a705870c5f30893c5163be" checksum = "deaa5745de3f5231ce10517a1f5dd97d53e5a2fd77aa6b5842292085831d48d7"
dependencies = [ dependencies = [
"once_cell", "once_cell",
"target-lexicon", "target-lexicon",
@ -225,9 +232,9 @@ dependencies = [
[[package]] [[package]]
name = "pyo3-ffi" name = "pyo3-ffi"
version = "0.20.2" version = "0.20.3"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "dbb7dec17e17766b46bca4f1a4215a85006b4c2ecde122076c562dd058da6cf1" checksum = "62b42531d03e08d4ef1f6e85a2ed422eb678b8cd62b762e53891c05faf0d4afa"
dependencies = [ dependencies = [
"libc", "libc",
"pyo3-build-config", "pyo3-build-config",
@ -246,9 +253,9 @@ dependencies = [
[[package]] [[package]]
name = "pyo3-macros" name = "pyo3-macros"
version = "0.20.2" version = "0.20.3"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "05f738b4e40d50b5711957f142878cfa0f28e054aa0ebdfc3fd137a843f74ed3" checksum = "7305c720fa01b8055ec95e484a6eca7a83c841267f0dd5280f0c8b8551d2c158"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"pyo3-macros-backend", "pyo3-macros-backend",
@ -258,12 +265,13 @@ dependencies = [
[[package]] [[package]]
name = "pyo3-macros-backend" name = "pyo3-macros-backend"
version = "0.20.2" version = "0.20.3"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0fc910d4851847827daf9d6cdd4a823fbdaab5b8818325c5e97a86da79e8881f" checksum = "7c7e9b68bb9c3149c5b0cade5d07f953d6d125eb4337723c4ccdb665f1f96185"
dependencies = [ dependencies = [
"heck", "heck",
"proc-macro2", "proc-macro2",
"pyo3-build-config",
"quote", "quote",
"syn", "syn",
] ]
@ -339,18 +347,18 @@ checksum = "d29ab0c6d3fc0ee92fe66e2d99f700eab17a8d57d1c1d3b748380fb20baa78cd"
[[package]] [[package]]
name = "serde" name = "serde"
version = "1.0.196" version = "1.0.197"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "870026e60fa08c69f064aa766c10f10b1d62db9ccd4d0abb206472bee0ce3b32" checksum = "3fb1c873e1b9b056a4dc4c0c198b24c3ffa059243875552b2bd0933b1aee4ce2"
dependencies = [ dependencies = [
"serde_derive", "serde_derive",
] ]
[[package]] [[package]]
name = "serde_derive" name = "serde_derive"
version = "1.0.196" version = "1.0.197"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "33c85360c95e7d137454dc81d9a4ed2b8efd8fbe19cee57357b32b9771fccb67" checksum = "7eb0b34b42edc17f6b7cac84a52a1c5f0e1bb2227e997ca9011ea3dd34e8610b"
dependencies = [ dependencies = [
"proc-macro2", "proc-macro2",
"quote", "quote",
@ -359,9 +367,9 @@ dependencies = [
[[package]] [[package]]
name = "serde_json" name = "serde_json"
version = "1.0.113" version = "1.0.114"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "69801b70b1c3dac963ecb03a364ba0ceda9cf60c71cfe475e99864759c8b8a79" checksum = "c5f09b1bd632ef549eaa9f60a1f8de742bdbc698e6cee2095fc84dde5f549ae0"
dependencies = [ dependencies = [
"itoa", "itoa",
"ryu", "ryu",

View file

@ -0,0 +1 @@
OIDC: try to JWT decode userinfo response if JSON parsing failed.

6
debian/changelog vendored
View file

@ -1,3 +1,9 @@
matrix-synapse-py3 (1.104.0~rc1) stable; urgency=medium
* New Synapse release 1.104.0rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 26 Mar 2024 11:48:58 +0000
matrix-synapse-py3 (1.103.0) stable; urgency=medium matrix-synapse-py3 (1.103.0) stable; urgency=medium
* New Synapse release 1.103.0. * New Synapse release 1.103.0.

View file

@ -1,7 +1,7 @@
[program:postgres] [program:postgres]
command=/usr/local/bin/prefix-log gosu postgres postgres command=/usr/local/bin/prefix-log gosu postgres postgres
# Only start if START_POSTGRES=1 # Only start if START_POSTGRES=true
autostart=%(ENV_START_POSTGRES)s autostart=%(ENV_START_POSTGRES)s
# Lower priority number = starts first # Lower priority number = starts first

View file

@ -32,8 +32,9 @@ case "$SYNAPSE_COMPLEMENT_DATABASE" in
;; ;;
sqlite|"") sqlite|"")
# Configure supervisord not to start Postgres, as we don't need it # Set START_POSTGRES to false unless it has already been set
export START_POSTGRES=false # (i.e. by another container image inheriting our own).
export START_POSTGRES=${START_POSTGRES:-false}
;; ;;
*) *)

View file

@ -7,6 +7,9 @@
# prefix-log command [args...] # prefix-log command [args...]
# #
exec 1> >(awk '{print "'"${SUPERVISOR_PROCESS_NAME}"' | "$0}' >&1) # '-W interactive' is a `mawk` extension which disables buffering on stdout and sets line-buffered reads on
exec 2> >(awk '{print "'"${SUPERVISOR_PROCESS_NAME}"' | "$0}' >&2) # stdin. The effect is that the output is flushed after each line, rather than being batched, which helps reduce
# confusion due to to interleaving of the different processes.
exec 1> >(awk -W interactive '{print "'"${SUPERVISOR_PROCESS_NAME}"' | "$0 }' >&1)
exec 2> >(awk -W interactive '{print "'"${SUPERVISOR_PROCESS_NAME}"' | "$0 }' >&2)
exec "$@" exec "$@"

View file

@ -160,11 +160,6 @@ def run_generate_config(environ: Mapping[str, str], ownership: Optional[str]) ->
config_path = environ.get("SYNAPSE_CONFIG_PATH", config_dir + "/homeserver.yaml") config_path = environ.get("SYNAPSE_CONFIG_PATH", config_dir + "/homeserver.yaml")
data_dir = environ.get("SYNAPSE_DATA_DIR", "/data") data_dir = environ.get("SYNAPSE_DATA_DIR", "/data")
if ownership is not None:
# make sure that synapse has perms to write to the data dir.
log(f"Setting ownership on {data_dir} to {ownership}")
subprocess.run(["chown", ownership, data_dir], check=True)
# create a suitable log config from our template # create a suitable log config from our template
log_config_file = "%s/%s.log.config" % (config_dir, server_name) log_config_file = "%s/%s.log.config" % (config_dir, server_name)
if not os.path.exists(log_config_file): if not os.path.exists(log_config_file):
@ -189,9 +184,15 @@ def run_generate_config(environ: Mapping[str, str], ownership: Optional[str]) ->
"--generate-config", "--generate-config",
"--open-private-ports", "--open-private-ports",
] ]
if ownership is not None:
# make sure that synapse has perms to write to the data dir.
log(f"Setting ownership on {data_dir} to {ownership}")
subprocess.run(["chown", ownership, data_dir], check=True)
args = ["gosu", ownership] + args
# log("running %s" % (args, )) # log("running %s" % (args, ))
flush_buffers() subprocess.run(args, check=True)
os.execv(sys.executable, args)
def main(args: List[str], environ: MutableMapping[str, str]) -> None: def main(args: List[str], environ: MutableMapping[str, str]) -> None:

View file

@ -68,7 +68,7 @@ Of their installation methods, we recommend
```shell ```shell
pip install --user pipx pip install --user pipx
pipx install poetry==1.5.1 # Problems with Poetry 1.6, see https://github.com/matrix-org/synapse/issues/16147 pipx install poetry
``` ```
but see poetry's [installation instructions](https://python-poetry.org/docs/#installation) but see poetry's [installation instructions](https://python-poetry.org/docs/#installation)

View file

@ -182,7 +182,7 @@ synapse_port_db --sqlite-database homeserver.db.snapshot \
--postgres-config homeserver-postgres.yaml --postgres-config homeserver-postgres.yaml
``` ```
The flag `--curses` displays a coloured curses progress UI. The flag `--curses` displays a coloured curses progress UI. (NOTE: if your terminal is too small the script will error out)
If the script took a long time to complete, or time has otherwise passed If the script took a long time to complete, or time has otherwise passed
since the original snapshot was taken, repeat the previous steps with a since the original snapshot was taken, repeat the previous steps with a

View file

@ -26,7 +26,7 @@ for most users.
#### Docker images and Ansible playbooks #### Docker images and Ansible playbooks
There is an official synapse image available at There is an official synapse image available at
<https://hub.docker.com/r/vectorim/synapse> or at [`ghcr.io/element-hq/synapse`](https://ghcr.io/element-hq/synapse) <https://hub.docker.com/r/matrixdotorg/synapse> or at [`ghcr.io/element-hq/synapse`](https://ghcr.io/element-hq/synapse)
which can be used with the docker-compose file available at which can be used with the docker-compose file available at
[contrib/docker](https://github.com/element-hq/synapse/tree/develop/contrib/docker). [contrib/docker](https://github.com/element-hq/synapse/tree/develop/contrib/docker).
Further information on this including configuration options is available in the README Further information on this including configuration options is available in the README

View file

@ -50,11 +50,13 @@ comment these options out and use those specified by the module instead.
A custom mapping provider must specify the following methods: A custom mapping provider must specify the following methods:
* `def __init__(self, parsed_config)` * `def __init__(self, parsed_config, module_api)`
- Arguments: - Arguments:
- `parsed_config` - A configuration object that is the return value of the - `parsed_config` - A configuration object that is the return value of the
`parse_config` method. You should set any configuration options needed by `parse_config` method. You should set any configuration options needed by
the module here. the module here.
- `module_api` - a `synapse.module_api.ModuleApi` object which provides the
stable API available for extension modules.
* `def parse_config(config)` * `def parse_config(config)`
- This method should have the `@staticmethod` decoration. - This method should have the `@staticmethod` decoration.
- Arguments: - Arguments:

View file

@ -88,11 +88,11 @@ process, for example:
dpkg -i matrix-synapse-py3_1.3.0+stretch1_amd64.deb dpkg -i matrix-synapse-py3_1.3.0+stretch1_amd64.deb
``` ```
Generally Synapse database schemas are compatible across multiple versions, once Generally Synapse database schemas are compatible across multiple versions, but once
a version of Synapse is deployed you may not be able to rollback automatically. a version of Synapse is deployed you may not be able to roll back automatically.
The following table gives the version ranges and the earliest version they can The following table gives the version ranges and the earliest version they can
be rolled back to. E.g. Synapse versions v1.58.0 through v1.61.1 can be rolled be rolled back to. E.g. Synapse versions v1.58.0 through v1.61.1 can be rolled
back safely to v1.57.0, but starting with v1.62.0 it is only safe to rollback to back safely to v1.57.0, but starting with v1.62.0 it is only safe to roll back to
v1.61.0. v1.61.0.
<!-- REPLACE_WITH_SCHEMA_VERSIONS --> <!-- REPLACE_WITH_SCHEMA_VERSIONS -->

View file

@ -205,3 +205,12 @@ SELECT user_id, device_id, user_agent, TO_TIMESTAMP(last_seen / 1000) AS "last_s
FROM devices FROM devices
WHERE last_seen < DATE_PART('epoch', NOW() - INTERVAL '3 month') * 1000; WHERE last_seen < DATE_PART('epoch', NOW() - INTERVAL '3 month') * 1000;
``` ```
## Clear the cache of a remote user's device list
Forces the resync of a remote user's device list - if you have somehow cached a bad state, and the remote server is
will not send out a device list update.
```sql
INSERT INTO device_lists_remote_resync
VALUES ('USER_ID', (EXTRACT(epoch FROM NOW()) * 1000)::BIGINT);
```

View file

@ -3349,6 +3349,9 @@ Options for each entry include:
not included in `scopes`. Set to `userinfo_endpoint` to always use the not included in `scopes`. Set to `userinfo_endpoint` to always use the
userinfo endpoint. userinfo endpoint.
* `additional_authorization_parameters`: String to string dictionary that will be passed as
additional parameters to the authorization grant URL.
* `allow_existing_users`: set to true to allow a user logging in via OIDC to * `allow_existing_users`: set to true to allow a user logging in via OIDC to
match a pre-existing account instead of failing. This could be used if match a pre-existing account instead of failing. This could be used if
switching from password logins to OIDC. Defaults to false. switching from password logins to OIDC. Defaults to false.
@ -3473,6 +3476,8 @@ oidc_providers:
token_endpoint: "https://accounts.example.com/oauth2/token" token_endpoint: "https://accounts.example.com/oauth2/token"
userinfo_endpoint: "https://accounts.example.com/userinfo" userinfo_endpoint: "https://accounts.example.com/userinfo"
jwks_uri: "https://accounts.example.com/.well-known/jwks.json" jwks_uri: "https://accounts.example.com/.well-known/jwks.json"
additional_authorization_parameters:
acr_values: 2fa
skip_verification: true skip_verification: true
enable_registration: true enable_registration: true
user_mapping_provider: user_mapping_provider:

View file

@ -100,10 +100,30 @@ function sortVersions(a, b) {
if (a === 'develop' || a === 'latest') return -1; if (a === 'develop' || a === 'latest') return -1;
if (b === 'develop' || b === 'latest') return 1; if (b === 'develop' || b === 'latest') return 1;
const versionA = (a.match(/v\d+(\.\d+)+/) || [])[0]; // If any of the versions do not confrom to a semantic version string, they
const versionB = (b.match(/v\d+(\.\d+)+/) || [])[0]; // will be sorted behind a valid version.
const versionA = (a.match(/v(\d+(\.\d+)+)/) || [])[1]?.split('.') ?? '';
const versionB = (b.match(/v(\d+(\.\d+)+)/) || [])[1]?.split('.') ?? '';
return versionB.localeCompare(versionA); for (let i = 0; i < Math.max(versionA.length, versionB.length); i++) {
if (versionB[i] === undefined) {
return -1;
}
if (versionA[i] === undefined) {
return 1;
}
const partA = parseInt(versionA[i], 10);
const partB = parseInt(versionB[i], 10);
if (partA > partB) {
return -1;
} else if (partB > partA) {
return 1;
}
}
return 0;
} }
/** /**

411
poetry.lock generated
View file

@ -169,29 +169,33 @@ lxml = ["lxml"]
[[package]] [[package]]
name = "black" name = "black"
version = "23.10.1" version = "24.2.0"
description = "The uncompromising code formatter." description = "The uncompromising code formatter."
optional = false optional = false
python-versions = ">=3.8" python-versions = ">=3.8"
files = [ files = [
{file = "black-23.10.1-cp310-cp310-macosx_10_16_arm64.whl", hash = "sha256:ec3f8e6234c4e46ff9e16d9ae96f4ef69fa328bb4ad08198c8cee45bb1f08c69"}, {file = "black-24.2.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:6981eae48b3b33399c8757036c7f5d48a535b962a7c2310d19361edeef64ce29"},
{file = "black-23.10.1-cp310-cp310-macosx_10_16_x86_64.whl", hash = "sha256:1b917a2aa020ca600483a7b340c165970b26e9029067f019e3755b56e8dd5916"}, {file = "black-24.2.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:d533d5e3259720fdbc1b37444491b024003e012c5173f7d06825a77508085430"},
{file = "black-23.10.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9c74de4c77b849e6359c6f01987e94873c707098322b91490d24296f66d067dc"}, {file = "black-24.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:61a0391772490ddfb8a693c067df1ef5227257e72b0e4108482b8d41b5aee13f"},
{file = "black-23.10.1-cp310-cp310-win_amd64.whl", hash = "sha256:7b4d10b0f016616a0d93d24a448100adf1699712fb7a4efd0e2c32bbb219b173"}, {file = "black-24.2.0-cp310-cp310-win_amd64.whl", hash = "sha256:992e451b04667116680cb88f63449267c13e1ad134f30087dec8527242e9862a"},
{file = "black-23.10.1-cp311-cp311-macosx_10_16_arm64.whl", hash = "sha256:b15b75fc53a2fbcac8a87d3e20f69874d161beef13954747e053bca7a1ce53a0"}, {file = "black-24.2.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:163baf4ef40e6897a2a9b83890e59141cc8c2a98f2dda5080dc15c00ee1e62cd"},
{file = "black-23.10.1-cp311-cp311-macosx_10_16_x86_64.whl", hash = "sha256:e293e4c2f4a992b980032bbd62df07c1bcff82d6964d6c9496f2cd726e246ace"}, {file = "black-24.2.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e37c99f89929af50ffaf912454b3e3b47fd64109659026b678c091a4cd450fb2"},
{file = "black-23.10.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7d56124b7a61d092cb52cce34182a5280e160e6aff3137172a68c2c2c4b76bcb"}, {file = "black-24.2.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4f9de21bafcba9683853f6c96c2d515e364aee631b178eaa5145fc1c61a3cc92"},
{file = "black-23.10.1-cp311-cp311-win_amd64.whl", hash = "sha256:3f157a8945a7b2d424da3335f7ace89c14a3b0625e6593d21139c2d8214d55ce"}, {file = "black-24.2.0-cp311-cp311-win_amd64.whl", hash = "sha256:9db528bccb9e8e20c08e716b3b09c6bdd64da0dd129b11e160bf082d4642ac23"},
{file = "black-23.10.1-cp38-cp38-macosx_10_16_arm64.whl", hash = "sha256:cfcce6f0a384d0da692119f2d72d79ed07c7159879d0bb1bb32d2e443382bf3a"}, {file = "black-24.2.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:d84f29eb3ee44859052073b7636533ec995bd0f64e2fb43aeceefc70090e752b"},
{file = "black-23.10.1-cp38-cp38-macosx_10_16_x86_64.whl", hash = "sha256:33d40f5b06be80c1bbce17b173cda17994fbad096ce60eb22054da021bf933d1"}, {file = "black-24.2.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1e08fb9a15c914b81dd734ddd7fb10513016e5ce7e6704bdd5e1251ceee51ac9"},
{file = "black-23.10.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:840015166dbdfbc47992871325799fd2dc0dcf9395e401ada6d88fe11498abad"}, {file = "black-24.2.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:810d445ae6069ce64030c78ff6127cd9cd178a9ac3361435708b907d8a04c693"},
{file = "black-23.10.1-cp38-cp38-win_amd64.whl", hash = "sha256:037e9b4664cafda5f025a1728c50a9e9aedb99a759c89f760bd83730e76ba884"}, {file = "black-24.2.0-cp312-cp312-win_amd64.whl", hash = "sha256:ba15742a13de85e9b8f3239c8f807723991fbfae24bad92d34a2b12e81904982"},
{file = "black-23.10.1-cp39-cp39-macosx_10_16_arm64.whl", hash = "sha256:7cb5936e686e782fddb1c73f8aa6f459e1ad38a6a7b0e54b403f1f05a1507ee9"}, {file = "black-24.2.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:7e53a8c630f71db01b28cd9602a1ada68c937cbf2c333e6ed041390d6968faf4"},
{file = "black-23.10.1-cp39-cp39-macosx_10_16_x86_64.whl", hash = "sha256:7670242e90dc129c539e9ca17665e39a146a761e681805c54fbd86015c7c84f7"}, {file = "black-24.2.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:93601c2deb321b4bad8f95df408e3fb3943d85012dddb6121336b8e24a0d1218"},
{file = "black-23.10.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5ed45ac9a613fb52dad3b61c8dea2ec9510bf3108d4db88422bacc7d1ba1243d"}, {file = "black-24.2.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a0057f800de6acc4407fe75bb147b0c2b5cbb7c3ed110d3e5999cd01184d53b0"},
{file = "black-23.10.1-cp39-cp39-win_amd64.whl", hash = "sha256:6d23d7822140e3fef190734216cefb262521789367fbdc0b3f22af6744058982"}, {file = "black-24.2.0-cp38-cp38-win_amd64.whl", hash = "sha256:faf2ee02e6612577ba0181f4347bcbcf591eb122f7841ae5ba233d12c39dcb4d"},
{file = "black-23.10.1-py3-none-any.whl", hash = "sha256:d431e6739f727bb2e0495df64a6c7a5310758e87505f5f8cde9ff6c0f2d7e4fe"}, {file = "black-24.2.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:057c3dc602eaa6fdc451069bd027a1b2635028b575a6c3acfd63193ced20d9c8"},
{file = "black-23.10.1.tar.gz", hash = "sha256:1f8ce316753428ff68749c65a5f7844631aa18c8679dfd3ca9dc1a289979c258"}, {file = "black-24.2.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:08654d0797e65f2423f850fc8e16a0ce50925f9337fb4a4a176a7aa4026e63f8"},
{file = "black-24.2.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ca610d29415ee1a30a3f30fab7a8f4144e9d34c89a235d81292a1edb2b55f540"},
{file = "black-24.2.0-cp39-cp39-win_amd64.whl", hash = "sha256:4dd76e9468d5536abd40ffbc7a247f83b2324f0c050556d9c371c2b9a9a95e31"},
{file = "black-24.2.0-py3-none-any.whl", hash = "sha256:e8a6ae970537e67830776488bca52000eaa37fa63b9988e8c487458d9cd5ace6"},
{file = "black-24.2.0.tar.gz", hash = "sha256:bce4f25c27c3435e4dace4815bcb2008b87e167e3bf4ee47ccdc5ce906eb4894"},
] ]
[package.dependencies] [package.dependencies]
@ -205,7 +209,7 @@ typing-extensions = {version = ">=4.0.1", markers = "python_version < \"3.11\""}
[package.extras] [package.extras]
colorama = ["colorama (>=0.4.3)"] colorama = ["colorama (>=0.4.3)"]
d = ["aiohttp (>=3.7.4)"] d = ["aiohttp (>=3.7.4)", "aiohttp (>=3.7.4,!=3.9.0)"]
jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"] jupyter = ["ipython (>=7.8.0)", "tokenize-rt (>=3.2.0)"]
uvloop = ["uvloop (>=0.15.2)"] uvloop = ["uvloop (>=0.15.2)"]
@ -461,47 +465,56 @@ files = [
[[package]] [[package]]
name = "cryptography" name = "cryptography"
version = "41.0.7" version = "42.0.5"
description = "cryptography is a package which provides cryptographic recipes and primitives to Python developers." description = "cryptography is a package which provides cryptographic recipes and primitives to Python developers."
optional = false optional = false
python-versions = ">=3.7" python-versions = ">=3.7"
files = [ files = [
{file = "cryptography-41.0.7-cp37-abi3-macosx_10_12_universal2.whl", hash = "sha256:3c78451b78313fa81607fa1b3f1ae0a5ddd8014c38a02d9db0616133987b9cdf"}, {file = "cryptography-42.0.5-cp37-abi3-macosx_10_12_universal2.whl", hash = "sha256:a30596bae9403a342c978fb47d9b0ee277699fa53bbafad14706af51fe543d16"},
{file = "cryptography-41.0.7-cp37-abi3-macosx_10_12_x86_64.whl", hash = "sha256:928258ba5d6f8ae644e764d0f996d61a8777559f72dfeb2eea7e2fe0ad6e782d"}, {file = "cryptography-42.0.5-cp37-abi3-macosx_10_12_x86_64.whl", hash = "sha256:b7ffe927ee6531c78f81aa17e684e2ff617daeba7f189f911065b2ea2d526dec"},
{file = "cryptography-41.0.7-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5a1b41bc97f1ad230a41657d9155113c7521953869ae57ac39ac7f1bb471469a"}, {file = "cryptography-42.0.5-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2424ff4c4ac7f6b8177b53c17ed5d8fa74ae5955656867f5a8affaca36a27abb"},
{file = "cryptography-41.0.7-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:841df4caa01008bad253bce2a6f7b47f86dc9f08df4b433c404def869f590a15"}, {file = "cryptography-42.0.5-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:329906dcc7b20ff3cad13c069a78124ed8247adcac44b10bea1130e36caae0b4"},
{file = "cryptography-41.0.7-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:5429ec739a29df2e29e15d082f1d9ad683701f0ec7709ca479b3ff2708dae65a"}, {file = "cryptography-42.0.5-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:b03c2ae5d2f0fc05f9a2c0c997e1bc18c8229f392234e8a0194f202169ccd278"},
{file = "cryptography-41.0.7-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:43f2552a2378b44869fe8827aa19e69512e3245a219104438692385b0ee119d1"}, {file = "cryptography-42.0.5-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:f8837fe1d6ac4a8052a9a8ddab256bc006242696f03368a4009be7ee3075cdb7"},
{file = "cryptography-41.0.7-cp37-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:af03b32695b24d85a75d40e1ba39ffe7db7ffcb099fe507b39fd41a565f1b157"}, {file = "cryptography-42.0.5-cp37-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:0270572b8bd2c833c3981724b8ee9747b3ec96f699a9665470018594301439ee"},
{file = "cryptography-41.0.7-cp37-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:49f0805fc0b2ac8d4882dd52f4a3b935b210935d500b6b805f321addc8177406"}, {file = "cryptography-42.0.5-cp37-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:b8cac287fafc4ad485b8a9b67d0ee80c66bf3574f655d3b97ef2e1082360faf1"},
{file = "cryptography-41.0.7-cp37-abi3-win32.whl", hash = "sha256:f983596065a18a2183e7f79ab3fd4c475205b839e02cbc0efbbf9666c4b3083d"}, {file = "cryptography-42.0.5-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:16a48c23a62a2f4a285699dba2e4ff2d1cff3115b9df052cdd976a18856d8e3d"},
{file = "cryptography-41.0.7-cp37-abi3-win_amd64.whl", hash = "sha256:90452ba79b8788fa380dfb587cca692976ef4e757b194b093d845e8d99f612f2"}, {file = "cryptography-42.0.5-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:2bce03af1ce5a5567ab89bd90d11e7bbdff56b8af3acbbec1faded8f44cb06da"},
{file = "cryptography-41.0.7-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:079b85658ea2f59c4f43b70f8119a52414cdb7be34da5d019a77bf96d473b960"}, {file = "cryptography-42.0.5-cp37-abi3-win32.whl", hash = "sha256:b6cd2203306b63e41acdf39aa93b86fb566049aeb6dc489b70e34bcd07adca74"},
{file = "cryptography-41.0.7-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:b640981bf64a3e978a56167594a0e97db71c89a479da8e175d8bb5be5178c003"}, {file = "cryptography-42.0.5-cp37-abi3-win_amd64.whl", hash = "sha256:98d8dc6d012b82287f2c3d26ce1d2dd130ec200c8679b6213b3c73c08b2b7940"},
{file = "cryptography-41.0.7-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:e3114da6d7f95d2dee7d3f4eec16dacff819740bbab931aff8648cb13c5ff5e7"}, {file = "cryptography-42.0.5-cp39-abi3-macosx_10_12_universal2.whl", hash = "sha256:5e6275c09d2badf57aea3afa80d975444f4be8d3bc58f7f80d2a484c6f9485c8"},
{file = "cryptography-41.0.7-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:d5ec85080cce7b0513cfd233914eb8b7bbd0633f1d1703aa28d1dd5a72f678ec"}, {file = "cryptography-42.0.5-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e4985a790f921508f36f81831817cbc03b102d643b5fcb81cd33df3fa291a1a1"},
{file = "cryptography-41.0.7-pp38-pypy38_pp73-macosx_10_12_x86_64.whl", hash = "sha256:7a698cb1dac82c35fcf8fe3417a3aaba97de16a01ac914b89a0889d364d2f6be"}, {file = "cryptography-42.0.5-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7cde5f38e614f55e28d831754e8a3bacf9ace5d1566235e39d91b35502d6936e"},
{file = "cryptography-41.0.7-pp38-pypy38_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:37a138589b12069efb424220bf78eac59ca68b95696fc622b6ccc1c0a197204a"}, {file = "cryptography-42.0.5-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:7367d7b2eca6513681127ebad53b2582911d1736dc2ffc19f2c3ae49997496bc"},
{file = "cryptography-41.0.7-pp38-pypy38_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:68a2dec79deebc5d26d617bfdf6e8aab065a4f34934b22d3b5010df3ba36612c"}, {file = "cryptography-42.0.5-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:cd2030f6650c089aeb304cf093f3244d34745ce0cfcc39f20c6fbfe030102e2a"},
{file = "cryptography-41.0.7-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:09616eeaef406f99046553b8a40fbf8b1e70795a91885ba4c96a70793de5504a"}, {file = "cryptography-42.0.5-cp39-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:a2913c5375154b6ef2e91c10b5720ea6e21007412f6437504ffea2109b5a33d7"},
{file = "cryptography-41.0.7-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:48a0476626da912a44cc078f9893f292f0b3e4c739caf289268168d8f4702a39"}, {file = "cryptography-42.0.5-cp39-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:c41fb5e6a5fe9ebcd58ca3abfeb51dffb5d83d6775405305bfa8715b76521922"},
{file = "cryptography-41.0.7-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:c7f3201ec47d5207841402594f1d7950879ef890c0c495052fa62f58283fde1a"}, {file = "cryptography-42.0.5-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:3eaafe47ec0d0ffcc9349e1708be2aaea4c6dd4978d76bf6eb0cb2c13636c6fc"},
{file = "cryptography-41.0.7-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:c5ca78485a255e03c32b513f8c2bc39fedb7f5c5f8535545bdc223a03b24f248"}, {file = "cryptography-42.0.5-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:1b95b98b0d2af784078fa69f637135e3c317091b615cd0905f8b8a087e86fa30"},
{file = "cryptography-41.0.7-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:d6c391c021ab1f7a82da5d8d0b3cee2f4b2c455ec86c8aebbc84837a631ff309"}, {file = "cryptography-42.0.5-cp39-abi3-win32.whl", hash = "sha256:1f71c10d1e88467126f0efd484bd44bca5e14c664ec2ede64c32f20875c0d413"},
{file = "cryptography-41.0.7.tar.gz", hash = "sha256:13f93ce9bea8016c253b34afc6bd6a75993e5c40672ed5405a9c832f0d4a00bc"}, {file = "cryptography-42.0.5-cp39-abi3-win_amd64.whl", hash = "sha256:a011a644f6d7d03736214d38832e030d8268bcff4a41f728e6030325fea3e400"},
{file = "cryptography-42.0.5-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:9481ffe3cf013b71b2428b905c4f7a9a4f76ec03065b05ff499bb5682a8d9ad8"},
{file = "cryptography-42.0.5-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:ba334e6e4b1d92442b75ddacc615c5476d4ad55cc29b15d590cc6b86efa487e2"},
{file = "cryptography-42.0.5-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:ba3e4a42397c25b7ff88cdec6e2a16c2be18720f317506ee25210f6d31925f9c"},
{file = "cryptography-42.0.5-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:111a0d8553afcf8eb02a4fea6ca4f59d48ddb34497aa8706a6cf536f1a5ec576"},
{file = "cryptography-42.0.5-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:cd65d75953847815962c84a4654a84850b2bb4aed3f26fadcc1c13892e1e29f6"},
{file = "cryptography-42.0.5-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:e807b3188f9eb0eaa7bbb579b462c5ace579f1cedb28107ce8b48a9f7ad3679e"},
{file = "cryptography-42.0.5-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:f12764b8fffc7a123f641d7d049d382b73f96a34117e0b637b80643169cec8ac"},
{file = "cryptography-42.0.5-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:37dd623507659e08be98eec89323469e8c7b4c1407c85112634ae3dbdb926fdd"},
{file = "cryptography-42.0.5.tar.gz", hash = "sha256:6fe07eec95dfd477eb9530aef5bead34fec819b3aaf6c5bd6d20565da607bfe1"},
] ]
[package.dependencies] [package.dependencies]
cffi = ">=1.12" cffi = {version = ">=1.12", markers = "platform_python_implementation != \"PyPy\""}
[package.extras] [package.extras]
docs = ["sphinx (>=5.3.0)", "sphinx-rtd-theme (>=1.1.1)"] docs = ["sphinx (>=5.3.0)", "sphinx-rtd-theme (>=1.1.1)"]
docstest = ["pyenchant (>=1.6.11)", "sphinxcontrib-spelling (>=4.0.1)", "twine (>=1.12.0)"] docstest = ["pyenchant (>=1.6.11)", "readme-renderer", "sphinxcontrib-spelling (>=4.0.1)"]
nox = ["nox"] nox = ["nox"]
pep8test = ["black", "check-sdist", "mypy", "ruff"] pep8test = ["check-sdist", "click", "mypy", "ruff"]
sdist = ["build"] sdist = ["build"]
ssh = ["bcrypt (>=3.1.5)"] ssh = ["bcrypt (>=3.1.5)"]
test = ["pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"] test = ["certifi", "pretend", "pytest (>=6.2.0)", "pytest-benchmark", "pytest-cov", "pytest-xdist"]
test-randomorder = ["pytest-randomly"] test-randomorder = ["pytest-randomly"]
[[package]] [[package]]
@ -988,13 +1001,13 @@ trio = ["async_generator", "trio"]
[[package]] [[package]]
name = "jinja2" name = "jinja2"
version = "3.1.2" version = "3.1.3"
description = "A very fast and expressive template engine." description = "A very fast and expressive template engine."
optional = false optional = false
python-versions = ">=3.7" python-versions = ">=3.7"
files = [ files = [
{file = "Jinja2-3.1.2-py3-none-any.whl", hash = "sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61"}, {file = "Jinja2-3.1.3-py3-none-any.whl", hash = "sha256:7d6d50dd97d52cbc355597bd845fabfbac3f551e1f99619e39a35ce8c370b5fa"},
{file = "Jinja2-3.1.2.tar.gz", hash = "sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852"}, {file = "Jinja2-3.1.3.tar.gz", hash = "sha256:ac8bd6544d4bb2c9792bf3a159e80bba8fda7f07e81bc3aed565432d5925ba90"},
] ]
[package.dependencies] [package.dependencies]
@ -1459,38 +1472,38 @@ files = [
[[package]] [[package]]
name = "mypy" name = "mypy"
version = "1.5.1" version = "1.8.0"
description = "Optional static typing for Python" description = "Optional static typing for Python"
optional = false optional = false
python-versions = ">=3.8" python-versions = ">=3.8"
files = [ files = [
{file = "mypy-1.5.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:f33592ddf9655a4894aef22d134de7393e95fcbdc2d15c1ab65828eee5c66c70"}, {file = "mypy-1.8.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:485a8942f671120f76afffff70f259e1cd0f0cfe08f81c05d8816d958d4577d3"},
{file = "mypy-1.5.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:258b22210a4a258ccd077426c7a181d789d1121aca6db73a83f79372f5569ae0"}, {file = "mypy-1.8.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:df9824ac11deaf007443e7ed2a4a26bebff98d2bc43c6da21b2b64185da011c4"},
{file = "mypy-1.5.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a9ec1f695f0c25986e6f7f8778e5ce61659063268836a38c951200c57479cc12"}, {file = "mypy-1.8.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2afecd6354bbfb6e0160f4e4ad9ba6e4e003b767dd80d85516e71f2e955ab50d"},
{file = "mypy-1.5.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:abed92d9c8f08643c7d831300b739562b0a6c9fcb028d211134fc9ab20ccad5d"}, {file = "mypy-1.8.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:8963b83d53ee733a6e4196954502b33567ad07dfd74851f32be18eb932fb1cb9"},
{file = "mypy-1.5.1-cp310-cp310-win_amd64.whl", hash = "sha256:a156e6390944c265eb56afa67c74c0636f10283429171018446b732f1a05af25"}, {file = "mypy-1.8.0-cp310-cp310-win_amd64.whl", hash = "sha256:e46f44b54ebddbeedbd3d5b289a893219065ef805d95094d16a0af6630f5d410"},
{file = "mypy-1.5.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:6ac9c21bfe7bc9f7f1b6fae441746e6a106e48fc9de530dea29e8cd37a2c0cc4"}, {file = "mypy-1.8.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:855fe27b80375e5c5878492f0729540db47b186509c98dae341254c8f45f42ae"},
{file = "mypy-1.5.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:51cb1323064b1099e177098cb939eab2da42fea5d818d40113957ec954fc85f4"}, {file = "mypy-1.8.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:4c886c6cce2d070bd7df4ec4a05a13ee20c0aa60cb587e8d1265b6c03cf91da3"},
{file = "mypy-1.5.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:596fae69f2bfcb7305808c75c00f81fe2829b6236eadda536f00610ac5ec2243"}, {file = "mypy-1.8.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d19c413b3c07cbecf1f991e2221746b0d2a9410b59cb3f4fb9557f0365a1a817"},
{file = "mypy-1.5.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:32cb59609b0534f0bd67faebb6e022fe534bdb0e2ecab4290d683d248be1b275"}, {file = "mypy-1.8.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:9261ed810972061388918c83c3f5cd46079d875026ba97380f3e3978a72f503d"},
{file = "mypy-1.5.1-cp311-cp311-win_amd64.whl", hash = "sha256:159aa9acb16086b79bbb0016145034a1a05360626046a929f84579ce1666b315"}, {file = "mypy-1.8.0-cp311-cp311-win_amd64.whl", hash = "sha256:51720c776d148bad2372ca21ca29256ed483aa9a4cdefefcef49006dff2a6835"},
{file = "mypy-1.5.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:f6b0e77db9ff4fda74de7df13f30016a0a663928d669c9f2c057048ba44f09bb"}, {file = "mypy-1.8.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:52825b01f5c4c1c4eb0db253ec09c7aa17e1a7304d247c48b6f3599ef40db8bd"},
{file = "mypy-1.5.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:26f71b535dfc158a71264e6dc805a9f8d2e60b67215ca0bfa26e2e1aa4d4d373"}, {file = "mypy-1.8.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f5ac9a4eeb1ec0f1ccdc6f326bcdb464de5f80eb07fb38b5ddd7b0de6bc61e55"},
{file = "mypy-1.5.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2fc3a600f749b1008cc75e02b6fb3d4db8dbcca2d733030fe7a3b3502902f161"}, {file = "mypy-1.8.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:afe3fe972c645b4632c563d3f3eff1cdca2fa058f730df2b93a35e3b0c538218"},
{file = "mypy-1.5.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:26fb32e4d4afa205b24bf645eddfbb36a1e17e995c5c99d6d00edb24b693406a"}, {file = "mypy-1.8.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:42c6680d256ab35637ef88891c6bd02514ccb7e1122133ac96055ff458f93fc3"},
{file = "mypy-1.5.1-cp312-cp312-win_amd64.whl", hash = "sha256:82cb6193de9bbb3844bab4c7cf80e6227d5225cc7625b068a06d005d861ad5f1"}, {file = "mypy-1.8.0-cp312-cp312-win_amd64.whl", hash = "sha256:720a5ca70e136b675af3af63db533c1c8c9181314d207568bbe79051f122669e"},
{file = "mypy-1.5.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:4a465ea2ca12804d5b34bb056be3a29dc47aea5973b892d0417c6a10a40b2d65"}, {file = "mypy-1.8.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:028cf9f2cae89e202d7b6593cd98db6759379f17a319b5faf4f9978d7084cdc6"},
{file = "mypy-1.5.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:9fece120dbb041771a63eb95e4896791386fe287fefb2837258925b8326d6160"}, {file = "mypy-1.8.0-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:4e6d97288757e1ddba10dd9549ac27982e3e74a49d8d0179fc14d4365c7add66"},
{file = "mypy-1.5.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d28ddc3e3dfeab553e743e532fb95b4e6afad51d4706dd22f28e1e5e664828d2"}, {file = "mypy-1.8.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7f1478736fcebb90f97e40aff11a5f253af890c845ee0c850fe80aa060a267c6"},
{file = "mypy-1.5.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:57b10c56016adce71fba6bc6e9fd45d8083f74361f629390c556738565af8eeb"}, {file = "mypy-1.8.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:42419861b43e6962a649068a61f4a4839205a3ef525b858377a960b9e2de6e0d"},
{file = "mypy-1.5.1-cp38-cp38-win_amd64.whl", hash = "sha256:ff0cedc84184115202475bbb46dd99f8dcb87fe24d5d0ddfc0fe6b8575c88d2f"}, {file = "mypy-1.8.0-cp38-cp38-win_amd64.whl", hash = "sha256:2b5b6c721bd4aabaadead3a5e6fa85c11c6c795e0c81a7215776ef8afc66de02"},
{file = "mypy-1.5.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:8f772942d372c8cbac575be99f9cc9d9fb3bd95c8bc2de6c01411e2c84ebca8a"}, {file = "mypy-1.8.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:5c1538c38584029352878a0466f03a8ee7547d7bd9f641f57a0f3017a7c905b8"},
{file = "mypy-1.5.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:5d627124700b92b6bbaa99f27cbe615c8ea7b3402960f6372ea7d65faf376c14"}, {file = "mypy-1.8.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:4ef4be7baf08a203170f29e89d79064463b7fc7a0908b9d0d5114e8009c3a259"},
{file = "mypy-1.5.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:361da43c4f5a96173220eb53340ace68cda81845cd88218f8862dfb0adc8cddb"}, {file = "mypy-1.8.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7178def594014aa6c35a8ff411cf37d682f428b3b5617ca79029d8ae72f5402b"},
{file = "mypy-1.5.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:330857f9507c24de5c5724235e66858f8364a0693894342485e543f5b07c8693"}, {file = "mypy-1.8.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:ab3c84fa13c04aeeeabb2a7f67a25ef5d77ac9d6486ff33ded762ef353aa5592"},
{file = "mypy-1.5.1-cp39-cp39-win_amd64.whl", hash = "sha256:c543214ffdd422623e9fedd0869166c2f16affe4ba37463975043ef7d2ea8770"}, {file = "mypy-1.8.0-cp39-cp39-win_amd64.whl", hash = "sha256:99b00bc72855812a60d253420d8a2eae839b0afa4938f09f4d2aa9bb4654263a"},
{file = "mypy-1.5.1-py3-none-any.whl", hash = "sha256:f757063a83970d67c444f6e01d9550a7402322af3557ce7630d3c957386fa8f5"}, {file = "mypy-1.8.0-py3-none-any.whl", hash = "sha256:538fd81bb5e430cc1381a443971c0475582ff9f434c16cd46d2c66763ce85d9d"},
{file = "mypy-1.5.1.tar.gz", hash = "sha256:b031b9601f1060bf1281feab89697324726ba0c0bae9d7cd7ab4b690940f0b92"}, {file = "mypy-1.8.0.tar.gz", hash = "sha256:6ff8b244d7085a0b425b56d327b480c3b29cafbd2eff27316a004f9a7391ae07"},
] ]
[package.dependencies] [package.dependencies]
@ -1501,6 +1514,7 @@ typing-extensions = ">=4.1.0"
[package.extras] [package.extras]
dmypy = ["psutil (>=4.0)"] dmypy = ["psutil (>=4.0)"]
install-types = ["pip"] install-types = ["pip"]
mypyc = ["setuptools (>=50)"]
reports = ["lxml"] reports = ["lxml"]
[[package]] [[package]]
@ -1561,15 +1575,18 @@ testing-docutils = ["pygments", "pytest (>=7,<8)", "pytest-param-files (>=0.3.4,
[[package]] [[package]]
name = "netaddr" name = "netaddr"
version = "0.9.0" version = "1.2.1"
description = "A network address manipulation library for Python" description = "A network address manipulation library for Python"
optional = false optional = false
python-versions = "*" python-versions = ">=3.7"
files = [ files = [
{file = "netaddr-0.9.0-py3-none-any.whl", hash = "sha256:5148b1055679d2a1ec070c521b7db82137887fabd6d7e37f5199b44f775c3bb1"}, {file = "netaddr-1.2.1-py3-none-any.whl", hash = "sha256:bd9e9534b0d46af328cf64f0e5a23a5a43fca292df221c85580b27394793496e"},
{file = "netaddr-0.9.0.tar.gz", hash = "sha256:7b46fa9b1a2d71fd5de9e4a3784ef339700a53a08c8040f08baf5f1194da0128"}, {file = "netaddr-1.2.1.tar.gz", hash = "sha256:6eb8fedf0412c6d294d06885c110de945cf4d22d2b510d0404f4e06950857987"},
] ]
[package.extras]
nicer-shell = ["ipython"]
[[package]] [[package]]
name = "opentracing" name = "opentracing"
version = "2.4.0" version = "2.4.0"
@ -1856,18 +1873,18 @@ files = [
[[package]] [[package]]
name = "pydantic" name = "pydantic"
version = "2.6.0" version = "2.6.4"
description = "Data validation using Python type hints" description = "Data validation using Python type hints"
optional = false optional = false
python-versions = ">=3.8" python-versions = ">=3.8"
files = [ files = [
{file = "pydantic-2.6.0-py3-none-any.whl", hash = "sha256:1440966574e1b5b99cf75a13bec7b20e3512e8a61b894ae252f56275e2c465ae"}, {file = "pydantic-2.6.4-py3-none-any.whl", hash = "sha256:cc46fce86607580867bdc3361ad462bab9c222ef042d3da86f2fb333e1d916c5"},
{file = "pydantic-2.6.0.tar.gz", hash = "sha256:ae887bd94eb404b09d86e4d12f93893bdca79d766e738528c6fa1c849f3c6bcf"}, {file = "pydantic-2.6.4.tar.gz", hash = "sha256:b1704e0847db01817624a6b86766967f552dd9dbf3afba4004409f908dcc84e6"},
] ]
[package.dependencies] [package.dependencies]
annotated-types = ">=0.4.0" annotated-types = ">=0.4.0"
pydantic-core = "2.16.1" pydantic-core = "2.16.3"
typing-extensions = ">=4.6.1" typing-extensions = ">=4.6.1"
[package.extras] [package.extras]
@ -1875,90 +1892,90 @@ email = ["email-validator (>=2.0.0)"]
[[package]] [[package]]
name = "pydantic-core" name = "pydantic-core"
version = "2.16.1" version = "2.16.3"
description = "" description = ""
optional = false optional = false
python-versions = ">=3.8" python-versions = ">=3.8"
files = [ files = [
{file = "pydantic_core-2.16.1-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:300616102fb71241ff477a2cbbc847321dbec49428434a2f17f37528721c4948"}, {file = "pydantic_core-2.16.3-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:75b81e678d1c1ede0785c7f46690621e4c6e63ccd9192af1f0bd9d504bbb6bf4"},
{file = "pydantic_core-2.16.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:5511f962dd1b9b553e9534c3b9c6a4b0c9ded3d8c2be96e61d56f933feef9e1f"}, {file = "pydantic_core-2.16.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9c865a7ee6f93783bd5d781af5a4c43dadc37053a5b42f7d18dc019f8c9d2bd1"},
{file = "pydantic_core-2.16.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:98f0edee7ee9cc7f9221af2e1b95bd02810e1c7a6d115cfd82698803d385b28f"}, {file = "pydantic_core-2.16.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:162e498303d2b1c036b957a1278fa0899d02b2842f1ff901b6395104c5554a45"},
{file = "pydantic_core-2.16.1-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:9795f56aa6b2296f05ac79d8a424e94056730c0b860a62b0fdcfe6340b658cc8"}, {file = "pydantic_core-2.16.3-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2f583bd01bbfbff4eaee0868e6fc607efdfcc2b03c1c766b06a707abbc856187"},
{file = "pydantic_core-2.16.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c45f62e4107ebd05166717ac58f6feb44471ed450d07fecd90e5f69d9bf03c48"}, {file = "pydantic_core-2.16.3-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b926dd38db1519ed3043a4de50214e0d600d404099c3392f098a7f9d75029ff8"},
{file = "pydantic_core-2.16.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:462d599299c5971f03c676e2b63aa80fec5ebc572d89ce766cd11ca8bcb56f3f"}, {file = "pydantic_core-2.16.3-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:716b542728d4c742353448765aa7cdaa519a7b82f9564130e2b3f6766018c9ec"},
{file = "pydantic_core-2.16.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:21ebaa4bf6386a3b22eec518da7d679c8363fb7fb70cf6972161e5542f470798"}, {file = "pydantic_core-2.16.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fc4ad7f7ee1a13d9cb49d8198cd7d7e3aa93e425f371a68235f784e99741561f"},
{file = "pydantic_core-2.16.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:99f9a50b56713a598d33bc23a9912224fc5d7f9f292444e6664236ae471ddf17"}, {file = "pydantic_core-2.16.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bd87f48924f360e5d1c5f770d6155ce0e7d83f7b4e10c2f9ec001c73cf475c99"},
{file = "pydantic_core-2.16.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:8ec364e280db4235389b5e1e6ee924723c693cbc98e9d28dc1767041ff9bc388"}, {file = "pydantic_core-2.16.3-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:0df446663464884297c793874573549229f9eca73b59360878f382a0fc085979"},
{file = "pydantic_core-2.16.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:653a5dfd00f601a0ed6654a8b877b18d65ac32c9d9997456e0ab240807be6cf7"}, {file = "pydantic_core-2.16.3-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:4df8a199d9f6afc5ae9a65f8f95ee52cae389a8c6b20163762bde0426275b7db"},
{file = "pydantic_core-2.16.1-cp310-none-win32.whl", hash = "sha256:1661c668c1bb67b7cec96914329d9ab66755911d093bb9063c4c8914188af6d4"}, {file = "pydantic_core-2.16.3-cp310-none-win32.whl", hash = "sha256:456855f57b413f077dff513a5a28ed838dbbb15082ba00f80750377eed23d132"},
{file = "pydantic_core-2.16.1-cp310-none-win_amd64.whl", hash = "sha256:561be4e3e952c2f9056fba5267b99be4ec2afadc27261505d4992c50b33c513c"}, {file = "pydantic_core-2.16.3-cp310-none-win_amd64.whl", hash = "sha256:732da3243e1b8d3eab8c6ae23ae6a58548849d2e4a4e03a1924c8ddf71a387cb"},
{file = "pydantic_core-2.16.1-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:102569d371fadc40d8f8598a59379c37ec60164315884467052830b28cc4e9da"}, {file = "pydantic_core-2.16.3-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:519ae0312616026bf4cedc0fe459e982734f3ca82ee8c7246c19b650b60a5ee4"},
{file = "pydantic_core-2.16.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:735dceec50fa907a3c314b84ed609dec54b76a814aa14eb90da31d1d36873a5e"}, {file = "pydantic_core-2.16.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:b3992a322a5617ded0a9f23fd06dbc1e4bd7cf39bc4ccf344b10f80af58beacd"},
{file = "pydantic_core-2.16.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e83ebbf020be727d6e0991c1b192a5c2e7113eb66e3def0cd0c62f9f266247e4"}, {file = "pydantic_core-2.16.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8d62da299c6ecb04df729e4b5c52dc0d53f4f8430b4492b93aa8de1f541c4aac"},
{file = "pydantic_core-2.16.1-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:30a8259569fbeec49cfac7fda3ec8123486ef1b729225222f0d41d5f840b476f"}, {file = "pydantic_core-2.16.3-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2acca2be4bb2f2147ada8cac612f8a98fc09f41c89f87add7256ad27332c2fda"},
{file = "pydantic_core-2.16.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:920c4897e55e2881db6a6da151198e5001552c3777cd42b8a4c2f72eedc2ee91"}, {file = "pydantic_core-2.16.3-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1b662180108c55dfbf1280d865b2d116633d436cfc0bba82323554873967b340"},
{file = "pydantic_core-2.16.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f5247a3d74355f8b1d780d0f3b32a23dd9f6d3ff43ef2037c6dcd249f35ecf4c"}, {file = "pydantic_core-2.16.3-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e7c6ed0dc9d8e65f24f5824291550139fe6f37fac03788d4580da0d33bc00c97"},
{file = "pydantic_core-2.16.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2d5bea8012df5bb6dda1e67d0563ac50b7f64a5d5858348b5c8cb5043811c19d"}, {file = "pydantic_core-2.16.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a6b1bb0827f56654b4437955555dc3aeeebeddc47c2d7ed575477f082622c49e"},
{file = "pydantic_core-2.16.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:ed3025a8a7e5a59817b7494686d449ebfbe301f3e757b852c8d0d1961d6be864"}, {file = "pydantic_core-2.16.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:e56f8186d6210ac7ece503193ec84104da7ceb98f68ce18c07282fcc2452e76f"},
{file = "pydantic_core-2.16.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:06f0d5a1d9e1b7932477c172cc720b3b23c18762ed7a8efa8398298a59d177c7"}, {file = "pydantic_core-2.16.3-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:936e5db01dd49476fa8f4383c259b8b1303d5dd5fb34c97de194560698cc2c5e"},
{file = "pydantic_core-2.16.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:150ba5c86f502c040b822777e2e519b5625b47813bd05f9273a8ed169c97d9ae"}, {file = "pydantic_core-2.16.3-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:33809aebac276089b78db106ee692bdc9044710e26f24a9a2eaa35a0f9fa70ba"},
{file = "pydantic_core-2.16.1-cp311-none-win32.whl", hash = "sha256:d6cbdf12ef967a6aa401cf5cdf47850559e59eedad10e781471c960583f25aa1"}, {file = "pydantic_core-2.16.3-cp311-none-win32.whl", hash = "sha256:ded1c35f15c9dea16ead9bffcde9bb5c7c031bff076355dc58dcb1cb436c4721"},
{file = "pydantic_core-2.16.1-cp311-none-win_amd64.whl", hash = "sha256:afa01d25769af33a8dac0d905d5c7bb2d73c7c3d5161b2dd6f8b5b5eea6a3c4c"}, {file = "pydantic_core-2.16.3-cp311-none-win_amd64.whl", hash = "sha256:d89ca19cdd0dd5f31606a9329e309d4fcbb3df860960acec32630297d61820df"},
{file = "pydantic_core-2.16.1-cp311-none-win_arm64.whl", hash = "sha256:1a2fe7b00a49b51047334d84aafd7e39f80b7675cad0083678c58983662da89b"}, {file = "pydantic_core-2.16.3-cp311-none-win_arm64.whl", hash = "sha256:6162f8d2dc27ba21027f261e4fa26f8bcb3cf9784b7f9499466a311ac284b5b9"},
{file = "pydantic_core-2.16.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:0f478ec204772a5c8218e30eb813ca43e34005dff2eafa03931b3d8caef87d51"}, {file = "pydantic_core-2.16.3-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:0f56ae86b60ea987ae8bcd6654a887238fd53d1384f9b222ac457070b7ac4cff"},
{file = "pydantic_core-2.16.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f1936ef138bed2165dd8573aa65e3095ef7c2b6247faccd0e15186aabdda7f66"}, {file = "pydantic_core-2.16.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:c9bd22a2a639e26171068f8ebb5400ce2c1bc7d17959f60a3b753ae13c632975"},
{file = "pydantic_core-2.16.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:99d3a433ef5dc3021c9534a58a3686c88363c591974c16c54a01af7efd741f13"}, {file = "pydantic_core-2.16.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4204e773b4b408062960e65468d5346bdfe139247ee5f1ca2a378983e11388a2"},
{file = "pydantic_core-2.16.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bd88f40f2294440d3f3c6308e50d96a0d3d0973d6f1a5732875d10f569acef49"}, {file = "pydantic_core-2.16.3-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:f651dd19363c632f4abe3480a7c87a9773be27cfe1341aef06e8759599454120"},
{file = "pydantic_core-2.16.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3fac641bbfa43d5a1bed99d28aa1fded1984d31c670a95aac1bf1d36ac6ce137"}, {file = "pydantic_core-2.16.3-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:aaf09e615a0bf98d406657e0008e4a8701b11481840be7d31755dc9f97c44053"},
{file = "pydantic_core-2.16.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:72bf9308a82b75039b8c8edd2be2924c352eda5da14a920551a8b65d5ee89253"}, {file = "pydantic_core-2.16.3-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:8e47755d8152c1ab5b55928ab422a76e2e7b22b5ed8e90a7d584268dd49e9c6b"},
{file = "pydantic_core-2.16.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fb4363e6c9fc87365c2bc777a1f585a22f2f56642501885ffc7942138499bf54"}, {file = "pydantic_core-2.16.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:500960cb3a0543a724a81ba859da816e8cf01b0e6aaeedf2c3775d12ee49cade"},
{file = "pydantic_core-2.16.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:20f724a023042588d0f4396bbbcf4cffd0ddd0ad3ed4f0d8e6d4ac4264bae81e"}, {file = "pydantic_core-2.16.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:cf6204fe865da605285c34cf1172879d0314ff267b1c35ff59de7154f35fdc2e"},
{file = "pydantic_core-2.16.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:fb4370b15111905bf8b5ba2129b926af9470f014cb0493a67d23e9d7a48348e8"}, {file = "pydantic_core-2.16.3-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:d33dd21f572545649f90c38c227cc8631268ba25c460b5569abebdd0ec5974ca"},
{file = "pydantic_core-2.16.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:23632132f1fd608034f1a56cc3e484be00854db845b3a4a508834be5a6435a6f"}, {file = "pydantic_core-2.16.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:49d5d58abd4b83fb8ce763be7794d09b2f50f10aa65c0f0c1696c677edeb7cbf"},
{file = "pydantic_core-2.16.1-cp312-none-win32.whl", hash = "sha256:b9f3e0bffad6e238f7acc20c393c1ed8fab4371e3b3bc311020dfa6020d99212"}, {file = "pydantic_core-2.16.3-cp312-none-win32.whl", hash = "sha256:f53aace168a2a10582e570b7736cc5bef12cae9cf21775e3eafac597e8551fbe"},
{file = "pydantic_core-2.16.1-cp312-none-win_amd64.whl", hash = "sha256:a0b4cfe408cd84c53bab7d83e4209458de676a6ec5e9c623ae914ce1cb79b96f"}, {file = "pydantic_core-2.16.3-cp312-none-win_amd64.whl", hash = "sha256:0d32576b1de5a30d9a97f300cc6a3f4694c428d956adbc7e6e2f9cad279e45ed"},
{file = "pydantic_core-2.16.1-cp312-none-win_arm64.whl", hash = "sha256:d195add190abccefc70ad0f9a0141ad7da53e16183048380e688b466702195dd"}, {file = "pydantic_core-2.16.3-cp312-none-win_arm64.whl", hash = "sha256:ec08be75bb268473677edb83ba71e7e74b43c008e4a7b1907c6d57e940bf34b6"},
{file = "pydantic_core-2.16.1-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:502c062a18d84452858f8aea1e520e12a4d5228fc3621ea5061409d666ea1706"}, {file = "pydantic_core-2.16.3-cp38-cp38-macosx_10_12_x86_64.whl", hash = "sha256:b1f6f5938d63c6139860f044e2538baeee6f0b251a1816e7adb6cbce106a1f01"},
{file = "pydantic_core-2.16.1-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:d8c032ccee90b37b44e05948b449a2d6baed7e614df3d3f47fe432c952c21b60"}, {file = "pydantic_core-2.16.3-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:2a1ef6a36fdbf71538142ed604ad19b82f67b05749512e47f247a6ddd06afdc7"},
{file = "pydantic_core-2.16.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:920f4633bee43d7a2818e1a1a788906df5a17b7ab6fe411220ed92b42940f818"}, {file = "pydantic_core-2.16.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:704d35ecc7e9c31d48926150afada60401c55efa3b46cd1ded5a01bdffaf1d48"},
{file = "pydantic_core-2.16.1-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:9f5d37ff01edcbace53a402e80793640c25798fb7208f105d87a25e6fcc9ea06"}, {file = "pydantic_core-2.16.3-cp38-cp38-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d937653a696465677ed583124b94a4b2d79f5e30b2c46115a68e482c6a591c8a"},
{file = "pydantic_core-2.16.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:399166f24c33a0c5759ecc4801f040dbc87d412c1a6d6292b2349b4c505effc9"}, {file = "pydantic_core-2.16.3-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c9803edf8e29bd825f43481f19c37f50d2b01899448273b3a7758441b512acf8"},
{file = "pydantic_core-2.16.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:ac89ccc39cd1d556cc72d6752f252dc869dde41c7c936e86beac5eb555041b66"}, {file = "pydantic_core-2.16.3-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:72282ad4892a9fb2da25defeac8c2e84352c108705c972db82ab121d15f14e6d"},
{file = "pydantic_core-2.16.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:73802194f10c394c2bedce7a135ba1d8ba6cff23adf4217612bfc5cf060de34c"}, {file = "pydantic_core-2.16.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7f752826b5b8361193df55afcdf8ca6a57d0232653494ba473630a83ba50d8c9"},
{file = "pydantic_core-2.16.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:8fa00fa24ffd8c31fac081bf7be7eb495be6d248db127f8776575a746fa55c95"}, {file = "pydantic_core-2.16.3-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:4384a8f68ddb31a0b0c3deae88765f5868a1b9148939c3f4121233314ad5532c"},
{file = "pydantic_core-2.16.1-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:601d3e42452cd4f2891c13fa8c70366d71851c1593ed42f57bf37f40f7dca3c8"}, {file = "pydantic_core-2.16.3-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:a4b2bf78342c40b3dc830880106f54328928ff03e357935ad26c7128bbd66ce8"},
{file = "pydantic_core-2.16.1-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:07982b82d121ed3fc1c51faf6e8f57ff09b1325d2efccaa257dd8c0dd937acca"}, {file = "pydantic_core-2.16.3-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:13dcc4802961b5f843a9385fc821a0b0135e8c07fc3d9949fd49627c1a5e6ae5"},
{file = "pydantic_core-2.16.1-cp38-none-win32.whl", hash = "sha256:d0bf6f93a55d3fa7a079d811b29100b019784e2ee6bc06b0bb839538272a5610"}, {file = "pydantic_core-2.16.3-cp38-none-win32.whl", hash = "sha256:e3e70c94a0c3841e6aa831edab1619ad5c511199be94d0c11ba75fe06efe107a"},
{file = "pydantic_core-2.16.1-cp38-none-win_amd64.whl", hash = "sha256:fbec2af0ebafa57eb82c18c304b37c86a8abddf7022955d1742b3d5471a6339e"}, {file = "pydantic_core-2.16.3-cp38-none-win_amd64.whl", hash = "sha256:ecdf6bf5f578615f2e985a5e1f6572e23aa632c4bd1dc67f8f406d445ac115ed"},
{file = "pydantic_core-2.16.1-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:a497be217818c318d93f07e14502ef93d44e6a20c72b04c530611e45e54c2196"}, {file = "pydantic_core-2.16.3-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:bda1ee3e08252b8d41fa5537413ffdddd58fa73107171a126d3b9ff001b9b820"},
{file = "pydantic_core-2.16.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:694a5e9f1f2c124a17ff2d0be613fd53ba0c26de588eb4bdab8bca855e550d95"}, {file = "pydantic_core-2.16.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:21b888c973e4f26b7a96491c0965a8a312e13be108022ee510248fe379a5fa23"},
{file = "pydantic_core-2.16.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8d4dfc66abea3ec6d9f83e837a8f8a7d9d3a76d25c9911735c76d6745950e62c"}, {file = "pydantic_core-2.16.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:be0ec334369316fa73448cc8c982c01e5d2a81c95969d58b8f6e272884df0074"},
{file = "pydantic_core-2.16.1-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8655f55fe68c4685673265a650ef71beb2d31871c049c8b80262026f23605ee3"}, {file = "pydantic_core-2.16.3-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:b5b6079cc452a7c53dd378c6f881ac528246b3ac9aae0f8eef98498a75657805"},
{file = "pydantic_core-2.16.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:21e3298486c4ea4e4d5cc6fb69e06fb02a4e22089304308817035ac006a7f506"}, {file = "pydantic_core-2.16.3-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:7ee8d5f878dccb6d499ba4d30d757111847b6849ae07acdd1205fffa1fc1253c"},
{file = "pydantic_core-2.16.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:71b4a48a7427f14679f0015b13c712863d28bb1ab700bd11776a5368135c7d60"}, {file = "pydantic_core-2.16.3-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7233d65d9d651242a68801159763d09e9ec96e8a158dbf118dc090cd77a104c9"},
{file = "pydantic_core-2.16.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:10dca874e35bb60ce4f9f6665bfbfad050dd7573596608aeb9e098621ac331dc"}, {file = "pydantic_core-2.16.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c6119dc90483a5cb50a1306adb8d52c66e447da88ea44f323e0ae1a5fcb14256"},
{file = "pydantic_core-2.16.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:fa496cd45cda0165d597e9d6f01e36c33c9508f75cf03c0a650018c5048f578e"}, {file = "pydantic_core-2.16.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:578114bc803a4c1ff9946d977c221e4376620a46cf78da267d946397dc9514a8"},
{file = "pydantic_core-2.16.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:5317c04349472e683803da262c781c42c5628a9be73f4750ac7d13040efb5d2d"}, {file = "pydantic_core-2.16.3-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:d8f99b147ff3fcf6b3cc60cb0c39ea443884d5559a30b1481e92495f2310ff2b"},
{file = "pydantic_core-2.16.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:42c29d54ed4501a30cd71015bf982fa95e4a60117b44e1a200290ce687d3e640"}, {file = "pydantic_core-2.16.3-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:4ac6b4ce1e7283d715c4b729d8f9dab9627586dafce81d9eaa009dd7f25dd972"},
{file = "pydantic_core-2.16.1-cp39-none-win32.whl", hash = "sha256:ba07646f35e4e49376c9831130039d1b478fbfa1215ae62ad62d2ee63cf9c18f"}, {file = "pydantic_core-2.16.3-cp39-none-win32.whl", hash = "sha256:e7774b570e61cb998490c5235740d475413a1f6de823169b4cf94e2fe9e9f6b2"},
{file = "pydantic_core-2.16.1-cp39-none-win_amd64.whl", hash = "sha256:2133b0e412a47868a358713287ff9f9a328879da547dc88be67481cdac529118"}, {file = "pydantic_core-2.16.3-cp39-none-win_amd64.whl", hash = "sha256:9091632a25b8b87b9a605ec0e61f241c456e9248bfdcf7abdf344fdb169c81cf"},
{file = "pydantic_core-2.16.1-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:d25ef0c33f22649b7a088035fd65ac1ce6464fa2876578df1adad9472f918a76"}, {file = "pydantic_core-2.16.3-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:36fa178aacbc277bc6b62a2c3da95226520da4f4e9e206fdf076484363895d2c"},
{file = "pydantic_core-2.16.1-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:99c095457eea8550c9fa9a7a992e842aeae1429dab6b6b378710f62bfb70b394"}, {file = "pydantic_core-2.16.3-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:dcca5d2bf65c6fb591fff92da03f94cd4f315972f97c21975398bd4bd046854a"},
{file = "pydantic_core-2.16.1-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b49c604ace7a7aa8af31196abbf8f2193be605db6739ed905ecaf62af31ccae0"}, {file = "pydantic_core-2.16.3-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2a72fb9963cba4cd5793854fd12f4cfee731e86df140f59ff52a49b3552db241"},
{file = "pydantic_core-2.16.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c56da23034fe66221f2208c813d8aa509eea34d97328ce2add56e219c3a9f41c"}, {file = "pydantic_core-2.16.3-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b60cc1a081f80a2105a59385b92d82278b15d80ebb3adb200542ae165cd7d183"},
{file = "pydantic_core-2.16.1-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:cebf8d56fee3b08ad40d332a807ecccd4153d3f1ba8231e111d9759f02edfd05"}, {file = "pydantic_core-2.16.3-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:cbcc558401de90a746d02ef330c528f2e668c83350f045833543cd57ecead1ad"},
{file = "pydantic_core-2.16.1-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:1ae8048cba95f382dba56766525abca438328455e35c283bb202964f41a780b0"}, {file = "pydantic_core-2.16.3-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:fee427241c2d9fb7192b658190f9f5fd6dfe41e02f3c1489d2ec1e6a5ab1e04a"},
{file = "pydantic_core-2.16.1-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:780daad9e35b18d10d7219d24bfb30148ca2afc309928e1d4d53de86822593dc"}, {file = "pydantic_core-2.16.3-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:f4cb85f693044e0f71f394ff76c98ddc1bc0953e48c061725e540396d5c8a2e1"},
{file = "pydantic_core-2.16.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:c94b5537bf6ce66e4d7830c6993152940a188600f6ae044435287753044a8fe2"}, {file = "pydantic_core-2.16.3-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:b29eeb887aa931c2fcef5aa515d9d176d25006794610c264ddc114c053bf96fe"},
{file = "pydantic_core-2.16.1-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:adf28099d061a25fbcc6531febb7a091e027605385de9fe14dd6a97319d614cf"}, {file = "pydantic_core-2.16.3-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:a425479ee40ff021f8216c9d07a6a3b54b31c8267c6e17aa88b70d7ebd0e5e5b"},
{file = "pydantic_core-2.16.1-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:644904600c15816a1f9a1bafa6aab0d21db2788abcdf4e2a77951280473f33e1"}, {file = "pydantic_core-2.16.3-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:5c5cbc703168d1b7a838668998308018a2718c2130595e8e190220238addc96f"},
{file = "pydantic_core-2.16.1-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:87bce04f09f0552b66fca0c4e10da78d17cb0e71c205864bab4e9595122cb9d9"}, {file = "pydantic_core-2.16.3-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:99b6add4c0b39a513d323d3b93bc173dac663c27b99860dd5bf491b240d26137"},
{file = "pydantic_core-2.16.1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:877045a7969ace04d59516d5d6a7dee13106822f99a5d8df5e6822941f7bedc8"}, {file = "pydantic_core-2.16.3-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:75f76ee558751746d6a38f89d60b6228fa174e5172d143886af0f85aa306fd89"},
{file = "pydantic_core-2.16.1-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:9c46e556ee266ed3fb7b7a882b53df3c76b45e872fdab8d9cf49ae5e91147fd7"}, {file = "pydantic_core-2.16.3-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:00ee1c97b5364b84cb0bd82e9bbf645d5e2871fb8c58059d158412fee2d33d8a"},
{file = "pydantic_core-2.16.1-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:4eebbd049008eb800f519578e944b8dc8e0f7d59a5abb5924cc2d4ed3a1834ff"}, {file = "pydantic_core-2.16.3-pp39-pypy39_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:287073c66748f624be4cef893ef9174e3eb88fe0b8a78dc22e88eca4bc357ca6"},
{file = "pydantic_core-2.16.1-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:c0be58529d43d38ae849a91932391eb93275a06b93b79a8ab828b012e916a206"}, {file = "pydantic_core-2.16.3-pp39-pypy39_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:ed25e1835c00a332cb10c683cd39da96a719ab1dfc08427d476bce41b92531fc"},
{file = "pydantic_core-2.16.1-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:b1fc07896fc1851558f532dffc8987e526b682ec73140886c831d773cef44b76"}, {file = "pydantic_core-2.16.3-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:86b3d0033580bd6bbe07590152007275bd7af95f98eaa5bd36f3da219dcd93da"},
{file = "pydantic_core-2.16.1.tar.gz", hash = "sha256:daff04257b49ab7f4b3f73f98283d3dbb1a65bf3500d55c7beac3c66c310fe34"}, {file = "pydantic_core-2.16.3.tar.gz", hash = "sha256:1cac689f80a3abab2d3c0048b29eea5751114054f032a941a32de4c852c59cad"},
] ]
[package.dependencies] [package.dependencies]
@ -2427,28 +2444,28 @@ files = [
[[package]] [[package]]
name = "ruff" name = "ruff"
version = "0.1.14" version = "0.3.2"
description = "An extremely fast Python linter and code formatter, written in Rust." description = "An extremely fast Python linter and code formatter, written in Rust."
optional = false optional = false
python-versions = ">=3.7" python-versions = ">=3.7"
files = [ files = [
{file = "ruff-0.1.14-py3-none-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl", hash = "sha256:96f76536df9b26622755c12ed8680f159817be2f725c17ed9305b472a757cdbb"}, {file = "ruff-0.3.2-py3-none-macosx_10_12_x86_64.macosx_11_0_arm64.macosx_10_12_universal2.whl", hash = "sha256:77f2612752e25f730da7421ca5e3147b213dca4f9a0f7e0b534e9562c5441f01"},
{file = "ruff-0.1.14-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:ab3f71f64498c7241123bb5a768544cf42821d2a537f894b22457a543d3ca7a9"}, {file = "ruff-0.3.2-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:9966b964b2dd1107797be9ca7195002b874424d1d5472097701ae8f43eadef5d"},
{file = "ruff-0.1.14-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7060156ecc572b8f984fd20fd8b0fcb692dd5d837b7606e968334ab7ff0090ab"}, {file = "ruff-0.3.2-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b83d17ff166aa0659d1e1deaf9f2f14cbe387293a906de09bc4860717eb2e2da"},
{file = "ruff-0.1.14-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a53d8e35313d7b67eb3db15a66c08434809107659226a90dcd7acb2afa55faea"}, {file = "ruff-0.3.2-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bb875c6cc87b3703aeda85f01c9aebdce3d217aeaca3c2e52e38077383f7268a"},
{file = "ruff-0.1.14-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:bea9be712b8f5b4ebed40e1949379cfb2a7d907f42921cf9ab3aae07e6fba9eb"}, {file = "ruff-0.3.2-py3-none-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:be75e468a6a86426430373d81c041b7605137a28f7014a72d2fc749e47f572aa"},
{file = "ruff-0.1.14-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:2270504d629a0b064247983cbc495bed277f372fb9eaba41e5cf51f7ba705a6a"}, {file = "ruff-0.3.2-py3-none-manylinux_2_17_ppc64.manylinux2014_ppc64.whl", hash = "sha256:967978ac2d4506255e2f52afe70dda023fc602b283e97685c8447d036863a302"},
{file = "ruff-0.1.14-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:80258bb3b8909b1700610dfabef7876423eed1bc930fe177c71c414921898efa"}, {file = "ruff-0.3.2-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1231eacd4510f73222940727ac927bc5d07667a86b0cbe822024dd00343e77e9"},
{file = "ruff-0.1.14-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:653230dd00aaf449eb5ff25d10a6e03bc3006813e2cb99799e568f55482e5cae"}, {file = "ruff-0.3.2-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2c6d613b19e9a8021be2ee1d0e27710208d1603b56f47203d0abbde906929a9b"},
{file = "ruff-0.1.14-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:87b3acc6c4e6928459ba9eb7459dd4f0c4bf266a053c863d72a44c33246bfdbf"}, {file = "ruff-0.3.2-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c8439338a6303585d27b66b4626cbde89bb3e50fa3cae86ce52c1db7449330a7"},
{file = "ruff-0.1.14-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:6b3dadc9522d0eccc060699a9816e8127b27addbb4697fc0c08611e4e6aeb8b5"}, {file = "ruff-0.3.2-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:de8b480d8379620cbb5ea466a9e53bb467d2fb07c7eca54a4aa8576483c35d36"},
{file = "ruff-0.1.14-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:1c8eca1a47b4150dc0fbec7fe68fc91c695aed798532a18dbb1424e61e9b721f"}, {file = "ruff-0.3.2-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:b74c3de9103bd35df2bb05d8b2899bf2dbe4efda6474ea9681280648ec4d237d"},
{file = "ruff-0.1.14-py3-none-musllinux_1_2_i686.whl", hash = "sha256:62ce2ae46303ee896fc6811f63d6dabf8d9c389da0f3e3f2bce8bc7f15ef5488"}, {file = "ruff-0.3.2-py3-none-musllinux_1_2_i686.whl", hash = "sha256:f380be9fc15a99765c9cf316b40b9da1f6ad2ab9639e551703e581a5e6da6745"},
{file = "ruff-0.1.14-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:b2027dde79d217b211d725fc833e8965dc90a16d0d3213f1298f97465956661b"}, {file = "ruff-0.3.2-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:0ac06a3759c3ab9ef86bbeca665d31ad3aa9a4b1c17684aadb7e61c10baa0df4"},
{file = "ruff-0.1.14-py3-none-win32.whl", hash = "sha256:722bafc299145575a63bbd6b5069cb643eaa62546a5b6398f82b3e4403329cab"}, {file = "ruff-0.3.2-py3-none-win32.whl", hash = "sha256:9bd640a8f7dd07a0b6901fcebccedadeb1a705a50350fb86b4003b805c81385a"},
{file = "ruff-0.1.14-py3-none-win_amd64.whl", hash = "sha256:e3d241aa61f92b0805a7082bd89a9990826448e4d0398f0e2bc8f05c75c63d99"}, {file = "ruff-0.3.2-py3-none-win_amd64.whl", hash = "sha256:0c1bdd9920cab5707c26c8b3bf33a064a4ca7842d91a99ec0634fec68f9f4037"},
{file = "ruff-0.1.14-py3-none-win_arm64.whl", hash = "sha256:269302b31ade4cde6cf6f9dd58ea593773a37ed3f7b97e793c8594b262466b67"}, {file = "ruff-0.3.2-py3-none-win_arm64.whl", hash = "sha256:5f65103b1d76e0d600cabd577b04179ff592064eaa451a70a81085930e907d0b"},
{file = "ruff-0.1.14.tar.gz", hash = "sha256:ad3f8088b2dfd884820289a06ab718cde7d38b94972212cc4ba90d5fbc9955f3"}, {file = "ruff-0.3.2.tar.gz", hash = "sha256:fa78ec9418eb1ca3db392811df3376b46471ae93792a81af2d1cbb0e5dcb5142"},
] ]
[[package]] [[package]]
@ -3056,13 +3073,13 @@ files = [
[[package]] [[package]]
name = "types-jsonschema" name = "types-jsonschema"
version = "4.21.0.20240118" version = "4.21.0.20240311"
description = "Typing stubs for jsonschema" description = "Typing stubs for jsonschema"
optional = false optional = false
python-versions = ">=3.8" python-versions = ">=3.8"
files = [ files = [
{file = "types-jsonschema-4.21.0.20240118.tar.gz", hash = "sha256:31aae1b5adc0176c1155c2d4f58348b22d92ae64315e9cc83bd6902168839232"}, {file = "types-jsonschema-4.21.0.20240311.tar.gz", hash = "sha256:f7165ce70abd91df490c73b089873afd2899c5e56430ee495b64f851ad01f287"},
{file = "types_jsonschema-4.21.0.20240118-py3-none-any.whl", hash = "sha256:77a4ac36b0be4f24274d5b9bf0b66208ee771c05f80e34c4641de7d63e8a872d"}, {file = "types_jsonschema-4.21.0.20240311-py3-none-any.whl", hash = "sha256:e872f5661513824edf9698f73a66c9c114713d93eab58699bd0532e7e6db5750"},
] ]
[package.dependencies] [package.dependencies]
@ -3103,24 +3120,24 @@ files = [
[[package]] [[package]]
name = "types-psycopg2" name = "types-psycopg2"
version = "2.9.21.16" version = "2.9.21.20240311"
description = "Typing stubs for psycopg2" description = "Typing stubs for psycopg2"
optional = false optional = false
python-versions = ">=3.7" python-versions = ">=3.8"
files = [ files = [
{file = "types-psycopg2-2.9.21.16.tar.gz", hash = "sha256:44a3ae748173bb637cff31654d6bd12de9ad0c7ad73afe737df6152830ed82ed"}, {file = "types-psycopg2-2.9.21.20240311.tar.gz", hash = "sha256:722945dffa6a729bebc660f14137f37edfcead5a2c15eb234212a7d017ee8072"},
{file = "types_psycopg2-2.9.21.16-py3-none-any.whl", hash = "sha256:e2f24b651239ccfda320ab3457099af035cf37962c36c9fa26a4dc65991aebed"}, {file = "types_psycopg2-2.9.21.20240311-py3-none-any.whl", hash = "sha256:2e137ae2b516ee0dbaab6f555086b6cfb723ba4389d67f551b0336adf4efcf1b"},
] ]
[[package]] [[package]]
name = "types-pyopenssl" name = "types-pyopenssl"
version = "23.3.0.0" version = "24.0.0.20240311"
description = "Typing stubs for pyOpenSSL" description = "Typing stubs for pyOpenSSL"
optional = false optional = false
python-versions = ">=3.7" python-versions = ">=3.8"
files = [ files = [
{file = "types-pyOpenSSL-23.3.0.0.tar.gz", hash = "sha256:5ffb077fe70b699c88d5caab999ae80e192fe28bf6cda7989b7e79b1e4e2dcd3"}, {file = "types-pyOpenSSL-24.0.0.20240311.tar.gz", hash = "sha256:7bca00cfc4e7ef9c5d2663c6a1c068c35798e59670595439f6296e7ba3d58083"},
{file = "types_pyOpenSSL-23.3.0.0-py3-none-any.whl", hash = "sha256:00171433653265843b7469ddb9f3c86d698668064cc33ef10537822156130ebf"}, {file = "types_pyOpenSSL-24.0.0.20240311-py3-none-any.whl", hash = "sha256:6e8e8bfad34924067333232c93f7fc4b369856d8bea0d5c9d1808cb290ab1972"},
] ]
[package.dependencies] [package.dependencies]
@ -3434,4 +3451,4 @@ user-search = ["pyicu"]
[metadata] [metadata]
lock-version = "2.0" lock-version = "2.0"
python-versions = "^3.8.0" python-versions = "^3.8.0"
content-hash = "e4ca55af1dcb6b28b8064b7551008fd16f6cdfa9cb9bf90d18c6b47766b56ae6" content-hash = "b510fa05f4ea33194bec079f5d04ebb3f9ffbb5c1ea96a0341d57ba770ef81e6"

View file

@ -96,7 +96,7 @@ module-name = "synapse.synapse_rust"
[tool.poetry] [tool.poetry]
name = "matrix-synapse" name = "matrix-synapse"
version = "1.103.0" version = "1.104.0rc1"
description = "Homeserver for the Matrix decentralised comms protocol" description = "Homeserver for the Matrix decentralised comms protocol"
authors = ["Matrix.org Team and Contributors <packages@matrix.org>"] authors = ["Matrix.org Team and Contributors <packages@matrix.org>"]
license = "AGPL-3.0-or-later" license = "AGPL-3.0-or-later"
@ -321,7 +321,7 @@ all = [
# This helps prevents merge conflicts when running a batch of dependabot updates. # This helps prevents merge conflicts when running a batch of dependabot updates.
isort = ">=5.10.1" isort = ">=5.10.1"
black = ">=22.7.0" black = ">=22.7.0"
ruff = "0.1.14" ruff = "0.3.2"
# Type checking only works with the pydantic.v1 compat module from pydantic v2 # Type checking only works with the pydantic.v1 compat module from pydantic v2
pydantic = "^2" pydantic = "^2"
@ -382,7 +382,7 @@ furo = ">=2022.12.7,<2025.0.0"
# runtime errors caused by build system changes. # runtime errors caused by build system changes.
# We are happy to raise these upper bounds upon request, # We are happy to raise these upper bounds upon request,
# provided we check that it's safe to do so (i.e. that CI passes). # provided we check that it's safe to do so (i.e. that CI passes).
requires = ["poetry-core>=1.1.0,<=1.8.1", "setuptools_rust>=1.3,<=1.8.1"] requires = ["poetry-core>=1.1.0,<=1.9.0", "setuptools_rust>=1.3,<=1.8.1"]
build-backend = "poetry.core.masonry.api" build-backend = "poetry.core.masonry.api"

View file

@ -191,30 +191,39 @@ charset-normalizer==3.1.0 ; python_full_version >= "3.8.0" and python_full_versi
constantly==15.1.0 ; python_full_version >= "3.8.0" and python_full_version < "4.0.0" \ constantly==15.1.0 ; python_full_version >= "3.8.0" and python_full_version < "4.0.0" \
--hash=sha256:586372eb92059873e29eba4f9dec8381541b4d3834660707faf8ba59146dfc35 \ --hash=sha256:586372eb92059873e29eba4f9dec8381541b4d3834660707faf8ba59146dfc35 \
--hash=sha256:dd2fa9d6b1a51a83f0d7dd76293d734046aa176e384bf6e33b7e44880eb37c5d --hash=sha256:dd2fa9d6b1a51a83f0d7dd76293d734046aa176e384bf6e33b7e44880eb37c5d
cryptography==41.0.7 ; python_version >= "3.8" and python_full_version < "4.0.0" \ cryptography==42.0.5 ; python_version >= "3.8" and python_full_version < "4.0.0" \
--hash=sha256:079b85658ea2f59c4f43b70f8119a52414cdb7be34da5d019a77bf96d473b960 \ --hash=sha256:0270572b8bd2c833c3981724b8ee9747b3ec96f699a9665470018594301439ee \
--hash=sha256:09616eeaef406f99046553b8a40fbf8b1e70795a91885ba4c96a70793de5504a \ --hash=sha256:111a0d8553afcf8eb02a4fea6ca4f59d48ddb34497aa8706a6cf536f1a5ec576 \
--hash=sha256:13f93ce9bea8016c253b34afc6bd6a75993e5c40672ed5405a9c832f0d4a00bc \ --hash=sha256:16a48c23a62a2f4a285699dba2e4ff2d1cff3115b9df052cdd976a18856d8e3d \
--hash=sha256:37a138589b12069efb424220bf78eac59ca68b95696fc622b6ccc1c0a197204a \ --hash=sha256:1b95b98b0d2af784078fa69f637135e3c317091b615cd0905f8b8a087e86fa30 \
--hash=sha256:3c78451b78313fa81607fa1b3f1ae0a5ddd8014c38a02d9db0616133987b9cdf \ --hash=sha256:1f71c10d1e88467126f0efd484bd44bca5e14c664ec2ede64c32f20875c0d413 \
--hash=sha256:43f2552a2378b44869fe8827aa19e69512e3245a219104438692385b0ee119d1 \ --hash=sha256:2424ff4c4ac7f6b8177b53c17ed5d8fa74ae5955656867f5a8affaca36a27abb \
--hash=sha256:48a0476626da912a44cc078f9893f292f0b3e4c739caf289268168d8f4702a39 \ --hash=sha256:2bce03af1ce5a5567ab89bd90d11e7bbdff56b8af3acbbec1faded8f44cb06da \
--hash=sha256:49f0805fc0b2ac8d4882dd52f4a3b935b210935d500b6b805f321addc8177406 \ --hash=sha256:329906dcc7b20ff3cad13c069a78124ed8247adcac44b10bea1130e36caae0b4 \
--hash=sha256:5429ec739a29df2e29e15d082f1d9ad683701f0ec7709ca479b3ff2708dae65a \ --hash=sha256:37dd623507659e08be98eec89323469e8c7b4c1407c85112634ae3dbdb926fdd \
--hash=sha256:5a1b41bc97f1ad230a41657d9155113c7521953869ae57ac39ac7f1bb471469a \ --hash=sha256:3eaafe47ec0d0ffcc9349e1708be2aaea4c6dd4978d76bf6eb0cb2c13636c6fc \
--hash=sha256:68a2dec79deebc5d26d617bfdf6e8aab065a4f34934b22d3b5010df3ba36612c \ --hash=sha256:5e6275c09d2badf57aea3afa80d975444f4be8d3bc58f7f80d2a484c6f9485c8 \
--hash=sha256:7a698cb1dac82c35fcf8fe3417a3aaba97de16a01ac914b89a0889d364d2f6be \ --hash=sha256:6fe07eec95dfd477eb9530aef5bead34fec819b3aaf6c5bd6d20565da607bfe1 \
--hash=sha256:841df4caa01008bad253bce2a6f7b47f86dc9f08df4b433c404def869f590a15 \ --hash=sha256:7367d7b2eca6513681127ebad53b2582911d1736dc2ffc19f2c3ae49997496bc \
--hash=sha256:90452ba79b8788fa380dfb587cca692976ef4e757b194b093d845e8d99f612f2 \ --hash=sha256:7cde5f38e614f55e28d831754e8a3bacf9ace5d1566235e39d91b35502d6936e \
--hash=sha256:928258ba5d6f8ae644e764d0f996d61a8777559f72dfeb2eea7e2fe0ad6e782d \ --hash=sha256:9481ffe3cf013b71b2428b905c4f7a9a4f76ec03065b05ff499bb5682a8d9ad8 \
--hash=sha256:af03b32695b24d85a75d40e1ba39ffe7db7ffcb099fe507b39fd41a565f1b157 \ --hash=sha256:98d8dc6d012b82287f2c3d26ce1d2dd130ec200c8679b6213b3c73c08b2b7940 \
--hash=sha256:b640981bf64a3e978a56167594a0e97db71c89a479da8e175d8bb5be5178c003 \ --hash=sha256:a011a644f6d7d03736214d38832e030d8268bcff4a41f728e6030325fea3e400 \
--hash=sha256:c5ca78485a255e03c32b513f8c2bc39fedb7f5c5f8535545bdc223a03b24f248 \ --hash=sha256:a2913c5375154b6ef2e91c10b5720ea6e21007412f6437504ffea2109b5a33d7 \
--hash=sha256:c7f3201ec47d5207841402594f1d7950879ef890c0c495052fa62f58283fde1a \ --hash=sha256:a30596bae9403a342c978fb47d9b0ee277699fa53bbafad14706af51fe543d16 \
--hash=sha256:d5ec85080cce7b0513cfd233914eb8b7bbd0633f1d1703aa28d1dd5a72f678ec \ --hash=sha256:b03c2ae5d2f0fc05f9a2c0c997e1bc18c8229f392234e8a0194f202169ccd278 \
--hash=sha256:d6c391c021ab1f7a82da5d8d0b3cee2f4b2c455ec86c8aebbc84837a631ff309 \ --hash=sha256:b6cd2203306b63e41acdf39aa93b86fb566049aeb6dc489b70e34bcd07adca74 \
--hash=sha256:e3114da6d7f95d2dee7d3f4eec16dacff819740bbab931aff8648cb13c5ff5e7 \ --hash=sha256:b7ffe927ee6531c78f81aa17e684e2ff617daeba7f189f911065b2ea2d526dec \
--hash=sha256:f983596065a18a2183e7f79ab3fd4c475205b839e02cbc0efbbf9666c4b3083d --hash=sha256:b8cac287fafc4ad485b8a9b67d0ee80c66bf3574f655d3b97ef2e1082360faf1 \
--hash=sha256:ba334e6e4b1d92442b75ddacc615c5476d4ad55cc29b15d590cc6b86efa487e2 \
--hash=sha256:ba3e4a42397c25b7ff88cdec6e2a16c2be18720f317506ee25210f6d31925f9c \
--hash=sha256:c41fb5e6a5fe9ebcd58ca3abfeb51dffb5d83d6775405305bfa8715b76521922 \
--hash=sha256:cd2030f6650c089aeb304cf093f3244d34745ce0cfcc39f20c6fbfe030102e2a \
--hash=sha256:cd65d75953847815962c84a4654a84850b2bb4aed3f26fadcc1c13892e1e29f6 \
--hash=sha256:e4985a790f921508f36f81831817cbc03b102d643b5fcb81cd33df3fa291a1a1 \
--hash=sha256:e807b3188f9eb0eaa7bbb579b462c5ace579f1cedb28107ce8b48a9f7ad3679e \
--hash=sha256:f12764b8fffc7a123f641d7d049d382b73f96a34117e0b637b80643169cec8ac \
--hash=sha256:f8837fe1d6ac4a8052a9a8ddab256bc006242696f03368a4009be7ee3075cdb7
hiredis==2.3.2 ; python_full_version >= "3.8.0" and python_full_version < "4.0.0" \ hiredis==2.3.2 ; python_full_version >= "3.8.0" and python_full_version < "4.0.0" \
--hash=sha256:01b6c24c0840ac7afafbc4db236fd55f56a9a0919a215c25a238f051781f4772 \ --hash=sha256:01b6c24c0840ac7afafbc4db236fd55f56a9a0919a215c25a238f051781f4772 \
--hash=sha256:02fc71c8333586871602db4774d3a3e403b4ccf6446dc4603ec12df563127cee \ --hash=sha256:02fc71c8333586871602db4774d3a3e403b4ccf6446dc4603ec12df563127cee \
@ -430,9 +439,9 @@ importlib-resources==5.12.0 ; python_version >= "3.8" and python_version < "3.9"
incremental==22.10.0 ; python_full_version >= "3.8.0" and python_full_version < "4.0.0" \ incremental==22.10.0 ; python_full_version >= "3.8.0" and python_full_version < "4.0.0" \
--hash=sha256:912feeb5e0f7e0188e6f42241d2f450002e11bbc0937c65865045854c24c0bd0 \ --hash=sha256:912feeb5e0f7e0188e6f42241d2f450002e11bbc0937c65865045854c24c0bd0 \
--hash=sha256:b864a1f30885ee72c5ac2835a761b8fe8aa9c28b9395cacf27286602688d3e51 --hash=sha256:b864a1f30885ee72c5ac2835a761b8fe8aa9c28b9395cacf27286602688d3e51
jinja2==3.1.2 ; python_full_version >= "3.8.0" and python_full_version < "4.0.0" \ jinja2==3.1.3 ; python_full_version >= "3.8.0" and python_full_version < "4.0.0" \
--hash=sha256:31351a702a408a9e7595a8fc6150fc3f43bb6bf7e319770cbc0db9df9437e852 \ --hash=sha256:7d6d50dd97d52cbc355597bd845fabfbac3f551e1f99619e39a35ce8c370b5fa \
--hash=sha256:6088930bfe239f0e6710546ab9c19c9ef35e29792895fed6e6e31a023a182a61 --hash=sha256:ac8bd6544d4bb2c9792bf3a159e80bba8fda7f07e81bc3aed565432d5925ba90
jsonschema-specifications==2023.6.1 ; python_version >= "3.8" and python_full_version < "4.0.0" \ jsonschema-specifications==2023.6.1 ; python_version >= "3.8" and python_full_version < "4.0.0" \
--hash=sha256:3d2b82663aff01815f744bb5c7887e2121a63399b49b104a3c96145474d091d7 \ --hash=sha256:3d2b82663aff01815f744bb5c7887e2121a63399b49b104a3c96145474d091d7 \
--hash=sha256:ca1c4dd059a9e7b34101cf5b3ab7ff1d18b139f35950d598d629837ef66e8f28 --hash=sha256:ca1c4dd059a9e7b34101cf5b3ab7ff1d18b139f35950d598d629837ef66e8f28
@ -629,9 +638,9 @@ msgpack==1.0.7 ; python_version >= "3.8" and python_full_version < "4.0.0" \
--hash=sha256:f6ffbc252eb0d229aeb2f9ad051200668fc3a9aaa8994e49f0cb2ffe2b7867e7 \ --hash=sha256:f6ffbc252eb0d229aeb2f9ad051200668fc3a9aaa8994e49f0cb2ffe2b7867e7 \
--hash=sha256:f9a7c509542db4eceed3dcf21ee5267ab565a83555c9b88a8109dcecc4709002 \ --hash=sha256:f9a7c509542db4eceed3dcf21ee5267ab565a83555c9b88a8109dcecc4709002 \
--hash=sha256:ff1d0899f104f3921d94579a5638847f783c9b04f2d5f229392ca77fba5b82fc --hash=sha256:ff1d0899f104f3921d94579a5638847f783c9b04f2d5f229392ca77fba5b82fc
netaddr==0.9.0 ; python_full_version >= "3.8.0" and python_full_version < "4.0.0" \ netaddr==1.2.1 ; python_full_version >= "3.8.0" and python_full_version < "4.0.0" \
--hash=sha256:5148b1055679d2a1ec070c521b7db82137887fabd6d7e37f5199b44f775c3bb1 \ --hash=sha256:6eb8fedf0412c6d294d06885c110de945cf4d22d2b510d0404f4e06950857987 \
--hash=sha256:7b46fa9b1a2d71fd5de9e4a3784ef339700a53a08c8040f08baf5f1194da0128 --hash=sha256:bd9e9534b0d46af328cf64f0e5a23a5a43fca292df221c85580b27394793496e
packaging==23.2 ; python_full_version >= "3.8.0" and python_full_version < "4.0.0" \ packaging==23.2 ; python_full_version >= "3.8.0" and python_full_version < "4.0.0" \
--hash=sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5 \ --hash=sha256:048fb0e9405036518eaaf48a55953c750c11e1a1b68e0dd1a9d62ed0c092cfc5 \
--hash=sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7 --hash=sha256:8c491190033a9af7e1d931d0b5dacc2ef47509b34dd0de67ed209b5203fc88c7
@ -743,89 +752,89 @@ pyasn1==0.5.1 ; python_version >= "3.8" and python_full_version < "4.0.0" \
pycparser==2.21 ; python_version >= "3.8" and python_full_version < "4.0.0" \ pycparser==2.21 ; python_version >= "3.8" and python_full_version < "4.0.0" \
--hash=sha256:8ee45429555515e1f6b185e78100aea234072576aa43ab53aefcae078162fca9 \ --hash=sha256:8ee45429555515e1f6b185e78100aea234072576aa43ab53aefcae078162fca9 \
--hash=sha256:e644fdec12f7872f86c58ff790da456218b10f863970249516d60a5eaca77206 --hash=sha256:e644fdec12f7872f86c58ff790da456218b10f863970249516d60a5eaca77206
pydantic-core==2.16.1 ; python_version >= "3.8" and python_full_version < "4.0.0" \ pydantic-core==2.16.3 ; python_version >= "3.8" and python_full_version < "4.0.0" \
--hash=sha256:06f0d5a1d9e1b7932477c172cc720b3b23c18762ed7a8efa8398298a59d177c7 \ --hash=sha256:00ee1c97b5364b84cb0bd82e9bbf645d5e2871fb8c58059d158412fee2d33d8a \
--hash=sha256:07982b82d121ed3fc1c51faf6e8f57ff09b1325d2efccaa257dd8c0dd937acca \ --hash=sha256:0d32576b1de5a30d9a97f300cc6a3f4694c428d956adbc7e6e2f9cad279e45ed \
--hash=sha256:0f478ec204772a5c8218e30eb813ca43e34005dff2eafa03931b3d8caef87d51 \ --hash=sha256:0df446663464884297c793874573549229f9eca73b59360878f382a0fc085979 \
--hash=sha256:102569d371fadc40d8f8598a59379c37ec60164315884467052830b28cc4e9da \ --hash=sha256:0f56ae86b60ea987ae8bcd6654a887238fd53d1384f9b222ac457070b7ac4cff \
--hash=sha256:10dca874e35bb60ce4f9f6665bfbfad050dd7573596608aeb9e098621ac331dc \ --hash=sha256:13dcc4802961b5f843a9385fc821a0b0135e8c07fc3d9949fd49627c1a5e6ae5 \
--hash=sha256:150ba5c86f502c040b822777e2e519b5625b47813bd05f9273a8ed169c97d9ae \ --hash=sha256:162e498303d2b1c036b957a1278fa0899d02b2842f1ff901b6395104c5554a45 \
--hash=sha256:1661c668c1bb67b7cec96914329d9ab66755911d093bb9063c4c8914188af6d4 \ --hash=sha256:1b662180108c55dfbf1280d865b2d116633d436cfc0bba82323554873967b340 \
--hash=sha256:1a2fe7b00a49b51047334d84aafd7e39f80b7675cad0083678c58983662da89b \ --hash=sha256:1cac689f80a3abab2d3c0048b29eea5751114054f032a941a32de4c852c59cad \
--hash=sha256:1ae8048cba95f382dba56766525abca438328455e35c283bb202964f41a780b0 \ --hash=sha256:21b888c973e4f26b7a96491c0965a8a312e13be108022ee510248fe379a5fa23 \
--hash=sha256:20f724a023042588d0f4396bbbcf4cffd0ddd0ad3ed4f0d8e6d4ac4264bae81e \ --hash=sha256:287073c66748f624be4cef893ef9174e3eb88fe0b8a78dc22e88eca4bc357ca6 \
--hash=sha256:2133b0e412a47868a358713287ff9f9a328879da547dc88be67481cdac529118 \ --hash=sha256:2a1ef6a36fdbf71538142ed604ad19b82f67b05749512e47f247a6ddd06afdc7 \
--hash=sha256:21e3298486c4ea4e4d5cc6fb69e06fb02a4e22089304308817035ac006a7f506 \ --hash=sha256:2a72fb9963cba4cd5793854fd12f4cfee731e86df140f59ff52a49b3552db241 \
--hash=sha256:21ebaa4bf6386a3b22eec518da7d679c8363fb7fb70cf6972161e5542f470798 \ --hash=sha256:2acca2be4bb2f2147ada8cac612f8a98fc09f41c89f87add7256ad27332c2fda \
--hash=sha256:23632132f1fd608034f1a56cc3e484be00854db845b3a4a508834be5a6435a6f \ --hash=sha256:2f583bd01bbfbff4eaee0868e6fc607efdfcc2b03c1c766b06a707abbc856187 \
--hash=sha256:2d5bea8012df5bb6dda1e67d0563ac50b7f64a5d5858348b5c8cb5043811c19d \ --hash=sha256:33809aebac276089b78db106ee692bdc9044710e26f24a9a2eaa35a0f9fa70ba \
--hash=sha256:300616102fb71241ff477a2cbbc847321dbec49428434a2f17f37528721c4948 \ --hash=sha256:36fa178aacbc277bc6b62a2c3da95226520da4f4e9e206fdf076484363895d2c \
--hash=sha256:30a8259569fbeec49cfac7fda3ec8123486ef1b729225222f0d41d5f840b476f \ --hash=sha256:4204e773b4b408062960e65468d5346bdfe139247ee5f1ca2a378983e11388a2 \
--hash=sha256:399166f24c33a0c5759ecc4801f040dbc87d412c1a6d6292b2349b4c505effc9 \ --hash=sha256:4384a8f68ddb31a0b0c3deae88765f5868a1b9148939c3f4121233314ad5532c \
--hash=sha256:3fac641bbfa43d5a1bed99d28aa1fded1984d31c670a95aac1bf1d36ac6ce137 \ --hash=sha256:456855f57b413f077dff513a5a28ed838dbbb15082ba00f80750377eed23d132 \
--hash=sha256:42c29d54ed4501a30cd71015bf982fa95e4a60117b44e1a200290ce687d3e640 \ --hash=sha256:49d5d58abd4b83fb8ce763be7794d09b2f50f10aa65c0f0c1696c677edeb7cbf \
--hash=sha256:462d599299c5971f03c676e2b63aa80fec5ebc572d89ce766cd11ca8bcb56f3f \ --hash=sha256:4ac6b4ce1e7283d715c4b729d8f9dab9627586dafce81d9eaa009dd7f25dd972 \
--hash=sha256:4eebbd049008eb800f519578e944b8dc8e0f7d59a5abb5924cc2d4ed3a1834ff \ --hash=sha256:4df8a199d9f6afc5ae9a65f8f95ee52cae389a8c6b20163762bde0426275b7db \
--hash=sha256:502c062a18d84452858f8aea1e520e12a4d5228fc3621ea5061409d666ea1706 \ --hash=sha256:500960cb3a0543a724a81ba859da816e8cf01b0e6aaeedf2c3775d12ee49cade \
--hash=sha256:5317c04349472e683803da262c781c42c5628a9be73f4750ac7d13040efb5d2d \ --hash=sha256:519ae0312616026bf4cedc0fe459e982734f3ca82ee8c7246c19b650b60a5ee4 \
--hash=sha256:5511f962dd1b9b553e9534c3b9c6a4b0c9ded3d8c2be96e61d56f933feef9e1f \ --hash=sha256:578114bc803a4c1ff9946d977c221e4376620a46cf78da267d946397dc9514a8 \
--hash=sha256:561be4e3e952c2f9056fba5267b99be4ec2afadc27261505d4992c50b33c513c \ --hash=sha256:5c5cbc703168d1b7a838668998308018a2718c2130595e8e190220238addc96f \
--hash=sha256:601d3e42452cd4f2891c13fa8c70366d71851c1593ed42f57bf37f40f7dca3c8 \ --hash=sha256:6162f8d2dc27ba21027f261e4fa26f8bcb3cf9784b7f9499466a311ac284b5b9 \
--hash=sha256:644904600c15816a1f9a1bafa6aab0d21db2788abcdf4e2a77951280473f33e1 \ --hash=sha256:704d35ecc7e9c31d48926150afada60401c55efa3b46cd1ded5a01bdffaf1d48 \
--hash=sha256:653a5dfd00f601a0ed6654a8b877b18d65ac32c9d9997456e0ab240807be6cf7 \ --hash=sha256:716b542728d4c742353448765aa7cdaa519a7b82f9564130e2b3f6766018c9ec \
--hash=sha256:694a5e9f1f2c124a17ff2d0be613fd53ba0c26de588eb4bdab8bca855e550d95 \ --hash=sha256:72282ad4892a9fb2da25defeac8c2e84352c108705c972db82ab121d15f14e6d \
--hash=sha256:71b4a48a7427f14679f0015b13c712863d28bb1ab700bd11776a5368135c7d60 \ --hash=sha256:7233d65d9d651242a68801159763d09e9ec96e8a158dbf118dc090cd77a104c9 \
--hash=sha256:72bf9308a82b75039b8c8edd2be2924c352eda5da14a920551a8b65d5ee89253 \ --hash=sha256:732da3243e1b8d3eab8c6ae23ae6a58548849d2e4a4e03a1924c8ddf71a387cb \
--hash=sha256:735dceec50fa907a3c314b84ed609dec54b76a814aa14eb90da31d1d36873a5e \ --hash=sha256:75b81e678d1c1ede0785c7f46690621e4c6e63ccd9192af1f0bd9d504bbb6bf4 \
--hash=sha256:73802194f10c394c2bedce7a135ba1d8ba6cff23adf4217612bfc5cf060de34c \ --hash=sha256:75f76ee558751746d6a38f89d60b6228fa174e5172d143886af0f85aa306fd89 \
--hash=sha256:780daad9e35b18d10d7219d24bfb30148ca2afc309928e1d4d53de86822593dc \ --hash=sha256:7ee8d5f878dccb6d499ba4d30d757111847b6849ae07acdd1205fffa1fc1253c \
--hash=sha256:8655f55fe68c4685673265a650ef71beb2d31871c049c8b80262026f23605ee3 \ --hash=sha256:7f752826b5b8361193df55afcdf8ca6a57d0232653494ba473630a83ba50d8c9 \
--hash=sha256:877045a7969ace04d59516d5d6a7dee13106822f99a5d8df5e6822941f7bedc8 \ --hash=sha256:86b3d0033580bd6bbe07590152007275bd7af95f98eaa5bd36f3da219dcd93da \
--hash=sha256:87bce04f09f0552b66fca0c4e10da78d17cb0e71c205864bab4e9595122cb9d9 \ --hash=sha256:8d62da299c6ecb04df729e4b5c52dc0d53f4f8430b4492b93aa8de1f541c4aac \
--hash=sha256:8d4dfc66abea3ec6d9f83e837a8f8a7d9d3a76d25c9911735c76d6745950e62c \ --hash=sha256:8e47755d8152c1ab5b55928ab422a76e2e7b22b5ed8e90a7d584268dd49e9c6b \
--hash=sha256:8ec364e280db4235389b5e1e6ee924723c693cbc98e9d28dc1767041ff9bc388 \ --hash=sha256:9091632a25b8b87b9a605ec0e61f241c456e9248bfdcf7abdf344fdb169c81cf \
--hash=sha256:8fa00fa24ffd8c31fac081bf7be7eb495be6d248db127f8776575a746fa55c95 \ --hash=sha256:936e5db01dd49476fa8f4383c259b8b1303d5dd5fb34c97de194560698cc2c5e \
--hash=sha256:920c4897e55e2881db6a6da151198e5001552c3777cd42b8a4c2f72eedc2ee91 \ --hash=sha256:99b6add4c0b39a513d323d3b93bc173dac663c27b99860dd5bf491b240d26137 \
--hash=sha256:920f4633bee43d7a2818e1a1a788906df5a17b7ab6fe411220ed92b42940f818 \ --hash=sha256:9c865a7ee6f93783bd5d781af5a4c43dadc37053a5b42f7d18dc019f8c9d2bd1 \
--hash=sha256:9795f56aa6b2296f05ac79d8a424e94056730c0b860a62b0fdcfe6340b658cc8 \ --hash=sha256:a425479ee40ff021f8216c9d07a6a3b54b31c8267c6e17aa88b70d7ebd0e5e5b \
--hash=sha256:98f0edee7ee9cc7f9221af2e1b95bd02810e1c7a6d115cfd82698803d385b28f \ --hash=sha256:a4b2bf78342c40b3dc830880106f54328928ff03e357935ad26c7128bbd66ce8 \
--hash=sha256:99c095457eea8550c9fa9a7a992e842aeae1429dab6b6b378710f62bfb70b394 \ --hash=sha256:a6b1bb0827f56654b4437955555dc3aeeebeddc47c2d7ed575477f082622c49e \
--hash=sha256:99d3a433ef5dc3021c9534a58a3686c88363c591974c16c54a01af7efd741f13 \ --hash=sha256:aaf09e615a0bf98d406657e0008e4a8701b11481840be7d31755dc9f97c44053 \
--hash=sha256:99f9a50b56713a598d33bc23a9912224fc5d7f9f292444e6664236ae471ddf17 \ --hash=sha256:b1f6f5938d63c6139860f044e2538baeee6f0b251a1816e7adb6cbce106a1f01 \
--hash=sha256:9c46e556ee266ed3fb7b7a882b53df3c76b45e872fdab8d9cf49ae5e91147fd7 \ --hash=sha256:b29eeb887aa931c2fcef5aa515d9d176d25006794610c264ddc114c053bf96fe \
--hash=sha256:9f5d37ff01edcbace53a402e80793640c25798fb7208f105d87a25e6fcc9ea06 \ --hash=sha256:b3992a322a5617ded0a9f23fd06dbc1e4bd7cf39bc4ccf344b10f80af58beacd \
--hash=sha256:a0b4cfe408cd84c53bab7d83e4209458de676a6ec5e9c623ae914ce1cb79b96f \ --hash=sha256:b5b6079cc452a7c53dd378c6f881ac528246b3ac9aae0f8eef98498a75657805 \
--hash=sha256:a497be217818c318d93f07e14502ef93d44e6a20c72b04c530611e45e54c2196 \ --hash=sha256:b60cc1a081f80a2105a59385b92d82278b15d80ebb3adb200542ae165cd7d183 \
--hash=sha256:ac89ccc39cd1d556cc72d6752f252dc869dde41c7c936e86beac5eb555041b66 \ --hash=sha256:b926dd38db1519ed3043a4de50214e0d600d404099c3392f098a7f9d75029ff8 \
--hash=sha256:adf28099d061a25fbcc6531febb7a091e027605385de9fe14dd6a97319d614cf \ --hash=sha256:bd87f48924f360e5d1c5f770d6155ce0e7d83f7b4e10c2f9ec001c73cf475c99 \
--hash=sha256:afa01d25769af33a8dac0d905d5c7bb2d73c7c3d5161b2dd6f8b5b5eea6a3c4c \ --hash=sha256:bda1ee3e08252b8d41fa5537413ffdddd58fa73107171a126d3b9ff001b9b820 \
--hash=sha256:b1fc07896fc1851558f532dffc8987e526b682ec73140886c831d773cef44b76 \ --hash=sha256:be0ec334369316fa73448cc8c982c01e5d2a81c95969d58b8f6e272884df0074 \
--hash=sha256:b49c604ace7a7aa8af31196abbf8f2193be605db6739ed905ecaf62af31ccae0 \ --hash=sha256:c6119dc90483a5cb50a1306adb8d52c66e447da88ea44f323e0ae1a5fcb14256 \
--hash=sha256:b9f3e0bffad6e238f7acc20c393c1ed8fab4371e3b3bc311020dfa6020d99212 \ --hash=sha256:c9803edf8e29bd825f43481f19c37f50d2b01899448273b3a7758441b512acf8 \
--hash=sha256:ba07646f35e4e49376c9831130039d1b478fbfa1215ae62ad62d2ee63cf9c18f \ --hash=sha256:c9bd22a2a639e26171068f8ebb5400ce2c1bc7d17959f60a3b753ae13c632975 \
--hash=sha256:bd88f40f2294440d3f3c6308e50d96a0d3d0973d6f1a5732875d10f569acef49 \ --hash=sha256:cbcc558401de90a746d02ef330c528f2e668c83350f045833543cd57ecead1ad \
--hash=sha256:c0be58529d43d38ae849a91932391eb93275a06b93b79a8ab828b012e916a206 \ --hash=sha256:cf6204fe865da605285c34cf1172879d0314ff267b1c35ff59de7154f35fdc2e \
--hash=sha256:c45f62e4107ebd05166717ac58f6feb44471ed450d07fecd90e5f69d9bf03c48 \ --hash=sha256:d33dd21f572545649f90c38c227cc8631268ba25c460b5569abebdd0ec5974ca \
--hash=sha256:c56da23034fe66221f2208c813d8aa509eea34d97328ce2add56e219c3a9f41c \ --hash=sha256:d89ca19cdd0dd5f31606a9329e309d4fcbb3df860960acec32630297d61820df \
--hash=sha256:c94b5537bf6ce66e4d7830c6993152940a188600f6ae044435287753044a8fe2 \ --hash=sha256:d8f99b147ff3fcf6b3cc60cb0c39ea443884d5559a30b1481e92495f2310ff2b \
--hash=sha256:cebf8d56fee3b08ad40d332a807ecccd4153d3f1ba8231e111d9759f02edfd05 \ --hash=sha256:d937653a696465677ed583124b94a4b2d79f5e30b2c46115a68e482c6a591c8a \
--hash=sha256:d0bf6f93a55d3fa7a079d811b29100b019784e2ee6bc06b0bb839538272a5610 \ --hash=sha256:dcca5d2bf65c6fb591fff92da03f94cd4f315972f97c21975398bd4bd046854a \
--hash=sha256:d195add190abccefc70ad0f9a0141ad7da53e16183048380e688b466702195dd \ --hash=sha256:ded1c35f15c9dea16ead9bffcde9bb5c7c031bff076355dc58dcb1cb436c4721 \
--hash=sha256:d25ef0c33f22649b7a088035fd65ac1ce6464fa2876578df1adad9472f918a76 \ --hash=sha256:e3e70c94a0c3841e6aa831edab1619ad5c511199be94d0c11ba75fe06efe107a \
--hash=sha256:d6cbdf12ef967a6aa401cf5cdf47850559e59eedad10e781471c960583f25aa1 \ --hash=sha256:e56f8186d6210ac7ece503193ec84104da7ceb98f68ce18c07282fcc2452e76f \
--hash=sha256:d8c032ccee90b37b44e05948b449a2d6baed7e614df3d3f47fe432c952c21b60 \ --hash=sha256:e7774b570e61cb998490c5235740d475413a1f6de823169b4cf94e2fe9e9f6b2 \
--hash=sha256:daff04257b49ab7f4b3f73f98283d3dbb1a65bf3500d55c7beac3c66c310fe34 \ --hash=sha256:e7c6ed0dc9d8e65f24f5824291550139fe6f37fac03788d4580da0d33bc00c97 \
--hash=sha256:e83ebbf020be727d6e0991c1b192a5c2e7113eb66e3def0cd0c62f9f266247e4 \ --hash=sha256:ec08be75bb268473677edb83ba71e7e74b43c008e4a7b1907c6d57e940bf34b6 \
--hash=sha256:ed3025a8a7e5a59817b7494686d449ebfbe301f3e757b852c8d0d1961d6be864 \ --hash=sha256:ecdf6bf5f578615f2e985a5e1f6572e23aa632c4bd1dc67f8f406d445ac115ed \
--hash=sha256:f1936ef138bed2165dd8573aa65e3095ef7c2b6247faccd0e15186aabdda7f66 \ --hash=sha256:ed25e1835c00a332cb10c683cd39da96a719ab1dfc08427d476bce41b92531fc \
--hash=sha256:f5247a3d74355f8b1d780d0f3b32a23dd9f6d3ff43ef2037c6dcd249f35ecf4c \ --hash=sha256:f4cb85f693044e0f71f394ff76c98ddc1bc0953e48c061725e540396d5c8a2e1 \
--hash=sha256:fa496cd45cda0165d597e9d6f01e36c33c9508f75cf03c0a650018c5048f578e \ --hash=sha256:f53aace168a2a10582e570b7736cc5bef12cae9cf21775e3eafac597e8551fbe \
--hash=sha256:fb4363e6c9fc87365c2bc777a1f585a22f2f56642501885ffc7942138499bf54 \ --hash=sha256:f651dd19363c632f4abe3480a7c87a9773be27cfe1341aef06e8759599454120 \
--hash=sha256:fb4370b15111905bf8b5ba2129b926af9470f014cb0493a67d23e9d7a48348e8 \ --hash=sha256:fc4ad7f7ee1a13d9cb49d8198cd7d7e3aa93e425f371a68235f784e99741561f \
--hash=sha256:fbec2af0ebafa57eb82c18c304b37c86a8abddf7022955d1742b3d5471a6339e --hash=sha256:fee427241c2d9fb7192b658190f9f5fd6dfe41e02f3c1489d2ec1e6a5ab1e04a
pydantic==2.6.0 ; python_version >= "3.8" and python_full_version < "4.0.0" \ pydantic==2.6.4 ; python_version >= "3.8" and python_full_version < "4.0.0" \
--hash=sha256:1440966574e1b5b99cf75a13bec7b20e3512e8a61b894ae252f56275e2c465ae \ --hash=sha256:b1704e0847db01817624a6b86766967f552dd9dbf3afba4004409f908dcc84e6 \
--hash=sha256:ae887bd94eb404b09d86e4d12f93893bdca79d766e738528c6fa1c849f3c6bcf --hash=sha256:cc46fce86607580867bdc3361ad462bab9c222ef042d3da86f2fb333e1d916c5
pymacaroons==0.13.0 ; python_full_version >= "3.8.0" and python_full_version < "4.0.0" \ pymacaroons==0.13.0 ; python_full_version >= "3.8.0" and python_full_version < "4.0.0" \
--hash=sha256:1e6bba42a5f66c245adf38a5a4006a99dcc06a0703786ea636098667d42903b8 \ --hash=sha256:1e6bba42a5f66c245adf38a5a4006a99dcc06a0703786ea636098667d42903b8 \
--hash=sha256:3e14dff6a262fdbf1a15e769ce635a8aea72e6f8f91e408f9a97166c53b91907 --hash=sha256:3e14dff6a262fdbf1a15e769ce635a8aea72e6f8f91e408f9a97166c53b91907

View file

@ -1040,10 +1040,10 @@ class Porter:
return done, remaining + done return done, remaining + done
async def _setup_state_group_id_seq(self) -> None: async def _setup_state_group_id_seq(self) -> None:
curr_id: Optional[ curr_id: Optional[int] = (
int await self.sqlite_store.db_pool.simple_select_one_onecol(
] = await self.sqlite_store.db_pool.simple_select_one_onecol( table="state_groups", keyvalues={}, retcol="MAX(id)", allow_none=True
table="state_groups", keyvalues={}, retcol="MAX(id)", allow_none=True )
) )
if not curr_id: if not curr_id:
@ -1132,13 +1132,13 @@ class Porter:
) )
async def _setup_auth_chain_sequence(self) -> None: async def _setup_auth_chain_sequence(self) -> None:
curr_chain_id: Optional[ curr_chain_id: Optional[int] = (
int await self.sqlite_store.db_pool.simple_select_one_onecol(
] = await self.sqlite_store.db_pool.simple_select_one_onecol( table="event_auth_chains",
table="event_auth_chains", keyvalues={},
keyvalues={}, retcol="MAX(chain_id)",
retcol="MAX(chain_id)", allow_none=True,
allow_none=True, )
) )
def r(txn: LoggingTransaction) -> None: def r(txn: LoggingTransaction) -> None:

View file

@ -43,7 +43,6 @@ MAIN_TIMELINE: Final = "main"
class Membership: class Membership:
"""Represents the membership states of a user in a room.""" """Represents the membership states of a user in a room."""
INVITE: Final = "invite" INVITE: Final = "invite"
@ -130,6 +129,8 @@ class EventTypes:
Reaction: Final = "m.reaction" Reaction: Final = "m.reaction"
CallInvite: Final = "m.call.invite"
class ToDeviceEventTypes: class ToDeviceEventTypes:
RoomKeyRequest: Final = "m.room_key_request" RoomKeyRequest: Final = "m.room_key_request"

View file

@ -370,9 +370,11 @@ class RoomVersionCapability:
MSC3244_CAPABILITIES = { MSC3244_CAPABILITIES = {
cap.identifier: { cap.identifier: {
"preferred": cap.preferred_version.identifier "preferred": (
if cap.preferred_version is not None cap.preferred_version.identifier
else None, if cap.preferred_version is not None
else None
),
"support": [ "support": [
v.identifier v.identifier
for v in KNOWN_ROOM_VERSIONS.values() for v in KNOWN_ROOM_VERSIONS.values()

View file

@ -188,9 +188,9 @@ class SynapseHomeServer(HomeServer):
PasswordResetSubmitTokenResource, PasswordResetSubmitTokenResource,
) )
resources[ resources["/_synapse/client/password_reset/email/submit_token"] = (
"/_synapse/client/password_reset/email/submit_token" PasswordResetSubmitTokenResource(self)
] = PasswordResetSubmitTokenResource(self) )
if name == "consent": if name == "consent":
from synapse.rest.consent.consent_resource import ConsentResource from synapse.rest.consent.consent_resource import ConsentResource

View file

@ -362,16 +362,16 @@ class ApplicationServiceApi(SimpleHttpClient):
# TODO: Update to stable prefixes once MSC3202 completes FCP merge # TODO: Update to stable prefixes once MSC3202 completes FCP merge
if service.msc3202_transaction_extensions: if service.msc3202_transaction_extensions:
if one_time_keys_count: if one_time_keys_count:
body[ body["org.matrix.msc3202.device_one_time_key_counts"] = (
"org.matrix.msc3202.device_one_time_key_counts" one_time_keys_count
] = one_time_keys_count )
body[ body["org.matrix.msc3202.device_one_time_keys_count"] = (
"org.matrix.msc3202.device_one_time_keys_count" one_time_keys_count
] = one_time_keys_count )
if unused_fallback_keys: if unused_fallback_keys:
body[ body["org.matrix.msc3202.device_unused_fallback_key_types"] = (
"org.matrix.msc3202.device_unused_fallback_key_types" unused_fallback_keys
] = unused_fallback_keys )
if device_list_summary: if device_list_summary:
body["org.matrix.msc3202.device_lists"] = { body["org.matrix.msc3202.device_lists"] = {
"changed": list(device_list_summary.changed), "changed": list(device_list_summary.changed),

View file

@ -342,6 +342,9 @@ def _parse_oidc_config_dict(
user_mapping_provider_config=user_mapping_provider_config, user_mapping_provider_config=user_mapping_provider_config,
attribute_requirements=attribute_requirements, attribute_requirements=attribute_requirements,
enable_registration=oidc_config.get("enable_registration", True), enable_registration=oidc_config.get("enable_registration", True),
additional_authorization_parameters=oidc_config.get(
"additional_authorization_parameters", {}
),
) )
@ -444,3 +447,6 @@ class OidcProviderConfig:
# Whether automatic registrations are enabled in the ODIC flow. Defaults to True # Whether automatic registrations are enabled in the ODIC flow. Defaults to True
enable_registration: bool enable_registration: bool
# Additional parameters that will be passed to the authorization grant URL
additional_authorization_parameters: Mapping[str, str]

View file

@ -171,9 +171,9 @@ class RegistrationConfig(Config):
refreshable_access_token_lifetime = self.parse_duration( refreshable_access_token_lifetime = self.parse_duration(
refreshable_access_token_lifetime refreshable_access_token_lifetime
) )
self.refreshable_access_token_lifetime: Optional[ self.refreshable_access_token_lifetime: Optional[int] = (
int refreshable_access_token_lifetime
] = refreshable_access_token_lifetime )
if ( if (
self.session_lifetime is not None self.session_lifetime is not None

View file

@ -201,9 +201,9 @@ class ContentRepositoryConfig(Config):
provider_config["module"] == "file_system" provider_config["module"] == "file_system"
or provider_config["module"] == "synapse.rest.media.v1.storage_provider" or provider_config["module"] == "synapse.rest.media.v1.storage_provider"
): ):
provider_config[ provider_config["module"] = (
"module" "synapse.media.storage_provider.FileStorageProviderBackend"
] = "synapse.media.storage_provider.FileStorageProviderBackend" )
provider_class, parsed_config = load_module( provider_class, parsed_config = load_module(
provider_config, ("media_storage_providers", "<item %i>" % i) provider_config, ("media_storage_providers", "<item %i>" % i)

View file

@ -88,8 +88,7 @@ class _EventSourceStore(Protocol):
redact_behaviour: EventRedactBehaviour, redact_behaviour: EventRedactBehaviour,
get_prev_content: bool = False, get_prev_content: bool = False,
allow_rejected: bool = False, allow_rejected: bool = False,
) -> Dict[str, "EventBase"]: ) -> Dict[str, "EventBase"]: ...
...
def validate_event_for_room_version(event: "EventBase") -> None: def validate_event_for_room_version(event: "EventBase") -> None:

View file

@ -93,16 +93,14 @@ class DictProperty(Generic[T]):
self, self,
instance: Literal[None], instance: Literal[None],
owner: Optional[Type[_DictPropertyInstance]] = None, owner: Optional[Type[_DictPropertyInstance]] = None,
) -> "DictProperty": ) -> "DictProperty": ...
...
@overload @overload
def __get__( def __get__(
self, self,
instance: _DictPropertyInstance, instance: _DictPropertyInstance,
owner: Optional[Type[_DictPropertyInstance]] = None, owner: Optional[Type[_DictPropertyInstance]] = None,
) -> T: ) -> T: ...
...
def __get__( def __get__(
self, self,
@ -161,16 +159,14 @@ class DefaultDictProperty(DictProperty, Generic[T]):
self, self,
instance: Literal[None], instance: Literal[None],
owner: Optional[Type[_DictPropertyInstance]] = None, owner: Optional[Type[_DictPropertyInstance]] = None,
) -> "DefaultDictProperty": ) -> "DefaultDictProperty": ...
...
@overload @overload
def __get__( def __get__(
self, self,
instance: _DictPropertyInstance, instance: _DictPropertyInstance,
owner: Optional[Type[_DictPropertyInstance]] = None, owner: Optional[Type[_DictPropertyInstance]] = None,
) -> T: ) -> T: ...
...
def __get__( def __get__(
self, self,

View file

@ -612,9 +612,9 @@ class EventClientSerializer:
serialized_aggregations = {} serialized_aggregations = {}
if event_aggregations.references: if event_aggregations.references:
serialized_aggregations[ serialized_aggregations[RelationTypes.REFERENCE] = (
RelationTypes.REFERENCE event_aggregations.references
] = event_aggregations.references )
if event_aggregations.replace: if event_aggregations.replace:
# Include information about it in the relations dict. # Include information about it in the relations dict.

View file

@ -169,9 +169,9 @@ class FederationServer(FederationBase):
# We cache responses to state queries, as they take a while and often # We cache responses to state queries, as they take a while and often
# come in waves. # come in waves.
self._state_resp_cache: ResponseCache[ self._state_resp_cache: ResponseCache[Tuple[str, Optional[str]]] = (
Tuple[str, Optional[str]] ResponseCache(hs.get_clock(), "state_resp", timeout_ms=30000)
] = ResponseCache(hs.get_clock(), "state_resp", timeout_ms=30000) )
self._state_ids_resp_cache: ResponseCache[Tuple[str, str]] = ResponseCache( self._state_ids_resp_cache: ResponseCache[Tuple[str, str]] = ResponseCache(
hs.get_clock(), "state_ids_resp", timeout_ms=30000 hs.get_clock(), "state_ids_resp", timeout_ms=30000
) )

View file

@ -88,9 +88,9 @@ class FederationRemoteSendQueue(AbstractFederationSender):
# Stores the destinations we need to explicitly send presence to about a # Stores the destinations we need to explicitly send presence to about a
# given user. # given user.
# Stream position -> (user_id, destinations) # Stream position -> (user_id, destinations)
self.presence_destinations: SortedDict[ self.presence_destinations: SortedDict[int, Tuple[str, Iterable[str]]] = (
int, Tuple[str, Iterable[str]] SortedDict()
] = SortedDict() )
# (destination, key) -> EDU # (destination, key) -> EDU
self.keyed_edu: Dict[Tuple[str, tuple], Edu] = {} self.keyed_edu: Dict[Tuple[str, tuple], Edu] = {}

View file

@ -192,10 +192,9 @@ sent_pdus_destination_dist_total = Counter(
) )
# Time (in s) to wait before trying to wake up destinations that have # Time (in s) to wait before trying to wake up destinations that have
# catch-up outstanding. This will also be the delay applied at startup # catch-up outstanding.
# before trying the same.
# Please note that rate limiting still applies, so while the loop is # Please note that rate limiting still applies, so while the loop is
# executed every X seconds the destinations may not be wake up because # executed every X seconds the destinations may not be woken up because
# they are being rate limited following previous attempt failures. # they are being rate limited following previous attempt failures.
WAKEUP_RETRY_PERIOD_SEC = 60 WAKEUP_RETRY_PERIOD_SEC = 60
@ -428,18 +427,17 @@ class FederationSender(AbstractFederationSender):
/ hs.config.ratelimiting.federation_rr_transactions_per_room_per_second / hs.config.ratelimiting.federation_rr_transactions_per_room_per_second
) )
self._external_cache = hs.get_external_cache()
self._destination_wakeup_queue = _DestinationWakeupQueue(self, self.clock)
# Regularly wake up destinations that have outstanding PDUs to be caught up # Regularly wake up destinations that have outstanding PDUs to be caught up
self.clock.looping_call( self.clock.looping_call_now(
run_as_background_process, run_as_background_process,
WAKEUP_RETRY_PERIOD_SEC * 1000.0, WAKEUP_RETRY_PERIOD_SEC * 1000.0,
"wake_destinations_needing_catchup", "wake_destinations_needing_catchup",
self._wake_destinations_needing_catchup, self._wake_destinations_needing_catchup,
) )
self._external_cache = hs.get_external_cache()
self._destination_wakeup_queue = _DestinationWakeupQueue(self, self.clock)
def _get_per_destination_queue(self, destination: str) -> PerDestinationQueue: def _get_per_destination_queue(self, destination: str) -> PerDestinationQueue:
"""Get or create a PerDestinationQueue for the given destination """Get or create a PerDestinationQueue for the given destination

View file

@ -118,10 +118,10 @@ class AccountHandler:
} }
if self._use_account_validity_in_account_status: if self._use_account_validity_in_account_status:
status[ status["org.matrix.expired"] = (
"org.matrix.expired" await self._account_validity_handler.is_user_expired(
] = await self._account_validity_handler.is_user_expired( user_id.to_string()
user_id.to_string() )
) )
return status return status

View file

@ -2185,7 +2185,7 @@ class PasswordAuthProvider:
# result is always the right type, but as it is 3rd party code it might not be # result is always the right type, but as it is 3rd party code it might not be
if not isinstance(result, tuple) or len(result) != 2: if not isinstance(result, tuple) or len(result) != 2:
logger.warning( logger.warning( # type: ignore[unreachable]
"Wrong type returned by module API callback %s: %s, expected" "Wrong type returned by module API callback %s: %s, expected"
" Optional[Tuple[str, Optional[Callable]]]", " Optional[Tuple[str, Optional[Callable]]]",
callback, callback,
@ -2248,7 +2248,7 @@ class PasswordAuthProvider:
# result is always the right type, but as it is 3rd party code it might not be # result is always the right type, but as it is 3rd party code it might not be
if not isinstance(result, tuple) or len(result) != 2: if not isinstance(result, tuple) or len(result) != 2:
logger.warning( logger.warning( # type: ignore[unreachable]
"Wrong type returned by module API callback %s: %s, expected" "Wrong type returned by module API callback %s: %s, expected"
" Optional[Tuple[str, Optional[Callable]]]", " Optional[Tuple[str, Optional[Callable]]]",
callback, callback,

View file

@ -18,9 +18,11 @@
# [This file includes modifications made by New Vector Limited] # [This file includes modifications made by New Vector Limited]
# #
# #
import itertools
import logging import logging
from typing import TYPE_CHECKING, Optional from typing import TYPE_CHECKING, Optional
from synapse.api.constants import Membership
from synapse.api.errors import SynapseError from synapse.api.errors import SynapseError
from synapse.handlers.device import DeviceHandler from synapse.handlers.device import DeviceHandler
from synapse.metrics.background_process_metrics import run_as_background_process from synapse.metrics.background_process_metrics import run_as_background_process
@ -168,9 +170,9 @@ class DeactivateAccountHandler:
# parts users from rooms (if it isn't already running) # parts users from rooms (if it isn't already running)
self._start_user_parting() self._start_user_parting()
# Reject all pending invites for the user, so that the user doesn't show up in the # Reject all pending invites and knocks for the user, so that the
# "invited" section of rooms' members list. # user doesn't show up in the "invited" section of rooms' members list.
await self._reject_pending_invites_for_user(user_id) await self._reject_pending_invites_and_knocks_for_user(user_id)
# Remove all information on the user from the account_validity table. # Remove all information on the user from the account_validity table.
if self._account_validity_enabled: if self._account_validity_enabled:
@ -194,34 +196,37 @@ class DeactivateAccountHandler:
return identity_server_supports_unbinding return identity_server_supports_unbinding
async def _reject_pending_invites_for_user(self, user_id: str) -> None: async def _reject_pending_invites_and_knocks_for_user(self, user_id: str) -> None:
"""Reject pending invites addressed to a given user ID. """Reject pending invites and knocks addressed to a given user ID.
Args: Args:
user_id: The user ID to reject pending invites for. user_id: The user ID to reject pending invites and knocks for.
""" """
user = UserID.from_string(user_id) user = UserID.from_string(user_id)
pending_invites = await self.store.get_invited_rooms_for_local_user(user_id) pending_invites = await self.store.get_invited_rooms_for_local_user(user_id)
pending_knocks = await self.store.get_knocked_at_rooms_for_local_user(user_id)
for room in pending_invites: for room in itertools.chain(pending_invites, pending_knocks):
try: try:
await self._room_member_handler.update_membership( await self._room_member_handler.update_membership(
create_requester(user, authenticated_entity=self._server_name), create_requester(user, authenticated_entity=self._server_name),
user, user,
room.room_id, room.room_id,
"leave", Membership.LEAVE,
ratelimit=False, ratelimit=False,
require_consent=False, require_consent=False,
) )
logger.info( logger.info(
"Rejected invite for deactivated user %r in room %r", "Rejected %r for deactivated user %r in room %r",
room.membership,
user_id, user_id,
room.room_id, room.room_id,
) )
except Exception: except Exception:
logger.exception( logger.exception(
"Failed to reject invite for user %r in room %r:" "Failed to reject %r for user %r in room %r:"
" ignoring and continuing", " ignoring and continuing",
room.membership,
user_id, user_id,
room.room_id, room.room_id,
) )

View file

@ -270,9 +270,9 @@ class DirectoryHandler:
async def get_association(self, room_alias: RoomAlias) -> JsonDict: async def get_association(self, room_alias: RoomAlias) -> JsonDict:
room_id = None room_id = None
if self.hs.is_mine(room_alias): if self.hs.is_mine(room_alias):
result: Optional[ result: Optional[RoomAliasMapping] = (
RoomAliasMapping await self.get_association_from_room_alias(room_alias)
] = await self.get_association_from_room_alias(room_alias) )
if result: if result:
room_id = result.room_id room_id = result.room_id

View file

@ -1001,11 +1001,11 @@ class FederationHandler:
) )
if include_auth_user_id: if include_auth_user_id:
event_content[ event_content[EventContentFields.AUTHORISING_USER] = (
EventContentFields.AUTHORISING_USER await self._event_auth_handler.get_user_which_could_invite(
] = await self._event_auth_handler.get_user_which_could_invite( room_id,
room_id, state_ids,
state_ids, )
) )
builder = self.event_builder_factory.for_room_version( builder = self.event_builder_factory.for_room_version(

View file

@ -1367,9 +1367,9 @@ class FederationEventHandler:
) )
if remote_event.is_state() and remote_event.rejected_reason is None: if remote_event.is_state() and remote_event.rejected_reason is None:
state_map[ state_map[(remote_event.type, remote_event.state_key)] = (
(remote_event.type, remote_event.state_key) remote_event.event_id
] = remote_event.event_id )
return state_map return state_map

View file

@ -34,6 +34,7 @@ from synapse.api.constants import (
EventTypes, EventTypes,
GuestAccess, GuestAccess,
HistoryVisibility, HistoryVisibility,
JoinRules,
Membership, Membership,
RelationTypes, RelationTypes,
UserTypes, UserTypes,
@ -1325,6 +1326,18 @@ class EventCreationHandler:
self.validator.validate_new(event, self.config) self.validator.validate_new(event, self.config)
await self._validate_event_relation(event) await self._validate_event_relation(event)
if event.type == EventTypes.CallInvite:
room_id = event.room_id
room_info = await self.store.get_room_with_stats(room_id)
assert room_info is not None
if room_info.join_rules == JoinRules.PUBLIC:
raise SynapseError(
403,
"Call invites are not allowed in public rooms.",
Codes.FORBIDDEN,
)
logger.debug("Created event %s", event.event_id) logger.debug("Created event %s", event.event_id)
return event, context return event, context
@ -1656,9 +1669,9 @@ class EventCreationHandler:
expiry_ms=60 * 60 * 1000, expiry_ms=60 * 60 * 1000,
) )
self._external_cache_joined_hosts_updates[ self._external_cache_joined_hosts_updates[state_entry.state_group] = (
state_entry.state_group None
] = None )
async def _validate_canonical_alias( async def _validate_canonical_alias(
self, self,

View file

@ -65,6 +65,7 @@ from synapse.http.server import finish_request
from synapse.http.servlet import parse_string from synapse.http.servlet import parse_string
from synapse.http.site import SynapseRequest from synapse.http.site import SynapseRequest
from synapse.logging.context import make_deferred_yieldable from synapse.logging.context import make_deferred_yieldable
from synapse.module_api import ModuleApi
from synapse.types import JsonDict, UserID, map_username_to_mxid_localpart from synapse.types import JsonDict, UserID, map_username_to_mxid_localpart
from synapse.util import Clock, json_decoder from synapse.util import Clock, json_decoder
from synapse.util.caches.cached_call import RetryOnExceptionCachedCall from synapse.util.caches.cached_call import RetryOnExceptionCachedCall
@ -421,9 +422,19 @@ class OidcProvider:
# from the IdP's jwks_uri, if required. # from the IdP's jwks_uri, if required.
self._jwks = RetryOnExceptionCachedCall(self._load_jwks) self._jwks = RetryOnExceptionCachedCall(self._load_jwks)
self._user_mapping_provider = provider.user_mapping_provider_class( user_mapping_provider_init_method = (
provider.user_mapping_provider_config provider.user_mapping_provider_class.__init__
) )
if len(inspect.signature(user_mapping_provider_init_method).parameters) == 3:
self._user_mapping_provider = provider.user_mapping_provider_class(
provider.user_mapping_provider_config,
ModuleApi(hs, hs.get_auth_handler()),
)
else:
self._user_mapping_provider = provider.user_mapping_provider_class(
provider.user_mapping_provider_config,
)
self._skip_verification = provider.skip_verification self._skip_verification = provider.skip_verification
self._allow_existing_users = provider.allow_existing_users self._allow_existing_users = provider.allow_existing_users
@ -442,6 +453,10 @@ class OidcProvider:
# optional brand identifier for this auth provider # optional brand identifier for this auth provider
self.idp_brand = provider.idp_brand self.idp_brand = provider.idp_brand
self.additional_authorization_parameters = (
provider.additional_authorization_parameters
)
self._sso_handler = hs.get_sso_handler() self._sso_handler = hs.get_sso_handler()
self._device_handler = hs.get_device_handler() self._device_handler = hs.get_device_handler()
@ -818,14 +833,38 @@ class OidcProvider:
logger.debug("Using the OAuth2 access_token to request userinfo") logger.debug("Using the OAuth2 access_token to request userinfo")
metadata = await self.load_metadata() metadata = await self.load_metadata()
resp = await self._http_client.get_json( resp = await self._http_client.request(
"GET",
metadata["userinfo_endpoint"], metadata["userinfo_endpoint"],
headers={"Authorization": ["Bearer {}".format(token["access_token"])]}, headers=Headers(
{"Authorization": ["Bearer {}".format(token["access_token"])]}
),
) )
logger.debug("Retrieved user info from userinfo endpoint: %r", resp) body = await readBody(resp)
return UserInfo(resp) content_type_headers = resp.headers.getRawHeaders("Content-Type")
assert content_type_headers
# We use `startswith` because the header value can contain the `charset` parameter
# even if it is useless, and Twisted doesn't take care of that for us.
if content_type_headers[0].startswith("application/jwt"):
alg_values = metadata.get(
"id_token_signing_alg_values_supported", ["RS256"]
)
jwt = JsonWebToken(alg_values)
jwk_set = await self.load_jwks()
try:
decoded_resp = jwt.decode(body, key=jwk_set)
except ValueError:
logger.info("Reloading JWKS after decode error")
jwk_set = await self.load_jwks(force=True) # try reloading the jwks
decoded_resp = jwt.decode(body, key=jwk_set)
else:
decoded_resp = json_decoder.decode(body.decode("utf-8"))
logger.debug("Retrieved user info from userinfo endpoint: %r", decoded_resp)
return UserInfo(decoded_resp)
async def _verify_jwt( async def _verify_jwt(
self, self,
@ -971,17 +1010,21 @@ class OidcProvider:
metadata = await self.load_metadata() metadata = await self.load_metadata()
additional_authorization_parameters = dict(
self.additional_authorization_parameters
)
# Automatically enable PKCE if it is supported. # Automatically enable PKCE if it is supported.
extra_grant_values = {}
if metadata.get("code_challenge_methods_supported"): if metadata.get("code_challenge_methods_supported"):
code_verifier = generate_token(48) code_verifier = generate_token(48)
# Note that we verified the server supports S256 earlier (in # Note that we verified the server supports S256 earlier (in
# OidcProvider._validate_metadata). # OidcProvider._validate_metadata).
extra_grant_values = { additional_authorization_parameters.update(
"code_challenge_method": "S256", {
"code_challenge": create_s256_code_challenge(code_verifier), "code_challenge_method": "S256",
} "code_challenge": create_s256_code_challenge(code_verifier),
}
)
cookie = self._macaroon_generaton.generate_oidc_session_token( cookie = self._macaroon_generaton.generate_oidc_session_token(
state=state, state=state,
@ -1020,7 +1063,7 @@ class OidcProvider:
scope=self._scopes, scope=self._scopes,
state=state, state=state,
nonce=nonce, nonce=nonce,
**extra_grant_values, **additional_authorization_parameters,
) )
async def handle_oidc_callback( async def handle_oidc_callback(
@ -1583,7 +1626,7 @@ class JinjaOidcMappingProvider(OidcMappingProvider[JinjaOidcMappingConfig]):
This is the default mapping provider. This is the default mapping provider.
""" """
def __init__(self, config: JinjaOidcMappingConfig): def __init__(self, config: JinjaOidcMappingConfig, module_api: ModuleApi):
self._config = config self._config = config
@staticmethod @staticmethod

View file

@ -493,9 +493,9 @@ class WorkerPresenceHandler(BasePresenceHandler):
# The number of ongoing syncs on this process, by (user ID, device ID). # The number of ongoing syncs on this process, by (user ID, device ID).
# Empty if _presence_enabled is false. # Empty if _presence_enabled is false.
self._user_device_to_num_current_syncs: Dict[ self._user_device_to_num_current_syncs: Dict[Tuple[str, Optional[str]], int] = (
Tuple[str, Optional[str]], int {}
] = {} )
self.notifier = hs.get_notifier() self.notifier = hs.get_notifier()
self.instance_id = hs.get_instance_id() self.instance_id = hs.get_instance_id()
@ -818,9 +818,9 @@ class PresenceHandler(BasePresenceHandler):
# Keeps track of the number of *ongoing* syncs on this process. While # Keeps track of the number of *ongoing* syncs on this process. While
# this is non zero a user will never go offline. # this is non zero a user will never go offline.
self._user_device_to_num_current_syncs: Dict[ self._user_device_to_num_current_syncs: Dict[Tuple[str, Optional[str]], int] = (
Tuple[str, Optional[str]], int {}
] = {} )
# Keeps track of the number of *ongoing* syncs on other processes. # Keeps track of the number of *ongoing* syncs on other processes.
# #

View file

@ -320,9 +320,9 @@ class ProfileHandler:
server_name = host server_name = host
if self._is_mine_server_name(server_name): if self._is_mine_server_name(server_name):
media_info: Optional[ media_info: Optional[Union[LocalMedia, RemoteMedia]] = (
Union[LocalMedia, RemoteMedia] await self.store.get_local_media(media_id)
] = await self.store.get_local_media(media_id) )
else: else:
media_info = await self.store.get_cached_remote_media(server_name, media_id) media_info = await self.store.get_cached_remote_media(server_name, media_id)

View file

@ -60,12 +60,12 @@ class ReadMarkerHandler:
should_update = True should_update = True
# Get event ordering, this also ensures we know about the event # Get event ordering, this also ensures we know about the event
event_ordering = await self.store.get_event_ordering(event_id) event_ordering = await self.store.get_event_ordering(event_id, room_id)
if existing_read_marker: if existing_read_marker:
try: try:
old_event_ordering = await self.store.get_event_ordering( old_event_ordering = await self.store.get_event_ordering(
existing_read_marker["event_id"] existing_read_marker["event_id"], room_id
) )
except SynapseError: except SynapseError:
# Old event no longer exists, assume new is ahead. This may # Old event no longer exists, assume new is ahead. This may

View file

@ -188,13 +188,13 @@ class RelationsHandler:
if include_original_event: if include_original_event:
# Do not bundle aggregations when retrieving the original event because # Do not bundle aggregations when retrieving the original event because
# we want the content before relations are applied to it. # we want the content before relations are applied to it.
return_value[ return_value["original_event"] = (
"original_event" await self._event_serializer.serialize_event(
] = await self._event_serializer.serialize_event( event,
event, now,
now, bundle_aggregations=None,
bundle_aggregations=None, config=serialize_options,
config=serialize_options, )
) )
if next_token: if next_token:

View file

@ -151,7 +151,7 @@ class RoomCreationHandler:
"history_visibility": HistoryVisibility.SHARED, "history_visibility": HistoryVisibility.SHARED,
"original_invitees_have_ops": False, "original_invitees_have_ops": False,
"guest_can_join": False, "guest_can_join": False,
"power_level_content_override": {}, "power_level_content_override": {EventTypes.CallInvite: 50},
}, },
} }
@ -538,10 +538,10 @@ class RoomCreationHandler:
# deep-copy the power-levels event before we start modifying it # deep-copy the power-levels event before we start modifying it
# note that if frozen_dicts are enabled, `power_levels` will be a frozen # note that if frozen_dicts are enabled, `power_levels` will be a frozen
# dict so we can't just copy.deepcopy it. # dict so we can't just copy.deepcopy it.
initial_state[ initial_state[(EventTypes.PowerLevels, "")] = power_levels = (
(EventTypes.PowerLevels, "") copy_and_fixup_power_levels_contents(
] = power_levels = copy_and_fixup_power_levels_contents( initial_state[(EventTypes.PowerLevels, "")]
initial_state[(EventTypes.PowerLevels, "")] )
) )
# Resolve the minimum power level required to send any state event # Resolve the minimum power level required to send any state event
@ -1374,9 +1374,11 @@ class RoomCreationHandler:
visibility = room_config.get("visibility", "private") visibility = room_config.get("visibility", "private")
preset_name = room_config.get( preset_name = room_config.get(
"preset", "preset",
RoomCreationPreset.PRIVATE_CHAT (
if visibility == "private" RoomCreationPreset.PRIVATE_CHAT
else RoomCreationPreset.PUBLIC_CHAT, if visibility == "private"
else RoomCreationPreset.PUBLIC_CHAT
),
) )
try: try:
preset_config = self._presets_dict[preset_name] preset_config = self._presets_dict[preset_name]

View file

@ -1216,11 +1216,11 @@ class RoomMemberHandler(metaclass=abc.ABCMeta):
# If this is going to be a local join, additional information must # If this is going to be a local join, additional information must
# be included in the event content in order to efficiently validate # be included in the event content in order to efficiently validate
# the event. # the event.
content[ content[EventContentFields.AUTHORISING_USER] = (
EventContentFields.AUTHORISING_USER await self.event_auth_handler.get_user_which_could_invite(
] = await self.event_auth_handler.get_user_which_could_invite( room_id,
room_id, state_before_join,
state_before_join, )
) )
return False, [] return False, []

View file

@ -150,7 +150,7 @@ class UserAttributes:
display_name: Optional[str] = None display_name: Optional[str] = None
picture: Optional[str] = None picture: Optional[str] = None
# mypy thinks these are incompatible for some reason. # mypy thinks these are incompatible for some reason.
emails: StrCollection = attr.Factory(list) # type: ignore[assignment] emails: StrCollection = attr.Factory(list)
@attr.s(slots=True, auto_attribs=True) @attr.s(slots=True, auto_attribs=True)

View file

@ -41,6 +41,7 @@ from synapse.api.constants import (
AccountDataTypes, AccountDataTypes,
EventContentFields, EventContentFields,
EventTypes, EventTypes,
JoinRules,
Membership, Membership,
) )
from synapse.api.filtering import FilterCollection from synapse.api.filtering import FilterCollection
@ -675,13 +676,22 @@ class SyncHandler:
) )
) )
loaded_recents = await filter_events_for_client( filtered_recents = await filter_events_for_client(
self._storage_controllers, self._storage_controllers,
sync_config.user.to_string(), sync_config.user.to_string(),
loaded_recents, loaded_recents,
always_include_ids=current_state_ids, always_include_ids=current_state_ids,
) )
loaded_recents = []
for event in filtered_recents:
if event.type == EventTypes.CallInvite:
room_info = await self.store.get_room_with_stats(event.room_id)
assert room_info is not None
if room_info.join_rules == JoinRules.PUBLIC:
continue
loaded_recents.append(event)
log_kv({"loaded_recents_after_client_filtering": len(loaded_recents)}) log_kv({"loaded_recents_after_client_filtering": len(loaded_recents)})
loaded_recents.extend(recents) loaded_recents.extend(recents)
@ -1014,30 +1024,6 @@ class SyncHandler:
if event.is_state(): if event.is_state():
timeline_state[(event.type, event.state_key)] = event.event_id timeline_state[(event.type, event.state_key)] = event.event_id
if full_state:
# always make sure we LL ourselves so we know we're in the room
# (if we are) to fix https://github.com/vector-im/riot-web/issues/7209
# We only need apply this on full state syncs given we disabled
# LL for incr syncs in https://github.com/matrix-org/synapse/pull/3840.
# We don't insert ourselves into `members_to_fetch`, because in some
# rare cases (an empty event batch with a now_token after the user's
# leave in a partial state room which another local user has
# joined), the room state will be missing our membership and there
# is no guarantee that our membership will be in the auth events of
# timeline events when the room is partial stated.
state_filter = StateFilter.from_lazy_load_member_list(
members_to_fetch.union((sync_config.user.to_string(),))
)
else:
state_filter = StateFilter.from_lazy_load_member_list(
members_to_fetch
)
# We are happy to use partial state to compute the `/sync` response.
# Since partial state may not include the lazy-loaded memberships we
# require, we fix up the state response afterwards with memberships from
# auth events.
await_full_state = False
else: else:
timeline_state = { timeline_state = {
(event.type, event.state_key): event.event_id (event.type, event.state_key): event.event_id
@ -1045,9 +1031,6 @@ class SyncHandler:
if event.is_state() if event.is_state()
} }
state_filter = StateFilter.all()
await_full_state = True
# Now calculate the state to return in the sync response for the room. # Now calculate the state to return in the sync response for the room.
# This is more or less the change in state between the end of the previous # This is more or less the change in state between the end of the previous
# sync's timeline and the start of the current sync's timeline. # sync's timeline and the start of the current sync's timeline.
@ -1057,132 +1040,29 @@ class SyncHandler:
# whether the room is partial stated *before* fetching it. # whether the room is partial stated *before* fetching it.
is_partial_state_room = await self.store.is_partial_state_room(room_id) is_partial_state_room = await self.store.is_partial_state_room(room_id)
if full_state: if full_state:
if batch: state_ids = await self._compute_state_delta_for_full_sync(
state_at_timeline_end = ( room_id,
await self._state_storage_controller.get_state_ids_for_event( sync_config.user,
batch.events[-1].event_id, batch,
state_filter=state_filter, now_token,
await_full_state=await_full_state, members_to_fetch,
) timeline_state,
)
state_at_timeline_start = (
await self._state_storage_controller.get_state_ids_for_event(
batch.events[0].event_id,
state_filter=state_filter,
await_full_state=await_full_state,
)
)
else:
state_at_timeline_end = await self.get_state_at(
room_id,
stream_position=now_token,
state_filter=state_filter,
await_full_state=await_full_state,
)
state_at_timeline_start = state_at_timeline_end
state_ids = _calculate_state(
timeline_contains=timeline_state,
timeline_start=state_at_timeline_start,
timeline_end=state_at_timeline_end,
previous_timeline_end={},
lazy_load_members=lazy_load_members,
) )
elif batch.limited: else:
if batch:
state_at_timeline_start = (
await self._state_storage_controller.get_state_ids_for_event(
batch.events[0].event_id,
state_filter=state_filter,
await_full_state=await_full_state,
)
)
else:
# We can get here if the user has ignored the senders of all
# the recent events.
state_at_timeline_start = await self.get_state_at(
room_id,
stream_position=now_token,
state_filter=state_filter,
await_full_state=await_full_state,
)
# for now, we disable LL for gappy syncs - see
# https://github.com/vector-im/riot-web/issues/7211#issuecomment-419976346
# N.B. this slows down incr syncs as we are now processing way
# more state in the server than if we were LLing.
#
# We still have to filter timeline_start to LL entries (above) in order
# for _calculate_state's LL logic to work, as we have to include LL
# members for timeline senders in case they weren't loaded in the initial
# sync. We do this by (counterintuitively) by filtering timeline_start
# members to just be ones which were timeline senders, which then ensures
# all of the rest get included in the state block (if we need to know
# about them).
state_filter = StateFilter.all()
# If this is an initial sync then full_state should be set, and # If this is an initial sync then full_state should be set, and
# that case is handled above. We assert here to ensure that this # that case is handled above. We assert here to ensure that this
# is indeed the case. # is indeed the case.
assert since_token is not None assert since_token is not None
state_at_previous_sync = await self.get_state_at(
state_ids = await self._compute_state_delta_for_incremental_sync(
room_id, room_id,
stream_position=since_token, batch,
state_filter=state_filter, since_token,
await_full_state=await_full_state, now_token,
members_to_fetch,
timeline_state,
) )
if batch:
state_at_timeline_end = (
await self._state_storage_controller.get_state_ids_for_event(
batch.events[-1].event_id,
state_filter=state_filter,
await_full_state=await_full_state,
)
)
else:
# We can get here if the user has ignored the senders of all
# the recent events.
state_at_timeline_end = await self.get_state_at(
room_id,
stream_position=now_token,
state_filter=state_filter,
await_full_state=await_full_state,
)
state_ids = _calculate_state(
timeline_contains=timeline_state,
timeline_start=state_at_timeline_start,
timeline_end=state_at_timeline_end,
previous_timeline_end=state_at_previous_sync,
# we have to include LL members in case LL initial sync missed them
lazy_load_members=lazy_load_members,
)
else:
state_ids = {}
if lazy_load_members:
if members_to_fetch and batch.events:
# We're returning an incremental sync, with no
# "gap" since the previous sync, so normally there would be
# no state to return.
# But we're lazy-loading, so the client might need some more
# member events to understand the events in this timeline.
# So we fish out all the member events corresponding to the
# timeline here, and then dedupe any redundant ones below.
state_ids = await self._state_storage_controller.get_state_ids_for_event(
batch.events[0].event_id,
# we only want members!
state_filter=StateFilter.from_types(
(EventTypes.Member, member)
for member in members_to_fetch
),
await_full_state=False,
)
# If we only have partial state for the room, `state_ids` may be missing the # If we only have partial state for the room, `state_ids` may be missing the
# memberships we wanted. We attempt to find some by digging through the auth # memberships we wanted. We attempt to find some by digging through the auth
# events of timeline events. # events of timeline events.
@ -1244,6 +1124,227 @@ class SyncHandler:
) )
} }
async def _compute_state_delta_for_full_sync(
self,
room_id: str,
syncing_user: UserID,
batch: TimelineBatch,
now_token: StreamToken,
members_to_fetch: Optional[Set[str]],
timeline_state: StateMap[str],
) -> StateMap[str]:
"""Calculate the state events to be included in a full sync response.
As with `_compute_state_delta_for_incremental_sync`, the result will include
the membership events for the senders of each event in `members_to_fetch`.
Args:
room_id: The room we are calculating for.
syncing_user: The user that is calling `/sync`.
batch: The timeline batch for the room that will be sent to the user.
now_token: Token of the end of the current batch.
members_to_fetch: If lazy-loading is enabled, the memberships needed for
events in the timeline.
timeline_state: The contribution to the room state from state events in
`batch`. Only contains the last event for any given state key.
Returns:
A map from (type, state_key) to event_id, for each event that we believe
should be included in the `state` part of the sync response.
"""
if members_to_fetch is not None:
# Lazy-loading of membership events is enabled.
#
# Always make sure we load our own membership event so we know if
# we're in the room, to fix https://github.com/vector-im/riot-web/issues/7209.
#
# We only need apply this on full state syncs given we disabled
# LL for incr syncs in https://github.com/matrix-org/synapse/pull/3840.
#
# We don't insert ourselves into `members_to_fetch`, because in some
# rare cases (an empty event batch with a now_token after the user's
# leave in a partial state room which another local user has
# joined), the room state will be missing our membership and there
# is no guarantee that our membership will be in the auth events of
# timeline events when the room is partial stated.
state_filter = StateFilter.from_lazy_load_member_list(
members_to_fetch.union((syncing_user.to_string(),))
)
# We are happy to use partial state to compute the `/sync` response.
# Since partial state may not include the lazy-loaded memberships we
# require, we fix up the state response afterwards with memberships from
# auth events.
await_full_state = False
lazy_load_members = True
else:
state_filter = StateFilter.all()
await_full_state = True
lazy_load_members = False
if batch:
state_at_timeline_end = (
await self._state_storage_controller.get_state_ids_for_event(
batch.events[-1].event_id,
state_filter=state_filter,
await_full_state=await_full_state,
)
)
state_at_timeline_start = (
await self._state_storage_controller.get_state_ids_for_event(
batch.events[0].event_id,
state_filter=state_filter,
await_full_state=await_full_state,
)
)
else:
state_at_timeline_end = await self.get_state_at(
room_id,
stream_position=now_token,
state_filter=state_filter,
await_full_state=await_full_state,
)
state_at_timeline_start = state_at_timeline_end
state_ids = _calculate_state(
timeline_contains=timeline_state,
timeline_start=state_at_timeline_start,
timeline_end=state_at_timeline_end,
previous_timeline_end={},
lazy_load_members=lazy_load_members,
)
return state_ids
async def _compute_state_delta_for_incremental_sync(
self,
room_id: str,
batch: TimelineBatch,
since_token: StreamToken,
now_token: StreamToken,
members_to_fetch: Optional[Set[str]],
timeline_state: StateMap[str],
) -> StateMap[str]:
"""Calculate the state events to be included in an incremental sync response.
If lazy-loading of membership events is enabled (as indicated by
`members_to_fetch` being not-`None`), the result will include the membership
events for each member in `members_to_fetch`. The caller
(`compute_state_delta`) is responsible for keeping track of which membership
events we have already sent to the client, and hence ripping them out.
Args:
room_id: The room we are calculating for.
batch: The timeline batch for the room that will be sent to the user.
since_token: Token of the end of the previous batch.
now_token: Token of the end of the current batch.
members_to_fetch: If lazy-loading is enabled, the memberships needed for
events in the timeline. Otherwise, `None`.
timeline_state: The contribution to the room state from state events in
`batch`. Only contains the last event for any given state key.
Returns:
A map from (type, state_key) to event_id, for each event that we believe
should be included in the `state` part of the sync response.
"""
if members_to_fetch is not None:
# Lazy-loading is enabled. Only return the state that is needed.
state_filter = StateFilter.from_lazy_load_member_list(members_to_fetch)
await_full_state = False
lazy_load_members = True
else:
state_filter = StateFilter.all()
await_full_state = True
lazy_load_members = False
if batch.limited:
if batch:
state_at_timeline_start = (
await self._state_storage_controller.get_state_ids_for_event(
batch.events[0].event_id,
state_filter=state_filter,
await_full_state=await_full_state,
)
)
else:
# We can get here if the user has ignored the senders of all
# the recent events.
state_at_timeline_start = await self.get_state_at(
room_id,
stream_position=now_token,
state_filter=state_filter,
await_full_state=await_full_state,
)
# for now, we disable LL for gappy syncs - see
# https://github.com/vector-im/riot-web/issues/7211#issuecomment-419976346
# N.B. this slows down incr syncs as we are now processing way
# more state in the server than if we were LLing.
#
# We still have to filter timeline_start to LL entries (above) in order
# for _calculate_state's LL logic to work, as we have to include LL
# members for timeline senders in case they weren't loaded in the initial
# sync. We do this by (counterintuitively) by filtering timeline_start
# members to just be ones which were timeline senders, which then ensures
# all of the rest get included in the state block (if we need to know
# about them).
state_filter = StateFilter.all()
state_at_previous_sync = await self.get_state_at(
room_id,
stream_position=since_token,
state_filter=state_filter,
await_full_state=await_full_state,
)
if batch:
state_at_timeline_end = (
await self._state_storage_controller.get_state_ids_for_event(
batch.events[-1].event_id,
state_filter=state_filter,
await_full_state=await_full_state,
)
)
else:
# We can get here if the user has ignored the senders of all
# the recent events.
state_at_timeline_end = await self.get_state_at(
room_id,
stream_position=now_token,
state_filter=state_filter,
await_full_state=await_full_state,
)
state_ids = _calculate_state(
timeline_contains=timeline_state,
timeline_start=state_at_timeline_start,
timeline_end=state_at_timeline_end,
previous_timeline_end=state_at_previous_sync,
lazy_load_members=lazy_load_members,
)
else:
state_ids = {}
if lazy_load_members:
if members_to_fetch and batch.events:
# We're returning an incremental sync, with no
# "gap" since the previous sync, so normally there would be
# no state to return.
# But we're lazy-loading, so the client might need some more
# member events to understand the events in this timeline.
# So we fish out all the member events corresponding to the
# timeline here. The caller will then dedupe any redundant ones.
state_ids = await self._state_storage_controller.get_state_ids_for_event(
batch.events[0].event_id,
# we only want members!
state_filter=StateFilter.from_types(
(EventTypes.Member, member) for member in members_to_fetch
),
await_full_state=False,
)
return state_ids
async def _find_missing_partial_state_memberships( async def _find_missing_partial_state_memberships(
self, self,
room_id: str, room_id: str,
@ -1332,9 +1433,9 @@ class SyncHandler:
and auth_event.state_key == member and auth_event.state_key == member
): ):
missing_members.discard(member) missing_members.discard(member)
additional_state_ids[ additional_state_ids[(EventTypes.Member, member)] = (
(EventTypes.Member, member) auth_event.event_id
] = auth_event.event_id )
break break
if missing_members: if missing_members:
@ -2745,7 +2846,7 @@ class SyncResultBuilder:
if self.since_token: if self.since_token:
for joined_sync in self.joined: for joined_sync in self.joined:
it = itertools.chain( it = itertools.chain(
joined_sync.timeline.events, joined_sync.state.values() joined_sync.state.values(), joined_sync.timeline.events
) )
for event in it: for event in it:
if event.type == EventTypes.Member: if event.type == EventTypes.Member:
@ -2757,13 +2858,20 @@ class SyncResultBuilder:
newly_joined_or_invited_or_knocked_users.add( newly_joined_or_invited_or_knocked_users.add(
event.state_key event.state_key
) )
# If the user left and rejoined in the same batch, they
# count as a newly-joined user, *not* a newly-left user.
newly_left_users.discard(event.state_key)
else: else:
prev_content = event.unsigned.get("prev_content", {}) prev_content = event.unsigned.get("prev_content", {})
prev_membership = prev_content.get("membership", None) prev_membership = prev_content.get("membership", None)
if prev_membership == Membership.JOIN: if prev_membership == Membership.JOIN:
newly_left_users.add(event.state_key) newly_left_users.add(event.state_key)
# If the user joined and left in the same batch, they
# count as a newly-left user, not a newly-joined user.
newly_joined_or_invited_or_knocked_users.discard(
event.state_key
)
newly_left_users -= newly_joined_or_invited_or_knocked_users
return newly_joined_or_invited_or_knocked_users, newly_left_users return newly_joined_or_invited_or_knocked_users, newly_left_users

View file

@ -182,12 +182,15 @@ class WorkerLocksHandler:
if not locks: if not locks:
return return
def _wake_deferred(deferred: defer.Deferred) -> None: def _wake_all_locks(
if not deferred.called: locks: Collection[Union[WaitingLock, WaitingMultiLock]]
deferred.callback(None) ) -> None:
for lock in locks:
deferred = lock.deferred
if not deferred.called:
deferred.callback(None)
for lock in locks: self._clock.call_later(0, _wake_all_locks, locks)
self._clock.call_later(0, _wake_deferred, lock.deferred)
@wrap_as_background_process("_cleanup_locks") @wrap_as_background_process("_cleanup_locks")
async def _cleanup_locks(self) -> None: async def _cleanup_locks(self) -> None:

View file

@ -390,6 +390,13 @@ class BaseHttpClient:
cooperator=self._cooperator, cooperator=self._cooperator,
) )
# Always make sure we add a user agent to the request
if headers is None:
headers = Headers()
if not headers.hasHeader("User-Agent"):
headers.addRawHeader("User-Agent", self.user_agent)
request_deferred: defer.Deferred = treq.request( request_deferred: defer.Deferred = treq.request(
method, method,
uri, uri,

View file

@ -931,8 +931,7 @@ class MatrixFederationHttpClient:
try_trailing_slash_on_400: bool = False, try_trailing_slash_on_400: bool = False,
parser: Literal[None] = None, parser: Literal[None] = None,
backoff_on_all_error_codes: bool = False, backoff_on_all_error_codes: bool = False,
) -> JsonDict: ) -> JsonDict: ...
...
@overload @overload
async def put_json( async def put_json(
@ -949,8 +948,7 @@ class MatrixFederationHttpClient:
try_trailing_slash_on_400: bool = False, try_trailing_slash_on_400: bool = False,
parser: Optional[ByteParser[T]] = None, parser: Optional[ByteParser[T]] = None,
backoff_on_all_error_codes: bool = False, backoff_on_all_error_codes: bool = False,
) -> T: ) -> T: ...
...
async def put_json( async def put_json(
self, self,
@ -1140,8 +1138,7 @@ class MatrixFederationHttpClient:
ignore_backoff: bool = False, ignore_backoff: bool = False,
try_trailing_slash_on_400: bool = False, try_trailing_slash_on_400: bool = False,
parser: Literal[None] = None, parser: Literal[None] = None,
) -> JsonDict: ) -> JsonDict: ...
...
@overload @overload
async def get_json( async def get_json(
@ -1154,8 +1151,7 @@ class MatrixFederationHttpClient:
ignore_backoff: bool = ..., ignore_backoff: bool = ...,
try_trailing_slash_on_400: bool = ..., try_trailing_slash_on_400: bool = ...,
parser: ByteParser[T] = ..., parser: ByteParser[T] = ...,
) -> T: ) -> T: ...
...
async def get_json( async def get_json(
self, self,
@ -1236,8 +1232,7 @@ class MatrixFederationHttpClient:
ignore_backoff: bool = False, ignore_backoff: bool = False,
try_trailing_slash_on_400: bool = False, try_trailing_slash_on_400: bool = False,
parser: Literal[None] = None, parser: Literal[None] = None,
) -> Tuple[JsonDict, Dict[bytes, List[bytes]]]: ) -> Tuple[JsonDict, Dict[bytes, List[bytes]]]: ...
...
@overload @overload
async def get_json_with_headers( async def get_json_with_headers(
@ -1250,8 +1245,7 @@ class MatrixFederationHttpClient:
ignore_backoff: bool = ..., ignore_backoff: bool = ...,
try_trailing_slash_on_400: bool = ..., try_trailing_slash_on_400: bool = ...,
parser: ByteParser[T] = ..., parser: ByteParser[T] = ...,
) -> Tuple[T, Dict[bytes, List[bytes]]]: ) -> Tuple[T, Dict[bytes, List[bytes]]]: ...
...
async def get_json_with_headers( async def get_json_with_headers(
self, self,

View file

@ -61,20 +61,17 @@ logger = logging.getLogger(__name__)
@overload @overload
def parse_integer(request: Request, name: str, default: int) -> int: def parse_integer(request: Request, name: str, default: int) -> int: ...
...
@overload @overload
def parse_integer(request: Request, name: str, *, required: Literal[True]) -> int: def parse_integer(request: Request, name: str, *, required: Literal[True]) -> int: ...
...
@overload @overload
def parse_integer( def parse_integer(
request: Request, name: str, default: Optional[int] = None, required: bool = False request: Request, name: str, default: Optional[int] = None, required: bool = False
) -> Optional[int]: ) -> Optional[int]: ...
...
def parse_integer( def parse_integer(
@ -105,8 +102,7 @@ def parse_integer_from_args(
args: Mapping[bytes, Sequence[bytes]], args: Mapping[bytes, Sequence[bytes]],
name: str, name: str,
default: Optional[int] = None, default: Optional[int] = None,
) -> Optional[int]: ) -> Optional[int]: ...
...
@overload @overload
@ -115,8 +111,7 @@ def parse_integer_from_args(
name: str, name: str,
*, *,
required: Literal[True], required: Literal[True],
) -> int: ) -> int: ...
...
@overload @overload
@ -125,8 +120,7 @@ def parse_integer_from_args(
name: str, name: str,
default: Optional[int] = None, default: Optional[int] = None,
required: bool = False, required: bool = False,
) -> Optional[int]: ) -> Optional[int]: ...
...
def parse_integer_from_args( def parse_integer_from_args(
@ -172,20 +166,17 @@ def parse_integer_from_args(
@overload @overload
def parse_boolean(request: Request, name: str, default: bool) -> bool: def parse_boolean(request: Request, name: str, default: bool) -> bool: ...
...
@overload @overload
def parse_boolean(request: Request, name: str, *, required: Literal[True]) -> bool: def parse_boolean(request: Request, name: str, *, required: Literal[True]) -> bool: ...
...
@overload @overload
def parse_boolean( def parse_boolean(
request: Request, name: str, default: Optional[bool] = None, required: bool = False request: Request, name: str, default: Optional[bool] = None, required: bool = False
) -> Optional[bool]: ) -> Optional[bool]: ...
...
def parse_boolean( def parse_boolean(
@ -216,8 +207,7 @@ def parse_boolean_from_args(
args: Mapping[bytes, Sequence[bytes]], args: Mapping[bytes, Sequence[bytes]],
name: str, name: str,
default: bool, default: bool,
) -> bool: ) -> bool: ...
...
@overload @overload
@ -226,8 +216,7 @@ def parse_boolean_from_args(
name: str, name: str,
*, *,
required: Literal[True], required: Literal[True],
) -> bool: ) -> bool: ...
...
@overload @overload
@ -236,8 +225,7 @@ def parse_boolean_from_args(
name: str, name: str,
default: Optional[bool] = None, default: Optional[bool] = None,
required: bool = False, required: bool = False,
) -> Optional[bool]: ) -> Optional[bool]: ...
...
def parse_boolean_from_args( def parse_boolean_from_args(
@ -289,8 +277,7 @@ def parse_bytes_from_args(
args: Mapping[bytes, Sequence[bytes]], args: Mapping[bytes, Sequence[bytes]],
name: str, name: str,
default: Optional[bytes] = None, default: Optional[bytes] = None,
) -> Optional[bytes]: ) -> Optional[bytes]: ...
...
@overload @overload
@ -300,8 +287,7 @@ def parse_bytes_from_args(
default: Literal[None] = None, default: Literal[None] = None,
*, *,
required: Literal[True], required: Literal[True],
) -> bytes: ) -> bytes: ...
...
@overload @overload
@ -310,8 +296,7 @@ def parse_bytes_from_args(
name: str, name: str,
default: Optional[bytes] = None, default: Optional[bytes] = None,
required: bool = False, required: bool = False,
) -> Optional[bytes]: ) -> Optional[bytes]: ...
...
def parse_bytes_from_args( def parse_bytes_from_args(
@ -355,8 +340,7 @@ def parse_string(
*, *,
allowed_values: Optional[StrCollection] = None, allowed_values: Optional[StrCollection] = None,
encoding: str = "ascii", encoding: str = "ascii",
) -> str: ) -> str: ...
...
@overload @overload
@ -367,8 +351,7 @@ def parse_string(
required: Literal[True], required: Literal[True],
allowed_values: Optional[StrCollection] = None, allowed_values: Optional[StrCollection] = None,
encoding: str = "ascii", encoding: str = "ascii",
) -> str: ) -> str: ...
...
@overload @overload
@ -380,8 +363,7 @@ def parse_string(
required: bool = False, required: bool = False,
allowed_values: Optional[StrCollection] = None, allowed_values: Optional[StrCollection] = None,
encoding: str = "ascii", encoding: str = "ascii",
) -> Optional[str]: ) -> Optional[str]: ...
...
def parse_string( def parse_string(
@ -437,8 +419,7 @@ def parse_enum(
name: str, name: str,
E: Type[EnumT], E: Type[EnumT],
default: EnumT, default: EnumT,
) -> EnumT: ) -> EnumT: ...
...
@overload @overload
@ -448,8 +429,7 @@ def parse_enum(
E: Type[EnumT], E: Type[EnumT],
*, *,
required: Literal[True], required: Literal[True],
) -> EnumT: ) -> EnumT: ...
...
def parse_enum( def parse_enum(
@ -526,8 +506,7 @@ def parse_strings_from_args(
*, *,
allowed_values: Optional[StrCollection] = None, allowed_values: Optional[StrCollection] = None,
encoding: str = "ascii", encoding: str = "ascii",
) -> Optional[List[str]]: ) -> Optional[List[str]]: ...
...
@overload @overload
@ -538,8 +517,7 @@ def parse_strings_from_args(
*, *,
allowed_values: Optional[StrCollection] = None, allowed_values: Optional[StrCollection] = None,
encoding: str = "ascii", encoding: str = "ascii",
) -> List[str]: ) -> List[str]: ...
...
@overload @overload
@ -550,8 +528,7 @@ def parse_strings_from_args(
required: Literal[True], required: Literal[True],
allowed_values: Optional[StrCollection] = None, allowed_values: Optional[StrCollection] = None,
encoding: str = "ascii", encoding: str = "ascii",
) -> List[str]: ) -> List[str]: ...
...
@overload @overload
@ -563,8 +540,7 @@ def parse_strings_from_args(
required: bool = False, required: bool = False,
allowed_values: Optional[StrCollection] = None, allowed_values: Optional[StrCollection] = None,
encoding: str = "ascii", encoding: str = "ascii",
) -> Optional[List[str]]: ) -> Optional[List[str]]: ...
...
def parse_strings_from_args( def parse_strings_from_args(
@ -625,8 +601,7 @@ def parse_string_from_args(
*, *,
allowed_values: Optional[StrCollection] = None, allowed_values: Optional[StrCollection] = None,
encoding: str = "ascii", encoding: str = "ascii",
) -> Optional[str]: ) -> Optional[str]: ...
...
@overload @overload
@ -638,8 +613,7 @@ def parse_string_from_args(
required: Literal[True], required: Literal[True],
allowed_values: Optional[StrCollection] = None, allowed_values: Optional[StrCollection] = None,
encoding: str = "ascii", encoding: str = "ascii",
) -> str: ) -> str: ...
...
@overload @overload
@ -650,8 +624,7 @@ def parse_string_from_args(
required: bool = False, required: bool = False,
allowed_values: Optional[StrCollection] = None, allowed_values: Optional[StrCollection] = None,
encoding: str = "ascii", encoding: str = "ascii",
) -> Optional[str]: ) -> Optional[str]: ...
...
def parse_string_from_args( def parse_string_from_args(
@ -704,22 +677,19 @@ def parse_string_from_args(
@overload @overload
def parse_json_value_from_request(request: Request) -> JsonDict: def parse_json_value_from_request(request: Request) -> JsonDict: ...
...
@overload @overload
def parse_json_value_from_request( def parse_json_value_from_request(
request: Request, allow_empty_body: Literal[False] request: Request, allow_empty_body: Literal[False]
) -> JsonDict: ) -> JsonDict: ...
...
@overload @overload
def parse_json_value_from_request( def parse_json_value_from_request(
request: Request, allow_empty_body: bool = False request: Request, allow_empty_body: bool = False
) -> Optional[JsonDict]: ) -> Optional[JsonDict]: ...
...
def parse_json_value_from_request( def parse_json_value_from_request(
@ -847,7 +817,6 @@ def assert_params_in_dict(body: JsonDict, required: StrCollection) -> None:
class RestServlet: class RestServlet:
"""A Synapse REST Servlet. """A Synapse REST Servlet.
An implementing class can either provide its own custom 'register' method, An implementing class can either provide its own custom 'register' method,

View file

@ -744,8 +744,7 @@ def preserve_fn(
@overload @overload
def preserve_fn(f: Callable[P, R]) -> Callable[P, "defer.Deferred[R]"]: def preserve_fn(f: Callable[P, R]) -> Callable[P, "defer.Deferred[R]"]: ...
...
def preserve_fn( def preserve_fn(
@ -774,15 +773,10 @@ def run_in_background(
@overload @overload
def run_in_background( def run_in_background(
f: Callable[P, R], *args: P.args, **kwargs: P.kwargs f: Callable[P, R], *args: P.args, **kwargs: P.kwargs
) -> "defer.Deferred[R]": ) -> "defer.Deferred[R]": ...
...
def run_in_background( # type: ignore[misc] def run_in_background(
# The `type: ignore[misc]` above suppresses
# "Overloaded function implementation does not accept all possible arguments of signature 1"
# "Overloaded function implementation does not accept all possible arguments of signature 2"
# which seems like a bug in mypy.
f: Union[ f: Union[
Callable[P, R], Callable[P, R],
Callable[P, Awaitable[R]], Callable[P, Awaitable[R]],

View file

@ -388,15 +388,13 @@ def only_if_tracing(func: Callable[P, R]) -> Callable[P, Optional[R]]:
@overload @overload
def ensure_active_span( def ensure_active_span(
message: str, message: str,
) -> Callable[[Callable[P, R]], Callable[P, Optional[R]]]: ) -> Callable[[Callable[P, R]], Callable[P, Optional[R]]]: ...
...
@overload @overload
def ensure_active_span( def ensure_active_span(
message: str, ret: T message: str, ret: T
) -> Callable[[Callable[P, R]], Callable[P, Union[T, R]]]: ) -> Callable[[Callable[P, R]], Callable[P, Union[T, R]]]: ...
...
def ensure_active_span( def ensure_active_span(

View file

@ -1002,9 +1002,9 @@ class MediaRepository:
) )
t_width = min(m_width, t_width) t_width = min(m_width, t_width)
t_height = min(m_height, t_height) t_height = min(m_height, t_height)
thumbnails[ thumbnails[(t_width, t_height, requirement.media_type)] = (
(t_width, t_height, requirement.media_type) requirement.method
] = requirement.method )
# Now we generate the thumbnails for each dimension, store it # Now we generate the thumbnails for each dimension, store it
for (t_width, t_height, t_type), t_method in thumbnails.items(): for (t_width, t_height, t_type), t_method in thumbnails.items():

View file

@ -42,14 +42,12 @@ class JemallocStats:
@overload @overload
def _mallctl( def _mallctl(
self, name: str, read: Literal[True] = True, write: Optional[int] = None self, name: str, read: Literal[True] = True, write: Optional[int] = None
) -> int: ) -> int: ...
...
@overload @overload
def _mallctl( def _mallctl(
self, name: str, read: Literal[False], write: Optional[int] = None self, name: str, read: Literal[False], write: Optional[int] = None
) -> None: ) -> None: ...
...
def _mallctl( def _mallctl(
self, name: str, read: bool = True, write: Optional[int] = None self, name: str, read: bool = True, write: Optional[int] = None

View file

@ -455,7 +455,7 @@ class SpamCheckerModuleApiCallbacks:
# mypy complains that we can't reach this code because of the # mypy complains that we can't reach this code because of the
# return type in CHECK_EVENT_FOR_SPAM_CALLBACK, but we don't know # return type in CHECK_EVENT_FOR_SPAM_CALLBACK, but we don't know
# for sure that the module actually returns it. # for sure that the module actually returns it.
logger.warning( logger.warning( # type: ignore[unreachable]
"Module returned invalid value, rejecting message as spam" "Module returned invalid value, rejecting message as spam"
) )
res = "This message has been rejected as probable spam" res = "This message has been rejected as probable spam"

View file

@ -469,8 +469,7 @@ class Notifier:
new_token: RoomStreamToken, new_token: RoomStreamToken,
users: Optional[Collection[Union[str, UserID]]] = None, users: Optional[Collection[Union[str, UserID]]] = None,
rooms: Optional[StrCollection] = None, rooms: Optional[StrCollection] = None,
) -> None: ) -> None: ...
...
@overload @overload
def on_new_event( def on_new_event(
@ -479,8 +478,7 @@ class Notifier:
new_token: MultiWriterStreamToken, new_token: MultiWriterStreamToken,
users: Optional[Collection[Union[str, UserID]]] = None, users: Optional[Collection[Union[str, UserID]]] = None,
rooms: Optional[StrCollection] = None, rooms: Optional[StrCollection] = None,
) -> None: ) -> None: ...
...
@overload @overload
def on_new_event( def on_new_event(
@ -497,8 +495,7 @@ class Notifier:
new_token: int, new_token: int,
users: Optional[Collection[Union[str, UserID]]] = None, users: Optional[Collection[Union[str, UserID]]] = None,
rooms: Optional[StrCollection] = None, rooms: Optional[StrCollection] = None,
) -> None: ) -> None: ...
...
def on_new_event( def on_new_event(
self, self,

View file

@ -377,12 +377,14 @@ class Mailer:
# #
# Note that many email clients will not render the unsubscribe link # Note that many email clients will not render the unsubscribe link
# unless DKIM, etc. is properly setup. # unless DKIM, etc. is properly setup.
additional_headers={ additional_headers=(
"List-Unsubscribe-Post": "List-Unsubscribe=One-Click", {
"List-Unsubscribe": f"<{unsubscribe_link}>", "List-Unsubscribe-Post": "List-Unsubscribe=One-Click",
} "List-Unsubscribe": f"<{unsubscribe_link}>",
if unsubscribe_link }
else None, if unsubscribe_link
else None
),
) )
async def _get_room_vars( async def _get_room_vars(

View file

@ -259,9 +259,9 @@ class ReplicationEndpoint(metaclass=abc.ABCMeta):
url_args.append(txn_id) url_args.append(txn_id)
if cls.METHOD == "POST": if cls.METHOD == "POST":
request_func: Callable[ request_func: Callable[..., Awaitable[Any]] = (
..., Awaitable[Any] client.post_json_get_json
] = client.post_json_get_json )
elif cls.METHOD == "PUT": elif cls.METHOD == "PUT":
request_func = client.put_json request_func = client.put_json
elif cls.METHOD == "GET": elif cls.METHOD == "GET":

View file

@ -70,9 +70,9 @@ class ExternalCache:
def __init__(self, hs: "HomeServer"): def __init__(self, hs: "HomeServer"):
if hs.config.redis.redis_enabled: if hs.config.redis.redis_enabled:
self._redis_connection: Optional[ self._redis_connection: Optional["ConnectionHandler"] = (
"ConnectionHandler" hs.get_outbound_redis_connection()
] = hs.get_outbound_redis_connection() )
else: else:
self._redis_connection = None self._redis_connection = None

View file

@ -237,10 +237,12 @@ class PurgeHistoryStatusRestServlet(RestServlet):
raise NotFoundError("purge id '%s' not found" % purge_id) raise NotFoundError("purge id '%s' not found" % purge_id)
result: JsonDict = { result: JsonDict = {
"status": purge_task.status "status": (
if purge_task.status == TaskStatus.COMPLETE purge_task.status
or purge_task.status == TaskStatus.FAILED if purge_task.status == TaskStatus.COMPLETE
else "active", or purge_task.status == TaskStatus.FAILED
else "active"
),
} }
if purge_task.error: if purge_task.error:
result["error"] = purge_task.error result["error"] = purge_task.error

View file

@ -1184,12 +1184,14 @@ class RateLimitRestServlet(RestServlet):
# convert `null` to `0` for consistency # convert `null` to `0` for consistency
# both values do the same in retelimit handler # both values do the same in retelimit handler
ret = { ret = {
"messages_per_second": 0 "messages_per_second": (
if ratelimit.messages_per_second is None 0
else ratelimit.messages_per_second, if ratelimit.messages_per_second is None
"burst_count": 0 else ratelimit.messages_per_second
if ratelimit.burst_count is None ),
else ratelimit.burst_count, "burst_count": (
0 if ratelimit.burst_count is None else ratelimit.burst_count
),
} }
else: else:
ret = {} ret = {}

View file

@ -112,9 +112,9 @@ class AccountDataServlet(RestServlet):
self._hs.config.experimental.msc4010_push_rules_account_data self._hs.config.experimental.msc4010_push_rules_account_data
and account_data_type == AccountDataTypes.PUSH_RULES and account_data_type == AccountDataTypes.PUSH_RULES
): ):
account_data: Optional[ account_data: Optional[JsonMapping] = (
JsonMapping await self._push_rules_handler.push_rules_for_user(requester.user)
] = await self._push_rules_handler.push_rules_for_user(requester.user) )
else: else:
account_data = await self.store.get_global_account_data_by_type_for_user( account_data = await self.store.get_global_account_data_by_type_for_user(
user_id, account_data_type user_id, account_data_type

View file

@ -313,12 +313,12 @@ class SyncRestServlet(RestServlet):
# https://github.com/matrix-org/matrix-doc/blob/54255851f642f84a4f1aaf7bc063eebe3d76752b/proposals/2732-olm-fallback-keys.md # https://github.com/matrix-org/matrix-doc/blob/54255851f642f84a4f1aaf7bc063eebe3d76752b/proposals/2732-olm-fallback-keys.md
# states that this field should always be included, as long as the server supports the feature. # states that this field should always be included, as long as the server supports the feature.
response[ response["org.matrix.msc2732.device_unused_fallback_key_types"] = (
"org.matrix.msc2732.device_unused_fallback_key_types" sync_result.device_unused_fallback_key_types
] = sync_result.device_unused_fallback_key_types )
response[ response["device_unused_fallback_key_types"] = (
"device_unused_fallback_key_types" sync_result.device_unused_fallback_key_types
] = sync_result.device_unused_fallback_key_types )
if joined: if joined:
response["rooms"][Membership.JOIN] = joined response["rooms"][Membership.JOIN] = joined
@ -543,9 +543,9 @@ class SyncRestServlet(RestServlet):
if room.unread_thread_notifications: if room.unread_thread_notifications:
result["unread_thread_notifications"] = room.unread_thread_notifications result["unread_thread_notifications"] = room.unread_thread_notifications
if self._msc3773_enabled: if self._msc3773_enabled:
result[ result["org.matrix.msc3773.unread_thread_notifications"] = (
"org.matrix.msc3773.unread_thread_notifications" room.unread_thread_notifications
] = room.unread_thread_notifications )
result["summary"] = room.summary result["summary"] = room.summary
if self._msc2654_enabled: if self._msc2654_enabled:
result["org.matrix.msc2654.unread_count"] = room.unread_count result["org.matrix.msc2654.unread_count"] = room.unread_count

View file

@ -191,10 +191,10 @@ class RemoteKey(RestServlet):
server_keys: Dict[Tuple[str, str], Optional[FetchKeyResultForRemote]] = {} server_keys: Dict[Tuple[str, str], Optional[FetchKeyResultForRemote]] = {}
for server_name, key_ids in query.items(): for server_name, key_ids in query.items():
if key_ids: if key_ids:
results: Mapping[ results: Mapping[str, Optional[FetchKeyResultForRemote]] = (
str, Optional[FetchKeyResultForRemote] await self.store.get_server_keys_json_for_remote(
] = await self.store.get_server_keys_json_for_remote( server_name, key_ids
server_name, key_ids )
) )
else: else:
results = await self.store.get_all_server_keys_json_for_remote( results = await self.store.get_all_server_keys_json_for_remote(

View file

@ -603,15 +603,15 @@ class StateResolutionHandler:
self.resolve_linearizer = Linearizer(name="state_resolve_lock") self.resolve_linearizer = Linearizer(name="state_resolve_lock")
# dict of set of event_ids -> _StateCacheEntry. # dict of set of event_ids -> _StateCacheEntry.
self._state_cache: ExpiringCache[ self._state_cache: ExpiringCache[FrozenSet[int], _StateCacheEntry] = (
FrozenSet[int], _StateCacheEntry ExpiringCache(
] = ExpiringCache( cache_name="state_cache",
cache_name="state_cache", clock=self.clock,
clock=self.clock, max_len=100000,
max_len=100000, expiry_ms=EVICTION_TIMEOUT_SECONDS * 1000,
expiry_ms=EVICTION_TIMEOUT_SECONDS * 1000, iterable=True,
iterable=True, reset_expiry_on_get=True,
reset_expiry_on_get=True, )
) )
# #

View file

@ -52,8 +52,7 @@ class Clock(Protocol):
# This is usually synapse.util.Clock, but it's replaced with a FakeClock in tests. # This is usually synapse.util.Clock, but it's replaced with a FakeClock in tests.
# We only ever sleep(0) though, so that other async functions can make forward # We only ever sleep(0) though, so that other async functions can make forward
# progress without waiting for stateres to complete. # progress without waiting for stateres to complete.
def sleep(self, duration_ms: float) -> Awaitable[None]: def sleep(self, duration_ms: float) -> Awaitable[None]: ...
...
class StateResolutionStore(Protocol): class StateResolutionStore(Protocol):
@ -61,13 +60,11 @@ class StateResolutionStore(Protocol):
# TestStateResolutionStore in tests. # TestStateResolutionStore in tests.
def get_events( def get_events(
self, event_ids: StrCollection, allow_rejected: bool = False self, event_ids: StrCollection, allow_rejected: bool = False
) -> Awaitable[Dict[str, EventBase]]: ) -> Awaitable[Dict[str, EventBase]]: ...
...
def get_auth_chain_difference( def get_auth_chain_difference(
self, room_id: str, state_sets: List[Set[str]] self, room_id: str, state_sets: List[Set[str]]
) -> Awaitable[Set[str]]: ) -> Awaitable[Set[str]]: ...
...
# We want to await to the reactor occasionally during state res when dealing # We want to await to the reactor occasionally during state res when dealing
@ -742,8 +739,7 @@ async def _get_event(
event_map: Dict[str, EventBase], event_map: Dict[str, EventBase],
state_res_store: StateResolutionStore, state_res_store: StateResolutionStore,
allow_none: Literal[False] = False, allow_none: Literal[False] = False,
) -> EventBase: ) -> EventBase: ...
...
@overload @overload
@ -753,8 +749,7 @@ async def _get_event(
event_map: Dict[str, EventBase], event_map: Dict[str, EventBase],
state_res_store: StateResolutionStore, state_res_store: StateResolutionStore,
allow_none: Literal[True], allow_none: Literal[True],
) -> Optional[EventBase]: ) -> Optional[EventBase]: ...
...
async def _get_event( async def _get_event(

View file

@ -836,9 +836,9 @@ class BackgroundUpdater:
c.execute(sql) c.execute(sql)
if isinstance(self.db_pool.engine, engines.PostgresEngine): if isinstance(self.db_pool.engine, engines.PostgresEngine):
runner: Optional[ runner: Optional[Callable[[LoggingDatabaseConnection], None]] = (
Callable[[LoggingDatabaseConnection], None] create_index_psql
] = create_index_psql )
elif psql_only: elif psql_only:
runner = None runner = None
else: else:

View file

@ -773,9 +773,9 @@ class EventsPersistenceStorageController:
) )
# Remove any events which are prev_events of any existing events. # Remove any events which are prev_events of any existing events.
existing_prevs: Collection[ existing_prevs: Collection[str] = (
str await self.persist_events_store._get_events_which_are_prevs(result)
] = await self.persist_events_store._get_events_which_are_prevs(result) )
result.difference_update(existing_prevs) result.difference_update(existing_prevs)
# Finally handle the case where the new events have soft-failed prev # Finally handle the case where the new events have soft-failed prev

View file

@ -273,8 +273,10 @@ class StateStorageController:
await_full_state: bool = True, await_full_state: bool = True,
) -> Dict[str, StateMap[str]]: ) -> Dict[str, StateMap[str]]:
""" """
Get the state dicts corresponding to a list of events, containing the event_ids Get the room states after each of a list of events.
of the state events (as opposed to the events themselves)
For each event in `event_ids`, the result contains a map from state tuple
to the event_ids of the state event (as opposed to the events themselves).
Args: Args:
event_ids: events whose state should be returned event_ids: events whose state should be returned
@ -347,7 +349,7 @@ class StateStorageController:
await_full_state: bool = True, await_full_state: bool = True,
) -> StateMap[str]: ) -> StateMap[str]:
""" """
Get the state dict corresponding to a particular event Get the state dict corresponding to the state after a particular event
Args: Args:
event_id: event whose state should be returned event_id: event whose state should be returned

View file

@ -111,8 +111,7 @@ class _PoolConnection(Connection):
A Connection from twisted.enterprise.adbapi.Connection. A Connection from twisted.enterprise.adbapi.Connection.
""" """
def reconnect(self) -> None: def reconnect(self) -> None: ...
...
def make_pool( def make_pool(
@ -914,9 +913,9 @@ class DatabasePool:
try: try:
with opentracing.start_active_span(f"db.{desc}"): with opentracing.start_active_span(f"db.{desc}"):
result = await self.runWithConnection( result: R = await self.runWithConnection(
# mypy seems to have an issue with this, maybe a bug? # mypy seems to have an issue with this, maybe a bug?
self.new_transaction, # type: ignore[arg-type] self.new_transaction,
desc, desc,
after_callbacks, after_callbacks,
async_after_callbacks, async_after_callbacks,
@ -935,7 +934,7 @@ class DatabasePool:
await async_callback(*async_args, **async_kwargs) await async_callback(*async_args, **async_kwargs)
for after_callback, after_args, after_kwargs in after_callbacks: for after_callback, after_args, after_kwargs in after_callbacks:
after_callback(*after_args, **after_kwargs) after_callback(*after_args, **after_kwargs)
return cast(R, result) return result
except Exception: except Exception:
for exception_callback, after_args, after_kwargs in exception_callbacks: for exception_callback, after_args, after_kwargs in exception_callbacks:
exception_callback(*after_args, **after_kwargs) exception_callback(*after_args, **after_kwargs)
@ -1603,8 +1602,7 @@ class DatabasePool:
retcols: Collection[str], retcols: Collection[str],
allow_none: Literal[False] = False, allow_none: Literal[False] = False,
desc: str = "simple_select_one", desc: str = "simple_select_one",
) -> Tuple[Any, ...]: ) -> Tuple[Any, ...]: ...
...
@overload @overload
async def simple_select_one( async def simple_select_one(
@ -1614,8 +1612,7 @@ class DatabasePool:
retcols: Collection[str], retcols: Collection[str],
allow_none: Literal[True] = True, allow_none: Literal[True] = True,
desc: str = "simple_select_one", desc: str = "simple_select_one",
) -> Optional[Tuple[Any, ...]]: ) -> Optional[Tuple[Any, ...]]: ...
...
async def simple_select_one( async def simple_select_one(
self, self,
@ -1654,8 +1651,7 @@ class DatabasePool:
retcol: str, retcol: str,
allow_none: Literal[False] = False, allow_none: Literal[False] = False,
desc: str = "simple_select_one_onecol", desc: str = "simple_select_one_onecol",
) -> Any: ) -> Any: ...
...
@overload @overload
async def simple_select_one_onecol( async def simple_select_one_onecol(
@ -1665,8 +1661,7 @@ class DatabasePool:
retcol: str, retcol: str,
allow_none: Literal[True] = True, allow_none: Literal[True] = True,
desc: str = "simple_select_one_onecol", desc: str = "simple_select_one_onecol",
) -> Optional[Any]: ) -> Optional[Any]: ...
...
async def simple_select_one_onecol( async def simple_select_one_onecol(
self, self,
@ -1706,8 +1701,7 @@ class DatabasePool:
keyvalues: Dict[str, Any], keyvalues: Dict[str, Any],
retcol: str, retcol: str,
allow_none: Literal[False] = False, allow_none: Literal[False] = False,
) -> Any: ) -> Any: ...
...
@overload @overload
@classmethod @classmethod
@ -1718,8 +1712,7 @@ class DatabasePool:
keyvalues: Dict[str, Any], keyvalues: Dict[str, Any],
retcol: str, retcol: str,
allow_none: Literal[True] = True, allow_none: Literal[True] = True,
) -> Optional[Any]: ) -> Optional[Any]: ...
...
@classmethod @classmethod
def simple_select_one_onecol_txn( def simple_select_one_onecol_txn(
@ -2501,8 +2494,7 @@ def make_tuple_in_list_sql_clause(
database_engine: BaseDatabaseEngine, database_engine: BaseDatabaseEngine,
columns: Tuple[str, str], columns: Tuple[str, str],
iterable: Collection[Tuple[Any, Any]], iterable: Collection[Tuple[Any, Any]],
) -> Tuple[str, list]: ) -> Tuple[str, list]: ...
...
def make_tuple_in_list_sql_clause( def make_tuple_in_list_sql_clause(

View file

@ -1701,9 +1701,9 @@ class DeviceStore(DeviceWorkerStore, DeviceBackgroundUpdateStore):
# Map of (user_id, device_id) -> bool. If there is an entry that implies # Map of (user_id, device_id) -> bool. If there is an entry that implies
# the device exists. # the device exists.
self.device_id_exists_cache: LruCache[ self.device_id_exists_cache: LruCache[Tuple[str, str], Literal[True]] = (
Tuple[str, str], Literal[True] LruCache(cache_name="device_id_exists", max_size=10000)
] = LruCache(cache_name="device_id_exists", max_size=10000) )
async def store_device( async def store_device(
self, self,

View file

@ -256,8 +256,7 @@ class EndToEndKeyWorkerStore(EndToEndKeyBackgroundStore, CacheInvalidationWorker
self, self,
query_list: Collection[Tuple[str, Optional[str]]], query_list: Collection[Tuple[str, Optional[str]]],
include_all_devices: Literal[False] = False, include_all_devices: Literal[False] = False,
) -> Dict[str, Dict[str, DeviceKeyLookupResult]]: ) -> Dict[str, Dict[str, DeviceKeyLookupResult]]: ...
...
@overload @overload
async def get_e2e_device_keys_and_signatures( async def get_e2e_device_keys_and_signatures(
@ -265,8 +264,7 @@ class EndToEndKeyWorkerStore(EndToEndKeyBackgroundStore, CacheInvalidationWorker
query_list: Collection[Tuple[str, Optional[str]]], query_list: Collection[Tuple[str, Optional[str]]],
include_all_devices: bool = False, include_all_devices: bool = False,
include_deleted_devices: Literal[False] = False, include_deleted_devices: Literal[False] = False,
) -> Dict[str, Dict[str, DeviceKeyLookupResult]]: ) -> Dict[str, Dict[str, DeviceKeyLookupResult]]: ...
...
@overload @overload
async def get_e2e_device_keys_and_signatures( async def get_e2e_device_keys_and_signatures(
@ -274,8 +272,7 @@ class EndToEndKeyWorkerStore(EndToEndKeyBackgroundStore, CacheInvalidationWorker
query_list: Collection[Tuple[str, Optional[str]]], query_list: Collection[Tuple[str, Optional[str]]],
include_all_devices: Literal[True], include_all_devices: Literal[True],
include_deleted_devices: Literal[True], include_deleted_devices: Literal[True],
) -> Dict[str, Dict[str, Optional[DeviceKeyLookupResult]]]: ) -> Dict[str, Dict[str, Optional[DeviceKeyLookupResult]]]: ...
...
@trace @trace
@cancellable @cancellable

View file

@ -1292,9 +1292,9 @@ class PersistEventsStore:
Returns: Returns:
filtered list filtered list
""" """
new_events_and_contexts: OrderedDict[ new_events_and_contexts: OrderedDict[str, Tuple[EventBase, EventContext]] = (
str, Tuple[EventBase, EventContext] OrderedDict()
] = OrderedDict() )
for event, context in events_and_contexts: for event, context in events_and_contexts:
prev_event_context = new_events_and_contexts.get(event.event_id) prev_event_context = new_events_and_contexts.get(event.event_id)
if prev_event_context: if prev_event_context:

View file

@ -263,13 +263,13 @@ class EventsWorkerStore(SQLBaseStore):
5 * 60 * 1000, 5 * 60 * 1000,
) )
self._get_event_cache: AsyncLruCache[ self._get_event_cache: AsyncLruCache[Tuple[str], EventCacheEntry] = (
Tuple[str], EventCacheEntry AsyncLruCache(
] = AsyncLruCache( cache_name="*getEvent*",
cache_name="*getEvent*", max_size=hs.config.caches.event_cache_size,
max_size=hs.config.caches.event_cache_size, # `extra_index_cb` Returns a tuple as that is the key type
# `extra_index_cb` Returns a tuple as that is the key type extra_index_cb=lambda _, v: (v.event.room_id,),
extra_index_cb=lambda _, v: (v.event.room_id,), )
) )
# Map from event ID to a deferred that will result in a map from event # Map from event ID to a deferred that will result in a map from event
@ -459,8 +459,7 @@ class EventsWorkerStore(SQLBaseStore):
allow_rejected: bool = ..., allow_rejected: bool = ...,
allow_none: Literal[False] = ..., allow_none: Literal[False] = ...,
check_room_id: Optional[str] = ..., check_room_id: Optional[str] = ...,
) -> EventBase: ) -> EventBase: ...
...
@overload @overload
async def get_event( async def get_event(
@ -471,8 +470,7 @@ class EventsWorkerStore(SQLBaseStore):
allow_rejected: bool = ..., allow_rejected: bool = ...,
allow_none: Literal[True] = ..., allow_none: Literal[True] = ...,
check_room_id: Optional[str] = ..., check_room_id: Optional[str] = ...,
) -> Optional[EventBase]: ) -> Optional[EventBase]: ...
...
@cancellable @cancellable
async def get_event( async def get_event(
@ -800,9 +798,9 @@ class EventsWorkerStore(SQLBaseStore):
# to all the events we pulled from the DB (this will result in this # to all the events we pulled from the DB (this will result in this
# function returning more events than requested, but that can happen # function returning more events than requested, but that can happen
# already due to `_get_events_from_db`). # already due to `_get_events_from_db`).
fetching_deferred: ObservableDeferred[ fetching_deferred: ObservableDeferred[Dict[str, EventCacheEntry]] = (
Dict[str, EventCacheEntry] ObservableDeferred(defer.Deferred(), consumeErrors=True)
] = ObservableDeferred(defer.Deferred(), consumeErrors=True) )
for event_id in missing_events_ids: for event_id in missing_events_ids:
self._current_event_fetches[event_id] = fetching_deferred self._current_event_fetches[event_id] = fetching_deferred
@ -1871,14 +1869,14 @@ class EventsWorkerStore(SQLBaseStore):
" LIMIT ?" " LIMIT ?"
) )
txn.execute(sql, (-last_id, -current_id, instance_name, limit)) txn.execute(sql, (-last_id, -current_id, instance_name, limit))
new_event_updates: List[ new_event_updates: List[Tuple[int, Tuple[str, str, str, str, str, str]]] = (
Tuple[int, Tuple[str, str, str, str, str, str]] []
] = [] )
row: Tuple[int, str, str, str, str, str, str] row: Tuple[int, str, str, str, str, str, str]
# Type safety: iterating over `txn` yields `Tuple`, i.e. # Type safety: iterating over `txn` yields `Tuple`, i.e.
# `Tuple[Any, ...]` of arbitrary length. Mypy detects assigning a # `Tuple[Any, ...]` of arbitrary length. Mypy detects assigning a
# variadic tuple to a fixed length tuple and flags it up as an error. # variadic tuple to a fixed length tuple and flags it up as an error.
for row in txn: # type: ignore[assignment] for row in txn:
new_event_updates.append((row[0], row[1:])) new_event_updates.append((row[0], row[1:]))
limited = False limited = False
@ -1905,7 +1903,7 @@ class EventsWorkerStore(SQLBaseStore):
# Type safety: iterating over `txn` yields `Tuple`, i.e. # Type safety: iterating over `txn` yields `Tuple`, i.e.
# `Tuple[Any, ...]` of arbitrary length. Mypy detects assigning a # `Tuple[Any, ...]` of arbitrary length. Mypy detects assigning a
# variadic tuple to a fixed length tuple and flags it up as an error. # variadic tuple to a fixed length tuple and flags it up as an error.
for row in txn: # type: ignore[assignment] for row in txn:
new_event_updates.append((row[0], row[1:])) new_event_updates.append((row[0], row[1:]))
if len(new_event_updates) >= limit: if len(new_event_updates) >= limit:
@ -1997,16 +1995,18 @@ class EventsWorkerStore(SQLBaseStore):
return rows, to_token, True return rows, to_token, True
@cached(max_entries=5000) @cached(max_entries=5000)
async def get_event_ordering(self, event_id: str) -> Tuple[int, int]: async def get_event_ordering(self, event_id: str, room_id: str) -> Tuple[int, int]:
res = await self.db_pool.simple_select_one( res = await self.db_pool.simple_select_one(
table="events", table="events",
retcols=["topological_ordering", "stream_ordering"], retcols=["topological_ordering", "stream_ordering"],
keyvalues={"event_id": event_id}, keyvalues={"event_id": event_id, "room_id": room_id},
allow_none=True, allow_none=True,
) )
if not res: if not res:
raise SynapseError(404, "Could not find event %s" % (event_id,)) raise SynapseError(
404, "Could not find event %s in room %s" % (event_id, room_id)
)
return int(res[0]), int(res[1]) return int(res[0]), int(res[1])

View file

@ -79,9 +79,9 @@ class LockStore(SQLBaseStore):
# A map from `(lock_name, lock_key)` to lock that we think we # A map from `(lock_name, lock_key)` to lock that we think we
# currently hold. # currently hold.
self._live_lock_tokens: WeakValueDictionary[ self._live_lock_tokens: WeakValueDictionary[Tuple[str, str], Lock] = (
Tuple[str, str], Lock WeakValueDictionary()
] = WeakValueDictionary() )
# A map from `(lock_name, lock_key, token)` to read/write lock that we # A map from `(lock_name, lock_key, token)` to read/write lock that we
# think we currently hold. For a given lock_name/lock_key, there can be # think we currently hold. For a given lock_name/lock_key, there can be

View file

@ -158,9 +158,9 @@ class MediaRepositoryBackgroundUpdateStore(SQLBaseStore):
) )
if hs.config.media.can_load_media_repo: if hs.config.media.can_load_media_repo:
self.unused_expiration_time: Optional[ self.unused_expiration_time: Optional[int] = (
int hs.config.media.unused_expiration_time
] = hs.config.media.unused_expiration_time )
else: else:
self.unused_expiration_time = None self.unused_expiration_time = None

View file

@ -394,9 +394,9 @@ class ReceiptsWorkerStore(SQLBaseStore):
content: JsonDict = {} content: JsonDict = {}
for receipt_type, user_id, event_id, data in rows: for receipt_type, user_id, event_id, data in rows:
content.setdefault(event_id, {}).setdefault(receipt_type, {})[ content.setdefault(event_id, {}).setdefault(receipt_type, {})[user_id] = (
user_id db_to_json(data)
] = db_to_json(data) )
return [{"type": EduTypes.RECEIPT, "room_id": room_id, "content": content}] return [{"type": EduTypes.RECEIPT, "room_id": room_id, "content": content}]
@ -483,9 +483,9 @@ class ReceiptsWorkerStore(SQLBaseStore):
if user_id in receipt_type_dict: # existing receipt if user_id in receipt_type_dict: # existing receipt
# is the existing receipt threaded and we are currently processing an unthreaded one? # is the existing receipt threaded and we are currently processing an unthreaded one?
if "thread_id" in receipt_type_dict[user_id] and not thread_id: if "thread_id" in receipt_type_dict[user_id] and not thread_id:
receipt_type_dict[ receipt_type_dict[user_id] = (
user_id receipt_data # replace with unthreaded one
] = receipt_data # replace with unthreaded one )
else: # receipt does not exist, just set it else: # receipt does not exist, just set it
receipt_type_dict[user_id] = receipt_data receipt_type_dict[user_id] = receipt_data
if thread_id: if thread_id:

View file

@ -369,6 +369,22 @@ class RoomMemberWorkerStore(EventsWorkerStore, CacheInvalidationWorkerStore):
user_id, [Membership.INVITE] user_id, [Membership.INVITE]
) )
async def get_knocked_at_rooms_for_local_user(
self, user_id: str
) -> Sequence[RoomsForUser]:
"""Get all the rooms the *local* user has knocked at.
Args:
user_id: The user ID.
Returns:
A list of RoomsForUser.
"""
return await self.get_rooms_for_local_user_where_membership_is(
user_id, [Membership.KNOCK]
)
async def get_invite_for_local_user_in_room( async def get_invite_for_local_user_in_room(
self, user_id: str, room_id: str self, user_id: str, room_id: str
) -> Optional[RoomsForUser]: ) -> Optional[RoomsForUser]:

View file

@ -768,12 +768,10 @@ class StateMapWrapper(Dict[StateKey, str]):
return super().__getitem__(key) return super().__getitem__(key)
@overload @overload
def get(self, key: Tuple[str, str]) -> Optional[str]: def get(self, key: Tuple[str, str]) -> Optional[str]: ...
...
@overload @overload
def get(self, key: Tuple[str, str], default: Union[str, _T]) -> Union[str, _T]: def get(self, key: Tuple[str, str], default: Union[str, _T]) -> Union[str, _T]: ...
...
def get( def get(
self, key: StateKey, default: Union[str, _T, None] = None self, key: StateKey, default: Union[str, _T, None] = None

View file

@ -988,8 +988,7 @@ class StreamWorkerStore(EventsWorkerStore, SQLBaseStore):
txn: LoggingTransaction, txn: LoggingTransaction,
event_id: str, event_id: str,
allow_none: Literal[False] = False, allow_none: Literal[False] = False,
) -> int: ) -> int: ...
...
@overload @overload
def get_stream_id_for_event_txn( def get_stream_id_for_event_txn(
@ -997,8 +996,7 @@ class StreamWorkerStore(EventsWorkerStore, SQLBaseStore):
txn: LoggingTransaction, txn: LoggingTransaction,
event_id: str, event_id: str,
allow_none: bool = False, allow_none: bool = False,
) -> Optional[int]: ) -> Optional[int]: ...
...
def get_stream_id_for_event_txn( def get_stream_id_for_event_txn(
self, self,
@ -1476,12 +1474,12 @@ class StreamWorkerStore(EventsWorkerStore, SQLBaseStore):
_EventDictReturn(event_id, topological_ordering, stream_ordering) _EventDictReturn(event_id, topological_ordering, stream_ordering)
for event_id, instance_name, topological_ordering, stream_ordering in txn for event_id, instance_name, topological_ordering, stream_ordering in txn
if _filter_results( if _filter_results(
lower_token=to_token lower_token=(
if direction == Direction.BACKWARDS to_token if direction == Direction.BACKWARDS else from_token
else from_token, ),
upper_token=from_token upper_token=(
if direction == Direction.BACKWARDS from_token if direction == Direction.BACKWARDS else to_token
else to_token, ),
instance_name=instance_name, instance_name=instance_name,
topological_ordering=topological_ordering, topological_ordering=topological_ordering,
stream_ordering=stream_ordering, stream_ordering=stream_ordering,

View file

@ -136,12 +136,12 @@ class TaskSchedulerWorkerStore(SQLBaseStore):
"status": task.status, "status": task.status,
"timestamp": task.timestamp, "timestamp": task.timestamp,
"resource_id": task.resource_id, "resource_id": task.resource_id,
"params": None "params": (
if task.params is None None if task.params is None else json_encoder.encode(task.params)
else json_encoder.encode(task.params), ),
"result": None "result": (
if task.result is None None if task.result is None else json_encoder.encode(task.result)
else json_encoder.encode(task.result), ),
"error": task.error, "error": task.error,
}, },
desc="insert_scheduled_task", desc="insert_scheduled_task",

View file

@ -423,8 +423,11 @@ class TransactionWorkerStore(CacheInvalidationWorkerStore):
self, after_destination: Optional[str] self, after_destination: Optional[str]
) -> List[str]: ) -> List[str]:
""" """
Gets at most 25 destinations which have outstanding PDUs to be caught up, Get a list of destinations we should retry transaction sending to.
and are not being backed off from
Returns up to 25 destinations which have outstanding PDUs or to-device messages,
and are not subject to a backoff.
Args: Args:
after_destination: after_destination:
If provided, all destinations must be lexicographically greater If provided, all destinations must be lexicographically greater
@ -448,30 +451,86 @@ class TransactionWorkerStore(CacheInvalidationWorkerStore):
def _get_catch_up_outstanding_destinations_txn( def _get_catch_up_outstanding_destinations_txn(
txn: LoggingTransaction, now_time_ms: int, after_destination: Optional[str] txn: LoggingTransaction, now_time_ms: int, after_destination: Optional[str]
) -> List[str]: ) -> List[str]:
# We're looking for destinations which satisfy either of the following
# conditions:
#
# * There is at least one room where we have an event that we have not yet
# sent to them, indicated by a row in `destination_rooms` with a
# `stream_ordering` older than the `last_successful_stream_ordering`
# (if any) in `destinations`, or:
#
# * There is at least one to-device message outstanding for the destination,
# indicated by a row in `device_federation_outbox`.
#
# Of course, that may produce destinations where we are already busy sending
# the relevant PDU or to-device message, but in that case, waking up the
# sender will just be a no-op.
#
# From those two lists, we need to *exclude* destinations which are subject
# to a backoff (ie, where `destinations.retry_last_ts + destinations.retry_interval`
# is in the future). There is also an edge-case where, if the server was
# previously shut down in the middle of the first send attempt to a given
# destination, there may be no row in `destinations` at all; we need to include
# such rows in the output, which means we need to left-join rather than
# inner-join against `destinations`.
#
# The two sources of destinations (`destination_rooms` and
# `device_federation_outbox`) are queried separately and UNIONed; but the list
# may be very long, and we don't want to return all the rows at once. We
# therefore sort the output and just return the first 25 rows. Obviously that
# means there is no point in either of the inner queries returning more than
# 25 results, since any further results are certain to be dropped by the outer
# LIMIT. In order to help the query-optimiser understand that, we *also* sort
# and limit the *inner* queries, hence we express them as CTEs rather than
# sub-queries.
#
# (NB: we make sure to do the top-level sort and limit on the database, rather
# than making two queries and combining the result in Python. We could otherwise
# suffer from slight differences in sort order between Python and the database,
# which would make the `after_destination` condition unreliable.)
q = """ q = """
SELECT DISTINCT destination FROM destinations WITH pdu_destinations AS (
INNER JOIN destination_rooms USING (destination) SELECT DISTINCT destination FROM destination_rooms
WHERE LEFT JOIN destinations USING (destination)
stream_ordering > last_successful_stream_ordering WHERE
AND destination > ? destination > ?
AND ( AND destination_rooms.stream_ordering > COALESCE(destinations.last_successful_stream_ordering, 0)
retry_last_ts IS NULL OR AND (
retry_last_ts + retry_interval < ? destinations.retry_last_ts IS NULL OR
) destinations.retry_last_ts + destinations.retry_interval < ?
ORDER BY destination )
LIMIT 25 ORDER BY destination
""" LIMIT 25
txn.execute( ), to_device_destinations AS (
q, SELECT DISTINCT destination FROM device_federation_outbox
( LEFT JOIN destinations USING (destination)
# everything is lexicographically greater than "" so this gives WHERE
# us the first batch of up to 25. destination > ?
after_destination or "", AND (
now_time_ms, destinations.retry_last_ts IS NULL OR
), destinations.retry_last_ts + destinations.retry_interval < ?
)
ORDER BY destination
LIMIT 25
) )
SELECT destination FROM pdu_destinations
UNION SELECT destination FROM to_device_destinations
ORDER BY destination
LIMIT 25
"""
# everything is lexicographically greater than "" so this gives
# us the first batch of up to 25.
after_destination = after_destination or ""
txn.execute(
q,
(after_destination, now_time_ms, after_destination, now_time_ms),
)
destinations = [row[0] for row in txn] destinations = [row[0] for row in txn]
return destinations return destinations
async def get_destinations_paginate( async def get_destinations_paginate(

View file

@ -745,9 +745,11 @@ class UserDirectoryBackgroundUpdateStore(StateDeltasStore):
p.user_id, p.user_id,
get_localpart_from_id(p.user_id), get_localpart_from_id(p.user_id),
get_domain_from_id(p.user_id), get_domain_from_id(p.user_id),
_filter_text_for_index(p.display_name) (
if p.display_name _filter_text_for_index(p.display_name)
else None, if p.display_name
else None
),
) )
for p in profiles for p in profiles
], ],

View file

@ -120,11 +120,11 @@ class StateGroupDataStore(StateBackgroundUpdateStore, SQLBaseStore):
# TODO: this hasn't been tuned yet # TODO: this hasn't been tuned yet
50000, 50000,
) )
self._state_group_members_cache: DictionaryCache[ self._state_group_members_cache: DictionaryCache[int, StateKey, str] = (
int, StateKey, str DictionaryCache(
] = DictionaryCache( "*stateGroupMembersCache*",
"*stateGroupMembersCache*", 500000,
500000, )
) )
def get_max_state_group_txn(txn: Cursor) -> int: def get_max_state_group_txn(txn: Cursor) -> int:

View file

@ -48,8 +48,7 @@ class BaseDatabaseEngine(Generic[ConnectionType, CursorType], metaclass=abc.ABCM
@property @property
@abc.abstractmethod @abc.abstractmethod
def single_threaded(self) -> bool: def single_threaded(self) -> bool: ...
...
@property @property
@abc.abstractmethod @abc.abstractmethod
@ -68,8 +67,7 @@ class BaseDatabaseEngine(Generic[ConnectionType, CursorType], metaclass=abc.ABCM
@abc.abstractmethod @abc.abstractmethod
def check_database( def check_database(
self, db_conn: ConnectionType, allow_outdated_version: bool = False self, db_conn: ConnectionType, allow_outdated_version: bool = False
) -> None: ) -> None: ...
...
@abc.abstractmethod @abc.abstractmethod
def check_new_database(self, txn: CursorType) -> None: def check_new_database(self, txn: CursorType) -> None:
@ -79,27 +77,22 @@ class BaseDatabaseEngine(Generic[ConnectionType, CursorType], metaclass=abc.ABCM
... ...
@abc.abstractmethod @abc.abstractmethod
def convert_param_style(self, sql: str) -> str: def convert_param_style(self, sql: str) -> str: ...
...
# This method would ideally take a plain ConnectionType, but it seems that # This method would ideally take a plain ConnectionType, but it seems that
# the Sqlite engine expects to use LoggingDatabaseConnection.cursor # the Sqlite engine expects to use LoggingDatabaseConnection.cursor
# instead of sqlite3.Connection.cursor: only the former takes a txn_name. # instead of sqlite3.Connection.cursor: only the former takes a txn_name.
@abc.abstractmethod @abc.abstractmethod
def on_new_connection(self, db_conn: "LoggingDatabaseConnection") -> None: def on_new_connection(self, db_conn: "LoggingDatabaseConnection") -> None: ...
...
@abc.abstractmethod @abc.abstractmethod
def is_deadlock(self, error: Exception) -> bool: def is_deadlock(self, error: Exception) -> bool: ...
...
@abc.abstractmethod @abc.abstractmethod
def is_connection_closed(self, conn: ConnectionType) -> bool: def is_connection_closed(self, conn: ConnectionType) -> bool: ...
...
@abc.abstractmethod @abc.abstractmethod
def lock_table(self, txn: Cursor, table: str) -> None: def lock_table(self, txn: Cursor, table: str) -> None: ...
...
@property @property
@abc.abstractmethod @abc.abstractmethod

View file

@ -42,20 +42,17 @@ SQLQueryParameters = Union[Sequence[Any], Mapping[str, Any]]
class Cursor(Protocol): class Cursor(Protocol):
def execute(self, sql: str, parameters: SQLQueryParameters = ...) -> Any: def execute(self, sql: str, parameters: SQLQueryParameters = ...) -> Any: ...
...
def executemany(self, sql: str, parameters: Sequence[SQLQueryParameters]) -> Any: def executemany(
... self, sql: str, parameters: Sequence[SQLQueryParameters]
) -> Any: ...
def fetchone(self) -> Optional[Tuple]: def fetchone(self) -> Optional[Tuple]: ...
...
def fetchmany(self, size: Optional[int] = ...) -> List[Tuple]: def fetchmany(self, size: Optional[int] = ...) -> List[Tuple]: ...
...
def fetchall(self) -> List[Tuple]: def fetchall(self) -> List[Tuple]: ...
...
@property @property
def description( def description(
@ -70,36 +67,28 @@ class Cursor(Protocol):
def rowcount(self) -> int: def rowcount(self) -> int:
return 0 return 0
def __iter__(self) -> Iterator[Tuple]: def __iter__(self) -> Iterator[Tuple]: ...
...
def close(self) -> None: def close(self) -> None: ...
...
class Connection(Protocol): class Connection(Protocol):
def cursor(self) -> Cursor: def cursor(self) -> Cursor: ...
...
def close(self) -> None: def close(self) -> None: ...
...
def commit(self) -> None: def commit(self) -> None: ...
...
def rollback(self) -> None: def rollback(self) -> None: ...
...
def __enter__(self) -> "Connection": def __enter__(self) -> "Connection": ...
...
def __exit__( def __exit__(
self, self,
exc_type: Optional[Type[BaseException]], exc_type: Optional[Type[BaseException]],
exc_value: Optional[BaseException], exc_value: Optional[BaseException],
traceback: Optional[TracebackType], traceback: Optional[TracebackType],
) -> Optional[bool]: ) -> Optional[bool]: ...
...
class DBAPI2Module(Protocol): class DBAPI2Module(Protocol):
@ -129,24 +118,20 @@ class DBAPI2Module(Protocol):
# explain why this is necessary for safety. TL;DR: we shouldn't be able to write # explain why this is necessary for safety. TL;DR: we shouldn't be able to write
# to `x`, only read from it. See also https://github.com/python/mypy/issues/6002 . # to `x`, only read from it. See also https://github.com/python/mypy/issues/6002 .
@property @property
def Warning(self) -> Type[Exception]: def Warning(self) -> Type[Exception]: ...
...
@property @property
def Error(self) -> Type[Exception]: def Error(self) -> Type[Exception]: ...
...
# Errors are divided into `InterfaceError`s (something went wrong in the database # Errors are divided into `InterfaceError`s (something went wrong in the database
# driver) and `DatabaseError`s (something went wrong in the database). These are # driver) and `DatabaseError`s (something went wrong in the database). These are
# both subclasses of `Error`, but we can't currently express this in type # both subclasses of `Error`, but we can't currently express this in type
# annotations due to https://github.com/python/mypy/issues/8397 # annotations due to https://github.com/python/mypy/issues/8397
@property @property
def InterfaceError(self) -> Type[Exception]: def InterfaceError(self) -> Type[Exception]: ...
...
@property @property
def DatabaseError(self) -> Type[Exception]: def DatabaseError(self) -> Type[Exception]: ...
...
# Everything below is a subclass of `DatabaseError`. # Everything below is a subclass of `DatabaseError`.
@ -155,8 +140,7 @@ class DBAPI2Module(Protocol):
# - An invalid date time was provided. # - An invalid date time was provided.
# - A string contained a null code point. # - A string contained a null code point.
@property @property
def DataError(self) -> Type[Exception]: def DataError(self) -> Type[Exception]: ...
...
# Roughly: something went wrong in the database, but it's not within the application # Roughly: something went wrong in the database, but it's not within the application
# programmer's control. Examples: # programmer's control. Examples:
@ -167,21 +151,18 @@ class DBAPI2Module(Protocol):
# - The database ran out of resources, such as storage, memory, connections, etc. # - The database ran out of resources, such as storage, memory, connections, etc.
# - The database encountered an error from the operating system. # - The database encountered an error from the operating system.
@property @property
def OperationalError(self) -> Type[Exception]: def OperationalError(self) -> Type[Exception]: ...
...
# Roughly: we've given the database data which breaks a rule we asked it to enforce. # Roughly: we've given the database data which breaks a rule we asked it to enforce.
# Examples: # Examples:
# - Stop, criminal scum! You violated the foreign key constraint # - Stop, criminal scum! You violated the foreign key constraint
# - Also check constraints, non-null constraints, etc. # - Also check constraints, non-null constraints, etc.
@property @property
def IntegrityError(self) -> Type[Exception]: def IntegrityError(self) -> Type[Exception]: ...
...
# Roughly: something went wrong within the database server itself. # Roughly: something went wrong within the database server itself.
@property @property
def InternalError(self) -> Type[Exception]: def InternalError(self) -> Type[Exception]: ...
...
# Roughly: the application did something silly that needs to be fixed. Examples: # Roughly: the application did something silly that needs to be fixed. Examples:
# - We don't have permissions to do something. # - We don't have permissions to do something.
@ -189,13 +170,11 @@ class DBAPI2Module(Protocol):
# - We tried to use a reserved name. # - We tried to use a reserved name.
# - We referred to a column that doesn't exist. # - We referred to a column that doesn't exist.
@property @property
def ProgrammingError(self) -> Type[Exception]: def ProgrammingError(self) -> Type[Exception]: ...
...
# Roughly: we've tried to do something that this database doesn't support. # Roughly: we've tried to do something that this database doesn't support.
@property @property
def NotSupportedError(self) -> Type[Exception]: def NotSupportedError(self) -> Type[Exception]: ...
...
# We originally wrote # We originally wrote
# def connect(self, *args, **kwargs) -> Connection: ... # def connect(self, *args, **kwargs) -> Connection: ...
@ -204,8 +183,7 @@ class DBAPI2Module(Protocol):
# psycopg2.connect doesn't have a mandatory positional argument. Instead, we use # psycopg2.connect doesn't have a mandatory positional argument. Instead, we use
# the following slightly unusual workaround. # the following slightly unusual workaround.
@property @property
def connect(self) -> Callable[..., Connection]: def connect(self) -> Callable[..., Connection]: ...
...
__all__ = ["Cursor", "Connection", "DBAPI2Module"] __all__ = ["Cursor", "Connection", "DBAPI2Module"]

View file

@ -57,12 +57,13 @@ class _EventSourcesInner:
class EventSources: class EventSources:
def __init__(self, hs: "HomeServer"): def __init__(self, hs: "HomeServer"):
self.sources = _EventSourcesInner( self.sources = _EventSourcesInner(
# mypy previously warned that attribute.type is `Optional`, but we know it's # attribute.type is `Optional`, but we know it's
# never `None` here since all the attributes of `_EventSourcesInner` are # never `None` here since all the attributes of `_EventSourcesInner` are
# annotated. # annotated.
# As of the stubs in attrs 22.1.0, `attr.fields()` now returns Any, *(
# so the call to `attribute.type` is not checked. attribute.type(hs) # type: ignore[misc]
*(attribute.type(hs) for attribute in attr.fields(_EventSourcesInner)) for attribute in attr.fields(_EventSourcesInner)
)
) )
self.store = hs.get_datastores().main self.store = hs.get_datastores().main
self._instance_name = hs.get_instance_name() self._instance_name = hs.get_instance_name()

View file

@ -56,7 +56,7 @@ class EventInternalMetadata:
(Added in synapse 0.99.0, so may be unreliable for events received before that) (Added in synapse 0.99.0, so may be unreliable for events received before that)
""" """
...
def get_send_on_behalf_of(self) -> Optional[str]: def get_send_on_behalf_of(self) -> Optional[str]:
"""Whether this server should send the event on behalf of another server. """Whether this server should send the event on behalf of another server.
This is used by the federation "send_join" API to forward the initial join This is used by the federation "send_join" API to forward the initial join
@ -64,7 +64,7 @@ class EventInternalMetadata:
returns a str with the name of the server this event is sent on behalf of. returns a str with the name of the server this event is sent on behalf of.
""" """
...
def need_to_check_redaction(self) -> bool: def need_to_check_redaction(self) -> bool:
"""Whether the redaction event needs to be rechecked when fetching """Whether the redaction event needs to be rechecked when fetching
from the database. from the database.
@ -75,7 +75,7 @@ class EventInternalMetadata:
If the sender of the redaction event is allowed to redact any event If the sender of the redaction event is allowed to redact any event
due to auth rules, then this will always return false. due to auth rules, then this will always return false.
""" """
...
def is_soft_failed(self) -> bool: def is_soft_failed(self) -> bool:
"""Whether the event has been soft failed. """Whether the event has been soft failed.
@ -85,7 +85,7 @@ class EventInternalMetadata:
2. They should not be added to the forward extremities (and 2. They should not be added to the forward extremities (and
therefore not to current state). therefore not to current state).
""" """
...
def should_proactively_send(self) -> bool: def should_proactively_send(self) -> bool:
"""Whether the event, if ours, should be sent to other clients and """Whether the event, if ours, should be sent to other clients and
servers. servers.
@ -93,14 +93,13 @@ class EventInternalMetadata:
This is used for sending dummy events internally. Servers and clients This is used for sending dummy events internally. Servers and clients
can still explicitly fetch the event. can still explicitly fetch the event.
""" """
...
def is_redacted(self) -> bool: def is_redacted(self) -> bool:
"""Whether the event has been redacted. """Whether the event has been redacted.
This is used for efficiently checking whether an event has been This is used for efficiently checking whether an event has been
marked as redacted without needing to make another database call. marked as redacted without needing to make another database call.
""" """
...
def is_notifiable(self) -> bool: def is_notifiable(self) -> bool:
"""Whether this event can trigger a push notification""" """Whether this event can trigger a push notification"""
...

View file

@ -976,12 +976,12 @@ class StreamToken:
return attr.evolve(self, **{key.value: new_value}) return attr.evolve(self, **{key.value: new_value})
@overload @overload
def get_field(self, key: Literal[StreamKeyType.ROOM]) -> RoomStreamToken: def get_field(self, key: Literal[StreamKeyType.ROOM]) -> RoomStreamToken: ...
...
@overload @overload
def get_field(self, key: Literal[StreamKeyType.RECEIPT]) -> MultiWriterStreamToken: def get_field(
... self, key: Literal[StreamKeyType.RECEIPT]
) -> MultiWriterStreamToken: ...
@overload @overload
def get_field( def get_field(
@ -995,14 +995,12 @@ class StreamToken:
StreamKeyType.TYPING, StreamKeyType.TYPING,
StreamKeyType.UN_PARTIAL_STATED_ROOMS, StreamKeyType.UN_PARTIAL_STATED_ROOMS,
], ],
) -> int: ) -> int: ...
...
@overload @overload
def get_field( def get_field(
self, key: StreamKeyType self, key: StreamKeyType
) -> Union[int, RoomStreamToken, MultiWriterStreamToken]: ) -> Union[int, RoomStreamToken, MultiWriterStreamToken]: ...
...
def get_field( def get_field(
self, key: StreamKeyType self, key: StreamKeyType

View file

@ -117,7 +117,11 @@ class Clock:
return int(self.time() * 1000) return int(self.time() * 1000)
def looping_call( def looping_call(
self, f: Callable[P, object], msec: float, *args: P.args, **kwargs: P.kwargs self,
f: Callable[P, object],
msec: float,
*args: P.args,
**kwargs: P.kwargs,
) -> LoopingCall: ) -> LoopingCall:
"""Call a function repeatedly. """Call a function repeatedly.
@ -134,12 +138,46 @@ class Clock:
Args: Args:
f: The function to call repeatedly. f: The function to call repeatedly.
msec: How long to wait between calls in milliseconds. msec: How long to wait between calls in milliseconds.
*args: Postional arguments to pass to function. *args: Positional arguments to pass to function.
**kwargs: Key arguments to pass to function. **kwargs: Key arguments to pass to function.
""" """
return self._looping_call_common(f, msec, False, *args, **kwargs)
def looping_call_now(
self,
f: Callable[P, object],
msec: float,
*args: P.args,
**kwargs: P.kwargs,
) -> LoopingCall:
"""Call a function immediately, and then repeatedly thereafter.
As with `looping_call`: subsequent calls are not scheduled until after the
the Awaitable returned by a previous call has finished.
Also as with `looping_call`: the function is called with no logcontext and
you probably want to wrap it in `run_as_background_process`.
Args:
f: The function to call repeatedly.
msec: How long to wait between calls in milliseconds.
*args: Positional arguments to pass to function.
**kwargs: Key arguments to pass to function.
"""
return self._looping_call_common(f, msec, True, *args, **kwargs)
def _looping_call_common(
self,
f: Callable[P, object],
msec: float,
now: bool,
*args: P.args,
**kwargs: P.kwargs,
) -> LoopingCall:
"""Common functionality for `looping_call` and `looping_call_now`"""
call = task.LoopingCall(f, *args, **kwargs) call = task.LoopingCall(f, *args, **kwargs)
call.clock = self._reactor call.clock = self._reactor
d = call.start(msec / 1000.0, now=False) d = call.start(msec / 1000.0, now=now)
d.addErrback(log_failure, "Looping call died", consumeErrors=False) d.addErrback(log_failure, "Looping call died", consumeErrors=False)
return call return call

View file

@ -284,15 +284,7 @@ async def yieldable_gather_results(
try: try:
return await make_deferred_yieldable( return await make_deferred_yieldable(
defer.gatherResults( defer.gatherResults(
# type-ignore: mypy reports two errors: [run_in_background(func, item, *args, **kwargs) for item in iter],
# error: Argument 1 to "run_in_background" has incompatible type
# "Callable[[T, **P], Awaitable[R]]"; expected
# "Callable[[T, **P], Awaitable[R]]" [arg-type]
# error: Argument 2 to "run_in_background" has incompatible type
# "T"; expected "[T, **P.args]" [arg-type]
# The former looks like a mypy bug, and the latter looks like a
# false positive.
[run_in_background(func, item, *args, **kwargs) for item in iter], # type: ignore[arg-type]
consumeErrors=True, consumeErrors=True,
) )
) )
@ -338,7 +330,7 @@ async def yieldable_gather_results_delaying_cancellation(
return await make_deferred_yieldable( return await make_deferred_yieldable(
delay_cancellation( delay_cancellation(
defer.gatherResults( defer.gatherResults(
[run_in_background(func, item, *args, **kwargs) for item in iter], # type: ignore[arg-type] [run_in_background(func, item, *args, **kwargs) for item in iter],
consumeErrors=True, consumeErrors=True,
) )
) )
@ -357,24 +349,21 @@ T4 = TypeVar("T4")
@overload @overload
def gather_results( def gather_results(
deferredList: Tuple[()], consumeErrors: bool = ... deferredList: Tuple[()], consumeErrors: bool = ...
) -> "defer.Deferred[Tuple[()]]": ) -> "defer.Deferred[Tuple[()]]": ...
...
@overload @overload
def gather_results( def gather_results(
deferredList: Tuple["defer.Deferred[T1]"], deferredList: Tuple["defer.Deferred[T1]"],
consumeErrors: bool = ..., consumeErrors: bool = ...,
) -> "defer.Deferred[Tuple[T1]]": ) -> "defer.Deferred[Tuple[T1]]": ...
...
@overload @overload
def gather_results( def gather_results(
deferredList: Tuple["defer.Deferred[T1]", "defer.Deferred[T2]"], deferredList: Tuple["defer.Deferred[T1]", "defer.Deferred[T2]"],
consumeErrors: bool = ..., consumeErrors: bool = ...,
) -> "defer.Deferred[Tuple[T1, T2]]": ) -> "defer.Deferred[Tuple[T1, T2]]": ...
...
@overload @overload
@ -383,8 +372,7 @@ def gather_results(
"defer.Deferred[T1]", "defer.Deferred[T2]", "defer.Deferred[T3]" "defer.Deferred[T1]", "defer.Deferred[T2]", "defer.Deferred[T3]"
], ],
consumeErrors: bool = ..., consumeErrors: bool = ...,
) -> "defer.Deferred[Tuple[T1, T2, T3]]": ) -> "defer.Deferred[Tuple[T1, T2, T3]]": ...
...
@overload @overload
@ -396,8 +384,7 @@ def gather_results(
"defer.Deferred[T4]", "defer.Deferred[T4]",
], ],
consumeErrors: bool = ..., consumeErrors: bool = ...,
) -> "defer.Deferred[Tuple[T1, T2, T3, T4]]": ) -> "defer.Deferred[Tuple[T1, T2, T3, T4]]": ...
...
def gather_results( # type: ignore[misc] def gather_results( # type: ignore[misc]
@ -782,18 +769,15 @@ def stop_cancellation(deferred: "defer.Deferred[T]") -> "defer.Deferred[T]":
@overload @overload
def delay_cancellation(awaitable: "defer.Deferred[T]") -> "defer.Deferred[T]": def delay_cancellation(awaitable: "defer.Deferred[T]") -> "defer.Deferred[T]": ...
...
@overload @overload
def delay_cancellation(awaitable: Coroutine[Any, Any, T]) -> "defer.Deferred[T]": def delay_cancellation(awaitable: Coroutine[Any, Any, T]) -> "defer.Deferred[T]": ...
...
@overload @overload
def delay_cancellation(awaitable: Awaitable[T]) -> Awaitable[T]: def delay_cancellation(awaitable: Awaitable[T]) -> Awaitable[T]: ...
...
def delay_cancellation(awaitable: Awaitable[T]) -> Awaitable[T]: def delay_cancellation(awaitable: Awaitable[T]) -> Awaitable[T]:

View file

@ -229,7 +229,7 @@ class DictionaryCache(Generic[KT, DKT, DV]):
for dict_key in missing: for dict_key in missing:
# We explicitly add each dict key to the cache, so that cache hit # We explicitly add each dict key to the cache, so that cache hit
# rates and LRU times for each key can be tracked separately. # rates and LRU times for each key can be tracked separately.
value = entry.get(dict_key, _Sentinel.sentinel) # type: ignore[arg-type] value = entry.get(dict_key, _Sentinel.sentinel)
self.cache[(key, dict_key)] = _PerKeyValue(value) self.cache[(key, dict_key)] = _PerKeyValue(value)
if value is not _Sentinel.sentinel: if value is not _Sentinel.sentinel:

View file

@ -142,7 +142,7 @@ class ExpiringCache(Generic[KT, VT]):
return default return default
if self.iterable: if self.iterable:
self.metrics.inc_evictions(EvictionReason.invalidation, len(value.value)) # type: ignore[arg-type] self.metrics.inc_evictions(EvictionReason.invalidation, len(value.value))
else: else:
self.metrics.inc_evictions(EvictionReason.invalidation) self.metrics.inc_evictions(EvictionReason.invalidation)
@ -152,12 +152,10 @@ class ExpiringCache(Generic[KT, VT]):
return key in self._cache return key in self._cache
@overload @overload
def get(self, key: KT, default: Literal[None] = None) -> Optional[VT]: def get(self, key: KT, default: Literal[None] = None) -> Optional[VT]: ...
...
@overload @overload
def get(self, key: KT, default: T) -> Union[VT, T]: def get(self, key: KT, default: T) -> Union[VT, T]: ...
...
def get(self, key: KT, default: Optional[T] = None) -> Union[VT, Optional[T]]: def get(self, key: KT, default: Optional[T] = None) -> Union[VT, Optional[T]]:
try: try:

View file

@ -580,8 +580,7 @@ class LruCache(Generic[KT, VT]):
callbacks: Collection[Callable[[], None]] = ..., callbacks: Collection[Callable[[], None]] = ...,
update_metrics: bool = ..., update_metrics: bool = ...,
update_last_access: bool = ..., update_last_access: bool = ...,
) -> Optional[VT]: ) -> Optional[VT]: ...
...
@overload @overload
def cache_get( def cache_get(
@ -590,8 +589,7 @@ class LruCache(Generic[KT, VT]):
callbacks: Collection[Callable[[], None]] = ..., callbacks: Collection[Callable[[], None]] = ...,
update_metrics: bool = ..., update_metrics: bool = ...,
update_last_access: bool = ..., update_last_access: bool = ...,
) -> Union[T, VT]: ) -> Union[T, VT]: ...
...
@synchronized @synchronized
def cache_get( def cache_get(
@ -634,16 +632,14 @@ class LruCache(Generic[KT, VT]):
key: tuple, key: tuple,
default: Literal[None] = None, default: Literal[None] = None,
update_metrics: bool = True, update_metrics: bool = True,
) -> Union[None, Iterable[Tuple[KT, VT]]]: ) -> Union[None, Iterable[Tuple[KT, VT]]]: ...
...
@overload @overload
def cache_get_multi( def cache_get_multi(
key: tuple, key: tuple,
default: T, default: T,
update_metrics: bool = True, update_metrics: bool = True,
) -> Union[T, Iterable[Tuple[KT, VT]]]: ) -> Union[T, Iterable[Tuple[KT, VT]]]: ...
...
@synchronized @synchronized
def cache_get_multi( def cache_get_multi(
@ -728,12 +724,10 @@ class LruCache(Generic[KT, VT]):
return value return value
@overload @overload
def cache_pop(key: KT, default: Literal[None] = None) -> Optional[VT]: def cache_pop(key: KT, default: Literal[None] = None) -> Optional[VT]: ...
...
@overload @overload
def cache_pop(key: KT, default: T) -> Union[T, VT]: def cache_pop(key: KT, default: T) -> Union[T, VT]: ...
...
@synchronized @synchronized
def cache_pop(key: KT, default: Optional[T] = None) -> Union[None, T, VT]: def cache_pop(key: KT, default: Optional[T] = None) -> Union[None, T, VT]:

View file

@ -50,8 +50,7 @@ class _SelfSlice(Sized, Protocol):
returned. returned.
""" """
def __getitem__(self: S, i: slice) -> S: def __getitem__(self: S, i: slice) -> S: ...
...
def batch_iter(iterable: Iterable[T], size: int) -> Iterator[Tuple[T, ...]]: def batch_iter(iterable: Iterable[T], size: int) -> Iterator[Tuple[T, ...]]:

View file

@ -177,9 +177,9 @@ class FederationRateLimiter:
clock=clock, config=config, metrics_name=metrics_name clock=clock, config=config, metrics_name=metrics_name
) )
self.ratelimiters: DefaultDict[ self.ratelimiters: DefaultDict[str, "_PerHostRatelimiter"] = (
str, "_PerHostRatelimiter" collections.defaultdict(new_limiter)
] = collections.defaultdict(new_limiter) )
with _rate_limiter_instances_lock: with _rate_limiter_instances_lock:
_rate_limiter_instances.add(self) _rate_limiter_instances.add(self)

Some files were not shown because too many files have changed in this diff Show more