Compare commits

...

16 Commits

Author SHA1 Message Date
Tulir Asokan 15d050f5f4 Merge remote-tracking branch 'upstream/release-v1.102' 2024-02-20 18:11:09 +02:00
Andrew Morgan 7856ec96ef 1.102.0rc1 2024-02-20 15:51:17 +00:00
Erik Johnston cdbbf3653d
Don't lock up when joining large rooms (#16903)
Co-authored-by: Andrew Morgan <andrew@amorgan.xyz>
2024-02-20 14:29:18 +00:00
kegsay c51a2240d1
bugfix: always prefer unthreaded receipt when >1 exist (MSC4102) (#16927)
Co-authored-by: Andrew Morgan <1342360+anoadragon453@users.noreply.github.com>
2024-02-20 14:12:06 +00:00
Erik Johnston e5dfb6ecbf
Fix incorrect docker hub link in release script (#16910) 2024-02-20 12:20:31 +00:00
Rainer Zufall 1b7304c8b4
fix typo in admin_api/rooms.md (#16857)
Co-authored-by: Andrew Morgan <andrew@amorgan.xyz>
2024-02-20 12:20:23 +00:00
Remi Rampin 0621e8eb0e
Add metric for emails sent (#16881)
This adds a counter `synapse_emails_sent_total` for emails sent. They
are broken down by `type`, which are `password_reset`, `registration`,
`add_threepid`, `notification` (matching the methods of `Mailer`).
2024-02-14 15:30:03 +00:00
Erik Johnston bc1db16086 Merge branch 'master' into develop 2024-02-13 13:24:29 +00:00
Erik Johnston 7b4d7429f8
Don't invalidate the entire event cache when we purge history (#16905)
We do this by adding support to the LRU cache for "extra indices" based
on the cached value. This allows us to efficiently map from room ID to
the cached events and only invalidate those.
2024-02-13 13:24:11 +00:00
Erik Johnston 01910b981f
Add a config to not send out device list updates for specific users (#16909)
List of users not to send out device list updates for when they register
new devices. This is useful to handle bot accounts.

This is undocumented as its mostly a hack to test on matrix.org.

Note: This will still send out device list updates if the device is
later updated, e.g. end to end keys are added.
2024-02-13 13:23:03 +00:00
dependabot[bot] 79e31e8527
Bump pygithub from 2.1.1 to 2.2.0 (#16902) 2024-02-12 16:29:07 +00:00
dependabot[bot] b07617fbe8
Bump attrs from 23.1.0 to 23.2.0 (#16899) 2024-02-12 16:27:51 +00:00
dependabot[bot] e7cdf6152b
Bump bcrypt from 4.0.1 to 4.1.2 (#16900)
Bumps [bcrypt](https://github.com/pyca/bcrypt) from 4.0.1 to 4.1.2.
<details>
<summary>Commits</summary>
<ul>
<li><a
href="b9223e61e2"><code>b9223e6</code></a>
Try building py39 wheels to see if that helps with reinitialization
errors (#...</li>
<li><a
href="5049783444"><code>5049783</code></a>
Bump syn from 2.0.40 to 2.0.41 in /src/_bcrypt (<a
href="https://redirect.github.com/pyca/bcrypt/issues/696">#696</a>)</li>
<li><a
href="642d070972"><code>642d070</code></a>
Bump syn from 2.0.39 to 2.0.40 in /src/_bcrypt (<a
href="https://redirect.github.com/pyca/bcrypt/issues/693">#693</a>)</li>
<li><a
href="8b44a1046a"><code>8b44a10</code></a>
Bump libc from 0.2.150 to 0.2.151 in /src/_bcrypt (<a
href="https://redirect.github.com/pyca/bcrypt/issues/692">#692</a>)</li>
<li><a
href="951cc64d0c"><code>951cc64</code></a>
Bump once_cell from 1.18.0 to 1.19.0 in /src/_bcrypt (<a
href="https://redirect.github.com/pyca/bcrypt/issues/690">#690</a>)</li>
<li><a
href="7377c6db3a"><code>7377c6d</code></a>
Bump actions/setup-python from 4.8.0 to 5.0.0 (<a
href="https://redirect.github.com/pyca/bcrypt/issues/689">#689</a>)</li>
<li><a
href="61b32039d4"><code>61b3203</code></a>
Bump actions/setup-python from 4.7.1 to 4.8.0 (<a
href="https://redirect.github.com/pyca/bcrypt/issues/688">#688</a>)</li>
<li><a
href="1c3159a28a"><code>1c3159a</code></a>
Fixed wheels for older versions of macOS (<a
href="https://redirect.github.com/pyca/bcrypt/issues/687">#687</a>)</li>
<li><a
href="1a41437d3a"><code>1a41437</code></a>
Update README.rst (<a
href="https://redirect.github.com/pyca/bcrypt/issues/682">#682</a>)</li>
<li><a
href="7881c5beef"><code>7881c5b</code></a>
Fix building windows abi3 wheels (<a
href="https://redirect.github.com/pyca/bcrypt/issues/681">#681</a>)</li>
<li>Additional commits viewable in <a
href="https://github.com/pyca/bcrypt/compare/4.0.1...4.1.2">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=bcrypt&package-manager=pip&previous-version=4.0.1&new-version=4.1.2)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-12 16:27:43 +00:00
dependabot[bot] c415b7a412
Bump sentry-sdk from 1.40.0 to 1.40.3 (#16898)
Bumps [sentry-sdk](https://github.com/getsentry/sentry-python) from
1.40.0 to 1.40.3.
<details>
<summary>Release notes</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-python/releases">sentry-sdk's
releases</a>.</em></p>
<blockquote>
<h2>1.40.3</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>Turn off metrics for uWSGI (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2720">#2720</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Minor improvements (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2714">#2714</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
</ul>
<h2>1.40.2</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>test: Fix <code>pytest</code> error (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2712">#2712</a>)
by <a
href="https://github.com/szokeasaurusrex"><code>@​szokeasaurusrex</code></a></li>
<li>build(deps): bump types-protobuf from 4.24.0.4 to 4.24.0.20240129
(<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2691">#2691</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
</ul>
<h2>1.40.1</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>Fix uWSGI workers hanging (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2694">#2694</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Make metrics work with <code>gevent</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2694">#2694</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Guard against <code>engine.url</code> being <code>None</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2708">#2708</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Fix performance regression in
<code>sentry_sdk.utils._generate_installed_modules</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2703">#2703</a>)
by <a
href="https://github.com/GlenWalker"><code>@​GlenWalker</code></a></li>
<li>Guard against Sentry initialization mid SQLAlchemy cursor (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2702">#2702</a>)
by <a
href="https://github.com/apmorton"><code>@​apmorton</code></a></li>
<li>Fix yaml generation script (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2695">#2695</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Fix AWS Lambda workflow (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2710">#2710</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Bump <code>codecov/codecov-action</code> from 3 to 4 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2706">#2706</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
<li>Bump <code>actions/cache</code> from 3 to 4 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2661">#2661</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
<li>Bump <code>actions/checkout</code> from 3.1.0 to 4.1.1 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2561">#2561</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
<li>Bump <code>github/codeql-action</code> from 2 to 3 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2603">#2603</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
<li>Bump <code>actions/setup-python</code> from 4 to 5 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2577">#2577</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a
href="https://github.com/getsentry/sentry-python/blob/master/CHANGELOG.md">sentry-sdk's
changelog</a>.</em></p>
<blockquote>
<h2>1.40.3</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>Turn off metrics for uWSGI (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2720">#2720</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Minor improvements (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2714">#2714</a>)
by <a
href="https://github.com/antonpirker"><code>@​antonpirker</code></a></li>
</ul>
<h2>1.40.2</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>test: Fix <code>pytest</code> error (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2712">#2712</a>)
by <a
href="https://github.com/szokeasaurusrex"><code>@​szokeasaurusrex</code></a></li>
<li>build(deps): bump types-protobuf from 4.24.0.4 to 4.24.0.20240129
(<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2691">#2691</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
</ul>
<h2>1.40.1</h2>
<h3>Various fixes &amp; improvements</h3>
<ul>
<li>Fix uWSGI workers hanging (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2694">#2694</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Make metrics work with <code>gevent</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2694">#2694</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Guard against <code>engine.url</code> being <code>None</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2708">#2708</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Fix performance regression in
<code>sentry_sdk.utils._generate_installed_modules</code> (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2703">#2703</a>)
by <a
href="https://github.com/GlenWalker"><code>@​GlenWalker</code></a></li>
<li>Guard against Sentry initialization mid SQLAlchemy cursor (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2702">#2702</a>)
by <a
href="https://github.com/apmorton"><code>@​apmorton</code></a></li>
<li>Fix yaml generation script (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2695">#2695</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Fix AWS Lambda workflow (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2710">#2710</a>)
by <a
href="https://github.com/sentrivana"><code>@​sentrivana</code></a></li>
<li>Bump <code>codecov/codecov-action</code> from 3 to 4 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2706">#2706</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
<li>Bump <code>actions/cache</code> from 3 to 4 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2661">#2661</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
<li>Bump <code>actions/checkout</code> from 3.1.0 to 4.1.1 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2561">#2561</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
<li>Bump <code>github/codeql-action</code> from 2 to 3 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2603">#2603</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
<li>Bump <code>actions/setup-python</code> from 4 to 5 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2577">#2577</a>)
by <a
href="https://github.com/dependabot"><code>@​dependabot</code></a></li>
</ul>
</blockquote>
</details>
<details>
<summary>Commits</summary>
<ul>
<li><a
href="84c4c127ff"><code>84c4c12</code></a>
Update CHANGELOG.md</li>
<li><a
href="f92b4f2247"><code>f92b4f2</code></a>
release: 1.40.3</li>
<li><a
href="f23bdd32fe"><code>f23bdd3</code></a>
fix(metrics): Turn off metrics for uWSGI (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2720">#2720</a>)</li>
<li><a
href="c77a1235f4"><code>c77a123</code></a>
Minor improvements (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2714">#2714</a>)</li>
<li><a
href="2186e227a5"><code>2186e22</code></a>
Merge branch 'release/1.40.2'</li>
<li><a
href="139469a01f"><code>139469a</code></a>
release: 1.40.2</li>
<li><a
href="d97e7d75f7"><code>d97e7d7</code></a>
test: Fix <code>pytest</code> error (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2712">#2712</a>)</li>
<li><a
href="60e644c8e3"><code>60e644c</code></a>
build(deps): bump types-protobuf from 4.24.0.4 to 4.24.0.20240129 (<a
href="https://redirect.github.com/getsentry/sentry-python/issues/2691">#2691</a>)</li>
<li><a
href="d769becc92"><code>d769bec</code></a>
Merge branch 'release/1.40.1'</li>
<li><a
href="ad25ed961b"><code>ad25ed9</code></a>
Update CHANGELOG.md</li>
<li>Additional commits viewable in <a
href="https://github.com/getsentry/sentry-python/compare/1.40.0...1.40.3">compare
view</a></li>
</ul>
</details>
<br />


[![Dependabot compatibility
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=sentry-sdk&package-manager=pip&previous-version=1.40.0&new-version=1.40.3)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)

Dependabot will resolve any conflicts with this PR as long as you don't
alter it yourself. You can also trigger a rebase manually by commenting
`@dependabot rebase`.

[//]: # (dependabot-automerge-start)
[//]: # (dependabot-automerge-end)

---

<details>
<summary>Dependabot commands and options</summary>
<br />

You can trigger Dependabot actions by commenting on this PR:
- `@dependabot rebase` will rebase this PR
- `@dependabot recreate` will recreate this PR, overwriting any edits
that have been made to it
- `@dependabot merge` will merge this PR after your CI passes on it
- `@dependabot squash and merge` will squash and merge this PR after
your CI passes on it
- `@dependabot cancel merge` will cancel a previously requested merge
and block automerging
- `@dependabot reopen` will reopen this PR if it is closed
- `@dependabot close` will close this PR and stop Dependabot recreating
it. You can achieve the same result by closing it manually
- `@dependabot show <dependency name> ignore conditions` will show all
of the ignore conditions of the specified dependency
- `@dependabot ignore this major version` will close this PR and stop
Dependabot creating any more for this major version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this minor version` will close this PR and stop
Dependabot creating any more for this minor version (unless you reopen
the PR or upgrade to it yourself)
- `@dependabot ignore this dependency` will close this PR and stop
Dependabot creating any more for this dependency (unless you reopen the
PR or upgrade to it yourself)


</details>

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-02-12 16:27:32 +00:00
Erik Johnston ea1b30940e Merge remote-tracking branch 'origin/release-v1.101' into develop 2024-02-09 10:52:35 +00:00
Erik Johnston bfa93d1d3b
Only do one concurrent fetch per server in keyring (#16894)
Otherwise if we've stacked a bunch of requests for the keys of a server,
we'll end up sending lots of concurrent requests for its keys,
needlessly.
2024-02-09 10:51:11 +00:00
17 changed files with 294 additions and 89 deletions

View File

@ -1,3 +1,34 @@
# Synapse 1.102.0rc1 (2024-02-20)
### Features
- A metric was added for emails sent by Synapse, broken down by type: `synapse_emails_sent_total`. Contributed by Remi Rampin. ([\#16881](https://github.com/element-hq/synapse/issues/16881))
### Bugfixes
- Do not send multiple concurrent requests for keys for the same server. ([\#16894](https://github.com/element-hq/synapse/issues/16894))
- Fix performance issue when joining very large rooms that can cause the server to lock up. Introduced in v1.100.0. ([\#16903](https://github.com/element-hq/synapse/issues/16903))
- Always prefer unthreaded receipt when >1 exist ([MSC4102](https://github.com/matrix-org/matrix-spec-proposals/pull/4102)). ([\#16927](https://github.com/element-hq/synapse/issues/16927))
### Improved Documentation
- Fix a small typo in the Rooms section of the Admin API documentation. Contributed by @RainerZufall187. ([\#16857](https://github.com/element-hq/synapse/issues/16857))
### Internal Changes
- Don't invalidate the entire event cache when we purge history. ([\#16905](https://github.com/element-hq/synapse/issues/16905))
- Add experimental config option to not send device list updates for specific users. ([\#16909](https://github.com/element-hq/synapse/issues/16909))
- Fix incorrect docker hub link in release script. ([\#16910](https://github.com/element-hq/synapse/issues/16910))
### Updates to locked dependencies
* Bump attrs from 23.1.0 to 23.2.0. ([\#16899](https://github.com/element-hq/synapse/issues/16899))
* Bump bcrypt from 4.0.1 to 4.1.2. ([\#16900](https://github.com/element-hq/synapse/issues/16900))
* Bump pygithub from 2.1.1 to 2.2.0. ([\#16902](https://github.com/element-hq/synapse/issues/16902))
* Bump sentry-sdk from 1.40.0 to 1.40.3. ([\#16898](https://github.com/element-hq/synapse/issues/16898))
# Synapse 1.101.0 (2024-02-13)
### Bugfixes

6
debian/changelog vendored
View File

@ -1,3 +1,9 @@
matrix-synapse-py3 (1.102.0~rc1) stable; urgency=medium
* New Synapse release 1.102.0rc1.
-- Synapse Packaging team <packages@matrix.org> Tue, 20 Feb 2024 15:50:36 +0000
matrix-synapse-py3 (1.101.0) stable; urgency=medium
* New Synapse release 1.101.0.

View File

@ -913,7 +913,7 @@ With all that being said, if you still want to try and recover the room:
them handle rejoining themselves.
4. If `new_room_user_id` was given, a 'Content Violation' will have been
created. Consider whether you want to delete that roomm.
created. Consider whether you want to delete that room.
# Make Room Admin API

78
poetry.lock generated
View File

@ -46,21 +46,22 @@ wrapt = [
[[package]]
name = "attrs"
version = "23.1.0"
version = "23.2.0"
description = "Classes Without Boilerplate"
optional = false
python-versions = ">=3.7"
files = [
{file = "attrs-23.1.0-py3-none-any.whl", hash = "sha256:1f28b4522cdc2fb4256ac1a020c78acf9cba2c6b461ccd2c126f3aa8e8335d04"},
{file = "attrs-23.1.0.tar.gz", hash = "sha256:6279836d581513a26f1bf235f9acd333bc9115683f14f7e8fae46c98fc50e015"},
{file = "attrs-23.2.0-py3-none-any.whl", hash = "sha256:99b87a485a5820b23b879f04c2305b44b951b502fd64be915879d77a7e8fc6f1"},
{file = "attrs-23.2.0.tar.gz", hash = "sha256:935dc3b529c262f6cf76e50877d35a4bd3c1de194fd41f47a2b7ae8f19971f30"},
]
[package.extras]
cov = ["attrs[tests]", "coverage[toml] (>=5.3)"]
dev = ["attrs[docs,tests]", "pre-commit"]
dev = ["attrs[tests]", "pre-commit"]
docs = ["furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier", "zope-interface"]
tests = ["attrs[tests-no-zope]", "zope-interface"]
tests-no-zope = ["cloudpickle", "hypothesis", "mypy (>=1.1.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"]
tests-mypy = ["mypy (>=1.6)", "pytest-mypy-plugins"]
tests-no-zope = ["attrs[tests-mypy]", "cloudpickle", "hypothesis", "pympler", "pytest (>=4.3.0)", "pytest-xdist[psutil]"]
[[package]]
name = "authlib"
@ -110,32 +111,38 @@ pytz = {version = ">=2015.7", markers = "python_version < \"3.9\""}
[[package]]
name = "bcrypt"
version = "4.0.1"
version = "4.1.2"
description = "Modern password hashing for your software and your servers"
optional = false
python-versions = ">=3.6"
python-versions = ">=3.7"
files = [
{file = "bcrypt-4.0.1-cp36-abi3-macosx_10_10_universal2.whl", hash = "sha256:b1023030aec778185a6c16cf70f359cbb6e0c289fd564a7cfa29e727a1c38f8f"},
{file = "bcrypt-4.0.1-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.manylinux_2_24_aarch64.whl", hash = "sha256:08d2947c490093a11416df18043c27abe3921558d2c03e2076ccb28a116cb6d0"},
{file = "bcrypt-4.0.1-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0eaa47d4661c326bfc9d08d16debbc4edf78778e6aaba29c1bc7ce67214d4410"},
{file = "bcrypt-4.0.1-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ae88eca3024bb34bb3430f964beab71226e761f51b912de5133470b649d82344"},
{file = "bcrypt-4.0.1-cp36-abi3-manylinux_2_24_x86_64.whl", hash = "sha256:a522427293d77e1c29e303fc282e2d71864579527a04ddcfda6d4f8396c6c36a"},
{file = "bcrypt-4.0.1-cp36-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:fbdaec13c5105f0c4e5c52614d04f0bca5f5af007910daa8b6b12095edaa67b3"},
{file = "bcrypt-4.0.1-cp36-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:ca3204d00d3cb2dfed07f2d74a25f12fc12f73e606fcaa6975d1f7ae69cacbb2"},
{file = "bcrypt-4.0.1-cp36-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:089098effa1bc35dc055366740a067a2fc76987e8ec75349eb9484061c54f535"},
{file = "bcrypt-4.0.1-cp36-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:e9a51bbfe7e9802b5f3508687758b564069ba937748ad7b9e890086290d2f79e"},
{file = "bcrypt-4.0.1-cp36-abi3-win32.whl", hash = "sha256:2caffdae059e06ac23fce178d31b4a702f2a3264c20bfb5ff541b338194d8fab"},
{file = "bcrypt-4.0.1-cp36-abi3-win_amd64.whl", hash = "sha256:8a68f4341daf7522fe8d73874de8906f3a339048ba406be6ddc1b3ccb16fc0d9"},
{file = "bcrypt-4.0.1-pp37-pypy37_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bf4fa8b2ca74381bb5442c089350f09a3f17797829d958fad058d6e44d9eb83c"},
{file = "bcrypt-4.0.1-pp37-pypy37_pp73-manylinux_2_24_x86_64.whl", hash = "sha256:67a97e1c405b24f19d08890e7ae0c4f7ce1e56a712a016746c8b2d7732d65d4b"},
{file = "bcrypt-4.0.1-pp37-pypy37_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:b3b85202d95dd568efcb35b53936c5e3b3600c7cdcc6115ba461df3a8e89f38d"},
{file = "bcrypt-4.0.1-pp38-pypy38_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cbb03eec97496166b704ed663a53680ab57c5084b2fc98ef23291987b525cb7d"},
{file = "bcrypt-4.0.1-pp38-pypy38_pp73-manylinux_2_24_x86_64.whl", hash = "sha256:5ad4d32a28b80c5fa6671ccfb43676e8c1cc232887759d1cd7b6f56ea4355215"},
{file = "bcrypt-4.0.1-pp38-pypy38_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:b57adba8a1444faf784394de3436233728a1ecaeb6e07e8c22c8848f179b893c"},
{file = "bcrypt-4.0.1-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:705b2cea8a9ed3d55b4491887ceadb0106acf7c6387699fca771af56b1cdeeda"},
{file = "bcrypt-4.0.1-pp39-pypy39_pp73-manylinux_2_24_x86_64.whl", hash = "sha256:2b3ac11cf45161628f1f3733263e63194f22664bf4d0c0f3ab34099c02134665"},
{file = "bcrypt-4.0.1-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:3100851841186c25f127731b9fa11909ab7b1df6fc4b9f8353f4f1fd952fbf71"},
{file = "bcrypt-4.0.1.tar.gz", hash = "sha256:27d375903ac8261cfe4047f6709d16f7d18d39b1ec92aaf72af989552a650ebd"},
{file = "bcrypt-4.1.2-cp37-abi3-macosx_10_12_universal2.whl", hash = "sha256:ac621c093edb28200728a9cca214d7e838529e557027ef0581685909acd28b5e"},
{file = "bcrypt-4.1.2-cp37-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ea505c97a5c465ab8c3ba75c0805a102ce526695cd6818c6de3b1a38f6f60da1"},
{file = "bcrypt-4.1.2-cp37-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:57fa9442758da926ed33a91644649d3e340a71e2d0a5a8de064fb621fd5a3326"},
{file = "bcrypt-4.1.2-cp37-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:eb3bd3321517916696233b5e0c67fd7d6281f0ef48e66812db35fc963a422a1c"},
{file = "bcrypt-4.1.2-cp37-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:6cad43d8c63f34b26aef462b6f5e44fdcf9860b723d2453b5d391258c4c8e966"},
{file = "bcrypt-4.1.2-cp37-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:44290ccc827d3a24604f2c8bcd00d0da349e336e6503656cb8192133e27335e2"},
{file = "bcrypt-4.1.2-cp37-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:732b3920a08eacf12f93e6b04ea276c489f1c8fb49344f564cca2adb663b3e4c"},
{file = "bcrypt-4.1.2-cp37-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:1c28973decf4e0e69cee78c68e30a523be441972c826703bb93099868a8ff5b5"},
{file = "bcrypt-4.1.2-cp37-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:b8df79979c5bae07f1db22dcc49cc5bccf08a0380ca5c6f391cbb5790355c0b0"},
{file = "bcrypt-4.1.2-cp37-abi3-win32.whl", hash = "sha256:fbe188b878313d01b7718390f31528be4010fed1faa798c5a1d0469c9c48c369"},
{file = "bcrypt-4.1.2-cp37-abi3-win_amd64.whl", hash = "sha256:9800ae5bd5077b13725e2e3934aa3c9c37e49d3ea3d06318010aa40f54c63551"},
{file = "bcrypt-4.1.2-cp39-abi3-macosx_10_12_universal2.whl", hash = "sha256:71b8be82bc46cedd61a9f4ccb6c1a493211d031415a34adde3669ee1b0afbb63"},
{file = "bcrypt-4.1.2-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:68e3c6642077b0c8092580c819c1684161262b2e30c4f45deb000c38947bf483"},
{file = "bcrypt-4.1.2-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:387e7e1af9a4dd636b9505a465032f2f5cb8e61ba1120e79a0e1cd0b512f3dfc"},
{file = "bcrypt-4.1.2-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:f70d9c61f9c4ca7d57f3bfe88a5ccf62546ffbadf3681bb1e268d9d2e41c91a7"},
{file = "bcrypt-4.1.2-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:2a298db2a8ab20056120b45e86c00a0a5eb50ec4075b6142db35f593b97cb3fb"},
{file = "bcrypt-4.1.2-cp39-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:ba55e40de38a24e2d78d34c2d36d6e864f93e0d79d0b6ce915e4335aa81d01b1"},
{file = "bcrypt-4.1.2-cp39-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:3566a88234e8de2ccae31968127b0ecccbb4cddb629da744165db72b58d88ca4"},
{file = "bcrypt-4.1.2-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:b90e216dc36864ae7132cb151ffe95155a37a14e0de3a8f64b49655dd959ff9c"},
{file = "bcrypt-4.1.2-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:69057b9fc5093ea1ab00dd24ede891f3e5e65bee040395fb1e66ee196f9c9b4a"},
{file = "bcrypt-4.1.2-cp39-abi3-win32.whl", hash = "sha256:02d9ef8915f72dd6daaef40e0baeef8a017ce624369f09754baf32bb32dba25f"},
{file = "bcrypt-4.1.2-cp39-abi3-win_amd64.whl", hash = "sha256:be3ab1071662f6065899fe08428e45c16aa36e28bc42921c4901a191fda6ee42"},
{file = "bcrypt-4.1.2-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:d75fc8cd0ba23f97bae88a6ec04e9e5351ff3c6ad06f38fe32ba50cbd0d11946"},
{file = "bcrypt-4.1.2-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:a97e07e83e3262599434816f631cc4c7ca2aa8e9c072c1b1a7fec2ae809a1d2d"},
{file = "bcrypt-4.1.2-pp39-pypy39_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:e51c42750b7585cee7892c2614be0d14107fad9581d1738d954a262556dd1aab"},
{file = "bcrypt-4.1.2-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:ba4e4cc26610581a6329b3937e02d319f5ad4b85b074846bf4fef8a8cf51e7bb"},
{file = "bcrypt-4.1.2.tar.gz", hash = "sha256:33313a1200a3ae90b75587ceac502b048b840fc69e7f7a0905b5f87fac7a1258"},
]
[package.extras]
@ -1959,20 +1966,19 @@ typing-extensions = ">=4.6.0,<4.7.0 || >4.7.0"
[[package]]
name = "pygithub"
version = "2.1.1"
version = "2.2.0"
description = "Use the full Github API v3"
optional = false
python-versions = ">=3.7"
files = [
{file = "PyGithub-2.1.1-py3-none-any.whl", hash = "sha256:4b528d5d6f35e991ea5fd3f942f58748f24938805cb7fcf24486546637917337"},
{file = "PyGithub-2.1.1.tar.gz", hash = "sha256:ecf12c2809c44147bce63b047b3d2e9dac8a41b63e90fcb263c703f64936b97c"},
{file = "PyGithub-2.2.0-py3-none-any.whl", hash = "sha256:41042ea53e4c372219db708c38d2ca1fd4fadab75475bac27d89d339596cfad1"},
{file = "PyGithub-2.2.0.tar.gz", hash = "sha256:e39be7c4dc39418bdd6e3ecab5931c636170b8b21b4d26f9ecf7e6102a3b51c3"},
]
[package.dependencies]
Deprecated = "*"
pyjwt = {version = ">=2.4.0", extras = ["crypto"]}
pynacl = ">=1.4.0"
python-dateutil = "*"
requests = ">=2.14.0"
typing-extensions = ">=4.0.0"
urllib3 = ">=1.26.0"
@ -2119,7 +2125,7 @@ s2repoze = ["paste", "repoze.who", "zope.interface"]
name = "python-dateutil"
version = "2.8.2"
description = "Extensions to the standard Python datetime module"
optional = false
optional = true
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7"
files = [
{file = "python-dateutil-2.8.2.tar.gz", hash = "sha256:0123cacc1627ae19ddf3c27a5de5bd67ee4586fbdd6440d9748f8abb483d3e86"},
@ -2477,13 +2483,13 @@ doc = ["Sphinx", "sphinx-rtd-theme"]
[[package]]
name = "sentry-sdk"
version = "1.40.0"
version = "1.40.3"
description = "Python client for Sentry (https://sentry.io)"
optional = true
python-versions = "*"
files = [
{file = "sentry-sdk-1.40.0.tar.gz", hash = "sha256:34ad8cfc9b877aaa2a8eb86bfe5296a467fffe0619b931a05b181c45f6da59bf"},
{file = "sentry_sdk-1.40.0-py2.py3-none-any.whl", hash = "sha256:78575620331186d32f34b7ece6edea97ce751f58df822547d3ab85517881a27a"},
{file = "sentry-sdk-1.40.3.tar.gz", hash = "sha256:3c2b027979bb400cd65a47970e64f8cef8acda86b288a27f42a98692505086cd"},
{file = "sentry_sdk-1.40.3-py2.py3-none-any.whl", hash = "sha256:73383f28311ae55602bb6cc3b013830811135ba5521e41333a6e68f269413502"},
]
[package.dependencies]

View File

@ -96,7 +96,7 @@ module-name = "synapse.synapse_rust"
[tool.poetry]
name = "matrix-synapse"
version = "1.101.0"
version = "1.102.0rc1"
description = "Homeserver for the Matrix decentralised comms protocol"
authors = ["Matrix.org Team and Contributors <packages@matrix.org>"]
license = "AGPL-3.0-or-later"

View File

@ -1,37 +1,43 @@
annotated-types==0.5.0 ; python_version >= "3.8" and python_full_version < "4.0.0" \
--hash=sha256:47cdc3490d9ac1506ce92c7aaa76c579dc3509ff11e098fc867e5130ab7be802 \
--hash=sha256:58da39888f92c276ad970249761ebea80ba544b77acddaa1a4d6cf78287d45fd
attrs==23.1.0 ; python_version >= "3.8" and python_full_version < "4.0.0" \
--hash=sha256:1f28b4522cdc2fb4256ac1a020c78acf9cba2c6b461ccd2c126f3aa8e8335d04 \
--hash=sha256:6279836d581513a26f1bf235f9acd333bc9115683f14f7e8fae46c98fc50e015
attrs==23.2.0 ; python_version >= "3.8" and python_full_version < "4.0.0" \
--hash=sha256:935dc3b529c262f6cf76e50877d35a4bd3c1de194fd41f47a2b7ae8f19971f30 \
--hash=sha256:99b87a485a5820b23b879f04c2305b44b951b502fd64be915879d77a7e8fc6f1
authlib==1.3.0 ; python_version >= "3.8" and python_full_version < "4.0.0" \
--hash=sha256:959ea62a5b7b5123c5059758296122b57cd2585ae2ed1c0622c21b371ffdae06 \
--hash=sha256:9637e4de1fb498310a56900b3e2043a206b03cb11c05422014b0302cbc814be3
automat==22.10.0 ; python_full_version >= "3.8.0" and python_full_version < "4.0.0" \
--hash=sha256:c3164f8742b9dc440f3682482d32aaff7bb53f71740dd018533f9de286b64180 \
--hash=sha256:e56beb84edad19dcc11d30e8d9b895f75deeb5ef5e96b84a467066b3b84bb04e
bcrypt==4.0.1 ; python_full_version >= "3.8.0" and python_full_version < "4.0.0" \
--hash=sha256:089098effa1bc35dc055366740a067a2fc76987e8ec75349eb9484061c54f535 \
--hash=sha256:08d2947c490093a11416df18043c27abe3921558d2c03e2076ccb28a116cb6d0 \
--hash=sha256:0eaa47d4661c326bfc9d08d16debbc4edf78778e6aaba29c1bc7ce67214d4410 \
--hash=sha256:27d375903ac8261cfe4047f6709d16f7d18d39b1ec92aaf72af989552a650ebd \
--hash=sha256:2b3ac11cf45161628f1f3733263e63194f22664bf4d0c0f3ab34099c02134665 \
--hash=sha256:2caffdae059e06ac23fce178d31b4a702f2a3264c20bfb5ff541b338194d8fab \
--hash=sha256:3100851841186c25f127731b9fa11909ab7b1df6fc4b9f8353f4f1fd952fbf71 \
--hash=sha256:5ad4d32a28b80c5fa6671ccfb43676e8c1cc232887759d1cd7b6f56ea4355215 \
--hash=sha256:67a97e1c405b24f19d08890e7ae0c4f7ce1e56a712a016746c8b2d7732d65d4b \
--hash=sha256:705b2cea8a9ed3d55b4491887ceadb0106acf7c6387699fca771af56b1cdeeda \
--hash=sha256:8a68f4341daf7522fe8d73874de8906f3a339048ba406be6ddc1b3ccb16fc0d9 \
--hash=sha256:a522427293d77e1c29e303fc282e2d71864579527a04ddcfda6d4f8396c6c36a \
--hash=sha256:ae88eca3024bb34bb3430f964beab71226e761f51b912de5133470b649d82344 \
--hash=sha256:b1023030aec778185a6c16cf70f359cbb6e0c289fd564a7cfa29e727a1c38f8f \
--hash=sha256:b3b85202d95dd568efcb35b53936c5e3b3600c7cdcc6115ba461df3a8e89f38d \
--hash=sha256:b57adba8a1444faf784394de3436233728a1ecaeb6e07e8c22c8848f179b893c \
--hash=sha256:bf4fa8b2ca74381bb5442c089350f09a3f17797829d958fad058d6e44d9eb83c \
--hash=sha256:ca3204d00d3cb2dfed07f2d74a25f12fc12f73e606fcaa6975d1f7ae69cacbb2 \
--hash=sha256:cbb03eec97496166b704ed663a53680ab57c5084b2fc98ef23291987b525cb7d \
--hash=sha256:e9a51bbfe7e9802b5f3508687758b564069ba937748ad7b9e890086290d2f79e \
--hash=sha256:fbdaec13c5105f0c4e5c52614d04f0bca5f5af007910daa8b6b12095edaa67b3
bcrypt==4.1.2 ; python_full_version >= "3.8.0" and python_full_version < "4.0.0" \
--hash=sha256:02d9ef8915f72dd6daaef40e0baeef8a017ce624369f09754baf32bb32dba25f \
--hash=sha256:1c28973decf4e0e69cee78c68e30a523be441972c826703bb93099868a8ff5b5 \
--hash=sha256:2a298db2a8ab20056120b45e86c00a0a5eb50ec4075b6142db35f593b97cb3fb \
--hash=sha256:33313a1200a3ae90b75587ceac502b048b840fc69e7f7a0905b5f87fac7a1258 \
--hash=sha256:3566a88234e8de2ccae31968127b0ecccbb4cddb629da744165db72b58d88ca4 \
--hash=sha256:387e7e1af9a4dd636b9505a465032f2f5cb8e61ba1120e79a0e1cd0b512f3dfc \
--hash=sha256:44290ccc827d3a24604f2c8bcd00d0da349e336e6503656cb8192133e27335e2 \
--hash=sha256:57fa9442758da926ed33a91644649d3e340a71e2d0a5a8de064fb621fd5a3326 \
--hash=sha256:68e3c6642077b0c8092580c819c1684161262b2e30c4f45deb000c38947bf483 \
--hash=sha256:69057b9fc5093ea1ab00dd24ede891f3e5e65bee040395fb1e66ee196f9c9b4a \
--hash=sha256:6cad43d8c63f34b26aef462b6f5e44fdcf9860b723d2453b5d391258c4c8e966 \
--hash=sha256:71b8be82bc46cedd61a9f4ccb6c1a493211d031415a34adde3669ee1b0afbb63 \
--hash=sha256:732b3920a08eacf12f93e6b04ea276c489f1c8fb49344f564cca2adb663b3e4c \
--hash=sha256:9800ae5bd5077b13725e2e3934aa3c9c37e49d3ea3d06318010aa40f54c63551 \
--hash=sha256:a97e07e83e3262599434816f631cc4c7ca2aa8e9c072c1b1a7fec2ae809a1d2d \
--hash=sha256:ac621c093edb28200728a9cca214d7e838529e557027ef0581685909acd28b5e \
--hash=sha256:b8df79979c5bae07f1db22dcc49cc5bccf08a0380ca5c6f391cbb5790355c0b0 \
--hash=sha256:b90e216dc36864ae7132cb151ffe95155a37a14e0de3a8f64b49655dd959ff9c \
--hash=sha256:ba4e4cc26610581a6329b3937e02d319f5ad4b85b074846bf4fef8a8cf51e7bb \
--hash=sha256:ba55e40de38a24e2d78d34c2d36d6e864f93e0d79d0b6ce915e4335aa81d01b1 \
--hash=sha256:be3ab1071662f6065899fe08428e45c16aa36e28bc42921c4901a191fda6ee42 \
--hash=sha256:d75fc8cd0ba23f97bae88a6ec04e9e5351ff3c6ad06f38fe32ba50cbd0d11946 \
--hash=sha256:e51c42750b7585cee7892c2614be0d14107fad9581d1738d954a262556dd1aab \
--hash=sha256:ea505c97a5c465ab8c3ba75c0805a102ce526695cd6818c6de3b1a38f6f60da1 \
--hash=sha256:eb3bd3321517916696233b5e0c67fd7d6281f0ef48e66812db35fc963a422a1c \
--hash=sha256:f70d9c61f9c4ca7d57f3bfe88a5ccf62546ffbadf3681bb1e268d9d2e41c91a7 \
--hash=sha256:fbe188b878313d01b7718390f31528be4010fed1faa798c5a1d0469c9c48c369
bleach==6.1.0 ; python_version >= "3.8" and python_full_version < "4.0.0" \
--hash=sha256:0a31f1837963c41d46bbf1331b8778e1308ea0791db03cc4e7357b97cf42a8fe \
--hash=sha256:3225f354cfc436b9789c66c4ee030194bee0568fbf9cbdad3bc8b5c26c5f12b6

View File

@ -660,7 +660,7 @@ def _announce() -> None:
Hi everyone. Synapse {current_version} has just been released.
[notes](https://github.com/element-hq/synapse/releases/tag/{tag_name}) | \
[docker](https://hub.docker.com/r/vectorim/synapse/tags?name={tag_name}) | \
[docker](https://hub.docker.com/r/matrixdotorg/synapse/tags?name={tag_name}) | \
[debs](https://packages.matrix.org/debian/) | \
[pypi](https://pypi.org/project/matrix-synapse/{current_version}/)"""
)

View File

@ -237,6 +237,14 @@ class RegistrationConfig(Config):
self.inhibit_user_in_use_error = config.get("inhibit_user_in_use_error", False)
# List of user IDs not to send out device list updates for when they
# register new devices. This is useful to handle bot accounts.
#
# Note: This will still send out device list updates if the device is
# later updated, e.g. end to end keys are added.
dont_notify_new_devices_for = config.get("dont_notify_new_devices_for", [])
self.dont_notify_new_devices_for = frozenset(dont_notify_new_devices_for)
def generate_config_section(
self, generate_secrets: bool = False, **kwargs: Any
) -> str:

View File

@ -839,11 +839,12 @@ class ServerKeyFetcher(BaseV2KeyFetcher):
Map from server_name -> key_id -> FetchKeyResult
"""
# We only need to do one request per server.
servers_to_fetch = {k.server_name for k in keys_to_fetch}
results = {}
async def get_keys(key_to_fetch_item: _FetchKeyRequest) -> None:
server_name = key_to_fetch_item.server_name
async def get_keys(server_name: str) -> None:
try:
keys = await self.get_server_verify_keys_v2_direct(server_name)
results[server_name] = keys
@ -852,7 +853,7 @@ class ServerKeyFetcher(BaseV2KeyFetcher):
except Exception:
logger.exception("Error getting keys from %s", server_name)
await yieldable_gather_results(get_keys, keys_to_fetch)
await yieldable_gather_results(get_keys, servers_to_fetch)
return results
async def get_server_verify_keys_v2_direct(

View File

@ -429,6 +429,10 @@ class DeviceHandler(DeviceWorkerHandler):
self._storage_controllers = hs.get_storage_controllers()
self.db_pool = hs.get_datastores().main.db_pool
self._dont_notify_new_devices_for = (
hs.config.registration.dont_notify_new_devices_for
)
self.device_list_updater = DeviceListUpdater(hs, self)
federation_registry = hs.get_federation_registry()
@ -505,6 +509,9 @@ class DeviceHandler(DeviceWorkerHandler):
self._check_device_name_length(initial_device_display_name)
# Check if we should send out device lists updates for this new device.
notify = user_id not in self._dont_notify_new_devices_for
if device_id is not None:
new_device = await self.store.store_device(
user_id=user_id,
@ -514,7 +521,8 @@ class DeviceHandler(DeviceWorkerHandler):
auth_provider_session_id=auth_provider_session_id,
)
if new_device:
await self.notify_device_update(user_id, [device_id])
if notify:
await self.notify_device_update(user_id, [device_id])
return device_id
# if the device id is not specified, we'll autogen one, but loop a few
@ -530,7 +538,8 @@ class DeviceHandler(DeviceWorkerHandler):
auth_provider_session_id=auth_provider_session_id,
)
if new_device:
await self.notify_device_update(user_id, [new_device_id])
if notify:
await self.notify_device_update(user_id, [new_device_id])
return new_device_id
attempts += 1

View File

@ -1757,17 +1757,25 @@ class FederationEventHandler:
events_and_contexts_to_persist.append((event, context))
for event in sorted_auth_events:
for i, event in enumerate(sorted_auth_events):
await prep(event)
await self.persist_events_and_notify(
room_id,
events_and_contexts_to_persist,
# Mark these events backfilled as they're historic events that will
# eventually be backfilled. For example, missing events we fetch
# during backfill should be marked as backfilled as well.
backfilled=True,
)
# The above function is typically not async, and so won't yield to
# the reactor. For large rooms let's yield to the reactor
# occasionally to ensure we don't block other work.
if (i + 1) % 1000 == 0:
await self._clock.sleep(0)
# Also persist the new event in batches for similar reasons as above.
for batch in batch_iter(events_and_contexts_to_persist, 1000):
await self.persist_events_and_notify(
room_id,
batch,
# Mark these events as backfilled as they're historic events that will
# eventually be backfilled. For example, missing events we fetch
# during backfill should be marked as backfilled as well.
backfilled=True,
)
@trace
async def _check_event_auth(

View File

@ -26,6 +26,7 @@ from typing import TYPE_CHECKING, Dict, Iterable, List, Optional, TypeVar
import bleach
import jinja2
from markupsafe import Markup
from prometheus_client import Counter
from synapse.api.constants import EventTypes, Membership, RoomTypes
from synapse.api.errors import StoreError
@ -56,6 +57,12 @@ logger = logging.getLogger(__name__)
T = TypeVar("T")
emails_sent_counter = Counter(
"synapse_emails_sent_total",
"Emails sent by type",
["type"],
)
CONTEXT_BEFORE = 1
CONTEXT_AFTER = 1
@ -130,6 +137,8 @@ class Mailer:
logger.info("Created Mailer for app_name %s" % app_name)
emails_sent_counter.labels("password_reset")
async def send_password_reset_mail(
self, email_address: str, token: str, client_secret: str, sid: str
) -> None:
@ -153,6 +162,8 @@ class Mailer:
template_vars: TemplateVars = {"link": link}
emails_sent_counter.labels("password_reset").inc()
await self.send_email(
email_address,
self.email_subjects.password_reset
@ -160,6 +171,8 @@ class Mailer:
template_vars,
)
emails_sent_counter.labels("registration")
async def send_registration_mail(
self, email_address: str, token: str, client_secret: str, sid: str
) -> None:
@ -183,6 +196,8 @@ class Mailer:
template_vars: TemplateVars = {"link": link}
emails_sent_counter.labels("registration").inc()
await self.send_email(
email_address,
self.email_subjects.email_validation
@ -190,6 +205,8 @@ class Mailer:
template_vars,
)
emails_sent_counter.labels("add_threepid")
async def send_add_threepid_mail(
self, email_address: str, token: str, client_secret: str, sid: str
) -> None:
@ -214,6 +231,8 @@ class Mailer:
template_vars: TemplateVars = {"link": link}
emails_sent_counter.labels("add_threepid").inc()
await self.send_email(
email_address,
self.email_subjects.email_validation
@ -221,6 +240,8 @@ class Mailer:
template_vars,
)
emails_sent_counter.labels("notification")
async def send_notification_mail(
self,
app_id: str,
@ -315,6 +336,8 @@ class Mailer:
"reason": reason,
}
emails_sent_counter.labels("notification").inc()
await self.send_email(
email_address, summary_text, template_vars, unsubscribe_link
)

View File

@ -373,7 +373,7 @@ class CacheInvalidationWorkerStore(SQLBaseStore):
deleted.
"""
self._invalidate_local_get_event_cache_all() # type: ignore[attr-defined]
self._invalidate_local_get_event_cache_room_id(room_id) # type: ignore[attr-defined]
self._attempt_to_invalidate_cache("have_seen_event", (room_id,))
self._attempt_to_invalidate_cache("get_latest_event_ids_in_room", (room_id,))

View File

@ -268,6 +268,8 @@ class EventsWorkerStore(SQLBaseStore):
] = AsyncLruCache(
cache_name="*getEvent*",
max_size=hs.config.caches.event_cache_size,
# `extra_index_cb` Returns a tuple as that is the key type
extra_index_cb=lambda _, v: (v.event.room_id,),
)
# Map from event ID to a deferred that will result in a map from event
@ -782,9 +784,9 @@ class EventsWorkerStore(SQLBaseStore):
if missing_events_ids:
async def get_missing_events_from_cache_or_db() -> Dict[
str, EventCacheEntry
]:
async def get_missing_events_from_cache_or_db() -> (
Dict[str, EventCacheEntry]
):
"""Fetches the events in `missing_event_ids` from the database.
Also creates entries in `self._current_event_fetches` to allow
@ -910,12 +912,12 @@ class EventsWorkerStore(SQLBaseStore):
self._event_ref.pop(event_id, None)
self._current_event_fetches.pop(event_id, None)
def _invalidate_local_get_event_cache_all(self) -> None:
"""Clears the in-memory get event caches.
def _invalidate_local_get_event_cache_room_id(self, room_id: str) -> None:
"""Clears the in-memory get event caches for a room.
Used when we purge room history.
"""
self._get_event_cache.clear()
self._get_event_cache.invalidate_on_extra_index_local((room_id,))
self._event_ref.clear()
self._current_event_fetches.clear()

View File

@ -472,9 +472,24 @@ class ReceiptsWorkerStore(SQLBaseStore):
event_entry = room_event["content"].setdefault(event_id, {})
receipt_type_dict = event_entry.setdefault(receipt_type, {})
receipt_type_dict[user_id] = db_to_json(data)
if thread_id:
receipt_type_dict[user_id]["thread_id"] = thread_id
# MSC4102: always replace threaded receipts with unthreaded ones if there is a clash.
# Specifically:
# - if there is no existing receipt, great, set the data.
# - if there is an existing receipt, is it threaded (thread_id present)?
# YES: replace if this receipt has no thread id. NO: do not replace.
# This means we will drop some receipts, but MSC4102 is designed to drop semantically
# meaningless receipts, so this is okay. Previously, we would drop meaningful data!
receipt_data = db_to_json(data)
if user_id in receipt_type_dict: # existing receipt
# is the existing receipt threaded and we are currently processing an unthreaded one?
if "thread_id" in receipt_type_dict[user_id] and not thread_id:
receipt_type_dict[
user_id
] = receipt_data # replace with unthreaded one
else: # receipt does not exist, just set it
receipt_type_dict[user_id] = receipt_data
if thread_id:
receipt_type_dict[user_id]["thread_id"] = thread_id
results = {
room_id: [results[room_id]] if room_id in results else []

View File

@ -35,6 +35,7 @@ from typing import (
Iterable,
List,
Optional,
Set,
Tuple,
Type,
TypeVar,
@ -386,6 +387,7 @@ class LruCache(Generic[KT, VT]):
apply_cache_factor_from_config: bool = True,
clock: Optional[Clock] = None,
prune_unread_entries: bool = True,
extra_index_cb: Optional[Callable[[KT, VT], KT]] = None,
):
"""
Args:
@ -416,6 +418,20 @@ class LruCache(Generic[KT, VT]):
prune_unread_entries: If True, cache entries that haven't been read recently
will be evicted from the cache in the background. Set to False to
opt-out of this behaviour.
extra_index_cb: If provided, the cache keeps a second index from a
(different) key to a cache entry based on the return value of
the callback. This can then be used to invalidate entries based
on the second type of key.
For example, for the event cache this would be a callback that
maps an event to its room ID, allowing invalidation of all
events in a given room.
Note: Though the two types of key have the same type, they are
in different namespaces.
Note: The new key does not have to be unique.
"""
# Default `clock` to something sensible. Note that we rename it to
# `real_clock` so that mypy doesn't think its still `Optional`.
@ -463,6 +479,8 @@ class LruCache(Generic[KT, VT]):
lock = threading.Lock()
extra_index: Dict[KT, Set[KT]] = {}
def evict() -> None:
while cache_len() > self.max_size:
# Get the last node in the list (i.e. the oldest node).
@ -521,6 +539,11 @@ class LruCache(Generic[KT, VT]):
if size_callback:
cached_cache_len[0] += size_callback(node.value)
if extra_index_cb:
index_key = extra_index_cb(node.key, node.value)
mapped_keys = extra_index.setdefault(index_key, set())
mapped_keys.add(node.key)
if caches.TRACK_MEMORY_USAGE and metrics:
metrics.inc_memory_usage(node.memory)
@ -537,6 +560,14 @@ class LruCache(Generic[KT, VT]):
node.run_and_clear_callbacks()
if extra_index_cb:
index_key = extra_index_cb(node.key, node.value)
mapped_keys = extra_index.get(index_key)
if mapped_keys is not None:
mapped_keys.discard(node.key)
if not mapped_keys:
extra_index.pop(index_key, None)
if caches.TRACK_MEMORY_USAGE and metrics:
metrics.dec_memory_usage(node.memory)
@ -748,6 +779,8 @@ class LruCache(Generic[KT, VT]):
if size_callback:
cached_cache_len[0] = 0
extra_index.clear()
if caches.TRACK_MEMORY_USAGE and metrics:
metrics.clear_memory_usage()
@ -755,6 +788,28 @@ class LruCache(Generic[KT, VT]):
def cache_contains(key: KT) -> bool:
return key in cache
@synchronized
def cache_invalidate_on_extra_index(index_key: KT) -> None:
"""Invalidates all entries that match the given extra index key.
This can only be called when `extra_index_cb` was specified.
"""
assert extra_index_cb is not None
keys = extra_index.pop(index_key, None)
if not keys:
return
for key in keys:
node = cache.pop(key, None)
if not node:
continue
evicted_len = delete_node(node)
if metrics:
metrics.inc_evictions(EvictionReason.invalidation, evicted_len)
# make sure that we clear out any excess entries after we get resized.
self._on_resize = evict
@ -771,6 +826,7 @@ class LruCache(Generic[KT, VT]):
self.len = synchronized(cache_len)
self.contains = cache_contains
self.clear = cache_clear
self.invalidate_on_extra_index = cache_invalidate_on_extra_index
def __getitem__(self, key: KT) -> VT:
result = self.get(key, _Sentinel.sentinel)
@ -864,6 +920,9 @@ class AsyncLruCache(Generic[KT, VT]):
# This method should invalidate any external cache and then invalidate the LruCache.
return self._lru_cache.invalidate(key)
def invalidate_on_extra_index_local(self, index_key: KT) -> None:
self._lru_cache.invalidate_on_extra_index(index_key)
def invalidate_local(self, key: KT) -> None:
"""Remove an entry from the local cache

View File

@ -383,3 +383,34 @@ class MemoryEvictionTestCase(unittest.HomeserverTestCase):
# the items should still be in the cache
self.assertEqual(cache.get("key1"), 1)
self.assertEqual(cache.get("key2"), 2)
class ExtraIndexLruCacheTestCase(unittest.HomeserverTestCase):
def test_invalidate_simple(self) -> None:
cache: LruCache[str, int] = LruCache(10, extra_index_cb=lambda k, v: str(v))
cache["key1"] = 1
cache["key2"] = 2
cache.invalidate_on_extra_index("key1")
self.assertEqual(cache.get("key1"), 1)
self.assertEqual(cache.get("key2"), 2)
cache.invalidate_on_extra_index("1")
self.assertEqual(cache.get("key1"), None)
self.assertEqual(cache.get("key2"), 2)
def test_invalidate_multi(self) -> None:
cache: LruCache[str, int] = LruCache(10, extra_index_cb=lambda k, v: str(v))
cache["key1"] = 1
cache["key2"] = 1
cache["key3"] = 2
cache.invalidate_on_extra_index("key1")
self.assertEqual(cache.get("key1"), 1)
self.assertEqual(cache.get("key2"), 1)
self.assertEqual(cache.get("key3"), 2)
cache.invalidate_on_extra_index("1")
self.assertEqual(cache.get("key1"), None)
self.assertEqual(cache.get("key2"), None)
self.assertEqual(cache.get("key3"), 2)