mirror of
https://mau.dev/maunium/synapse.git
synced 2024-11-16 23:11:34 +01:00
Merge remote-tracking branch 'upstream/release-v1.27.0'
This commit is contained in:
commit
7f7fb9b566
146 changed files with 4893 additions and 1484 deletions
|
@ -10,4 +10,7 @@ apt-get install -y python3.5 python3.5-dev python3-pip libxml2-dev libxslt-dev x
|
|||
|
||||
export LANG="C.UTF-8"
|
||||
|
||||
# Prevent virtualenv from auto-updating pip to an incompatible version
|
||||
export VIRTUALENV_NO_DOWNLOAD=1
|
||||
|
||||
exec tox -e py35-old,combine
|
||||
|
|
76
CHANGES.md
76
CHANGES.md
|
@ -1,3 +1,79 @@
|
|||
Synapse 1.27.0rc1 (2021-02-02)
|
||||
==============================
|
||||
|
||||
Note that this release includes a change in Synapse to use Redis as a cache ─ as well as a pub/sub mechanism ─ if Redis support is enabled. No action is needed by server administrators, and we do not expect resource usage of the Redis instance to change dramatically.
|
||||
|
||||
This release also changes the callback URI for OpenID Connect (OIDC) identity providers. If your server is configured to use single sign-on via an OIDC/OAuth2 IdP, you may need to make configuration changes. Please review [UPGRADE.rst](UPGRADE.rst) for more details on these changes.
|
||||
|
||||
This release also changes escaping of variables in the HTML templates for SSO or email notifications. If you have customised these templates, please review [UPGRADE.rst](UPGRADE.rst) for more details on these changes.
|
||||
|
||||
|
||||
Features
|
||||
--------
|
||||
|
||||
- Add an admin API for getting and deleting forward extremities for a room. ([\#9062](https://github.com/matrix-org/synapse/issues/9062))
|
||||
- Add an admin API for retrieving the current room state of a room. ([\#9168](https://github.com/matrix-org/synapse/issues/9168))
|
||||
- Add experimental support for allowing clients to pick an SSO Identity Provider ([MSC2858](https://github.com/matrix-org/matrix-doc/pull/2858)). ([\#9183](https://github.com/matrix-org/synapse/issues/9183), [\#9242](https://github.com/matrix-org/synapse/issues/9242))
|
||||
- Add an admin API endpoint for shadow-banning users. ([\#9209](https://github.com/matrix-org/synapse/issues/9209))
|
||||
- Add ratelimits to the 3PID `/requestToken` APIs. ([\#9238](https://github.com/matrix-org/synapse/issues/9238))
|
||||
- Add support to the OpenID Connect integration for adding the user's email address. ([\#9245](https://github.com/matrix-org/synapse/issues/9245))
|
||||
- Add ratelimits to invites in rooms and to specific users. ([\#9258](https://github.com/matrix-org/synapse/issues/9258))
|
||||
- Improve the user experience of setting up an account via single-sign on. ([\#9262](https://github.com/matrix-org/synapse/issues/9262), [\#9272](https://github.com/matrix-org/synapse/issues/9272), [\#9275](https://github.com/matrix-org/synapse/issues/9275), [\#9276](https://github.com/matrix-org/synapse/issues/9276), [\#9277](https://github.com/matrix-org/synapse/issues/9277), [\#9286](https://github.com/matrix-org/synapse/issues/9286), [\#9287](https://github.com/matrix-org/synapse/issues/9287))
|
||||
- Add phone home stats for encrypted messages. ([\#9283](https://github.com/matrix-org/synapse/issues/9283))
|
||||
- Update the redirect URI for OIDC authentication. ([\#9288](https://github.com/matrix-org/synapse/issues/9288))
|
||||
|
||||
|
||||
Bugfixes
|
||||
--------
|
||||
|
||||
- Fix spurious errors in logs when deleting a non-existant pusher. ([\#9121](https://github.com/matrix-org/synapse/issues/9121))
|
||||
- Fix a long-standing bug where Synapse would return a 500 error when a thumbnail did not exist (and auto-generation of thumbnails was not enabled). ([\#9163](https://github.com/matrix-org/synapse/issues/9163))
|
||||
- Fix a long-standing bug where an internal server error was raised when attempting to preview an HTML document in an unknown character encoding. ([\#9164](https://github.com/matrix-org/synapse/issues/9164))
|
||||
- Fix a long-standing bug where invalid data could cause errors when calculating the presentable room name for push. ([\#9165](https://github.com/matrix-org/synapse/issues/9165))
|
||||
- Fix bug where we sometimes didn't detect that Redis connections had died, causing workers to not see new data. ([\#9218](https://github.com/matrix-org/synapse/issues/9218))
|
||||
- Fix a bug where `None` was passed to Synapse modules instead of an empty dictionary if an empty module `config` block was provided in the homeserver config. ([\#9229](https://github.com/matrix-org/synapse/issues/9229))
|
||||
- Fix a bug in the `make_room_admin` admin API where it failed if the admin with the greatest power level was not in the room. Contributed by Pankaj Yadav. ([\#9235](https://github.com/matrix-org/synapse/issues/9235))
|
||||
- Prevent password hashes from getting dropped if a client failed threepid validation during a User Interactive Auth stage. Removes a workaround for an ancient bug in Riot Web <v0.7.4. ([\#9265](https://github.com/matrix-org/synapse/issues/9265))
|
||||
- Fix single-sign-on when the endpoints are routed to synapse workers. ([\#9271](https://github.com/matrix-org/synapse/issues/9271))
|
||||
|
||||
|
||||
Improved Documentation
|
||||
----------------------
|
||||
|
||||
- Add docs for using Gitea as OpenID provider. ([\#9134](https://github.com/matrix-org/synapse/issues/9134))
|
||||
- Add link to Matrix VoIP tester for turn-howto. ([\#9135](https://github.com/matrix-org/synapse/issues/9135))
|
||||
- Add notes on integrating with Facebook for SSO login. ([\#9244](https://github.com/matrix-org/synapse/issues/9244))
|
||||
|
||||
|
||||
Deprecations and Removals
|
||||
-------------------------
|
||||
|
||||
- The `service_url` parameter in `cas_config` is deprecated in favor of `public_baseurl`. ([\#9199](https://github.com/matrix-org/synapse/issues/9199))
|
||||
- Add new endpoint `/_synapse/client/saml2` for SAML2 authentication callbacks, and deprecate the old endpoint `/_matrix/saml2`. ([\#9289](https://github.com/matrix-org/synapse/issues/9289))
|
||||
|
||||
|
||||
Internal Changes
|
||||
----------------
|
||||
|
||||
- Add tests to `test_user.UsersListTestCase` for List Users Admin API. ([\#9045](https://github.com/matrix-org/synapse/issues/9045))
|
||||
- Various improvements to the federation client. ([\#9129](https://github.com/matrix-org/synapse/issues/9129))
|
||||
- Speed up chain cover calculation when persisting a batch of state events at once. ([\#9176](https://github.com/matrix-org/synapse/issues/9176))
|
||||
- Add a `long_description_type` to the package metadata. ([\#9180](https://github.com/matrix-org/synapse/issues/9180))
|
||||
- Speed up batch insertion when using PostgreSQL. ([\#9181](https://github.com/matrix-org/synapse/issues/9181), [\#9188](https://github.com/matrix-org/synapse/issues/9188))
|
||||
- Emit an error at startup if different Identity Providers are configured with the same `idp_id`. ([\#9184](https://github.com/matrix-org/synapse/issues/9184))
|
||||
- Improve performance of concurrent use of `StreamIDGenerators`. ([\#9190](https://github.com/matrix-org/synapse/issues/9190))
|
||||
- Add some missing source directories to the automatic linting script. ([\#9191](https://github.com/matrix-org/synapse/issues/9191))
|
||||
- Precompute joined hosts and store in Redis. ([\#9198](https://github.com/matrix-org/synapse/issues/9198), [\#9227](https://github.com/matrix-org/synapse/issues/9227))
|
||||
- Clean-up template loading code. ([\#9200](https://github.com/matrix-org/synapse/issues/9200))
|
||||
- Fix the Python 3.5 old dependencies build. ([\#9217](https://github.com/matrix-org/synapse/issues/9217))
|
||||
- Update `isort` to v5.7.0 to bypass a bug where it would disagree with `black` about formatting. ([\#9222](https://github.com/matrix-org/synapse/issues/9222))
|
||||
- Add type hints to handlers code. ([\#9223](https://github.com/matrix-org/synapse/issues/9223), [\#9232](https://github.com/matrix-org/synapse/issues/9232))
|
||||
- Fix Debian package building on Ubuntu 16.04 LTS (Xenial). ([\#9254](https://github.com/matrix-org/synapse/issues/9254))
|
||||
- Minor performance improvement during TLS handshake. ([\#9255](https://github.com/matrix-org/synapse/issues/9255))
|
||||
- Refactor the generation of summary text for email notifications. ([\#9260](https://github.com/matrix-org/synapse/issues/9260))
|
||||
- Restore PyPy compatibility by not calling CPython-specific GC methods when under PyPy. ([\#9270](https://github.com/matrix-org/synapse/issues/9270))
|
||||
|
||||
|
||||
Synapse 1.26.0 (2021-01-27)
|
||||
===========================
|
||||
|
||||
|
|
54
UPGRADE.rst
54
UPGRADE.rst
|
@ -85,6 +85,58 @@ for example:
|
|||
wget https://packages.matrix.org/debian/pool/main/m/matrix-synapse-py3/matrix-synapse-py3_1.3.0+stretch1_amd64.deb
|
||||
dpkg -i matrix-synapse-py3_1.3.0+stretch1_amd64.deb
|
||||
|
||||
Upgrading to v1.27.0
|
||||
====================
|
||||
|
||||
Changes to callback URI for OAuth2 / OpenID Connect
|
||||
---------------------------------------------------
|
||||
|
||||
This version changes the URI used for callbacks from OAuth2 identity providers. If
|
||||
your server is configured for single sign-on via an OpenID Connect or OAuth2 identity
|
||||
provider, you will need to add ``[synapse public baseurl]/_synapse/client/oidc/callback``
|
||||
to the list of permitted "redirect URIs" at the identity provider.
|
||||
|
||||
See `docs/openid.md <docs/openid.md>`_ for more information on setting up OpenID
|
||||
Connect.
|
||||
|
||||
(Note: a similar change is being made for SAML2; in this case the old URI
|
||||
``[synapse public baseurl]/_matrix/saml2`` is being deprecated, but will continue to
|
||||
work, so no immediate changes are required for existing installations.)
|
||||
|
||||
Changes to HTML templates
|
||||
-------------------------
|
||||
|
||||
The HTML templates for SSO and email notifications now have `Jinja2's autoescape <https://jinja.palletsprojects.com/en/2.11.x/api/#autoescaping>`_
|
||||
enabled for files ending in ``.html``, ``.htm``, and ``.xml``. If you have customised
|
||||
these templates and see issues when viewing them you might need to update them.
|
||||
It is expected that most configurations will need no changes.
|
||||
|
||||
If you have customised the templates *names* for these templates, it is recommended
|
||||
to verify they end in ``.html`` to ensure autoescape is enabled.
|
||||
|
||||
The above applies to the following templates:
|
||||
|
||||
* ``add_threepid.html``
|
||||
* ``add_threepid_failure.html``
|
||||
* ``add_threepid_success.html``
|
||||
* ``notice_expiry.html``
|
||||
* ``notice_expiry.html``
|
||||
* ``notif_mail.html`` (which, by default, includes ``room.html`` and ``notif.html``)
|
||||
* ``password_reset.html``
|
||||
* ``password_reset_confirmation.html``
|
||||
* ``password_reset_failure.html``
|
||||
* ``password_reset_success.html``
|
||||
* ``registration.html``
|
||||
* ``registration_failure.html``
|
||||
* ``registration_success.html``
|
||||
* ``sso_account_deactivated.html``
|
||||
* ``sso_auth_bad_user.html``
|
||||
* ``sso_auth_confirm.html``
|
||||
* ``sso_auth_success.html``
|
||||
* ``sso_error.html``
|
||||
* ``sso_login_idp_picker.html``
|
||||
* ``sso_redirect_confirm.html``
|
||||
|
||||
Upgrading to v1.26.0
|
||||
====================
|
||||
|
||||
|
@ -198,7 +250,7 @@ shown below:
|
|||
|
||||
return {"localpart": localpart}
|
||||
|
||||
Removal historical Synapse Admin API
|
||||
Removal historical Synapse Admin API
|
||||
------------------------------------
|
||||
|
||||
Historically, the Synapse Admin API has been accessible under:
|
||||
|
|
4
debian/build_virtualenv
vendored
4
debian/build_virtualenv
vendored
|
@ -33,11 +33,13 @@ esac
|
|||
# Use --builtin-venv to use the better `venv` module from CPython 3.4+ rather
|
||||
# than the 2/3 compatible `virtualenv`.
|
||||
|
||||
# Pin pip to 20.3.4 to fix breakage in 21.0 on py3.5 (xenial)
|
||||
|
||||
dh_virtualenv \
|
||||
--install-suffix "matrix-synapse" \
|
||||
--builtin-venv \
|
||||
--python "$SNAKE" \
|
||||
--upgrade-pip \
|
||||
--upgrade-pip-to="20.3.4" \
|
||||
--preinstall="lxml" \
|
||||
--preinstall="mock" \
|
||||
--extra-pip-arg="--no-cache-dir" \
|
||||
|
|
6
debian/changelog
vendored
6
debian/changelog
vendored
|
@ -1,3 +1,9 @@
|
|||
matrix-synapse-py3 (1.26.0+nmu1) UNRELEASED; urgency=medium
|
||||
|
||||
* Fix build on Ubuntu 16.04 LTS (Xenial).
|
||||
|
||||
-- Dan Callahan <danc@element.io> Thu, 28 Jan 2021 16:21:03 +0000
|
||||
|
||||
matrix-synapse-py3 (1.26.0) stable; urgency=medium
|
||||
|
||||
[ Richard van der Hoff ]
|
||||
|
|
|
@ -27,6 +27,7 @@ RUN env DEBIAN_FRONTEND=noninteractive apt-get install \
|
|||
wget
|
||||
|
||||
# fetch and unpack the package
|
||||
# TODO: Upgrade to 1.2.2 once xenial is dropped
|
||||
RUN mkdir /dh-virtualenv
|
||||
RUN wget -q -O /dh-virtualenv.tar.gz https://github.com/spotify/dh-virtualenv/archive/ac6e1b1.tar.gz
|
||||
RUN tar -xv --strip-components=1 -C /dh-virtualenv -f /dh-virtualenv.tar.gz
|
||||
|
|
|
@ -9,6 +9,7 @@
|
|||
* [Response](#response)
|
||||
* [Undoing room shutdowns](#undoing-room-shutdowns)
|
||||
- [Make Room Admin API](#make-room-admin-api)
|
||||
- [Forward Extremities Admin API](#forward-extremities-admin-api)
|
||||
|
||||
# List Room API
|
||||
|
||||
|
@ -367,6 +368,36 @@ Response:
|
|||
}
|
||||
```
|
||||
|
||||
# Room State API
|
||||
|
||||
The Room State admin API allows server admins to get a list of all state events in a room.
|
||||
|
||||
The response includes the following fields:
|
||||
|
||||
* `state` - The current state of the room at the time of request.
|
||||
|
||||
## Usage
|
||||
|
||||
A standard request:
|
||||
|
||||
```
|
||||
GET /_synapse/admin/v1/rooms/<room_id>/state
|
||||
|
||||
{}
|
||||
```
|
||||
|
||||
Response:
|
||||
|
||||
```json
|
||||
{
|
||||
"state": [
|
||||
{"type": "m.room.create", "state_key": "", "etc": true},
|
||||
{"type": "m.room.power_levels", "state_key": "", "etc": true},
|
||||
{"type": "m.room.name", "state_key": "", "etc": true}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
# Delete Room API
|
||||
|
||||
The Delete Room admin API allows server admins to remove rooms from server
|
||||
|
@ -511,3 +542,55 @@ optionally be specified, e.g.:
|
|||
"user_id": "@foo:example.com"
|
||||
}
|
||||
```
|
||||
|
||||
# Forward Extremities Admin API
|
||||
|
||||
Enables querying and deleting forward extremities from rooms. When a lot of forward
|
||||
extremities accumulate in a room, performance can become degraded. For details, see
|
||||
[#1760](https://github.com/matrix-org/synapse/issues/1760).
|
||||
|
||||
## Check for forward extremities
|
||||
|
||||
To check the status of forward extremities for a room:
|
||||
|
||||
```
|
||||
GET /_synapse/admin/v1/rooms/<room_id_or_alias>/forward_extremities
|
||||
```
|
||||
|
||||
A response as follows will be returned:
|
||||
|
||||
```json
|
||||
{
|
||||
"count": 1,
|
||||
"results": [
|
||||
{
|
||||
"event_id": "$M5SP266vsnxctfwFgFLNceaCo3ujhRtg_NiiHabcdefgh",
|
||||
"state_group": 439,
|
||||
"depth": 123,
|
||||
"received_ts": 1611263016761
|
||||
}
|
||||
]
|
||||
}
|
||||
```
|
||||
|
||||
## Deleting forward extremities
|
||||
|
||||
**WARNING**: Please ensure you know what you're doing and have read
|
||||
the related issue [#1760](https://github.com/matrix-org/synapse/issues/1760).
|
||||
Under no situations should this API be executed as an automated maintenance task!
|
||||
|
||||
If a room has lots of forward extremities, the extra can be
|
||||
deleted as follows:
|
||||
|
||||
```
|
||||
DELETE /_synapse/admin/v1/rooms/<room_id_or_alias>/forward_extremities
|
||||
```
|
||||
|
||||
A response as follows will be returned, indicating the amount of forward extremities
|
||||
that were deleted.
|
||||
|
||||
```json
|
||||
{
|
||||
"deleted": 1
|
||||
}
|
||||
```
|
||||
|
|
|
@ -760,3 +760,33 @@ The following fields are returned in the JSON response body:
|
|||
- ``total`` - integer - Number of pushers.
|
||||
|
||||
See also `Client-Server API Spec <https://matrix.org/docs/spec/client_server/latest#get-matrix-client-r0-pushers>`_
|
||||
|
||||
Shadow-banning users
|
||||
====================
|
||||
|
||||
Shadow-banning is a useful tool for moderating malicious or egregiously abusive users.
|
||||
A shadow-banned users receives successful responses to their client-server API requests,
|
||||
but the events are not propagated into rooms. This can be an effective tool as it
|
||||
(hopefully) takes longer for the user to realise they are being moderated before
|
||||
pivoting to another account.
|
||||
|
||||
Shadow-banning a user should be used as a tool of last resort and may lead to confusing
|
||||
or broken behaviour for the client. A shadow-banned user will not receive any
|
||||
notification and it is generally more appropriate to ban or kick abusive users.
|
||||
A shadow-banned user will be unable to contact anyone on the server.
|
||||
|
||||
The API is::
|
||||
|
||||
POST /_synapse/admin/v1/users/<user_id>/shadow_ban
|
||||
|
||||
To use it, you will need to authenticate by providing an ``access_token`` for a
|
||||
server admin: see `README.rst <README.rst>`_.
|
||||
|
||||
An empty JSON dict is returned.
|
||||
|
||||
**Parameters**
|
||||
|
||||
The following parameters should be set in the URL:
|
||||
|
||||
- ``user_id`` - The fully qualified MXID: for example, ``@user:server.com``. The user must
|
||||
be local.
|
||||
|
|
107
docs/openid.md
107
docs/openid.md
|
@ -44,7 +44,7 @@ as follows:
|
|||
|
||||
To enable the OpenID integration, you should then add a section to the `oidc_providers`
|
||||
setting in your configuration file (or uncomment one of the existing examples).
|
||||
See [sample_config.yaml](./sample_config.yaml) for some sample settings, as well as
|
||||
See [sample_config.yaml](./sample_config.yaml) for some sample settings, as well as
|
||||
the text below for example configurations for specific providers.
|
||||
|
||||
## Sample configs
|
||||
|
@ -52,11 +52,12 @@ the text below for example configurations for specific providers.
|
|||
Here are a few configs for providers that should work with Synapse.
|
||||
|
||||
### Microsoft Azure Active Directory
|
||||
Azure AD can act as an OpenID Connect Provider. Register a new application under
|
||||
Azure AD can act as an OpenID Connect Provider. Register a new application under
|
||||
*App registrations* in the Azure AD management console. The RedirectURI for your
|
||||
application should point to your matrix server: `[synapse public baseurl]/_synapse/oidc/callback`
|
||||
application should point to your matrix server:
|
||||
`[synapse public baseurl]/_synapse/client/oidc/callback`
|
||||
|
||||
Go to *Certificates & secrets* and register a new client secret. Make note of your
|
||||
Go to *Certificates & secrets* and register a new client secret. Make note of your
|
||||
Directory (tenant) ID as it will be used in the Azure links.
|
||||
Edit your Synapse config file and change the `oidc_config` section:
|
||||
|
||||
|
@ -94,7 +95,7 @@ staticClients:
|
|||
- id: synapse
|
||||
secret: secret
|
||||
redirectURIs:
|
||||
- '[synapse public baseurl]/_synapse/oidc/callback'
|
||||
- '[synapse public baseurl]/_synapse/client/oidc/callback'
|
||||
name: 'Synapse'
|
||||
```
|
||||
|
||||
|
@ -118,7 +119,7 @@ oidc_providers:
|
|||
```
|
||||
### [Keycloak][keycloak-idp]
|
||||
|
||||
[Keycloak][keycloak-idp] is an opensource IdP maintained by Red Hat.
|
||||
[Keycloak][keycloak-idp] is an opensource IdP maintained by Red Hat.
|
||||
|
||||
Follow the [Getting Started Guide](https://www.keycloak.org/getting-started) to install Keycloak and set up a realm.
|
||||
|
||||
|
@ -140,7 +141,7 @@ Follow the [Getting Started Guide](https://www.keycloak.org/getting-started) to
|
|||
| Enabled | `On` |
|
||||
| Client Protocol | `openid-connect` |
|
||||
| Access Type | `confidential` |
|
||||
| Valid Redirect URIs | `[synapse public baseurl]/_synapse/oidc/callback` |
|
||||
| Valid Redirect URIs | `[synapse public baseurl]/_synapse/client/oidc/callback` |
|
||||
|
||||
5. Click `Save`
|
||||
6. On the Credentials tab, update the fields:
|
||||
|
@ -168,7 +169,7 @@ oidc_providers:
|
|||
### [Auth0][auth0]
|
||||
|
||||
1. Create a regular web application for Synapse
|
||||
2. Set the Allowed Callback URLs to `[synapse public baseurl]/_synapse/oidc/callback`
|
||||
2. Set the Allowed Callback URLs to `[synapse public baseurl]/_synapse/client/oidc/callback`
|
||||
3. Add a rule to add the `preferred_username` claim.
|
||||
<details>
|
||||
<summary>Code sample</summary>
|
||||
|
@ -194,7 +195,7 @@ Synapse config:
|
|||
|
||||
```yaml
|
||||
oidc_providers:
|
||||
- idp_id: auth0
|
||||
- idp_id: auth0
|
||||
idp_name: Auth0
|
||||
issuer: "https://your-tier.eu.auth0.com/" # TO BE FILLED
|
||||
client_id: "your-client-id" # TO BE FILLED
|
||||
|
@ -217,7 +218,7 @@ login mechanism needs an attribute to uniquely identify users, and that endpoint
|
|||
does not return a `sub` property, an alternative `subject_claim` has to be set.
|
||||
|
||||
1. Create a new OAuth application: https://github.com/settings/applications/new.
|
||||
2. Set the callback URL to `[synapse public baseurl]/_synapse/oidc/callback`.
|
||||
2. Set the callback URL to `[synapse public baseurl]/_synapse/client/oidc/callback`.
|
||||
|
||||
Synapse config:
|
||||
|
||||
|
@ -225,6 +226,7 @@ Synapse config:
|
|||
oidc_providers:
|
||||
- idp_id: github
|
||||
idp_name: Github
|
||||
idp_brand: "org.matrix.github" # optional: styling hint for clients
|
||||
discover: false
|
||||
issuer: "https://github.com/"
|
||||
client_id: "your-client-id" # TO BE FILLED
|
||||
|
@ -250,6 +252,7 @@ oidc_providers:
|
|||
oidc_providers:
|
||||
- idp_id: google
|
||||
idp_name: Google
|
||||
idp_brand: "org.matrix.google" # optional: styling hint for clients
|
||||
issuer: "https://accounts.google.com/"
|
||||
client_id: "your-client-id" # TO BE FILLED
|
||||
client_secret: "your-client-secret" # TO BE FILLED
|
||||
|
@ -260,13 +263,13 @@ oidc_providers:
|
|||
display_name_template: "{{ user.name }}"
|
||||
```
|
||||
4. Back in the Google console, add this Authorized redirect URI: `[synapse
|
||||
public baseurl]/_synapse/oidc/callback`.
|
||||
public baseurl]/_synapse/client/oidc/callback`.
|
||||
|
||||
### Twitch
|
||||
|
||||
1. Setup a developer account on [Twitch](https://dev.twitch.tv/)
|
||||
2. Obtain the OAuth 2.0 credentials by [creating an app](https://dev.twitch.tv/console/apps/)
|
||||
3. Add this OAuth Redirect URL: `[synapse public baseurl]/_synapse/oidc/callback`
|
||||
3. Add this OAuth Redirect URL: `[synapse public baseurl]/_synapse/client/oidc/callback`
|
||||
|
||||
Synapse config:
|
||||
|
||||
|
@ -288,7 +291,7 @@ oidc_providers:
|
|||
|
||||
1. Create a [new application](https://gitlab.com/profile/applications).
|
||||
2. Add the `read_user` and `openid` scopes.
|
||||
3. Add this Callback URL: `[synapse public baseurl]/_synapse/oidc/callback`
|
||||
3. Add this Callback URL: `[synapse public baseurl]/_synapse/client/oidc/callback`
|
||||
|
||||
Synapse config:
|
||||
|
||||
|
@ -296,6 +299,7 @@ Synapse config:
|
|||
oidc_providers:
|
||||
- idp_id: gitlab
|
||||
idp_name: Gitlab
|
||||
idp_brand: "org.matrix.gitlab" # optional: styling hint for clients
|
||||
issuer: "https://gitlab.com/"
|
||||
client_id: "your-client-id" # TO BE FILLED
|
||||
client_secret: "your-client-secret" # TO BE FILLED
|
||||
|
@ -307,3 +311,80 @@ oidc_providers:
|
|||
localpart_template: '{{ user.nickname }}'
|
||||
display_name_template: '{{ user.name }}'
|
||||
```
|
||||
|
||||
### Facebook
|
||||
|
||||
Like Github, Facebook provide a custom OAuth2 API rather than an OIDC-compliant
|
||||
one so requires a little more configuration.
|
||||
|
||||
0. You will need a Facebook developer account. You can register for one
|
||||
[here](https://developers.facebook.com/async/registration/).
|
||||
1. On the [apps](https://developers.facebook.com/apps/) page of the developer
|
||||
console, "Create App", and choose "Build Connected Experiences".
|
||||
2. Once the app is created, add "Facebook Login" and choose "Web". You don't
|
||||
need to go through the whole form here.
|
||||
3. In the left-hand menu, open "Products"/"Facebook Login"/"Settings".
|
||||
* Add `[synapse public baseurl]/_synapse/client/oidc/callback` as an OAuth Redirect
|
||||
URL.
|
||||
4. In the left-hand menu, open "Settings/Basic". Here you can copy the "App ID"
|
||||
and "App Secret" for use below.
|
||||
|
||||
Synapse config:
|
||||
|
||||
```yaml
|
||||
- idp_id: facebook
|
||||
idp_name: Facebook
|
||||
idp_brand: "org.matrix.facebook" # optional: styling hint for clients
|
||||
discover: false
|
||||
issuer: "https://facebook.com"
|
||||
client_id: "your-client-id" # TO BE FILLED
|
||||
client_secret: "your-client-secret" # TO BE FILLED
|
||||
scopes: ["openid", "email"]
|
||||
authorization_endpoint: https://facebook.com/dialog/oauth
|
||||
token_endpoint: https://graph.facebook.com/v9.0/oauth/access_token
|
||||
user_profile_method: "userinfo_endpoint"
|
||||
userinfo_endpoint: "https://graph.facebook.com/v9.0/me?fields=id,name,email,picture"
|
||||
user_mapping_provider:
|
||||
config:
|
||||
subject_claim: "id"
|
||||
display_name_template: "{{ user.name }}"
|
||||
```
|
||||
|
||||
Relevant documents:
|
||||
* https://developers.facebook.com/docs/facebook-login/manually-build-a-login-flow
|
||||
* Using Facebook's Graph API: https://developers.facebook.com/docs/graph-api/using-graph-api/
|
||||
* Reference to the User endpoint: https://developers.facebook.com/docs/graph-api/reference/user
|
||||
|
||||
### Gitea
|
||||
|
||||
Gitea is, like Github, not an OpenID provider, but just an OAuth2 provider.
|
||||
|
||||
The [`/user` API endpoint](https://try.gitea.io/api/swagger#/user/userGetCurrent)
|
||||
can be used to retrieve information on the authenticated user. As the Synapse
|
||||
login mechanism needs an attribute to uniquely identify users, and that endpoint
|
||||
does not return a `sub` property, an alternative `subject_claim` has to be set.
|
||||
|
||||
1. Create a new application.
|
||||
2. Add this Callback URL: `[synapse public baseurl]/_synapse/oidc/callback`
|
||||
|
||||
Synapse config:
|
||||
|
||||
```yaml
|
||||
oidc_providers:
|
||||
- idp_id: gitea
|
||||
idp_name: Gitea
|
||||
discover: false
|
||||
issuer: "https://your-gitea.com/"
|
||||
client_id: "your-client-id" # TO BE FILLED
|
||||
client_secret: "your-client-secret" # TO BE FILLED
|
||||
client_auth_method: client_secret_post
|
||||
scopes: [] # Gitea doesn't support Scopes
|
||||
authorization_endpoint: "https://your-gitea.com/login/oauth/authorize"
|
||||
token_endpoint: "https://your-gitea.com/login/oauth/access_token"
|
||||
userinfo_endpoint: "https://your-gitea.com/api/v1/user"
|
||||
user_mapping_provider:
|
||||
config:
|
||||
subject_claim: "id"
|
||||
localpart_template: "{{ user.login }}"
|
||||
display_name_template: "{{ user.full_name }}"
|
||||
```
|
||||
|
|
|
@ -824,6 +824,9 @@ log_config: "CONFDIR/SERVERNAME.log.config"
|
|||
# users are joining rooms the server is already in (this is cheap) vs
|
||||
# "remote" for when users are trying to join rooms not on the server (which
|
||||
# can be more expensive)
|
||||
# - one for ratelimiting how often a user or IP can attempt to validate a 3PID.
|
||||
# - two for ratelimiting how often invites can be sent in a room or to a
|
||||
# specific user.
|
||||
#
|
||||
# The defaults are as shown below.
|
||||
#
|
||||
|
@ -857,7 +860,18 @@ log_config: "CONFDIR/SERVERNAME.log.config"
|
|||
# remote:
|
||||
# per_second: 0.01
|
||||
# burst_count: 3
|
||||
|
||||
#
|
||||
#rc_3pid_validation:
|
||||
# per_second: 0.003
|
||||
# burst_count: 5
|
||||
#
|
||||
#rc_invites:
|
||||
# per_room:
|
||||
# per_second: 0.3
|
||||
# burst_count: 10
|
||||
# per_user:
|
||||
# per_second: 0.003
|
||||
# burst_count: 5
|
||||
|
||||
# Ratelimiting settings for incoming federation
|
||||
#
|
||||
|
@ -1552,10 +1566,10 @@ trusted_key_servers:
|
|||
# enable SAML login.
|
||||
#
|
||||
# Once SAML support is enabled, a metadata file will be exposed at
|
||||
# https://<server>:<port>/_matrix/saml2/metadata.xml, which you may be able to
|
||||
# https://<server>:<port>/_synapse/client/saml2/metadata.xml, which you may be able to
|
||||
# use to configure your SAML IdP with. Alternatively, you can manually configure
|
||||
# the IdP to use an ACS location of
|
||||
# https://<server>:<port>/_matrix/saml2/authn_response.
|
||||
# https://<server>:<port>/_synapse/client/saml2/authn_response.
|
||||
#
|
||||
saml2_config:
|
||||
# `sp_config` is the configuration for the pysaml2 Service Provider.
|
||||
|
@ -1727,10 +1741,14 @@ saml2_config:
|
|||
# offer the user a choice of login mechanisms.
|
||||
#
|
||||
# idp_icon: An optional icon for this identity provider, which is presented
|
||||
# by identity picker pages. If given, must be an MXC URI of the format
|
||||
# mxc://<server-name>/<media-id>. (An easy way to obtain such an MXC URI
|
||||
# is to upload an image to an (unencrypted) room and then copy the "url"
|
||||
# from the source of the event.)
|
||||
# by clients and Synapse's own IdP picker page. If given, must be an
|
||||
# MXC URI of the format mxc://<server-name>/<media-id>. (An easy way to
|
||||
# obtain such an MXC URI is to upload an image to an (unencrypted) room
|
||||
# and then copy the "url" from the source of the event.)
|
||||
#
|
||||
# idp_brand: An optional brand for this identity provider, allowing clients
|
||||
# to style the login flow according to the identity provider in question.
|
||||
# See the spec for possible options here.
|
||||
#
|
||||
# discover: set to 'false' to disable the use of the OIDC discovery mechanism
|
||||
# to discover endpoints. Defaults to true.
|
||||
|
@ -1791,17 +1809,21 @@ saml2_config:
|
|||
#
|
||||
# For the default provider, the following settings are available:
|
||||
#
|
||||
# sub: name of the claim containing a unique identifier for the
|
||||
# user. Defaults to 'sub', which OpenID Connect compliant
|
||||
# providers should provide.
|
||||
# subject_claim: name of the claim containing a unique identifier
|
||||
# for the user. Defaults to 'sub', which OpenID Connect
|
||||
# compliant providers should provide.
|
||||
#
|
||||
# localpart_template: Jinja2 template for the localpart of the MXID.
|
||||
# If this is not set, the user will be prompted to choose their
|
||||
# own username.
|
||||
# own username (see 'sso_auth_account_details.html' in the 'sso'
|
||||
# section of this file).
|
||||
#
|
||||
# display_name_template: Jinja2 template for the display name to set
|
||||
# on first login. If unset, no displayname will be set.
|
||||
#
|
||||
# email_template: Jinja2 template for the email address of the user.
|
||||
# If unset, no email address will be added to the account.
|
||||
#
|
||||
# extra_attributes: a map of Jinja2 templates for extra attributes
|
||||
# to send back to the client during login.
|
||||
# Note that these are non-standard and clients will ignore them
|
||||
|
@ -1837,6 +1859,12 @@ oidc_providers:
|
|||
# userinfo_endpoint: "https://accounts.example.com/userinfo"
|
||||
# jwks_uri: "https://accounts.example.com/.well-known/jwks.json"
|
||||
# skip_verification: true
|
||||
# user_mapping_provider:
|
||||
# config:
|
||||
# subject_claim: "id"
|
||||
# localpart_template: "{ user.login }"
|
||||
# display_name_template: "{ user.name }"
|
||||
# email_template: "{ user.email }"
|
||||
|
||||
# For use with Keycloak
|
||||
#
|
||||
|
@ -1851,6 +1879,7 @@ oidc_providers:
|
|||
#
|
||||
#- idp_id: github
|
||||
# idp_name: Github
|
||||
# idp_brand: org.matrix.github
|
||||
# discover: false
|
||||
# issuer: "https://github.com/"
|
||||
# client_id: "your-client-id" # TO BE FILLED
|
||||
|
@ -1878,10 +1907,6 @@ cas_config:
|
|||
#
|
||||
#server_url: "https://cas-server.com"
|
||||
|
||||
# The public URL of the homeserver.
|
||||
#
|
||||
#service_url: "https://homeserver.domain.com:8448"
|
||||
|
||||
# The attribute of the CAS response to use as the display name.
|
||||
#
|
||||
# If unset, no displayname will be set.
|
||||
|
@ -1943,8 +1968,13 @@ sso:
|
|||
#
|
||||
# * providers: a list of available Identity Providers. Each element is
|
||||
# an object with the following attributes:
|
||||
#
|
||||
# * idp_id: unique identifier for the IdP
|
||||
# * idp_name: user-facing name for the IdP
|
||||
# * idp_icon: if specified in the IdP config, an MXC URI for an icon
|
||||
# for the IdP
|
||||
# * idp_brand: if specified in the IdP config, a textual identifier
|
||||
# for the brand of the IdP
|
||||
#
|
||||
# The rendered HTML page should contain a form which submits its results
|
||||
# back as a GET request, with the following query parameters:
|
||||
|
@ -1954,10 +1984,62 @@ sso:
|
|||
#
|
||||
# * idp: the 'idp_id' of the chosen IDP.
|
||||
#
|
||||
# * HTML page to prompt new users to enter a userid and confirm other
|
||||
# details: 'sso_auth_account_details.html'. This is only shown if the
|
||||
# SSO implementation (with any user_mapping_provider) does not return
|
||||
# a localpart.
|
||||
#
|
||||
# When rendering, this template is given the following variables:
|
||||
#
|
||||
# * server_name: the homeserver's name.
|
||||
#
|
||||
# * idp: details of the SSO Identity Provider that the user logged in
|
||||
# with: an object with the following attributes:
|
||||
#
|
||||
# * idp_id: unique identifier for the IdP
|
||||
# * idp_name: user-facing name for the IdP
|
||||
# * idp_icon: if specified in the IdP config, an MXC URI for an icon
|
||||
# for the IdP
|
||||
# * idp_brand: if specified in the IdP config, a textual identifier
|
||||
# for the brand of the IdP
|
||||
#
|
||||
# * user_attributes: an object containing details about the user that
|
||||
# we received from the IdP. May have the following attributes:
|
||||
#
|
||||
# * display_name: the user's display_name
|
||||
# * emails: a list of email addresses
|
||||
#
|
||||
# The template should render a form which submits the following fields:
|
||||
#
|
||||
# * username: the localpart of the user's chosen user id
|
||||
#
|
||||
# * HTML page allowing the user to consent to the server's terms and
|
||||
# conditions. This is only shown for new users, and only if
|
||||
# `user_consent.require_at_registration` is set.
|
||||
#
|
||||
# When rendering, this template is given the following variables:
|
||||
#
|
||||
# * server_name: the homeserver's name.
|
||||
#
|
||||
# * user_id: the user's matrix proposed ID.
|
||||
#
|
||||
# * user_profile.display_name: the user's proposed display name, if any.
|
||||
#
|
||||
# * consent_version: the version of the terms that the user will be
|
||||
# shown
|
||||
#
|
||||
# * terms_url: a link to the page showing the terms.
|
||||
#
|
||||
# The template should render a form which submits the following fields:
|
||||
#
|
||||
# * accepted_version: the version of the terms accepted by the user
|
||||
# (ie, 'consent_version' from the input variables).
|
||||
#
|
||||
# * HTML page for a confirmation step before redirecting back to the client
|
||||
# with the login token: 'sso_redirect_confirm.html'.
|
||||
#
|
||||
# When rendering, this template is given three variables:
|
||||
# When rendering, this template is given the following variables:
|
||||
#
|
||||
# * redirect_url: the URL the user is about to be redirected to. Needs
|
||||
# manual escaping (see
|
||||
# https://jinja.palletsprojects.com/en/2.11.x/templates/#html-escaping).
|
||||
|
@ -1970,6 +2052,17 @@ sso:
|
|||
#
|
||||
# * server_name: the homeserver's name.
|
||||
#
|
||||
# * new_user: a boolean indicating whether this is the user's first time
|
||||
# logging in.
|
||||
#
|
||||
# * user_id: the user's matrix ID.
|
||||
#
|
||||
# * user_profile.avatar_url: an MXC URI for the user's avatar, if any.
|
||||
# None if the user has not set an avatar.
|
||||
#
|
||||
# * user_profile.display_name: the user's display name. None if the user
|
||||
# has not set a display name.
|
||||
#
|
||||
# * HTML page which notifies the user that they are authenticating to confirm
|
||||
# an operation on their account during the user interactive authentication
|
||||
# process: 'sso_auth_confirm.html'.
|
||||
|
@ -1981,6 +2074,16 @@ sso:
|
|||
#
|
||||
# * description: the operation which the user is being asked to confirm
|
||||
#
|
||||
# * idp: details of the Identity Provider that we will use to confirm
|
||||
# the user's identity: an object with the following attributes:
|
||||
#
|
||||
# * idp_id: unique identifier for the IdP
|
||||
# * idp_name: user-facing name for the IdP
|
||||
# * idp_icon: if specified in the IdP config, an MXC URI for an icon
|
||||
# for the IdP
|
||||
# * idp_brand: if specified in the IdP config, a textual identifier
|
||||
# for the brand of the IdP
|
||||
#
|
||||
# * HTML page shown after a successful user interactive authentication session:
|
||||
# 'sso_auth_success.html'.
|
||||
#
|
||||
|
|
|
@ -232,6 +232,12 @@ Here are a few things to try:
|
|||
|
||||
(Understanding the output is beyond the scope of this document!)
|
||||
|
||||
* You can test your Matrix homeserver TURN setup with https://test.voip.librepush.net/.
|
||||
Note that this test is not fully reliable yet, so don't be discouraged if
|
||||
the test fails.
|
||||
[Here](https://github.com/matrix-org/voip-tester) is the github repo of the
|
||||
source of the tester, where you can file bug reports.
|
||||
|
||||
* There is a WebRTC test tool at
|
||||
https://webrtc.github.io/samples/src/content/peerconnection/trickle-ice/. To
|
||||
use it, you will need a username/password for your TURN server. You can
|
||||
|
|
|
@ -40,6 +40,9 @@ which relays replication commands between processes. This can give a significant
|
|||
cpu saving on the main process and will be a prerequisite for upcoming
|
||||
performance improvements.
|
||||
|
||||
If Redis support is enabled Synapse will use it as a shared cache, as well as a
|
||||
pub/sub mechanism.
|
||||
|
||||
See the [Architectural diagram](#architectural-diagram) section at the end for
|
||||
a visualisation of what this looks like.
|
||||
|
||||
|
@ -225,7 +228,6 @@ expressions:
|
|||
^/_matrix/client/(api/v1|r0|unstable)/joined_groups$
|
||||
^/_matrix/client/(api/v1|r0|unstable)/publicised_groups$
|
||||
^/_matrix/client/(api/v1|r0|unstable)/publicised_groups/
|
||||
^/_synapse/client/password_reset/email/submit_token$
|
||||
|
||||
# Registration/login requests
|
||||
^/_matrix/client/(api/v1|r0|unstable)/login$
|
||||
|
@ -256,25 +258,29 @@ Additionally, the following endpoints should be included if Synapse is configure
|
|||
to use SSO (you only need to include the ones for whichever SSO provider you're
|
||||
using):
|
||||
|
||||
# for all SSO providers
|
||||
^/_matrix/client/(api/v1|r0|unstable)/login/sso/redirect
|
||||
^/_synapse/client/pick_idp$
|
||||
^/_synapse/client/pick_username
|
||||
^/_synapse/client/new_user_consent$
|
||||
^/_synapse/client/sso_register$
|
||||
|
||||
# OpenID Connect requests.
|
||||
^/_matrix/client/(api/v1|r0|unstable)/login/sso/redirect$
|
||||
^/_synapse/oidc/callback$
|
||||
^/_synapse/client/oidc/callback$
|
||||
|
||||
# SAML requests.
|
||||
^/_matrix/client/(api/v1|r0|unstable)/login/sso/redirect$
|
||||
^/_matrix/saml2/authn_response$
|
||||
^/_synapse/client/saml2/authn_response$
|
||||
|
||||
# CAS requests.
|
||||
^/_matrix/client/(api/v1|r0|unstable)/login/(cas|sso)/redirect$
|
||||
^/_matrix/client/(api/v1|r0|unstable)/login/cas/ticket$
|
||||
|
||||
Ensure that all SSO logins go to a single process.
|
||||
For multiple workers not handling the SSO endpoints properly, see
|
||||
[#7530](https://github.com/matrix-org/synapse/issues/7530).
|
||||
|
||||
Note that a HTTP listener with `client` and `federation` resources must be
|
||||
configured in the `worker_listeners` option in the worker config.
|
||||
|
||||
Ensure that all SSO logins go to a single process (usually the main process).
|
||||
For multiple workers not handling the SSO endpoints properly, see
|
||||
[#7530](https://github.com/matrix-org/synapse/issues/7530).
|
||||
|
||||
#### Load balancing
|
||||
|
||||
It is possible to run multiple instances of this worker app, with incoming requests
|
||||
|
|
40
mypy.ini
40
mypy.ini
|
@ -23,39 +23,7 @@ files =
|
|||
synapse/events/validator.py,
|
||||
synapse/events/spamcheck.py,
|
||||
synapse/federation,
|
||||
synapse/handlers/_base.py,
|
||||
synapse/handlers/account_data.py,
|
||||
synapse/handlers/account_validity.py,
|
||||
synapse/handlers/admin.py,
|
||||
synapse/handlers/appservice.py,
|
||||
synapse/handlers/auth.py,
|
||||
synapse/handlers/cas_handler.py,
|
||||
synapse/handlers/deactivate_account.py,
|
||||
synapse/handlers/device.py,
|
||||
synapse/handlers/devicemessage.py,
|
||||
synapse/handlers/directory.py,
|
||||
synapse/handlers/events.py,
|
||||
synapse/handlers/federation.py,
|
||||
synapse/handlers/identity.py,
|
||||
synapse/handlers/initial_sync.py,
|
||||
synapse/handlers/message.py,
|
||||
synapse/handlers/oidc_handler.py,
|
||||
synapse/handlers/pagination.py,
|
||||
synapse/handlers/password_policy.py,
|
||||
synapse/handlers/presence.py,
|
||||
synapse/handlers/profile.py,
|
||||
synapse/handlers/read_marker.py,
|
||||
synapse/handlers/receipts.py,
|
||||
synapse/handlers/register.py,
|
||||
synapse/handlers/room.py,
|
||||
synapse/handlers/room_list.py,
|
||||
synapse/handlers/room_member.py,
|
||||
synapse/handlers/room_member_worker.py,
|
||||
synapse/handlers/saml_handler.py,
|
||||
synapse/handlers/sso.py,
|
||||
synapse/handlers/sync.py,
|
||||
synapse/handlers/user_directory.py,
|
||||
synapse/handlers/ui_auth,
|
||||
synapse/handlers,
|
||||
synapse/http/client.py,
|
||||
synapse/http/federation/matrix_federation_agent.py,
|
||||
synapse/http/federation/well_known_resolver.py,
|
||||
|
@ -194,3 +162,9 @@ ignore_missing_imports = True
|
|||
|
||||
[mypy-hiredis]
|
||||
ignore_missing_imports = True
|
||||
|
||||
[mypy-josepy.*]
|
||||
ignore_missing_imports = True
|
||||
|
||||
[mypy-txacme.*]
|
||||
ignore_missing_imports = True
|
||||
|
|
|
@ -80,7 +80,8 @@ else
|
|||
# then lint everything!
|
||||
if [[ -z ${files+x} ]]; then
|
||||
# Lint all source code files and directories
|
||||
files=("synapse" "tests" "scripts-dev" "scripts" "contrib" "synctl" "setup.py" "synmark")
|
||||
# Note: this list aims the mirror the one in tox.ini
|
||||
files=("synapse" "docker" "tests" "scripts-dev" "scripts" "contrib" "synctl" "setup.py" "synmark" "stubs" ".buildkite")
|
||||
fi
|
||||
fi
|
||||
|
||||
|
|
3
setup.py
3
setup.py
|
@ -96,7 +96,7 @@ CONDITIONAL_REQUIREMENTS["all"] = list(ALL_OPTIONAL_REQUIREMENTS)
|
|||
#
|
||||
# We pin black so that our tests don't start failing on new releases.
|
||||
CONDITIONAL_REQUIREMENTS["lint"] = [
|
||||
"isort==5.0.3",
|
||||
"isort==5.7.0",
|
||||
"black==19.10b0",
|
||||
"flake8-comprehensions",
|
||||
"flake8",
|
||||
|
@ -121,6 +121,7 @@ setup(
|
|||
include_package_data=True,
|
||||
zip_safe=False,
|
||||
long_description=long_description,
|
||||
long_description_content_type="text/x-rst",
|
||||
python_requires="~=3.5",
|
||||
classifiers=[
|
||||
"Development Status :: 5 - Production/Stable",
|
||||
|
|
|
@ -15,13 +15,23 @@
|
|||
|
||||
"""Contains *incomplete* type hints for txredisapi.
|
||||
"""
|
||||
|
||||
from typing import List, Optional, Type, Union
|
||||
from typing import Any, List, Optional, Type, Union
|
||||
|
||||
class RedisProtocol:
|
||||
def publish(self, channel: str, message: bytes): ...
|
||||
async def ping(self) -> None: ...
|
||||
async def set(
|
||||
self,
|
||||
key: str,
|
||||
value: Any,
|
||||
expire: Optional[int] = None,
|
||||
pexpire: Optional[int] = None,
|
||||
only_if_not_exists: bool = False,
|
||||
only_if_exists: bool = False,
|
||||
) -> None: ...
|
||||
async def get(self, key: str) -> Any: ...
|
||||
|
||||
class SubscriberProtocol:
|
||||
class SubscriberProtocol(RedisProtocol):
|
||||
def __init__(self, *args, **kwargs): ...
|
||||
password: Optional[str]
|
||||
def subscribe(self, channels: Union[str, List[str]]): ...
|
||||
|
@ -40,14 +50,13 @@ def lazyConnection(
|
|||
convertNumbers: bool = ...,
|
||||
) -> RedisProtocol: ...
|
||||
|
||||
class SubscriberFactory:
|
||||
def buildProtocol(self, addr): ...
|
||||
|
||||
class ConnectionHandler: ...
|
||||
|
||||
class RedisFactory:
|
||||
continueTrying: bool
|
||||
handler: RedisProtocol
|
||||
pool: List[RedisProtocol]
|
||||
replyTimeout: Optional[int]
|
||||
def __init__(
|
||||
self,
|
||||
uuid: str,
|
||||
|
@ -60,3 +69,7 @@ class RedisFactory:
|
|||
replyTimeout: Optional[int] = None,
|
||||
convertNumbers: Optional[int] = True,
|
||||
): ...
|
||||
def buildProtocol(self, addr) -> RedisProtocol: ...
|
||||
|
||||
class SubscriberFactory(RedisFactory):
|
||||
def __init__(self): ...
|
||||
|
|
|
@ -48,7 +48,7 @@ try:
|
|||
except ImportError:
|
||||
pass
|
||||
|
||||
__version__ = "1.26.0"
|
||||
__version__ = "1.27.0rc1"
|
||||
|
||||
if bool(os.environ.get("SYNAPSE_TEST_PATCH_LOG_CONTEXTS", False)):
|
||||
# We import here so that we don't have to install a bunch of deps when
|
||||
|
|
|
@ -16,6 +16,7 @@
|
|||
import gc
|
||||
import logging
|
||||
import os
|
||||
import platform
|
||||
import signal
|
||||
import socket
|
||||
import sys
|
||||
|
@ -339,7 +340,7 @@ async def start(hs: "synapse.server.HomeServer", listeners: Iterable[ListenerCon
|
|||
# rest of time. Doing so means less work each GC (hopefully).
|
||||
#
|
||||
# This only works on Python 3.7
|
||||
if sys.version_info >= (3, 7):
|
||||
if platform.python_implementation() == "CPython" and sys.version_info >= (3, 7):
|
||||
gc.collect()
|
||||
gc.freeze()
|
||||
|
||||
|
|
|
@ -22,6 +22,7 @@ from typing import Dict, Iterable, Optional, Set
|
|||
from typing_extensions import ContextManager
|
||||
|
||||
from twisted.internet import address
|
||||
from twisted.web.resource import IResource
|
||||
|
||||
import synapse
|
||||
import synapse.events
|
||||
|
@ -90,9 +91,8 @@ from synapse.replication.tcp.streams import (
|
|||
ToDeviceStream,
|
||||
)
|
||||
from synapse.rest.admin import register_servlets_for_media_repo
|
||||
from synapse.rest.client.v1 import events, room
|
||||
from synapse.rest.client.v1 import events, login, room
|
||||
from synapse.rest.client.v1.initial_sync import InitialSyncRestServlet
|
||||
from synapse.rest.client.v1.login import LoginRestServlet
|
||||
from synapse.rest.client.v1.profile import (
|
||||
ProfileAvatarURLRestServlet,
|
||||
ProfileDisplaynameRestServlet,
|
||||
|
@ -127,6 +127,7 @@ from synapse.rest.client.v2_alpha.sendtodevice import SendToDeviceRestServlet
|
|||
from synapse.rest.client.versions import VersionsRestServlet
|
||||
from synapse.rest.health import HealthResource
|
||||
from synapse.rest.key.v2 import KeyApiV2Resource
|
||||
from synapse.rest.synapse.client import build_synapse_client_resource_tree
|
||||
from synapse.server import HomeServer, cache_in_self
|
||||
from synapse.storage.databases.main.censor_events import CensorEventsStore
|
||||
from synapse.storage.databases.main.client_ips import ClientIpWorkerStore
|
||||
|
@ -507,7 +508,7 @@ class GenericWorkerServer(HomeServer):
|
|||
site_tag = port
|
||||
|
||||
# We always include a health resource.
|
||||
resources = {"/health": HealthResource()}
|
||||
resources = {"/health": HealthResource()} # type: Dict[str, IResource]
|
||||
|
||||
for res in listener_config.http_options.resources:
|
||||
for name in res.names:
|
||||
|
@ -517,7 +518,7 @@ class GenericWorkerServer(HomeServer):
|
|||
resource = JsonResource(self, canonical_json=False)
|
||||
|
||||
RegisterRestServlet(self).register(resource)
|
||||
LoginRestServlet(self).register(resource)
|
||||
login.register_servlets(self, resource)
|
||||
ThreepidRestServlet(self).register(resource)
|
||||
DevicesRestServlet(self).register(resource)
|
||||
KeyQueryServlet(self).register(resource)
|
||||
|
@ -557,6 +558,8 @@ class GenericWorkerServer(HomeServer):
|
|||
groups.register_servlets(self, resource)
|
||||
|
||||
resources.update({CLIENT_API_PREFIX: resource})
|
||||
|
||||
resources.update(build_synapse_client_resource_tree(self))
|
||||
elif name == "federation":
|
||||
resources.update({FEDERATION_PREFIX: TransportLayerServer(self)})
|
||||
elif name == "media":
|
||||
|
|
|
@ -60,8 +60,7 @@ from synapse.rest import ClientRestResource
|
|||
from synapse.rest.admin import AdminRestResource
|
||||
from synapse.rest.health import HealthResource
|
||||
from synapse.rest.key.v2 import KeyApiV2Resource
|
||||
from synapse.rest.synapse.client.pick_idp import PickIdpResource
|
||||
from synapse.rest.synapse.client.pick_username import pick_username_resource
|
||||
from synapse.rest.synapse.client import build_synapse_client_resource_tree
|
||||
from synapse.rest.well_known import WellKnownResource
|
||||
from synapse.server import HomeServer
|
||||
from synapse.storage import DataStore
|
||||
|
@ -190,21 +189,10 @@ class SynapseHomeServer(HomeServer):
|
|||
"/_matrix/client/versions": client_resource,
|
||||
"/.well-known/matrix/client": WellKnownResource(self),
|
||||
"/_synapse/admin": AdminRestResource(self),
|
||||
"/_synapse/client/pick_username": pick_username_resource(self),
|
||||
"/_synapse/client/pick_idp": PickIdpResource(self),
|
||||
**build_synapse_client_resource_tree(self),
|
||||
}
|
||||
)
|
||||
|
||||
if self.get_config().oidc_enabled:
|
||||
from synapse.rest.oidc import OIDCResource
|
||||
|
||||
resources["/_synapse/oidc"] = OIDCResource(self)
|
||||
|
||||
if self.get_config().saml2_enabled:
|
||||
from synapse.rest.saml2 import SAML2Resource
|
||||
|
||||
resources["/_matrix/saml2"] = SAML2Resource(self)
|
||||
|
||||
if self.get_config().threepid_behaviour_email == ThreepidBehaviour.LOCAL:
|
||||
from synapse.rest.synapse.client.password_reset import (
|
||||
PasswordResetSubmitTokenResource,
|
||||
|
|
|
@ -93,15 +93,20 @@ async def phone_stats_home(hs, stats, stats_process=_stats_process):
|
|||
|
||||
stats["daily_active_users"] = await hs.get_datastore().count_daily_users()
|
||||
stats["monthly_active_users"] = await hs.get_datastore().count_monthly_users()
|
||||
daily_active_e2ee_rooms = await hs.get_datastore().count_daily_active_e2ee_rooms()
|
||||
stats["daily_active_e2ee_rooms"] = daily_active_e2ee_rooms
|
||||
stats["daily_e2ee_messages"] = await hs.get_datastore().count_daily_e2ee_messages()
|
||||
daily_sent_e2ee_messages = await hs.get_datastore().count_daily_sent_e2ee_messages()
|
||||
stats["daily_sent_e2ee_messages"] = daily_sent_e2ee_messages
|
||||
stats["daily_active_rooms"] = await hs.get_datastore().count_daily_active_rooms()
|
||||
stats["daily_messages"] = await hs.get_datastore().count_daily_messages()
|
||||
daily_sent_messages = await hs.get_datastore().count_daily_sent_messages()
|
||||
stats["daily_sent_messages"] = daily_sent_messages
|
||||
|
||||
r30_results = await hs.get_datastore().count_r30_users()
|
||||
for name, count in r30_results.items():
|
||||
stats["r30_users_" + name] = count
|
||||
|
||||
daily_sent_messages = await hs.get_datastore().count_daily_sent_messages()
|
||||
stats["daily_sent_messages"] = daily_sent_messages
|
||||
stats["cache_factor"] = hs.config.caches.global_factor
|
||||
stats["event_cache_size"] = hs.config.caches.event_cache_size
|
||||
|
||||
|
|
|
@ -18,18 +18,18 @@
|
|||
import argparse
|
||||
import errno
|
||||
import os
|
||||
import time
|
||||
import urllib.parse
|
||||
from collections import OrderedDict
|
||||
from hashlib import sha256
|
||||
from textwrap import dedent
|
||||
from typing import Any, Callable, Iterable, List, MutableMapping, Optional
|
||||
from typing import Any, Iterable, List, MutableMapping, Optional
|
||||
|
||||
import attr
|
||||
import jinja2
|
||||
import pkg_resources
|
||||
import yaml
|
||||
|
||||
from synapse.util.templates import _create_mxc_to_http_filter, _format_ts_filter
|
||||
|
||||
|
||||
class ConfigError(Exception):
|
||||
"""Represents a problem parsing the configuration
|
||||
|
@ -203,11 +203,28 @@ class Config:
|
|||
with open(file_path) as file_stream:
|
||||
return file_stream.read()
|
||||
|
||||
def read_template(self, filename: str) -> jinja2.Template:
|
||||
"""Load a template file from disk.
|
||||
|
||||
This function will attempt to load the given template from the default Synapse
|
||||
template directory.
|
||||
|
||||
Files read are treated as Jinja templates. The templates is not rendered yet
|
||||
and has autoescape enabled.
|
||||
|
||||
Args:
|
||||
filename: A template filename to read.
|
||||
|
||||
Raises:
|
||||
ConfigError: if the file's path is incorrect or otherwise cannot be read.
|
||||
|
||||
Returns:
|
||||
A jinja2 template.
|
||||
"""
|
||||
return self.read_templates([filename])[0]
|
||||
|
||||
def read_templates(
|
||||
self,
|
||||
filenames: List[str],
|
||||
custom_template_directory: Optional[str] = None,
|
||||
autoescape: bool = False,
|
||||
self, filenames: List[str], custom_template_directory: Optional[str] = None,
|
||||
) -> List[jinja2.Template]:
|
||||
"""Load a list of template files from disk using the given variables.
|
||||
|
||||
|
@ -215,7 +232,8 @@ class Config:
|
|||
template directory. If `custom_template_directory` is supplied, that directory
|
||||
is tried first.
|
||||
|
||||
Files read are treated as Jinja templates. These templates are not rendered yet.
|
||||
Files read are treated as Jinja templates. The templates are not rendered yet
|
||||
and have autoescape enabled.
|
||||
|
||||
Args:
|
||||
filenames: A list of template filenames to read.
|
||||
|
@ -223,16 +241,12 @@ class Config:
|
|||
custom_template_directory: A directory to try to look for the templates
|
||||
before using the default Synapse template directory instead.
|
||||
|
||||
autoescape: Whether to autoescape variables before inserting them into the
|
||||
template.
|
||||
|
||||
Raises:
|
||||
ConfigError: if the file's path is incorrect or otherwise cannot be read.
|
||||
|
||||
Returns:
|
||||
A list of jinja2 templates.
|
||||
"""
|
||||
templates = []
|
||||
search_directories = [self.default_template_dir]
|
||||
|
||||
# The loader will first look in the custom template directory (if specified) for the
|
||||
|
@ -248,8 +262,9 @@ class Config:
|
|||
# Search the custom template directory as well
|
||||
search_directories.insert(0, custom_template_directory)
|
||||
|
||||
# TODO: switch to synapse.util.templates.build_jinja_env
|
||||
loader = jinja2.FileSystemLoader(search_directories)
|
||||
env = jinja2.Environment(loader=loader, autoescape=autoescape)
|
||||
env = jinja2.Environment(loader=loader, autoescape=jinja2.select_autoescape(),)
|
||||
|
||||
# Update the environment with our custom filters
|
||||
env.filters.update(
|
||||
|
@ -259,44 +274,8 @@ class Config:
|
|||
}
|
||||
)
|
||||
|
||||
for filename in filenames:
|
||||
# Load the template
|
||||
template = env.get_template(filename)
|
||||
templates.append(template)
|
||||
|
||||
return templates
|
||||
|
||||
|
||||
def _format_ts_filter(value: int, format: str):
|
||||
return time.strftime(format, time.localtime(value / 1000))
|
||||
|
||||
|
||||
def _create_mxc_to_http_filter(public_baseurl: str) -> Callable:
|
||||
"""Create and return a jinja2 filter that converts MXC urls to HTTP
|
||||
|
||||
Args:
|
||||
public_baseurl: The public, accessible base URL of the homeserver
|
||||
"""
|
||||
|
||||
def mxc_to_http_filter(value, width, height, resize_method="crop"):
|
||||
if value[0:6] != "mxc://":
|
||||
return ""
|
||||
|
||||
server_and_media_id = value[6:]
|
||||
fragment = None
|
||||
if "#" in server_and_media_id:
|
||||
server_and_media_id, fragment = server_and_media_id.split("#", 1)
|
||||
fragment = "#" + fragment
|
||||
|
||||
params = {"width": width, "height": height, "method": resize_method}
|
||||
return "%s_matrix/media/v1/thumbnail/%s?%s%s" % (
|
||||
public_baseurl,
|
||||
server_and_media_id,
|
||||
urllib.parse.urlencode(params),
|
||||
fragment or "",
|
||||
)
|
||||
|
||||
return mxc_to_http_filter
|
||||
# Load the templates
|
||||
return [env.get_template(filename) for filename in filenames]
|
||||
|
||||
|
||||
class RootConfig:
|
||||
|
|
|
@ -9,6 +9,7 @@ from synapse.config import (
|
|||
consent_config,
|
||||
database,
|
||||
emailconfig,
|
||||
experimental,
|
||||
groups,
|
||||
jwt_config,
|
||||
key,
|
||||
|
@ -18,6 +19,7 @@ from synapse.config import (
|
|||
password_auth_providers,
|
||||
push,
|
||||
ratelimiting,
|
||||
redis,
|
||||
registration,
|
||||
repository,
|
||||
room_directory,
|
||||
|
@ -48,10 +50,11 @@ def path_exists(file_path: str): ...
|
|||
|
||||
class RootConfig:
|
||||
server: server.ServerConfig
|
||||
experimental: experimental.ExperimentalConfig
|
||||
tls: tls.TlsConfig
|
||||
database: database.DatabaseConfig
|
||||
logging: logger.LoggingConfig
|
||||
ratelimit: ratelimiting.RatelimitConfig
|
||||
ratelimiting: ratelimiting.RatelimitConfig
|
||||
media: repository.ContentRepositoryConfig
|
||||
captcha: captcha.CaptchaConfig
|
||||
voip: voip.VoipConfig
|
||||
|
@ -79,6 +82,7 @@ class RootConfig:
|
|||
roomdirectory: room_directory.RoomDirectoryConfig
|
||||
thirdpartyrules: third_party_event_rules.ThirdPartyRulesConfig
|
||||
tracer: tracer.TracerConfig
|
||||
redis: redis.RedisConfig
|
||||
|
||||
config_classes: List = ...
|
||||
def __init__(self) -> None: ...
|
||||
|
|
|
@ -28,9 +28,7 @@ class CaptchaConfig(Config):
|
|||
"recaptcha_siteverify_api",
|
||||
"https://www.recaptcha.net/recaptcha/api/siteverify",
|
||||
)
|
||||
self.recaptcha_template = self.read_templates(
|
||||
["recaptcha.html"], autoescape=True
|
||||
)[0]
|
||||
self.recaptcha_template = self.read_template("recaptcha.html")
|
||||
|
||||
def generate_config_section(self, **kwargs):
|
||||
return """\
|
||||
|
|
|
@ -30,7 +30,13 @@ class CasConfig(Config):
|
|||
|
||||
if self.cas_enabled:
|
||||
self.cas_server_url = cas_config["server_url"]
|
||||
self.cas_service_url = cas_config["service_url"]
|
||||
public_base_url = cas_config.get("service_url") or self.public_baseurl
|
||||
if public_base_url[-1] != "/":
|
||||
public_base_url += "/"
|
||||
# TODO Update this to a _synapse URL.
|
||||
self.cas_service_url = (
|
||||
public_base_url + "_matrix/client/r0/login/cas/ticket"
|
||||
)
|
||||
self.cas_displayname_attribute = cas_config.get("displayname_attribute")
|
||||
self.cas_required_attributes = cas_config.get("required_attributes") or {}
|
||||
else:
|
||||
|
@ -53,10 +59,6 @@ class CasConfig(Config):
|
|||
#
|
||||
#server_url: "https://cas-server.com"
|
||||
|
||||
# The public URL of the homeserver.
|
||||
#
|
||||
#service_url: "https://homeserver.domain.com:8448"
|
||||
|
||||
# The attribute of the CAS response to use as the display name.
|
||||
#
|
||||
# If unset, no displayname will be set.
|
||||
|
|
|
@ -89,7 +89,7 @@ class ConsentConfig(Config):
|
|||
|
||||
def read_config(self, config, **kwargs):
|
||||
consent_config = config.get("user_consent")
|
||||
self.terms_template = self.read_templates(["terms.html"], autoescape=True)[0]
|
||||
self.terms_template = self.read_template("terms.html")
|
||||
|
||||
if consent_config is None:
|
||||
return
|
||||
|
|
29
synapse/config/experimental.py
Normal file
29
synapse/config/experimental.py
Normal file
|
@ -0,0 +1,29 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Copyright 2021 The Matrix.org Foundation C.I.C.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
from synapse.config._base import Config
|
||||
from synapse.types import JsonDict
|
||||
|
||||
|
||||
class ExperimentalConfig(Config):
|
||||
"""Config section for enabling experimental features"""
|
||||
|
||||
section = "experimental"
|
||||
|
||||
def read_config(self, config: JsonDict, **kwargs):
|
||||
experimental = config.get("experimental_features") or {}
|
||||
|
||||
# MSC2858 (multiple SSO identity providers)
|
||||
self.msc2858_enabled = experimental.get("msc2858_enabled", False) # type: bool
|
|
@ -25,6 +25,7 @@ from .cas import CasConfig
|
|||
from .consent_config import ConsentConfig
|
||||
from .database import DatabaseConfig
|
||||
from .emailconfig import EmailConfig
|
||||
from .experimental import ExperimentalConfig
|
||||
from .federation import FederationConfig
|
||||
from .groups import GroupsConfig
|
||||
from .jwt_config import JWTConfig
|
||||
|
@ -59,6 +60,7 @@ class HomeServerConfig(RootConfig):
|
|||
config_classes = [
|
||||
MeowConfig,
|
||||
ServerConfig,
|
||||
ExperimentalConfig,
|
||||
TlsConfig,
|
||||
FederationConfig,
|
||||
CacheConfig,
|
||||
|
|
|
@ -14,7 +14,7 @@
|
|||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import string
|
||||
from collections import Counter
|
||||
from typing import Iterable, Optional, Tuple, Type
|
||||
|
||||
import attr
|
||||
|
@ -43,8 +43,17 @@ class OIDCConfig(Config):
|
|||
except DependencyException as e:
|
||||
raise ConfigError(e.message) from e
|
||||
|
||||
public_baseurl = self.public_baseurl
|
||||
self.oidc_callback_url = public_baseurl + "_synapse/oidc/callback"
|
||||
# check we don't have any duplicate idp_ids now. (The SSO handler will also
|
||||
# check for duplicates when the REST listeners get registered, but that happens
|
||||
# after synapse has forked so doesn't give nice errors.)
|
||||
c = Counter([i.idp_id for i in self.oidc_providers])
|
||||
for idp_id, count in c.items():
|
||||
if count > 1:
|
||||
raise ConfigError(
|
||||
"Multiple OIDC providers have the idp_id %r." % idp_id
|
||||
)
|
||||
|
||||
self.oidc_callback_url = self.public_baseurl + "_synapse/client/oidc/callback"
|
||||
|
||||
@property
|
||||
def oidc_enabled(self) -> bool:
|
||||
|
@ -68,10 +77,14 @@ class OIDCConfig(Config):
|
|||
# offer the user a choice of login mechanisms.
|
||||
#
|
||||
# idp_icon: An optional icon for this identity provider, which is presented
|
||||
# by identity picker pages. If given, must be an MXC URI of the format
|
||||
# mxc://<server-name>/<media-id>. (An easy way to obtain such an MXC URI
|
||||
# is to upload an image to an (unencrypted) room and then copy the "url"
|
||||
# from the source of the event.)
|
||||
# by clients and Synapse's own IdP picker page. If given, must be an
|
||||
# MXC URI of the format mxc://<server-name>/<media-id>. (An easy way to
|
||||
# obtain such an MXC URI is to upload an image to an (unencrypted) room
|
||||
# and then copy the "url" from the source of the event.)
|
||||
#
|
||||
# idp_brand: An optional brand for this identity provider, allowing clients
|
||||
# to style the login flow according to the identity provider in question.
|
||||
# See the spec for possible options here.
|
||||
#
|
||||
# discover: set to 'false' to disable the use of the OIDC discovery mechanism
|
||||
# to discover endpoints. Defaults to true.
|
||||
|
@ -132,17 +145,21 @@ class OIDCConfig(Config):
|
|||
#
|
||||
# For the default provider, the following settings are available:
|
||||
#
|
||||
# sub: name of the claim containing a unique identifier for the
|
||||
# user. Defaults to 'sub', which OpenID Connect compliant
|
||||
# providers should provide.
|
||||
# subject_claim: name of the claim containing a unique identifier
|
||||
# for the user. Defaults to 'sub', which OpenID Connect
|
||||
# compliant providers should provide.
|
||||
#
|
||||
# localpart_template: Jinja2 template for the localpart of the MXID.
|
||||
# If this is not set, the user will be prompted to choose their
|
||||
# own username.
|
||||
# own username (see 'sso_auth_account_details.html' in the 'sso'
|
||||
# section of this file).
|
||||
#
|
||||
# display_name_template: Jinja2 template for the display name to set
|
||||
# on first login. If unset, no displayname will be set.
|
||||
#
|
||||
# email_template: Jinja2 template for the email address of the user.
|
||||
# If unset, no email address will be added to the account.
|
||||
#
|
||||
# extra_attributes: a map of Jinja2 templates for extra attributes
|
||||
# to send back to the client during login.
|
||||
# Note that these are non-standard and clients will ignore them
|
||||
|
@ -178,6 +195,12 @@ class OIDCConfig(Config):
|
|||
# userinfo_endpoint: "https://accounts.example.com/userinfo"
|
||||
# jwks_uri: "https://accounts.example.com/.well-known/jwks.json"
|
||||
# skip_verification: true
|
||||
# user_mapping_provider:
|
||||
# config:
|
||||
# subject_claim: "id"
|
||||
# localpart_template: "{{ user.login }}"
|
||||
# display_name_template: "{{ user.name }}"
|
||||
# email_template: "{{ user.email }}"
|
||||
|
||||
# For use with Keycloak
|
||||
#
|
||||
|
@ -192,6 +215,7 @@ class OIDCConfig(Config):
|
|||
#
|
||||
#- idp_id: github
|
||||
# idp_name: Github
|
||||
# idp_brand: org.matrix.github
|
||||
# discover: false
|
||||
# issuer: "https://github.com/"
|
||||
# client_id: "your-client-id" # TO BE FILLED
|
||||
|
@ -215,11 +239,22 @@ OIDC_PROVIDER_CONFIG_SCHEMA = {
|
|||
"type": "object",
|
||||
"required": ["issuer", "client_id", "client_secret"],
|
||||
"properties": {
|
||||
# TODO: fix the maxLength here depending on what MSC2528 decides
|
||||
# remember that we prefix the ID given here with `oidc-`
|
||||
"idp_id": {"type": "string", "minLength": 1, "maxLength": 128},
|
||||
"idp_id": {
|
||||
"type": "string",
|
||||
"minLength": 1,
|
||||
# MSC2858 allows a maxlen of 255, but we prefix with "oidc-"
|
||||
"maxLength": 250,
|
||||
"pattern": "^[A-Za-z0-9._~-]+$",
|
||||
},
|
||||
"idp_name": {"type": "string"},
|
||||
"idp_icon": {"type": "string"},
|
||||
"idp_brand": {
|
||||
"type": "string",
|
||||
# MSC2758-style namespaced identifier
|
||||
"minLength": 1,
|
||||
"maxLength": 255,
|
||||
"pattern": "^[a-z][a-z0-9_.-]*$",
|
||||
},
|
||||
"discover": {"type": "boolean"},
|
||||
"issuer": {"type": "string"},
|
||||
"client_id": {"type": "string"},
|
||||
|
@ -338,25 +373,8 @@ def _parse_oidc_config_dict(
|
|||
config_path + ("user_mapping_provider", "module"),
|
||||
)
|
||||
|
||||
# MSC2858 will apply certain limits in what can be used as an IdP id, so let's
|
||||
# enforce those limits now.
|
||||
# TODO: factor out this stuff to a generic function
|
||||
idp_id = oidc_config.get("idp_id", "oidc")
|
||||
|
||||
# TODO: update this validity check based on what MSC2858 decides.
|
||||
valid_idp_chars = set(string.ascii_lowercase + string.digits + "-._")
|
||||
|
||||
if any(c not in valid_idp_chars for c in idp_id):
|
||||
raise ConfigError(
|
||||
'idp_id may only contain a-z, 0-9, "-", ".", "_"',
|
||||
config_path + ("idp_id",),
|
||||
)
|
||||
|
||||
if idp_id[0] not in string.ascii_lowercase:
|
||||
raise ConfigError(
|
||||
"idp_id must start with a-z", config_path + ("idp_id",),
|
||||
)
|
||||
|
||||
# prefix the given IDP with a prefix specific to the SSO mechanism, to avoid
|
||||
# clashes with other mechs (such as SAML, CAS).
|
||||
#
|
||||
|
@ -382,6 +400,7 @@ def _parse_oidc_config_dict(
|
|||
idp_id=idp_id,
|
||||
idp_name=oidc_config.get("idp_name", "OIDC"),
|
||||
idp_icon=idp_icon,
|
||||
idp_brand=oidc_config.get("idp_brand"),
|
||||
discover=oidc_config.get("discover", True),
|
||||
issuer=oidc_config["issuer"],
|
||||
client_id=oidc_config["client_id"],
|
||||
|
@ -412,6 +431,9 @@ class OidcProviderConfig:
|
|||
# Optional MXC URI for icon for this IdP.
|
||||
idp_icon = attr.ib(type=Optional[str])
|
||||
|
||||
# Optional brand identifier for this IdP.
|
||||
idp_brand = attr.ib(type=Optional[str])
|
||||
|
||||
# whether the OIDC discovery mechanism is used to discover endpoints
|
||||
discover = attr.ib(type=bool)
|
||||
|
||||
|
|
|
@ -24,7 +24,7 @@ class RateLimitConfig:
|
|||
defaults={"per_second": 0.17, "burst_count": 3.0},
|
||||
):
|
||||
self.per_second = config.get("per_second", defaults["per_second"])
|
||||
self.burst_count = config.get("burst_count", defaults["burst_count"])
|
||||
self.burst_count = int(config.get("burst_count", defaults["burst_count"]))
|
||||
|
||||
|
||||
class FederationRateLimitConfig:
|
||||
|
@ -102,6 +102,20 @@ class RatelimitConfig(Config):
|
|||
defaults={"per_second": 0.01, "burst_count": 3},
|
||||
)
|
||||
|
||||
self.rc_3pid_validation = RateLimitConfig(
|
||||
config.get("rc_3pid_validation") or {},
|
||||
defaults={"per_second": 0.003, "burst_count": 5},
|
||||
)
|
||||
|
||||
self.rc_invites_per_room = RateLimitConfig(
|
||||
config.get("rc_invites", {}).get("per_room", {}),
|
||||
defaults={"per_second": 0.3, "burst_count": 10},
|
||||
)
|
||||
self.rc_invites_per_user = RateLimitConfig(
|
||||
config.get("rc_invites", {}).get("per_user", {}),
|
||||
defaults={"per_second": 0.003, "burst_count": 5},
|
||||
)
|
||||
|
||||
def generate_config_section(self, **kwargs):
|
||||
return """\
|
||||
## Ratelimiting ##
|
||||
|
@ -131,6 +145,9 @@ class RatelimitConfig(Config):
|
|||
# users are joining rooms the server is already in (this is cheap) vs
|
||||
# "remote" for when users are trying to join rooms not on the server (which
|
||||
# can be more expensive)
|
||||
# - one for ratelimiting how often a user or IP can attempt to validate a 3PID.
|
||||
# - two for ratelimiting how often invites can be sent in a room or to a
|
||||
# specific user.
|
||||
#
|
||||
# The defaults are as shown below.
|
||||
#
|
||||
|
@ -164,7 +181,18 @@ class RatelimitConfig(Config):
|
|||
# remote:
|
||||
# per_second: 0.01
|
||||
# burst_count: 3
|
||||
|
||||
#
|
||||
#rc_3pid_validation:
|
||||
# per_second: 0.003
|
||||
# burst_count: 5
|
||||
#
|
||||
#rc_invites:
|
||||
# per_room:
|
||||
# per_second: 0.3
|
||||
# burst_count: 10
|
||||
# per_user:
|
||||
# per_second: 0.003
|
||||
# burst_count: 5
|
||||
|
||||
# Ratelimiting settings for incoming federation
|
||||
#
|
||||
|
|
|
@ -176,9 +176,7 @@ class RegistrationConfig(Config):
|
|||
self.session_lifetime = session_lifetime
|
||||
|
||||
# The success template used during fallback auth.
|
||||
self.fallback_success_template = self.read_templates(
|
||||
["auth_success.html"], autoescape=True
|
||||
)[0]
|
||||
self.fallback_success_template = self.read_template("auth_success.html")
|
||||
|
||||
def generate_config_section(self, generate_secrets=False, **kwargs):
|
||||
if generate_secrets:
|
||||
|
|
|
@ -194,8 +194,8 @@ class SAML2Config(Config):
|
|||
optional_attributes.add(self.saml2_grandfathered_mxid_source_attribute)
|
||||
optional_attributes -= required_attributes
|
||||
|
||||
metadata_url = public_baseurl + "_matrix/saml2/metadata.xml"
|
||||
response_url = public_baseurl + "_matrix/saml2/authn_response"
|
||||
metadata_url = public_baseurl + "_synapse/client/saml2/metadata.xml"
|
||||
response_url = public_baseurl + "_synapse/client/saml2/authn_response"
|
||||
return {
|
||||
"entityid": metadata_url,
|
||||
"service": {
|
||||
|
@ -233,10 +233,10 @@ class SAML2Config(Config):
|
|||
# enable SAML login.
|
||||
#
|
||||
# Once SAML support is enabled, a metadata file will be exposed at
|
||||
# https://<server>:<port>/_matrix/saml2/metadata.xml, which you may be able to
|
||||
# https://<server>:<port>/_synapse/client/saml2/metadata.xml, which you may be able to
|
||||
# use to configure your SAML IdP with. Alternatively, you can manually configure
|
||||
# the IdP to use an ACS location of
|
||||
# https://<server>:<port>/_matrix/saml2/authn_response.
|
||||
# https://<server>:<port>/_synapse/client/saml2/authn_response.
|
||||
#
|
||||
saml2_config:
|
||||
# `sp_config` is the configuration for the pysaml2 Service Provider.
|
||||
|
|
|
@ -27,7 +27,7 @@ class SSOConfig(Config):
|
|||
sso_config = config.get("sso") or {} # type: Dict[str, Any]
|
||||
|
||||
# The sso-specific template_dir
|
||||
template_dir = sso_config.get("template_dir")
|
||||
self.sso_template_dir = sso_config.get("template_dir")
|
||||
|
||||
# Read templates from disk
|
||||
(
|
||||
|
@ -48,7 +48,7 @@ class SSOConfig(Config):
|
|||
"sso_auth_success.html",
|
||||
"sso_auth_bad_user.html",
|
||||
],
|
||||
template_dir,
|
||||
self.sso_template_dir,
|
||||
)
|
||||
|
||||
# These templates have no placeholders, so render them here
|
||||
|
@ -113,8 +113,13 @@ class SSOConfig(Config):
|
|||
#
|
||||
# * providers: a list of available Identity Providers. Each element is
|
||||
# an object with the following attributes:
|
||||
#
|
||||
# * idp_id: unique identifier for the IdP
|
||||
# * idp_name: user-facing name for the IdP
|
||||
# * idp_icon: if specified in the IdP config, an MXC URI for an icon
|
||||
# for the IdP
|
||||
# * idp_brand: if specified in the IdP config, a textual identifier
|
||||
# for the brand of the IdP
|
||||
#
|
||||
# The rendered HTML page should contain a form which submits its results
|
||||
# back as a GET request, with the following query parameters:
|
||||
|
@ -124,10 +129,62 @@ class SSOConfig(Config):
|
|||
#
|
||||
# * idp: the 'idp_id' of the chosen IDP.
|
||||
#
|
||||
# * HTML page to prompt new users to enter a userid and confirm other
|
||||
# details: 'sso_auth_account_details.html'. This is only shown if the
|
||||
# SSO implementation (with any user_mapping_provider) does not return
|
||||
# a localpart.
|
||||
#
|
||||
# When rendering, this template is given the following variables:
|
||||
#
|
||||
# * server_name: the homeserver's name.
|
||||
#
|
||||
# * idp: details of the SSO Identity Provider that the user logged in
|
||||
# with: an object with the following attributes:
|
||||
#
|
||||
# * idp_id: unique identifier for the IdP
|
||||
# * idp_name: user-facing name for the IdP
|
||||
# * idp_icon: if specified in the IdP config, an MXC URI for an icon
|
||||
# for the IdP
|
||||
# * idp_brand: if specified in the IdP config, a textual identifier
|
||||
# for the brand of the IdP
|
||||
#
|
||||
# * user_attributes: an object containing details about the user that
|
||||
# we received from the IdP. May have the following attributes:
|
||||
#
|
||||
# * display_name: the user's display_name
|
||||
# * emails: a list of email addresses
|
||||
#
|
||||
# The template should render a form which submits the following fields:
|
||||
#
|
||||
# * username: the localpart of the user's chosen user id
|
||||
#
|
||||
# * HTML page allowing the user to consent to the server's terms and
|
||||
# conditions. This is only shown for new users, and only if
|
||||
# `user_consent.require_at_registration` is set.
|
||||
#
|
||||
# When rendering, this template is given the following variables:
|
||||
#
|
||||
# * server_name: the homeserver's name.
|
||||
#
|
||||
# * user_id: the user's matrix proposed ID.
|
||||
#
|
||||
# * user_profile.display_name: the user's proposed display name, if any.
|
||||
#
|
||||
# * consent_version: the version of the terms that the user will be
|
||||
# shown
|
||||
#
|
||||
# * terms_url: a link to the page showing the terms.
|
||||
#
|
||||
# The template should render a form which submits the following fields:
|
||||
#
|
||||
# * accepted_version: the version of the terms accepted by the user
|
||||
# (ie, 'consent_version' from the input variables).
|
||||
#
|
||||
# * HTML page for a confirmation step before redirecting back to the client
|
||||
# with the login token: 'sso_redirect_confirm.html'.
|
||||
#
|
||||
# When rendering, this template is given three variables:
|
||||
# When rendering, this template is given the following variables:
|
||||
#
|
||||
# * redirect_url: the URL the user is about to be redirected to. Needs
|
||||
# manual escaping (see
|
||||
# https://jinja.palletsprojects.com/en/2.11.x/templates/#html-escaping).
|
||||
|
@ -140,6 +197,17 @@ class SSOConfig(Config):
|
|||
#
|
||||
# * server_name: the homeserver's name.
|
||||
#
|
||||
# * new_user: a boolean indicating whether this is the user's first time
|
||||
# logging in.
|
||||
#
|
||||
# * user_id: the user's matrix ID.
|
||||
#
|
||||
# * user_profile.avatar_url: an MXC URI for the user's avatar, if any.
|
||||
# None if the user has not set an avatar.
|
||||
#
|
||||
# * user_profile.display_name: the user's display name. None if the user
|
||||
# has not set a display name.
|
||||
#
|
||||
# * HTML page which notifies the user that they are authenticating to confirm
|
||||
# an operation on their account during the user interactive authentication
|
||||
# process: 'sso_auth_confirm.html'.
|
||||
|
@ -151,6 +219,16 @@ class SSOConfig(Config):
|
|||
#
|
||||
# * description: the operation which the user is being asked to confirm
|
||||
#
|
||||
# * idp: details of the Identity Provider that we will use to confirm
|
||||
# the user's identity: an object with the following attributes:
|
||||
#
|
||||
# * idp_id: unique identifier for the IdP
|
||||
# * idp_name: user-facing name for the IdP
|
||||
# * idp_icon: if specified in the IdP config, an MXC URI for an icon
|
||||
# for the IdP
|
||||
# * idp_brand: if specified in the IdP config, a textual identifier
|
||||
# for the brand of the IdP
|
||||
#
|
||||
# * HTML page shown after a successful user interactive authentication session:
|
||||
# 'sso_auth_success.html'.
|
||||
#
|
||||
|
|
|
@ -125,19 +125,24 @@ class FederationPolicyForHTTPS:
|
|||
self._no_verify_ssl_context = _no_verify_ssl.getContext()
|
||||
self._no_verify_ssl_context.set_info_callback(_context_info_cb)
|
||||
|
||||
def get_options(self, host: bytes):
|
||||
self._should_verify = self._config.federation_verify_certificates
|
||||
|
||||
self._federation_certificate_verification_whitelist = (
|
||||
self._config.federation_certificate_verification_whitelist
|
||||
)
|
||||
|
||||
def get_options(self, host: bytes):
|
||||
# IPolicyForHTTPS.get_options takes bytes, but we want to compare
|
||||
# against the str whitelist. The hostnames in the whitelist are already
|
||||
# IDNA-encoded like the hosts will be here.
|
||||
ascii_host = host.decode("ascii")
|
||||
|
||||
# Check if certificate verification has been enabled
|
||||
should_verify = self._config.federation_verify_certificates
|
||||
should_verify = self._should_verify
|
||||
|
||||
# Check if we've disabled certificate verification for this host
|
||||
if should_verify:
|
||||
for regex in self._config.federation_certificate_verification_whitelist:
|
||||
if self._should_verify:
|
||||
for regex in self._federation_certificate_verification_whitelist:
|
||||
if regex.match(ascii_host):
|
||||
should_verify = False
|
||||
break
|
||||
|
|
|
@ -18,6 +18,7 @@ import copy
|
|||
import itertools
|
||||
import logging
|
||||
from typing import (
|
||||
TYPE_CHECKING,
|
||||
Any,
|
||||
Awaitable,
|
||||
Callable,
|
||||
|
@ -26,7 +27,6 @@ from typing import (
|
|||
List,
|
||||
Mapping,
|
||||
Optional,
|
||||
Sequence,
|
||||
Tuple,
|
||||
TypeVar,
|
||||
Union,
|
||||
|
@ -61,6 +61,9 @@ from synapse.util import unwrapFirstError
|
|||
from synapse.util.caches.expiringcache import ExpiringCache
|
||||
from synapse.util.retryutils import NotRetryingDestination
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from synapse.app.homeserver import HomeServer
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
sent_queries_counter = Counter("synapse_federation_client_sent_queries", "", ["type"])
|
||||
|
@ -80,10 +83,10 @@ class InvalidResponseError(RuntimeError):
|
|||
|
||||
|
||||
class FederationClient(FederationBase):
|
||||
def __init__(self, hs):
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
super().__init__(hs)
|
||||
|
||||
self.pdu_destination_tried = {}
|
||||
self.pdu_destination_tried = {} # type: Dict[str, Dict[str, int]]
|
||||
self._clock.looping_call(self._clear_tried_cache, 60 * 1000)
|
||||
self.state = hs.get_state_handler()
|
||||
self.transport_layer = hs.get_federation_transport_client()
|
||||
|
@ -116,33 +119,32 @@ class FederationClient(FederationBase):
|
|||
self.pdu_destination_tried[event_id] = destination_dict
|
||||
|
||||
@log_function
|
||||
def make_query(
|
||||
async def make_query(
|
||||
self,
|
||||
destination,
|
||||
query_type,
|
||||
args,
|
||||
retry_on_dns_fail=False,
|
||||
ignore_backoff=False,
|
||||
):
|
||||
destination: str,
|
||||
query_type: str,
|
||||
args: dict,
|
||||
retry_on_dns_fail: bool = False,
|
||||
ignore_backoff: bool = False,
|
||||
) -> JsonDict:
|
||||
"""Sends a federation Query to a remote homeserver of the given type
|
||||
and arguments.
|
||||
|
||||
Args:
|
||||
destination (str): Domain name of the remote homeserver
|
||||
query_type (str): Category of the query type; should match the
|
||||
destination: Domain name of the remote homeserver
|
||||
query_type: Category of the query type; should match the
|
||||
handler name used in register_query_handler().
|
||||
args (dict): Mapping of strings to strings containing the details
|
||||
args: Mapping of strings to strings containing the details
|
||||
of the query request.
|
||||
ignore_backoff (bool): true to ignore the historical backoff data
|
||||
ignore_backoff: true to ignore the historical backoff data
|
||||
and try the request anyway.
|
||||
|
||||
Returns:
|
||||
a Awaitable which will eventually yield a JSON object from the
|
||||
response
|
||||
The JSON object from the response
|
||||
"""
|
||||
sent_queries_counter.labels(query_type).inc()
|
||||
|
||||
return self.transport_layer.make_query(
|
||||
return await self.transport_layer.make_query(
|
||||
destination,
|
||||
query_type,
|
||||
args,
|
||||
|
@ -151,42 +153,52 @@ class FederationClient(FederationBase):
|
|||
)
|
||||
|
||||
@log_function
|
||||
def query_client_keys(self, destination, content, timeout):
|
||||
async def query_client_keys(
|
||||
self, destination: str, content: JsonDict, timeout: int
|
||||
) -> JsonDict:
|
||||
"""Query device keys for a device hosted on a remote server.
|
||||
|
||||
Args:
|
||||
destination (str): Domain name of the remote homeserver
|
||||
content (dict): The query content.
|
||||
destination: Domain name of the remote homeserver
|
||||
content: The query content.
|
||||
|
||||
Returns:
|
||||
an Awaitable which will eventually yield a JSON object from the
|
||||
response
|
||||
The JSON object from the response
|
||||
"""
|
||||
sent_queries_counter.labels("client_device_keys").inc()
|
||||
return self.transport_layer.query_client_keys(destination, content, timeout)
|
||||
return await self.transport_layer.query_client_keys(
|
||||
destination, content, timeout
|
||||
)
|
||||
|
||||
@log_function
|
||||
def query_user_devices(self, destination, user_id, timeout=30000):
|
||||
async def query_user_devices(
|
||||
self, destination: str, user_id: str, timeout: int = 30000
|
||||
) -> JsonDict:
|
||||
"""Query the device keys for a list of user ids hosted on a remote
|
||||
server.
|
||||
"""
|
||||
sent_queries_counter.labels("user_devices").inc()
|
||||
return self.transport_layer.query_user_devices(destination, user_id, timeout)
|
||||
return await self.transport_layer.query_user_devices(
|
||||
destination, user_id, timeout
|
||||
)
|
||||
|
||||
@log_function
|
||||
def claim_client_keys(self, destination, content, timeout):
|
||||
async def claim_client_keys(
|
||||
self, destination: str, content: JsonDict, timeout: int
|
||||
) -> JsonDict:
|
||||
"""Claims one-time keys for a device hosted on a remote server.
|
||||
|
||||
Args:
|
||||
destination (str): Domain name of the remote homeserver
|
||||
content (dict): The query content.
|
||||
destination: Domain name of the remote homeserver
|
||||
content: The query content.
|
||||
|
||||
Returns:
|
||||
an Awaitable which will eventually yield a JSON object from the
|
||||
response
|
||||
The JSON object from the response
|
||||
"""
|
||||
sent_queries_counter.labels("client_one_time_keys").inc()
|
||||
return self.transport_layer.claim_client_keys(destination, content, timeout)
|
||||
return await self.transport_layer.claim_client_keys(
|
||||
destination, content, timeout
|
||||
)
|
||||
|
||||
async def backfill(
|
||||
self, dest: str, room_id: str, limit: int, extremities: Iterable[str]
|
||||
|
@ -195,10 +207,10 @@ class FederationClient(FederationBase):
|
|||
given destination server.
|
||||
|
||||
Args:
|
||||
dest (str): The remote homeserver to ask.
|
||||
room_id (str): The room_id to backfill.
|
||||
limit (int): The maximum number of events to return.
|
||||
extremities (list): our current backwards extremities, to backfill from
|
||||
dest: The remote homeserver to ask.
|
||||
room_id: The room_id to backfill.
|
||||
limit: The maximum number of events to return.
|
||||
extremities: our current backwards extremities, to backfill from
|
||||
"""
|
||||
logger.debug("backfill extrem=%s", extremities)
|
||||
|
||||
|
@ -370,7 +382,7 @@ class FederationClient(FederationBase):
|
|||
for events that have failed their checks
|
||||
|
||||
Returns:
|
||||
Deferred : A list of PDUs that have valid signatures and hashes.
|
||||
A list of PDUs that have valid signatures and hashes.
|
||||
"""
|
||||
deferreds = self._check_sigs_and_hashes(room_version, pdus)
|
||||
|
||||
|
@ -418,7 +430,9 @@ class FederationClient(FederationBase):
|
|||
else:
|
||||
return [p for p in valid_pdus if p]
|
||||
|
||||
async def get_event_auth(self, destination, room_id, event_id):
|
||||
async def get_event_auth(
|
||||
self, destination: str, room_id: str, event_id: str
|
||||
) -> List[EventBase]:
|
||||
res = await self.transport_layer.get_event_auth(destination, room_id, event_id)
|
||||
|
||||
room_version = await self.store.get_room_version(room_id)
|
||||
|
@ -700,18 +714,16 @@ class FederationClient(FederationBase):
|
|||
|
||||
return await self._try_destination_list("send_join", destinations, send_request)
|
||||
|
||||
async def _do_send_join(self, destination: str, pdu: EventBase):
|
||||
async def _do_send_join(self, destination: str, pdu: EventBase) -> JsonDict:
|
||||
time_now = self._clock.time_msec()
|
||||
|
||||
try:
|
||||
content = await self.transport_layer.send_join_v2(
|
||||
return await self.transport_layer.send_join_v2(
|
||||
destination=destination,
|
||||
room_id=pdu.room_id,
|
||||
event_id=pdu.event_id,
|
||||
content=pdu.get_pdu_json(time_now),
|
||||
)
|
||||
|
||||
return content
|
||||
except HttpResponseException as e:
|
||||
if e.code in [400, 404]:
|
||||
err = e.to_synapse_error()
|
||||
|
@ -769,7 +781,7 @@ class FederationClient(FederationBase):
|
|||
time_now = self._clock.time_msec()
|
||||
|
||||
try:
|
||||
content = await self.transport_layer.send_invite_v2(
|
||||
return await self.transport_layer.send_invite_v2(
|
||||
destination=destination,
|
||||
room_id=pdu.room_id,
|
||||
event_id=pdu.event_id,
|
||||
|
@ -779,7 +791,6 @@ class FederationClient(FederationBase):
|
|||
"invite_room_state": pdu.unsigned.get("invite_room_state", []),
|
||||
},
|
||||
)
|
||||
return content
|
||||
except HttpResponseException as e:
|
||||
if e.code in [400, 404]:
|
||||
err = e.to_synapse_error()
|
||||
|
@ -799,7 +810,7 @@ class FederationClient(FederationBase):
|
|||
"User's homeserver does not support this room version",
|
||||
Codes.UNSUPPORTED_ROOM_VERSION,
|
||||
)
|
||||
elif e.code == 403:
|
||||
elif e.code in (403, 429):
|
||||
raise e.to_synapse_error()
|
||||
else:
|
||||
raise
|
||||
|
@ -842,18 +853,16 @@ class FederationClient(FederationBase):
|
|||
"send_leave", destinations, send_request
|
||||
)
|
||||
|
||||
async def _do_send_leave(self, destination, pdu):
|
||||
async def _do_send_leave(self, destination: str, pdu: EventBase) -> JsonDict:
|
||||
time_now = self._clock.time_msec()
|
||||
|
||||
try:
|
||||
content = await self.transport_layer.send_leave_v2(
|
||||
return await self.transport_layer.send_leave_v2(
|
||||
destination=destination,
|
||||
room_id=pdu.room_id,
|
||||
event_id=pdu.event_id,
|
||||
content=pdu.get_pdu_json(time_now),
|
||||
)
|
||||
|
||||
return content
|
||||
except HttpResponseException as e:
|
||||
if e.code in [400, 404]:
|
||||
err = e.to_synapse_error()
|
||||
|
@ -879,7 +888,7 @@ class FederationClient(FederationBase):
|
|||
# content.
|
||||
return resp[1]
|
||||
|
||||
def get_public_rooms(
|
||||
async def get_public_rooms(
|
||||
self,
|
||||
remote_server: str,
|
||||
limit: Optional[int] = None,
|
||||
|
@ -887,7 +896,7 @@ class FederationClient(FederationBase):
|
|||
search_filter: Optional[Dict] = None,
|
||||
include_all_networks: bool = False,
|
||||
third_party_instance_id: Optional[str] = None,
|
||||
):
|
||||
) -> JsonDict:
|
||||
"""Get the list of public rooms from a remote homeserver
|
||||
|
||||
Args:
|
||||
|
@ -901,8 +910,7 @@ class FederationClient(FederationBase):
|
|||
party instance
|
||||
|
||||
Returns:
|
||||
Awaitable[Dict[str, Any]]: The response from the remote server, or None if
|
||||
`remote_server` is the same as the local server_name
|
||||
The response from the remote server.
|
||||
|
||||
Raises:
|
||||
HttpResponseException: There was an exception returned from the remote server
|
||||
|
@ -910,7 +918,7 @@ class FederationClient(FederationBase):
|
|||
requests over federation
|
||||
|
||||
"""
|
||||
return self.transport_layer.get_public_rooms(
|
||||
return await self.transport_layer.get_public_rooms(
|
||||
remote_server,
|
||||
limit,
|
||||
since_token,
|
||||
|
@ -923,7 +931,7 @@ class FederationClient(FederationBase):
|
|||
self,
|
||||
destination: str,
|
||||
room_id: str,
|
||||
earliest_events_ids: Sequence[str],
|
||||
earliest_events_ids: Iterable[str],
|
||||
latest_events: Iterable[EventBase],
|
||||
limit: int,
|
||||
min_depth: int,
|
||||
|
@ -974,7 +982,9 @@ class FederationClient(FederationBase):
|
|||
|
||||
return signed_events
|
||||
|
||||
async def forward_third_party_invite(self, destinations, room_id, event_dict):
|
||||
async def forward_third_party_invite(
|
||||
self, destinations: Iterable[str], room_id: str, event_dict: JsonDict
|
||||
) -> None:
|
||||
for destination in destinations:
|
||||
if destination == self.server_name:
|
||||
continue
|
||||
|
@ -983,7 +993,7 @@ class FederationClient(FederationBase):
|
|||
await self.transport_layer.exchange_third_party_invite(
|
||||
destination=destination, room_id=room_id, event_dict=event_dict
|
||||
)
|
||||
return None
|
||||
return
|
||||
except CodeMessageException:
|
||||
raise
|
||||
except Exception as e:
|
||||
|
@ -995,7 +1005,7 @@ class FederationClient(FederationBase):
|
|||
|
||||
async def get_room_complexity(
|
||||
self, destination: str, room_id: str
|
||||
) -> Optional[dict]:
|
||||
) -> Optional[JsonDict]:
|
||||
"""
|
||||
Fetch the complexity of a remote room from another server.
|
||||
|
||||
|
@ -1008,10 +1018,9 @@ class FederationClient(FederationBase):
|
|||
could not fetch the complexity.
|
||||
"""
|
||||
try:
|
||||
complexity = await self.transport_layer.get_room_complexity(
|
||||
return await self.transport_layer.get_room_complexity(
|
||||
destination=destination, room_id=room_id
|
||||
)
|
||||
return complexity
|
||||
except CodeMessageException as e:
|
||||
# We didn't manage to get it -- probably a 404. We are okay if other
|
||||
# servers don't give it to us.
|
||||
|
|
|
@ -142,6 +142,8 @@ class FederationSender:
|
|||
self._wake_destinations_needing_catchup,
|
||||
)
|
||||
|
||||
self._external_cache = hs.get_external_cache()
|
||||
|
||||
def _get_per_destination_queue(self, destination: str) -> PerDestinationQueue:
|
||||
"""Get or create a PerDestinationQueue for the given destination
|
||||
|
||||
|
@ -197,22 +199,40 @@ class FederationSender:
|
|||
if not event.internal_metadata.should_proactively_send():
|
||||
return
|
||||
|
||||
try:
|
||||
# Get the state from before the event.
|
||||
# We need to make sure that this is the state from before
|
||||
# the event and not from after it.
|
||||
# Otherwise if the last member on a server in a room is
|
||||
# banned then it won't receive the event because it won't
|
||||
# be in the room after the ban.
|
||||
destinations = await self.state.get_hosts_in_room_at_events(
|
||||
event.room_id, event_ids=event.prev_event_ids()
|
||||
destinations = None # type: Optional[Set[str]]
|
||||
if not event.prev_event_ids():
|
||||
# If there are no prev event IDs then the state is empty
|
||||
# and so no remote servers in the room
|
||||
destinations = set()
|
||||
else:
|
||||
# We check the external cache for the destinations, which is
|
||||
# stored per state group.
|
||||
|
||||
sg = await self._external_cache.get(
|
||||
"event_to_prev_state_group", event.event_id
|
||||
)
|
||||
except Exception:
|
||||
logger.exception(
|
||||
"Failed to calculate hosts in room for event: %s",
|
||||
event.event_id,
|
||||
)
|
||||
return
|
||||
if sg:
|
||||
destinations = await self._external_cache.get(
|
||||
"get_joined_hosts", str(sg)
|
||||
)
|
||||
|
||||
if destinations is None:
|
||||
try:
|
||||
# Get the state from before the event.
|
||||
# We need to make sure that this is the state from before
|
||||
# the event and not from after it.
|
||||
# Otherwise if the last member on a server in a room is
|
||||
# banned then it won't receive the event because it won't
|
||||
# be in the room after the ban.
|
||||
destinations = await self.state.get_hosts_in_room_at_events(
|
||||
event.room_id, event_ids=event.prev_event_ids()
|
||||
)
|
||||
except Exception:
|
||||
logger.exception(
|
||||
"Failed to calculate hosts in room for event: %s",
|
||||
event.event_id,
|
||||
)
|
||||
return
|
||||
|
||||
destinations = {
|
||||
d
|
||||
|
|
|
@ -14,6 +14,7 @@
|
|||
# limitations under the License.
|
||||
|
||||
import logging
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
import twisted
|
||||
import twisted.internet.error
|
||||
|
@ -22,6 +23,9 @@ from twisted.web.resource import Resource
|
|||
|
||||
from synapse.app import check_bind_error
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from synapse.app.homeserver import HomeServer
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
ACME_REGISTER_FAIL_ERROR = """
|
||||
|
@ -35,12 +39,12 @@ solutions, please read https://github.com/matrix-org/synapse/blob/master/docs/AC
|
|||
|
||||
|
||||
class AcmeHandler:
|
||||
def __init__(self, hs):
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
self.hs = hs
|
||||
self.reactor = hs.get_reactor()
|
||||
self._acme_domain = hs.config.acme_domain
|
||||
|
||||
async def start_listening(self):
|
||||
async def start_listening(self) -> None:
|
||||
from synapse.handlers import acme_issuing_service
|
||||
|
||||
# Configure logging for txacme, if you need to debug
|
||||
|
@ -85,7 +89,7 @@ class AcmeHandler:
|
|||
logger.error(ACME_REGISTER_FAIL_ERROR)
|
||||
raise
|
||||
|
||||
async def provision_certificate(self):
|
||||
async def provision_certificate(self) -> None:
|
||||
|
||||
logger.warning("Reprovisioning %s", self._acme_domain)
|
||||
|
||||
|
@ -110,5 +114,3 @@ class AcmeHandler:
|
|||
except Exception:
|
||||
logger.exception("Failed saving!")
|
||||
raise
|
||||
|
||||
return True
|
||||
|
|
|
@ -22,8 +22,10 @@ only need (and may only have available) if we are doing ACME, so is designed to
|
|||
imported conditionally.
|
||||
"""
|
||||
import logging
|
||||
from typing import Dict, Iterable, List
|
||||
|
||||
import attr
|
||||
import pem
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
from cryptography.hazmat.primitives import serialization
|
||||
from josepy import JWKRSA
|
||||
|
@ -36,20 +38,27 @@ from txacme.util import generate_private_key
|
|||
from zope.interface import implementer
|
||||
|
||||
from twisted.internet import defer
|
||||
from twisted.internet.interfaces import IReactorTCP
|
||||
from twisted.python.filepath import FilePath
|
||||
from twisted.python.url import URL
|
||||
from twisted.web.resource import IResource
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def create_issuing_service(reactor, acme_url, account_key_file, well_known_resource):
|
||||
def create_issuing_service(
|
||||
reactor: IReactorTCP,
|
||||
acme_url: str,
|
||||
account_key_file: str,
|
||||
well_known_resource: IResource,
|
||||
) -> AcmeIssuingService:
|
||||
"""Create an ACME issuing service, and attach it to a web Resource
|
||||
|
||||
Args:
|
||||
reactor: twisted reactor
|
||||
acme_url (str): URL to use to request certificates
|
||||
account_key_file (str): where to store the account key
|
||||
well_known_resource (twisted.web.IResource): web resource for .well-known.
|
||||
acme_url: URL to use to request certificates
|
||||
account_key_file: where to store the account key
|
||||
well_known_resource: web resource for .well-known.
|
||||
we will attach a child resource for "acme-challenge".
|
||||
|
||||
Returns:
|
||||
|
@ -83,18 +92,20 @@ class ErsatzStore:
|
|||
A store that only stores in memory.
|
||||
"""
|
||||
|
||||
certs = attr.ib(default=attr.Factory(dict))
|
||||
certs = attr.ib(type=Dict[bytes, List[bytes]], default=attr.Factory(dict))
|
||||
|
||||
def store(self, server_name, pem_objects):
|
||||
def store(
|
||||
self, server_name: bytes, pem_objects: Iterable[pem.AbstractPEMObject]
|
||||
) -> defer.Deferred:
|
||||
self.certs[server_name] = [o.as_bytes() for o in pem_objects]
|
||||
return defer.succeed(None)
|
||||
|
||||
|
||||
def load_or_create_client_key(key_file):
|
||||
def load_or_create_client_key(key_file: str) -> JWKRSA:
|
||||
"""Load the ACME account key from a file, creating it if it does not exist.
|
||||
|
||||
Args:
|
||||
key_file (str): name of the file to use as the account key
|
||||
key_file: name of the file to use as the account key
|
||||
"""
|
||||
# this is based on txacme.endpoint.load_or_create_client_key, but doesn't
|
||||
# hardcode the 'client.key' filename
|
||||
|
|
|
@ -61,6 +61,7 @@ from synapse.http.site import SynapseRequest
|
|||
from synapse.logging.context import defer_to_thread
|
||||
from synapse.metrics.background_process_metrics import run_as_background_process
|
||||
from synapse.module_api import ModuleApi
|
||||
from synapse.storage.roommember import ProfileInfo
|
||||
from synapse.types import JsonDict, Requester, UserID
|
||||
from synapse.util import stringutils as stringutils
|
||||
from synapse.util.async_helpers import maybe_awaitable
|
||||
|
@ -567,16 +568,6 @@ class AuthHandler(BaseHandler):
|
|||
session.session_id, login_type, result
|
||||
)
|
||||
except LoginError as e:
|
||||
if login_type == LoginType.EMAIL_IDENTITY:
|
||||
# riot used to have a bug where it would request a new
|
||||
# validation token (thus sending a new email) each time it
|
||||
# got a 401 with a 'flows' field.
|
||||
# (https://github.com/vector-im/vector-web/issues/2447).
|
||||
#
|
||||
# Grandfather in the old behaviour for now to avoid
|
||||
# breaking old riot deployments.
|
||||
raise
|
||||
|
||||
# this step failed. Merge the error dict into the response
|
||||
# so that the client can have another go.
|
||||
errordict = e.error_dict()
|
||||
|
@ -1387,7 +1378,9 @@ class AuthHandler(BaseHandler):
|
|||
)
|
||||
|
||||
return self._sso_auth_confirm_template.render(
|
||||
description=session.description, redirect_url=redirect_url,
|
||||
description=session.description,
|
||||
redirect_url=redirect_url,
|
||||
idp=sso_auth_provider,
|
||||
)
|
||||
|
||||
async def complete_sso_login(
|
||||
|
@ -1396,6 +1389,7 @@ class AuthHandler(BaseHandler):
|
|||
request: Request,
|
||||
client_redirect_url: str,
|
||||
extra_attributes: Optional[JsonDict] = None,
|
||||
new_user: bool = False,
|
||||
):
|
||||
"""Having figured out a mxid for this user, complete the HTTP request
|
||||
|
||||
|
@ -1406,6 +1400,8 @@ class AuthHandler(BaseHandler):
|
|||
process.
|
||||
extra_attributes: Extra attributes which will be passed to the client
|
||||
during successful login. Must be JSON serializable.
|
||||
new_user: True if we should use wording appropriate to a user who has just
|
||||
registered.
|
||||
"""
|
||||
# If the account has been deactivated, do not proceed with the login
|
||||
# flow.
|
||||
|
@ -1414,8 +1410,17 @@ class AuthHandler(BaseHandler):
|
|||
respond_with_html(request, 403, self._sso_account_deactivated_template)
|
||||
return
|
||||
|
||||
profile = await self.store.get_profileinfo(
|
||||
UserID.from_string(registered_user_id).localpart
|
||||
)
|
||||
|
||||
self._complete_sso_login(
|
||||
registered_user_id, request, client_redirect_url, extra_attributes
|
||||
registered_user_id,
|
||||
request,
|
||||
client_redirect_url,
|
||||
extra_attributes,
|
||||
new_user=new_user,
|
||||
user_profile_data=profile,
|
||||
)
|
||||
|
||||
def _complete_sso_login(
|
||||
|
@ -1424,12 +1429,18 @@ class AuthHandler(BaseHandler):
|
|||
request: Request,
|
||||
client_redirect_url: str,
|
||||
extra_attributes: Optional[JsonDict] = None,
|
||||
new_user: bool = False,
|
||||
user_profile_data: Optional[ProfileInfo] = None,
|
||||
):
|
||||
"""
|
||||
The synchronous portion of complete_sso_login.
|
||||
|
||||
This exists purely for backwards compatibility of synapse.module_api.ModuleApi.
|
||||
"""
|
||||
|
||||
if user_profile_data is None:
|
||||
user_profile_data = ProfileInfo(None, None)
|
||||
|
||||
# Store any extra attributes which will be passed in the login response.
|
||||
# Note that this is per-user so it may overwrite a previous value, this
|
||||
# is considered OK since the newest SSO attributes should be most valid.
|
||||
|
@ -1467,6 +1478,9 @@ class AuthHandler(BaseHandler):
|
|||
display_url=redirect_url_no_params,
|
||||
redirect_url=redirect_url,
|
||||
server_name=self._server_name,
|
||||
new_user=new_user,
|
||||
user_id=registered_user_id,
|
||||
user_profile=user_profile_data,
|
||||
)
|
||||
respond_with_html(request, 200, html)
|
||||
|
||||
|
|
|
@ -80,9 +80,10 @@ class CasHandler:
|
|||
# user-facing name of this auth provider
|
||||
self.idp_name = "CAS"
|
||||
|
||||
# we do not currently support icons for CAS auth, but this is required by
|
||||
# we do not currently support brands/icons for CAS auth, but this is required by
|
||||
# the SsoIdentityProvider protocol type.
|
||||
self.idp_icon = None
|
||||
self.idp_brand = None
|
||||
|
||||
self._sso_handler = hs.get_sso_handler()
|
||||
|
||||
|
@ -99,11 +100,7 @@ class CasHandler:
|
|||
Returns:
|
||||
The URL to use as a "service" parameter.
|
||||
"""
|
||||
return "%s%s?%s" % (
|
||||
self._cas_service_url,
|
||||
"/_matrix/client/r0/login/cas/ticket",
|
||||
urllib.parse.urlencode(args),
|
||||
)
|
||||
return "%s?%s" % (self._cas_service_url, urllib.parse.urlencode(args),)
|
||||
|
||||
async def _validate_ticket(
|
||||
self, ticket: str, service_args: Dict[str, str]
|
||||
|
|
|
@ -15,7 +15,7 @@
|
|||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
import logging
|
||||
from typing import TYPE_CHECKING, Any, Dict, Iterable, List, Optional, Set, Tuple
|
||||
from typing import TYPE_CHECKING, Dict, Iterable, List, Optional, Set, Tuple
|
||||
|
||||
from synapse.api import errors
|
||||
from synapse.api.constants import EventTypes
|
||||
|
@ -62,7 +62,7 @@ class DeviceWorkerHandler(BaseHandler):
|
|||
self._auth_handler = hs.get_auth_handler()
|
||||
|
||||
@trace
|
||||
async def get_devices_by_user(self, user_id: str) -> List[Dict[str, Any]]:
|
||||
async def get_devices_by_user(self, user_id: str) -> List[JsonDict]:
|
||||
"""
|
||||
Retrieve the given user's devices
|
||||
|
||||
|
@ -85,7 +85,7 @@ class DeviceWorkerHandler(BaseHandler):
|
|||
return devices
|
||||
|
||||
@trace
|
||||
async def get_device(self, user_id: str, device_id: str) -> Dict[str, Any]:
|
||||
async def get_device(self, user_id: str, device_id: str) -> JsonDict:
|
||||
""" Retrieve the given device
|
||||
|
||||
Args:
|
||||
|
@ -598,7 +598,7 @@ class DeviceHandler(DeviceWorkerHandler):
|
|||
|
||||
|
||||
def _update_device_from_client_ips(
|
||||
device: Dict[str, Any], client_ips: Dict[Tuple[str, str], Dict[str, Any]]
|
||||
device: JsonDict, client_ips: Dict[Tuple[str, str], JsonDict]
|
||||
) -> None:
|
||||
ip = client_ips.get((device["user_id"], device["device_id"]), {})
|
||||
device.update({"last_seen_ts": ip.get("last_seen"), "last_seen_ip": ip.get("ip")})
|
||||
|
@ -946,8 +946,8 @@ class DeviceListUpdater:
|
|||
async def process_cross_signing_key_update(
|
||||
self,
|
||||
user_id: str,
|
||||
master_key: Optional[Dict[str, Any]],
|
||||
self_signing_key: Optional[Dict[str, Any]],
|
||||
master_key: Optional[JsonDict],
|
||||
self_signing_key: Optional[JsonDict],
|
||||
) -> List[str]:
|
||||
"""Process the given new master and self-signing key for the given remote user.
|
||||
|
||||
|
|
|
@ -16,7 +16,7 @@
|
|||
# limitations under the License.
|
||||
|
||||
import logging
|
||||
from typing import Dict, List, Optional, Tuple
|
||||
from typing import TYPE_CHECKING, Dict, Iterable, List, Optional, Tuple
|
||||
|
||||
import attr
|
||||
from canonicaljson import encode_canonical_json
|
||||
|
@ -31,6 +31,7 @@ from synapse.logging.context import make_deferred_yieldable, run_in_background
|
|||
from synapse.logging.opentracing import log_kv, set_tag, tag_args, trace
|
||||
from synapse.replication.http.devices import ReplicationUserDevicesResyncRestServlet
|
||||
from synapse.types import (
|
||||
JsonDict,
|
||||
UserID,
|
||||
get_domain_from_id,
|
||||
get_verify_key_from_cross_signing_key,
|
||||
|
@ -40,11 +41,14 @@ from synapse.util.async_helpers import Linearizer
|
|||
from synapse.util.caches.expiringcache import ExpiringCache
|
||||
from synapse.util.retryutils import NotRetryingDestination
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from synapse.app.homeserver import HomeServer
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class E2eKeysHandler:
|
||||
def __init__(self, hs):
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
self.store = hs.get_datastore()
|
||||
self.federation = hs.get_federation_client()
|
||||
self.device_handler = hs.get_device_handler()
|
||||
|
@ -78,7 +82,9 @@ class E2eKeysHandler:
|
|||
)
|
||||
|
||||
@trace
|
||||
async def query_devices(self, query_body, timeout, from_user_id):
|
||||
async def query_devices(
|
||||
self, query_body: JsonDict, timeout: int, from_user_id: str
|
||||
) -> JsonDict:
|
||||
""" Handle a device key query from a client
|
||||
|
||||
{
|
||||
|
@ -98,12 +104,14 @@ class E2eKeysHandler:
|
|||
}
|
||||
|
||||
Args:
|
||||
from_user_id (str): the user making the query. This is used when
|
||||
from_user_id: the user making the query. This is used when
|
||||
adding cross-signing signatures to limit what signatures users
|
||||
can see.
|
||||
"""
|
||||
|
||||
device_keys_query = query_body.get("device_keys", {})
|
||||
device_keys_query = query_body.get(
|
||||
"device_keys", {}
|
||||
) # type: Dict[str, Iterable[str]]
|
||||
|
||||
# separate users by domain.
|
||||
# make a map from domain to user_id to device_ids
|
||||
|
@ -121,7 +129,8 @@ class E2eKeysHandler:
|
|||
set_tag("remote_key_query", remote_queries)
|
||||
|
||||
# First get local devices.
|
||||
failures = {}
|
||||
# A map of destination -> failure response.
|
||||
failures = {} # type: Dict[str, JsonDict]
|
||||
results = {}
|
||||
if local_query:
|
||||
local_result = await self.query_local_devices(local_query)
|
||||
|
@ -135,9 +144,10 @@ class E2eKeysHandler:
|
|||
)
|
||||
|
||||
# Now attempt to get any remote devices from our local cache.
|
||||
remote_queries_not_in_cache = {}
|
||||
# A map of destination -> user ID -> device IDs.
|
||||
remote_queries_not_in_cache = {} # type: Dict[str, Dict[str, Iterable[str]]]
|
||||
if remote_queries:
|
||||
query_list = []
|
||||
query_list = [] # type: List[Tuple[str, Optional[str]]]
|
||||
for user_id, device_ids in remote_queries.items():
|
||||
if device_ids:
|
||||
query_list.extend((user_id, device_id) for device_id in device_ids)
|
||||
|
@ -284,15 +294,15 @@ class E2eKeysHandler:
|
|||
return ret
|
||||
|
||||
async def get_cross_signing_keys_from_cache(
|
||||
self, query, from_user_id
|
||||
self, query: Iterable[str], from_user_id: Optional[str]
|
||||
) -> Dict[str, Dict[str, dict]]:
|
||||
"""Get cross-signing keys for users from the database
|
||||
|
||||
Args:
|
||||
query (Iterable[string]) an iterable of user IDs. A dict whose keys
|
||||
query: an iterable of user IDs. A dict whose keys
|
||||
are user IDs satisfies this, so the query format used for
|
||||
query_devices can be used here.
|
||||
from_user_id (str): the user making the query. This is used when
|
||||
from_user_id: the user making the query. This is used when
|
||||
adding cross-signing signatures to limit what signatures users
|
||||
can see.
|
||||
|
||||
|
@ -315,14 +325,12 @@ class E2eKeysHandler:
|
|||
if "self_signing" in user_info:
|
||||
self_signing_keys[user_id] = user_info["self_signing"]
|
||||
|
||||
if (
|
||||
from_user_id in keys
|
||||
and keys[from_user_id] is not None
|
||||
and "user_signing" in keys[from_user_id]
|
||||
):
|
||||
# users can see other users' master and self-signing keys, but can
|
||||
# only see their own user-signing keys
|
||||
user_signing_keys[from_user_id] = keys[from_user_id]["user_signing"]
|
||||
# users can see other users' master and self-signing keys, but can
|
||||
# only see their own user-signing keys
|
||||
if from_user_id:
|
||||
from_user_key = keys.get(from_user_id)
|
||||
if from_user_key and "user_signing" in from_user_key:
|
||||
user_signing_keys[from_user_id] = from_user_key["user_signing"]
|
||||
|
||||
return {
|
||||
"master_keys": master_keys,
|
||||
|
@ -344,9 +352,9 @@ class E2eKeysHandler:
|
|||
A map from user_id -> device_id -> device details
|
||||
"""
|
||||
set_tag("local_query", query)
|
||||
local_query = []
|
||||
local_query = [] # type: List[Tuple[str, Optional[str]]]
|
||||
|
||||
result_dict = {}
|
||||
result_dict = {} # type: Dict[str, Dict[str, dict]]
|
||||
for user_id, device_ids in query.items():
|
||||
# we use UserID.from_string to catch invalid user ids
|
||||
if not self.is_mine(UserID.from_string(user_id)):
|
||||
|
@ -380,10 +388,14 @@ class E2eKeysHandler:
|
|||
log_kv(results)
|
||||
return result_dict
|
||||
|
||||
async def on_federation_query_client_keys(self, query_body):
|
||||
async def on_federation_query_client_keys(
|
||||
self, query_body: Dict[str, Dict[str, Optional[List[str]]]]
|
||||
) -> JsonDict:
|
||||
""" Handle a device key query from a federated server
|
||||
"""
|
||||
device_keys_query = query_body.get("device_keys", {})
|
||||
device_keys_query = query_body.get(
|
||||
"device_keys", {}
|
||||
) # type: Dict[str, Optional[List[str]]]
|
||||
res = await self.query_local_devices(device_keys_query)
|
||||
ret = {"device_keys": res}
|
||||
|
||||
|
@ -397,31 +409,34 @@ class E2eKeysHandler:
|
|||
return ret
|
||||
|
||||
@trace
|
||||
async def claim_one_time_keys(self, query, timeout):
|
||||
local_query = []
|
||||
remote_queries = {}
|
||||
async def claim_one_time_keys(
|
||||
self, query: Dict[str, Dict[str, Dict[str, str]]], timeout: int
|
||||
) -> JsonDict:
|
||||
local_query = [] # type: List[Tuple[str, str, str]]
|
||||
remote_queries = {} # type: Dict[str, Dict[str, Dict[str, str]]]
|
||||
|
||||
for user_id, device_keys in query.get("one_time_keys", {}).items():
|
||||
for user_id, one_time_keys in query.get("one_time_keys", {}).items():
|
||||
# we use UserID.from_string to catch invalid user ids
|
||||
if self.is_mine(UserID.from_string(user_id)):
|
||||
for device_id, algorithm in device_keys.items():
|
||||
for device_id, algorithm in one_time_keys.items():
|
||||
local_query.append((user_id, device_id, algorithm))
|
||||
else:
|
||||
domain = get_domain_from_id(user_id)
|
||||
remote_queries.setdefault(domain, {})[user_id] = device_keys
|
||||
remote_queries.setdefault(domain, {})[user_id] = one_time_keys
|
||||
|
||||
set_tag("local_key_query", local_query)
|
||||
set_tag("remote_key_query", remote_queries)
|
||||
|
||||
results = await self.store.claim_e2e_one_time_keys(local_query)
|
||||
|
||||
json_result = {}
|
||||
failures = {}
|
||||
# A map of user ID -> device ID -> key ID -> key.
|
||||
json_result = {} # type: Dict[str, Dict[str, Dict[str, JsonDict]]]
|
||||
failures = {} # type: Dict[str, JsonDict]
|
||||
for user_id, device_keys in results.items():
|
||||
for device_id, keys in device_keys.items():
|
||||
for key_id, json_bytes in keys.items():
|
||||
for key_id, json_str in keys.items():
|
||||
json_result.setdefault(user_id, {})[device_id] = {
|
||||
key_id: json_decoder.decode(json_bytes)
|
||||
key_id: json_decoder.decode(json_str)
|
||||
}
|
||||
|
||||
@trace
|
||||
|
@ -468,7 +483,9 @@ class E2eKeysHandler:
|
|||
return {"one_time_keys": json_result, "failures": failures}
|
||||
|
||||
@tag_args
|
||||
async def upload_keys_for_user(self, user_id, device_id, keys):
|
||||
async def upload_keys_for_user(
|
||||
self, user_id: str, device_id: str, keys: JsonDict
|
||||
) -> JsonDict:
|
||||
|
||||
time_now = self.clock.time_msec()
|
||||
|
||||
|
@ -543,8 +560,8 @@ class E2eKeysHandler:
|
|||
return {"one_time_key_counts": result}
|
||||
|
||||
async def _upload_one_time_keys_for_user(
|
||||
self, user_id, device_id, time_now, one_time_keys
|
||||
):
|
||||
self, user_id: str, device_id: str, time_now: int, one_time_keys: JsonDict
|
||||
) -> None:
|
||||
logger.info(
|
||||
"Adding one_time_keys %r for device %r for user %r at %d",
|
||||
one_time_keys.keys(),
|
||||
|
@ -585,12 +602,14 @@ class E2eKeysHandler:
|
|||
log_kv({"message": "Inserting new one_time_keys.", "keys": new_keys})
|
||||
await self.store.add_e2e_one_time_keys(user_id, device_id, time_now, new_keys)
|
||||
|
||||
async def upload_signing_keys_for_user(self, user_id, keys):
|
||||
async def upload_signing_keys_for_user(
|
||||
self, user_id: str, keys: JsonDict
|
||||
) -> JsonDict:
|
||||
"""Upload signing keys for cross-signing
|
||||
|
||||
Args:
|
||||
user_id (string): the user uploading the keys
|
||||
keys (dict[string, dict]): the signing keys
|
||||
user_id: the user uploading the keys
|
||||
keys: the signing keys
|
||||
"""
|
||||
|
||||
# if a master key is uploaded, then check it. Otherwise, load the
|
||||
|
@ -667,16 +686,17 @@ class E2eKeysHandler:
|
|||
|
||||
return {}
|
||||
|
||||
async def upload_signatures_for_device_keys(self, user_id, signatures):
|
||||
async def upload_signatures_for_device_keys(
|
||||
self, user_id: str, signatures: JsonDict
|
||||
) -> JsonDict:
|
||||
"""Upload device signatures for cross-signing
|
||||
|
||||
Args:
|
||||
user_id (string): the user uploading the signatures
|
||||
signatures (dict[string, dict[string, dict]]): map of users to
|
||||
devices to signed keys. This is the submission from the user; an
|
||||
exception will be raised if it is malformed.
|
||||
user_id: the user uploading the signatures
|
||||
signatures: map of users to devices to signed keys. This is the submission
|
||||
from the user; an exception will be raised if it is malformed.
|
||||
Returns:
|
||||
dict: response to be sent back to the client. The response will have
|
||||
The response to be sent back to the client. The response will have
|
||||
a "failures" key, which will be a dict mapping users to devices
|
||||
to errors for the signatures that failed.
|
||||
Raises:
|
||||
|
@ -719,7 +739,9 @@ class E2eKeysHandler:
|
|||
|
||||
return {"failures": failures}
|
||||
|
||||
async def _process_self_signatures(self, user_id, signatures):
|
||||
async def _process_self_signatures(
|
||||
self, user_id: str, signatures: JsonDict
|
||||
) -> Tuple[List["SignatureListItem"], Dict[str, Dict[str, dict]]]:
|
||||
"""Process uploaded signatures of the user's own keys.
|
||||
|
||||
Signatures of the user's own keys from this API come in two forms:
|
||||
|
@ -731,15 +753,14 @@ class E2eKeysHandler:
|
|||
signatures (dict[string, dict]): map of devices to signed keys
|
||||
|
||||
Returns:
|
||||
(list[SignatureListItem], dict[string, dict[string, dict]]):
|
||||
a list of signatures to store, and a map of users to devices to failure
|
||||
reasons
|
||||
A tuple of a list of signatures to store, and a map of users to
|
||||
devices to failure reasons
|
||||
|
||||
Raises:
|
||||
SynapseError: if the input is malformed
|
||||
"""
|
||||
signature_list = []
|
||||
failures = {}
|
||||
signature_list = [] # type: List[SignatureListItem]
|
||||
failures = {} # type: Dict[str, Dict[str, JsonDict]]
|
||||
if not signatures:
|
||||
return signature_list, failures
|
||||
|
||||
|
@ -834,19 +855,24 @@ class E2eKeysHandler:
|
|||
return signature_list, failures
|
||||
|
||||
def _check_master_key_signature(
|
||||
self, user_id, master_key_id, signed_master_key, stored_master_key, devices
|
||||
):
|
||||
self,
|
||||
user_id: str,
|
||||
master_key_id: str,
|
||||
signed_master_key: JsonDict,
|
||||
stored_master_key: JsonDict,
|
||||
devices: Dict[str, Dict[str, JsonDict]],
|
||||
) -> List["SignatureListItem"]:
|
||||
"""Check signatures of a user's master key made by their devices.
|
||||
|
||||
Args:
|
||||
user_id (string): the user whose master key is being checked
|
||||
master_key_id (string): the ID of the user's master key
|
||||
signed_master_key (dict): the user's signed master key that was uploaded
|
||||
stored_master_key (dict): our previously-stored copy of the user's master key
|
||||
devices (iterable(dict)): the user's devices
|
||||
user_id: the user whose master key is being checked
|
||||
master_key_id: the ID of the user's master key
|
||||
signed_master_key: the user's signed master key that was uploaded
|
||||
stored_master_key: our previously-stored copy of the user's master key
|
||||
devices: the user's devices
|
||||
|
||||
Returns:
|
||||
list[SignatureListItem]: a list of signatures to store
|
||||
A list of signatures to store
|
||||
|
||||
Raises:
|
||||
SynapseError: if a signature is invalid
|
||||
|
@ -877,25 +903,26 @@ class E2eKeysHandler:
|
|||
|
||||
return master_key_signature_list
|
||||
|
||||
async def _process_other_signatures(self, user_id, signatures):
|
||||
async def _process_other_signatures(
|
||||
self, user_id: str, signatures: Dict[str, dict]
|
||||
) -> Tuple[List["SignatureListItem"], Dict[str, Dict[str, dict]]]:
|
||||
"""Process uploaded signatures of other users' keys. These will be the
|
||||
target user's master keys, signed by the uploading user's user-signing
|
||||
key.
|
||||
|
||||
Args:
|
||||
user_id (string): the user uploading the keys
|
||||
signatures (dict[string, dict]): map of users to devices to signed keys
|
||||
user_id: the user uploading the keys
|
||||
signatures: map of users to devices to signed keys
|
||||
|
||||
Returns:
|
||||
(list[SignatureListItem], dict[string, dict[string, dict]]):
|
||||
a list of signatures to store, and a map of users to devices to failure
|
||||
A list of signatures to store, and a map of users to devices to failure
|
||||
reasons
|
||||
|
||||
Raises:
|
||||
SynapseError: if the input is malformed
|
||||
"""
|
||||
signature_list = []
|
||||
failures = {}
|
||||
signature_list = [] # type: List[SignatureListItem]
|
||||
failures = {} # type: Dict[str, Dict[str, JsonDict]]
|
||||
if not signatures:
|
||||
return signature_list, failures
|
||||
|
||||
|
@ -983,7 +1010,7 @@ class E2eKeysHandler:
|
|||
|
||||
async def _get_e2e_cross_signing_verify_key(
|
||||
self, user_id: str, key_type: str, from_user_id: str = None
|
||||
):
|
||||
) -> Tuple[JsonDict, str, VerifyKey]:
|
||||
"""Fetch locally or remotely query for a cross-signing public key.
|
||||
|
||||
First, attempt to fetch the cross-signing public key from storage.
|
||||
|
@ -997,8 +1024,7 @@ class E2eKeysHandler:
|
|||
This affects what signatures are fetched.
|
||||
|
||||
Returns:
|
||||
dict, str, VerifyKey: the raw key data, the key ID, and the
|
||||
signedjson verify key
|
||||
The raw key data, the key ID, and the signedjson verify key
|
||||
|
||||
Raises:
|
||||
NotFoundError: if the key is not found
|
||||
|
@ -1135,16 +1161,18 @@ class E2eKeysHandler:
|
|||
return desired_key, desired_key_id, desired_verify_key
|
||||
|
||||
|
||||
def _check_cross_signing_key(key, user_id, key_type, signing_key=None):
|
||||
def _check_cross_signing_key(
|
||||
key: JsonDict, user_id: str, key_type: str, signing_key: Optional[VerifyKey] = None
|
||||
) -> None:
|
||||
"""Check a cross-signing key uploaded by a user. Performs some basic sanity
|
||||
checking, and ensures that it is signed, if a signature is required.
|
||||
|
||||
Args:
|
||||
key (dict): the key data to verify
|
||||
user_id (str): the user whose key is being checked
|
||||
key_type (str): the type of key that the key should be
|
||||
signing_key (VerifyKey): (optional) the signing key that the key should
|
||||
be signed with. If omitted, signatures will not be checked.
|
||||
key: the key data to verify
|
||||
user_id: the user whose key is being checked
|
||||
key_type: the type of key that the key should be
|
||||
signing_key: the signing key that the key should be signed with. If
|
||||
omitted, signatures will not be checked.
|
||||
"""
|
||||
if (
|
||||
key.get("user_id") != user_id
|
||||
|
@ -1162,16 +1190,21 @@ def _check_cross_signing_key(key, user_id, key_type, signing_key=None):
|
|||
)
|
||||
|
||||
|
||||
def _check_device_signature(user_id, verify_key, signed_device, stored_device):
|
||||
def _check_device_signature(
|
||||
user_id: str,
|
||||
verify_key: VerifyKey,
|
||||
signed_device: JsonDict,
|
||||
stored_device: JsonDict,
|
||||
) -> None:
|
||||
"""Check that a signature on a device or cross-signing key is correct and
|
||||
matches the copy of the device/key that we have stored. Throws an
|
||||
exception if an error is detected.
|
||||
|
||||
Args:
|
||||
user_id (str): the user ID whose signature is being checked
|
||||
verify_key (VerifyKey): the key to verify the device with
|
||||
signed_device (dict): the uploaded signed device data
|
||||
stored_device (dict): our previously stored copy of the device
|
||||
user_id: the user ID whose signature is being checked
|
||||
verify_key: the key to verify the device with
|
||||
signed_device: the uploaded signed device data
|
||||
stored_device: our previously stored copy of the device
|
||||
|
||||
Raises:
|
||||
SynapseError: if the signature was invalid or the sent device is not the
|
||||
|
@ -1201,7 +1234,7 @@ def _check_device_signature(user_id, verify_key, signed_device, stored_device):
|
|||
raise SynapseError(400, "Invalid signature", Codes.INVALID_SIGNATURE)
|
||||
|
||||
|
||||
def _exception_to_failure(e):
|
||||
def _exception_to_failure(e: Exception) -> JsonDict:
|
||||
if isinstance(e, SynapseError):
|
||||
return {"status": e.code, "errcode": e.errcode, "message": str(e)}
|
||||
|
||||
|
@ -1218,7 +1251,7 @@ def _exception_to_failure(e):
|
|||
return {"status": 503, "message": str(e)}
|
||||
|
||||
|
||||
def _one_time_keys_match(old_key_json, new_key):
|
||||
def _one_time_keys_match(old_key_json: str, new_key: JsonDict) -> bool:
|
||||
old_key = json_decoder.decode(old_key_json)
|
||||
|
||||
# if either is a string rather than an object, they must match exactly
|
||||
|
@ -1239,16 +1272,16 @@ class SignatureListItem:
|
|||
"""An item in the signature list as used by upload_signatures_for_device_keys.
|
||||
"""
|
||||
|
||||
signing_key_id = attr.ib()
|
||||
target_user_id = attr.ib()
|
||||
target_device_id = attr.ib()
|
||||
signature = attr.ib()
|
||||
signing_key_id = attr.ib(type=str)
|
||||
target_user_id = attr.ib(type=str)
|
||||
target_device_id = attr.ib(type=str)
|
||||
signature = attr.ib(type=JsonDict)
|
||||
|
||||
|
||||
class SigningKeyEduUpdater:
|
||||
"""Handles incoming signing key updates from federation and updates the DB"""
|
||||
|
||||
def __init__(self, hs, e2e_keys_handler):
|
||||
def __init__(self, hs: "HomeServer", e2e_keys_handler: E2eKeysHandler):
|
||||
self.store = hs.get_datastore()
|
||||
self.federation = hs.get_federation_client()
|
||||
self.clock = hs.get_clock()
|
||||
|
@ -1257,7 +1290,7 @@ class SigningKeyEduUpdater:
|
|||
self._remote_edu_linearizer = Linearizer(name="remote_signing_key")
|
||||
|
||||
# user_id -> list of updates waiting to be handled.
|
||||
self._pending_updates = {}
|
||||
self._pending_updates = {} # type: Dict[str, List[Tuple[JsonDict, JsonDict]]]
|
||||
|
||||
# Recently seen stream ids. We don't bother keeping these in the DB,
|
||||
# but they're useful to have them about to reduce the number of spurious
|
||||
|
@ -1270,13 +1303,15 @@ class SigningKeyEduUpdater:
|
|||
iterable=True,
|
||||
)
|
||||
|
||||
async def incoming_signing_key_update(self, origin, edu_content):
|
||||
async def incoming_signing_key_update(
|
||||
self, origin: str, edu_content: JsonDict
|
||||
) -> None:
|
||||
"""Called on incoming signing key update from federation. Responsible for
|
||||
parsing the EDU and adding to pending updates list.
|
||||
|
||||
Args:
|
||||
origin (string): the server that sent the EDU
|
||||
edu_content (dict): the contents of the EDU
|
||||
origin: the server that sent the EDU
|
||||
edu_content: the contents of the EDU
|
||||
"""
|
||||
|
||||
user_id = edu_content.pop("user_id")
|
||||
|
@ -1299,11 +1334,11 @@ class SigningKeyEduUpdater:
|
|||
|
||||
await self._handle_signing_key_updates(user_id)
|
||||
|
||||
async def _handle_signing_key_updates(self, user_id):
|
||||
async def _handle_signing_key_updates(self, user_id: str) -> None:
|
||||
"""Actually handle pending updates.
|
||||
|
||||
Args:
|
||||
user_id (string): the user whose updates we are processing
|
||||
user_id: the user whose updates we are processing
|
||||
"""
|
||||
|
||||
device_handler = self.e2e_keys_handler.device_handler
|
||||
|
@ -1315,7 +1350,7 @@ class SigningKeyEduUpdater:
|
|||
# This can happen since we batch updates
|
||||
return
|
||||
|
||||
device_ids = []
|
||||
device_ids = [] # type: List[str]
|
||||
|
||||
logger.info("pending updates: %r", pending_updates)
|
||||
|
||||
|
|
|
@ -15,6 +15,7 @@
|
|||
# limitations under the License.
|
||||
|
||||
import logging
|
||||
from typing import TYPE_CHECKING, List, Optional
|
||||
|
||||
from synapse.api.errors import (
|
||||
Codes,
|
||||
|
@ -24,8 +25,12 @@ from synapse.api.errors import (
|
|||
SynapseError,
|
||||
)
|
||||
from synapse.logging.opentracing import log_kv, trace
|
||||
from synapse.types import JsonDict
|
||||
from synapse.util.async_helpers import Linearizer
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from synapse.app.homeserver import HomeServer
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
|
@ -37,7 +42,7 @@ class E2eRoomKeysHandler:
|
|||
The actual payload of the encrypted keys is completely opaque to the handler.
|
||||
"""
|
||||
|
||||
def __init__(self, hs):
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
self.store = hs.get_datastore()
|
||||
|
||||
# Used to lock whenever a client is uploading key data. This prevents collisions
|
||||
|
@ -48,21 +53,27 @@ class E2eRoomKeysHandler:
|
|||
self._upload_linearizer = Linearizer("upload_room_keys_lock")
|
||||
|
||||
@trace
|
||||
async def get_room_keys(self, user_id, version, room_id=None, session_id=None):
|
||||
async def get_room_keys(
|
||||
self,
|
||||
user_id: str,
|
||||
version: str,
|
||||
room_id: Optional[str] = None,
|
||||
session_id: Optional[str] = None,
|
||||
) -> List[JsonDict]:
|
||||
"""Bulk get the E2E room keys for a given backup, optionally filtered to a given
|
||||
room, or a given session.
|
||||
See EndToEndRoomKeyStore.get_e2e_room_keys for full details.
|
||||
|
||||
Args:
|
||||
user_id(str): the user whose keys we're getting
|
||||
version(str): the version ID of the backup we're getting keys from
|
||||
room_id(string): room ID to get keys for, for None to get keys for all rooms
|
||||
session_id(string): session ID to get keys for, for None to get keys for all
|
||||
user_id: the user whose keys we're getting
|
||||
version: the version ID of the backup we're getting keys from
|
||||
room_id: room ID to get keys for, for None to get keys for all rooms
|
||||
session_id: session ID to get keys for, for None to get keys for all
|
||||
sessions
|
||||
Raises:
|
||||
NotFoundError: if the backup version does not exist
|
||||
Returns:
|
||||
A deferred list of dicts giving the session_data and message metadata for
|
||||
A list of dicts giving the session_data and message metadata for
|
||||
these room keys.
|
||||
"""
|
||||
|
||||
|
@ -86,17 +97,23 @@ class E2eRoomKeysHandler:
|
|||
return results
|
||||
|
||||
@trace
|
||||
async def delete_room_keys(self, user_id, version, room_id=None, session_id=None):
|
||||
async def delete_room_keys(
|
||||
self,
|
||||
user_id: str,
|
||||
version: str,
|
||||
room_id: Optional[str] = None,
|
||||
session_id: Optional[str] = None,
|
||||
) -> JsonDict:
|
||||
"""Bulk delete the E2E room keys for a given backup, optionally filtered to a given
|
||||
room or a given session.
|
||||
See EndToEndRoomKeyStore.delete_e2e_room_keys for full details.
|
||||
|
||||
Args:
|
||||
user_id(str): the user whose backup we're deleting
|
||||
version(str): the version ID of the backup we're deleting
|
||||
room_id(string): room ID to delete keys for, for None to delete keys for all
|
||||
user_id: the user whose backup we're deleting
|
||||
version: the version ID of the backup we're deleting
|
||||
room_id: room ID to delete keys for, for None to delete keys for all
|
||||
rooms
|
||||
session_id(string): session ID to delete keys for, for None to delete keys
|
||||
session_id: session ID to delete keys for, for None to delete keys
|
||||
for all sessions
|
||||
Raises:
|
||||
NotFoundError: if the backup version does not exist
|
||||
|
@ -128,15 +145,17 @@ class E2eRoomKeysHandler:
|
|||
return {"etag": str(version_etag), "count": count}
|
||||
|
||||
@trace
|
||||
async def upload_room_keys(self, user_id, version, room_keys):
|
||||
async def upload_room_keys(
|
||||
self, user_id: str, version: str, room_keys: JsonDict
|
||||
) -> JsonDict:
|
||||
"""Bulk upload a list of room keys into a given backup version, asserting
|
||||
that the given version is the current backup version. room_keys are merged
|
||||
into the current backup as described in RoomKeysServlet.on_PUT().
|
||||
|
||||
Args:
|
||||
user_id(str): the user whose backup we're setting
|
||||
version(str): the version ID of the backup we're updating
|
||||
room_keys(dict): a nested dict describing the room_keys we're setting:
|
||||
user_id: the user whose backup we're setting
|
||||
version: the version ID of the backup we're updating
|
||||
room_keys: a nested dict describing the room_keys we're setting:
|
||||
|
||||
{
|
||||
"rooms": {
|
||||
|
@ -254,14 +273,16 @@ class E2eRoomKeysHandler:
|
|||
return {"etag": str(version_etag), "count": count}
|
||||
|
||||
@staticmethod
|
||||
def _should_replace_room_key(current_room_key, room_key):
|
||||
def _should_replace_room_key(
|
||||
current_room_key: Optional[JsonDict], room_key: JsonDict
|
||||
) -> bool:
|
||||
"""
|
||||
Determine whether to replace a given current_room_key (if any)
|
||||
with a newly uploaded room_key backup
|
||||
|
||||
Args:
|
||||
current_room_key (dict): Optional, the current room_key dict if any
|
||||
room_key (dict): The new room_key dict which may or may not be fit to
|
||||
current_room_key: Optional, the current room_key dict if any
|
||||
room_key : The new room_key dict which may or may not be fit to
|
||||
replace the current_room_key
|
||||
|
||||
Returns:
|
||||
|
@ -286,14 +307,14 @@ class E2eRoomKeysHandler:
|
|||
return True
|
||||
|
||||
@trace
|
||||
async def create_version(self, user_id, version_info):
|
||||
async def create_version(self, user_id: str, version_info: JsonDict) -> str:
|
||||
"""Create a new backup version. This automatically becomes the new
|
||||
backup version for the user's keys; previous backups will no longer be
|
||||
writeable to.
|
||||
|
||||
Args:
|
||||
user_id(str): the user whose backup version we're creating
|
||||
version_info(dict): metadata about the new version being created
|
||||
user_id: the user whose backup version we're creating
|
||||
version_info: metadata about the new version being created
|
||||
|
||||
{
|
||||
"algorithm": "m.megolm_backup.v1",
|
||||
|
@ -301,7 +322,7 @@ class E2eRoomKeysHandler:
|
|||
}
|
||||
|
||||
Returns:
|
||||
A deferred of a string that gives the new version number.
|
||||
The new version number.
|
||||
"""
|
||||
|
||||
# TODO: Validate the JSON to make sure it has the right keys.
|
||||
|
@ -313,17 +334,19 @@ class E2eRoomKeysHandler:
|
|||
)
|
||||
return new_version
|
||||
|
||||
async def get_version_info(self, user_id, version=None):
|
||||
async def get_version_info(
|
||||
self, user_id: str, version: Optional[str] = None
|
||||
) -> JsonDict:
|
||||
"""Get the info about a given version of the user's backup
|
||||
|
||||
Args:
|
||||
user_id(str): the user whose current backup version we're querying
|
||||
version(str): Optional; if None gives the most recent version
|
||||
user_id: the user whose current backup version we're querying
|
||||
version: Optional; if None gives the most recent version
|
||||
otherwise a historical one.
|
||||
Raises:
|
||||
NotFoundError: if the requested backup version doesn't exist
|
||||
Returns:
|
||||
A deferred of a info dict that gives the info about the new version.
|
||||
A info dict that gives the info about the new version.
|
||||
|
||||
{
|
||||
"version": "1234",
|
||||
|
@ -346,7 +369,7 @@ class E2eRoomKeysHandler:
|
|||
return res
|
||||
|
||||
@trace
|
||||
async def delete_version(self, user_id, version=None):
|
||||
async def delete_version(self, user_id: str, version: Optional[str] = None) -> None:
|
||||
"""Deletes a given version of the user's e2e_room_keys backup
|
||||
|
||||
Args:
|
||||
|
@ -366,17 +389,19 @@ class E2eRoomKeysHandler:
|
|||
raise
|
||||
|
||||
@trace
|
||||
async def update_version(self, user_id, version, version_info):
|
||||
async def update_version(
|
||||
self, user_id: str, version: str, version_info: JsonDict
|
||||
) -> JsonDict:
|
||||
"""Update the info about a given version of the user's backup
|
||||
|
||||
Args:
|
||||
user_id(str): the user whose current backup version we're updating
|
||||
version(str): the backup version we're updating
|
||||
version_info(dict): the new information about the backup
|
||||
user_id: the user whose current backup version we're updating
|
||||
version: the backup version we're updating
|
||||
version_info: the new information about the backup
|
||||
Raises:
|
||||
NotFoundError: if the requested backup version doesn't exist
|
||||
Returns:
|
||||
A deferred of an empty dict.
|
||||
An empty dict.
|
||||
"""
|
||||
if "version" not in version_info:
|
||||
version_info["version"] = version
|
||||
|
|
|
@ -1617,6 +1617,10 @@ class FederationHandler(BaseHandler):
|
|||
if event.state_key == self._server_notices_mxid:
|
||||
raise SynapseError(HTTPStatus.FORBIDDEN, "Cannot invite this user")
|
||||
|
||||
# We retrieve the room member handler here as to not cause a cyclic dependency
|
||||
member_handler = self.hs.get_room_member_handler()
|
||||
member_handler.ratelimit_invite(event.room_id, event.state_key)
|
||||
|
||||
# keep a record of the room version, if we don't yet know it.
|
||||
# (this may get overwritten if we later get a different room version in a
|
||||
# join dance).
|
||||
|
@ -2093,6 +2097,11 @@ class FederationHandler(BaseHandler):
|
|||
if event.type == EventTypes.GuestAccess and not context.rejected:
|
||||
await self.maybe_kick_guest_users(event)
|
||||
|
||||
# If we are going to send this event over federation we precaclculate
|
||||
# the joined hosts.
|
||||
if event.internal_metadata.get_send_on_behalf_of():
|
||||
await self.event_creation_handler.cache_joined_hosts_for_event(event)
|
||||
|
||||
return context
|
||||
|
||||
async def _check_for_soft_fail(
|
||||
|
|
|
@ -15,9 +15,13 @@
|
|||
# limitations under the License.
|
||||
|
||||
import logging
|
||||
from typing import TYPE_CHECKING, Dict, Iterable, List, Set
|
||||
|
||||
from synapse.api.errors import HttpResponseException, RequestSendFailed, SynapseError
|
||||
from synapse.types import GroupID, get_domain_from_id
|
||||
from synapse.types import GroupID, JsonDict, get_domain_from_id
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from synapse.app.homeserver import HomeServer
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
@ -56,7 +60,7 @@ def _create_rerouter(func_name):
|
|||
|
||||
|
||||
class GroupsLocalWorkerHandler:
|
||||
def __init__(self, hs):
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
self.hs = hs
|
||||
self.store = hs.get_datastore()
|
||||
self.room_list_handler = hs.get_room_list_handler()
|
||||
|
@ -84,7 +88,9 @@ class GroupsLocalWorkerHandler:
|
|||
get_group_role = _create_rerouter("get_group_role")
|
||||
get_group_roles = _create_rerouter("get_group_roles")
|
||||
|
||||
async def get_group_summary(self, group_id, requester_user_id):
|
||||
async def get_group_summary(
|
||||
self, group_id: str, requester_user_id: str
|
||||
) -> JsonDict:
|
||||
"""Get the group summary for a group.
|
||||
|
||||
If the group is remote we check that the users have valid attestations.
|
||||
|
@ -137,14 +143,15 @@ class GroupsLocalWorkerHandler:
|
|||
|
||||
return res
|
||||
|
||||
async def get_users_in_group(self, group_id, requester_user_id):
|
||||
async def get_users_in_group(
|
||||
self, group_id: str, requester_user_id: str
|
||||
) -> JsonDict:
|
||||
"""Get users in a group
|
||||
"""
|
||||
if self.is_mine_id(group_id):
|
||||
res = await self.groups_server_handler.get_users_in_group(
|
||||
return await self.groups_server_handler.get_users_in_group(
|
||||
group_id, requester_user_id
|
||||
)
|
||||
return res
|
||||
|
||||
group_server_name = get_domain_from_id(group_id)
|
||||
|
||||
|
@ -178,11 +185,11 @@ class GroupsLocalWorkerHandler:
|
|||
|
||||
return res
|
||||
|
||||
async def get_joined_groups(self, user_id):
|
||||
async def get_joined_groups(self, user_id: str) -> JsonDict:
|
||||
group_ids = await self.store.get_joined_groups(user_id)
|
||||
return {"groups": group_ids}
|
||||
|
||||
async def get_publicised_groups_for_user(self, user_id):
|
||||
async def get_publicised_groups_for_user(self, user_id: str) -> JsonDict:
|
||||
if self.hs.is_mine_id(user_id):
|
||||
result = await self.store.get_publicised_groups_for_user(user_id)
|
||||
|
||||
|
@ -206,8 +213,10 @@ class GroupsLocalWorkerHandler:
|
|||
# TODO: Verify attestations
|
||||
return {"groups": result}
|
||||
|
||||
async def bulk_get_publicised_groups(self, user_ids, proxy=True):
|
||||
destinations = {}
|
||||
async def bulk_get_publicised_groups(
|
||||
self, user_ids: Iterable[str], proxy: bool = True
|
||||
) -> JsonDict:
|
||||
destinations = {} # type: Dict[str, Set[str]]
|
||||
local_users = set()
|
||||
|
||||
for user_id in user_ids:
|
||||
|
@ -220,7 +229,7 @@ class GroupsLocalWorkerHandler:
|
|||
raise SynapseError(400, "Some user_ids are not local")
|
||||
|
||||
results = {}
|
||||
failed_results = []
|
||||
failed_results = [] # type: List[str]
|
||||
for destination, dest_user_ids in destinations.items():
|
||||
try:
|
||||
r = await self.transport_client.bulk_get_publicised_groups(
|
||||
|
@ -242,7 +251,7 @@ class GroupsLocalWorkerHandler:
|
|||
|
||||
|
||||
class GroupsLocalHandler(GroupsLocalWorkerHandler):
|
||||
def __init__(self, hs):
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
super().__init__(hs)
|
||||
|
||||
# Ensure attestations get renewed
|
||||
|
@ -271,7 +280,9 @@ class GroupsLocalHandler(GroupsLocalWorkerHandler):
|
|||
|
||||
set_group_join_policy = _create_rerouter("set_group_join_policy")
|
||||
|
||||
async def create_group(self, group_id, user_id, content):
|
||||
async def create_group(
|
||||
self, group_id: str, user_id: str, content: JsonDict
|
||||
) -> JsonDict:
|
||||
"""Create a group
|
||||
"""
|
||||
|
||||
|
@ -284,27 +295,7 @@ class GroupsLocalHandler(GroupsLocalWorkerHandler):
|
|||
local_attestation = None
|
||||
remote_attestation = None
|
||||
else:
|
||||
local_attestation = self.attestations.create_attestation(group_id, user_id)
|
||||
content["attestation"] = local_attestation
|
||||
|
||||
content["user_profile"] = await self.profile_handler.get_profile(user_id)
|
||||
|
||||
try:
|
||||
res = await self.transport_client.create_group(
|
||||
get_domain_from_id(group_id), group_id, user_id, content
|
||||
)
|
||||
except HttpResponseException as e:
|
||||
raise e.to_synapse_error()
|
||||
except RequestSendFailed:
|
||||
raise SynapseError(502, "Failed to contact group server")
|
||||
|
||||
remote_attestation = res["attestation"]
|
||||
await self.attestations.verify_attestation(
|
||||
remote_attestation,
|
||||
group_id=group_id,
|
||||
user_id=user_id,
|
||||
server_name=get_domain_from_id(group_id),
|
||||
)
|
||||
raise SynapseError(400, "Unable to create remote groups")
|
||||
|
||||
is_publicised = content.get("publicise", False)
|
||||
token = await self.store.register_user_group_membership(
|
||||
|
@ -320,7 +311,9 @@ class GroupsLocalHandler(GroupsLocalWorkerHandler):
|
|||
|
||||
return res
|
||||
|
||||
async def join_group(self, group_id, user_id, content):
|
||||
async def join_group(
|
||||
self, group_id: str, user_id: str, content: JsonDict
|
||||
) -> JsonDict:
|
||||
"""Request to join a group
|
||||
"""
|
||||
if self.is_mine_id(group_id):
|
||||
|
@ -365,7 +358,9 @@ class GroupsLocalHandler(GroupsLocalWorkerHandler):
|
|||
|
||||
return {}
|
||||
|
||||
async def accept_invite(self, group_id, user_id, content):
|
||||
async def accept_invite(
|
||||
self, group_id: str, user_id: str, content: JsonDict
|
||||
) -> JsonDict:
|
||||
"""Accept an invite to a group
|
||||
"""
|
||||
if self.is_mine_id(group_id):
|
||||
|
@ -410,7 +405,9 @@ class GroupsLocalHandler(GroupsLocalWorkerHandler):
|
|||
|
||||
return {}
|
||||
|
||||
async def invite(self, group_id, user_id, requester_user_id, config):
|
||||
async def invite(
|
||||
self, group_id: str, user_id: str, requester_user_id: str, config: JsonDict
|
||||
) -> JsonDict:
|
||||
"""Invite a user to a group
|
||||
"""
|
||||
content = {"requester_user_id": requester_user_id, "config": config}
|
||||
|
@ -434,7 +431,9 @@ class GroupsLocalHandler(GroupsLocalWorkerHandler):
|
|||
|
||||
return res
|
||||
|
||||
async def on_invite(self, group_id, user_id, content):
|
||||
async def on_invite(
|
||||
self, group_id: str, user_id: str, content: JsonDict
|
||||
) -> JsonDict:
|
||||
"""One of our users were invited to a group
|
||||
"""
|
||||
# TODO: Support auto join and rejection
|
||||
|
@ -465,8 +464,8 @@ class GroupsLocalHandler(GroupsLocalWorkerHandler):
|
|||
return {"state": "invite", "user_profile": user_profile}
|
||||
|
||||
async def remove_user_from_group(
|
||||
self, group_id, user_id, requester_user_id, content
|
||||
):
|
||||
self, group_id: str, user_id: str, requester_user_id: str, content: JsonDict
|
||||
) -> JsonDict:
|
||||
"""Remove a user from a group
|
||||
"""
|
||||
if user_id == requester_user_id:
|
||||
|
@ -499,7 +498,9 @@ class GroupsLocalHandler(GroupsLocalWorkerHandler):
|
|||
|
||||
return res
|
||||
|
||||
async def user_removed_from_group(self, group_id, user_id, content):
|
||||
async def user_removed_from_group(
|
||||
self, group_id: str, user_id: str, content: JsonDict
|
||||
) -> None:
|
||||
"""One of our users was removed/kicked from a group
|
||||
"""
|
||||
# TODO: Check if user in group
|
||||
|
|
|
@ -27,9 +27,11 @@ from synapse.api.errors import (
|
|||
HttpResponseException,
|
||||
SynapseError,
|
||||
)
|
||||
from synapse.api.ratelimiting import Ratelimiter
|
||||
from synapse.config.emailconfig import ThreepidBehaviour
|
||||
from synapse.http import RequestTimedOutError
|
||||
from synapse.http.client import SimpleHttpClient
|
||||
from synapse.http.site import SynapseRequest
|
||||
from synapse.types import JsonDict, Requester
|
||||
from synapse.util import json_decoder
|
||||
from synapse.util.hash import sha256_and_url_safe_base64
|
||||
|
@ -57,6 +59,32 @@ class IdentityHandler(BaseHandler):
|
|||
|
||||
self._web_client_location = hs.config.invite_client_location
|
||||
|
||||
# Ratelimiters for `/requestToken` endpoints.
|
||||
self._3pid_validation_ratelimiter_ip = Ratelimiter(
|
||||
clock=hs.get_clock(),
|
||||
rate_hz=hs.config.ratelimiting.rc_3pid_validation.per_second,
|
||||
burst_count=hs.config.ratelimiting.rc_3pid_validation.burst_count,
|
||||
)
|
||||
self._3pid_validation_ratelimiter_address = Ratelimiter(
|
||||
clock=hs.get_clock(),
|
||||
rate_hz=hs.config.ratelimiting.rc_3pid_validation.per_second,
|
||||
burst_count=hs.config.ratelimiting.rc_3pid_validation.burst_count,
|
||||
)
|
||||
|
||||
def ratelimit_request_token_requests(
|
||||
self, request: SynapseRequest, medium: str, address: str,
|
||||
):
|
||||
"""Used to ratelimit requests to `/requestToken` by IP and address.
|
||||
|
||||
Args:
|
||||
request: The associated request
|
||||
medium: The type of threepid, e.g. "msisdn" or "email"
|
||||
address: The actual threepid ID, e.g. the phone number or email address
|
||||
"""
|
||||
|
||||
self._3pid_validation_ratelimiter_ip.ratelimit((medium, request.getClientIP()))
|
||||
self._3pid_validation_ratelimiter_address.ratelimit((medium, address))
|
||||
|
||||
async def threepid_from_creds(
|
||||
self, id_server: str, creds: Dict[str, str]
|
||||
) -> Optional[JsonDict]:
|
||||
|
|
|
@ -174,7 +174,7 @@ class MessageHandler:
|
|||
raise NotFoundError("Can't find event for token %s" % (at_token,))
|
||||
|
||||
visible_events = await filter_events_for_client(
|
||||
self.storage, user_id, last_events, filter_send_to_client=False
|
||||
self.storage, user_id, last_events, filter_send_to_client=False,
|
||||
)
|
||||
|
||||
event = last_events[0]
|
||||
|
@ -434,6 +434,8 @@ class EventCreationHandler:
|
|||
|
||||
self._ephemeral_events_enabled = hs.config.enable_ephemeral_messages
|
||||
|
||||
self._external_cache = hs.get_external_cache()
|
||||
|
||||
async def create_event(
|
||||
self,
|
||||
requester: Requester,
|
||||
|
@ -941,6 +943,8 @@ class EventCreationHandler:
|
|||
|
||||
await self.action_generator.handle_push_actions_for_event(event, context)
|
||||
|
||||
await self.cache_joined_hosts_for_event(event)
|
||||
|
||||
try:
|
||||
# If we're a worker we need to hit out to the master.
|
||||
writer_instance = self._events_shard_config.get_instance(event.room_id)
|
||||
|
@ -980,6 +984,44 @@ class EventCreationHandler:
|
|||
await self.store.remove_push_actions_from_staging(event.event_id)
|
||||
raise
|
||||
|
||||
async def cache_joined_hosts_for_event(self, event: EventBase) -> None:
|
||||
"""Precalculate the joined hosts at the event, when using Redis, so that
|
||||
external federation senders don't have to recalculate it themselves.
|
||||
"""
|
||||
|
||||
if not self._external_cache.is_enabled():
|
||||
return
|
||||
|
||||
# We actually store two mappings, event ID -> prev state group,
|
||||
# state group -> joined hosts, which is much more space efficient
|
||||
# than event ID -> joined hosts.
|
||||
#
|
||||
# Note: We have to cache event ID -> prev state group, as we don't
|
||||
# store that in the DB.
|
||||
#
|
||||
# Note: We always set the state group -> joined hosts cache, even if
|
||||
# we already set it, so that the expiry time is reset.
|
||||
|
||||
state_entry = await self.state.resolve_state_groups_for_events(
|
||||
event.room_id, event_ids=event.prev_event_ids()
|
||||
)
|
||||
|
||||
if state_entry.state_group:
|
||||
joined_hosts = await self.store.get_joined_hosts(event.room_id, state_entry)
|
||||
|
||||
await self._external_cache.set(
|
||||
"event_to_prev_state_group",
|
||||
event.event_id,
|
||||
state_entry.state_group,
|
||||
expiry_ms=60 * 60 * 1000,
|
||||
)
|
||||
await self._external_cache.set(
|
||||
"get_joined_hosts",
|
||||
str(state_entry.state_group),
|
||||
list(joined_hosts),
|
||||
expiry_ms=60 * 60 * 1000,
|
||||
)
|
||||
|
||||
async def _validate_canonical_alias(
|
||||
self, directory_handler, room_alias_str: str, expected_room_id: str
|
||||
) -> None:
|
||||
|
|
|
@ -102,7 +102,7 @@ class OidcHandler:
|
|||
) from e
|
||||
|
||||
async def handle_oidc_callback(self, request: SynapseRequest) -> None:
|
||||
"""Handle an incoming request to /_synapse/oidc/callback
|
||||
"""Handle an incoming request to /_synapse/client/oidc/callback
|
||||
|
||||
Since we might want to display OIDC-related errors in a user-friendly
|
||||
way, we don't raise SynapseError from here. Instead, we call
|
||||
|
@ -274,6 +274,9 @@ class OidcProvider:
|
|||
# MXC URI for icon for this auth provider
|
||||
self.idp_icon = provider.idp_icon
|
||||
|
||||
# optional brand identifier for this auth provider
|
||||
self.idp_brand = provider.idp_brand
|
||||
|
||||
self._sso_handler = hs.get_sso_handler()
|
||||
|
||||
self._sso_handler.register_identity_provider(self)
|
||||
|
@ -640,7 +643,7 @@ class OidcProvider:
|
|||
|
||||
- ``client_id``: the client ID set in ``oidc_config.client_id``
|
||||
- ``response_type``: ``code``
|
||||
- ``redirect_uri``: the callback URL ; ``{base url}/_synapse/oidc/callback``
|
||||
- ``redirect_uri``: the callback URL ; ``{base url}/_synapse/client/oidc/callback``
|
||||
- ``scope``: the list of scopes set in ``oidc_config.scopes``
|
||||
- ``state``: a random string
|
||||
- ``nonce``: a random string
|
||||
|
@ -681,7 +684,7 @@ class OidcProvider:
|
|||
request.addCookie(
|
||||
SESSION_COOKIE_NAME,
|
||||
cookie,
|
||||
path="/_synapse/oidc",
|
||||
path="/_synapse/client/oidc",
|
||||
max_age="3600",
|
||||
httpOnly=True,
|
||||
sameSite="lax",
|
||||
|
@ -702,7 +705,7 @@ class OidcProvider:
|
|||
async def handle_oidc_callback(
|
||||
self, request: SynapseRequest, session_data: "OidcSessionData", code: str
|
||||
) -> None:
|
||||
"""Handle an incoming request to /_synapse/oidc/callback
|
||||
"""Handle an incoming request to /_synapse/client/oidc/callback
|
||||
|
||||
By this time we have already validated the session on the synapse side, and
|
||||
now need to do the provider-specific operations. This includes:
|
||||
|
@ -1056,7 +1059,8 @@ class OidcSessionData:
|
|||
|
||||
|
||||
UserAttributeDict = TypedDict(
|
||||
"UserAttributeDict", {"localpart": Optional[str], "display_name": Optional[str]}
|
||||
"UserAttributeDict",
|
||||
{"localpart": Optional[str], "display_name": Optional[str], "emails": List[str]},
|
||||
)
|
||||
C = TypeVar("C")
|
||||
|
||||
|
@ -1135,11 +1139,12 @@ def jinja_finalize(thing):
|
|||
env = Environment(finalize=jinja_finalize)
|
||||
|
||||
|
||||
@attr.s
|
||||
@attr.s(slots=True, frozen=True)
|
||||
class JinjaOidcMappingConfig:
|
||||
subject_claim = attr.ib(type=str)
|
||||
localpart_template = attr.ib(type=Optional[Template])
|
||||
display_name_template = attr.ib(type=Optional[Template])
|
||||
email_template = attr.ib(type=Optional[Template])
|
||||
extra_attributes = attr.ib(type=Dict[str, Template])
|
||||
|
||||
|
||||
|
@ -1156,23 +1161,17 @@ class JinjaOidcMappingProvider(OidcMappingProvider[JinjaOidcMappingConfig]):
|
|||
def parse_config(config: dict) -> JinjaOidcMappingConfig:
|
||||
subject_claim = config.get("subject_claim", "sub")
|
||||
|
||||
localpart_template = None # type: Optional[Template]
|
||||
if "localpart_template" in config:
|
||||
def parse_template_config(option_name: str) -> Optional[Template]:
|
||||
if option_name not in config:
|
||||
return None
|
||||
try:
|
||||
localpart_template = env.from_string(config["localpart_template"])
|
||||
return env.from_string(config[option_name])
|
||||
except Exception as e:
|
||||
raise ConfigError(
|
||||
"invalid jinja template", path=["localpart_template"]
|
||||
) from e
|
||||
raise ConfigError("invalid jinja template", path=[option_name]) from e
|
||||
|
||||
display_name_template = None # type: Optional[Template]
|
||||
if "display_name_template" in config:
|
||||
try:
|
||||
display_name_template = env.from_string(config["display_name_template"])
|
||||
except Exception as e:
|
||||
raise ConfigError(
|
||||
"invalid jinja template", path=["display_name_template"]
|
||||
) from e
|
||||
localpart_template = parse_template_config("localpart_template")
|
||||
display_name_template = parse_template_config("display_name_template")
|
||||
email_template = parse_template_config("email_template")
|
||||
|
||||
extra_attributes = {} # type Dict[str, Template]
|
||||
if "extra_attributes" in config:
|
||||
|
@ -1192,6 +1191,7 @@ class JinjaOidcMappingProvider(OidcMappingProvider[JinjaOidcMappingConfig]):
|
|||
subject_claim=subject_claim,
|
||||
localpart_template=localpart_template,
|
||||
display_name_template=display_name_template,
|
||||
email_template=email_template,
|
||||
extra_attributes=extra_attributes,
|
||||
)
|
||||
|
||||
|
@ -1213,16 +1213,23 @@ class JinjaOidcMappingProvider(OidcMappingProvider[JinjaOidcMappingConfig]):
|
|||
# a usable mxid.
|
||||
localpart += str(failures) if failures else ""
|
||||
|
||||
display_name = None # type: Optional[str]
|
||||
if self._config.display_name_template is not None:
|
||||
display_name = self._config.display_name_template.render(
|
||||
user=userinfo
|
||||
).strip()
|
||||
def render_template_field(template: Optional[Template]) -> Optional[str]:
|
||||
if template is None:
|
||||
return None
|
||||
return template.render(user=userinfo).strip()
|
||||
|
||||
if display_name == "":
|
||||
display_name = None
|
||||
display_name = render_template_field(self._config.display_name_template)
|
||||
if display_name == "":
|
||||
display_name = None
|
||||
|
||||
return UserAttributeDict(localpart=localpart, display_name=display_name)
|
||||
emails = [] # type: List[str]
|
||||
email = render_template_field(self._config.email_template)
|
||||
if email:
|
||||
emails.append(email)
|
||||
|
||||
return UserAttributeDict(
|
||||
localpart=localpart, display_name=display_name, emails=emails
|
||||
)
|
||||
|
||||
async def get_extra_attributes(self, userinfo: UserInfo, token: Token) -> JsonDict:
|
||||
extras = {} # type: Dict[str, str]
|
||||
|
|
|
@ -14,8 +14,9 @@
|
|||
# limitations under the License.
|
||||
|
||||
"""Contains functions for registering clients."""
|
||||
|
||||
import logging
|
||||
from typing import TYPE_CHECKING, List, Optional, Tuple
|
||||
from typing import TYPE_CHECKING, Iterable, List, Optional, Tuple
|
||||
|
||||
from synapse import types
|
||||
from synapse.api.constants import MAX_USERID_LENGTH, EventTypes, JoinRules, LoginType
|
||||
|
@ -157,7 +158,7 @@ class RegistrationHandler(BaseHandler):
|
|||
user_type: Optional[str] = None,
|
||||
default_display_name: Optional[str] = None,
|
||||
address: Optional[str] = None,
|
||||
bind_emails: List[str] = [],
|
||||
bind_emails: Iterable[str] = [],
|
||||
by_admin: bool = False,
|
||||
user_agent_ips: Optional[List[Tuple[str, str]]] = None,
|
||||
) -> str:
|
||||
|
@ -700,6 +701,8 @@ class RegistrationHandler(BaseHandler):
|
|||
access_token: The access token of the newly logged in device, or
|
||||
None if `inhibit_login` enabled.
|
||||
"""
|
||||
# TODO: 3pid registration can actually happen on the workers. Consider
|
||||
# refactoring it.
|
||||
if self.hs.config.worker_app:
|
||||
await self._post_registration_client(
|
||||
user_id=user_id, auth_result=auth_result, access_token=access_token
|
||||
|
|
|
@ -126,6 +126,10 @@ class RoomCreationHandler(BaseHandler):
|
|||
|
||||
self.third_party_event_rules = hs.get_third_party_event_rules()
|
||||
|
||||
self._invite_burst_count = (
|
||||
hs.config.ratelimiting.rc_invites_per_room.burst_count
|
||||
)
|
||||
|
||||
async def upgrade_room(
|
||||
self, requester: Requester, old_room_id: str, new_version: RoomVersion
|
||||
) -> str:
|
||||
|
@ -662,6 +666,9 @@ class RoomCreationHandler(BaseHandler):
|
|||
invite_3pid_list = []
|
||||
invite_list = []
|
||||
|
||||
if len(invite_list) + len(invite_3pid_list) > self._invite_burst_count:
|
||||
raise SynapseError(400, "Cannot invite so many users at once")
|
||||
|
||||
await self.event_creation_handler.assert_accepted_privacy_policy(requester)
|
||||
|
||||
power_level_content_override = config.get("power_level_content_override")
|
||||
|
|
|
@ -85,6 +85,17 @@ class RoomMemberHandler(metaclass=abc.ABCMeta):
|
|||
burst_count=hs.config.ratelimiting.rc_joins_remote.burst_count,
|
||||
)
|
||||
|
||||
self._invites_per_room_limiter = Ratelimiter(
|
||||
clock=self.clock,
|
||||
rate_hz=hs.config.ratelimiting.rc_invites_per_room.per_second,
|
||||
burst_count=hs.config.ratelimiting.rc_invites_per_room.burst_count,
|
||||
)
|
||||
self._invites_per_user_limiter = Ratelimiter(
|
||||
clock=self.clock,
|
||||
rate_hz=hs.config.ratelimiting.rc_invites_per_user.per_second,
|
||||
burst_count=hs.config.ratelimiting.rc_invites_per_user.burst_count,
|
||||
)
|
||||
|
||||
# This is only used to get at ratelimit function, and
|
||||
# maybe_kick_guest_users. It's fine there are multiple of these as
|
||||
# it doesn't store state.
|
||||
|
@ -144,6 +155,12 @@ class RoomMemberHandler(metaclass=abc.ABCMeta):
|
|||
"""
|
||||
raise NotImplementedError()
|
||||
|
||||
def ratelimit_invite(self, room_id: str, invitee_user_id: str):
|
||||
"""Ratelimit invites by room and by target user.
|
||||
"""
|
||||
self._invites_per_room_limiter.ratelimit(room_id)
|
||||
self._invites_per_user_limiter.ratelimit(invitee_user_id)
|
||||
|
||||
async def _local_membership_update(
|
||||
self,
|
||||
requester: Requester,
|
||||
|
@ -387,8 +404,12 @@ class RoomMemberHandler(metaclass=abc.ABCMeta):
|
|||
raise SynapseError(403, "This room has been blocked on this server")
|
||||
|
||||
if effective_membership_state == Membership.INVITE:
|
||||
target_id = target.to_string()
|
||||
if ratelimit:
|
||||
self.ratelimit_invite(room_id, target_id)
|
||||
|
||||
# block any attempts to invite the server notices mxid
|
||||
if target.to_string() == self._server_notices_mxid:
|
||||
if target_id == self._server_notices_mxid:
|
||||
raise SynapseError(HTTPStatus.FORBIDDEN, "Cannot invite this user")
|
||||
|
||||
block_invite = False
|
||||
|
@ -412,7 +433,7 @@ class RoomMemberHandler(metaclass=abc.ABCMeta):
|
|||
block_invite = True
|
||||
|
||||
if not await self.spam_checker.user_may_invite(
|
||||
requester.user.to_string(), target.to_string(), room_id
|
||||
requester.user.to_string(), target_id, room_id
|
||||
):
|
||||
logger.info("Blocking invite due to spam checker")
|
||||
block_invite = True
|
||||
|
|
|
@ -78,9 +78,10 @@ class SamlHandler(BaseHandler):
|
|||
# user-facing name of this auth provider
|
||||
self.idp_name = "SAML"
|
||||
|
||||
# we do not currently support icons for SAML auth, but this is required by
|
||||
# we do not currently support icons/brands for SAML auth, but this is required by
|
||||
# the SsoIdentityProvider protocol type.
|
||||
self.idp_icon = None
|
||||
self.idp_brand = None
|
||||
|
||||
# a map from saml session id to Saml2SessionData object
|
||||
self._outstanding_requests_dict = {} # type: Dict[str, Saml2SessionData]
|
||||
|
@ -132,7 +133,7 @@ class SamlHandler(BaseHandler):
|
|||
raise Exception("prepare_for_authenticate didn't return a Location header")
|
||||
|
||||
async def handle_saml_response(self, request: SynapseRequest) -> None:
|
||||
"""Handle an incoming request to /_matrix/saml2/authn_response
|
||||
"""Handle an incoming request to /_synapse/client/saml2/authn_response
|
||||
|
||||
Args:
|
||||
request: the incoming request from the browser. We'll
|
||||
|
|
|
@ -15,23 +15,28 @@
|
|||
|
||||
import itertools
|
||||
import logging
|
||||
from typing import Iterable
|
||||
from typing import TYPE_CHECKING, Dict, Iterable, List, Optional
|
||||
|
||||
from unpaddedbase64 import decode_base64, encode_base64
|
||||
|
||||
from synapse.api.constants import EventTypes, Membership
|
||||
from synapse.api.errors import NotFoundError, SynapseError
|
||||
from synapse.api.filtering import Filter
|
||||
from synapse.events import EventBase
|
||||
from synapse.storage.state import StateFilter
|
||||
from synapse.types import JsonDict, UserID
|
||||
from synapse.visibility import filter_events_for_client
|
||||
|
||||
from ._base import BaseHandler
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from synapse.app.homeserver import HomeServer
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class SearchHandler(BaseHandler):
|
||||
def __init__(self, hs):
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
super().__init__(hs)
|
||||
self._event_serializer = hs.get_event_client_serializer()
|
||||
self.storage = hs.get_storage()
|
||||
|
@ -87,13 +92,15 @@ class SearchHandler(BaseHandler):
|
|||
|
||||
return historical_room_ids
|
||||
|
||||
async def search(self, user, content, batch=None):
|
||||
async def search(
|
||||
self, user: UserID, content: JsonDict, batch: Optional[str] = None
|
||||
) -> JsonDict:
|
||||
"""Performs a full text search for a user.
|
||||
|
||||
Args:
|
||||
user (UserID)
|
||||
content (dict): Search parameters
|
||||
batch (str): The next_batch parameter. Used for pagination.
|
||||
user
|
||||
content: Search parameters
|
||||
batch: The next_batch parameter. Used for pagination.
|
||||
|
||||
Returns:
|
||||
dict to be returned to the client with results of search
|
||||
|
@ -186,7 +193,7 @@ class SearchHandler(BaseHandler):
|
|||
# If doing a subset of all rooms seearch, check if any of the rooms
|
||||
# are from an upgraded room, and search their contents as well
|
||||
if search_filter.rooms:
|
||||
historical_room_ids = []
|
||||
historical_room_ids = [] # type: List[str]
|
||||
for room_id in search_filter.rooms:
|
||||
# Add any previous rooms to the search if they exist
|
||||
ids = await self.get_old_rooms_from_upgraded_room(room_id)
|
||||
|
@ -209,8 +216,10 @@ class SearchHandler(BaseHandler):
|
|||
|
||||
rank_map = {} # event_id -> rank of event
|
||||
allowed_events = []
|
||||
room_groups = {} # Holds result of grouping by room, if applicable
|
||||
sender_group = {} # Holds result of grouping by sender, if applicable
|
||||
# Holds result of grouping by room, if applicable
|
||||
room_groups = {} # type: Dict[str, JsonDict]
|
||||
# Holds result of grouping by sender, if applicable
|
||||
sender_group = {} # type: Dict[str, JsonDict]
|
||||
|
||||
# Holds the next_batch for the entire result set if one of those exists
|
||||
global_next_batch = None
|
||||
|
@ -254,7 +263,7 @@ class SearchHandler(BaseHandler):
|
|||
s["results"].append(e.event_id)
|
||||
|
||||
elif order_by == "recent":
|
||||
room_events = []
|
||||
room_events = [] # type: List[EventBase]
|
||||
i = 0
|
||||
|
||||
pagination_token = batch_token
|
||||
|
@ -418,13 +427,10 @@ class SearchHandler(BaseHandler):
|
|||
|
||||
state_results = {}
|
||||
if include_state:
|
||||
rooms = {e.room_id for e in allowed_events}
|
||||
for room_id in rooms:
|
||||
for room_id in {e.room_id for e in allowed_events}:
|
||||
state = await self.state_handler.get_current_state(room_id)
|
||||
state_results[room_id] = list(state.values())
|
||||
|
||||
state_results.values()
|
||||
|
||||
# We're now about to serialize the events. We should not make any
|
||||
# blocking calls after this. Otherwise the 'age' will be wrong
|
||||
|
||||
|
@ -448,9 +454,9 @@ class SearchHandler(BaseHandler):
|
|||
|
||||
if state_results:
|
||||
s = {}
|
||||
for room_id, state in state_results.items():
|
||||
for room_id, state_events in state_results.items():
|
||||
s[room_id] = await self._event_serializer.serialize_events(
|
||||
state, time_now
|
||||
state_events, time_now
|
||||
)
|
||||
|
||||
rooms_cat_res["state"] = s
|
||||
|
|
|
@ -13,24 +13,26 @@
|
|||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
import logging
|
||||
from typing import Optional
|
||||
from typing import TYPE_CHECKING, Optional
|
||||
|
||||
from synapse.api.errors import Codes, StoreError, SynapseError
|
||||
from synapse.types import Requester
|
||||
|
||||
from ._base import BaseHandler
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from synapse.app.homeserver import HomeServer
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class SetPasswordHandler(BaseHandler):
|
||||
"""Handler which deals with changing user account passwords"""
|
||||
|
||||
def __init__(self, hs):
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
super().__init__(hs)
|
||||
self._auth_handler = hs.get_auth_handler()
|
||||
self._device_handler = hs.get_device_handler()
|
||||
self._password_policy_handler = hs.get_password_policy_handler()
|
||||
|
||||
async def set_password(
|
||||
self,
|
||||
|
@ -38,7 +40,7 @@ class SetPasswordHandler(BaseHandler):
|
|||
password_hash: str,
|
||||
logout_devices: bool,
|
||||
requester: Optional[Requester] = None,
|
||||
):
|
||||
) -> None:
|
||||
if not self.hs.config.password_localdb_enabled:
|
||||
raise SynapseError(403, "Password change disabled", errcode=Codes.FORBIDDEN)
|
||||
|
||||
|
|
|
@ -14,21 +14,31 @@
|
|||
# limitations under the License.
|
||||
import abc
|
||||
import logging
|
||||
from typing import TYPE_CHECKING, Awaitable, Callable, Dict, List, Mapping, Optional
|
||||
from typing import (
|
||||
TYPE_CHECKING,
|
||||
Awaitable,
|
||||
Callable,
|
||||
Dict,
|
||||
Iterable,
|
||||
Mapping,
|
||||
Optional,
|
||||
Set,
|
||||
)
|
||||
from urllib.parse import urlencode
|
||||
|
||||
import attr
|
||||
from typing_extensions import NoReturn, Protocol
|
||||
|
||||
from twisted.web.http import Request
|
||||
from twisted.web.iweb import IRequest
|
||||
|
||||
from synapse.api.constants import LoginType
|
||||
from synapse.api.errors import Codes, RedirectException, SynapseError
|
||||
from synapse.api.errors import Codes, NotFoundError, RedirectException, SynapseError
|
||||
from synapse.handlers.ui_auth import UIAuthSessionDataConstants
|
||||
from synapse.http import get_request_user_agent
|
||||
from synapse.http.server import respond_with_html
|
||||
from synapse.http.server import respond_with_html, respond_with_redirect
|
||||
from synapse.http.site import SynapseRequest
|
||||
from synapse.types import JsonDict, UserID, contains_invalid_mxid_characters
|
||||
from synapse.types import Collection, JsonDict, UserID, contains_invalid_mxid_characters
|
||||
from synapse.util.async_helpers import Linearizer
|
||||
from synapse.util.stringutils import random_string
|
||||
|
||||
|
@ -80,6 +90,11 @@ class SsoIdentityProvider(Protocol):
|
|||
"""Optional MXC URI for user-facing icon"""
|
||||
return None
|
||||
|
||||
@property
|
||||
def idp_brand(self) -> Optional[str]:
|
||||
"""Optional branding identifier"""
|
||||
return None
|
||||
|
||||
@abc.abstractmethod
|
||||
async def handle_redirect_request(
|
||||
self,
|
||||
|
@ -109,7 +124,7 @@ class UserAttributes:
|
|||
# enter one.
|
||||
localpart = attr.ib(type=Optional[str])
|
||||
display_name = attr.ib(type=Optional[str], default=None)
|
||||
emails = attr.ib(type=List[str], default=attr.Factory(list))
|
||||
emails = attr.ib(type=Collection[str], default=attr.Factory(list))
|
||||
|
||||
|
||||
@attr.s(slots=True)
|
||||
|
@ -124,7 +139,7 @@ class UsernameMappingSession:
|
|||
|
||||
# attributes returned by the ID mapper
|
||||
display_name = attr.ib(type=Optional[str])
|
||||
emails = attr.ib(type=List[str])
|
||||
emails = attr.ib(type=Collection[str])
|
||||
|
||||
# An optional dictionary of extra attributes to be provided to the client in the
|
||||
# login response.
|
||||
|
@ -136,6 +151,12 @@ class UsernameMappingSession:
|
|||
# expiry time for the session, in milliseconds
|
||||
expiry_time_ms = attr.ib(type=int)
|
||||
|
||||
# choices made by the user
|
||||
chosen_localpart = attr.ib(type=Optional[str], default=None)
|
||||
use_display_name = attr.ib(type=bool, default=True)
|
||||
emails_to_use = attr.ib(type=Collection[str], default=())
|
||||
terms_accepted_version = attr.ib(type=Optional[str], default=None)
|
||||
|
||||
|
||||
# the HTTP cookie used to track the mapping session id
|
||||
USERNAME_MAPPING_SESSION_COOKIE_NAME = b"username_mapping_session"
|
||||
|
@ -170,6 +191,8 @@ class SsoHandler:
|
|||
# map from idp_id to SsoIdentityProvider
|
||||
self._identity_providers = {} # type: Dict[str, SsoIdentityProvider]
|
||||
|
||||
self._consent_at_registration = hs.config.consent.user_consent_at_registration
|
||||
|
||||
def register_identity_provider(self, p: SsoIdentityProvider):
|
||||
p_id = p.idp_id
|
||||
assert p_id not in self._identity_providers
|
||||
|
@ -235,7 +258,10 @@ class SsoHandler:
|
|||
respond_with_html(request, code, html)
|
||||
|
||||
async def handle_redirect_request(
|
||||
self, request: SynapseRequest, client_redirect_url: bytes,
|
||||
self,
|
||||
request: SynapseRequest,
|
||||
client_redirect_url: bytes,
|
||||
idp_id: Optional[str],
|
||||
) -> str:
|
||||
"""Handle a request to /login/sso/redirect
|
||||
|
||||
|
@ -243,6 +269,7 @@ class SsoHandler:
|
|||
request: incoming HTTP request
|
||||
client_redirect_url: the URL that we should redirect the
|
||||
client to after login.
|
||||
idp_id: optional identity provider chosen by the client
|
||||
|
||||
Returns:
|
||||
the URI to redirect to
|
||||
|
@ -252,10 +279,19 @@ class SsoHandler:
|
|||
400, "Homeserver not configured for SSO.", errcode=Codes.UNRECOGNIZED
|
||||
)
|
||||
|
||||
# if the client chose an IdP, use that
|
||||
idp = None # type: Optional[SsoIdentityProvider]
|
||||
if idp_id:
|
||||
idp = self._identity_providers.get(idp_id)
|
||||
if not idp:
|
||||
raise NotFoundError("Unknown identity provider")
|
||||
|
||||
# if we only have one auth provider, redirect to it directly
|
||||
if len(self._identity_providers) == 1:
|
||||
ap = next(iter(self._identity_providers.values()))
|
||||
return await ap.handle_redirect_request(request, client_redirect_url)
|
||||
elif len(self._identity_providers) == 1:
|
||||
idp = next(iter(self._identity_providers.values()))
|
||||
|
||||
if idp:
|
||||
return await idp.handle_redirect_request(request, client_redirect_url)
|
||||
|
||||
# otherwise, redirect to the IDP picker
|
||||
return "/_synapse/client/pick_idp?" + urlencode(
|
||||
|
@ -369,6 +405,8 @@ class SsoHandler:
|
|||
to an additional page. (e.g. to prompt for more information)
|
||||
|
||||
"""
|
||||
new_user = False
|
||||
|
||||
# grab a lock while we try to find a mapping for this user. This seems...
|
||||
# optimistic, especially for implementations that end up redirecting to
|
||||
# interstitial pages.
|
||||
|
@ -409,9 +447,14 @@ class SsoHandler:
|
|||
get_request_user_agent(request),
|
||||
request.getClientIP(),
|
||||
)
|
||||
new_user = True
|
||||
|
||||
await self._auth_handler.complete_sso_login(
|
||||
user_id, request, client_redirect_url, extra_login_attributes
|
||||
user_id,
|
||||
request,
|
||||
client_redirect_url,
|
||||
extra_login_attributes,
|
||||
new_user=new_user,
|
||||
)
|
||||
|
||||
async def _call_attribute_mapper(
|
||||
|
@ -501,7 +544,7 @@ class SsoHandler:
|
|||
logger.info("Recorded registration session id %s", session_id)
|
||||
|
||||
# Set the cookie and redirect to the username picker
|
||||
e = RedirectException(b"/_synapse/client/pick_username")
|
||||
e = RedirectException(b"/_synapse/client/pick_username/account_details")
|
||||
e.cookies.append(
|
||||
b"%s=%s; path=/"
|
||||
% (USERNAME_MAPPING_SESSION_COOKIE_NAME, session_id.encode("ascii"))
|
||||
|
@ -629,6 +672,25 @@ class SsoHandler:
|
|||
)
|
||||
respond_with_html(request, 200, html)
|
||||
|
||||
def get_mapping_session(self, session_id: str) -> UsernameMappingSession:
|
||||
"""Look up the given username mapping session
|
||||
|
||||
If it is not found, raises a SynapseError with an http code of 400
|
||||
|
||||
Args:
|
||||
session_id: session to look up
|
||||
Returns:
|
||||
active mapping session
|
||||
Raises:
|
||||
SynapseError if the session is not found/has expired
|
||||
"""
|
||||
self._expire_old_sessions()
|
||||
session = self._username_mapping_sessions.get(session_id)
|
||||
if session:
|
||||
return session
|
||||
logger.info("Couldn't find session id %s", session_id)
|
||||
raise SynapseError(400, "unknown session")
|
||||
|
||||
async def check_username_availability(
|
||||
self, localpart: str, session_id: str,
|
||||
) -> bool:
|
||||
|
@ -645,12 +707,7 @@ class SsoHandler:
|
|||
|
||||
# make sure that there is a valid mapping session, to stop people dictionary-
|
||||
# scanning for accounts
|
||||
|
||||
self._expire_old_sessions()
|
||||
session = self._username_mapping_sessions.get(session_id)
|
||||
if not session:
|
||||
logger.info("Couldn't find session id %s", session_id)
|
||||
raise SynapseError(400, "unknown session")
|
||||
self.get_mapping_session(session_id)
|
||||
|
||||
logger.info(
|
||||
"[session %s] Checking for availability of username %s",
|
||||
|
@ -667,7 +724,12 @@ class SsoHandler:
|
|||
return not user_infos
|
||||
|
||||
async def handle_submit_username_request(
|
||||
self, request: SynapseRequest, localpart: str, session_id: str
|
||||
self,
|
||||
request: SynapseRequest,
|
||||
session_id: str,
|
||||
localpart: str,
|
||||
use_display_name: bool,
|
||||
emails_to_use: Iterable[str],
|
||||
) -> None:
|
||||
"""Handle a request to the username-picker 'submit' endpoint
|
||||
|
||||
|
@ -677,21 +739,90 @@ class SsoHandler:
|
|||
request: HTTP request
|
||||
localpart: localpart requested by the user
|
||||
session_id: ID of the username mapping session, extracted from a cookie
|
||||
use_display_name: whether the user wants to use the suggested display name
|
||||
emails_to_use: emails that the user would like to use
|
||||
"""
|
||||
self._expire_old_sessions()
|
||||
session = self._username_mapping_sessions.get(session_id)
|
||||
if not session:
|
||||
logger.info("Couldn't find session id %s", session_id)
|
||||
raise SynapseError(400, "unknown session")
|
||||
session = self.get_mapping_session(session_id)
|
||||
|
||||
logger.info("[session %s] Registering localpart %s", session_id, localpart)
|
||||
# update the session with the user's choices
|
||||
session.chosen_localpart = localpart
|
||||
session.use_display_name = use_display_name
|
||||
|
||||
emails_from_idp = set(session.emails)
|
||||
filtered_emails = set() # type: Set[str]
|
||||
|
||||
# we iterate through the list rather than just building a set conjunction, so
|
||||
# that we can log attempts to use unknown addresses
|
||||
for email in emails_to_use:
|
||||
if email in emails_from_idp:
|
||||
filtered_emails.add(email)
|
||||
else:
|
||||
logger.warning(
|
||||
"[session %s] ignoring user request to use unknown email address %r",
|
||||
session_id,
|
||||
email,
|
||||
)
|
||||
session.emails_to_use = filtered_emails
|
||||
|
||||
# we may now need to collect consent from the user, in which case, redirect
|
||||
# to the consent-extraction-unit
|
||||
if self._consent_at_registration:
|
||||
redirect_url = b"/_synapse/client/new_user_consent"
|
||||
|
||||
# otherwise, redirect to the completion page
|
||||
else:
|
||||
redirect_url = b"/_synapse/client/sso_register"
|
||||
|
||||
respond_with_redirect(request, redirect_url)
|
||||
|
||||
async def handle_terms_accepted(
|
||||
self, request: Request, session_id: str, terms_version: str
|
||||
):
|
||||
"""Handle a request to the new-user 'consent' endpoint
|
||||
|
||||
Will serve an HTTP response to the request.
|
||||
|
||||
Args:
|
||||
request: HTTP request
|
||||
session_id: ID of the username mapping session, extracted from a cookie
|
||||
terms_version: the version of the terms which the user viewed and consented
|
||||
to
|
||||
"""
|
||||
logger.info(
|
||||
"[session %s] User consented to terms version %s",
|
||||
session_id,
|
||||
terms_version,
|
||||
)
|
||||
session = self.get_mapping_session(session_id)
|
||||
session.terms_accepted_version = terms_version
|
||||
|
||||
# we're done; now we can register the user
|
||||
respond_with_redirect(request, b"/_synapse/client/sso_register")
|
||||
|
||||
async def register_sso_user(self, request: Request, session_id: str) -> None:
|
||||
"""Called once we have all the info we need to register a new user.
|
||||
|
||||
Does so and serves an HTTP response
|
||||
|
||||
Args:
|
||||
request: HTTP request
|
||||
session_id: ID of the username mapping session, extracted from a cookie
|
||||
"""
|
||||
session = self.get_mapping_session(session_id)
|
||||
|
||||
logger.info(
|
||||
"[session %s] Registering localpart %s",
|
||||
session_id,
|
||||
session.chosen_localpart,
|
||||
)
|
||||
|
||||
attributes = UserAttributes(
|
||||
localpart=localpart,
|
||||
display_name=session.display_name,
|
||||
emails=session.emails,
|
||||
localpart=session.chosen_localpart, emails=session.emails_to_use,
|
||||
)
|
||||
|
||||
if session.use_display_name:
|
||||
attributes.display_name = session.display_name
|
||||
|
||||
# the following will raise a 400 error if the username has been taken in the
|
||||
# meantime.
|
||||
user_id = await self._register_mapped_user(
|
||||
|
@ -702,7 +833,12 @@ class SsoHandler:
|
|||
request.getClientIP(),
|
||||
)
|
||||
|
||||
logger.info("[session %s] Registered userid %s", session_id, user_id)
|
||||
logger.info(
|
||||
"[session %s] Registered userid %s with attributes %s",
|
||||
session_id,
|
||||
user_id,
|
||||
attributes,
|
||||
)
|
||||
|
||||
# delete the mapping session and the cookie
|
||||
del self._username_mapping_sessions[session_id]
|
||||
|
@ -715,11 +851,21 @@ class SsoHandler:
|
|||
path=b"/",
|
||||
)
|
||||
|
||||
auth_result = {}
|
||||
if session.terms_accepted_version:
|
||||
# TODO: make this less awful.
|
||||
auth_result[LoginType.TERMS] = True
|
||||
|
||||
await self._registration_handler.post_registration_actions(
|
||||
user_id, auth_result, access_token=None
|
||||
)
|
||||
|
||||
await self._auth_handler.complete_sso_login(
|
||||
user_id,
|
||||
request,
|
||||
session.client_redirect_url,
|
||||
session.extra_login_attributes,
|
||||
new_user=True,
|
||||
)
|
||||
|
||||
def _expire_old_sessions(self):
|
||||
|
@ -733,3 +879,14 @@ class SsoHandler:
|
|||
for session_id in to_expire:
|
||||
logger.info("Expiring mapping session %s", session_id)
|
||||
del self._username_mapping_sessions[session_id]
|
||||
|
||||
|
||||
def get_username_mapping_session_cookie_from_request(request: IRequest) -> str:
|
||||
"""Extract the session ID from the cookie
|
||||
|
||||
Raises a SynapseError if the cookie isn't found
|
||||
"""
|
||||
session_id = request.getCookie(USERNAME_MAPPING_SESSION_COOKIE_NAME)
|
||||
if not session_id:
|
||||
raise SynapseError(code=400, msg="missing session_id")
|
||||
return session_id.decode("ascii", errors="replace")
|
||||
|
|
|
@ -14,15 +14,25 @@
|
|||
# limitations under the License.
|
||||
|
||||
import logging
|
||||
from typing import TYPE_CHECKING, Optional
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from synapse.app.homeserver import HomeServer
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class StateDeltasHandler:
|
||||
def __init__(self, hs):
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
self.store = hs.get_datastore()
|
||||
|
||||
async def _get_key_change(self, prev_event_id, event_id, key_name, public_value):
|
||||
async def _get_key_change(
|
||||
self,
|
||||
prev_event_id: Optional[str],
|
||||
event_id: Optional[str],
|
||||
key_name: str,
|
||||
public_value: str,
|
||||
) -> Optional[bool]:
|
||||
"""Given two events check if the `key_name` field in content changed
|
||||
from not matching `public_value` to doing so.
|
||||
|
||||
|
|
|
@ -12,13 +12,19 @@
|
|||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import logging
|
||||
from collections import Counter
|
||||
from typing import TYPE_CHECKING, Any, Dict, Iterable, Optional, Tuple
|
||||
|
||||
from typing_extensions import Counter as CounterType
|
||||
|
||||
from synapse.api.constants import EventTypes, Membership
|
||||
from synapse.metrics import event_processing_positions
|
||||
from synapse.metrics.background_process_metrics import run_as_background_process
|
||||
from synapse.types import JsonDict
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from synapse.app.homeserver import HomeServer
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
@ -31,7 +37,7 @@ class StatsHandler:
|
|||
Heavily derived from UserDirectoryHandler
|
||||
"""
|
||||
|
||||
def __init__(self, hs):
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
self.hs = hs
|
||||
self.store = hs.get_datastore()
|
||||
self.state = hs.get_state_handler()
|
||||
|
@ -44,7 +50,7 @@ class StatsHandler:
|
|||
self.stats_enabled = hs.config.stats_enabled
|
||||
|
||||
# The current position in the current_state_delta stream
|
||||
self.pos = None
|
||||
self.pos = None # type: Optional[int]
|
||||
|
||||
# Guard to ensure we only process deltas one at a time
|
||||
self._is_processing = False
|
||||
|
@ -56,7 +62,7 @@ class StatsHandler:
|
|||
# we start populating stats
|
||||
self.clock.call_later(0, self.notify_new_event)
|
||||
|
||||
def notify_new_event(self):
|
||||
def notify_new_event(self) -> None:
|
||||
"""Called when there may be more deltas to process
|
||||
"""
|
||||
if not self.stats_enabled or self._is_processing:
|
||||
|
@ -72,7 +78,7 @@ class StatsHandler:
|
|||
|
||||
run_as_background_process("stats.notify_new_event", process)
|
||||
|
||||
async def _unsafe_process(self):
|
||||
async def _unsafe_process(self) -> None:
|
||||
# If self.pos is None then means we haven't fetched it from DB
|
||||
if self.pos is None:
|
||||
self.pos = await self.store.get_stats_positions()
|
||||
|
@ -110,10 +116,10 @@ class StatsHandler:
|
|||
)
|
||||
|
||||
for room_id, fields in room_count.items():
|
||||
room_deltas.setdefault(room_id, {}).update(fields)
|
||||
room_deltas.setdefault(room_id, Counter()).update(fields)
|
||||
|
||||
for user_id, fields in user_count.items():
|
||||
user_deltas.setdefault(user_id, {}).update(fields)
|
||||
user_deltas.setdefault(user_id, Counter()).update(fields)
|
||||
|
||||
logger.debug("room_deltas: %s", room_deltas)
|
||||
logger.debug("user_deltas: %s", user_deltas)
|
||||
|
@ -131,19 +137,20 @@ class StatsHandler:
|
|||
|
||||
self.pos = max_pos
|
||||
|
||||
async def _handle_deltas(self, deltas):
|
||||
async def _handle_deltas(
|
||||
self, deltas: Iterable[JsonDict]
|
||||
) -> Tuple[Dict[str, CounterType[str]], Dict[str, CounterType[str]]]:
|
||||
"""Called with the state deltas to process
|
||||
|
||||
Returns:
|
||||
tuple[dict[str, Counter], dict[str, counter]]
|
||||
Two dicts: the room deltas and the user deltas,
|
||||
mapping from room/user ID to changes in the various fields.
|
||||
"""
|
||||
|
||||
room_to_stats_deltas = {}
|
||||
user_to_stats_deltas = {}
|
||||
room_to_stats_deltas = {} # type: Dict[str, CounterType[str]]
|
||||
user_to_stats_deltas = {} # type: Dict[str, CounterType[str]]
|
||||
|
||||
room_to_state_updates = {}
|
||||
room_to_state_updates = {} # type: Dict[str, Dict[str, Any]]
|
||||
|
||||
for delta in deltas:
|
||||
typ = delta["type"]
|
||||
|
@ -173,7 +180,7 @@ class StatsHandler:
|
|||
)
|
||||
continue
|
||||
|
||||
event_content = {}
|
||||
event_content = {} # type: JsonDict
|
||||
|
||||
sender = None
|
||||
if event_id is not None:
|
||||
|
@ -257,13 +264,13 @@ class StatsHandler:
|
|||
)
|
||||
|
||||
if has_changed_joinedness:
|
||||
delta = +1 if membership == Membership.JOIN else -1
|
||||
membership_delta = +1 if membership == Membership.JOIN else -1
|
||||
|
||||
user_to_stats_deltas.setdefault(user_id, Counter())[
|
||||
"joined_rooms"
|
||||
] += delta
|
||||
] += membership_delta
|
||||
|
||||
room_stats_delta["local_users_in_room"] += delta
|
||||
room_stats_delta["local_users_in_room"] += membership_delta
|
||||
|
||||
elif typ == EventTypes.Create:
|
||||
room_state["is_federatable"] = (
|
||||
|
|
|
@ -15,13 +15,13 @@
|
|||
import logging
|
||||
import random
|
||||
from collections import namedtuple
|
||||
from typing import TYPE_CHECKING, List, Set, Tuple
|
||||
from typing import TYPE_CHECKING, Dict, Iterable, List, Optional, Set, Tuple
|
||||
|
||||
from synapse.api.errors import AuthError, ShadowBanError, SynapseError
|
||||
from synapse.appservice import ApplicationService
|
||||
from synapse.metrics.background_process_metrics import run_as_background_process
|
||||
from synapse.replication.tcp.streams import TypingStream
|
||||
from synapse.types import JsonDict, UserID, get_domain_from_id
|
||||
from synapse.types import JsonDict, Requester, UserID, get_domain_from_id
|
||||
from synapse.util.caches.stream_change_cache import StreamChangeCache
|
||||
from synapse.util.metrics import Measure
|
||||
from synapse.util.wheel_timer import WheelTimer
|
||||
|
@ -65,17 +65,17 @@ class FollowerTypingHandler:
|
|||
)
|
||||
|
||||
# map room IDs to serial numbers
|
||||
self._room_serials = {}
|
||||
self._room_serials = {} # type: Dict[str, int]
|
||||
# map room IDs to sets of users currently typing
|
||||
self._room_typing = {}
|
||||
self._room_typing = {} # type: Dict[str, Set[str]]
|
||||
|
||||
self._member_last_federation_poke = {}
|
||||
self._member_last_federation_poke = {} # type: Dict[RoomMember, int]
|
||||
self.wheel_timer = WheelTimer(bucket_size=5000)
|
||||
self._latest_room_serial = 0
|
||||
|
||||
self.clock.looping_call(self._handle_timeouts, 5000)
|
||||
|
||||
def _reset(self):
|
||||
def _reset(self) -> None:
|
||||
"""Reset the typing handler's data caches.
|
||||
"""
|
||||
# map room IDs to serial numbers
|
||||
|
@ -86,7 +86,7 @@ class FollowerTypingHandler:
|
|||
self._member_last_federation_poke = {}
|
||||
self.wheel_timer = WheelTimer(bucket_size=5000)
|
||||
|
||||
def _handle_timeouts(self):
|
||||
def _handle_timeouts(self) -> None:
|
||||
logger.debug("Checking for typing timeouts")
|
||||
|
||||
now = self.clock.time_msec()
|
||||
|
@ -96,7 +96,7 @@ class FollowerTypingHandler:
|
|||
for member in members:
|
||||
self._handle_timeout_for_member(now, member)
|
||||
|
||||
def _handle_timeout_for_member(self, now: int, member: RoomMember):
|
||||
def _handle_timeout_for_member(self, now: int, member: RoomMember) -> None:
|
||||
if not self.is_typing(member):
|
||||
# Nothing to do if they're no longer typing
|
||||
return
|
||||
|
@ -114,10 +114,10 @@ class FollowerTypingHandler:
|
|||
# each person typing.
|
||||
self.wheel_timer.insert(now=now, obj=member, then=now + 60 * 1000)
|
||||
|
||||
def is_typing(self, member):
|
||||
def is_typing(self, member: RoomMember) -> bool:
|
||||
return member.user_id in self._room_typing.get(member.room_id, [])
|
||||
|
||||
async def _push_remote(self, member, typing):
|
||||
async def _push_remote(self, member: RoomMember, typing: bool) -> None:
|
||||
if not self.federation:
|
||||
return
|
||||
|
||||
|
@ -148,7 +148,7 @@ class FollowerTypingHandler:
|
|||
|
||||
def process_replication_rows(
|
||||
self, token: int, rows: List[TypingStream.TypingStreamRow]
|
||||
):
|
||||
) -> None:
|
||||
"""Should be called whenever we receive updates for typing stream.
|
||||
"""
|
||||
|
||||
|
@ -178,7 +178,7 @@ class FollowerTypingHandler:
|
|||
|
||||
async def _send_changes_in_typing_to_remotes(
|
||||
self, room_id: str, prev_typing: Set[str], now_typing: Set[str]
|
||||
):
|
||||
) -> None:
|
||||
"""Process a change in typing of a room from replication, sending EDUs
|
||||
for any local users.
|
||||
"""
|
||||
|
@ -194,12 +194,12 @@ class FollowerTypingHandler:
|
|||
if self.is_mine_id(user_id):
|
||||
await self._push_remote(RoomMember(room_id, user_id), False)
|
||||
|
||||
def get_current_token(self):
|
||||
def get_current_token(self) -> int:
|
||||
return self._latest_room_serial
|
||||
|
||||
|
||||
class TypingWriterHandler(FollowerTypingHandler):
|
||||
def __init__(self, hs):
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
super().__init__(hs)
|
||||
|
||||
assert hs.config.worker.writers.typing == hs.get_instance_name()
|
||||
|
@ -213,14 +213,15 @@ class TypingWriterHandler(FollowerTypingHandler):
|
|||
|
||||
hs.get_distributor().observe("user_left_room", self.user_left_room)
|
||||
|
||||
self._member_typing_until = {} # clock time we expect to stop
|
||||
# clock time we expect to stop
|
||||
self._member_typing_until = {} # type: Dict[RoomMember, int]
|
||||
|
||||
# caches which room_ids changed at which serials
|
||||
self._typing_stream_change_cache = StreamChangeCache(
|
||||
"TypingStreamChangeCache", self._latest_room_serial
|
||||
)
|
||||
|
||||
def _handle_timeout_for_member(self, now: int, member: RoomMember):
|
||||
def _handle_timeout_for_member(self, now: int, member: RoomMember) -> None:
|
||||
super()._handle_timeout_for_member(now, member)
|
||||
|
||||
if not self.is_typing(member):
|
||||
|
@ -233,7 +234,9 @@ class TypingWriterHandler(FollowerTypingHandler):
|
|||
self._stopped_typing(member)
|
||||
return
|
||||
|
||||
async def started_typing(self, target_user, requester, room_id, timeout):
|
||||
async def started_typing(
|
||||
self, target_user: UserID, requester: Requester, room_id: str, timeout: int
|
||||
) -> None:
|
||||
target_user_id = target_user.to_string()
|
||||
auth_user_id = requester.user.to_string()
|
||||
|
||||
|
@ -263,11 +266,13 @@ class TypingWriterHandler(FollowerTypingHandler):
|
|||
|
||||
if was_present:
|
||||
# No point sending another notification
|
||||
return None
|
||||
return
|
||||
|
||||
self._push_update(member=member, typing=True)
|
||||
|
||||
async def stopped_typing(self, target_user, requester, room_id):
|
||||
async def stopped_typing(
|
||||
self, target_user: UserID, requester: Requester, room_id: str
|
||||
) -> None:
|
||||
target_user_id = target_user.to_string()
|
||||
auth_user_id = requester.user.to_string()
|
||||
|
||||
|
@ -290,23 +295,23 @@ class TypingWriterHandler(FollowerTypingHandler):
|
|||
|
||||
self._stopped_typing(member)
|
||||
|
||||
def user_left_room(self, user, room_id):
|
||||
def user_left_room(self, user: UserID, room_id: str) -> None:
|
||||
user_id = user.to_string()
|
||||
if self.is_mine_id(user_id):
|
||||
member = RoomMember(room_id=room_id, user_id=user_id)
|
||||
self._stopped_typing(member)
|
||||
|
||||
def _stopped_typing(self, member):
|
||||
def _stopped_typing(self, member: RoomMember) -> None:
|
||||
if member.user_id not in self._room_typing.get(member.room_id, set()):
|
||||
# No point
|
||||
return None
|
||||
return
|
||||
|
||||
self._member_typing_until.pop(member, None)
|
||||
self._member_last_federation_poke.pop(member, None)
|
||||
|
||||
self._push_update(member=member, typing=False)
|
||||
|
||||
def _push_update(self, member, typing):
|
||||
def _push_update(self, member: RoomMember, typing: bool) -> None:
|
||||
if self.hs.is_mine_id(member.user_id):
|
||||
# Only send updates for changes to our own users.
|
||||
run_as_background_process(
|
||||
|
@ -315,7 +320,7 @@ class TypingWriterHandler(FollowerTypingHandler):
|
|||
|
||||
self._push_update_local(member=member, typing=typing)
|
||||
|
||||
async def _recv_edu(self, origin, content):
|
||||
async def _recv_edu(self, origin: str, content: JsonDict) -> None:
|
||||
room_id = content["room_id"]
|
||||
user_id = content["user_id"]
|
||||
|
||||
|
@ -340,7 +345,7 @@ class TypingWriterHandler(FollowerTypingHandler):
|
|||
self.wheel_timer.insert(now=now, obj=member, then=now + FEDERATION_TIMEOUT)
|
||||
self._push_update_local(member=member, typing=content["typing"])
|
||||
|
||||
def _push_update_local(self, member, typing):
|
||||
def _push_update_local(self, member: RoomMember, typing: bool) -> None:
|
||||
room_set = self._room_typing.setdefault(member.room_id, set())
|
||||
if typing:
|
||||
room_set.add(member.user_id)
|
||||
|
@ -386,7 +391,7 @@ class TypingWriterHandler(FollowerTypingHandler):
|
|||
|
||||
changed_rooms = self._typing_stream_change_cache.get_all_entities_changed(
|
||||
last_id
|
||||
)
|
||||
) # type: Optional[Iterable[str]]
|
||||
|
||||
if changed_rooms is None:
|
||||
changed_rooms = self._room_serials
|
||||
|
@ -412,13 +417,13 @@ class TypingWriterHandler(FollowerTypingHandler):
|
|||
|
||||
def process_replication_rows(
|
||||
self, token: int, rows: List[TypingStream.TypingStreamRow]
|
||||
):
|
||||
) -> None:
|
||||
# The writing process should never get updates from replication.
|
||||
raise Exception("Typing writer instance got typing info over replication")
|
||||
|
||||
|
||||
class TypingNotificationEventSource:
|
||||
def __init__(self, hs):
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
self.hs = hs
|
||||
self.clock = hs.get_clock()
|
||||
# We can't call get_typing_handler here because there's a cycle:
|
||||
|
@ -427,7 +432,7 @@ class TypingNotificationEventSource:
|
|||
#
|
||||
self.get_typing_handler = hs.get_typing_handler
|
||||
|
||||
def _make_event_for(self, room_id):
|
||||
def _make_event_for(self, room_id: str) -> JsonDict:
|
||||
typing = self.get_typing_handler()._room_typing[room_id]
|
||||
return {
|
||||
"type": "m.typing",
|
||||
|
@ -462,7 +467,9 @@ class TypingNotificationEventSource:
|
|||
|
||||
return (events, handler._latest_room_serial)
|
||||
|
||||
async def get_new_events(self, from_key, room_ids, **kwargs):
|
||||
async def get_new_events(
|
||||
self, from_key: int, room_ids: Iterable[str], **kwargs
|
||||
) -> Tuple[List[JsonDict], int]:
|
||||
with Measure(self.clock, "typing.get_new_events"):
|
||||
from_key = int(from_key)
|
||||
handler = self.get_typing_handler()
|
||||
|
@ -478,5 +485,5 @@ class TypingNotificationEventSource:
|
|||
|
||||
return (events, handler._latest_room_serial)
|
||||
|
||||
def get_current_key(self):
|
||||
def get_current_key(self) -> int:
|
||||
return self.get_typing_handler()._latest_room_serial
|
||||
|
|
|
@ -145,10 +145,6 @@ class UserDirectoryHandler(StateDeltasHandler):
|
|||
if self.pos is None:
|
||||
self.pos = await self.store.get_user_directory_stream_pos()
|
||||
|
||||
# If still None then the initial background update hasn't happened yet
|
||||
if self.pos is None:
|
||||
return None
|
||||
|
||||
# Loop round handling deltas until we're up to date
|
||||
while True:
|
||||
with Measure(self.clock, "user_dir_delta"):
|
||||
|
@ -233,6 +229,11 @@ class UserDirectoryHandler(StateDeltasHandler):
|
|||
|
||||
if change: # The user joined
|
||||
event = await self.store.get_event(event_id, allow_none=True)
|
||||
# It isn't expected for this event to not exist, but we
|
||||
# don't want the entire background process to break.
|
||||
if event is None:
|
||||
continue
|
||||
|
||||
profile = ProfileInfo(
|
||||
avatar_url=event.content.get("avatar_url"),
|
||||
display_name=event.content.get("displayname"),
|
||||
|
|
|
@ -22,10 +22,22 @@ import types
|
|||
import urllib
|
||||
from http import HTTPStatus
|
||||
from io import BytesIO
|
||||
from typing import Any, Callable, Dict, Iterator, List, Tuple, Union
|
||||
from typing import (
|
||||
Any,
|
||||
Awaitable,
|
||||
Callable,
|
||||
Dict,
|
||||
Iterable,
|
||||
Iterator,
|
||||
List,
|
||||
Pattern,
|
||||
Tuple,
|
||||
Union,
|
||||
)
|
||||
|
||||
import jinja2
|
||||
from canonicaljson import iterencode_canonical_json
|
||||
from typing_extensions import Protocol
|
||||
from zope.interface import implementer
|
||||
|
||||
from twisted.internet import defer, interfaces
|
||||
|
@ -168,11 +180,25 @@ def wrap_async_request_handler(h):
|
|||
return preserve_fn(wrapped_async_request_handler)
|
||||
|
||||
|
||||
class HttpServer:
|
||||
# Type of a callback method for processing requests
|
||||
# it is actually called with a SynapseRequest and a kwargs dict for the params,
|
||||
# but I can't figure out how to represent that.
|
||||
ServletCallback = Callable[
|
||||
..., Union[None, Awaitable[None], Tuple[int, Any], Awaitable[Tuple[int, Any]]]
|
||||
]
|
||||
|
||||
|
||||
class HttpServer(Protocol):
|
||||
""" Interface for registering callbacks on a HTTP server
|
||||
"""
|
||||
|
||||
def register_paths(self, method, path_patterns, callback):
|
||||
def register_paths(
|
||||
self,
|
||||
method: str,
|
||||
path_patterns: Iterable[Pattern],
|
||||
callback: ServletCallback,
|
||||
servlet_classname: str,
|
||||
) -> None:
|
||||
""" Register a callback that gets fired if we receive a http request
|
||||
with the given method for a path that matches the given regex.
|
||||
|
||||
|
@ -180,12 +206,14 @@ class HttpServer:
|
|||
an unpacked tuple.
|
||||
|
||||
Args:
|
||||
method (str): The method to listen to.
|
||||
path_patterns (list<SRE_Pattern>): The regex used to match requests.
|
||||
callback (function): The function to fire if we receive a matched
|
||||
method: The HTTP method to listen to.
|
||||
path_patterns: The regex used to match requests.
|
||||
callback: The function to fire if we receive a matched
|
||||
request. The first argument will be the request object and
|
||||
subsequent arguments will be any matched groups from the regex.
|
||||
This should return a tuple of (code, response).
|
||||
This should return either tuple of (code, response), or None.
|
||||
servlet_classname (str): The name of the handler to be used in prometheus
|
||||
and opentracing logs.
|
||||
"""
|
||||
pass
|
||||
|
||||
|
@ -354,7 +382,7 @@ class JsonResource(DirectServeJsonResource):
|
|||
|
||||
def _get_handler_for_request(
|
||||
self, request: SynapseRequest
|
||||
) -> Tuple[Callable, str, Dict[str, str]]:
|
||||
) -> Tuple[ServletCallback, str, Dict[str, str]]:
|
||||
"""Finds a callback method to handle the given request.
|
||||
|
||||
Returns:
|
||||
|
@ -733,6 +761,13 @@ def set_clickjacking_protection_headers(request: Request):
|
|||
request.setHeader(b"Content-Security-Policy", b"frame-ancestors 'none';")
|
||||
|
||||
|
||||
def respond_with_redirect(request: Request, url: bytes) -> None:
|
||||
"""Write a 302 response to the request, if it is still alive."""
|
||||
logger.debug("Redirect to %s", url.decode("utf-8"))
|
||||
request.redirect(url)
|
||||
finish_request(request)
|
||||
|
||||
|
||||
def finish_request(request: Request):
|
||||
""" Finish writing the response to the request.
|
||||
|
||||
|
|
|
@ -791,7 +791,7 @@ def tag_args(func):
|
|||
|
||||
@wraps(func)
|
||||
def _tag_args_inner(*args, **kwargs):
|
||||
argspec = inspect.getargspec(func)
|
||||
argspec = inspect.getfullargspec(func)
|
||||
for i, arg in enumerate(argspec.args[1:]):
|
||||
set_tag("ARG_" + arg, args[i])
|
||||
set_tag("args", args[len(argspec.args) :])
|
||||
|
|
|
@ -279,7 +279,11 @@ class ModuleApi:
|
|||
)
|
||||
|
||||
async def complete_sso_login_async(
|
||||
self, registered_user_id: str, request: SynapseRequest, client_redirect_url: str
|
||||
self,
|
||||
registered_user_id: str,
|
||||
request: SynapseRequest,
|
||||
client_redirect_url: str,
|
||||
new_user: bool = False,
|
||||
):
|
||||
"""Complete a SSO login by redirecting the user to a page to confirm whether they
|
||||
want their access token sent to `client_redirect_url`, or redirect them to that
|
||||
|
@ -291,9 +295,11 @@ class ModuleApi:
|
|||
request: The request to respond to.
|
||||
client_redirect_url: The URL to which to offer to redirect the user (or to
|
||||
redirect them directly if whitelisted).
|
||||
new_user: set to true to use wording for the consent appropriate to a user
|
||||
who has just registered.
|
||||
"""
|
||||
await self._auth_handler.complete_sso_login(
|
||||
registered_user_id, request, client_redirect_url,
|
||||
registered_user_id, request, client_redirect_url, new_user=new_user
|
||||
)
|
||||
|
||||
@defer.inlineCallbacks
|
||||
|
|
|
@ -267,9 +267,21 @@ class Mailer:
|
|||
fallback_to_members=True,
|
||||
)
|
||||
|
||||
summary_text = await self.make_summary_text(
|
||||
notifs_by_room, state_by_room, notif_events, user_id, reason
|
||||
)
|
||||
if len(notifs_by_room) == 1:
|
||||
# Only one room has new stuff
|
||||
room_id = list(notifs_by_room.keys())[0]
|
||||
|
||||
summary_text = await self.make_summary_text_single_room(
|
||||
room_id,
|
||||
notifs_by_room[room_id],
|
||||
state_by_room[room_id],
|
||||
notif_events,
|
||||
user_id,
|
||||
)
|
||||
else:
|
||||
summary_text = await self.make_summary_text(
|
||||
notifs_by_room, state_by_room, notif_events, reason
|
||||
)
|
||||
|
||||
template_vars = {
|
||||
"user_display_name": user_display_name,
|
||||
|
@ -492,138 +504,177 @@ class Mailer:
|
|||
if "url" in event.content:
|
||||
messagevars["image_url"] = event.content["url"]
|
||||
|
||||
async def make_summary_text_single_room(
|
||||
self,
|
||||
room_id: str,
|
||||
notifs: List[Dict[str, Any]],
|
||||
room_state_ids: StateMap[str],
|
||||
notif_events: Dict[str, EventBase],
|
||||
user_id: str,
|
||||
) -> str:
|
||||
"""
|
||||
Make a summary text for the email when only a single room has notifications.
|
||||
|
||||
Args:
|
||||
room_id: The ID of the room.
|
||||
notifs: The notifications for this room.
|
||||
room_state_ids: The state map for the room.
|
||||
notif_events: A map of event ID -> notification event.
|
||||
user_id: The user receiving the notification.
|
||||
|
||||
Returns:
|
||||
The summary text.
|
||||
"""
|
||||
# If the room has some kind of name, use it, but we don't
|
||||
# want the generated-from-names one here otherwise we'll
|
||||
# end up with, "new message from Bob in the Bob room"
|
||||
room_name = await calculate_room_name(
|
||||
self.store, room_state_ids, user_id, fallback_to_members=False
|
||||
)
|
||||
|
||||
# See if one of the notifs is an invite event for the user
|
||||
invite_event = None
|
||||
for n in notifs:
|
||||
ev = notif_events[n["event_id"]]
|
||||
if ev.type == EventTypes.Member and ev.state_key == user_id:
|
||||
if ev.content.get("membership") == Membership.INVITE:
|
||||
invite_event = ev
|
||||
break
|
||||
|
||||
if invite_event:
|
||||
inviter_member_event_id = room_state_ids.get(
|
||||
("m.room.member", invite_event.sender)
|
||||
)
|
||||
inviter_name = invite_event.sender
|
||||
if inviter_member_event_id:
|
||||
inviter_member_event = await self.store.get_event(
|
||||
inviter_member_event_id, allow_none=True
|
||||
)
|
||||
if inviter_member_event:
|
||||
inviter_name = name_from_member_event(inviter_member_event)
|
||||
|
||||
if room_name is None:
|
||||
return self.email_subjects.invite_from_person % {
|
||||
"person": inviter_name,
|
||||
"app": self.app_name,
|
||||
}
|
||||
|
||||
return self.email_subjects.invite_from_person_to_room % {
|
||||
"person": inviter_name,
|
||||
"room": room_name,
|
||||
"app": self.app_name,
|
||||
}
|
||||
|
||||
if len(notifs) == 1:
|
||||
# There is just the one notification, so give some detail
|
||||
sender_name = None
|
||||
event = notif_events[notifs[0]["event_id"]]
|
||||
if ("m.room.member", event.sender) in room_state_ids:
|
||||
state_event_id = room_state_ids[("m.room.member", event.sender)]
|
||||
state_event = await self.store.get_event(state_event_id)
|
||||
sender_name = name_from_member_event(state_event)
|
||||
|
||||
if sender_name is not None and room_name is not None:
|
||||
return self.email_subjects.message_from_person_in_room % {
|
||||
"person": sender_name,
|
||||
"room": room_name,
|
||||
"app": self.app_name,
|
||||
}
|
||||
elif sender_name is not None:
|
||||
return self.email_subjects.message_from_person % {
|
||||
"person": sender_name,
|
||||
"app": self.app_name,
|
||||
}
|
||||
|
||||
# The sender is unknown, just use the room name (or ID).
|
||||
return self.email_subjects.messages_in_room % {
|
||||
"room": room_name or room_id,
|
||||
"app": self.app_name,
|
||||
}
|
||||
else:
|
||||
# There's more than one notification for this room, so just
|
||||
# say there are several
|
||||
if room_name is not None:
|
||||
return self.email_subjects.messages_in_room % {
|
||||
"room": room_name,
|
||||
"app": self.app_name,
|
||||
}
|
||||
|
||||
return await self.make_summary_text_from_member_events(
|
||||
room_id, notifs, room_state_ids, notif_events
|
||||
)
|
||||
|
||||
async def make_summary_text(
|
||||
self,
|
||||
notifs_by_room: Dict[str, List[Dict[str, Any]]],
|
||||
room_state_ids: Dict[str, StateMap[str]],
|
||||
notif_events: Dict[str, EventBase],
|
||||
user_id: str,
|
||||
reason: Dict[str, Any],
|
||||
):
|
||||
if len(notifs_by_room) == 1:
|
||||
# Only one room has new stuff
|
||||
room_id = list(notifs_by_room.keys())[0]
|
||||
) -> str:
|
||||
"""
|
||||
Make a summary text for the email when multiple rooms have notifications.
|
||||
|
||||
# If the room has some kind of name, use it, but we don't
|
||||
# want the generated-from-names one here otherwise we'll
|
||||
# end up with, "new message from Bob in the Bob room"
|
||||
room_name = await calculate_room_name(
|
||||
self.store, room_state_ids[room_id], user_id, fallback_to_members=False
|
||||
)
|
||||
Args:
|
||||
notifs_by_room: A map of room ID to the notifications for that room.
|
||||
room_state_ids: A map of room ID to the state map for that room.
|
||||
notif_events: A map of event ID -> notification event.
|
||||
reason: The reason this notification is being sent.
|
||||
|
||||
# See if one of the notifs is an invite event for the user
|
||||
invite_event = None
|
||||
for n in notifs_by_room[room_id]:
|
||||
ev = notif_events[n["event_id"]]
|
||||
if ev.type == EventTypes.Member and ev.state_key == user_id:
|
||||
if ev.content.get("membership") == Membership.INVITE:
|
||||
invite_event = ev
|
||||
break
|
||||
Returns:
|
||||
The summary text.
|
||||
"""
|
||||
# Stuff's happened in multiple different rooms
|
||||
# ...but we still refer to the 'reason' room which triggered the mail
|
||||
if reason["room_name"] is not None:
|
||||
return self.email_subjects.messages_in_room_and_others % {
|
||||
"room": reason["room_name"],
|
||||
"app": self.app_name,
|
||||
}
|
||||
|
||||
if invite_event:
|
||||
inviter_member_event_id = room_state_ids[room_id].get(
|
||||
("m.room.member", invite_event.sender)
|
||||
)
|
||||
inviter_name = invite_event.sender
|
||||
if inviter_member_event_id:
|
||||
inviter_member_event = await self.store.get_event(
|
||||
inviter_member_event_id, allow_none=True
|
||||
)
|
||||
if inviter_member_event:
|
||||
inviter_name = name_from_member_event(inviter_member_event)
|
||||
room_id = reason["room_id"]
|
||||
return await self.make_summary_text_from_member_events(
|
||||
room_id, notifs_by_room[room_id], room_state_ids[room_id], notif_events
|
||||
)
|
||||
|
||||
if room_name is None:
|
||||
return self.email_subjects.invite_from_person % {
|
||||
"person": inviter_name,
|
||||
"app": self.app_name,
|
||||
}
|
||||
else:
|
||||
return self.email_subjects.invite_from_person_to_room % {
|
||||
"person": inviter_name,
|
||||
"room": room_name,
|
||||
"app": self.app_name,
|
||||
}
|
||||
async def make_summary_text_from_member_events(
|
||||
self,
|
||||
room_id: str,
|
||||
notifs: List[Dict[str, Any]],
|
||||
room_state_ids: StateMap[str],
|
||||
notif_events: Dict[str, EventBase],
|
||||
) -> str:
|
||||
"""
|
||||
Make a summary text for the email when only a single room has notifications.
|
||||
|
||||
sender_name = None
|
||||
if len(notifs_by_room[room_id]) == 1:
|
||||
# There is just the one notification, so give some detail
|
||||
event = notif_events[notifs_by_room[room_id][0]["event_id"]]
|
||||
if ("m.room.member", event.sender) in room_state_ids[room_id]:
|
||||
state_event_id = room_state_ids[room_id][
|
||||
("m.room.member", event.sender)
|
||||
]
|
||||
state_event = await self.store.get_event(state_event_id)
|
||||
sender_name = name_from_member_event(state_event)
|
||||
Args:
|
||||
room_id: The ID of the room.
|
||||
notifs: The notifications for this room.
|
||||
room_state_ids: The state map for the room.
|
||||
notif_events: A map of event ID -> notification event.
|
||||
|
||||
if sender_name is not None and room_name is not None:
|
||||
return self.email_subjects.message_from_person_in_room % {
|
||||
"person": sender_name,
|
||||
"room": room_name,
|
||||
"app": self.app_name,
|
||||
}
|
||||
elif sender_name is not None:
|
||||
return self.email_subjects.message_from_person % {
|
||||
"person": sender_name,
|
||||
"app": self.app_name,
|
||||
}
|
||||
else:
|
||||
# There's more than one notification for this room, so just
|
||||
# say there are several
|
||||
if room_name is not None:
|
||||
return self.email_subjects.messages_in_room % {
|
||||
"room": room_name,
|
||||
"app": self.app_name,
|
||||
}
|
||||
else:
|
||||
# If the room doesn't have a name, say who the messages
|
||||
# are from explicitly to avoid, "messages in the Bob room"
|
||||
sender_ids = list(
|
||||
{
|
||||
notif_events[n["event_id"]].sender
|
||||
for n in notifs_by_room[room_id]
|
||||
}
|
||||
)
|
||||
Returns:
|
||||
The summary text.
|
||||
"""
|
||||
# If the room doesn't have a name, say who the messages
|
||||
# are from explicitly to avoid, "messages in the Bob room"
|
||||
sender_ids = {notif_events[n["event_id"]].sender for n in notifs}
|
||||
|
||||
member_events = await self.store.get_events(
|
||||
[
|
||||
room_state_ids[room_id][("m.room.member", s)]
|
||||
for s in sender_ids
|
||||
]
|
||||
)
|
||||
member_events = await self.store.get_events(
|
||||
[room_state_ids[("m.room.member", s)] for s in sender_ids]
|
||||
)
|
||||
|
||||
return self.email_subjects.messages_from_person % {
|
||||
"person": descriptor_from_member_events(member_events.values()),
|
||||
"app": self.app_name,
|
||||
}
|
||||
else:
|
||||
# Stuff's happened in multiple different rooms
|
||||
# There was a single sender.
|
||||
if len(sender_ids) == 1:
|
||||
return self.email_subjects.messages_from_person % {
|
||||
"person": descriptor_from_member_events(member_events.values()),
|
||||
"app": self.app_name,
|
||||
}
|
||||
|
||||
# ...but we still refer to the 'reason' room which triggered the mail
|
||||
if reason["room_name"] is not None:
|
||||
return self.email_subjects.messages_in_room_and_others % {
|
||||
"room": reason["room_name"],
|
||||
"app": self.app_name,
|
||||
}
|
||||
else:
|
||||
# If the reason room doesn't have a name, say who the messages
|
||||
# are from explicitly to avoid, "messages in the Bob room"
|
||||
room_id = reason["room_id"]
|
||||
|
||||
sender_ids = list(
|
||||
{
|
||||
notif_events[n["event_id"]].sender
|
||||
for n in notifs_by_room[room_id]
|
||||
}
|
||||
)
|
||||
|
||||
member_events = await self.store.get_events(
|
||||
[room_state_ids[room_id][("m.room.member", s)] for s in sender_ids]
|
||||
)
|
||||
|
||||
return self.email_subjects.messages_from_person_and_others % {
|
||||
"person": descriptor_from_member_events(member_events.values()),
|
||||
"app": self.app_name,
|
||||
}
|
||||
# There was more than one sender, use the first one and a tweaked template.
|
||||
return self.email_subjects.messages_from_person_and_others % {
|
||||
"person": descriptor_from_member_events(list(member_events.values())[:1]),
|
||||
"app": self.app_name,
|
||||
}
|
||||
|
||||
def make_room_link(self, room_id: str) -> str:
|
||||
if self.hs.config.email_riot_base_url:
|
||||
|
@ -668,6 +719,15 @@ class Mailer:
|
|||
|
||||
|
||||
def safe_markup(raw_html: str) -> jinja2.Markup:
|
||||
"""
|
||||
Sanitise a raw HTML string to a set of allowed tags and attributes, and linkify any bare URLs.
|
||||
|
||||
Args
|
||||
raw_html: Unsafe HTML.
|
||||
|
||||
Returns:
|
||||
A Markup object ready to safely use in a Jinja template.
|
||||
"""
|
||||
return jinja2.Markup(
|
||||
bleach.linkify(
|
||||
bleach.clean(
|
||||
|
@ -684,8 +744,13 @@ def safe_markup(raw_html: str) -> jinja2.Markup:
|
|||
|
||||
def safe_text(raw_text: str) -> jinja2.Markup:
|
||||
"""
|
||||
Process text: treat it as HTML but escape any tags (ie. just escape the
|
||||
HTML) then linkify it.
|
||||
Sanitise text (escape any HTML tags), and then linkify any bare URLs.
|
||||
|
||||
Args
|
||||
raw_text: Unsafe text which might include HTML markup.
|
||||
|
||||
Returns:
|
||||
A Markup object ready to safely use in a Jinja template.
|
||||
"""
|
||||
return jinja2.Markup(
|
||||
bleach.linkify(bleach.clean(raw_text, tags=[], attributes={}, strip=False))
|
||||
|
|
|
@ -17,7 +17,7 @@ import logging
|
|||
import re
|
||||
from typing import TYPE_CHECKING, Dict, Iterable, Optional
|
||||
|
||||
from synapse.api.constants import EventTypes
|
||||
from synapse.api.constants import EventTypes, Membership
|
||||
from synapse.events import EventBase
|
||||
from synapse.types import StateMap
|
||||
|
||||
|
@ -63,7 +63,7 @@ async def calculate_room_name(
|
|||
m_room_name = await store.get_event(
|
||||
room_state_ids[(EventTypes.Name, "")], allow_none=True
|
||||
)
|
||||
if m_room_name and m_room_name.content and m_room_name.content["name"]:
|
||||
if m_room_name and m_room_name.content and m_room_name.content.get("name"):
|
||||
return m_room_name.content["name"]
|
||||
|
||||
# does it have a canonical alias?
|
||||
|
@ -74,15 +74,11 @@ async def calculate_room_name(
|
|||
if (
|
||||
canon_alias
|
||||
and canon_alias.content
|
||||
and canon_alias.content["alias"]
|
||||
and canon_alias.content.get("alias")
|
||||
and _looks_like_an_alias(canon_alias.content["alias"])
|
||||
):
|
||||
return canon_alias.content["alias"]
|
||||
|
||||
# at this point we're going to need to search the state by all state keys
|
||||
# for an event type, so rearrange the data structure
|
||||
room_state_bytype_ids = _state_as_two_level_dict(room_state_ids)
|
||||
|
||||
if not fallback_to_members:
|
||||
return None
|
||||
|
||||
|
@ -94,7 +90,7 @@ async def calculate_room_name(
|
|||
|
||||
if (
|
||||
my_member_event is not None
|
||||
and my_member_event.content["membership"] == "invite"
|
||||
and my_member_event.content.get("membership") == Membership.INVITE
|
||||
):
|
||||
if (EventTypes.Member, my_member_event.sender) in room_state_ids:
|
||||
inviter_member_event = await store.get_event(
|
||||
|
@ -111,6 +107,10 @@ async def calculate_room_name(
|
|||
else:
|
||||
return "Room Invite"
|
||||
|
||||
# at this point we're going to need to search the state by all state keys
|
||||
# for an event type, so rearrange the data structure
|
||||
room_state_bytype_ids = _state_as_two_level_dict(room_state_ids)
|
||||
|
||||
# we're going to have to generate a name based on who's in the room,
|
||||
# so find out who is in the room that isn't the user.
|
||||
if EventTypes.Member in room_state_bytype_ids:
|
||||
|
@ -120,8 +120,8 @@ async def calculate_room_name(
|
|||
all_members = [
|
||||
ev
|
||||
for ev in member_events.values()
|
||||
if ev.content["membership"] == "join"
|
||||
or ev.content["membership"] == "invite"
|
||||
if ev.content.get("membership") == Membership.JOIN
|
||||
or ev.content.get("membership") == Membership.INVITE
|
||||
]
|
||||
# Sort the member events oldest-first so the we name people in the
|
||||
# order the joined (it should at least be deterministic rather than
|
||||
|
@ -194,11 +194,7 @@ def descriptor_from_member_events(member_events: Iterable[EventBase]) -> str:
|
|||
|
||||
|
||||
def name_from_member_event(member_event: EventBase) -> str:
|
||||
if (
|
||||
member_event.content
|
||||
and "displayname" in member_event.content
|
||||
and member_event.content["displayname"]
|
||||
):
|
||||
if member_event.content and member_event.content.get("displayname"):
|
||||
return member_event.content["displayname"]
|
||||
return member_event.state_key
|
||||
|
||||
|
|
105
synapse/replication/tcp/external_cache.py
Normal file
105
synapse/replication/tcp/external_cache.py
Normal file
|
@ -0,0 +1,105 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Copyright 2021 The Matrix.org Foundation C.I.C.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import logging
|
||||
from typing import TYPE_CHECKING, Any, Optional
|
||||
|
||||
from prometheus_client import Counter
|
||||
|
||||
from synapse.logging.context import make_deferred_yieldable
|
||||
from synapse.util import json_decoder, json_encoder
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from synapse.server import HomeServer
|
||||
|
||||
set_counter = Counter(
|
||||
"synapse_external_cache_set",
|
||||
"Number of times we set a cache",
|
||||
labelnames=["cache_name"],
|
||||
)
|
||||
|
||||
get_counter = Counter(
|
||||
"synapse_external_cache_get",
|
||||
"Number of times we get a cache",
|
||||
labelnames=["cache_name", "hit"],
|
||||
)
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class ExternalCache:
|
||||
"""A cache backed by an external Redis. Does nothing if no Redis is
|
||||
configured.
|
||||
"""
|
||||
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
self._redis_connection = hs.get_outbound_redis_connection()
|
||||
|
||||
def _get_redis_key(self, cache_name: str, key: str) -> str:
|
||||
return "cache_v1:%s:%s" % (cache_name, key)
|
||||
|
||||
def is_enabled(self) -> bool:
|
||||
"""Whether the external cache is used or not.
|
||||
|
||||
It's safe to use the cache when this returns false, the methods will
|
||||
just no-op, but the function is useful to avoid doing unnecessary work.
|
||||
"""
|
||||
return self._redis_connection is not None
|
||||
|
||||
async def set(self, cache_name: str, key: str, value: Any, expiry_ms: int) -> None:
|
||||
"""Add the key/value to the named cache, with the expiry time given.
|
||||
"""
|
||||
|
||||
if self._redis_connection is None:
|
||||
return
|
||||
|
||||
set_counter.labels(cache_name).inc()
|
||||
|
||||
# txredisapi requires the value to be string, bytes or numbers, so we
|
||||
# encode stuff in JSON.
|
||||
encoded_value = json_encoder.encode(value)
|
||||
|
||||
logger.debug("Caching %s %s: %r", cache_name, key, encoded_value)
|
||||
|
||||
return await make_deferred_yieldable(
|
||||
self._redis_connection.set(
|
||||
self._get_redis_key(cache_name, key), encoded_value, pexpire=expiry_ms,
|
||||
)
|
||||
)
|
||||
|
||||
async def get(self, cache_name: str, key: str) -> Optional[Any]:
|
||||
"""Look up a key/value in the named cache.
|
||||
"""
|
||||
|
||||
if self._redis_connection is None:
|
||||
return None
|
||||
|
||||
result = await make_deferred_yieldable(
|
||||
self._redis_connection.get(self._get_redis_key(cache_name, key))
|
||||
)
|
||||
|
||||
logger.debug("Got cache result %s %s: %r", cache_name, key, result)
|
||||
|
||||
get_counter.labels(cache_name, result is not None).inc()
|
||||
|
||||
if not result:
|
||||
return None
|
||||
|
||||
# For some reason the integers get magically converted back to integers
|
||||
if isinstance(result, int):
|
||||
return result
|
||||
|
||||
return json_decoder.decode(result)
|
|
@ -15,6 +15,7 @@
|
|||
# limitations under the License.
|
||||
import logging
|
||||
from typing import (
|
||||
TYPE_CHECKING,
|
||||
Any,
|
||||
Awaitable,
|
||||
Dict,
|
||||
|
@ -63,6 +64,9 @@ from synapse.replication.tcp.streams import (
|
|||
TypingStream,
|
||||
)
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from synapse.server import HomeServer
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
|
@ -88,7 +92,7 @@ class ReplicationCommandHandler:
|
|||
back out to connections.
|
||||
"""
|
||||
|
||||
def __init__(self, hs):
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
self._replication_data_handler = hs.get_replication_data_handler()
|
||||
self._presence_handler = hs.get_presence_handler()
|
||||
self._store = hs.get_datastore()
|
||||
|
@ -282,13 +286,6 @@ class ReplicationCommandHandler:
|
|||
if hs.config.redis.redis_enabled:
|
||||
from synapse.replication.tcp.redis import (
|
||||
RedisDirectTcpReplicationClientFactory,
|
||||
lazyConnection,
|
||||
)
|
||||
|
||||
logger.info(
|
||||
"Connecting to redis (host=%r port=%r)",
|
||||
hs.config.redis_host,
|
||||
hs.config.redis_port,
|
||||
)
|
||||
|
||||
# First let's ensure that we have a ReplicationStreamer started.
|
||||
|
@ -299,13 +296,7 @@ class ReplicationCommandHandler:
|
|||
# connection after SUBSCRIBE is called).
|
||||
|
||||
# First create the connection for sending commands.
|
||||
outbound_redis_connection = lazyConnection(
|
||||
reactor=hs.get_reactor(),
|
||||
host=hs.config.redis_host,
|
||||
port=hs.config.redis_port,
|
||||
password=hs.config.redis.redis_password,
|
||||
reconnect=True,
|
||||
)
|
||||
outbound_redis_connection = hs.get_outbound_redis_connection()
|
||||
|
||||
# Now create the factory/connection for the subscription stream.
|
||||
self._factory = RedisDirectTcpReplicationClientFactory(
|
||||
|
|
|
@ -15,7 +15,7 @@
|
|||
|
||||
import logging
|
||||
from inspect import isawaitable
|
||||
from typing import TYPE_CHECKING, Optional
|
||||
from typing import TYPE_CHECKING, Optional, Type, cast
|
||||
|
||||
import txredisapi
|
||||
|
||||
|
@ -23,6 +23,7 @@ from synapse.logging.context import PreserveLoggingContext, make_deferred_yielda
|
|||
from synapse.metrics.background_process_metrics import (
|
||||
BackgroundProcessLoggingContext,
|
||||
run_as_background_process,
|
||||
wrap_as_background_process,
|
||||
)
|
||||
from synapse.replication.tcp.commands import (
|
||||
Command,
|
||||
|
@ -59,16 +60,16 @@ class RedisSubscriber(txredisapi.SubscriberProtocol, AbstractConnection):
|
|||
immediately after initialisation.
|
||||
|
||||
Attributes:
|
||||
handler: The command handler to handle incoming commands.
|
||||
stream_name: The *redis* stream name to subscribe to and publish from
|
||||
(not anything to do with Synapse replication streams).
|
||||
outbound_redis_connection: The connection to redis to use to send
|
||||
synapse_handler: The command handler to handle incoming commands.
|
||||
synapse_stream_name: The *redis* stream name to subscribe to and publish
|
||||
from (not anything to do with Synapse replication streams).
|
||||
synapse_outbound_redis_connection: The connection to redis to use to send
|
||||
commands.
|
||||
"""
|
||||
|
||||
handler = None # type: ReplicationCommandHandler
|
||||
stream_name = None # type: str
|
||||
outbound_redis_connection = None # type: txredisapi.RedisProtocol
|
||||
synapse_handler = None # type: ReplicationCommandHandler
|
||||
synapse_stream_name = None # type: str
|
||||
synapse_outbound_redis_connection = None # type: txredisapi.RedisProtocol
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
|
@ -88,19 +89,19 @@ class RedisSubscriber(txredisapi.SubscriberProtocol, AbstractConnection):
|
|||
# it's important to make sure that we only send the REPLICATE command once we
|
||||
# have successfully subscribed to the stream - otherwise we might miss the
|
||||
# POSITION response sent back by the other end.
|
||||
logger.info("Sending redis SUBSCRIBE for %s", self.stream_name)
|
||||
await make_deferred_yieldable(self.subscribe(self.stream_name))
|
||||
logger.info("Sending redis SUBSCRIBE for %s", self.synapse_stream_name)
|
||||
await make_deferred_yieldable(self.subscribe(self.synapse_stream_name))
|
||||
logger.info(
|
||||
"Successfully subscribed to redis stream, sending REPLICATE command"
|
||||
)
|
||||
self.handler.new_connection(self)
|
||||
self.synapse_handler.new_connection(self)
|
||||
await self._async_send_command(ReplicateCommand())
|
||||
logger.info("REPLICATE successfully sent")
|
||||
|
||||
# We send out our positions when there is a new connection in case the
|
||||
# other side missed updates. We do this for Redis connections as the
|
||||
# otherside won't know we've connected and so won't issue a REPLICATE.
|
||||
self.handler.send_positions_to_connection(self)
|
||||
self.synapse_handler.send_positions_to_connection(self)
|
||||
|
||||
def messageReceived(self, pattern: str, channel: str, message: str):
|
||||
"""Received a message from redis.
|
||||
|
@ -137,7 +138,7 @@ class RedisSubscriber(txredisapi.SubscriberProtocol, AbstractConnection):
|
|||
cmd: received command
|
||||
"""
|
||||
|
||||
cmd_func = getattr(self.handler, "on_%s" % (cmd.NAME,), None)
|
||||
cmd_func = getattr(self.synapse_handler, "on_%s" % (cmd.NAME,), None)
|
||||
if not cmd_func:
|
||||
logger.warning("Unhandled command: %r", cmd)
|
||||
return
|
||||
|
@ -155,7 +156,7 @@ class RedisSubscriber(txredisapi.SubscriberProtocol, AbstractConnection):
|
|||
def connectionLost(self, reason):
|
||||
logger.info("Lost connection to redis")
|
||||
super().connectionLost(reason)
|
||||
self.handler.lost_connection(self)
|
||||
self.synapse_handler.lost_connection(self)
|
||||
|
||||
# mark the logging context as finished
|
||||
self._logging_context.__exit__(None, None, None)
|
||||
|
@ -183,11 +184,54 @@ class RedisSubscriber(txredisapi.SubscriberProtocol, AbstractConnection):
|
|||
tcp_outbound_commands_counter.labels(cmd.NAME, "redis").inc()
|
||||
|
||||
await make_deferred_yieldable(
|
||||
self.outbound_redis_connection.publish(self.stream_name, encoded_string)
|
||||
self.synapse_outbound_redis_connection.publish(
|
||||
self.synapse_stream_name, encoded_string
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
class RedisDirectTcpReplicationClientFactory(txredisapi.SubscriberFactory):
|
||||
class SynapseRedisFactory(txredisapi.RedisFactory):
|
||||
"""A subclass of RedisFactory that periodically sends pings to ensure that
|
||||
we detect dead connections.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
hs: "HomeServer",
|
||||
uuid: str,
|
||||
dbid: Optional[int],
|
||||
poolsize: int,
|
||||
isLazy: bool = False,
|
||||
handler: Type = txredisapi.ConnectionHandler,
|
||||
charset: str = "utf-8",
|
||||
password: Optional[str] = None,
|
||||
replyTimeout: int = 30,
|
||||
convertNumbers: Optional[int] = True,
|
||||
):
|
||||
super().__init__(
|
||||
uuid=uuid,
|
||||
dbid=dbid,
|
||||
poolsize=poolsize,
|
||||
isLazy=isLazy,
|
||||
handler=handler,
|
||||
charset=charset,
|
||||
password=password,
|
||||
replyTimeout=replyTimeout,
|
||||
convertNumbers=convertNumbers,
|
||||
)
|
||||
|
||||
hs.get_clock().looping_call(self._send_ping, 30 * 1000)
|
||||
|
||||
@wrap_as_background_process("redis_ping")
|
||||
async def _send_ping(self):
|
||||
for connection in self.pool:
|
||||
try:
|
||||
await make_deferred_yieldable(connection.ping())
|
||||
except Exception:
|
||||
logger.warning("Failed to send ping to a redis connection")
|
||||
|
||||
|
||||
class RedisDirectTcpReplicationClientFactory(SynapseRedisFactory):
|
||||
"""This is a reconnecting factory that connects to redis and immediately
|
||||
subscribes to a stream.
|
||||
|
||||
|
@ -206,65 +250,62 @@ class RedisDirectTcpReplicationClientFactory(txredisapi.SubscriberFactory):
|
|||
self, hs: "HomeServer", outbound_redis_connection: txredisapi.RedisProtocol
|
||||
):
|
||||
|
||||
super().__init__()
|
||||
super().__init__(
|
||||
hs,
|
||||
uuid="subscriber",
|
||||
dbid=None,
|
||||
poolsize=1,
|
||||
replyTimeout=30,
|
||||
password=hs.config.redis.redis_password,
|
||||
)
|
||||
|
||||
# This sets the password on the RedisFactory base class (as
|
||||
# SubscriberFactory constructor doesn't pass it through).
|
||||
self.password = hs.config.redis.redis_password
|
||||
self.synapse_handler = hs.get_tcp_replication()
|
||||
self.synapse_stream_name = hs.hostname
|
||||
|
||||
self.handler = hs.get_tcp_replication()
|
||||
self.stream_name = hs.hostname
|
||||
|
||||
self.outbound_redis_connection = outbound_redis_connection
|
||||
self.synapse_outbound_redis_connection = outbound_redis_connection
|
||||
|
||||
def buildProtocol(self, addr):
|
||||
p = super().buildProtocol(addr) # type: RedisSubscriber
|
||||
p = super().buildProtocol(addr)
|
||||
p = cast(RedisSubscriber, p)
|
||||
|
||||
# We do this here rather than add to the constructor of `RedisSubcriber`
|
||||
# as to do so would involve overriding `buildProtocol` entirely, however
|
||||
# the base method does some other things than just instantiating the
|
||||
# protocol.
|
||||
p.handler = self.handler
|
||||
p.outbound_redis_connection = self.outbound_redis_connection
|
||||
p.stream_name = self.stream_name
|
||||
p.password = self.password
|
||||
p.synapse_handler = self.synapse_handler
|
||||
p.synapse_outbound_redis_connection = self.synapse_outbound_redis_connection
|
||||
p.synapse_stream_name = self.synapse_stream_name
|
||||
|
||||
return p
|
||||
|
||||
|
||||
def lazyConnection(
|
||||
reactor,
|
||||
hs: "HomeServer",
|
||||
host: str = "localhost",
|
||||
port: int = 6379,
|
||||
dbid: Optional[int] = None,
|
||||
reconnect: bool = True,
|
||||
charset: str = "utf-8",
|
||||
password: Optional[str] = None,
|
||||
connectTimeout: Optional[int] = None,
|
||||
replyTimeout: Optional[int] = None,
|
||||
convertNumbers: bool = True,
|
||||
replyTimeout: int = 30,
|
||||
) -> txredisapi.RedisProtocol:
|
||||
"""Equivalent to `txredisapi.lazyConnection`, except allows specifying a
|
||||
reactor.
|
||||
"""Creates a connection to Redis that is lazily set up and reconnects if the
|
||||
connections is lost.
|
||||
"""
|
||||
|
||||
isLazy = True
|
||||
poolsize = 1
|
||||
|
||||
uuid = "%s:%d" % (host, port)
|
||||
factory = txredisapi.RedisFactory(
|
||||
uuid,
|
||||
dbid,
|
||||
poolsize,
|
||||
isLazy,
|
||||
txredisapi.ConnectionHandler,
|
||||
charset,
|
||||
password,
|
||||
replyTimeout,
|
||||
convertNumbers,
|
||||
factory = SynapseRedisFactory(
|
||||
hs,
|
||||
uuid=uuid,
|
||||
dbid=dbid,
|
||||
poolsize=1,
|
||||
isLazy=True,
|
||||
handler=txredisapi.ConnectionHandler,
|
||||
password=password,
|
||||
replyTimeout=replyTimeout,
|
||||
)
|
||||
factory.continueTrying = reconnect
|
||||
for x in range(poolsize):
|
||||
reactor.connectTCP(host, port, factory, connectTimeout)
|
||||
|
||||
reactor = hs.get_reactor()
|
||||
reactor.connectTCP(host, port, factory, 30)
|
||||
|
||||
return factory.handler
|
||||
|
|
88
synapse/res/templates/sso.css
Normal file
88
synapse/res/templates/sso.css
Normal file
|
@ -0,0 +1,88 @@
|
|||
body {
|
||||
font-family: "Inter", "Helvetica", "Arial", sans-serif;
|
||||
font-size: 14px;
|
||||
color: #17191C;
|
||||
}
|
||||
|
||||
header {
|
||||
max-width: 480px;
|
||||
width: 100%;
|
||||
margin: 24px auto;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
header p {
|
||||
color: #737D8C;
|
||||
line-height: 24px;
|
||||
}
|
||||
|
||||
h1 {
|
||||
font-size: 24px;
|
||||
}
|
||||
|
||||
.error_page h1 {
|
||||
color: #FE2928;
|
||||
}
|
||||
|
||||
h2 {
|
||||
font-size: 14px;
|
||||
}
|
||||
|
||||
h2 img {
|
||||
vertical-align: middle;
|
||||
margin-right: 8px;
|
||||
width: 24px;
|
||||
height: 24px;
|
||||
}
|
||||
|
||||
label {
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
main {
|
||||
max-width: 360px;
|
||||
width: 100%;
|
||||
margin: 24px auto;
|
||||
}
|
||||
|
||||
.primary-button {
|
||||
border: none;
|
||||
text-decoration: none;
|
||||
padding: 12px;
|
||||
color: white;
|
||||
background-color: #418DED;
|
||||
font-weight: bold;
|
||||
display: block;
|
||||
border-radius: 12px;
|
||||
width: 100%;
|
||||
box-sizing: border-box;
|
||||
margin: 16px 0;
|
||||
cursor: pointer;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.profile {
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
margin: 24px 0;
|
||||
}
|
||||
|
||||
.profile .avatar {
|
||||
width: 36px;
|
||||
height: 36px;
|
||||
border-radius: 100%;
|
||||
display: block;
|
||||
margin-right: 8px;
|
||||
}
|
||||
|
||||
.profile .display-name {
|
||||
font-weight: bold;
|
||||
margin-bottom: 4px;
|
||||
}
|
||||
.profile .user-id {
|
||||
color: #737D8C;
|
||||
}
|
||||
|
||||
.profile .display-name, .profile .user-id {
|
||||
line-height: 18px;
|
||||
}
|
|
@ -1,10 +1,24 @@
|
|||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<title>SSO account deactivated</title>
|
||||
</head>
|
||||
<body>
|
||||
<p>This account has been deactivated.</p>
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<title>SSO account deactivated</title>
|
||||
<meta name="viewport" content="width=device-width, user-scalable=no">
|
||||
<style type="text/css">
|
||||
{% include "sso.css" without context %}
|
||||
</style>
|
||||
</head>
|
||||
<body class="error_page">
|
||||
<header>
|
||||
<h1>Your account has been deactivated</h1>
|
||||
<p>
|
||||
<strong>No account found</strong>
|
||||
</p>
|
||||
<p>
|
||||
Your account might have been deactivated by the server administrator.
|
||||
You can either try to create a new account or contact the server’s
|
||||
administrator.
|
||||
</p>
|
||||
</header>
|
||||
</body>
|
||||
</html>
|
||||
|
|
138
synapse/res/templates/sso_auth_account_details.html
Normal file
138
synapse/res/templates/sso_auth_account_details.html
Normal file
|
@ -0,0 +1,138 @@
|
|||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<title>Synapse Login</title>
|
||||
<meta charset="utf-8">
|
||||
<meta name="viewport" content="width=device-width, user-scalable=no">
|
||||
<style type="text/css">
|
||||
{% include "sso.css" without context %}
|
||||
|
||||
.username_input {
|
||||
display: flex;
|
||||
border: 2px solid #418DED;
|
||||
border-radius: 8px;
|
||||
padding: 12px;
|
||||
position: relative;
|
||||
margin: 16px 0;
|
||||
align-items: center;
|
||||
font-size: 12px;
|
||||
}
|
||||
|
||||
.username_input label {
|
||||
position: absolute;
|
||||
top: -8px;
|
||||
left: 14px;
|
||||
font-size: 80%;
|
||||
background: white;
|
||||
padding: 2px;
|
||||
}
|
||||
|
||||
.username_input input {
|
||||
flex: 1;
|
||||
display: block;
|
||||
min-width: 0;
|
||||
border: none;
|
||||
}
|
||||
|
||||
.username_input div {
|
||||
color: #8D99A5;
|
||||
}
|
||||
|
||||
.idp-pick-details {
|
||||
border: 1px solid #E9ECF1;
|
||||
border-radius: 8px;
|
||||
margin: 24px 0;
|
||||
}
|
||||
|
||||
.idp-pick-details h2 {
|
||||
margin: 0;
|
||||
padding: 8px 12px;
|
||||
}
|
||||
|
||||
.idp-pick-details .idp-detail {
|
||||
border-top: 1px solid #E9ECF1;
|
||||
padding: 12px;
|
||||
}
|
||||
.idp-pick-details .check-row {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
}
|
||||
|
||||
.idp-pick-details .check-row .name {
|
||||
flex: 1;
|
||||
}
|
||||
|
||||
.idp-pick-details .use, .idp-pick-details .idp-value {
|
||||
color: #737D8C;
|
||||
}
|
||||
|
||||
.idp-pick-details .idp-value {
|
||||
margin: 0;
|
||||
margin-top: 8px;
|
||||
}
|
||||
|
||||
.idp-pick-details .avatar {
|
||||
width: 53px;
|
||||
height: 53px;
|
||||
border-radius: 100%;
|
||||
display: block;
|
||||
margin-top: 8px;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<header>
|
||||
<h1>Your account is nearly ready</h1>
|
||||
<p>Check your details before creating an account on {{ server_name }}</p>
|
||||
</header>
|
||||
<main>
|
||||
<form method="post" class="form__input" id="form">
|
||||
<div class="username_input">
|
||||
<label for="field-username">Username</label>
|
||||
<div class="prefix">@</div>
|
||||
<input type="text" name="username" id="field-username" autofocus required pattern="[a-z0-9\-=_\/\.]+">
|
||||
<div class="postfix">:{{ server_name }}</div>
|
||||
</div>
|
||||
<input type="submit" value="Continue" class="primary-button">
|
||||
{% if user_attributes %}
|
||||
<section class="idp-pick-details">
|
||||
<h2><img src="{{ idp.idp_icon | mxc_to_http(24, 24) }}"/>Information from {{ idp.idp_name }}</h2>
|
||||
{% if user_attributes.avatar_url %}
|
||||
<div class="idp-detail idp-avatar">
|
||||
<div class="check-row">
|
||||
<label for="idp-avatar" class="name">Avatar</label>
|
||||
<label for="idp-avatar" class="use">Use</label>
|
||||
<input type="checkbox" name="use_avatar" id="idp-avatar" value="true" checked>
|
||||
</div>
|
||||
<img src="{{ user_attributes.avatar_url }}" class="avatar" />
|
||||
</div>
|
||||
{% endif %}
|
||||
{% if user_attributes.display_name %}
|
||||
<div class="idp-detail">
|
||||
<div class="check-row">
|
||||
<label for="idp-displayname" class="name">Display name</label>
|
||||
<label for="idp-displayname" class="use">Use</label>
|
||||
<input type="checkbox" name="use_display_name" id="idp-displayname" value="true" checked>
|
||||
</div>
|
||||
<p class="idp-value">{{ user_attributes.display_name }}</p>
|
||||
</div>
|
||||
{% endif %}
|
||||
{% for email in user_attributes.emails %}
|
||||
<div class="idp-detail">
|
||||
<div class="check-row">
|
||||
<label for="idp-email{{ loop.index }}" class="name">E-mail</label>
|
||||
<label for="idp-email{{ loop.index }}" class="use">Use</label>
|
||||
<input type="checkbox" name="use_email" id="idp-email{{ loop.index }}" value="{{ email }}" checked>
|
||||
</div>
|
||||
<p class="idp-value">{{ email }}</p>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</section>
|
||||
{% endif %}
|
||||
</form>
|
||||
</main>
|
||||
<script type="text/javascript">
|
||||
{% include "sso_auth_account_details.js" without context %}
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
76
synapse/res/templates/sso_auth_account_details.js
Normal file
76
synapse/res/templates/sso_auth_account_details.js
Normal file
|
@ -0,0 +1,76 @@
|
|||
const usernameField = document.getElementById("field-username");
|
||||
|
||||
function throttle(fn, wait) {
|
||||
let timeout;
|
||||
return function() {
|
||||
const args = Array.from(arguments);
|
||||
if (timeout) {
|
||||
clearTimeout(timeout);
|
||||
}
|
||||
timeout = setTimeout(fn.bind.apply(fn, [null].concat(args)), wait);
|
||||
}
|
||||
}
|
||||
|
||||
function checkUsernameAvailable(username) {
|
||||
let check_uri = 'check?username=' + encodeURIComponent(username);
|
||||
return fetch(check_uri, {
|
||||
// include the cookie
|
||||
"credentials": "same-origin",
|
||||
}).then((response) => {
|
||||
if(!response.ok) {
|
||||
// for non-200 responses, raise the body of the response as an exception
|
||||
return response.text().then((text) => { throw new Error(text); });
|
||||
} else {
|
||||
return response.json();
|
||||
}
|
||||
}).then((json) => {
|
||||
if(json.error) {
|
||||
return {message: json.error};
|
||||
} else if(json.available) {
|
||||
return {available: true};
|
||||
} else {
|
||||
return {message: username + " is not available, please choose another."};
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
function validateUsername(username) {
|
||||
usernameField.setCustomValidity("");
|
||||
if (usernameField.validity.valueMissing) {
|
||||
usernameField.setCustomValidity("Please provide a username");
|
||||
return;
|
||||
}
|
||||
if (usernameField.validity.patternMismatch) {
|
||||
usernameField.setCustomValidity("Invalid username, please only use " + allowedCharactersString);
|
||||
return;
|
||||
}
|
||||
usernameField.setCustomValidity("Checking if username is available …");
|
||||
throttledCheckUsernameAvailable(username);
|
||||
}
|
||||
|
||||
const throttledCheckUsernameAvailable = throttle(function(username) {
|
||||
const handleError = function(err) {
|
||||
// don't prevent form submission on error
|
||||
usernameField.setCustomValidity("");
|
||||
console.log(err.message);
|
||||
};
|
||||
try {
|
||||
checkUsernameAvailable(username).then(function(result) {
|
||||
if (!result.available) {
|
||||
usernameField.setCustomValidity(result.message);
|
||||
usernameField.reportValidity();
|
||||
} else {
|
||||
usernameField.setCustomValidity("");
|
||||
}
|
||||
}, handleError);
|
||||
} catch (err) {
|
||||
handleError(err);
|
||||
}
|
||||
}, 500);
|
||||
|
||||
usernameField.addEventListener("input", function(evt) {
|
||||
validateUsername(usernameField.value);
|
||||
});
|
||||
usernameField.addEventListener("change", function(evt) {
|
||||
validateUsername(usernameField.value);
|
||||
});
|
|
@ -1,18 +1,25 @@
|
|||
<html>
|
||||
<head>
|
||||
<title>Authentication Failed</title>
|
||||
</head>
|
||||
<body>
|
||||
<div>
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<title>Authentication failed</title>
|
||||
<meta name="viewport" content="width=device-width, user-scalable=no">
|
||||
<style type="text/css">
|
||||
{% include "sso.css" without context %}
|
||||
</style>
|
||||
</head>
|
||||
<body class="error_page">
|
||||
<header>
|
||||
<h1>That doesn't look right</h1>
|
||||
<p>
|
||||
We were unable to validate your <tt>{{server_name | e}}</tt> account via
|
||||
single-sign-on (SSO), because the SSO Identity Provider returned
|
||||
different details than when you logged in.
|
||||
<strong>We were unable to validate your {{ server_name }} account</strong>
|
||||
via single sign‑on (SSO), because the SSO Identity
|
||||
Provider returned different details than when you logged in.
|
||||
</p>
|
||||
<p>
|
||||
Try the operation again, and ensure that you use the same details on
|
||||
the Identity Provider as when you log into your account.
|
||||
</p>
|
||||
</div>
|
||||
</header>
|
||||
</body>
|
||||
</html>
|
||||
|
|
|
@ -1,14 +1,28 @@
|
|||
<html>
|
||||
<head>
|
||||
<title>Authentication</title>
|
||||
</head>
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<title>Authentication</title>
|
||||
<meta name="viewport" content="width=device-width, user-scalable=no">
|
||||
<style type="text/css">
|
||||
{% include "sso.css" without context %}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<div>
|
||||
<header>
|
||||
<h1>Confirm it's you to continue</h1>
|
||||
<p>
|
||||
A client is trying to {{ description | e }}. To confirm this action,
|
||||
<a href="{{ redirect_url | e }}">re-authenticate with single sign-on</a>.
|
||||
If you did not expect this, your account may be compromised!
|
||||
A client is trying to {{ description }}. To confirm this action
|
||||
re-authorize your account with single sign-on.
|
||||
</p>
|
||||
</div>
|
||||
<p><strong>
|
||||
If you did not expect this, your account may be compromised.
|
||||
</strong></p>
|
||||
</header>
|
||||
<main>
|
||||
<a href="{{ redirect_url }}" class="primary-button">
|
||||
Continue with {{ idp.idp_name }}
|
||||
</a>
|
||||
</main>
|
||||
</body>
|
||||
</html>
|
||||
|
|
|
@ -1,18 +1,27 @@
|
|||
<html>
|
||||
<head>
|
||||
<title>Authentication Successful</title>
|
||||
<script>
|
||||
if (window.onAuthDone) {
|
||||
window.onAuthDone();
|
||||
} else if (window.opener && window.opener.postMessage) {
|
||||
window.opener.postMessage("authDone", "*");
|
||||
}
|
||||
</script>
|
||||
</head>
|
||||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<title>Authentication successful</title>
|
||||
<meta name="viewport" content="width=device-width, user-scalable=no">
|
||||
<style type="text/css">
|
||||
{% include "sso.css" without context %}
|
||||
</style>
|
||||
<script>
|
||||
if (window.onAuthDone) {
|
||||
window.onAuthDone();
|
||||
} else if (window.opener && window.opener.postMessage) {
|
||||
window.opener.postMessage("authDone", "*");
|
||||
}
|
||||
</script>
|
||||
</head>
|
||||
<body>
|
||||
<div>
|
||||
<p>Thank you</p>
|
||||
<p>You may now close this window and return to the application</p>
|
||||
</div>
|
||||
<header>
|
||||
<h1>Thank you</h1>
|
||||
<p>
|
||||
Now we know it’s you, you can close this window and return to the
|
||||
application.
|
||||
</p>
|
||||
</header>
|
||||
</body>
|
||||
</html>
|
||||
|
|
|
@ -1,53 +1,68 @@
|
|||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<title>SSO error</title>
|
||||
</head>
|
||||
<body>
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<title>Authentication failed</title>
|
||||
<meta name="viewport" content="width=device-width, user-scalable=no">
|
||||
<style type="text/css">
|
||||
{% include "sso.css" without context %}
|
||||
|
||||
#error_code {
|
||||
margin-top: 56px;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body class="error_page">
|
||||
{# If an error of unauthorised is returned it means we have actively rejected their login #}
|
||||
{% if error == "unauthorised" %}
|
||||
<p>You are not allowed to log in here.</p>
|
||||
<header>
|
||||
<p>You are not allowed to log in here.</p>
|
||||
</header>
|
||||
{% else %}
|
||||
<p>
|
||||
There was an error during authentication:
|
||||
</p>
|
||||
<div id="errormsg" style="margin:20px 80px">{{ error_description | e }}</div>
|
||||
<p>
|
||||
If you are seeing this page after clicking a link sent to you via email, make
|
||||
sure you only click the confirmation link once, and that you open the
|
||||
validation link in the same client you're logging in from.
|
||||
</p>
|
||||
<p>
|
||||
Try logging in again from your Matrix client and if the problem persists
|
||||
please contact the server's administrator.
|
||||
</p>
|
||||
<p>Error: <code>{{ error }}</code></p>
|
||||
<header>
|
||||
<h1>There was an error</h1>
|
||||
<p>
|
||||
<strong id="errormsg">{{ error_description }}</strong>
|
||||
</p>
|
||||
<p>
|
||||
If you are seeing this page after clicking a link sent to you via email,
|
||||
make sure you only click the confirmation link once, and that you open
|
||||
the validation link in the same client you're logging in from.
|
||||
</p>
|
||||
<p>
|
||||
Try logging in again from your Matrix client and if the problem persists
|
||||
please contact the server's administrator.
|
||||
</p>
|
||||
<div id="error_code">
|
||||
<p><strong>Error code</strong></p>
|
||||
<p>{{ error }}</p>
|
||||
</div>
|
||||
</header>
|
||||
|
||||
<script type="text/javascript">
|
||||
// Error handling to support Auth0 errors that we might get through a GET request
|
||||
// to the validation endpoint. If an error is provided, it's either going to be
|
||||
// located in the query string or in a query string-like URI fragment.
|
||||
// We try to locate the error from any of these two locations, but if we can't
|
||||
// we just don't print anything specific.
|
||||
let searchStr = "";
|
||||
if (window.location.search) {
|
||||
// window.location.searchParams isn't always defined when
|
||||
// window.location.search is, so it's more reliable to parse the latter.
|
||||
searchStr = window.location.search;
|
||||
} else if (window.location.hash) {
|
||||
// Replace the # with a ? so that URLSearchParams does the right thing and
|
||||
// doesn't parse the first parameter incorrectly.
|
||||
searchStr = window.location.hash.replace("#", "?");
|
||||
}
|
||||
<script type="text/javascript">
|
||||
// Error handling to support Auth0 errors that we might get through a GET request
|
||||
// to the validation endpoint. If an error is provided, it's either going to be
|
||||
// located in the query string or in a query string-like URI fragment.
|
||||
// We try to locate the error from any of these two locations, but if we can't
|
||||
// we just don't print anything specific.
|
||||
let searchStr = "";
|
||||
if (window.location.search) {
|
||||
// window.location.searchParams isn't always defined when
|
||||
// window.location.search is, so it's more reliable to parse the latter.
|
||||
searchStr = window.location.search;
|
||||
} else if (window.location.hash) {
|
||||
// Replace the # with a ? so that URLSearchParams does the right thing and
|
||||
// doesn't parse the first parameter incorrectly.
|
||||
searchStr = window.location.hash.replace("#", "?");
|
||||
}
|
||||
|
||||
// We might end up with no error in the URL, so we need to check if we have one
|
||||
// to print one.
|
||||
let errorDesc = new URLSearchParams(searchStr).get("error_description")
|
||||
if (errorDesc) {
|
||||
document.getElementById("errormsg").innerText = errorDesc;
|
||||
}
|
||||
</script>
|
||||
// We might end up with no error in the URL, so we need to check if we have one
|
||||
// to print one.
|
||||
let errorDesc = new URLSearchParams(searchStr).get("error_description")
|
||||
if (errorDesc) {
|
||||
document.getElementById("errormsg").innerText = errorDesc;
|
||||
}
|
||||
</script>
|
||||
{% endif %}
|
||||
</body>
|
||||
</html>
|
||||
|
|
|
@ -3,22 +3,22 @@
|
|||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<link rel="stylesheet" href="/_matrix/static/client/login/style.css">
|
||||
<title>{{server_name | e}} Login</title>
|
||||
<title>{{ server_name }} Login</title>
|
||||
</head>
|
||||
<body>
|
||||
<div id="container">
|
||||
<h1 id="title">{{server_name | e}} Login</h1>
|
||||
<h1 id="title">{{ server_name }} Login</h1>
|
||||
<div class="login_flow">
|
||||
<p>Choose one of the following identity providers:</p>
|
||||
<form>
|
||||
<input type="hidden" name="redirectUrl" value="{{redirect_url | e}}">
|
||||
<input type="hidden" name="redirectUrl" value="{{ redirect_url }}">
|
||||
<ul class="radiobuttons">
|
||||
{% for p in providers %}
|
||||
<li>
|
||||
<input type="radio" name="idp" id="prov{{loop.index}}" value="{{p.idp_id}}">
|
||||
<label for="prov{{loop.index}}">{{p.idp_name | e}}</label>
|
||||
<input type="radio" name="idp" id="prov{{ loop.index }}" value="{{ p.idp_id }}">
|
||||
<label for="prov{{ loop.index }}">{{ p.idp_name }}</label>
|
||||
{% if p.idp_icon %}
|
||||
<img src="{{p.idp_icon | mxc_to_http(32, 32)}}"/>
|
||||
<img src="{{ p.idp_icon | mxc_to_http(32, 32) }}"/>
|
||||
{% endif %}
|
||||
</li>
|
||||
{% endfor %}
|
||||
|
|
39
synapse/res/templates/sso_new_user_consent.html
Normal file
39
synapse/res/templates/sso_new_user_consent.html
Normal file
|
@ -0,0 +1,39 @@
|
|||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<title>SSO redirect confirmation</title>
|
||||
<meta name="viewport" content="width=device-width, user-scalable=no">
|
||||
<style type="text/css">
|
||||
{% include "sso.css" without context %}
|
||||
|
||||
#consent_form {
|
||||
margin-top: 56px;
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<header>
|
||||
<h1>Your account is nearly ready</h1>
|
||||
<p>Agree to the terms to create your account.</p>
|
||||
</header>
|
||||
<main>
|
||||
<!-- {% if user_profile.avatar_url and user_profile.display_name %} -->
|
||||
<div class="profile">
|
||||
<img src="{{ user_profile.avatar_url | mxc_to_http(64, 64) }}" class="avatar" />
|
||||
<div class="profile-details">
|
||||
<div class="display-name">{{ user_profile.display_name }}</div>
|
||||
<div class="user-id">{{ user_id }}</div>
|
||||
</div>
|
||||
</div>
|
||||
<!-- {% endif %} -->
|
||||
<form method="post" action="{{my_url}}" id="consent_form">
|
||||
<p>
|
||||
<input id="accepted_version" type="checkbox" name="accepted_version" value="{{ consent_version }}" required>
|
||||
<label for="accepted_version">I have read and agree to the <a href="{{ terms_url }}" target="_blank">terms and conditions</a>.</label>
|
||||
</p>
|
||||
<input type="submit" class="primary-button" value="Continue"/>
|
||||
</form>
|
||||
</main>
|
||||
</body>
|
||||
</html>
|
|
@ -3,12 +3,34 @@
|
|||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<title>SSO redirect confirmation</title>
|
||||
<meta name="viewport" content="width=device-width, user-scalable=no">
|
||||
<style type="text/css">
|
||||
{% include "sso.css" without context %}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<p>The application at <span style="font-weight:bold">{{ display_url | e }}</span> is requesting full access to your <span style="font-weight:bold">{{ server_name }}</span> Matrix account.</p>
|
||||
<p>If you don't recognise this address, you should ignore this and close this tab.</p>
|
||||
<p>
|
||||
<a href="{{ redirect_url | e }}">I trust this address</a>
|
||||
</p>
|
||||
<header>
|
||||
{% if new_user %}
|
||||
<h1>Your account is now ready</h1>
|
||||
<p>You've made your account on {{ server_name }}.</p>
|
||||
{% else %}
|
||||
<h1>Log in</h1>
|
||||
{% endif %}
|
||||
<p>Continue to confirm you trust <strong>{{ display_url }}</strong>.</p>
|
||||
</header>
|
||||
<main>
|
||||
{% if user_profile.avatar_url %}
|
||||
<div class="profile">
|
||||
<img src="{{ user_profile.avatar_url | mxc_to_http(64, 64) }}" class="avatar" />
|
||||
<div class="profile-details">
|
||||
{% if user_profile.display_name %}
|
||||
<div class="display-name">{{ user_profile.display_name }}</div>
|
||||
{% endif %}
|
||||
<div class="user-id">{{ user_id }}</div>
|
||||
</div>
|
||||
</div>
|
||||
{% endif %}
|
||||
<a href="{{ redirect_url }}" class="primary-button">Continue</a>
|
||||
</main>
|
||||
</body>
|
||||
</html>
|
||||
</html>
|
||||
|
|
|
@ -1,19 +0,0 @@
|
|||
<!DOCTYPE html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<title>Synapse Login</title>
|
||||
<link rel="stylesheet" href="style.css" type="text/css" />
|
||||
</head>
|
||||
<body>
|
||||
<div class="card">
|
||||
<form method="post" class="form__input" id="form" action="submit">
|
||||
<label for="field-username">Please pick your username:</label>
|
||||
<input type="text" name="username" id="field-username" autofocus="">
|
||||
<input type="submit" class="button button--full-width" id="button-submit" value="Submit">
|
||||
</form>
|
||||
<!-- this is used for feedback -->
|
||||
<div role=alert class="tooltip hidden" id="message"></div>
|
||||
<script src="script.js"></script>
|
||||
</div>
|
||||
</body>
|
||||
</html>
|
|
@ -1,95 +0,0 @@
|
|||
let inputField = document.getElementById("field-username");
|
||||
let inputForm = document.getElementById("form");
|
||||
let submitButton = document.getElementById("button-submit");
|
||||
let message = document.getElementById("message");
|
||||
|
||||
// Submit username and receive response
|
||||
function showMessage(messageText) {
|
||||
// Unhide the message text
|
||||
message.classList.remove("hidden");
|
||||
|
||||
message.textContent = messageText;
|
||||
};
|
||||
|
||||
function doSubmit() {
|
||||
showMessage("Success. Please wait a moment for your browser to redirect.");
|
||||
|
||||
// remove the event handler before re-submitting the form.
|
||||
delete inputForm.onsubmit;
|
||||
inputForm.submit();
|
||||
}
|
||||
|
||||
function onResponse(response) {
|
||||
// Display message
|
||||
showMessage(response);
|
||||
|
||||
// Enable submit button and input field
|
||||
submitButton.classList.remove('button--disabled');
|
||||
submitButton.value = "Submit";
|
||||
};
|
||||
|
||||
let allowedUsernameCharacters = RegExp("[^a-z0-9\\.\\_\\=\\-\\/]");
|
||||
function usernameIsValid(username) {
|
||||
return !allowedUsernameCharacters.test(username);
|
||||
}
|
||||
let allowedCharactersString = "lowercase letters, digits, ., _, -, /, =";
|
||||
|
||||
function buildQueryString(params) {
|
||||
return Object.keys(params)
|
||||
.map(k => encodeURIComponent(k) + '=' + encodeURIComponent(params[k]))
|
||||
.join('&');
|
||||
}
|
||||
|
||||
function submitUsername(username) {
|
||||
if(username.length == 0) {
|
||||
onResponse("Please enter a username.");
|
||||
return;
|
||||
}
|
||||
if(!usernameIsValid(username)) {
|
||||
onResponse("Invalid username. Only the following characters are allowed: " + allowedCharactersString);
|
||||
return;
|
||||
}
|
||||
|
||||
// if this browser doesn't support fetch, skip the availability check.
|
||||
if(!window.fetch) {
|
||||
doSubmit();
|
||||
return;
|
||||
}
|
||||
|
||||
let check_uri = 'check?' + buildQueryString({"username": username});
|
||||
fetch(check_uri, {
|
||||
// include the cookie
|
||||
"credentials": "same-origin",
|
||||
}).then((response) => {
|
||||
if(!response.ok) {
|
||||
// for non-200 responses, raise the body of the response as an exception
|
||||
return response.text().then((text) => { throw text; });
|
||||
} else {
|
||||
return response.json();
|
||||
}
|
||||
}).then((json) => {
|
||||
if(json.error) {
|
||||
throw json.error;
|
||||
} else if(json.available) {
|
||||
doSubmit();
|
||||
} else {
|
||||
onResponse("This username is not available, please choose another.");
|
||||
}
|
||||
}).catch((err) => {
|
||||
onResponse("Error checking username availability: " + err);
|
||||
});
|
||||
}
|
||||
|
||||
function clickSubmit() {
|
||||
event.preventDefault();
|
||||
if(submitButton.classList.contains('button--disabled')) { return; }
|
||||
|
||||
// Disable submit button and input field
|
||||
submitButton.classList.add('button--disabled');
|
||||
|
||||
// Submit username
|
||||
submitButton.value = "Checking...";
|
||||
submitUsername(inputField.value);
|
||||
};
|
||||
|
||||
inputForm.onsubmit = clickSubmit;
|
|
@ -1,27 +0,0 @@
|
|||
input[type="text"] {
|
||||
font-size: 100%;
|
||||
background-color: #ededf0;
|
||||
border: 1px solid #fff;
|
||||
border-radius: .2em;
|
||||
padding: .5em .9em;
|
||||
display: block;
|
||||
width: 26em;
|
||||
}
|
||||
|
||||
.button--disabled {
|
||||
border-color: #fff;
|
||||
background-color: transparent;
|
||||
color: #000;
|
||||
text-transform: none;
|
||||
}
|
||||
|
||||
.hidden {
|
||||
display: none;
|
||||
}
|
||||
|
||||
.tooltip {
|
||||
background-color: #f9f9fa;
|
||||
padding: 1em;
|
||||
margin: 1em 0;
|
||||
}
|
||||
|
|
@ -1,6 +1,8 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Copyright 2014-2016 OpenMarket Ltd
|
||||
# Copyright 2018-2019 New Vector Ltd
|
||||
# Copyright 2020, 2021 The Matrix.org Foundation C.I.C.
|
||||
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
|
@ -36,11 +38,13 @@ from synapse.rest.admin.media import ListMediaInRoom, register_servlets_for_medi
|
|||
from synapse.rest.admin.purge_room_servlet import PurgeRoomServlet
|
||||
from synapse.rest.admin.rooms import (
|
||||
DeleteRoomRestServlet,
|
||||
ForwardExtremitiesRestServlet,
|
||||
JoinRoomAliasServlet,
|
||||
ListRoomRestServlet,
|
||||
MakeRoomAdminRestServlet,
|
||||
RoomMembersRestServlet,
|
||||
RoomRestServlet,
|
||||
RoomStateRestServlet,
|
||||
ShutdownRoomRestServlet,
|
||||
)
|
||||
from synapse.rest.admin.server_notice_servlet import SendServerNoticeServlet
|
||||
|
@ -51,6 +55,7 @@ from synapse.rest.admin.users import (
|
|||
PushersRestServlet,
|
||||
ResetPasswordRestServlet,
|
||||
SearchUsersRestServlet,
|
||||
ShadowBanRestServlet,
|
||||
UserAdminServlet,
|
||||
UserMediaRestServlet,
|
||||
UserMembershipRestServlet,
|
||||
|
@ -209,6 +214,7 @@ def register_servlets(hs, http_server):
|
|||
"""
|
||||
register_servlets_for_client_rest_resource(hs, http_server)
|
||||
ListRoomRestServlet(hs).register(http_server)
|
||||
RoomStateRestServlet(hs).register(http_server)
|
||||
RoomRestServlet(hs).register(http_server)
|
||||
RoomMembersRestServlet(hs).register(http_server)
|
||||
DeleteRoomRestServlet(hs).register(http_server)
|
||||
|
@ -230,6 +236,8 @@ def register_servlets(hs, http_server):
|
|||
EventReportsRestServlet(hs).register(http_server)
|
||||
PushersRestServlet(hs).register(http_server)
|
||||
MakeRoomAdminRestServlet(hs).register(http_server)
|
||||
ShadowBanRestServlet(hs).register(http_server)
|
||||
ForwardExtremitiesRestServlet(hs).register(http_server)
|
||||
|
||||
|
||||
def register_servlets_for_client_rest_resource(hs, http_server):
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Copyright 2019 The Matrix.org Foundation C.I.C.
|
||||
# Copyright 2019-2021 The Matrix.org Foundation C.I.C.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
|
@ -292,6 +292,45 @@ class RoomMembersRestServlet(RestServlet):
|
|||
return 200, ret
|
||||
|
||||
|
||||
class RoomStateRestServlet(RestServlet):
|
||||
"""
|
||||
Get full state within a room.
|
||||
"""
|
||||
|
||||
PATTERNS = admin_patterns("/rooms/(?P<room_id>[^/]+)/state")
|
||||
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
self.hs = hs
|
||||
self.auth = hs.get_auth()
|
||||
self.store = hs.get_datastore()
|
||||
self.clock = hs.get_clock()
|
||||
self._event_serializer = hs.get_event_client_serializer()
|
||||
|
||||
async def on_GET(
|
||||
self, request: SynapseRequest, room_id: str
|
||||
) -> Tuple[int, JsonDict]:
|
||||
requester = await self.auth.get_user_by_req(request)
|
||||
await assert_user_is_admin(self.auth, requester.user)
|
||||
|
||||
ret = await self.store.get_room(room_id)
|
||||
if not ret:
|
||||
raise NotFoundError("Room not found")
|
||||
|
||||
event_ids = await self.store.get_current_state_ids(room_id)
|
||||
events = await self.store.get_events(event_ids.values())
|
||||
now = self.clock.time_msec()
|
||||
room_state = await self._event_serializer.serialize_events(
|
||||
events.values(),
|
||||
now,
|
||||
# We don't bother bundling aggregations in when asked for state
|
||||
# events, as clients won't use them.
|
||||
bundle_aggregations=False,
|
||||
)
|
||||
ret = {"state": room_state}
|
||||
|
||||
return 200, ret
|
||||
|
||||
|
||||
class JoinRoomAliasServlet(RestServlet):
|
||||
|
||||
PATTERNS = admin_patterns("/join/(?P<room_identifier>[^/]*)")
|
||||
|
@ -431,7 +470,17 @@ class MakeRoomAdminRestServlet(RestServlet):
|
|||
if not admin_users:
|
||||
raise SynapseError(400, "No local admin user in room")
|
||||
|
||||
admin_user_id = admin_users[-1]
|
||||
admin_user_id = None
|
||||
|
||||
for admin_user in reversed(admin_users):
|
||||
if room_state.get((EventTypes.Member, admin_user)):
|
||||
admin_user_id = admin_user
|
||||
break
|
||||
|
||||
if not admin_user_id:
|
||||
raise SynapseError(
|
||||
400, "No local admin user in room",
|
||||
)
|
||||
|
||||
pl_content = power_levels.content
|
||||
else:
|
||||
|
@ -499,3 +548,60 @@ class MakeRoomAdminRestServlet(RestServlet):
|
|||
)
|
||||
|
||||
return 200, {}
|
||||
|
||||
|
||||
class ForwardExtremitiesRestServlet(RestServlet):
|
||||
"""Allows a server admin to get or clear forward extremities.
|
||||
|
||||
Clearing does not require restarting the server.
|
||||
|
||||
Clear forward extremities:
|
||||
DELETE /_synapse/admin/v1/rooms/<room_id_or_alias>/forward_extremities
|
||||
|
||||
Get forward_extremities:
|
||||
GET /_synapse/admin/v1/rooms/<room_id_or_alias>/forward_extremities
|
||||
"""
|
||||
|
||||
PATTERNS = admin_patterns("/rooms/(?P<room_identifier>[^/]*)/forward_extremities")
|
||||
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
self.hs = hs
|
||||
self.auth = hs.get_auth()
|
||||
self.room_member_handler = hs.get_room_member_handler()
|
||||
self.store = hs.get_datastore()
|
||||
|
||||
async def resolve_room_id(self, room_identifier: str) -> str:
|
||||
"""Resolve to a room ID, if necessary."""
|
||||
if RoomID.is_valid(room_identifier):
|
||||
resolved_room_id = room_identifier
|
||||
elif RoomAlias.is_valid(room_identifier):
|
||||
room_alias = RoomAlias.from_string(room_identifier)
|
||||
room_id, _ = await self.room_member_handler.lookup_room_alias(room_alias)
|
||||
resolved_room_id = room_id.to_string()
|
||||
else:
|
||||
raise SynapseError(
|
||||
400, "%s was not legal room ID or room alias" % (room_identifier,)
|
||||
)
|
||||
if not resolved_room_id:
|
||||
raise SynapseError(
|
||||
400, "Unknown room ID or room alias %s" % room_identifier
|
||||
)
|
||||
return resolved_room_id
|
||||
|
||||
async def on_DELETE(self, request, room_identifier):
|
||||
requester = await self.auth.get_user_by_req(request)
|
||||
await assert_user_is_admin(self.auth, requester.user)
|
||||
|
||||
room_id = await self.resolve_room_id(room_identifier)
|
||||
|
||||
deleted_count = await self.store.delete_forward_extremities_for_room(room_id)
|
||||
return 200, {"deleted": deleted_count}
|
||||
|
||||
async def on_GET(self, request, room_identifier):
|
||||
requester = await self.auth.get_user_by_req(request)
|
||||
await assert_user_is_admin(self.auth, requester.user)
|
||||
|
||||
room_id = await self.resolve_room_id(room_identifier)
|
||||
|
||||
extremities = await self.store.get_forward_extremities_for_room(room_id)
|
||||
return 200, {"count": len(extremities), "results": extremities}
|
||||
|
|
|
@ -83,17 +83,32 @@ class UsersRestServletV2(RestServlet):
|
|||
The parameter `deactivated` can be used to include deactivated users.
|
||||
"""
|
||||
|
||||
def __init__(self, hs):
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
self.hs = hs
|
||||
self.store = hs.get_datastore()
|
||||
self.auth = hs.get_auth()
|
||||
self.admin_handler = hs.get_admin_handler()
|
||||
|
||||
async def on_GET(self, request):
|
||||
async def on_GET(self, request: SynapseRequest) -> Tuple[int, JsonDict]:
|
||||
await assert_requester_is_admin(self.auth, request)
|
||||
|
||||
start = parse_integer(request, "from", default=0)
|
||||
limit = parse_integer(request, "limit", default=100)
|
||||
|
||||
if start < 0:
|
||||
raise SynapseError(
|
||||
400,
|
||||
"Query parameter from must be a string representing a positive integer.",
|
||||
errcode=Codes.INVALID_PARAM,
|
||||
)
|
||||
|
||||
if limit < 0:
|
||||
raise SynapseError(
|
||||
400,
|
||||
"Query parameter limit must be a string representing a positive integer.",
|
||||
errcode=Codes.INVALID_PARAM,
|
||||
)
|
||||
|
||||
user_id = parse_string(request, "user_id", default=None)
|
||||
name = parse_string(request, "name", default=None)
|
||||
guests = parse_boolean(request, "guests", default=True)
|
||||
|
@ -103,7 +118,7 @@ class UsersRestServletV2(RestServlet):
|
|||
start, limit, user_id, name, guests, deactivated
|
||||
)
|
||||
ret = {"users": users, "total": total}
|
||||
if len(users) >= limit:
|
||||
if (start + limit) < total:
|
||||
ret["next_token"] = str(start + len(users))
|
||||
|
||||
return 200, ret
|
||||
|
@ -875,3 +890,39 @@ class UserTokenRestServlet(RestServlet):
|
|||
)
|
||||
|
||||
return 200, {"access_token": token}
|
||||
|
||||
|
||||
class ShadowBanRestServlet(RestServlet):
|
||||
"""An admin API for shadow-banning a user.
|
||||
|
||||
A shadow-banned users receives successful responses to their client-server
|
||||
API requests, but the events are not propagated into rooms.
|
||||
|
||||
Shadow-banning a user should be used as a tool of last resort and may lead
|
||||
to confusing or broken behaviour for the client.
|
||||
|
||||
Example:
|
||||
|
||||
POST /_synapse/admin/v1/users/@test:example.com/shadow_ban
|
||||
{}
|
||||
|
||||
200 OK
|
||||
{}
|
||||
"""
|
||||
|
||||
PATTERNS = admin_patterns("/users/(?P<user_id>[^/]*)/shadow_ban")
|
||||
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
self.hs = hs
|
||||
self.store = hs.get_datastore()
|
||||
self.auth = hs.get_auth()
|
||||
|
||||
async def on_POST(self, request, user_id):
|
||||
await assert_requester_is_admin(self.auth, request)
|
||||
|
||||
if not self.hs.is_mine_id(user_id):
|
||||
raise SynapseError(400, "Only local users can be shadow-banned")
|
||||
|
||||
await self.store.set_shadow_banned(UserID.from_string(user_id), True)
|
||||
|
||||
return 200, {}
|
||||
|
|
|
@ -19,7 +19,8 @@ from typing import TYPE_CHECKING, Awaitable, Callable, Dict, Optional
|
|||
from synapse.api.errors import Codes, LoginError, SynapseError
|
||||
from synapse.api.ratelimiting import Ratelimiter
|
||||
from synapse.appservice import ApplicationService
|
||||
from synapse.http.server import finish_request
|
||||
from synapse.handlers.sso import SsoIdentityProvider
|
||||
from synapse.http.server import HttpServer, finish_request
|
||||
from synapse.http.servlet import (
|
||||
RestServlet,
|
||||
parse_json_object_from_request,
|
||||
|
@ -60,11 +61,14 @@ class LoginRestServlet(RestServlet):
|
|||
self.saml2_enabled = hs.config.saml2_enabled
|
||||
self.cas_enabled = hs.config.cas_enabled
|
||||
self.oidc_enabled = hs.config.oidc_enabled
|
||||
self._msc2858_enabled = hs.config.experimental.msc2858_enabled
|
||||
|
||||
self.auth = hs.get_auth()
|
||||
|
||||
self.auth_handler = self.hs.get_auth_handler()
|
||||
self.registration_handler = hs.get_registration_handler()
|
||||
self._sso_handler = hs.get_sso_handler()
|
||||
|
||||
self._well_known_builder = WellKnownBuilder(hs)
|
||||
self._address_ratelimiter = Ratelimiter(
|
||||
clock=hs.get_clock(),
|
||||
|
@ -89,8 +93,17 @@ class LoginRestServlet(RestServlet):
|
|||
flows.append({"type": LoginRestServlet.CAS_TYPE})
|
||||
|
||||
if self.cas_enabled or self.saml2_enabled or self.oidc_enabled:
|
||||
flows.append({"type": LoginRestServlet.SSO_TYPE})
|
||||
# While its valid for us to advertise this login type generally,
|
||||
sso_flow = {"type": LoginRestServlet.SSO_TYPE} # type: JsonDict
|
||||
|
||||
if self._msc2858_enabled:
|
||||
sso_flow["org.matrix.msc2858.identity_providers"] = [
|
||||
_get_auth_flow_dict_for_idp(idp)
|
||||
for idp in self._sso_handler.get_identity_providers().values()
|
||||
]
|
||||
|
||||
flows.append(sso_flow)
|
||||
|
||||
# While it's valid for us to advertise this login type generally,
|
||||
# synapse currently only gives out these tokens as part of the
|
||||
# SSO login flow.
|
||||
# Generally we don't want to advertise login flows that clients
|
||||
|
@ -311,8 +324,22 @@ class LoginRestServlet(RestServlet):
|
|||
return result
|
||||
|
||||
|
||||
def _get_auth_flow_dict_for_idp(idp: SsoIdentityProvider) -> JsonDict:
|
||||
"""Return an entry for the login flow dict
|
||||
|
||||
Returns an entry suitable for inclusion in "identity_providers" in the
|
||||
response to GET /_matrix/client/r0/login
|
||||
"""
|
||||
e = {"id": idp.idp_id, "name": idp.idp_name} # type: JsonDict
|
||||
if idp.idp_icon:
|
||||
e["icon"] = idp.idp_icon
|
||||
if idp.idp_brand:
|
||||
e["brand"] = idp.idp_brand
|
||||
return e
|
||||
|
||||
|
||||
class SsoRedirectServlet(RestServlet):
|
||||
PATTERNS = client_patterns("/login/(cas|sso)/redirect", v1=True)
|
||||
PATTERNS = client_patterns("/login/(cas|sso)/redirect$", v1=True)
|
||||
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
# make sure that the relevant handlers are instantiated, so that they
|
||||
|
@ -324,13 +351,31 @@ class SsoRedirectServlet(RestServlet):
|
|||
if hs.config.oidc_enabled:
|
||||
hs.get_oidc_handler()
|
||||
self._sso_handler = hs.get_sso_handler()
|
||||
self._msc2858_enabled = hs.config.experimental.msc2858_enabled
|
||||
|
||||
async def on_GET(self, request: SynapseRequest):
|
||||
def register(self, http_server: HttpServer) -> None:
|
||||
super().register(http_server)
|
||||
if self._msc2858_enabled:
|
||||
# expose additional endpoint for MSC2858 support
|
||||
http_server.register_paths(
|
||||
"GET",
|
||||
client_patterns(
|
||||
"/org.matrix.msc2858/login/sso/redirect/(?P<idp_id>[A-Za-z0-9_.~-]+)$",
|
||||
releases=(),
|
||||
unstable=True,
|
||||
),
|
||||
self.on_GET,
|
||||
self.__class__.__name__,
|
||||
)
|
||||
|
||||
async def on_GET(
|
||||
self, request: SynapseRequest, idp_id: Optional[str] = None
|
||||
) -> None:
|
||||
client_redirect_url = parse_string(
|
||||
request, "redirectUrl", required=True, encoding=None
|
||||
)
|
||||
sso_url = await self._sso_handler.handle_redirect_request(
|
||||
request, client_redirect_url
|
||||
request, client_redirect_url, idp_id,
|
||||
)
|
||||
logger.info("Redirecting to %s", sso_url)
|
||||
request.redirect(sso_url)
|
||||
|
|
|
@ -54,7 +54,7 @@ logger = logging.getLogger(__name__)
|
|||
class EmailPasswordRequestTokenRestServlet(RestServlet):
|
||||
PATTERNS = client_patterns("/account/password/email/requestToken$")
|
||||
|
||||
def __init__(self, hs):
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
super().__init__()
|
||||
self.hs = hs
|
||||
self.datastore = hs.get_datastore()
|
||||
|
@ -103,6 +103,8 @@ class EmailPasswordRequestTokenRestServlet(RestServlet):
|
|||
# Raise if the provided next_link value isn't valid
|
||||
assert_valid_next_link(self.hs, next_link)
|
||||
|
||||
self.identity_handler.ratelimit_request_token_requests(request, "email", email)
|
||||
|
||||
# The email will be sent to the stored address.
|
||||
# This avoids a potential account hijack by requesting a password reset to
|
||||
# an email address which is controlled by the attacker but which, after
|
||||
|
@ -379,6 +381,8 @@ class EmailThreepidRequestTokenRestServlet(RestServlet):
|
|||
Codes.THREEPID_DENIED,
|
||||
)
|
||||
|
||||
self.identity_handler.ratelimit_request_token_requests(request, "email", email)
|
||||
|
||||
if next_link:
|
||||
# Raise if the provided next_link value isn't valid
|
||||
assert_valid_next_link(self.hs, next_link)
|
||||
|
@ -430,7 +434,7 @@ class EmailThreepidRequestTokenRestServlet(RestServlet):
|
|||
class MsisdnThreepidRequestTokenRestServlet(RestServlet):
|
||||
PATTERNS = client_patterns("/account/3pid/msisdn/requestToken$")
|
||||
|
||||
def __init__(self, hs):
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
self.hs = hs
|
||||
super().__init__()
|
||||
self.store = self.hs.get_datastore()
|
||||
|
@ -458,6 +462,10 @@ class MsisdnThreepidRequestTokenRestServlet(RestServlet):
|
|||
Codes.THREEPID_DENIED,
|
||||
)
|
||||
|
||||
self.identity_handler.ratelimit_request_token_requests(
|
||||
request, "msisdn", msisdn
|
||||
)
|
||||
|
||||
if next_link:
|
||||
# Raise if the provided next_link value isn't valid
|
||||
assert_valid_next_link(self.hs, next_link)
|
||||
|
|
|
@ -126,6 +126,8 @@ class EmailRegisterRequestTokenRestServlet(RestServlet):
|
|||
Codes.THREEPID_DENIED,
|
||||
)
|
||||
|
||||
self.identity_handler.ratelimit_request_token_requests(request, "email", email)
|
||||
|
||||
existing_user_id = await self.hs.get_datastore().get_user_id_by_threepid(
|
||||
"email", email
|
||||
)
|
||||
|
@ -205,6 +207,10 @@ class MsisdnRegisterRequestTokenRestServlet(RestServlet):
|
|||
Codes.THREEPID_DENIED,
|
||||
)
|
||||
|
||||
self.identity_handler.ratelimit_request_token_requests(
|
||||
request, "msisdn", msisdn
|
||||
)
|
||||
|
||||
existing_user_id = await self.hs.get_datastore().get_user_id_by_threepid(
|
||||
"msisdn", msisdn
|
||||
)
|
||||
|
|
|
@ -100,6 +100,7 @@ class ConsentResource(DirectServeHtmlResource):
|
|||
|
||||
consent_template_directory = hs.config.user_consent_template_dir
|
||||
|
||||
# TODO: switch to synapse.util.templates.build_jinja_env
|
||||
loader = jinja2.FileSystemLoader(consent_template_directory)
|
||||
self._jinja_env = jinja2.Environment(
|
||||
loader=loader, autoescape=jinja2.select_autoescape(["html", "htm", "xml"])
|
||||
|
|
|
@ -300,6 +300,7 @@ class FileInfo:
|
|||
thumbnail_height (int)
|
||||
thumbnail_method (str)
|
||||
thumbnail_type (str): Content type of thumbnail, e.g. image/png
|
||||
thumbnail_length (int): The size of the media file, in bytes.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
|
@ -312,6 +313,7 @@ class FileInfo:
|
|||
thumbnail_height=None,
|
||||
thumbnail_method=None,
|
||||
thumbnail_type=None,
|
||||
thumbnail_length=None,
|
||||
):
|
||||
self.server_name = server_name
|
||||
self.file_id = file_id
|
||||
|
@ -321,6 +323,7 @@ class FileInfo:
|
|||
self.thumbnail_height = thumbnail_height
|
||||
self.thumbnail_method = thumbnail_method
|
||||
self.thumbnail_type = thumbnail_type
|
||||
self.thumbnail_length = thumbnail_length
|
||||
|
||||
|
||||
def get_filename_from_headers(headers: Dict[bytes, List[bytes]]) -> Optional[str]:
|
||||
|
|
|
@ -375,7 +375,7 @@ class PreviewUrlResource(DirectServeJsonResource):
|
|||
"""
|
||||
Check whether the URL should be downloaded as oEmbed content instead.
|
||||
|
||||
Params:
|
||||
Args:
|
||||
url: The URL to check.
|
||||
|
||||
Returns:
|
||||
|
@ -392,7 +392,7 @@ class PreviewUrlResource(DirectServeJsonResource):
|
|||
"""
|
||||
Request content from an oEmbed endpoint.
|
||||
|
||||
Params:
|
||||
Args:
|
||||
endpoint: The oEmbed API endpoint.
|
||||
url: The URL to pass to the API.
|
||||
|
||||
|
@ -681,27 +681,51 @@ class PreviewUrlResource(DirectServeJsonResource):
|
|||
def decode_and_calc_og(
|
||||
body: bytes, media_uri: str, request_encoding: Optional[str] = None
|
||||
) -> Dict[str, Optional[str]]:
|
||||
"""
|
||||
Calculate metadata for an HTML document.
|
||||
|
||||
This uses lxml to parse the HTML document into the OG response. If errors
|
||||
occur during processing of the document, an empty response is returned.
|
||||
|
||||
Args:
|
||||
body: The HTML document, as bytes.
|
||||
media_url: The URI used to download the body.
|
||||
request_encoding: The character encoding of the body, as a string.
|
||||
|
||||
Returns:
|
||||
The OG response as a dictionary.
|
||||
"""
|
||||
# If there's no body, nothing useful is going to be found.
|
||||
if not body:
|
||||
return {}
|
||||
|
||||
from lxml import etree
|
||||
|
||||
# Create an HTML parser. If this fails, log and return no metadata.
|
||||
try:
|
||||
parser = etree.HTMLParser(recover=True, encoding=request_encoding)
|
||||
tree = etree.fromstring(body, parser)
|
||||
og = _calc_og(tree, media_uri)
|
||||
except LookupError:
|
||||
# blindly consider the encoding as utf-8.
|
||||
parser = etree.HTMLParser(recover=True, encoding="utf-8")
|
||||
except Exception as e:
|
||||
logger.warning("Unable to create HTML parser: %s" % (e,))
|
||||
return {}
|
||||
|
||||
def _attempt_calc_og(body_attempt: Union[bytes, str]) -> Dict[str, Optional[str]]:
|
||||
# Attempt to parse the body. If this fails, log and return no metadata.
|
||||
tree = etree.fromstring(body_attempt, parser)
|
||||
return _calc_og(tree, media_uri)
|
||||
|
||||
# Attempt to parse the body. If this fails, log and return no metadata.
|
||||
try:
|
||||
return _attempt_calc_og(body)
|
||||
except UnicodeDecodeError:
|
||||
# blindly try decoding the body as utf-8, which seems to fix
|
||||
# the charset mismatches on https://google.com
|
||||
parser = etree.HTMLParser(recover=True, encoding=request_encoding)
|
||||
tree = etree.fromstring(body.decode("utf-8", "ignore"), parser)
|
||||
og = _calc_og(tree, media_uri)
|
||||
|
||||
return og
|
||||
return _attempt_calc_og(body.decode("utf-8", "ignore"))
|
||||
|
||||
|
||||
def _calc_og(tree, media_uri: str) -> Dict[str, Optional[str]]:
|
||||
def _calc_og(tree: "etree.Element", media_uri: str) -> Dict[str, Optional[str]]:
|
||||
# suck our tree into lxml and define our OG response.
|
||||
|
||||
# if we see any image URLs in the OG response, then spider them
|
||||
|
|
|
@ -16,7 +16,7 @@
|
|||
|
||||
|
||||
import logging
|
||||
from typing import TYPE_CHECKING
|
||||
from typing import TYPE_CHECKING, Any, Dict, List, Optional
|
||||
|
||||
from twisted.web.http import Request
|
||||
|
||||
|
@ -106,31 +106,17 @@ class ThumbnailResource(DirectServeJsonResource):
|
|||
return
|
||||
|
||||
thumbnail_infos = await self.store.get_local_media_thumbnails(media_id)
|
||||
|
||||
if thumbnail_infos:
|
||||
thumbnail_info = self._select_thumbnail(
|
||||
width, height, method, m_type, thumbnail_infos
|
||||
)
|
||||
|
||||
file_info = FileInfo(
|
||||
server_name=None,
|
||||
file_id=media_id,
|
||||
url_cache=media_info["url_cache"],
|
||||
thumbnail=True,
|
||||
thumbnail_width=thumbnail_info["thumbnail_width"],
|
||||
thumbnail_height=thumbnail_info["thumbnail_height"],
|
||||
thumbnail_type=thumbnail_info["thumbnail_type"],
|
||||
thumbnail_method=thumbnail_info["thumbnail_method"],
|
||||
)
|
||||
|
||||
t_type = file_info.thumbnail_type
|
||||
t_length = thumbnail_info["thumbnail_length"]
|
||||
|
||||
responder = await self.media_storage.fetch_media(file_info)
|
||||
await respond_with_responder(request, responder, t_type, t_length)
|
||||
else:
|
||||
logger.info("Couldn't find any generated thumbnails")
|
||||
respond_404(request)
|
||||
await self._select_and_respond_with_thumbnail(
|
||||
request,
|
||||
width,
|
||||
height,
|
||||
method,
|
||||
m_type,
|
||||
thumbnail_infos,
|
||||
media_id,
|
||||
url_cache=media_info["url_cache"],
|
||||
server_name=None,
|
||||
)
|
||||
|
||||
async def _select_or_generate_local_thumbnail(
|
||||
self,
|
||||
|
@ -276,26 +262,64 @@ class ThumbnailResource(DirectServeJsonResource):
|
|||
thumbnail_infos = await self.store.get_remote_media_thumbnails(
|
||||
server_name, media_id
|
||||
)
|
||||
await self._select_and_respond_with_thumbnail(
|
||||
request,
|
||||
width,
|
||||
height,
|
||||
method,
|
||||
m_type,
|
||||
thumbnail_infos,
|
||||
media_info["filesystem_id"],
|
||||
url_cache=None,
|
||||
server_name=server_name,
|
||||
)
|
||||
|
||||
async def _select_and_respond_with_thumbnail(
|
||||
self,
|
||||
request: Request,
|
||||
desired_width: int,
|
||||
desired_height: int,
|
||||
desired_method: str,
|
||||
desired_type: str,
|
||||
thumbnail_infos: List[Dict[str, Any]],
|
||||
file_id: str,
|
||||
url_cache: Optional[str] = None,
|
||||
server_name: Optional[str] = None,
|
||||
) -> None:
|
||||
"""
|
||||
Respond to a request with an appropriate thumbnail from the previously generated thumbnails.
|
||||
|
||||
Args:
|
||||
request: The incoming request.
|
||||
desired_width: The desired width, the returned thumbnail may be larger than this.
|
||||
desired_height: The desired height, the returned thumbnail may be larger than this.
|
||||
desired_method: The desired method used to generate the thumbnail.
|
||||
desired_type: The desired content-type of the thumbnail.
|
||||
thumbnail_infos: A list of dictionaries of candidate thumbnails.
|
||||
file_id: The ID of the media that a thumbnail is being requested for.
|
||||
url_cache: The URL cache value.
|
||||
server_name: The server name, if this is a remote thumbnail.
|
||||
"""
|
||||
if thumbnail_infos:
|
||||
thumbnail_info = self._select_thumbnail(
|
||||
width, height, method, m_type, thumbnail_infos
|
||||
file_info = self._select_thumbnail(
|
||||
desired_width,
|
||||
desired_height,
|
||||
desired_method,
|
||||
desired_type,
|
||||
thumbnail_infos,
|
||||
file_id,
|
||||
url_cache,
|
||||
server_name,
|
||||
)
|
||||
file_info = FileInfo(
|
||||
server_name=server_name,
|
||||
file_id=media_info["filesystem_id"],
|
||||
thumbnail=True,
|
||||
thumbnail_width=thumbnail_info["thumbnail_width"],
|
||||
thumbnail_height=thumbnail_info["thumbnail_height"],
|
||||
thumbnail_type=thumbnail_info["thumbnail_type"],
|
||||
thumbnail_method=thumbnail_info["thumbnail_method"],
|
||||
)
|
||||
|
||||
t_type = file_info.thumbnail_type
|
||||
t_length = thumbnail_info["thumbnail_length"]
|
||||
if not file_info:
|
||||
logger.info("Couldn't find a thumbnail matching the desired inputs")
|
||||
respond_404(request)
|
||||
return
|
||||
|
||||
responder = await self.media_storage.fetch_media(file_info)
|
||||
await respond_with_responder(request, responder, t_type, t_length)
|
||||
await respond_with_responder(
|
||||
request, responder, file_info.thumbnail_type, file_info.thumbnail_length
|
||||
)
|
||||
else:
|
||||
logger.info("Failed to find any generated thumbnails")
|
||||
respond_404(request)
|
||||
|
@ -306,67 +330,117 @@ class ThumbnailResource(DirectServeJsonResource):
|
|||
desired_height: int,
|
||||
desired_method: str,
|
||||
desired_type: str,
|
||||
thumbnail_infos,
|
||||
) -> dict:
|
||||
thumbnail_infos: List[Dict[str, Any]],
|
||||
file_id: str,
|
||||
url_cache: Optional[str],
|
||||
server_name: Optional[str],
|
||||
) -> Optional[FileInfo]:
|
||||
"""
|
||||
Choose an appropriate thumbnail from the previously generated thumbnails.
|
||||
|
||||
Args:
|
||||
desired_width: The desired width, the returned thumbnail may be larger than this.
|
||||
desired_height: The desired height, the returned thumbnail may be larger than this.
|
||||
desired_method: The desired method used to generate the thumbnail.
|
||||
desired_type: The desired content-type of the thumbnail.
|
||||
thumbnail_infos: A list of dictionaries of candidate thumbnails.
|
||||
file_id: The ID of the media that a thumbnail is being requested for.
|
||||
url_cache: The URL cache value.
|
||||
server_name: The server name, if this is a remote thumbnail.
|
||||
|
||||
Returns:
|
||||
The thumbnail which best matches the desired parameters.
|
||||
"""
|
||||
desired_method = desired_method.lower()
|
||||
|
||||
# The chosen thumbnail.
|
||||
thumbnail_info = None
|
||||
|
||||
d_w = desired_width
|
||||
d_h = desired_height
|
||||
|
||||
if desired_method.lower() == "crop":
|
||||
if desired_method == "crop":
|
||||
# Thumbnails that match equal or larger sizes of desired width/height.
|
||||
crop_info_list = []
|
||||
# Other thumbnails.
|
||||
crop_info_list2 = []
|
||||
for info in thumbnail_infos:
|
||||
# Skip thumbnails generated with different methods.
|
||||
if info["thumbnail_method"] != "crop":
|
||||
continue
|
||||
|
||||
t_w = info["thumbnail_width"]
|
||||
t_h = info["thumbnail_height"]
|
||||
t_method = info["thumbnail_method"]
|
||||
if t_method == "crop":
|
||||
aspect_quality = abs(d_w * t_h - d_h * t_w)
|
||||
min_quality = 0 if d_w <= t_w and d_h <= t_h else 1
|
||||
size_quality = abs((d_w - t_w) * (d_h - t_h))
|
||||
type_quality = desired_type != info["thumbnail_type"]
|
||||
length_quality = info["thumbnail_length"]
|
||||
if t_w >= d_w or t_h >= d_h:
|
||||
crop_info_list.append(
|
||||
(
|
||||
aspect_quality,
|
||||
min_quality,
|
||||
size_quality,
|
||||
type_quality,
|
||||
length_quality,
|
||||
info,
|
||||
)
|
||||
)
|
||||
else:
|
||||
crop_info_list2.append(
|
||||
(
|
||||
aspect_quality,
|
||||
min_quality,
|
||||
size_quality,
|
||||
type_quality,
|
||||
length_quality,
|
||||
info,
|
||||
)
|
||||
)
|
||||
if crop_info_list:
|
||||
return min(crop_info_list)[-1]
|
||||
else:
|
||||
return min(crop_info_list2)[-1]
|
||||
else:
|
||||
info_list = []
|
||||
info_list2 = []
|
||||
for info in thumbnail_infos:
|
||||
t_w = info["thumbnail_width"]
|
||||
t_h = info["thumbnail_height"]
|
||||
t_method = info["thumbnail_method"]
|
||||
aspect_quality = abs(d_w * t_h - d_h * t_w)
|
||||
min_quality = 0 if d_w <= t_w and d_h <= t_h else 1
|
||||
size_quality = abs((d_w - t_w) * (d_h - t_h))
|
||||
type_quality = desired_type != info["thumbnail_type"]
|
||||
length_quality = info["thumbnail_length"]
|
||||
if t_method == "scale" and (t_w >= d_w or t_h >= d_h):
|
||||
if t_w >= d_w or t_h >= d_h:
|
||||
crop_info_list.append(
|
||||
(
|
||||
aspect_quality,
|
||||
min_quality,
|
||||
size_quality,
|
||||
type_quality,
|
||||
length_quality,
|
||||
info,
|
||||
)
|
||||
)
|
||||
else:
|
||||
crop_info_list2.append(
|
||||
(
|
||||
aspect_quality,
|
||||
min_quality,
|
||||
size_quality,
|
||||
type_quality,
|
||||
length_quality,
|
||||
info,
|
||||
)
|
||||
)
|
||||
if crop_info_list:
|
||||
thumbnail_info = min(crop_info_list)[-1]
|
||||
elif crop_info_list2:
|
||||
thumbnail_info = min(crop_info_list2)[-1]
|
||||
elif desired_method == "scale":
|
||||
# Thumbnails that match equal or larger sizes of desired width/height.
|
||||
info_list = []
|
||||
# Other thumbnails.
|
||||
info_list2 = []
|
||||
|
||||
for info in thumbnail_infos:
|
||||
# Skip thumbnails generated with different methods.
|
||||
if info["thumbnail_method"] != "scale":
|
||||
continue
|
||||
|
||||
t_w = info["thumbnail_width"]
|
||||
t_h = info["thumbnail_height"]
|
||||
size_quality = abs((d_w - t_w) * (d_h - t_h))
|
||||
type_quality = desired_type != info["thumbnail_type"]
|
||||
length_quality = info["thumbnail_length"]
|
||||
if t_w >= d_w or t_h >= d_h:
|
||||
info_list.append((size_quality, type_quality, length_quality, info))
|
||||
elif t_method == "scale":
|
||||
else:
|
||||
info_list2.append(
|
||||
(size_quality, type_quality, length_quality, info)
|
||||
)
|
||||
if info_list:
|
||||
return min(info_list)[-1]
|
||||
else:
|
||||
return min(info_list2)[-1]
|
||||
thumbnail_info = min(info_list)[-1]
|
||||
elif info_list2:
|
||||
thumbnail_info = min(info_list2)[-1]
|
||||
|
||||
if thumbnail_info:
|
||||
return FileInfo(
|
||||
file_id=file_id,
|
||||
url_cache=url_cache,
|
||||
server_name=server_name,
|
||||
thumbnail=True,
|
||||
thumbnail_width=thumbnail_info["thumbnail_width"],
|
||||
thumbnail_height=thumbnail_info["thumbnail_height"],
|
||||
thumbnail_type=thumbnail_info["thumbnail_type"],
|
||||
thumbnail_method=thumbnail_info["thumbnail_method"],
|
||||
thumbnail_length=thumbnail_info["thumbnail_length"],
|
||||
)
|
||||
|
||||
# No matching thumbnail was found.
|
||||
return None
|
||||
|
|
|
@ -1,5 +1,5 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Copyright 2020 The Matrix.org Foundation C.I.C.
|
||||
# Copyright 2021 The Matrix.org Foundation C.I.C.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
|
@ -12,3 +12,55 @@
|
|||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
from typing import TYPE_CHECKING, Mapping
|
||||
|
||||
from twisted.web.resource import Resource
|
||||
|
||||
from synapse.rest.synapse.client.new_user_consent import NewUserConsentResource
|
||||
from synapse.rest.synapse.client.pick_idp import PickIdpResource
|
||||
from synapse.rest.synapse.client.pick_username import pick_username_resource
|
||||
from synapse.rest.synapse.client.sso_register import SsoRegisterResource
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from synapse.server import HomeServer
|
||||
|
||||
|
||||
def build_synapse_client_resource_tree(hs: "HomeServer") -> Mapping[str, Resource]:
|
||||
"""Builds a resource tree to include synapse-specific client resources
|
||||
|
||||
These are resources which should be loaded on all workers which expose a C-S API:
|
||||
ie, the main process, and any generic workers so configured.
|
||||
|
||||
Returns:
|
||||
map from path to Resource.
|
||||
"""
|
||||
resources = {
|
||||
# SSO bits. These are always loaded, whether or not SSO login is actually
|
||||
# enabled (they just won't work very well if it's not)
|
||||
"/_synapse/client/pick_idp": PickIdpResource(hs),
|
||||
"/_synapse/client/pick_username": pick_username_resource(hs),
|
||||
"/_synapse/client/new_user_consent": NewUserConsentResource(hs),
|
||||
"/_synapse/client/sso_register": SsoRegisterResource(hs),
|
||||
}
|
||||
|
||||
# provider-specific SSO bits. Only load these if they are enabled, since they
|
||||
# rely on optional dependencies.
|
||||
if hs.config.oidc_enabled:
|
||||
from synapse.rest.synapse.client.oidc import OIDCResource
|
||||
|
||||
resources["/_synapse/client/oidc"] = OIDCResource(hs)
|
||||
|
||||
if hs.config.saml2_enabled:
|
||||
from synapse.rest.synapse.client.saml2 import SAML2Resource
|
||||
|
||||
res = SAML2Resource(hs)
|
||||
resources["/_synapse/client/saml2"] = res
|
||||
|
||||
# This is also mounted under '/_matrix' for backwards-compatibility.
|
||||
resources["/_matrix/saml2"] = res
|
||||
|
||||
return resources
|
||||
|
||||
|
||||
__all__ = ["build_synapse_client_resource_tree"]
|
||||
|
|
97
synapse/rest/synapse/client/new_user_consent.py
Normal file
97
synapse/rest/synapse/client/new_user_consent.py
Normal file
|
@ -0,0 +1,97 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Copyright 2021 The Matrix.org Foundation C.I.C.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
import logging
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
from twisted.web.http import Request
|
||||
|
||||
from synapse.api.errors import SynapseError
|
||||
from synapse.handlers.sso import get_username_mapping_session_cookie_from_request
|
||||
from synapse.http.server import DirectServeHtmlResource, respond_with_html
|
||||
from synapse.http.servlet import parse_string
|
||||
from synapse.types import UserID
|
||||
from synapse.util.templates import build_jinja_env
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from synapse.server import HomeServer
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class NewUserConsentResource(DirectServeHtmlResource):
|
||||
"""A resource which collects consent to the server's terms from a new user
|
||||
|
||||
This resource gets mounted at /_synapse/client/new_user_consent, and is shown
|
||||
when we are automatically creating a new user due to an SSO login.
|
||||
|
||||
It shows a template which prompts the user to go and read the Ts and Cs, and click
|
||||
a clickybox if they have done so.
|
||||
"""
|
||||
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
super().__init__()
|
||||
self._sso_handler = hs.get_sso_handler()
|
||||
self._server_name = hs.hostname
|
||||
self._consent_version = hs.config.consent.user_consent_version
|
||||
|
||||
def template_search_dirs():
|
||||
if hs.config.sso.sso_template_dir:
|
||||
yield hs.config.sso.sso_template_dir
|
||||
yield hs.config.sso.default_template_dir
|
||||
|
||||
self._jinja_env = build_jinja_env(template_search_dirs(), hs.config)
|
||||
|
||||
async def _async_render_GET(self, request: Request) -> None:
|
||||
try:
|
||||
session_id = get_username_mapping_session_cookie_from_request(request)
|
||||
session = self._sso_handler.get_mapping_session(session_id)
|
||||
except SynapseError as e:
|
||||
logger.warning("Error fetching session: %s", e)
|
||||
self._sso_handler.render_error(request, "bad_session", e.msg, code=e.code)
|
||||
return
|
||||
|
||||
user_id = UserID(session.chosen_localpart, self._server_name)
|
||||
user_profile = {
|
||||
"display_name": session.display_name,
|
||||
}
|
||||
|
||||
template_params = {
|
||||
"user_id": user_id.to_string(),
|
||||
"user_profile": user_profile,
|
||||
"consent_version": self._consent_version,
|
||||
"terms_url": "/_matrix/consent?v=%s" % (self._consent_version,),
|
||||
}
|
||||
|
||||
template = self._jinja_env.get_template("sso_new_user_consent.html")
|
||||
html = template.render(template_params)
|
||||
respond_with_html(request, 200, html)
|
||||
|
||||
async def _async_render_POST(self, request: Request):
|
||||
try:
|
||||
session_id = get_username_mapping_session_cookie_from_request(request)
|
||||
except SynapseError as e:
|
||||
logger.warning("Error fetching session cookie: %s", e)
|
||||
self._sso_handler.render_error(request, "bad_session", e.msg, code=e.code)
|
||||
return
|
||||
|
||||
try:
|
||||
accepted_version = parse_string(request, "accepted_version", required=True)
|
||||
except SynapseError as e:
|
||||
self._sso_handler.render_error(request, "bad_param", e.msg, code=e.code)
|
||||
return
|
||||
|
||||
await self._sso_handler.handle_terms_accepted(
|
||||
request, session_id, accepted_version
|
||||
)
|
|
@ -12,11 +12,12 @@
|
|||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import logging
|
||||
|
||||
from twisted.web.resource import Resource
|
||||
|
||||
from synapse.rest.oidc.callback_resource import OIDCCallbackResource
|
||||
from synapse.rest.synapse.client.oidc.callback_resource import OIDCCallbackResource
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
@ -25,3 +26,6 @@ class OIDCResource(Resource):
|
|||
def __init__(self, hs):
|
||||
Resource.__init__(self)
|
||||
self.putChild(b"callback", OIDCCallbackResource(hs))
|
||||
|
||||
|
||||
__all__ = ["OIDCResource"]
|
|
@ -12,42 +12,42 @@
|
|||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
import pkg_resources
|
||||
import logging
|
||||
from typing import TYPE_CHECKING, List
|
||||
|
||||
from twisted.web.http import Request
|
||||
from twisted.web.resource import Resource
|
||||
from twisted.web.static import File
|
||||
|
||||
from synapse.api.errors import SynapseError
|
||||
from synapse.handlers.sso import USERNAME_MAPPING_SESSION_COOKIE_NAME
|
||||
from synapse.http.server import DirectServeHtmlResource, DirectServeJsonResource
|
||||
from synapse.http.servlet import parse_string
|
||||
from synapse.handlers.sso import get_username_mapping_session_cookie_from_request
|
||||
from synapse.http.server import (
|
||||
DirectServeHtmlResource,
|
||||
DirectServeJsonResource,
|
||||
respond_with_html,
|
||||
)
|
||||
from synapse.http.servlet import parse_boolean, parse_string
|
||||
from synapse.http.site import SynapseRequest
|
||||
from synapse.util.templates import build_jinja_env
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from synapse.server import HomeServer
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def pick_username_resource(hs: "HomeServer") -> Resource:
|
||||
"""Factory method to generate the username picker resource.
|
||||
|
||||
This resource gets mounted under /_synapse/client/pick_username. The top-level
|
||||
resource is just a File resource which serves up the static files in the resources
|
||||
"res" directory, but it has a couple of children:
|
||||
This resource gets mounted under /_synapse/client/pick_username and has two
|
||||
children:
|
||||
|
||||
* "submit", which does the mechanics of registering the new user, and redirects the
|
||||
browser back to the client URL
|
||||
|
||||
* "check": checks if a userid is free.
|
||||
* "account_details": renders the form and handles the POSTed response
|
||||
* "check": a JSON endpoint which checks if a userid is free.
|
||||
"""
|
||||
|
||||
# XXX should we make this path customisable so that admins can restyle it?
|
||||
base_path = pkg_resources.resource_filename("synapse", "res/username_picker")
|
||||
|
||||
res = File(base_path)
|
||||
res.putChild(b"submit", SubmitResource(hs))
|
||||
res = Resource()
|
||||
res.putChild(b"account_details", AccountDetailsResource(hs))
|
||||
res.putChild(b"check", AvailabilityCheckResource(hs))
|
||||
|
||||
return res
|
||||
|
@ -61,28 +61,71 @@ class AvailabilityCheckResource(DirectServeJsonResource):
|
|||
async def _async_render_GET(self, request: Request):
|
||||
localpart = parse_string(request, "username", required=True)
|
||||
|
||||
session_id = request.getCookie(USERNAME_MAPPING_SESSION_COOKIE_NAME)
|
||||
if not session_id:
|
||||
raise SynapseError(code=400, msg="missing session_id")
|
||||
session_id = get_username_mapping_session_cookie_from_request(request)
|
||||
|
||||
is_available = await self._sso_handler.check_username_availability(
|
||||
localpart, session_id.decode("ascii", errors="replace")
|
||||
localpart, session_id
|
||||
)
|
||||
return 200, {"available": is_available}
|
||||
|
||||
|
||||
class SubmitResource(DirectServeHtmlResource):
|
||||
class AccountDetailsResource(DirectServeHtmlResource):
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
super().__init__()
|
||||
self._sso_handler = hs.get_sso_handler()
|
||||
|
||||
async def _async_render_POST(self, request: SynapseRequest):
|
||||
localpart = parse_string(request, "username", required=True)
|
||||
def template_search_dirs():
|
||||
if hs.config.sso.sso_template_dir:
|
||||
yield hs.config.sso.sso_template_dir
|
||||
yield hs.config.sso.default_template_dir
|
||||
|
||||
session_id = request.getCookie(USERNAME_MAPPING_SESSION_COOKIE_NAME)
|
||||
if not session_id:
|
||||
raise SynapseError(code=400, msg="missing session_id")
|
||||
self._jinja_env = build_jinja_env(template_search_dirs(), hs.config)
|
||||
|
||||
async def _async_render_GET(self, request: Request) -> None:
|
||||
try:
|
||||
session_id = get_username_mapping_session_cookie_from_request(request)
|
||||
session = self._sso_handler.get_mapping_session(session_id)
|
||||
except SynapseError as e:
|
||||
logger.warning("Error fetching session: %s", e)
|
||||
self._sso_handler.render_error(request, "bad_session", e.msg, code=e.code)
|
||||
return
|
||||
|
||||
idp_id = session.auth_provider_id
|
||||
template_params = {
|
||||
"idp": self._sso_handler.get_identity_providers()[idp_id],
|
||||
"user_attributes": {
|
||||
"display_name": session.display_name,
|
||||
"emails": session.emails,
|
||||
},
|
||||
}
|
||||
|
||||
template = self._jinja_env.get_template("sso_auth_account_details.html")
|
||||
html = template.render(template_params)
|
||||
respond_with_html(request, 200, html)
|
||||
|
||||
async def _async_render_POST(self, request: SynapseRequest):
|
||||
try:
|
||||
session_id = get_username_mapping_session_cookie_from_request(request)
|
||||
except SynapseError as e:
|
||||
logger.warning("Error fetching session cookie: %s", e)
|
||||
self._sso_handler.render_error(request, "bad_session", e.msg, code=e.code)
|
||||
return
|
||||
|
||||
try:
|
||||
localpart = parse_string(request, "username", required=True)
|
||||
use_display_name = parse_boolean(request, "use_display_name", default=False)
|
||||
|
||||
try:
|
||||
emails_to_use = [
|
||||
val.decode("utf-8") for val in request.args.get(b"use_email", [])
|
||||
] # type: List[str]
|
||||
except ValueError:
|
||||
raise SynapseError(400, "Query parameter use_email must be utf-8")
|
||||
except SynapseError as e:
|
||||
logger.warning("[session %s] bad param: %s", session_id, e)
|
||||
self._sso_handler.render_error(request, "bad_param", e.msg, code=e.code)
|
||||
return
|
||||
|
||||
await self._sso_handler.handle_submit_username_request(
|
||||
request, localpart, session_id.decode("ascii", errors="replace")
|
||||
request, session_id, localpart, use_display_name, emails_to_use
|
||||
)
|
||||
|
|
|
@ -12,12 +12,13 @@
|
|||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import logging
|
||||
|
||||
from twisted.web.resource import Resource
|
||||
|
||||
from synapse.rest.saml2.metadata_resource import SAML2MetadataResource
|
||||
from synapse.rest.saml2.response_resource import SAML2ResponseResource
|
||||
from synapse.rest.synapse.client.saml2.metadata_resource import SAML2MetadataResource
|
||||
from synapse.rest.synapse.client.saml2.response_resource import SAML2ResponseResource
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
@ -27,3 +28,6 @@ class SAML2Resource(Resource):
|
|||
Resource.__init__(self)
|
||||
self.putChild(b"metadata.xml", SAML2MetadataResource(hs))
|
||||
self.putChild(b"authn_response", SAML2ResponseResource(hs))
|
||||
|
||||
|
||||
__all__ = ["SAML2Resource"]
|
50
synapse/rest/synapse/client/sso_register.py
Normal file
50
synapse/rest/synapse/client/sso_register.py
Normal file
|
@ -0,0 +1,50 @@
|
|||
# -*- coding: utf-8 -*-
|
||||
# Copyright 2021 The Matrix.org Foundation C.I.C.
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import logging
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
from twisted.web.http import Request
|
||||
|
||||
from synapse.api.errors import SynapseError
|
||||
from synapse.handlers.sso import get_username_mapping_session_cookie_from_request
|
||||
from synapse.http.server import DirectServeHtmlResource
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from synapse.server import HomeServer
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class SsoRegisterResource(DirectServeHtmlResource):
|
||||
"""A resource which completes SSO registration
|
||||
|
||||
This resource gets mounted at /_synapse/client/sso_register, and is shown
|
||||
after we collect username and/or consent for a new SSO user. It (finally) registers
|
||||
the user, and confirms redirect to the client
|
||||
"""
|
||||
|
||||
def __init__(self, hs: "HomeServer"):
|
||||
super().__init__()
|
||||
self._sso_handler = hs.get_sso_handler()
|
||||
|
||||
async def _async_render_GET(self, request: Request) -> None:
|
||||
try:
|
||||
session_id = get_username_mapping_session_cookie_from_request(request)
|
||||
except SynapseError as e:
|
||||
logger.warning("Error fetching session cookie: %s", e)
|
||||
self._sso_handler.render_error(request, "bad_session", e.msg, code=e.code)
|
||||
return
|
||||
await self._sso_handler.register_sso_user(request, session_id)
|
Some files were not shown because too many files have changed in this diff Show more
Loading…
Reference in a new issue