0
0
Fork 1
mirror of https://mau.dev/maunium/synapse.git synced 2024-06-20 11:38:20 +02:00

merge develop pydoc for _get_state_for_groups

This commit is contained in:
Matthew Hodgson 2018-07-19 11:26:04 +01:00
commit be3adfc331
370 changed files with 6458 additions and 3962 deletions

View file

@ -4,7 +4,12 @@ language: python
# tell travis to cache ~/.cache/pip # tell travis to cache ~/.cache/pip
cache: pip cache: pip
before_script:
- git remote set-branches --add origin develop
- git fetch origin develop
matrix: matrix:
fast_finish: true
include: include:
- python: 2.7 - python: 2.7
env: TOX_ENV=packaging env: TOX_ENV=packaging
@ -14,10 +19,16 @@ matrix:
- python: 2.7 - python: 2.7
env: TOX_ENV=py27 env: TOX_ENV=py27
- python: 3.6 - python: 3.6
env: TOX_ENV=py36 env: TOX_ENV=py36
- python: 3.6
env: TOX_ENV=check_isort
- python: 3.6
env: TOX_ENV=check-newsfragment
install: install:
- pip install tox - pip install tox

View file

@ -1,3 +1,108 @@
Synapse 0.33.0rc1 (2018-07-18)
==============================
Features
--------
- Enforce the specified API for report_event (`#3316 <https://github.com/matrix-org/synapse/issues/3316>`_)
- Include CPU time from database threads in request/block metrics. (`#3496 <https://github.com/matrix-org/synapse/issues/3496>`_)
- Add CPU metrics for _fetch_event_list (`#3497 <https://github.com/matrix-org/synapse/issues/3497>`_)
- Reduce database consumption when processing large numbers of receipts (`#3505 <https://github.com/matrix-org/synapse/issues/3505>`_)
- Cache optimisation for /sync requests (`#3521 <https://github.com/matrix-org/synapse/issues/3521>`_)
- Optimisation to make handling incoming federation requests more efficient. (`#3541 <https://github.com/matrix-org/synapse/issues/3541>`_)
Bugfixes
--------
- Fix queued federation requests being processed in the wrong order (`#3533 <https://github.com/matrix-org/synapse/issues/3533>`_)
- Ensure that erasure requests are correctly honoured for publicly accessible rooms when accessed over federation. (`#3546 <https://github.com/matrix-org/synapse/issues/3546>`_)
Misc
----
- `#3351 <https://github.com/matrix-org/synapse/issues/3351>`_, `#3463 <https://github.com/matrix-org/synapse/issues/3463>`_, `#3464 <https://github.com/matrix-org/synapse/issues/3464>`_, `#3498 <https://github.com/matrix-org/synapse/issues/3498>`_, `#3499 <https://github.com/matrix-org/synapse/issues/3499>`_, `#3501 <https://github.com/matrix-org/synapse/issues/3501>`_, `#3530 <https://github.com/matrix-org/synapse/issues/3530>`_, `#3534 <https://github.com/matrix-org/synapse/issues/3534>`_, `#3535 <https://github.com/matrix-org/synapse/issues/3535>`_, `#3540 <https://github.com/matrix-org/synapse/issues/3540>`_, `#3544 <https://github.com/matrix-org/synapse/issues/3544>`_
Synapse 0.32.2 (2018-07-07)
===========================
Bugfixes
--------
- Amend the Python dependencies to depend on attrs from PyPI, not attr (`#3492 <https://github.com/matrix-org/synapse/issues/3492>`_)
Synapse 0.32.1 (2018-07-06)
===========================
Bugfixes
--------
- Add explicit dependency on netaddr (`#3488 <https://github.com/matrix-org/synapse/issues/3488>`_)
Changes in synapse v0.32.0 (2018-07-06)
===========================================
No changes since 0.32.0rc1
Synapse 0.32.0rc1 (2018-07-05)
==============================
Features
--------
- Add blacklist & whitelist of servers allowed to send events to a room via ``m.room.server_acl`` event.
- Cache factor override system for specific caches (`#3334 <https://github.com/matrix-org/synapse/issues/3334>`_)
- Add metrics to track appservice transactions (`#3344 <https://github.com/matrix-org/synapse/issues/3344>`_)
- Try to log more helpful info when a sig verification fails (`#3372 <https://github.com/matrix-org/synapse/issues/3372>`_)
- Synapse now uses the best performing JSON encoder/decoder according to your runtime (simplejson on CPython, stdlib json on PyPy). (`#3462 <https://github.com/matrix-org/synapse/issues/3462>`_)
- Add optional ip_range_whitelist param to AS registration files to lock AS IP access (`#3465 <https://github.com/matrix-org/synapse/issues/3465>`_)
- Reject invalid server names in federation requests (`#3480 <https://github.com/matrix-org/synapse/issues/3480>`_)
- Reject invalid server names in homeserver.yaml (`#3483 <https://github.com/matrix-org/synapse/issues/3483>`_)
Bugfixes
--------
- Strip access_token from outgoing requests (`#3327 <https://github.com/matrix-org/synapse/issues/3327>`_)
- Redact AS tokens in logs (`#3349 <https://github.com/matrix-org/synapse/issues/3349>`_)
- Fix federation backfill from SQLite servers (`#3355 <https://github.com/matrix-org/synapse/issues/3355>`_)
- Fix event-purge-by-ts admin API (`#3363 <https://github.com/matrix-org/synapse/issues/3363>`_)
- Fix event filtering in get_missing_events handler (`#3371 <https://github.com/matrix-org/synapse/issues/3371>`_)
- Synapse is now stricter regarding accepting events which it cannot retrieve the prev_events for. (`#3456 <https://github.com/matrix-org/synapse/issues/3456>`_)
- Fix bug where synapse would explode when receiving unicode in HTTP User-Agent header (`#3470 <https://github.com/matrix-org/synapse/issues/3470>`_)
- Invalidate cache on correct thread to avoid race (`#3473 <https://github.com/matrix-org/synapse/issues/3473>`_)
Improved Documentation
----------------------
- ``doc/postgres.rst``: fix display of the last command block. Thanks to @ArchangeGabriel! (`#3340 <https://github.com/matrix-org/synapse/issues/3340>`_)
Deprecations and Removals
-------------------------
- Remove was_forgotten_at (`#3324 <https://github.com/matrix-org/synapse/issues/3324>`_)
Misc
----
- `#3332 <https://github.com/matrix-org/synapse/issues/3332>`_, `#3341 <https://github.com/matrix-org/synapse/issues/3341>`_, `#3347 <https://github.com/matrix-org/synapse/issues/3347>`_, `#3348 <https://github.com/matrix-org/synapse/issues/3348>`_, `#3356 <https://github.com/matrix-org/synapse/issues/3356>`_, `#3385 <https://github.com/matrix-org/synapse/issues/3385>`_, `#3446 <https://github.com/matrix-org/synapse/issues/3446>`_, `#3447 <https://github.com/matrix-org/synapse/issues/3447>`_, `#3467 <https://github.com/matrix-org/synapse/issues/3467>`_, `#3474 <https://github.com/matrix-org/synapse/issues/3474>`_
Changes in synapse v0.31.2 (2018-06-14)
=======================================
SECURITY UPDATE: Prevent unauthorised users from setting state events in a room
when there is no ``m.room.power_levels`` event in force in the room. (PR #3397)
Discussion around the Matrix Spec change proposal for this change can be
followed at https://github.com/matrix-org/matrix-doc/issues/1304.
Changes in synapse v0.31.1 (2018-06-08) Changes in synapse v0.31.1 (2018-06-08)
======================================= =======================================

View file

@ -48,6 +48,26 @@ Please ensure your changes match the cosmetic style of the existing project,
and **never** mix cosmetic and functional changes in the same commit, as it and **never** mix cosmetic and functional changes in the same commit, as it
makes it horribly hard to review otherwise. makes it horribly hard to review otherwise.
Changelog
~~~~~~~~~
All changes, even minor ones, need a corresponding changelog
entry. These are managed by Towncrier
(https://github.com/hawkowl/towncrier).
To create a changelog entry, make a new file in the ``changelog.d``
file named in the format of ``issuenumberOrPR.type``. The type can be
one of ``feature``, ``bugfix``, ``removal`` (also used for
deprecations), or ``misc`` (for internal-only changes). The content of
the file is your changelog entry, which can contain RestructuredText
formatting. A note of contributors is welcomed in changelogs for
non-misc changes (the content of misc changes is not displayed).
For example, a fix for a bug reported in #1234 would have its
changelog entry in ``changelog.d/1234.bugfix``, and contain content
like "The security levels of Florbs are now validated when
recieved over federation. Contributed by Jane Matrix".
Attribution Attribution
~~~~~~~~~~~ ~~~~~~~~~~~
@ -110,11 +130,15 @@ If you agree to this for your contribution, then all that's needed is to
include the line in your commit or pull request comment:: include the line in your commit or pull request comment::
Signed-off-by: Your Name <your@email.example.org> Signed-off-by: Your Name <your@email.example.org>
...using your real name; unfortunately pseudonyms and anonymous contributions We accept contributions under a legally identifiable name, such as
can't be accepted. Git makes this trivial - just use the -s flag when you do your name on government documentation or common-law names (names
``git commit``, having first set ``user.name`` and ``user.email`` git configs claimed by legitimate usage or repute). Unfortunately, we cannot
(which you should have done anyway :) accept anonymous contributions at this time.
Git allows you to add this signoff automatically when using the ``-s``
flag to ``git commit``, which uses the name and email set in your
``user.name`` and ``user.email`` git configs.
Conclusion Conclusion
~~~~~~~~~~ ~~~~~~~~~~

View file

@ -29,5 +29,8 @@ exclude Dockerfile
exclude .dockerignore exclude .dockerignore
recursive-exclude jenkins *.sh recursive-exclude jenkins *.sh
include pyproject.toml
recursive-include changelog.d *
prune .github prune .github
prune demo/etc prune demo/etc

1
changelog.d/.gitignore vendored Normal file
View file

@ -0,0 +1 @@
!.gitignore

1
changelog.d/3367.misc Normal file
View file

@ -0,0 +1 @@
Remove unnecessary event re-signing hacks

0
changelog.d/3460.misc Normal file
View file

1
changelog.d/3514.bugfix Normal file
View file

@ -0,0 +1 @@
Don't generate TURN credentials if no TURN config options are set

1
changelog.d/3553.feature Normal file
View file

@ -0,0 +1 @@
Add metrics to track resource usage by background processes

1
changelog.d/3554.feature Normal file
View file

@ -0,0 +1 @@
Add `code` label to `synapse_http_server_response_time_seconds` prometheus metric

1
changelog.d/3556.feature Normal file
View file

@ -0,0 +1 @@
Add metrics to track resource usage by background processes

1
changelog.d/3559.misc Normal file
View file

@ -0,0 +1 @@
add config for pep8

1
changelog.d/3561.bugfix Normal file
View file

@ -0,0 +1 @@
Disable a noisy warning about logcontexts

View file

@ -44,13 +44,26 @@ Deactivate Account
This API deactivates an account. It removes active access tokens, resets the This API deactivates an account. It removes active access tokens, resets the
password, and deletes third-party IDs (to prevent the user requesting a password, and deletes third-party IDs (to prevent the user requesting a
password reset). password reset). It can also mark the user as GDPR-erased (stopping their data
from distributed further, and deleting it entirely if there are no other
references to it).
The api is:: The api is::
POST /_matrix/client/r0/admin/deactivate/<user_id> POST /_matrix/client/r0/admin/deactivate/<user_id>
including an ``access_token`` of a server admin, and an empty request body. with a body of:
.. code:: json
{
"erase": true
}
including an ``access_token`` of a server admin.
The erase parameter is optional and defaults to 'false'.
An empty body may be passed for backwards compatibility.
Reset password Reset password

5
pyproject.toml Normal file
View file

@ -0,0 +1,5 @@
[tool.towncrier]
package = "synapse"
filename = "CHANGES.rst"
directory = "changelog.d"
issue_format = "`#{issue} <https://github.com/matrix-org/synapse/issues/{issue}>`_"

View file

@ -18,14 +18,22 @@
from __future__ import print_function from __future__ import print_function
import argparse import argparse
from urlparse import urlparse, urlunparse
import nacl.signing import nacl.signing
import json import json
import base64 import base64
import requests import requests
import sys import sys
from requests.adapters import HTTPAdapter
import srvlookup import srvlookup
import yaml import yaml
# uncomment the following to enable debug logging of http requests
#from httplib import HTTPConnection
#HTTPConnection.debuglevel = 1
def encode_base64(input_bytes): def encode_base64(input_bytes):
"""Encode bytes as a base64 string without any padding.""" """Encode bytes as a base64 string without any padding."""
@ -113,17 +121,6 @@ def read_signing_keys(stream):
return keys return keys
def lookup(destination, path):
if ":" in destination:
return "https://%s%s" % (destination, path)
else:
try:
srv = srvlookup.lookup("matrix", "tcp", destination)[0]
return "https://%s:%d%s" % (srv.host, srv.port, path)
except:
return "https://%s:%d%s" % (destination, 8448, path)
def request_json(method, origin_name, origin_key, destination, path, content): def request_json(method, origin_name, origin_key, destination, path, content):
if method is None: if method is None:
if content is None: if content is None:
@ -152,13 +149,19 @@ def request_json(method, origin_name, origin_key, destination, path, content):
authorization_headers.append(bytes(header)) authorization_headers.append(bytes(header))
print ("Authorization: %s" % header, file=sys.stderr) print ("Authorization: %s" % header, file=sys.stderr)
dest = lookup(destination, path) dest = "matrix://%s%s" % (destination, path)
print ("Requesting %s" % dest, file=sys.stderr) print ("Requesting %s" % dest, file=sys.stderr)
result = requests.request( s = requests.Session()
s.mount("matrix://", MatrixConnectionAdapter())
result = s.request(
method=method, method=method,
url=dest, url=dest,
headers={"Authorization": authorization_headers[0]}, headers={
"Host": destination,
"Authorization": authorization_headers[0]
},
verify=False, verify=False,
data=content, data=content,
) )
@ -242,5 +245,39 @@ def read_args_from_config(args):
args.signing_key_path = config['signing_key_path'] args.signing_key_path = config['signing_key_path']
class MatrixConnectionAdapter(HTTPAdapter):
@staticmethod
def lookup(s):
if s[-1] == ']':
# ipv6 literal (with no port)
return s, 8448
if ":" in s:
out = s.rsplit(":",1)
try:
port = int(out[1])
except ValueError:
raise ValueError("Invalid host:port '%s'" % s)
return out[0], port
try:
srv = srvlookup.lookup("matrix", "tcp", s)[0]
return srv.host, srv.port
except:
return s, 8448
def get_connection(self, url, proxies=None):
parsed = urlparse(url)
(host, port) = self.lookup(parsed.netloc)
netloc = "%s:%d" % (host, port)
print("Connecting to %s" % (netloc,), file=sys.stderr)
url = urlunparse((
"https", netloc, parsed.path, parsed.params, parsed.query,
parsed.fragment,
))
return super(MatrixConnectionAdapter, self).get_connection(url, proxies)
if __name__ == "__main__": if __name__ == "__main__":
main() main()

View file

@ -14,8 +14,26 @@ ignore =
pylint.cfg pylint.cfg
tox.ini tox.ini
[flake8] [pep8]
max-line-length = 90 max-line-length = 90
# W503 requires that binary operators be at the end, not start, of lines. Erik doesn't like it. # W503 requires that binary operators be at the end, not start, of lines. Erik
# E203 is contrary to PEP8. # doesn't like it. E203 is contrary to PEP8.
ignore = W503,E203 ignore = W503,E203
[flake8]
# note that flake8 inherits the "ignore" settings from "pep8" (because it uses
# pep8 to do those checks), but not the "max-line-length" setting
max-line-length = 90
[isort]
line_length = 89
not_skip = __init__.py
sections=FUTURE,STDLIB,COMPAT,THIRDPARTY,TWISTED,FIRSTPARTY,TESTS,LOCALFOLDER
default_section=THIRDPARTY
known_first_party = synapse
known_tests=tests
known_compat = mock,six
known_twisted=twisted,OpenSSL
multi_line_output=3
include_trailing_comma=true
combine_as_imports=true

View file

@ -1,5 +1,6 @@
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
# Copyright 2014-2016 OpenMarket Ltd # Copyright 2014-2016 OpenMarket Ltd
# Copyright 2018 New Vector Ltd
# #
# Licensed under the Apache License, Version 2.0 (the "License"); # Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License. # you may not use this file except in compliance with the License.
@ -16,4 +17,4 @@
""" This is a reference implementation of a Matrix home server. """ This is a reference implementation of a Matrix home server.
""" """
__version__ = "0.31.1" __version__ = "0.33.0rc1"

View file

@ -18,14 +18,16 @@ import logging
from six import itervalues from six import itervalues
import pymacaroons import pymacaroons
from netaddr import IPAddress
from twisted.internet import defer from twisted.internet import defer
import synapse.types import synapse.types
from synapse import event_auth from synapse import event_auth
from synapse.api.constants import EventTypes, Membership, JoinRules from synapse.api.constants import EventTypes, JoinRules, Membership
from synapse.api.errors import AuthError, Codes from synapse.api.errors import AuthError, Codes
from synapse.types import UserID from synapse.types import UserID
from synapse.util.caches import register_cache, CACHE_SIZE_FACTOR from synapse.util.caches import CACHE_SIZE_FACTOR, register_cache
from synapse.util.caches.lrucache import LruCache from synapse.util.caches.lrucache import LruCache
from synapse.util.metrics import Measure from synapse.util.metrics import Measure
@ -191,7 +193,7 @@ class Auth(object):
synapse.types.create_requester(user_id, app_service=app_service) synapse.types.create_requester(user_id, app_service=app_service)
) )
access_token = get_access_token_from_request( access_token = self.get_access_token_from_request(
request, self.TOKEN_NOT_FOUND_HTTP_STATUS request, self.TOKEN_NOT_FOUND_HTTP_STATUS
) )
@ -237,13 +239,18 @@ class Auth(object):
@defer.inlineCallbacks @defer.inlineCallbacks
def _get_appservice_user_id(self, request): def _get_appservice_user_id(self, request):
app_service = self.store.get_app_service_by_token( app_service = self.store.get_app_service_by_token(
get_access_token_from_request( self.get_access_token_from_request(
request, self.TOKEN_NOT_FOUND_HTTP_STATUS request, self.TOKEN_NOT_FOUND_HTTP_STATUS
) )
) )
if app_service is None: if app_service is None:
defer.returnValue((None, None)) defer.returnValue((None, None))
if app_service.ip_range_whitelist:
ip_address = IPAddress(self.hs.get_ip_from_request(request))
if ip_address not in app_service.ip_range_whitelist:
defer.returnValue((None, None))
if "user_id" not in request.args: if "user_id" not in request.args:
defer.returnValue((app_service.sender, app_service)) defer.returnValue((app_service.sender, app_service))
@ -488,7 +495,7 @@ class Auth(object):
def _look_up_user_by_access_token(self, token): def _look_up_user_by_access_token(self, token):
ret = yield self.store.get_user_by_access_token(token) ret = yield self.store.get_user_by_access_token(token)
if not ret: if not ret:
logger.warn("Unrecognised access token - not in store: %s" % (token,)) logger.warn("Unrecognised access token - not in store.")
raise AuthError( raise AuthError(
self.TOKEN_NOT_FOUND_HTTP_STATUS, "Unrecognised access token.", self.TOKEN_NOT_FOUND_HTTP_STATUS, "Unrecognised access token.",
errcode=Codes.UNKNOWN_TOKEN errcode=Codes.UNKNOWN_TOKEN
@ -506,12 +513,12 @@ class Auth(object):
def get_appservice_by_req(self, request): def get_appservice_by_req(self, request):
try: try:
token = get_access_token_from_request( token = self.get_access_token_from_request(
request, self.TOKEN_NOT_FOUND_HTTP_STATUS request, self.TOKEN_NOT_FOUND_HTTP_STATUS
) )
service = self.store.get_app_service_by_token(token) service = self.store.get_app_service_by_token(token)
if not service: if not service:
logger.warn("Unrecognised appservice access token: %s" % (token,)) logger.warn("Unrecognised appservice access token.")
raise AuthError( raise AuthError(
self.TOKEN_NOT_FOUND_HTTP_STATUS, self.TOKEN_NOT_FOUND_HTTP_STATUS,
"Unrecognised access token.", "Unrecognised access token.",
@ -655,7 +662,7 @@ class Auth(object):
auth_events[(EventTypes.PowerLevels, "")] = power_level_event auth_events[(EventTypes.PowerLevels, "")] = power_level_event
send_level = event_auth.get_send_level( send_level = event_auth.get_send_level(
EventTypes.Aliases, "", auth_events EventTypes.Aliases, "", power_level_event,
) )
user_level = event_auth.get_user_power_level(user_id, auth_events) user_level = event_auth.get_user_power_level(user_id, auth_events)
@ -666,67 +673,67 @@ class Auth(object):
" edit its room list entry" " edit its room list entry"
) )
@staticmethod
def has_access_token(request):
"""Checks if the request has an access_token.
def has_access_token(request): Returns:
"""Checks if the request has an access_token. bool: False if no access_token was given, True otherwise.
"""
query_params = request.args.get("access_token")
auth_headers = request.requestHeaders.getRawHeaders(b"Authorization")
return bool(query_params) or bool(auth_headers)
Returns: @staticmethod
bool: False if no access_token was given, True otherwise. def get_access_token_from_request(request, token_not_found_http_status=401):
""" """Extracts the access_token from the request.
query_params = request.args.get("access_token")
auth_headers = request.requestHeaders.getRawHeaders(b"Authorization")
return bool(query_params) or bool(auth_headers)
Args:
request: The http request.
token_not_found_http_status(int): The HTTP status code to set in the
AuthError if the token isn't found. This is used in some of the
legacy APIs to change the status code to 403 from the default of
401 since some of the old clients depended on auth errors returning
403.
Returns:
str: The access_token
Raises:
AuthError: If there isn't an access_token in the request.
"""
def get_access_token_from_request(request, token_not_found_http_status=401): auth_headers = request.requestHeaders.getRawHeaders(b"Authorization")
"""Extracts the access_token from the request. query_params = request.args.get(b"access_token")
if auth_headers:
Args: # Try the get the access_token from a "Authorization: Bearer"
request: The http request. # header
token_not_found_http_status(int): The HTTP status code to set in the if query_params is not None:
AuthError if the token isn't found. This is used in some of the raise AuthError(
legacy APIs to change the status code to 403 from the default of token_not_found_http_status,
401 since some of the old clients depended on auth errors returning "Mixing Authorization headers and access_token query parameters.",
403. errcode=Codes.MISSING_TOKEN,
Returns: )
str: The access_token if len(auth_headers) > 1:
Raises: raise AuthError(
AuthError: If there isn't an access_token in the request. token_not_found_http_status,
""" "Too many Authorization headers.",
errcode=Codes.MISSING_TOKEN,
auth_headers = request.requestHeaders.getRawHeaders(b"Authorization") )
query_params = request.args.get(b"access_token") parts = auth_headers[0].split(" ")
if auth_headers: if parts[0] == "Bearer" and len(parts) == 2:
# Try the get the access_token from a "Authorization: Bearer" return parts[1]
# header else:
if query_params is not None: raise AuthError(
raise AuthError( token_not_found_http_status,
token_not_found_http_status, "Invalid Authorization header.",
"Mixing Authorization headers and access_token query parameters.", errcode=Codes.MISSING_TOKEN,
errcode=Codes.MISSING_TOKEN, )
)
if len(auth_headers) > 1:
raise AuthError(
token_not_found_http_status,
"Too many Authorization headers.",
errcode=Codes.MISSING_TOKEN,
)
parts = auth_headers[0].split(" ")
if parts[0] == "Bearer" and len(parts) == 2:
return parts[1]
else: else:
raise AuthError( # Try to get the access_token from the query params.
token_not_found_http_status, if not query_params:
"Invalid Authorization header.", raise AuthError(
errcode=Codes.MISSING_TOKEN, token_not_found_http_status,
) "Missing access token.",
else: errcode=Codes.MISSING_TOKEN
# Try to get the access_token from the query params. )
if not query_params:
raise AuthError(
token_not_found_http_status,
"Missing access token.",
errcode=Codes.MISSING_TOKEN
)
return query_params[0] return query_params[0]

View file

@ -76,6 +76,8 @@ class EventTypes(object):
Topic = "m.room.topic" Topic = "m.room.topic"
Name = "m.room.name" Name = "m.room.name"
ServerACL = "m.room.server_acl"
class RejectedReason(object): class RejectedReason(object):
AUTH_ERROR = "auth_error" AUTH_ERROR = "auth_error"

View file

@ -17,10 +17,11 @@
import logging import logging
import simplejson as json
from six import iteritems from six import iteritems
from six.moves import http_client from six.moves import http_client
from canonicaljson import json
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -12,14 +12,15 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from synapse.api.errors import SynapseError import jsonschema
from synapse.storage.presence import UserPresenceState from canonicaljson import json
from synapse.types import UserID, RoomID from jsonschema import FormatChecker
from twisted.internet import defer from twisted.internet import defer
import simplejson as json from synapse.api.errors import SynapseError
import jsonschema from synapse.storage.presence import UserPresenceState
from jsonschema import FormatChecker from synapse.types import RoomID, UserID
FILTER_SCHEMA = { FILTER_SCHEMA = {
"additionalProperties": False, "additionalProperties": False,

View file

@ -15,8 +15,8 @@
# limitations under the License. # limitations under the License.
"""Contains the URL paths to prefix various aspects of the server with. """ """Contains the URL paths to prefix various aspects of the server with. """
from hashlib import sha256
import hmac import hmac
from hashlib import sha256
from six.moves.urllib.parse import urlencode from six.moves.urllib.parse import urlencode

View file

@ -14,9 +14,11 @@
# limitations under the License. # limitations under the License.
import sys import sys
from synapse import python_dependencies # noqa: E402
sys.dont_write_bytecode = True sys.dont_write_bytecode = True
from synapse import python_dependencies # noqa: E402
try: try:
python_dependencies.check_requirements() python_dependencies.check_requirements()

View file

@ -17,15 +17,18 @@ import gc
import logging import logging
import sys import sys
from daemonize import Daemonize
from twisted.internet import error, reactor
from synapse.util import PreserveLoggingContext
from synapse.util.rlimit import change_resource_limit
try: try:
import affinity import affinity
except Exception: except Exception:
affinity = None affinity = None
from daemonize import Daemonize
from synapse.util import PreserveLoggingContext
from synapse.util.rlimit import change_resource_limit
from twisted.internet import error, reactor
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -16,6 +16,9 @@
import logging import logging
import sys import sys
from twisted.internet import defer, reactor
from twisted.web.resource import NoResource
import synapse import synapse
from synapse import events from synapse import events
from synapse.app import _base from synapse.app import _base
@ -23,6 +26,7 @@ from synapse.config._base import ConfigError
from synapse.config.homeserver import HomeServerConfig from synapse.config.homeserver import HomeServerConfig
from synapse.config.logger import setup_logging from synapse.config.logger import setup_logging
from synapse.http.site import SynapseSite from synapse.http.site import SynapseSite
from synapse.metrics import RegistryProxy
from synapse.metrics.resource import METRICS_PREFIX, MetricsResource from synapse.metrics.resource import METRICS_PREFIX, MetricsResource
from synapse.replication.slave.storage.appservice import SlavedApplicationServiceStore from synapse.replication.slave.storage.appservice import SlavedApplicationServiceStore
from synapse.replication.slave.storage.directory import DirectoryStore from synapse.replication.slave.storage.directory import DirectoryStore
@ -35,8 +39,6 @@ from synapse.util.httpresourcetree import create_resource_tree
from synapse.util.logcontext import LoggingContext, run_in_background from synapse.util.logcontext import LoggingContext, run_in_background
from synapse.util.manhole import manhole from synapse.util.manhole import manhole
from synapse.util.versionstring import get_version_string from synapse.util.versionstring import get_version_string
from twisted.internet import reactor, defer
from twisted.web.resource import NoResource
logger = logging.getLogger("synapse.app.appservice") logger = logging.getLogger("synapse.app.appservice")
@ -62,7 +64,7 @@ class AppserviceServer(HomeServer):
for res in listener_config["resources"]: for res in listener_config["resources"]:
for name in res["names"]: for name in res["names"]:
if name == "metrics": if name == "metrics":
resources[METRICS_PREFIX] = MetricsResource(self) resources[METRICS_PREFIX] = MetricsResource(RegistryProxy)
root_resource = create_resource_tree(resources, NoResource()) root_resource = create_resource_tree(resources, NoResource())
@ -97,7 +99,7 @@ class AppserviceServer(HomeServer):
elif listener["type"] == "metrics": elif listener["type"] == "metrics":
if not self.get_config().enable_metrics: if not self.get_config().enable_metrics:
logger.warn(("Metrics listener configured, but " logger.warn(("Metrics listener configured, but "
"collect_metrics is not enabled!")) "enable_metrics is not True!"))
else: else:
_base.listen_metrics(listener["bind_addresses"], _base.listen_metrics(listener["bind_addresses"],
listener["port"]) listener["port"])

View file

@ -16,6 +16,9 @@
import logging import logging
import sys import sys
from twisted.internet import reactor
from twisted.web.resource import NoResource
import synapse import synapse
from synapse import events from synapse import events
from synapse.app import _base from synapse.app import _base
@ -44,8 +47,6 @@ from synapse.util.httpresourcetree import create_resource_tree
from synapse.util.logcontext import LoggingContext from synapse.util.logcontext import LoggingContext
from synapse.util.manhole import manhole from synapse.util.manhole import manhole
from synapse.util.versionstring import get_version_string from synapse.util.versionstring import get_version_string
from twisted.internet import reactor
from twisted.web.resource import NoResource
logger = logging.getLogger("synapse.app.client_reader") logger = logging.getLogger("synapse.app.client_reader")
@ -122,7 +123,7 @@ class ClientReaderServer(HomeServer):
elif listener["type"] == "metrics": elif listener["type"] == "metrics":
if not self.get_config().enable_metrics: if not self.get_config().enable_metrics:
logger.warn(("Metrics listener configured, but " logger.warn(("Metrics listener configured, but "
"collect_metrics is not enabled!")) "enable_metrics is not True!"))
else: else:
_base.listen_metrics(listener["bind_addresses"], _base.listen_metrics(listener["bind_addresses"],
listener["port"]) listener["port"])

View file

@ -16,6 +16,9 @@
import logging import logging
import sys import sys
from twisted.internet import reactor
from twisted.web.resource import NoResource
import synapse import synapse
from synapse import events from synapse import events
from synapse.app import _base from synapse.app import _base
@ -43,8 +46,10 @@ from synapse.replication.slave.storage.room import RoomStore
from synapse.replication.slave.storage.transactions import TransactionStore from synapse.replication.slave.storage.transactions import TransactionStore
from synapse.replication.tcp.client import ReplicationClientHandler from synapse.replication.tcp.client import ReplicationClientHandler
from synapse.rest.client.v1.room import ( from synapse.rest.client.v1.room import (
RoomSendEventRestServlet, RoomMembershipRestServlet, RoomStateEventRestServlet,
JoinRoomAliasServlet, JoinRoomAliasServlet,
RoomMembershipRestServlet,
RoomSendEventRestServlet,
RoomStateEventRestServlet,
) )
from synapse.server import HomeServer from synapse.server import HomeServer
from synapse.storage.engines import create_engine from synapse.storage.engines import create_engine
@ -52,8 +57,6 @@ from synapse.util.httpresourcetree import create_resource_tree
from synapse.util.logcontext import LoggingContext from synapse.util.logcontext import LoggingContext
from synapse.util.manhole import manhole from synapse.util.manhole import manhole
from synapse.util.versionstring import get_version_string from synapse.util.versionstring import get_version_string
from twisted.internet import reactor
from twisted.web.resource import NoResource
logger = logging.getLogger("synapse.app.event_creator") logger = logging.getLogger("synapse.app.event_creator")
@ -138,7 +141,7 @@ class EventCreatorServer(HomeServer):
elif listener["type"] == "metrics": elif listener["type"] == "metrics":
if not self.get_config().enable_metrics: if not self.get_config().enable_metrics:
logger.warn(("Metrics listener configured, but " logger.warn(("Metrics listener configured, but "
"collect_metrics is not enabled!")) "enable_metrics is not True!"))
else: else:
_base.listen_metrics(listener["bind_addresses"], _base.listen_metrics(listener["bind_addresses"],
listener["port"]) listener["port"])

View file

@ -16,6 +16,9 @@
import logging import logging
import sys import sys
from twisted.internet import reactor
from twisted.web.resource import NoResource
import synapse import synapse
from synapse import events from synapse import events
from synapse.api.urls import FEDERATION_PREFIX from synapse.api.urls import FEDERATION_PREFIX
@ -41,8 +44,6 @@ from synapse.util.httpresourcetree import create_resource_tree
from synapse.util.logcontext import LoggingContext from synapse.util.logcontext import LoggingContext
from synapse.util.manhole import manhole from synapse.util.manhole import manhole
from synapse.util.versionstring import get_version_string from synapse.util.versionstring import get_version_string
from twisted.internet import reactor
from twisted.web.resource import NoResource
logger = logging.getLogger("synapse.app.federation_reader") logger = logging.getLogger("synapse.app.federation_reader")
@ -111,7 +112,7 @@ class FederationReaderServer(HomeServer):
elif listener["type"] == "metrics": elif listener["type"] == "metrics":
if not self.get_config().enable_metrics: if not self.get_config().enable_metrics:
logger.warn(("Metrics listener configured, but " logger.warn(("Metrics listener configured, but "
"collect_metrics is not enabled!")) "enable_metrics is not True!"))
else: else:
_base.listen_metrics(listener["bind_addresses"], _base.listen_metrics(listener["bind_addresses"],
listener["port"]) listener["port"])

View file

@ -16,6 +16,9 @@
import logging import logging
import sys import sys
from twisted.internet import defer, reactor
from twisted.web.resource import NoResource
import synapse import synapse
from synapse import events from synapse import events
from synapse.app import _base from synapse.app import _base
@ -42,8 +45,6 @@ from synapse.util.httpresourcetree import create_resource_tree
from synapse.util.logcontext import LoggingContext, run_in_background from synapse.util.logcontext import LoggingContext, run_in_background
from synapse.util.manhole import manhole from synapse.util.manhole import manhole
from synapse.util.versionstring import get_version_string from synapse.util.versionstring import get_version_string
from twisted.internet import defer, reactor
from twisted.web.resource import NoResource
logger = logging.getLogger("synapse.app.federation_sender") logger = logging.getLogger("synapse.app.federation_sender")
@ -125,7 +126,7 @@ class FederationSenderServer(HomeServer):
elif listener["type"] == "metrics": elif listener["type"] == "metrics":
if not self.get_config().enable_metrics: if not self.get_config().enable_metrics:
logger.warn(("Metrics listener configured, but " logger.warn(("Metrics listener configured, but "
"collect_metrics is not enabled!")) "enable_metrics is not True!"))
else: else:
_base.listen_metrics(listener["bind_addresses"], _base.listen_metrics(listener["bind_addresses"],
listener["port"]) listener["port"])

View file

@ -16,6 +16,9 @@
import logging import logging
import sys import sys
from twisted.internet import defer, reactor
from twisted.web.resource import NoResource
import synapse import synapse
from synapse import events from synapse import events
from synapse.api.errors import SynapseError from synapse.api.errors import SynapseError
@ -25,9 +28,7 @@ from synapse.config.homeserver import HomeServerConfig
from synapse.config.logger import setup_logging from synapse.config.logger import setup_logging
from synapse.crypto import context_factory from synapse.crypto import context_factory
from synapse.http.server import JsonResource from synapse.http.server import JsonResource
from synapse.http.servlet import ( from synapse.http.servlet import RestServlet, parse_json_object_from_request
RestServlet, parse_json_object_from_request,
)
from synapse.http.site import SynapseSite from synapse.http.site import SynapseSite
from synapse.metrics import RegistryProxy from synapse.metrics import RegistryProxy
from synapse.metrics.resource import METRICS_PREFIX, MetricsResource from synapse.metrics.resource import METRICS_PREFIX, MetricsResource
@ -44,8 +45,6 @@ from synapse.util.httpresourcetree import create_resource_tree
from synapse.util.logcontext import LoggingContext from synapse.util.logcontext import LoggingContext
from synapse.util.manhole import manhole from synapse.util.manhole import manhole
from synapse.util.versionstring import get_version_string from synapse.util.versionstring import get_version_string
from twisted.internet import defer, reactor
from twisted.web.resource import NoResource
logger = logging.getLogger("synapse.app.frontend_proxy") logger = logging.getLogger("synapse.app.frontend_proxy")
@ -176,7 +175,7 @@ class FrontendProxyServer(HomeServer):
elif listener["type"] == "metrics": elif listener["type"] == "metrics":
if not self.get_config().enable_metrics: if not self.get_config().enable_metrics:
logger.warn(("Metrics listener configured, but " logger.warn(("Metrics listener configured, but "
"collect_metrics is not enabled!")) "enable_metrics is not True!"))
else: else:
_base.listen_metrics(listener["bind_addresses"], _base.listen_metrics(listener["bind_addresses"],
listener["port"]) listener["port"])

View file

@ -18,27 +18,39 @@ import logging
import os import os
import sys import sys
from twisted.application import service
from twisted.internet import defer, reactor
from twisted.web.resource import EncodingResourceWrapper, NoResource
from twisted.web.server import GzipEncoderFactory
from twisted.web.static import File
import synapse import synapse
import synapse.config.logger import synapse.config.logger
from synapse import events from synapse import events
from synapse.api.urls import CONTENT_REPO_PREFIX, FEDERATION_PREFIX, \ from synapse.api.urls import (
LEGACY_MEDIA_PREFIX, MEDIA_PREFIX, SERVER_KEY_PREFIX, SERVER_KEY_V2_PREFIX, \ CONTENT_REPO_PREFIX,
STATIC_PREFIX, WEB_CLIENT_PREFIX FEDERATION_PREFIX,
LEGACY_MEDIA_PREFIX,
MEDIA_PREFIX,
SERVER_KEY_PREFIX,
SERVER_KEY_V2_PREFIX,
STATIC_PREFIX,
WEB_CLIENT_PREFIX,
)
from synapse.app import _base from synapse.app import _base
from synapse.app._base import quit_with_error, listen_ssl, listen_tcp from synapse.app._base import listen_ssl, listen_tcp, quit_with_error
from synapse.config._base import ConfigError from synapse.config._base import ConfigError
from synapse.config.homeserver import HomeServerConfig from synapse.config.homeserver import HomeServerConfig
from synapse.crypto import context_factory from synapse.crypto import context_factory
from synapse.federation.transport.server import TransportLayerServer from synapse.federation.transport.server import TransportLayerServer
from synapse.module_api import ModuleApi
from synapse.http.additional_resource import AdditionalResource from synapse.http.additional_resource import AdditionalResource
from synapse.http.server import RootRedirect from synapse.http.server import RootRedirect
from synapse.http.site import SynapseSite from synapse.http.site import SynapseSite
from synapse.metrics import RegistryProxy from synapse.metrics import RegistryProxy
from synapse.metrics.resource import METRICS_PREFIX, MetricsResource from synapse.metrics.resource import METRICS_PREFIX, MetricsResource
from synapse.python_dependencies import CONDITIONAL_REQUIREMENTS, \ from synapse.module_api import ModuleApi
check_requirements from synapse.python_dependencies import CONDITIONAL_REQUIREMENTS, check_requirements
from synapse.replication.http import ReplicationRestResource, REPLICATION_PREFIX from synapse.replication.http import REPLICATION_PREFIX, ReplicationRestResource
from synapse.replication.tcp.resource import ReplicationStreamProtocolFactory from synapse.replication.tcp.resource import ReplicationStreamProtocolFactory
from synapse.rest import ClientRestResource from synapse.rest import ClientRestResource
from synapse.rest.key.v1.server_key_resource import LocalKey from synapse.rest.key.v1.server_key_resource import LocalKey
@ -55,11 +67,6 @@ from synapse.util.manhole import manhole
from synapse.util.module_loader import load_module from synapse.util.module_loader import load_module
from synapse.util.rlimit import change_resource_limit from synapse.util.rlimit import change_resource_limit
from synapse.util.versionstring import get_version_string from synapse.util.versionstring import get_version_string
from twisted.application import service
from twisted.internet import defer, reactor
from twisted.web.resource import EncodingResourceWrapper, NoResource
from twisted.web.server import GzipEncoderFactory
from twisted.web.static import File
logger = logging.getLogger("synapse.app.homeserver") logger = logging.getLogger("synapse.app.homeserver")
@ -266,7 +273,7 @@ class SynapseHomeServer(HomeServer):
elif listener["type"] == "metrics": elif listener["type"] == "metrics":
if not self.get_config().enable_metrics: if not self.get_config().enable_metrics:
logger.warn(("Metrics listener configured, but " logger.warn(("Metrics listener configured, but "
"collect_metrics is not enabled!")) "enable_metrics is not True!"))
else: else:
_base.listen_metrics(listener["bind_addresses"], _base.listen_metrics(listener["bind_addresses"],
listener["port"]) listener["port"])
@ -318,11 +325,6 @@ def setup(config_options):
# check any extra requirements we have now we have a config # check any extra requirements we have now we have a config
check_requirements(config) check_requirements(config)
version_string = "Synapse/" + get_version_string(synapse)
logger.info("Server hostname: %s", config.server_name)
logger.info("Server version: %s", version_string)
events.USE_FROZEN_DICTS = config.use_frozen_dicts events.USE_FROZEN_DICTS = config.use_frozen_dicts
tls_server_context_factory = context_factory.ServerContextFactory(config) tls_server_context_factory = context_factory.ServerContextFactory(config)
@ -335,7 +337,7 @@ def setup(config_options):
db_config=config.database_config, db_config=config.database_config,
tls_server_context_factory=tls_server_context_factory, tls_server_context_factory=tls_server_context_factory,
config=config, config=config,
version_string=version_string, version_string="Synapse/" + get_version_string(synapse),
database_engine=database_engine, database_engine=database_engine,
) )

View file

@ -16,11 +16,12 @@
import logging import logging
import sys import sys
from twisted.internet import reactor
from twisted.web.resource import NoResource
import synapse import synapse
from synapse import events from synapse import events
from synapse.api.urls import ( from synapse.api.urls import CONTENT_REPO_PREFIX, LEGACY_MEDIA_PREFIX, MEDIA_PREFIX
CONTENT_REPO_PREFIX, LEGACY_MEDIA_PREFIX, MEDIA_PREFIX
)
from synapse.app import _base from synapse.app import _base
from synapse.config._base import ConfigError from synapse.config._base import ConfigError
from synapse.config.homeserver import HomeServerConfig from synapse.config.homeserver import HomeServerConfig
@ -43,8 +44,6 @@ from synapse.util.httpresourcetree import create_resource_tree
from synapse.util.logcontext import LoggingContext from synapse.util.logcontext import LoggingContext
from synapse.util.manhole import manhole from synapse.util.manhole import manhole
from synapse.util.versionstring import get_version_string from synapse.util.versionstring import get_version_string
from twisted.internet import reactor
from twisted.web.resource import NoResource
logger = logging.getLogger("synapse.app.media_repository") logger = logging.getLogger("synapse.app.media_repository")
@ -118,7 +117,7 @@ class MediaRepositoryServer(HomeServer):
elif listener["type"] == "metrics": elif listener["type"] == "metrics":
if not self.get_config().enable_metrics: if not self.get_config().enable_metrics:
logger.warn(("Metrics listener configured, but " logger.warn(("Metrics listener configured, but "
"collect_metrics is not enabled!")) "enable_metrics is not True!"))
else: else:
_base.listen_metrics(listener["bind_addresses"], _base.listen_metrics(listener["bind_addresses"],
listener["port"]) listener["port"])

View file

@ -16,6 +16,9 @@
import logging import logging
import sys import sys
from twisted.internet import defer, reactor
from twisted.web.resource import NoResource
import synapse import synapse
from synapse import events from synapse import events
from synapse.app import _base from synapse.app import _base
@ -37,8 +40,6 @@ from synapse.util.httpresourcetree import create_resource_tree
from synapse.util.logcontext import LoggingContext, run_in_background from synapse.util.logcontext import LoggingContext, run_in_background
from synapse.util.manhole import manhole from synapse.util.manhole import manhole
from synapse.util.versionstring import get_version_string from synapse.util.versionstring import get_version_string
from twisted.internet import defer, reactor
from twisted.web.resource import NoResource
logger = logging.getLogger("synapse.app.pusher") logger = logging.getLogger("synapse.app.pusher")
@ -128,7 +129,7 @@ class PusherServer(HomeServer):
elif listener["type"] == "metrics": elif listener["type"] == "metrics":
if not self.get_config().enable_metrics: if not self.get_config().enable_metrics:
logger.warn(("Metrics listener configured, but " logger.warn(("Metrics listener configured, but "
"collect_metrics is not enabled!")) "enable_metrics is not True!"))
else: else:
_base.listen_metrics(listener["bind_addresses"], _base.listen_metrics(listener["bind_addresses"],
listener["port"]) listener["port"])

View file

@ -17,6 +17,11 @@ import contextlib
import logging import logging
import sys import sys
from six import iteritems
from twisted.internet import defer, reactor
from twisted.web.resource import NoResource
import synapse import synapse
from synapse.api.constants import EventTypes from synapse.api.constants import EventTypes
from synapse.app import _base from synapse.app import _base
@ -36,12 +41,12 @@ from synapse.replication.slave.storage.deviceinbox import SlavedDeviceInboxStore
from synapse.replication.slave.storage.devices import SlavedDeviceStore from synapse.replication.slave.storage.devices import SlavedDeviceStore
from synapse.replication.slave.storage.events import SlavedEventStore from synapse.replication.slave.storage.events import SlavedEventStore
from synapse.replication.slave.storage.filtering import SlavedFilteringStore from synapse.replication.slave.storage.filtering import SlavedFilteringStore
from synapse.replication.slave.storage.groups import SlavedGroupServerStore
from synapse.replication.slave.storage.presence import SlavedPresenceStore from synapse.replication.slave.storage.presence import SlavedPresenceStore
from synapse.replication.slave.storage.push_rule import SlavedPushRuleStore from synapse.replication.slave.storage.push_rule import SlavedPushRuleStore
from synapse.replication.slave.storage.receipts import SlavedReceiptsStore from synapse.replication.slave.storage.receipts import SlavedReceiptsStore
from synapse.replication.slave.storage.registration import SlavedRegistrationStore from synapse.replication.slave.storage.registration import SlavedRegistrationStore
from synapse.replication.slave.storage.room import RoomStore from synapse.replication.slave.storage.room import RoomStore
from synapse.replication.slave.storage.groups import SlavedGroupServerStore
from synapse.replication.tcp.client import ReplicationClientHandler from synapse.replication.tcp.client import ReplicationClientHandler
from synapse.rest.client.v1 import events from synapse.rest.client.v1 import events
from synapse.rest.client.v1.initial_sync import InitialSyncRestServlet from synapse.rest.client.v1.initial_sync import InitialSyncRestServlet
@ -56,10 +61,6 @@ from synapse.util.logcontext import LoggingContext, run_in_background
from synapse.util.manhole import manhole from synapse.util.manhole import manhole
from synapse.util.stringutils import random_string from synapse.util.stringutils import random_string
from synapse.util.versionstring import get_version_string from synapse.util.versionstring import get_version_string
from twisted.internet import defer, reactor
from twisted.web.resource import NoResource
from six import iteritems
logger = logging.getLogger("synapse.app.synchrotron") logger = logging.getLogger("synapse.app.synchrotron")
@ -305,7 +306,7 @@ class SynchrotronServer(HomeServer):
elif listener["type"] == "metrics": elif listener["type"] == "metrics":
if not self.get_config().enable_metrics: if not self.get_config().enable_metrics:
logger.warn(("Metrics listener configured, but " logger.warn(("Metrics listener configured, but "
"collect_metrics is not enabled!")) "enable_metrics is not True!"))
else: else:
_base.listen_metrics(listener["bind_addresses"], _base.listen_metrics(listener["bind_addresses"],
listener["port"]) listener["port"])

View file

@ -16,16 +16,17 @@
import argparse import argparse
import collections import collections
import errno
import glob import glob
import os import os
import os.path import os.path
import signal import signal
import subprocess import subprocess
import sys import sys
import yaml
import errno
import time import time
import yaml
SYNAPSE = [sys.executable, "-B", "-m", "synapse.app.homeserver"] SYNAPSE = [sys.executable, "-B", "-m", "synapse.app.homeserver"]
GREEN = "\x1b[1;32m" GREEN = "\x1b[1;32m"

View file

@ -17,6 +17,9 @@
import logging import logging
import sys import sys
from twisted.internet import defer, reactor
from twisted.web.resource import NoResource
import synapse import synapse
from synapse import events from synapse import events
from synapse.app import _base from synapse.app import _base
@ -43,8 +46,6 @@ from synapse.util.httpresourcetree import create_resource_tree
from synapse.util.logcontext import LoggingContext, run_in_background from synapse.util.logcontext import LoggingContext, run_in_background
from synapse.util.manhole import manhole from synapse.util.manhole import manhole
from synapse.util.versionstring import get_version_string from synapse.util.versionstring import get_version_string
from twisted.internet import reactor, defer
from twisted.web.resource import NoResource
logger = logging.getLogger("synapse.app.user_dir") logger = logging.getLogger("synapse.app.user_dir")
@ -150,7 +151,7 @@ class UserDirectoryServer(HomeServer):
elif listener["type"] == "metrics": elif listener["type"] == "metrics":
if not self.get_config().enable_metrics: if not self.get_config().enable_metrics:
logger.warn(("Metrics listener configured, but " logger.warn(("Metrics listener configured, but "
"collect_metrics is not enabled!")) "enable_metrics is not True!"))
else: else:
_base.listen_metrics(listener["bind_addresses"], _base.listen_metrics(listener["bind_addresses"],
listener["port"]) listener["port"])

View file

@ -12,17 +12,17 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from synapse.api.constants import EventTypes
from synapse.util.caches.descriptors import cachedInlineCallbacks
from synapse.types import GroupID, get_domain_from_id
from twisted.internet import defer
import logging import logging
import re import re
from six import string_types from six import string_types
from twisted.internet import defer
from synapse.api.constants import EventTypes
from synapse.types import GroupID, get_domain_from_id
from synapse.util.caches.descriptors import cachedInlineCallbacks
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -85,7 +85,8 @@ class ApplicationService(object):
NS_LIST = [NS_USERS, NS_ALIASES, NS_ROOMS] NS_LIST = [NS_USERS, NS_ALIASES, NS_ROOMS]
def __init__(self, token, hostname, url=None, namespaces=None, hs_token=None, def __init__(self, token, hostname, url=None, namespaces=None, hs_token=None,
sender=None, id=None, protocols=None, rate_limited=True): sender=None, id=None, protocols=None, rate_limited=True,
ip_range_whitelist=None):
self.token = token self.token = token
self.url = url self.url = url
self.hs_token = hs_token self.hs_token = hs_token
@ -93,6 +94,7 @@ class ApplicationService(object):
self.server_name = hostname self.server_name = hostname
self.namespaces = self._check_namespaces(namespaces) self.namespaces = self._check_namespaces(namespaces)
self.id = id self.id = id
self.ip_range_whitelist = ip_range_whitelist
if "|" in self.id: if "|" in self.id:
raise Exception("application service ID cannot contain '|' character") raise Exception("application service ID cannot contain '|' character")

View file

@ -12,20 +12,20 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from twisted.internet import defer
from synapse.api.constants import ThirdPartyEntityKind
from synapse.api.errors import CodeMessageException
from synapse.http.client import SimpleHttpClient
from synapse.events.utils import serialize_event
from synapse.util.caches.response_cache import ResponseCache
from synapse.types import ThirdPartyInstanceID
import logging import logging
import urllib import urllib
from prometheus_client import Counter from prometheus_client import Counter
from twisted.internet import defer
from synapse.api.constants import ThirdPartyEntityKind
from synapse.api.errors import CodeMessageException
from synapse.events.utils import serialize_event
from synapse.http.client import SimpleHttpClient
from synapse.types import ThirdPartyInstanceID
from synapse.util.caches.response_cache import ResponseCache
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
sent_transactions_counter = Counter( sent_transactions_counter = Counter(

View file

@ -48,14 +48,14 @@ UP & quit +---------- YES SUCCESS
This is all tied together by the AppServiceScheduler which DIs the required This is all tied together by the AppServiceScheduler which DIs the required
components. components.
""" """
import logging
from twisted.internet import defer from twisted.internet import defer
from synapse.appservice import ApplicationServiceState from synapse.appservice import ApplicationServiceState
from synapse.util.logcontext import run_in_background from synapse.util.logcontext import run_in_background
from synapse.util.metrics import Measure from synapse.util.metrics import Measure
import logging
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -16,11 +16,12 @@
import argparse import argparse
import errno import errno
import os import os
import yaml
from textwrap import dedent from textwrap import dedent
from six import integer_types from six import integer_types
import yaml
class ConfigError(Exception): class ConfigError(Exception):
pass pass

View file

@ -12,10 +12,10 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from ._base import Config
from synapse.api.constants import EventTypes from synapse.api.constants import EventTypes
from ._base import Config
class ApiConfig(Config): class ApiConfig(Config):

View file

@ -12,17 +12,19 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from ._base import Config, ConfigError
from synapse.appservice import ApplicationService
from synapse.types import UserID
import yaml
import logging import logging
from six import string_types from six import string_types
from six.moves.urllib import parse as urlparse from six.moves.urllib import parse as urlparse
import yaml
from netaddr import IPSet
from synapse.appservice import ApplicationService
from synapse.types import UserID
from ._base import Config, ConfigError
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -154,6 +156,13 @@ def _load_appservice(hostname, as_info, config_filename):
" will not receive events or queries.", " will not receive events or queries.",
config_filename, config_filename,
) )
ip_range_whitelist = None
if as_info.get('ip_range_whitelist'):
ip_range_whitelist = IPSet(
as_info.get('ip_range_whitelist')
)
return ApplicationService( return ApplicationService(
token=as_info["as_token"], token=as_info["as_token"],
hostname=hostname, hostname=hostname,
@ -163,5 +172,6 @@ def _load_appservice(hostname, as_info, config_filename):
sender=user_id, sender=user_id,
id=as_info["id"], id=as_info["id"],
protocols=protocols, protocols=protocols,
rate_limited=rate_limited rate_limited=rate_limited,
ip_range_whitelist=ip_range_whitelist,
) )

View file

@ -13,32 +13,32 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from .tls import TlsConfig
from .server import ServerConfig
from .logger import LoggingConfig
from .database import DatabaseConfig
from .ratelimiting import RatelimitConfig
from .repository import ContentRepositoryConfig
from .captcha import CaptchaConfig
from .voip import VoipConfig
from .registration import RegistrationConfig
from .metrics import MetricsConfig
from .api import ApiConfig from .api import ApiConfig
from .appservice import AppServiceConfig from .appservice import AppServiceConfig
from .key import KeyConfig from .captcha import CaptchaConfig
from .saml2 import SAML2Config
from .cas import CasConfig from .cas import CasConfig
from .password import PasswordConfig
from .jwt import JWTConfig
from .password_auth_providers import PasswordAuthProviderConfig
from .emailconfig import EmailConfig
from .workers import WorkerConfig
from .push import PushConfig
from .spam_checker import SpamCheckerConfig
from .groups import GroupsConfig
from .user_directory import UserDirectoryConfig
from .consent_config import ConsentConfig from .consent_config import ConsentConfig
from .database import DatabaseConfig
from .emailconfig import EmailConfig
from .groups import GroupsConfig
from .jwt import JWTConfig
from .key import KeyConfig
from .logger import LoggingConfig
from .metrics import MetricsConfig
from .password import PasswordConfig
from .password_auth_providers import PasswordAuthProviderConfig
from .push import PushConfig
from .ratelimiting import RatelimitConfig
from .registration import RegistrationConfig
from .repository import ContentRepositoryConfig
from .saml2 import SAML2Config
from .server import ServerConfig
from .server_notices_config import ServerNoticesConfig from .server_notices_config import ServerNoticesConfig
from .spam_checker import SpamCheckerConfig
from .tls import TlsConfig
from .user_directory import UserDirectoryConfig
from .voip import VoipConfig
from .workers import WorkerConfig
class HomeServerConfig(TlsConfig, ServerConfig, DatabaseConfig, LoggingConfig, class HomeServerConfig(TlsConfig, ServerConfig, DatabaseConfig, LoggingConfig,

View file

@ -15,7 +15,6 @@
from ._base import Config, ConfigError from ._base import Config, ConfigError
MISSING_JWT = ( MISSING_JWT = (
"""Missing jwt library. This is required for jwt login. """Missing jwt library. This is required for jwt login.

View file

@ -13,21 +13,24 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from ._base import Config, ConfigError
from synapse.util.stringutils import random_string
from signedjson.key import (
generate_signing_key, is_signing_algorithm_supported,
decode_signing_key_base64, decode_verify_key_bytes,
read_signing_keys, write_signing_keys, NACL_ED25519
)
from unpaddedbase64 import decode_base64
from synapse.util.stringutils import random_string_with_symbols
import os
import hashlib import hashlib
import logging import logging
import os
from signedjson.key import (
NACL_ED25519,
decode_signing_key_base64,
decode_verify_key_bytes,
generate_signing_key,
is_signing_algorithm_supported,
read_signing_keys,
write_signing_keys,
)
from unpaddedbase64 import decode_base64
from synapse.util.stringutils import random_string, random_string_with_symbols
from ._base import Config, ConfigError
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -12,17 +12,22 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from ._base import Config
from synapse.util.logcontext import LoggingContextFilter
from twisted.logger import globalLogBeginner, STDLibLogObserver
import logging import logging
import logging.config import logging.config
import yaml
from string import Template
import os import os
import signal import signal
import sys
from string import Template
import yaml
from twisted.logger import STDLibLogObserver, globalLogBeginner
import synapse
from synapse.util.logcontext import LoggingContextFilter
from synapse.util.versionstring import get_version_string
from ._base import Config
DEFAULT_LOG_CONFIG = Template(""" DEFAULT_LOG_CONFIG = Template("""
version: 1 version: 1
@ -202,6 +207,15 @@ def setup_logging(config, use_worker_options=False):
if getattr(signal, "SIGHUP"): if getattr(signal, "SIGHUP"):
signal.signal(signal.SIGHUP, sighup) signal.signal(signal.SIGHUP, sighup)
# make sure that the first thing we log is a thing we can grep backwards
# for
logging.warn("***** STARTING SERVER *****")
logging.warn(
"Server %s version %s",
sys.argv[0], get_version_string(synapse),
)
logging.info("Server hostname: %s", config.server_name)
# It's critical to point twisted's internal logging somewhere, otherwise it # It's critical to point twisted's internal logging somewhere, otherwise it
# stacks up and leaks kup to 64K object; # stacks up and leaks kup to 64K object;
# see: https://twistedmatrix.com/trac/ticket/8164 # see: https://twistedmatrix.com/trac/ticket/8164

View file

@ -13,10 +13,10 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from ._base import Config
from synapse.util.module_loader import load_module from synapse.util.module_loader import load_module
from ._base import Config
LDAP_PROVIDER = 'ldap_auth_provider.LdapAuthProvider' LDAP_PROVIDER = 'ldap_auth_provider.LdapAuthProvider'

View file

@ -13,11 +13,11 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from ._base import Config from distutils.util import strtobool
from synapse.util.stringutils import random_string_with_symbols from synapse.util.stringutils import random_string_with_symbols
from distutils.util import strtobool from ._base import Config
class RegistrationConfig(Config): class RegistrationConfig(Config):

View file

@ -13,11 +13,11 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from ._base import Config, ConfigError
from collections import namedtuple from collections import namedtuple
from synapse.util.module_loader import load_module from synapse.util.module_loader import load_module
from ._base import Config, ConfigError
MISSING_NETADDR = ( MISSING_NETADDR = (
"Missing netaddr library. This is required for URL preview API." "Missing netaddr library. This is required for URL preview API."

View file

@ -16,6 +16,8 @@
import logging import logging
from synapse.http.endpoint import parse_and_validate_server_name
from ._base import Config, ConfigError from ._base import Config, ConfigError
logger = logging.Logger(__name__) logger = logging.Logger(__name__)
@ -25,6 +27,12 @@ class ServerConfig(Config):
def read_config(self, config): def read_config(self, config):
self.server_name = config["server_name"] self.server_name = config["server_name"]
try:
parse_and_validate_server_name(self.server_name)
except ValueError as e:
raise ConfigError(str(e))
self.pid_file = self.abspath(config.get("pid_file")) self.pid_file = self.abspath(config.get("pid_file"))
self.web_client = config["web_client"] self.web_client = config["web_client"]
self.web_client_location = config.get("web_client_location", None) self.web_client_location = config.get("web_client_location", None)
@ -162,8 +170,8 @@ class ServerConfig(Config):
}) })
def default_config(self, server_name, **kwargs): def default_config(self, server_name, **kwargs):
if ":" in server_name: _, bind_port = parse_and_validate_server_name(server_name)
bind_port = int(server_name.split(":")[1]) if bind_port is not None:
unsecure_port = bind_port - 400 unsecure_port = bind_port - 400
else: else:
bind_port = 8448 bind_port = 8448

View file

@ -12,9 +12,10 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from ._base import Config
from synapse.types import UserID from synapse.types import UserID
from ._base import Config
DEFAULT_CONFIG = """\ DEFAULT_CONFIG = """\
# Server Notices room configuration # Server Notices room configuration
# #

View file

@ -13,14 +13,15 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from ._base import Config import os
import subprocess
from hashlib import sha256
from unpaddedbase64 import encode_base64
from OpenSSL import crypto from OpenSSL import crypto
import subprocess
import os
from hashlib import sha256 from ._base import Config
from unpaddedbase64 import encode_base64
GENERATE_DH_PARAMS = False GENERATE_DH_PARAMS = False

View file

@ -30,10 +30,10 @@ class VoipConfig(Config):
## Turn ## ## Turn ##
# The public URIs of the TURN server to give to clients # The public URIs of the TURN server to give to clients
turn_uris: [] #turn_uris: []
# The shared secret used to compute passwords for the TURN server # The shared secret used to compute passwords for the TURN server
turn_shared_secret: "YOUR_SHARED_SECRET" #turn_shared_secret: "YOUR_SHARED_SECRET"
# The Username and password if the TURN server needs them and # The Username and password if the TURN server needs them and
# does not use a token # does not use a token

View file

@ -12,12 +12,12 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from twisted.internet import ssl
from OpenSSL import SSL, crypto
from twisted.internet._sslverify import _defaultCurveName
import logging import logging
from OpenSSL import SSL, crypto
from twisted.internet import ssl
from twisted.internet._sslverify import _defaultCurveName
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -15,16 +15,16 @@
# limitations under the License. # limitations under the License.
from synapse.api.errors import SynapseError, Codes
from synapse.events.utils import prune_event
from canonicaljson import encode_canonical_json
from unpaddedbase64 import encode_base64, decode_base64
from signedjson.sign import sign_json
import hashlib import hashlib
import logging import logging
from canonicaljson import encode_canonical_json
from signedjson.sign import sign_json
from unpaddedbase64 import decode_base64, encode_base64
from synapse.api.errors import Codes, SynapseError
from synapse.events.utils import prune_event
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -13,14 +13,16 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from synapse.util import logcontext
from twisted.web.http import HTTPClient
from twisted.internet.protocol import Factory
from twisted.internet import defer, reactor
from synapse.http.endpoint import matrix_federation_endpoint
import simplejson as json
import logging import logging
from canonicaljson import json
from twisted.internet import defer, reactor
from twisted.internet.protocol import Factory
from twisted.web.http import HTTPClient
from synapse.http.endpoint import matrix_federation_endpoint
from synapse.util import logcontext
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -14,9 +14,31 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import hashlib
import logging
import urllib
from collections import namedtuple
from signedjson.key import (
decode_verify_key_bytes,
encode_verify_key_base64,
is_signing_algorithm_supported,
)
from signedjson.sign import (
SignatureVerifyException,
encode_canonical_json,
sign_json,
signature_ids,
verify_signed_json,
)
from unpaddedbase64 import decode_base64, encode_base64
from OpenSSL import crypto
from twisted.internet import defer
from synapse.api.errors import Codes, SynapseError
from synapse.crypto.keyclient import fetch_server_key from synapse.crypto.keyclient import fetch_server_key
from synapse.api.errors import SynapseError, Codes from synapse.util import logcontext, unwrapFirstError
from synapse.util import unwrapFirstError, logcontext
from synapse.util.logcontext import ( from synapse.util.logcontext import (
PreserveLoggingContext, PreserveLoggingContext,
preserve_fn, preserve_fn,
@ -24,26 +46,6 @@ from synapse.util.logcontext import (
) )
from synapse.util.metrics import Measure from synapse.util.metrics import Measure
from twisted.internet import defer
from signedjson.sign import (
verify_signed_json, signature_ids, sign_json, encode_canonical_json,
SignatureVerifyException,
)
from signedjson.key import (
is_signing_algorithm_supported, decode_verify_key_bytes,
encode_verify_key_base64,
)
from unpaddedbase64 import decode_base64, encode_base64
from OpenSSL import crypto
from collections import namedtuple
import urllib
import hashlib
import logging
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -17,11 +17,11 @@ import logging
from canonicaljson import encode_canonical_json from canonicaljson import encode_canonical_json
from signedjson.key import decode_verify_key_bytes from signedjson.key import decode_verify_key_bytes
from signedjson.sign import verify_signed_json, SignatureVerifyException from signedjson.sign import SignatureVerifyException, verify_signed_json
from unpaddedbase64 import decode_base64 from unpaddedbase64 import decode_base64
from synapse.api.constants import EventTypes, Membership, JoinRules from synapse.api.constants import EventTypes, JoinRules, Membership
from synapse.api.errors import AuthError, SynapseError, EventSizeError from synapse.api.errors import AuthError, EventSizeError, SynapseError
from synapse.types import UserID, get_domain_from_id from synapse.types import UserID, get_domain_from_id
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -34,9 +34,11 @@ def check(event, auth_events, do_sig_check=True, do_size_check=True):
event: the event being checked. event: the event being checked.
auth_events (dict: event-key -> event): the existing room state. auth_events (dict: event-key -> event): the existing room state.
Raises:
AuthError if the checks fail
Returns: Returns:
True if the auth checks pass. if the auth checks pass.
""" """
if do_size_check: if do_size_check:
_check_size_limits(event) _check_size_limits(event)
@ -71,9 +73,10 @@ def check(event, auth_events, do_sig_check=True, do_size_check=True):
# Oh, we don't know what the state of the room was, so we # Oh, we don't know what the state of the room was, so we
# are trusting that this is allowed (at least for now) # are trusting that this is allowed (at least for now)
logger.warn("Trusting event: %s", event.event_id) logger.warn("Trusting event: %s", event.event_id)
return True return
if event.type == EventTypes.Create: if event.type == EventTypes.Create:
sender_domain = get_domain_from_id(event.sender)
room_id_domain = get_domain_from_id(event.room_id) room_id_domain = get_domain_from_id(event.room_id)
if room_id_domain != sender_domain: if room_id_domain != sender_domain:
raise AuthError( raise AuthError(
@ -81,7 +84,8 @@ def check(event, auth_events, do_sig_check=True, do_size_check=True):
"Creation event's room_id domain does not match sender's" "Creation event's room_id domain does not match sender's"
) )
# FIXME # FIXME
return True logger.debug("Allowing! %s", event)
return
creation_event = auth_events.get((EventTypes.Create, ""), None) creation_event = auth_events.get((EventTypes.Create, ""), None)
@ -118,7 +122,8 @@ def check(event, auth_events, do_sig_check=True, do_size_check=True):
403, 403,
"Alias event's state_key does not match sender's domain" "Alias event's state_key does not match sender's domain"
) )
return True logger.debug("Allowing! %s", event)
return
if logger.isEnabledFor(logging.DEBUG): if logger.isEnabledFor(logging.DEBUG):
logger.debug( logger.debug(
@ -127,14 +132,9 @@ def check(event, auth_events, do_sig_check=True, do_size_check=True):
) )
if event.type == EventTypes.Member: if event.type == EventTypes.Member:
allowed = _is_membership_change_allowed( _is_membership_change_allowed(event, auth_events)
event, auth_events logger.debug("Allowing! %s", event)
) return
if allowed:
logger.debug("Allowing! %s", event)
else:
logger.debug("Denying! %s", event)
return allowed
_check_event_sender_in_room(event, auth_events) _check_event_sender_in_room(event, auth_events)
@ -153,7 +153,8 @@ def check(event, auth_events, do_sig_check=True, do_size_check=True):
) )
) )
else: else:
return True logger.debug("Allowing! %s", event)
return
_can_send_event(event, auth_events) _can_send_event(event, auth_events)
@ -200,7 +201,7 @@ def _is_membership_change_allowed(event, auth_events):
create = auth_events.get(key) create = auth_events.get(key)
if create and event.prev_events[0][0] == create.event_id: if create and event.prev_events[0][0] == create.event_id:
if create.content["creator"] == event.state_key: if create.content["creator"] == event.state_key:
return True return
target_user_id = event.state_key target_user_id = event.state_key
@ -265,13 +266,13 @@ def _is_membership_change_allowed(event, auth_events):
raise AuthError( raise AuthError(
403, "%s is banned from the room" % (target_user_id,) 403, "%s is banned from the room" % (target_user_id,)
) )
return True return
if Membership.JOIN != membership: if Membership.JOIN != membership:
if (caller_invited if (caller_invited
and Membership.LEAVE == membership and Membership.LEAVE == membership
and target_user_id == event.user_id): and target_user_id == event.user_id):
return True return
if not caller_in_room: # caller isn't joined if not caller_in_room: # caller isn't joined
raise AuthError( raise AuthError(
@ -334,8 +335,6 @@ def _is_membership_change_allowed(event, auth_events):
else: else:
raise AuthError(500, "Unknown membership %s" % membership) raise AuthError(500, "Unknown membership %s" % membership)
return True
def _check_event_sender_in_room(event, auth_events): def _check_event_sender_in_room(event, auth_events):
key = (EventTypes.Member, event.user_id, ) key = (EventTypes.Member, event.user_id, )
@ -355,35 +354,46 @@ def _check_joined_room(member, user_id, room_id):
)) ))
def get_send_level(etype, state_key, auth_events): def get_send_level(etype, state_key, power_levels_event):
key = (EventTypes.PowerLevels, "", ) """Get the power level required to send an event of a given type
send_level_event = auth_events.get(key)
send_level = None
if send_level_event:
send_level = send_level_event.content.get("events", {}).get(
etype
)
if send_level is None:
if state_key is not None:
send_level = send_level_event.content.get(
"state_default", 50
)
else:
send_level = send_level_event.content.get(
"events_default", 0
)
if send_level: The federation spec [1] refers to this as "Required Power Level".
send_level = int(send_level)
https://matrix.org/docs/spec/server_server/unstable.html#definitions
Args:
etype (str): type of event
state_key (str|None): state_key of state event, or None if it is not
a state event.
power_levels_event (synapse.events.EventBase|None): power levels event
in force at this point in the room
Returns:
int: power level required to send this event.
"""
if power_levels_event:
power_levels_content = power_levels_event.content
else: else:
send_level = 0 power_levels_content = {}
return send_level # see if we have a custom level for this event type
send_level = power_levels_content.get("events", {}).get(etype)
# otherwise, fall back to the state_default/events_default.
if send_level is None:
if state_key is not None:
send_level = power_levels_content.get("state_default", 50)
else:
send_level = power_levels_content.get("events_default", 0)
return int(send_level)
def _can_send_event(event, auth_events): def _can_send_event(event, auth_events):
power_levels_event = _get_power_level_event(auth_events)
send_level = get_send_level( send_level = get_send_level(
event.type, event.get("state_key", None), auth_events event.type, event.get("state_key"), power_levels_event,
) )
user_level = get_user_power_level(event.user_id, auth_events) user_level = get_user_power_level(event.user_id, auth_events)
@ -515,7 +525,11 @@ def _check_power_levels(event, auth_events):
"to your own" "to your own"
) )
if old_level > user_level or new_level > user_level: # Check if the old and new levels are greater than the user level
# (if defined)
old_level_too_big = old_level is not None and old_level > user_level
new_level_too_big = new_level is not None and new_level > user_level
if old_level_too_big or new_level_too_big:
raise AuthError( raise AuthError(
403, 403,
"You don't have permission to add ops level greater " "You don't have permission to add ops level greater "
@ -524,13 +538,22 @@ def _check_power_levels(event, auth_events):
def _get_power_level_event(auth_events): def _get_power_level_event(auth_events):
key = (EventTypes.PowerLevels, "", ) return auth_events.get((EventTypes.PowerLevels, ""))
return auth_events.get(key)
def get_user_power_level(user_id, auth_events): def get_user_power_level(user_id, auth_events):
power_level_event = _get_power_level_event(auth_events) """Get a user's power level
Args:
user_id (str): user's id to look up in power_levels
auth_events (dict[(str, str), synapse.events.EventBase]):
state in force at this point in the room (or rather, a subset of
it including at least the create event and power levels event.
Returns:
int: the user's power level in this room.
"""
power_level_event = _get_power_level_event(auth_events)
if power_level_event: if power_level_event:
level = power_level_event.content.get("users", {}).get(user_id) level = power_level_event.content.get("users", {}).get(user_id)
if not level: if not level:
@ -541,6 +564,11 @@ def get_user_power_level(user_id, auth_events):
else: else:
return int(level) return int(level)
else: else:
# if there is no power levels event, the creator gets 100 and everyone
# else gets 0.
# some things which call this don't pass the create event: hack around
# that.
key = (EventTypes.Create, "", ) key = (EventTypes.Create, "", )
create_event = auth_events.get(key) create_event = auth_events.get(key)
if (create_event is not None and if (create_event is not None and

View file

@ -13,9 +13,8 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from synapse.util.frozenutils import freeze
from synapse.util.caches import intern_dict from synapse.util.caches import intern_dict
from synapse.util.frozenutils import freeze
# Whether we should use frozen_dict in FrozenEvent. Using frozen_dicts prevents # Whether we should use frozen_dict in FrozenEvent. Using frozen_dicts prevents
# bugs where we accidentally share e.g. signature dicts. However, converting # bugs where we accidentally share e.g. signature dicts. However, converting

View file

@ -13,13 +13,12 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from . import EventBase, FrozenEvent, _event_dict_property import copy
from synapse.types import EventID from synapse.types import EventID
from synapse.util.stringutils import random_string from synapse.util.stringutils import random_string
import copy from . import EventBase, FrozenEvent, _event_dict_property
class EventBuilder(EventBase): class EventBuilder(EventBase):

View file

@ -13,10 +13,10 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from twisted.internet import defer
from frozendict import frozendict from frozendict import frozendict
from twisted.internet import defer
class EventContext(object): class EventContext(object):
""" """

View file

@ -13,15 +13,16 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from synapse.api.constants import EventTypes
from . import EventBase
from frozendict import frozendict
import re import re
from six import string_types from six import string_types
from frozendict import frozendict
from synapse.api.constants import EventTypes
from . import EventBase
# Split strings on "." but not "\." This uses a negative lookbehind assertion for '\' # Split strings on "." but not "\." This uses a negative lookbehind assertion for '\'
# (?<!stuff) matches if the current position in the string is not preceded # (?<!stuff) matches if the current position in the string is not preceded
# by a match for 'stuff'. # by a match for 'stuff'.

View file

@ -13,12 +13,12 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from synapse.types import EventID, RoomID, UserID
from synapse.api.errors import SynapseError
from synapse.api.constants import EventTypes, Membership
from six import string_types from six import string_types
from synapse.api.constants import EventTypes, Membership
from synapse.api.errors import SynapseError
from synapse.types import EventID, RoomID, UserID
class EventValidator(object): class EventValidator(object):

View file

@ -16,14 +16,15 @@ import logging
import six import six
from twisted.internet import defer
from synapse.api.constants import MAX_DEPTH from synapse.api.constants import MAX_DEPTH
from synapse.api.errors import SynapseError, Codes from synapse.api.errors import Codes, SynapseError
from synapse.crypto.event_signing import check_event_content_hash from synapse.crypto.event_signing import check_event_content_hash
from synapse.events import FrozenEvent from synapse.events import FrozenEvent
from synapse.events.utils import prune_event from synapse.events.utils import prune_event
from synapse.http.servlet import assert_params_in_request from synapse.http.servlet import assert_params_in_dict
from synapse.util import unwrapFirstError, logcontext from synapse.util import logcontext, unwrapFirstError
from twisted.internet import defer
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -198,7 +199,7 @@ def event_from_pdu_json(pdu_json, outlier=False):
""" """
# we could probably enforce a bunch of other fields here (room_id, sender, # we could probably enforce a bunch of other fields here (room_id, sender,
# origin, etc etc) # origin, etc etc)
assert_params_in_request(pdu_json, ('event_id', 'type', 'depth')) assert_params_in_dict(pdu_json, ('event_id', 'type', 'depth'))
depth = pdu_json['depth'] depth = pdu_json['depth']
if not isinstance(depth, six.integer_types): if not isinstance(depth, six.integer_types):

View file

@ -21,25 +21,25 @@ import random
from six.moves import range from six.moves import range
from prometheus_client import Counter
from twisted.internet import defer from twisted.internet import defer
from synapse.api.constants import Membership from synapse.api.constants import Membership
from synapse.api.errors import ( from synapse.api.errors import (
CodeMessageException, HttpResponseException, SynapseError, FederationDeniedError CodeMessageException,
FederationDeniedError,
HttpResponseException,
SynapseError,
) )
from synapse.events import builder from synapse.events import builder
from synapse.federation.federation_base import ( from synapse.federation.federation_base import FederationBase, event_from_pdu_json
FederationBase,
event_from_pdu_json,
)
from synapse.util import logcontext, unwrapFirstError from synapse.util import logcontext, unwrapFirstError
from synapse.util.caches.expiringcache import ExpiringCache from synapse.util.caches.expiringcache import ExpiringCache
from synapse.util.logcontext import make_deferred_yieldable, run_in_background from synapse.util.logcontext import make_deferred_yieldable, run_in_background
from synapse.util.logutils import log_function from synapse.util.logutils import log_function
from synapse.util.retryutils import NotRetryingDestination from synapse.util.retryutils import NotRetryingDestination
from prometheus_client import Counter
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
sent_queries_counter = Counter("synapse_federation_client_sent_queries", "", ["type"]) sent_queries_counter = Counter("synapse_federation_client_sent_queries", "", ["type"])

View file

@ -14,28 +14,29 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import logging import logging
import re
import six
from six import iteritems
from canonicaljson import json
from prometheus_client import Counter
import simplejson as json
from twisted.internet import defer from twisted.internet import defer
from twisted.internet.abstract import isIPAddress
from synapse.api.errors import AuthError, FederationError, SynapseError, NotFoundError from synapse.api.constants import EventTypes
from synapse.api.errors import AuthError, FederationError, NotFoundError, SynapseError
from synapse.crypto.event_signing import compute_event_signature from synapse.crypto.event_signing import compute_event_signature
from synapse.federation.federation_base import ( from synapse.federation.federation_base import FederationBase, event_from_pdu_json
FederationBase,
event_from_pdu_json,
)
from synapse.federation.persistence import TransactionActions from synapse.federation.persistence import TransactionActions
from synapse.federation.units import Edu, Transaction from synapse.federation.units import Edu, Transaction
from synapse.http.endpoint import parse_server_name
from synapse.types import get_domain_from_id from synapse.types import get_domain_from_id
from synapse.util import async from synapse.util import async
from synapse.util.caches.response_cache import ResponseCache from synapse.util.caches.response_cache import ResponseCache
from synapse.util.logutils import log_function from synapse.util.logutils import log_function
from prometheus_client import Counter
from six import iteritems
# when processing incoming transactions, we try to handle multiple rooms in # when processing incoming transactions, we try to handle multiple rooms in
# parallel, up to this limit. # parallel, up to this limit.
TRANSACTION_CONCURRENCY_LIMIT = 10 TRANSACTION_CONCURRENCY_LIMIT = 10
@ -74,6 +75,9 @@ class FederationServer(FederationBase):
@log_function @log_function
def on_backfill_request(self, origin, room_id, versions, limit): def on_backfill_request(self, origin, room_id, versions, limit):
with (yield self._server_linearizer.queue((origin, room_id))): with (yield self._server_linearizer.queue((origin, room_id))):
origin_host, _ = parse_server_name(origin)
yield self.check_server_matches_acl(origin_host, room_id)
pdus = yield self.handler.on_backfill_request( pdus = yield self.handler.on_backfill_request(
origin, room_id, versions, limit origin, room_id, versions, limit
) )
@ -134,6 +138,8 @@ class FederationServer(FederationBase):
received_pdus_counter.inc(len(transaction.pdus)) received_pdus_counter.inc(len(transaction.pdus))
origin_host, _ = parse_server_name(transaction.origin)
pdus_by_room = {} pdus_by_room = {}
for p in transaction.pdus: for p in transaction.pdus:
@ -154,9 +160,21 @@ class FederationServer(FederationBase):
# we can process different rooms in parallel (which is useful if they # we can process different rooms in parallel (which is useful if they
# require callouts to other servers to fetch missing events), but # require callouts to other servers to fetch missing events), but
# impose a limit to avoid going too crazy with ram/cpu. # impose a limit to avoid going too crazy with ram/cpu.
@defer.inlineCallbacks @defer.inlineCallbacks
def process_pdus_for_room(room_id): def process_pdus_for_room(room_id):
logger.debug("Processing PDUs for %s", room_id) logger.debug("Processing PDUs for %s", room_id)
try:
yield self.check_server_matches_acl(origin_host, room_id)
except AuthError as e:
logger.warn(
"Ignoring PDUs for room %s from banned server", room_id,
)
for pdu in pdus_by_room[room_id]:
event_id = pdu.event_id
pdu_results[event_id] = e.error_dict()
return
for pdu in pdus_by_room[room_id]: for pdu in pdus_by_room[room_id]:
event_id = pdu.event_id event_id = pdu.event_id
try: try:
@ -211,6 +229,9 @@ class FederationServer(FederationBase):
if not event_id: if not event_id:
raise NotImplementedError("Specify an event") raise NotImplementedError("Specify an event")
origin_host, _ = parse_server_name(origin)
yield self.check_server_matches_acl(origin_host, room_id)
in_room = yield self.auth.check_host_in_room(room_id, origin) in_room = yield self.auth.check_host_in_room(room_id, origin)
if not in_room: if not in_room:
raise AuthError(403, "Host not in room.") raise AuthError(403, "Host not in room.")
@ -234,6 +255,9 @@ class FederationServer(FederationBase):
if not event_id: if not event_id:
raise NotImplementedError("Specify an event") raise NotImplementedError("Specify an event")
origin_host, _ = parse_server_name(origin)
yield self.check_server_matches_acl(origin_host, room_id)
in_room = yield self.auth.check_host_in_room(room_id, origin) in_room = yield self.auth.check_host_in_room(room_id, origin)
if not in_room: if not in_room:
raise AuthError(403, "Host not in room.") raise AuthError(403, "Host not in room.")
@ -277,7 +301,7 @@ class FederationServer(FederationBase):
@defer.inlineCallbacks @defer.inlineCallbacks
@log_function @log_function
def on_pdu_request(self, origin, event_id): def on_pdu_request(self, origin, event_id):
pdu = yield self._get_persisted_pdu(origin, event_id) pdu = yield self.handler.get_persisted_pdu(origin, event_id)
if pdu: if pdu:
defer.returnValue( defer.returnValue(
@ -298,7 +322,9 @@ class FederationServer(FederationBase):
defer.returnValue((200, resp)) defer.returnValue((200, resp))
@defer.inlineCallbacks @defer.inlineCallbacks
def on_make_join_request(self, room_id, user_id): def on_make_join_request(self, origin, room_id, user_id):
origin_host, _ = parse_server_name(origin)
yield self.check_server_matches_acl(origin_host, room_id)
pdu = yield self.handler.on_make_join_request(room_id, user_id) pdu = yield self.handler.on_make_join_request(room_id, user_id)
time_now = self._clock.time_msec() time_now = self._clock.time_msec()
defer.returnValue({"event": pdu.get_pdu_json(time_now)}) defer.returnValue({"event": pdu.get_pdu_json(time_now)})
@ -306,6 +332,8 @@ class FederationServer(FederationBase):
@defer.inlineCallbacks @defer.inlineCallbacks
def on_invite_request(self, origin, content): def on_invite_request(self, origin, content):
pdu = event_from_pdu_json(content) pdu = event_from_pdu_json(content)
origin_host, _ = parse_server_name(origin)
yield self.check_server_matches_acl(origin_host, pdu.room_id)
ret_pdu = yield self.handler.on_invite_request(origin, pdu) ret_pdu = yield self.handler.on_invite_request(origin, pdu)
time_now = self._clock.time_msec() time_now = self._clock.time_msec()
defer.returnValue((200, {"event": ret_pdu.get_pdu_json(time_now)})) defer.returnValue((200, {"event": ret_pdu.get_pdu_json(time_now)}))
@ -314,6 +342,10 @@ class FederationServer(FederationBase):
def on_send_join_request(self, origin, content): def on_send_join_request(self, origin, content):
logger.debug("on_send_join_request: content: %s", content) logger.debug("on_send_join_request: content: %s", content)
pdu = event_from_pdu_json(content) pdu = event_from_pdu_json(content)
origin_host, _ = parse_server_name(origin)
yield self.check_server_matches_acl(origin_host, pdu.room_id)
logger.debug("on_send_join_request: pdu sigs: %s", pdu.signatures) logger.debug("on_send_join_request: pdu sigs: %s", pdu.signatures)
res_pdus = yield self.handler.on_send_join_request(origin, pdu) res_pdus = yield self.handler.on_send_join_request(origin, pdu)
time_now = self._clock.time_msec() time_now = self._clock.time_msec()
@ -325,7 +357,9 @@ class FederationServer(FederationBase):
})) }))
@defer.inlineCallbacks @defer.inlineCallbacks
def on_make_leave_request(self, room_id, user_id): def on_make_leave_request(self, origin, room_id, user_id):
origin_host, _ = parse_server_name(origin)
yield self.check_server_matches_acl(origin_host, room_id)
pdu = yield self.handler.on_make_leave_request(room_id, user_id) pdu = yield self.handler.on_make_leave_request(room_id, user_id)
time_now = self._clock.time_msec() time_now = self._clock.time_msec()
defer.returnValue({"event": pdu.get_pdu_json(time_now)}) defer.returnValue({"event": pdu.get_pdu_json(time_now)})
@ -334,6 +368,10 @@ class FederationServer(FederationBase):
def on_send_leave_request(self, origin, content): def on_send_leave_request(self, origin, content):
logger.debug("on_send_leave_request: content: %s", content) logger.debug("on_send_leave_request: content: %s", content)
pdu = event_from_pdu_json(content) pdu = event_from_pdu_json(content)
origin_host, _ = parse_server_name(origin)
yield self.check_server_matches_acl(origin_host, pdu.room_id)
logger.debug("on_send_leave_request: pdu sigs: %s", pdu.signatures) logger.debug("on_send_leave_request: pdu sigs: %s", pdu.signatures)
yield self.handler.on_send_leave_request(origin, pdu) yield self.handler.on_send_leave_request(origin, pdu)
defer.returnValue((200, {})) defer.returnValue((200, {}))
@ -341,6 +379,9 @@ class FederationServer(FederationBase):
@defer.inlineCallbacks @defer.inlineCallbacks
def on_event_auth(self, origin, room_id, event_id): def on_event_auth(self, origin, room_id, event_id):
with (yield self._server_linearizer.queue((origin, room_id))): with (yield self._server_linearizer.queue((origin, room_id))):
origin_host, _ = parse_server_name(origin)
yield self.check_server_matches_acl(origin_host, room_id)
time_now = self._clock.time_msec() time_now = self._clock.time_msec()
auth_pdus = yield self.handler.on_event_auth(event_id) auth_pdus = yield self.handler.on_event_auth(event_id)
res = { res = {
@ -369,6 +410,9 @@ class FederationServer(FederationBase):
Deferred: Results in `dict` with the same format as `content` Deferred: Results in `dict` with the same format as `content`
""" """
with (yield self._server_linearizer.queue((origin, room_id))): with (yield self._server_linearizer.queue((origin, room_id))):
origin_host, _ = parse_server_name(origin)
yield self.check_server_matches_acl(origin_host, room_id)
auth_chain = [ auth_chain = [
event_from_pdu_json(e) event_from_pdu_json(e)
for e in content["auth_chain"] for e in content["auth_chain"]
@ -442,6 +486,9 @@ class FederationServer(FederationBase):
def on_get_missing_events(self, origin, room_id, earliest_events, def on_get_missing_events(self, origin, room_id, earliest_events,
latest_events, limit, min_depth): latest_events, limit, min_depth):
with (yield self._server_linearizer.queue((origin, room_id))): with (yield self._server_linearizer.queue((origin, room_id))):
origin_host, _ = parse_server_name(origin)
yield self.check_server_matches_acl(origin_host, room_id)
logger.info( logger.info(
"on_get_missing_events: earliest_events: %r, latest_events: %r," "on_get_missing_events: earliest_events: %r, latest_events: %r,"
" limit: %d, min_depth: %d", " limit: %d, min_depth: %d",
@ -470,17 +517,6 @@ class FederationServer(FederationBase):
ts_now_ms = self._clock.time_msec() ts_now_ms = self._clock.time_msec()
return self.store.get_user_id_for_open_id_token(token, ts_now_ms) return self.store.get_user_id_for_open_id_token(token, ts_now_ms)
@log_function
def _get_persisted_pdu(self, origin, event_id, do_auth=True):
""" Get a PDU from the database with given origin and id.
Returns:
Deferred: Results in a `Pdu`.
"""
return self.handler.get_persisted_pdu(
origin, event_id, do_auth=do_auth
)
def _transaction_from_pdus(self, pdu_list): def _transaction_from_pdus(self, pdu_list):
"""Returns a new Transaction containing the given PDUs suitable for """Returns a new Transaction containing the given PDUs suitable for
transmission. transmission.
@ -560,7 +596,9 @@ class FederationServer(FederationBase):
affected=pdu.event_id, affected=pdu.event_id,
) )
yield self.handler.on_receive_pdu(origin, pdu, get_missing=True) yield self.handler.on_receive_pdu(
origin, pdu, get_missing=True, sent_to_us_directly=True,
)
def __str__(self): def __str__(self):
return "<ReplicationLayer(%s)>" % self.server_name return "<ReplicationLayer(%s)>" % self.server_name
@ -588,6 +626,101 @@ class FederationServer(FederationBase):
) )
defer.returnValue(ret) defer.returnValue(ret)
@defer.inlineCallbacks
def check_server_matches_acl(self, server_name, room_id):
"""Check if the given server is allowed by the server ACLs in the room
Args:
server_name (str): name of server, *without any port part*
room_id (str): ID of the room to check
Raises:
AuthError if the server does not match the ACL
"""
state_ids = yield self.store.get_current_state_ids(room_id)
acl_event_id = state_ids.get((EventTypes.ServerACL, ""))
if not acl_event_id:
return
acl_event = yield self.store.get_event(acl_event_id)
if server_matches_acl_event(server_name, acl_event):
return
raise AuthError(code=403, msg="Server is banned from room")
def server_matches_acl_event(server_name, acl_event):
"""Check if the given server is allowed by the ACL event
Args:
server_name (str): name of server, without any port part
acl_event (EventBase): m.room.server_acl event
Returns:
bool: True if this server is allowed by the ACLs
"""
logger.debug("Checking %s against acl %s", server_name, acl_event.content)
# first of all, check if literal IPs are blocked, and if so, whether the
# server name is a literal IP
allow_ip_literals = acl_event.content.get("allow_ip_literals", True)
if not isinstance(allow_ip_literals, bool):
logger.warn("Ignorning non-bool allow_ip_literals flag")
allow_ip_literals = True
if not allow_ip_literals:
# check for ipv6 literals. These start with '['.
if server_name[0] == '[':
return False
# check for ipv4 literals. We can just lift the routine from twisted.
if isIPAddress(server_name):
return False
# next, check the deny list
deny = acl_event.content.get("deny", [])
if not isinstance(deny, (list, tuple)):
logger.warn("Ignorning non-list deny ACL %s", deny)
deny = []
for e in deny:
if _acl_entry_matches(server_name, e):
# logger.info("%s matched deny rule %s", server_name, e)
return False
# then the allow list.
allow = acl_event.content.get("allow", [])
if not isinstance(allow, (list, tuple)):
logger.warn("Ignorning non-list allow ACL %s", allow)
allow = []
for e in allow:
if _acl_entry_matches(server_name, e):
# logger.info("%s matched allow rule %s", server_name, e)
return True
# everything else should be rejected.
# logger.info("%s fell through", server_name)
return False
def _acl_entry_matches(server_name, acl_entry):
if not isinstance(acl_entry, six.string_types):
logger.warn("Ignoring non-str ACL entry '%s' (is %s)", acl_entry, type(acl_entry))
return False
regex = _glob_to_regex(acl_entry)
return regex.match(server_name)
def _glob_to_regex(glob):
res = ''
for c in glob:
if c == '*':
res = res + '.*'
elif c == '?':
res = res + '.'
else:
res = res + re.escape(c)
return re.compile(res + "\\Z", re.IGNORECASE)
class FederationHandlerRegistry(object): class FederationHandlerRegistry(object):
"""Allows classes to register themselves as handlers for a given EDU or """Allows classes to register themselves as handlers for a given EDU or

View file

@ -19,13 +19,12 @@ package.
These actions are mostly only used by the :py:mod:`.replication` module. These actions are mostly only used by the :py:mod:`.replication` module.
""" """
import logging
from twisted.internet import defer from twisted.internet import defer
from synapse.util.logutils import log_function from synapse.util.logutils import log_function
import logging
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -29,18 +29,18 @@ dead worker doesn't cause the queues to grow limitlessly.
Events are replicated via a separate events stream. Events are replicated via a separate events stream.
""" """
from .units import Edu import logging
from synapse.storage.presence import UserPresenceState
from synapse.util.metrics import Measure
from synapse.metrics import LaterGauge
from sortedcontainers import SortedDict
from collections import namedtuple from collections import namedtuple
import logging from six import iteritems, itervalues
from six import itervalues, iteritems from sortedcontainers import SortedDict
from synapse.metrics import LaterGauge
from synapse.storage.presence import UserPresenceState
from synapse.util.metrics import Measure
from .units import Edu
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -13,37 +13,38 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import datetime import datetime
import logging
from twisted.internet import defer
from .persistence import TransactionActions
from .units import Transaction, Edu
from synapse.api.errors import HttpResponseException, FederationDeniedError
from synapse.util import logcontext, PreserveLoggingContext
from synapse.util.async import run_on_reactor
from synapse.util.retryutils import NotRetryingDestination, get_retry_limiter
from synapse.util.metrics import measure_func
from synapse.handlers.presence import format_user_presence_state, get_interested_remotes
import synapse.metrics
from synapse.metrics import LaterGauge
from synapse.metrics import (
sent_edus_counter,
sent_transactions_counter,
events_processed_counter,
)
from prometheus_client import Counter
from six import itervalues from six import itervalues
import logging from prometheus_client import Counter
from twisted.internet import defer
import synapse.metrics
from synapse.api.errors import FederationDeniedError, HttpResponseException
from synapse.handlers.presence import format_user_presence_state, get_interested_remotes
from synapse.metrics import (
LaterGauge,
events_processed_counter,
sent_edus_counter,
sent_transactions_counter,
)
from synapse.metrics.background_process_metrics import run_as_background_process
from synapse.util import logcontext
from synapse.util.metrics import measure_func
from synapse.util.retryutils import NotRetryingDestination, get_retry_limiter
from .persistence import TransactionActions
from .units import Edu, Transaction
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
sent_pdus_destination_dist = Counter( sent_pdus_destination_dist_count = Counter(
"synapse_federation_transaction_queue_sent_pdu_destinations", "" "synapse_federation_client_sent_pdu_destinations:count", ""
)
sent_pdus_destination_dist_total = Counter(
"synapse_federation_client_sent_pdu_destinations:total", ""
) )
@ -165,10 +166,11 @@ class TransactionQueue(object):
if self._is_processing: if self._is_processing:
return return
# fire off a processing loop in the background. It's likely it will # fire off a processing loop in the background
# outlast the current request, so run it in the sentinel logcontext. run_as_background_process(
with PreserveLoggingContext(): "process_event_queue_for_federation",
self._process_event_queue_loop() self._process_event_queue_loop,
)
@defer.inlineCallbacks @defer.inlineCallbacks
def _process_event_queue_loop(self): def _process_event_queue_loop(self):
@ -280,7 +282,8 @@ class TransactionQueue(object):
if not destinations: if not destinations:
return return
sent_pdus_destination_dist.inc(len(destinations)) sent_pdus_destination_dist_total.inc(len(destinations))
sent_pdus_destination_dist_count.inc()
for destination in destinations: for destination in destinations:
self.pending_pdus_by_dest.setdefault(destination, []).append( self.pending_pdus_by_dest.setdefault(destination, []).append(
@ -431,14 +434,11 @@ class TransactionQueue(object):
logger.debug("TX [%s] Starting transaction loop", destination) logger.debug("TX [%s] Starting transaction loop", destination)
# Drop the logcontext before starting the transaction. It doesn't run_as_background_process(
# really make sense to log all the outbound transactions against "federation_transaction_transmission_loop",
# whatever path led us to this point: that's pretty arbitrary really. self._transaction_transmission_loop,
# destination,
# (this also means we can fire off _perform_transaction without )
# yielding)
with logcontext.PreserveLoggingContext():
self._transaction_transmission_loop(destination)
@defer.inlineCallbacks @defer.inlineCallbacks
def _transaction_transmission_loop(self, destination): def _transaction_transmission_loop(self, destination):
@ -451,9 +451,6 @@ class TransactionQueue(object):
# hence why we throw the result away. # hence why we throw the result away.
yield get_retry_limiter(destination, self.clock, self.store) yield get_retry_limiter(destination, self.clock, self.store)
# XXX: what's this for?
yield run_on_reactor()
pending_pdus = [] pending_pdus = []
while True: while True:
device_message_edus, device_stream_id, dev_list_id = ( device_message_edus, device_stream_id, dev_list_id = (

View file

@ -14,15 +14,14 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from twisted.internet import defer
from synapse.api.constants import Membership
from synapse.api.urls import FEDERATION_PREFIX as PREFIX
from synapse.util.logutils import log_function
import logging import logging
import urllib import urllib
from twisted.internet import defer
from synapse.api.constants import Membership
from synapse.api.urls import FEDERATION_PREFIX as PREFIX
from synapse.util.logutils import log_function
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -14,25 +14,27 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from twisted.internet import defer
from synapse.api.urls import FEDERATION_PREFIX as PREFIX
from synapse.api.errors import Codes, SynapseError, FederationDeniedError
from synapse.http.server import JsonResource
from synapse.http.servlet import (
parse_json_object_from_request, parse_integer_from_args, parse_string_from_args,
parse_boolean_from_args,
)
from synapse.util.ratelimitutils import FederationRateLimiter
from synapse.util.versionstring import get_version_string
from synapse.util.logcontext import run_in_background
from synapse.types import ThirdPartyInstanceID, get_domain_from_id
import functools import functools
import logging import logging
import re import re
import synapse
from twisted.internet import defer
import synapse
from synapse.api.errors import Codes, FederationDeniedError, SynapseError
from synapse.api.urls import FEDERATION_PREFIX as PREFIX
from synapse.http.endpoint import parse_and_validate_server_name
from synapse.http.server import JsonResource
from synapse.http.servlet import (
parse_boolean_from_args,
parse_integer_from_args,
parse_json_object_from_request,
parse_string_from_args,
)
from synapse.types import ThirdPartyInstanceID, get_domain_from_id
from synapse.util.logcontext import run_in_background
from synapse.util.ratelimitutils import FederationRateLimiter
from synapse.util.versionstring import get_version_string
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -99,26 +101,6 @@ class Authenticator(object):
origin = None origin = None
def parse_auth_header(header_str):
try:
params = auth.split(" ")[1].split(",")
param_dict = dict(kv.split("=") for kv in params)
def strip_quotes(value):
if value.startswith("\""):
return value[1:-1]
else:
return value
origin = strip_quotes(param_dict["origin"])
key = strip_quotes(param_dict["key"])
sig = strip_quotes(param_dict["sig"])
return (origin, key, sig)
except Exception:
raise AuthenticationError(
400, "Malformed Authorization header", Codes.UNAUTHORIZED
)
auth_headers = request.requestHeaders.getRawHeaders(b"Authorization") auth_headers = request.requestHeaders.getRawHeaders(b"Authorization")
if not auth_headers: if not auth_headers:
@ -127,8 +109,8 @@ class Authenticator(object):
) )
for auth in auth_headers: for auth in auth_headers:
if auth.startswith("X-Matrix"): if auth.startswith(b"X-Matrix"):
(origin, key, sig) = parse_auth_header(auth) (origin, key, sig) = _parse_auth_header(auth)
json_request["origin"] = origin json_request["origin"] = origin
json_request["signatures"].setdefault(origin, {})[key] = sig json_request["signatures"].setdefault(origin, {})[key] = sig
@ -165,6 +147,48 @@ class Authenticator(object):
logger.exception("Error resetting retry timings on %s", origin) logger.exception("Error resetting retry timings on %s", origin)
def _parse_auth_header(header_bytes):
"""Parse an X-Matrix auth header
Args:
header_bytes (bytes): header value
Returns:
Tuple[str, str, str]: origin, key id, signature.
Raises:
AuthenticationError if the header could not be parsed
"""
try:
header_str = header_bytes.decode('utf-8')
params = header_str.split(" ")[1].split(",")
param_dict = dict(kv.split("=") for kv in params)
def strip_quotes(value):
if value.startswith(b"\""):
return value[1:-1]
else:
return value
origin = strip_quotes(param_dict["origin"])
# ensure that the origin is a valid server name
parse_and_validate_server_name(origin)
key = strip_quotes(param_dict["key"])
sig = strip_quotes(param_dict["sig"])
return origin, key, sig
except Exception as e:
logger.warn(
"Error parsing auth header '%s': %s",
header_bytes.decode('ascii', 'replace'),
e,
)
raise AuthenticationError(
400, "Malformed Authorization header", Codes.UNAUTHORIZED,
)
class BaseFederationServlet(object): class BaseFederationServlet(object):
REQUIRE_AUTH = True REQUIRE_AUTH = True
@ -362,7 +386,9 @@ class FederationMakeJoinServlet(BaseFederationServlet):
@defer.inlineCallbacks @defer.inlineCallbacks
def on_GET(self, origin, content, query, context, user_id): def on_GET(self, origin, content, query, context, user_id):
content = yield self.handler.on_make_join_request(context, user_id) content = yield self.handler.on_make_join_request(
origin, context, user_id,
)
defer.returnValue((200, content)) defer.returnValue((200, content))
@ -371,7 +397,9 @@ class FederationMakeLeaveServlet(BaseFederationServlet):
@defer.inlineCallbacks @defer.inlineCallbacks
def on_GET(self, origin, content, query, context, user_id): def on_GET(self, origin, content, query, context, user_id):
content = yield self.handler.on_make_leave_request(context, user_id) content = yield self.handler.on_make_leave_request(
origin, context, user_id,
)
defer.returnValue((200, content)) defer.returnValue((200, content))

View file

@ -17,10 +17,9 @@
server protocol. server protocol.
""" """
from synapse.util.jsonobject import JsonEncodedObject
import logging import logging
from synapse.util.jsonobject import JsonEncodedObject
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -23,9 +23,9 @@ If a user leaves (or gets kicked out of) a group, either side can still use
their attestation to "prove" their membership, until the attestation expires. their attestation to "prove" their membership, until the attestation expires.
Therefore attestations shouldn't be relied on to prove membership in important Therefore attestations shouldn't be relied on to prove membership in important
cases, but can for less important situtations, e.g. showing a users membership cases, but can for less important situtations, e.g. showing a users membership
of groups on their profile, showing flairs, etc.abs of groups on their profile, showing flairs, etc.
An attestsation is a signed blob of json that looks like: An attestation is a signed blob of json that looks like:
{ {
"user_id": "@foo:a.example.com", "user_id": "@foo:a.example.com",
@ -38,15 +38,14 @@ An attestsation is a signed blob of json that looks like:
import logging import logging
import random import random
from signedjson.sign import sign_json
from twisted.internet import defer from twisted.internet import defer
from synapse.api.errors import SynapseError from synapse.api.errors import SynapseError
from synapse.types import get_domain_from_id from synapse.types import get_domain_from_id
from synapse.util.logcontext import run_in_background from synapse.util.logcontext import run_in_background
from signedjson.sign import sign_json
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -16,11 +16,12 @@
import logging import logging
from synapse.api.errors import SynapseError from six import string_types
from synapse.types import GroupID, RoomID, UserID, get_domain_from_id
from twisted.internet import defer from twisted.internet import defer
from six import string_types from synapse.api.errors import SynapseError
from synapse.types import GroupID, RoomID, UserID, get_domain_from_id
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -13,13 +13,13 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from .admin import AdminHandler
from .directory import DirectoryHandler
from .federation import FederationHandler
from .identity import IdentityHandler
from .message import MessageHandler
from .register import RegistrationHandler from .register import RegistrationHandler
from .room import RoomContextHandler from .room import RoomContextHandler
from .message import MessageHandler
from .federation import FederationHandler
from .directory import DirectoryHandler
from .admin import AdminHandler
from .identity import IdentityHandler
from .search import SearchHandler from .search import SearchHandler

View file

@ -18,11 +18,10 @@ import logging
from twisted.internet import defer from twisted.internet import defer
import synapse.types import synapse.types
from synapse.api.constants import Membership, EventTypes from synapse.api.constants import EventTypes, Membership
from synapse.api.errors import LimitExceededError from synapse.api.errors import LimitExceededError
from synapse.types import UserID from synapse.types import UserID
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -13,12 +13,12 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import logging
from twisted.internet import defer from twisted.internet import defer
from ._base import BaseHandler from ._base import BaseHandler
import logging
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -13,19 +13,18 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from twisted.internet import defer import logging
from six import itervalues from six import itervalues
import synapse
from synapse.api.constants import EventTypes
from synapse.util.metrics import Measure
from synapse.util.logcontext import (
make_deferred_yieldable, run_in_background,
)
from prometheus_client import Counter from prometheus_client import Counter
import logging from twisted.internet import defer
import synapse
from synapse.api.constants import EventTypes
from synapse.util.logcontext import make_deferred_yieldable, run_in_background
from synapse.util.metrics import Measure
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -13,29 +13,33 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from twisted.internet import defer, threads
from ._base import BaseHandler import logging
import attr
import bcrypt
import pymacaroons
from canonicaljson import json
from twisted.internet import defer, threads
from twisted.web.client import PartialDownloadError
import synapse.util.stringutils as stringutils
from synapse.api.constants import LoginType from synapse.api.constants import LoginType
from synapse.api.errors import ( from synapse.api.errors import (
AuthError, Codes, InteractiveAuthIncompleteError, LoginError, StoreError, AuthError,
Codes,
InteractiveAuthIncompleteError,
LoginError,
StoreError,
SynapseError, SynapseError,
) )
from synapse.module_api import ModuleApi from synapse.module_api import ModuleApi
from synapse.types import UserID from synapse.types import UserID
from synapse.util.async import run_on_reactor
from synapse.util.caches.expiringcache import ExpiringCache from synapse.util.caches.expiringcache import ExpiringCache
from synapse.util.logcontext import make_deferred_yieldable from synapse.util.logcontext import make_deferred_yieldable
from twisted.web.client import PartialDownloadError from ._base import BaseHandler
import logging
import bcrypt
import pymacaroons
import simplejson
import synapse.util.stringutils as stringutils
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -402,7 +406,7 @@ class AuthHandler(BaseHandler):
except PartialDownloadError as pde: except PartialDownloadError as pde:
# Twisted is silly # Twisted is silly
data = pde.response data = pde.response
resp_body = simplejson.loads(data) resp_body = json.loads(data)
if 'success' in resp_body: if 'success' in resp_body:
# Note that we do NOT check the hostname here: we explicitly # Note that we do NOT check the hostname here: we explicitly
@ -423,15 +427,11 @@ class AuthHandler(BaseHandler):
def _check_msisdn(self, authdict, _): def _check_msisdn(self, authdict, _):
return self._check_threepid('msisdn', authdict) return self._check_threepid('msisdn', authdict)
@defer.inlineCallbacks
def _check_dummy_auth(self, authdict, _): def _check_dummy_auth(self, authdict, _):
yield run_on_reactor() return defer.succeed(True)
defer.returnValue(True)
@defer.inlineCallbacks @defer.inlineCallbacks
def _check_threepid(self, medium, authdict): def _check_threepid(self, medium, authdict):
yield run_on_reactor()
if 'threepid_creds' not in authdict: if 'threepid_creds' not in authdict:
raise LoginError(400, "Missing threepid_creds", Codes.MISSING_PARAM) raise LoginError(400, "Missing threepid_creds", Codes.MISSING_PARAM)
@ -825,6 +825,15 @@ class AuthHandler(BaseHandler):
if medium == 'email': if medium == 'email':
address = address.lower() address = address.lower()
identity_handler = self.hs.get_handlers().identity_handler
yield identity_handler.unbind_threepid(
user_id,
{
'medium': medium,
'address': address,
},
)
ret = yield self.store.user_delete_threepid( ret = yield self.store.user_delete_threepid(
user_id, medium, address, user_id, medium, address,
) )
@ -849,7 +858,11 @@ class AuthHandler(BaseHandler):
return bcrypt.hashpw(password.encode('utf8') + self.hs.config.password_pepper, return bcrypt.hashpw(password.encode('utf8') + self.hs.config.password_pepper,
bcrypt.gensalt(self.bcrypt_rounds)) bcrypt.gensalt(self.bcrypt_rounds))
return make_deferred_yieldable(threads.deferToThread(_do_hash)) return make_deferred_yieldable(
threads.deferToThreadPool(
self.hs.get_reactor(), self.hs.get_reactor().getThreadPool(), _do_hash
),
)
def validate_hash(self, password, stored_hash): def validate_hash(self, password, stored_hash):
"""Validates that self.hash(password) == stored_hash. """Validates that self.hash(password) == stored_hash.
@ -869,16 +882,21 @@ class AuthHandler(BaseHandler):
) )
if stored_hash: if stored_hash:
return make_deferred_yieldable(threads.deferToThread(_do_validate_hash)) return make_deferred_yieldable(
threads.deferToThreadPool(
self.hs.get_reactor(),
self.hs.get_reactor().getThreadPool(),
_do_validate_hash,
),
)
else: else:
return defer.succeed(False) return defer.succeed(False)
class MacaroonGeneartor(object): @attr.s
def __init__(self, hs): class MacaroonGenerator(object):
self.clock = hs.get_clock()
self.server_name = hs.config.server_name hs = attr.ib()
self.macaroon_secret_key = hs.config.macaroon_secret_key
def generate_access_token(self, user_id, extra_caveats=None): def generate_access_token(self, user_id, extra_caveats=None):
extra_caveats = extra_caveats or [] extra_caveats = extra_caveats or []
@ -896,7 +914,7 @@ class MacaroonGeneartor(object):
def generate_short_term_login_token(self, user_id, duration_in_ms=(2 * 60 * 1000)): def generate_short_term_login_token(self, user_id, duration_in_ms=(2 * 60 * 1000)):
macaroon = self._generate_base_macaroon(user_id) macaroon = self._generate_base_macaroon(user_id)
macaroon.add_first_party_caveat("type = login") macaroon.add_first_party_caveat("type = login")
now = self.clock.time_msec() now = self.hs.get_clock().time_msec()
expiry = now + duration_in_ms expiry = now + duration_in_ms
macaroon.add_first_party_caveat("time < %d" % (expiry,)) macaroon.add_first_party_caveat("time < %d" % (expiry,))
return macaroon.serialize() return macaroon.serialize()
@ -908,9 +926,9 @@ class MacaroonGeneartor(object):
def _generate_base_macaroon(self, user_id): def _generate_base_macaroon(self, user_id):
macaroon = pymacaroons.Macaroon( macaroon = pymacaroons.Macaroon(
location=self.server_name, location=self.hs.config.server_name,
identifier="key", identifier="key",
key=self.macaroon_secret_key) key=self.hs.config.macaroon_secret_key)
macaroon.add_first_party_caveat("gen = 1") macaroon.add_first_party_caveat("gen = 1")
macaroon.add_first_party_caveat("user_id = %s" % (user_id,)) macaroon.add_first_party_caveat("user_id = %s" % (user_id,))
return macaroon return macaroon

View file

@ -12,13 +12,15 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from twisted.internet import defer, reactor import logging
from ._base import BaseHandler from twisted.internet import defer
from synapse.api.errors import SynapseError
from synapse.types import UserID, create_requester from synapse.types import UserID, create_requester
from synapse.util.logcontext import run_in_background from synapse.util.logcontext import run_in_background
import logging from ._base import BaseHandler
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -30,6 +32,7 @@ class DeactivateAccountHandler(BaseHandler):
self._auth_handler = hs.get_auth_handler() self._auth_handler = hs.get_auth_handler()
self._device_handler = hs.get_device_handler() self._device_handler = hs.get_device_handler()
self._room_member_handler = hs.get_room_member_handler() self._room_member_handler = hs.get_room_member_handler()
self._identity_handler = hs.get_handlers().identity_handler
self.user_directory_handler = hs.get_user_directory_handler() self.user_directory_handler = hs.get_user_directory_handler()
# Flag that indicates whether the process to part users from rooms is running # Flag that indicates whether the process to part users from rooms is running
@ -37,14 +40,15 @@ class DeactivateAccountHandler(BaseHandler):
# Start the user parter loop so it can resume parting users from rooms where # Start the user parter loop so it can resume parting users from rooms where
# it left off (if it has work left to do). # it left off (if it has work left to do).
reactor.callWhenRunning(self._start_user_parting) hs.get_reactor().callWhenRunning(self._start_user_parting)
@defer.inlineCallbacks @defer.inlineCallbacks
def deactivate_account(self, user_id): def deactivate_account(self, user_id, erase_data):
"""Deactivate a user's account """Deactivate a user's account
Args: Args:
user_id (str): ID of user to be deactivated user_id (str): ID of user to be deactivated
erase_data (bool): whether to GDPR-erase the user's data
Returns: Returns:
Deferred Deferred
@ -52,14 +56,35 @@ class DeactivateAccountHandler(BaseHandler):
# FIXME: Theoretically there is a race here wherein user resets # FIXME: Theoretically there is a race here wherein user resets
# password using threepid. # password using threepid.
# first delete any devices belonging to the user, which will also # delete threepids first. We remove these from the IS so if this fails,
# leave the user still active so they can try again.
# Ideally we would prevent password resets and then do this in the
# background thread.
threepids = yield self.store.user_get_threepids(user_id)
for threepid in threepids:
try:
yield self._identity_handler.unbind_threepid(
user_id,
{
'medium': threepid['medium'],
'address': threepid['address'],
},
)
except Exception:
# Do we want this to be a fatal error or should we carry on?
logger.exception("Failed to remove threepid from ID server")
raise SynapseError(400, "Failed to remove threepid from ID server")
yield self.store.user_delete_threepid(
user_id, threepid['medium'], threepid['address'],
)
# delete any devices belonging to the user, which will also
# delete corresponding access tokens. # delete corresponding access tokens.
yield self._device_handler.delete_all_devices_for_user(user_id) yield self._device_handler.delete_all_devices_for_user(user_id)
# then delete any remaining access tokens which weren't associated with # then delete any remaining access tokens which weren't associated with
# a device. # a device.
yield self._auth_handler.delete_access_tokens_for_user(user_id) yield self._auth_handler.delete_access_tokens_for_user(user_id)
yield self.store.user_delete_threepids(user_id)
yield self.store.user_set_password_hash(user_id, None) yield self.store.user_set_password_hash(user_id, None)
# Add the user to a table of users pending deactivation (ie. # Add the user to a table of users pending deactivation (ie.
@ -69,6 +94,11 @@ class DeactivateAccountHandler(BaseHandler):
# delete from user directory # delete from user directory
yield self.user_directory_handler.handle_user_deactivated(user_id) yield self.user_directory_handler.handle_user_deactivated(user_id)
# Mark the user as erased, if they asked for that
if erase_data:
logger.info("Marking %s as erased", user_id)
yield self.store.mark_user_erased(user_id)
# Now start the process that goes through that list and # Now start the process that goes through that list and
# parts users from rooms (if it isn't already running) # parts users from rooms (if it isn't already running)
self._start_user_parting() self._start_user_parting()

View file

@ -12,22 +12,24 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import logging
from six import iteritems, itervalues
from twisted.internet import defer
from synapse.api import errors from synapse.api import errors
from synapse.api.constants import EventTypes from synapse.api.constants import EventTypes
from synapse.api.errors import FederationDeniedError from synapse.api.errors import FederationDeniedError
from synapse.types import RoomStreamToken, get_domain_from_id
from synapse.util import stringutils from synapse.util import stringutils
from synapse.util.async import Linearizer from synapse.util.async import Linearizer
from synapse.util.caches.expiringcache import ExpiringCache from synapse.util.caches.expiringcache import ExpiringCache
from synapse.util.retryutils import NotRetryingDestination
from synapse.util.metrics import measure_func from synapse.util.metrics import measure_func
from synapse.types import get_domain_from_id, RoomStreamToken from synapse.util.retryutils import NotRetryingDestination
from twisted.internet import defer
from ._base import BaseHandler from ._base import BaseHandler
import logging
from six import itervalues, iteritems
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -537,7 +539,7 @@ class DeviceListEduUpdater(object):
yield self.device_handler.notify_device_update(user_id, device_ids) yield self.device_handler.notify_device_update(user_id, device_ids)
else: else:
# Simply update the single device, since we know that is the only # Simply update the single device, since we know that is the only
# change (becuase of the single prev_id matching the current cache) # change (because of the single prev_id matching the current cache)
for device_id, stream_id, prev_ids, content in pending_updates: for device_id, stream_id, prev_ids, content in pending_updates:
yield self.store.update_remote_device_list_cache_entry( yield self.store.update_remote_device_list_cache_entry(
user_id, device_id, content, stream_id, user_id, device_id, content, stream_id,

View file

@ -18,10 +18,9 @@ import logging
from twisted.internet import defer from twisted.internet import defer
from synapse.api.errors import SynapseError from synapse.api.errors import SynapseError
from synapse.types import get_domain_from_id, UserID from synapse.types import UserID, get_domain_from_id
from synapse.util.stringutils import random_string from synapse.util.stringutils import random_string
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -14,16 +14,17 @@
# limitations under the License. # limitations under the License.
from twisted.internet import defer
from ._base import BaseHandler
from synapse.api.errors import SynapseError, Codes, CodeMessageException, AuthError
from synapse.api.constants import EventTypes
from synapse.types import RoomAlias, UserID, get_domain_from_id
import logging import logging
import string import string
from twisted.internet import defer
from synapse.api.constants import EventTypes
from synapse.api.errors import AuthError, CodeMessageException, Codes, SynapseError
from synapse.types import RoomAlias, UserID, get_domain_from_id
from ._base import BaseHandler
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -14,17 +14,16 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import simplejson as json
import logging import logging
from canonicaljson import encode_canonical_json
from twisted.internet import defer
from six import iteritems from six import iteritems
from synapse.api.errors import ( from canonicaljson import encode_canonical_json, json
SynapseError, CodeMessageException, FederationDeniedError,
) from twisted.internet import defer
from synapse.types import get_domain_from_id, UserID
from synapse.api.errors import CodeMessageException, FederationDeniedError, SynapseError
from synapse.types import UserID, get_domain_from_id
from synapse.util.logcontext import make_deferred_yieldable, run_in_background from synapse.util.logcontext import make_deferred_yieldable, run_in_background
from synapse.util.retryutils import NotRetryingDestination from synapse.util.retryutils import NotRetryingDestination
@ -80,7 +79,7 @@ class E2eKeysHandler(object):
else: else:
remote_queries[user_id] = device_ids remote_queries[user_id] = device_ids
# Firt get local devices. # First get local devices.
failures = {} failures = {}
results = {} results = {}
if local_query: if local_query:
@ -357,7 +356,7 @@ def _exception_to_failure(e):
# include ConnectionRefused and other errors # include ConnectionRefused and other errors
# #
# Note that some Exceptions (notably twisted's ResponseFailed etc) don't # Note that some Exceptions (notably twisted's ResponseFailed etc) don't
# give a string for e.message, which simplejson then fails to serialize. # give a string for e.message, which json then fails to serialize.
return { return {
"status": 503, "message": str(e.message), "status": 503, "message": str(e.message),
} }

View file

@ -13,19 +13,18 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from twisted.internet import defer
from synapse.util.logutils import log_function
from synapse.types import UserID
from synapse.events.utils import serialize_event
from synapse.api.constants import Membership, EventTypes
from synapse.events import EventBase
from ._base import BaseHandler
import logging import logging
import random import random
from twisted.internet import defer
from synapse.api.constants import EventTypes, Membership
from synapse.events import EventBase
from synapse.events.utils import serialize_event
from synapse.types import UserID
from synapse.util.logutils import log_function
from ._base import BaseHandler
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -20,37 +20,41 @@ import itertools
import logging import logging
import sys import sys
import six
from six import iteritems
from six.moves import http_client
from signedjson.key import decode_verify_key_bytes from signedjson.key import decode_verify_key_bytes
from signedjson.sign import verify_signed_json from signedjson.sign import verify_signed_json
import six
from six.moves import http_client
from six import iteritems
from twisted.internet import defer
from unpaddedbase64 import decode_base64 from unpaddedbase64 import decode_base64
from ._base import BaseHandler from twisted.internet import defer
from synapse.api.errors import (
AuthError, FederationError, StoreError, CodeMessageException, SynapseError,
FederationDeniedError,
)
from synapse.api.constants import EventTypes, Membership, RejectedReason from synapse.api.constants import EventTypes, Membership, RejectedReason
from synapse.events.validator import EventValidator from synapse.api.errors import (
from synapse.util import unwrapFirstError, logcontext AuthError,
from synapse.util.metrics import measure_func CodeMessageException,
from synapse.util.logutils import log_function FederationDeniedError,
from synapse.util.async import run_on_reactor, Linearizer FederationError,
from synapse.util.frozenutils import unfreeze StoreError,
from synapse.crypto.event_signing import ( SynapseError,
compute_event_signature, add_hashes_and_signatures,
) )
from synapse.crypto.event_signing import (
add_hashes_and_signatures,
compute_event_signature,
)
from synapse.events.validator import EventValidator
from synapse.state import resolve_events_with_factory
from synapse.types import UserID, get_domain_from_id from synapse.types import UserID, get_domain_from_id
from synapse.util import logcontext, unwrapFirstError
from synapse.events.utils import prune_event from synapse.util.async import Linearizer
from synapse.util.retryutils import NotRetryingDestination
from synapse.util.distributor import user_joined_room from synapse.util.distributor import user_joined_room
from synapse.util.frozenutils import unfreeze
from synapse.util.logutils import log_function
from synapse.util.retryutils import NotRetryingDestination
from synapse.visibility import filter_events_for_server
from ._base import BaseHandler
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -89,7 +93,9 @@ class FederationHandler(BaseHandler):
@defer.inlineCallbacks @defer.inlineCallbacks
@log_function @log_function
def on_receive_pdu(self, origin, pdu, get_missing=True): def on_receive_pdu(
self, origin, pdu, get_missing=True, sent_to_us_directly=False,
):
""" Process a PDU received via a federation /send/ transaction, or """ Process a PDU received via a federation /send/ transaction, or
via backfill of missing prev_events via backfill of missing prev_events
@ -103,8 +109,10 @@ class FederationHandler(BaseHandler):
""" """
# We reprocess pdus when we have seen them only as outliers # We reprocess pdus when we have seen them only as outliers
existing = yield self.get_persisted_pdu( existing = yield self.store.get_event(
origin, pdu.event_id, do_auth=False pdu.event_id,
allow_none=True,
allow_rejected=True,
) )
# FIXME: Currently we fetch an event again when we already have it # FIXME: Currently we fetch an event again when we already have it
@ -161,14 +169,11 @@ class FederationHandler(BaseHandler):
"Ignoring PDU %s for room %s from %s as we've left the room!", "Ignoring PDU %s for room %s from %s as we've left the room!",
pdu.event_id, pdu.room_id, origin, pdu.event_id, pdu.room_id, origin,
) )
return defer.returnValue(None)
state = None state = None
auth_chain = [] auth_chain = []
fetch_state = False
# Get missing pdus if necessary. # Get missing pdus if necessary.
if not pdu.internal_metadata.is_outlier(): if not pdu.internal_metadata.is_outlier():
# We only backfill backwards to the min depth. # We only backfill backwards to the min depth.
@ -223,26 +228,60 @@ class FederationHandler(BaseHandler):
list(prevs - seen)[:5], list(prevs - seen)[:5],
) )
if prevs - seen: if sent_to_us_directly and prevs - seen:
logger.info( # If they have sent it to us directly, and the server
"Still missing %d events for room %r: %r...", # isn't telling us about the auth events that it's
len(prevs - seen), pdu.room_id, list(prevs - seen)[:5] # made a message referencing, we explode
raise FederationError(
"ERROR",
403,
(
"Your server isn't divulging details about prev_events "
"referenced in this event."
),
affected=pdu.event_id,
) )
fetch_state = True elif prevs - seen:
# Calculate the state of the previous events, and
# de-conflict them to find the current state.
state_groups = []
auth_chains = set()
try:
# Get the state of the events we know about
ours = yield self.store.get_state_groups(pdu.room_id, list(seen))
state_groups.append(ours)
if fetch_state: # Ask the remote server for the states we don't
# We need to get the state at this event, since we haven't # know about
# processed all the prev events. for p in prevs - seen:
logger.debug( state, got_auth_chain = (
"_handle_new_pdu getting state for %s", yield self.replication_layer.get_state_for_room(
pdu.room_id origin, pdu.room_id, p
) )
try: )
state, auth_chain = yield self.replication_layer.get_state_for_room( auth_chains.update(got_auth_chain)
origin, pdu.room_id, pdu.event_id, state_group = {(x.type, x.state_key): x.event_id for x in state}
) state_groups.append(state_group)
except Exception:
logger.exception("Failed to get state for event: %s", pdu.event_id) # Resolve any conflicting state
def fetch(ev_ids):
return self.store.get_events(
ev_ids, get_prev_content=False, check_redacted=False
)
state_map = yield resolve_events_with_factory(
state_groups, {pdu.event_id: pdu}, fetch
)
state = (yield self.store.get_events(state_map.values())).values()
auth_chain = list(auth_chains)
except Exception:
raise FederationError(
"ERROR",
403,
"We can't get valid state history.",
affected=pdu.event_id,
)
yield self._process_received_pdu( yield self._process_received_pdu(
origin, origin,
@ -320,11 +359,17 @@ class FederationHandler(BaseHandler):
for e in missing_events: for e in missing_events:
logger.info("Handling found event %s", e.event_id) logger.info("Handling found event %s", e.event_id)
yield self.on_receive_pdu( try:
origin, yield self.on_receive_pdu(
e, origin,
get_missing=False e,
) get_missing=False
)
except FederationError as e:
if e.code == 403:
logger.warn("Event %s failed history check.")
else:
raise
@log_function @log_function
@defer.inlineCallbacks @defer.inlineCallbacks
@ -455,83 +500,6 @@ class FederationHandler(BaseHandler):
user = UserID.from_string(event.state_key) user = UserID.from_string(event.state_key)
yield user_joined_room(self.distributor, user, event.room_id) yield user_joined_room(self.distributor, user, event.room_id)
@measure_func("_filter_events_for_server")
@defer.inlineCallbacks
def _filter_events_for_server(self, server_name, room_id, events):
event_to_state_ids = yield self.store.get_state_ids_for_events(
frozenset(e.event_id for e in events),
types=(
(EventTypes.RoomHistoryVisibility, ""),
(EventTypes.Member, None),
)
)
# We only want to pull out member events that correspond to the
# server's domain.
def check_match(id):
try:
return server_name == get_domain_from_id(id)
except Exception:
return False
# Parses mapping `event_id -> (type, state_key) -> state event_id`
# to get all state ids that we're interested in.
event_map = yield self.store.get_events([
e_id
for key_to_eid in list(event_to_state_ids.values())
for key, e_id in key_to_eid.items()
if key[0] != EventTypes.Member or check_match(key[1])
])
event_to_state = {
e_id: {
key: event_map[inner_e_id]
for key, inner_e_id in key_to_eid.iteritems()
if inner_e_id in event_map
}
for e_id, key_to_eid in event_to_state_ids.iteritems()
}
def redact_disallowed(event, state):
if not state:
return event
history = state.get((EventTypes.RoomHistoryVisibility, ''), None)
if history:
visibility = history.content.get("history_visibility", "shared")
if visibility in ["invited", "joined"]:
# We now loop through all state events looking for
# membership states for the requesting server to determine
# if the server is either in the room or has been invited
# into the room.
for ev in state.itervalues():
if ev.type != EventTypes.Member:
continue
try:
domain = get_domain_from_id(ev.state_key)
except Exception:
continue
if domain != server_name:
continue
memtype = ev.membership
if memtype == Membership.JOIN:
return event
elif memtype == Membership.INVITE:
if visibility == "invited":
return event
else:
return prune_event(event)
return event
defer.returnValue([
redact_disallowed(e, event_to_state[e.event_id])
for e in events
])
@log_function @log_function
@defer.inlineCallbacks @defer.inlineCallbacks
def backfill(self, dest, room_id, limit, extremities): def backfill(self, dest, room_id, limit, extremities):
@ -938,16 +906,6 @@ class FederationHandler(BaseHandler):
[auth_id for auth_id, _ in event.auth_events], [auth_id for auth_id, _ in event.auth_events],
include_given=True include_given=True
) )
for event in auth:
event.signatures.update(
compute_event_signature(
event,
self.hs.hostname,
self.hs.config.signing_key[0]
)
)
defer.returnValue([e for e in auth]) defer.returnValue([e for e in auth])
@log_function @log_function
@ -1381,8 +1339,6 @@ class FederationHandler(BaseHandler):
def get_state_for_pdu(self, room_id, event_id): def get_state_for_pdu(self, room_id, event_id):
"""Returns the state at the event. i.e. not including said event. """Returns the state at the event. i.e. not including said event.
""" """
yield run_on_reactor()
state_groups = yield self.store.get_state_groups( state_groups = yield self.store.get_state_groups(
room_id, [event_id] room_id, [event_id]
) )
@ -1405,18 +1361,6 @@ class FederationHandler(BaseHandler):
del results[(event.type, event.state_key)] del results[(event.type, event.state_key)]
res = list(results.values()) res = list(results.values())
for event in res:
# We sign these again because there was a bug where we
# incorrectly signed things the first time round
if self.is_mine_id(event.event_id):
event.signatures.update(
compute_event_signature(
event,
self.hs.hostname,
self.hs.config.signing_key[0]
)
)
defer.returnValue(res) defer.returnValue(res)
else: else:
defer.returnValue([]) defer.returnValue([])
@ -1425,8 +1369,6 @@ class FederationHandler(BaseHandler):
def get_state_ids_for_pdu(self, room_id, event_id): def get_state_ids_for_pdu(self, room_id, event_id):
"""Returns the state at the event. i.e. not including said event. """Returns the state at the event. i.e. not including said event.
""" """
yield run_on_reactor()
state_groups = yield self.store.get_state_groups_ids( state_groups = yield self.store.get_state_groups_ids(
room_id, [event_id] room_id, [event_id]
) )
@ -1462,17 +1404,26 @@ class FederationHandler(BaseHandler):
limit limit
) )
events = yield self._filter_events_for_server(origin, room_id, events) events = yield filter_events_for_server(self.store, origin, events)
defer.returnValue(events) defer.returnValue(events)
@defer.inlineCallbacks @defer.inlineCallbacks
@log_function @log_function
def get_persisted_pdu(self, origin, event_id, do_auth=True): def get_persisted_pdu(self, origin, event_id):
""" Get a PDU from the database with given origin and id. """Get an event from the database for the given server.
Args:
origin [str]: hostname of server which is requesting the event; we
will check that the server is allowed to see it.
event_id [str]: id of the event being requested
Returns: Returns:
Deferred: Results in a `Pdu`. Deferred[EventBase|None]: None if we know nothing about the event;
otherwise the (possibly-redacted) event.
Raises:
AuthError if the server is not currently in the room
""" """
event = yield self.store.get_event( event = yield self.store.get_event(
event_id, event_id,
@ -1481,32 +1432,17 @@ class FederationHandler(BaseHandler):
) )
if event: if event:
if self.is_mine_id(event.event_id): in_room = yield self.auth.check_host_in_room(
# FIXME: This is a temporary work around where we occasionally event.room_id,
# return events slightly differently than when they were origin
# originally signed )
event.signatures.update( if not in_room:
compute_event_signature( raise AuthError(403, "Host not in room.")
event,
self.hs.hostname,
self.hs.config.signing_key[0]
)
)
if do_auth:
in_room = yield self.auth.check_host_in_room(
event.room_id,
origin
)
if not in_room:
raise AuthError(403, "Host not in room.")
events = yield self._filter_events_for_server(
origin, event.room_id, [event]
)
event = events[0]
events = yield filter_events_for_server(
self.store, origin, [event],
)
event = events[0]
defer.returnValue(event) defer.returnValue(event)
else: else:
defer.returnValue(None) defer.returnValue(None)
@ -1760,15 +1696,6 @@ class FederationHandler(BaseHandler):
local_auth_chain, remote_auth_chain local_auth_chain, remote_auth_chain
) )
for event in ret["auth_chain"]:
event.signatures.update(
compute_event_signature(
event,
self.hs.hostname,
self.hs.config.signing_key[0]
)
)
logger.debug("on_query_auth returning: %s", ret) logger.debug("on_query_auth returning: %s", ret)
defer.returnValue(ret) defer.returnValue(ret)
@ -1794,8 +1721,8 @@ class FederationHandler(BaseHandler):
min_depth=min_depth, min_depth=min_depth,
) )
missing_events = yield self._filter_events_for_server( missing_events = yield filter_events_for_server(
origin, room_id, missing_events, self.store, origin, missing_events,
) )
defer.returnValue(missing_events) defer.returnValue(missing_events)

View file

@ -14,14 +14,15 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from twisted.internet import defer import logging
from six import iteritems from six import iteritems
from twisted.internet import defer
from synapse.api.errors import SynapseError from synapse.api.errors import SynapseError
from synapse.types import get_domain_from_id from synapse.types import get_domain_from_id
import logging
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -1,6 +1,7 @@
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
# Copyright 2015, 2016 OpenMarket Ltd # Copyright 2015, 2016 OpenMarket Ltd
# Copyright 2017 Vector Creations Ltd # Copyright 2017 Vector Creations Ltd
# Copyright 2018 New Vector Ltd
# #
# Licensed under the Apache License, Version 2.0 (the "License"); # Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License. # you may not use this file except in compliance with the License.
@ -18,16 +19,18 @@
import logging import logging
import simplejson as json from canonicaljson import json
from twisted.internet import defer from twisted.internet import defer
from synapse.api.errors import ( from synapse.api.errors import (
MatrixCodeMessageException, CodeMessageException CodeMessageException,
Codes,
MatrixCodeMessageException,
SynapseError,
) )
from ._base import BaseHandler from ._base import BaseHandler
from synapse.util.async import run_on_reactor
from synapse.api.errors import SynapseError, Codes
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -38,6 +41,7 @@ class IdentityHandler(BaseHandler):
super(IdentityHandler, self).__init__(hs) super(IdentityHandler, self).__init__(hs)
self.http_client = hs.get_simple_http_client() self.http_client = hs.get_simple_http_client()
self.federation_http_client = hs.get_http_client()
self.trusted_id_servers = set(hs.config.trusted_third_party_id_servers) self.trusted_id_servers = set(hs.config.trusted_third_party_id_servers)
self.trust_any_id_server_just_for_testing_do_not_use = ( self.trust_any_id_server_just_for_testing_do_not_use = (
@ -60,8 +64,6 @@ class IdentityHandler(BaseHandler):
@defer.inlineCallbacks @defer.inlineCallbacks
def threepid_from_creds(self, creds): def threepid_from_creds(self, creds):
yield run_on_reactor()
if 'id_server' in creds: if 'id_server' in creds:
id_server = creds['id_server'] id_server = creds['id_server']
elif 'idServer' in creds: elif 'idServer' in creds:
@ -104,7 +106,6 @@ class IdentityHandler(BaseHandler):
@defer.inlineCallbacks @defer.inlineCallbacks
def bind_threepid(self, creds, mxid): def bind_threepid(self, creds, mxid):
yield run_on_reactor()
logger.debug("binding threepid %r to %s", creds, mxid) logger.debug("binding threepid %r to %s", creds, mxid)
data = None data = None
@ -139,9 +140,53 @@ class IdentityHandler(BaseHandler):
defer.returnValue(data) defer.returnValue(data)
@defer.inlineCallbacks @defer.inlineCallbacks
def requestEmailToken(self, id_server, email, client_secret, send_attempt, **kwargs): def unbind_threepid(self, mxid, threepid):
yield run_on_reactor() """
Removes a binding from an identity server
Args:
mxid (str): Matrix user ID of binding to be removed
threepid (dict): Dict with medium & address of binding to be removed
Returns:
Deferred[bool]: True on success, otherwise False
"""
logger.debug("unbinding threepid %r from %s", threepid, mxid)
if not self.trusted_id_servers:
logger.warn("Can't unbind threepid: no trusted ID servers set in config")
defer.returnValue(False)
# We don't track what ID server we added 3pids on (perhaps we ought to)
# but we assume that any of the servers in the trusted list are in the
# same ID server federation, so we can pick any one of them to send the
# deletion request to.
id_server = next(iter(self.trusted_id_servers))
url = "https://%s/_matrix/identity/api/v1/3pid/unbind" % (id_server,)
content = {
"mxid": mxid,
"threepid": threepid,
}
headers = {}
# we abuse the federation http client to sign the request, but we have to send it
# using the normal http client since we don't want the SRV lookup and want normal
# 'browser-like' HTTPS.
self.federation_http_client.sign_request(
destination=None,
method='POST',
url_bytes='/_matrix/identity/api/v1/3pid/unbind'.encode('ascii'),
headers_dict=headers,
content=content,
destination_is=id_server,
)
yield self.http_client.post_json_get_json(
url,
content,
headers,
)
defer.returnValue(True)
@defer.inlineCallbacks
def requestEmailToken(self, id_server, email, client_secret, send_attempt, **kwargs):
if not self._should_trust_id_server(id_server): if not self._should_trust_id_server(id_server):
raise SynapseError( raise SynapseError(
400, "Untrusted ID server '%s'" % id_server, 400, "Untrusted ID server '%s'" % id_server,
@ -176,8 +221,6 @@ class IdentityHandler(BaseHandler):
self, id_server, country, phone_number, self, id_server, country, phone_number,
client_secret, send_attempt, **kwargs client_secret, send_attempt, **kwargs
): ):
yield run_on_reactor()
if not self._should_trust_id_server(id_server): if not self._should_trust_id_server(id_server):
raise SynapseError( raise SynapseError(
400, "Untrusted ID server '%s'" % id_server, 400, "Untrusted ID server '%s'" % id_server,

View file

@ -13,6 +13,8 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import logging
from twisted.internet import defer from twisted.internet import defer
from synapse.api.constants import EventTypes, Membership from synapse.api.constants import EventTypes, Membership
@ -21,9 +23,7 @@ from synapse.events.utils import serialize_event
from synapse.events.validator import EventValidator from synapse.events.validator import EventValidator
from synapse.handlers.presence import format_user_presence_state from synapse.handlers.presence import format_user_presence_state
from synapse.streams.config import PaginationConfig from synapse.streams.config import PaginationConfig
from synapse.types import ( from synapse.types import StreamToken, UserID
UserID, StreamToken,
)
from synapse.util import unwrapFirstError from synapse.util import unwrapFirstError
from synapse.util.async import concurrently_execute from synapse.util.async import concurrently_execute
from synapse.util.caches.snapshot_cache import SnapshotCache from synapse.util.caches.snapshot_cache import SnapshotCache
@ -32,9 +32,6 @@ from synapse.visibility import filter_events_for_client
from ._base import BaseHandler from ._base import BaseHandler
import logging
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -14,35 +14,31 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
import logging import logging
import simplejson
import sys import sys
from canonicaljson import encode_canonical_json
import six import six
from six import string_types, itervalues, iteritems from six import iteritems, itervalues, string_types
from twisted.internet import defer, reactor
from canonicaljson import encode_canonical_json, json
from twisted.internet import defer
from twisted.internet.defer import succeed from twisted.internet.defer import succeed
from twisted.python.failure import Failure from twisted.python.failure import Failure
from synapse.api.constants import EventTypes, Membership, MAX_DEPTH from synapse.api.constants import MAX_DEPTH, EventTypes, Membership
from synapse.api.errors import ( from synapse.api.errors import AuthError, Codes, ConsentNotGivenError, SynapseError
AuthError, Codes, SynapseError,
ConsentNotGivenError,
)
from synapse.api.urls import ConsentURIBuilder from synapse.api.urls import ConsentURIBuilder
from synapse.crypto.event_signing import add_hashes_and_signatures from synapse.crypto.event_signing import add_hashes_and_signatures
from synapse.events.utils import serialize_event from synapse.events.utils import serialize_event
from synapse.events.validator import EventValidator from synapse.events.validator import EventValidator
from synapse.types import ( from synapse.replication.http.send_event import send_event_to_master
UserID, RoomAlias, RoomStreamToken, from synapse.types import RoomAlias, RoomStreamToken, UserID
) from synapse.util.async import Limiter, ReadWriteLock
from synapse.util.async import run_on_reactor, ReadWriteLock, Limiter from synapse.util.frozenutils import frozendict_json_encoder
from synapse.util.logcontext import run_in_background from synapse.util.logcontext import run_in_background
from synapse.util.metrics import measure_func from synapse.util.metrics import measure_func
from synapse.util.frozenutils import frozendict_json_encoder
from synapse.util.stringutils import random_string from synapse.util.stringutils import random_string
from synapse.visibility import filter_events_for_client from synapse.visibility import filter_events_for_client
from synapse.replication.http.send_event import send_event_to_master
from ._base import BaseHandler from ._base import BaseHandler
@ -157,7 +153,7 @@ class MessageHandler(BaseHandler):
# remove the purge from the list 24 hours after it completes # remove the purge from the list 24 hours after it completes
def clear_purge(): def clear_purge():
del self._purges_by_id[purge_id] del self._purges_by_id[purge_id]
reactor.callLater(24 * 3600, clear_purge) self.hs.get_reactor().callLater(24 * 3600, clear_purge)
def get_purge_status(self, purge_id): def get_purge_status(self, purge_id):
"""Get the current status of an active purge """Get the current status of an active purge
@ -388,7 +384,7 @@ class MessageHandler(BaseHandler):
users_with_profile = yield self.state.get_current_user_in_room(room_id) users_with_profile = yield self.state.get_current_user_in_room(room_id)
# If this is an AS, double check that they are allowed to see the members. # If this is an AS, double check that they are allowed to see the members.
# This can either be because the AS user is in the room or becuase there # This can either be because the AS user is in the room or because there
# is a user in the room that the AS is "interested in" # is a user in the room that the AS is "interested in"
if requester.app_service and user_id not in users_with_profile: if requester.app_service and user_id not in users_with_profile:
for uid in users_with_profile: for uid in users_with_profile:
@ -491,7 +487,7 @@ class EventCreationHandler(object):
target, e target, e
) )
is_exempt = yield self._is_exempt_from_privacy_policy(builder) is_exempt = yield self._is_exempt_from_privacy_policy(builder, requester)
if not is_exempt: if not is_exempt:
yield self.assert_accepted_privacy_policy(requester) yield self.assert_accepted_privacy_policy(requester)
@ -509,12 +505,13 @@ class EventCreationHandler(object):
defer.returnValue((event, context)) defer.returnValue((event, context))
def _is_exempt_from_privacy_policy(self, builder): def _is_exempt_from_privacy_policy(self, builder, requester):
""""Determine if an event to be sent is exempt from having to consent """"Determine if an event to be sent is exempt from having to consent
to the privacy policy to the privacy policy
Args: Args:
builder (synapse.events.builder.EventBuilder): event being created builder (synapse.events.builder.EventBuilder): event being created
requester (Requster): user requesting this event
Returns: Returns:
Deferred[bool]: true if the event can be sent without the user Deferred[bool]: true if the event can be sent without the user
@ -525,6 +522,9 @@ class EventCreationHandler(object):
membership = builder.content.get("membership", None) membership = builder.content.get("membership", None)
if membership == Membership.JOIN: if membership == Membership.JOIN:
return self._is_server_notices_room(builder.room_id) return self._is_server_notices_room(builder.room_id)
elif membership == Membership.LEAVE:
# the user is always allowed to leave (but not kick people)
return builder.state_key == requester.user.to_string()
return succeed(False) return succeed(False)
@defer.inlineCallbacks @defer.inlineCallbacks
@ -793,7 +793,7 @@ class EventCreationHandler(object):
# Ensure that we can round trip before trying to persist in db # Ensure that we can round trip before trying to persist in db
try: try:
dump = frozendict_json_encoder.encode(event.content) dump = frozendict_json_encoder.encode(event.content)
simplejson.loads(dump) json.loads(dump)
except Exception: except Exception:
logger.exception("Failed to encode content: %r", event.content) logger.exception("Failed to encode content: %r", event.content)
raise raise
@ -806,6 +806,7 @@ class EventCreationHandler(object):
# If we're a worker we need to hit out to the master. # If we're a worker we need to hit out to the master.
if self.config.worker_app: if self.config.worker_app:
yield send_event_to_master( yield send_event_to_master(
self.hs.get_clock(),
self.http_client, self.http_client,
host=self.config.worker_replication_host, host=self.config.worker_replication_host,
port=self.config.worker_replication_http_port, port=self.config.worker_replication_http_port,
@ -959,9 +960,7 @@ class EventCreationHandler(object):
event_stream_id, max_stream_id event_stream_id, max_stream_id
) )
@defer.inlineCallbacks
def _notify(): def _notify():
yield run_on_reactor()
try: try:
self.notifier.on_new_room_event( self.notifier.on_new_room_event(
event, event_stream_id, max_stream_id, event, event_stream_id, max_stream_id,

View file

@ -22,27 +22,26 @@ The methods that define policy are:
- should_notify - should_notify
""" """
from twisted.internet import defer, reactor import logging
from contextlib import contextmanager from contextlib import contextmanager
from six import itervalues, iteritems from six import iteritems, itervalues
from prometheus_client import Counter
from twisted.internet import defer
from synapse.api.errors import SynapseError
from synapse.api.constants import PresenceState from synapse.api.constants import PresenceState
from synapse.api.errors import SynapseError
from synapse.metrics import LaterGauge
from synapse.storage.presence import UserPresenceState from synapse.storage.presence import UserPresenceState
from synapse.types import UserID, get_domain_from_id
from synapse.util.caches.descriptors import cachedInlineCallbacks
from synapse.util.async import Linearizer from synapse.util.async import Linearizer
from synapse.util.caches.descriptors import cachedInlineCallbacks
from synapse.util.logcontext import run_in_background from synapse.util.logcontext import run_in_background
from synapse.util.logutils import log_function from synapse.util.logutils import log_function
from synapse.util.metrics import Measure from synapse.util.metrics import Measure
from synapse.util.wheel_timer import WheelTimer from synapse.util.wheel_timer import WheelTimer
from synapse.types import UserID, get_domain_from_id
from synapse.metrics import LaterGauge
import logging
from prometheus_client import Counter
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -179,7 +178,7 @@ class PresenceHandler(object):
# have not yet been persisted # have not yet been persisted
self.unpersisted_users_changes = set() self.unpersisted_users_changes = set()
reactor.addSystemEventTrigger("before", "shutdown", self._on_shutdown) hs.get_reactor().addSystemEventTrigger("before", "shutdown", self._on_shutdown)
self.serial_to_user = {} self.serial_to_user = {}
self._next_serial = 1 self._next_serial = 1

View file

@ -17,8 +17,9 @@ import logging
from twisted.internet import defer from twisted.internet import defer
from synapse.api.errors import SynapseError, AuthError, CodeMessageException from synapse.api.errors import AuthError, CodeMessageException, SynapseError
from synapse.types import UserID, get_domain_from_id from synapse.types import UserID, get_domain_from_id
from ._base import BaseHandler from ._base import BaseHandler
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -13,13 +13,14 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from ._base import BaseHandler import logging
from twisted.internet import defer from twisted.internet import defer
from synapse.util.async import Linearizer from synapse.util.async import Linearizer
import logging from ._base import BaseHandler
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -12,17 +12,15 @@
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from synapse.util import logcontext import logging
from ._base import BaseHandler
from twisted.internet import defer from twisted.internet import defer
from synapse.util.logcontext import PreserveLoggingContext
from synapse.types import get_domain_from_id from synapse.types import get_domain_from_id
from synapse.util import logcontext
from synapse.util.logcontext import PreserveLoggingContext
import logging from ._base import BaseHandler
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -18,14 +18,19 @@ import logging
from twisted.internet import defer from twisted.internet import defer
from synapse import types
from synapse.api.errors import ( from synapse.api.errors import (
AuthError, Codes, SynapseError, RegistrationError, InvalidCaptchaError AuthError,
Codes,
InvalidCaptchaError,
RegistrationError,
SynapseError,
) )
from synapse.http.client import CaptchaServerHttpClient from synapse.http.client import CaptchaServerHttpClient
from synapse import types from synapse.types import RoomAlias, RoomID, UserID, create_requester
from synapse.types import UserID, create_requester, RoomID, RoomAlias from synapse.util.async import Linearizer
from synapse.util.async import run_on_reactor, Linearizer
from synapse.util.threepids import check_3pid_allowed from synapse.util.threepids import check_3pid_allowed
from ._base import BaseHandler from ._base import BaseHandler
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -139,7 +144,6 @@ class RegistrationHandler(BaseHandler):
Raises: Raises:
RegistrationError if there was a problem registering. RegistrationError if there was a problem registering.
""" """
yield run_on_reactor()
password_hash = None password_hash = None
if password: if password:
password_hash = yield self.auth_handler().hash(password) password_hash = yield self.auth_handler().hash(password)
@ -431,8 +435,6 @@ class RegistrationHandler(BaseHandler):
Raises: Raises:
RegistrationError if there was a problem registering. RegistrationError if there was a problem registering.
""" """
yield run_on_reactor()
if localpart is None: if localpart is None:
raise SynapseError(400, "Request must include user id") raise SynapseError(400, "Request must include user id")

View file

@ -15,23 +15,20 @@
# limitations under the License. # limitations under the License.
"""Contains functions for performing events on rooms.""" """Contains functions for performing events on rooms."""
from twisted.internet import defer
from ._base import BaseHandler
from synapse.types import UserID, RoomAlias, RoomID, RoomStreamToken
from synapse.api.constants import (
EventTypes, JoinRules, RoomCreationPreset
)
from synapse.api.errors import AuthError, StoreError, SynapseError
from synapse.util import stringutils
from synapse.visibility import filter_events_for_client
from collections import OrderedDict
import logging import logging
import math import math
import string import string
from collections import OrderedDict
from twisted.internet import defer
from synapse.api.constants import EventTypes, JoinRules, RoomCreationPreset
from synapse.api.errors import AuthError, Codes, StoreError, SynapseError
from synapse.types import RoomAlias, RoomID, RoomStreamToken, UserID
from synapse.util import stringutils
from synapse.visibility import filter_events_for_client
from ._base import BaseHandler
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -115,7 +112,11 @@ class RoomCreationHandler(BaseHandler):
) )
if mapping: if mapping:
raise SynapseError(400, "Room alias already taken") raise SynapseError(
400,
"Room alias already taken",
Codes.ROOM_IN_USE
)
else: else:
room_alias = None room_alias = None

View file

@ -13,26 +13,24 @@
# See the License for the specific language governing permissions and # See the License for the specific language governing permissions and
# limitations under the License. # limitations under the License.
from twisted.internet import defer import logging
from collections import namedtuple
from six import iteritems from six import iteritems
from six.moves import range from six.moves import range
from ._base import BaseHandler import msgpack
from unpaddedbase64 import decode_base64, encode_base64
from synapse.api.constants import ( from twisted.internet import defer
EventTypes, JoinRules,
) from synapse.api.constants import EventTypes, JoinRules
from synapse.types import ThirdPartyInstanceID
from synapse.util.async import concurrently_execute from synapse.util.async import concurrently_execute
from synapse.util.caches.descriptors import cachedInlineCallbacks from synapse.util.caches.descriptors import cachedInlineCallbacks
from synapse.util.caches.response_cache import ResponseCache from synapse.util.caches.response_cache import ResponseCache
from synapse.types import ThirdPartyInstanceID
from collections import namedtuple from ._base import BaseHandler
from unpaddedbase64 import encode_base64, decode_base64
import logging
import msgpack
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)
@ -40,7 +38,7 @@ REMOTE_ROOM_LIST_POLL_INTERVAL = 60 * 1000
# This is used to indicate we should only return rooms published to the main list. # This is used to indicate we should only return rooms published to the main list.
EMTPY_THIRD_PARTY_ID = ThirdPartyInstanceID(None, None) EMPTY_THIRD_PARTY_ID = ThirdPartyInstanceID(None, None)
class RoomListHandler(BaseHandler): class RoomListHandler(BaseHandler):
@ -52,7 +50,7 @@ class RoomListHandler(BaseHandler):
def get_local_public_room_list(self, limit=None, since_token=None, def get_local_public_room_list(self, limit=None, since_token=None,
search_filter=None, search_filter=None,
network_tuple=EMTPY_THIRD_PARTY_ID,): network_tuple=EMPTY_THIRD_PARTY_ID,):
"""Generate a local public room list. """Generate a local public room list.
There are multiple different lists: the main one plus one per third There are multiple different lists: the main one plus one per third
@ -89,7 +87,7 @@ class RoomListHandler(BaseHandler):
@defer.inlineCallbacks @defer.inlineCallbacks
def _get_public_room_list(self, limit=None, since_token=None, def _get_public_room_list(self, limit=None, since_token=None,
search_filter=None, search_filter=None,
network_tuple=EMTPY_THIRD_PARTY_ID,): network_tuple=EMPTY_THIRD_PARTY_ID,):
if since_token and since_token != "END": if since_token and since_token != "END":
since_token = RoomListNextBatch.from_token(since_token) since_token = RoomListNextBatch.from_token(since_token)
else: else:

View file

@ -21,19 +21,17 @@ from six.moves import http_client
from signedjson.key import decode_verify_key_bytes from signedjson.key import decode_verify_key_bytes
from signedjson.sign import verify_signed_json from signedjson.sign import verify_signed_json
from twisted.internet import defer
from unpaddedbase64 import decode_base64 from unpaddedbase64 import decode_base64
from twisted.internet import defer
import synapse.server import synapse.server
import synapse.types import synapse.types
from synapse.api.constants import ( from synapse.api.constants import EventTypes, Membership
EventTypes, Membership, from synapse.api.errors import AuthError, Codes, SynapseError
) from synapse.types import RoomID, UserID
from synapse.api.errors import AuthError, SynapseError, Codes
from synapse.types import UserID, RoomID
from synapse.util.async import Linearizer from synapse.util.async import Linearizer
from synapse.util.distributor import user_left_room, user_joined_room from synapse.util.distributor import user_joined_room, user_left_room
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View file

@ -20,11 +20,12 @@ from twisted.internet import defer
from synapse.api.errors import SynapseError from synapse.api.errors import SynapseError
from synapse.handlers.room_member import RoomMemberHandler from synapse.handlers.room_member import RoomMemberHandler
from synapse.replication.http.membership import ( from synapse.replication.http.membership import (
remote_join, remote_reject_invite, get_or_register_3pid_guest, get_or_register_3pid_guest,
notify_user_membership_change, notify_user_membership_change,
remote_join,
remote_reject_invite,
) )
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

Some files were not shown because too many files have changed in this diff Show more