forked from MirrorHub/synapse
Move setting of Filter into code.
We do this to prevent foot guns. The default config uses a MemoryFilter, but users are free to change to logging to files directly. If they do then they have to ensure to set the `filters: [context]` on the right handler, otherwise records get written with the wrong context. Instead we move the logic to happen when we generate a record, which is when we *log* rather than *handle*. (It's possible to add filters to loggers in the config, however they don't apply to descendant loggers and so they have to be manually set on *every* logger used in the code base)
This commit is contained in:
parent
a0f574f3c2
commit
0304ad0c3d
3 changed files with 16 additions and 21 deletions
|
@ -4,16 +4,10 @@ formatters:
|
|||
precise:
|
||||
format: '%(asctime)s - %(name)s - %(lineno)d - %(levelname)s - %(request)s - %(message)s'
|
||||
|
||||
filters:
|
||||
context:
|
||||
(): synapse.logging.context.LoggingContextFilter
|
||||
request: ""
|
||||
|
||||
handlers:
|
||||
console:
|
||||
class: logging.StreamHandler
|
||||
formatter: precise
|
||||
filters: [context]
|
||||
|
||||
loggers:
|
||||
synapse.storage.SQL:
|
||||
|
|
|
@ -11,11 +11,6 @@ formatters:
|
|||
precise:
|
||||
format: '%(asctime)s - %(name)s - %(lineno)d - %(levelname)s - %(request)s - %(message)s'
|
||||
|
||||
filters:
|
||||
context:
|
||||
(): synapse.logging.context.LoggingContextFilter
|
||||
request: ""
|
||||
|
||||
handlers:
|
||||
file:
|
||||
class: logging.handlers.TimedRotatingFileHandler
|
||||
|
@ -30,7 +25,6 @@ handlers:
|
|||
# logs will still be flushed immediately.
|
||||
buffer:
|
||||
class: logging.handlers.MemoryHandler
|
||||
filters: [context]
|
||||
target: file
|
||||
# The capacity is the number of log lines that are buffered before
|
||||
# being written to disk. Increasing this will lead to better
|
||||
|
@ -44,7 +38,6 @@ handlers:
|
|||
console:
|
||||
class: logging.StreamHandler
|
||||
formatter: precise
|
||||
filters: [context]
|
||||
|
||||
loggers:
|
||||
synapse.storage.SQL:
|
||||
|
|
|
@ -55,11 +55,6 @@ formatters:
|
|||
format: '%(asctime)s - %(name)s - %(lineno)d - %(levelname)s - \
|
||||
%(request)s - %(message)s'
|
||||
|
||||
filters:
|
||||
context:
|
||||
(): synapse.logging.context.LoggingContextFilter
|
||||
request: ""
|
||||
|
||||
handlers:
|
||||
file:
|
||||
class: logging.handlers.TimedRotatingFileHandler
|
||||
|
@ -74,7 +69,6 @@ handlers:
|
|||
# logs will still be flushed immediately.
|
||||
buffer:
|
||||
class: logging.handlers.MemoryHandler
|
||||
filters: [context]
|
||||
target: file
|
||||
# The capacity is the number of log lines that are buffered before
|
||||
# being written to disk. Increasing this will lead to better
|
||||
|
@ -88,7 +82,6 @@ handlers:
|
|||
console:
|
||||
class: logging.StreamHandler
|
||||
formatter: precise
|
||||
filters: [context]
|
||||
|
||||
loggers:
|
||||
synapse.storage.SQL:
|
||||
|
@ -199,11 +192,26 @@ def _setup_stdlib_logging(config, log_config, logBeginner: LogBeginner):
|
|||
|
||||
handler = logging.StreamHandler()
|
||||
handler.setFormatter(formatter)
|
||||
handler.addFilter(LoggingContextFilter(request=""))
|
||||
logger.addHandler(handler)
|
||||
else:
|
||||
logging.config.dictConfig(log_config)
|
||||
|
||||
# We add a log record factory that runs all messages through the
|
||||
# LoggingContextFilter so that we get the context *at the time we log*
|
||||
# rather than when we write to a handler. This can be done in config using
|
||||
# filter options, but care must when using e.g. MemoryHandler to buffer
|
||||
# writes.
|
||||
|
||||
log_filter = LoggingContextFilter(request="")
|
||||
old_factory = logging.getLogRecordFactory()
|
||||
|
||||
def factory(*args, **kwargs):
|
||||
record = old_factory(*args, **kwargs)
|
||||
log_filter.filter(record)
|
||||
return record
|
||||
|
||||
logging.setLogRecordFactory(factory)
|
||||
|
||||
# Route Twisted's native logging through to the standard library logging
|
||||
# system.
|
||||
observer = STDLibLogObserver()
|
||||
|
|
Loading…
Reference in a new issue