summary refs log tree commit diff
path: root/synapse/rest
diff options
context:
space:
mode:
authorDavid Teller <d.o.teller+github@gmail.com>2022-05-11 10:32:27 +0200
committerDavid Teller <d.o.teller+github@gmail.com>2022-05-11 10:32:30 +0200
commitae66c672fe8b5fbfe497888d03b2cbd68381de76 (patch)
tree4c837eb8b98c8b6bd1a0fa910100df30dea7161f /synapse/rest
parentFix `/messages` throwing a 500 when querying for non-existent room (#12683) (diff)
downloadsynapse-github/ts/spam-errors.tar.xz
Uniformize spam-checker API: github/ts/spam-errors ts/spam-errors
- Some callbacks should return `True` to allow, `False` to deny, while others should return `True` to deny and `False` to allow. With this PR, all callbacks return `ALLOW` to allow or a `Codes` (typically `Codes.FORBIDDEN`) to deny.
- Similarly, some methods returned `True` to allow, `False` to deny, while others returned `True` to deny and `False` to allow. They now all return `ALLOW` to allow or a `Codes` to deny.
- Spam-checker implementations may now return an explicit code, e.g. to differentiate between "User account has been suspended" (which is in practice required by law in some countries, including UK) and "This message looks like spam".
Diffstat (limited to 'synapse/rest')
-rw-r--r--synapse/rest/media/v1/media_storage.py9
1 files changed, 6 insertions, 3 deletions
diff --git a/synapse/rest/media/v1/media_storage.py b/synapse/rest/media/v1/media_storage.py
index 604f18bf52..c85a8a636b 100644
--- a/synapse/rest/media/v1/media_storage.py
+++ b/synapse/rest/media/v1/media_storage.py
@@ -36,6 +36,7 @@ from twisted.internet.defer import Deferred
 from twisted.internet.interfaces import IConsumer
 from twisted.protocols.basic import FileSender
 
+import synapse
 from synapse.api.errors import NotFoundError
 from synapse.logging.context import defer_to_thread, make_deferred_yieldable
 from synapse.util import Clock
@@ -145,15 +146,17 @@ class MediaStorage:
                     f.flush()
                     f.close()
 
-                    spam = await self.spam_checker.check_media_file_for_spam(
+                    spam_check = await self.spam_checker.check_media_file_for_spam(
                         ReadableFileWrapper(self.clock, fname), file_info
                     )
-                    if spam:
+                    if spam_check is not synapse.spam_checker_api.ALLOW:
                         logger.info("Blocking media due to spam checker")
                         # Note that we'll delete the stored media, due to the
                         # try/except below. The media also won't be stored in
                         # the DB.
-                        raise SpamMediaException()
+                        raise SpamMediaException(
+                            "File rejected as probable spam", spam_check
+                        )
 
                     for provider in self.storage_providers:
                         await provider.store_file(path, file_info)