summary refs log tree commit diff
path: root/docker
diff options
context:
space:
mode:
Diffstat (limited to 'docker')
-rw-r--r--docker/Dockerfile35
-rw-r--r--docker/README.md122
-rw-r--r--docker/conf/homeserver.yaml219
-rw-r--r--docker/conf/log.config29
-rwxr-xr-xdocker/start.py66
5 files changed, 471 insertions, 0 deletions
diff --git a/docker/Dockerfile b/docker/Dockerfile
new file mode 100644
index 0000000000..26fb3a6bff
--- /dev/null
+++ b/docker/Dockerfile
@@ -0,0 +1,35 @@
+FROM docker.io/python:2-alpine3.7
+
+RUN apk add --no-cache --virtual .nacl_deps \
+        build-base \
+        libffi-dev \
+        libjpeg-turbo-dev \
+        libressl-dev \
+        libxslt-dev \
+        linux-headers \
+        postgresql-dev \
+        su-exec \
+        zlib-dev
+
+COPY . /synapse
+
+# A wheel cache may be provided in ./cache for faster build
+RUN cd /synapse \
+ && pip install --upgrade \
+        lxml \
+        pip \
+        psycopg2 \
+        setuptools \
+ && mkdir -p /synapse/cache \
+ && pip install -f /synapse/cache --upgrade --process-dependency-links . \
+ && mv /synapse/docker/start.py /synapse/docker/conf / \
+ && rm -rf \
+        setup.cfg \
+        setup.py \
+        synapse
+
+VOLUME ["/data"]
+
+EXPOSE 8008/tcp 8448/tcp
+
+ENTRYPOINT ["/start.py"]
diff --git a/docker/README.md b/docker/README.md
new file mode 100644
index 0000000000..f60ea49234
--- /dev/null
+++ b/docker/README.md
@@ -0,0 +1,122 @@
+# Synapse Docker
+
+This Docker image will run Synapse as a single process. It does not provide a database
+server or a TURN server, you should run these separately.
+
+## Run
+
+We do not currently offer a `latest` image, as this has somewhat undefined semantics.
+We instead release only tagged versions so upgrading between releases is entirely
+within your control.
+
+### Using docker-compose (easier)
+
+This image is designed to run either with an automatically generated configuration
+file or with a custom configuration that requires manual edition.
+
+An easy way to make use of this image is via docker-compose, see the (https://github.com/matrix-org/synapse/tree/develop/contrib/docker)[contrib] section of the synapse project for examples.
+
+### Without Compose (harder)
+
+If you do not wish to use Compose, you may still run this image using plain
+Docker commands. Note that the following is just a guideline and you may need
+to add parameters to the docker run command to account for the network situation
+with your postgres database.
+
+```
+docker run \
+    -d \
+    --name synapse \
+    -v ${DATA_PATH}:/data \
+    -e SYNAPSE_SERVER_NAME=my.matrix.host \
+    -e SYNAPSE_REPORT_STATS=yes \
+    docker.io/matrixdotorg/synapse:latest
+```
+
+## Volumes
+
+The image expects a single volume, located at ``/data``, that will hold:
+
+* temporary files during uploads;
+* uploaded media and thumbnails;
+* the SQLite database if you do not configure postgres;
+* the appservices configuration.
+
+You are free to use separate volumes depending on storage endpoints at your
+disposal. For instance, ``/data/media`` coud be stored on a large but low
+performance hdd storage while other files could be stored on high performance
+endpoints.
+
+In order to setup an application service, simply create an ``appservices``
+directory in the data volume and write the application service Yaml
+configuration file there. Multiple application services are supported.
+
+## Environment
+
+Unless you specify a custom path for the configuration file, a very generic
+file will be generated, based on the following environment settings.
+These are a good starting point for setting up your own deployment.
+
+Global settings:
+
+* ``UID``, the user id Synapse will run as [default 991]
+* ``GID``, the group id Synapse will run as [default 991]
+* ``SYNAPSE_CONFIG_PATH``, path to a custom config file
+
+If ``SYNAPSE_CONFIG_PATH`` is set, you should generate a configuration file
+then customize it manually. No other environment variable is required.
+
+Otherwise, a dynamic configuration file will be used. The following environment
+variables are available for configuration:
+
+* ``SYNAPSE_SERVER_NAME`` (mandatory), the current server public hostname.
+* ``SYNAPSE_REPORT_STATS``, (mandatory, ``yes`` or ``no``), enable anonymous
+  statistics reporting back to the Matrix project which helps us to get funding.
+* ``SYNAPSE_NO_TLS``, set this variable to disable TLS in Synapse (use this if
+  you run your own TLS-capable reverse proxy).
+* ``SYNAPSE_ENABLE_REGISTRATION``, set this variable to enable registration on
+  the Synapse instance.
+* ``SYNAPSE_ALLOW_GUEST``, set this variable to allow guest joining this server.
+* ``SYNAPSE_EVENT_CACHE_SIZE``, the event cache size [default `10K`].
+* ``SYNAPSE_CACHE_FACTOR``, the cache factor [default `0.5`].
+* ``SYNAPSE_RECAPTCHA_PUBLIC_KEY``, set this variable to the recaptcha public
+  key in order to enable recaptcha upon registration.
+* ``SYNAPSE_RECAPTCHA_PRIVATE_KEY``, set this variable to the recaptcha private
+  key in order to enable recaptcha upon registration.
+* ``SYNAPSE_TURN_URIS``, set this variable to the coma-separated list of TURN
+  uris to enable TURN for this homeserver.
+* ``SYNAPSE_TURN_SECRET``, set this to the TURN shared secret if required.
+
+Shared secrets, that will be initialized to random values if not set:
+
+* ``SYNAPSE_REGISTRATION_SHARED_SECRET``, secret for registrering users if
+  registration is disable.
+* ``SYNAPSE_MACAROON_SECRET_KEY`` secret for signing access tokens
+  to the server.
+
+Database specific values (will use SQLite if not set):
+
+* `POSTGRES_DB` - The database name for the synapse postgres database. [default: `synapse`]
+* `POSTGRES_HOST` - The host of the postgres database if you wish to use postgresql instead of sqlite3. [default: `db` which is useful when using a container on the same docker network in a compose file where the postgres service is called `db`]
+* `POSTGRES_PASSWORD` - The password for the synapse postgres database. **If this is set then postgres will be used instead of sqlite3.** [default: none] **NOTE**: You are highly encouraged to use postgresql! Please use the compose file to make it easier to deploy.
+* `POSTGRES_USER` - The user for the synapse postgres database. [default: `matrix`]
+
+Mail server specific values (will not send emails if not set):
+
+* ``SYNAPSE_SMTP_HOST``, hostname to the mail server.
+* ``SYNAPSE_SMTP_PORT``, TCP port for accessing the mail server [default ``25``].
+* ``SYNAPSE_SMTP_USER``, username for authenticating against the mail server if any.
+* ``SYNAPSE_SMTP_PASSWORD``, password for authenticating against the mail server if any.
+
+## Build
+
+Build the docker image with the `docker build` command from the root of the synapse repository.
+
+```
+docker build -t docker.io/matrixdotorg/synapse . -f docker/Dockerfile
+```
+
+The `-t` option sets the image tag. Official images are tagged `matrixdotorg/synapse:<version>` where `<version>` is the same as the release tag in the synapse git repository.
+
+You may have a local Python wheel cache available, in which case copy the relevant
+packages in the ``cache/`` directory at the root of the project.
diff --git a/docker/conf/homeserver.yaml b/docker/conf/homeserver.yaml
new file mode 100644
index 0000000000..6bc25bb45f
--- /dev/null
+++ b/docker/conf/homeserver.yaml
@@ -0,0 +1,219 @@
+# vim:ft=yaml
+
+## TLS ##
+
+tls_certificate_path: "/data/{{ SYNAPSE_SERVER_NAME }}.tls.crt"
+tls_private_key_path: "/data/{{ SYNAPSE_SERVER_NAME }}.tls.key"
+tls_dh_params_path: "/data/{{ SYNAPSE_SERVER_NAME }}.tls.dh"
+no_tls: {{ "True" if SYNAPSE_NO_TLS else "False" }}
+tls_fingerprints: []
+
+## Server ##
+
+server_name: "{{ SYNAPSE_SERVER_NAME }}"
+pid_file: /homeserver.pid
+web_client: False
+soft_file_limit: 0
+
+## Ports ##
+
+listeners:
+  {% if not SYNAPSE_NO_TLS %}
+  -
+    port: 8448
+    bind_addresses: ['0.0.0.0']
+    type: http
+    tls: true
+    x_forwarded: false
+    resources:
+      - names: [client]
+        compress: true
+      - names: [federation]  # Federation APIs
+        compress: false
+  {% endif %}
+
+  - port: 8008
+    tls: false
+    bind_addresses: ['0.0.0.0']
+    type: http
+    x_forwarded: false
+
+    resources:
+      - names: [client]
+        compress: true
+      - names: [federation]
+        compress: false
+
+## Database ##
+
+{% if POSTGRES_PASSWORD %}
+database:
+  name: "psycopg2"
+  args:
+    user: "{{ POSTGRES_USER or "synapse" }}"
+    password: "{{ POSTGRES_PASSWORD }}"
+    database: "{{ POSTGRES_DB or "synapse" }}"
+    host: "{{ POSTGRES_HOST or "db" }}"
+    port: "{{ POSTGRES_PORT or "5432" }}"
+    cp_min: 5
+    cp_max: 10
+{% else %}
+database:
+  name: "sqlite3"
+  args:
+    database: "/data/homeserver.db"
+{% endif %}
+
+## Performance ##
+
+event_cache_size: "{{ SYNAPSE_EVENT_CACHE_SIZE or "10K" }}"
+verbose: 0
+log_file: "/data/homeserver.log"
+log_config: "/compiled/log.config"
+
+## Ratelimiting ##
+
+rc_messages_per_second: 0.2
+rc_message_burst_count: 10.0
+federation_rc_window_size: 1000
+federation_rc_sleep_limit: 10
+federation_rc_sleep_delay: 500
+federation_rc_reject_limit: 50
+federation_rc_concurrent: 3
+
+## Files ##
+
+media_store_path: "/data/media"
+uploads_path: "/data/uploads"
+max_upload_size: "10M"
+max_image_pixels: "32M"
+dynamic_thumbnails: false
+
+# List of thumbnail to precalculate when an image is uploaded.
+thumbnail_sizes:
+- width: 32
+  height: 32
+  method: crop
+- width: 96
+  height: 96
+  method: crop
+- width: 320
+  height: 240
+  method: scale
+- width: 640
+  height: 480
+  method: scale
+- width: 800
+  height: 600
+  method: scale
+
+url_preview_enabled: False
+max_spider_size: "10M"
+
+## Captcha ##
+
+{% if SYNAPSE_RECAPTCHA_PUBLIC_KEY %}
+recaptcha_public_key: "{{ SYNAPSE_RECAPTCHA_PUBLIC_KEY }}"
+recaptcha_private_key: "{{ SYNAPSE_RECAPTCHA_PRIVATE_KEY }}"
+enable_registration_captcha: True
+recaptcha_siteverify_api: "https://www.google.com/recaptcha/api/siteverify"
+{% else %}
+recaptcha_public_key: "YOUR_PUBLIC_KEY"
+recaptcha_private_key: "YOUR_PRIVATE_KEY"
+enable_registration_captcha: False
+recaptcha_siteverify_api: "https://www.google.com/recaptcha/api/siteverify"
+{% endif %}
+
+## Turn ##
+
+{% if SYNAPSE_TURN_URIS %}
+turn_uris:
+{% for uri in SYNAPSE_TURN_URIS.split(',') %}    - "{{ uri }}"
+{% endfor %}
+turn_shared_secret: "{{ SYNAPSE_TURN_SECRET }}"
+turn_user_lifetime: "1h"
+turn_allow_guests: True
+{% else %}
+turn_uris: []
+turn_shared_secret: "YOUR_SHARED_SECRET"
+turn_user_lifetime: "1h"
+turn_allow_guests: True
+{% endif %}
+
+## Registration ##
+
+enable_registration: {{ "True" if SYNAPSE_ENABLE_REGISTRATION else "False" }}
+registration_shared_secret: "{{ SYNAPSE_REGISTRATION_SHARED_SECRET }}"
+bcrypt_rounds: 12
+allow_guest_access: {{ "True" if SYNAPSE_ALLOW_GUEST else "False" }}
+enable_group_creation: true
+
+# The list of identity servers trusted to verify third party
+# identifiers by this server.
+trusted_third_party_id_servers:
+    - matrix.org
+    - vector.im
+    - riot.im
+
+## Metrics ###
+
+{% if SYNAPSE_REPORT_STATS.lower() == "yes" %}
+enable_metrics: True
+report_stats: True
+{% else %}
+enable_metrics: False
+report_stats: False
+{% endif %}
+
+## API Configuration ##
+
+room_invite_state_types:
+    - "m.room.join_rules"
+    - "m.room.canonical_alias"
+    - "m.room.avatar"
+    - "m.room.name"
+
+{% if SYNAPSE_APPSERVICES %}
+app_service_config_files:
+{% for appservice in SYNAPSE_APPSERVICES %}    - "{{ appservice }}"
+{% endfor %}
+{% else %}
+app_service_config_files: []
+{% endif %}
+
+macaroon_secret_key: "{{ SYNAPSE_MACAROON_SECRET_KEY }}"
+expire_access_token: False
+
+## Signing Keys ##
+
+signing_key_path: "/data/{{ SYNAPSE_SERVER_NAME }}.signing.key"
+old_signing_keys: {}
+key_refresh_interval: "1d" # 1 Day.
+
+# The trusted servers to download signing keys from.
+perspectives:
+  servers:
+    "matrix.org":
+      verify_keys:
+        "ed25519:auto":
+          key: "Noi6WqcDj0QmPxCNQqgezwTlBKrfqehY1u2FyWP9uYw"
+
+password_config:
+   enabled: true
+
+{% if SYNAPSE_SMTP_HOST %}
+email:
+   enable_notifs: false
+   smtp_host: "{{ SYNAPSE_SMTP_HOST }}"
+   smtp_port: {{ SYNAPSE_SMTP_PORT or "25" }}
+   smtp_user: "{{ SYNAPSE_SMTP_USER }}"
+   smtp_pass: "{{ SYNAPSE_SMTP_PASSWORD }}"
+   require_transport_security: False
+   notif_from: "{{ SYNAPSE_SMTP_FROM or "hostmaster@" + SYNAPSE_SERVER_NAME }}"
+   app_name: Matrix
+   template_dir: res/templates
+   notif_template_html: notif_mail.html
+   notif_template_text: notif_mail.txt
+   notif_for_new_users: True
+   riot_base_url: "https://{{ SYNAPSE_SERVER_NAME }}"
+{% endif %}
diff --git a/docker/conf/log.config b/docker/conf/log.config
new file mode 100644
index 0000000000..1851995802
--- /dev/null
+++ b/docker/conf/log.config
@@ -0,0 +1,29 @@
+version: 1
+
+formatters:
+  precise:
+   format: '%(asctime)s - %(name)s - %(lineno)d - %(levelname)s - %(request)s- %(message)s'
+
+filters:
+  context:
+    (): synapse.util.logcontext.LoggingContextFilter
+    request: ""
+
+handlers:
+  console:
+    class: logging.StreamHandler
+    formatter: precise
+    filters: [context]
+
+loggers:
+    synapse:
+        level: {{ SYNAPSE_LOG_LEVEL or "WARNING" }}
+
+    synapse.storage.SQL:
+        # beware: increasing this to DEBUG will make synapse log sensitive
+        # information such as access tokens.
+        level: {{ SYNAPSE_LOG_LEVEL or "WARNING" }}
+
+root:
+    level: {{ SYNAPSE_LOG_LEVEL or "WARNING" }}
+    handlers: [console]
diff --git a/docker/start.py b/docker/start.py
new file mode 100755
index 0000000000..90e8b9c51a
--- /dev/null
+++ b/docker/start.py
@@ -0,0 +1,66 @@
+#!/usr/local/bin/python
+
+import jinja2
+import os
+import sys
+import subprocess
+import glob
+
+# Utility functions
+convert = lambda src, dst, environ: open(dst, "w").write(jinja2.Template(open(src).read()).render(**environ))
+
+def check_arguments(environ, args):
+    for argument in args:
+        if argument not in environ:
+            print("Environment variable %s is mandatory, exiting." % argument)
+            sys.exit(2)
+
+def generate_secrets(environ, secrets):
+    for name, secret in secrets.items():
+        if secret not in environ:
+            filename = "/data/%s.%s.key" % (environ["SYNAPSE_SERVER_NAME"], name)
+            if os.path.exists(filename):
+                with open(filename) as handle: value = handle.read()
+            else:
+                print("Generating a random secret for {}".format(name))
+                value = os.urandom(32).encode("hex")
+                with open(filename, "w") as handle: handle.write(value)
+            environ[secret] = value
+
+# Prepare the configuration
+mode = sys.argv[1] if len(sys.argv) > 1 else None
+environ = os.environ.copy()
+ownership = "{}:{}".format(environ.get("UID", 991), environ.get("GID", 991))
+args = ["python", "-m", "synapse.app.homeserver"]
+
+# In generate mode, generate a configuration, missing keys, then exit
+if mode == "generate":
+    check_arguments(environ, ("SYNAPSE_SERVER_NAME", "SYNAPSE_REPORT_STATS", "SYNAPSE_CONFIG_PATH"))
+    args += [
+        "--server-name", environ["SYNAPSE_SERVER_NAME"],
+        "--report-stats", environ["SYNAPSE_REPORT_STATS"],
+        "--config-path", environ["SYNAPSE_CONFIG_PATH"],
+        "--generate-config"
+    ]
+    os.execv("/usr/local/bin/python", args)
+
+# In normal mode, generate missing keys if any, then run synapse
+else:
+    # Parse the configuration file
+    if "SYNAPSE_CONFIG_PATH" in environ:
+        args += ["--config-path", environ["SYNAPSE_CONFIG_PATH"]]
+    else:
+        check_arguments(environ, ("SYNAPSE_SERVER_NAME", "SYNAPSE_REPORT_STATS"))
+        generate_secrets(environ, {
+            "registration": "SYNAPSE_REGISTRATION_SHARED_SECRET",
+            "macaroon": "SYNAPSE_MACAROON_SECRET_KEY"
+        })
+        environ["SYNAPSE_APPSERVICES"] = glob.glob("/data/appservices/*.yaml")
+        if not os.path.exists("/compiled"): os.mkdir("/compiled")
+        convert("/conf/homeserver.yaml", "/compiled/homeserver.yaml", environ)
+        convert("/conf/log.config", "/compiled/log.config", environ)
+        subprocess.check_output(["chown", "-R", ownership, "/data"])
+        args += ["--config-path", "/compiled/homeserver.yaml"]
+    # Generate missing keys and start synapse
+    subprocess.check_output(args + ["--generate-keys"])
+    os.execv("/sbin/su-exec", ["su-exec", ownership] + args)