{"ok": true, "next": null, "rows": [{"id": "settings:config-dir", "page": "settings", "ref": "config-dir", "title": "Configuration directory mode", "content": "Normally you configure Datasette using command-line options. For a Datasette instance with custom templates, custom plugins, a static directory and several databases this can get quite verbose: \n             datasette one.db two.db \\\n  --metadata=metadata.json \\\n  --template-dir=templates/ \\\n  --plugins-dir=plugins \\\n  --static css:css \n             As an alternative to this, you can run Datasette in  configuration directory  mode. Create a directory with the following structure: \n             # In a directory called my-app:\nmy-app/one.db\nmy-app/two.db\nmy-app/datasette.yaml\nmy-app/metadata.json\nmy-app/templates/index.html\nmy-app/plugins/my_plugin.py\nmy-app/static/my.css \n             Now start Datasette by providing the path to that directory: \n             datasette my-app/ \n             Datasette will detect the files in that directory and automatically configure itself using them. It will serve all  *.db  files that it finds, will load  metadata.json  if it exists, and will load the  templates ,  plugins  and  static  folders if they are present. \n             The files that can be included in this directory are as follows. All are optional. \n             \n                 \n                     *.db  (or  *.sqlite3  or  *.sqlite ) - SQLite database files that will be served by Datasette \n                 \n                 \n                     datasette.yaml  -  Configuration  for the Datasette instance \n                 \n                 \n                     metadata.json  -  Metadata  for those databases -  metadata.yaml  or  metadata.yml  can be used as well \n                 \n                 \n                     inspect-data.json  - the result of running  datasette inspect *.db --inspect-file=inspect-data.json  from the configuration directory - any database files listed here will be treated as immutable, so they should not be changed while Datasette is running \n                 \n                 \n                     templates/  - a directory containing  Custom templates \n                 \n                 \n                     plugins/  - a directory containing plugins, see  Writing one-off plugins \n                 \n                 \n                     static/  - a directory containing static files - these will be served from  /static/filename.txt , see  Serving static files", "breadcrumbs": "[\"Settings\"]", "references": "[]"}, {"id": "settings:id2", "page": "settings", "ref": "id2", "title": "Settings", "content": "The following options can be set using  --setting name value , or by storing them in the  settings.json  file for use with  Configuration directory mode .", "breadcrumbs": "[\"Settings\"]", "references": "[]"}, {"id": "settings:setting-allow-csv-stream", "page": "settings", "ref": "setting-allow-csv-stream", "title": "allow_csv_stream", "content": "Enables  the CSV export feature  where an entire table\n                    (potentially hundreds of thousands of rows) can be exported as a single CSV\n                    file. This is turned on by default - you can turn it off like this: \n                 datasette mydatabase.db --setting allow_csv_stream off", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"}, {"id": "settings:setting-allow-download", "page": "settings", "ref": "setting-allow-download", "title": "allow_download", "content": "Should users be able to download the original SQLite database using a link on the database index page? This is turned on by default. However, databases can only be downloaded if they are served in immutable mode and not in-memory. If downloading is unavailable for either of these reasons, the download link is hidden even if  allow_download  is on. To disable database downloads, use the following: \n                 datasette mydatabase.db --setting allow_download off", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"}, {"id": "settings:setting-allow-facet", "page": "settings", "ref": "setting-allow-facet", "title": "allow_facet", "content": "Allow users to specify columns they would like to facet on using the  ?_facet=COLNAME  URL parameter to the table view. \n                 This is enabled by default. If disabled, facets will still be displayed if they have been specifically enabled in  metadata.json  configuration for the table. \n                 Here's how to disable this feature: \n                 datasette mydatabase.db --setting allow_facet off", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"}, {"id": "settings:setting-allow-signed-tokens", "page": "settings", "ref": "setting-allow-signed-tokens", "title": "allow_signed_tokens", "content": "Should users be able to create signed API tokens to access Datasette? \n                 This is turned on by default. Use the following to turn it off: \n                 datasette mydatabase.db --setting allow_signed_tokens off \n                 Turning this setting off will disable the  /-/create-token  page,  described here . It will also cause any incoming  Authorization: Bearer dstok_...  API tokens to be ignored.", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"}, {"id": "settings:setting-base-url", "page": "settings", "ref": "setting-base-url", "title": "base_url", "content": "If you are running Datasette behind a proxy, it may be useful to change the root path used for the Datasette instance. \n                 For example, if you are sending traffic from  https://www.example.com/tools/datasette/  through to a proxied Datasette instance you may wish Datasette to use  /tools/datasette/  as its root URL. \n                 You can do that like so: \n                 datasette mydatabase.db --setting base_url /tools/datasette/", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"}, {"id": "settings:setting-cache-size-kb", "page": "settings", "ref": "setting-cache-size-kb", "title": "cache_size_kb", "content": "Sets the amount of memory SQLite uses for its  per-connection cache , in KB. \n                 datasette mydatabase.db --setting cache_size_kb 5000", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[{\"href\": \"https://www.sqlite.org/pragma.html#pragma_cache_size\", \"label\": \"per-connection cache\"}]"}, {"id": "settings:setting-default-allow-sql", "page": "settings", "ref": "setting-default-allow-sql", "title": "default_allow_sql", "content": "Should users be able to execute arbitrary SQL queries by default? \n                 Setting this to  off  causes permission checks for  execute-sql  to fail by default. \n                 datasette mydatabase.db --setting default_allow_sql off \n                 Another way to achieve this is to add  \"allow_sql\": false  to your  datasette.yaml  file, as described in  Controlling the ability to execute arbitrary SQL . This setting offers a more convenient way to do this.", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"}, {"id": "settings:setting-default-cache-ttl", "page": "settings", "ref": "setting-default-cache-ttl", "title": "default_cache_ttl", "content": "Default HTTP caching max-age header in seconds, used for  Cache-Control: max-age=X . Can be over-ridden on a per-request basis using the  ?_ttl=  query string parameter. Set this to  0  to disable HTTP caching entirely. Defaults to 5 seconds. \n                 datasette mydatabase.db --setting default_cache_ttl 60", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"}, {"id": "settings:setting-default-facet-size", "page": "settings", "ref": "setting-default-facet-size", "title": "default_facet_size", "content": "The default number of unique rows returned by  Facets  is 30. You can customize it like this: \n                 datasette mydatabase.db --setting default_facet_size 50", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"}, {"id": "settings:setting-default-page-size", "page": "settings", "ref": "setting-default-page-size", "title": "default_page_size", "content": "The default number of rows returned by the table page. You can over-ride this on a per-page basis using the  ?_size=80  query string parameter, provided you do not specify a value higher than the  max_returned_rows  setting. You can set this default using  --setting  like so: \n                 datasette mydatabase.db --setting default_page_size 50", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"}, {"id": "settings:setting-facet-suggest-time-limit-ms", "page": "settings", "ref": "setting-facet-suggest-time-limit-ms", "title": "facet_suggest_time_limit_ms", "content": "When Datasette calculates suggested facets it needs to run a SQL query for every column in your table. The default for this time limit is 50ms to account for the fact that it needs to run once for every column. If the time limit is exceeded the column will not be suggested as a facet. \n                 You can increase this time limit like so: \n                 datasette mydatabase.db --setting facet_suggest_time_limit_ms 500", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"}, {"id": "settings:setting-facet-time-limit-ms", "page": "settings", "ref": "setting-facet-time-limit-ms", "title": "facet_time_limit_ms", "content": "This is the time limit Datasette allows for calculating a facet, which defaults to 200ms: \n                 datasette mydatabase.db --setting facet_time_limit_ms 1000", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"}, {"id": "settings:setting-force-https-urls", "page": "settings", "ref": "setting-force-https-urls", "title": "force_https_urls", "content": "Forces self-referential URLs in the JSON output to always use the  https:// \n                    protocol. This is useful for cases where the application itself is hosted using\n                    HTTP but is served to the outside world via a proxy that enables HTTPS. \n                 datasette mydatabase.db --setting force_https_urls 1", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"}, {"id": "settings:setting-max-csv-mb", "page": "settings", "ref": "setting-max-csv-mb", "title": "max_csv_mb", "content": "The maximum size of CSV that can be exported, in megabytes. Defaults to 100MB.\n                    You can disable the limit entirely by settings this to 0: \n                 datasette mydatabase.db --setting max_csv_mb 0", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"}, {"id": "settings:setting-max-insert-rows", "page": "settings", "ref": "setting-max-insert-rows", "title": "max_insert_rows", "content": "Maximum rows that can be inserted at a time using the bulk insert API, see  Inserting rows . Defaults to 100. \n                 You can increase or decrease this limit like so: \n                 datasette mydatabase.db --setting max_insert_rows 1000", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"}, {"id": "settings:setting-max-returned-rows", "page": "settings", "ref": "setting-max-returned-rows", "title": "max_returned_rows", "content": "Datasette returns a maximum of 1,000 rows of data at a time. If you execute a query that returns more than 1,000 rows, Datasette will return the first 1,000 and include a warning that the result set has been truncated. You can use OFFSET/LIMIT or other methods in your SQL to implement pagination if you need to return more than 1,000 rows. \n                 You can increase or decrease this limit like so: \n                 datasette mydatabase.db --setting max_returned_rows 2000", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"}, {"id": "settings:setting-max-signed-tokens-ttl", "page": "settings", "ref": "setting-max-signed-tokens-ttl", "title": "max_signed_tokens_ttl", "content": "Maximum allowed expiry time for signed API tokens created by users. \n                 Defaults to  0  which means no limit - tokens can be created that will never expire. \n                 Set this to a value in seconds to limit the maximum expiry time. For example, to set that limit to 24 hours you would use: \n                 datasette mydatabase.db --setting max_signed_tokens_ttl 86400 \n                 This setting is enforced when incoming tokens are processed.", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"}, {"id": "settings:setting-num-sql-threads", "page": "settings", "ref": "setting-num-sql-threads", "title": "num_sql_threads", "content": "Maximum number of threads in the thread pool Datasette uses to execute SQLite queries. Defaults to 3. \n                 datasette mydatabase.db --setting num_sql_threads 10 \n                 Setting this to 0 turns off threaded SQL queries entirely - useful for environments that do not support threading such as  Pyodide .", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[{\"href\": \"https://pyodide.org/\", \"label\": \"Pyodide\"}]"}, {"id": "settings:setting-publish-secrets", "page": "settings", "ref": "setting-publish-secrets", "title": "Using secrets with datasette publish", "content": "The  datasette publish  and  datasette package  commands both generate a secret for you automatically when Datasette is deployed. \n             This means that every time you deploy a new version of a Datasette project, a new secret will be generated. This will cause signed cookies to become invalid on every fresh deploy. \n             You can fix this by creating a secret that will be used for multiple deploys and passing it using the  --secret  option: \n             datasette publish cloudrun mydb.db --service=my-service --secret=cdb19e94283a20f9d42cca5", "breadcrumbs": "[\"Settings\"]", "references": "[]"}, {"id": "settings:setting-secret", "page": "settings", "ref": "setting-secret", "title": "Configuring the secret", "content": "Datasette uses a secret string to sign secure values such as cookies. \n             If you do not provide a secret, Datasette will create one when it starts up. This secret will reset every time the Datasette server restarts though, so things like authentication cookies and  API tokens  will not stay valid between restarts. \n             You can pass a secret to Datasette in two ways: with the  --secret  command-line option or by setting a  DATASETTE_SECRET  environment variable. \n             datasette mydb.db --secret=SECRET_VALUE_HERE \n             Or: \n             export DATASETTE_SECRET=SECRET_VALUE_HERE\ndatasette mydb.db \n             One way to generate a secure random secret is to use Python like this: \n             python3 -c 'import secrets; print(secrets.token_hex(32))'\ncdb19e94283a20f9d42cca50c5a4871c0aa07392db308755d60a1a5b9bb0fa52 \n             Plugin authors can make use of this signing mechanism in their plugins using the  datasette.sign()  and  datasette.unsign()  methods.", "breadcrumbs": "[\"Settings\"]", "references": "[]"}, {"id": "settings:setting-sql-time-limit-ms", "page": "settings", "ref": "setting-sql-time-limit-ms", "title": "sql_time_limit_ms", "content": "By default, queries have a time limit of one second. If a query takes longer than this to run Datasette will terminate the query and return an error. \n                 If this time limit is too short for you, you can customize it using the  sql_time_limit_ms  limit - for example, to increase it to 3.5 seconds: \n                 datasette mydatabase.db --setting sql_time_limit_ms 3500 \n                 You can optionally set a lower time limit for an individual query using the  ?_timelimit=100  query string argument: \n                 /my-database/my-table?qSpecies=44&_timelimit=100 \n                 This would set the time limit to 100ms for that specific query. This feature is useful if you are working with databases of unknown size and complexity - a query that might make perfect sense for a smaller table could take too long to execute on a table with millions of rows. By setting custom time limits you can execute queries \"optimistically\" - e.g. give me an exact count of rows matching this query but only if it takes less than 100ms to calculate.", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"}, {"id": "settings:setting-suggest-facets", "page": "settings", "ref": "setting-suggest-facets", "title": "suggest_facets", "content": "Should Datasette calculate suggested facets? On by default, turn this off like so: \n                 datasette mydatabase.db --setting suggest_facets off", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"}, {"id": "settings:setting-template-debug", "page": "settings", "ref": "setting-template-debug", "title": "template_debug", "content": "This setting enables template context debug mode, which is useful to help understand what variables are available to custom templates when you are writing them. \n                 Enable it like this: \n                 datasette mydatabase.db --setting template_debug 1 \n                 Now you can add  ?_context=1  or  &_context=1  to any Datasette page to see the context that was passed to that template. \n                 Some examples: \n                 \n                     \n                         https://latest.datasette.io/?_context=1 \n                     \n                     \n                         https://latest.datasette.io/fixtures?_context=1 \n                     \n                     \n                         https://latest.datasette.io/fixtures/roadside_attractions?_context=1", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[{\"href\": \"https://latest.datasette.io/?_context=1\", \"label\": \"https://latest.datasette.io/?_context=1\"}, {\"href\": \"https://latest.datasette.io/fixtures?_context=1\", \"label\": \"https://latest.datasette.io/fixtures?_context=1\"}, {\"href\": \"https://latest.datasette.io/fixtures/roadside_attractions?_context=1\", \"label\": \"https://latest.datasette.io/fixtures/roadside_attractions?_context=1\"}]"}, {"id": "settings:setting-trace-debug", "page": "settings", "ref": "setting-trace-debug", "title": "trace_debug", "content": "This setting enables appending  ?_trace=1  to any page in order to see the SQL queries and other trace information that was used to generate that page. \n                 Enable it like this: \n                 datasette mydatabase.db --setting trace_debug 1 \n                 Some examples: \n                 \n                     \n                         https://latest.datasette.io/?_trace=1 \n                     \n                     \n                         https://latest.datasette.io/fixtures/roadside_attractions?_trace=1 \n                     \n                 \n                 See  datasette.tracer  for details on how to hook into this mechanism as a plugin author.", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[{\"href\": \"https://latest.datasette.io/?_trace=1\", \"label\": \"https://latest.datasette.io/?_trace=1\"}, {\"href\": \"https://latest.datasette.io/fixtures/roadside_attractions?_trace=1\", \"label\": \"https://latest.datasette.io/fixtures/roadside_attractions?_trace=1\"}]"}, {"id": "settings:setting-truncate-cells-html", "page": "settings", "ref": "setting-truncate-cells-html", "title": "truncate_cells_html", "content": "In the HTML table view, truncate any strings that are longer than this value.\n                    The full value will still be available in CSV, JSON and on the individual row\n                    HTML page. Set this to 0 to disable truncation. \n                 datasette mydatabase.db --setting truncate_cells_html 0", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"}, {"id": "settings:using-setting", "page": "settings", "ref": "using-setting", "title": "Using --setting", "content": "Datasette supports a number of settings. These can be set using the  --setting name value  option to  datasette serve . \n             You can set multiple settings at once like this: \n             datasette mydatabase.db \\\n  --setting default_page_size 50 \\\n  --setting sql_time_limit_ms 3500 \\\n  --setting max_returned_rows 2000 \n             Settings can also be specified  in the database.yaml configuration file .", "breadcrumbs": "[\"Settings\"]", "references": "[]"}], "truncated": false}