{"ok": true, "rows": [{"id": "authentication:actions", "page": "authentication", "ref": "actions", "title": "Built-in actions", "content": "This section lists all of the permission checks that are carried out by Datasette core, along with the  resource  if it was passed.", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"}, {"id": "authentication:actions-alter-table", "page": "authentication", "ref": "actions-alter-table", "title": "alter-table", "content": "Actor is allowed to alter a database table. \n                 \n                     \n                         resource  -  datasette.resources.TableResource(database, table) \n                         \n                             database  is the name of the database (string) \n                             table  is the name of the table (string)", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in actions\"]", "references": "[]"}, {"id": "authentication:actions-create-table", "page": "authentication", "ref": "actions-create-table", "title": "create-table", "content": "Actor is allowed to create a database table. \n                 \n                     \n                         resource  -  datasette.resources.DatabaseResource(database) \n                         \n                             database  is the name of the database (string)", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in actions\"]", "references": "[]"}, {"id": "authentication:actions-debug-menu", "page": "authentication", "ref": "actions-debug-menu", "title": "debug-menu", "content": "Controls if the various debug pages are displayed in the navigation menu.", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in actions\"]", "references": "[]"}, {"id": "authentication:actions-delete-row", "page": "authentication", "ref": "actions-delete-row", "title": "delete-row", "content": "Actor is allowed to delete rows from a table. \n                 \n                     \n                         resource  -  datasette.resources.TableResource(database, table) \n                         \n                             database  is the name of the database (string) \n                             table  is the name of the table (string)", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in actions\"]", "references": "[]"}, {"id": "authentication:actions-drop-table", "page": "authentication", "ref": "actions-drop-table", "title": "drop-table", "content": "Actor is allowed to drop a database table. \n                 \n                     \n                         resource  -  datasette.resources.TableResource(database, table) \n                         \n                             database  is the name of the database (string) \n                             table  is the name of the table (string)", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in actions\"]", "references": "[]"}, {"id": "authentication:actions-execute-sql", "page": "authentication", "ref": "actions-execute-sql", "title": "execute-sql", "content": "Actor is allowed to run arbitrary SQL queries against a specific database, e.g.  https://latest.datasette.io/fixtures/-/query?sql=select+100 \n                 \n                     \n                         resource  -  datasette.resources.DatabaseResource(database) \n                         \n                             database  is the name of the database (string) \n                         \n                     \n                 \n                 See also  the default_allow_sql setting .", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in actions\"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures/-/query?sql=select+100\", \"label\": \"https://latest.datasette.io/fixtures/-/query?sql=select+100\"}]"}, {"id": "authentication:actions-insert-row", "page": "authentication", "ref": "actions-insert-row", "title": "insert-row", "content": "Actor is allowed to insert rows into a table. \n                 \n                     \n                         resource  -  datasette.resources.TableResource(database, table) \n                         \n                             database  is the name of the database (string) \n                             table  is the name of the table (string)", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in actions\"]", "references": "[]"}, {"id": "authentication:actions-permissions-debug", "page": "authentication", "ref": "actions-permissions-debug", "title": "permissions-debug", "content": "Actor is allowed to view the  /-/permissions  debug tools.", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in actions\"]", "references": "[]"}, {"id": "authentication:actions-update-row", "page": "authentication", "ref": "actions-update-row", "title": "update-row", "content": "Actor is allowed to update rows in a table. \n                 \n                     \n                         resource  -  datasette.resources.TableResource(database, table) \n                         \n                             database  is the name of the database (string) \n                             table  is the name of the table (string)", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in actions\"]", "references": "[]"}, {"id": "authentication:actions-view-database", "page": "authentication", "ref": "actions-view-database", "title": "view-database", "content": "Actor is allowed to view a database page, e.g.  https://latest.datasette.io/fixtures \n                 \n                     \n                         resource  -  datasette.permissions.DatabaseResource(database) \n                         \n                             database  is the name of the database (string)", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in actions\"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures\", \"label\": \"https://latest.datasette.io/fixtures\"}]"}, {"id": "authentication:actions-view-database-download", "page": "authentication", "ref": "actions-view-database-download", "title": "view-database-download", "content": "Actor is allowed to download a database, e.g.  https://latest.datasette.io/fixtures.db \n                 \n                     \n                         resource  -  datasette.resources.DatabaseResource(database) \n                         \n                             database  is the name of the database (string)", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in actions\"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures.db\", \"label\": \"https://latest.datasette.io/fixtures.db\"}]"}, {"id": "authentication:actions-view-instance", "page": "authentication", "ref": "actions-view-instance", "title": "view-instance", "content": "Top level permission - Actor is allowed to view any pages within this instance, starting at  https://latest.datasette.io/", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in actions\"]", "references": "[{\"href\": \"https://latest.datasette.io/\", \"label\": \"https://latest.datasette.io/\"}]"}, {"id": "authentication:actions-view-query", "page": "authentication", "ref": "actions-view-query", "title": "view-query", "content": "Actor is allowed to view (and execute) a  canned query  page, e.g.  https://latest.datasette.io/fixtures/pragma_cache_size  - this includes executing  Writable canned queries . \n                 \n                     \n                         resource  -  datasette.resources.QueryResource(database, query) \n                         \n                             database  is the name of the database (string) \n                             query  is the name of the canned query (string)", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in actions\"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures/pragma_cache_size\", \"label\": \"https://latest.datasette.io/fixtures/pragma_cache_size\"}]"}, {"id": "authentication:actions-view-table", "page": "authentication", "ref": "actions-view-table", "title": "view-table", "content": "Actor is allowed to view a table (or view) page, e.g.  https://latest.datasette.io/fixtures/complex_foreign_keys \n                 \n                     \n                         resource  -  datasette.resources.TableResource(database, table) \n                         \n                             database  is the name of the database (string) \n                             table  is the name of the table (string)", "breadcrumbs": "[\"Authentication and permissions\", \"Built-in actions\"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures/complex_foreign_keys\", \"label\": \"https://latest.datasette.io/fixtures/complex_foreign_keys\"}]"}, {"id": "authentication:allowdebugview", "page": "authentication", "ref": "allowdebugview", "title": "The /-/allow-debug tool", "content": "The  /-/allow-debug  tool lets you try out different   \"action\"  blocks against different  \"actor\"  JSON objects. You can try that out here:  https://latest.datasette.io/-/allow-debug", "breadcrumbs": "[\"Authentication and permissions\", \"Permissions\"]", "references": "[{\"href\": \"https://latest.datasette.io/-/allow-debug\", \"label\": \"https://latest.datasette.io/-/allow-debug\"}]"}, {"id": "authentication:allowedresourcesview", "page": "authentication", "ref": "allowedresourcesview", "title": "Allowed resources view", "content": "The  /-/allowed  endpoint displays resources that the current actor can access for a specified  action . \n                 This endpoint provides an interactive HTML form interface. Add  .json  to the URL path (e.g.  /-/allowed.json ) to get the raw JSON response instead. \n                 Pass  ?action=view-table  (or another action) to select the action. Optional  parent=  and  child=  query parameters can narrow the results to a specific database/table pair. \n                 This endpoint is publicly accessible to help users understand their own permissions. The potentially sensitive  reason  field is only shown to users with the  permissions-debug  permission - it shows the plugins and explanatory reasons that were responsible for each decision.", "breadcrumbs": "[\"Authentication and permissions\", \"Permissions debug tools\"]", "references": "[]"}, {"id": "authentication:authentication", "page": "authentication", "ref": "authentication", "title": "Authentication and permissions", "content": "Datasette doesn't require authentication by default. Any visitor to a Datasette instance can explore the full data and execute read-only SQL queries. \n         Datasette can be configured to only allow authenticated users, or to control which databases, tables, and queries can be accessed by the public or by specific users. Datasette's plugin system can be used to add many different styles of authentication, such as user accounts, single sign-on or API keys.", "breadcrumbs": "[]", "references": "[]"}, {"id": "authentication:authentication-actor", "page": "authentication", "ref": "authentication-actor", "title": "Actors", "content": "Through plugins, Datasette can support both authenticated users (with cookies) and authenticated API clients (via authentication tokens). The word \"actor\" is used to cover both of these cases. \n             Every request to Datasette has an associated actor value, available in the code as  request.actor . This can be  None  for unauthenticated requests, or a JSON compatible Python dictionary for authenticated users or API clients. \n             The actor dictionary can be any shape - the design of that data structure is left up to the plugins. Actors should always include a unique  \"id\"  string, as demonstrated by the \"root\" actor below. \n             Plugins can use the  actor_from_request(datasette, request)  hook to implement custom logic for authenticating an actor based on the incoming HTTP request.", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"}, {"id": "authentication:authentication-actor-matches-allow", "page": "authentication", "ref": "authentication-actor-matches-allow", "title": "actor_matches_allow()", "content": "Plugins that wish to implement this same  \"allow\"  block permissions scheme can take advantage of the  datasette.utils.actor_matches_allow(actor, allow)  function: \n             from datasette.utils import actor_matches_allow\n\nactor_matches_allow({\"id\": \"root\"}, {\"id\": \"*\"})\n# returns True \n             The currently authenticated actor is made available to plugins as  request.actor .", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"}, {"id": "authentication:authentication-cli-create-token", "page": "authentication", "ref": "authentication-cli-create-token", "title": "datasette create-token", "content": "You can also create tokens on the command line using the  datasette create-token  command. \n                 This command takes one required argument - the ID of the actor to be associated with the created token. \n                 You can specify a  -e/--expires-after  option in seconds. If omitted, the token will never expire. \n                 The command will sign the token using the  DATASETTE_SECRET  environment variable, if available. You can also pass the secret using the  --secret  option. \n                 This means you can run the command locally to create tokens for use with a deployed Datasette instance, provided you know that instance's secret. \n                 To create a token for the  root  actor that will expire in one hour: \n                 datasette create-token root --expires-after 3600 \n                 To create a token that never expires using a specific secret: \n                 datasette create-token root --secret my-secret-goes-here", "breadcrumbs": "[\"Authentication and permissions\", \"API Tokens\"]", "references": "[]"}, {"id": "authentication:authentication-cli-create-token-restrict", "page": "authentication", "ref": "authentication-cli-create-token-restrict", "title": "Restricting the actions that a token can perform", "content": "Tokens created using  datasette create-token ACTOR_ID  will inherit all of the permissions of the actor that they are associated with. \n                     You can pass additional options to create tokens that are restricted to a subset of that actor's permissions. \n                     To restrict the token to just specific permissions against all available databases, use the  --all  option: \n                     datasette create-token root --all insert-row --all update-row \n                     This option can be passed as many times as you like. In the above example the token will only be allowed to insert and update rows. \n                     You can also restrict permissions such that they can only be used within specific databases: \n                     datasette create-token root --database mydatabase insert-row \n                     The resulting token will only be able to insert rows, and only to tables in the  mydatabase  database. \n                     Finally, you can restrict permissions to individual resources - tables, SQL views and  named queries  - within a specific database: \n                     datasette create-token root --resource mydatabase mytable insert-row \n                     These options have short versions:  -a  for  --all ,  -d  for  --database  and  -r  for  --resource . \n                     You can add  --debug  to see a JSON representation of the token that has been created. Here's a full example: \n                     datasette create-token root \\\n    --secret mysecret \\\n    --all view-instance \\\n    --all view-table \\\n    --database docs view-query \\\n    --resource docs documents insert-row \\\n    --resource docs documents update-row \\\n    --debug \n                     This example outputs the following: \n                     dstok_.eJxFizEKgDAMRe_y5w4qYrFXERGxDkVsMI0uxbubdjFL8l_ez1jhwEQCA6Fjjxp90qtkuHawzdjYrh8MFobLxZ_wBH0_gtnAF-hpS5VfmF8D_lnd97lHqUJgLd6sls4H1qwlhA.nH_7RecYHj5qSzvjhMU95iy0Xlc\n\nDecoded:\n\n{\n  \"a\": \"root\",\n  \"token\": \"dstok\",\n  \"t\": 1670907246,\n  \"_r\": {\n    \"a\": [\n      \"vi\",\n      \"vt\"\n    ],\n    \"d\": {\n      \"docs\": [\n        \"vq\"\n      ]\n    },\n    \"r\": {\n      \"docs\": {\n        \"documents\": [\n          \"ir\",\n          \"ur\"\n        ]\n      }\n    }\n  }\n} \n                     Restrictions act as an allowlist layered on top of the actor's existing\n                        permissions. They can only remove access the actor would otherwise have\u2014they\n                        cannot grant new access. If the underlying actor is denied by  allow  rules in\n                         datasette.yaml  or by a plugin, a token that lists that resource in its\n                         \"_r\"  section will still be denied. \n                     To create tokens with restrictions in Python code, use the  TokenRestrictions  builder and pass it to  datasette.create_token() .", "breadcrumbs": "[\"Authentication and permissions\", \"API Tokens\", \"datasette create-token\"]", "references": "[]"}, {"id": "authentication:authentication-default-deny", "page": "authentication", "ref": "authentication-default-deny", "title": "Denying all permissions by default", "content": "By default, Datasette allows unauthenticated access to view databases, tables, and execute SQL queries. \n                 You may want to run Datasette in a mode where  all  access is denied by default, and you explicitly grant permissions only to authenticated users, either using the  --root mechanism  or through  configuration file rules  or plugins. \n                 Use the  --default-deny  command-line option to run Datasette in this mode: \n                 datasette --default-deny data.db --root \n                 With  --default-deny  enabled: \n                 \n                     \n                         Anonymous users are denied access to view the instance, databases, tables, and queries \n                     \n                     \n                         Authenticated users are also denied access unless they're explicitly granted permissions \n                     \n                     \n                         The root user (when using  --root ) still has access to everything \n                     \n                     \n                         You can grant permissions using  configuration file rules  or plugins \n                     \n                 \n                 For example, to allow only a specific user to access your instance: \n                 datasette --default-deny data.db --config datasette.yaml \n                 Where  datasette.yaml  contains: \n                 allow:\n  id: alice \n                 This configuration will deny access to everyone except the user with  id  of  alice .", "breadcrumbs": "[\"Authentication and permissions\", \"Permissions\"]", "references": "[]"}, {"id": "authentication:authentication-ds-actor", "page": "authentication", "ref": "authentication-ds-actor", "title": "The ds_actor cookie", "content": "Datasette includes a default authentication plugin which looks for a signed  ds_actor  cookie containing a JSON actor dictionary. This is how the  root actor  mechanism works. \n             Authentication plugins can set signed  ds_actor  cookies themselves like so: \n             response = Response.redirect(\"/\")\ndatasette.set_actor_cookie(response, {\"id\": \"cleopaws\"}) \n             The shape of data encoded in the cookie is as follows: \n             {\n  \"a\": {\n    \"id\": \"cleopaws\"\n  }\n} \n             To implement logout in a plugin, use the  delete_actor_cookie()  method: \n             response = Response.redirect(\"/\")\ndatasette.delete_actor_cookie(response)", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"}, {"id": "authentication:authentication-ds-actor-expiry", "page": "authentication", "ref": "authentication-ds-actor-expiry", "title": "Including an expiry time", "content": "ds_actor  cookies can optionally include a signed expiry timestamp, after which the cookies will no longer be valid. Authentication plugins may chose to use this mechanism to limit the lifetime of the cookie. For example, if a plugin implements single-sign-on against another source it may decide to set short-lived cookies so that if the user is removed from the SSO system their existing Datasette cookies will stop working shortly afterwards. \n                 To include an expiry pass  expire_after=  to  datasette.set_actor_cookie()  with a number of seconds. For example, to expire in 24 hours: \n                 response = Response.redirect(\"/\")\ndatasette.set_actor_cookie(\n    response, {\"id\": \"cleopaws\"}, expire_after=60 * 60 * 24\n) \n                 The resulting cookie will encode data that looks something like this: \n                 {\n  \"a\": {\n    \"id\": \"cleopaws\"\n  },\n  \"e\": \"1jjSji\"\n}", "breadcrumbs": "[\"Authentication and permissions\", \"The ds_actor cookie\"]", "references": "[]"}, {"id": "authentication:authentication-permissions", "page": "authentication", "ref": "authentication-permissions", "title": "Permissions", "content": "Datasette's permissions system is built around SQL queries. Datasette and its plugins construct SQL queries to resolve the list of resources that an actor cas access. \n             The key question the permissions system answers is this: \n             \n                 Is this  actor  allowed to perform this  action , optionally against this particular  resource ? \n             \n             Actors  are  described above . \n             An  action  is a string describing the action the actor would like to perform. A full list is  provided below  - examples include  view-table  and  execute-sql . \n             A  resource  is the item the actor wishes to interact with - for example a specific database or table. Some actions, such as  permissions-debug , are not associated with a particular resource. \n             Datasette's built-in view actions ( view-database ,  view-table  etc) are allowed by Datasette's default configuration: unless you  configure additional permission rules  unauthenticated users will be allowed to access content. \n             Other actions, including those introduced by plugins, will default to  deny .", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"}, {"id": "authentication:authentication-permissions-allow", "page": "authentication", "ref": "authentication-permissions-allow", "title": "Defining permissions with \"allow\" blocks", "content": "One way to define permissions in Datasette is to use an  \"allow\"  block  in the datasette.yaml file . This is a JSON document describing which actors are allowed to perform an action against a specific resource. \n                 Each  allow  block is compiled into SQL and combined with any\n                     plugin-provided rules  to produce\n                    the cascading allow/deny decisions that power  await .allowed(*, action, resource, actor=None) . \n                 The most basic form of allow block is this ( allow demo ,  deny demo ): \n                 [[[cog\nfrom metadata_doc import config_example\nimport textwrap\nconfig_example(cog, textwrap.dedent(\n  \"\"\"\n    allow:\n      id: root\n    \"\"\").strip(),\n    \"YAML\", \"JSON\"\n  ) \n                 ]]] \n                 [[[end]]] \n                 This will match any actors with an  \"id\"  property of  \"root\"  - for example, an actor that looks like this: \n                 {\n    \"id\": \"root\",\n    \"name\": \"Root User\"\n} \n                 An allow block can specify \"deny all\" using  false  ( demo ): \n                 [[[cog\nfrom metadata_doc import config_example\nimport textwrap\nconfig_example(cog, textwrap.dedent(\n  \"\"\"\n    allow: false\n    \"\"\").strip(),\n    \"YAML\", \"JSON\"\n  ) \n                 ]]] \n                 [[[end]]] \n                 An  \"allow\"  of  true  allows all access ( demo ): \n                 [[[cog\nfrom metadata_doc import config_example\nimport textwrap\nconfig_example(cog, textwrap.dedent(\n  \"\"\"\n    allow: true\n    \"\"\").strip(),\n    \"YAML\", \"JSON\"\n  ) \n                 ]]] \n                 [[[end]]] \n                 Allow keys can provide a list of values. These will match any actor that has any of those values ( allow demo ,  deny demo ): \n                 [[[cog\nfrom metadata_doc import config_example\nimport textwrap\nconfig_example(cog, textwrap.dedent(\n  \"\"\"\n    allow:\n      id:\n      - simon\n      - cleopaws\n    \"\"\").strip(),\n    \"YAML\", \"JSON\"\n  ) \n                 ]]] \n                 [[[end]]] \n                 This will match any actor with an  \"id\"  of either  \"simon\"  or  \"cleopaws\" . \n                 Actors can have properties that feature a list of values. These will be matched against the list of values in an allow block. Consider the following actor: \n                 {\n    \"id\": \"simon\",\n    \"roles\": [\"staff\", \"developer\"]\n} \n                 This allow block will provide access to any actor that has  \"developer\"  as one of their roles ( allow demo ,  deny demo ): \n                 [[[cog\nfrom metadata_doc import config_example\nimport textwrap\nconfig_example(cog, textwrap.dedent(\n  \"\"\"\n    allow:\n      roles:\n      - developer\n    \"\"\").strip(),\n    \"YAML\", \"JSON\"\n  ) \n                 ]]] \n                 [[[end]]] \n                 Note that \"roles\" is not a concept that is baked into Datasette - it's a convention that plugins can choose to implement and act on. \n                 If you want to provide access to any actor with a value for a specific key, use  \"*\" . For example, to match any logged-in user specify the following ( allow demo ,  deny demo ): \n                 [[[cog\nfrom metadata_doc import config_example\nimport textwrap\nconfig_example(cog, textwrap.dedent(\n  \"\"\"\n    allow:\n      id: \"*\"\n    \"\"\").strip(),\n    \"YAML\", \"JSON\"\n  ) \n                 ]]] \n                 [[[end]]] \n                 You can specify that only unauthenticated actors (from anonymous HTTP requests) should be allowed access using the special  \"unauthenticated\": true  key in an allow block ( allow demo ,  deny demo ): \n                 [[[cog\nfrom metadata_doc import config_example\nimport textwrap\nconfig_example(cog, textwrap.dedent(\n  \"\"\"\n    allow:\n      unauthenticated: true\n    \"\"\").strip(),\n    \"YAML\", \"JSON\"\n  ) \n                 ]]] \n                 [[[end]]] \n                 Allow keys act as an \"or\" mechanism. An actor will be able to execute the query if any of their JSON properties match any of the values in the corresponding lists in the  allow  block. The following block will allow users with either a  role  of  \"ops\"  OR users who have an  id  of  \"simon\"  or  \"cleopaws\" : \n                 [[[cog\nfrom metadata_doc import config_example\nimport textwrap\nconfig_example(cog, textwrap.dedent(\n  \"\"\"\n    allow:\n      id:\n      - simon\n      - cleopaws\n      role: ops\n    \"\"\").strip(),\n    \"YAML\", \"JSON\"\n  ) \n                 ]]] \n                 [[[end]]] \n                 Demo for cleopaws ,  demo for ops role ,  demo for an actor matching neither rule .", "breadcrumbs": "[\"Authentication and permissions\", \"Permissions\"]", "references": "[{\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%22id%22%3A+%22root%22%7D&allow=%7B%0D%0A++++++++%22id%22%3A+%22root%22%0D%0A++++%7D\", \"label\": \"allow demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%22id%22%3A+%22trevor%22%7D&allow=%7B%0D%0A++++++++%22id%22%3A+%22root%22%0D%0A++++%7D\", \"label\": \"deny demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22root%22%0D%0A%7D&allow=false\", \"label\": \"demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22root%22%0D%0A%7D&allow=true\", \"label\": \"demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22cleopaws%22%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%5B%0D%0A++++++++%22simon%22%2C%0D%0A++++++++%22cleopaws%22%0D%0A++++%5D%0D%0A%7D\", \"label\": \"allow demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22pancakes%22%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%5B%0D%0A++++++++%22simon%22%2C%0D%0A++++++++%22cleopaws%22%0D%0A++++%5D%0D%0A%7D\", \"label\": \"deny demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22simon%22%2C%0D%0A++++%22roles%22%3A+%5B%0D%0A++++++++%22staff%22%2C%0D%0A++++++++%22developer%22%0D%0A++++%5D%0D%0A%7D&allow=%7B%0D%0A++++%22roles%22%3A+%5B%0D%0A++++++++%22developer%22%0D%0A++++%5D%0D%0A%7D\", \"label\": \"allow demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22cleopaws%22%2C%0D%0A++++%22roles%22%3A+%5B%22dog%22%5D%0D%0A%7D&allow=%7B%0D%0A++++%22roles%22%3A+%5B%0D%0A++++++++%22developer%22%0D%0A++++%5D%0D%0A%7D\", \"label\": \"deny demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22simon%22%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%22*%22%0D%0A%7D\", \"label\": \"allow demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22bot%22%3A+%22readme-bot%22%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%22*%22%0D%0A%7D\", \"label\": \"deny demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=null&allow=%7B%0D%0A++++%22unauthenticated%22%3A+true%0D%0A%7D\", \"label\": \"allow demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22hello%22%0D%0A%7D&allow=%7B%0D%0A++++%22unauthenticated%22%3A+true%0D%0A%7D\", \"label\": \"deny demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22cleopaws%22%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%5B%0D%0A++++++++%22simon%22%2C%0D%0A++++++++%22cleopaws%22%0D%0A++++%5D%2C%0D%0A++++%22role%22%3A+%22ops%22%0D%0A%7D\", \"label\": \"Demo for cleopaws\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22trevor%22%2C%0D%0A++++%22role%22%3A+%5B%0D%0A++++++++%22ops%22%2C%0D%0A++++++++%22staff%22%0D%0A++++%5D%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%5B%0D%0A++++++++%22simon%22%2C%0D%0A++++++++%22cleopaws%22%0D%0A++++%5D%2C%0D%0A++++%22role%22%3A+%22ops%22%0D%0A%7D\", \"label\": \"demo for ops role\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22percy%22%2C%0D%0A++++%22role%22%3A+%5B%0D%0A++++++++%22staff%22%0D%0A++++%5D%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%5B%0D%0A++++++++%22simon%22%2C%0D%0A++++++++%22cleopaws%22%0D%0A++++%5D%2C%0D%0A++++%22role%22%3A+%22ops%22%0D%0A%7D\", \"label\": \"demo for an actor matching neither rule\"}]"}, {"id": "authentication:authentication-permissions-config", "page": "authentication", "ref": "authentication-permissions-config", "title": "Access permissions in ", "content": "There are two ways to configure permissions using  datasette.yaml  (or  datasette.json ). \n             For simple visibility permissions you can use  \"allow\"  blocks in the root, database, table and query sections. \n             For other permissions you can use a  \"permissions\"  block, described  in the next section . \n             You can limit who is allowed to view different parts of your Datasette instance using  \"allow\"  keys in your  Configuration . \n             You can control the following: \n             \n                 \n                     Access to the entire Datasette instance \n                 \n                 \n                     Access to specific databases \n                 \n                 \n                     Access to specific tables and views \n                 \n                 \n                     Access to specific  Canned queries \n                 \n             \n             If a user has permission to view a table they will be able to view that table, independent of if they have permission to view the database or instance that the table exists within.", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"}, {"id": "authentication:authentication-permissions-database", "page": "authentication", "ref": "authentication-permissions-database", "title": "Access to specific databases", "content": "To limit access to a specific  private.db  database to just authenticated users, use the  \"allow\"  block like this: \n                 [[[cog\nconfig_example(cog, \"\"\"\n    databases:\n      private:\n        allow:\n          id: \"*\"\n\"\"\") \n                 ]]] \n                 [[[end]]]", "breadcrumbs": "[\"Authentication and permissions\", \"Access permissions in \"]", "references": "[]"}, {"id": "authentication:authentication-permissions-execute-sql", "page": "authentication", "ref": "authentication-permissions-execute-sql", "title": "Controlling the ability to execute arbitrary SQL", "content": "Datasette defaults to allowing any site visitor to execute their own custom SQL queries, for example using the form on  the database page  or by appending a  ?_where=  parameter to the table page  like this . \n                 Access to this ability is controlled by the  execute-sql  permission. \n                 The easiest way to disable arbitrary SQL queries is using the  default_allow_sql setting  when you first start Datasette running. \n                 You can alternatively use an  \"allow_sql\"  block to control who is allowed to execute arbitrary SQL queries. \n                 To prevent any user from executing arbitrary SQL queries, use this: \n                 [[[cog\nconfig_example(cog, \"\"\"\n    allow_sql: false\n\"\"\") \n                 ]]] \n                 [[[end]]] \n                 To enable just the  root user  to execute SQL for all databases in your instance, use the following: \n                 [[[cog\nconfig_example(cog, \"\"\"\n    allow_sql:\n      id: root\n\"\"\") \n                 ]]] \n                 [[[end]]] \n                 To limit this ability for just one specific database, use this: \n                 [[[cog\nconfig_example(cog, \"\"\"\n    databases:\n      mydatabase:\n        allow_sql:\n          id: root\n\"\"\") \n                 ]]] \n                 [[[end]]]", "breadcrumbs": "[\"Authentication and permissions\", \"Access permissions in \"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures\", \"label\": \"the database page\"}, {\"href\": \"https://latest.datasette.io/fixtures/facetable?_where=_city_id=1\", \"label\": \"like this\"}]"}, {"id": "authentication:authentication-permissions-explained", "page": "authentication", "ref": "authentication-permissions-explained", "title": "How permissions are resolved", "content": "Datasette performs permission checks using the internal  await .allowed(*, action, resource, actor=None) , method which accepts keyword arguments for  action ,  resource  and an optional  actor . \n                 resource  should be an instance of the appropriate  Resource  subclass from  datasette.resources \u2014for example  InstanceResource() ,  DatabaseResource(database=\"... )`` or  TableResource(database=\"...\", table=\"...\") . This defaults to  InstanceResource()  if not specified. \n                 When a check runs Datasette gathers allow/deny rules from multiple sources and\n                    compiles them into a SQL query. The resulting query describes all of the\n                    resources an actor may access for that action, together with the reasons those\n                    resources were allowed or denied. The combined sources are: \n                 \n                     \n                         allow  blocks configured in  datasette.yaml . \n                     \n                     \n                         Actor restrictions  encoded into the actor dictionary or API token. \n                     \n                     \n                         The \"root\" user shortcut when  --root  (or  Datasette.root_enabled ) is active, replying  True  to all permission chucks unless configuration rules deny them at a more specific level. \n                     \n                     \n                         Any additional SQL provided by plugins implementing  permission_resources_sql(datasette, actor, action) . \n                     \n                 \n                 Datasette evaluates the SQL to determine if the requested  resource  is\n                    included. Explicit deny rules returned by configuration or plugins will block\n                    access even if other rules allowed it.", "breadcrumbs": "[\"Authentication and permissions\", \"Permissions\"]", "references": "[]"}, {"id": "authentication:authentication-permissions-instance", "page": "authentication", "ref": "authentication-permissions-instance", "title": "Access to an instance", "content": "Here's how to restrict access to your entire Datasette instance to just the  \"id\": \"root\"  user: \n                 [[[cog\nfrom metadata_doc import config_example\nconfig_example(cog, \"\"\"\n    title: My private Datasette instance\n    allow:\n      id: root\n  \"\"\") \n                 ]]] \n                 [[[end]]] \n                 To deny access to all users, you can use  \"allow\": false : \n                 [[[cog\nconfig_example(cog, \"\"\"\n    title: My entirely inaccessible instance\n    allow: false\n\"\"\") \n                 ]]] \n                 [[[end]]] \n                 One reason to do this is if you are using a Datasette plugin - such as  datasette-permissions-sql  - to control permissions instead.", "breadcrumbs": "[\"Authentication and permissions\", \"Access permissions in \"]", "references": "[{\"href\": \"https://github.com/simonw/datasette-permissions-sql\", \"label\": \"datasette-permissions-sql\"}]"}, {"id": "authentication:authentication-permissions-other", "page": "authentication", "ref": "authentication-permissions-other", "title": "Other permissions in ", "content": "For all other permissions, you can use one or more  \"permissions\"  blocks in your  datasette.yaml  configuration file. \n             To grant access to the  permissions debug tool  to all signed in users, you can grant  permissions-debug  to any actor with an  id  matching the wildcard  *  by adding this a the root of your configuration: \n             [[[cog\nconfig_example(cog, \"\"\"\n    permissions:\n      debug-menu:\n        id: '*'\n\"\"\") \n             ]]] \n             [[[end]]] \n             To grant  create-table  to the user with  id  of  editor  for the  docs  database: \n             [[[cog\nconfig_example(cog, \"\"\"\n    databases:\n      docs:\n        permissions:\n          create-table:\n            id: editor\n\"\"\") \n             ]]] \n             [[[end]]] \n             And for  insert-row  against the  reports  table in that  docs  database: \n             [[[cog\nconfig_example(cog, \"\"\"\n    databases:\n      docs:\n        tables:\n          reports:\n            permissions:\n              insert-row:\n                id: editor\n\"\"\") \n             ]]] \n             [[[end]]] \n             The  permissions debug tool  can be useful for helping test permissions that you have configured in this way.", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"}, {"id": "authentication:authentication-permissions-query", "page": "authentication", "ref": "authentication-permissions-query", "title": "Access to specific canned queries", "content": "Canned queries  allow you to configure named SQL queries in your  datasette.yaml  that can be executed by users. These queries can be set up to both read and write to the database, so controlling who can execute them can be important. \n                 To limit access to the  add_name  canned query in your  dogs.db  database to just the  root user : \n                 [[[cog\nconfig_example(cog, \"\"\"\n    databases:\n      dogs:\n        queries:\n          add_name:\n            sql: INSERT INTO names (name) VALUES (:name)\n            write: true\n            allow:\n              id:\n              - root\n\"\"\") \n                 ]]] \n                 [[[end]]]", "breadcrumbs": "[\"Authentication and permissions\", \"Access permissions in \"]", "references": "[]"}, {"id": "authentication:authentication-permissions-table", "page": "authentication", "ref": "authentication-permissions-table", "title": "Access to specific tables and views", "content": "To limit access to the  users  table in your  bakery.db  database: \n                 [[[cog\nconfig_example(cog, \"\"\"\n    databases:\n      bakery:\n        tables:\n          users:\n            allow:\n              id: '*'\n\"\"\") \n                 ]]] \n                 [[[end]]] \n                 This works for SQL views as well - you can list their names in the  \"tables\"  block above in the same way as regular tables. \n                 \n                     Restricting access to tables and views in this way will NOT prevent users from querying them using arbitrary SQL queries,  like this  for example. \n                     If you are restricting access to specific tables you should also use the  \"allow_sql\"  block to prevent users from bypassing the limit with their own SQL queries - see  Controlling the ability to execute arbitrary SQL .", "breadcrumbs": "[\"Authentication and permissions\", \"Access permissions in \"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures?sql=select+*+from+facetable\", \"label\": \"like this\"}]"}, {"id": "authentication:authentication-root", "page": "authentication", "ref": "authentication-root", "title": "Using the \"root\" actor", "content": "Datasette currently leaves almost all forms of authentication to plugins -  datasette-auth-github  for example. \n                 The one exception is the \"root\" account, which you can sign into while using Datasette on your local machine. The root user has  all permissions  - they can perform any action regardless of other permission rules. \n                 The  --root  flag is designed for local development and testing. When you start Datasette with  --root , the root user automatically receives every permission, including: \n                 \n                     \n                         All view permissions ( view-instance ,  view-database ,  view-table , etc.) \n                     \n                     \n                         All write permissions ( insert-row ,  update-row ,  delete-row ,  create-table ,  alter-table ,  drop-table ) \n                     \n                     \n                         Debug permissions ( permissions-debug ,  debug-menu ) \n                     \n                     \n                         Any custom permissions defined by plugins \n                     \n                 \n                 If you add explicit deny rules in  datasette.yaml  those can still block the\n                    root actor from specific databases or tables. \n                 The  --root  flag sets an internal  root_enabled  switch\u2014without it, a signed-in user with  {\"id\": \"root\"}  is treated like any other actor. \n                 To sign in as root, start Datasette using the  --root  command-line option, like this: \n                 datasette --root \n                 Datasette will output a single-use-only login URL on startup: \n                 http://127.0.0.1:8001/-/auth-token?token=786fc524e0199d70dc9a581d851f466244e114ca92f33aa3b42a139e9388daa7\nINFO:     Started server process [25801]\nINFO:     Waiting for application startup.\nINFO:     Application startup complete.\nINFO:     Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) \n                 Click on that link and then visit  http://127.0.0.1:8001/-/actor  to confirm that you are authenticated as an actor that looks like this: \n                 {\n    \"id\": \"root\"\n}", "breadcrumbs": "[\"Authentication and permissions\", \"Actors\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette-auth-github\", \"label\": \"datasette-auth-github\"}]"}, {"id": "authentication:createtokenview", "page": "authentication", "ref": "createtokenview", "title": "API Tokens", "content": "Datasette includes a default mechanism for generating API tokens that can be used to authenticate requests. \n             Authenticated users can create new API tokens using a form on the  /-/create-token  page. \n             Tokens created in this way can be further restricted to only allow access to specific actions, or to limit those actions to specific databases, tables or queries. \n             Created tokens can then be passed in the  Authorization: Bearer $token  header of HTTP requests to Datasette. \n             A token created by a user will include that user's  \"id\"  in the token payload, so any permissions granted to that user based on their ID can be made available to the token as well. \n             When one of these a token accompanies a request, the actor for that request will have the following shape: \n             {\n    \"id\": \"user_id\",\n    \"token\": \"dstok\",\n    \"token_expires\": 1667717426\n} \n             The  \"id\"  field duplicates the ID of the actor who first created the token. \n             The  \"token\"  field identifies that this actor was authenticated using a Datasette signed token ( dstok ). \n             The  \"token_expires\"  field, if present, indicates that the token will expire after that integer timestamp. \n             The  /-/create-token  page cannot be accessed by actors that are authenticated with a  \"token\": \"some-value\"  property. This is to prevent API tokens from being used to create more tokens. \n             Datasette plugins that implement their own form of API token authentication should follow this convention. \n             You can disable the signed token feature entirely using the  allow_signed_tokens  setting.", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"}, {"id": "authentication:logoutview", "page": "authentication", "ref": "logoutview", "title": "The /-/logout page", "content": "The page at  /-/logout  provides the ability to log out of a  ds_actor  cookie authentication session.", "breadcrumbs": "[\"Authentication and permissions\", \"The ds_actor cookie\"]", "references": "[]"}, {"id": "authentication:permissioncheckview", "page": "authentication", "ref": "permissioncheckview", "title": "Permission check view", "content": "The  /-/check  endpoint evaluates a single action/resource pair and returns information indicating whether the access was allowed along with diagnostic information. \n                 This endpoint provides an interactive HTML form interface. Add  .json  to the URL path (e.g.  /-/check.json?action=view-instance ) to get the raw JSON response instead. \n                 Pass  ?action=  to specify the action to check, and optional  ?parent=  and  ?child=  parameters to specify the resource.", "breadcrumbs": "[\"Authentication and permissions\", \"Permissions debug tools\"]", "references": "[]"}, {"id": "authentication:permissionrulesview", "page": "authentication", "ref": "permissionrulesview", "title": "Permission rules view", "content": "The  /-/rules  endpoint displays all permission rules (both allow and deny) for each candidate resource for the requested action. \n                 This endpoint provides an interactive HTML form interface. Add  .json  to the URL path (e.g.  /-/rules.json?action=view-table ) to get the raw JSON response instead. \n                 Pass  ?action=  as a query parameter to specify which action to check. \n                 This endpoint requires the  permissions-debug  permission.", "breadcrumbs": "[\"Authentication and permissions\", \"Permissions debug tools\"]", "references": "[]"}, {"id": "authentication:permissions-plugins", "page": "authentication", "ref": "permissions-plugins", "title": "Checking permissions in plugins", "content": "Datasette plugins can check if an actor has permission to perform an action using  await .allowed(*, action, resource, actor=None) \u2014for example: \n             from datasette.resources import TableResource\n\ncan_edit = await datasette.allowed(\n    action=\"update-row\",\n    resource=TableResource(database=\"fixtures\", table=\"facetable\"),\n    actor=request.actor,\n) \n             Use  await .ensure_permission(action, resource=None, actor=None)  when you need to enforce a permission and\n                raise a  Forbidden  error automatically. \n             Plugins that define new operations should return  Action \n                objects from  register_actions(datasette)  and can supply additional allow/deny\n                rules by returning  PermissionSQL  objects from the\n                 permission_resources_sql(datasette, actor, action)  hook. Those rules are merged with\n                configuration  allow  blocks and actor restrictions to determine the final\n                result for each check.", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"}, {"id": "authentication:permissionsdebugview", "page": "authentication", "ref": "permissionsdebugview", "title": "Permissions debug tools", "content": "The debug tool at  /-/permissions  is available to any actor with the  permissions-debug  permission. By default this is just the  authenticated root user  but you can open it up to all users by starting Datasette like this: \n             datasette -s permissions.permissions-debug true data.db \n             The page shows the permission checks that have been carried out by the Datasette instance. \n             It also provides an interface for running hypothetical permission checks against a hypothetical actor. This is a useful way of confirming that your configured permissions work in the way you expect. \n             This is designed to help administrators and plugin authors understand exactly how permission checks are being carried out, in order to effectively configure Datasette's permission system.", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"}, {"id": "binary_data:binary", "page": "binary_data", "ref": "binary", "title": "Binary data", "content": "SQLite tables can contain binary data in  BLOB  columns. \n         Datasette includes special handling for these binary values. The Datasette interface detects binary values and provides a link to download their content, for example on  https://latest.datasette.io/fixtures/binary_data \n         \n         Binary data is represented in  .json  exports using Base64 encoding. \n         https://latest.datasette.io/fixtures/binary_data.json?_shape=array \n         [\n    {\n        \"rowid\": 1,\n        \"data\": {\n            \"$base64\": true,\n            \"encoded\": \"FRwCx60F/g==\"\n        }\n    },\n    {\n        \"rowid\": 2,\n        \"data\": {\n            \"$base64\": true,\n            \"encoded\": \"FRwDx60F/g==\"\n        }\n    },\n    {\n        \"rowid\": 3,\n        \"data\": null\n    }\n]", "breadcrumbs": "[]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures/binary_data\", \"label\": \"https://latest.datasette.io/fixtures/binary_data\"}, {\"href\": \"https://latest.datasette.io/fixtures/binary_data.json?_shape=array\", \"label\": \"https://latest.datasette.io/fixtures/binary_data.json?_shape=array\"}]"}, {"id": "binary_data:binary-linking", "page": "binary_data", "ref": "binary-linking", "title": "Linking to binary downloads", "content": "The  .blob  output format is used to return binary data. It requires a  _blob_column=  query string argument specifying which BLOB column should be downloaded, for example: \n             https://latest.datasette.io/fixtures/binary_data/1.blob?_blob_column=data \n             This output format can also be used to return binary data from an arbitrary SQL query. Since such queries do not specify an exact row, an additional  ?_blob_hash=  parameter can be used to specify the SHA-256 hash of the value that is being linked to. \n             Consider the query  select data from binary_data  -  demonstrated here . \n             That page links to the binary value downloads. Those links look like this: \n             https://latest.datasette.io/fixtures.blob?sql=select+data+from+binary_data&_blob_column=data&_blob_hash=f3088978da8f9aea479ffc7f631370b968d2e855eeb172bea7f6c7a04262bb6d \n             These  .blob  links are also returned in the  .csv  exports Datasette provides for binary tables and queries, since the CSV format does not have a mechanism for representing binary data.", "breadcrumbs": "[\"Binary data\"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures/binary_data/1.blob?_blob_column=data\", \"label\": \"https://latest.datasette.io/fixtures/binary_data/1.blob?_blob_column=data\"}, {\"href\": \"https://latest.datasette.io/fixtures?sql=select+data+from+binary_data\", \"label\": \"demonstrated here\"}, {\"href\": \"https://latest.datasette.io/fixtures.blob?sql=select+data+from+binary_data&_blob_column=data&_blob_hash=f3088978da8f9aea479ffc7f631370b968d2e855eeb172bea7f6c7a04262bb6d\", \"label\": \"https://latest.datasette.io/fixtures.blob?sql=select+data+from+binary_data&_blob_column=data&_blob_hash=f3088978da8f9aea479ffc7f631370b968d2e855eeb172bea7f6c7a04262bb6d\"}]"}, {"id": "binary_data:binary-plugins", "page": "binary_data", "ref": "binary-plugins", "title": "Binary plugins", "content": "Several Datasette plugins are available that change the way Datasette treats binary data. \n             \n                 \n                     datasette-render-binary  modifies Datasette's default interface to show an automatic guess at what type of binary data is being stored, along with a visual representation of the binary value that displays ASCII strings directly in the interface. \n                 \n                 \n                     datasette-render-images  detects common image formats and renders them as images directly in the Datasette interface. \n                 \n                 \n                     datasette-media  allows Datasette interfaces to be configured to serve binary files from configured SQL queries, and includes the ability to resize images directly before serving them.", "breadcrumbs": "[\"Binary data\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette-render-binary\", \"label\": \"datasette-render-binary\"}, {\"href\": \"https://github.com/simonw/datasette-render-images\", \"label\": \"datasette-render-images\"}, {\"href\": \"https://github.com/simonw/datasette-media\", \"label\": \"datasette-media\"}]"}, {"id": "changelog:alter-table-support-for-create-insert-upsert-and-update", "page": "changelog", "ref": "alter-table-support-for-create-insert-upsert-and-update", "title": "Alter table support for create, insert, upsert and update", "content": "The  JSON write API  can now be used to apply simple alter table schema changes, provided the acting actor has the new  alter-table  permission. ( #2101 ) \n                 The only alter operation supported so far is adding new columns to an existing table. \n                 \n                     \n                         The  /db/-/create  API now adds new columns during large operations to create a table based on incoming example  \"rows\" , in the case where one of the later rows includes columns that were not present in the earlier batches. This requires the  create-table  but not the  alter-table  permission. \n                     \n                     \n                         When  /db/-/create  is called with rows in a situation where the table may have been already created, an  \"alter\": true  key can be included to indicate that any missing columns from the new rows should be added to the table. This requires the  alter-table  permission. \n                     \n                     \n                         /db/table/-/insert  and  /db/table/-/upsert  and  /db/table/row-pks/-/update  all now also accept  \"alter\": true , depending on the  alter-table  permission. \n                     \n                 \n                 Operations that alter a table now fire the new  alter-table event .", "breadcrumbs": "[\"Changelog\", \"1.0a9 (2024-02-16)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/2101\", \"label\": \"#2101\"}]"}, {"id": "changelog:asgi", "page": "changelog", "ref": "asgi", "title": "ASGI", "content": "ASGI  is the Asynchronous Server Gateway Interface standard. I've been wanting to convert Datasette into an ASGI application for over a year -  Port Datasette to ASGI #272  tracks thirteen months of intermittent development - but with Datasette 0.29 the change is finally released. This also means Datasette now runs on top of  Uvicorn  and no longer depends on  Sanic . \n                 I wrote about the significance of this change in  Porting Datasette to ASGI, and Turtles all the way down . \n                 The most exciting consequence of this change is that Datasette plugins can now take advantage of the ASGI standard.", "breadcrumbs": "[\"Changelog\", \"0.29 (2019-07-07)\"]", "references": "[{\"href\": \"https://asgi.readthedocs.io/\", \"label\": \"ASGI\"}, {\"href\": \"https://github.com/simonw/datasette/issues/272\", \"label\": \"Port Datasette to ASGI #272\"}, {\"href\": \"https://www.uvicorn.org/\", \"label\": \"Uvicorn\"}, {\"href\": \"https://github.com/huge-success/sanic\", \"label\": \"Sanic\"}, {\"href\": \"https://simonwillison.net/2019/Jun/23/datasette-asgi/\", \"label\": \"Porting Datasette to ASGI, and Turtles all the way down\"}]"}, {"id": "changelog:authentication", "page": "changelog", "ref": "authentication", "title": "Authentication", "content": "Prior to this release the Datasette ecosystem has treated authentication as exclusively the realm of plugins, most notably through  datasette-auth-github . \n                 0.44 introduces  Authentication and permissions  as core Datasette concepts ( #699 ). This enables different plugins to share responsibility for authenticating requests - you might have one plugin that handles user accounts and another one that allows automated access via API keys, for example. \n                 You'll need to install plugins if you want full user accounts, but default Datasette can now authenticate a single root user with the new  --root  command-line option, which outputs a one-time use URL to  authenticate as a root actor  ( #784 ): \n                 datasette fixtures.db --root \n                 http://127.0.0.1:8001/-/auth-token?token=5b632f8cd44b868df625f5a6e2185d88eea5b22237fd3cc8773f107cc4fd6477\nINFO:     Started server process [14973]\nINFO:     Waiting for application startup.\nINFO:     Application startup complete.\nINFO:     Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) \n                 Plugins can implement new ways of authenticating users using the new  actor_from_request(datasette, request)  hook.", "breadcrumbs": "[\"Changelog\", \"0.44 (2020-06-11)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette-auth-github\", \"label\": \"datasette-auth-github\"}, {\"href\": \"https://github.com/simonw/datasette/issues/699\", \"label\": \"#699\"}, {\"href\": \"https://github.com/simonw/datasette/issues/784\", \"label\": \"#784\"}]"}, {"id": "changelog:better-plugin-documentation", "page": "changelog", "ref": "better-plugin-documentation", "title": "Better plugin documentation", "content": "The plugin documentation has been re-arranged into four sections, including a brand new section on testing plugins. ( #687 ) \n                 \n                     \n                         Plugins  introduces Datasette's plugin system and describes how to install and configure plugins. \n                     \n                     \n                         Writing plugins  describes how to author plugins, from  one-off single file plugins to packaged plugins that can be published to PyPI. It also describes how to start a plugin using the new  datasette-plugin  cookiecutter template. \n                     \n                     \n                         Plugin hooks  is a full list of detailed documentation for every Datasette plugin hook. \n                     \n                     \n                         Testing plugins  describes how to write tests for Datasette plugins, using  pytest  and  HTTPX .", "breadcrumbs": "[\"Changelog\", \"0.45 (2020-07-01)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/687\", \"label\": \"#687\"}, {\"href\": \"https://github.com/simonw/datasette-plugin\", \"label\": \"datasette-plugin\"}, {\"href\": \"https://docs.pytest.org/\", \"label\": \"pytest\"}, {\"href\": \"https://www.python-httpx.org/\", \"label\": \"HTTPX\"}]"}, {"id": "changelog:binary-data", "page": "changelog", "ref": "binary-data", "title": "Binary data", "content": "SQLite tables can contain binary data in  BLOB  columns. Datasette now provides links for users to download this data directly from Datasette, and uses those links to make binary data available from CSV exports. See  Binary data  for more details. ( #1036  and  #1034 ).", "breadcrumbs": "[\"Changelog\", \"0.51 (2020-10-31)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1036\", \"label\": \"#1036\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1034\", \"label\": \"#1034\"}]"}, {"id": "changelog:bug-fixes", "page": "changelog", "ref": "bug-fixes", "title": "Bug fixes", "content": "Don't show the facet option in the cog menu if faceting is not allowed. ( #1683 ) \n                     \n                     \n                         ?_sort  and  ?_sort_desc  now work if the column that is being sorted has been excluded from the query using  ?_col=  or  ?_nocol= . ( #1773 ) \n                     \n                     \n                         Fixed bug where  ?_sort_desc  was duplicated in the URL every time the Apply button was clicked. ( #1738 )", "breadcrumbs": "[\"Changelog\", \"0.62 (2022-08-14)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1683\", \"label\": \"#1683\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1773\", \"label\": \"#1773\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1738\", \"label\": \"#1738\"}]"}, {"id": "changelog:bug-fixes-and-other-improvements", "page": "changelog", "ref": "bug-fixes-and-other-improvements", "title": "Bug fixes and other improvements", "content": "Custom pages  now work correctly when combined with the  base_url  setting. ( #1238 ) \n                     \n                     \n                         Fixed intermittent error displaying the index page when the user did not have permission to access one of the tables. Thanks, Guy Freeman. ( #1305 ) \n                     \n                     \n                         Columns with the name \"Link\" are no longer incorrectly displayed in bold. ( #1308 ) \n                     \n                     \n                         Fixed error caused by tables with a single quote in their names. ( #1257 ) \n                     \n                     \n                         Updated dependencies:  pytest-asyncio ,  Black ,  jinja2 ,  aiofiles ,  click , and  itsdangerous . \n                     \n                     \n                         The official Datasette Docker image now supports  apt-get install . ( #1320 ) \n                     \n                     \n                         The Heroku runtime used by  datasette publish heroku  is now  python-3.8.10 .", "breadcrumbs": "[\"Changelog\", \"0.57 (2021-06-05)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1238\", \"label\": \"#1238\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1305\", \"label\": \"#1305\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1308\", \"label\": \"#1308\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1257\", \"label\": \"#1257\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1320\", \"label\": \"#1320\"}]"}, {"id": "changelog:code-formatting-with-black-and-prettier", "page": "changelog", "ref": "code-formatting-with-black-and-prettier", "title": "Code formatting with Black and Prettier", "content": "Datasette adopted  Black  for opinionated Python code formatting in June 2019. Datasette now also embraces  Prettier  for JavaScript formatting, which like Black is enforced by tests in continuous integration. Instructions for using these two tools can be found in the new section on  Code formatting  in the contributors documentation. ( #1167 )", "breadcrumbs": "[\"Changelog\", \"0.54 (2021-01-25)\"]", "references": "[{\"href\": \"https://github.com/psf/black\", \"label\": \"Black\"}, {\"href\": \"https://prettier.io/\", \"label\": \"Prettier\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1167\", \"label\": \"#1167\"}]"}, {"id": "changelog:configuration", "page": "changelog", "ref": "configuration", "title": "Configuration", "content": "Plugin configuration now lives in the  datasette.yaml configuration file , passed to Datasette using the  -c/--config  option. Thanks, Alex Garcia. ( #2093 ) \n                         datasette -c datasette.yaml \n                         Where  datasette.yaml  contains configuration that looks like this: \n                         plugins:\n  datasette-cluster-map:\n    latitude_column: xlat\n    longitude_column: xlon \n                         Previously plugins were configured in  metadata.yaml , which was confusing as plugin settings were unrelated to database and table metadata. \n                     \n                     \n                         The  -s/--setting  option can now be used to set plugin configuration as well. See  Configuration via the command-line  for details. ( #2252 ) \n                         The above YAML configuration example using  -s/--setting  looks like this: \n                         datasette mydatabase.db \\\n  -s plugins.datasette-cluster-map.latitude_column xlat \\\n  -s plugins.datasette-cluster-map.longitude_column xlon \n                     \n                     \n                         The new  /-/config  page shows the current instance configuration, after redacting keys that could contain sensitive data such as API keys or passwords. ( #2254 ) \n                     \n                     \n                         Existing Datasette installations may already have configuration set in  metadata.yaml  that should be migrated to  datasette.yaml . To avoid breaking these installations, Datasette will silently treat table configuration, plugin configuration and allow blocks in metadata as if they had been specified in configuration instead. ( #2247 ) ( #2248 ) ( #2249 ) \n                     \n                 \n                 Note that the  datasette publish  command has not yet been updated to accept a  datasette.yaml  configuration file. This will be addressed in  #2195  but for the moment you can include those settings in  metadata.yaml  instead.", "breadcrumbs": "[\"Changelog\", \"1.0a8 (2024-02-07)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/2093\", \"label\": \"#2093\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2252\", \"label\": \"#2252\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2254\", \"label\": \"#2254\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2247\", \"label\": \"#2247\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2248\", \"label\": \"#2248\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2249\", \"label\": \"#2249\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2195\", \"label\": \"#2195\"}]"}, {"id": "changelog:control-http-caching-with-ttl", "page": "changelog", "ref": "control-http-caching-with-ttl", "title": "Control HTTP caching with ?_ttl=", "content": "You can now customize the HTTP max-age header that is sent on a per-URL basis, using the new  ?_ttl=  query string parameter. \n                 You can set this to any value in seconds, or you can set it to 0 to disable HTTP caching entirely. \n                 Consider for example this query which returns a randomly selected member of the Avengers: \n                 select * from [avengers/avengers] order by random() limit 1 \n                 If you hit the following page repeatedly you will get the same result, due to HTTP caching: \n                 /fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1 \n                 By adding  ?_ttl=0  to the zero you can ensure the page will not be cached and get back a different super hero every time: \n                 /fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1&_ttl=0", "breadcrumbs": "[\"Changelog\", \"0.23 (2018-06-18)\"]", "references": "[{\"href\": \"https://fivethirtyeight.datasettes.com/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1\", \"label\": \"/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1\"}, {\"href\": \"https://fivethirtyeight.datasettes.com/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1&_ttl=0\", \"label\": \"/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1&_ttl=0\"}]"}, {"id": "changelog:cookie-methods", "page": "changelog", "ref": "cookie-methods", "title": "Cookie methods", "content": "Plugins can now use the new  response.set_cookie()  method to set cookies. \n                 A new  request.cookies  method on the :ref:internals_request` can be used to read incoming cookies.", "breadcrumbs": "[\"Changelog\", \"0.44 (2020-06-11)\"]", "references": "[]"}, {"id": "changelog:csrf-protection", "page": "changelog", "ref": "csrf-protection", "title": "CSRF protection", "content": "Since writable canned queries are built using POST forms, Datasette now ships with  CSRF protection  ( #798 ). This applies automatically to any POST request, which means plugins need to include a  csrftoken  in any POST forms that they render. They can do that like so: \n                 <input type=\"hidden\" name=\"csrftoken\" value=\"{{ csrftoken() }}\">", "breadcrumbs": "[\"Changelog\", \"0.44 (2020-06-11)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/798\", \"label\": \"#798\"}]"}, {"id": "changelog:csv-export", "page": "changelog", "ref": "csv-export", "title": "CSV export", "content": "Any Datasette table, view or custom SQL query can now be exported as CSV. \n                 \n                 Check out the  CSV export documentation  for more details, or\n                    try the feature out on\n                     https://fivethirtyeight.datasettes.com/fivethirtyeight/bechdel%2Fmovies \n                 If your table has more than  max_returned_rows  (default 1,000)\n                    Datasette provides the option to  stream all rows . This option takes advantage\n                    of async Python and Datasette's efficient  pagination  to\n                    iterate through the entire matching result set and stream it back as a\n                    downloadable CSV file.", "breadcrumbs": "[\"Changelog\", \"0.23 (2018-06-18)\"]", "references": "[{\"href\": \"https://fivethirtyeight.datasettes.com/fivethirtyeight/bechdel%2Fmovies\", \"label\": \"https://fivethirtyeight.datasettes.com/fivethirtyeight/bechdel%2Fmovies\"}]"}, {"id": "changelog:documentation", "page": "changelog", "ref": "documentation", "title": "Documentation", "content": "Documentation describing  how to write tests that use signed actor cookies  using  datasette.client.actor_cookie() . ( #1830 ) \n                     \n                     \n                         Documentation on how to  register a plugin for the duration of a test . ( #2234 ) \n                     \n                     \n                         The  configuration documentation  now shows examples of both YAML and JSON for each setting.", "breadcrumbs": "[\"Changelog\", \"1.0a8 (2024-02-07)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1830\", \"label\": \"#1830\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2234\", \"label\": \"#2234\"}]"}, {"id": "changelog:facet-by-date", "page": "changelog", "ref": "facet-by-date", "title": "Facet by date", "content": "If a column contains datetime values, Datasette can now facet that column by date. ( #481 )", "breadcrumbs": "[\"Changelog\", \"0.29 (2019-07-07)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/481\", \"label\": \"#481\"}]"}, {"id": "changelog:faceting", "page": "changelog", "ref": "faceting", "title": "Faceting", "content": "The number of unique values in a facet is now always displayed. Previously it was only displayed if the user specified  ?_facet_size=max . ( #1556 ) \n                     \n                     \n                         Facets of type  date  or  array  can now be configured in  metadata.json , see  Facets in metadata . Thanks, David Larlet. ( #1552 ) \n                     \n                     \n                         New  ?_nosuggest=1  parameter for table views, which disables facet suggestion. ( #1557 ) \n                     \n                     \n                         Fixed bug where  ?_facet_array=tags&_facet=tags  would only display one of the two selected facets. ( #625 )", "breadcrumbs": "[\"Changelog\", \"0.60 (2022-01-13)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1556\", \"label\": \"#1556\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1552\", \"label\": \"#1552\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1557\", \"label\": \"#1557\"}, {\"href\": \"https://github.com/simonw/datasette/issues/625\", \"label\": \"#625\"}]"}, {"id": "changelog:features", "page": "changelog", "ref": "features", "title": "Features", "content": "Now tested against Python 3.11. Docker containers used by  datasette publish  and  datasette package  both now use that version of Python. ( #1853 ) \n                     \n                     \n                         --load-extension  option now supports entrypoints. Thanks, Alex Garcia. ( #1789 ) \n                     \n                     \n                         Facet size can now be set per-table with the new  facet_size  table metadata option. ( #1804 ) \n                     \n                     \n                         The  truncate_cells_html  setting now also affects long URLs in columns. ( #1805 ) \n                     \n                     \n                         The non-JavaScript SQL editor textarea now increases height to fit the SQL query. ( #1786 ) \n                     \n                     \n                         Facets are now displayed with better line-breaks in long values. Thanks, Daniel Rech. ( #1794 ) \n                     \n                     \n                         The  settings.json  file used in  Configuration directory mode  is now validated on startup. ( #1816 ) \n                     \n                     \n                         SQL queries can now include leading SQL comments, using  /* ... */  or  -- ...  syntax. Thanks,  Charles Nepote. ( #1860 ) \n                     \n                     \n                         SQL query is now re-displayed when terminated with a time limit error. ( #1819 ) \n                     \n                     \n                         The  inspect data  mechanism is now used to speed up server startup - thanks, Forest Gregg. ( #1834 ) \n                     \n                     \n                         In  Configuration directory mode  databases with filenames ending in  .sqlite  or  .sqlite3  are now automatically added to the Datasette instance. ( #1646 ) \n                     \n                     \n                         Breadcrumb navigation display now respects the current user's permissions. ( #1831 )", "breadcrumbs": "[\"Changelog\", \"0.63 (2022-10-27)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1853\", \"label\": \"#1853\"}, {\"href\": \"https://github.com/simonw/datasette/pull/1789\", \"label\": \"#1789\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1804\", \"label\": \"#1804\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1805\", \"label\": \"#1805\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1786\", \"label\": \"#1786\"}, {\"href\": \"https://github.com/simonw/datasette/pull/1794\", \"label\": \"#1794\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1816\", \"label\": \"#1816\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1860\", \"label\": \"#1860\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1819\", \"label\": \"#1819\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1834\", \"label\": \"#1834\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1646\", \"label\": \"#1646\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1831\", \"label\": \"#1831\"}]"}, {"id": "changelog:flash-messages", "page": "changelog", "ref": "flash-messages", "title": "Flash messages", "content": "Writable canned queries needed a mechanism to let the user know that the query has been successfully executed. The new flash messaging system ( #790 ) allows messages to persist in signed cookies which are then displayed to the user on the next page that they visit. Plugins can use this mechanism to display their own messages, see  .add_message(request, message, type=datasette.INFO)  for details. \n                 You can try out the new messages using the  /-/messages  debug tool, for example at  https://latest.datasette.io/-/messages", "breadcrumbs": "[\"Changelog\", \"0.44 (2020-06-11)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/790\", \"label\": \"#790\"}, {\"href\": \"https://latest.datasette.io/-/messages\", \"label\": \"https://latest.datasette.io/-/messages\"}]"}, {"id": "changelog:foreign-key-expansions", "page": "changelog", "ref": "foreign-key-expansions", "title": "Foreign key expansions", "content": "When Datasette detects a foreign key reference it attempts to resolve a label\n                    for that reference (automatically or using the  Specifying the label column for a table  metadata\n                    option) so it can display a link to the associated row. \n                 This expansion is now also available for JSON and CSV representations of the\n                    table, using the new  _labels=on  query string option. See\n                     Expanding foreign key references  for more details.", "breadcrumbs": "[\"Changelog\", \"0.23 (2018-06-18)\"]", "references": "[]"}, {"id": "changelog:id1", "page": "changelog", "ref": "id1", "title": "Changelog", "content": "", "breadcrumbs": "[]", "references": "[]"}, {"id": "changelog:id10", "page": "changelog", "ref": "id10", "title": "0.64.6 (2023-12-22)", "content": "Fixed a bug where CSV export with expanded labels could fail if a foreign key reference did not correctly resolve. ( #2214 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/2214\", \"label\": \"#2214\"}]"}, {"id": "changelog:id100", "page": "changelog", "ref": "id100", "title": "0.25.1 (2018-11-04)", "content": "Documentation improvements plus a fix for publishing to Zeit Now. \n             \n                 \n                     datasette publish now  now uses Zeit's v1 platform, to work around the new 100MB image limit. Thanks, @slygent - closes  #366 .", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/366\", \"label\": \"#366\"}]"}, {"id": "changelog:id101", "page": "changelog", "ref": "id101", "title": "0.25 (2018-09-19)", "content": "New plugin hooks, improved database view support and an easier way to use more recent versions of SQLite. \n             \n                 \n                     New  publish_subcommand  plugin hook. A plugin can now add additional  datasette publish  publishers in addition to the default  now  and  heroku , both of which have been refactored into default plugins.  publish_subcommand documentation . Closes  #349 \n                 \n                 \n                     New  render_cell  plugin hook. Plugins can now customize how values are displayed in the HTML tables produced by Datasette's browsable interface.  datasette-json-html  and  datasette-render-images  are two new plugins that use this hook.  render_cell documentation . Closes  #352 \n                 \n                 \n                     New  extra_body_script  plugin hook, enabling plugins to provide additional JavaScript that should be added to the page footer.  extra_body_script documentation . \n                 \n                 \n                     extra_css_urls  and  extra_js_urls  hooks now take additional optional parameters, allowing them to be more selective about which pages they apply to.  Documentation . \n                 \n                 \n                     You can now use the  sortable_columns metadata setting  to explicitly enable sort-by-column in the interface for database views, as well as for specific tables. \n                 \n                 \n                     The new  fts_table  and  fts_pk  metadata settings can now be used to  explicitly configure full-text search for a table or a view , even if that table is not directly coupled to the SQLite FTS feature in the database schema itself. \n                 \n                 \n                     Datasette will now use  pysqlite3  in place of the standard library  sqlite3  module if it has been installed in the current environment. This makes it much easier to run Datasette against a more recent version of SQLite, including the just-released  SQLite 3.25.0  which adds window function support. More details on how to use this in  #360 \n                 \n                 \n                     New mechanism that allows  plugin configuration options  to be set using  metadata.json .", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/349\", \"label\": \"#349\"}, {\"href\": \"https://github.com/simonw/datasette-json-html\", \"label\": \"datasette-json-html\"}, {\"href\": \"https://github.com/simonw/datasette-render-images\", \"label\": \"datasette-render-images\"}, {\"href\": \"https://github.com/simonw/datasette/issues/352\", \"label\": \"#352\"}, {\"href\": \"https://github.com/coleifer/pysqlite3\", \"label\": \"pysqlite3\"}, {\"href\": \"https://www.sqlite.org/releaselog/3_25_0.html\", \"label\": \"SQLite 3.25.0\"}, {\"href\": \"https://github.com/simonw/datasette/issues/360\", \"label\": \"#360\"}]"}, {"id": "changelog:id102", "page": "changelog", "ref": "id102", "title": "0.24 (2018-07-23)", "content": "A number of small new features: \n             \n                 \n                     datasette publish heroku  now supports  --extra-options , fixes  #334 \n                 \n                 \n                     Custom error message if SpatiaLite is needed for specified database, closes  #331 \n                 \n                 \n                     New config option:  truncate_cells_html  for  truncating long cell values  in HTML view - closes  #330 \n                 \n                 \n                     Documentation for  datasette publish and datasette package , closes  #337 \n                 \n                 \n                     Fixed compatibility with Python 3.7 \n                 \n                 \n                     datasette publish heroku  now supports app names via the  -n  option, which can also be used to overwrite an existing application [Russ Garrett] \n                 \n                 \n                     Title and description metadata can now be set for  canned SQL queries , closes  #342 \n                 \n                 \n                     New  force_https_on  config option, fixes  https://  API URLs when deploying to Zeit Now - closes  #333 \n                 \n                 \n                     ?_json_infinity=1  query string argument for handling Infinity/-Infinity values in JSON, closes  #332 \n                 \n                 \n                     URLs displayed in the results of custom SQL queries are now URLified, closes  #298", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/334\", \"label\": \"#334\"}, {\"href\": \"https://github.com/simonw/datasette/issues/331\", \"label\": \"#331\"}, {\"href\": \"https://github.com/simonw/datasette/issues/330\", \"label\": \"#330\"}, {\"href\": \"https://github.com/simonw/datasette/issues/337\", \"label\": \"#337\"}, {\"href\": \"https://github.com/simonw/datasette/issues/342\", \"label\": \"#342\"}, {\"href\": \"https://github.com/simonw/datasette/issues/333\", \"label\": \"#333\"}, {\"href\": \"https://github.com/simonw/datasette/issues/332\", \"label\": \"#332\"}, {\"href\": \"https://github.com/simonw/datasette/issues/298\", \"label\": \"#298\"}]"}, {"id": "changelog:id11", "page": "changelog", "ref": "id11", "title": "0.64.5 (2023-10-08)", "content": "Dropped dependency on  click-default-group-wheel , which could cause a dependency conflict. ( #2197 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/2197\", \"label\": \"#2197\"}]"}, {"id": "changelog:id111", "page": "changelog", "ref": "id111", "title": "0.23.2 (2018-07-07)", "content": "Minor bugfix and documentation release. \n             \n                 \n                     CSV export now respects  --cors , fixes  #326 \n                 \n                 \n                     Installation instructions , including docker image - closes  #328 \n                 \n                 \n                     Fix for row pages for tables with / in, closes  #325", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/326\", \"label\": \"#326\"}, {\"href\": \"https://github.com/simonw/datasette/issues/328\", \"label\": \"#328\"}, {\"href\": \"https://github.com/simonw/datasette/issues/325\", \"label\": \"#325\"}]"}, {"id": "changelog:id115", "page": "changelog", "ref": "id115", "title": "0.23.1 (2018-06-21)", "content": "Minor bugfix release. \n             \n                 \n                     Correctly display empty strings in HTML table, closes  #314 \n                 \n                 \n                     Allow \".\" in database filenames, closes  #302 \n                 \n                 \n                     404s ending in slash redirect to remove that slash, closes  #309 \n                 \n                 \n                     Fixed incorrect display of compound primary keys with foreign key\n                        references. Closes  #319 \n                 \n                 \n                     Docs + example of canned SQL query using || concatenation. Closes  #321 \n                 \n                 \n                     Correctly display facets with value of 0 - closes  #318 \n                 \n                 \n                     Default 'expand labels' to checked in CSV advanced export", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/314\", \"label\": \"#314\"}, {\"href\": \"https://github.com/simonw/datasette/issues/302\", \"label\": \"#302\"}, {\"href\": \"https://github.com/simonw/datasette/issues/309\", \"label\": \"#309\"}, {\"href\": \"https://github.com/simonw/datasette/issues/319\", \"label\": \"#319\"}, {\"href\": \"https://github.com/simonw/datasette/issues/321\", \"label\": \"#321\"}, {\"href\": \"https://github.com/simonw/datasette/issues/318\", \"label\": \"#318\"}]"}, {"id": "changelog:id12", "page": "changelog", "ref": "id12", "title": "0.64.4 (2023-09-21)", "content": "Fix for a crashing bug caused by viewing the table page for a named in-memory database. ( #2189 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/2189\", \"label\": \"#2189\"}]"}, {"id": "changelog:id122", "page": "changelog", "ref": "id122", "title": "0.23 (2018-06-18)", "content": "This release features CSV export, improved options for foreign key expansions,\n                new configuration settings and improved support for SpatiaLite. \n             See  datasette/compare/0.22.1...0.23  for a full list of\n                commits added since the last release.", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/compare/0.22.1...0.23\", \"label\": \"datasette/compare/0.22.1...0.23\"}]"}, {"id": "changelog:id123", "page": "changelog", "ref": "id123", "title": "0.22.1 (2018-05-23)", "content": "Bugfix release, plus we now use  versioneer  for our version numbers. \n             \n                 \n                     Faceting no longer breaks pagination, fixes  #282 \n                 \n                 \n                     Add  __version_info__  derived from  __version__  [Robert Gieseke] \n                     This might be tuple of more than two values (major and minor\n                        version) if commits have been made after a release. \n                 \n                 \n                     Add version number support with Versioneer. [Robert Gieseke] \n                     Versioneer Licence:\n                        Public Domain (CC0-1.0) \n                     Closes  #273 \n                 \n                 \n                     Refactor inspect logic [Russ Garrett]", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/warner/python-versioneer\", \"label\": \"versioneer\"}, {\"href\": \"https://github.com/simonw/datasette/issues/282\", \"label\": \"#282\"}, {\"href\": \"https://github.com/simonw/datasette/issues/273\", \"label\": \"#273\"}]"}, {"id": "changelog:id126", "page": "changelog", "ref": "id126", "title": "0.22 (2018-05-20)", "content": "The big new feature in this release is  Facets . Datasette can now apply faceted browse to any column in any table. It will also suggest possible facets. See the  Datasette Facets  announcement post for more details. \n             In addition to the work on facets: \n             \n                 \n                     Added  docs for introspection endpoints \n                 \n                 \n                     New  --config  option, added  --help-config , closes  #274 \n                     Removed the  --page_size=  argument to  datasette serve  in favour of: \n                     datasette serve --config default_page_size:50 mydb.db \n                     Added new help section: \n                     datasette --help-config \n                     Config options:\n  default_page_size            Default page size for the table view\n                               (default=100)\n  max_returned_rows            Maximum rows that can be returned from a table\n                               or custom query (default=1000)\n  sql_time_limit_ms            Time limit for a SQL query in milliseconds\n                               (default=1000)\n  default_facet_size           Number of values to return for requested facets\n                               (default=30)\n  facet_time_limit_ms          Time limit for calculating a requested facet\n                               (default=200)\n  facet_suggest_time_limit_ms  Time limit for calculating a suggested facet\n                               (default=50) \n                 \n                 \n                     Only apply responsive table styles to  .rows-and-column \n                     Otherwise they interfere with tables in the description, e.g. on\n                         https://fivethirtyeight.datasettes.com/fivethirtyeight/nba-elo%2Fnbaallelo \n                 \n                 \n                     Refactored views into new  views/  modules, refs  #256 \n                 \n                 \n                     Documentation for SQLite full-text search  support, closes  #253 \n                 \n                 \n                     /-/versions  now includes SQLite  fts_versions , closes  #252", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://simonwillison.net/2018/May/20/datasette-facets/\", \"label\": \"Datasette Facets\"}, {\"href\": \"https://docs.datasette.io/en/stable/introspection.html\", \"label\": \"docs for introspection endpoints\"}, {\"href\": \"https://github.com/simonw/datasette/issues/274\", \"label\": \"#274\"}, {\"href\": \"https://fivethirtyeight.datasettes.com/fivethirtyeight/nba-elo%2Fnbaallelo\", \"label\": \"https://fivethirtyeight.datasettes.com/fivethirtyeight/nba-elo%2Fnbaallelo\"}, {\"href\": \"https://github.com/simonw/datasette/issues/256\", \"label\": \"#256\"}, {\"href\": \"https://docs.datasette.io/en/stable/full_text_search.html\", \"label\": \"Documentation for SQLite full-text search\"}, {\"href\": \"https://github.com/simonw/datasette/issues/253\", \"label\": \"#253\"}, {\"href\": \"https://github.com/simonw/datasette/issues/252\", \"label\": \"#252\"}]"}, {"id": "changelog:id13", "page": "changelog", "ref": "id13", "title": "0.64.2 (2023-03-08)", "content": "Fixed a bug with  datasette publish cloudrun  where deploys all used the same Docker image tag. This was mostly inconsequential as the service is deployed as soon as the image has been pushed to the registry, but could result in the incorrect image being deployed if two different deploys for two separate services ran at exactly the same time. ( #2036 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/2036\", \"label\": \"#2036\"}]"}, {"id": "changelog:id131", "page": "changelog", "ref": "id131", "title": "0.21 (2018-05-05)", "content": "New JSON  _shape=  options, the ability to set table  _size=  and a mechanism for searching within specific columns. \n             \n                 \n                     Default tests to using a longer timelimit \n                     Every now and then a test will fail in Travis CI on Python 3.5 because it hit\n                        the default 20ms SQL time limit. \n                     Test fixtures now default to a 200ms time limit, and we only use the 20ms time\n                        limit for the specific test that tests query interruption. This should make\n                        our tests on Python 3.5 in Travis much more stable. \n                 \n                 \n                     Support  _search_COLUMN=text  searches, closes  #237 \n                 \n                 \n                     Show version on  /-/plugins  page, closes  #248 \n                 \n                 \n                     ?_size=max  option, closes  #249 \n                 \n                 \n                     Added  /-/versions  and  /-/versions.json , closes  #244 \n                     Sample output: \n                     {\n  \"python\": {\n    \"version\": \"3.6.3\",\n    \"full\": \"3.6.3 (default, Oct  4 2017, 06:09:38) \\n[GCC 4.2.1 Compatible Apple LLVM 9.0.0 (clang-900.0.37)]\"\n  },\n  \"datasette\": {\n    \"version\": \"0.20\"\n  },\n  \"sqlite\": {\n    \"version\": \"3.23.1\",\n    \"extensions\": {\n      \"json1\": null,\n      \"spatialite\": \"4.3.0a\"\n    }\n  }\n} \n                 \n                 \n                     Renamed  ?_sql_time_limit_ms=  to  ?_timelimit , closes  #242 \n                 \n                 \n                     New  ?_shape=array  option + tweaks to  _shape , closes  #245 \n                     \n                         \n                             Default is now  ?_shape=arrays  (renamed from  lists ) \n                         \n                         \n                             New  ?_shape=array  returns an array of objects as the root object \n                         \n                         \n                             Changed  ?_shape=object  to return the object as the root \n                         \n                         \n                             Updated docs \n                         \n                     \n                 \n                 \n                     FTS tables now detected by  inspect() , closes  #240 \n                 \n                 \n                     New  ?_size=XXX  query string parameter for table view, closes  #229 \n                     Also added documentation for all of the  _special  arguments. \n                     Plus deleted some duplicate logic implementing  _group_count . \n                 \n                 \n                     If  max_returned_rows==page_size , increment  max_returned_rows  - fixes  #230 \n                 \n                 \n                     New  hidden: True  option for table metadata, closes  #239 \n                 \n                 \n                     Hide  idx_*  tables if spatialite detected, closes  #228 \n                 \n                 \n                     Added  class=rows-and-columns  to custom query results table \n                 \n                 \n                     Added CSS class  rows-and-columns  to main table \n                 \n                 \n                     label_column  option in  metadata.json  - closes  #234", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/237\", \"label\": \"#237\"}, {\"href\": \"https://github.com/simonw/datasette/issues/248\", \"label\": \"#248\"}, {\"href\": \"https://github.com/simonw/datasette/issues/249\", \"label\": \"#249\"}, {\"href\": \"https://github.com/simonw/datasette/issues/244\", \"label\": \"#244\"}, {\"href\": \"https://github.com/simonw/datasette/issues/242\", \"label\": \"#242\"}, {\"href\": \"https://github.com/simonw/datasette/issues/245\", \"label\": \"#245\"}, {\"href\": \"https://github.com/simonw/datasette/issues/240\", \"label\": \"#240\"}, {\"href\": \"https://github.com/simonw/datasette/issues/229\", \"label\": \"#229\"}, {\"href\": \"https://github.com/simonw/datasette/issues/230\", \"label\": \"#230\"}, {\"href\": \"https://github.com/simonw/datasette/issues/239\", \"label\": \"#239\"}, {\"href\": \"https://github.com/simonw/datasette/issues/228\", \"label\": \"#228\"}, {\"href\": \"https://github.com/simonw/datasette/issues/234\", \"label\": \"#234\"}]"}, {"id": "changelog:id14", "page": "changelog", "ref": "id14", "title": "0.64.1 (2023-01-11)", "content": "Documentation now links to a current source of information for installing Python 3. ( #1987 ) \n                 \n                 \n                     Incorrectly calling the Datasette constructor using  Datasette(\"path/to/data.db\")  instead of  Datasette([\"path/to/data.db\"])  now returns a useful error message. ( #1985 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1987\", \"label\": \"#1987\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1985\", \"label\": \"#1985\"}]"}, {"id": "changelog:id144", "page": "changelog", "ref": "id144", "title": "0.20 (2018-04-20)", "content": "Mostly new work on the  Plugins  mechanism: plugins can now bundle static assets and custom templates, and  datasette publish  has a new  --install=name-of-plugin  option. \n             \n                 \n                     Add col-X classes to HTML table on custom query page \n                 \n                 \n                     Fixed out-dated template in documentation \n                 \n                 \n                     Plugins can now bundle custom templates,  #224 \n                 \n                 \n                     Added /-/metadata /-/plugins /-/inspect,  #225 \n                 \n                 \n                     Documentation for --install option, refs  #223 \n                 \n                 \n                     Datasette publish/package --install option,  #223 \n                 \n                 \n                     Fix for plugins in Python 3.5,  #222 \n                 \n                 \n                     New plugin hooks: extra_css_urls() and extra_js_urls(),  #214 \n                 \n                 \n                     /-/static-plugins/PLUGIN_NAME/ now serves static/ from plugins \n                 \n                 \n                     <th> now gets class=\"col-X\" - plus added col-X documentation \n                 \n                 \n                     Use to_css_class for table cell column classes \n                     This ensures that columns with spaces in the name will still\n                        generate usable CSS class names. Refs  #209 \n                 \n                 \n                     Add column name classes to <td>s, make PK bold [Russ Garrett] \n                 \n                 \n                     Don't duplicate simple primary keys in the link column [Russ Garrett] \n                     When there's a simple (single-column) primary key, it looks weird to\n                        duplicate it in the link column. \n                     This change removes the second PK column and treats the link column as\n                        if it were the PK column from a header/sorting perspective. \n                 \n                 \n                     Correct escaping for HTML display of row links [Russ Garrett] \n                 \n                 \n                     Longer time limit for test_paginate_compound_keys \n                     It was failing intermittently in Travis - see  #209 \n                 \n                 \n                     Use application/octet-stream for downloadable databases \n                 \n                 \n                     Updated PyPI classifiers \n                 \n                 \n                     Updated PyPI link to pypi.org", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/224\", \"label\": \"#224\"}, {\"href\": \"https://github.com/simonw/datasette/issues/225\", \"label\": \"#225\"}, {\"href\": \"https://github.com/simonw/datasette/issues/223\", \"label\": \"#223\"}, {\"href\": \"https://github.com/simonw/datasette/issues/223\", \"label\": \"#223\"}, {\"href\": \"https://github.com/simonw/datasette/issues/222\", \"label\": \"#222\"}, {\"href\": \"https://github.com/simonw/datasette/issues/214\", \"label\": \"#214\"}, {\"href\": \"https://github.com/simonw/datasette/issues/209\", \"label\": \"#209\"}, {\"href\": \"https://github.com/simonw/datasette/issues/209\", \"label\": \"#209\"}]"}, {"id": "changelog:id15", "page": "changelog", "ref": "id15", "title": "0.64 (2023-01-09)", "content": "Datasette now  strongly recommends against allowing arbitrary SQL queries if you are using SpatiaLite . SpatiaLite includes SQL functions that could cause the Datasette server to crash. See  SpatiaLite  for more details. \n                 \n                 \n                     New  default_allow_sql  setting, providing an easier way to disable all arbitrary SQL execution by end users:  datasette --setting default_allow_sql off . See also  Controlling the ability to execute arbitrary SQL . ( #1409 ) \n                 \n                 \n                     Building a location to time zone API with SpatiaLite  is a new Datasette tutorial showing how to safely use SpatiaLite to create a location to time zone API. \n                 \n                 \n                     New documentation about  how to debug problems loading SQLite extensions . The error message shown when an extension cannot be loaded has also been improved. ( #1979 ) \n                 \n                 \n                     Fixed an accessibility issue: the  <select>  elements in the table filter form now show an outline when they are currently focused. ( #1771 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1409\", \"label\": \"#1409\"}, {\"href\": \"https://datasette.io/tutorials/spatialite\", \"label\": \"Building a location to time zone API with SpatiaLite\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1979\", \"label\": \"#1979\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1771\", \"label\": \"#1771\"}]"}, {"id": "changelog:id153", "page": "changelog", "ref": "id153", "title": "0.19 (2018-04-16)", "content": "This is the first preview of the new Datasette plugins mechanism. Only two\n                plugin hooks are available so far - for custom SQL functions and custom template\n                filters. There's plenty more to come - read  the documentation  and get involved in\n                 the tracking ticket  if you\n                have feedback on the direction so far. \n             \n                 \n                     Fix for  _sort_desc=sortable_with_nulls  test, refs  #216 \n                 \n                 \n                     Fixed  #216  - paginate correctly when sorting by nullable column \n                 \n                 \n                     Initial documentation for plugins, closes  #213 \n                     https://docs.datasette.io/en/stable/plugins.html \n                 \n                 \n                     New  --plugins-dir=plugins/  option ( #212 ) \n                     New option causing Datasette to load and evaluate all of the Python files in\n                        the specified directory and register any plugins that are defined in those\n                        files. \n                     This new option is available for the following commands: \n                     datasette serve mydb.db --plugins-dir=plugins/\ndatasette publish now/heroku mydb.db --plugins-dir=plugins/\ndatasette package mydb.db --plugins-dir=plugins/ \n                 \n                 \n                     Start of the plugin system, based on pluggy ( #210 ) \n                     Uses  https://pluggy.readthedocs.io/  originally created for the py.test project \n                     We're starting with two plugin hooks: \n                     prepare_connection(conn) \n                     This is called when a new SQLite connection is created. It can be used to register custom SQL functions. \n                     prepare_jinja2_environment(env) \n                     This is called with the Jinja2 environment. It can be used to register custom template tags and filters. \n                     An example plugin which uses these two hooks can be found at  https://github.com/simonw/datasette-plugin-demos  or installed using  pip install datasette-plugin-demos \n                     Refs  #14 \n                 \n                 \n                     Return HTTP 405 on InvalidUsage rather than 500. [Russ Garrett] \n                     This also stops it filling up the logs. This happens for HEAD requests\n                        at the moment - which perhaps should be handled better, but that's a\n                        different issue.", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://docs.datasette.io/en/stable/plugins.html\", \"label\": \"the documentation\"}, {\"href\": \"https://github.com/simonw/datasette/issues/14\", \"label\": \"the tracking ticket\"}, {\"href\": \"https://github.com/simonw/datasette/issues/216\", \"label\": \"#216\"}, {\"href\": \"https://github.com/simonw/datasette/issues/216\", \"label\": \"#216\"}, {\"href\": \"https://github.com/simonw/datasette/issues/213\", \"label\": \"#213\"}, {\"href\": \"https://docs.datasette.io/en/stable/plugins.html\", \"label\": \"https://docs.datasette.io/en/stable/plugins.html\"}, {\"href\": \"https://github.com/simonw/datasette/issues/212\", \"label\": \"#212\"}, {\"href\": \"https://github.com/simonw/datasette/issues/14\", \"label\": \"#210\"}, {\"href\": \"https://pluggy.readthedocs.io/\", \"label\": \"https://pluggy.readthedocs.io/\"}, {\"href\": \"https://github.com/simonw/datasette-plugin-demos\", \"label\": \"https://github.com/simonw/datasette-plugin-demos\"}, {\"href\": \"https://github.com/simonw/datasette/issues/14\", \"label\": \"#14\"}]"}, {"id": "changelog:id16", "page": "changelog", "ref": "id16", "title": "0.63.3 (2022-12-17)", "content": "Fixed a bug where  datasette --root , when running in Docker, would only output the URL to sign in root when the server shut down, not when it started up. ( #1958 ) \n                 \n                 \n                     You no longer need to ensure  await datasette.invoke_startup()  has been called in order for Datasette to start correctly serving requests - this is now handled automatically the first time the server receives a request. This fixes a bug experienced when Datasette is served directly by an ASGI application server such as Uvicorn or Gunicorn. It also fixes a bug with the  datasette-gunicorn  plugin. ( #1955 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1958\", \"label\": \"#1958\"}, {\"href\": \"https://datasette.io/plugins/datasette-gunicorn\", \"label\": \"datasette-gunicorn\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1955\", \"label\": \"#1955\"}]"}, {"id": "changelog:id160", "page": "changelog", "ref": "id160", "title": "0.18 (2018-04-14)", "content": "This release introduces  support for units ,\n                contributed by Russ Garrett ( #203 ).\n                You can now optionally specify the units for specific columns using  metadata.json .\n                Once specified, units will be displayed in the HTML view of your table. They also become\n                available for use in filters - if a column is configured with a unit of distance, you can\n                request all rows where that column is less than 50 meters or more than 20 feet for example. \n             \n                 \n                     Link foreign keys which don't have labels. [Russ Garrett] \n                     This renders unlabeled FKs as simple links. \n                     Also includes bonus fixes for two minor issues: \n                     \n                         \n                             In foreign key link hrefs the primary key was escaped using HTML\n                                escaping rather than URL escaping. This broke some non-integer PKs. \n                         \n                         \n                             Print tracebacks to console when handling 500 errors. \n                         \n                     \n                 \n                 \n                     Fix SQLite error when loading rows with no incoming FKs. [Russ\n                        Garrett] \n                     This fixes an error caused by an invalid query when loading incoming FKs. \n                     The error was ignored due to async but it still got printed to the\n                        console. \n                 \n                 \n                     Allow custom units to be registered with Pint. [Russ Garrett] \n                 \n                 \n                     Support units in filters. [Russ Garrett] \n                 \n                 \n                     Tidy up units support. [Russ Garrett] \n                     \n                         \n                             Add units to exported JSON \n                         \n                         \n                             Units key in metadata skeleton \n                         \n                         \n                             Docs \n                         \n                     \n                 \n                 \n                     Initial units support. [Russ Garrett] \n                     Add support for specifying units for a column in  metadata.json  and\n                        rendering them on display using\n                         pint", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://docs.datasette.io/en/stable/metadata.html#specifying-units-for-a-column\", \"label\": \"support for units\"}, {\"href\": \"https://github.com/simonw/datasette/issues/203\", \"label\": \"#203\"}, {\"href\": \"https://pint.readthedocs.io/en/latest/\", \"label\": \"pint\"}]"}, {"id": "changelog:id162", "page": "changelog", "ref": "id162", "title": "0.17 (2018-04-13)", "content": "Release 0.17 to fix issues with PyPI", "breadcrumbs": "[\"Changelog\"]", "references": "[]"}, {"id": "changelog:id163", "page": "changelog", "ref": "id163", "title": "0.16 (2018-04-13)", "content": "Better mechanism for handling errors; 404s for missing table/database \n                     New error mechanism closes  #193 \n                     404s for missing tables/databases closes  #184 \n                 \n                 \n                     long_description in markdown for the new PyPI \n                 \n                 \n                     Hide SpatiaLite system tables. [Russ Garrett] \n                 \n                 \n                     Allow  explain select  /  explain query plan select   #201 \n                 \n                 \n                     Datasette inspect now finds primary_keys  #195 \n                 \n                 \n                     Ability to sort using form fields (for mobile portrait mode)  #199 \n                     We now display sort options as a select box plus a descending checkbox, which\n                        means you can apply sort orders even in portrait mode on a mobile phone where\n                        the column headers are hidden.", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/193\", \"label\": \"#193\"}, {\"href\": \"https://github.com/simonw/datasette/issues/184\", \"label\": \"#184\"}, {\"href\": \"https://github.com/simonw/datasette/issues/201\", \"label\": \"#201\"}, {\"href\": \"https://github.com/simonw/datasette/issues/195\", \"label\": \"#195\"}, {\"href\": \"https://github.com/simonw/datasette/issues/199\", \"label\": \"#199\"}]"}, {"id": "changelog:id169", "page": "changelog", "ref": "id169", "title": "0.15 (2018-04-09)", "content": "The biggest new feature in this release is the ability to sort by column. On the\n                table page the column headers can now be clicked to apply sort (or descending\n                sort), or you can specify  ?_sort=column  or  ?_sort_desc=column  directly\n                in the URL. \n             \n                 \n                     table_rows  =>  table_rows_count ,  filtered_table_rows  =>\n                         filtered_table_rows_count \n                     Renamed properties. Closes  #194 \n                 \n                 \n                     New  sortable_columns  option in  metadata.json  to control sort options. \n                     You can now explicitly set which columns in a table can be used for sorting\n                        using the  _sort  and  _sort_desc  arguments using  metadata.json : \n                     {\n    \"databases\": {\n        \"database1\": {\n            \"tables\": {\n                \"example_table\": {\n                    \"sortable_columns\": [\n                        \"height\",\n                        \"weight\"\n                    ]\n                }\n            }\n        }\n    }\n} \n                     Refs  #189 \n                 \n                 \n                     Column headers now link to sort/desc sort - refs  #189 \n                 \n                 \n                     _sort  and  _sort_desc  parameters for table views \n                     Allows for paginated sorted results based on a specified column. \n                     Refs  #189 \n                 \n                 \n                     Total row count now correct even if  _next  applied \n                 \n                 \n                     Use .custom_sql() for _group_count implementation (refs  #150 ) \n                 \n                 \n                     Make HTML title more readable in query template ( #180 ) [Ryan Pitts] \n                 \n                 \n                     New  ?_shape=objects/object/lists  param for JSON API ( #192 ) \n                     New  _shape=  parameter replacing old  .jsono  extension \n                     Now instead of this: \n                     /database/table.jsono \n                     We use the  _shape  parameter like this: \n                     /database/table.json?_shape=objects \n                     Also introduced a new  _shape  called  object  which looks like this: \n                     /database/table.json?_shape=object \n                     Returning an object for the rows key: \n                     ...\n\"rows\": {\n    \"pk1\": {\n        ...\n    },\n    \"pk2\": {\n        ...\n    }\n} \n                     Refs  #122 \n                 \n                 \n                     Utility for writing test database fixtures to a .db file \n                     python tests/fixtures.py /tmp/hello.db \n                     This is useful for making a SQLite database of the test fixtures for\n                        interactive exploration. \n                 \n                 \n                     Compound primary key  _next=  now plays well with extra filters \n                     Closes  #190 \n                 \n                 \n                     Fixed bug with keyset pagination over compound primary keys \n                     Refs  #190 \n                 \n                 \n                     Database/Table views inherit  source/license/source_url/license_url \n                        metadata \n                     If you set the  source_url/license_url/source/license  fields in your root\n                        metadata those values will now be inherited all the way down to the database\n                        and table templates. \n                     The  title/description  are NOT inherited. \n                     Also added unit tests for the HTML generated by the metadata. \n                     Refs  #185 \n                 \n                 \n                     Add metadata, if it exists, to heroku temp dir ( #178 ) [Tony Hirst] \n                 \n                 \n                     Initial documentation for pagination \n                 \n                 \n                     Broke up test_app into test_api and test_html \n                 \n                 \n                     Fixed bug with .json path regular expression \n                     I had a table called  geojson  and it caused an exception because the regex\n                        was matching  .json  and not  \\.json \n                 \n                 \n                     Deploy to Heroku with Python 3.6.3", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/194\", \"label\": \"#194\"}, {\"href\": \"https://github.com/simonw/datasette/issues/189\", \"label\": \"#189\"}, {\"href\": \"https://github.com/simonw/datasette/issues/189\", \"label\": \"#189\"}, {\"href\": \"https://github.com/simonw/datasette/issues/189\", \"label\": \"#189\"}, {\"href\": \"https://github.com/simonw/datasette/issues/150\", \"label\": \"#150\"}, {\"href\": \"https://github.com/simonw/datasette/issues/180\", \"label\": \"#180\"}, {\"href\": \"https://github.com/simonw/datasette/issues/192\", \"label\": \"#192\"}, {\"href\": \"https://github.com/simonw/datasette/issues/122\", \"label\": \"#122\"}, {\"href\": \"https://github.com/simonw/datasette/issues/190\", \"label\": \"#190\"}, {\"href\": \"https://github.com/simonw/datasette/issues/190\", \"label\": \"#190\"}, {\"href\": \"https://github.com/simonw/datasette/issues/185\", \"label\": \"#185\"}, {\"href\": \"https://github.com/simonw/datasette/issues/178\", \"label\": \"#178\"}]"}, {"id": "changelog:id17", "page": "changelog", "ref": "id17", "title": "0.63.2 (2022-11-18)", "content": "Fixed a bug in  datasette publish heroku  where deployments failed due to an older version of Python being requested. ( #1905 ) \n                 \n                 \n                     New  datasette publish heroku --generate-dir <dir>  option for generating a Heroku deployment directory without deploying it.", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1905\", \"label\": \"#1905\"}]"}, {"id": "changelog:id18", "page": "changelog", "ref": "id18", "title": "0.63.1 (2022-11-10)", "content": "Fixed a bug where Datasette's table filter form would not redirect correctly when run behind a proxy using the  base_url  setting. ( #1883 ) \n                 \n                 \n                     SQL query is now shown wrapped in a  <textarea>  if a query exceeds a time limit. ( #1876 ) \n                 \n                 \n                     Fixed an intermittent \"Too many open files\" error while running the test suite. ( #1843 ) \n                 \n                 \n                     New  db.close()  internal method.", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1883\", \"label\": \"#1883\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1876\", \"label\": \"#1876\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1843\", \"label\": \"#1843\"}]"}, {"id": "changelog:id182", "page": "changelog", "ref": "id182", "title": "0.14 (2017-12-09)", "content": "The theme of this release is customization: Datasette now allows every aspect\n                of its presentation  to be customized \n                either using additional CSS or by providing entirely new templates. \n             Datasette's  metadata.json format \n                has also been expanded, to allow per-database and per-table metadata. A new\n                 datasette skeleton  command can be used to generate a skeleton JSON file\n                ready to be filled in with per-database and per-table details. \n             The  metadata.json  file can also be used to define\n                 canned queries ,\n                as a more powerful alternative to SQL views. \n             \n                 \n                     extra_css_urls / extra_js_urls  in metadata \n                     A mechanism in the  metadata.json  format for adding custom CSS and JS urls. \n                     Create a  metadata.json  file that looks like this: \n                     {\n    \"extra_css_urls\": [\n        \"https://simonwillison.net/static/css/all.bf8cd891642c.css\"\n    ],\n    \"extra_js_urls\": [\n        \"https://code.jquery.com/jquery-3.2.1.slim.min.js\"\n    ]\n} \n                     Then start datasette like this: \n                     datasette mydb.db --metadata=metadata.json \n                     The CSS and JavaScript files will be linked in the  <head>  of every page. \n                     You can also specify a SRI (subresource integrity hash) for these assets: \n                     {\n    \"extra_css_urls\": [\n        {\n            \"url\": \"https://simonwillison.net/static/css/all.bf8cd891642c.css\",\n            \"sri\": \"sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI\"\n        }\n    ],\n    \"extra_js_urls\": [\n        {\n            \"url\": \"https://code.jquery.com/jquery-3.2.1.slim.min.js\",\n            \"sri\": \"sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g=\"\n        }\n    ]\n} \n                     Modern browsers will only execute the stylesheet or JavaScript if the SRI hash\n                        matches the content served. You can generate hashes using  https://www.srihash.org/ \n                 \n                 \n                     Auto-link column values that look like URLs ( #153 ) \n                 \n                 \n                     CSS styling hooks as classes on the body ( #153 ) \n                     Every template now gets CSS classes in the body designed to support custom\n                        styling. \n                     The index template (the top level page at  / ) gets this: \n                     <body class=\"index\"> \n                     The database template ( /dbname/ ) gets this: \n                     <body class=\"db db-dbname\"> \n                     The table template ( /dbname/tablename ) gets: \n                     <body class=\"table db-dbname table-tablename\"> \n                     The row template ( /dbname/tablename/rowid ) gets: \n                     <body class=\"row db-dbname table-tablename\"> \n                     The  db-x  and  table-x  classes use the database or table names themselves IF\n                        they are valid CSS identifiers. If they aren't, we strip any invalid\n                        characters out and append a 6 character md5 digest of the original name, in\n                        order to ensure that multiple tables which resolve to the same stripped\n                        character version still have different CSS classes. \n                     Some examples (extracted from the unit tests): \n                     \"simple\" => \"simple\"\n\"MixedCase\" => \"MixedCase\"\n\"-no-leading-hyphens\" => \"no-leading-hyphens-65bea6\"\n\"_no-leading-underscores\" => \"no-leading-underscores-b921bc\"\n\"no spaces\" => \"no-spaces-7088d7\"\n\"-\" => \"336d5e\"\n\"no $ characters\" => \"no--characters-59e024\" \n                 \n                 \n                     datasette --template-dir=mytemplates/  argument \n                     You can now pass an additional argument specifying a directory to look for\n                        custom templates in. \n                     Datasette will fall back on the default templates if a template is not\n                        found in that directory. \n                 \n                 \n                     Ability to over-ride templates for individual tables/databases. \n                     It is now possible to over-ride templates on a per-database / per-row or per-\n                        table basis. \n                     When you access e.g.  /mydatabase/mytable  Datasette will look for the following: \n                     - table-mydatabase-mytable.html\n- table.html \n                     If you provided a  --template-dir  argument to datasette serve it will look in\n                        that directory first. \n                     The lookup rules are as follows: \n                     Index page (/):\n    index.html\n\nDatabase page (/mydatabase):\n    database-mydatabase.html\n    database.html\n\nTable page (/mydatabase/mytable):\n    table-mydatabase-mytable.html\n    table.html\n\nRow page (/mydatabase/mytable/id):\n    row-mydatabase-mytable.html\n    row.html \n                     If a table name has spaces or other unexpected characters in it, the template\n                        filename will follow the same rules as our custom  <body>  CSS classes\n                        - for example, a table called \"Food Trucks\"\n                        will attempt to load the following templates: \n                     table-mydatabase-Food-Trucks-399138.html\ntable.html \n                     It is possible to extend the default templates using Jinja template\n                        inheritance. If you want to customize EVERY row template with some additional\n                        content you can do so by creating a row.html template like this: \n                     {% extends \"default:row.html\" %}\n\n{% block content %}\n<h1>EXTRA HTML AT THE TOP OF THE CONTENT BLOCK</h1>\n<p>This line renders the original block:</p>\n{{ super() }}\n{% endblock %} \n                 \n                 \n                     --static  option for datasette serve ( #160 ) \n                     You can now tell Datasette to serve static files from a specific location at a\n                        specific mountpoint. \n                     For example: \n                     datasette serve mydb.db --static extra-css:/tmp/static/css \n                     Now if you visit this URL: \n                     http://localhost:8001/extra-css/blah.css \n                     The following file will be served: \n                     /tmp/static/css/blah.css \n                 \n                 \n                     Canned query support. \n                     Named canned queries can now be defined in  metadata.json  like this: \n                     {\n    \"databases\": {\n        \"timezones\": {\n            \"queries\": {\n                \"timezone_for_point\": \"select tzid from timezones ...\"\n            }\n        }\n    }\n} \n                     These will be shown in a new \"Queries\" section beneath \"Views\" on the database page. \n                 \n                 \n                     New  datasette skeleton  command for generating  metadata.json  ( #164 ) \n                 \n                 \n                     metadata.json  support for per-table/per-database metadata ( #165 ) \n                     Also added support for descriptions and HTML descriptions. \n                     Here's an example metadata.json file illustrating custom per-database and per-\n                        table metadata: \n                     {\n    \"title\": \"Overall datasette title\",\n    \"description_html\": \"This is a <em>description with HTML</em>.\",\n    \"databases\": {\n        \"db1\": {\n            \"title\": \"First database\",\n            \"description\": \"This is a string description & has no HTML\",\n            \"license_url\": \"http://example.com/\",\n        \"license\": \"The example license\",\n            \"queries\": {\n              \"canned_query\": \"select * from table1 limit 3;\"\n            },\n            \"tables\": {\n                \"table1\": {\n                    \"title\": \"Custom title for table1\",\n                    \"description\": \"Tables can have descriptions too\",\n                    \"source\": \"This has a custom source\",\n                    \"source_url\": \"http://example.com/\"\n                }\n            }\n        }\n    }\n} \n                 \n                 \n                     Renamed  datasette build  command to  datasette inspect  ( #130 ) \n                 \n                 \n                     Upgrade to Sanic 0.7.0 ( #168 ) \n                     https://github.com/channelcat/sanic/releases/tag/0.7.0 \n                 \n                 \n                     Package and publish commands now accept  --static  and  --template-dir \n                     Example usage: \n                     datasette package --static css:extra-css/ --static js:extra-js/ \\\n  sf-trees.db --template-dir templates/ --tag sf-trees --branch master \n                     This creates a local Docker image that includes copies of the templates/,\n                        extra-css/ and extra-js/ directories. You can then run it like this: \n                     docker run -p 8001:8001 sf-trees \n                     For publishing to Zeit now: \n                     datasette publish now --static css:extra-css/ --static js:extra-js/ \\\n  sf-trees.db --template-dir templates/ --name sf-trees --branch master \n                 \n                 \n                     HTML comment showing which templates were considered for a page ( #171 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://docs.datasette.io/en/stable/custom_templates.html\", \"label\": \"to be customized\"}, {\"href\": \"https://docs.datasette.io/en/stable/metadata.html\", \"label\": \"metadata.json format\"}, {\"href\": \"https://docs.datasette.io/en/stable/sql_queries.html#canned-queries\", \"label\": \"canned queries\"}, {\"href\": \"https://www.srihash.org/\", \"label\": \"https://www.srihash.org/\"}, {\"href\": \"https://github.com/simonw/datasette/issues/153\", \"label\": \"#153\"}, {\"href\": \"https://github.com/simonw/datasette/issues/153\", \"label\": \"#153\"}, {\"href\": \"https://github.com/simonw/datasette/issues/160\", \"label\": \"#160\"}, {\"href\": \"https://github.com/simonw/datasette/issues/164\", \"label\": \"#164\"}, {\"href\": \"https://github.com/simonw/datasette/issues/165\", \"label\": \"#165\"}, {\"href\": \"https://github.com/simonw/datasette/issues/130\", \"label\": \"#130\"}, {\"href\": \"https://github.com/simonw/datasette/issues/168\", \"label\": \"#168\"}, {\"href\": \"https://github.com/channelcat/sanic/releases/tag/0.7.0\", \"label\": \"https://github.com/channelcat/sanic/releases/tag/0.7.0\"}, {\"href\": \"https://github.com/simonw/datasette/issues/171\", \"label\": \"#171\"}]"}, {"id": "changelog:id19", "page": "changelog", "ref": "id19", "title": "0.63 (2022-10-27)", "content": "See  Datasette 0.63: The annotated release notes  for more background on the changes in this release.", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://simonwillison.net/2022/Oct/27/datasette-0-63/\", \"label\": \"Datasette 0.63: The annotated release notes\"}]"}, {"id": "changelog:id191", "page": "changelog", "ref": "id191", "title": "0.13 (2017-11-24)", "content": "Search now applies to current filters. \n                     Combined search into the same form as filters. \n                     Closes  #133 \n                 \n                 \n                     Much tidier design for table view header. \n                     Closes  #147 \n                 \n                 \n                     Added  ?column__not=blah  filter. \n                     Closes  #148 \n                 \n                 \n                     Row page now resolves foreign keys. \n                     Closes  #132 \n                 \n                 \n                     Further tweaks to select/input filter styling. \n                     Refs  #86  - thanks for the help, @natbat! \n                 \n                 \n                     Show linked foreign key in table cells. \n                 \n                 \n                     Added UI for editing table filters. \n                     Refs  #86 \n                 \n                 \n                     Hide FTS-created tables on index pages. \n                     Closes  #129 \n                 \n                 \n                     Add publish to heroku support [Jacob Kaplan-Moss] \n                     datasette publish heroku mydb.db \n                     Pull request  #104 \n                 \n                 \n                     Initial implementation of  ?_group_count=column . \n                     URL shortcut for counting rows grouped by one or more columns. \n                     ?_group_count=column1&_group_count=column2  works as well. \n                     SQL generated looks like this: \n                     select \"qSpecies\", count(*) as \"count\"\nfrom Street_Tree_List\ngroup by \"qSpecies\"\norder by \"count\" desc limit 100 \n                     Or for two columns like this: \n                     select \"qSpecies\", \"qSiteInfo\", count(*) as \"count\"\nfrom Street_Tree_List\ngroup by \"qSpecies\", \"qSiteInfo\"\norder by \"count\" desc limit 100 \n                     Refs  #44 \n                 \n                 \n                     Added  --build=master  option to datasette publish and package. \n                     The  datasette publish  and  datasette package  commands both now accept an\n                        optional  --build  argument. If provided, this can be used to specify a branch\n                        published to GitHub that should be built into the container. \n                     This makes it easier to test code that has not yet been officially released to\n                        PyPI, e.g.: \n                     datasette publish now mydb.db --branch=master \n                 \n                 \n                     Implemented  ?_search=XXX  + UI if a FTS table is detected. \n                     Closes  #131 \n                 \n                 \n                     Added  datasette --version  support. \n                 \n                 \n                     Table views now show expanded foreign key references, if possible. \n                     If a table has foreign key columns, and those foreign key tables have\n                         label_columns , the TableView will now query those other tables for the\n                        corresponding values and display those values as links in the corresponding\n                        table cells. \n                     label_columns are currently detected by the  inspect()  function, which looks\n                        for any table that has just two columns - an ID column and one other - and\n                        sets the  label_column  to be that second non-ID column. \n                 \n                 \n                     Don't prevent tabbing to \"Run SQL\" button ( #117 ) [Robert Gieseke] \n                     See comment in  #115 \n                 \n                 \n                     Add keyboard shortcut to execute SQL query ( #115 ) [Robert Gieseke] \n                 \n                 \n                     Allow  --load-extension  to be set via environment variable. \n                 \n                 \n                     Add support for  ?field__isnull=1  ( #107 ) [Ray N] \n                 \n                 \n                     Add spatialite, switch to debian and local build ( #114 ) [Ariel N\u00fa\u00f1ez] \n                 \n                 \n                     Added  --load-extension  argument to datasette serve. \n                     Allows loading of SQLite extensions. Refs  #110 .", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/133\", \"label\": \"#133\"}, {\"href\": \"https://github.com/simonw/datasette/issues/147\", \"label\": \"#147\"}, {\"href\": \"https://github.com/simonw/datasette/issues/148\", \"label\": \"#148\"}, {\"href\": \"https://github.com/simonw/datasette/issues/132\", \"label\": \"#132\"}, {\"href\": \"https://github.com/simonw/datasette/issues/86\", \"label\": \"#86\"}, {\"href\": \"https://github.com/simonw/datasette/issues/86\", \"label\": \"#86\"}, {\"href\": \"https://github.com/simonw/datasette/issues/129\", \"label\": \"#129\"}, {\"href\": \"https://github.com/simonw/datasette/issues/104\", \"label\": \"#104\"}, {\"href\": \"https://github.com/simonw/datasette/issues/44\", \"label\": \"#44\"}, {\"href\": \"https://github.com/simonw/datasette/issues/131\", \"label\": \"#131\"}, {\"href\": \"https://github.com/simonw/datasette/issues/117\", \"label\": \"#117\"}, {\"href\": \"https://github.com/simonw/datasette/issues/115\", \"label\": \"#115\"}, {\"href\": \"https://github.com/simonw/datasette/issues/115\", \"label\": \"#115\"}, {\"href\": \"https://github.com/simonw/datasette/issues/107\", \"label\": \"#107\"}, {\"href\": \"https://github.com/simonw/datasette/issues/114\", \"label\": \"#114\"}, {\"href\": \"https://github.com/simonw/datasette/issues/110\", \"label\": \"#110\"}]"}, {"id": "changelog:id2", "page": "changelog", "ref": "id2", "title": "Other changes", "content": "Plugins that raise  datasette.utils.StartupError()  during startup now display a clean error message instead of a full traceback. ( #2624 ) \n                     \n                     \n                         Schema refreshes are now throttled to at most once per second, providing a small performance increase. ( #2629 ) \n                     \n                     \n                         Minor performance improvement to  remove_infinites  \u2014 rows without infinity values now skip the list/dict reconstruction step. ( #2629 ) \n                     \n                     \n                         Filter inputs and the search input no longer trigger unwanted zoom on iOS Safari. Thanks,  Daniel Olasubomi Sobowale . ( #2346 ) \n                     \n                     \n                         table_names()  and  get_all_foreign_keys()  now return results in deterministic sorted order. ( #2628 ) \n                     \n                     \n                         Switched linting to  ruff  and fixed all lint errors. ( #2630 )", "breadcrumbs": "[\"Changelog\", \"1.0a24 (2026-01-29)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/2624\", \"label\": \"#2624\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2629\", \"label\": \"#2629\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2629\", \"label\": \"#2629\"}, {\"href\": \"https://github.com/bowale-os\", \"label\": \"Daniel Olasubomi Sobowale\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2346\", \"label\": \"#2346\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2628\", \"label\": \"#2628\"}, {\"href\": \"https://github.com/astral-sh/ruff\", \"label\": \"ruff\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2630\", \"label\": \"#2630\"}]"}, {"id": "changelog:id20", "page": "changelog", "ref": "id20", "title": "Documentation", "content": "New tutorial:  Cleaning data with sqlite-utils and Datasette . \n                     \n                     \n                         Screenshots in the documentation are now maintained using  shot-scraper , as described in  Automating screenshots for the Datasette documentation using shot-scraper . ( #1844 ) \n                     \n                     \n                         More detailed command descriptions on the  CLI reference  page. ( #1787 ) \n                     \n                     \n                         New documentation on  Running Datasette using OpenRC  - thanks, Adam Simpson. ( #1825 )", "breadcrumbs": "[\"Changelog\", \"0.63 (2022-10-27)\"]", "references": "[{\"href\": \"https://datasette.io/tutorials/clean-data\", \"label\": \"Cleaning data with sqlite-utils and Datasette\"}, {\"href\": \"https://shot-scraper.datasette.io/\", \"label\": \"shot-scraper\"}, {\"href\": \"https://simonwillison.net/2022/Oct/14/automating-screenshots/\", \"label\": \"Automating screenshots for the Datasette documentation using shot-scraper\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1844\", \"label\": \"#1844\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1787\", \"label\": \"#1787\"}, {\"href\": \"https://github.com/simonw/datasette/pull/1825\", \"label\": \"#1825\"}]"}, {"id": "changelog:id206", "page": "changelog", "ref": "id206", "title": "0.12 (2017-11-16)", "content": "Added  __version__ , now displayed as tooltip in page footer ( #108 ). \n                 \n                 \n                     Added initial docs, including a changelog ( #99 ). \n                 \n                 \n                     Turned on auto-escaping in Jinja. \n                 \n                 \n                     Added a UI for editing named parameters ( #96 ). \n                     You can now construct a custom SQL statement using SQLite named\n                        parameters (e.g.  :name ) and datasette will display form fields for\n                        editing those parameters.  Here\u2019s an example  which lets you see the\n                        most popular names for dogs of different species registered through\n                        various dog registration schemes in Australia. \n                 \n             \n             \n             \n                 \n                     Pin to specific Jinja version. ( #100 ). \n                 \n                 \n                     Default to 127.0.0.1 not 0.0.0.0. ( #98 ). \n                 \n                 \n                     Added extra metadata options to publish and package commands. ( #92 ). \n                     You can now run these commands like so: \n                     datasette now publish mydb.db \\\n    --title=\"My Title\" \\\n    --source=\"Source\" \\\n    --source_url=\"http://www.example.com/\" \\\n    --license=\"CC0\" \\\n    --license_url=\"https://creativecommons.org/publicdomain/zero/1.0/\" \n                     This will write those values into the metadata.json that is packaged with the\n                        app. If you also pass  --metadata=metadata.json  that file will be updated with the extra\n                        values before being written into the Docker image. \n                 \n                 \n                     Added production-ready Dockerfile ( #94 ) [Andrew\n                        Cutler] \n                 \n                 \n                     New  ?_sql_time_limit_ms=10  argument to database and table page ( #95 ) \n                 \n                 \n                     SQL syntax highlighting with Codemirror ( #89 ) [Tom Dyson]", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/108\", \"label\": \"#108\"}, {\"href\": \"https://github.com/simonw/datasette/issues/99\", \"label\": \"#99\"}, {\"href\": \"https://github.com/simonw/datasette/issues/96\", \"label\": \"#96\"}, {\"href\": \"https://australian-dogs.now.sh/australian-dogs-3ba9628?sql=select+name%2C+count%28*%29+as+n+from+%28%0D%0A%0D%0Aselect+upper%28%22Animal+name%22%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2013%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2014%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2015%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22AnimalName%22%29+as+name+from+%5BCity-of-Port-Adelaide-Enfield-Dog_Registrations_2016%5D+where+AnimalBreed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5BMitcham-dog-registrations-2015%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22DOG_NAME%22%29+as+name+from+%5Bburnside-dog-registrations-2015%5D+where+DOG_BREED+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Animal_Name%22%29+as+name+from+%5Bcity-of-playford-2015-dog-registration%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5Bcity-of-prospect-dog-registration-details-2016%5D+where%22Breed+Description%22+like+%3Abreed%0D%0A%0D%0A%29+group+by+name+order+by+n+desc%3B&breed=pug\", \"label\": \"Here\u2019s an example\"}, {\"href\": \"https://github.com/simonw/datasette/issues/100\", \"label\": \"#100\"}, {\"href\": \"https://github.com/simonw/datasette/issues/98\", \"label\": \"#98\"}, {\"href\": \"https://github.com/simonw/datasette/issues/92\", \"label\": \"#92\"}, {\"href\": \"https://github.com/simonw/datasette/issues/94\", \"label\": \"#94\"}, {\"href\": \"https://github.com/simonw/datasette/issues/95\", \"label\": \"#95\"}, {\"href\": \"https://github.com/simonw/datasette/issues/89\", \"label\": \"#89\"}]"}, {"id": "changelog:id21", "page": "changelog", "ref": "id21", "title": "0.62 (2022-08-14)", "content": "Datasette can now run entirely in your browser using WebAssembly. Try out  Datasette Lite , take a look  at the code  or read more about it in  Datasette Lite: a server-side Python web application running in a browser . \n             Datasette now has a  Discord community  for questions and discussions about Datasette and its ecosystem of projects.", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://lite.datasette.io/\", \"label\": \"Datasette Lite\"}, {\"href\": \"https://github.com/simonw/datasette-lite\", \"label\": \"at the code\"}, {\"href\": \"https://simonwillison.net/2022/May/4/datasette-lite/\", \"label\": \"Datasette Lite: a server-side Python web application running in a browser\"}, {\"href\": \"https://datasette.io/discord\", \"label\": \"Discord community\"}]"}, {"id": "changelog:id216", "page": "changelog", "ref": "id216", "title": "0.11 (2017-11-14)", "content": "Added  datasette publish now --force  option. \n                     This calls  now  with  --force  - useful as it means you get a fresh copy of datasette even if Now has already cached that docker layer. \n                 \n                 \n                     Enable  --cors  by default when running in a container.", "breadcrumbs": "[\"Changelog\"]", "references": "[]"}, {"id": "changelog:id217", "page": "changelog", "ref": "id217", "title": "0.10 (2017-11-14)", "content": "Fixed  #83  - 500 error on individual row pages. \n                 \n                 \n                     Stop using sqlite WITH RECURSIVE in our tests. \n                     The version of Python 3 running in Travis CI doesn't support this.", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/83\", \"label\": \"#83\"}]"}, {"id": "changelog:id219", "page": "changelog", "ref": "id219", "title": "0.9 (2017-11-13)", "content": "Added  --sql_time_limit_ms  and  --extra-options . \n                     The serve command now accepts  --sql_time_limit_ms  for customizing the SQL time\n                        limit. \n                     The publish and package commands now accept  --extra-options  which can be used\n                        to specify additional options to be passed to the datasite serve command when\n                        it executes inside the resulting Docker containers.", "breadcrumbs": "[\"Changelog\"]", "references": "[]"}, {"id": "changelog:id22", "page": "changelog", "ref": "id22", "title": "Features", "content": "Datasette is now compatible with  Pyodide .  This is the enabling technology behind  Datasette Lite . ( #1733 ) \n                     \n                     \n                         Database file downloads now implement conditional GET using ETags. ( #1739 ) \n                     \n                     \n                         HTML for facet results and suggested results has been extracted out into new templates  _facet_results.html  and  _suggested_facets.html . Thanks, M. Nasimul Haque. ( #1759 ) \n                     \n                     \n                         Datasette now runs some SQL queries in parallel. This has limited impact on performance, see  this research issue  for details. \n                     \n                     \n                         New  --nolock  option for ignoring file locks when opening read-only databases. ( #1744 ) \n                     \n                     \n                         Spaces in the database names in URLs are now encoded as  +  rather than  ~20 . ( #1701 ) \n                     \n                     \n                         <Binary: 2427344 bytes>  is now displayed as  <Binary: 2,427,344 bytes>  and is accompanied by tooltip showing \"2.3MB\". ( #1712 ) \n                     \n                     \n                         The base Docker image used by  datasette publish cloudrun ,  datasette package  and the  official Datasette image  has been upgraded to  3.10.6-slim-bullseye .  ( #1768 ) \n                     \n                     \n                         Canned writable queries against immutable databases now show a warning message. ( #1728 ) \n                     \n                     \n                         datasette publish cloudrun  has a new  --timeout  option which can be used to increase the time limit applied by the Google Cloud build environment. Thanks, Tim Sherratt. ( #1717 ) \n                     \n                     \n                         datasette publish cloudrun  has new  --min-instances  and  --max-instances  options. ( #1779 )", "breadcrumbs": "[\"Changelog\", \"0.62 (2022-08-14)\"]", "references": "[{\"href\": \"https://pyodide.org/\", \"label\": \"Pyodide\"}, {\"href\": \"https://lite.datasette.io/\", \"label\": \"Datasette Lite\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1733\", \"label\": \"#1733\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1739\", \"label\": \"#1739\"}, {\"href\": \"https://github.com/simonw/datasette/pull/1759\", \"label\": \"#1759\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1727\", \"label\": \"this research issue\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1744\", \"label\": \"#1744\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1701\", \"label\": \"#1701\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1712\", \"label\": \"#1712\"}, {\"href\": \"https://hub.docker.com/datasetteproject/datasette\", \"label\": \"official Datasette image\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1768\", \"label\": \"#1768\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1728\", \"label\": \"#1728\"}, {\"href\": \"https://github.com/simonw/datasette/pull/1717\", \"label\": \"#1717\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1779\", \"label\": \"#1779\"}]"}, {"id": "changelog:id220", "page": "changelog", "ref": "id220", "title": "0.8 (2017-11-13)", "content": "V0.8 - added PyPI metadata, ready to ship. \n                 \n                 \n                     Implemented offset/limit pagination for views ( #70 ). \n                 \n                 \n                     Improved pagination. ( #78 ) \n                 \n                 \n                     Limit on max rows returned, controlled by  --max_returned_rows  option. ( #69 ) \n                     If someone executes 'select * from table' against a table with a million rows\n                        in it, we could run into problems: just serializing that much data as JSON is\n                        likely to lock up the server. \n                     Solution: we now have a hard limit on the maximum number of rows that can be\n                        returned by a query. If that limit is exceeded, the server will return a\n                         \"truncated\": true  field in the JSON. \n                     This limit can be optionally controlled by the new  --max_returned_rows \n                        option. Setting that option to 0 disables the limit entirely.", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/70\", \"label\": \"#70\"}, {\"href\": \"https://github.com/simonw/datasette/issues/78\", \"label\": \"#78\"}, {\"href\": \"https://github.com/simonw/datasette/issues/69\", \"label\": \"#69\"}]"}], "truncated": false}