{"id": "authentication:authentication", "page": "authentication", "ref": "authentication", "title": "Authentication and permissions", "content": "Datasette doesn't require authentication by default. Any visitor to a Datasette instance can explore the full data and execute read-only SQL queries. \n Datasette's plugin system can be used to add many different styles of authentication, such as user accounts, single sign-on or API keys.", "breadcrumbs": "[]", "references": "[]"} {"id": "javascript_plugins:javascript-datasette-init", "page": "javascript_plugins", "ref": "javascript-datasette-init", "title": "The datasette_init event", "content": "Datasette emits a custom event called datasette_init when the page is loaded. This event is dispatched on the document object, and includes a detail object with a reference to the datasetteManager object. \n Your JavaScript code can listen out for this event using document.addEventListener() like this: \n document.addEventListener(\"datasette_init\", function (evt) {\n const manager = evt.detail;\n console.log(\"Datasette version:\", manager.VERSION);\n});", "breadcrumbs": "[\"JavaScript plugins\"]", "references": "[]"} {"id": "facets:id1", "page": "facets", "ref": "id1", "title": "Facets", "content": "Datasette facets can be used to add a faceted browse interface to any database table.\n With facets, tables are displayed along with a summary showing the most common values in specified columns.\n These values can be selected to further filter the table. \n Here's an example : \n \n Facets can be specified in two ways: using query string parameters, or in metadata.json configuration for the table.", "breadcrumbs": "[]", "references": "[{\"href\": \"https://congress-legislators.datasettes.com/legislators/legislator_terms?_facet=type&_facet=party&_facet=state&_facet_size=10\", \"label\": \"an example\"}]"} {"id": "authentication:authentication-permissions", "page": "authentication", "ref": "authentication-permissions", "title": "Permissions", "content": "Datasette has an extensive permissions system built-in, which can be further extended and customized by plugins. \n The key question the permissions system answers is this: \n \n Is this actor allowed to perform this action , optionally against this particular resource ? \n \n Actors are described above . \n An action is a string describing the action the actor would like to perform. A full list is provided below - examples include view-table and execute-sql . \n A resource is the item the actor wishes to interact with - for example a specific database or table. Some actions, such as permissions-debug , are not associated with a particular resource. \n Datasette's built-in view permissions ( view-database , view-table etc) default to allow - unless you configure additional permission rules unauthenticated users will be allowed to access content. \n Permissions with potentially harmful effects should default to deny . Plugin authors should account for this when designing new plugins - for example, the datasette-upload-csvs plugin defaults to deny so that installations don't accidentally allow unauthenticated users to create new tables by uploading a CSV file.", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette-upload-csvs\", \"label\": \"datasette-upload-csvs\"}]"} {"id": "changelog:id37", "page": "changelog", "ref": "id37", "title": "0.53 (2020-12-10)", "content": "Datasette has an official project website now, at https://datasette.io/ . This release mainly updates the documentation to reflect the new site. \n \n \n New ?column__arraynotcontains= table filter. ( #1132 ) \n \n \n datasette serve has a new --create option, which will create blank database files if they do not already exist rather than exiting with an error. ( #1135 ) \n \n \n New ?_header=off option for CSV export which omits the CSV header row, documented here . ( #1133 ) \n \n \n \"Powered by Datasette\" link in the footer now links to https://datasette.io/ . ( #1138 ) \n \n \n Project news no longer lives in the README - it can now be found at https://datasette.io/news . ( #1137 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://datasette.io/\", \"label\": \"https://datasette.io/\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1132\", \"label\": \"#1132\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1135\", \"label\": \"#1135\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1133\", \"label\": \"#1133\"}, {\"href\": \"https://datasette.io/\", \"label\": \"https://datasette.io/\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1138\", \"label\": \"#1138\"}, {\"href\": \"https://datasette.io/news\", \"label\": \"https://datasette.io/news\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1137\", \"label\": \"#1137\"}]"} {"id": "getting_started:getting-started-tutorial", "page": "getting_started", "ref": "getting-started-tutorial", "title": "Follow a tutorial", "content": "Datasette has several tutorials to help you get started with the tool. Try one of the following: \n \n \n Exploring a database with Datasette shows how to use the Datasette web interface to explore a new database. \n \n \n Learn SQL with Datasette introduces SQL, and shows how to use that query language to ask questions of your data. \n \n \n Cleaning data with sqlite-utils and Datasette guides you through using sqlite-utils to turn a CSV file into a database that you can explore using Datasette.", "breadcrumbs": "[\"Getting started\"]", "references": "[{\"href\": \"https://datasette.io/tutorials\", \"label\": \"tutorials\"}, {\"href\": \"https://datasette.io/tutorials/explore\", \"label\": \"Exploring a database with Datasette\"}, {\"href\": \"https://datasette.io/tutorials/learn-sql\", \"label\": \"Learn SQL with Datasette\"}, {\"href\": \"https://datasette.io/tutorials/clean-data\", \"label\": \"Cleaning data with sqlite-utils and Datasette\"}, {\"href\": \"https://sqlite-utils.datasette.io/\", \"label\": \"sqlite-utils\"}]"} {"id": "sql_queries:sql-parameters", "page": "sql_queries", "ref": "sql-parameters", "title": "Named parameters", "content": "Datasette has special support for SQLite named parameters. Consider a SQL query like this: \n select * from Street_Tree_List\nwhere \"PermitNotes\" like :notes\nand \"qSpecies\" = :species \n If you execute this query using the custom query editor, Datasette will extract the two named parameters and use them to construct form fields for you to provide values. \n You can also provide values for these fields by constructing a URL: \n /mydatabase?sql=select...&species=44 \n SQLite string escaping rules will be applied to values passed using named parameters - they will be wrapped in quotes and their content will be correctly escaped. \n Values from named parameters are treated as SQLite strings. If you need to perform numeric comparisons on them you should cast them to an integer or float first using cast(:name as integer) or cast(:name as real) , for example: \n select * from Street_Tree_List\nwhere latitude > cast(:min_latitude as real)\nand latitude < cast(:max_latitude as real) \n Datasette disallows custom SQL queries containing the string PRAGMA (with a small number of exceptions ) as SQLite pragma statements can be used to change database settings at runtime. If you need to include the string \"pragma\" in a query you can do so safely using a named parameter.", "breadcrumbs": "[\"Running SQL queries\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/761\", \"label\": \"of exceptions\"}]"} {"id": "authentication:authentication-ds-actor", "page": "authentication", "ref": "authentication-ds-actor", "title": "The ds_actor cookie", "content": "Datasette includes a default authentication plugin which looks for a signed ds_actor cookie containing a JSON actor dictionary. This is how the root actor mechanism works. \n Authentication plugins can set signed ds_actor cookies themselves like so: \n response = Response.redirect(\"/\")\nresponse.set_cookie(\n \"ds_actor\",\n datasette.sign({\"a\": {\"id\": \"cleopaws\"}}, \"actor\"),\n) \n Note that you need to pass \"actor\" as the namespace to .sign(value, namespace=\"default\") . \n The shape of data encoded in the cookie is as follows: \n {\n \"a\": {... actor ...}\n}", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"} {"id": "authentication:createtokenview", "page": "authentication", "ref": "createtokenview", "title": "API Tokens", "content": "Datasette includes a default mechanism for generating API tokens that can be used to authenticate requests. \n Authenticated users can create new API tokens using a form on the /-/create-token page. \n Tokens created in this way can be further restricted to only allow access to specific actions, or to limit those actions to specific databases, tables or queries. \n Created tokens can then be passed in the Authorization: Bearer $token header of HTTP requests to Datasette. \n A token created by a user will include that user's \"id\" in the token payload, so any permissions granted to that user based on their ID can be made available to the token as well. \n When one of these a token accompanies a request, the actor for that request will have the following shape: \n {\n \"id\": \"user_id\",\n \"token\": \"dstok\",\n \"token_expires\": 1667717426\n} \n The \"id\" field duplicates the ID of the actor who first created the token. \n The \"token\" field identifies that this actor was authenticated using a Datasette signed token ( dstok ). \n The \"token_expires\" field, if present, indicates that the token will expire after that integer timestamp. \n The /-/create-token page cannot be accessed by actors that are authenticated with a \"token\": \"some-value\" property. This is to prevent API tokens from being used to create more tokens. \n Datasette plugins that implement their own form of API token authentication should follow this convention. \n You can disable the signed token feature entirely using the allow_signed_tokens setting.", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"} {"id": "events:id1", "page": "events", "ref": "id1", "title": "Events", "content": "Datasette includes a mechanism for tracking events that occur while the software is running. This is primarily intended to be used by plugins, which can both trigger events and listen for events. \n The core Datasette application triggers events when certain things happen. This page describes those events. \n Plugins can listen for events using the track_event(datasette, event) plugin hook, which will be called with instances of the following classes - or additional classes registered by other plugins . \n \n \n \n \n class datasette.events. LoginEvent actor : dict | None \n \n Event name: login \n A user (represented by event.actor ) has logged in. \n \n \n \n \n class datasette.events. LogoutEvent actor : dict | None \n \n Event name: logout \n A user (represented by event.actor ) has logged out. \n \n \n \n \n class datasette.events. CreateTokenEvent actor : dict | None expires_after : int | None restrict_all : list restrict_database : dict restrict_resource : dict \n \n Event name: create-token \n A user created an API token. \n \n \n Variables \n \n \n \n expires_after -- Number of seconds after which this token will expire. \n \n \n restrict_all -- Restricted permissions for this token. \n \n \n restrict_database -- Restricted database permissions for this token. \n \n \n restrict_resource -- Restricted resource permissions for this token. \n \n \n \n \n \n \n \n \n \n class datasette.events. CreateTableEvent actor : dict | None database : str table : str schema : str \n \n Event name: create-table \n A new table has been created in the database. \n \n \n Variables \n \n \n \n database -- The name of the database where the table was created. \n \n \n table -- The name of the table that was created \n \n \n schema -- The SQL schema definition for the new table. \n \n \n \n \n \n \n \n \n \n class datasette.events. DropTableEvent actor : dict | None database : str table : str \n \n Event name: drop-table \n A table has been dropped from the database. \n \n \n Variables \n \n \n \n database -- The name of the database where the table was dropped. \n \n \n table -- The name of the table that was dropped \n \n \n \n \n \n \n \n \n \n class datasette.events. AlterTableEvent actor : dict | None database : str table : str before_schema : str after_schema : str \n \n Event name: alter-table \n A table has been altered. \n \n \n Variables \n \n \n \n database -- The name of the database where the table was altered \n \n \n table -- The name of the table that was altered \n \n \n before_schema -- The table's SQL schema before the alteration \n \n \n after_schema -- The table's SQL schema after the alteration \n \n \n \n \n \n \n \n \n \n class datasette.events. InsertRowsEvent actor : dict | None database : str table : str num_rows : int ignore : bool replace : bool \n \n Event name: insert-rows \n Rows were inserted into a table. \n \n \n Variables \n \n \n \n database -- The name of the database where the rows were inserted. \n \n \n table -- The name of the table where the rows were inserted. \n \n \n num_rows -- The number of rows that were requested to be inserted. \n \n \n ignore -- Was ignore set? \n \n \n replace -- Was replace set? \n \n \n \n \n \n \n \n \n \n class datasette.events. UpsertRowsEvent actor : dict | None database : str table : str num_rows : int \n \n Event name: upsert-rows \n Rows were upserted into a table. \n \n \n Variables \n \n \n \n database -- The name of the database where the rows were inserted. \n \n \n table -- The name of the table where the rows were inserted. \n \n \n num_rows -- The number of rows that were requested to be inserted. \n \n \n \n \n \n \n \n \n \n class datasette.events. UpdateRowEvent actor : dict | None database : str table : str pks : list \n \n Event name: update-row \n A row was updated in a table. \n \n \n Variables \n \n \n \n database -- The name of the database where the row was updated. \n \n \n table -- The name of the table where the row was updated. \n \n \n pks -- The primary key values of the updated row. \n \n \n \n \n \n \n \n \n \n class datasette.events. DeleteRowEvent actor : dict | None database : str table : str pks : list \n \n Event name: delete-row \n A row was deleted from a table. \n \n \n Variables \n \n \n \n database -- The name of the database where the row was deleted. \n \n \n table -- The name of the table where the row was deleted. \n \n \n pks -- The primary key values of the deleted row.", "breadcrumbs": "[]", "references": "[]"} {"id": "plugin_hooks:plugin-event-tracking", "page": "plugin_hooks", "ref": "plugin-event-tracking", "title": "Event tracking", "content": "Datasette includes an internal mechanism for tracking notable events. This can be used for analytics, but can also be used by plugins that want to listen out for when key events occur (such as a table being created) and take action in response. \n Plugins can register to receive events using the track_event plugin hook. \n They can also define their own events for other plugins to receive using the register_events() plugin hook , combined with calls to the datasette.track_event() internal method .", "breadcrumbs": "[\"Plugin hooks\"]", "references": "[]"} {"id": "introspection:id1", "page": "introspection", "ref": "id1", "title": "Introspection", "content": "Datasette includes some pages and JSON API endpoints for introspecting the current instance. These can be used to understand some of the internals of Datasette and to see how a particular instance has been configured. \n Each of these pages can be viewed in your browser. Add .json to the URL to get back the contents as JSON.", "breadcrumbs": "[]", "references": "[]"} {"id": "publish:publishing", "page": "publish", "ref": "publishing", "title": "Publishing data", "content": "Datasette includes tools for publishing and deploying your data to the internet. The datasette publish command will deploy a new Datasette instance containing your databases directly to a Heroku or Google Cloud hosting account. You can also use datasette package to create a Docker image that bundles your databases together with the datasette application that is used to serve them.", "breadcrumbs": "[]", "references": "[]"} {"id": "contributing:id1", "page": "contributing", "ref": "id1", "title": "Contributing", "content": "Datasette is an open source project. We welcome contributions! \n This document describes how to contribute to Datasette core. You can also contribute to the wider Datasette ecosystem by creating new Plugins .", "breadcrumbs": "[]", "references": "[]"} {"id": "changelog:new-visual-design", "page": "changelog", "ref": "new-visual-design", "title": "New visual design", "content": "Datasette is no longer white and grey with blue and purple links! Natalie Downe has been working on a visual refresh, the first iteration of which is included in this release. ( #1056 )", "breadcrumbs": "[\"Changelog\", \"0.51 (2020-10-31)\"]", "references": "[{\"href\": \"https://twitter.com/natbat\", \"label\": \"Natalie Downe\"}, {\"href\": \"https://github.com/simonw/datasette/pull/1056\", \"label\": \"#1056\"}]"} {"id": "changelog:id14", "page": "changelog", "ref": "id14", "title": "Features", "content": "Datasette is now compatible with Pyodide . This is the enabling technology behind Datasette Lite . ( #1733 ) \n \n \n Database file downloads now implement conditional GET using ETags. ( #1739 ) \n \n \n HTML for facet results and suggested results has been extracted out into new templates _facet_results.html and _suggested_facets.html . Thanks, M. Nasimul Haque. ( #1759 ) \n \n \n Datasette now runs some SQL queries in parallel. This has limited impact on performance, see this research issue for details. \n \n \n New --nolock option for ignoring file locks when opening read-only databases. ( #1744 ) \n \n \n Spaces in the database names in URLs are now encoded as + rather than ~20 . ( #1701 ) \n \n \n is now displayed as and is accompanied by tooltip showing \"2.3MB\". ( #1712 ) \n \n \n The base Docker image used by datasette publish cloudrun , datasette package and the official Datasette image has been upgraded to 3.10.6-slim-bullseye . ( #1768 ) \n \n \n Canned writable queries against immutable databases now show a warning message. ( #1728 ) \n \n \n datasette publish cloudrun has a new --timeout option which can be used to increase the time limit applied by the Google Cloud build environment. Thanks, Tim Sherratt. ( #1717 ) \n \n \n datasette publish cloudrun has new --min-instances and --max-instances options. ( #1779 )", "breadcrumbs": "[\"Changelog\", \"0.62 (2022-08-14)\"]", "references": "[{\"href\": \"https://pyodide.org/\", \"label\": \"Pyodide\"}, {\"href\": \"https://lite.datasette.io/\", \"label\": \"Datasette Lite\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1733\", \"label\": \"#1733\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1739\", \"label\": \"#1739\"}, {\"href\": \"https://github.com/simonw/datasette/pull/1759\", \"label\": \"#1759\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1727\", \"label\": \"this research issue\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1744\", \"label\": \"#1744\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1701\", \"label\": \"#1701\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1712\", \"label\": \"#1712\"}, {\"href\": \"https://hub.docker.com/datasetteproject/datasette\", \"label\": \"official Datasette image\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1768\", \"label\": \"#1768\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1728\", \"label\": \"#1728\"}, {\"href\": \"https://github.com/simonw/datasette/pull/1717\", \"label\": \"#1717\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1779\", \"label\": \"#1779\"}]"} {"id": "internals:internals-internal", "page": "internals", "ref": "internals-internal", "title": "Datasette's internal database", "content": "Datasette maintains an \"internal\" SQLite database used for configuration, caching, and storage. Plugins can store configuration, settings, and other data inside this database. By default, Datasette will use a temporary in-memory SQLite database as the internal database, which is created at startup and destroyed at shutdown. Users of Datasette can optionally pass in a --internal flag to specify the path to a SQLite database to use as the internal database, which will persist internal data across Datasette instances. \n Datasette maintains tables called catalog_databases , catalog_tables , catalog_columns , catalog_indexes , catalog_foreign_keys with details of the attached databases and their schemas. These tables should not be considered a stable API - they may change between Datasette releases. \n The internal database is not exposed in the Datasette application by default, which means private data can safely be stored without worry of accidentally leaking information through the default Datasette interface and API. However, other plugins do have full read and write access to the internal database. \n Plugins can access this database by calling internal_db = datasette.get_internal_database() and then executing queries using the Database API . \n Plugin authors are asked to practice good etiquette when using the internal database, as all plugins use the same database to store data. For example: \n \n \n Use a unique prefix when creating tables, indices, and triggers in the internal database. If your plugin is called datasette-xyz , then prefix names with datasette_xyz_* . \n \n \n Avoid long-running write statements that may stall or block other plugins that are trying to write at the same time. \n \n \n Use temporary tables or shared in-memory attached databases when possible. \n \n \n Avoid implementing features that could expose private data stored in the internal database by other plugins.", "breadcrumbs": "[\"Internals for plugins\"]", "references": "[]"} {"id": "changelog:minor-fixes", "page": "changelog", "ref": "minor-fixes", "title": "Minor fixes", "content": "Datasette no longer attempts to run SQL queries in parallel when rendering a table page, as this was leading to some rare crashing bugs. ( #2189 ) \n \n \n Fixed warning: DeprecationWarning: pkg_resources is deprecated as an API ( #2057 ) \n \n \n Fixed bug where ?_extra=columns parameter returned an incorrectly shaped response. ( #2230 )", "breadcrumbs": "[\"Changelog\", \"1.0a8 (2024-02-07)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/2189\", \"label\": \"#2189\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2057\", \"label\": \"#2057\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2230\", \"label\": \"#2230\"}]"} {"id": "changelog:id7", "page": "changelog", "ref": "id7", "title": "0.64 (2023-01-09)", "content": "Datasette now strongly recommends against allowing arbitrary SQL queries if you are using SpatiaLite . SpatiaLite includes SQL functions that could cause the Datasette server to crash. See SpatiaLite for more details. \n \n \n New default_allow_sql setting, providing an easier way to disable all arbitrary SQL execution by end users: datasette --setting default_allow_sql off . See also Controlling the ability to execute arbitrary SQL . ( #1409 ) \n \n \n Building a location to time zone API with SpatiaLite is a new Datasette tutorial showing how to safely use SpatiaLite to create a location to time zone API. \n \n \n New documentation about how to debug problems loading SQLite extensions . The error message shown when an extension cannot be loaded has also been improved. ( #1979 ) \n \n \n Fixed an accessibility issue: the \n If you are rendering templates using the await .render_template(template, context=None, request=None) method the csrftoken() helper will only work if you provide the request= argument to that method. If you forget to do this you will see the following error: \n form-urlencoded POST field did not match cookie \n You can selectively disable CSRF protection using the skip_csrf(datasette, scope) hook.", "breadcrumbs": "[\"Internals for plugins\"]", "references": "[{\"href\": \"https://github.com/simonw/asgi-csrf\", \"label\": \"asgi-csrf\"}]"} {"id": "internals:internals-tilde-encoding", "page": "internals", "ref": "internals-tilde-encoding", "title": "Tilde encoding", "content": "Datasette uses a custom encoding scheme in some places, called tilde encoding . This is primarily used for table names and row primary keys, to avoid any confusion between / characters in those values and the Datasette URLs that reference them. \n Tilde encoding uses the same algorithm as URL percent-encoding , but with the ~ tilde character used in place of % . \n Any character other than ABCDEFGHIJKLMNOPQRSTUVWXYZ abcdefghijklmnopqrstuvwxyz0123456789_- will be replaced by the numeric equivalent preceded by a tilde. For example: \n \n \n / becomes ~2F \n \n \n . becomes ~2E \n \n \n % becomes ~25 \n \n \n ~ becomes ~7E \n \n \n Space becomes + \n \n \n polls/2022.primary becomes polls~2F2022~2Eprimary \n \n \n Note that the space character is a special case: it will be replaced with a + symbol. \n \n \n \n datasette.utils. tilde_encode s : str str \n \n Returns tilde-encoded string - for example /foo/bar -> ~2Ffoo~2Fbar \n \n \n \n \n \n datasette.utils. tilde_decode s : str str \n \n Decodes a tilde-encoded string, so ~2Ffoo~2Fbar -> /foo/bar", "breadcrumbs": "[\"Internals for plugins\", \"The datasette.utils module\"]", "references": "[{\"href\": \"https://developer.mozilla.org/en-US/docs/Glossary/percent-encoding\", \"label\": \"URL percent-encoding\"}]"} {"id": "settings:setting-secret", "page": "settings", "ref": "setting-secret", "title": "Configuring the secret", "content": "Datasette uses a secret string to sign secure values such as cookies. \n If you do not provide a secret, Datasette will create one when it starts up. This secret will reset every time the Datasette server restarts though, so things like authentication cookies and API tokens will not stay valid between restarts. \n You can pass a secret to Datasette in two ways: with the --secret command-line option or by setting a DATASETTE_SECRET environment variable. \n datasette mydb.db --secret=SECRET_VALUE_HERE \n Or: \n export DATASETTE_SECRET=SECRET_VALUE_HERE\ndatasette mydb.db \n One way to generate a secure random secret is to use Python like this: \n python3 -c 'import secrets; print(secrets.token_hex(32))'\ncdb19e94283a20f9d42cca50c5a4871c0aa07392db308755d60a1a5b9bb0fa52 \n Plugin authors make use of this signing mechanism in their plugins using .sign(value, namespace=\"default\") and .unsign(value, namespace=\"default\") .", "breadcrumbs": "[\"Settings\"]", "references": "[]"} {"id": "contributing:contributing-formatting", "page": "contributing", "ref": "contributing-formatting", "title": "Code formatting", "content": "Datasette uses opinionated code formatters: Black for Python and Prettier for JavaScript. \n These formatters are enforced by Datasette's continuous integration: if a commit includes Python or JavaScript code that does not match the style enforced by those tools, the tests will fail. \n When developing locally, you can verify and correct the formatting of your code using these tools.", "breadcrumbs": "[\"Contributing\"]", "references": "[{\"href\": \"https://github.com/psf/black\", \"label\": \"Black\"}, {\"href\": \"https://prettier.io/\", \"label\": \"Prettier\"}]"} {"id": "changelog:writable-canned-queries", "page": "changelog", "ref": "writable-canned-queries", "title": "Writable canned queries", "content": "Datasette's Canned queries feature lets you define SQL queries in metadata.json which can then be executed by users visiting a specific URL. https://latest.datasette.io/fixtures/neighborhood_search for example. \n Canned queries were previously restricted to SELECT , but Datasette 0.44 introduces the ability for canned queries to execute INSERT or UPDATE queries as well, using the new \"write\": true property ( #800 ): \n {\n \"databases\": {\n \"dogs\": {\n \"queries\": {\n \"add_name\": {\n \"sql\": \"INSERT INTO names (name) VALUES (:name)\",\n \"write\": true\n }\n }\n }\n }\n} \n See Writable canned queries for more details.", "breadcrumbs": "[\"Changelog\", \"0.44 (2020-06-11)\"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures/neighborhood_search\", \"label\": \"https://latest.datasette.io/fixtures/neighborhood_search\"}, {\"href\": \"https://github.com/simonw/datasette/issues/800\", \"label\": \"#800\"}]"} {"id": "changelog:new-configuration-settings", "page": "changelog", "ref": "new-configuration-settings", "title": "New configuration settings", "content": "Datasette's Settings now also supports boolean settings. A number of new\n configuration options have been added: \n \n \n num_sql_threads - the number of threads used to execute SQLite queries. Defaults to 3. \n \n \n allow_facet - enable or disable custom Facets using the _facet= parameter. Defaults to on. \n \n \n suggest_facets - should Datasette suggest facets? Defaults to on. \n \n \n allow_download - should users be allowed to download the entire SQLite database? Defaults to on. \n \n \n allow_sql - should users be allowed to execute custom SQL queries? Defaults to on. \n \n \n default_cache_ttl - Default HTTP caching max-age header in seconds. Defaults to 365 days - caching can be disabled entirely by settings this to 0. \n \n \n cache_size_kb - Set the amount of memory SQLite uses for its per-connection cache , in KB. \n \n \n allow_csv_stream - allow users to stream entire result sets as a single CSV file. Defaults to on. \n \n \n max_csv_mb - maximum size of a returned CSV file in MB. Defaults to 100MB, set to 0 to disable this limit.", "breadcrumbs": "[\"Changelog\", \"0.23 (2018-06-18)\"]", "references": "[{\"href\": \"https://www.sqlite.org/pragma.html#pragma_cache_size\", \"label\": \"per-connection cache\"}]"} {"id": "configuration:configuration-reference-permissions", "page": "configuration", "ref": "configuration-reference-permissions", "title": "Permissions configuration", "content": "Datasette's authentication and permissions system can also be configured using datasette.yaml . \n Here is a simple example: \n [[[cog\nfrom metadata_doc import config_example\nimport textwrap\nconfig_example(cog, textwrap.dedent(\n \"\"\"\n # Instance is only available to users 'sharon' and 'percy':\n allow:\n id:\n - sharon\n - percy\n\n # Only 'percy' is allowed access to the accounting database:\n databases:\n accounting:\n allow:\n id: percy\n \"\"\").strip()\n ) \n ]]] \n [[[end]]] \n Access permissions in datasette.yaml has the full details.", "breadcrumbs": "[\"Configuration\", null]", "references": "[]"} {"id": "metadata:label-columns", "page": "metadata", "ref": "label-columns", "title": "Specifying the label column for a table", "content": "Datasette's HTML interface attempts to display foreign key references as\n labelled hyperlinks. By default, it looks for referenced tables that only have\n two columns: a primary key column and one other. It assumes that the second\n column should be used as the link label. \n If your table has more than two columns you can specify which column should be\n used for the link label with the label_column property: \n [[[cog\nmetadata_example(cog, {\n \"databases\": {\n \"database1\": {\n \"tables\": {\n \"example_table\": {\n \"label_column\": \"title\"\n }\n }\n }\n }\n}) \n ]]] \n [[[end]]]", "breadcrumbs": "[\"Metadata\"]", "references": "[]"} {"id": "sql_queries:id2", "page": "sql_queries", "ref": "id2", "title": "Pagination", "content": "Datasette's default table pagination is designed to be extremely efficient. SQL OFFSET/LIMIT pagination can have a significant performance penalty once you get into multiple thousands of rows, as each page still requires the database to scan through every preceding row to find the correct offset. \n When paginating through tables, Datasette instead orders the rows in the table by their primary key and performs a WHERE clause against the last seen primary key for the previous page. For example: \n select rowid, * from Tree_List where rowid > 200 order by rowid limit 101 \n This represents page three for this particular table, with a page size of 100. \n Note that we request 101 items in the limit clause rather than 100. This allows us to detect if we are on the last page of the results: if the query returns less than 101 rows we know we have reached the end of the pagination set. Datasette will only return the first 100 rows - the 101st is used purely to detect if there should be another page. \n Since the where clause acts against the index on the primary key, the query is extremely fast even for records that are a long way into the overall pagination set.", "breadcrumbs": "[\"Running SQL queries\"]", "references": "[]"} {"id": "contributing:contributing-documentation", "page": "contributing", "ref": "contributing-documentation", "title": "Editing and building the documentation", "content": "Datasette's documentation lives in the docs/ directory and is deployed automatically using Read The Docs . \n The documentation is written using reStructuredText. You may find this article on The subset of reStructuredText worth committing to memory useful. \n You can build it locally by installing sphinx and sphinx_rtd_theme in your Datasette development environment and then running make html directly in the docs/ directory: \n # You may first need to activate your virtual environment:\nsource venv/bin/activate\n\n# Install the dependencies needed to build the docs\npip install -e .[docs]\n\n# Now build the docs\ncd docs/\nmake html \n This will create the HTML version of the documentation in docs/_build/html . You can open it in your browser like so: \n open _build/html/index.html \n Any time you make changes to a .rst file you can re-run make html to update the built documents, then refresh them in your browser. \n For added productivity, you can use use sphinx-autobuild to run Sphinx in auto-build mode. This will run a local webserver serving the docs that automatically rebuilds them and refreshes the page any time you hit save in your editor. \n sphinx-autobuild will have been installed when you ran pip install -e .[docs] . In your docs/ directory you can start the server by running the following: \n make livehtml \n Now browse to http://localhost:8000/ to view the documentation. Any edits you make should be instantly reflected in your browser.", "breadcrumbs": "[\"Contributing\"]", "references": "[{\"href\": \"https://readthedocs.org/\", \"label\": \"Read The Docs\"}, {\"href\": \"https://simonwillison.net/2018/Aug/25/restructuredtext/\", \"label\": \"The subset of reStructuredText worth committing to memory\"}, {\"href\": \"https://pypi.org/project/sphinx-autobuild/\", \"label\": \"sphinx-autobuild\"}]"} {"id": "plugins:id1", "page": "plugins", "ref": "id1", "title": "Plugins", "content": "Datasette's plugin system allows additional features to be implemented as Python\n code (or front-end JavaScript) which can be wrapped up in a separate Python\n package. The underlying mechanism uses pluggy . \n See the Datasette plugins directory for a list of existing plugins, or take a look at the\n datasette-plugin topic on GitHub. \n Things you can do with plugins include: \n \n \n Add visualizations to Datasette, for example\n datasette-cluster-map and\n datasette-vega . \n \n \n Make new custom SQL functions available for use within Datasette, for example\n datasette-haversine and\n datasette-jellyfish . \n \n \n Define custom output formats with custom extensions, for example datasette-atom and\n datasette-ics . \n \n \n Add template functions that can be called within your Jinja custom templates,\n for example datasette-render-markdown . \n \n \n Customize how database values are rendered in the Datasette interface, for example\n datasette-render-binary and\n datasette-pretty-json . \n \n \n Customize how Datasette's authentication and permissions systems work, for example datasette-auth-passwords and\n datasette-permissions-sql .", "breadcrumbs": "[]", "references": "[{\"href\": \"https://pluggy.readthedocs.io/\", \"label\": \"pluggy\"}, {\"href\": \"https://datasette.io/plugins\", \"label\": \"Datasette plugins directory\"}, {\"href\": \"https://github.com/topics/datasette-plugin\", \"label\": \"datasette-plugin\"}, {\"href\": \"https://github.com/simonw/datasette-cluster-map\", \"label\": \"datasette-cluster-map\"}, {\"href\": \"https://github.com/simonw/datasette-vega\", \"label\": \"datasette-vega\"}, {\"href\": \"https://github.com/simonw/datasette-haversine\", \"label\": \"datasette-haversine\"}, {\"href\": \"https://github.com/simonw/datasette-jellyfish\", \"label\": \"datasette-jellyfish\"}, {\"href\": \"https://github.com/simonw/datasette-atom\", \"label\": \"datasette-atom\"}, {\"href\": \"https://github.com/simonw/datasette-ics\", \"label\": \"datasette-ics\"}, {\"href\": \"https://github.com/simonw/datasette-render-markdown#markdown-in-templates\", \"label\": \"datasette-render-markdown\"}, {\"href\": \"https://github.com/simonw/datasette-render-binary\", \"label\": \"datasette-render-binary\"}, {\"href\": \"https://github.com/simonw/datasette-pretty-json\", \"label\": \"datasette-pretty-json\"}, {\"href\": \"https://github.com/simonw/datasette-auth-passwords\", \"label\": \"datasette-auth-passwords\"}, {\"href\": \"https://github.com/simonw/datasette-permissions-sql\", \"label\": \"datasette-permissions-sql\"}]"} {"id": "facets:suggested-facets", "page": "facets", "ref": "suggested-facets", "title": "Suggested facets", "content": "Datasette's table UI will suggest facets for the user to apply, based on the following criteria: \n For the currently filtered data are there any columns which, if applied as a facet... \n \n \n Will return 30 or less unique options \n \n \n Will return more than one unique option \n \n \n Will return less unique options than the total number of filtered rows \n \n \n And the query used to evaluate this criteria can be completed in under 50ms \n \n \n That last point is particularly important: Datasette runs a query for every column that is displayed on a page, which could get expensive - so to avoid slow load times it sets a time limit of just 50ms for each of those queries.\n This means suggested facets are unlikely to appear for tables with millions of records in them.", "breadcrumbs": "[\"Facets\"]", "references": "[]"} {"id": "settings:setting-default-cache-ttl", "page": "settings", "ref": "setting-default-cache-ttl", "title": "default_cache_ttl", "content": "Default HTTP caching max-age header in seconds, used for Cache-Control: max-age=X . Can be over-ridden on a per-request basis using the ?_ttl= query string parameter. Set this to 0 to disable HTTP caching entirely. Defaults to 5 seconds. \n datasette mydatabase.db --setting default_cache_ttl 60", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"} {"id": "changelog:id76", "page": "changelog", "ref": "id76", "title": "0.31.1 (2019-11-12)", "content": "Deployments created using datasette publish now use python:3.8 base Docker image ( #629 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/pull/629\", \"label\": \"#629\"}]"} {"id": "internals:internals-utils-derive-named-parameters", "page": "internals", "ref": "internals-utils-derive-named-parameters", "title": "derive_named_parameters(db, sql)", "content": "Derive the list of named parameters referenced in a SQL query, using an explain query executed against the provided database. \n \n \n async datasette.utils. derive_named_parameters db : Database sql : str List [ str ] \n \n Given a SQL statement, return a list of named parameters that are used in the statement \n e.g. for select * from foo where id=:id this would return [\"id\"]", "breadcrumbs": "[\"Internals for plugins\", \"The datasette.utils module\"]", "references": "[]"} {"id": "changelog:documentation", "page": "changelog", "ref": "documentation", "title": "Documentation", "content": "Documentation describing how to write tests that use signed actor cookies using datasette.client.actor_cookie() . ( #1830 ) \n \n \n Documentation on how to register a plugin for the duration of a test . ( #2234 ) \n \n \n The configuration documentation now shows examples of both YAML and JSON for each setting.", "breadcrumbs": "[\"Changelog\", \"1.0a8 (2024-02-07)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1830\", \"label\": \"#1830\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2234\", \"label\": \"#2234\"}]"} {"id": "changelog:id92", "page": "changelog", "ref": "id92", "title": "0.25.1 (2018-11-04)", "content": "Documentation improvements plus a fix for publishing to Zeit Now. \n \n \n datasette publish now now uses Zeit's v1 platform, to work around the new 100MB image limit. Thanks, @slygent - closes #366 .", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/366\", \"label\": \"#366\"}]"} {"id": "changelog:id32", "page": "changelog", "ref": "id32", "title": "0.56 (2021-03-28)", "content": "Documentation improvements, bug fixes and support for SpatiaLite 5. \n \n \n The SQL editor can now be resized by dragging a handle. ( #1236 ) \n \n \n Fixed a bug with JSON faceting and the __arraycontains filter caused by tables with spaces in their names. ( #1239 ) \n \n \n Upgraded httpx dependency. ( #1005 ) \n \n \n JSON faceting is now suggested even if a column contains blank strings. ( #1246 ) \n \n \n New datasette.add_memory_database() method. ( #1247 ) \n \n \n The Response.asgi_send() method is now documented. ( #1266 ) \n \n \n The official Datasette Docker image now bundles SpatiaLite version 5. ( #1278 ) \n \n \n Fixed a no such table: pragma_database_list bug when running Datasette against SQLite versions prior to SQLite 3.16.0. ( #1276 ) \n \n \n HTML lists displayed in table cells are now styled correctly. Thanks, Bob Whitelock. ( #1141 , #1252 ) \n \n \n Configuration directory mode now correctly serves immutable databases that are listed in inspect-data.json . Thanks Campbell Allen and Frankie Robertson. ( #1031 , #1229 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1236\", \"label\": \"#1236\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1239\", \"label\": \"#1239\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1005\", \"label\": \"#1005\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1246\", \"label\": \"#1246\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1247\", \"label\": \"#1247\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1266\", \"label\": \"#1266\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1278\", \"label\": \"#1278\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1276\", \"label\": \"#1276\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1141\", \"label\": \"#1141\"}, {\"href\": \"https://github.com/simonw/datasette/pull/1252\", \"label\": \"#1252\"}, {\"href\": \"https://github.com/simonw/datasette/pull/1031\", \"label\": \"#1031\"}, {\"href\": \"https://github.com/simonw/datasette/pull/1229\", \"label\": \"#1229\"}]"} {"id": "changelog:id6", "page": "changelog", "ref": "id6", "title": "0.64.1 (2023-01-11)", "content": "Documentation now links to a current source of information for installing Python 3. ( #1987 ) \n \n \n Incorrectly calling the Datasette constructor using Datasette(\"path/to/data.db\") instead of Datasette([\"path/to/data.db\"]) now returns a useful error message. ( #1985 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1987\", \"label\": \"#1987\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1985\", \"label\": \"#1985\"}]"} {"id": "changelog:id42", "page": "changelog", "ref": "id42", "title": "0.52.1 (2020-11-29)", "content": "Documentation on Testing plugins now recommends using datasette.client . ( #1102 ) \n \n \n Fix bug where compound foreign keys produced broken links. ( #1098 ) \n \n \n datasette --load-module=spatialite now also checks for /usr/local/lib/mod_spatialite.so . Thanks, Dan Peterson. ( #1114 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1102\", \"label\": \"#1102\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1098\", \"label\": \"#1098\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1114\", \"label\": \"#1114\"}]"} {"id": "ecosystem:dogsheep", "page": "ecosystem", "ref": "dogsheep", "title": "Dogsheep", "content": "Dogsheep is a collection of tools for personal analytics using SQLite and Datasette. The project provides tools like github-to-sqlite and twitter-to-sqlite that can import data from different sources in order to create a personal data warehouse. Personal Data Warehouses: Reclaiming Your Data is a talk that explains Dogsheep and demonstrates it in action.", "breadcrumbs": "[\"The Datasette Ecosystem\"]", "references": "[{\"href\": \"https://dogsheep.github.io/\", \"label\": \"Dogsheep\"}, {\"href\": \"https://datasette.io/tools/github-to-sqlite\", \"label\": \"github-to-sqlite\"}, {\"href\": \"https://datasette.io/tools/twitter-to-sqlite\", \"label\": \"twitter-to-sqlite\"}, {\"href\": \"https://simonwillison.net/2020/Nov/14/personal-data-warehouses/\", \"label\": \"Personal Data Warehouses: Reclaiming Your Data\"}]"} {"id": "changelog:id68", "page": "changelog", "ref": "id68", "title": "0.37.1 (2020-03-02)", "content": "Don't attempt to count table rows to display on the index page for databases > 100MB. ( #688 ) \n \n \n Print exceptions if they occur in the write thread rather than silently swallowing them. \n \n \n Handle the possibility of scope[\"path\"] being a string rather than bytes \n \n \n Better documentation for the extra_template_vars(template, database, table, columns, view_name, request, datasette) plugin hook.", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/688\", \"label\": \"#688\"}]"} {"id": "changelog:bug-fixes", "page": "changelog", "ref": "bug-fixes", "title": "Bug fixes", "content": "Don't show the facet option in the cog menu if faceting is not allowed. ( #1683 ) \n \n \n ?_sort and ?_sort_desc now work if the column that is being sorted has been excluded from the query using ?_col= or ?_nocol= . ( #1773 ) \n \n \n Fixed bug where ?_sort_desc was duplicated in the URL every time the Apply button was clicked. ( #1738 )", "breadcrumbs": "[\"Changelog\", \"0.62 (2022-08-14)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1683\", \"label\": \"#1683\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1773\", \"label\": \"#1773\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1738\", \"label\": \"#1738\"}]"} {"id": "changelog:id3", "page": "changelog", "ref": "id3", "title": "0.64.5 (2023-10-08)", "content": "Dropped dependency on click-default-group-wheel , which could cause a dependency conflict. ( #2197 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/2197\", \"label\": \"#2197\"}]"} {"id": "pages:databaseview", "page": "pages", "ref": "databaseview", "title": "Database", "content": "Each database has a page listing the tables, views and canned queries available for that database. If the execute-sql permission is enabled (it's on by default) there will also be an interface for executing arbitrary SQL select queries against the data. \n Examples: \n \n \n fivethirtyeight.datasettes.com/fivethirtyeight \n \n \n global-power-plants.datasettes.com/global-power-plants \n \n \n The JSON version of this page provides programmatic access to the underlying data: \n \n \n fivethirtyeight.datasettes.com/fivethirtyeight.json \n \n \n global-power-plants.datasettes.com/global-power-plants.json", "breadcrumbs": "[\"Pages and API endpoints\"]", "references": "[{\"href\": \"https://fivethirtyeight.datasettes.com/fivethirtyeight\", \"label\": \"fivethirtyeight.datasettes.com/fivethirtyeight\"}, {\"href\": \"https://global-power-plants.datasettes.com/global-power-plants\", \"label\": \"global-power-plants.datasettes.com/global-power-plants\"}, {\"href\": \"https://fivethirtyeight.datasettes.com/fivethirtyeight.json\", \"label\": \"fivethirtyeight.datasettes.com/fivethirtyeight.json\"}, {\"href\": \"https://global-power-plants.datasettes.com/global-power-plants.json\", \"label\": \"global-power-plants.datasettes.com/global-power-plants.json\"}]"} {"id": "changelog:v1-0-a13", "page": "changelog", "ref": "v1-0-a13", "title": "1.0a13 (2024-03-12)", "content": "Each of the key concepts in Datasette now has an actions menu , which plugins can use to add additional functionality targeting that entity. \n \n \n Plugin hook: view_actions() for actions that can be applied to a SQL view. ( #2297 ) \n \n \n Plugin hook: homepage_actions() for actions that apply to the instance homepage. ( #2298 ) \n \n \n Plugin hook: row_actions() for actions that apply to the row page. ( #2299 ) \n \n \n Action menu items for all of the *_actions() plugin hooks can now return an optional \"description\" key, which will be displayed in the menu below the action label. ( #2294 ) \n \n \n Plugin hooks documentation page is now organized with additional headings. ( #2300 ) \n \n \n Improved the display of action buttons on pages that also display metadata. ( #2286 ) \n \n \n The header and footer of the page now uses a subtle gradient effect, and options in the navigation menu are better visually defined. ( #2302 ) \n \n \n Table names that start with an underscore now default to hidden. ( #2104 ) \n \n \n pragma_table_list has been added to the allow-list of SQLite pragma functions supported by Datasette. select * from pragma_table_list() is no longer blocked. ( #2104 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/2297\", \"label\": \"#2297\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2298\", \"label\": \"#2298\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2299\", \"label\": \"#2299\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2294\", \"label\": \"#2294\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2300\", \"label\": \"#2300\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2286\", \"label\": \"#2286\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2302\", \"label\": \"#2302\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2104\", \"label\": \"#2104\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2104#issuecomment-1982352475\", \"label\": \"#2104\"}]"} {"id": "settings:setting-allow-csv-stream", "page": "settings", "ref": "setting-allow-csv-stream", "title": "allow_csv_stream", "content": "Enables the CSV export feature where an entire table\n (potentially hundreds of thousands of rows) can be exported as a single CSV\n file. This is turned on by default - you can turn it off like this: \n datasette mydatabase.db --setting allow_csv_stream off", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"} {"id": "json_api:json-api-special", "page": "json_api", "ref": "json-api-special", "title": "Special JSON arguments", "content": "Every Datasette endpoint that can return JSON also accepts the following\n query string arguments: \n \n \n ?_shape=SHAPE \n \n The shape of the JSON to return, documented above. \n \n \n \n ?_nl=on \n \n When used with ?_shape=array produces newline-delimited JSON objects. \n \n \n \n ?_json=COLUMN1&_json=COLUMN2 \n \n If any of your SQLite columns contain JSON values, you can use one or more\n _json= parameters to request that those columns be returned as regular\n JSON. Without this argument those columns will be returned as JSON objects\n that have been double-encoded into a JSON string value. \n Compare this query without the argument to this query using the argument \n \n \n \n ?_json_infinity=on \n \n If your data contains infinity or -infinity values, Datasette will replace\n them with None when returning them as JSON. If you pass _json_infinity=1 \n Datasette will instead return them as Infinity or -Infinity which is\n invalid JSON but can be processed by some custom JSON parsers. \n \n \n \n ?_timelimit=MS \n \n Sets a custom time limit for the query in ms. You can use this for optimistic\n queries where you would like Datasette to give up if the query takes too\n long, for example if you want to implement autocomplete search but only if\n it can be executed in less than 10ms. \n \n \n \n ?_ttl=SECONDS \n \n For how many seconds should this response be cached by HTTP proxies? Use\n ?_ttl=0 to disable HTTP caching entirely for this request. \n \n \n \n ?_trace=1 \n \n Turns on tracing for this page: SQL queries executed during the request will\n be gathered and included in the response, either in a new \"_traces\" key\n for JSON responses or at the bottom of the page if the response is in HTML. \n The structure of the data returned here should be considered highly unstable\n and very likely to change. \n Only available if the trace_debug setting is enabled.", "breadcrumbs": "[\"JSON API\"]", "references": "[{\"href\": \"https://fivethirtyeight.datasettes.com/fivethirtyeight.json?sql=select+%27{%22this+is%22%3A+%22a+json+object%22}%27+as+d&_shape=array\", \"label\": \"this query without the argument\"}, {\"href\": \"https://fivethirtyeight.datasettes.com/fivethirtyeight.json?sql=select+%27{%22this+is%22%3A+%22a+json+object%22}%27+as+d&_shape=array&_json=d\", \"label\": \"this query using the argument\"}]"} {"id": "changelog:latest-datasette-io", "page": "changelog", "ref": "latest-datasette-io", "title": "latest.datasette.io", "content": "Every commit to Datasette master is now automatically deployed by Travis CI to\n https://latest.datasette.io/ - ensuring there is always a live demo of the\n latest version of the software. \n The demo uses the fixtures from our\n unit tests, ensuring it demonstrates the same range of functionality that is\n covered by the tests. \n You can see how the deployment mechanism works in our .travis.yml file.", "breadcrumbs": "[\"Changelog\", \"0.23 (2018-06-18)\"]", "references": "[{\"href\": \"https://latest.datasette.io/\", \"label\": \"https://latest.datasette.io/\"}, {\"href\": \"https://github.com/simonw/datasette/blob/master/tests/fixtures.py\", \"label\": \"the fixtures\"}, {\"href\": \"https://github.com/simonw/datasette/blob/master/.travis.yml\", \"label\": \".travis.yml\"}]"} {"id": "custom_templates:css-classes-on-the-body", "page": "custom_templates", "ref": "css-classes-on-the-body", "title": "CSS classes on the ", "content": "Every default template includes CSS classes in the body designed to support\n custom styling. \n The index template (the top level page at / ) gets this: \n \n The database template ( /dbname ) gets this: \n \n The custom SQL template ( /dbname?sql=... ) gets this: \n \n A canned query template ( /dbname/queryname ) gets this: \n \n The table template ( /dbname/tablename ) gets: \n \n The row template ( /dbname/tablename/rowid ) gets: \n \n The db-x and table-x classes use the database or table names themselves if\n they are valid CSS identifiers. If they aren't, we strip any invalid\n characters out and append a 6 character md5 digest of the original name, in\n order to ensure that multiple tables which resolve to the same stripped\n character version still have different CSS classes. \n Some examples: \n \"simple\" => \"simple\"\n\"MixedCase\" => \"MixedCase\"\n\"-no-leading-hyphens\" => \"no-leading-hyphens-65bea6\"\n\"_no-leading-underscores\" => \"no-leading-underscores-b921bc\"\n\"no spaces\" => \"no-spaces-7088d7\"\n\"-\" => \"336d5e\"\n\"no $ characters\" => \"no--characters-59e024\" \n and elements also get custom CSS classes reflecting the\n database column they are representing, for example: \n \n \n \n \n \n \n \n \n \n \n \n \n \n
idname
1SMITH
", "breadcrumbs": "[\"Custom pages and templates\"]", "references": "[]"} {"id": "pages:rowview", "page": "pages", "ref": "rowview", "title": "Row", "content": "Every row in every Datasette table has its own URL. This means individual records can be linked to directly. \n Table cells with extremely long text contents are truncated on the table view according to the truncate_cells_html setting. If a cell has been truncated the full length version of that cell will be available on the row page. \n Rows which are the targets of foreign key references from other tables will show a link to a filtered search for all records that reference that row. Here's an example from the Registers of Members Interests database: \n ../people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001 \n Note that this URL includes the encoded primary key of the record. \n Here's that same page as JSON: \n ../people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001.json", "breadcrumbs": "[\"Pages and API endpoints\"]", "references": "[{\"href\": \"https://register-of-members-interests.datasettes.com/regmem/people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001\", \"label\": \"../people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001\"}, {\"href\": \"https://register-of-members-interests.datasettes.com/regmem/people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001.json\", \"label\": \"../people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001.json\"}]"} {"id": "changelog:id16", "page": "changelog", "ref": "id16", "title": "Documentation", "content": "Examples in the documentation now include a copy-to-clipboard button. ( #1748 ) \n \n \n Documentation now uses the Furo Sphinx theme. ( #1746 ) \n \n \n Code examples in the documentation are now all formatted using Black. ( #1718 ) \n \n \n Request.fake() method is now documented, see Request object . \n \n \n New documentation for plugin authors: Registering a plugin for the duration of a test . ( #903 )", "breadcrumbs": "[\"Changelog\", \"0.62 (2022-08-14)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1748\", \"label\": \"#1748\"}, {\"href\": \"https://github.com/pradyunsg/furo\", \"label\": \"Furo\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1746\", \"label\": \"#1746\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1718\", \"label\": \"#1718\"}, {\"href\": \"https://github.com/simonw/datasette/issues/903\", \"label\": \"#903\"}]"} {"id": "internals:database-execute", "page": "internals", "ref": "database-execute", "title": "await db.execute(sql, ...)", "content": "Executes a SQL query against the database and returns the resulting rows (see Results ). \n \n \n sql - string (required) \n \n The SQL query to execute. This can include ? or :named parameters. \n \n \n \n params - list or dict \n \n A list or dictionary of values to use for the parameters. List for ? , dictionary for :named . \n \n \n \n truncate - boolean \n \n Should the rows returned by the query be truncated at the maximum page size? Defaults to True , set this to False to disable truncation. \n \n \n \n custom_time_limit - integer ms \n \n A custom time limit for this query. This can be set to a lower value than the Datasette configured default. If a query takes longer than this it will be terminated early and raise a dataette.database.QueryInterrupted exception. \n \n \n \n page_size - integer \n \n Set a custom page size for truncation, over-riding the configured Datasette default. \n \n \n \n log_sql_errors - boolean \n \n Should any SQL errors be logged to the console in addition to being raised as an error? Defaults to True .", "breadcrumbs": "[\"Internals for plugins\", \"Database class\"]", "references": "[]"} {"id": "internals:database-execute-fn", "page": "internals", "ref": "database-execute-fn", "title": "await db.execute_fn(fn)", "content": "Executes a given callback function against a read-only database connection running in a thread. The function will be passed a SQLite connection, and the return value from the function will be returned by the await . \n Example usage: \n def get_version(conn):\n return conn.execute(\n \"select sqlite_version()\"\n ).fetchall()[0][0]\n\n\nversion = await db.execute_fn(get_version)", "breadcrumbs": "[\"Internals for plugins\", \"Database class\"]", "references": "[]"} {"id": "plugin_hooks:plugin-hook-extra-body-script", "page": "plugin_hooks", "ref": "plugin-hook-extra-body-script", "title": "extra_body_script(template, database, table, columns, view_name, request, datasette)", "content": "Extra JavaScript to be added to a element: \n @hookimpl\ndef extra_body_script():\n return {\n \"module\": True,\n \"script\": \"console.log('Your JavaScript goes here...')\",\n } \n This will add the following to the end of your page: \n \n Example: datasette-cluster-map", "breadcrumbs": "[\"Plugin hooks\", \"Page extras\"]", "references": "[{\"href\": \"https://datasette.io/plugins/datasette-cluster-map\", \"label\": \"datasette-cluster-map\"}]"} {"id": "plugin_hooks:plugin-hook-extra-template-vars", "page": "plugin_hooks", "ref": "plugin-hook-extra-template-vars", "title": "extra_template_vars(template, database, table, columns, view_name, request, datasette)", "content": "Extra template variables that should be made available in the rendered template context. \n \n \n template - string \n \n The template that is being rendered, e.g. database.html \n \n \n \n database - string or None \n \n The name of the database, or None if the page does not correspond to a database (e.g. the root page) \n \n \n \n table - string or None \n \n The name of the table, or None if the page does not correct to a table \n \n \n \n columns - list of strings or None \n \n The names of the database columns that will be displayed on this page. None if the page does not contain a table. \n \n \n \n view_name - string \n \n The name of the view being displayed. ( index , database , table , and row are the most important ones.) \n \n \n \n request - Request object or None \n \n The current HTTP request. This can be None if the request object is not available. \n \n \n \n datasette - Datasette class \n \n You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) \n \n \n \n This hook can return one of three different types: \n \n \n Dictionary \n \n If you return a dictionary its keys and values will be merged into the template context. \n \n \n \n Function that returns a dictionary \n \n If you return a function it will be executed. If it returns a dictionary those values will will be merged into the template context. \n \n \n \n Function that returns an awaitable function that returns a dictionary \n \n You can also return a function which returns an awaitable function which returns a dictionary. \n \n \n \n Datasette runs Jinja2 in async mode , which means you can add awaitable functions to the template scope and they will be automatically awaited when they are rendered by the template. \n Here's an example plugin that adds a \"user_agent\" variable to the template context containing the current request's User-Agent header: \n @hookimpl\ndef extra_template_vars(request):\n return {\"user_agent\": request.headers.get(\"user-agent\")} \n This example returns an awaitable function which adds a list of hidden_table_names to the context: \n @hookimpl\ndef extra_template_vars(datasette, database):\n async def hidden_table_names():\n if database:\n db = datasette.databases[database]\n return {\n \"hidden_table_names\": await db.hidden_table_names()\n }\n else:\n return {}\n\n return hidden_table_names \n And here's an example which adds a sql_first(sql_query) function which executes a SQL statement and returns the first column of the first row of results: \n @hookimpl\ndef extra_template_vars(datasette, database):\n async def sql_first(sql, dbname=None):\n dbname = (\n dbname\n or database\n or next(iter(datasette.databases.keys()))\n )\n result = await datasette.execute(dbname, sql)\n return result.rows[0][0]\n\n return {\"sql_first\": sql_first} \n You can then use the new function in a template like so: \n SQLite version: {{ sql_first(\"select sqlite_version()\") }} \n Examples: datasette-search-all , datasette-template-sql", "breadcrumbs": "[\"Plugin hooks\", \"Page extras\"]", "references": "[{\"href\": \"https://jinja.palletsprojects.com/en/2.10.x/api/#async-support\", \"label\": \"async mode\"}, {\"href\": \"https://datasette.io/plugins/datasette-search-all\", \"label\": \"datasette-search-all\"}, {\"href\": \"https://datasette.io/plugins/datasette-template-sql\", \"label\": \"datasette-template-sql\"}]"} {"id": "getting_started:getting-started-your-computer", "page": "getting_started", "ref": "getting-started-your-computer", "title": "Using Datasette on your own computer", "content": "First, follow the Installation instructions. Now you can run Datasette against a SQLite file on your computer using the following command: \n datasette path/to/database.db \n This will start a web server on port 8001 - visit http://localhost:8001/ \n to access the web interface. \n Add -o to open your browser automatically once Datasette has started: \n datasette path/to/database.db -o \n Use Chrome on OS X? You can run datasette against your browser history\n like so: \n datasette ~/Library/Application\\ Support/Google/Chrome/Default/History --nolock \n The --nolock option ignores any file locks. This is safe as Datasette will open the file in read-only mode. \n Now visiting http://localhost:8001/History/downloads will show you a web\n interface to browse your downloads data: \n \n \n \n http://localhost:8001/History/downloads.json will return that data as\n JSON: \n {\n \"database\": \"History\",\n \"columns\": [\n \"id\",\n \"current_path\",\n \"target_path\",\n \"start_time\",\n \"received_bytes\",\n \"total_bytes\",\n ...\n ],\n \"rows\": [\n [\n 1,\n \"/Users/simonw/Downloads/DropboxInstaller.dmg\",\n \"/Users/simonw/Downloads/DropboxInstaller.dmg\",\n 13097290269022132,\n 626688,\n 0,\n ...\n ]\n ]\n} \n http://localhost:8001/History/downloads.json?_shape=objects will return that data as\n JSON in a more convenient format: \n {\n ...\n \"rows\": [\n {\n \"start_time\": 13097290269022132,\n \"interrupt_reason\": 0,\n \"hash\": \"\",\n \"id\": 1,\n \"site_url\": \"\",\n \"referrer\": \"https://www.dropbox.com/downloading?src=index\",\n ...\n }\n ]\n}", "breadcrumbs": "[\"Getting started\"]", "references": "[{\"href\": \"http://localhost:8001/\", \"label\": \"http://localhost:8001/\"}, {\"href\": \"http://localhost:8001/History/downloads\", \"label\": \"http://localhost:8001/History/downloads\"}, {\"href\": \"http://localhost:8001/History/downloads.json\", \"label\": \"http://localhost:8001/History/downloads.json\"}, {\"href\": \"http://localhost:8001/History/downloads.json?_shape=objects\", \"label\": \"http://localhost:8001/History/downloads.json?_shape=objects\"}]"} {"id": "changelog:id25", "page": "changelog", "ref": "id25", "title": "0.59.1 (2021-10-24)", "content": "Fix compatibility with Python 3.10. ( #1482 ) \n \n \n Documentation on how to use Named parameters with integer and floating point values. ( #1496 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1482\", \"label\": \"#1482\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1496\", \"label\": \"#1496\"}]"} {"id": "changelog:id4", "page": "changelog", "ref": "id4", "title": "0.64.4 (2023-09-21)", "content": "Fix for a crashing bug caused by viewing the table page for a named in-memory database. ( #2189 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/2189\", \"label\": \"#2189\"}]"} {"id": "changelog:v1-0-a7", "page": "changelog", "ref": "v1-0-a7", "title": "1.0a7 (2023-09-21)", "content": "Fix for a crashing bug caused by viewing the table page for a named in-memory database. ( #2189 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/2189\", \"label\": \"#2189\"}]"} {"id": "changelog:id27", "page": "changelog", "ref": "id27", "title": "0.58.1 (2021-07-16)", "content": "Fix for an intermittent race condition caused by the refresh_schemas() internal function. ( #1231 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1231\", \"label\": \"#1231\"}]"} {"id": "changelog:id38", "page": "changelog", "ref": "id38", "title": "0.52.5 (2020-12-09)", "content": "Fix for error caused by combining the _searchmode=raw and ?_search_COLUMN parameters. ( #1134 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1134\", \"label\": \"#1134\"}]"} {"id": "changelog:id209", "page": "changelog", "ref": "id209", "title": "0.10 (2017-11-14)", "content": "Fixed #83 - 500 error on individual row pages. \n \n \n Stop using sqlite WITH RECURSIVE in our tests. \n The version of Python 3 running in Travis CI doesn't support this.", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/83\", \"label\": \"#83\"}]"} {"id": "changelog:id9", "page": "changelog", "ref": "id9", "title": "0.63.2 (2022-11-18)", "content": "Fixed a bug in datasette publish heroku where deployments failed due to an older version of Python being requested. ( #1905 ) \n \n \n New datasette publish heroku --generate-dir option for generating a Heroku deployment directory without deploying it.", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1905\", \"label\": \"#1905\"}]"} {"id": "changelog:id48", "page": "changelog", "ref": "id48", "title": "0.50.1 (2020-10-09)", "content": "Fixed a bug introduced in 0.50 where the export as JSON/CSV links on the table, row and query pages were broken. ( #1010 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1010\", \"label\": \"#1010\"}]"} {"id": "changelog:id34", "page": "changelog", "ref": "id34", "title": "0.54.1 (2021-02-02)", "content": "Fixed a bug where ?_search= and ?_sort= parameters were incorrectly duplicated when the filter form on the table page was re-submitted. ( #1214 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1214\", \"label\": \"#1214\"}]"} {"id": "changelog:id8", "page": "changelog", "ref": "id8", "title": "0.63.3 (2022-12-17)", "content": "Fixed a bug where datasette --root , when running in Docker, would only output the URL to sign in root when the server shut down, not when it started up. ( #1958 ) \n \n \n You no longer need to ensure await datasette.invoke_startup() has been called in order for Datasette to start correctly serving requests - this is now handled automatically the first time the server receives a request. This fixes a bug experienced when Datasette is served directly by an ASGI application server such as Uvicorn or Gunicorn. It also fixes a bug with the datasette-gunicorn plugin. ( #1955 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1958\", \"label\": \"#1958\"}, {\"href\": \"https://datasette.io/plugins/datasette-gunicorn\", \"label\": \"datasette-gunicorn\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1955\", \"label\": \"#1955\"}]"} {"id": "changelog:id75", "page": "changelog", "ref": "id75", "title": "0.31.2 (2019-11-13)", "content": "Fixed a bug where datasette publish heroku applications failed to start ( #633 ) \n \n \n Fix for datasette publish with just --source_url - thanks, Stanley Zheng ( #572 ) \n \n \n Deployments to Heroku now use Python 3.8.0 ( #632 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/633\", \"label\": \"#633\"}, {\"href\": \"https://github.com/simonw/datasette/issues/572\", \"label\": \"#572\"}, {\"href\": \"https://github.com/simonw/datasette/issues/632\", \"label\": \"#632\"}]"} {"id": "changelog:id2", "page": "changelog", "ref": "id2", "title": "0.64.6 (2023-12-22)", "content": "Fixed a bug where CSV export with expanded labels could fail if a foreign key reference did not correctly resolve. ( #2214 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/2214\", \"label\": \"#2214\"}]"} {"id": "changelog:id19", "page": "changelog", "ref": "id19", "title": "0.60.2 (2022-02-07)", "content": "Fixed a bug where Datasette would open the same file twice with two different database names if you ran datasette file.db file.db . ( #1632 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1632\", \"label\": \"#1632\"}]"} {"id": "changelog:id10", "page": "changelog", "ref": "id10", "title": "0.63.1 (2022-11-10)", "content": "Fixed a bug where Datasette's table filter form would not redirect correctly when run behind a proxy using the base_url setting. ( #1883 ) \n \n \n SQL query is now shown wrapped in a