{"id": "authentication:authentication-cli-create-token-restrict", "page": "authentication", "ref": "authentication-cli-create-token-restrict", "title": "Restricting the actions that a token can perform", "content": "Tokens created using datasette create-token ACTOR_ID will inherit all of the permissions of the actor that they are associated with. \n You can pass additional options to create tokens that are restricted to a subset of that actor's permissions. \n To restrict the token to just specific permissions against all available databases, use the --all option: \n datasette create-token root --all insert-row --all update-row \n This option can be passed as many times as you like. In the above example the token will only be allowed to insert and update rows. \n You can also restrict permissions such that they can only be used within specific databases: \n datasette create-token root --database mydatabase insert-row \n The resulting token will only be able to insert rows, and only to tables in the mydatabase database. \n Finally, you can restrict permissions to individual resources - tables, SQL views and named queries - within a specific database: \n datasette create-token root --resource mydatabase mytable insert-row \n These options have short versions: -a for --all , -d for --database and -r for --resource . \n You can add --debug to see a JSON representation of the token that has been created. Here's a full example: \n datasette create-token root \\\n --secret mysecret \\\n --all view-instance \\\n --all view-table \\\n --database docs view-query \\\n --resource docs documents insert-row \\\n --resource docs documents update-row \\\n --debug \n This example outputs the following: \n dstok_.eJxFizEKgDAMRe_y5w4qYrFXERGxDkVsMI0uxbubdjFL8l_ez1jhwEQCA6Fjjxp90qtkuHawzdjYrh8MFobLxZ_wBH0_gtnAF-hpS5VfmF8D_lnd97lHqUJgLd6sls4H1qwlhA.nH_7RecYHj5qSzvjhMU95iy0Xlc\n\nDecoded:\n\n{\n \"a\": \"root\",\n \"token\": \"dstok\",\n \"t\": 1670907246,\n \"_r\": {\n \"a\": [\n \"vi\",\n \"vt\"\n ],\n \"d\": {\n \"docs\": [\n \"vq\"\n ]\n },\n \"r\": {\n \"docs\": {\n \"documents\": [\n \"ir\",\n \"ur\"\n ]\n }\n }\n }\n}", "breadcrumbs": "[\"Authentication and permissions\", \"API Tokens\", \"datasette create-token\"]", "references": "[]"} {"id": "json_api:rowupdateview", "page": "json_api", "ref": "rowupdateview", "title": "Updating a row", "content": "To update a row, make a POST to ////-/update . This requires the update-row permission. \n POST //
//-/update\nContent-Type: application/json\nAuthorization: Bearer dstok_ \n {\n \"update\": {\n \"text_column\": \"New text string\",\n \"integer_column\": 3,\n \"float_column\": 3.14\n }\n} \n here is the tilde-encoded primary key value of the row to update - or a comma-separated list of primary key values if the table has a composite primary key. \n You only need to pass the columns you want to update. Any other columns will be left unchanged. \n If successful, this will return a 200 status code and a {\"ok\": true} response body. \n Add \"return\": true to the request body to return the updated row: \n {\n \"update\": {\n \"title\": \"New title\"\n },\n \"return\": true\n} \n The returned JSON will look like this: \n {\n \"ok\": true,\n \"row\": {\n \"id\": 1,\n \"title\": \"New title\",\n \"other_column\": \"Will be present here too\"\n }\n} \n Any errors will return {\"errors\": [\"... descriptive message ...\"], \"ok\": false} , and a 400 status code for a bad input or a 403 status code for an authentication or permission error. \n Pass \"alter: true to automatically add any missing columns to the table. This requires the alter-table permission.", "breadcrumbs": "[\"JSON API\", \"The JSON write API\"]", "references": "[]"} {"id": "facets:facets-in-query-strings", "page": "facets", "ref": "facets-in-query-strings", "title": "Facets in query strings", "content": "To turn on faceting for specific columns on a Datasette table view, add one or more _facet=COLUMN parameters to the URL.\n For example, if you want to turn on facets for the city_id and state columns, construct a URL that looks like this: \n /dbname/tablename?_facet=state&_facet=city_id \n This works for both the HTML interface and the .json view.\n When enabled, facets will cause a facet_results block to be added to the JSON output, looking something like this: \n {\n \"state\": {\n \"name\": \"state\",\n \"results\": [\n {\n \"value\": \"CA\",\n \"label\": \"CA\",\n \"count\": 10,\n \"toggle_url\": \"http://...?_facet=city_id&_facet=state&state=CA\",\n \"selected\": false\n },\n {\n \"value\": \"MI\",\n \"label\": \"MI\",\n \"count\": 4,\n \"toggle_url\": \"http://...?_facet=city_id&_facet=state&state=MI\",\n \"selected\": false\n },\n {\n \"value\": \"MC\",\n \"label\": \"MC\",\n \"count\": 1,\n \"toggle_url\": \"http://...?_facet=city_id&_facet=state&state=MC\",\n \"selected\": false\n }\n ],\n \"truncated\": false\n }\n \"city_id\": {\n \"name\": \"city_id\",\n \"results\": [\n {\n \"value\": 1,\n \"label\": \"San Francisco\",\n \"count\": 6,\n \"toggle_url\": \"http://...?_facet=city_id&_facet=state&city_id=1\",\n \"selected\": false\n },\n {\n \"value\": 2,\n \"label\": \"Los Angeles\",\n \"count\": 4,\n \"toggle_url\": \"http://...?_facet=city_id&_facet=state&city_id=2\",\n \"selected\": false\n },\n {\n \"value\": 3,\n \"label\": \"Detroit\",\n \"count\": 4,\n \"toggle_url\": \"http://...?_facet=city_id&_facet=state&city_id=3\",\n \"selected\": false\n },\n {\n \"value\": 4,\n \"label\": \"Memnonia\",\n \"count\": 1,\n \"toggle_url\": \"http://...?_facet=city_id&_facet=state&city_id=4\",\n \"selected\": false\n }\n ],\n \"truncated\": false\n }\n} \n If Datasette detects that a column is a foreign key, the \"label\" property will be automatically derived from the detected label column on the referenced table. \n The default number of facet results returned is 30, controlled by the default_facet_size setting.\n You can increase this on an individual page by adding ?_facet_size=100 to the query string, up to a maximum of max_returned_rows (which defaults to 1000).", "breadcrumbs": "[\"Facets\"]", "references": "[]"} {"id": "internals:internals-response-set-cookie", "page": "internals", "ref": "internals-response-set-cookie", "title": "Setting cookies with response.set_cookie()", "content": "To set cookies on the response, use the response.set_cookie(...) method. The method signature looks like this: \n def set_cookie(\n self,\n key,\n value=\"\",\n max_age=None,\n expires=None,\n path=\"/\",\n domain=None,\n secure=False,\n httponly=False,\n samesite=\"lax\",\n): ... \n You can use this with datasette.sign() to set signed cookies. Here's how you would set the ds_actor cookie for use with Datasette authentication : \n response = Response.redirect(\"/\")\nresponse.set_cookie(\n \"ds_actor\",\n datasette.sign({\"a\": {\"id\": \"cleopaws\"}}, \"actor\"),\n)\nreturn response", "breadcrumbs": "[\"Internals for plugins\", \"Response class\"]", "references": "[]"} {"id": "contributing:contributing-using-fixtures", "page": "contributing", "ref": "contributing-using-fixtures", "title": "Using fixtures", "content": "To run Datasette itself, type datasette . \n You're going to need at least one SQLite database. A quick way to get started is to use the fixtures database that Datasette uses for its own tests. \n You can create a copy of that database by running this command: \n python tests/fixtures.py fixtures.db \n Now you can run Datasette against the new fixtures database like so: \n datasette fixtures.db \n This will start a server at http://127.0.0.1:8001/ . \n Any changes you make in the datasette/templates or datasette/static folder will be picked up immediately (though you may need to do a force-refresh in your browser to see changes to CSS or JavaScript). \n If you want to change Datasette's Python code you can use the --reload option to cause Datasette to automatically reload any time the underlying code changes: \n datasette --reload fixtures.db \n You can also use the fixtures.py script to recreate the testing version of metadata.json used by the unit tests. To do that: \n python tests/fixtures.py fixtures.db fixtures-metadata.json \n Or to output the plugins used by the tests, run this: \n python tests/fixtures.py fixtures.db fixtures-metadata.json fixtures-plugins\nTest tables written to fixtures.db\n- metadata written to fixtures-metadata.json\nWrote plugin: fixtures-plugins/register_output_renderer.py\nWrote plugin: fixtures-plugins/view_name.py\nWrote plugin: fixtures-plugins/my_plugin.py\nWrote plugin: fixtures-plugins/messages_output_renderer.py\nWrote plugin: fixtures-plugins/my_plugin_2.py \n Then run Datasette like this: \n datasette fixtures.db -m fixtures-metadata.json --plugins-dir=fixtures-plugins/", "breadcrumbs": "[\"Contributing\"]", "references": "[]"} {"id": "publish:publish-heroku", "page": "publish", "ref": "publish-heroku", "title": "Publishing to Heroku", "content": "To publish your data using Heroku , first create an account there and install and configure the Heroku CLI tool . \n You can publish one or more databases to Heroku using the following command: \n datasette publish heroku mydatabase.db \n This will output some details about the new deployment, including a URL like this one: \n https://limitless-reef-88278.herokuapp.com/ deployed to Heroku \n You can specify a custom app name by passing -n my-app-name to the publish command. This will also allow you to overwrite an existing app. \n Rather than deploying directly you can use the --generate-dir option to output the files that would be deployed to a directory: \n datasette publish heroku mydatabase.db --generate-dir=/tmp/deploy-this-to-heroku \n See datasette publish heroku for the full list of options for this command.", "breadcrumbs": "[\"Publishing data\", \"datasette publish\"]", "references": "[{\"href\": \"https://www.heroku.com/\", \"label\": \"Heroku\"}, {\"href\": \"https://devcenter.heroku.com/articles/heroku-cli\", \"label\": \"Heroku CLI tool\"}]"} {"id": "authentication:authentication-permissions-table", "page": "authentication", "ref": "authentication-permissions-table", "title": "Access to specific tables and views", "content": "To limit access to the users table in your bakery.db database: \n [[[cog\nconfig_example(cog, \"\"\"\n databases:\n bakery:\n tables:\n users:\n allow:\n id: '*'\n\"\"\") \n ]]] \n [[[end]]] \n This works for SQL views as well - you can list their names in the \"tables\" block above in the same way as regular tables. \n \n Restricting access to tables and views in this way will NOT prevent users from querying them using arbitrary SQL queries, like this for example. \n If you are restricting access to specific tables you should also use the \"allow_sql\" block to prevent users from bypassing the limit with their own SQL queries - see Controlling the ability to execute arbitrary SQL .", "breadcrumbs": "[\"Authentication and permissions\", \"Access permissions in \"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures?sql=select+*+from+facetable\", \"label\": \"like this\"}]"} {"id": "authentication:authentication-permissions-database", "page": "authentication", "ref": "authentication-permissions-database", "title": "Access to specific databases", "content": "To limit access to a specific private.db database to just authenticated users, use the \"allow\" block like this: \n [[[cog\nconfig_example(cog, \"\"\"\n databases:\n private:\n allow:\n id: \"*\"\n\"\"\") \n ]]] \n [[[end]]]", "breadcrumbs": "[\"Authentication and permissions\", \"Access permissions in \"]", "references": "[]"} {"id": "contributing:contributing-formatting-prettier", "page": "contributing", "ref": "contributing-formatting-prettier", "title": "Prettier", "content": "To install Prettier, install Node.js and then run the following in the root of your datasette repository checkout: \n npm install \n This will install Prettier in a node_modules directory. You can then check that your code matches the coding style like so: \n npm run prettier -- --check \n > prettier\n> prettier 'datasette/static/*[!.min].js' \"--check\"\n\nChecking formatting...\n[warn] datasette/static/plugins.js\n[warn] Code style issues found in the above file(s). Forgot to run Prettier? \n You can fix any problems by running: \n npm run fix", "breadcrumbs": "[\"Contributing\", \"Code formatting\"]", "references": "[{\"href\": \"https://nodejs.org/en/download/package-manager/\", \"label\": \"install Node.js\"}]"} {"id": "custom_templates:custom-pages-404", "page": "custom_templates", "ref": "custom-pages-404", "title": "Returning 404s", "content": "To indicate that content could not be found and display the default 404 page you can use the raise_404(message) function: \n {% if not rows %}\n {{ raise_404(\"Content not found\") }}\n{% endif %} \n If you call raise_404() the other content in your template will be ignored.", "breadcrumbs": "[\"Custom pages and templates\"]", "references": "[]"} {"id": "json_api:tabledropview", "page": "json_api", "ref": "tabledropview", "title": "Dropping tables", "content": "To drop a table, make a POST to //
/-/drop . This requires the drop-table permission. \n POST //
/-/drop\nContent-Type: application/json\nAuthorization: Bearer dstok_ \n Without a POST body this will return a status 200 with a note about how many rows will be deleted: \n {\n \"ok\": true,\n \"database\": \"\",\n \"table\": \"
\",\n \"row_count\": 5,\n \"message\": \"Pass \\\"confirm\\\": true to confirm\"\n} \n If you pass the following POST body: \n {\n \"confirm\": true\n} \n Then the table will be dropped and a status 200 response of {\"ok\": true} will be returned. \n Any errors will return {\"errors\": [\"... descriptive message ...\"], \"ok\": false} , and a 400 status code for a bad input or a 403 status code for an authentication or permission error.", "breadcrumbs": "[\"JSON API\", \"The JSON write API\"]", "references": "[]"} {"id": "json_api:rowdeleteview", "page": "json_api", "ref": "rowdeleteview", "title": "Deleting a row", "content": "To delete a row, make a POST to //
//-/delete . This requires the delete-row permission. \n POST //
//-/delete\nContent-Type: application/json\nAuthorization: Bearer dstok_ \n here is the tilde-encoded primary key value of the row to delete - or a comma-separated list of primary key values if the table has a composite primary key. \n If successful, this will return a 200 status code and a {\"ok\": true} response body. \n Any errors will return {\"errors\": [\"... descriptive message ...\"], \"ok\": false} , and a 400 status code for a bad input or a 403 status code for an authentication or permission error.", "breadcrumbs": "[\"JSON API\", \"The JSON write API\"]", "references": "[]"} {"id": "json_api:tablecreateview", "page": "json_api", "ref": "tablecreateview", "title": "Creating a table", "content": "To create a table, make a POST to //-/create . This requires the create-table permission. \n POST //-/create\nContent-Type: application/json\nAuthorization: Bearer dstok_ \n {\n \"table\": \"name_of_new_table\",\n \"columns\": [\n {\n \"name\": \"id\",\n \"type\": \"integer\"\n },\n {\n \"name\": \"title\",\n \"type\": \"text\"\n }\n ],\n \"pk\": \"id\"\n} \n The JSON here describes the table that will be created: \n \n \n table is the name of the table to create. This field is required. \n \n \n columns is a list of columns to create. Each column is a dictionary with name and type keys. \n \n \n name is the name of the column. This is required. \n \n \n type is the type of the column. This is optional - if not provided, text will be assumed. The valid types are text , integer , float and blob . \n \n \n \n \n pk is the primary key for the table. This is optional - if not provided, Datasette will create a SQLite table with a hidden rowid column. \n If the primary key is an integer column, it will be configured to automatically increment for each new record. \n If you set this to id without including an id column in the list of columns , Datasette will create an auto-incrementing integer ID column for you. \n \n \n pks can be used instead of pk to create a compound primary key. It should be a JSON list of column names to use in that primary key. \n \n \n ignore can be set to true to ignore existing rows by primary key if the table already exists. \n \n \n replace can be set to true to replace existing rows by primary key if the table already exists. This requires the update-row permission. \n \n \n alter can be set to true if you want to automatically add any missing columns to the table. This requires the alter-table permission. \n \n \n If the table is successfully created this will return a 201 status code and the following response: \n {\n \"ok\": true,\n \"database\": \"data\",\n \"table\": \"name_of_new_table\",\n \"table_url\": \"http://127.0.0.1:8001/data/name_of_new_table\",\n \"table_api_url\": \"http://127.0.0.1:8001/data/name_of_new_table.json\",\n \"schema\": \"CREATE TABLE [name_of_new_table] (\\n [id] INTEGER PRIMARY KEY,\\n [title] TEXT\\n)\"\n}", "breadcrumbs": "[\"JSON API\", \"The JSON write API\"]", "references": "[]"} {"id": "changelog:id87", "page": "changelog", "ref": "id87", "title": "0.27.1 (2019-05-09)", "content": "Tiny bugfix release: don't install tests/ in the wrong place. Thanks, Veit Heller.", "breadcrumbs": "[\"Changelog\"]", "references": "[]"} {"id": "authentication:authentication-actor", "page": "authentication", "ref": "authentication-actor", "title": "Actors", "content": "Through plugins, Datasette can support both authenticated users (with cookies) and authenticated API agents (via authentication tokens). The word \"actor\" is used to cover both of these cases. \n Every request to Datasette has an associated actor value, available in the code as request.actor . This can be None for unauthenticated requests, or a JSON compatible Python dictionary for authenticated users or API agents. \n The actor dictionary can be any shape - the design of that data structure is left up to the plugins. A useful convention is to include an \"id\" string, as demonstrated by the \"root\" actor below. \n Plugins can use the actor_from_request(datasette, request) hook to implement custom logic for authenticating an actor based on the incoming HTTP request.", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"} {"id": "changelog:id77", "page": "changelog", "ref": "id77", "title": "0.31 (2019-11-11)", "content": "This version adds compatibility with Python 3.8 and breaks compatibility with Python 3.5. \n If you are still running Python 3.5 you should stick with 0.30.2 , which you can install like this: \n pip install datasette==0.30.2 \n \n \n Format SQL button now works with read-only SQL queries - thanks, Tobias Kunze ( #602 ) \n \n \n New ?column__notin=x,y,z filter for table views ( #614 ) \n \n \n Table view now uses select col1, col2, col3 instead of select * \n \n \n Database filenames can now contain spaces - thanks, Tobias Kunze ( #590 ) \n \n \n Removed obsolete ?_group_count=col feature ( #504 ) \n \n \n Improved user interface and documentation for datasette publish cloudrun ( #608 ) \n \n \n Tables with indexes now show the CREATE INDEX statements on the table page ( #618 ) \n \n \n Current version of uvicorn is now shown on /-/versions \n \n \n Python 3.8 is now supported! ( #622 ) \n \n \n Python 3.5 is no longer supported.", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/pull/602\", \"label\": \"#602\"}, {\"href\": \"https://github.com/simonw/datasette/issues/614\", \"label\": \"#614\"}, {\"href\": \"https://github.com/simonw/datasette/pull/590\", \"label\": \"#590\"}, {\"href\": \"https://github.com/simonw/datasette/issues/504\", \"label\": \"#504\"}, {\"href\": \"https://github.com/simonw/datasette/issues/608\", \"label\": \"#608\"}, {\"href\": \"https://github.com/simonw/datasette/issues/618\", \"label\": \"#618\"}, {\"href\": \"https://www.uvicorn.org/\", \"label\": \"uvicorn\"}, {\"href\": \"https://github.com/simonw/datasette/issues/622\", \"label\": \"#622\"}]"} {"id": "plugin_hooks:plugin-hook-extra-js-urls", "page": "plugin_hooks", "ref": "plugin-hook-extra-js-urls", "title": "extra_js_urls(template, database, table, columns, view_name, request, datasette)", "content": "This takes the same arguments as extra_template_vars(...) \n This works in the same way as extra_css_urls() but for JavaScript. You can\n return a list of URLs, a list of dictionaries or an awaitable function that returns those things: \n from datasette import hookimpl\n\n\n@hookimpl\ndef extra_js_urls():\n return [\n {\n \"url\": \"https://code.jquery.com/jquery-3.3.1.slim.min.js\",\n \"sri\": \"sha384-q8i/X+965DzO0rT7abK41JStQIAqVgRVzpbzo5smXKp4YfRvH+8abtTE1Pi6jizo\",\n }\n ] \n You can also return URLs to files from your plugin's static/ directory, if\n you have one: \n @hookimpl\ndef extra_js_urls():\n return [\"/-/static-plugins/your-plugin/app.js\"] \n Note that your-plugin here should be the hyphenated plugin name - the name that is displayed in the list on the /-/plugins debug page. \n If your code uses JavaScript modules you should include the \"module\": True key. See Custom CSS and JavaScript for more details. \n @hookimpl\ndef extra_js_urls():\n return [\n {\n \"url\": \"/-/static-plugins/your-plugin/app.js\",\n \"module\": True,\n }\n ] \n Examples: datasette-cluster-map , datasette-vega", "breadcrumbs": "[\"Plugin hooks\", \"Page extras\"]", "references": "[{\"href\": \"https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules\", \"label\": \"JavaScript modules\"}, {\"href\": \"https://datasette.io/plugins/datasette-cluster-map\", \"label\": \"datasette-cluster-map\"}, {\"href\": \"https://datasette.io/plugins/datasette-vega\", \"label\": \"datasette-vega\"}]"} {"id": "plugin_hooks:plugin-hook-extra-css-urls", "page": "plugin_hooks", "ref": "plugin-hook-extra-css-urls", "title": "extra_css_urls(template, database, table, columns, view_name, request, datasette)", "content": "This takes the same arguments as extra_template_vars(...) \n Return a list of extra CSS URLs that should be included on the page. These can\n take advantage of the CSS class hooks described in Custom pages and templates . \n This can be a list of URLs: \n from datasette import hookimpl\n\n\n@hookimpl\ndef extra_css_urls():\n return [\n \"https://stackpath.bootstrapcdn.com/bootstrap/4.1.0/css/bootstrap.min.css\"\n ] \n Or a list of dictionaries defining both a URL and an\n SRI hash : \n @hookimpl\ndef extra_css_urls():\n return [\n {\n \"url\": \"https://stackpath.bootstrapcdn.com/bootstrap/4.1.0/css/bootstrap.min.css\",\n \"sri\": \"sha384-9gVQ4dYFwwWSjIDZnLEWnxCjeSWFphJiwGPXr1jddIhOegiu1FwO5qRGvFXOdJZ4\",\n }\n ] \n This function can also return an awaitable function, useful if it needs to run any async code: \n @hookimpl\ndef extra_css_urls(datasette):\n async def inner():\n db = datasette.get_database()\n results = await db.execute(\n \"select url from css_files\"\n )\n return [r[0] for r in results]\n\n return inner \n Examples: datasette-cluster-map , datasette-vega", "breadcrumbs": "[\"Plugin hooks\", \"Page extras\"]", "references": "[{\"href\": \"https://www.srihash.org/\", \"label\": \"SRI hash\"}, {\"href\": \"https://datasette.io/plugins/datasette-cluster-map\", \"label\": \"datasette-cluster-map\"}, {\"href\": \"https://datasette.io/plugins/datasette-vega\", \"label\": \"datasette-vega\"}]"} {"id": "settings:setting-template-debug", "page": "settings", "ref": "setting-template-debug", "title": "template_debug", "content": "This setting enables template context debug mode, which is useful to help understand what variables are available to custom templates when you are writing them. \n Enable it like this: \n datasette mydatabase.db --setting template_debug 1 \n Now you can add ?_context=1 or &_context=1 to any Datasette page to see the context that was passed to that template. \n Some examples: \n \n \n https://latest.datasette.io/?_context=1 \n \n \n https://latest.datasette.io/fixtures?_context=1 \n \n \n https://latest.datasette.io/fixtures/roadside_attractions?_context=1", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[{\"href\": \"https://latest.datasette.io/?_context=1\", \"label\": \"https://latest.datasette.io/?_context=1\"}, {\"href\": \"https://latest.datasette.io/fixtures?_context=1\", \"label\": \"https://latest.datasette.io/fixtures?_context=1\"}, {\"href\": \"https://latest.datasette.io/fixtures/roadside_attractions?_context=1\", \"label\": \"https://latest.datasette.io/fixtures/roadside_attractions?_context=1\"}]"} {"id": "settings:setting-trace-debug", "page": "settings", "ref": "setting-trace-debug", "title": "trace_debug", "content": "This setting enables appending ?_trace=1 to any page in order to see the SQL queries and other trace information that was used to generate that page. \n Enable it like this: \n datasette mydatabase.db --setting trace_debug 1 \n Some examples: \n \n \n https://latest.datasette.io/?_trace=1 \n \n \n https://latest.datasette.io/fixtures/roadside_attractions?_trace=1 \n \n \n See datasette.tracer for details on how to hook into this mechanism as a plugin author.", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[{\"href\": \"https://latest.datasette.io/?_trace=1\", \"label\": \"https://latest.datasette.io/?_trace=1\"}, {\"href\": \"https://latest.datasette.io/fixtures/roadside_attractions?_trace=1\", \"label\": \"https://latest.datasette.io/fixtures/roadside_attractions?_trace=1\"}]"} {"id": "authentication:id1", "page": "authentication", "ref": "id1", "title": "Built-in permissions", "content": "This section lists all of the permission checks that are carried out by Datasette core, along with the resource if it was passed.", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"} {"id": "json_api:tableinsertview", "page": "json_api", "ref": "tableinsertview", "title": "Inserting rows", "content": "This requires the insert-row permission. \n A single row can be inserted using the \"row\" key: \n POST //
/-/insert\nContent-Type: application/json\nAuthorization: Bearer dstok_ \n {\n \"row\": {\n \"column1\": \"value1\",\n \"column2\": \"value2\"\n }\n} \n If successful, this will return a 201 status code and the newly inserted row, for example: \n {\n \"rows\": [\n {\n \"id\": 1,\n \"column1\": \"value1\",\n \"column2\": \"value2\"\n }\n ]\n} \n To insert multiple rows at a time, use the same API method but send a list of dictionaries as the \"rows\" key: \n POST //
/-/insert\nContent-Type: application/json\nAuthorization: Bearer dstok_ \n {\n \"rows\": [\n {\n \"column1\": \"value1\",\n \"column2\": \"value2\"\n },\n {\n \"column1\": \"value3\",\n \"column2\": \"value4\"\n }\n ]\n} \n If successful, this will return a 201 status code and a {\"ok\": true} response body. \n The maximum number rows that can be submitted at once defaults to 100, but this can be changed using the max_insert_rows setting. \n To return the newly inserted rows, add the \"return\": true key to the request body: \n {\n \"rows\": [\n {\n \"column1\": \"value1\",\n \"column2\": \"value2\"\n },\n {\n \"column1\": \"value3\",\n \"column2\": \"value4\"\n }\n ],\n \"return\": true\n} \n This will return the same \"rows\" key as the single row example above. There is a small performance penalty for using this option. \n If any of your rows have a primary key that is already in use, you will get an error and none of the rows will be inserted: \n {\n \"ok\": false,\n \"errors\": [\n \"UNIQUE constraint failed: new_table.id\"\n ]\n} \n Pass \"ignore\": true to ignore these errors and insert the other rows: \n {\n \"rows\": [\n {\n \"id\": 1,\n \"column1\": \"value1\",\n \"column2\": \"value2\"\n },\n {\n \"id\": 2,\n \"column1\": \"value3\",\n \"column2\": \"value4\"\n }\n ],\n \"ignore\": true\n} \n Or you can pass \"replace\": true to replace any rows with conflicting primary keys with the new values. This requires the update-row permission. \n Pass \"alter: true to automatically add any missing columns to the table. This requires the alter-table permission.", "breadcrumbs": "[\"JSON API\", \"The JSON write API\"]", "references": "[]"} {"id": "changelog:id152", "page": "changelog", "ref": "id152", "title": "0.18 (2018-04-14)", "content": "This release introduces support for units ,\n contributed by Russ Garrett ( #203 ).\n You can now optionally specify the units for specific columns using metadata.json .\n Once specified, units will be displayed in the HTML view of your table. They also become\n available for use in filters - if a column is configured with a unit of distance, you can\n request all rows where that column is less than 50 meters or more than 20 feet for example. \n \n \n Link foreign keys which don't have labels. [Russ Garrett] \n This renders unlabeled FKs as simple links. \n Also includes bonus fixes for two minor issues: \n \n \n In foreign key link hrefs the primary key was escaped using HTML\n escaping rather than URL escaping. This broke some non-integer PKs. \n \n \n Print tracebacks to console when handling 500 errors. \n \n \n \n \n Fix SQLite error when loading rows with no incoming FKs. [Russ\n Garrett] \n This fixes an error caused by an invalid query when loading incoming FKs. \n The error was ignored due to async but it still got printed to the\n console. \n \n \n Allow custom units to be registered with Pint. [Russ Garrett] \n \n \n Support units in filters. [Russ Garrett] \n \n \n Tidy up units support. [Russ Garrett] \n \n \n Add units to exported JSON \n \n \n Units key in metadata skeleton \n \n \n Docs \n \n \n \n \n Initial units support. [Russ Garrett] \n Add support for specifying units for a column in metadata.json and\n rendering them on display using\n pint", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://docs.datasette.io/en/stable/metadata.html#specifying-units-for-a-column\", \"label\": \"support for units\"}, {\"href\": \"https://github.com/simonw/datasette/issues/203\", \"label\": \"#203\"}, {\"href\": \"https://pint.readthedocs.io/en/latest/\", \"label\": \"pint\"}]"} {"id": "changelog:id43", "page": "changelog", "ref": "id43", "title": "0.52 (2020-11-28)", "content": "This release includes a number of changes relating to an internal rebranding effort: Datasette's configuration mechanism (things like datasette --config default_page_size:10 ) has been renamed to settings . \n \n \n New --setting default_page_size 10 option as a replacement for --config default_page_size:10 (note the lack of a colon). The --config option is deprecated but will continue working until Datasette 1.0. ( #992 ) \n \n \n The /-/config introspection page is now /-/settings , and the previous page redirects to the new one. ( #1103 ) \n \n \n The config.json file in Configuration directory mode is now called settings.json . ( #1104 ) \n \n \n The undocumented datasette.config() internal method has been replaced by a documented .setting(key) method. ( #1107 ) \n \n \n Also in this release: \n \n \n New plugin hook: database_actions(datasette, actor, database, request) , which adds menu items to a new cog menu shown at the top of the database page. ( #1077 ) \n \n \n datasette publish cloudrun has a new --apt-get-install option that can be used to install additional Ubuntu packages as part of the deployment. This is useful for deploying the new datasette-ripgrep plugin . ( #1110 ) \n \n \n Swept the documentation to remove words that minimize involved difficulty. ( #1089 ) \n \n \n And some bug fixes: \n \n \n Foreign keys linking to rows with blank label columns now display as a hyphen, allowing those links to be clicked. ( #1086 ) \n \n \n Fixed bug where row pages could sometimes 500 if the underlying queries exceeded a time limit. ( #1088 ) \n \n \n Fixed a bug where the table action menu could appear partially obscured by the edge of the page. ( #1084 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/992\", \"label\": \"#992\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1103\", \"label\": \"#1103\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1104\", \"label\": \"#1104\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1107\", \"label\": \"#1107\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1077\", \"label\": \"#1077\"}, {\"href\": \"https://github.com/simonw/datasette-ripgrep\", \"label\": \"datasette-ripgrep plugin\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1110\", \"label\": \"#1110\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1089\", \"label\": \"#1089\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1086\", \"label\": \"#1086\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1088\", \"label\": \"#1088\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1084\", \"label\": \"#1084\"}]"} {"id": "changelog:id31", "page": "changelog", "ref": "id31", "title": "0.56.1 (2021-06-05)", "content": "This release fixes a reflected cross-site scripting security hole with the ?_trace=1 feature. You should upgrade to this version, or to Datasette 0.57, as soon as possible. ( #1360 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://owasp.org/www-community/attacks/xss/#reflected-xss-attacks\", \"label\": \"reflected cross-site scripting\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1360\", \"label\": \"#1360\"}]"} {"id": "changelog:id30", "page": "changelog", "ref": "id30", "title": "0.57 (2021-06-05)", "content": "This release fixes a reflected cross-site scripting security hole with the ?_trace=1 feature. You should upgrade to this version, or to Datasette 0.56.1, as soon as possible. ( #1360 ) \n \n In addition to the security fix, this release includes ?_col= and ?_nocol= options for controlling which columns are displayed for a table, ?_facet_size= for increasing the number of facet results returned, re-display of your SQL query should an error occur and numerous bug fixes.", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://owasp.org/www-community/attacks/xss/#reflected-xss-attacks\", \"label\": \"reflected cross-site scripting\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1360\", \"label\": \"#1360\"}]"} {"id": "changelog:id114", "page": "changelog", "ref": "id114", "title": "0.23 (2018-06-18)", "content": "This release features CSV export, improved options for foreign key expansions,\n new configuration settings and improved support for SpatiaLite. \n See datasette/compare/0.22.1...0.23 for a full list of\n commits added since the last release.", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/compare/0.22.1...0.23\", \"label\": \"datasette/compare/0.22.1...0.23\"}]"} {"id": "changelog:id57", "page": "changelog", "ref": "id57", "title": "0.46 (2020-08-09)", "content": "This release contains a security fix related to authenticated writable canned queries. If you are using this feature you should upgrade as soon as possible. \n \n \n \n Security fix: CSRF tokens were incorrectly included in read-only canned query forms, which could allow them to be leaked to a sophisticated attacker. See issue 918 for details. \n \n \n Datasette now supports GraphQL via the new datasette-graphql plugin - see GraphQL in Datasette with the new datasette-graphql plugin . \n \n \n Principle git branch has been renamed from master to main . ( #849 ) \n \n \n New debugging tool: /-/allow-debug tool ( demo here ) helps test allow blocks against actors, as described in Defining permissions with \"allow\" blocks . ( #908 ) \n \n \n New logo for the documentation, and a new project tagline: \"An open source multi-tool for exploring and publishing data\". \n \n \n Whitespace in column values is now respected on display, using white-space: pre-wrap . ( #896 ) \n \n \n New await request.post_body() method for accessing the raw POST body, see Request object . ( #897 ) \n \n \n Database file downloads now include a content-length HTTP header, enabling download progress bars. ( #905 ) \n \n \n File downloads now also correctly set the suggested file name using a content-disposition HTTP header. ( #909 ) \n \n \n tests are now excluded from the Datasette package properly - thanks, abeyerpath. ( #456 ) \n \n \n The Datasette package published to PyPI now includes sdist as well as bdist_wheel . \n \n \n Better titles for canned query pages. ( #887 ) \n \n \n Now only loads Python files from a directory passed using the --plugins-dir option - thanks, Amjith Ramanujam. ( #890 ) \n \n \n New documentation section on Publishing to Vercel .", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/918\", \"label\": \"issue 918\"}, {\"href\": \"https://github.com/simonw/datasette-graphql\", \"label\": \"datasette-graphql\"}, {\"href\": \"https://simonwillison.net/2020/Aug/7/datasette-graphql/\", \"label\": \"GraphQL in Datasette with the new datasette-graphql plugin\"}, {\"href\": \"https://github.com/simonw/datasette/issues/849\", \"label\": \"#849\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug\", \"label\": \"demo here\"}, {\"href\": \"https://github.com/simonw/datasette/issues/908\", \"label\": \"#908\"}, {\"href\": \"https://github.com/simonw/datasette/issues/896\", \"label\": \"#896\"}, {\"href\": \"https://github.com/simonw/datasette/issues/897\", \"label\": \"#897\"}, {\"href\": \"https://github.com/simonw/datasette/issues/905\", \"label\": \"#905\"}, {\"href\": \"https://github.com/simonw/datasette/issues/909\", \"label\": \"#909\"}, {\"href\": \"https://github.com/simonw/datasette/issues/456\", \"label\": \"#456\"}, {\"href\": \"https://github.com/simonw/datasette/issues/887\", \"label\": \"#887\"}, {\"href\": \"https://github.com/simonw/datasette/pull/890\", \"label\": \"#890\"}]"} {"id": "internals:internals-datasette", "page": "internals", "ref": "internals-datasette", "title": "Datasette class", "content": "This object is an instance of the Datasette class, passed to many plugin hooks as an argument called datasette . \n You can create your own instance of this - for example to help write tests for a plugin - like so: \n from datasette.app import Datasette\n\n# With no arguments a single in-memory database will be attached\ndatasette = Datasette()\n\n# The files= argument can load files from disk\ndatasette = Datasette(files=[\"/path/to/my-database.db\"])\n\n# Pass metadata as a JSON dictionary like this\ndatasette = Datasette(\n files=[\"/path/to/my-database.db\"],\n metadata={\n \"databases\": {\n \"my-database\": {\n \"description\": \"This is my database\"\n }\n }\n },\n) \n Constructor parameters include: \n \n \n files=[...] - a list of database files to open \n \n \n immutables=[...] - a list of database files to open in immutable mode \n \n \n metadata={...} - a dictionary of Metadata \n \n \n config_dir=... - the configuration directory to use, stored in datasette.config_dir", "breadcrumbs": "[\"Internals for plugins\"]", "references": "[]"} {"id": "javascript_plugins:javascript-plugins-makecolumnactions", "page": "javascript_plugins", "ref": "javascript-plugins-makecolumnactions", "title": "makeColumnActions(columnDetails)", "content": "This method, if present, will be called when Datasette is rendering the cog action menu icons that appear at the top of the table view. By default these include options like \"Sort ascending/descending\" and \"Facet by this\", but plugins can return additional actions to be included in this menu. \n The method will be called with a columnDetails object with the following keys: \n \n \n columnName - string \n \n The name of the column \n \n \n \n columnNotNull - boolean \n \n True if the column is defined as NOT NULL \n \n \n \n columnType - string \n \n The SQLite data type of the column \n \n \n \n isPk - boolean \n \n True if the column is part of the primary key \n \n \n \n It should return a JavaScript array of objects each with a label and onClick property: \n \n \n label - string \n \n The human-readable label for the action \n \n \n \n onClick(evt) - function \n \n A function that will be called when the action is clicked \n \n \n \n The evt object passed to the onClick is the standard browser event object that triggered the click. \n This example plugin adds two menu items - one to copy the column name to the clipboard and another that displays the column metadata in an alert() window: \n document.addEventListener('datasette_init', function(ev) {\n ev.detail.registerPlugin('column-name-plugin', {\n version: 0.1,\n makeColumnActions: (columnDetails) => {\n return [\n {\n label: 'Copy column to clipboard',\n onClick: async (evt) => {\n await navigator.clipboard.writeText(columnDetails.columnName)\n }\n },\n {\n label: 'Alert column metadata',\n onClick: () => alert(JSON.stringify(columnDetails, null, 2))\n }\n ];\n }\n });\n});", "breadcrumbs": "[\"JavaScript plugins\", \"JavaScript plugin objects\"]", "references": "[]"} {"id": "internals:database-execute-write-fn", "page": "internals", "ref": "database-execute-write-fn", "title": "await db.execute_write_fn(fn, block=True, transaction=True)", "content": "This method works like .execute_write() , but instead of a SQL statement you give it a callable Python function. Your function will be queued up and then called when the write connection is available, passing that connection as the argument to the function. \n The function can then perform multiple actions, safe in the knowledge that it has exclusive access to the single writable connection for as long as it is executing. \n \n fn needs to be a regular function, not an async def function. \n \n For example: \n def delete_and_return_count(conn):\n conn.execute(\"delete from some_table where id > 5\")\n return conn.execute(\n \"select count(*) from some_table\"\n ).fetchone()[0]\n\n\ntry:\n num_rows_left = await database.execute_write_fn(\n delete_and_return_count\n )\nexcept Exception as e:\n print(\"An error occurred:\", e) \n The value returned from await database.execute_write_fn(...) will be the return value from your function. \n If your function raises an exception that exception will be propagated up to the await line. \n By default your function will be executed inside a transaction. You can pass transaction=False to disable this behavior, though if you do that you should be careful to manually apply transactions - ideally using the with conn: pattern, or you may see OperationalError: database table is locked errors. \n If you specify block=False the method becomes fire-and-forget, queueing your function to be executed and then allowing your code after the call to .execute_write_fn() to continue running while the underlying thread waits for an opportunity to run your function. A UUID representing the queued task will be returned. Any exceptions in your code will be silently swallowed.", "breadcrumbs": "[\"Internals for plugins\", \"Database class\"]", "references": "[]"} {"id": "internals:database-execute-isolated-fn", "page": "internals", "ref": "database-execute-isolated-fn", "title": "await db.execute_isolated_fn(fn)", "content": "This method works is similar to execute_write_fn() but executes the provided function in an entirely isolated SQLite connection, which is opened, used and then closed again in a single call to this method. \n The prepare_connection() plugin hook is not executed against this connection. \n This allows plugins to execute database operations that might conflict with how database connections are usually configured. For example, running a VACUUM operation while bypassing any restrictions placed by the datasette-sqlite-authorizer plugin. \n Plugins can also use this method to load potentially dangerous SQLite extensions, use them to perform an operation and then have them safely unloaded at the end of the call, without risk of exposing them to other connections. \n Functions run using execute_isolated_fn() share the same queue as execute_write_fn() , which guarantees that no writes can be executed at the same time as the isolated function is executing. \n The return value of the function will be returned by this method. Any exceptions raised by the function will be raised out of the await line as well.", "breadcrumbs": "[\"Internals for plugins\", \"Database class\"]", "references": "[{\"href\": \"https://github.com/datasette/datasette-sqlite-authorizer\", \"label\": \"datasette-sqlite-authorizer\"}]"} {"id": "javascript_plugins:javascript-plugins-makeabovetablepanelconfigs", "page": "javascript_plugins", "ref": "javascript-plugins-makeabovetablepanelconfigs", "title": "makeAboveTablePanelConfigs()", "content": "This method should return a JavaScript array of objects defining additional panels to be added to the top of the table page. Each object should have the following: \n \n \n id - string \n \n A unique string ID for the panel, for example map-panel \n \n \n \n label - string \n \n A human-readable label for the panel \n \n \n \n render(node) - function \n \n A function that will be called with a DOM node to render the panel into \n \n \n \n This example shows how a plugin might define a single panel: \n document.addEventListener('datasette_init', function(ev) {\n ev.detail.registerPlugin('panel-plugin', {\n version: 0.1,\n makeAboveTablePanelConfigs: () => {\n return [\n {\n id: 'first-panel',\n label: 'First panel',\n render: node => {\n node.innerHTML = '

My custom panel

This is a custom panel that I added using a JavaScript plugin

';\n }\n }\n ]\n }\n });\n}); \n When a page with a table loads, all registered plugins that implement makeAboveTablePanelConfigs() will be called and panels they return will be added to the top of the table page.", "breadcrumbs": "[\"JavaScript plugins\", \"JavaScript plugin objects\"]", "references": "[]"} {"id": "settings:setting-facet-time-limit-ms", "page": "settings", "ref": "setting-facet-time-limit-ms", "title": "facet_time_limit_ms", "content": "This is the time limit Datasette allows for calculating a facet, which defaults to 200ms: \n datasette mydatabase.db --setting facet_time_limit_ms 1000", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"} {"id": "changelog:id145", "page": "changelog", "ref": "id145", "title": "0.19 (2018-04-16)", "content": "This is the first preview of the new Datasette plugins mechanism. Only two\n plugin hooks are available so far - for custom SQL functions and custom template\n filters. There's plenty more to come - read the documentation and get involved in\n the tracking ticket if you\n have feedback on the direction so far. \n \n \n Fix for _sort_desc=sortable_with_nulls test, refs #216 \n \n \n Fixed #216 - paginate correctly when sorting by nullable column \n \n \n Initial documentation for plugins, closes #213 \n https://docs.datasette.io/en/stable/plugins.html \n \n \n New --plugins-dir=plugins/ option ( #212 ) \n New option causing Datasette to load and evaluate all of the Python files in\n the specified directory and register any plugins that are defined in those\n files. \n This new option is available for the following commands: \n datasette serve mydb.db --plugins-dir=plugins/\ndatasette publish now/heroku mydb.db --plugins-dir=plugins/\ndatasette package mydb.db --plugins-dir=plugins/ \n \n \n Start of the plugin system, based on pluggy ( #210 ) \n Uses https://pluggy.readthedocs.io/ originally created for the py.test project \n We're starting with two plugin hooks: \n prepare_connection(conn) \n This is called when a new SQLite connection is created. It can be used to register custom SQL functions. \n prepare_jinja2_environment(env) \n This is called with the Jinja2 environment. It can be used to register custom template tags and filters. \n An example plugin which uses these two hooks can be found at https://github.com/simonw/datasette-plugin-demos or installed using pip install datasette-plugin-demos \n Refs #14 \n \n \n Return HTTP 405 on InvalidUsage rather than 500. [Russ Garrett] \n This also stops it filling up the logs. This happens for HEAD requests\n at the moment - which perhaps should be handled better, but that's a\n different issue.", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://docs.datasette.io/en/stable/plugins.html\", \"label\": \"the documentation\"}, {\"href\": \"https://github.com/simonw/datasette/issues/14\", \"label\": \"the tracking ticket\"}, {\"href\": \"https://github.com/simonw/datasette/issues/216\", \"label\": \"#216\"}, {\"href\": \"https://github.com/simonw/datasette/issues/216\", \"label\": \"#216\"}, {\"href\": \"https://github.com/simonw/datasette/issues/213\", \"label\": \"#213\"}, {\"href\": \"https://docs.datasette.io/en/stable/plugins.html\", \"label\": \"https://docs.datasette.io/en/stable/plugins.html\"}, {\"href\": \"https://github.com/simonw/datasette/issues/212\", \"label\": \"#212\"}, {\"href\": \"https://github.com/simonw/datasette/issues/14\", \"label\": \"#210\"}, {\"href\": \"https://pluggy.readthedocs.io/\", \"label\": \"https://pluggy.readthedocs.io/\"}, {\"href\": \"https://github.com/simonw/datasette-plugin-demos\", \"label\": \"https://github.com/simonw/datasette-plugin-demos\"}, {\"href\": \"https://github.com/simonw/datasette/issues/14\", \"label\": \"#14\"}]"} {"id": "plugin_hooks:plugin-hook-startup", "page": "plugin_hooks", "ref": "plugin-hook-startup", "title": "startup(datasette)", "content": "This hook fires when the Datasette application server first starts up. \n Here is an example that validates required plugin configuration. The server will fail to start and show an error if the validation check fails: \n @hookimpl\ndef startup(datasette):\n config = datasette.plugin_config(\"my-plugin\") or {}\n assert (\n \"required-setting\" in config\n ), \"my-plugin requires setting required-setting\" \n You can also return an async function, which will be awaited on startup. Use this option if you need to execute any database queries, for example this function which creates the my_table database table if it does not yet exist: \n @hookimpl\ndef startup(datasette):\n async def inner():\n db = datasette.get_database()\n if \"my_table\" not in await db.table_names():\n await db.execute_write(\n \"\"\"\n create table my_table (mycol text)\n \"\"\"\n )\n\n return inner \n Potential use-cases: \n \n \n Run some initialization code for the plugin \n \n \n Create database tables that a plugin needs on startup \n \n \n Validate the configuration for a plugin on startup, and raise an error if it is invalid \n \n \n \n If you are writing unit tests for a plugin that uses this hook and doesn't exercise Datasette by sending\n any simulated requests through it you will need to explicitly call await ds.invoke_startup() in your tests. An example: \n @pytest.mark.asyncio\nasync def test_my_plugin():\n ds = Datasette()\n await ds.invoke_startup()\n # Rest of test goes here \n \n Examples: datasette-saved-queries , datasette-init", "breadcrumbs": "[\"Plugin hooks\"]", "references": "[{\"href\": \"https://datasette.io/plugins/datasette-saved-queries\", \"label\": \"datasette-saved-queries\"}, {\"href\": \"https://datasette.io/plugins/datasette-init\", \"label\": \"datasette-init\"}]"} {"id": "internals:internals-utils-parse-metadata", "page": "internals", "ref": "internals-utils-parse-metadata", "title": "parse_metadata(content)", "content": "This function accepts a string containing either JSON or YAML, expected to be of the format described in Metadata . It returns a nested Python dictionary representing the parsed data from that string. \n If the metadata cannot be parsed as either JSON or YAML the function will raise a utils.BadMetadataError exception. \n \n \n datasette.utils. parse_metadata content : str dict \n \n Detects if content is JSON or YAML and parses it appropriately.", "breadcrumbs": "[\"Internals for plugins\", \"The datasette.utils module\"]", "references": "[]"} {"id": "changelog:v1-0-a0", "page": "changelog", "ref": "v1-0-a0", "title": "1.0a0 (2022-11-29)", "content": "This first alpha release of Datasette 1.0 introduces a brand new collection of APIs for writing to the database ( #1850 ), as well as a new API token mechanism baked into Datasette core. Previously, API tokens have only been supported by installing additional plugins. \n This is very much a preview: expect many more backwards incompatible API changes prior to the full 1.0 release. \n Feedback enthusiastically welcomed, either through issue comments or via the Datasette Discord community.", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1850\", \"label\": \"#1850\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1850\", \"label\": \"issue comments\"}, {\"href\": \"https://datasette.io/discord\", \"label\": \"Datasette Discord\"}]"} {"id": "cli-reference:cli-help-serve-help", "page": "cli-reference", "ref": "cli-help-serve-help", "title": "datasette serve", "content": "This command starts the Datasette web application running on your machine: \n datasette serve mydatabase.db \n Or since this is the default command you can run this instead: \n datasette mydatabase.db \n Once started you can access it at http://localhost:8001 \n [[[cog\nhelp([\"serve\", \"--help\"]) \n ]]] \n Usage: datasette serve [OPTIONS] [FILES]...\n\n Serve up specified SQLite database files with a web UI\n\nOptions:\n -i, --immutable PATH Database files to open in immutable mode\n -h, --host TEXT Host for server. Defaults to 127.0.0.1 which\n means only connections from the local machine\n will be allowed. Use 0.0.0.0 to listen to all\n IPs and allow access from other machines.\n -p, --port INTEGER RANGE Port for server, defaults to 8001. Use -p 0 to\n automatically assign an available port.\n [0<=x<=65535]\n --uds TEXT Bind to a Unix domain socket\n --reload Automatically reload if code or metadata\n change detected - useful for development\n --cors Enable CORS by serving Access-Control-Allow-\n Origin: *\n --load-extension PATH:ENTRYPOINT?\n Path to a SQLite extension to load, and\n optional entrypoint\n --inspect-file TEXT Path to JSON file created using \"datasette\n inspect\"\n -m, --metadata FILENAME Path to JSON/YAML file containing\n license/source metadata\n --template-dir DIRECTORY Path to directory containing custom templates\n --plugins-dir DIRECTORY Path to directory containing custom plugins\n --static MOUNT:DIRECTORY Serve static files from this directory at\n /MOUNT/...\n --memory Make /_memory database available\n -c, --config FILENAME Path to JSON/YAML Datasette configuration file\n -s, --setting SETTING... nested.key, value setting to use in Datasette\n configuration\n --secret TEXT Secret used for signing secure values, such as\n signed cookies\n --root Output URL that sets a cookie authenticating\n the root user\n --get TEXT Run an HTTP GET request against this path,\n print results and exit\n --token TEXT API token to send with --get requests\n --actor TEXT Actor to use for --get requests (JSON string)\n --version-note TEXT Additional note to show on /-/versions\n --help-settings Show available settings\n --pdb Launch debugger on any errors\n -o, --open Open Datasette in your web browser\n --create Create database files if they do not exist\n --crossdb Enable cross-database joins using the /_memory\n database\n --nolock Ignore locking, open locked files in read-only\n mode\n --ssl-keyfile TEXT SSL key file\n --ssl-certfile TEXT SSL certificate file\n --internal PATH Path to a persistent Datasette internal SQLite\n database\n --help Show this message and exit. \n [[[end]]]", "breadcrumbs": "[\"CLI reference\"]", "references": "[]"} {"id": "cli-reference:cli-help-serve-help-settings", "page": "cli-reference", "ref": "cli-help-serve-help-settings", "title": "datasette serve --help-settings", "content": "This command outputs all of the available Datasette settings . \n These can be passed to datasette serve using datasette serve --setting name value . \n [[[cog\nhelp([\"--help-settings\"]) \n ]]] \n Settings:\n default_page_size Default page size for the table view\n (default=100)\n max_returned_rows Maximum rows that can be returned from a table or\n custom query (default=1000)\n max_insert_rows Maximum rows that can be inserted at a time using\n the bulk insert API (default=100)\n num_sql_threads Number of threads in the thread pool for\n executing SQLite queries (default=3)\n sql_time_limit_ms Time limit for a SQL query in milliseconds\n (default=1000)\n default_facet_size Number of values to return for requested facets\n (default=30)\n facet_time_limit_ms Time limit for calculating a requested facet\n (default=200)\n facet_suggest_time_limit_ms Time limit for calculating a suggested facet\n (default=50)\n allow_facet Allow users to specify columns to facet using\n ?_facet= parameter (default=True)\n allow_download Allow users to download the original SQLite\n database files (default=True)\n allow_signed_tokens Allow users to create and use signed API tokens\n (default=True)\n default_allow_sql Allow anyone to run arbitrary SQL queries\n (default=True)\n max_signed_tokens_ttl Maximum allowed expiry time for signed API tokens\n (default=0)\n suggest_facets Calculate and display suggested facets\n (default=True)\n default_cache_ttl Default HTTP cache TTL (used in Cache-Control:\n max-age= header) (default=5)\n cache_size_kb SQLite cache size in KB (0 == use SQLite default)\n (default=0)\n allow_csv_stream Allow .csv?_stream=1 to download all rows\n (ignoring max_returned_rows) (default=True)\n max_csv_mb Maximum size allowed for CSV export in MB - set 0\n to disable this limit (default=100)\n truncate_cells_html Truncate cells longer than this in HTML table\n view - set 0 to disable (default=2048)\n force_https_urls Force URLs in API output to always use https://\n protocol (default=False)\n template_debug Allow display of template debug information with\n ?_context=1 (default=False)\n trace_debug Allow display of SQL trace debug information with\n ?_trace=1 (default=False)\n base_url Datasette URLs should use this base path\n (default=/) \n [[[end]]]", "breadcrumbs": "[\"CLI reference\", \"datasette serve\"]", "references": "[]"} {"id": "changelog:v1-0-a3", "page": "changelog", "ref": "v1-0-a3", "title": "1.0a3 (2023-08-09)", "content": "This alpha release previews the updated design for Datasette's default JSON API. ( #782 ) \n The new default JSON representation for both table pages ( /dbname/table.json ) and arbitrary SQL queries ( /dbname.json?sql=... ) is now shaped like this: \n {\n \"ok\": true,\n \"rows\": [\n {\n \"id\": 3,\n \"name\": \"Detroit\"\n },\n {\n \"id\": 2,\n \"name\": \"Los Angeles\"\n },\n {\n \"id\": 4,\n \"name\": \"Memnonia\"\n },\n {\n \"id\": 1,\n \"name\": \"San Francisco\"\n }\n ],\n \"truncated\": false\n} \n Tables will include an additional \"next\" key for pagination, which can be passed to ?_next= to fetch the next page of results. \n The various ?_shape= options continue to work as before - see Different shapes for details. \n A new ?_extra= mechanism is available for tables, but has not yet been stabilized or documented. Details on that are available in #262 .", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/782\", \"label\": \"#782\"}, {\"href\": \"https://github.com/simonw/datasette/issues/262\", \"label\": \"#262\"}]"} {"id": "changelog:v1-0-a8", "page": "changelog", "ref": "v1-0-a8", "title": "1.0a8 (2024-02-07)", "content": "This alpha release continues the migration of Datasette's configuration from metadata.yaml to the new datasette.yaml configuration file, introduces a new system for JavaScript plugins and adds several new plugin hooks. \n See Datasette 1.0a8: JavaScript plugins, new plugin hooks and plugin configuration in datasette.yaml for an annotated version of these release notes.", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://simonwillison.net/2024/Feb/7/datasette-1a8/\", \"label\": \"Datasette 1.0a8: JavaScript plugins, new plugin hooks and plugin configuration in datasette.yaml\"}]"} {"id": "changelog:v1-0-a9", "page": "changelog", "ref": "v1-0-a9", "title": "1.0a9 (2024-02-16)", "content": "This alpha release adds basic alter table support to the Datasette Write API and fixes a permissions bug relating to the /upsert API endpoint.", "breadcrumbs": "[\"Changelog\"]", "references": "[]"} {"id": "changelog:v1-0-a4", "page": "changelog", "ref": "v1-0-a4", "title": "1.0a4 (2023-08-21)", "content": "This alpha fixes a security issue with the /-/api API explorer. On authenticated Datasette instances (instances protected using plugins such as datasette-auth-passwords ) the API explorer interface could reveal the names of databases and tables within the protected instance. The data stored in those tables was not revealed. \n For more information and workarounds, read the security advisory . The issue has been present in every previous alpha version of Datasette 1.0: versions 1.0a0, 1.0a1, 1.0a2 and 1.0a3. \n Also in this alpha: \n \n \n The new datasette plugins --requirements option outputs a list of currently installed plugins in Python requirements.txt format, useful for duplicating that installation elsewhere. ( #2133 ) \n \n \n Writable canned queries can now define a on_success_message_sql field in their configuration, containing a SQL query that should be executed upon successful completion of the write operation in order to generate a message to be shown to the user. ( #2138 ) \n \n \n The automatically generated border color for a database is now shown in more places around the application. ( #2119 ) \n \n \n Every instance of example shell script code in the documentation should now include a working copy button, free from additional syntax. ( #2140 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://datasette.io/plugins/datasette-auth-passwords\", \"label\": \"datasette-auth-passwords\"}, {\"href\": \"https://github.com/simonw/datasette/security/advisories/GHSA-7ch3-7pp7-7cpq\", \"label\": \"the security advisory\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2133\", \"label\": \"#2133\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2138\", \"label\": \"#2138\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2119\", \"label\": \"#2119\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2140\", \"label\": \"#2140\"}]"} {"id": "plugin_hooks:plugin-page-extras", "page": "plugin_hooks", "ref": "plugin-page-extras", "title": "Page extras", "content": "These plugin hooks can be used to affect the way HTML pages for different Datasette interfaces are rendered.", "breadcrumbs": "[\"Plugin hooks\"]", "references": "[]"} {"id": "javascript_plugins:javascript-datasette-manager-selectors", "page": "javascript_plugins", "ref": "javascript-datasette-manager-selectors", "title": "Selectors", "content": "These are available on the selectors property of the datasetteManager object. \n const DOM_SELECTORS = {\n /** Should have one match */\n jsonExportLink: \".export-links a[href*=json]\",\n\n /** Event listeners that go outside of the main table, e.g. existing scroll listener */\n tableWrapper: \".table-wrapper\",\n table: \"table.rows-and-columns\",\n aboveTablePanel: \".above-table-panel\",\n\n // These could have multiple matches\n /** Used for selecting table headers. Use makeColumnActions if you want to add menu items. */\n tableHeaders: `table.rows-and-columns th`,\n\n /** Used to add \"where\" clauses to query using direct manipulation */\n filterRows: \".filter-row\",\n /** Used to show top available enum values for a column (\"facets\") */\n facetResults: \".facet-results [data-column]\",\n};", "breadcrumbs": "[\"JavaScript plugins\"]", "references": "[]"} {"id": "authentication:authentication-permissions-config", "page": "authentication", "ref": "authentication-permissions-config", "title": "Access permissions in ", "content": "There are two ways to configure permissions using datasette.yaml (or datasette.json ). \n For simple visibility permissions you can use \"allow\" blocks in the root, database, table and query sections. \n For other permissions you can use a \"permissions\" block, described in the next section . \n You can limit who is allowed to view different parts of your Datasette instance using \"allow\" keys in your Configuration . \n You can control the following: \n \n \n Access to the entire Datasette instance \n \n \n Access to specific databases \n \n \n Access to specific tables and views \n \n \n Access to specific Canned queries \n \n \n If a user cannot access a specific database, they will not be able to access tables, views or queries within that database. If a user cannot access the instance they will not be able to access any of the databases, tables, views or queries.", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"} {"id": "full_text_search:full-text-search-fts-versions", "page": "full_text_search", "ref": "full-text-search-fts-versions", "title": "FTS versions", "content": "There are three different versions of the SQLite FTS module: FTS3, FTS4 and FTS5. You can tell which versions are supported by your instance of Datasette by checking the /-/versions page. \n FTS5 is the most advanced module but may not be available in the SQLite version that is bundled with your Python installation. Most importantly, FTS5 is the only version that has the ability to order by search relevance without needing extra code. \n If you can't be sure that FTS5 will be available, you should use FTS4.", "breadcrumbs": "[\"Full-text search\"]", "references": "[]"} {"id": "changelog:id35", "page": "changelog", "ref": "id35", "title": "0.54 (2021-01-25)", "content": "The two big new features in this release are the _internal SQLite in-memory database storing details of all connected databases and tables, and support for JavaScript modules in plugins and additional scripts. \n For additional commentary on this release, see Datasette 0.54, the annotated release notes .", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://simonwillison.net/2021/Jan/25/datasette/\", \"label\": \"Datasette 0.54, the annotated release notes\"}]"} {"id": "metadata:metadata-source-license-about", "page": "metadata", "ref": "metadata-source-license-about", "title": "Source, license and about", "content": "The three visible metadata fields you can apply to everything, specific databases or specific tables are source, license and about. All three are optional. \n source and source_url should be used to indicate where the underlying data came from. \n license and license_url should be used to indicate the license under which the data can be used. \n about and about_url can be used to link to further information about the project - an accompanying blog entry for example. \n For each of these you can provide just the *_url field and Datasette will treat that as the default link label text and display the URL directly on the page.", "breadcrumbs": "[\"Metadata\"]", "references": "[]"} {"id": "changelog:v1-0-a2", "page": "changelog", "ref": "v1-0-a2", "title": "1.0a2 (2022-12-14)", "content": "The third Datasette 1.0 alpha release adds upsert support to the JSON API, plus the ability to specify finely grained permissions when creating an API token. \n See Datasette 1.0a2: Upserts and finely grained permissions for an extended, annotated version of these release notes. \n \n \n New /db/table/-/upsert API, documented here . upsert is an update-or-insert: existing rows will have specified keys updated, but if no row matches the incoming primary key a brand new row will be inserted instead. ( #1878 ) \n \n \n New register_permissions(datasette) plugin hook. Plugins can now register named permissions, which will then be listed in various interfaces that show available permissions. ( #1940 ) \n \n \n The /db/-/create API for creating a table now accepts \"ignore\": true and \"replace\": true options when called with the \"rows\" property that creates a new table based on an example set of rows. This means the API can be called multiple times with different rows, setting rules for what should happen if a primary key collides with an existing row. ( #1927 ) \n \n \n Arbitrary permissions can now be configured at the instance, database and resource (table, SQL view or canned query) level in Datasette's Metadata JSON and YAML files. The new \"permissions\" key can be used to specify which actors should have which permissions. See Other permissions in datasette.yaml for details. ( #1636 ) \n \n \n The /-/create-token page can now be used to create API tokens which are restricted to just a subset of actions, including against specific databases or resources. See API Tokens for details. ( #1947 ) \n \n \n Likewise, the datasette create-token CLI command can now create tokens with a subset of permissions . ( #1855 ) \n \n \n New datasette.create_token() API method for programmatically creating signed API tokens. ( #1951 ) \n \n \n /db/-/create API now requires actor to have insert-row permission in order to use the \"row\" or \"rows\" properties. ( #1937 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://simonwillison.net/2022/Dec/15/datasette-1a2/\", \"label\": \"Datasette 1.0a2: Upserts and finely grained permissions\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1878\", \"label\": \"#1878\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1940\", \"label\": \"#1940\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1927\", \"label\": \"#1927\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1636\", \"label\": \"#1636\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1947\", \"label\": \"#1947\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1855\", \"label\": \"#1855\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1951\", \"label\": \"#1951\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1937\", \"label\": \"#1937\"}]"} {"id": "changelog:id174", "page": "changelog", "ref": "id174", "title": "0.14 (2017-12-09)", "content": "The theme of this release is customization: Datasette now allows every aspect\n of its presentation to be customized \n either using additional CSS or by providing entirely new templates. \n Datasette's metadata.json format \n has also been expanded, to allow per-database and per-table metadata. A new\n datasette skeleton command can be used to generate a skeleton JSON file\n ready to be filled in with per-database and per-table details. \n The metadata.json file can also be used to define\n canned queries ,\n as a more powerful alternative to SQL views. \n \n \n extra_css_urls / extra_js_urls in metadata \n A mechanism in the metadata.json format for adding custom CSS and JS urls. \n Create a metadata.json file that looks like this: \n {\n \"extra_css_urls\": [\n \"https://simonwillison.net/static/css/all.bf8cd891642c.css\"\n ],\n \"extra_js_urls\": [\n \"https://code.jquery.com/jquery-3.2.1.slim.min.js\"\n ]\n} \n Then start datasette like this: \n datasette mydb.db --metadata=metadata.json \n The CSS and JavaScript files will be linked in the of every page. \n You can also specify a SRI (subresource integrity hash) for these assets: \n {\n \"extra_css_urls\": [\n {\n \"url\": \"https://simonwillison.net/static/css/all.bf8cd891642c.css\",\n \"sri\": \"sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI\"\n }\n ],\n \"extra_js_urls\": [\n {\n \"url\": \"https://code.jquery.com/jquery-3.2.1.slim.min.js\",\n \"sri\": \"sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g=\"\n }\n ]\n} \n Modern browsers will only execute the stylesheet or JavaScript if the SRI hash\n matches the content served. You can generate hashes using https://www.srihash.org/ \n \n \n Auto-link column values that look like URLs ( #153 ) \n \n \n CSS styling hooks as classes on the body ( #153 ) \n Every template now gets CSS classes in the body designed to support custom\n styling. \n The index template (the top level page at / ) gets this: \n \n The database template ( /dbname/ ) gets this: \n \n The table template ( /dbname/tablename ) gets: \n \n The row template ( /dbname/tablename/rowid ) gets: \n \n The db-x and table-x classes use the database or table names themselves IF\n they are valid CSS identifiers. If they aren't, we strip any invalid\n characters out and append a 6 character md5 digest of the original name, in\n order to ensure that multiple tables which resolve to the same stripped\n character version still have different CSS classes. \n Some examples (extracted from the unit tests): \n \"simple\" => \"simple\"\n\"MixedCase\" => \"MixedCase\"\n\"-no-leading-hyphens\" => \"no-leading-hyphens-65bea6\"\n\"_no-leading-underscores\" => \"no-leading-underscores-b921bc\"\n\"no spaces\" => \"no-spaces-7088d7\"\n\"-\" => \"336d5e\"\n\"no $ characters\" => \"no--characters-59e024\" \n \n \n datasette --template-dir=mytemplates/ argument \n You can now pass an additional argument specifying a directory to look for\n custom templates in. \n Datasette will fall back on the default templates if a template is not\n found in that directory. \n \n \n Ability to over-ride templates for individual tables/databases. \n It is now possible to over-ride templates on a per-database / per-row or per-\n table basis. \n When you access e.g. /mydatabase/mytable Datasette will look for the following: \n - table-mydatabase-mytable.html\n- table.html \n If you provided a --template-dir argument to datasette serve it will look in\n that directory first. \n The lookup rules are as follows: \n Index page (/):\n index.html\n\nDatabase page (/mydatabase):\n database-mydatabase.html\n database.html\n\nTable page (/mydatabase/mytable):\n table-mydatabase-mytable.html\n table.html\n\nRow page (/mydatabase/mytable/id):\n row-mydatabase-mytable.html\n row.html \n If a table name has spaces or other unexpected characters in it, the template\n filename will follow the same rules as our custom CSS classes\n - for example, a table called \"Food Trucks\"\n will attempt to load the following templates: \n table-mydatabase-Food-Trucks-399138.html\ntable.html \n It is possible to extend the default templates using Jinja template\n inheritance. If you want to customize EVERY row template with some additional\n content you can do so by creating a row.html template like this: \n {% extends \"default:row.html\" %}\n\n{% block content %}\n

EXTRA HTML AT THE TOP OF THE CONTENT BLOCK

\n

This line renders the original block:

\n{{ super() }}\n{% endblock %} \n \n \n --static option for datasette serve ( #160 ) \n You can now tell Datasette to serve static files from a specific location at a\n specific mountpoint. \n For example: \n datasette serve mydb.db --static extra-css:/tmp/static/css \n Now if you visit this URL: \n http://localhost:8001/extra-css/blah.css \n The following file will be served: \n /tmp/static/css/blah.css \n \n \n Canned query support. \n Named canned queries can now be defined in metadata.json like this: \n {\n \"databases\": {\n \"timezones\": {\n \"queries\": {\n \"timezone_for_point\": \"select tzid from timezones ...\"\n }\n }\n }\n} \n These will be shown in a new \"Queries\" section beneath \"Views\" on the database page. \n \n \n New datasette skeleton command for generating metadata.json ( #164 ) \n \n \n metadata.json support for per-table/per-database metadata ( #165 ) \n Also added support for descriptions and HTML descriptions. \n Here's an example metadata.json file illustrating custom per-database and per-\n table metadata: \n {\n \"title\": \"Overall datasette title\",\n \"description_html\": \"This is a description with HTML.\",\n \"databases\": {\n \"db1\": {\n \"title\": \"First database\",\n \"description\": \"This is a string description & has no HTML\",\n \"license_url\": \"http://example.com/\",\n \"license\": \"The example license\",\n \"queries\": {\n \"canned_query\": \"select * from table1 limit 3;\"\n },\n \"tables\": {\n \"table1\": {\n \"title\": \"Custom title for table1\",\n \"description\": \"Tables can have descriptions too\",\n \"source\": \"This has a custom source\",\n \"source_url\": \"http://example.com/\"\n }\n }\n }\n }\n} \n \n \n Renamed datasette build command to datasette inspect ( #130 ) \n \n \n Upgrade to Sanic 0.7.0 ( #168 ) \n https://github.com/channelcat/sanic/releases/tag/0.7.0 \n \n \n Package and publish commands now accept --static and --template-dir \n Example usage: \n datasette package --static css:extra-css/ --static js:extra-js/ \\\n sf-trees.db --template-dir templates/ --tag sf-trees --branch master \n This creates a local Docker image that includes copies of the templates/,\n extra-css/ and extra-js/ directories. You can then run it like this: \n docker run -p 8001:8001 sf-trees \n For publishing to Zeit now: \n datasette publish now --static css:extra-css/ --static js:extra-js/ \\\n sf-trees.db --template-dir templates/ --name sf-trees --branch master \n \n \n HTML comment showing which templates were considered for a page ( #171 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://docs.datasette.io/en/stable/custom_templates.html\", \"label\": \"to be customized\"}, {\"href\": \"https://docs.datasette.io/en/stable/metadata.html\", \"label\": \"metadata.json format\"}, {\"href\": \"https://docs.datasette.io/en/stable/sql_queries.html#canned-queries\", \"label\": \"canned queries\"}, {\"href\": \"https://www.srihash.org/\", \"label\": \"https://www.srihash.org/\"}, {\"href\": \"https://github.com/simonw/datasette/issues/153\", \"label\": \"#153\"}, {\"href\": \"https://github.com/simonw/datasette/issues/153\", \"label\": \"#153\"}, {\"href\": \"https://github.com/simonw/datasette/issues/160\", \"label\": \"#160\"}, {\"href\": \"https://github.com/simonw/datasette/issues/164\", \"label\": \"#164\"}, {\"href\": \"https://github.com/simonw/datasette/issues/165\", \"label\": \"#165\"}, {\"href\": \"https://github.com/simonw/datasette/issues/130\", \"label\": \"#130\"}, {\"href\": \"https://github.com/simonw/datasette/issues/168\", \"label\": \"#168\"}, {\"href\": \"https://github.com/channelcat/sanic/releases/tag/0.7.0\", \"label\": \"https://github.com/channelcat/sanic/releases/tag/0.7.0\"}, {\"href\": \"https://github.com/simonw/datasette/issues/171\", \"label\": \"#171\"}]"} {"id": "pages:tableview", "page": "pages", "ref": "tableview", "title": "Table", "content": "The table page is the heart of Datasette: it allows users to interactively explore the contents of a database table, including sorting, filtering, Full-text search and applying Facets . \n The HTML interface is worth spending some time exploring. As with other pages, you can return the JSON data by appending .json to the URL path, before any ? query string arguments. \n The query string arguments are described in more detail here: Table arguments \n You can also use the table page to interactively construct a SQL query - by applying different filters and a sort order for example - and then click the \"View and edit SQL\" link to see the SQL query that was used for the page and edit and re-submit it. \n Some examples: \n \n \n ../items lists all of the line-items registered by UK MPs as potential conflicts of interest. It demonstrates Datasette's support for Full-text search . \n \n \n ../antiquities-act%2Factions_under_antiquities_act is an interface for exploring the \"actions under the antiquities act\" data table published by FiveThirtyEight. \n \n \n ../global-power-plants?country_long=United+Kingdom&primary_fuel=Gas is a filtered table page showing every Gas power plant in the United Kingdom. It includes some default facets (configured using its metadata.json ) and uses the datasette-cluster-map plugin to show a map of the results.", "breadcrumbs": "[\"Pages and API endpoints\"]", "references": "[{\"href\": \"https://register-of-members-interests.datasettes.com/regmem/items\", \"label\": \"../items\"}, {\"href\": \"https://fivethirtyeight.datasettes.com/fivethirtyeight/antiquities-act%2Factions_under_antiquities_act\", \"label\": \"../antiquities-act%2Factions_under_antiquities_act\"}, {\"href\": \"https://global-power-plants.datasettes.com/global-power-plants/global-power-plants?_facet=primary_fuel&_facet=owner&_facet=country_long&country_long__exact=United+Kingdom&primary_fuel=Gas\", \"label\": \"../global-power-plants?country_long=United+Kingdom&primary_fuel=Gas\"}, {\"href\": \"https://global-power-plants.datasettes.com/-/metadata\", \"label\": \"its metadata.json\"}, {\"href\": \"https://github.com/simonw/datasette-cluster-map\", \"label\": \"datasette-cluster-map\"}]"} {"id": "authentication:authentication-permissions-allow", "page": "authentication", "ref": "authentication-permissions-allow", "title": "Defining permissions with \"allow\" blocks", "content": "The standard way to define permissions in Datasette is to use an \"allow\" block in the datasette.yaml file . This is a JSON document describing which actors are allowed to perform a permission. \n The most basic form of allow block is this ( allow demo , deny demo ): \n [[[cog\nfrom metadata_doc import config_example\nimport textwrap\nconfig_example(cog, textwrap.dedent(\n \"\"\"\n allow:\n id: root\n \"\"\").strip(),\n \"YAML\", \"JSON\"\n ) \n ]]] \n [[[end]]] \n This will match any actors with an \"id\" property of \"root\" - for example, an actor that looks like this: \n {\n \"id\": \"root\",\n \"name\": \"Root User\"\n} \n An allow block can specify \"deny all\" using false ( demo ): \n [[[cog\nfrom metadata_doc import config_example\nimport textwrap\nconfig_example(cog, textwrap.dedent(\n \"\"\"\n allow: false\n \"\"\").strip(),\n \"YAML\", \"JSON\"\n ) \n ]]] \n [[[end]]] \n An \"allow\" of true allows all access ( demo ): \n [[[cog\nfrom metadata_doc import config_example\nimport textwrap\nconfig_example(cog, textwrap.dedent(\n \"\"\"\n allow: true\n \"\"\").strip(),\n \"YAML\", \"JSON\"\n ) \n ]]] \n [[[end]]] \n Allow keys can provide a list of values. These will match any actor that has any of those values ( allow demo , deny demo ): \n [[[cog\nfrom metadata_doc import config_example\nimport textwrap\nconfig_example(cog, textwrap.dedent(\n \"\"\"\n allow:\n id:\n - simon\n - cleopaws\n \"\"\").strip(),\n \"YAML\", \"JSON\"\n ) \n ]]] \n [[[end]]] \n This will match any actor with an \"id\" of either \"simon\" or \"cleopaws\" . \n Actors can have properties that feature a list of values. These will be matched against the list of values in an allow block. Consider the following actor: \n {\n \"id\": \"simon\",\n \"roles\": [\"staff\", \"developer\"]\n} \n This allow block will provide access to any actor that has \"developer\" as one of their roles ( allow demo , deny demo ): \n [[[cog\nfrom metadata_doc import config_example\nimport textwrap\nconfig_example(cog, textwrap.dedent(\n \"\"\"\n allow:\n roles:\n - developer\n \"\"\").strip(),\n \"YAML\", \"JSON\"\n ) \n ]]] \n [[[end]]] \n Note that \"roles\" is not a concept that is baked into Datasette - it's a convention that plugins can choose to implement and act on. \n If you want to provide access to any actor with a value for a specific key, use \"*\" . For example, to match any logged-in user specify the following ( allow demo , deny demo ): \n [[[cog\nfrom metadata_doc import config_example\nimport textwrap\nconfig_example(cog, textwrap.dedent(\n \"\"\"\n allow:\n id: \"*\"\n \"\"\").strip(),\n \"YAML\", \"JSON\"\n ) \n ]]] \n [[[end]]] \n You can specify that only unauthenticated actors (from anonymous HTTP requests) should be allowed access using the special \"unauthenticated\": true key in an allow block ( allow demo , deny demo ): \n [[[cog\nfrom metadata_doc import config_example\nimport textwrap\nconfig_example(cog, textwrap.dedent(\n \"\"\"\n allow:\n unauthenticated: true\n \"\"\").strip(),\n \"YAML\", \"JSON\"\n ) \n ]]] \n [[[end]]] \n Allow keys act as an \"or\" mechanism. An actor will be able to execute the query if any of their JSON properties match any of the values in the corresponding lists in the allow block. The following block will allow users with either a role of \"ops\" OR users who have an id of \"simon\" or \"cleopaws\" : \n [[[cog\nfrom metadata_doc import config_example\nimport textwrap\nconfig_example(cog, textwrap.dedent(\n \"\"\"\n allow:\n id:\n - simon\n - cleopaws\n role: ops\n \"\"\").strip(),\n \"YAML\", \"JSON\"\n ) \n ]]] \n [[[end]]] \n Demo for cleopaws , demo for ops role , demo for an actor matching neither rule .", "breadcrumbs": "[\"Authentication and permissions\", \"Permissions\"]", "references": "[{\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%22id%22%3A+%22root%22%7D&allow=%7B%0D%0A++++++++%22id%22%3A+%22root%22%0D%0A++++%7D\", \"label\": \"allow demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%22id%22%3A+%22trevor%22%7D&allow=%7B%0D%0A++++++++%22id%22%3A+%22root%22%0D%0A++++%7D\", \"label\": \"deny demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22root%22%0D%0A%7D&allow=false\", \"label\": \"demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22root%22%0D%0A%7D&allow=true\", \"label\": \"demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22cleopaws%22%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%5B%0D%0A++++++++%22simon%22%2C%0D%0A++++++++%22cleopaws%22%0D%0A++++%5D%0D%0A%7D\", \"label\": \"allow demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22pancakes%22%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%5B%0D%0A++++++++%22simon%22%2C%0D%0A++++++++%22cleopaws%22%0D%0A++++%5D%0D%0A%7D\", \"label\": \"deny demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22simon%22%2C%0D%0A++++%22roles%22%3A+%5B%0D%0A++++++++%22staff%22%2C%0D%0A++++++++%22developer%22%0D%0A++++%5D%0D%0A%7D&allow=%7B%0D%0A++++%22roles%22%3A+%5B%0D%0A++++++++%22developer%22%0D%0A++++%5D%0D%0A%7D\", \"label\": \"allow demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22cleopaws%22%2C%0D%0A++++%22roles%22%3A+%5B%22dog%22%5D%0D%0A%7D&allow=%7B%0D%0A++++%22roles%22%3A+%5B%0D%0A++++++++%22developer%22%0D%0A++++%5D%0D%0A%7D\", \"label\": \"deny demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22simon%22%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%22*%22%0D%0A%7D\", \"label\": \"allow demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22bot%22%3A+%22readme-bot%22%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%22*%22%0D%0A%7D\", \"label\": \"deny demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=null&allow=%7B%0D%0A++++%22unauthenticated%22%3A+true%0D%0A%7D\", \"label\": \"allow demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22hello%22%0D%0A%7D&allow=%7B%0D%0A++++%22unauthenticated%22%3A+true%0D%0A%7D\", \"label\": \"deny demo\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22cleopaws%22%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%5B%0D%0A++++++++%22simon%22%2C%0D%0A++++++++%22cleopaws%22%0D%0A++++%5D%2C%0D%0A++++%22role%22%3A+%22ops%22%0D%0A%7D\", \"label\": \"Demo for cleopaws\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22trevor%22%2C%0D%0A++++%22role%22%3A+%5B%0D%0A++++++++%22ops%22%2C%0D%0A++++++++%22staff%22%0D%0A++++%5D%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%5B%0D%0A++++++++%22simon%22%2C%0D%0A++++++++%22cleopaws%22%0D%0A++++%5D%2C%0D%0A++++%22role%22%3A+%22ops%22%0D%0A%7D\", \"label\": \"demo for ops role\"}, {\"href\": \"https://latest.datasette.io/-/allow-debug?actor=%7B%0D%0A++++%22id%22%3A+%22percy%22%2C%0D%0A++++%22role%22%3A+%5B%0D%0A++++++++%22staff%22%0D%0A++++%5D%0D%0A%7D&allow=%7B%0D%0A++++%22id%22%3A+%5B%0D%0A++++++++%22simon%22%2C%0D%0A++++++++%22cleopaws%22%0D%0A++++%5D%2C%0D%0A++++%22role%22%3A+%22ops%22%0D%0A%7D\", \"label\": \"demo for an actor matching neither rule\"}]"} {"id": "pages:indexview", "page": "pages", "ref": "indexview", "title": "Top-level index", "content": "The root page of any Datasette installation is an index page that lists all of the currently attached databases. Some examples: \n \n \n fivethirtyeight.datasettes.com \n \n \n global-power-plants.datasettes.com \n \n \n register-of-members-interests.datasettes.com \n \n \n Add /.json to the end of the URL for the JSON version of the underlying data: \n \n \n fivethirtyeight.datasettes.com/.json \n \n \n global-power-plants.datasettes.com/.json \n \n \n register-of-members-interests.datasettes.com/.json", "breadcrumbs": "[\"Pages and API endpoints\"]", "references": "[{\"href\": \"https://fivethirtyeight.datasettes.com/\", \"label\": \"fivethirtyeight.datasettes.com\"}, {\"href\": \"https://global-power-plants.datasettes.com/\", \"label\": \"global-power-plants.datasettes.com\"}, {\"href\": \"https://register-of-members-interests.datasettes.com/\", \"label\": \"register-of-members-interests.datasettes.com\"}, {\"href\": \"https://fivethirtyeight.datasettes.com/.json\", \"label\": \"fivethirtyeight.datasettes.com/.json\"}, {\"href\": \"https://global-power-plants.datasettes.com/.json\", \"label\": \"global-power-plants.datasettes.com/.json\"}, {\"href\": \"https://register-of-members-interests.datasettes.com/.json\", \"label\": \"register-of-members-interests.datasettes.com/.json\"}]"} {"id": "internals:internals-request", "page": "internals", "ref": "internals-request", "title": "Request object", "content": "The request object is passed to various plugin hooks. It represents an incoming HTTP request. It has the following properties: \n \n \n .scope - dictionary \n \n The ASGI scope that was used to construct this request, described in the ASGI HTTP connection scope specification. \n \n \n \n .method - string \n \n The HTTP method for this request, usually GET or POST . \n \n \n \n .url - string \n \n The full URL for this request, e.g. https://latest.datasette.io/fixtures . \n \n \n \n .scheme - string \n \n The request scheme - usually https or http . \n \n \n \n .headers - dictionary (str -> str) \n \n A dictionary of incoming HTTP request headers. Header names have been converted to lowercase. \n \n \n \n .cookies - dictionary (str -> str) \n \n A dictionary of incoming cookies \n \n \n \n .host - string \n \n The host header from the incoming request, e.g. latest.datasette.io or localhost . \n \n \n \n .path - string \n \n The path of the request excluding the query string, e.g. /fixtures . \n \n \n \n .full_path - string \n \n The path of the request including the query string if one is present, e.g. /fixtures?sql=select+sqlite_version() . \n \n \n \n .query_string - string \n \n The query string component of the request, without the ? - e.g. name__contains=sam&age__gt=10 . \n \n \n \n .args - MultiParams \n \n An object representing the parsed query string parameters, see below. \n \n \n \n .url_vars - dictionary (str -> str) \n \n Variables extracted from the URL path, if that path was defined using a regular expression. See register_routes(datasette) . \n \n \n \n .actor - dictionary (str -> Any) or None \n \n The currently authenticated actor (see actors ), or None if the request is unauthenticated. \n \n \n \n The object also has two awaitable methods: \n \n \n await request.post_vars() - dictionary \n \n Returns a dictionary of form variables that were submitted in the request body via POST . Don't forget to read about CSRF protection ! \n \n \n \n await request.post_body() - bytes \n \n Returns the un-parsed body of a request submitted by POST - useful for things like incoming JSON data. \n \n \n \n And a class method that can be used to create fake request objects for use in tests: \n \n \n fake(path_with_query_string, method=\"GET\", scheme=\"http\", url_vars=None) \n \n Returns a Request instance for the specified path and method. For example: \n from datasette import Request\nfrom pprint import pprint\n\nrequest = Request.fake(\n \"/fixtures/facetable/\",\n url_vars={\"database\": \"fixtures\", \"table\": \"facetable\"},\n)\npprint(request.scope) \n This outputs: \n {'http_version': '1.1',\n 'method': 'GET',\n 'path': '/fixtures/facetable/',\n 'query_string': b'',\n 'raw_path': b'/fixtures/facetable/',\n 'scheme': 'http',\n 'type': 'http',\n 'url_route': {'kwargs': {'database': 'fixtures', 'table': 'facetable'}}}", "breadcrumbs": "[\"Internals for plugins\"]", "references": "[{\"href\": \"https://asgi.readthedocs.io/en/latest/specs/www.html#connection-scope\", \"label\": \"ASGI HTTP connection scope\"}]"} {"id": "configuration:configuration-cli", "page": "configuration", "ref": "configuration-cli", "title": "Configuration via the command-line", "content": "The recommended way to configure Datasette is using a datasette.yaml file passed to -c/--config . You can also pass individual settings to Datasette using the -s/--setting option, which can be used multiple times: \n datasette mydatabase.db \\\n --setting settings.default_page_size 50 \\\n --setting settings.sql_time_limit_ms 3500 \n This option takes dotted-notation for the first argument and a value for the second argument. This means you can use it to set any configuration value that would be valid in a datasette.yaml file. \n It also works for plugin configuration, for example for datasette-cluster-map : \n datasette mydatabase.db \\\n --setting plugins.datasette-cluster-map.latitude_column xlat \\\n --setting plugins.datasette-cluster-map.longitude_column xlon \n If the value you provide is a valid JSON object or list it will be treated as nested data, allowing you to configure plugins that accept lists such as datasette-proxy-url : \n datasette mydatabase.db \\\n -s plugins.datasette-proxy-url.paths '[{\"path\": \"/proxy\", \"backend\": \"http://example.com/\"}]' \n This is equivalent to a datasette.yaml file containing the following: \n [[[cog\nfrom metadata_doc import config_example\nimport textwrap\nconfig_example(cog, textwrap.dedent(\n \"\"\"\n plugins:\n datasette-proxy-url:\n paths:\n - path: /proxy\n backend: http://example.com/\n \"\"\").strip()\n ) \n ]]] \n [[[end]]]", "breadcrumbs": "[\"Configuration\"]", "references": "[{\"href\": \"https://datasette.io/plugins/datasette-cluster-map\", \"label\": \"datasette-cluster-map\"}, {\"href\": \"https://datasette.io/plugins/datasette-proxy-url\", \"label\": \"datasette-proxy-url\"}]"} {"id": "writing_plugins:writing-plugins-one-off", "page": "writing_plugins", "ref": "writing-plugins-one-off", "title": "Writing one-off plugins", "content": "The quickest way to start writing a plugin is to create a my_plugin.py file and drop it into your plugins/ directory. Here is an example plugin, which adds a new custom SQL function called hello_world() which takes no arguments and returns the string Hello world! . \n from datasette import hookimpl\n\n\n@hookimpl\ndef prepare_connection(conn):\n conn.create_function(\n \"hello_world\", 0, lambda: \"Hello world!\"\n ) \n If you save this in plugins/my_plugin.py you can then start Datasette like this: \n datasette serve mydb.db --plugins-dir=plugins/ \n Now you can navigate to http://localhost:8001/mydb and run this SQL: \n select hello_world(); \n To see the output of your plugin.", "breadcrumbs": "[\"Writing plugins\"]", "references": "[{\"href\": \"http://localhost:8001/mydb\", \"label\": \"http://localhost:8001/mydb\"}]"} {"id": "deploying:deploying", "page": "deploying", "ref": "deploying", "title": "Deploying Datasette", "content": "The quickest way to deploy a Datasette instance on the internet is to use the datasette publish command, described in Publishing data . This can be used to quickly deploy Datasette to a number of hosting providers including Heroku, Google Cloud Run and Vercel. \n You can deploy Datasette to other hosting providers using the instructions on this page.", "breadcrumbs": "[]", "references": "[]"} {"id": "changelog:better-plugin-documentation", "page": "changelog", "ref": "better-plugin-documentation", "title": "Better plugin documentation", "content": "The plugin documentation has been re-arranged into four sections, including a brand new section on testing plugins. ( #687 ) \n \n \n Plugins introduces Datasette's plugin system and describes how to install and configure plugins. \n \n \n Writing plugins describes how to author plugins, from one-off single file plugins to packaged plugins that can be published to PyPI. It also describes how to start a plugin using the new datasette-plugin cookiecutter template. \n \n \n Plugin hooks is a full list of detailed documentation for every Datasette plugin hook. \n \n \n Testing plugins describes how to write tests for Datasette plugins, using pytest and HTTPX .", "breadcrumbs": "[\"Changelog\", \"0.45 (2020-07-01)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/687\", \"label\": \"#687\"}, {\"href\": \"https://github.com/simonw/datasette-plugin\", \"label\": \"datasette-plugin\"}, {\"href\": \"https://docs.pytest.org/\", \"label\": \"pytest\"}, {\"href\": \"https://www.python-httpx.org/\", \"label\": \"HTTPX\"}]"} {"id": "facets:speeding-up-facets-with-indexes", "page": "facets", "ref": "speeding-up-facets-with-indexes", "title": "Speeding up facets with indexes", "content": "The performance of facets can be greatly improved by adding indexes on the columns you wish to facet by.\n Adding indexes can be performed using the sqlite3 command-line utility. Here's how to add an index on the state column in a table called Food_Trucks : \n sqlite3 mydatabase.db \n SQLite version 3.19.3 2017-06-27 16:48:08\nEnter \".help\" for usage hints.\nsqlite> CREATE INDEX Food_Trucks_state ON Food_Trucks(\"state\"); \n Or using the sqlite-utils command-line utility: \n sqlite-utils create-index mydatabase.db Food_Trucks state", "breadcrumbs": "[\"Facets\"]", "references": "[{\"href\": \"https://sqlite-utils.datasette.io/en/stable/cli.html#creating-indexes\", \"label\": \"sqlite-utils\"}]"} {"id": "authentication:logoutview", "page": "authentication", "ref": "logoutview", "title": "The /-/logout page", "content": "The page at /-/logout provides the ability to log out of a ds_actor cookie authentication session.", "breadcrumbs": "[\"Authentication and permissions\", \"The ds_actor cookie\"]", "references": "[]"} {"id": "changelog:v1-0-a10", "page": "changelog", "ref": "v1-0-a10", "title": "1.0a10 (2024-02-17)", "content": "The only changes in this alpha correspond to the way Datasette handles database transactions. ( #2277 ) \n \n \n The database.execute_write_fn() method has a new transaction=True parameter. This defaults to True which means all functions executed using this method are now automatically wrapped in a transaction - previously the functions needed to roll transaction handling on their own, and many did not. \n \n \n Pass transaction=False to execute_write_fn() if you want to manually handle transactions in your function. \n \n \n Several internal Datasette features, including parts of the JSON write API , had been failing to wrap their operations in a transaction. This has been fixed by the new transaction=True default.", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/2277\", \"label\": \"#2277\"}]"} {"id": "changelog:faceting", "page": "changelog", "ref": "faceting", "title": "Faceting", "content": "The number of unique values in a facet is now always displayed. Previously it was only displayed if the user specified ?_facet_size=max . ( #1556 ) \n \n \n Facets of type date or array can now be configured in metadata.json , see Facets in metadata . Thanks, David Larlet. ( #1552 ) \n \n \n New ?_nosuggest=1 parameter for table views, which disables facet suggestion. ( #1557 ) \n \n \n Fixed bug where ?_facet_array=tags&_facet=tags would only display one of the two selected facets. ( #625 )", "breadcrumbs": "[\"Changelog\", \"0.60 (2022-01-13)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1556\", \"label\": \"#1556\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1552\", \"label\": \"#1552\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1557\", \"label\": \"#1557\"}, {\"href\": \"https://github.com/simonw/datasette/issues/625\", \"label\": \"#625\"}]"} {"id": "changelog:url-building", "page": "changelog", "ref": "url-building", "title": "URL building", "content": "The new datasette.urls family of methods can be used to generate URLs to key pages within the Datasette interface, both within custom templates and Datasette plugins. See Building URLs within plugins for more details. ( #904 )", "breadcrumbs": "[\"Changelog\", \"0.51 (2020-10-31)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/904\", \"label\": \"#904\"}]"} {"id": "changelog:other-changes", "page": "changelog", "ref": "other-changes", "title": "Other changes", "content": "The new DATASETTE_TRACE_PLUGINS=1 environment variable turns on detailed trace output for every executed plugin hook, useful for debugging and understanding how the plugin system works at a low level. ( #2274 ) \n \n \n Datasette on Python 3.9 or above marks its non-cryptographic uses of the MD5 hash function as usedforsecurity=False , for compatibility with FIPS systems. ( #2270 ) \n \n \n SQL relating to Datasette's internal database now executes inside a transaction, avoiding a potential database locked error. ( #2273 ) \n \n \n The /-/threads debug page now identifies the database in the name associated with each dedicated write thread. ( #2265 ) \n \n \n The /db/-/create API now fires a insert-rows event if rows were inserted after the table was created. ( #2260 )", "breadcrumbs": "[\"Changelog\", \"1.0a9 (2024-02-16)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/2274\", \"label\": \"#2274\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2270\", \"label\": \"#2270\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2273\", \"label\": \"#2273\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2265\", \"label\": \"#2265\"}, {\"href\": \"https://github.com/simonw/datasette/issues/2260\", \"label\": \"#2260\"}]"} {"id": "changelog:through-for-joins-through-many-to-many-tables", "page": "changelog", "ref": "through-for-joins-through-many-to-many-tables", "title": "?_through= for joins through many-to-many tables", "content": "The new ?_through={json} argument to the Table view allows records to be filtered based on a many-to-many relationship. See Special table arguments for full documentation - here's an example . ( #355 ) \n This feature was added to help support facet by many-to-many , which isn't quite ready yet but will be coming in the next Datasette release.", "breadcrumbs": "[\"Changelog\", \"0.29 (2019-07-07)\"]", "references": "[{\"href\": \"https://latest.datasette.io/fixtures/roadside_attractions?_through={%22table%22:%22roadside_attraction_characteristics%22,%22column%22:%22characteristic_id%22,%22value%22:%221%22}\", \"label\": \"an example\"}, {\"href\": \"https://github.com/simonw/datasette/issues/355\", \"label\": \"#355\"}, {\"href\": \"https://github.com/simonw/datasette/issues/551\", \"label\": \"facet by many-to-many\"}]"} {"id": "settings:setting-max-csv-mb", "page": "settings", "ref": "setting-max-csv-mb", "title": "max_csv_mb", "content": "The maximum size of CSV that can be exported, in megabytes. Defaults to 100MB.\n You can disable the limit entirely by settings this to 0: \n datasette mydatabase.db --setting max_csv_mb 0", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"} {"id": "changelog:id62", "page": "changelog", "ref": "id62", "title": "0.43 (2020-05-28)", "content": "The main focus of this release is a major upgrade to the register_output_renderer(datasette) plugin hook, which allows plugins to provide new output formats for Datasette such as datasette-atom and datasette-ics . \n \n \n Redesign of register_output_renderer(datasette) to provide more context to the render callback and support an optional \"can_render\" callback that controls if a suggested link to the output format is provided. ( #581 , #770 ) \n \n \n Visually distinguish float and integer columns - useful for figuring out why order-by-column might be returning unexpected results. ( #729 ) \n \n \n The Request object , which is passed to several plugin hooks, is now documented. ( #706 ) \n \n \n New metadata.json option for setting a custom default page size for specific tables and views, see Setting a custom page size . ( #751 ) \n \n \n Canned queries can now be configured with a default URL fragment hash, useful when working with plugins such as datasette-vega , see Additional canned query options . ( #706 ) \n \n \n Fixed a bug in datasette publish when running on operating systems where the /tmp directory lives in a different volume, using a backport of the Python 3.8 shutil.copytree() function. ( #744 ) \n \n \n Every plugin hook is now covered by the unit tests, and a new unit test checks that each plugin hook has at least one corresponding test. ( #771 , #773 )", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette-atom\", \"label\": \"datasette-atom\"}, {\"href\": \"https://github.com/simonw/datasette-ics\", \"label\": \"datasette-ics\"}, {\"href\": \"https://github.com/simonw/datasette/issues/581\", \"label\": \"#581\"}, {\"href\": \"https://github.com/simonw/datasette/issues/770\", \"label\": \"#770\"}, {\"href\": \"https://github.com/simonw/datasette/issues/729\", \"label\": \"#729\"}, {\"href\": \"https://github.com/simonw/datasette/issues/706\", \"label\": \"#706\"}, {\"href\": \"https://github.com/simonw/datasette/issues/751\", \"label\": \"#751\"}, {\"href\": \"https://github.com/simonw/datasette-vega\", \"label\": \"datasette-vega\"}, {\"href\": \"https://github.com/simonw/datasette/issues/706\", \"label\": \"#706\"}, {\"href\": \"https://github.com/simonw/datasette/issues/744\", \"label\": \"#744\"}, {\"href\": \"https://github.com/simonw/datasette/issues/771\", \"label\": \"#771\"}, {\"href\": \"https://github.com/simonw/datasette/issues/773\", \"label\": \"#773\"}]"} {"id": "changelog:id49", "page": "changelog", "ref": "id49", "title": "0.50 (2020-10-09)", "content": "The key new feature in this release is the column actions menu on the table page ( #891 ). This can be used to sort a column in ascending or descending order, facet data by that column or filter the table to just rows that have a value for that column. \n Plugin authors can use the new datasette.client object to make internal HTTP requests from their plugins, allowing them to make use of Datasette's JSON API. ( #943 ) \n New Deploying Datasette documentation with guides for deploying Datasette on a Linux server using systemd or to hosting providers that support buildpacks . ( #514 , #997 ) \n Other improvements in this release: \n \n \n Publishing to Google Cloud Run documentation now covers Google Cloud SDK options. Thanks, Geoffrey Hing. ( #995 ) \n \n \n New datasette -o option which opens your browser as soon as Datasette starts up. ( #970 ) \n \n \n Datasette now sets sqlite3.enable_callback_tracebacks(True) so that errors in custom SQL functions will display tracebacks. ( #891 ) \n \n \n Fixed two rendering bugs with column headers in portrait mobile view. ( #978 , #980 ) \n \n \n New db.table_column_details(table) introspection method for retrieving full details of the columns in a specific table, see Database introspection . \n \n \n Fixed a routing bug with custom page wildcard templates. ( #996 ) \n \n \n datasette publish heroku now deploys using Python 3.8.6. \n \n \n New datasette publish heroku --tar= option. ( #969 ) \n \n \n OPTIONS requests against HTML pages no longer return a 500 error. ( #1001 ) \n \n \n Datasette now supports Python 3.9. \n \n \n See also Datasette 0.50: The annotated release notes .", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/891\", \"label\": \"#891\"}, {\"href\": \"https://github.com/simonw/datasette/issues/943\", \"label\": \"#943\"}, {\"href\": \"https://github.com/simonw/datasette/issues/514\", \"label\": \"#514\"}, {\"href\": \"https://github.com/simonw/datasette/issues/997\", \"label\": \"#997\"}, {\"href\": \"https://github.com/simonw/datasette/pull/995\", \"label\": \"#995\"}, {\"href\": \"https://github.com/simonw/datasette/issues/970\", \"label\": \"#970\"}, {\"href\": \"https://github.com/simonw/datasette/issues/891\", \"label\": \"#891\"}, {\"href\": \"https://github.com/simonw/datasette/issues/978\", \"label\": \"#978\"}, {\"href\": \"https://github.com/simonw/datasette/issues/980\", \"label\": \"#980\"}, {\"href\": \"https://github.com/simonw/datasette/issues/996\", \"label\": \"#996\"}, {\"href\": \"https://github.com/simonw/datasette/issues/969\", \"label\": \"#969\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1001\", \"label\": \"#1001\"}, {\"href\": \"https://simonwillison.net/2020/Oct/9/datasette-0-50/\", \"label\": \"Datasette 0.50: The annotated release notes\"}]"} {"id": "plugin_hooks:plugin-hook-slots", "page": "plugin_hooks", "ref": "plugin-hook-slots", "title": "Template slots", "content": "The following set of plugin hooks can be used to return extra HTML content that will be inserted into the corresponding page, directly below the

heading. \n Multiple plugins can contribute content here. The order in which it is displayed can be controlled using Pluggy's call time order options . \n Each of these plugin hooks can return either a string or an awaitable function that returns a string.", "breadcrumbs": "[\"Plugin hooks\"]", "references": "[{\"href\": \"https://pluggy.readthedocs.io/en/stable/#call-time-order\", \"label\": \"call time order options\"}]"} {"id": "csv_export:csv-export-url-parameters", "page": "csv_export", "ref": "csv-export-url-parameters", "title": "URL parameters", "content": "The following options can be used to customize the CSVs returned by Datasette. \n \n \n ?_header=off \n \n This removes the first row of the CSV file specifying the headings - only the row data will be returned. \n \n \n \n ?_stream=on \n \n Stream all matching records, not just the first page of results. See below. \n \n \n \n ?_dl=on \n \n Causes Datasette to return a content-disposition: attachment; filename=\"filename.csv\" header.", "breadcrumbs": "[\"CSV export\"]", "references": "[]"} {"id": "settings:id2", "page": "settings", "ref": "id2", "title": "Settings", "content": "The following options can be set using --setting name value , or by storing them in the settings.json file for use with Configuration directory mode .", "breadcrumbs": "[\"Settings\"]", "references": "[]"} {"id": "configuration:configuration-reference", "page": "configuration", "ref": "configuration-reference", "title": null, "content": "The following example shows some of the valid configuration options that can exist inside datasette.yaml . \n [[[cog\nfrom metadata_doc import config_example\nimport textwrap\nconfig_example(cog, textwrap.dedent(\n \"\"\"\n # Datasette settings block\n settings:\n default_page_size: 50\n sql_time_limit_ms: 3500\n max_returned_rows: 2000\n\n # top-level plugin configuration\n plugins:\n datasette-my-plugin:\n key: valueA\n\n # Database and table-level configuration\n databases:\n your_db_name:\n # plugin configuration for the your_db_name database\n plugins:\n datasette-my-plugin:\n key: valueA\n tables:\n your_table_name:\n allow:\n # Only the root user can access this table\n id: root\n # plugin configuration for the your_table_name table\n # inside your_db_name database\n plugins:\n datasette-my-plugin:\n key: valueB\n \"\"\")\n ) \n ]]] \n [[[end]]]", "breadcrumbs": "[\"Configuration\"]", "references": "[]"} {"id": "internals:internals-shortcuts", "page": "internals", "ref": "internals-shortcuts", "title": "Import shortcuts", "content": "The following commonly used symbols can be imported directly from the datasette module: \n from datasette import Response\nfrom datasette import Forbidden\nfrom datasette import NotFound\nfrom datasette import hookimpl\nfrom datasette import actor_matches_allow", "breadcrumbs": "[\"Internals for plugins\"]", "references": "[]"} {"id": "spatialite:installing-spatialite-on-os-x", "page": "spatialite", "ref": "installing-spatialite-on-os-x", "title": "Installing SpatiaLite on OS X", "content": "The easiest way to install SpatiaLite on OS X is to use Homebrew . \n brew update\nbrew install spatialite-tools \n This will install the spatialite command-line tool and the mod_spatialite dynamic library. \n You can now run Datasette like so: \n datasette --load-extension=spatialite", "breadcrumbs": "[\"SpatiaLite\", \"Installation\"]", "references": "[{\"href\": \"https://brew.sh/\", \"label\": \"Homebrew\"}]"} {"id": "contributing:contributing-continuous-deployment", "page": "contributing", "ref": "contributing-continuous-deployment", "title": "Continuously deployed demo instances", "content": "The demo instance at latest.datasette.io is re-deployed automatically to Google Cloud Run for every push to main that passes the test suite. This is implemented by the GitHub Actions workflow at .github/workflows/deploy-latest.yml . \n Specific branches can also be set to automatically deploy by adding them to the on: push: branches block at the top of the workflow YAML file. Branches configured in this way will be deployed to a new Cloud Run service whether or not their tests pass. \n The Cloud Run URL for a branch demo can be found in the GitHub Actions logs.", "breadcrumbs": "[\"Contributing\"]", "references": "[{\"href\": \"https://latest.datasette.io/\", \"label\": \"latest.datasette.io\"}, {\"href\": \"https://github.com/simonw/datasette/blob/main/.github/workflows/deploy-latest.yml\", \"label\": \".github/workflows/deploy-latest.yml\"}]"} {"id": "settings:setting-default-facet-size", "page": "settings", "ref": "setting-default-facet-size", "title": "default_facet_size", "content": "The default number of unique rows returned by Facets is 30. You can customize it like this: \n datasette mydatabase.db --setting default_facet_size 50", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"} {"id": "settings:setting-default-page-size", "page": "settings", "ref": "setting-default-page-size", "title": "default_page_size", "content": "The default number of rows returned by the table page. You can over-ride this on a per-page basis using the ?_size=80 query string parameter, provided you do not specify a value higher than the max_returned_rows setting. You can set this default using --setting like so: \n datasette mydatabase.db --setting default_page_size 50", "breadcrumbs": "[\"Settings\", \"Settings\"]", "references": "[]"} {"id": "json_api:json-api-default", "page": "json_api", "ref": "json-api-default", "title": "Default representation", "content": "The default JSON representation of data from a SQLite table or custom query\n looks like this: \n {\n \"ok\": true,\n \"rows\": [\n {\n \"id\": 3,\n \"name\": \"Detroit\"\n },\n {\n \"id\": 2,\n \"name\": \"Los Angeles\"\n },\n {\n \"id\": 4,\n \"name\": \"Memnonia\"\n },\n {\n \"id\": 1,\n \"name\": \"San Francisco\"\n }\n ],\n \"truncated\": false\n} \n \"ok\" is always true if an error did not occur. \n The \"rows\" key is a list of objects, each one representing a row. \n The \"truncated\" key lets you know if the query was truncated. This can happen if a SQL query returns more than 1,000 results (or the max_returned_rows setting). \n For table pages, an additional key \"next\" may be present. This indicates that the next page in the pagination set can be retrieved using ?_next=VALUE .", "breadcrumbs": "[\"JSON API\"]", "references": "[]"} {"id": "json_api:json-api-pagination", "page": "json_api", "ref": "json-api-pagination", "title": "Pagination", "content": "The default JSON representation includes a \"next_url\" key which can be used to access the next page of results. If that key is null or missing then it means you have reached the final page of results. \n Other representations include pagination information in the link HTTP header. That header will look something like this: \n link: ; rel=\"next\" \n Here is an example Python function built using requests that returns a list of all of the paginated items from one of these API endpoints: \n def paginate(url):\n items = []\n while url:\n response = requests.get(url)\n try:\n url = response.links.get(\"next\").get(\"url\")\n except AttributeError:\n url = None\n items.extend(response.json())\n return items", "breadcrumbs": "[\"JSON API\"]", "references": "[{\"href\": \"https://requests.readthedocs.io/\", \"label\": \"requests\"}]"} {"id": "authentication:permissionsdebugview", "page": "authentication", "ref": "permissionsdebugview", "title": "The permissions debug tool", "content": "The debug tool at /-/permissions is only available to the authenticated root user (or any actor granted the permissions-debug action). \n It shows the thirty most recent permission checks that have been carried out by the Datasette instance. \n It also provides an interface for running hypothetical permission checks against a hypothetical actor. This is a useful way of confirming that your configured permissions work in the way you expect. \n This is designed to help administrators and plugin authors understand exactly how permission checks are being carried out, in order to effectively configure Datasette's permission system.", "breadcrumbs": "[\"Authentication and permissions\"]", "references": "[]"} {"id": "introspection:messagesdebugview", "page": "introspection", "ref": "messagesdebugview", "title": "/-/messages", "content": "The debug tool at /-/messages can be used to set flash messages to try out that feature. See .add_message(request, message, type=datasette.INFO) for details of this feature.", "breadcrumbs": "[\"Introspection\"]", "references": "[]"} {"id": "changelog:id161", "page": "changelog", "ref": "id161", "title": "0.15 (2018-04-09)", "content": "The biggest new feature in this release is the ability to sort by column. On the\n table page the column headers can now be clicked to apply sort (or descending\n sort), or you can specify ?_sort=column or ?_sort_desc=column directly\n in the URL. \n \n \n table_rows => table_rows_count , filtered_table_rows =>\n filtered_table_rows_count \n Renamed properties. Closes #194 \n \n \n New sortable_columns option in metadata.json to control sort options. \n You can now explicitly set which columns in a table can be used for sorting\n using the _sort and _sort_desc arguments using metadata.json : \n {\n \"databases\": {\n \"database1\": {\n \"tables\": {\n \"example_table\": {\n \"sortable_columns\": [\n \"height\",\n \"weight\"\n ]\n }\n }\n }\n }\n} \n Refs #189 \n \n \n Column headers now link to sort/desc sort - refs #189 \n \n \n _sort and _sort_desc parameters for table views \n Allows for paginated sorted results based on a specified column. \n Refs #189 \n \n \n Total row count now correct even if _next applied \n \n \n Use .custom_sql() for _group_count implementation (refs #150 ) \n \n \n Make HTML title more readable in query template ( #180 ) [Ryan Pitts] \n \n \n New ?_shape=objects/object/lists param for JSON API ( #192 ) \n New _shape= parameter replacing old .jsono extension \n Now instead of this: \n /database/table.jsono \n We use the _shape parameter like this: \n /database/table.json?_shape=objects \n Also introduced a new _shape called object which looks like this: \n /database/table.json?_shape=object \n Returning an object for the rows key: \n ...\n\"rows\": {\n \"pk1\": {\n ...\n },\n \"pk2\": {\n ...\n }\n} \n Refs #122 \n \n \n Utility for writing test database fixtures to a .db file \n python tests/fixtures.py /tmp/hello.db \n This is useful for making a SQLite database of the test fixtures for\n interactive exploration. \n \n \n Compound primary key _next= now plays well with extra filters \n Closes #190 \n \n \n Fixed bug with keyset pagination over compound primary keys \n Refs #190 \n \n \n Database/Table views inherit source/license/source_url/license_url \n metadata \n If you set the source_url/license_url/source/license fields in your root\n metadata those values will now be inherited all the way down to the database\n and table templates. \n The title/description are NOT inherited. \n Also added unit tests for the HTML generated by the metadata. \n Refs #185 \n \n \n Add metadata, if it exists, to heroku temp dir ( #178 ) [Tony Hirst] \n \n \n Initial documentation for pagination \n \n \n Broke up test_app into test_api and test_html \n \n \n Fixed bug with .json path regular expression \n I had a table called geojson and it caused an exception because the regex\n was matching .json and not \\.json \n \n \n Deploy to Heroku with Python 3.6.3", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/194\", \"label\": \"#194\"}, {\"href\": \"https://github.com/simonw/datasette/issues/189\", \"label\": \"#189\"}, {\"href\": \"https://github.com/simonw/datasette/issues/189\", \"label\": \"#189\"}, {\"href\": \"https://github.com/simonw/datasette/issues/189\", \"label\": \"#189\"}, {\"href\": \"https://github.com/simonw/datasette/issues/150\", \"label\": \"#150\"}, {\"href\": \"https://github.com/simonw/datasette/issues/180\", \"label\": \"#180\"}, {\"href\": \"https://github.com/simonw/datasette/issues/192\", \"label\": \"#192\"}, {\"href\": \"https://github.com/simonw/datasette/issues/122\", \"label\": \"#122\"}, {\"href\": \"https://github.com/simonw/datasette/issues/190\", \"label\": \"#190\"}, {\"href\": \"https://github.com/simonw/datasette/issues/190\", \"label\": \"#190\"}, {\"href\": \"https://github.com/simonw/datasette/issues/185\", \"label\": \"#185\"}, {\"href\": \"https://github.com/simonw/datasette/issues/178\", \"label\": \"#178\"}]"} {"id": "changelog:id118", "page": "changelog", "ref": "id118", "title": "0.22 (2018-05-20)", "content": "The big new feature in this release is Facets . Datasette can now apply faceted browse to any column in any table. It will also suggest possible facets. See the Datasette Facets announcement post for more details. \n In addition to the work on facets: \n \n \n Added docs for introspection endpoints \n \n \n New --config option, added --help-config , closes #274 \n Removed the --page_size= argument to datasette serve in favour of: \n datasette serve --config default_page_size:50 mydb.db \n Added new help section: \n datasette --help-config \n Config options:\n default_page_size Default page size for the table view\n (default=100)\n max_returned_rows Maximum rows that can be returned from a table\n or custom query (default=1000)\n sql_time_limit_ms Time limit for a SQL query in milliseconds\n (default=1000)\n default_facet_size Number of values to return for requested facets\n (default=30)\n facet_time_limit_ms Time limit for calculating a requested facet\n (default=200)\n facet_suggest_time_limit_ms Time limit for calculating a suggested facet\n (default=50) \n \n \n Only apply responsive table styles to .rows-and-column \n Otherwise they interfere with tables in the description, e.g. on\n https://fivethirtyeight.datasettes.com/fivethirtyeight/nba-elo%2Fnbaallelo \n \n \n Refactored views into new views/ modules, refs #256 \n \n \n Documentation for SQLite full-text search support, closes #253 \n \n \n /-/versions now includes SQLite fts_versions , closes #252", "breadcrumbs": "[\"Changelog\"]", "references": "[{\"href\": \"https://simonwillison.net/2018/May/20/datasette-facets/\", \"label\": \"Datasette Facets\"}, {\"href\": \"https://docs.datasette.io/en/stable/introspection.html\", \"label\": \"docs for introspection endpoints\"}, {\"href\": \"https://github.com/simonw/datasette/issues/274\", \"label\": \"#274\"}, {\"href\": \"https://fivethirtyeight.datasettes.com/fivethirtyeight/nba-elo%2Fnbaallelo\", \"label\": \"https://fivethirtyeight.datasettes.com/fivethirtyeight/nba-elo%2Fnbaallelo\"}, {\"href\": \"https://github.com/simonw/datasette/issues/256\", \"label\": \"#256\"}, {\"href\": \"https://docs.datasette.io/en/stable/full_text_search.html\", \"label\": \"Documentation for SQLite full-text search\"}, {\"href\": \"https://github.com/simonw/datasette/issues/253\", \"label\": \"#253\"}, {\"href\": \"https://github.com/simonw/datasette/issues/252\", \"label\": \"#252\"}]"} {"id": "getting_started:getting-started-demo", "page": "getting_started", "ref": "getting-started-demo", "title": "Play with a live demo", "content": "The best way to experience Datasette for the first time is with a demo: \n \n \n global-power-plants.datasettes.com provides a searchable database of power plants around the world, using data from the World Resources Institude rendered using the datasette-cluster-map plugin. \n \n \n fivethirtyeight.datasettes.com shows Datasette running against over 400 datasets imported from the FiveThirtyEight GitHub repository .", "breadcrumbs": "[\"Getting started\"]", "references": "[{\"href\": \"https://global-power-plants.datasettes.com/global-power-plants/global-power-plants\", \"label\": \"global-power-plants.datasettes.com\"}, {\"href\": \"https://www.wri.org/publication/global-power-plant-database\", \"label\": \"World Resources Institude\"}, {\"href\": \"https://github.com/simonw/datasette-cluster-map\", \"label\": \"datasette-cluster-map\"}, {\"href\": \"https://fivethirtyeight.datasettes.com/fivethirtyeight\", \"label\": \"fivethirtyeight.datasettes.com\"}, {\"href\": \"https://github.com/fivethirtyeight/data\", \"label\": \"FiveThirtyEight GitHub repository\"}]"} {"id": "testing_plugins:testing-plugins-datasette-test-instance", "page": "testing_plugins", "ref": "testing-plugins-datasette-test-instance", "title": "Setting up a Datasette test instance", "content": "The above example shows the easiest way to start writing tests against a Datasette instance: \n from datasette.app import Datasette\nimport pytest\n\n\n@pytest.mark.asyncio\nasync def test_plugin_is_installed():\n datasette = Datasette(memory=True)\n response = await datasette.client.get(\"/-/plugins.json\")\n assert response.status_code == 200 \n Creating a Datasette() instance like this as useful shortcut in tests, but there is one detail you need to be aware of. It's important to ensure that the async method .invoke_startup() is called on that instance. You can do that like this: \n datasette = Datasette(memory=True)\nawait datasette.invoke_startup() \n This method registers any startup(datasette) or prepare_jinja2_environment(env, datasette) plugins that might themselves need to make async calls. \n If you are using await datasette.client.get() and similar methods then you don't need to worry about this - Datasette automatically calls invoke_startup() the first time it handles a request.", "breadcrumbs": "[\"Testing plugins\"]", "references": "[]"} {"id": "spatialite:spatialite-warning", "page": "spatialite", "ref": "spatialite-warning", "title": "Warning", "content": "The SpatiaLite extension adds a large number of additional SQL functions , some of which are not be safe for untrusted users to execute: they may cause the Datasette server to crash. \n You should not expose a SpatiaLite-enabled Datasette instance to the public internet without taking extra measures to secure it against potentially harmful SQL queries. \n The following steps are recommended: \n \n \n Disable arbitrary SQL queries by untrusted users. See Controlling the ability to execute arbitrary SQL for ways to do this. The easiest is to start Datasette with the datasette --setting default_allow_sql off option. \n \n \n Define Canned queries with the SQL queries that use SpatiaLite functions that you want people to be able to execute. \n \n \n The Datasette SpatiaLite tutorial includes detailed instructions for running SpatiaLite safely using these techniques", "breadcrumbs": "[\"SpatiaLite\"]", "references": "[{\"href\": \"https://www.gaia-gis.it/gaia-sins/spatialite-sql-5.0.1.html\", \"label\": \"a large number of additional SQL functions\"}, {\"href\": \"https://datasette.io/tutorials/spatialite\", \"label\": \"Datasette SpatiaLite tutorial\"}]"} {"id": "pages:pages", "page": "pages", "ref": "pages", "title": "Pages and API endpoints", "content": "The Datasette web application offers a number of different pages that can be accessed to explore the data in question, each of which is accompanied by an equivalent JSON API.", "breadcrumbs": "[]", "references": "[]"} {"id": "json_api:id2", "page": "json_api", "ref": "id2", "title": "Table arguments", "content": "The Datasette table view takes a number of special query string arguments.", "breadcrumbs": "[\"JSON API\"]", "references": "[]"} {"id": "spatialite:querying-polygons-using-within", "page": "spatialite", "ref": "querying-polygons-using-within", "title": "Querying polygons using within()", "content": "The within() SQL function can be used to check if a point is within a geometry: \n select\n name\nfrom\n places\nwhere\n within(GeomFromText('POINT(-3.1724366 51.4704448)'), places.geom); \n The GeomFromText() function takes a string of well-known text. Note that the order used here is longitude then latitude . \n To run that same within() query in a way that benefits from the spatial index, use the following: \n select\n name\nfrom\n places\nwhere\n within(GeomFromText('POINT(-3.1724366 51.4704448)'), places.geom)\n and rowid in (\n SELECT pkid FROM idx_places_geom\n where xmin < -3.1724366\n and xmax > -3.1724366\n and ymin < 51.4704448\n and ymax > 51.4704448\n );", "breadcrumbs": "[\"SpatiaLite\"]", "references": "[]"} {"id": "csv_export:streaming-all-records", "page": "csv_export", "ref": "streaming-all-records", "title": "Streaming all records", "content": "The stream all rows option is designed to be as efficient as possible -\n under the hood it takes advantage of Python 3 asyncio capabilities and\n Datasette's efficient pagination to stream back the full\n CSV file. \n Since databases can get pretty large, by default this option is capped at 100MB -\n if a table returns more than 100MB of data the last line of the CSV will be a\n truncation error message. \n You can increase or remove this limit using the max_csv_mb config\n setting. You can also disable the CSV export feature entirely using\n allow_csv_stream .", "breadcrumbs": "[\"CSV export\"]", "references": "[]"} {"id": "spatialite:importing-shapefiles-into-spatialite", "page": "spatialite", "ref": "importing-shapefiles-into-spatialite", "title": "Importing shapefiles into SpatiaLite", "content": "The shapefile format is a common format for distributing geospatial data. You can use the spatialite command-line tool to create a new database table from a shapefile. \n Try it now with the North America shapefile available from the University of North Carolina Global River Database project. Download the file and unzip it (this will create files called narivs.dbf , narivs.prj , narivs.shp and narivs.shx in the current directory), then run the following: \n spatialite rivers-database.db \n SpatiaLite version ..: 4.3.0a Supported Extensions:\n...\nspatialite> .loadshp narivs rivers CP1252 23032\n========\nLoading shapefile at 'narivs' into SQLite table 'rivers'\n...\nInserted 467973 rows into 'rivers' from SHAPEFILE \n This will load the data from the narivs shapefile into a new database table called rivers . \n Exit out of spatialite (using Ctrl+D ) and run Datasette against your new database like this: \n datasette rivers-database.db \\\n --load-extension=/usr/local/lib/mod_spatialite.dylib \n If you browse to http://localhost:8001/rivers-database/rivers you will see the new table... but the Geometry column will contain unreadable binary data (SpatiaLite uses a custom format based on WKB ). \n The easiest way to turn this into semi-readable data is to use the SpatiaLite AsGeoJSON function. Try the following using the SQL query interface at http://localhost:8001/rivers-database : \n select *, AsGeoJSON(Geometry) from rivers limit 10; \n This will give you back an additional column of GeoJSON. You can copy and paste GeoJSON from this column into the debugging tool at geojson.io to visualize it on a map. \n To see a more interesting example, try ordering the records with the longest geometry first. Since there are 467,000 rows in the table you will first need to increase the SQL time limit imposed by Datasette: \n datasette rivers-database.db \\\n --load-extension=/usr/local/lib/mod_spatialite.dylib \\\n --setting sql_time_limit_ms 10000 \n Now try the following query: \n select *, AsGeoJSON(Geometry) from rivers\norder by length(Geometry) desc limit 10;", "breadcrumbs": "[\"SpatiaLite\"]", "references": "[{\"href\": \"https://en.wikipedia.org/wiki/Shapefile\", \"label\": \"shapefile format\"}, {\"href\": \"http://gaia.geosci.unc.edu/rivers/\", \"label\": \"Global River Database\"}, {\"href\": \"https://www.gaia-gis.it/gaia-sins/BLOB-Geometry.html\", \"label\": \"a custom format based on WKB\"}, {\"href\": \"https://geojson.io/\", \"label\": \"geojson.io\"}]"} {"id": "changelog:plugin-hooks-and-internals", "page": "changelog", "ref": "plugin-hooks-and-internals", "title": "Plugin hooks and internals", "content": "The prepare_jinja2_environment(env, datasette) plugin hook now accepts an optional datasette argument. Hook implementations can also now return an async function which will be awaited automatically. ( #1809 ) \n \n \n Database(is_mutable=) now defaults to True . ( #1808 ) \n \n \n The datasette.check_visibility() method now accepts an optional permissions= list, allowing it to take multiple permissions into account at once when deciding if something should be shown as public or private. This has been used to correctly display padlock icons in more places in the Datasette interface. ( #1829 ) \n \n \n Datasette no longer enforces upper bounds on its dependencies. ( #1800 )", "breadcrumbs": "[\"Changelog\", \"0.63 (2022-10-27)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/1809\", \"label\": \"#1809\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1808\", \"label\": \"#1808\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1829\", \"label\": \"#1829\"}, {\"href\": \"https://github.com/simonw/datasette/issues/1800\", \"label\": \"#1800\"}]"} {"id": "changelog:new-plugin-hook-extra-template-vars", "page": "changelog", "ref": "new-plugin-hook-extra-template-vars", "title": "New plugin hook: extra_template_vars", "content": "The extra_template_vars(template, database, table, columns, view_name, request, datasette) plugin hook allows plugins to inject their own additional variables into the Datasette template context. This can be used in conjunction with custom templates to customize the Datasette interface. datasette-auth-github uses this hook to add custom HTML to the new top navigation bar (which is designed to be modified by plugins, see #540 ).", "breadcrumbs": "[\"Changelog\", \"0.29 (2019-07-07)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette-auth-github\", \"label\": \"datasette-auth-github\"}, {\"href\": \"https://github.com/simonw/datasette/issues/540\", \"label\": \"#540\"}]"} {"id": "changelog:log-out", "page": "changelog", "ref": "log-out", "title": "Log out", "content": "The ds_actor cookie can be used by plugins (or by Datasette's --root mechanism ) to authenticate users. The new /-/logout page provides a way to clear that cookie. \n A \"Log out\" button now shows in the global navigation provided the user is authenticated using the ds_actor cookie. ( #840 )", "breadcrumbs": "[\"Changelog\", \"0.45 (2020-07-01)\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/840\", \"label\": \"#840\"}]"} {"id": "internals:database-results", "page": "internals", "ref": "database-results", "title": "Results", "content": "The db.execute() method returns a single Results object. This can be used to access the rows returned by the query. \n Iterating over a Results object will yield SQLite Row objects . Each of these can be treated as a tuple or can be accessed using row[\"column\"] syntax: \n info = []\nresults = await db.execute(\"select name from sqlite_master\")\nfor row in results:\n info.append(row[\"name\"]) \n The Results object also has the following properties and methods: \n \n \n .truncated - boolean \n \n Indicates if this query was truncated - if it returned more results than the specified page_size . If this is true then the results object will only provide access to the first page_size rows in the query result. You can disable truncation by passing truncate=False to the db.query() method. \n \n \n \n .columns - list of strings \n \n A list of column names returned by the query. \n \n \n \n .rows - list of sqlite3.Row \n \n This property provides direct access to the list of rows returned by the database. You can access specific rows by index using results.rows[0] . \n \n \n \n .first() - row or None \n \n Returns the first row in the results, or None if no rows were returned. \n \n \n \n .single_value() \n \n Returns the value of the first column of the first row of results - but only if the query returned a single row with a single column. Raises a datasette.database.MultipleValues exception otherwise. \n \n \n \n .__len__() \n \n Calling len(results) returns the (truncated) number of returned results.", "breadcrumbs": "[\"Internals for plugins\", \"Database class\"]", "references": "[{\"href\": \"https://docs.python.org/3/library/sqlite3.html#row-objects\", \"label\": \"Row objects\"}]"} {"id": "installation:loading-spatialite", "page": "installation", "ref": "loading-spatialite", "title": "Loading SpatiaLite", "content": "The datasetteproject/datasette image includes a recent version of the\n SpatiaLite extension for SQLite. To load and enable that\n module, use the following command: \n docker run -p 8001:8001 -v `pwd`:/mnt \\\n datasetteproject/datasette \\\n datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db \\\n --load-extension=spatialite \n You can confirm that SpatiaLite is successfully loaded by visiting\n http://127.0.0.1:8001/-/versions", "breadcrumbs": "[\"Installation\", \"Advanced installation options\", \"Using Docker\"]", "references": "[{\"href\": \"http://127.0.0.1:8001/-/versions\", \"label\": \"http://127.0.0.1:8001/-/versions\"}]"} {"id": "javascript_plugins:javascript-datasette-manager", "page": "javascript_plugins", "ref": "javascript-datasette-manager", "title": "datasetteManager", "content": "The datasetteManager object \n \n \n VERSION - string \n \n The version of Datasette \n \n \n \n plugins - Map() \n \n A Map of currently loaded plugin names to plugin implementations \n \n \n \n registerPlugin(name, implementation) \n \n Call this to register a plugin, passing its name and implementation \n \n \n \n selectors - object \n \n An object providing named aliases to useful CSS selectors, listed below", "breadcrumbs": "[\"JavaScript plugins\"]", "references": "[]"} {"id": "internals:internals-utils", "page": "internals", "ref": "internals-utils", "title": "The datasette.utils module", "content": "The datasette.utils module contains various utility functions used by Datasette. As a general rule you should consider anything in this module to be unstable - functions and classes here could change without warning or be removed entirely between Datasette releases, without being mentioned in the release notes. \n The exception to this rule is anything that is documented here. If you find a need for an undocumented utility function in your own work, consider opening an issue requesting that the function you are using be upgraded to documented and supported status.", "breadcrumbs": "[\"Internals for plugins\"]", "references": "[{\"href\": \"https://github.com/simonw/datasette/issues/new\", \"label\": \"opening an issue\"}]"}