id,page,ref,title,content,breadcrumbs,references
changelog:id70,changelog,id70,0.36 (2020-02-21),"The datasette object passed to plugins now has API documentation: Datasette class . ( #576 )
New methods on datasette : .add_database() and .remove_database() - documentation . ( #671 )
prepare_connection() plugin hook now takes optional datasette and database arguments - prepare_connection(conn, database, datasette) . ( #678 )
Added three new plugins and one new conversion tool to the The Datasette Ecosystem .","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/576"", ""label"": ""#576""}, {""href"": ""https://github.com/simonw/datasette/issues/671"", ""label"": ""#671""}, {""href"": ""https://github.com/simonw/datasette/issues/678"", ""label"": ""#678""}]"
changelog:id71,changelog,id71,0.35 (2020-02-04),"Added five new plugins and one new conversion tool to the The Datasette Ecosystem .
The Datasette class has a new render_template() method which can be used by plugins to render templates using Datasette's pre-configured Jinja templating library.
You can now execute SQL queries that start with a -- comment - thanks, Jay Graves ( #653 )","[""Changelog""]","[{""href"": ""https://jinja.palletsprojects.com/"", ""label"": ""Jinja""}, {""href"": ""https://github.com/simonw/datasette/pull/653"", ""label"": ""#653""}]"
changelog:id72,changelog,id72,0.34 (2020-01-29),"_search= queries are now correctly escaped using a new escape_fts() custom SQL function. This means you can now run searches for strings like park. without seeing errors. ( #651 )
Google Cloud Run is no longer in beta, so datasette publish cloudrun has been updated to work even if the user has not installed the gcloud beta components package. Thanks, Katie McLaughlin ( #660 )
datasette package now accepts a --port option for specifying which port the resulting Docker container should listen on. ( #661 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/651"", ""label"": ""#651""}, {""href"": ""https://cloud.google.com/run/"", ""label"": ""Google Cloud Run""}, {""href"": ""https://github.com/simonw/datasette/pull/660"", ""label"": ""#660""}, {""href"": ""https://github.com/simonw/datasette/issues/661"", ""label"": ""#661""}]"
changelog:id73,changelog,id73,0.33 (2019-12-22),"rowid is now included in dropdown menus for filtering tables ( #636 )
Columns are now only suggested for faceting if they have at least one value with more than one record ( #638 )
Queries with no results now display ""0 results"" ( #637 )
Improved documentation for the --static option ( #641 )
asyncio task information is now included on the /-/threads debug page
Bumped Uvicorn dependency 0.11
You can now use --port 0 to listen on an available port
New template_debug setting for debugging templates, e.g. https://latest.datasette.io/fixtures/roadside_attractions?_context=1 ( #654 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/636"", ""label"": ""#636""}, {""href"": ""https://github.com/simonw/datasette/issues/638"", ""label"": ""#638""}, {""href"": ""https://github.com/simonw/datasette/issues/637"", ""label"": ""#637""}, {""href"": ""https://github.com/simonw/datasette/issues/641"", ""label"": ""#641""}, {""href"": ""https://latest.datasette.io/fixtures/roadside_attractions?_context=1"", ""label"": ""https://latest.datasette.io/fixtures/roadside_attractions?_context=1""}, {""href"": ""https://github.com/simonw/datasette/issues/654"", ""label"": ""#654""}]"
changelog:id74,changelog,id74,0.32 (2019-11-14),"Datasette now renders templates using Jinja async mode . This means plugins can provide custom template functions that perform asynchronous actions, for example the new datasette-template-sql plugin which allows custom templates to directly execute SQL queries and render their results. ( #628 )","[""Changelog""]","[{""href"": ""https://jinja.palletsprojects.com/en/2.10.x/api/#async-support"", ""label"": ""Jinja async mode""}, {""href"": ""https://github.com/simonw/datasette-template-sql"", ""label"": ""datasette-template-sql""}, {""href"": ""https://github.com/simonw/datasette/issues/628"", ""label"": ""#628""}]"
changelog:id75,changelog,id75,0.31.2 (2019-11-13),"Fixed a bug where datasette publish heroku applications failed to start ( #633 )
Fix for datasette publish with just --source_url - thanks, Stanley Zheng ( #572 )
Deployments to Heroku now use Python 3.8.0 ( #632 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/633"", ""label"": ""#633""}, {""href"": ""https://github.com/simonw/datasette/issues/572"", ""label"": ""#572""}, {""href"": ""https://github.com/simonw/datasette/issues/632"", ""label"": ""#632""}]"
changelog:id76,changelog,id76,0.31.1 (2019-11-12),Deployments created using datasette publish now use python:3.8 base Docker image ( #629 ),"[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/pull/629"", ""label"": ""#629""}]"
changelog:id77,changelog,id77,0.31 (2019-11-11),"This version adds compatibility with Python 3.8 and breaks compatibility with Python 3.5.
If you are still running Python 3.5 you should stick with 0.30.2 , which you can install like this:
pip install datasette==0.30.2
Format SQL button now works with read-only SQL queries - thanks, Tobias Kunze ( #602 )
New ?column__notin=x,y,z filter for table views ( #614 )
Table view now uses select col1, col2, col3 instead of select *
Database filenames can now contain spaces - thanks, Tobias Kunze ( #590 )
Removed obsolete ?_group_count=col feature ( #504 )
Improved user interface and documentation for datasette publish cloudrun ( #608 )
Tables with indexes now show the CREATE INDEX statements on the table page ( #618 )
Current version of uvicorn is now shown on /-/versions
Python 3.8 is now supported! ( #622 )
Python 3.5 is no longer supported.","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/pull/602"", ""label"": ""#602""}, {""href"": ""https://github.com/simonw/datasette/issues/614"", ""label"": ""#614""}, {""href"": ""https://github.com/simonw/datasette/pull/590"", ""label"": ""#590""}, {""href"": ""https://github.com/simonw/datasette/issues/504"", ""label"": ""#504""}, {""href"": ""https://github.com/simonw/datasette/issues/608"", ""label"": ""#608""}, {""href"": ""https://github.com/simonw/datasette/issues/618"", ""label"": ""#618""}, {""href"": ""https://www.uvicorn.org/"", ""label"": ""uvicorn""}, {""href"": ""https://github.com/simonw/datasette/issues/622"", ""label"": ""#622""}]"
changelog:id78,changelog,id78,0.30.2 (2019-11-02),"/-/plugins page now uses distribution name e.g. datasette-cluster-map instead of the name of the underlying Python package ( datasette_cluster_map ) ( #606 )
Array faceting is now only suggested for columns that contain arrays of strings ( #562 )
Better documentation for the --host argument ( #574 )
Don't show None with a broken link for the label on a nullable foreign key ( #406 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/606"", ""label"": ""#606""}, {""href"": ""https://github.com/simonw/datasette/issues/562"", ""label"": ""#562""}, {""href"": ""https://github.com/simonw/datasette/issues/574"", ""label"": ""#574""}, {""href"": ""https://github.com/simonw/datasette/issues/406"", ""label"": ""#406""}]"
changelog:id79,changelog,id79,0.30.1 (2019-10-30),"Fixed bug where ?_where= parameter was not persisted in hidden form fields ( #604 )
Fixed bug with .JSON representation of row pages - thanks, Chris Shaw ( #603 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/604"", ""label"": ""#604""}, {""href"": ""https://github.com/simonw/datasette/issues/603"", ""label"": ""#603""}]"
changelog:id8,changelog,id8,0.63.3 (2022-12-17),"Fixed a bug where datasette --root , when running in Docker, would only output the URL to sign in root when the server shut down, not when it started up. ( #1958 )
You no longer need to ensure await datasette.invoke_startup() has been called in order for Datasette to start correctly serving requests - this is now handled automatically the first time the server receives a request. This fixes a bug experienced when Datasette is served directly by an ASGI application server such as Uvicorn or Gunicorn. It also fixes a bug with the datasette-gunicorn plugin. ( #1955 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/1958"", ""label"": ""#1958""}, {""href"": ""https://datasette.io/plugins/datasette-gunicorn"", ""label"": ""datasette-gunicorn""}, {""href"": ""https://github.com/simonw/datasette/issues/1955"", ""label"": ""#1955""}]"
changelog:id80,changelog,id80,0.30 (2019-10-18),"Added /-/threads debugging page
Allow EXPLAIN WITH... ( #583 )
Button to format SQL - thanks, Tobias Kunze ( #136 )
Sort databases on homepage by argument order - thanks, Tobias Kunze ( #585 )
Display metadata footer on custom SQL queries - thanks, Tobias Kunze ( #589 )
Use --platform=managed for publish cloudrun ( #587 )
Fixed bug returning non-ASCII characters in CSV ( #584 )
Fix for /foo v.s. /foo-bar bug ( #601 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/583"", ""label"": ""#583""}, {""href"": ""https://github.com/simonw/datasette/issues/136"", ""label"": ""#136""}, {""href"": ""https://github.com/simonw/datasette/issues/585"", ""label"": ""#585""}, {""href"": ""https://github.com/simonw/datasette/pull/589"", ""label"": ""#589""}, {""href"": ""https://github.com/simonw/datasette/issues/587"", ""label"": ""#587""}, {""href"": ""https://github.com/simonw/datasette/issues/584"", ""label"": ""#584""}, {""href"": ""https://github.com/simonw/datasette/issues/601"", ""label"": ""#601""}]"
changelog:id81,changelog,id81,0.29.3 (2019-09-02),"Fixed implementation of CodeMirror on database page ( #560 )
Documentation typo fixes - thanks, Min ho Kim ( #561 )
Mechanism for detecting if a table has FTS enabled now works if the table name used alternative escaping mechanisms ( #570 ) - for compatibility with a recent change to sqlite-utils .","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/560"", ""label"": ""#560""}, {""href"": ""https://github.com/simonw/datasette/pull/561"", ""label"": ""#561""}, {""href"": ""https://github.com/simonw/datasette/issues/570"", ""label"": ""#570""}, {""href"": ""https://github.com/simonw/sqlite-utils/pull/57"", ""label"": ""a recent change to sqlite-utils""}]"
changelog:id82,changelog,id82,0.29.2 (2019-07-13),"Bumped Uvicorn to 0.8.4, fixing a bug where the query string was not included in the server logs. ( #559 )
Fixed bug where the navigation breadcrumbs were not displayed correctly on the page for a custom query. ( #558 )
Fixed bug where custom query names containing unicode characters caused errors.","[""Changelog""]","[{""href"": ""https://www.uvicorn.org/"", ""label"": ""Uvicorn""}, {""href"": ""https://github.com/simonw/datasette/issues/559"", ""label"": ""#559""}, {""href"": ""https://github.com/simonw/datasette/issues/558"", ""label"": ""#558""}]"
changelog:id83,changelog,id83,0.29.1 (2019-07-11),"Fixed bug with static mounts using relative paths which could lead to traversal exploits ( #555 ) - thanks Abdussamet Kocak!
Datasette can now be run as a module: python -m datasette ( #556 ) - thanks, Abdussamet Kocak!","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/555"", ""label"": ""#555""}, {""href"": ""https://github.com/simonw/datasette/issues/556"", ""label"": ""#556""}]"
changelog:id84,changelog,id84,0.29 (2019-07-07),"ASGI, new plugin hooks, facet by date and much, much more...","[""Changelog""]",[]
changelog:id85,changelog,id85,0.28 (2019-05-19),A salmagundi of new features!,"[""Changelog""]","[{""href"": ""https://adamj.eu/tech/2019/01/18/a-salmagundi-of-django-alpha-announcements/"", ""label"": ""salmagundi""}]"
changelog:id87,changelog,id87,0.27.1 (2019-05-09),"Tiny bugfix release: don't install tests/ in the wrong place. Thanks, Veit Heller.","[""Changelog""]",[]
changelog:id88,changelog,id88,0.27 (2019-01-31),"New command: datasette plugins ( documentation ) shows you the currently installed list of plugins.
Datasette can now output newline-delimited JSON using the new ?_shape=array&_nl=on query string option.
Added documentation on The Datasette Ecosystem .
Now using Python 3.7.2 as the base for the official Datasette Docker image .","[""Changelog""]","[{""href"": ""http://ndjson.org/"", ""label"": ""newline-delimited JSON""}, {""href"": ""https://hub.docker.com/r/datasetteproject/datasette/"", ""label"": ""Datasette Docker image""}]"
changelog:id89,changelog,id89,0.26.1 (2019-01-10),"/-/versions now includes SQLite compile_options ( #396 )
datasetteproject/datasette Docker image now uses SQLite 3.26.0 ( #397 )
Cleaned up some deprecation warnings under Python 3.7","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/396"", ""label"": ""#396""}, {""href"": ""https://hub.docker.com/r/datasetteproject/datasette"", ""label"": ""datasetteproject/datasette""}, {""href"": ""https://github.com/simonw/datasette/issues/397"", ""label"": ""#397""}]"
changelog:id9,changelog,id9,0.63.2 (2022-11-18),"Fixed a bug in datasette publish heroku where deployments failed due to an older version of Python being requested. ( #1905 )
New datasette publish heroku --generate-dir
option for generating a Heroku deployment directory without deploying it.","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/1905"", ""label"": ""#1905""}]"
changelog:id90,changelog,id90,0.26 (2019-01-02),"datasette serve --reload now restarts Datasette if a database file changes on disk.
datasette publish now now takes an optional --alias mysite.now.sh argument. This will attempt to set an alias after the deploy completes.
Fixed a bug where the advanced CSV export form failed to include the currently selected filters ( #393 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/393"", ""label"": ""#393""}]"
changelog:id91,changelog,id91,0.25.2 (2018-12-16),"datasette publish heroku now uses the python-3.6.7 runtime
Added documentation on how to build the documentation
Added documentation covering our release process
Upgraded to pytest 4.0.2","[""Changelog""]",[]
changelog:id92,changelog,id92,0.25.1 (2018-11-04),"Documentation improvements plus a fix for publishing to Zeit Now.
datasette publish now now uses Zeit's v1 platform, to work around the new 100MB image limit. Thanks, @slygent - closes #366 .","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/366"", ""label"": ""#366""}]"
changelog:id93,changelog,id93,0.25 (2018-09-19),"New plugin hooks, improved database view support and an easier way to use more recent versions of SQLite.
New publish_subcommand plugin hook. A plugin can now add additional datasette publish publishers in addition to the default now and heroku , both of which have been refactored into default plugins. publish_subcommand documentation . Closes #349
New render_cell plugin hook. Plugins can now customize how values are displayed in the HTML tables produced by Datasette's browsable interface. datasette-json-html and datasette-render-images are two new plugins that use this hook. render_cell documentation . Closes #352
New extra_body_script plugin hook, enabling plugins to provide additional JavaScript that should be added to the page footer. extra_body_script documentation .
extra_css_urls and extra_js_urls hooks now take additional optional parameters, allowing them to be more selective about which pages they apply to. Documentation .
You can now use the sortable_columns metadata setting to explicitly enable sort-by-column in the interface for database views, as well as for specific tables.
The new fts_table and fts_pk metadata settings can now be used to explicitly configure full-text search for a table or a view , even if that table is not directly coupled to the SQLite FTS feature in the database schema itself.
Datasette will now use pysqlite3 in place of the standard library sqlite3 module if it has been installed in the current environment. This makes it much easier to run Datasette against a more recent version of SQLite, including the just-released SQLite 3.25.0 which adds window function support. More details on how to use this in #360
New mechanism that allows plugin configuration options to be set using metadata.json .","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/349"", ""label"": ""#349""}, {""href"": ""https://github.com/simonw/datasette-json-html"", ""label"": ""datasette-json-html""}, {""href"": ""https://github.com/simonw/datasette-render-images"", ""label"": ""datasette-render-images""}, {""href"": ""https://github.com/simonw/datasette/issues/352"", ""label"": ""#352""}, {""href"": ""https://github.com/coleifer/pysqlite3"", ""label"": ""pysqlite3""}, {""href"": ""https://www.sqlite.org/releaselog/3_25_0.html"", ""label"": ""SQLite 3.25.0""}, {""href"": ""https://github.com/simonw/datasette/issues/360"", ""label"": ""#360""}]"
changelog:id94,changelog,id94,0.24 (2018-07-23),"A number of small new features:
datasette publish heroku now supports --extra-options , fixes #334
Custom error message if SpatiaLite is needed for specified database, closes #331
New config option: truncate_cells_html for truncating long cell values in HTML view - closes #330
Documentation for datasette publish and datasette package , closes #337
Fixed compatibility with Python 3.7
datasette publish heroku now supports app names via the -n option, which can also be used to overwrite an existing application [Russ Garrett]
Title and description metadata can now be set for canned SQL queries , closes #342
New force_https_on config option, fixes https:// API URLs when deploying to Zeit Now - closes #333
?_json_infinity=1 query string argument for handling Infinity/-Infinity values in JSON, closes #332
URLs displayed in the results of custom SQL queries are now URLified, closes #298","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/334"", ""label"": ""#334""}, {""href"": ""https://github.com/simonw/datasette/issues/331"", ""label"": ""#331""}, {""href"": ""https://github.com/simonw/datasette/issues/330"", ""label"": ""#330""}, {""href"": ""https://github.com/simonw/datasette/issues/337"", ""label"": ""#337""}, {""href"": ""https://github.com/simonw/datasette/issues/342"", ""label"": ""#342""}, {""href"": ""https://github.com/simonw/datasette/issues/333"", ""label"": ""#333""}, {""href"": ""https://github.com/simonw/datasette/issues/332"", ""label"": ""#332""}, {""href"": ""https://github.com/simonw/datasette/issues/298"", ""label"": ""#298""}]"
changelog:v1-0-a0,changelog,v1-0-a0,1.0a0 (2022-11-29),"This first alpha release of Datasette 1.0 introduces a brand new collection of APIs for writing to the database ( #1850 ), as well as a new API token mechanism baked into Datasette core. Previously, API tokens have only been supported by installing additional plugins.
This is very much a preview: expect many more backwards incompatible API changes prior to the full 1.0 release.
Feedback enthusiastically welcomed, either through issue comments or via the Datasette Discord community.","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/1850"", ""label"": ""#1850""}, {""href"": ""https://github.com/simonw/datasette/issues/1850"", ""label"": ""issue comments""}, {""href"": ""https://datasette.io/discord"", ""label"": ""Datasette Discord""}]"
changelog:v1-0-a1,changelog,v1-0-a1,1.0a1 (2022-12-01),"Write APIs now serve correct CORS headers if Datasette is started in --cors mode. See the full list of CORS headers in the documentation. ( #1922 )
Fixed a bug where the _memory database could be written to even though writes were not persisted. ( #1917 )
The https://latest.datasette.io/ demo instance now includes an ephemeral database which can be used to test Datasette's write APIs, using the new datasette-ephemeral-tables plugin to drop any created tables after five minutes. This database is only available if you sign in as the root user using the link on the homepage. ( #1915 )
Fixed a bug where hitting the write endpoints with a GET request returned a 500 error. It now returns a 405 (method not allowed) error instead. ( #1916 )
The list of endpoints in the API explorer now lists mutable databases first. ( #1918 )
The ""ignore"": true and ""replace"": true options for the insert API are now documented . ( #1924 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/1922"", ""label"": ""#1922""}, {""href"": ""https://github.com/simonw/datasette/issues/1917"", ""label"": ""#1917""}, {""href"": ""https://latest.datasette.io/"", ""label"": ""https://latest.datasette.io/""}, {""href"": ""https://datasette.io/plugins/datasette-ephemeral-tables"", ""label"": ""datasette-ephemeral-tables""}, {""href"": ""https://github.com/simonw/datasette/issues/1915"", ""label"": ""#1915""}, {""href"": ""https://github.com/simonw/datasette/issues/1916"", ""label"": ""#1916""}, {""href"": ""https://github.com/simonw/datasette/issues/1918"", ""label"": ""#1918""}, {""href"": ""https://github.com/simonw/datasette/issues/1924"", ""label"": ""#1924""}]"
changelog:v1-0-a10,changelog,v1-0-a10,1.0a10 (2024-02-17),"The only changes in this alpha correspond to the way Datasette handles database transactions. ( #2277 )
The database.execute_write_fn() method has a new transaction=True parameter. This defaults to True which means all functions executed using this method are now automatically wrapped in a transaction - previously the functions needed to roll transaction handling on their own, and many did not.
Pass transaction=False to execute_write_fn() if you want to manually handle transactions in your function.
Several internal Datasette features, including parts of the JSON write API , had been failing to wrap their operations in a transaction. This has been fixed by the new transaction=True default.","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/2277"", ""label"": ""#2277""}]"
changelog:v1-0-a11,changelog,v1-0-a11,1.0a11 (2024-02-19),"The ""replace"": true argument to the /db/table/-/insert API now requires the actor to have the update-row permission. ( #2279 )
Fixed some UI bugs in the interactive permissions debugging tool. ( #2278 )
The column action menu now aligns better with the cog icon, and positions itself taking into account the width of the browser window. ( #2263 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/2279"", ""label"": ""#2279""}, {""href"": ""https://github.com/simonw/datasette/issues/2278"", ""label"": ""#2278""}, {""href"": ""https://github.com/simonw/datasette/issues/2263"", ""label"": ""#2263""}]"
changelog:v1-0-a12,changelog,v1-0-a12,1.0a12 (2024-02-29),"New query_actions() plugin hook, similar to table_actions() and database_actions() . Can be used to add a menu of actions to the canned query or arbitrary SQL query page. ( #2283 )
New design for the button that opens the query, table and database actions menu. ( #2281 )
""does not contain"" table filter for finding rows that do not contain a string. ( #2287 )
Fixed a bug in the makeColumnActions(columnDetails) JavaScript plugin mechanism where the column action menu was not fully reset in between each interaction. ( #2289 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/2283"", ""label"": ""#2283""}, {""href"": ""https://github.com/simonw/datasette/issues/2281"", ""label"": ""#2281""}, {""href"": ""https://github.com/simonw/datasette/issues/2287"", ""label"": ""#2287""}, {""href"": ""https://github.com/simonw/datasette/issues/2289"", ""label"": ""#2289""}]"
changelog:v1-0-a13,changelog,v1-0-a13,1.0a13 (2024-03-12),"Each of the key concepts in Datasette now has an actions menu , which plugins can use to add additional functionality targeting that entity.
Plugin hook: view_actions() for actions that can be applied to a SQL view. ( #2297 )
Plugin hook: homepage_actions() for actions that apply to the instance homepage. ( #2298 )
Plugin hook: row_actions() for actions that apply to the row page. ( #2299 )
Action menu items for all of the *_actions() plugin hooks can now return an optional ""description"" key, which will be displayed in the menu below the action label. ( #2294 )
Plugin hooks documentation page is now organized with additional headings. ( #2300 )
Improved the display of action buttons on pages that also display metadata. ( #2286 )
The header and footer of the page now uses a subtle gradient effect, and options in the navigation menu are better visually defined. ( #2302 )
Table names that start with an underscore now default to hidden. ( #2104 )
pragma_table_list has been added to the allow-list of SQLite pragma functions supported by Datasette. select * from pragma_table_list() is no longer blocked. ( #2104 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/2297"", ""label"": ""#2297""}, {""href"": ""https://github.com/simonw/datasette/issues/2298"", ""label"": ""#2298""}, {""href"": ""https://github.com/simonw/datasette/issues/2299"", ""label"": ""#2299""}, {""href"": ""https://github.com/simonw/datasette/issues/2294"", ""label"": ""#2294""}, {""href"": ""https://github.com/simonw/datasette/issues/2300"", ""label"": ""#2300""}, {""href"": ""https://github.com/simonw/datasette/issues/2286"", ""label"": ""#2286""}, {""href"": ""https://github.com/simonw/datasette/issues/2302"", ""label"": ""#2302""}, {""href"": ""https://github.com/simonw/datasette/issues/2104"", ""label"": ""#2104""}, {""href"": ""https://github.com/simonw/datasette/issues/2104#issuecomment-1982352475"", ""label"": ""#2104""}]"
changelog:v1-0-a2,changelog,v1-0-a2,1.0a2 (2022-12-14),"The third Datasette 1.0 alpha release adds upsert support to the JSON API, plus the ability to specify finely grained permissions when creating an API token.
See Datasette 1.0a2: Upserts and finely grained permissions for an extended, annotated version of these release notes.
New /db/table/-/upsert API, documented here . upsert is an update-or-insert: existing rows will have specified keys updated, but if no row matches the incoming primary key a brand new row will be inserted instead. ( #1878 )
New register_permissions(datasette) plugin hook. Plugins can now register named permissions, which will then be listed in various interfaces that show available permissions. ( #1940 )
The /db/-/create API for creating a table now accepts ""ignore"": true and ""replace"": true options when called with the ""rows"" property that creates a new table based on an example set of rows. This means the API can be called multiple times with different rows, setting rules for what should happen if a primary key collides with an existing row. ( #1927 )
Arbitrary permissions can now be configured at the instance, database and resource (table, SQL view or canned query) level in Datasette's Metadata JSON and YAML files. The new ""permissions"" key can be used to specify which actors should have which permissions. See Other permissions in datasette.yaml for details. ( #1636 )
The /-/create-token page can now be used to create API tokens which are restricted to just a subset of actions, including against specific databases or resources. See API Tokens for details. ( #1947 )
Likewise, the datasette create-token CLI command can now create tokens with a subset of permissions . ( #1855 )
New datasette.create_token() API method for programmatically creating signed API tokens. ( #1951 )
/db/-/create API now requires actor to have insert-row permission in order to use the ""row"" or ""rows"" properties. ( #1937 )","[""Changelog""]","[{""href"": ""https://simonwillison.net/2022/Dec/15/datasette-1a2/"", ""label"": ""Datasette 1.0a2: Upserts and finely grained permissions""}, {""href"": ""https://github.com/simonw/datasette/issues/1878"", ""label"": ""#1878""}, {""href"": ""https://github.com/simonw/datasette/issues/1940"", ""label"": ""#1940""}, {""href"": ""https://github.com/simonw/datasette/issues/1927"", ""label"": ""#1927""}, {""href"": ""https://github.com/simonw/datasette/issues/1636"", ""label"": ""#1636""}, {""href"": ""https://github.com/simonw/datasette/issues/1947"", ""label"": ""#1947""}, {""href"": ""https://github.com/simonw/datasette/issues/1855"", ""label"": ""#1855""}, {""href"": ""https://github.com/simonw/datasette/issues/1951"", ""label"": ""#1951""}, {""href"": ""https://github.com/simonw/datasette/issues/1937"", ""label"": ""#1937""}]"
changelog:v1-0-a3,changelog,v1-0-a3,1.0a3 (2023-08-09),"This alpha release previews the updated design for Datasette's default JSON API. ( #782 )
The new default JSON representation for both table pages ( /dbname/table.json ) and arbitrary SQL queries ( /dbname.json?sql=... ) is now shaped like this:
{
""ok"": true,
""rows"": [
{
""id"": 3,
""name"": ""Detroit""
},
{
""id"": 2,
""name"": ""Los Angeles""
},
{
""id"": 4,
""name"": ""Memnonia""
},
{
""id"": 1,
""name"": ""San Francisco""
}
],
""truncated"": false
}
Tables will include an additional ""next"" key for pagination, which can be passed to ?_next= to fetch the next page of results.
The various ?_shape= options continue to work as before - see Different shapes for details.
A new ?_extra= mechanism is available for tables, but has not yet been stabilized or documented. Details on that are available in #262 .","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/782"", ""label"": ""#782""}, {""href"": ""https://github.com/simonw/datasette/issues/262"", ""label"": ""#262""}]"
changelog:v1-0-a4,changelog,v1-0-a4,1.0a4 (2023-08-21),"This alpha fixes a security issue with the /-/api API explorer. On authenticated Datasette instances (instances protected using plugins such as datasette-auth-passwords ) the API explorer interface could reveal the names of databases and tables within the protected instance. The data stored in those tables was not revealed.
For more information and workarounds, read the security advisory . The issue has been present in every previous alpha version of Datasette 1.0: versions 1.0a0, 1.0a1, 1.0a2 and 1.0a3.
Also in this alpha:
The new datasette plugins --requirements option outputs a list of currently installed plugins in Python requirements.txt format, useful for duplicating that installation elsewhere. ( #2133 )
Writable canned queries can now define a on_success_message_sql field in their configuration, containing a SQL query that should be executed upon successful completion of the write operation in order to generate a message to be shown to the user. ( #2138 )
The automatically generated border color for a database is now shown in more places around the application. ( #2119 )
Every instance of example shell script code in the documentation should now include a working copy button, free from additional syntax. ( #2140 )","[""Changelog""]","[{""href"": ""https://datasette.io/plugins/datasette-auth-passwords"", ""label"": ""datasette-auth-passwords""}, {""href"": ""https://github.com/simonw/datasette/security/advisories/GHSA-7ch3-7pp7-7cpq"", ""label"": ""the security advisory""}, {""href"": ""https://github.com/simonw/datasette/issues/2133"", ""label"": ""#2133""}, {""href"": ""https://github.com/simonw/datasette/issues/2138"", ""label"": ""#2138""}, {""href"": ""https://github.com/simonw/datasette/issues/2119"", ""label"": ""#2119""}, {""href"": ""https://github.com/simonw/datasette/issues/2140"", ""label"": ""#2140""}]"
changelog:v1-0-a5,changelog,v1-0-a5,1.0a5 (2023-08-29),"When restrictions are applied to API tokens , those restrictions now behave slightly differently: applying the view-table restriction will imply the ability to view-database for the database containing that table, and both view-table and view-database will imply view-instance . Previously you needed to create a token with restrictions that explicitly listed view-instance and view-database and view-table in order to view a table without getting a permission denied error. ( #2102 )
New datasette.yaml (or .json ) configuration file, which can be specified using datasette -c path-to-file . The goal here to consolidate settings, plugin configuration, permissions, canned queries, and other Datasette configuration into a single single file, separate from metadata.yaml . The legacy settings.json config file used for Configuration directory mode has been removed, and datasette.yaml has a ""settings"" section where the same settings key/value pairs can be included. In the next future alpha release, more configuration such as plugins/permissions/canned queries will be moved to the datasette.yaml file. See #2093 for more details. Thanks, Alex Garcia.
The -s/--setting option can now take dotted paths to nested settings. These will then be used to set or over-ride the same options as are present in the new configuration file. ( #2156 )
New --actor '{""id"": ""json-goes-here""}' option for use with datasette --get to treat the simulated request as being made by a specific actor, see datasette --get . ( #2153 )
The Datasette _internal database has had some changes. It no longer shows up in the datasette.databases list by default, and is now instead available to plugins using the datasette.get_internal_database() . Plugins are invited to use this as a private database to store configuration and settings and secrets that should not be made visible through the default Datasette interface. Users can pass the new --internal internal.db option to persist that internal database to disk. Thanks, Alex Garcia. ( #2157 ).","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/2102"", ""label"": ""#2102""}, {""href"": ""https://github.com/simonw/datasette/issues/2093"", ""label"": ""#2093""}, {""href"": ""https://github.com/simonw/datasette/issues/2156"", ""label"": ""#2156""}, {""href"": ""https://github.com/simonw/datasette/issues/2153"", ""label"": ""#2153""}, {""href"": ""https://github.com/simonw/datasette/issues/2157"", ""label"": ""#2157""}]"
changelog:v1-0-a6,changelog,v1-0-a6,1.0a6 (2023-09-07),"New plugin hook: actors_from_ids(datasette, actor_ids) and an internal method to accompany it, await .actors_from_ids(actor_ids) . This mechanism is intended to be used by plugins that may need to display the actor who was responsible for something managed by that plugin: they can now resolve the recorded IDs of actors into the full actor objects. ( #2181 )
DATASETTE_LOAD_PLUGINS environment variable for controlling which plugins are loaded by Datasette. ( #2164 )
Datasette now checks if the user has permission to view a table linked to by a foreign key before turning that foreign key into a clickable link. ( #2178 )
The execute-sql permission now implies that the actor can also view the database and instance. ( #2169 )
Documentation describing a pattern for building plugins that themselves define further hooks for other plugins. ( #1765 )
Datasette is now tested against the Python 3.12 preview. ( #2175 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/2181"", ""label"": ""#2181""}, {""href"": ""https://github.com/simonw/datasette/issues/2164"", ""label"": ""#2164""}, {""href"": ""https://github.com/simonw/datasette/issues/2178"", ""label"": ""#2178""}, {""href"": ""https://github.com/simonw/datasette/issues/2169"", ""label"": ""#2169""}, {""href"": ""https://github.com/simonw/datasette/issues/1765"", ""label"": ""#1765""}, {""href"": ""https://github.com/simonw/datasette/pull/2175"", ""label"": ""#2175""}]"
changelog:v1-0-a7,changelog,v1-0-a7,1.0a7 (2023-09-21),Fix for a crashing bug caused by viewing the table page for a named in-memory database. ( #2189 ),"[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/2189"", ""label"": ""#2189""}]"
changelog:v1-0-a8,changelog,v1-0-a8,1.0a8 (2024-02-07),"This alpha release continues the migration of Datasette's configuration from metadata.yaml to the new datasette.yaml configuration file, introduces a new system for JavaScript plugins and adds several new plugin hooks.
See Datasette 1.0a8: JavaScript plugins, new plugin hooks and plugin configuration in datasette.yaml for an annotated version of these release notes.","[""Changelog""]","[{""href"": ""https://simonwillison.net/2024/Feb/7/datasette-1a8/"", ""label"": ""Datasette 1.0a8: JavaScript plugins, new plugin hooks and plugin configuration in datasette.yaml""}]"
changelog:v1-0-a9,changelog,v1-0-a9,1.0a9 (2024-02-16),This alpha release adds basic alter table support to the Datasette Write API and fixes a permissions bug relating to the /upsert API endpoint.,"[""Changelog""]",[]
configuration:configuration-reference-canned-queries,configuration,configuration-reference-canned-queries,Canned queries configuration,"Canned queries are named SQL queries that appear in the Datasette interface. They can be configured in datasette.yaml using the queries key at the database level:
[[[cog
from metadata_doc import config_example, config_example
config_example(cog, {
""databases"": {
""sf-trees"": {
""queries"": {
""just_species"": {
""sql"": ""select qSpecies from Street_Tree_List""
}
}
}
}
})
]]]
[[[end]]]
See the canned queries documentation for more, including how to configure writable canned queries .","[""Configuration"", null]",[]
configuration:configuration-reference-css-js,configuration,configuration-reference-css-js,Custom CSS and JavaScript,"Datasette can load additional CSS and JavaScript files, configured in datasette.yaml like this:
[[[cog
from metadata_doc import config_example
config_example(cog, """"""
extra_css_urls:
- https://simonwillison.net/static/css/all.bf8cd891642c.css
extra_js_urls:
- https://code.jquery.com/jquery-3.2.1.slim.min.js
"""""")
]]]
[[[end]]]
The extra CSS and JavaScript files will be linked in the of every page:
You can also specify a SRI (subresource integrity hash) for these assets:
[[[cog
config_example(cog, """"""
extra_css_urls:
- url: https://simonwillison.net/static/css/all.bf8cd891642c.css
sri: sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI
extra_js_urls:
- url: https://code.jquery.com/jquery-3.2.1.slim.min.js
sri: sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g=
"""""")
]]]
[[[end]]]
This will produce:
Modern browsers will only execute the stylesheet or JavaScript if the SRI hash
matches the content served. You can generate hashes using www.srihash.org
Items in ""extra_js_urls"" can specify ""module"": true if they reference JavaScript that uses JavaScript modules . This configuration:
[[[cog
config_example(cog, """"""
extra_js_urls:
- url: https://example.datasette.io/module.js
module: true
"""""")
]]]
[[[end]]]
Will produce this HTML:
","[""Configuration"", null]","[{""href"": ""https://www.srihash.org/"", ""label"": ""www.srihash.org""}, {""href"": ""https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules"", ""label"": ""JavaScript modules""}]"
configuration:configuration-reference-permissions,configuration,configuration-reference-permissions,Permissions configuration,"Datasette's authentication and permissions system can also be configured using datasette.yaml .
Here is a simple example:
[[[cog
from metadata_doc import config_example
import textwrap
config_example(cog, textwrap.dedent(
""""""
# Instance is only available to users 'sharon' and 'percy':
allow:
id:
- sharon
- percy
# Only 'percy' is allowed access to the accounting database:
databases:
accounting:
allow:
id: percy
"""""").strip()
)
]]]
[[[end]]]
Access permissions in datasette.yaml has the full details.","[""Configuration"", null]",[]
configuration:configuration-reference-plugins,configuration,configuration-reference-plugins,Plugin configuration,"Datasette plugins often require configuration. This plugin configuration should be placed in plugins keys inside datasette.yaml .
Most plugins are configured at the top-level of the file, using the plugins key:
[[[cog
from metadata_doc import config_example
import textwrap
config_example(cog, textwrap.dedent(
""""""
# inside datasette.yaml
plugins:
datasette-my-plugin:
key: my_value
"""""").strip()
)
]]]
[[[end]]]
Some plugins can be configured at the database or table level. These should use a plugins key nested under the appropriate place within the databases object:
[[[cog
from metadata_doc import config_example
import textwrap
config_example(cog, textwrap.dedent(
""""""
# inside datasette.yaml
databases:
my_database:
# plugin configuration for the my_database database
plugins:
datasette-my-plugin:
key: my_value
my_other_database:
tables:
my_table:
# plugin configuration for the my_table table inside the my_other_database database
plugins:
datasette-my-plugin:
key: my_value
"""""").strip()
)
]]]
[[[end]]]","[""Configuration"", null]",[]
configuration:configuration-reference-settings,configuration,configuration-reference-settings,Settings,"Settings can be configured in datasette.yaml with the settings key:
[[[cog
from metadata_doc import config_example
import textwrap
config_example(cog, textwrap.dedent(
""""""
# inside datasette.yaml
settings:
default_allow_sql: off
default_page_size: 50
"""""").strip()
)
]]]
[[[end]]]
The full list of settings is available in the settings documentation . Settings can also be passed to Datasette using one or more --setting name value command line options.`","[""Configuration"", null]",[]
configuration:configuration-cli,configuration,configuration-cli,Configuration via the command-line,"The recommended way to configure Datasette is using a datasette.yaml file passed to -c/--config . You can also pass individual settings to Datasette using the -s/--setting option, which can be used multiple times:
datasette mydatabase.db \
--setting settings.default_page_size 50 \
--setting settings.sql_time_limit_ms 3500
This option takes dotted-notation for the first argument and a value for the second argument. This means you can use it to set any configuration value that would be valid in a datasette.yaml file.
It also works for plugin configuration, for example for datasette-cluster-map :
datasette mydatabase.db \
--setting plugins.datasette-cluster-map.latitude_column xlat \
--setting plugins.datasette-cluster-map.longitude_column xlon
If the value you provide is a valid JSON object or list it will be treated as nested data, allowing you to configure plugins that accept lists such as datasette-proxy-url :
datasette mydatabase.db \
-s plugins.datasette-proxy-url.paths '[{""path"": ""/proxy"", ""backend"": ""http://example.com/""}]'
This is equivalent to a datasette.yaml file containing the following:
[[[cog
from metadata_doc import config_example
import textwrap
config_example(cog, textwrap.dedent(
""""""
plugins:
datasette-proxy-url:
paths:
- path: /proxy
backend: http://example.com/
"""""").strip()
)
]]]
[[[end]]]","[""Configuration""]","[{""href"": ""https://datasette.io/plugins/datasette-cluster-map"", ""label"": ""datasette-cluster-map""}, {""href"": ""https://datasette.io/plugins/datasette-proxy-url"", ""label"": ""datasette-proxy-url""}]"
configuration:configuration-reference,configuration,configuration-reference,,"The following example shows some of the valid configuration options that can exist inside datasette.yaml .
[[[cog
from metadata_doc import config_example
import textwrap
config_example(cog, textwrap.dedent(
""""""
# Datasette settings block
settings:
default_page_size: 50
sql_time_limit_ms: 3500
max_returned_rows: 2000
# top-level plugin configuration
plugins:
datasette-my-plugin:
key: valueA
# Database and table-level configuration
databases:
your_db_name:
# plugin configuration for the your_db_name database
plugins:
datasette-my-plugin:
key: valueA
tables:
your_table_name:
allow:
# Only the root user can access this table
id: root
# plugin configuration for the your_table_name table
# inside your_db_name database
plugins:
datasette-my-plugin:
key: valueB
"""""")
)
]]]
[[[end]]]","[""Configuration""]",[]
contributing:contributing-formatting-black,contributing,contributing-formatting-black,Running Black,"Black will be installed when you run pip install -e '.[test]' . To test that your code complies with Black, run the following in your root datasette repository checkout:
black . --check
All done! ✨ 🍰 ✨
95 files would be left unchanged.
If any of your code does not conform to Black you can run this to automatically fix those problems:
black .
reformatted ../datasette/setup.py
All done! ✨ 🍰 ✨
1 file reformatted, 94 files left unchanged.","[""Contributing"", ""Code formatting""]",[]
contributing:contributing-formatting-blacken-docs,contributing,contributing-formatting-blacken-docs,blacken-docs,"The blacken-docs command applies Black formatting rules to code examples in the documentation. Run it like this:
blacken-docs -l 60 docs/*.rst","[""Contributing"", ""Code formatting""]","[{""href"": ""https://pypi.org/project/blacken-docs/"", ""label"": ""blacken-docs""}]"
contributing:contributing-formatting-prettier,contributing,contributing-formatting-prettier,Prettier,"To install Prettier, install Node.js and then run the following in the root of your datasette repository checkout:
npm install
This will install Prettier in a node_modules directory. You can then check that your code matches the coding style like so:
npm run prettier -- --check
> prettier
> prettier 'datasette/static/*[!.min].js' ""--check""
Checking formatting...
[warn] datasette/static/plugins.js
[warn] Code style issues found in the above file(s). Forgot to run Prettier?
You can fix any problems by running:
npm run fix","[""Contributing"", ""Code formatting""]","[{""href"": ""https://nodejs.org/en/download/package-manager/"", ""label"": ""install Node.js""}]"
contributing:contributing-documentation-cog,contributing,contributing-documentation-cog,Running Cog,"Some pages of documentation (in particular the CLI reference ) are automatically updated using Cog .
To update these pages, run the following command:
cog -r docs/*.rst","[""Contributing"", ""Editing and building the documentation""]","[{""href"": ""https://github.com/nedbat/cog"", ""label"": ""Cog""}]"
contributing:contributing-alpha-beta,contributing,contributing-alpha-beta,Alpha and beta releases,"Alpha and beta releases are published to preview upcoming features that may not yet be stable - in particular to preview new plugin hooks.
You are welcome to try these out, but please be aware that details may change before the final release.
Please join discussions on the issue tracker to share your thoughts and experiences with on alpha and beta features that you try out.","[""Contributing""]","[{""href"": ""https://github.com/simonw/datasette/issues"", ""label"": ""discussions on the issue tracker""}]"
contributing:contributing-bug-fix-branch,contributing,contributing-bug-fix-branch,Releasing bug fixes from a branch,"If it's necessary to publish a bug fix release without shipping new features that have landed on main a release branch can be used.
Create it from the relevant last tagged release like so:
git branch 0.52.x 0.52.4
git checkout 0.52.x
Next cherry-pick the commits containing the bug fixes:
git cherry-pick COMMIT
Write the release notes in the branch, and update the version number in version.py . Then push the branch:
git push -u origin 0.52.x
Once the tests have completed, publish the release from that branch target using the GitHub Draft a new release form.
Finally, cherry-pick the commit with the release notes and version number bump across to main :
git checkout main
git cherry-pick COMMIT
git push","[""Contributing""]","[{""href"": ""https://github.com/simonw/datasette/releases/new"", ""label"": ""Draft a new release""}]"
contributing:contributing-continuous-deployment,contributing,contributing-continuous-deployment,Continuously deployed demo instances,"The demo instance at latest.datasette.io is re-deployed automatically to Google Cloud Run for every push to main that passes the test suite. This is implemented by the GitHub Actions workflow at .github/workflows/deploy-latest.yml .
Specific branches can also be set to automatically deploy by adding them to the on: push: branches block at the top of the workflow YAML file. Branches configured in this way will be deployed to a new Cloud Run service whether or not their tests pass.
The Cloud Run URL for a branch demo can be found in the GitHub Actions logs.","[""Contributing""]","[{""href"": ""https://latest.datasette.io/"", ""label"": ""latest.datasette.io""}, {""href"": ""https://github.com/simonw/datasette/blob/main/.github/workflows/deploy-latest.yml"", ""label"": "".github/workflows/deploy-latest.yml""}]"
contributing:contributing-debugging,contributing,contributing-debugging,Debugging,"Any errors that occur while Datasette is running while display a stack trace on the console.
You can tell Datasette to open an interactive pdb debugger session if an error occurs using the --pdb option:
datasette --pdb fixtures.db","[""Contributing""]",[]
contributing:contributing-documentation,contributing,contributing-documentation,Editing and building the documentation,"Datasette's documentation lives in the docs/ directory and is deployed automatically using Read The Docs .
The documentation is written using reStructuredText. You may find this article on The subset of reStructuredText worth committing to memory useful.
You can build it locally by installing sphinx and sphinx_rtd_theme in your Datasette development environment and then running make html directly in the docs/ directory:
# You may first need to activate your virtual environment:
source venv/bin/activate
# Install the dependencies needed to build the docs
pip install -e .[docs]
# Now build the docs
cd docs/
make html
This will create the HTML version of the documentation in docs/_build/html . You can open it in your browser like so:
open _build/html/index.html
Any time you make changes to a .rst file you can re-run make html to update the built documents, then refresh them in your browser.
For added productivity, you can use use sphinx-autobuild to run Sphinx in auto-build mode. This will run a local webserver serving the docs that automatically rebuilds them and refreshes the page any time you hit save in your editor.
sphinx-autobuild will have been installed when you ran pip install -e .[docs] . In your docs/ directory you can start the server by running the following:
make livehtml
Now browse to http://localhost:8000/ to view the documentation. Any edits you make should be instantly reflected in your browser.","[""Contributing""]","[{""href"": ""https://readthedocs.org/"", ""label"": ""Read The Docs""}, {""href"": ""https://simonwillison.net/2018/Aug/25/restructuredtext/"", ""label"": ""The subset of reStructuredText worth committing to memory""}, {""href"": ""https://pypi.org/project/sphinx-autobuild/"", ""label"": ""sphinx-autobuild""}]"
contributing:contributing-formatting,contributing,contributing-formatting,Code formatting,"Datasette uses opinionated code formatters: Black for Python and Prettier for JavaScript.
These formatters are enforced by Datasette's continuous integration: if a commit includes Python or JavaScript code that does not match the style enforced by those tools, the tests will fail.
When developing locally, you can verify and correct the formatting of your code using these tools.","[""Contributing""]","[{""href"": ""https://github.com/psf/black"", ""label"": ""Black""}, {""href"": ""https://prettier.io/"", ""label"": ""Prettier""}]"
contributing:contributing-release,contributing,contributing-release,Release process,"Datasette releases are performed using tags. When a new release is published on GitHub, a GitHub Action workflow will perform the following:
Run the unit tests against all supported Python versions. If the tests pass...
Build a Docker image of the release and push a tag to https://hub.docker.com/r/datasetteproject/datasette
Re-point the ""latest"" tag on Docker Hub to the new image
Build a wheel bundle of the underlying Python source code
Push that new wheel up to PyPI: https://pypi.org/project/datasette/
If the release is an alpha, navigate to https://readthedocs.org/projects/datasette/versions/ and search for the tag name in the ""Activate a version"" filter, then mark that version as ""active"" to ensure it will appear on the public ReadTheDocs documentation site.
To deploy new releases you will need to have push access to the main Datasette GitHub repository.
Datasette follows Semantic Versioning :
major.minor.patch
We increment major for backwards-incompatible releases. Datasette is currently pre-1.0 so the major version is always 0 .
We increment minor for new features.
We increment patch for bugfix releass.
Alpha and beta releases may have an additional a0 or b0 prefix - the integer component will be incremented with each subsequent alpha or beta.
To release a new version, first create a commit that updates the version number in datasette/version.py and the the changelog with highlights of the new version. An example commit can be seen here :
# Update changelog
git commit -m "" Release 0.51a1
Refs #1056, #1039, #998, #1045, #1033, #1036, #1034, #976, #1057, #1058, #1053, #1064, #1066"" -a
git push
Referencing the issues that are part of the release in the commit message ensures the name of the release shows up on those issue pages, e.g. here .
You can generate the list of issue references for a specific release by copying and pasting text from the release notes or GitHub changes-since-last-release view into this Extract issue numbers from pasted text tool.
To create the tag for the release, create a new release on GitHub matching the new version number. You can convert the release notes to Markdown by copying and pasting the rendered HTML into this Paste to Markdown tool .
Finally, post a news item about the release on datasette.io by editing the news.yaml file in that site's repository.","[""Contributing""]","[{""href"": ""https://github.com/simonw/datasette/blob/main/.github/workflows/deploy-latest.yml"", ""label"": ""GitHub Action workflow""}, {""href"": ""https://hub.docker.com/r/datasetteproject/datasette"", ""label"": ""https://hub.docker.com/r/datasetteproject/datasette""}, {""href"": ""https://pypi.org/project/datasette/"", ""label"": ""https://pypi.org/project/datasette/""}, {""href"": ""https://readthedocs.org/projects/datasette/versions/"", ""label"": ""https://readthedocs.org/projects/datasette/versions/""}, {""href"": ""https://semver.org/"", ""label"": ""Semantic Versioning""}, {""href"": ""https://github.com/simonw/datasette/commit/0e1e89c6ba3d0fbdb0823272952cf356f3016def"", ""label"": ""commit can be seen here""}, {""href"": ""https://github.com/simonw/datasette/issues/581#ref-commit-d56f402"", ""label"": ""here""}, {""href"": ""https://observablehq.com/@simonw/extract-issue-numbers-from-pasted-text"", ""label"": ""Extract issue numbers from pasted text""}, {""href"": ""https://github.com/simonw/datasette/releases/new"", ""label"": ""a new release""}, {""href"": ""https://euangoddard.github.io/clipboard2markdown/"", ""label"": ""Paste to Markdown tool""}, {""href"": ""https://datasette.io/"", ""label"": ""datasette.io""}, {""href"": ""https://github.com/simonw/datasette.io/blob/main/news.yaml"", ""label"": ""news.yaml""}]"
contributing:contributing-running-tests,contributing,contributing-running-tests,Running the tests,"Once you have done this, you can run the Datasette unit tests from inside your datasette/ directory using pytest like so:
pytest
You can run the tests faster using multiple CPU cores with pytest-xdist like this:
pytest -n auto -m ""not serial""
-n auto detects the number of available cores automatically. The -m ""not serial"" skips tests that don't work well in a parallel test environment. You can run those tests separately like so:
pytest -m ""serial""","[""Contributing""]","[{""href"": ""https://docs.pytest.org/"", ""label"": ""pytest""}, {""href"": ""https://pypi.org/project/pytest-xdist/"", ""label"": ""pytest-xdist""}]"
contributing:contributing-upgrading-codemirror,contributing,contributing-upgrading-codemirror,Upgrading CodeMirror,"Datasette bundles CodeMirror for the SQL editing interface, e.g. on this page . Here are the steps for upgrading to a new version of CodeMirror:
Install the packages with:
npm i codemirror @codemirror/lang-sql
Build the bundle using the version number from package.json with:
node_modules/.bin/rollup datasette/static/cm-editor-6.0.1.js \
-f iife \
-n cm \
-o datasette/static/cm-editor-6.0.1.bundle.js \
-p @rollup/plugin-node-resolve \
-p @rollup/plugin-terser
Update the version reference in the codemirror.html template.","[""Contributing""]","[{""href"": ""https://codemirror.net/"", ""label"": ""CodeMirror""}, {""href"": ""https://latest.datasette.io/fixtures"", ""label"": ""this page""}]"
contributing:contributing-using-fixtures,contributing,contributing-using-fixtures,Using fixtures,"To run Datasette itself, type datasette .
You're going to need at least one SQLite database. A quick way to get started is to use the fixtures database that Datasette uses for its own tests.
You can create a copy of that database by running this command:
python tests/fixtures.py fixtures.db
Now you can run Datasette against the new fixtures database like so:
datasette fixtures.db
This will start a server at http://127.0.0.1:8001/ .
Any changes you make in the datasette/templates or datasette/static folder will be picked up immediately (though you may need to do a force-refresh in your browser to see changes to CSS or JavaScript).
If you want to change Datasette's Python code you can use the --reload option to cause Datasette to automatically reload any time the underlying code changes:
datasette --reload fixtures.db
You can also use the fixtures.py script to recreate the testing version of metadata.json used by the unit tests. To do that:
python tests/fixtures.py fixtures.db fixtures-metadata.json
Or to output the plugins used by the tests, run this:
python tests/fixtures.py fixtures.db fixtures-metadata.json fixtures-plugins
Test tables written to fixtures.db
- metadata written to fixtures-metadata.json
Wrote plugin: fixtures-plugins/register_output_renderer.py
Wrote plugin: fixtures-plugins/view_name.py
Wrote plugin: fixtures-plugins/my_plugin.py
Wrote plugin: fixtures-plugins/messages_output_renderer.py
Wrote plugin: fixtures-plugins/my_plugin_2.py
Then run Datasette like this:
datasette fixtures.db -m fixtures-metadata.json --plugins-dir=fixtures-plugins/","[""Contributing""]",[]
contributing:devenvironment,contributing,devenvironment,Setting up a development environment,"If you have Python 3.8 or higher installed on your computer (on OS X the quickest way to do this is using homebrew ) you can install an editable copy of Datasette using the following steps.
If you want to use GitHub to publish your changes, first create a fork of datasette under your own GitHub account.
Now clone that repository somewhere on your computer:
git clone git@github.com:YOURNAME/datasette
If you want to get started without creating your own fork, you can do this instead:
git clone git@github.com:simonw/datasette
The next step is to create a virtual environment for your project and use it to install Datasette's dependencies:
cd datasette
# Create a virtual environment in ./venv
python3 -m venv ./venv
# Now activate the virtual environment, so pip can install into it
source venv/bin/activate
# Install Datasette and its testing dependencies
python3 -m pip install -e '.[test]'
That last line does most of the work: pip install -e means ""install this package in a way that allows me to edit the source code in place"". The .[test] option means ""use the setup.py in this directory and install the optional testing dependencies as well"".","[""Contributing""]","[{""href"": ""https://docs.python-guide.org/starting/install3/osx/"", ""label"": ""is using homebrew""}, {""href"": ""https://github.com/simonw/datasette/fork"", ""label"": ""create a fork of datasette""}]"
contributing:general-guidelines,contributing,general-guidelines,General guidelines,"main should always be releasable . Incomplete features should live in branches. This ensures that any small bug fixes can be quickly released.
The ideal commit should bundle together the implementation, unit tests and associated documentation updates. The commit message should link to an associated issue.
New plugin hooks should only be shipped if accompanied by a separate release of a non-demo plugin that uses them.","[""Contributing""]",[]
custom_templates:custom-pages-errors,custom_templates,custom-pages-errors,Custom error pages,"Datasette returns an error page if an unexpected error occurs, access is forbidden or content cannot be found.
You can customize the response returned for these errors by providing a custom error page template.
Content not found errors use a 404.html template. Access denied errors use 403.html . Invalid input errors use 400.html . Unexpected errors of other kinds use 500.html .
If a template for the specific error code is not found a template called error.html will be used instead. If you do not provide that template Datasette's default error.html template will be used.
The error template will be passed the following context:
status - integer
The integer HTTP status code, e.g. 404, 500, 403, 400.
error - string
Details of the specific error, usually a full sentence.
title - string or None
A title for the page representing the class of error. This is often None for errors that do not provide a title separate from their error message.","[""Custom pages and templates"", ""Custom redirects""]","[{""href"": ""https://github.com/simonw/datasette/blob/main/datasette/templates/error.html"", ""label"": ""default error.html template""}]"
custom_templates:customization-custom-templates,custom_templates,customization-custom-templates,Custom templates,"By default, Datasette uses default templates that ship with the package.
You can over-ride these templates by specifying a custom --template-dir like
this:
datasette mydb.db --template-dir=mytemplates/
Datasette will now first look for templates in that directory, and fall back on
the defaults if no matches are found.
It is also possible to over-ride templates on a per-database, per-row or per-
table basis.
The lookup rules Datasette uses are as follows:
Index page (/):
index.html
Database page (/mydatabase):
database-mydatabase.html
database.html
Custom query page (/mydatabase?sql=...):
query-mydatabase.html
query.html
Canned query page (/mydatabase/canned-query):
query-mydatabase-canned-query.html
query-mydatabase.html
query.html
Table page (/mydatabase/mytable):
table-mydatabase-mytable.html
table.html
Row page (/mydatabase/mytable/id):
row-mydatabase-mytable.html
row.html
Table of rows and columns include on table page:
_table-table-mydatabase-mytable.html
_table-mydatabase-mytable.html
_table.html
Table of rows and columns include on row page:
_table-row-mydatabase-mytable.html
_table-mydatabase-mytable.html
_table.html
If a table name has spaces or other unexpected characters in it, the template
filename will follow the same rules as our custom CSS classes - for
example, a table called ""Food Trucks"" will attempt to load the following
templates:
table-mydatabase-Food-Trucks-399138.html
table.html
You can find out which templates were considered for a specific page by viewing
source on that page and looking for an HTML comment at the bottom. The comment
will look something like this:
This example is from the canned query page for a query called ""tz"" in the
database called ""mydb"". The asterisk shows which template was selected - so in
this case, Datasette found a template file called query-mydb-tz.html and
used that - but if that template had not been found, it would have tried for
query-mydb.html or the default query.html .
It is possible to extend the default templates using Jinja template
inheritance. If you want to customize EVERY row template with some additional
content you can do so by creating a row.html template like this:
{% extends ""default:row.html"" %}
{% block content %}
EXTRA HTML AT THE TOP OF THE CONTENT BLOCK
This line renders the original block:
{{ super() }}
{% endblock %}
Note the default:row.html template name, which ensures Jinja will inherit
from the default template.
The _table.html template is included by both the row and the table pages,
and a list of rows. The default _table.html template renders them as an
HTML template and can be seen here .
You can provide a custom template that applies to all of your databases and
tables, or you can provide custom templates for specific tables using the
template naming scheme described above.
If you want to present your data in a format other than an HTML table, you
can do so by looping through display_rows in your own _table.html
template. You can use {{ row[""column_name""] }} to output the raw value
of a specific column.
If you want to output the rendered HTML version of a column, including any
links to foreign keys, you can use {{ row.display(""column_name"") }} .
Here is an example of a custom _table.html template:
{% for row in display_rows %}
{{ row[""title""] }}
{{ row[""description""] }}
Category: {{ row.display(""category_id"") }}
{% endfor %}","[""Custom pages and templates"", ""Publishing static assets""]","[{""href"": ""https://github.com/simonw/datasette/blob/main/datasette/templates/_table.html"", ""label"": ""can be seen here""}]"
custom_templates:id1,custom_templates,id1,Custom pages,"You can add templated pages to your Datasette instance by creating HTML files in a pages directory within your templates directory.
For example, to add a custom page that is served at http://localhost/about you would create a file in templates/pages/about.html , then start Datasette like this:
datasette mydb.db --template-dir=templates/
You can nest directories within pages to create a nested structure. To create a http://localhost:8001/about/map page you would create templates/pages/about/map.html .","[""Custom pages and templates"", ""Publishing static assets""]",[]
custom_templates:css-classes-on-the-body,custom_templates,css-classes-on-the-body,CSS classes on the ,"Every default template includes CSS classes in the body designed to support
custom styling.
The index template (the top level page at / ) gets this:
The database template ( /dbname ) gets this:
The custom SQL template ( /dbname?sql=... ) gets this:
A canned query template ( /dbname/queryname ) gets this:
The table template ( /dbname/tablename ) gets:
The row template ( /dbname/tablename/rowid ) gets:
The db-x and table-x classes use the database or table names themselves if
they are valid CSS identifiers. If they aren't, we strip any invalid
characters out and append a 6 character md5 digest of the original name, in
order to ensure that multiple tables which resolve to the same stripped
character version still have different CSS classes.
Some examples:
""simple"" => ""simple""
""MixedCase"" => ""MixedCase""
""-no-leading-hyphens"" => ""no-leading-hyphens-65bea6""
""_no-leading-underscores"" => ""no-leading-underscores-b921bc""
""no spaces"" => ""no-spaces-7088d7""
""-"" => ""336d5e""
""no $ characters"" => ""no--characters-59e024""
and
elements also get custom CSS classes reflecting the
database column they are representing, for example:
","[""Custom pages and templates""]",[]
custom_templates:custom-pages-404,custom_templates,custom-pages-404,Returning 404s,"To indicate that content could not be found and display the default 404 page you can use the raise_404(message) function:
{% if not rows %}
{{ raise_404(""Content not found"") }}
{% endif %}
If you call raise_404() the other content in your template will be ignored.","[""Custom pages and templates""]",[]
custom_templates:custom-pages-headers,custom_templates,custom-pages-headers,Custom headers and status codes,"Custom pages default to being served with a content-type of text/html; charset=utf-8 and a 200 status code. You can change these by calling a custom function from within your template.
For example, to serve a custom page with a 418 I'm a teapot HTTP status code, create a file in pages/teapot.html containing the following:
{{ custom_status(418) }}
Teapot
I'm a teapot
To serve a custom HTTP header, add a custom_header(name, value) function call. For example:
{{ custom_status(418) }}
{{ custom_header(""x-teapot"", ""I am"") }}
Teapot
I'm a teapot
You can verify this is working using curl like this:
curl -I 'http://127.0.0.1:8001/teapot'
HTTP/1.1 418
date: Sun, 26 Apr 2020 18:38:30 GMT
server: uvicorn
x-teapot: I am
content-type: text/html; charset=utf-8","[""Custom pages and templates""]",[]
custom_templates:custom-pages-parameters,custom_templates,custom-pages-parameters,Path parameters for pages,"You can define custom pages that match multiple paths by creating files with {variable} definitions in their filenames.
For example, to capture any request to a URL matching /about/* , you would create a template in the following location:
templates/pages/about/{slug}.html
A hit to /about/news would render that template and pass in a variable called slug with a value of ""news"" .
If you use this mechanism don't forget to return a 404 if the referenced content could not be found. You can do this using {{ raise_404() }} described below.
Templates defined using custom page routes work particularly well with the sql() template function from datasette-template-sql or the graphql() template function from datasette-graphql .","[""Custom pages and templates""]","[{""href"": ""https://github.com/simonw/datasette-template-sql"", ""label"": ""datasette-template-sql""}, {""href"": ""https://github.com/simonw/datasette-graphql#the-graphql-template-function"", ""label"": ""datasette-graphql""}]"
custom_templates:custom-pages-redirects,custom_templates,custom-pages-redirects,Custom redirects,"You can use the custom_redirect(location) function to redirect users to another page, for example in a file called pages/datasette.html :
{{ custom_redirect(""https://github.com/simonw/datasette"") }}
Now requests to http://localhost:8001/datasette will result in a redirect.
These redirects are served with a 302 Found status code by default. You can send a 301 Moved Permanently code by passing 301 as the second argument to the function:
{{ custom_redirect(""https://github.com/simonw/datasette"", 301) }}","[""Custom pages and templates""]",[]
custom_templates:customization-static-files,custom_templates,customization-static-files,Serving static files,"Datasette can serve static files for you, using the --static option.
Consider the following directory structure:
metadata.json
static-files/styles.css
static-files/app.js
You can start Datasette using --static assets:static-files/ to serve those
files from the /assets/ mount point:
datasette --config datasette.yaml --static assets:static-files/ --memory
The following URLs will now serve the content from those CSS and JS files:
http://localhost:8001/assets/styles.css
http://localhost:8001/assets/app.js
You can reference those files from datasette.yaml like this, see custom CSS and JavaScript for more details:
[[[cog
from metadata_doc import config_example
config_example(cog, """"""
extra_css_urls:
- /assets/styles.css
extra_js_urls:
- /assets/app.js
"""""")
]]]
[[[end]]]","[""Custom pages and templates""]",[]
custom_templates:publishing-static-assets,custom_templates,publishing-static-assets,Publishing static assets,"The datasette publish command can be used to publish your static assets,
using the same syntax as above:
datasette publish cloudrun mydb.db --static assets:static-files/
This will upload the contents of the static-files/ directory as part of the
deployment, and configure Datasette to correctly serve the assets from /assets/ .","[""Custom pages and templates""]",[]
index:contents,index,contents,Contents,"Getting started Play with a live demo Follow a tutorial Datasette in your browser with Datasette Lite Try Datasette without installing anything using Glitch Using Datasette on your own computer Installation Basic installation Datasette Desktop for Mac Using Homebrew Using pip Advanced installation options Using pipx Using Docker A note about extensions Configuration Configuration via the command-line datasette.yaml reference Settings Plugin configuration Permissions configuration Canned queries configuration Custom CSS and JavaScript The Datasette Ecosystem sqlite-utils Dogsheep CLI reference datasette --help datasette serve datasette --get datasette serve --help-settings datasette plugins datasette install datasette uninstall datasette publish datasette publish cloudrun datasette publish heroku datasette package datasette inspect datasette create-token Pages and API endpoints Top-level index Database Hidden tables Table Row Publishing data datasette publish Publishing to Google Cloud Run Publishing to Heroku Publishing to Vercel Publishing to Fly Custom metadata and plugins datasette package Deploying Datasette Deployment fundamentals Running Datasette using systemd Running Datasette using OpenRC Deploying using buildpacks Running Datasette behind a proxy Nginx proxy configuration Apache proxy configuration JSON API Default representation Different shapes Pagination Special JSON arguments Table arguments Column filter arguments Special table arguments Expanding foreign key references Discovering the JSON for a page Enabling CORS The JSON write API Inserting rows Upserting rows Updating a row Deleting a row Creating a table Creating a table from example data Dropping tables Running SQL queries Named parameters Views Canned queries Canned query parameters Additional canned query options Writable canned queries Magic parameters JSON API for writable canned queries Pagination Cross-database queries Authentication and permissions Actors Using the ""root"" actor Permissions How permissions are resolved Defining permissions with ""allow"" blocks The /-/allow-debug tool Access permissions in datasette.yaml Access to an instance Access to specific databases Access to specific tables and views Access to specific canned queries Controlling the ability to execute arbitrary SQL Other permissions in datasette.yaml API Tokens datasette create-token Checking permissions in plugins actor_matches_allow() The permissions debug tool The ds_actor cookie Including an expiry time The /-/logout page Built-in permissions view-instance view-database view-database-download view-table view-query insert-row delete-row update-row create-table alter-table drop-table execute-sql permissions-debug debug-menu Performance and caching Immutable mode Using ""datasette inspect"" HTTP caching datasette-hashed-urls CSV export URL parameters Streaming all records Binary data Linking to binary downloads Binary plugins Facets Facets in query strings Facets in metadata Suggested facets Speeding up facets with indexes Facet by JSON array Facet by date Full-text search The table page and table view API Advanced SQLite search queries Configuring full-text search for a table or view Searches using custom SQL Enabling full-text search for a SQLite table Configuring FTS using sqlite-utils Configuring FTS using csvs-to-sqlite Configuring FTS by hand FTS versions SpatiaLite Warning Installation Installing SpatiaLite on OS X Installing SpatiaLite on Linux Spatial indexing latitude/longitude columns Making use of a spatial index Importing shapefiles into SpatiaLite Importing GeoJSON polygons using Shapely Querying polygons using within() Metadata Per-database and per-table metadata Source, license and about Column descriptions Specifying units for a column Setting a default sort order Setting a custom page size Setting which columns can be used for sorting Specifying the label column for a table Hiding tables Metadata reference Top-level metadata Database-level metadata Table-level metadata Settings Using --setting Configuration directory mode Settings default_allow_sql default_page_size sql_time_limit_ms max_returned_rows max_insert_rows num_sql_threads allow_facet default_facet_size facet_time_limit_ms facet_suggest_time_limit_ms suggest_facets allow_download allow_signed_tokens max_signed_tokens_ttl default_cache_ttl cache_size_kb allow_csv_stream max_csv_mb truncate_cells_html force_https_urls template_debug trace_debug base_url Configuring the secret Using secrets with datasette publish Introspection /-/metadata /-/versions /-/plugins /-/settings /-/config /-/databases /-/threads /-/actor /-/messages Custom pages and templates CSS classes on the Serving static files Publishing static assets Custom templates Custom pages Path parameters for pages Custom headers and status codes Returning 404s Custom redirects Custom error pages Plugins Installing plugins One-off plugins using --plugins-dir Deploying plugins using datasette publish Controlling which plugins are loaded Seeing what plugins are installed Plugin configuration Secret configuration values Writing plugins Tracing plugin hooks Writing one-off plugins Starting an installable plugin using cookiecutter Packaging a plugin Static assets Custom templates Writing plugins that accept configuration Designing URLs for your plugin Building URLs within plugins Plugins that define new plugin hooks JavaScript plugins The datasette_init event datasetteManager JavaScript plugin objects makeAboveTablePanelConfigs() makeColumnActions(columnDetails) Selectors Plugin hooks prepare_connection(conn, database, datasette) prepare_jinja2_environment(env, datasette) Page extras extra_template_vars(template, database, table, columns, view_name, request, datasette) extra_css_urls(template, database, table, columns, view_name, request, datasette) extra_js_urls(template, database, table, columns, view_name, request, datasette) extra_body_script(template, database, table, columns, view_name, request, datasette) publish_subcommand(publish) render_cell(row, value, column, table, database, datasette, request) register_output_renderer(datasette) register_routes(datasette) register_commands(cli) register_facet_classes() register_permissions(datasette) asgi_wrapper(datasette) startup(datasette) canned_queries(datasette, database, actor) actor_from_request(datasette, request) actors_from_ids(datasette, actor_ids) jinja2_environment_from_request(datasette, request, env) filters_from_request(request, database, table, datasette) permission_allowed(datasette, actor, action, resource) register_magic_parameters(datasette) forbidden(datasette, request, message) handle_exception(datasette, request, exception) skip_csrf(datasette, scope) get_metadata(datasette, key, database, table) menu_links(datasette, actor, request) Action hooks table_actions(datasette, actor, database, table, request) view_actions(datasette, actor, database, view, request) query_actions(datasette, actor, database, query_name, request, sql, params) row_actions(datasette, actor, request, database, table, row) database_actions(datasette, actor, database, request) homepage_actions(datasette, actor, request) Template slots top_homepage(datasette, request) top_database(datasette, request, database) top_table(datasette, request, database, table) top_row(datasette, request, database, table, row) top_query(datasette, request, database, sql) top_canned_query(datasette, request, database, query_name) Event tracking track_event(datasette, event) register_events(datasette) Testing plugins Setting up a Datasette test instance Using datasette.client in tests Using pdb for errors thrown inside Datasette Using pytest fixtures Testing outbound HTTP calls with pytest-httpx Registering a plugin for the duration of a test Internals for plugins Request object The MultiParams class Response class Returning a response with .asgi_send(send) Setting cookies with response.set_cookie() Datasette class .databases .permissions .plugin_config(plugin_name, database=None, table=None) await .render_template(template, context=None, request=None) await .actors_from_ids(actor_ids) await .permission_allowed(actor, action, resource=None, default=...) await .ensure_permissions(actor, permissions) await .check_visibility(actor, action=None, resource=None, permissions=None) .create_token(actor_id, expires_after=None, restrict_all=None, restrict_database=None, restrict_resource=None) .get_permission(name_or_abbr) .get_database(name) .get_internal_database() .add_database(db, name=None, route=None) .add_memory_database(name) .remove_database(name) await .track_event(event) .sign(value, namespace=""default"") .unsign(value, namespace=""default"") .add_message(request, message, type=datasette.INFO) .absolute_url(request, path) .setting(key) .resolve_database(request) .resolve_table(request) .resolve_row(request) datasette.client datasette.urls Database class Database(ds, path=None, is_mutable=True, is_memory=False, memory_name=None) db.hash await db.execute(sql, ...) Results await db.execute_fn(fn) await db.execute_write(sql, params=None, block=True) await db.execute_write_script(sql, block=True) await db.execute_write_many(sql, params_seq, block=True) await db.execute_write_fn(fn, block=True, transaction=True) await db.execute_isolated_fn(fn) db.close() Database introspection CSRF protection Datasette's internal database The datasette.utils module parse_metadata(content) await_me_maybe(value) derive_named_parameters(db, sql) Tilde encoding datasette.tracer Tracing child tasks Import shortcuts Events LoginEvent LogoutEvent CreateTokenEvent CreateTableEvent DropTableEvent AlterTableEvent InsertRowsEvent UpsertRowsEvent UpdateRowEvent DeleteRowEvent Contributing General guidelines Setting up a development environment Running the tests Using fixtures Debugging Code formatting Running Black blacken-docs Prettier Editing and building the documentation Running Cog Continuously deployed demo instances Release process Alpha and beta releases Releasing bug fixes from a branch Upgrading CodeMirror Changelog 1.0a13 (2024-03-12) 1.0a12 (2024-02-29) 1.0a11 (2024-02-19) 1.0a10 (2024-02-17) 1.0a9 (2024-02-16) Alter table support for create, insert, upsert and update Permissions fix for the upsert API Permission checks now consider opinions from every plugin Other changes 1.0a8 (2024-02-07) Configuration JavaScript plugins Plugin hooks Documentation Minor fixes 0.64.6 (2023-12-22) 0.64.5 (2023-10-08) 1.0a7 (2023-09-21) 0.64.4 (2023-09-21) 1.0a6 (2023-09-07) 1.0a5 (2023-08-29) 1.0a4 (2023-08-21) 1.0a3 (2023-08-09) Smaller changes 0.64.2 (2023-03-08) 0.64.1 (2023-01-11) 0.64 (2023-01-09) 0.63.3 (2022-12-17) 1.0a2 (2022-12-14) 1.0a1 (2022-12-01) 1.0a0 (2022-11-29) Signed API tokens Write API 0.63.2 (2022-11-18) 0.63.1 (2022-11-10) 0.63 (2022-10-27) Features Plugin hooks and internals Documentation 0.62 (2022-08-14) Features Plugin hooks Bug fixes Documentation 0.61.1 (2022-03-23) 0.61 (2022-03-23) 0.60.2 (2022-02-07) 0.60.1 (2022-01-20) 0.60 (2022-01-13) Plugins and internals Faceting Other small fixes 0.59.4 (2021-11-29) 0.59.3 (2021-11-20) 0.59.2 (2021-11-13) 0.59.1 (2021-10-24) 0.59 (2021-10-14) 0.58.1 (2021-07-16) 0.58 (2021-07-14) 0.57.1 (2021-06-08) 0.57 (2021-06-05) New features Bug fixes and other improvements 0.56.1 (2021-06-05) 0.56 (2021-03-28) 0.55 (2021-02-18) 0.54.1 (2021-02-02) 0.54 (2021-01-25) The _internal database Named in-memory database support JavaScript modules Code formatting with Black and Prettier Other changes 0.53 (2020-12-10) 0.52.5 (2020-12-09) 0.52.4 (2020-12-05) 0.52.3 (2020-12-03) 0.52.2 (2020-12-02) 0.52.1 (2020-11-29) 0.52 (2020-11-28) 0.51.1 (2020-10-31) 0.51 (2020-10-31) New visual design Plugins can now add links within Datasette Binary data URL building Running Datasette behind a proxy Smaller changes 0.50.2 (2020-10-09) 0.50.1 (2020-10-09) 0.50 (2020-10-09) 0.49.1 (2020-09-15) 0.49 (2020-09-14) 0.48 (2020-08-16) 0.47.3 (2020-08-15) 0.47.2 (2020-08-12) 0.47.1 (2020-08-11) 0.47 (2020-08-11) 0.46 (2020-08-09) 0.45 (2020-07-01) Magic parameters for canned queries Log out Better plugin documentation New plugin hooks Smaller changes 0.44 (2020-06-11) Authentication Permissions Writable canned queries Flash messages Signed values and secrets CSRF protection Cookie methods register_routes() plugin hooks Smaller changes The road to Datasette 1.0 0.43 (2020-05-28) 0.42 (2020-05-08) 0.41 (2020-05-06) 0.40 (2020-04-21) 0.39 (2020-03-24) 0.38 (2020-03-08) 0.37.1 (2020-03-02) 0.37 (2020-02-25) 0.36 (2020-02-21) 0.35 (2020-02-04) 0.34 (2020-01-29) 0.33 (2019-12-22) 0.32 (2019-11-14) 0.31.2 (2019-11-13) 0.31.1 (2019-11-12) 0.31 (2019-11-11) 0.30.2 (2019-11-02) 0.30.1 (2019-10-30) 0.30 (2019-10-18) 0.29.3 (2019-09-02) 0.29.2 (2019-07-13) 0.29.1 (2019-07-11) 0.29 (2019-07-07) ASGI New plugin hook: asgi_wrapper New plugin hook: extra_template_vars Secret plugin configuration options Facet by date Easier custom templates for table rows ?_through= for joins through many-to-many tables Small changes 0.28 (2019-05-19) Supporting databases that change Faceting improvements, and faceting plugins datasette publish cloudrun register_output_renderer plugins Medium changes Small changes 0.27.1 (2019-05-09) 0.27 (2019-01-31) 0.26.1 (2019-01-10) 0.26 (2019-01-02) 0.25.2 (2018-12-16) 0.25.1 (2018-11-04) 0.25 (2018-09-19) 0.24 (2018-07-23) 0.23.2 (2018-07-07) 0.23.1 (2018-06-21) 0.23 (2018-06-18) CSV export Foreign key expansions New configuration settings Control HTTP caching with ?_ttl= Improved support for SpatiaLite latest.datasette.io Miscellaneous 0.22.1 (2018-05-23) 0.22 (2018-05-20) 0.21 (2018-05-05) 0.20 (2018-04-20) 0.19 (2018-04-16) 0.18 (2018-04-14) 0.17 (2018-04-13) 0.16 (2018-04-13) 0.15 (2018-04-09) 0.14 (2017-12-09) 0.13 (2017-11-24) 0.12 (2017-11-16) 0.11 (2017-11-14) 0.10 (2017-11-14) 0.9 (2017-11-13) 0.8 (2017-11-13)","[""Datasette""]",[]
deploying:apache-proxy-configuration,deploying,apache-proxy-configuration,Apache proxy configuration,"For Apache , you can use the ProxyPass directive. First make sure the following lines are uncommented:
LoadModule proxy_module lib/httpd/modules/mod_proxy.so
LoadModule proxy_http_module lib/httpd/modules/mod_proxy_http.so
Then add these directives to proxy traffic:
ProxyPass /my-datasette/ http://127.0.0.1:8009/my-datasette/
ProxyPreserveHost On
A live demo of Datasette running behind Apache using this proxy setup can be seen at datasette-apache-proxy-demo.datasette.io/prefix/ . The code for that demo can be found in the demos/apache-proxy directory.
Using --uds you can use Unix domain sockets similar to the nginx example:
ProxyPass /my-datasette/ unix:/tmp/datasette.sock|http://localhost/my-datasette/
The ProxyPreserveHost On directive ensures that the original Host: header from the incoming request is passed through to Datasette. Datasette needs this to correctly assemble links to other pages using the .absolute_url(request, path) method.","[""Deploying Datasette"", ""Running Datasette behind a proxy""]","[{""href"": ""https://httpd.apache.org/"", ""label"": ""Apache""}, {""href"": ""https://datasette-apache-proxy-demo.datasette.io/prefix/"", ""label"": ""datasette-apache-proxy-demo.datasette.io/prefix/""}, {""href"": ""https://github.com/simonw/datasette/tree/main/demos/apache-proxy"", ""label"": ""demos/apache-proxy""}, {""href"": ""https://httpd.apache.org/docs/2.4/mod/mod_proxy.html#proxypreservehost"", ""label"": ""ProxyPreserveHost On""}]"
deploying:nginx-proxy-configuration,deploying,nginx-proxy-configuration,Nginx proxy configuration,"Here is an example of an nginx configuration file that will proxy traffic to Datasette:
daemon off;
events {
worker_connections 1024;
}
http {
server {
listen 80;
location /my-datasette {
proxy_pass http://127.0.0.1:8009/my-datasette;
proxy_set_header Host $host;
}
}
}
You can also use the --uds option to Datasette to listen on a Unix domain socket instead of a port, configuring the nginx upstream proxy like this:
daemon off;
events {
worker_connections 1024;
}
http {
server {
listen 80;
location /my-datasette {
proxy_pass http://datasette/my-datasette;
proxy_set_header Host $host;
}
}
upstream datasette {
server unix:/tmp/datasette.sock;
}
}
Then run Datasette with datasette --uds /tmp/datasette.sock path/to/database.db --setting base_url /my-datasette/ .","[""Deploying Datasette"", ""Running Datasette behind a proxy""]","[{""href"": ""https://nginx.org/"", ""label"": ""nginx""}]"
deploying:deploying-buildpacks,deploying,deploying-buildpacks,Deploying using buildpacks,"Some hosting providers such as Heroku , DigitalOcean App Platform and Scalingo support the Buildpacks standard for deploying Python web applications.
Deploying Datasette on these platforms requires two files: requirements.txt and Procfile .
The requirements.txt file lets the platform know which Python packages should be installed. It should contain datasette at a minimum, but can also list any Datasette plugins you wish to install - for example:
datasette
datasette-vega
The Procfile lets the hosting platform know how to run the command that serves web traffic. It should look like this:
web: datasette . -h 0.0.0.0 -p $PORT --cors
The $PORT environment variable is provided by the hosting platform. --cors enables CORS requests from JavaScript running on other websites to your domain - omit this if you don't want to allow CORS. You can add additional Datasette Settings options here too.
These two files should be enough to deploy Datasette on any host that supports buildpacks. Datasette will serve any SQLite files that are included in the root directory of the application.
If you want to build SQLite files or download them as part of the deployment process you can do so using a bin/post_compile file. For example, the following bin/post_compile will download an example database that will then be served by Datasette:
wget https://fivethirtyeight.datasettes.com/fivethirtyeight.db
simonw/buildpack-datasette-demo is an example GitHub repository showing a Datasette configuration that can be deployed to a buildpack-supporting host.","[""Deploying Datasette""]","[{""href"": ""https://www.heroku.com/"", ""label"": ""Heroku""}, {""href"": ""https://www.digitalocean.com/docs/app-platform/"", ""label"": ""DigitalOcean App Platform""}, {""href"": ""https://scalingo.com/"", ""label"": ""Scalingo""}, {""href"": ""https://buildpacks.io/"", ""label"": ""Buildpacks standard""}, {""href"": ""https://github.com/simonw/buildpack-datasette-demo"", ""label"": ""simonw/buildpack-datasette-demo""}]"
deploying:deploying-fundamentals,deploying,deploying-fundamentals,Deployment fundamentals,"Datasette can be deployed as a single datasette process that listens on a port. Datasette is not designed to be run as root, so that process should listen on a higher port such as port 8000.
If you want to serve Datasette on port 80 (the HTTP default port) or port 443 (for HTTPS) you should run it behind a proxy server, such as nginx, Apache or HAProxy. The proxy server can listen on port 80/443 and forward traffic on to Datasette.","[""Deploying Datasette""]",[]
deploying:deploying-openrc,deploying,deploying-openrc,Running Datasette using OpenRC,"OpenRC is the service manager on non-systemd Linux distributions like Alpine Linux and Gentoo .
Create an init script at /etc/init.d/datasette with the following contents:
#!/sbin/openrc-run
name=""datasette""
command=""datasette""
command_args=""serve -h 0.0.0.0 /path/to/db.db""
command_background=true
pidfile=""/run/${RC_SVCNAME}.pid""
You then need to configure the service to run at boot and start it:
rc-update add datasette
rc-service datasette start","[""Deploying Datasette""]","[{""href"": ""https://www.alpinelinux.org/"", ""label"": ""Alpine Linux""}, {""href"": ""https://www.gentoo.org/"", ""label"": ""Gentoo""}]"
deploying:deploying-proxy,deploying,deploying-proxy,Running Datasette behind a proxy,"You may wish to run Datasette behind an Apache or nginx proxy, using a path within your existing site.
You can use the base_url configuration setting to tell Datasette to serve traffic with a specific URL prefix. For example, you could run Datasette like this:
datasette my-database.db --setting base_url /my-datasette/ -p 8009
This will run Datasette with the following URLs:
http://127.0.0.1:8009/my-datasette/ - the Datasette homepage
http://127.0.0.1:8009/my-datasette/my-database - the page for the my-database.db database
http://127.0.0.1:8009/my-datasette/my-database/some_table - the page for the some_table table
You can now set your nginx or Apache server to proxy the /my-datasette/ path to this Datasette instance.","[""Deploying Datasette""]",[]
deploying:deploying-systemd,deploying,deploying-systemd,Running Datasette using systemd,"You can run Datasette on Ubuntu or Debian systems using systemd .
First, ensure you have Python 3 and pip installed. On Ubuntu you can use sudo apt-get install python3 python3-pip .
You can install Datasette into a virtual environment, or you can install it system-wide. To install system-wide, use sudo pip3 install datasette .
Now create a folder for your Datasette databases, for example using mkdir /home/ubuntu/datasette-root .
You can copy a test database into that folder like so:
cd /home/ubuntu/datasette-root
curl -O https://latest.datasette.io/fixtures.db
Create a file at /etc/systemd/system/datasette.service with the following contents:
[Unit]
Description=Datasette
After=network.target
[Service]
Type=simple
User=ubuntu
Environment=DATASETTE_SECRET=
WorkingDirectory=/home/ubuntu/datasette-root
ExecStart=datasette serve . -h 127.0.0.1 -p 8000
Restart=on-failure
[Install]
WantedBy=multi-user.target
Add a random value for the DATASETTE_SECRET - this will be used to sign Datasette cookies such as the CSRF token cookie. You can generate a suitable value like so:
python3 -c 'import secrets; print(secrets.token_hex(32))'
This configuration will run Datasette against all database files contained in the /home/ubuntu/datasette-root directory. If that directory contains a metadata.yml (or .json ) file or a templates/ or plugins/ sub-directory those will automatically be loaded by Datasette - see Configuration directory mode for details.
You can start the Datasette process running using the following:
sudo systemctl daemon-reload
sudo systemctl start datasette.service
You will need to restart the Datasette service after making changes to its metadata.json configuration or adding a new database file to that directory. You can do that using:
sudo systemctl restart datasette.service
Once the service has started you can confirm that Datasette is running on port 8000 like so:
curl 127.0.0.1:8000/-/versions.json
# Should output JSON showing the installed version
Datasette will not be accessible from outside the server because it is listening on 127.0.0.1 . You can expose it by instead listening on 0.0.0.0 , but a better way is to set up a proxy such as nginx - see Running Datasette behind a proxy .","[""Deploying Datasette""]",[]
facets:facets-in-query-strings,facets,facets-in-query-strings,Facets in query strings,"To turn on faceting for specific columns on a Datasette table view, add one or more _facet=COLUMN parameters to the URL.
For example, if you want to turn on facets for the city_id and state columns, construct a URL that looks like this:
/dbname/tablename?_facet=state&_facet=city_id
This works for both the HTML interface and the .json view.
When enabled, facets will cause a facet_results block to be added to the JSON output, looking something like this:
{
""state"": {
""name"": ""state"",
""results"": [
{
""value"": ""CA"",
""label"": ""CA"",
""count"": 10,
""toggle_url"": ""http://...?_facet=city_id&_facet=state&state=CA"",
""selected"": false
},
{
""value"": ""MI"",
""label"": ""MI"",
""count"": 4,
""toggle_url"": ""http://...?_facet=city_id&_facet=state&state=MI"",
""selected"": false
},
{
""value"": ""MC"",
""label"": ""MC"",
""count"": 1,
""toggle_url"": ""http://...?_facet=city_id&_facet=state&state=MC"",
""selected"": false
}
],
""truncated"": false
}
""city_id"": {
""name"": ""city_id"",
""results"": [
{
""value"": 1,
""label"": ""San Francisco"",
""count"": 6,
""toggle_url"": ""http://...?_facet=city_id&_facet=state&city_id=1"",
""selected"": false
},
{
""value"": 2,
""label"": ""Los Angeles"",
""count"": 4,
""toggle_url"": ""http://...?_facet=city_id&_facet=state&city_id=2"",
""selected"": false
},
{
""value"": 3,
""label"": ""Detroit"",
""count"": 4,
""toggle_url"": ""http://...?_facet=city_id&_facet=state&city_id=3"",
""selected"": false
},
{
""value"": 4,
""label"": ""Memnonia"",
""count"": 1,
""toggle_url"": ""http://...?_facet=city_id&_facet=state&city_id=4"",
""selected"": false
}
],
""truncated"": false
}
}
If Datasette detects that a column is a foreign key, the ""label"" property will be automatically derived from the detected label column on the referenced table.
The default number of facet results returned is 30, controlled by the default_facet_size setting.
You can increase this on an individual page by adding ?_facet_size=100 to the query string, up to a maximum of max_returned_rows (which defaults to 1000).","[""Facets""]",[]
facets:facets-metadata,facets,facets-metadata,Facets in metadata,"You can turn facets on by default for specific tables by adding them to a ""facets"" key in a Datasette Metadata file.
Here's an example that turns on faceting by default for the qLegalStatus column in the Street_Tree_List table in the sf-trees database:
[[[cog
from metadata_doc import metadata_example
metadata_example(cog, {
""databases"": {
""sf-trees"": {
""tables"": {
""Street_Tree_List"": {
""facets"": [""qLegalStatus""]
}
}
}
}
})
]]]
[[[end]]]
Facets defined in this way will always be shown in the interface and returned in the API, regardless of the _facet arguments passed to the view.
You can specify array or date facets in metadata using JSON objects with a single key of array or date and a value specifying the column, like this:
[[[cog
metadata_example(cog, {
""facets"": [
{""array"": ""tags""},
{""date"": ""created""}
]
})
]]]
[[[end]]]
You can change the default facet size (the number of results shown for each facet) for a table using facet_size :
[[[cog
metadata_example(cog, {
""databases"": {
""sf-trees"": {
""tables"": {
""Street_Tree_List"": {
""facets"": [""qLegalStatus""],
""facet_size"": 10
}
}
}
}
})
]]]
[[[end]]]","[""Facets""]",[]
facets:id2,facets,id2,Facet by JSON array,"If your SQLite installation provides the json1 extension (you can check using /-/versions ) Datasette will automatically detect columns that contain JSON arrays of values and offer a faceting interface against those columns.
This is useful for modelling things like tags without needing to break them out into a new table.
Example here: latest.datasette.io/fixtures/facetable?_facet_array=tags","[""Facets""]","[{""href"": ""https://latest.datasette.io/fixtures/facetable?_facet_array=tags"", ""label"": ""latest.datasette.io/fixtures/facetable?_facet_array=tags""}]"
facets:id3,facets,id3,Facet by date,"If Datasette finds any columns that contain dates in the first 100 values, it will offer a faceting interface against the dates of those values.
This works especially well against timestamp values such as 2019-03-01 12:44:00 .
Example here: latest.datasette.io/fixtures/facetable?_facet_date=created","[""Facets""]","[{""href"": ""https://latest.datasette.io/fixtures/facetable?_facet_date=created"", ""label"": ""latest.datasette.io/fixtures/facetable?_facet_date=created""}]"
facets:speeding-up-facets-with-indexes,facets,speeding-up-facets-with-indexes,Speeding up facets with indexes,"The performance of facets can be greatly improved by adding indexes on the columns you wish to facet by.
Adding indexes can be performed using the sqlite3 command-line utility. Here's how to add an index on the state column in a table called Food_Trucks :
sqlite3 mydatabase.db
SQLite version 3.19.3 2017-06-27 16:48:08
Enter "".help"" for usage hints.
sqlite> CREATE INDEX Food_Trucks_state ON Food_Trucks(""state"");
Or using the sqlite-utils command-line utility:
sqlite-utils create-index mydatabase.db Food_Trucks state","[""Facets""]","[{""href"": ""https://sqlite-utils.datasette.io/en/stable/cli.html#creating-indexes"", ""label"": ""sqlite-utils""}]"
facets:suggested-facets,facets,suggested-facets,Suggested facets,"Datasette's table UI will suggest facets for the user to apply, based on the following criteria:
For the currently filtered data are there any columns which, if applied as a facet...
Will return 30 or less unique options
Will return more than one unique option
Will return less unique options than the total number of filtered rows
And the query used to evaluate this criteria can be completed in under 50ms
That last point is particularly important: Datasette runs a query for every column that is displayed on a page, which could get expensive - so to avoid slow load times it sets a time limit of just 50ms for each of those queries.
This means suggested facets are unlikely to appear for tables with millions of records in them.","[""Facets""]",[]
full_text_search:configuring-fts-by-hand,full_text_search,configuring-fts-by-hand,Configuring FTS by hand,"We recommend using sqlite-utils , but if you want to hand-roll a SQLite full-text search table you can do so using the following SQL.
To enable full-text search for a table called items that works against the name and description columns, you would run this SQL to create a new items_fts FTS virtual table:
CREATE VIRTUAL TABLE ""items_fts"" USING FTS4 (
name,
description,
content=""items""
);
This creates a set of tables to power full-text search against items . The new items_fts table will be detected by Datasette as the fts_table for the items table.
Creating the table is not enough: you also need to populate it with a copy of the data that you wish to make searchable. You can do that using the following SQL:
INSERT INTO ""items_fts"" (rowid, name, description)
SELECT rowid, name, description FROM items;
If your table has columns that are foreign key references to other tables you can include that data in your full-text search index using a join. Imagine the items table has a foreign key column called category_id which refers to a categories table - you could create a full-text search table like this:
CREATE VIRTUAL TABLE ""items_fts"" USING FTS4 (
name,
description,
category_name,
content=""items""
);
And then populate it like this:
INSERT INTO ""items_fts"" (rowid, name, description, category_name)
SELECT items.rowid,
items.name,
items.description,
categories.name
FROM items JOIN categories ON items.category_id=categories.id;
You can use this technique to populate the full-text search index from any combination of tables and joins that makes sense for your project.","[""Full-text search"", ""Enabling full-text search for a SQLite table""]","[{""href"": ""https://sqlite-utils.datasette.io/"", ""label"": ""sqlite-utils""}]"
full_text_search:configuring-fts-using-csvs-to-sqlite,full_text_search,configuring-fts-using-csvs-to-sqlite,Configuring FTS using csvs-to-sqlite,"If your data starts out in CSV files, you can use Datasette's companion tool csvs-to-sqlite to convert that file into a SQLite database and enable full-text search on specific columns. For a file called items.csv where you want full-text search to operate against the name and description columns you would run the following:
csvs-to-sqlite items.csv items.db -f name -f description","[""Full-text search"", ""Enabling full-text search for a SQLite table""]","[{""href"": ""https://github.com/simonw/csvs-to-sqlite"", ""label"": ""csvs-to-sqlite""}]"
full_text_search:configuring-fts-using-sqlite-utils,full_text_search,configuring-fts-using-sqlite-utils,Configuring FTS using sqlite-utils,"sqlite-utils is a CLI utility and Python library for manipulating SQLite databases. You can use it from Python code to configure FTS search, or you can achieve the same goal using the accompanying command-line tool .
Here's how to use sqlite-utils to enable full-text search for an items table across the name and description columns:
sqlite-utils enable-fts mydatabase.db items name description","[""Full-text search"", ""Enabling full-text search for a SQLite table""]","[{""href"": ""https://sqlite-utils.datasette.io/"", ""label"": ""sqlite-utils""}, {""href"": ""https://sqlite-utils.datasette.io/en/latest/python-api.html#enabling-full-text-search"", ""label"": ""it from Python code""}, {""href"": ""https://sqlite-utils.datasette.io/en/latest/cli.html#configuring-full-text-search"", ""label"": ""using the accompanying command-line tool""}]"
full_text_search:full-text-search-advanced-queries,full_text_search,full-text-search-advanced-queries,Advanced SQLite search queries,"SQLite full-text search includes support for a variety of advanced queries , including AND , OR , NOT and NEAR .
By default Datasette disables these features to ensure they do not cause errors or confusion for users who are not aware of them. You can disable this escaping and use the advanced queries by adding &_searchmode=raw to the table page query string.
If you want to enable these operators by default for a specific table, you can do so by adding ""searchmode"": ""raw"" to the metadata configuration for that table, see Configuring full-text search for a table or view .
If that option has been specified in the table metadata but you want to over-ride it and return to the default behavior you can append &_searchmode=escaped to the query string.","[""Full-text search""]","[{""href"": ""https://www.sqlite.org/fts5.html#full_text_query_syntax"", ""label"": ""a variety of advanced queries""}]"
full_text_search:full-text-search-custom-sql,full_text_search,full-text-search-custom-sql,Searches using custom SQL,"You can include full-text search results in custom SQL queries. The general pattern with SQLite search is to run the search as a sub-select that returns rowid values, then include those rowids in another part of the query.
You can see the syntax for a basic search by running that search on a table page and then clicking ""View and edit SQL"" to see the underlying SQL. For example, consider this search for manafort is the US FARA database :
/fara/FARA_All_ShortForms?_search=manafort
If you click View and edit SQL you'll see that the underlying SQL looks like this:
select
rowid,
Short_Form_Termination_Date,
Short_Form_Date,
Short_Form_Last_Name,
Short_Form_First_Name,
Registration_Number,
Registration_Date,
Registrant_Name,
Address_1,
Address_2,
City,
State,
Zip
from
FARA_All_ShortForms
where
rowid in (
select
rowid
from
FARA_All_ShortForms_fts
where
FARA_All_ShortForms_fts match escape_fts(:search)
)
order by
rowid
limit
101","[""Full-text search""]","[{""href"": ""https://fara.datasettes.com/fara/FARA_All_ShortForms?_search=manafort"", ""label"": ""manafort is the US FARA database""}, {""href"": ""https://fara.datasettes.com/fara?sql=select%0D%0A++rowid%2C%0D%0A++Short_Form_Termination_Date%2C%0D%0A++Short_Form_Date%2C%0D%0A++Short_Form_Last_Name%2C%0D%0A++Short_Form_First_Name%2C%0D%0A++Registration_Number%2C%0D%0A++Registration_Date%2C%0D%0A++Registrant_Name%2C%0D%0A++Address_1%2C%0D%0A++Address_2%2C%0D%0A++City%2C%0D%0A++State%2C%0D%0A++Zip%0D%0Afrom%0D%0A++FARA_All_ShortForms%0D%0Awhere%0D%0A++rowid+in+%28%0D%0A++++select%0D%0A++++++rowid%0D%0A++++from%0D%0A++++++FARA_All_ShortForms_fts%0D%0A++++where%0D%0A++++++FARA_All_ShortForms_fts+match+escape_fts%28%3Asearch%29%0D%0A++%29%0D%0Aorder+by%0D%0A++rowid%0D%0Alimit%0D%0A++101&search=manafort"", ""label"": ""View and edit SQL""}]"
full_text_search:full-text-search-enabling,full_text_search,full-text-search-enabling,Enabling full-text search for a SQLite table,"Datasette takes advantage of the external content mechanism in SQLite, which allows a full-text search virtual table to be associated with the contents of another SQLite table.
To set up full-text search for a table, you need to do two things:
Create a new FTS virtual table associated with your table
Populate that FTS table with the data that you would like to be able to run searches against","[""Full-text search""]","[{""href"": ""https://www.sqlite.org/fts3.html#_external_content_fts4_tables_"", ""label"": ""external content""}]"
full_text_search:full-text-search-fts-versions,full_text_search,full-text-search-fts-versions,FTS versions,"There are three different versions of the SQLite FTS module: FTS3, FTS4 and FTS5. You can tell which versions are supported by your instance of Datasette by checking the /-/versions page.
FTS5 is the most advanced module but may not be available in the SQLite version that is bundled with your Python installation. Most importantly, FTS5 is the only version that has the ability to order by search relevance without needing extra code.
If you can't be sure that FTS5 will be available, you should use FTS4.","[""Full-text search""]",[]
full_text_search:full-text-search-table-or-view,full_text_search,full-text-search-table-or-view,Configuring full-text search for a table or view,"If a table has a corresponding FTS table set up using the content= argument to CREATE VIRTUAL TABLE shown below, Datasette will detect it automatically and add a search interface to the table page for that table.
You can also manually configure which table should be used for full-text search using query string parameters or Metadata . You can set the associated FTS table for a specific table and you can also set one for a view - if you do that, the page for that SQL view will offer a search option.
Use ?_fts_table=x to over-ride the FTS table for a specific page. If the primary key was something other than rowid you can use ?_fts_pk=col to set that as well. This is particularly useful for views, for example:
https://latest.datasette.io/fixtures/searchable_view?_fts_table=searchable_fts&_fts_pk=pk
The fts_table metadata property can be used to specify an associated FTS table. If the primary key column in your table which was used to populate the FTS table is something other than rowid , you can specify the column to use with the fts_pk property.
The ""searchmode"": ""raw"" property can be used to default the table to accepting SQLite advanced search operators, as described in Advanced SQLite search queries .
Here is an example which enables full-text search (with SQLite advanced search operators) for a display_ads view which is defined against the ads table and hence needs to run FTS against the ads_fts table, using the id as the primary key:
[[[cog
from metadata_doc import metadata_example
metadata_example(cog, {
""databases"": {
""russian-ads"": {
""tables"": {
""display_ads"": {
""fts_table"": ""ads_fts"",
""fts_pk"": ""id"",
""searchmode"": ""raw""
}
}
}
}
})
]]]
[[[end]]]","[""Full-text search""]","[{""href"": ""https://latest.datasette.io/fixtures/searchable_view?_fts_table=searchable_fts&_fts_pk=pk"", ""label"": ""https://latest.datasette.io/fixtures/searchable_view?_fts_table=searchable_fts&_fts_pk=pk""}]"
full_text_search:full-text-search-table-view-api,full_text_search,full-text-search-table-view-api,The table page and table view API,"Table views that support full-text search can be queried using the ?_search=TERMS query string parameter. This will run the search against content from all of the columns that have been included in the index.
Try this example: fara.datasettes.com/fara/FARA_All_ShortForms?_search=manafort
SQLite full-text search supports wildcards. This means you can easily implement prefix auto-complete by including an asterisk at the end of the search term - for example:
/dbname/tablename/?_search=rob*
This will return all records containing at least one word that starts with the letters rob .
You can also run searches against just the content of a specific named column by using _search_COLNAME=TERMS - for example, this would search for just rows where the name column in the FTS index mentions Sarah :
/dbname/tablename/?_search_name=Sarah","[""Full-text search""]","[{""href"": ""https://fara.datasettes.com/fara/FARA_All_ShortForms?_search=manafort"", ""label"": ""fara.datasettes.com/fara/FARA_All_ShortForms?_search=manafort""}]"
getting_started:getting-started-datasette-lite,getting_started,getting-started-datasette-lite,Datasette in your browser with Datasette Lite,"Datasette Lite is Datasette packaged using WebAssembly so that it runs entirely in your browser, no Python web application server required.
You can pass a URL to a CSV, SQLite or raw SQL file directly to Datasette Lite to explore that data in your browser.
This example link opens Datasette Lite and loads the SQL Murder Mystery example database from Northwestern University Knight Lab .","[""Getting started""]","[{""href"": ""https://lite.datasette.io/"", ""label"": ""Datasette Lite""}, {""href"": ""https://lite.datasette.io/?url=https%3A%2F%2Fraw.githubusercontent.com%2FNUKnightLab%2Fsql-mysteries%2Fmaster%2Fsql-murder-mystery.db#/sql-murder-mystery"", ""label"": ""example link""}, {""href"": ""https://github.com/NUKnightLab/sql-mysteries"", ""label"": ""Northwestern University Knight Lab""}]"
getting_started:getting-started-demo,getting_started,getting-started-demo,Play with a live demo,"The best way to experience Datasette for the first time is with a demo:
global-power-plants.datasettes.com provides a searchable database of power plants around the world, using data from the World Resources Institude rendered using the datasette-cluster-map plugin.
fivethirtyeight.datasettes.com shows Datasette running against over 400 datasets imported from the FiveThirtyEight GitHub repository .","[""Getting started""]","[{""href"": ""https://global-power-plants.datasettes.com/global-power-plants/global-power-plants"", ""label"": ""global-power-plants.datasettes.com""}, {""href"": ""https://www.wri.org/publication/global-power-plant-database"", ""label"": ""World Resources Institude""}, {""href"": ""https://github.com/simonw/datasette-cluster-map"", ""label"": ""datasette-cluster-map""}, {""href"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight"", ""label"": ""fivethirtyeight.datasettes.com""}, {""href"": ""https://github.com/fivethirtyeight/data"", ""label"": ""FiveThirtyEight GitHub repository""}]"
getting_started:getting-started-glitch,getting_started,getting-started-glitch,Try Datasette without installing anything using Glitch,"Glitch is a free online tool for building web apps directly from your web browser. You can use Glitch to try out Datasette without needing to install any software on your own computer.
Here's a demo project on Glitch which you can use as the basis for your own experiments:
glitch.com/~datasette-csvs
Glitch allows you to ""remix"" any project to create your own copy and start editing it in your browser. You can remix the datasette-csvs project by clicking this button:
Find a CSV file and drag it onto the Glitch file explorer panel - datasette-csvs will automatically convert it to a SQLite database (using sqlite-utils ) and allow you to start exploring it using Datasette.
If your CSV file has a latitude and longitude column you can visualize it on a map by uncommenting the datasette-cluster-map line in the requirements.txt file using the Glitch file editor.
Need some data? Try this Public Art Data for the city of Seattle - hit ""Export"" and select ""CSV"" to download it as a CSV file.
For more on how this works, see Running Datasette on Glitch .","[""Getting started""]","[{""href"": ""https://glitch.com/"", ""label"": ""Glitch""}, {""href"": ""https://glitch.com/~datasette-csvs"", ""label"": ""glitch.com/~datasette-csvs""}, {""href"": ""https://glitch.com/edit/#!/remix/datasette-csvs"", ""label"": null}, {""href"": ""https://github.com/simonw/sqlite-utils"", ""label"": ""sqlite-utils""}, {""href"": ""https://data.seattle.gov/Community/Public-Art-Data/j7sn-tdzk"", ""label"": ""Public Art Data""}, {""href"": ""https://simonwillison.net/2019/Apr/23/datasette-glitch/"", ""label"": ""Running Datasette on Glitch""}]"
getting_started:getting-started-tutorial,getting_started,getting-started-tutorial,Follow a tutorial,"Datasette has several tutorials to help you get started with the tool. Try one of the following:
Exploring a database with Datasette shows how to use the Datasette web interface to explore a new database.
Learn SQL with Datasette introduces SQL, and shows how to use that query language to ask questions of your data.
Cleaning data with sqlite-utils and Datasette guides you through using sqlite-utils to turn a CSV file into a database that you can explore using Datasette.","[""Getting started""]","[{""href"": ""https://datasette.io/tutorials"", ""label"": ""tutorials""}, {""href"": ""https://datasette.io/tutorials/explore"", ""label"": ""Exploring a database with Datasette""}, {""href"": ""https://datasette.io/tutorials/learn-sql"", ""label"": ""Learn SQL with Datasette""}, {""href"": ""https://datasette.io/tutorials/clean-data"", ""label"": ""Cleaning data with sqlite-utils and Datasette""}, {""href"": ""https://sqlite-utils.datasette.io/"", ""label"": ""sqlite-utils""}]"
getting_started:getting-started-your-computer,getting_started,getting-started-your-computer,Using Datasette on your own computer,"First, follow the Installation instructions. Now you can run Datasette against a SQLite file on your computer using the following command:
datasette path/to/database.db
This will start a web server on port 8001 - visit http://localhost:8001/
to access the web interface.
Add -o to open your browser automatically once Datasette has started:
datasette path/to/database.db -o
Use Chrome on OS X? You can run datasette against your browser history
like so:
datasette ~/Library/Application\ Support/Google/Chrome/Default/History --nolock
The --nolock option ignores any file locks. This is safe as Datasette will open the file in read-only mode.
Now visiting http://localhost:8001/History/downloads will show you a web
interface to browse your downloads data:
http://localhost:8001/History/downloads.json will return that data as
JSON:
{
""database"": ""History"",
""columns"": [
""id"",
""current_path"",
""target_path"",
""start_time"",
""received_bytes"",
""total_bytes"",
...
],
""rows"": [
[
1,
""/Users/simonw/Downloads/DropboxInstaller.dmg"",
""/Users/simonw/Downloads/DropboxInstaller.dmg"",
13097290269022132,
626688,
0,
...
]
]
}
http://localhost:8001/History/downloads.json?_shape=objects will return that data as
JSON in a more convenient format:
{
...
""rows"": [
{
""start_time"": 13097290269022132,
""interrupt_reason"": 0,
""hash"": """",
""id"": 1,
""site_url"": """",
""referrer"": ""https://www.dropbox.com/downloading?src=index"",
...
}
]
}","[""Getting started""]","[{""href"": ""http://localhost:8001/"", ""label"": ""http://localhost:8001/""}, {""href"": ""http://localhost:8001/History/downloads"", ""label"": ""http://localhost:8001/History/downloads""}, {""href"": ""http://localhost:8001/History/downloads.json"", ""label"": ""http://localhost:8001/History/downloads.json""}, {""href"": ""http://localhost:8001/History/downloads.json?_shape=objects"", ""label"": ""http://localhost:8001/History/downloads.json?_shape=objects""}]"
installation:installing-plugins,installation,installing-plugins,Installing plugins,"If you want to install plugins into your local Datasette Docker image you can do
so using the following recipe. This will install the plugins and then save a
brand new local image called datasette-with-plugins :
docker run datasetteproject/datasette \
pip install datasette-vega
docker commit $(docker ps -lq) datasette-with-plugins
You can now run the new custom image like so:
docker run -p 8001:8001 -v `pwd`:/mnt \
datasette-with-plugins \
datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db
You can confirm that the plugins are installed by visiting
http://127.0.0.1:8001/-/plugins
Some plugins such as datasette-ripgrep may need additional system packages. You can install these by running apt-get install inside the container:
docker run datasette-057a0 bash -c '
apt-get update &&
apt-get install ripgrep &&
pip install datasette-ripgrep'
docker commit $(docker ps -lq) datasette-with-ripgrep","[""Installation"", ""Advanced installation options"", ""Using Docker""]","[{""href"": ""http://127.0.0.1:8001/-/plugins"", ""label"": ""http://127.0.0.1:8001/-/plugins""}, {""href"": ""https://datasette.io/plugins/datasette-ripgrep"", ""label"": ""datasette-ripgrep""}]"
installation:loading-spatialite,installation,loading-spatialite,Loading SpatiaLite,"The datasetteproject/datasette image includes a recent version of the
SpatiaLite extension for SQLite. To load and enable that
module, use the following command:
docker run -p 8001:8001 -v `pwd`:/mnt \
datasetteproject/datasette \
datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db \
--load-extension=spatialite
You can confirm that SpatiaLite is successfully loaded by visiting
http://127.0.0.1:8001/-/versions","[""Installation"", ""Advanced installation options"", ""Using Docker""]","[{""href"": ""http://127.0.0.1:8001/-/versions"", ""label"": ""http://127.0.0.1:8001/-/versions""}]"
installation:installing-plugins-using-pipx,installation,installing-plugins-using-pipx,Installing plugins using pipx,"You can install additional datasette plugins with pipx inject like so:
pipx inject datasette datasette-json-html
injected package datasette-json-html into venv datasette
done! ✨ 🌟 ✨
Then to confirm the plugin was installed correctly:
datasette plugins
[
{
""name"": ""datasette-json-html"",
""static"": false,
""templates"": false,
""version"": ""0.6""
}
]","[""Installation"", ""Advanced installation options"", ""Using pipx""]",[]
installation:upgrading-packages-using-pipx,installation,upgrading-packages-using-pipx,Upgrading packages using pipx,"You can upgrade your pipx installation to the latest release of Datasette using pipx upgrade datasette :
pipx upgrade datasette
upgraded package datasette from 0.39 to 0.40 (location: /Users/simon/.local/pipx/venvs/datasette)
To upgrade a plugin within the pipx environment use pipx runpip datasette install -U name-of-plugin - like this:
datasette plugins
[
{
""name"": ""datasette-vega"",
""static"": true,
""templates"": false,
""version"": ""0.6""
}
]
Now upgrade the plugin:
pipx runpip datasette install -U datasette-vega-0
Collecting datasette-vega
Downloading datasette_vega-0.6.2-py3-none-any.whl (1.8 MB)
|████████████████████████████████| 1.8 MB 2.0 MB/s
...
Installing collected packages: datasette-vega
Attempting uninstall: datasette-vega
Found existing installation: datasette-vega 0.6
Uninstalling datasette-vega-0.6:
Successfully uninstalled datasette-vega-0.6
Successfully installed datasette-vega-0.6.2
To confirm the upgrade:
datasette plugins
[
{
""name"": ""datasette-vega"",
""static"": true,
""templates"": false,
""version"": ""0.6.2""
}
]","[""Installation"", ""Advanced installation options"", ""Using pipx""]",[]
installation:installation-docker,installation,installation-docker,Using Docker,"A Docker image containing the latest release of Datasette is published to Docker
Hub here: https://hub.docker.com/r/datasetteproject/datasette/
If you have Docker installed (for example with Docker for Mac on OS X) you can download and run this
image like so:
docker run -p 8001:8001 -v `pwd`:/mnt \
datasetteproject/datasette \
datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db
This will start an instance of Datasette running on your machine's port 8001,
serving the fixtures.db file in your current directory.
Now visit http://127.0.0.1:8001/ to access Datasette.
(You can download a copy of fixtures.db from
https://latest.datasette.io/fixtures.db )
To upgrade to the most recent release of Datasette, run the following:
docker pull datasetteproject/datasette","[""Installation"", ""Advanced installation options""]","[{""href"": ""https://hub.docker.com/r/datasetteproject/datasette/"", ""label"": ""https://hub.docker.com/r/datasetteproject/datasette/""}, {""href"": ""https://www.docker.com/docker-mac"", ""label"": ""Docker for Mac""}, {""href"": ""http://127.0.0.1:8001/"", ""label"": ""http://127.0.0.1:8001/""}, {""href"": ""https://latest.datasette.io/fixtures.db"", ""label"": ""https://latest.datasette.io/fixtures.db""}]"
installation:installation-pipx,installation,installation-pipx,Using pipx,"pipx is a tool for installing Python software with all of its dependencies in an isolated environment, to ensure that they will not conflict with any other installed Python software.
If you use Homebrew on macOS you can install pipx like this:
brew install pipx
pipx ensurepath
Without Homebrew you can install it like so:
python3 -m pip install --user pipx
python3 -m pipx ensurepath
The pipx ensurepath command configures your shell to ensure it can find commands that have been installed by pipx - generally by making sure ~/.local/bin has been added to your PATH .
Once pipx is installed you can use it to install Datasette like this:
pipx install datasette
Then run datasette --version to confirm that it has been successfully installed.","[""Installation"", ""Advanced installation options""]","[{""href"": ""https://pipxproject.github.io/pipx/"", ""label"": ""pipx""}, {""href"": ""https://brew.sh/"", ""label"": ""Homebrew""}]"
installation:installation-datasette-desktop,installation,installation-datasette-desktop,Datasette Desktop for Mac,Datasette Desktop is a packaged Mac application which bundles Datasette together with Python and allows you to install and run Datasette directly on your laptop. This is the best option for local installation if you are not comfortable using the command line.,"[""Installation"", ""Basic installation""]","[{""href"": ""https://datasette.io/desktop"", ""label"": ""Datasette Desktop""}]"
installation:installation-homebrew,installation,installation-homebrew,Using Homebrew,"If you have a Mac and use Homebrew , you can install Datasette by running this command in your terminal:
brew install datasette
This should install the latest version. You can confirm by running:
datasette --version
You can upgrade to the latest Homebrew packaged version using:
brew upgrade datasette
Once you have installed Datasette you can install plugins using the following:
datasette install datasette-vega
If the latest packaged release of Datasette has not yet been made available through Homebrew, you can upgrade your Homebrew installation in-place using:
datasette install -U datasette","[""Installation"", ""Basic installation""]","[{""href"": ""https://brew.sh/"", ""label"": ""Homebrew""}]"
installation:installation-pip,installation,installation-pip,Using pip,"Datasette requires Python 3.8 or higher. The Python.org Python For Beginners page has instructions for getting started.
You can install Datasette and its dependencies using pip :
pip install datasette
You can now run Datasette like so:
datasette","[""Installation"", ""Basic installation""]","[{""href"": ""https://www.python.org/about/gettingstarted/"", ""label"": ""Python.org Python For Beginners""}]"
installation:installation-advanced,installation,installation-advanced,Advanced installation options,,"[""Installation""]",[]
installation:installation-basic,installation,installation-basic,Basic installation,,"[""Installation""]",[]
installation:installation-extensions,installation,installation-extensions,A note about extensions,"SQLite supports extensions, such as SpatiaLite for geospatial operations.
These can be loaded using the --load-extension argument, like so:
datasette --load-extension=/usr/local/lib/mod_spatialite.dylib
Some Python installations do not include support for SQLite extensions. If this is the case you will see the following error when you attempt to load an extension:
Your Python installation does not have the ability to load SQLite extensions.
In some cases you may see the following error message instead:
AttributeError: 'sqlite3.Connection' object has no attribute 'enable_load_extension'
On macOS the easiest fix for this is to install Datasette using Homebrew:
brew install datasette
Use which datasette to confirm that datasette will run that version. The output should look something like this:
/usr/local/opt/datasette/bin/datasette
If you get a different location here such as /Library/Frameworks/Python.framework/Versions/3.10/bin/datasette you can run the following command to cause datasette to execute the Homebrew version instead:
alias datasette=$(echo $(brew --prefix datasette)/bin/datasette)
You can undo this operation using:
unalias datasette
If you need to run SQLite with extension support for other Python code, you can do so by install Python itself using Homebrew:
brew install python
Then executing Python using:
/usr/local/opt/python@3/libexec/bin/python
A more convenient way to work with this version of Python may be to use it to create a virtual environment:
/usr/local/opt/python@3/libexec/bin/python -m venv datasette-venv
Then activate it like this:
source datasette-venv/bin/activate
Now running python and pip will work against a version of Python 3 that includes support for SQLite extensions:
pip install datasette
which datasette
datasette --version","[""Installation""]",[]
internals:database-close,internals,database-close,db.close(),"Closes all of the open connections to file-backed databases. This is mainly intended to be used by large test suites, to avoid hitting limits on the number of open files.","[""Internals for plugins"", ""Database class""]",[]
internals:database-constructor,internals,database-constructor,"Database(ds, path=None, is_mutable=True, is_memory=False, memory_name=None)","The Database() constructor can be used by plugins, in conjunction with .add_database(db, name=None, route=None) , to create and register new databases.
The arguments are as follows:
ds - Datasette class (required)
The Datasette instance you are attaching this database to.
path - string
Path to a SQLite database file on disk.
is_mutable - boolean
Set this to False to cause Datasette to open the file in immutable mode.
is_memory - boolean
Use this to create non-shared memory connections.
memory_name - string or None
Use this to create a named in-memory database. Unlike regular memory databases these can be accessed by multiple threads and will persist an changes made to them for the lifetime of the Datasette server process.
The first argument is the datasette instance you are attaching to, the second is a path= , then is_mutable and is_memory are both optional arguments.","[""Internals for plugins"", ""Database class""]",[]
internals:database-execute,internals,database-execute,"await db.execute(sql, ...)","Executes a SQL query against the database and returns the resulting rows (see Results ).
sql - string (required)
The SQL query to execute. This can include ? or :named parameters.
params - list or dict
A list or dictionary of values to use for the parameters. List for ? , dictionary for :named .
truncate - boolean
Should the rows returned by the query be truncated at the maximum page size? Defaults to True , set this to False to disable truncation.
custom_time_limit - integer ms
A custom time limit for this query. This can be set to a lower value than the Datasette configured default. If a query takes longer than this it will be terminated early and raise a dataette.database.QueryInterrupted exception.
page_size - integer
Set a custom page size for truncation, over-riding the configured Datasette default.
log_sql_errors - boolean
Should any SQL errors be logged to the console in addition to being raised as an error? Defaults to True .","[""Internals for plugins"", ""Database class""]",[]
internals:database-execute-fn,internals,database-execute-fn,await db.execute_fn(fn),"Executes a given callback function against a read-only database connection running in a thread. The function will be passed a SQLite connection, and the return value from the function will be returned by the await .
Example usage:
def get_version(conn):
return conn.execute(
""select sqlite_version()""
).fetchall()[0][0]
version = await db.execute_fn(get_version)","[""Internals for plugins"", ""Database class""]",[]
internals:database-execute-isolated-fn,internals,database-execute-isolated-fn,await db.execute_isolated_fn(fn),"This method works is similar to execute_write_fn() but executes the provided function in an entirely isolated SQLite connection, which is opened, used and then closed again in a single call to this method.
The prepare_connection() plugin hook is not executed against this connection.
This allows plugins to execute database operations that might conflict with how database connections are usually configured. For example, running a VACUUM operation while bypassing any restrictions placed by the datasette-sqlite-authorizer plugin.
Plugins can also use this method to load potentially dangerous SQLite extensions, use them to perform an operation and then have them safely unloaded at the end of the call, without risk of exposing them to other connections.
Functions run using execute_isolated_fn() share the same queue as execute_write_fn() , which guarantees that no writes can be executed at the same time as the isolated function is executing.
The return value of the function will be returned by this method. Any exceptions raised by the function will be raised out of the await line as well.","[""Internals for plugins"", ""Database class""]","[{""href"": ""https://github.com/datasette/datasette-sqlite-authorizer"", ""label"": ""datasette-sqlite-authorizer""}]"
internals:database-execute-write,internals,database-execute-write,"await db.execute_write(sql, params=None, block=True)","SQLite only allows one database connection to write at a time. Datasette handles this for you by maintaining a queue of writes to be executed against a given database. Plugins can submit write operations to this queue and they will be executed in the order in which they are received.
This method can be used to queue up a non-SELECT SQL query to be executed against a single write connection to the database.
You can pass additional SQL parameters as a tuple or dictionary.
The method will block until the operation is completed, and the return value will be the return from calling conn.execute(...) using the underlying sqlite3 Python library.
If you pass block=False this behavior changes to ""fire and forget"" - queries will be added to the write queue and executed in a separate thread while your code can continue to do other things. The method will return a UUID representing the queued task.
Each call to execute_write() will be executed inside a transaction.","[""Internals for plugins"", ""Database class""]",[]
internals:database-execute-write-fn,internals,database-execute-write-fn,"await db.execute_write_fn(fn, block=True, transaction=True)","This method works like .execute_write() , but instead of a SQL statement you give it a callable Python function. Your function will be queued up and then called when the write connection is available, passing that connection as the argument to the function.
The function can then perform multiple actions, safe in the knowledge that it has exclusive access to the single writable connection for as long as it is executing.
fn needs to be a regular function, not an async def function.
For example:
def delete_and_return_count(conn):
conn.execute(""delete from some_table where id > 5"")
return conn.execute(
""select count(*) from some_table""
).fetchone()[0]
try:
num_rows_left = await database.execute_write_fn(
delete_and_return_count
)
except Exception as e:
print(""An error occurred:"", e)
The value returned from await database.execute_write_fn(...) will be the return value from your function.
If your function raises an exception that exception will be propagated up to the await line.
By default your function will be executed inside a transaction. You can pass transaction=False to disable this behavior, though if you do that you should be careful to manually apply transactions - ideally using the with conn: pattern, or you may see OperationalError: database table is locked errors.
If you specify block=False the method becomes fire-and-forget, queueing your function to be executed and then allowing your code after the call to .execute_write_fn() to continue running while the underlying thread waits for an opportunity to run your function. A UUID representing the queued task will be returned. Any exceptions in your code will be silently swallowed.","[""Internals for plugins"", ""Database class""]",[]
internals:database-execute-write-many,internals,database-execute-write-many,"await db.execute_write_many(sql, params_seq, block=True)","Like execute_write() but uses the sqlite3 conn.executemany() method. This will efficiently execute the same SQL statement against each of the parameters in the params_seq iterator, for example:
await db.execute_write_many(
""insert into characters (id, name) values (?, ?)"",
[(1, ""Melanie""), (2, ""Selma""), (2, ""Viktor"")],
)
Each call to execute_write_many() will be executed inside a transaction.","[""Internals for plugins"", ""Database class""]","[{""href"": ""https://docs.python.org/3/library/sqlite3.html#sqlite3.Cursor.executemany"", ""label"": ""conn.executemany()""}]"
internals:database-execute-write-script,internals,database-execute-write-script,"await db.execute_write_script(sql, block=True)","Like execute_write() but can be used to send multiple SQL statements in a single string separated by semicolons, using the sqlite3 conn.executescript() method.
Each call to execute_write_script() will be executed inside a transaction.","[""Internals for plugins"", ""Database class""]","[{""href"": ""https://docs.python.org/3/library/sqlite3.html#sqlite3.Cursor.executescript"", ""label"": ""conn.executescript()""}]"
internals:database-hash,internals,database-hash,db.hash,"If the database was opened in immutable mode, this property returns the 64 character SHA-256 hash of the database contents as a string. Otherwise it returns None .","[""Internals for plugins"", ""Database class""]",[]
internals:database-results,internals,database-results,Results,"The db.execute() method returns a single Results object. This can be used to access the rows returned by the query.
Iterating over a Results object will yield SQLite Row objects . Each of these can be treated as a tuple or can be accessed using row[""column""] syntax:
info = []
results = await db.execute(""select name from sqlite_master"")
for row in results:
info.append(row[""name""])
The Results object also has the following properties and methods:
.truncated - boolean
Indicates if this query was truncated - if it returned more results than the specified page_size . If this is true then the results object will only provide access to the first page_size rows in the query result. You can disable truncation by passing truncate=False to the db.query() method.
.columns - list of strings
A list of column names returned by the query.
.rows - list of sqlite3.Row
This property provides direct access to the list of rows returned by the database. You can access specific rows by index using results.rows[0] .
.first() - row or None
Returns the first row in the results, or None if no rows were returned.
.single_value()
Returns the value of the first column of the first row of results - but only if the query returned a single row with a single column. Raises a datasette.database.MultipleValues exception otherwise.
.__len__()
Calling len(results) returns the (truncated) number of returned results.","[""Internals for plugins"", ""Database class""]","[{""href"": ""https://docs.python.org/3/library/sqlite3.html#row-objects"", ""label"": ""Row objects""}]"
internals:internals-database-introspection,internals,internals-database-introspection,Database introspection,"The Database class also provides properties and methods for introspecting the database.
db.name - string
The name of the database - usually the filename without the .db prefix.
db.size - integer
The size of the database file in bytes. 0 for :memory: databases.
db.mtime_ns - integer or None
The last modification time of the database file in nanoseconds since the epoch. None for :memory: databases.
db.is_mutable - boolean
Is this database mutable, and allowed to accept writes?
db.is_memory - boolean
Is this database an in-memory database?
await db.attached_databases() - list of named tuples
Returns a list of additional databases that have been connected to this database using the SQLite ATTACH command. Each named tuple has fields seq , name and file .
await db.table_exists(table) - boolean
Check if a table called table exists.
await db.view_exists(view) - boolean
Check if a view called view exists.
await db.table_names() - list of strings
List of names of tables in the database.
await db.view_names() - list of strings
List of names of views in the database.
await db.table_columns(table) - list of strings
Names of columns in a specific table.
await db.table_column_details(table) - list of named tuples
Full details of the columns in a specific table. Each column is represented by a Column named tuple with fields cid (integer representing the column position), name (string), type (string, e.g. REAL or VARCHAR(30) ), notnull (integer 1 or 0), default_value (string or None), is_pk (integer 1 or 0).
await db.primary_keys(table) - list of strings
Names of the columns that are part of the primary key for this table.
await db.fts_table(table) - string or None
The name of the FTS table associated with this table, if one exists.
await db.label_column_for_table(table) - string or None
The label column that is associated with this table - either automatically detected or using the ""label_column"" key from Metadata , see Specifying the label column for a table .
await db.foreign_keys_for_table(table) - list of dictionaries
Details of columns in this table which are foreign keys to other tables. A list of dictionaries where each dictionary is shaped like this: {""column"": string, ""other_table"": string, ""other_column"": string} .
await db.hidden_table_names() - list of strings
List of tables which Datasette ""hides"" by default - usually these are tables associated with SQLite's full-text search feature, the SpatiaLite extension or tables hidden using the Hiding tables feature.
await db.get_table_definition(table) - string
Returns the SQL definition for the table - the CREATE TABLE statement and any associated CREATE INDEX statements.
await db.get_view_definition(view) - string
Returns the SQL definition of the named view.
await db.get_all_foreign_keys() - dictionary
Dictionary representing both incoming and outgoing foreign keys for this table. It has two keys, ""incoming"" and ""outgoing"" , each of which is a list of dictionaries with keys ""column"" , ""other_table"" and ""other_column"" . For example:
{
""incoming"": [],
""outgoing"": [
{
""other_table"": ""attraction_characteristic"",
""column"": ""characteristic_id"",
""other_column"": ""pk"",
},
{
""other_table"": ""roadside_attractions"",
""column"": ""attraction_id"",
""other_column"": ""pk"",
}
]
}","[""Internals for plugins"", ""Database class""]",[]
internals:datasette-absolute-url,internals,datasette-absolute-url,".absolute_url(request, path)","request - Request
The current Request object
path - string
A path, for example /dbname/table.json
Returns the absolute URL for the given path, including the protocol and host. For example:
absolute_url = datasette.absolute_url(
request, ""/dbname/table.json""
)
# Would return ""http://localhost:8001/dbname/table.json""
The current request object is used to determine the hostname and protocol that should be used for the returned URL. The force_https_urls configuration setting is taken into account.","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-actors-from-ids,internals,datasette-actors-from-ids,await .actors_from_ids(actor_ids),"actor_ids - list of strings or integers
A list of actor IDs to look up.
Returns a dictionary, where the keys are the IDs passed to it and the values are the corresponding actor dictionaries.
This method is mainly designed to be used with plugins. See the actors_from_ids(datasette, actor_ids) documentation for details.
If no plugins that implement that hook are installed, the default return value looks like this:
{
""1"": {""id"": ""1""},
""2"": {""id"": ""2""}
}","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-add-database,internals,datasette-add-database,".add_database(db, name=None, route=None)","db - datasette.database.Database instance
The database to be attached.
name - string, optional
The name to be used for this database . If not specified Datasette will pick one based on the filename or memory name.
route - string, optional
This will be used in the URL path. If not specified, it will default to the same thing as the name .
The datasette.add_database(db) method lets you add a new database to the current Datasette instance.
The db parameter should be an instance of the datasette.database.Database class. For example:
from datasette.database import Database
datasette.add_database(
Database(
datasette,
path=""path/to/my-new-database.db"",
)
)
This will add a mutable database and serve it at /my-new-database .
Use is_mutable=False to add an immutable database.
.add_database() returns the Database instance, with its name set as the database.name attribute. Any time you are working with a newly added database you should use the return value of .add_database() , for example:
db = datasette.add_database(
Database(datasette, memory_name=""statistics"")
)
await db.execute_write(
""CREATE TABLE foo(id integer primary key)""
)","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-add-memory-database,internals,datasette-add-memory-database,.add_memory_database(name),"Adds a shared in-memory database with the specified name:
datasette.add_memory_database(""statistics"")
This is a shortcut for the following:
from datasette.database import Database
datasette.add_database(
Database(datasette, memory_name=""statistics"")
)
Using either of these pattern will result in the in-memory database being served at /statistics .","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-add-message,internals,datasette-add-message,".add_message(request, message, type=datasette.INFO)","request - Request
The current Request object
message - string
The message string
type - constant, optional
The message type - datasette.INFO , datasette.WARNING or datasette.ERROR
Datasette's flash messaging mechanism allows you to add a message that will be displayed to the user on the next page that they visit. Messages are persisted in a ds_messages cookie. This method adds a message to that cookie.
You can try out these messages (including the different visual styling of the three message types) using the /-/messages debugging tool.","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-check-visibility,internals,datasette-check-visibility,"await .check_visibility(actor, action=None, resource=None, permissions=None)","actor - dictionary
The authenticated actor. This is usually request.actor .
action - string, optional
The name of the action that is being permission checked.
resource - string or tuple, optional
The resource, e.g. the name of the database, or a tuple of two strings containing the name of the database and the name of the table. Only some permissions apply to a resource.
permissions - list of action strings or (action, resource) tuples, optional
Provide this instead of action and resource to check multiple permissions at once.
This convenience method can be used to answer the question ""should this item be considered private, in that it is visible to me but it is not visible to anonymous users?""
It returns a tuple of two booleans, (visible, private) . visible indicates if the actor can see this resource. private will be True if an anonymous user would not be able to view the resource.
This example checks if the user can access a specific table, and sets private so that a padlock icon can later be displayed:
visible, private = await datasette.check_visibility(
request.actor,
action=""view-table"",
resource=(database, table),
)
The following example runs three checks in a row, similar to await .ensure_permissions(actor, permissions) . If any of the checks are denied before one of them is explicitly granted then visible will be False . private will be True if an anonymous user would not be able to view the resource.
visible, private = await datasette.check_visibility(
request.actor,
permissions=[
(""view-table"", (database, table)),
(""view-database"", database),
""view-instance"",
],
)","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-create-token,internals,datasette-create-token,".create_token(actor_id, expires_after=None, restrict_all=None, restrict_database=None, restrict_resource=None)","actor_id - string
The ID of the actor to create a token for.
expires_after - int, optional
The number of seconds after which the token should expire.
restrict_all - iterable, optional
A list of actions that this token should be restricted to across all databases and resources.
restrict_database - dict, optional
For restricting actions within specific databases, e.g. {""mydb"": [""view-table"", ""view-query""]} .
restrict_resource - dict, optional
For restricting actions to specific resources (tables, SQL views and Canned queries ) within a database. For example: {""mydb"": {""mytable"": [""insert-row"", ""update-row""]}} .
This method returns a signed API token of the format dstok_... which can be used to authenticate requests to the Datasette API.
All tokens must have an actor_id string indicating the ID of the actor which the token will act on behalf of.
Tokens default to lasting forever, but can be set to expire after a given number of seconds using the expires_after argument. The following code creates a token for user1 that will expire after an hour:
token = datasette.create_token(
actor_id=""user1"",
expires_after=3600,
)
The three restrict_* arguments can be used to create a token that has additional restrictions beyond what the associated actor is allowed to do.
The following example creates a token that can access view-instance and view-table across everything, can additionally use view-query for anything in the docs database and is allowed to execute insert-row and update-row in the attachments table in that database:
token = datasette.create_token(
actor_id=""user1"",
restrict_all=(""view-instance"", ""view-table""),
restrict_database={""docs"": (""view-query"",)},
restrict_resource={
""docs"": {
""attachments"": (""insert-row"", ""update-row"")
}
},
)","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-databases,internals,datasette-databases,.databases,"Property exposing a collections.OrderedDict of databases currently connected to Datasette.
The dictionary keys are the name of the database that is used in the URL - e.g. /fixtures would have a key of ""fixtures"" . The values are Database class instances.
All databases are listed, irrespective of user permissions.","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-ensure-permissions,internals,datasette-ensure-permissions,"await .ensure_permissions(actor, permissions)","actor - dictionary
The authenticated actor. This is usually request.actor .
permissions - list
A list of permissions to check. Each permission in that list can be a string action name or a 2-tuple of (action, resource) .
This method allows multiple permissions to be checked at once. It raises a datasette.Forbidden exception if any of the checks are denied before one of them is explicitly granted.
This is useful when you need to check multiple permissions at once. For example, an actor should be able to view a table if either one of the following checks returns True or not a single one of them returns False :
await datasette.ensure_permissions(
request.actor,
[
(""view-table"", (database, table)),
(""view-database"", database),
""view-instance"",
],
)","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-get-database,internals,datasette-get-database,.get_database(name),"name - string, optional
The name of the database - optional.
Returns the specified database object. Raises a KeyError if the database does not exist. Call this method without an argument to return the first connected database.","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-get-permission,internals,datasette-get-permission,.get_permission(name_or_abbr),"name_or_abbr - string
The name or abbreviation of the permission to look up, e.g. view-table or vt .
Returns a Permission object representing the permission, or raises a KeyError if one is not found.","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-permission-allowed,internals,datasette-permission-allowed,"await .permission_allowed(actor, action, resource=None, default=...)","actor - dictionary
The authenticated actor. This is usually request.actor .
action - string
The name of the action that is being permission checked.
resource - string or tuple, optional
The resource, e.g. the name of the database, or a tuple of two strings containing the name of the database and the name of the table. Only some permissions apply to a resource.
default - optional: True, False or None
What value should be returned by default if nothing provides an opinion on this permission check.
Set to True for default allow or False for default deny.
If not specified the default from the Permission() tuple that was registered using register_permissions(datasette) will be used.
Check if the given actor has permission to perform the given action on the given resource.
Some permission checks are carried out against rules defined in datasette.yaml , while other custom permissions may be decided by plugins that implement the permission_allowed(datasette, actor, action, resource) plugin hook.
If neither metadata.json nor any of the plugins provide an answer to the permission query the default argument will be returned.
See Built-in permissions for a full list of permission actions included in Datasette core.","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-permissions,internals,datasette-permissions,.permissions,"Property exposing a dictionary of permissions that have been registered using the register_permissions(datasette) plugin hook.
The dictionary keys are the permission names - e.g. view-instance - and the values are Permission() objects describing the permission. Here is a description of that object .","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-plugin-config,internals,datasette-plugin-config,".plugin_config(plugin_name, database=None, table=None)","plugin_name - string
The name of the plugin to look up configuration for. Usually this is something similar to datasette-cluster-map .
database - None or string
The database the user is interacting with.
table - None or string
The table the user is interacting with.
This method lets you read plugin configuration values that were set in datasette.yaml . See Writing plugins that accept configuration for full details of how this method should be used.
The return value will be the value from the configuration file - usually a dictionary.
If the plugin is not configured the return value will be None .","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-remove-database,internals,datasette-remove-database,.remove_database(name),"name - string
The name of the database to be removed.
This removes a database that has been previously added. name= is the unique name of that database.","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-render-template,internals,datasette-render-template,"await .render_template(template, context=None, request=None)","template - string, list of strings or jinja2.Template
The template file to be rendered, e.g. my_plugin.html . Datasette will search for this file first in the --template-dir= location, if it was specified - then in the plugin's bundled templates and finally in Datasette's set of default templates.
If this is a list of template file names then the first one that exists will be loaded and rendered.
If this is a Jinja Template object it will be used directly.
context - None or a Python dictionary
The context variables to pass to the template.
request - request object or None
If you pass a Datasette request object here it will be made available to the template.
Renders a Jinja template using Datasette's preconfigured instance of Jinja and returns the resulting string. The template will have access to Datasette's default template functions and any functions that have been made available by other plugins.","[""Internals for plugins"", ""Datasette class""]","[{""href"": ""https://jinja.palletsprojects.com/en/2.11.x/api/#jinja2.Template"", ""label"": ""Template object""}, {""href"": ""https://jinja.palletsprojects.com/en/2.11.x/"", ""label"": ""Jinja template""}]"
internals:datasette-resolve-database,internals,datasette-resolve-database,.resolve_database(request),"request - Request object
A request object
If you are implementing your own custom views, you may need to resolve the database that the user is requesting based on a URL path. If the regular expression for your route declares a database named group, you can use this method to resolve the database object.
This returns a Database instance.
If the database cannot be found, it raises a datasette.utils.asgi.DatabaseNotFound exception - which is a subclass of datasette.utils.asgi.NotFound with a .database_name attribute set to the name of the database that was requested.","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-resolve-row,internals,datasette-resolve-row,.resolve_row(request),"request - Request object
A request object
This method assumes your route declares named groups for database , table and pks .
It returns a ResolvedRow named tuple instance with the following fields:
db - Database
The database object
table - string
The name of the table
sql - string
SQL snippet that can be used in a WHERE clause to select the row
params - dict
Parameters that should be passed to the SQL query
pks - list
List of primary key column names
pk_values - list
List of primary key values decoded from the URL
row - sqlite3.Row
The row itself
If the database or table cannot be found it raises a datasette.utils.asgi.DatabaseNotFound exception.
If the table does not exist it raises a datasette.utils.asgi.TableNotFound exception.
If the row cannot be found it raises a datasette.utils.asgi.RowNotFound exception. This has .database_name , .table and .pk_values attributes, extracted from the request path.","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-resolve-table,internals,datasette-resolve-table,.resolve_table(request),"request - Request object
A request object
This assumes that the regular expression for your route declares both a database and a table named group.
It returns a ResolvedTable named tuple instance with the following fields:
db - Database
The database object
table - string
The name of the table (or view)
is_view - boolean
True if this is a view, False if it is a table
If the database or table cannot be found it raises a datasette.utils.asgi.DatabaseNotFound exception.
If the table does not exist it raises a datasette.utils.asgi.TableNotFound exception - a subclass of datasette.utils.asgi.NotFound with .database_name and .table attributes.","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-setting,internals,datasette-setting,.setting(key),"key - string
The name of the setting, e.g. base_url .
Returns the configured value for the specified setting . This can be a string, boolean or integer depending on the requested setting.
For example:
downloads_are_allowed = datasette.setting(""allow_download"")","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-sign,internals,datasette-sign,".sign(value, namespace=""default"")","value - any serializable type
The value to be signed.
namespace - string, optional
An alternative namespace, see the itsdangerous salt documentation .
Utility method for signing values, such that you can safely pass data to and from an untrusted environment. This is a wrapper around the itsdangerous library.
This method returns a signed string, which can be decoded and verified using .unsign(value, namespace=""default"") .","[""Internals for plugins"", ""Datasette class""]","[{""href"": ""https://itsdangerous.palletsprojects.com/en/1.1.x/serializer/#the-salt"", ""label"": ""itsdangerous salt documentation""}, {""href"": ""https://itsdangerous.palletsprojects.com/"", ""label"": ""itsdangerous""}]"
internals:datasette-track-event,internals,datasette-track-event,await .track_event(event),"event - Event
An instance of a subclass of datasette.events.Event .
Plugins can call this to track events, using classes they have previously registered. See Event tracking for details.
The event will then be passed to all plugins that have registered to receive events using the track_event(datasette, event) hook.
Example usage, assuming the plugin has previously registered the BanUserEvent class:
await datasette.track_event(
BanUserEvent(user={""id"": 1, ""username"": ""cleverbot""})
)","[""Internals for plugins"", ""Datasette class""]",[]
internals:datasette-unsign,internals,datasette-unsign,".unsign(value, namespace=""default"")","signed - any serializable type
The signed string that was created using .sign(value, namespace=""default"") .
namespace - string, optional
The alternative namespace, if one was used.
Returns the original, decoded object that was passed to .sign(value, namespace=""default"") . If the signature is not valid this raises a itsdangerous.BadSignature exception.","[""Internals for plugins"", ""Datasette class""]",[]
internals:id1,internals,id1,.get_internal_database(),Returns a database object for reading and writing to the private internal database .,"[""Internals for plugins"", ""Datasette class""]",[]
internals:internals-datasette-client,internals,internals-datasette-client,datasette.client,"Plugins can make internal simulated HTTP requests to the Datasette instance within which they are running. This ensures that all of Datasette's external JSON APIs are also available to plugins, while avoiding the overhead of making an external HTTP call to access those APIs.
The datasette.client object is a wrapper around the HTTPX Python library , providing an async-friendly API that is similar to the widely used Requests library .
It offers the following methods:
await datasette.client.get(path, **kwargs) - returns HTTPX Response
Execute an internal GET request against that path.
await datasette.client.post(path, **kwargs) - returns HTTPX Response
Execute an internal POST request. Use data={""name"": ""value""} to pass form parameters.
await datasette.client.options(path, **kwargs) - returns HTTPX Response
Execute an internal OPTIONS request.
await datasette.client.head(path, **kwargs) - returns HTTPX Response
Execute an internal HEAD request.
await datasette.client.put(path, **kwargs) - returns HTTPX Response
Execute an internal PUT request.
await datasette.client.patch(path, **kwargs) - returns HTTPX Response
Execute an internal PATCH request.
await datasette.client.delete(path, **kwargs) - returns HTTPX Response
Execute an internal DELETE request.
await datasette.client.request(method, path, **kwargs) - returns HTTPX Response
Execute an internal request with the given HTTP method against that path.
These methods can be used with datasette.urls - for example:
table_json = (
await datasette.client.get(
datasette.urls.table(
""fixtures"", ""facetable"", format=""json""
)
)
).json()
datasette.client methods automatically take the current base_url setting into account, whether or not you use the datasette.urls family of methods to construct the path.
For documentation on available **kwargs options and the shape of the HTTPX Response object refer to the HTTPX Async documentation .","[""Internals for plugins"", ""Datasette class""]","[{""href"": ""https://www.python-httpx.org/"", ""label"": ""HTTPX Python library""}, {""href"": ""https://requests.readthedocs.io/"", ""label"": ""Requests library""}, {""href"": ""https://www.python-httpx.org/async/"", ""label"": ""HTTPX Async documentation""}]"
internals:internals-datasette-urls,internals,internals-datasette-urls,datasette.urls,"The datasette.urls object contains methods for building URLs to pages within Datasette. Plugins should use this to link to pages, since these methods take into account any base_url configuration setting that might be in effect.
datasette.urls.instance(format=None)
Returns the URL to the Datasette instance root page. This is usually ""/"" .
datasette.urls.path(path, format=None)
Takes a path and returns the full path, taking base_url into account.
For example, datasette.urls.path(""-/logout"") will return the path to the logout page, which will be ""/-/logout"" by default or /prefix-path/-/logout if base_url is set to /prefix-path/
datasette.urls.logout()
Returns the URL to the logout page, usually ""/-/logout""
datasette.urls.static(path)
Returns the URL of one of Datasette's default static assets, for example ""/-/static/app.css""
datasette.urls.static_plugins(plugin_name, path)
Returns the URL of one of the static assets belonging to a plugin.
datasette.urls.static_plugins(""datasette_cluster_map"", ""datasette-cluster-map.js"") would return ""/-/static-plugins/datasette_cluster_map/datasette-cluster-map.js""
datasette.urls.static(path)
Returns the URL of one of Datasette's default static assets, for example ""/-/static/app.css""
datasette.urls.database(database_name, format=None)
Returns the URL to a database page, for example ""/fixtures""
datasette.urls.table(database_name, table_name, format=None)
Returns the URL to a table page, for example ""/fixtures/facetable""
datasette.urls.query(database_name, query_name, format=None)
Returns the URL to a query page, for example ""/fixtures/pragma_cache_size""
These functions can be accessed via the {{ urls }} object in Datasette templates, for example:
HomepageFixtures databasefacetable tablepragma_cache_size query
Use the format=""json"" (or ""csv"" or other formats supported by plugins) arguments to get back URLs to the JSON representation. This is the path with .json added on the end.
These methods each return a datasette.utils.PrefixedUrlString object, which is a subclass of the Python str type. This allows the logic that considers the base_url setting to detect if that prefix has already been applied to the path.","[""Internals for plugins"", ""Datasette class""]",[]
internals:internals-response-asgi-send,internals,internals-response-asgi-send,Returning a response with .asgi_send(send),"In most cases you will return Response objects from your own view functions. You can also use a Response instance to respond at a lower level via ASGI, for example if you are writing code that uses the asgi_wrapper(datasette) hook.
Create a Response object and then use await response.asgi_send(send) , passing the ASGI send function. For example:
async def require_authorization(scope, receive, send):
response = Response.text(
""401 Authorization Required"",
headers={
""www-authenticate"": 'Basic realm=""Datasette"", charset=""UTF-8""'
},
status=401,
)
await response.asgi_send(send)","[""Internals for plugins"", ""Response class""]",[]
internals:internals-response-set-cookie,internals,internals-response-set-cookie,Setting cookies with response.set_cookie(),"To set cookies on the response, use the response.set_cookie(...) method. The method signature looks like this:
def set_cookie(
self,
key,
value="""",
max_age=None,
expires=None,
path=""/"",
domain=None,
secure=False,
httponly=False,
samesite=""lax"",
): ...
You can use this with datasette.sign() to set signed cookies. Here's how you would set the ds_actor cookie for use with Datasette authentication :
response = Response.redirect(""/"")
response.set_cookie(
""ds_actor"",
datasette.sign({""a"": {""id"": ""cleopaws""}}, ""actor""),
)
return response","[""Internals for plugins"", ""Response class""]",[]
internals:internals-tilde-encoding,internals,internals-tilde-encoding,Tilde encoding,"Datasette uses a custom encoding scheme in some places, called tilde encoding . This is primarily used for table names and row primary keys, to avoid any confusion between / characters in those values and the Datasette URLs that reference them.
Tilde encoding uses the same algorithm as URL percent-encoding , but with the ~ tilde character used in place of % .
Any character other than ABCDEFGHIJKLMNOPQRSTUVWXYZ abcdefghijklmnopqrstuvwxyz0123456789_- will be replaced by the numeric equivalent preceded by a tilde. For example:
/ becomes ~2F
. becomes ~2E
% becomes ~25
~ becomes ~7E
Space becomes +
polls/2022.primary becomes polls~2F2022~2Eprimary
Note that the space character is a special case: it will be replaced with a + symbol.
datasette.utils. tilde_encode s : str str
Returns tilde-encoded string - for example /foo/bar -> ~2Ffoo~2Fbar
datasette.utils. tilde_decode s : str str
Decodes a tilde-encoded string, so ~2Ffoo~2Fbar -> /foo/bar","[""Internals for plugins"", ""The datasette.utils module""]","[{""href"": ""https://developer.mozilla.org/en-US/docs/Glossary/percent-encoding"", ""label"": ""URL percent-encoding""}]"
internals:internals-utils-await-me-maybe,internals,internals-utils-await-me-maybe,await_me_maybe(value),"Utility function for calling await on a return value if it is awaitable, otherwise returning the value. This is used by Datasette to support plugin hooks that can optionally return awaitable functions. Read more about this function in The “await me maybe” pattern for Python asyncio .
async datasette.utils. await_me_maybe value : Any Any
If value is callable, call it. If awaitable, await it. Otherwise return it.","[""Internals for plugins"", ""The datasette.utils module""]","[{""href"": ""https://simonwillison.net/2020/Sep/2/await-me-maybe/"", ""label"": ""The “await me maybe” pattern for Python asyncio""}]"
internals:internals-utils-derive-named-parameters,internals,internals-utils-derive-named-parameters,"derive_named_parameters(db, sql)","Derive the list of named parameters referenced in a SQL query, using an explain query executed against the provided database.
async datasette.utils. derive_named_parameters db : Database sql : str List [ str ]
Given a SQL statement, return a list of named parameters that are used in the statement
e.g. for select * from foo where id=:id this would return [""id""]","[""Internals for plugins"", ""The datasette.utils module""]",[]
internals:internals-utils-parse-metadata,internals,internals-utils-parse-metadata,parse_metadata(content),"This function accepts a string containing either JSON or YAML, expected to be of the format described in Metadata . It returns a nested Python dictionary representing the parsed data from that string.
If the metadata cannot be parsed as either JSON or YAML the function will raise a utils.BadMetadataError exception.
datasette.utils. parse_metadata content : str dict
Detects if content is JSON or YAML and parses it appropriately.","[""Internals for plugins"", ""The datasette.utils module""]",[]
internals:internals-tracer-trace-child-tasks,internals,internals-tracer-trace-child-tasks,Tracing child tasks,"If your code uses a mechanism such as asyncio.gather() to execute code in additional tasks you may find that some of the traces are missing from the display.
You can use the trace_child_tasks() context manager to ensure these child tasks are correctly handled.
from datasette import tracer
with tracer.trace_child_tasks():
results = await asyncio.gather(
# ... async tasks here
)
This example uses the register_routes() plugin hook to add a page at /parallel-queries which executes two SQL queries in parallel using asyncio.gather() and returns their results.
from datasette import hookimpl
from datasette import tracer
@hookimpl
def register_routes():
async def parallel_queries(datasette):
db = datasette.get_database()
with tracer.trace_child_tasks():
one, two = await asyncio.gather(
db.execute(""select 1""),
db.execute(""select 2""),
)
return Response.json(
{
""one"": one.single_value(),
""two"": two.single_value(),
}
)
return [
(r""/parallel-queries$"", parallel_queries),
]
Note that running parallel SQL queries in this way has been known to cause problems in the past , so treat this example with caution.
Adding ?_trace=1 will show that the trace covers both of those child tasks.","[""Internals for plugins"", ""datasette.tracer""]","[{""href"": ""https://github.com/simonw/datasette/issues/2189"", ""label"": ""been known to cause problems in the past""}]"
internals:internals-csrf,internals,internals-csrf,CSRF protection,"Datasette uses asgi-csrf to guard against CSRF attacks on form POST submissions. Users receive a ds_csrftoken cookie which is compared against the csrftoken form field (or x-csrftoken HTTP header) for every incoming request.
If your plugin implements a