home / docs

sections_fts

451 rows

✎ View and edit SQL

This data as json, CSV (advanced)

Link rowid ▼ title content sections_fts rank
101 101 0.40 (2020-04-21) Datasette Metadata can now be provided as a YAML file as an optional alternative to JSON. See Using YAML for metadata . ( #713 ) Removed support for datasette publish now , which used the the now-retired Zeit Now v1 hosting platform. A new plugin, datasette-publish-now , can be installed to publish data to Zeit ( now Vercel ) Now v2. ( #710 ) Fixed a bug where the extra_template_vars(request, view_name) plugin hook was not receiving the correct view_name . ( #716 ) Variables added to the template context by the extra_template_vars() plugin hook are now shown in the ?_context=1 debugging mode (see template_debug ). ( #693 ) Fixed a bug where the "templates considered" HTML comment was no longer being displayed. ( #689 ) Fixed a datasette publish bug where --plugin-secret would over-ride plugin configuration in the provided metadata.json file. ( #724 ) Added a new CSS class for customizing the canned query page. ( #727 ) 3  
102 102 0.39 (2020-03-24) New base_url configuration setting for serving up the correct links while running Datasette under a different URL prefix. ( #394 ) New metadata settings "sort" and "sort_desc" for setting the default sort order for a table. See Setting a default sort order . ( #702 ) Sort direction arrow now displays by default on the primary key. This means you only have to click once (not twice) to sort in reverse order. ( #677 ) New await Request(scope, receive).post_vars() method for accessing POST form variables. ( #700 ) Plugin hooks documentation now links to example uses of each plugin. ( #709 ) 3  
103 103 0.38 (2020-03-08) The Docker build of Datasette now uses SQLite 3.31.1, upgraded from 3.26. ( #695 ) datasette publish cloudrun now accepts an optional --memory=2Gi flag for setting the Cloud Run allocated memory to a value other than the default (256Mi). ( #694 ) Fixed bug where templates that shipped with plugins were sometimes not being correctly loaded. ( #697 ) 3  
104 104 0.37.1 (2020-03-02) Don't attempt to count table rows to display on the index page for databases > 100MB. ( #688 ) Print exceptions if they occur in the write thread rather than silently swallowing them. Handle the possibility of scope["path"] being a string rather than bytes Better documentation for the extra_template_vars(template, database, table, columns, view_name, request, datasette) plugin hook. 3  
105 105 0.37 (2020-02-25) Plugins now have a supported mechanism for writing to a database, using the new .execute_write() and .execute_write_fn() methods. Documentation . ( #682 ) Immutable databases that have had their rows counted using the inspect command now use the calculated count more effectively - thanks, Kevin Keogh. ( #666 ) --reload no longer restarts the server if a database file is modified, unless that database was opened immutable mode with -i . ( #494 ) New ?_searchmode=raw option turns off escaping for FTS queries in ?_search= allowing full use of SQLite's FTS5 query syntax . ( #676 ) 3  
106 106 0.36 (2020-02-21) The datasette object passed to plugins now has API documentation: Datasette class . ( #576 ) New methods on datasette : .add_database() and .remove_database() - documentation . ( #671 ) prepare_connection() plugin hook now takes optional datasette and database arguments - prepare_connection(conn, database, datasette) . ( #678 ) Added three new plugins and one new conversion tool to the The Datasette Ecosystem . 3  
107 107 0.35 (2020-02-04) Added five new plugins and one new conversion tool to the The Datasette Ecosystem . The Datasette class has a new render_template() method which can be used by plugins to render templates using Datasette's pre-configured Jinja templating library. You can now execute SQL queries that start with a -- comment - thanks, Jay Graves ( #653 ) 3  
108 108 0.34 (2020-01-29) _search= queries are now correctly escaped using a new escape_fts() custom SQL function. This means you can now run searches for strings like park. without seeing errors. ( #651 ) Google Cloud Run is no longer in beta, so datasette publish cloudrun has been updated to work even if the user has not installed the gcloud beta components package. Thanks, Katie McLaughlin ( #660 ) datasette package now accepts a --port option for specifying which port the resulting Docker container should listen on. ( #661 ) 3  
109 109 0.33 (2019-12-22) rowid is now included in dropdown menus for filtering tables ( #636 ) Columns are now only suggested for faceting if they have at least one value with more than one record ( #638 ) Queries with no results now display "0 results" ( #637 ) Improved documentation for the --static option ( #641 ) asyncio task information is now included on the /-/threads debug page Bumped Uvicorn dependency 0.11 You can now use --port 0 to listen on an available port New template_debug setting for debugging templates, e.g. https://latest.datasette.io/fixtures/roadside_attractions?_context=1 ( #654 ) 3  
110 110 0.32 (2019-11-14) Datasette now renders templates using Jinja async mode . This means plugins can provide custom template functions that perform asynchronous actions, for example the new datasette-template-sql plugin which allows custom templates to directly execute SQL queries and render their results. ( #628 ) 3  
111 111 0.31.2 (2019-11-13) Fixed a bug where datasette publish heroku applications failed to start ( #633 ) Fix for datasette publish with just --source_url - thanks, Stanley Zheng ( #572 ) Deployments to Heroku now use Python 3.8.0 ( #632 ) 3  
112 112 0.31.1 (2019-11-12) Deployments created using datasette publish now use python:3.8 base Docker image ( #629 ) 3  
113 113 0.31 (2019-11-11) This version adds compatibility with Python 3.8 and breaks compatibility with Python 3.5. If you are still running Python 3.5 you should stick with 0.30.2 , which you can install like this: pip install datasette==0.30.2 Format SQL button now works with read-only SQL queries - thanks, Tobias Kunze ( #602 ) New ?column__notin=x,y,z filter for table views ( #614 ) Table view now uses select col1, col2, col3 instead of select * Database filenames can now contain spaces - thanks, Tobias Kunze ( #590 ) Removed obsolete ?_group_count=col feature ( #504 ) Improved user interface and documentation for datasette publish cloudrun ( #608 ) Tables with indexes now show the CREATE INDEX statements on the table page ( #618 ) Current version of uvicorn is now shown on /-/versions Python 3.8 is now supported! ( #622 ) Python 3.5 is no longer supported. 3  
114 114 0.30.2 (2019-11-02) /-/plugins page now uses distribution name e.g. datasette-cluster-map instead of the name of the underlying Python package ( datasette_cluster_map ) ( #606 ) Array faceting is now only suggested for columns that contain arrays of strings ( #562 ) Better documentation for the --host argument ( #574 ) Don't show None with a broken link for the label on a nullable foreign key ( #406 ) 3  
115 115 0.30.1 (2019-10-30) Fixed bug where ?_where= parameter was not persisted in hidden form fields ( #604 ) Fixed bug with .JSON representation of row pages - thanks, Chris Shaw ( #603 ) 3  
116 116 0.30 (2019-10-18) Added /-/threads debugging page Allow EXPLAIN WITH... ( #583 ) Button to format SQL - thanks, Tobias Kunze ( #136 ) Sort databases on homepage by argument order - thanks, Tobias Kunze ( #585 ) Display metadata footer on custom SQL queries - thanks, Tobias Kunze ( #589 ) Use --platform=managed for publish cloudrun ( #587 ) Fixed bug returning non-ASCII characters in CSV ( #584 ) Fix for /foo v.s. /foo-bar bug ( #601 ) 3  
117 117 0.29.3 (2019-09-02) Fixed implementation of CodeMirror on database page ( #560 ) Documentation typo fixes - thanks, Min ho Kim ( #561 ) Mechanism for detecting if a table has FTS enabled now works if the table name used alternative escaping mechanisms ( #570 ) - for compatibility with a recent change to sqlite-utils . 3  
118 118 0.29.2 (2019-07-13) Bumped Uvicorn to 0.8.4, fixing a bug where the query string was not included in the server logs. ( #559 ) Fixed bug where the navigation breadcrumbs were not displayed correctly on the page for a custom query. ( #558 ) Fixed bug where custom query names containing unicode characters caused errors. 3  
119 119 0.29.1 (2019-07-11) Fixed bug with static mounts using relative paths which could lead to traversal exploits ( #555 ) - thanks Abdussamet Kocak! Datasette can now be run as a module: python -m datasette ( #556 ) - thanks, Abdussamet Kocak! 3  
120 120 0.29 (2019-07-07) ASGI, new plugin hooks, facet by date and much, much more... 3  
121 121 ASGI ASGI is the Asynchronous Server Gateway Interface standard. I've been wanting to convert Datasette into an ASGI application for over a year - Port Datasette to ASGI #272 tracks thirteen months of intermittent development - but with Datasette 0.29 the change is finally released. This also means Datasette now runs on top of Uvicorn and no longer depends on Sanic . I wrote about the significance of this change in Porting Datasette to ASGI, and Turtles all the way down . The most exciting consequence of this change is that Datasette plugins can now take advantage of the ASGI standard. 3  
122 122 New plugin hook: asgi_wrapper The asgi_wrapper(datasette) plugin hook allows plugins to entirely wrap the Datasette ASGI application in their own ASGI middleware. ( #520 ) Two new plugins take advantage of this hook: datasette-auth-github adds a authentication layer: users will have to sign in using their GitHub account before they can view data or interact with Datasette. You can also use it to restrict access to specific GitHub users, or to members of specified GitHub organizations or teams . datasette-cors allows you to configure CORS headers for your Datasette instance. You can use this to enable JavaScript running on a whitelisted set of domains to make fetch() calls to the JSON API provided by your Datasette instance. 3  
123 123 New plugin hook: extra_template_vars The extra_template_vars(template, database, table, columns, view_name, request, datasette) plugin hook allows plugins to inject their own additional variables into the Datasette template context. This can be used in conjunction with custom templates to customize the Datasette interface. datasette-auth-github uses this hook to add custom HTML to the new top navigation bar (which is designed to be modified by plugins, see #540 ). 3  
124 124 Secret plugin configuration options Plugins like datasette-auth-github need a safe way to set secret configuration options. Since the default mechanism for configuring plugins exposes those settings in /-/metadata a new mechanism was needed. Secret configuration values describes how plugins can now specify that their settings should be read from a file or an environment variable: { "plugins": { "datasette-auth-github": { "client_secret": { "$env": "GITHUB_CLIENT_SECRET" } } } } These plugin secrets can be set directly using datasette publish . See Custom metadata and plugins for details. ( #538 and #543 ) 3  
125 125 Facet by date If a column contains datetime values, Datasette can now facet that column by date. ( #481 ) 3  
126 126 Easier custom templates for table rows If you want to customize the display of individual table rows, you can do so using a _table.html template include that looks something like this: {% for row in display_rows %} <div> <h2>{{ row["title"] }}</h2> <p>{{ row["description"] }}<lp> <p>Category: {{ row.display("category_id") }}</p> </div> {% endfor %} This is a backwards incompatible change . If you previously had a custom template called _rows_and_columns.html you need to rename it to _table.html . See Custom templates for full details. 3  
127 127 ?_through= for joins through many-to-many tables The new ?_through={json} argument to the Table view allows records to be filtered based on a many-to-many relationship. See Special table arguments for full documentation - here's an example . ( #355 ) This feature was added to help support facet by many-to-many , which isn't quite ready yet but will be coming in the next Datasette release. 3  
128 128 Small changes Databases published using datasette publish now open in Immutable mode . ( #469 ) ?col__date= now works for columns containing spaces Automatic label detection (for deciding which column to show when linking to a foreign key) has been improved. ( #485 ) Fixed bug where pagination broke when combined with an expanded foreign key. ( #489 ) Contributors can now run pip install -e .[docs] to get all of the dependencies needed to build the documentation, including cd docs && make livehtml support. Datasette's dependencies are now all specified using the ~= match operator. ( #532 ) white-space: pre-wrap now used for table creation SQL. ( #505 ) Full list of commits between 0.28 and 0.29. 3  
129 129 0.28 (2019-05-19) A salmagundi of new features! 3  
130 130 Supporting databases that change From the beginning of the project, Datasette has been designed with read-only databases in mind. If a database is guaranteed not to change it opens up all kinds of interesting opportunities - from taking advantage of SQLite immutable mode and HTTP caching to bundling static copies of the database directly in a Docker container. The interesting ideas in Datasette explores this idea in detail. As my goals for the project have developed, I realized that read-only databases are no longer the right default. SQLite actually supports concurrent access very well provided only one thread attempts to write to a database at a time, and I keep encountering sensible use-cases for running Datasette on top of a database that is processing inserts and updates. So, as-of version 0.28 Datasette no longer assumes that a database file will not change. It is now safe to point Datasette at a SQLite database which is being updated by another process. Making this change was a lot of work - see tracking tickets #418 , #419 and #420 . It required new thinking around how Datasette should calculate table counts (an expensive operation against a large, changing database) and also meant reconsidering the "content hash" URLs Datasette has used in the past to optimize the performance of HTTP caches. Datasette can still run against immutable files and gains numerous performance benefits from doing so, but this is no longer the default behaviour. Take a look at the new Performance and caching documentation section for details on how to make the most of Datasette against data that you know will be staying read-only and immutable. 3  
131 131 Faceting improvements, and faceting plugins Datasette Facets provide an intuitive way to quickly summarize and interact with data. Previously the only supported faceting technique was column faceting, but 0.28 introduces two powerful new capabilities: facet-by-JSON-array and the ability to define further facet types using plugins. Facet by array ( #359 ) is only available if your SQLite installation provides the json1 extension. Datasette will automatically detect columns that contain JSON arrays of values and offer a faceting interface against those columns - useful for modelling things like tags without needing to break them out into a new table. See Facet by JSON array for more. The new register_facet_classes() plugin hook ( #445 ) can be used to register additional custom facet classes. Each facet class should provide two methods: suggest() which suggests facet selections that might be appropriate for a provided SQL query, and facet_results() which executes a facet operation and returns results. Datasette's own faceting implementations have been refactored to use the same API as these plugins. 3  
132 132 datasette publish cloudrun Google Cloud Run is a brand new serverless hosting platform from Google, which allows you to build a Docker container which will run only when HTTP traffic is received and will shut down (and hence cost you nothing) the rest of the time. It's similar to Zeit's Now v1 Docker hosting platform which sadly is no longer accepting signups from new users. The new datasette publish cloudrun command was contributed by Romain Primet ( #434 ) and publishes selected databases to a new Datasette instance running on Google Cloud Run. See Publishing to Google Cloud Run for full documentation. 3  
133 133 register_output_renderer plugins Russ Garrett implemented a new Datasette plugin hook called register_output_renderer ( #441 ) which allows plugins to create additional output renderers in addition to Datasette's default .json and .csv . Russ's in-development datasette-geo plugin includes an example of this hook being used to output .geojson automatically converted from SpatiaLite. 3  
134 134 Medium changes Datasette now conforms to the Black coding style ( #449 ) - and has a unit test to enforce this in the future New Special table arguments : ?columnname__in=value1,value2,value3 filter for executing SQL IN queries against a table, see Table arguments ( #433 ) ?columnname__date=yyyy-mm-dd filter which returns rows where the spoecified datetime column falls on the specified date ( 583b22a ) ?tags__arraycontains=tag filter which acts against a JSON array contained in a column ( 78e45ea ) ?_where=sql-fragment filter for the table view ( #429 ) ?_fts_table=mytable and ?_fts_pk=mycolumn query string options can be used to specify which FTS table to use for a search query - see Configuring full-text search for a table or view ( #428 ) You can now pass the same table filter multiple times - for example, ?content__not=world&content__not=hello will return all rows where the content column is neither hello or world ( #288 ) … 3  
135 135 Small changes We now show the size of the database file next to the download link ( #172 ) New /-/databases introspection page shows currently connected databases ( #470 ) Binary data is no longer displayed on the table and row pages ( #442 - thanks, Russ Garrett) New show/hide SQL links on custom query pages ( #415 ) The extra_body_script plugin hook now accepts an optional view_name argument ( #443 - thanks, Russ Garrett) Bumped Jinja2 dependency to 2.10.1 ( #426 ) All table filters are now documented, and documentation is enforced via unit tests ( 2c19a27 ) New project guideline: master should stay shippable at all times! ( 31f36e1 ) Fixed a bug where sqlite_timelimit() occasionally failed to clean up after itself ( bac4e01 ) We no longer load additional plugins when executing pytest ( #438 ) Homepage now links to database views if there are less than five tables in a database ( #373 ) The --cors option is now respected by error pages ( #453 ) datasette publish heroku now uses the --include-vcs-ignore option, which means it works under Travis CI ( #407 ) datasette publish heroku now publishes using Python 3.6.8 ( 66… 3  
136 136 0.27.1 (2019-05-09) Tiny bugfix release: don't install tests/ in the wrong place. Thanks, Veit Heller. 3  
137 137 0.27 (2019-01-31) New command: datasette plugins ( documentation ) shows you the currently installed list of plugins. Datasette can now output newline-delimited JSON using the new ?_shape=array&_nl=on query string option. Added documentation on The Datasette Ecosystem . Now using Python 3.7.2 as the base for the official Datasette Docker image . 3  
138 138 0.26.1 (2019-01-10) /-/versions now includes SQLite compile_options ( #396 ) datasetteproject/datasette Docker image now uses SQLite 3.26.0 ( #397 ) Cleaned up some deprecation warnings under Python 3.7 3  
139 139 0.26 (2019-01-02) datasette serve --reload now restarts Datasette if a database file changes on disk. datasette publish now now takes an optional --alias mysite.now.sh argument. This will attempt to set an alias after the deploy completes. Fixed a bug where the advanced CSV export form failed to include the currently selected filters ( #393 ) 3  
140 140 0.25.2 (2018-12-16) datasette publish heroku now uses the python-3.6.7 runtime Added documentation on how to build the documentation Added documentation covering our release process Upgraded to pytest 4.0.2 3  
141 141 0.25.1 (2018-11-04) Documentation improvements plus a fix for publishing to Zeit Now. datasette publish now now uses Zeit's v1 platform, to work around the new 100MB image limit. Thanks, @slygent - closes #366 . 3  
142 142 0.25 (2018-09-19) New plugin hooks, improved database view support and an easier way to use more recent versions of SQLite. New publish_subcommand plugin hook. A plugin can now add additional datasette publish publishers in addition to the default now and heroku , both of which have been refactored into default plugins. publish_subcommand documentation . Closes #349 New render_cell plugin hook. Plugins can now customize how values are displayed in the HTML tables produced by Datasette's browsable interface. datasette-json-html and datasette-render-images are two new plugins that use this hook. render_cell documentation . Closes #352 New extra_body_script plugin hook, enabling plugins to provide additional JavaScript that should be added to the page footer. extra_body_script documentation . extra_css_urls and extra_js_urls hooks now take additional optional parameters, allowing them to be more selective about which pages they apply to. Documentation . You can now use the sortable_columns metadata setting to explicitly enable sort-by-column in the interface for database views, as well as for specific tables. The new fts_table and fts_pk metadata settings can now be used to explicitly configure full-text search for a table or a view , even if that table is not directly coupled to the SQLite FTS feature in the database schema itself. Datasette will now use pysqlite3 in place of the standard library sqlite3 module if it has been installed in the current environment. This makes it much easier to run Datasette against a more recent version of SQLite, including the just-released SQLite 3.25.0 which adds wind… 3  
143 143 0.24 (2018-07-23) A number of small new features: datasette publish heroku now supports --extra-options , fixes #334 Custom error message if SpatiaLite is needed for specified database, closes #331 New config option: truncate_cells_html for truncating long cell values in HTML view - closes #330 Documentation for datasette publish and datasette package , closes #337 Fixed compatibility with Python 3.7 datasette publish heroku now supports app names via the -n option, which can also be used to overwrite an existing application [Russ Garrett] Title and description metadata can now be set for canned SQL queries , closes #342 New force_https_on config option, fixes https:// API URLs when deploying to Zeit Now - closes #333 ?_json_infinity=1 query string argument for handling Infinity/-Infinity values in JSON, closes #332 URLs displayed in the results of custom SQL queries are now URLified, closes #298 3  
144 144 0.23.2 (2018-07-07) Minor bugfix and documentation release. CSV export now respects --cors , fixes #326 Installation instructions , including docker image - closes #328 Fix for row pages for tables with / in, closes #325 3  
145 145 0.23.1 (2018-06-21) Minor bugfix release. Correctly display empty strings in HTML table, closes #314 Allow "." in database filenames, closes #302 404s ending in slash redirect to remove that slash, closes #309 Fixed incorrect display of compound primary keys with foreign key references. Closes #319 Docs + example of canned SQL query using || concatenation. Closes #321 Correctly display facets with value of 0 - closes #318 Default 'expand labels' to checked in CSV advanced export 3  
146 146 0.23 (2018-06-18) This release features CSV export, improved options for foreign key expansions, new configuration settings and improved support for SpatiaLite. See datasette/compare/0.22.1...0.23 for a full list of commits added since the last release. 3  
147 147 CSV export Any Datasette table, view or custom SQL query can now be exported as CSV. Check out the CSV export documentation for more details, or try the feature out on https://fivethirtyeight.datasettes.com/fivethirtyeight/bechdel%2Fmovies If your table has more than max_returned_rows (default 1,000) Datasette provides the option to stream all rows . This option takes advantage of async Python and Datasette's efficient pagination to iterate through the entire matching result set and stream it back as a downloadable CSV file. 3  
148 148 Foreign key expansions When Datasette detects a foreign key reference it attempts to resolve a label for that reference (automatically or using the Specifying the label column for a table metadata option) so it can display a link to the associated row. This expansion is now also available for JSON and CSV representations of the table, using the new _labels=on query string option. See Expanding foreign key references for more details. 3  
149 149 New configuration settings Datasette's Settings now also supports boolean settings. A number of new configuration options have been added: num_sql_threads - the number of threads used to execute SQLite queries. Defaults to 3. allow_facet - enable or disable custom Facets using the _facet= parameter. Defaults to on. suggest_facets - should Datasette suggest facets? Defaults to on. allow_download - should users be allowed to download the entire SQLite database? Defaults to on. allow_sql - should users be allowed to execute custom SQL queries? Defaults to on. default_cache_ttl - Default HTTP caching max-age header in seconds. Defaults to 365 days - caching can be disabled entirely by settings this to 0. cache_size_kb - Set the amount of memory SQLite uses for its per-connection cache , in KB. allow_csv_stream - allow users to stream entire result sets as a single CSV file. Defaults to on. max_csv_mb - maximum size of a returned CSV file in MB. Defaults to 100MB, set to 0 to disable this limit. 3  
150 150 Control HTTP caching with ?_ttl= You can now customize the HTTP max-age header that is sent on a per-URL basis, using the new ?_ttl= query string parameter. You can set this to any value in seconds, or you can set it to 0 to disable HTTP caching entirely. Consider for example this query which returns a randomly selected member of the Avengers: select * from [avengers/avengers] order by random() limit 1 If you hit the following page repeatedly you will get the same result, due to HTTP caching: /fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1 By adding ?_ttl=0 to the zero you can ensure the page will not be cached and get back a different super hero every time: /fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1&_ttl=0 3  
151 151 Improved support for SpatiaLite The SpatiaLite module for SQLite adds robust geospatial features to the database. Getting SpatiaLite working can be tricky, especially if you want to use the most recent alpha version (with support for K-nearest neighbor). Datasette now includes extensive documentation on SpatiaLite , and thanks to Ravi Kotecha our GitHub repo includes a Dockerfile that can build the latest SpatiaLite and configure it for use with Datasette. The datasette publish and datasette package commands now accept a new --spatialite argument which causes them to install and configure SpatiaLite as part of the container they deploy. 3  
152 152 latest.datasette.io Every commit to Datasette master is now automatically deployed by Travis CI to https://latest.datasette.io/ - ensuring there is always a live demo of the latest version of the software. The demo uses the fixtures from our unit tests, ensuring it demonstrates the same range of functionality that is covered by the tests. You can see how the deployment mechanism works in our .travis.yml file. 3  
153 153 Miscellaneous Got JSON data in one of your columns? Use the new ?_json=COLNAME argument to tell Datasette to return that JSON value directly rather than encoding it as a string. If you just want an array of the first value of each row, use the new ?_shape=arrayfirst option - example . 3  
154 154 0.22.1 (2018-05-23) Bugfix release, plus we now use versioneer for our version numbers. Faceting no longer breaks pagination, fixes #282 Add __version_info__ derived from __version__ [Robert Gieseke] This might be tuple of more than two values (major and minor version) if commits have been made after a release. Add version number support with Versioneer. [Robert Gieseke] Versioneer Licence: Public Domain (CC0-1.0) Closes #273 Refactor inspect logic [Russ Garrett] 3  
155 155 0.22 (2018-05-20) The big new feature in this release is Facets . Datasette can now apply faceted browse to any column in any table. It will also suggest possible facets. See the Datasette Facets announcement post for more details. In addition to the work on facets: Added docs for introspection endpoints New --config option, added --help-config , closes #274 Removed the --page_size= argument to datasette serve in favour of: datasette serve --config default_page_size:50 mydb.db Added new help section: $ datasette --help-config Config options: default_page_size Default page size for the table view (default=100) max_returned_rows Maximum rows that can be returned from a table or custom query (default=1000) sql_time_limit_ms Time limit for a SQL query in milliseconds (default=1000) default_facet_size Number of values to return for requested facets (default=30) facet_time_limit_ms Time limit for calculating a requested facet (default=200) facet_suggest_time_limit_ms Time limit for calculating a suggested facet (default=50) Only apply responsive table styles to .rows-and-column Otherwise they interfere with tables in the description, e.g. on https://fivethirtyeight.datasettes.com/fivethirtyeight/nba-elo%2Fnbaallelo Refactored views into new views/ modules, refs #256 Documentation for SQLite full-text search support, closes #253 … 3  
156 156 0.21 (2018-05-05) New JSON _shape= options, the ability to set table _size= and a mechanism for searching within specific columns. Default tests to using a longer timelimit Every now and then a test will fail in Travis CI on Python 3.5 because it hit the default 20ms SQL time limit. Test fixtures now default to a 200ms time limit, and we only use the 20ms time limit for the specific test that tests query interruption. This should make our tests on Python 3.5 in Travis much more stable. Support _search_COLUMN=text searches, closes #237 Show version on /-/plugins page, closes #248 ?_size=max option, closes #249 Added /-/versions and /-/versions.json , closes #244 Sample output: { "python": { "version": "3.6.3", "full": "3.6.3 (default, Oct 4 2017, 06:09:38) \n[GCC 4.2.1 Compatible Apple LLVM 9.0.0 (clang-900.0.37)]" }, "datasette": { "version": "0.20" }, "sqlite": { "version": "3.23.1", "extensions": { "json1": null, "spatialite": "4.3.0a" } } } Renamed ?_sql_time_limit_ms= to ?_timelimit , closes #242 New ?_shape=array option + tweaks to _shape , closes #245 Default is now ?_shape=arrays (renamed from lists ) New ?_shape=array returns an array of objects as the root object … 3  
157 157 0.20 (2018-04-20) Mostly new work on the Plugins mechanism: plugins can now bundle static assets and custom templates, and datasette publish has a new --install=name-of-plugin option. Add col-X classes to HTML table on custom query page Fixed out-dated template in documentation Plugins can now bundle custom templates, #224 Added /-/metadata /-/plugins /-/inspect, #225 Documentation for --install option, refs #223 Datasette publish/package --install option, #223 Fix for plugins in Python 3.5, #222 New plugin hooks: extra_css_urls() and extra_js_urls(), #214 /-/static-plugins/PLUGIN_NAME/ now serves static/ from plugins <th> now gets class="col-X" - plus added col-X documentation Use to_css_class for table cell column classes This ensures that columns with spaces in the name will still generate usable CSS class names. Refs #209 Add column name classes to <td>s, make PK bold [Russ Garrett] Don't duplicate simple primary keys in the link column [Russ Garrett] When there's a simple (single-column) primary key, it looks weird to duplicate it in the link column. This change removes the second PK column and treats the link column as if it were the PK colu… 3  
158 158 0.19 (2018-04-16) This is the first preview of the new Datasette plugins mechanism. Only two plugin hooks are available so far - for custom SQL functions and custom template filters. There's plenty more to come - read the documentation and get involved in the tracking ticket if you have feedback on the direction so far. Fix for _sort_desc=sortable_with_nulls test, refs #216 Fixed #216 - paginate correctly when sorting by nullable column Initial documentation for plugins, closes #213 https://docs.datasette.io/en/stable/plugins.html New --plugins-dir=plugins/ option ( #212 ) New option causing Datasette to load and evaluate all of the Python files in the specified directory and register any plugins that are defined in those files. This new option is available for the following commands: datasette serve mydb.db --plugins-dir=plugins/ datasette publish now/heroku mydb.db --plugins-dir=plugins/ datasette package mydb.db --plugins-dir=plugins/ Start of the plugin system, based on pluggy ( #210 ) Uses https://pluggy.readthedocs.io/ originally created for the py.test project We're starting with two plugin hooks: prepare_connection(conn) This is called when a new SQLite connection is created. It can be used to register custom SQL functions. prepare_jinja2_environment(env) This is called with the Jinja2 environment. It can be used to register custom template tags and filters. An example plugin which… 3  
159 159 0.18 (2018-04-14) This release introduces support for units , contributed by Russ Garrett ( #203 ). You can now optionally specify the units for specific columns using metadata.json . Once specified, units will be displayed in the HTML view of your table. They also become available for use in filters - if a column is configured with a unit of distance, you can request all rows where that column is less than 50 meters or more than 20 feet for example. Link foreign keys which don't have labels. [Russ Garrett] This renders unlabeled FKs as simple links. Also includes bonus fixes for two minor issues: In foreign key link hrefs the primary key was escaped using HTML escaping rather than URL escaping. This broke some non-integer PKs. Print tracebacks to console when handling 500 errors. Fix SQLite error when loading rows with no incoming FKs. [Russ Garrett] This fixes an error caused by an invalid query when loading incoming FKs. The error was ignored due to async but it still got printed to the console. Allow custom units to be registered with Pint. [Russ Garrett] Support units in filters. [Russ Garrett] Tidy up units support. [Russ Garrett] Add units to exported JSON … 3  
160 160 0.17 (2018-04-13) Release 0.17 to fix issues with PyPI 3  
161 161 0.16 (2018-04-13) Better mechanism for handling errors; 404s for missing table/database New error mechanism closes #193 404s for missing tables/databases closes #184 long_description in markdown for the new PyPI Hide SpatiaLite system tables. [Russ Garrett] Allow explain select / explain query plan select #201 Datasette inspect now finds primary_keys #195 Ability to sort using form fields (for mobile portrait mode) #199 We now display sort options as a select box plus a descending checkbox, which means you can apply sort orders even in portrait mode on a mobile phone where the column headers are hidden. 3  
162 162 0.15 (2018-04-09) The biggest new feature in this release is the ability to sort by column. On the table page the column headers can now be clicked to apply sort (or descending sort), or you can specify ?_sort=column or ?_sort_desc=column directly in the URL. table_rows => table_rows_count , filtered_table_rows => filtered_table_rows_count Renamed properties. Closes #194 New sortable_columns option in metadata.json to control sort options. You can now explicitly set which columns in a table can be used for sorting using the _sort and _sort_desc arguments using metadata.json : { "databases": { "database1": { "tables": { "example_table": { "sortable_columns": [ "height", "weight" ] } } } } } Refs #189 Column headers now link to sort/desc sort - refs #189 _sort and _sort_desc parameters for table views Allows for paginated sorted results based on a specified column. Refs #189 Total row count now correct even if _next applied Use .custom_sql() for _group_count implementation (refs #150 ) Make HTML title more readable in query template ( #180 ) [Ryan Pitts] New ?_shape=objects/object/lists param for JSON API ( #192 ) New _shape= parameter repl… 3  
163 163 0.14 (2017-12-09) The theme of this release is customization: Datasette now allows every aspect of its presentation to be customized either using additional CSS or by providing entirely new templates. Datasette's metadata.json format has also been expanded, to allow per-database and per-table metadata. A new datasette skeleton command can be used to generate a skeleton JSON file ready to be filled in with per-database and per-table details. The metadata.json file can also be used to define canned queries , as a more powerful alternative to SQL views. extra_css_urls / extra_js_urls in metadata A mechanism in the metadata.json format for adding custom CSS and JS urls. Create a metadata.json file that looks like this: { "extra_css_urls": [ "https://simonwillison.net/static/css/all.bf8cd891642c.css" ], "extra_js_urls": [ "https://code.jquery.com/jquery-3.2.1.slim.min.js" ] } Then start datasette like this: datasette mydb.db --metadata=metadata.json The CSS and JavaScript files will be linked in the <head> of every page. You can also specify a SRI (subresource integrity hash) for these assets: { "extra_css_urls": [ { "url": "https://simonwillison.net/static/css/all.bf8cd891642c.css", "sri": "sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI" } ], "extra_js_urls": [ { "url": "https://code.jquery.com/jquery-3.2.1.slim.min.js", "sri": "sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g=" } ] } Modern browsers will only execute the stylesheet or JavaScript if the SRI hash … 3  
164 164 0.13 (2017-11-24) Search now applies to current filters. Combined search into the same form as filters. Closes #133 Much tidier design for table view header. Closes #147 Added ?column__not=blah filter. Closes #148 Row page now resolves foreign keys. Closes #132 Further tweaks to select/input filter styling. Refs #86 - thanks for the help, @natbat! Show linked foreign key in table cells. Added UI for editing table filters. Refs #86 Hide FTS-created tables on index pages. Closes #129 Add publish to heroku support [Jacob Kaplan-Moss] datasette publish heroku mydb.db Pull request #104 Initial implementation of ?_group_count=column . URL shortcut for counting rows grouped by one or more columns. ?_group_count=column1&_group_count=column2 works as well. SQL generated looks like this: select "qSpecies", count(*) as "count" from Street_Tree_List group by "qSpecies" order by "count" desc limit 100 Or for two columns like this: select "qSpecies", "qSiteInfo", count(*) as "count" from Street_Tree_List group by "qSpecies", "qSiteInfo" order by "count" desc limit 100 Refs #44 Added --build=mas… 3  
165 165 0.12 (2017-11-16) Added __version__ , now displayed as tooltip in page footer ( #108 ). Added initial docs, including a changelog ( #99 ). Turned on auto-escaping in Jinja. Added a UI for editing named parameters ( #96 ). You can now construct a custom SQL statement using SQLite named parameters (e.g. :name ) and datasette will display form fields for editing those parameters. Here’s an example which lets you see the most popular names for dogs of different species registered through various dog registration schemes in Australia. Pin to specific Jinja version. ( #100 ). Default to 127.0.0.1 not 0.0.0.0. ( #98 ). Added extra metadata options to publish and package commands. ( #92 ). You can now run these commands like so: datasette now publish mydb.db \ --title="My Title" \ --source="Source" \ --source_url="http://www.example.com/" \ --license="CC0" \ --license_url="https://creativecommons.org/publicdomain/zero/1.0/" This will write those values into the metadata.json that is packaged with the app. If you also pass --metadata=metadata.json that file will be updated with the extra values before being written into the Docker image. Added production-ready Dockerfile ( #94 ) [Andrew Cutler] New ?_sql_time_limit_ms=10 argument to database and table page ( #95 ) … 3  
166 166 0.11 (2017-11-14) Added datasette publish now --force option. This calls now with --force - useful as it means you get a fresh copy of datasette even if Now has already cached that docker layer. Enable --cors by default when running in a container. 3  
167 167 0.10 (2017-11-14) Fixed #83 - 500 error on individual row pages. Stop using sqlite WITH RECURSIVE in our tests. The version of Python 3 running in Travis CI doesn't support this. 3  
168 168 0.9 (2017-11-13) Added --sql_time_limit_ms and --extra-options . The serve command now accepts --sql_time_limit_ms for customizing the SQL time limit. The publish and package commands now accept --extra-options which can be used to specify additional options to be passed to the datasite serve command when it executes inside the resulting Docker containers. 3  
169 169 0.8 (2017-11-13) V0.8 - added PyPI metadata, ready to ship. Implemented offset/limit pagination for views ( #70 ). Improved pagination. ( #78 ) Limit on max rows returned, controlled by --max_returned_rows option. ( #69 ) If someone executes 'select * from table' against a table with a million rows in it, we could run into problems: just serializing that much data as JSON is likely to lock up the server. Solution: we now have a hard limit on the maximum number of rows that can be returned by a query. If that limit is exceeded, the server will return a "truncated": true field in the JSON. This limit can be optionally controlled by the new --max_returned_rows option. Setting that option to 0 disables the limit entirely. 3  
170 170 Datasette An open source multi-tool for exploring and publishing data Datasette is a tool for exploring and publishing data. It helps people take data of any shape or size and publish that as an interactive, explorable website and accompanying API. Datasette is aimed at data journalists, museum curators, archivists, local governments and anyone else who has data that they wish to share with the world. It is part of a wider ecosystem of tools and plugins dedicated to making working with structured data as productive as possible. Explore a demo , watch a presentation about the project or Try Datasette without installing anything using Glitch . Interested in learning Datasette? Start with the official tutorials . Support questions, feedback? Join our GitHub Discussions forum . 3  
171 171 Contents Getting started Play with a live demo Follow a tutorial Try Datasette without installing anything using Glitch Using Datasette on your own computer datasette --get Installation Basic installation Datasette Desktop for Mac Using Homebrew Using pip Advanced installation options Using pipx Using Docker The Datasette Ecosystem sqlite-utils Dogsheep Pages and API endpoints Top-level index Database Table Row Publishing data datasette publish Publishing to Google Cloud Run Publishing to Heroku Publishing to Vercel Publishing to Fly Custom metadata and plugins datasette package Deploying Datasette Deployment fundamentals Running Datasette using systemd Deploying using buildpacks Running Datasette behind a proxy Nginx proxy configuration Apache proxy configuration JSON API Different shapes Pagination Special JSON arguments Table arguments Column filter arguments Special table arguments Expanding foreign key references Discovering the JSON for a page Running SQL queries Named parameters Views Canned queries Canned query parameters Additional canned query options Writable canned queries Magic parameters JSON API for writable canned queries Pagination Cross-database queries Authentication and permissions Actors Using the "root" actor Permissions Defining permissions with "allow" blocks The /-/allow-debug tool Configuring permissions in metadata.json Controlling access to an instance Controlling access to specific databases Controlling access to specific tables and views Controlling access to specific canned queries Controlling the ability to execute arbitrary SQL Checking permissions in plugins actor_matches_allow() The permissions debug tool The ds_actor cookie Including an expiry time The /-/logout page Built-in permissions view-instance view-database view-database-download view-table view-query execute-sql permissions-debug debug-menu Performance and caching Immutable mode Using "datasette inspect" HTTP caching datasette-hashed-urls CSV export URL parameters Streaming all records Binary data Linking to binary downloads Bi… 3  
172 172 Facets Datasette facets can be used to add a faceted browse interface to any database table. With facets, tables are displayed along with a summary showing the most common values in specified columns. These values can be selected to further filter the table. Facets can be specified in two ways: using query string parameters, or in metadata.json configuration for the table. 3  
173 173 Facets in query strings To turn on faceting for specific columns on a Datasette table view, add one or more _facet=COLUMN parameters to the URL. For example, if you want to turn on facets for the city_id and state columns, construct a URL that looks like this: /dbname/tablename?_facet=state&_facet=city_id This works for both the HTML interface and the .json view. When enabled, facets will cause a facet_results block to be added to the JSON output, looking something like this: { "state": { "name": "state", "results": [ { "value": "CA", "label": "CA", "count": 10, "toggle_url": "http://...?_facet=city_id&_facet=state&state=CA", "selected": false }, { "value": "MI", "label": "MI", "count": 4, "toggle_url": "http://...?_facet=city_id&_facet=state&state=MI", "selected": false }, { "value": "MC", "label": "MC", "count": 1, "toggle_url": "http://...?_facet=city_id&_facet=state&state=MC", "selected": false } ], "truncated": false } "city_id": { "name": "city_id", "results": [ { "value": 1, "label": "San Francisco", "count": 6, "toggle_url": "http://...?_facet=city_id&_facet=state&city_id=1", "selected": false }, { "value": 2, "label": "Los Angeles", "count": 4, "toggle_url": "http://...?_facet=city_id&_facet=state&city_id=2", "selected": false }, { "value": 3, "label": "Detroit", "count": 4, "toggle_url": "http://...?_facet=city_id&_facet=state&city_id=3", "selected": false }, { "value": 4, "label": "Memnonia", "count": 1, "toggle_url": "http://...?_facet=city_id&_facet=state&city_id=4", "selected": false } ], "truncated": false } } If Datasette detect… 3  
174 174 Facets in metadata.json You can turn facets on by default for specific tables by adding them to a "facets" key in a Datasette Metadata file. Here's an example that turns on faceting by default for the qLegalStatus column in the Street_Tree_List table in the sf-trees database: { "databases": { "sf-trees": { "tables": { "Street_Tree_List": { "facets": ["qLegalStatus"] } } } } } Facets defined in this way will always be shown in the interface and returned in the API, regardless of the _facet arguments passed to the view. You can specify array or date facets in metadata using JSON objects with a single key of array or date and a value specifying the column, like this: { "facets": [ {"array": "tags"}, {"date": "created"} ] } 3  
175 175 Suggested facets Datasette's table UI will suggest facets for the user to apply, based on the following criteria: For the currently filtered data are there any columns which, if applied as a facet... Will return 30 or less unique options Will return more than one unique option Will return less unique options than the total number of filtered rows And the query used to evaluate this criteria can be completed in under 50ms That last point is particularly important: Datasette runs a query for every column that is displayed on a page, which could get expensive - so to avoid slow load times it sets a time limit of just 50ms for each of those queries. This means suggested facets are unlikely to appear for tables with millions of records in them. 3  
176 176 Speeding up facets with indexes The performance of facets can be greatly improved by adding indexes on the columns you wish to facet by. Adding indexes can be performed using the sqlite3 command-line utility. Here's how to add an index on the state column in a table called Food_Trucks : $ sqlite3 mydatabase.db SQLite version 3.19.3 2017-06-27 16:48:08 Enter ".help" for usage hints. sqlite> CREATE INDEX Food_Trucks_state ON Food_Trucks("state"); Or using the sqlite-utils command-line utility: $ sqlite-utils create-index mydatabase.db Food_Trucks state 3  
177 177 Facet by JSON array If your SQLite installation provides the json1 extension (you can check using /-/versions ) Datasette will automatically detect columns that contain JSON arrays of values and offer a faceting interface against those columns. This is useful for modelling things like tags without needing to break them out into a new table. Example here: latest.datasette.io/fixtures/facetable?_facet_array=tags 3  
178 178 Facet by date If Datasette finds any columns that contain dates in the first 100 values, it will offer a faceting interface against the dates of those values. This works especially well against timestamp values such as 2019-03-01 12:44:00 . Example here: latest.datasette.io/fixtures/facetable?_facet_date=created 3  
179 179 Plugin hooks Datasette plugins use plugin hooks to customize Datasette's behavior. These hooks are powered by the pluggy plugin system. Each plugin can implement one or more hooks using the @hookimpl decorator against a function named that matches one of the hooks documented on this page. When you implement a plugin hook you can accept any or all of the parameters that are documented as being passed to that hook. For example, you can implement the render_cell plugin hook like this even though the full documented hook signature is render_cell(value, column, table, database, datasette) : @hookimpl def render_cell(value, column): if column == "stars": return "*" * int(value) List of plugin hooks prepare_connection(conn, database, datasette) prepare_jinja2_environment(env) extra_template_vars(template, database, table, columns, view_name, request, datasette) extra_css_urls(template, database, table, columns, view_name, request, datasette) extra_js_urls(template, database, table, columns, view_name, request, datasette) extra_body_script(template, database, table, columns, view_name, request, datasette) publish_subcommand(publish) render_cell(value, column, table, database, datasette) register_output_renderer(datasette) register_routes(datasette) register_commands(cli) … 3  
180 180 prepare_connection(conn, database, datasette) conn - sqlite3 connection object The connection that is being opened database - string The name of the database datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) This hook is called when a new SQLite database connection is created. You can use it to register custom SQL functions , aggregates and collations. For example: from datasette import hookimpl import random @hookimpl def prepare_connection(conn): conn.create_function( "random_integer", 2, random.randint ) This registers a SQL function called random_integer which takes two arguments and can be called like this: select random_integer(1, 10); Examples: datasette-jellyfish , datasette-jq , datasette-haversine , datasette-rure 3  
181 181 prepare_jinja2_environment(env) env - jinja2 Environment The template environment that is being prepared This hook is called with the Jinja2 environment that is used to evaluate Datasette HTML templates. You can use it to do things like register custom template filters , for example: from datasette import hookimpl @hookimpl def prepare_jinja2_environment(env): env.filters["uppercase"] = lambda u: u.upper() You can now use this filter in your custom templates like so: Table name: {{ table|uppercase }} 3  
182 182 extra_template_vars(template, database, table, columns, view_name, request, datasette) Extra template variables that should be made available in the rendered template context. template - string The template that is being rendered, e.g. database.html database - string or None The name of the database, or None if the page does not correspond to a database (e.g. the root page) table - string or None The name of the table, or None if the page does not correct to a table columns - list of strings or None The names of the database columns that will be displayed on this page. None if the page does not contain a table. view_name - string The name of the view being displayed. ( index , database , table , and row are the most important ones.) request - object or None The current HTTP Request object . This can be None if the request object is not available. datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) This hook can return one of three different types: Dictionary If you return a dicti… 3  
183 183 extra_css_urls(template, database, table, columns, view_name, request, datasette) This takes the same arguments as extra_template_vars(...) Return a list of extra CSS URLs that should be included on the page. These can take advantage of the CSS class hooks described in Custom pages and templates . This can be a list of URLs: from datasette import hookimpl @hookimpl def extra_css_urls(): return [ "https://stackpath.bootstrapcdn.com/bootstrap/4.1.0/css/bootstrap.min.css" ] Or a list of dictionaries defining both a URL and an SRI hash : @hookimpl def extra_css_urls(): return [ { "url": "https://stackpath.bootstrapcdn.com/bootstrap/4.1.0/css/bootstrap.min.css", "sri": "sha384-9gVQ4dYFwwWSjIDZnLEWnxCjeSWFphJiwGPXr1jddIhOegiu1FwO5qRGvFXOdJZ4", } ] This function can also return an awaitable function, useful if it needs to run any async code: @hookimpl def extra_css_urls(datasette): async def inner(): db = datasette.get_database() results = await db.execute( "select url from css_files" ) return [r[0] for r in results] return inner Examples: datasette-cluster-map , datasette-vega 3  
184 184 extra_js_urls(template, database, table, columns, view_name, request, datasette) This takes the same arguments as extra_template_vars(...) This works in the same way as extra_css_urls() but for JavaScript. You can return a list of URLs, a list of dictionaries or an awaitable function that returns those things: from datasette import hookimpl @hookimpl def extra_js_urls(): return [ { "url": "https://code.jquery.com/jquery-3.3.1.slim.min.js", "sri": "sha384-q8i/X+965DzO0rT7abK41JStQIAqVgRVzpbzo5smXKp4YfRvH+8abtTE1Pi6jizo", } ] You can also return URLs to files from your plugin's static/ directory, if you have one: @hookimpl def extra_js_urls(): return ["/-/static-plugins/your-plugin/app.js"] Note that your-plugin here should be the hyphenated plugin name - the name that is displayed in the list on the /-/plugins debug page. If your code uses JavaScript modules you should include the "module": True key. See Custom CSS and JavaScript for more details. @hookimpl def extra_js_urls(): return [ { "url": "/-/static-plugins/your-plugin/app.js", "module": True, } ] Examples: datasette-cluster-map , datasette-vega 3  
185 185 extra_body_script(template, database, table, columns, view_name, request, datasette) Extra JavaScript to be added to a <script> block at the end of the <body> element on the page. This takes the same arguments as extra_template_vars(...) The template , database , table and view_name options can be used to return different code depending on which template is being rendered and which database or table are being processed. The datasette instance is provided primarily so that you can consult any plugin configuration options that may have been set, using the datasette.plugin_config(plugin_name) method documented above. This function can return a string containing JavaScript, or a dictionary as described below, or a function or awaitable function that returns a string or dictionary. Use a dictionary if you want to specify that the code should be placed in a <script type="module">...</script> element: @hookimpl def extra_body_script(): return { "module": True, "script": "console.log('Your JavaScript goes here...')", } This will add the following to the end of your page: <script type="module">console.log('Your JavaScript goes here...')</script> Example: datasette-cluster-map 3  
186 186 publish_subcommand(publish) publish - Click publish command group The Click command group for the datasette publish subcommand This hook allows you to create new providers for the datasette publish command. Datasette uses this hook internally to implement the default cloudrun and heroku subcommands, so you can read their source to see examples of this hook in action. Let's say you want to build a plugin that adds a datasette publish my_hosting_provider --api_key=xxx mydatabase.db publish command. Your implementation would start like this: from datasette import hookimpl from datasette.publish.common import ( add_common_publish_arguments_and_options, ) import click @hookimpl def publish_subcommand(publish): @publish.command() @add_common_publish_arguments_and_options @click.option( "-k", "--api_key", help="API key for talking to my hosting provider", ) def my_hosting_provider( files, metadata, extra_options, branch, template_dir, plugins_dir, static, install, plugin_secret, version_note, secret, title, license, license_url, source, source_url, about, about_url, api_key, ): ... Examples: datasette-publish-fly , datasette-publish-vercel 3  
187 187 render_cell(value, column, table, database, datasette) Lets you customize the display of values within table cells in the HTML table view. value - string, integer or None The value that was loaded from the database column - string The name of the column being rendered table - string or None The name of the table - or None if this is a custom SQL query database - string The name of the database datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries. If your hook returns None , it will be ignored. Use this to indicate that your hook is not able to custom render this particular value. If the hook returns a string, that string will be rendered in the table cell. If you want to return HTML markup you can do so by returning a jinja2.Markup object. You can also return an awaitable function which returns a value. Datasette will loop through all available render_cell hooks and display the value returned by the first one that does not return None . Here is an example of a custom render_cell() plugin which looks for values that are a JSON string matching the following format: {"href": "https://www.example.com/", "label": "Name"} If the value matches that pattern, the plugin returns an HTML link … 3  
188 188 register_output_renderer(datasette) datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) Registers a new output renderer, to output data in a custom format. The hook function should return a dictionary, or a list of dictionaries, of the following shape: @hookimpl def register_output_renderer(datasette): return { "extension": "test", "render": render_demo, "can_render": can_render_demo, # Optional } This will register render_demo to be called when paths with the extension .test (for example /database.test , /database/table.test , or /database/table/row.test ) are requested. render_demo is a Python function. It can be a regular function or an async def render_demo() awaitable function, depending on if it needs to make any asynchronous calls. can_render_demo is a Python function (or async def function) which acepts the same arguments as render_demo but just returns True or False . It lets Datasette know if the current SQL query can be represented by the plugin - and hence influnce if a link to this output format is displayed in the user interface. If you omit the "can_render" key from the dictionary every query will be treated as being supported by the plugin. When a request is received, the "render" callback function is called with zero or more of the following arguments. Datasette will inspect your callback function and pass arguments that match its function signature. datasette - Datasette class For accessing plugin configuration and executing queries. columns - list of strings The names of the columns re… 3  
189 189 register_routes(datasette) datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) Register additional view functions to execute for specified URL routes. Return a list of (regex, view_function) pairs, something like this: from datasette import hookimpl, Response import html async def hello_from(request): name = request.url_vars["name"] return Response.html( "Hello from {}".format(html.escape(name)) ) @hookimpl def register_routes(): return [(r"^/hello-from/(?P<name>.*)$", hello_from)] The view functions can take a number of different optional arguments. The corresponding argument will be passed to your function depending on its named parameters - a form of dependency injection. The optional view function arguments are as follows: datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries. request - Request object The current HTTP Request object . scope - dictionary The incoming ASGI scope dictionary. send - function The ASGI send function. receive - function The ASGI receive function. The vi… 3  
190 190 register_commands(cli) cli - the root Datasette Click command group Use this to register additional CLI commands Register additional CLI commands that can be run using datsette yourcommand ... . This provides a mechanism by which plugins can add new CLI commands to Datasette. This example registers a new datasette verify file1.db file2.db command that checks if the provided file paths are valid SQLite databases: from datasette import hookimpl import click import sqlite3 @hookimpl def register_commands(cli): @cli.command() @click.argument( "files", type=click.Path(exists=True), nargs=-1 ) def verify(files): "Verify that files can be opened by Datasette" for file in files: conn = sqlite3.connect(str(file)) try: conn.execute("select * from sqlite_master") except sqlite3.DatabaseError: raise click.ClickException( "Invalid database: {}".format(file) ) The new command can then be executed like so: datasette verify fixtures.db Help text (from the docstring for the function plus any defined Click arguments or options) will become available using: datasette verify --help Plugins can register multiple commands by making multiple calls to the @cli.command() decorator. Consult the Click documentation for full details on how to build a CLI command, including how to define arguments and options. Note that register_commands() plugins cannot used with the --plugins-dir mechanism - they need to be installed into the same virtual environment as Datasette using pip install . Provided it has a setup.py file (see Packaging a plugin ) you can run pip install directly against the directory in which you are developing your plugin like so: pip install -e path/t… 3  
191 191 register_facet_classes() Return a list of additional Facet subclasses to be registered. The design of this plugin hook is unstable and may change. See issue 830 . Each Facet subclass implements a new type of facet operation. The class should look like this: class SpecialFacet(Facet): # This key must be unique across all facet classes: type = "special" async def suggest(self): # Use self.sql and self.params to suggest some facets suggested_facets = [] suggested_facets.append( { "name": column, # Or other unique name # Construct the URL that will enable this facet: "toggle_url": self.ds.absolute_url( self.request, path_with_added_args( self.request, {"_facet": column} ), ), } ) return suggested_facets async def facet_results(self): # This should execute the facet operation and return results, again # using self.sql and self.params as the starting point facet_results = [] facets_timed_out = [] facet_size = self.get_facet_size() # Do some calculations here... for column in columns_selected_for_facet: try: facet_results_values = [] # More calculations... facet_results_values.append( { "value": value, "label": label, "count": count, "toggle_url": self.ds.absolute_url( self.request, toggle_path ), "selected": selected, } ) facet_results.append( { "name": column, "results": facet_results_values, "trunc… 3  
192 192 asgi_wrapper(datasette) Return an ASGI middleware wrapper function that will be applied to the Datasette ASGI application. This is a very powerful hook. You can use it to manipulate the entire Datasette response, or even to configure new URL routes that will be handled by your own custom code. You can write your ASGI code directly against the low-level specification, or you can use the middleware utilities provided by an ASGI framework such as Starlette . This example plugin adds a x-databases HTTP header listing the currently attached databases: from datasette import hookimpl from functools import wraps @hookimpl def asgi_wrapper(datasette): def wrap_with_databases_header(app): @wraps(app) async def add_x_databases_header( scope, receive, send ): async def wrapped_send(event): if event["type"] == "http.response.start": original_headers = ( event.get("headers") or [] ) event = { "type": event["type"], "status": event["status"], "headers": original_headers + [ [ b"x-databases", ", ".join( datasette.databases.keys() ).encode("utf-8"), ] ], } await send(event) await app(scope, receive, wrapped_send) return add_x_databases_header return wrap_with_databases_header Examples: datasette-cors , datasette-pyinstrument , datasette-total-page-time 3  
193 193 startup(datasette) This hook fires when the Datasette application server first starts up. You can implement a regular function, for example to validate required plugin configuration: @hookimpl def startup(datasette): config = datasette.plugin_config("my-plugin") or {} assert ( "required-setting" in config ), "my-plugin requires setting required-setting" Or you can return an async function which will be awaited on startup. Use this option if you need to make any database queries: @hookimpl def startup(datasette): async def inner(): db = datasette.get_database() if "my_table" not in await db.table_names(): await db.execute_write( """ create table my_table (mycol text) """ ) return inner Potential use-cases: Run some initialization code for the plugin Create database tables that a plugin needs on startup Validate the metadata configuration for a plugin on startup, and raise an error if it is invalid If you are writing unit tests for a plugin that uses this hook you will need to explicitly call await ds.invoke_startup() in your tests. An example: @pytest.mark.asyncio async def test_my_plugin(): ds = Datasette([], metadata={}) await ds.invoke_startup() # Rest of test goes here Examples: datasette-saved-queries , datasette-init 3  
194 194 canned_queries(datasette, database, actor) datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries. database - string The name of the database. actor - dictionary or None The currently authenticated actor . Ues this hook to return a dictionary of additional canned query definitions for the specified database. The return value should be the same shape as the JSON described in the canned query documentation. from datasette import hookimpl @hookimpl def canned_queries(datasette, database): if database == "mydb": return { "my_query": { "sql": "select * from my_table where id > :min_id" } } The hook can alternatively return an awaitable function that returns a list. Here's an example that returns queries that have been stored in the saved_queries database table, if one exists: from datasette import hookimpl @hookimpl def canned_queries(datasette, database): async def inner(): db = datasette.get_database(database) if await db.table_exists("saved_queries"): results = await db.execute( "select name, sql from saved_queries" ) return { result["name"]: {"sql": result["sql"]} for result in results } return inner The actor parameter can be used to include the currently authenticated actor in your decision. Here's an example that returns saved queries that were saved by that actor: from datasette import hookimpl @hookimpl def canned_queries… 3  
195 195 actor_from_request(datasette, request) datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries. request - object The current HTTP Request object . This is part of Datasette's authentication and permissions system . The function should attempt to authenticate an actor (either a user or an API actor of some sort) based on information in the request. If it cannot authenticate an actor, it should return None . Otherwise it should return a dictionary representing that actor. Here's an example that authenticates the actor based on an incoming API key: from datasette import hookimpl import secrets SECRET_KEY = "this-is-a-secret" @hookimpl def actor_from_request(datasette, request): authorization = ( request.headers.get("authorization") or "" ) expected = "Bearer {}".format(SECRET_KEY) if secrets.compare_digest(authorization, expected): return {"id": "bot"} If you install this in your plugins directory you can test it like this: $ curl -H 'Authorization: Bearer this-is-a-secret' http://localhost:8003/-/actor.json Instead of returning a dictionary, this function can return an awaitable function which itself returns either None or a dictionary. This is useful for authentication functions that need to make a database query - for example: from datasette import hookimpl @hookimpl def actor_from_request(datasette, request): async def inner(): token = request.args.get("_token") if not token: return None # Look up ?_token=xxx in sessions table result = await datasette.get_database().execute( "select count(*) from sessions wher… 3  
196 196 filters_from_request(request, database, table, datasette) request - object The current HTTP Request object . database - string The name of the database. table - string The name of the table. datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries. This hook runs on the table page, and can influence the where clause of the SQL query used to populate that page, based on query string arguments on the incoming request. The hook should return an instance of datasette.filters.FilterArguments which has one required and three optional arguments: return FilterArguments( where_clauses=["id > :max_id"], params={"max_id": 5}, human_descriptions=["max_id is greater than 5"], extra_context={}, ) The arguments to the FilterArguments class constructor are as follows: where_clauses - list of strings, required A list of SQL fragments that will be inserted into the SQL query, joined by the and operator. These can include :named parameters which will be populated using data in params . params - dictionary, optional Additional keyword arguments to be used when the query is executed. These should match any :arguments in the where clauses. … 3  
197 197 permission_allowed(datasette, actor, action, resource) datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries. actor - dictionary The current actor, as decided by actor_from_request(datasette, request) . action - string The action to be performed, e.g. "edit-table" . resource - string or None An identifier for the individual resource, e.g. the name of the table. Called to check that an actor has permission to perform an action on a resource. Can return True if the action is allowed, False if the action is not allowed or None if the plugin does not have an opinion one way or the other. Here's an example plugin which randomly selects if a permission should be allowed or denied, except for view-instance which always uses the default permission scheme instead. from datasette import hookimpl import random @hookimpl def permission_allowed(action): if action != "view-instance": # Return True or False at random return random.random() > 0.5 # Returning None falls back to default permissions This function can alternatively return an awaitable function which itself returns True , False or None . You can use this option if you need to execute additional database queries using await datasette.execute(...) . Here's an example that allows users to view the admin_log table only if their actor id is present in the admin_users table. It aso disallows arbitrary SQL queries for the staff… 3  
198 198 register_magic_parameters(datasette) datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) . Magic parameters can be used to add automatic parameters to canned queries . This plugin hook allows additional magic parameters to be defined by plugins. Magic parameters all take this format: _prefix_rest_of_parameter . The prefix indicates which magic parameter function should be called - the rest of the parameter is passed as an argument to that function. To register a new function, return it as a tuple of (string prefix, function) from this hook. The function you register should take two arguments: key and request , where key is the rest_of_parameter portion of the parameter and request is the current Request object . This example registers two new magic parameters: :_request_http_version returning the HTTP version of the current request, and :_uuid_new which returns a new UUID: from uuid import uuid4 def uuid(key, request): if key == "new": return str(uuid4()) else: raise KeyError def request(key, request): if key == "http_version": return request.scope["http_version"] else: raise KeyError @hookimpl def register_magic_parameters(datasette): return [ ("request", request), ("uuid", uuid), ] 3  
199 199 forbidden(datasette, request, message) datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries. request - object The current HTTP Request object . message - string A message hinting at why the request was forbidden. Plugins can use this to customize how Datasette responds when a 403 Forbidden error occurs - usually because a page failed a permission check, see Permissions . If a plugin hook wishes to react to the error, it should return a Response object . This example returns a redirect to a /-/login page: from datasette import hookimpl from urllib.parse import urlencode @hookimpl def forbidden(request, message): return Response.redirect( "/-/login?=" + urlencode({"message": message}) ) The function can alternatively return an awaitable function if it needs to make any asynchronous method calls. This example renders a template: from datasette import hookimpl from datasette.utils.asgi import Response @hookimpl def forbidden(datasette): async def inner(): return Response.html( await datasette.render_template( "forbidden.html" ) ) return inner 3  
200 200 menu_links(datasette, actor, request) datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries. actor - dictionary or None The currently authenticated actor . request - object or None The current HTTP Request object . This can be None if the request object is not available. This hook allows additional items to be included in the menu displayed by Datasette's top right menu icon. The hook should return a list of {"href": "...", "label": "..."} menu items. These will be added to the menu. It can alternatively return an async def awaitable function which returns a list of menu items. This example adds a new menu item but only if the signed in user is "root" : from datasette import hookimpl @hookimpl def menu_links(datasette, actor): if actor and actor.get("id") == "root": return [ { "href": datasette.urls.path( "/-/edit-schema" ), "label": "Edit schema", }, ] Using datasette.urls here ensures that links in the menu will take the base_url setting into account. Examples: datasette-search-all , datasette-graphql 3  

Next page

Advanced export

JSON shape: default, array, newline-delimited

CSV options:

CREATE VIRTUAL TABLE [sections_fts] USING FTS5 (
    [title], [content],
    tokenize='porter',
    content=[sections]
);
Powered by Datasette · Queries took 4.418ms