rowid,title,content,sections_fts,rank
101,0.52.3 (2020-12-03),Fixed bug where static assets would 404 for Datasette installed on ARM Amazon Linux. ( #1124 ),21,
102,0.52.2 (2020-12-02),"Generated columns from SQLite 3.31.0 or higher are now correctly displayed. ( #1116 )
Error message if you attempt to open a SpatiaLite database now suggests using --load-extension=spatialite if it detects that the extension is available in a common location. ( #1115 )
OPTIONS requests against the /database page no longer raise a 500 error. ( #1100 )
Databases larger than 32MB that are published to Cloud Run can now be downloaded. ( #749 )
Fix for misaligned cog icon on table and database pages. Thanks, Abdussamet Koçak. ( #1121 )",21,
103,0.52.1 (2020-11-29),"Documentation on Testing plugins now recommends using datasette.client . ( #1102 )
Fix bug where compound foreign keys produced broken links. ( #1098 )
datasette --load-module=spatialite now also checks for /usr/local/lib/mod_spatialite.so . Thanks, Dan Peterson. ( #1114 )",21,
104,0.52 (2020-11-28),"This release includes a number of changes relating to an internal rebranding effort: Datasette's configuration mechanism (things like datasette --config default_page_size:10 ) has been renamed to settings .
New --setting default_page_size 10 option as a replacement for --config default_page_size:10 (note the lack of a colon). The --config option is deprecated but will continue working until Datasette 1.0. ( #992 )
The /-/config introspection page is now /-/settings , and the previous page redirects to the new one. ( #1103 )
The config.json file in Configuration directory mode is now called settings.json . ( #1104 )
The undocumented datasette.config() internal method has been replaced by a documented .setting(key) method. ( #1107 )
Also in this release:
New plugin hook: database_actions(datasette, actor, database, request) , which adds menu items to a new cog menu shown at the top of the database page. ( #1077 )
datasette publish cloudrun has a new --apt-get-install option that can be used to install additional Ubuntu packages as part of the deployment. This is useful for deploying the new datasette-ripgrep plugin . ( #1110 )
Swept the documentation to remove words that minimize involved difficulty. ( #1089 )
And some bug fixes:
Foreign keys linking to rows with blank label columns now display as a hyphen, allowing those links to be clicked. ( #1086 )
Fixed bug where row pages could sometimes 500 if the underlying queries exceeded a time limit. ( #1088 )
Fixed a bug where the table action menu could appear partially obscured by the edge of the page. ( #1084 )",21,
105,0.51.1 (2020-10-31),Improvements to the new Binary data documentation page.,21,
106,0.51 (2020-10-31),"A new visual design, plugin hooks for adding navigation options, better handling of binary data, URL building utility methods and better support for running Datasette behind a proxy.",21,
107,New visual design,"Datasette is no longer white and grey with blue and purple links! Natalie Downe has been working on a visual refresh, the first iteration of which is included in this release. ( #1056 )",21,
108,Plugins can now add links within Datasette,"A number of existing Datasette plugins add new pages to the Datasette interface, providig tools for things like uploading CSVs , editing table schemas or configuring full-text search .
Plugins like this can now link to themselves from other parts of Datasette interface. The menu_links(datasette, actor, request) hook ( #1064 ) lets plugins add links to Datasette's new top-right application menu, and the table_actions(datasette, actor, database, table, request) hook ( #1066 ) adds links to a new ""table actions"" menu on the table page.
The demo at latest.datasette.io now includes some example plugins. To see the new table actions menu first sign into that demo as root and then visit the facetable table to see the new cog icon menu at the top of the page.",21,
109,Binary data,"SQLite tables can contain binary data in BLOB columns. Datasette now provides links for users to download this data directly from Datasette, and uses those links to make binary data available from CSV exports. See Binary data for more details. ( #1036 and #1034 ).",21,
110,URL building,"The new datasette.urls family of methods can be used to generate URLs to key pages within the Datasette interface, both within custom templates and Datasette plugins. See Building URLs within plugins for more details. ( #904 )",21,
111,Running Datasette behind a proxy,"The base_url configuration option is designed to help run Datasette on a specific path behind a proxy - for example if you want to run an instance of Datasette at /my-datasette/ within your existing site's URL hierarchy, proxied behind nginx or Apache.
Support for this configuration option has been greatly improved ( #1023 ), and guidelines for using it are now available in a new documentation section on Running Datasette behind a proxy . ( #1027 )",21,
112,Smaller changes,"Wide tables shown within Datasette now scroll horizontally ( #998 ). This is achieved using a new
element which may impact the implementation of some plugins (for example this change to datasette-cluster-map ).
New debug-menu permission. ( #1068 )
Removed --debug option, which didn't do anything. ( #814 )
Link: HTTP header pagination. ( #1014 )
x button for clearing filters. ( #1016 )
Edit SQL button on canned queries, ( #1019 )
--load-extension=spatialite shortcut. ( #1028 )
scale-in animation for column action menu. ( #1039 )
Option to pass a list of templates to .render_template() is now documented. ( #1045 )
New datasette.urls.static_plugins() method. ( #1033 )
datasette -o option now opens the most relevant page. ( #976 )
datasette --cors option now enables access to /database.db downloads. ( #1057 )
Database file downloads now implement cascading permissions, so you can download a database if you have view-database-download permission even if you do not have permission to access the Datasette instance. ( #1058 )
New documentation on Designing URLs for your plugin . ( #1053 )",21,
113,0.50.2 (2020-10-09),Fixed another bug introduced in 0.50 where column header links on the table page were broken. ( #1011 ),21,
114,0.50.1 (2020-10-09),"Fixed a bug introduced in 0.50 where the export as JSON/CSV links on the table, row and query pages were broken. ( #1010 )",21,
115,0.50 (2020-10-09),"The key new feature in this release is the column actions menu on the table page ( #891 ). This can be used to sort a column in ascending or descending order, facet data by that column or filter the table to just rows that have a value for that column.
Plugin authors can use the new datasette.client object to make internal HTTP requests from their plugins, allowing them to make use of Datasette's JSON API. ( #943 )
New Deploying Datasette documentation with guides for deploying Datasette on a Linux server using systemd or to hosting providers that support buildpacks . ( #514 , #997 )
Other improvements in this release:
Publishing to Google Cloud Run documentation now covers Google Cloud SDK options. Thanks, Geoffrey Hing. ( #995 )
New datasette -o option which opens your browser as soon as Datasette starts up. ( #970 )
Datasette now sets sqlite3.enable_callback_tracebacks(True) so that errors in custom SQL functions will display tracebacks. ( #891 )
Fixed two rendering bugs with column headers in portrait mobile view. ( #978 , #980 )
New db.table_column_details(table) introspection method for retrieving full details of the columns in a specific table, see Database introspection .
Fixed a routing bug with custom page wildcard templates. ( #996 )
datasette publish heroku now deploys using Python 3.8.6.
New datasette publish heroku --tar= option. ( #969 )
OPTIONS requests against HTML pages no longer return a 500 error. ( #1001 )
Datasette now supports Python 3.9.
See also Datasette 0.50: The annotated release notes .",21,
116,0.49.1 (2020-09-15),Fixed a bug with writable canned queries that use magic parameters but accept no non-magic arguments. ( #967 ),21,
117,0.49 (2020-09-14),"See also Datasette 0.49: The annotated release notes .
Writable canned queries now expose a JSON API, see JSON API for writable canned queries . ( #880 )
New mechanism for defining page templates with custom path parameters - a template file called pages/about/{slug}.html will be used to render any requests to /about/something . See Path parameters for pages . ( #944 )
register_output_renderer() render functions can now return a Response . ( #953 )
New --upgrade option for datasette install . ( #945 )
New datasette --pdb option. ( #962 )
datasette --get exit code now reflects the internal HTTP status code. ( #947 )
New raise_404() template function for returning 404 errors. ( #964 )
datasette publish heroku now deploys using Python 3.8.5
Upgraded CodeMirror to 5.57.0. ( #948 )
Upgraded code style to Black 20.8b1. ( #958 )
Fixed bug where selected facets were not correctly persisted in hidden form fields on the table page. ( #963 )
Renamed the default error template from 500.html to error.html .
Custom error pages are now documented, see Custom error pages . ( #965 )",21,
118,0.48 (2020-08-16),"Datasette documentation now lives at docs.datasette.io .
db.is_mutable property is now documented and tested, see Database introspection .
The extra_template_vars , extra_css_urls , extra_js_urls and extra_body_script plugin hooks now all accept the same arguments. See extra_template_vars(template, database, table, columns, view_name, request, datasette) for details. ( #939 )
Those hooks now accept a new columns argument detailing the table columns that will be rendered on that page. ( #938 )
Fixed bug where plugins calling db.execute_write_fn() could hang Datasette if the connection failed. ( #935 )
Fixed bug with the ?_nl=on output option and binary data. ( #914 )",21,
119,0.47.3 (2020-08-15),The datasette --get command-line mechanism now ensures any plugins using the startup() hook are correctly executed. ( #934 ),21,
120,0.47.2 (2020-08-12),Fixed an issue with the Docker image published to Docker Hub . ( #931 ),21,
121,0.47.1 (2020-08-11),Fixed a bug where the sdist distribution of Datasette was not correctly including the template files. ( #930 ),21,
122,0.47 (2020-08-11),"Datasette now has a GitHub discussions forum for conversations about the project that go beyond just bug reports and issues.
Datasette can now be installed on macOS using Homebrew! Run brew install simonw/datasette/datasette . See Using Homebrew . ( #335 )
Two new commands: datasette install name-of-plugin and datasette uninstall name-of-plugin . These are equivalent to pip install and pip uninstall but automatically run in the same virtual environment as Datasette, so users don't have to figure out where that virtual environment is - useful for installations created using Homebrew or pipx . See Installing plugins . ( #925 )
A new command-line option, datasette --get , accepts a path to a URL within the Datasette instance. It will run that request through Datasette (without starting a web server) and print out the response. See datasette --get for an example. ( #926 )",21,
123,0.46 (2020-08-09),"This release contains a security fix related to authenticated writable canned queries. If you are using this feature you should upgrade as soon as possible.
Security fix: CSRF tokens were incorrectly included in read-only canned query forms, which could allow them to be leaked to a sophisticated attacker. See issue 918 for details.
Datasette now supports GraphQL via the new datasette-graphql plugin - see GraphQL in Datasette with the new datasette-graphql plugin .
Principle git branch has been renamed from master to main . ( #849 )
New debugging tool: /-/allow-debug tool ( demo here ) helps test allow blocks against actors, as described in Defining permissions with ""allow"" blocks . ( #908 )
New logo for the documentation, and a new project tagline: ""An open source multi-tool for exploring and publishing data"".
Whitespace in column values is now respected on display, using white-space: pre-wrap . ( #896 )
New await request.post_body() method for accessing the raw POST body, see Request object . ( #897 )
Database file downloads now include a content-length HTTP header, enabling download progress bars. ( #905 )
File downloads now also correctly set the suggested file name using a content-disposition HTTP header. ( #909 )
tests are now excluded from the Datasette package properly - thanks, abeyerpath. ( #456 )
The Datasette package published to PyPI now includes sdist as well as bdist_wheel .
Better titles for canned query pages. ( #887 )
Now only loads Python files from a directory passed using the --plugins-dir option - thanks, Amjith Ramanujam. ( #890 )
New documentation section on Publishing to Vercel .",21,
124,0.45 (2020-07-01),"See also Datasette 0.45: The annotated release notes .
Magic parameters for canned queries, a log out feature, improved plugin documentation and four new plugin hooks.",21,
125,Magic parameters for canned queries,"Canned queries now support Magic parameters , which can be used to insert or select automatically generated values. For example:
insert into logs
(user_id, timestamp)
values
(:_actor_id, :_now_datetime_utc)
This inserts the currently authenticated actor ID and the current datetime. ( #842 )",21,
126,Log out,"The ds_actor cookie can be used by plugins (or by Datasette's --root mechanism ) to authenticate users. The new /-/logout page provides a way to clear that cookie.
A ""Log out"" button now shows in the global navigation provided the user is authenticated using the ds_actor cookie. ( #840 )",21,
127,Better plugin documentation,"The plugin documentation has been re-arranged into four sections, including a brand new section on testing plugins. ( #687 )
Plugins introduces Datasette's plugin system and describes how to install and configure plugins.
Writing plugins describes how to author plugins, from one-off single file plugins to packaged plugins that can be published to PyPI. It also describes how to start a plugin using the new datasette-plugin cookiecutter template.
Plugin hooks is a full list of detailed documentation for every Datasette plugin hook.
Testing plugins describes how to write tests for Datasette plugins, using pytest and HTTPX .",21,
128,New plugin hooks,"register_magic_parameters(datasette) can be used to define new types of magic canned query parameters.
startup(datasette) can run custom code when Datasette first starts up. datasette-init is a new plugin that uses this hook to create database tables and views on startup if they have not yet been created. ( #834 )
canned_queries(datasette, database, actor) lets plugins provide additional canned queries beyond those defined in Datasette's metadata. See datasette-saved-queries for an example of this hook in action. ( #852 )
forbidden(datasette, request, message) is a hook for customizing how Datasette responds to 403 forbidden errors. ( #812 )",21,
129,Smaller changes,"Cascading view permissions - so if a user has view-table they can view the table page even if they do not have view-database or view-instance . ( #832 )
CSRF protection no longer applies to Authentication: Bearer token requests or requests without cookies. ( #835 )
datasette.add_message() now works inside plugins. ( #864 )
Workaround for ""Too many open files"" error in test runs. ( #846 )
Respect existing scope[""actor""] if already set by ASGI middleware. ( #854 )
New process for shipping Alpha and beta releases . ( #807 )
{{ csrftoken() }} now works when plugins render a template using datasette.render_template(..., request=request) . ( #863 )
Datasette now creates a single Request object and uses it throughout the lifetime of the current HTTP request. ( #870 )",21,
130,0.44 (2020-06-11),"See also Datasette 0.44: The annotated release notes .
Authentication and permissions, writable canned queries, flash messages, new plugin hooks and more.",21,
131,Authentication,"Prior to this release the Datasette ecosystem has treated authentication as exclusively the realm of plugins, most notably through datasette-auth-github .
0.44 introduces Authentication and permissions as core Datasette concepts ( #699 ). This enables different plugins to share responsibility for authenticating requests - you might have one plugin that handles user accounts and another one that allows automated access via API keys, for example.
You'll need to install plugins if you want full user accounts, but default Datasette can now authenticate a single root user with the new --root command-line option, which outputs a one-time use URL to authenticate as a root actor ( #784 ):
datasette fixtures.db --root
http://127.0.0.1:8001/-/auth-token?token=5b632f8cd44b868df625f5a6e2185d88eea5b22237fd3cc8773f107cc4fd6477
INFO: Started server process [14973]
INFO: Waiting for application startup.
INFO: Application startup complete.
INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit)
Plugins can implement new ways of authenticating users using the new actor_from_request(datasette, request) hook.",21,
132,Permissions,"Datasette also now has a built-in concept of Permissions . The permissions system answers the following question:
Is this actor allowed to perform this action , optionally against this particular resource ?
You can use the new ""allow"" block syntax in metadata.json (or metadata.yaml ) to set required permissions at the instance, database, table or canned query level. For example, to restrict access to the fixtures.db database to the ""root"" user:
{
""databases"": {
""fixtures"": {
""allow"": {
""id"" ""root""
}
}
}
}
See Defining permissions with ""allow"" blocks for more details.
Plugins can implement their own custom permission checks using the new permission_allowed(datasette, actor, action, resource) hook.
A new debug page at /-/permissions shows recent permission checks, to help administrators and plugin authors understand exactly what checks are being performed. This tool defaults to only being available to the root user, but can be exposed to other users by plugins that respond to the permissions-debug permission. ( #788 )",21,
133,Writable canned queries,"Datasette's Canned queries feature lets you define SQL queries in metadata.json which can then be executed by users visiting a specific URL. https://latest.datasette.io/fixtures/neighborhood_search for example.
Canned queries were previously restricted to SELECT , but Datasette 0.44 introduces the ability for canned queries to execute INSERT or UPDATE queries as well, using the new ""write"": true property ( #800 ):
{
""databases"": {
""dogs"": {
""queries"": {
""add_name"": {
""sql"": ""INSERT INTO names (name) VALUES (:name)"",
""write"": true
}
}
}
}
}
See Writable canned queries for more details.",21,
134,Flash messages,"Writable canned queries needed a mechanism to let the user know that the query has been successfully executed. The new flash messaging system ( #790 ) allows messages to persist in signed cookies which are then displayed to the user on the next page that they visit. Plugins can use this mechanism to display their own messages, see .add_message(request, message, type=datasette.INFO) for details.
You can try out the new messages using the /-/messages debug tool, for example at https://latest.datasette.io/-/messages",21,
135,Signed values and secrets,"Both flash messages and user authentication needed a way to sign values and set signed cookies. Two new methods are now available for plugins to take advantage of this mechanism: .sign(value, namespace=""default"") and .unsign(value, namespace=""default"") .
Datasette will generate a secret automatically when it starts up, but to avoid resetting the secret (and hence invalidating any cookies) every time the server restarts you should set your own secret. You can pass a secret to Datasette using the new --secret option or with a DATASETTE_SECRET environment variable. See Configuring the secret for more details.
You can also set a secret when you deploy Datasette using datasette publish or datasette package - see Using secrets with datasette publish .
Plugins can now sign values and verify their signatures using the datasette.sign() and datasette.unsign() methods.",21,
136,CSRF protection,"Since writable canned queries are built using POST forms, Datasette now ships with CSRF protection ( #798 ). This applies automatically to any POST request, which means plugins need to include a csrftoken in any POST forms that they render. They can do that like so:
",21,
137,Cookie methods,"Plugins can now use the new response.set_cookie() method to set cookies.
A new request.cookies method on the :ref:internals_request` can be used to read incoming cookies.",21,
138,register_routes() plugin hooks,"Plugins can now register new views and routes via the register_routes(datasette) plugin hook ( #819 ). View functions can be defined that accept any of the current datasette object, the current request , or the ASGI scope , send and receive objects.",21,
139,Smaller changes,"New internals documentation for Request object and Response class . ( #706 )
request.url now respects the force_https_urls config setting. closes ( #781 )
request.args.getlist() returns [] if missing. Removed request.raw_args entirely. ( #774 )
New datasette.get_database() method.
Added _ prefix to many private, undocumented methods of the Datasette class. ( #576 )
Removed the db.get_outbound_foreign_keys() method which duplicated the behaviour of db.foreign_keys_for_table() .
New await datasette.permission_allowed() method.
/-/actor debugging endpoint for viewing the currently authenticated actor.
New request.cookies property.
/-/plugins endpoint now shows a list of hooks implemented by each plugin, e.g. https://latest.datasette.io/-/plugins?all=1
request.post_vars() method no longer discards empty values.
New ""params"" canned query key for explicitly setting named parameters, see Canned query parameters . ( #797 )
request.args is now a MultiParams object.
Fixed a bug with the datasette plugins command. ( #802 )
Nicer pattern for using make_app_client() in tests. ( #395 )
New request.actor property.
Fixed broken CSS on nested 404 pages. ( #777 )
New request.url_vars property. ( #822 )
Fixed a bug with the python tests/fixtures.py command for outputting Datasette's testing fixtures database and plugins. ( #804 )
datasette publish heroku now deploys using Python 3.8.3.
Added a warning that the register_facet_classes() hook is unstable and may change in the future. ( #830 )
The {""$env"": ""ENVIRONMENT_VARIBALE""} mechanism (see Secret configuration values ) now works with variables inside nested lists. ( #837 )",21,
140,The road to Datasette 1.0,"I've assembled a milestone for Datasette 1.0 . The focus of the 1.0 release will be the following:
Signify confidence in the quality/stability of Datasette
Give plugin authors confidence that their plugins will work for the whole 1.x release cycle
Provide the same confidence to developers building against Datasette JSON APIs
If you have thoughts about what you would like to see for Datasette 1.0 you can join the conversation on issue #519 .",21,
141,0.43 (2020-05-28),"The main focus of this release is a major upgrade to the register_output_renderer(datasette) plugin hook, which allows plugins to provide new output formats for Datasette such as datasette-atom and datasette-ics .
Redesign of register_output_renderer(datasette) to provide more context to the render callback and support an optional ""can_render"" callback that controls if a suggested link to the output format is provided. ( #581 , #770 )
Visually distinguish float and integer columns - useful for figuring out why order-by-column might be returning unexpected results. ( #729 )
The Request object , which is passed to several plugin hooks, is now documented. ( #706 )
New metadata.json option for setting a custom default page size for specific tables and views, see Setting a custom page size . ( #751 )
Canned queries can now be configured with a default URL fragment hash, useful when working with plugins such as datasette-vega , see Additional canned query options . ( #706 )
Fixed a bug in datasette publish when running on operating systems where the /tmp directory lives in a different volume, using a backport of the Python 3.8 shutil.copytree() function. ( #744 )
Every plugin hook is now covered by the unit tests, and a new unit test checks that each plugin hook has at least one corresponding test. ( #771 , #773 )",21,
142,0.42 (2020-05-08),"A small release which provides improved internal methods for use in plugins, along with documentation. See #685 .
Added documentation for db.execute() , see await db.execute(sql, ...) .
Renamed db.execute_against_connection_in_thread() to db.execute_fn() and made it a documented method, see await db.execute_fn(fn) .
New results.first() and results.single_value() methods, plus documentation for the Results class - see Results .",21,
143,0.41 (2020-05-06),"You can now create custom pages within your Datasette instance using a custom template file. For example, adding a template file called templates/pages/about.html will result in a new page being served at /about on your instance. See the custom pages documentation for full details, including how to return custom HTTP headers, redirects and status codes. ( #648 )
Configuration directory mode ( #731 ) allows you to define a custom Datasette instance as a directory. So instead of running the following:
datasette one.db two.db \
--metadata=metadata.json \
--template-dir=templates/ \
--plugins-dir=plugins \
--static css:css
You can instead arrange your files in a single directory called my-project and run this:
datasette my-project/
Also in this release:
New NOT LIKE table filter: ?colname__notlike=expression . ( #750 )
Datasette now has a pattern portfolio at /-/patterns - e.g. https://latest.datasette.io/-/patterns . This is a page that shows every Datasette user interface component in one place, to aid core development and people building custom CSS themes. ( #151 )
SQLite PRAGMA functions such as pragma_table_info(tablename) are now allowed in Datasette SQL queries. ( #761 )
Datasette pages now consistently return a content-type of text/html; charset=utf-8"" . ( #752 )
Datasette now handles an ASGI raw_path value of None , which should allow compatibility with the Mangum adapter for running ASGI apps on AWS Lambda. Thanks, Colin Dellow. ( #719 )
Installation documentation now covers how to Using pipx . ( #756 )
Improved the documentation for Full-text search . ( #748 )",21,
144,0.40 (2020-04-21),"Datasette Metadata can now be provided as a YAML file as an optional alternative to JSON. ( #713 )
Removed support for datasette publish now , which used the the now-retired Zeit Now v1 hosting platform. A new plugin, datasette-publish-now , can be installed to publish data to Zeit ( now Vercel ) Now v2. ( #710 )
Fixed a bug where the extra_template_vars(request, view_name) plugin hook was not receiving the correct view_name . ( #716 )
Variables added to the template context by the extra_template_vars() plugin hook are now shown in the ?_context=1 debugging mode (see template_debug ). ( #693 )
Fixed a bug where the ""templates considered"" HTML comment was no longer being displayed. ( #689 )
Fixed a datasette publish bug where --plugin-secret would over-ride plugin configuration in the provided metadata.json file. ( #724 )
Added a new CSS class for customizing the canned query page. ( #727 )",21,
145,0.39 (2020-03-24),"New base_url configuration setting for serving up the correct links while running Datasette under a different URL prefix. ( #394 )
New metadata settings ""sort"" and ""sort_desc"" for setting the default sort order for a table. See Setting a default sort order . ( #702 )
Sort direction arrow now displays by default on the primary key. This means you only have to click once (not twice) to sort in reverse order. ( #677 )
New await Request(scope, receive).post_vars() method for accessing POST form variables. ( #700 )
Plugin hooks documentation now links to example uses of each plugin. ( #709 )",21,
146,0.38 (2020-03-08),"The Docker build of Datasette now uses SQLite 3.31.1, upgraded from 3.26. ( #695 )
datasette publish cloudrun now accepts an optional --memory=2Gi flag for setting the Cloud Run allocated memory to a value other than the default (256Mi). ( #694 )
Fixed bug where templates that shipped with plugins were sometimes not being correctly loaded. ( #697 )",21,
147,0.37.1 (2020-03-02),"Don't attempt to count table rows to display on the index page for databases > 100MB. ( #688 )
Print exceptions if they occur in the write thread rather than silently swallowing them.
Handle the possibility of scope[""path""] being a string rather than bytes
Better documentation for the extra_template_vars(template, database, table, columns, view_name, request, datasette) plugin hook.",21,
148,0.37 (2020-02-25),"Plugins now have a supported mechanism for writing to a database, using the new .execute_write() and .execute_write_fn() methods. Documentation . ( #682 )
Immutable databases that have had their rows counted using the inspect command now use the calculated count more effectively - thanks, Kevin Keogh. ( #666 )
--reload no longer restarts the server if a database file is modified, unless that database was opened immutable mode with -i . ( #494 )
New ?_searchmode=raw option turns off escaping for FTS queries in ?_search= allowing full use of SQLite's FTS5 query syntax . ( #676 )",21,
149,0.36 (2020-02-21),"The datasette object passed to plugins now has API documentation: Datasette class . ( #576 )
New methods on datasette : .add_database() and .remove_database() - documentation . ( #671 )
prepare_connection() plugin hook now takes optional datasette and database arguments - prepare_connection(conn, database, datasette) . ( #678 )
Added three new plugins and one new conversion tool to the The Datasette Ecosystem .",21,
150,0.35 (2020-02-04),"Added five new plugins and one new conversion tool to the The Datasette Ecosystem .
The Datasette class has a new render_template() method which can be used by plugins to render templates using Datasette's pre-configured Jinja templating library.
You can now execute SQL queries that start with a -- comment - thanks, Jay Graves ( #653 )",21,
151,0.34 (2020-01-29),"_search= queries are now correctly escaped using a new escape_fts() custom SQL function. This means you can now run searches for strings like park. without seeing errors. ( #651 )
Google Cloud Run is no longer in beta, so datasette publish cloudrun has been updated to work even if the user has not installed the gcloud beta components package. Thanks, Katie McLaughlin ( #660 )
datasette package now accepts a --port option for specifying which port the resulting Docker container should listen on. ( #661 )",21,
152,0.33 (2019-12-22),"rowid is now included in dropdown menus for filtering tables ( #636 )
Columns are now only suggested for faceting if they have at least one value with more than one record ( #638 )
Queries with no results now display ""0 results"" ( #637 )
Improved documentation for the --static option ( #641 )
asyncio task information is now included on the /-/threads debug page
Bumped Uvicorn dependency 0.11
You can now use --port 0 to listen on an available port
New template_debug setting for debugging templates, e.g. https://latest.datasette.io/fixtures/roadside_attractions?_context=1 ( #654 )",21,
153,0.32 (2019-11-14),"Datasette now renders templates using Jinja async mode . This means plugins can provide custom template functions that perform asynchronous actions, for example the new datasette-template-sql plugin which allows custom templates to directly execute SQL queries and render their results. ( #628 )",21,
154,0.31.2 (2019-11-13),"Fixed a bug where datasette publish heroku applications failed to start ( #633 )
Fix for datasette publish with just --source_url - thanks, Stanley Zheng ( #572 )
Deployments to Heroku now use Python 3.8.0 ( #632 )",21,
155,0.31.1 (2019-11-12),Deployments created using datasette publish now use python:3.8 base Docker image ( #629 ),21,
156,0.31 (2019-11-11),"This version adds compatibility with Python 3.8 and breaks compatibility with Python 3.5.
If you are still running Python 3.5 you should stick with 0.30.2 , which you can install like this:
pip install datasette==0.30.2
Format SQL button now works with read-only SQL queries - thanks, Tobias Kunze ( #602 )
New ?column__notin=x,y,z filter for table views ( #614 )
Table view now uses select col1, col2, col3 instead of select *
Database filenames can now contain spaces - thanks, Tobias Kunze ( #590 )
Removed obsolete ?_group_count=col feature ( #504 )
Improved user interface and documentation for datasette publish cloudrun ( #608 )
Tables with indexes now show the CREATE INDEX statements on the table page ( #618 )
Current version of uvicorn is now shown on /-/versions
Python 3.8 is now supported! ( #622 )
Python 3.5 is no longer supported.",21,
157,0.30.2 (2019-11-02),"/-/plugins page now uses distribution name e.g. datasette-cluster-map instead of the name of the underlying Python package ( datasette_cluster_map ) ( #606 )
Array faceting is now only suggested for columns that contain arrays of strings ( #562 )
Better documentation for the --host argument ( #574 )
Don't show None with a broken link for the label on a nullable foreign key ( #406 )",21,
158,0.30.1 (2019-10-30),"Fixed bug where ?_where= parameter was not persisted in hidden form fields ( #604 )
Fixed bug with .JSON representation of row pages - thanks, Chris Shaw ( #603 )",21,
159,0.30 (2019-10-18),"Added /-/threads debugging page
Allow EXPLAIN WITH... ( #583 )
Button to format SQL - thanks, Tobias Kunze ( #136 )
Sort databases on homepage by argument order - thanks, Tobias Kunze ( #585 )
Display metadata footer on custom SQL queries - thanks, Tobias Kunze ( #589 )
Use --platform=managed for publish cloudrun ( #587 )
Fixed bug returning non-ASCII characters in CSV ( #584 )
Fix for /foo v.s. /foo-bar bug ( #601 )",21,
160,0.29.3 (2019-09-02),"Fixed implementation of CodeMirror on database page ( #560 )
Documentation typo fixes - thanks, Min ho Kim ( #561 )
Mechanism for detecting if a table has FTS enabled now works if the table name used alternative escaping mechanisms ( #570 ) - for compatibility with a recent change to sqlite-utils .",21,
161,0.29.2 (2019-07-13),"Bumped Uvicorn to 0.8.4, fixing a bug where the query string was not included in the server logs. ( #559 )
Fixed bug where the navigation breadcrumbs were not displayed correctly on the page for a custom query. ( #558 )
Fixed bug where custom query names containing unicode characters caused errors.",21,
162,0.29.1 (2019-07-11),"Fixed bug with static mounts using relative paths which could lead to traversal exploits ( #555 ) - thanks Abdussamet Kocak!
Datasette can now be run as a module: python -m datasette ( #556 ) - thanks, Abdussamet Kocak!",21,
163,0.29 (2019-07-07),"ASGI, new plugin hooks, facet by date and much, much more...",21,
164,ASGI,"ASGI is the Asynchronous Server Gateway Interface standard. I've been wanting to convert Datasette into an ASGI application for over a year - Port Datasette to ASGI #272 tracks thirteen months of intermittent development - but with Datasette 0.29 the change is finally released. This also means Datasette now runs on top of Uvicorn and no longer depends on Sanic .
I wrote about the significance of this change in Porting Datasette to ASGI, and Turtles all the way down .
The most exciting consequence of this change is that Datasette plugins can now take advantage of the ASGI standard.",21,
165,New plugin hook: asgi_wrapper,"The asgi_wrapper(datasette) plugin hook allows plugins to entirely wrap the Datasette ASGI application in their own ASGI middleware. ( #520 )
Two new plugins take advantage of this hook:
datasette-auth-github adds a authentication layer: users will have to sign in using their GitHub account before they can view data or interact with Datasette. You can also use it to restrict access to specific GitHub users, or to members of specified GitHub organizations or teams .
datasette-cors allows you to configure CORS headers for your Datasette instance. You can use this to enable JavaScript running on a whitelisted set of domains to make fetch() calls to the JSON API provided by your Datasette instance.",21,
166,New plugin hook: extra_template_vars,"The extra_template_vars(template, database, table, columns, view_name, request, datasette) plugin hook allows plugins to inject their own additional variables into the Datasette template context. This can be used in conjunction with custom templates to customize the Datasette interface. datasette-auth-github uses this hook to add custom HTML to the new top navigation bar (which is designed to be modified by plugins, see #540 ).",21,
167,Secret plugin configuration options,"Plugins like datasette-auth-github need a safe way to set secret configuration options. Since the default mechanism for configuring plugins exposes those settings in /-/metadata a new mechanism was needed. Secret configuration values describes how plugins can now specify that their settings should be read from a file or an environment variable:
{
""plugins"": {
""datasette-auth-github"": {
""client_secret"": {
""$env"": ""GITHUB_CLIENT_SECRET""
}
}
}
}
These plugin secrets can be set directly using datasette publish . See Custom metadata and plugins for details. ( #538 and #543 )",21,
168,Facet by date,"If a column contains datetime values, Datasette can now facet that column by date. ( #481 )",21,
169,Easier custom templates for table rows,"If you want to customize the display of individual table rows, you can do so using a _table.html template include that looks something like this:
{% for row in display_rows %}
{{ row[""title""] }}
{{ row[""description""] }}
Category: {{ row.display(""category_id"") }}
{% endfor %}
This is a backwards incompatible change . If you previously had a custom template called _rows_and_columns.html you need to rename it to _table.html .
See Custom templates for full details.",21,
170,?_through= for joins through many-to-many tables,"The new ?_through={json} argument to the Table view allows records to be filtered based on a many-to-many relationship. See Special table arguments for full documentation - here's an example . ( #355 )
This feature was added to help support facet by many-to-many , which isn't quite ready yet but will be coming in the next Datasette release.",21,
171,Small changes,"Databases published using datasette publish now open in Immutable mode . ( #469 )
?col__date= now works for columns containing spaces
Automatic label detection (for deciding which column to show when linking to a foreign key) has been improved. ( #485 )
Fixed bug where pagination broke when combined with an expanded foreign key. ( #489 )
Contributors can now run pip install -e .[docs] to get all of the dependencies needed to build the documentation, including cd docs && make livehtml support.
Datasette's dependencies are now all specified using the ~= match operator. ( #532 )
white-space: pre-wrap now used for table creation SQL. ( #505 )
Full list of commits between 0.28 and 0.29.",21,
172,0.28 (2019-05-19),A salmagundi of new features!,21,
173,Supporting databases that change,"From the beginning of the project, Datasette has been designed with read-only databases in mind. If a database is guaranteed not to change it opens up all kinds of interesting opportunities - from taking advantage of SQLite immutable mode and HTTP caching to bundling static copies of the database directly in a Docker container. The interesting ideas in Datasette explores this idea in detail.
As my goals for the project have developed, I realized that read-only databases are no longer the right default. SQLite actually supports concurrent access very well provided only one thread attempts to write to a database at a time, and I keep encountering sensible use-cases for running Datasette on top of a database that is processing inserts and updates.
So, as-of version 0.28 Datasette no longer assumes that a database file will not change. It is now safe to point Datasette at a SQLite database which is being updated by another process.
Making this change was a lot of work - see tracking tickets #418 , #419 and #420 . It required new thinking around how Datasette should calculate table counts (an expensive operation against a large, changing database) and also meant reconsidering the ""content hash"" URLs Datasette has used in the past to optimize the performance of HTTP caches.
Datasette can still run against immutable files and gains numerous performance benefits from doing so, but this is no longer the default behaviour. Take a look at the new Performance and caching documentation section for details on how to make the most of Datasette against data that you know will be staying read-only and immutable.",21,
174,"Faceting improvements, and faceting plugins","Datasette Facets provide an intuitive way to quickly summarize and interact with data. Previously the only supported faceting technique was column faceting, but 0.28 introduces two powerful new capabilities: facet-by-JSON-array and the ability to define further facet types using plugins.
Facet by array ( #359 ) is only available if your SQLite installation provides the json1 extension. Datasette will automatically detect columns that contain JSON arrays of values and offer a faceting interface against those columns - useful for modelling things like tags without needing to break them out into a new table. See Facet by JSON array for more.
The new register_facet_classes() plugin hook ( #445 ) can be used to register additional custom facet classes. Each facet class should provide two methods: suggest() which suggests facet selections that might be appropriate for a provided SQL query, and facet_results() which executes a facet operation and returns results. Datasette's own faceting implementations have been refactored to use the same API as these plugins.",21,
175,datasette publish cloudrun,"Google Cloud Run is a brand new serverless hosting platform from Google, which allows you to build a Docker container which will run only when HTTP traffic is received and will shut down (and hence cost you nothing) the rest of the time. It's similar to Zeit's Now v1 Docker hosting platform which sadly is no longer accepting signups from new users.
The new datasette publish cloudrun command was contributed by Romain Primet ( #434 ) and publishes selected databases to a new Datasette instance running on Google Cloud Run.
See Publishing to Google Cloud Run for full documentation.",21,
176,register_output_renderer plugins,"Russ Garrett implemented a new Datasette plugin hook called register_output_renderer ( #441 ) which allows plugins to create additional output renderers in addition to Datasette's default .json and .csv .
Russ's in-development datasette-geo plugin includes an example of this hook being used to output .geojson automatically converted from SpatiaLite.",21,
177,Medium changes,"Datasette now conforms to the Black coding style ( #449 ) - and has a unit test to enforce this in the future
New Special table arguments :
?columnname__in=value1,value2,value3 filter for executing SQL IN queries against a table, see Table arguments ( #433 )
?columnname__date=yyyy-mm-dd filter which returns rows where the spoecified datetime column falls on the specified date ( 583b22a )
?tags__arraycontains=tag filter which acts against a JSON array contained in a column ( 78e45ea )
?_where=sql-fragment filter for the table view ( #429 )
?_fts_table=mytable and ?_fts_pk=mycolumn query string options can be used to specify which FTS table to use for a search query - see Configuring full-text search for a table or view ( #428 )
You can now pass the same table filter multiple times - for example, ?content__not=world&content__not=hello will return all rows where the content column is neither hello or world ( #288 )
You can now specify about and about_url metadata (in addition to source and license ) linking to further information about a project - see Source, license and about
New ?_trace=1 parameter now adds debug information showing every SQL query that was executed while constructing the page ( #435 )
datasette inspect now just calculates table counts, and does not introspect other database metadata ( #462 )
Removed /-/inspect page entirely - this will be replaced by something similar in the future, see #465
Datasette can now run against an in-memory SQLite database. You can do this by starting it without passing any files or by using the new --memory option to datasette serve . This can be useful for experimenting with SQLite queries that do not access any data, such as SELECT 1+1 or SELECT sqlite_version() .",21,
178,Small changes,"We now show the size of the database file next to the download link ( #172 )
New /-/databases introspection page shows currently connected databases ( #470 )
Binary data is no longer displayed on the table and row pages ( #442 - thanks, Russ Garrett)
New show/hide SQL links on custom query pages ( #415 )
The extra_body_script plugin hook now accepts an optional view_name argument ( #443 - thanks, Russ Garrett)
Bumped Jinja2 dependency to 2.10.1 ( #426 )
All table filters are now documented, and documentation is enforced via unit tests ( 2c19a27 )
New project guideline: master should stay shippable at all times! ( 31f36e1 )
Fixed a bug where sqlite_timelimit() occasionally failed to clean up after itself ( bac4e01 )
We no longer load additional plugins when executing pytest ( #438 )
Homepage now links to database views if there are less than five tables in a database ( #373 )
The --cors option is now respected by error pages ( #453 )
datasette publish heroku now uses the --include-vcs-ignore option, which means it works under Travis CI ( #407 )
datasette publish heroku now publishes using Python 3.6.8 ( 666c374 )
Renamed datasette publish now to datasette publish nowv1 ( #472 )
datasette publish nowv1 now accepts multiple --alias parameters ( 09ef305 )
Removed the datasette skeleton command ( #476 )
The documentation on how to build the documentation now recommends sphinx-autobuild",21,
179,0.27.1 (2019-05-09),"Tiny bugfix release: don't install tests/ in the wrong place. Thanks, Veit Heller.",21,
180,0.27 (2019-01-31),"New command: datasette plugins ( documentation ) shows you the currently installed list of plugins.
Datasette can now output newline-delimited JSON using the new ?_shape=array&_nl=on query string option.
Added documentation on The Datasette Ecosystem .
Now using Python 3.7.2 as the base for the official Datasette Docker image .",21,
181,0.26.1 (2019-01-10),"/-/versions now includes SQLite compile_options ( #396 )
datasetteproject/datasette Docker image now uses SQLite 3.26.0 ( #397 )
Cleaned up some deprecation warnings under Python 3.7",21,
182,0.26 (2019-01-02),"datasette serve --reload now restarts Datasette if a database file changes on disk.
datasette publish now now takes an optional --alias mysite.now.sh argument. This will attempt to set an alias after the deploy completes.
Fixed a bug where the advanced CSV export form failed to include the currently selected filters ( #393 )",21,
183,0.25.2 (2018-12-16),"datasette publish heroku now uses the python-3.6.7 runtime
Added documentation on how to build the documentation
Added documentation covering our release process
Upgraded to pytest 4.0.2",21,
184,0.25.1 (2018-11-04),"Documentation improvements plus a fix for publishing to Zeit Now.
datasette publish now now uses Zeit's v1 platform, to work around the new 100MB image limit. Thanks, @slygent - closes #366 .",21,
185,0.25 (2018-09-19),"New plugin hooks, improved database view support and an easier way to use more recent versions of SQLite.
New publish_subcommand plugin hook. A plugin can now add additional datasette publish publishers in addition to the default now and heroku , both of which have been refactored into default plugins. publish_subcommand documentation . Closes #349
New render_cell plugin hook. Plugins can now customize how values are displayed in the HTML tables produced by Datasette's browsable interface. datasette-json-html and datasette-render-images are two new plugins that use this hook. render_cell documentation . Closes #352
New extra_body_script plugin hook, enabling plugins to provide additional JavaScript that should be added to the page footer. extra_body_script documentation .
extra_css_urls and extra_js_urls hooks now take additional optional parameters, allowing them to be more selective about which pages they apply to. Documentation .
You can now use the sortable_columns metadata setting to explicitly enable sort-by-column in the interface for database views, as well as for specific tables.
The new fts_table and fts_pk metadata settings can now be used to explicitly configure full-text search for a table or a view , even if that table is not directly coupled to the SQLite FTS feature in the database schema itself.
Datasette will now use pysqlite3 in place of the standard library sqlite3 module if it has been installed in the current environment. This makes it much easier to run Datasette against a more recent version of SQLite, including the just-released SQLite 3.25.0 which adds window function support. More details on how to use this in #360
New mechanism that allows plugin configuration options to be set using metadata.json .",21,
186,0.24 (2018-07-23),"A number of small new features:
datasette publish heroku now supports --extra-options , fixes #334
Custom error message if SpatiaLite is needed for specified database, closes #331
New config option: truncate_cells_html for truncating long cell values in HTML view - closes #330
Documentation for datasette publish and datasette package , closes #337
Fixed compatibility with Python 3.7
datasette publish heroku now supports app names via the -n option, which can also be used to overwrite an existing application [Russ Garrett]
Title and description metadata can now be set for canned SQL queries , closes #342
New force_https_on config option, fixes https:// API URLs when deploying to Zeit Now - closes #333
?_json_infinity=1 query string argument for handling Infinity/-Infinity values in JSON, closes #332
URLs displayed in the results of custom SQL queries are now URLified, closes #298",21,
187,0.23.2 (2018-07-07),"Minor bugfix and documentation release.
CSV export now respects --cors , fixes #326
Installation instructions , including docker image - closes #328
Fix for row pages for tables with / in, closes #325",21,
188,0.23.1 (2018-06-21),"Minor bugfix release.
Correctly display empty strings in HTML table, closes #314
Allow ""."" in database filenames, closes #302
404s ending in slash redirect to remove that slash, closes #309
Fixed incorrect display of compound primary keys with foreign key
references. Closes #319
Docs + example of canned SQL query using || concatenation. Closes #321
Correctly display facets with value of 0 - closes #318
Default 'expand labels' to checked in CSV advanced export",21,
189,0.23 (2018-06-18),"This release features CSV export, improved options for foreign key expansions,
new configuration settings and improved support for SpatiaLite.
See datasette/compare/0.22.1...0.23 for a full list of
commits added since the last release.",21,
190,CSV export,"Any Datasette table, view or custom SQL query can now be exported as CSV.
Check out the CSV export documentation for more details, or
try the feature out on
https://fivethirtyeight.datasettes.com/fivethirtyeight/bechdel%2Fmovies
If your table has more than max_returned_rows (default 1,000)
Datasette provides the option to stream all rows . This option takes advantage
of async Python and Datasette's efficient pagination to
iterate through the entire matching result set and stream it back as a
downloadable CSV file.",21,
191,Foreign key expansions,"When Datasette detects a foreign key reference it attempts to resolve a label
for that reference (automatically or using the Specifying the label column for a table metadata
option) so it can display a link to the associated row.
This expansion is now also available for JSON and CSV representations of the
table, using the new _labels=on query string option. See
Expanding foreign key references for more details.",21,
192,New configuration settings,"Datasette's Settings now also supports boolean settings. A number of new
configuration options have been added:
num_sql_threads - the number of threads used to execute SQLite queries. Defaults to 3.
allow_facet - enable or disable custom Facets using the _facet= parameter. Defaults to on.
suggest_facets - should Datasette suggest facets? Defaults to on.
allow_download - should users be allowed to download the entire SQLite database? Defaults to on.
allow_sql - should users be allowed to execute custom SQL queries? Defaults to on.
default_cache_ttl - Default HTTP caching max-age header in seconds. Defaults to 365 days - caching can be disabled entirely by settings this to 0.
cache_size_kb - Set the amount of memory SQLite uses for its per-connection cache , in KB.
allow_csv_stream - allow users to stream entire result sets as a single CSV file. Defaults to on.
max_csv_mb - maximum size of a returned CSV file in MB. Defaults to 100MB, set to 0 to disable this limit.",21,
193,Control HTTP caching with ?_ttl=,"You can now customize the HTTP max-age header that is sent on a per-URL basis, using the new ?_ttl= query string parameter.
You can set this to any value in seconds, or you can set it to 0 to disable HTTP caching entirely.
Consider for example this query which returns a randomly selected member of the Avengers:
select * from [avengers/avengers] order by random() limit 1
If you hit the following page repeatedly you will get the same result, due to HTTP caching:
/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1
By adding ?_ttl=0 to the zero you can ensure the page will not be cached and get back a different super hero every time:
/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1&_ttl=0",21,
194,Improved support for SpatiaLite,"The SpatiaLite module
for SQLite adds robust geospatial features to the database.
Getting SpatiaLite working can be tricky, especially if you want to use the most
recent alpha version (with support for K-nearest neighbor).
Datasette now includes extensive documentation on SpatiaLite , and thanks to Ravi Kotecha our GitHub
repo includes a Dockerfile that can build
the latest SpatiaLite and configure it for use with Datasette.
The datasette publish and datasette package commands now accept a new
--spatialite argument which causes them to install and configure SpatiaLite
as part of the container they deploy.",21,
195,latest.datasette.io,"Every commit to Datasette master is now automatically deployed by Travis CI to
https://latest.datasette.io/ - ensuring there is always a live demo of the
latest version of the software.
The demo uses the fixtures from our
unit tests, ensuring it demonstrates the same range of functionality that is
covered by the tests.
You can see how the deployment mechanism works in our .travis.yml file.",21,
196,Miscellaneous,"Got JSON data in one of your columns? Use the new ?_json=COLNAME argument
to tell Datasette to return that JSON value directly rather than encoding it
as a string.
If you just want an array of the first value of each row, use the new
?_shape=arrayfirst option - example .",21,
197,0.22.1 (2018-05-23),"Bugfix release, plus we now use versioneer for our version numbers.
Faceting no longer breaks pagination, fixes #282
Add __version_info__ derived from __version__ [Robert Gieseke]
This might be tuple of more than two values (major and minor
version) if commits have been made after a release.
Add version number support with Versioneer. [Robert Gieseke]
Versioneer Licence:
Public Domain (CC0-1.0)
Closes #273
Refactor inspect logic [Russ Garrett]",21,
198,0.22 (2018-05-20),"The big new feature in this release is Facets . Datasette can now apply faceted browse to any column in any table. It will also suggest possible facets. See the Datasette Facets announcement post for more details.
In addition to the work on facets:
Added docs for introspection endpoints
New --config option, added --help-config , closes #274
Removed the --page_size= argument to datasette serve in favour of:
datasette serve --config default_page_size:50 mydb.db
Added new help section:
datasette --help-config
Config options:
default_page_size Default page size for the table view
(default=100)
max_returned_rows Maximum rows that can be returned from a table
or custom query (default=1000)
sql_time_limit_ms Time limit for a SQL query in milliseconds
(default=1000)
default_facet_size Number of values to return for requested facets
(default=30)
facet_time_limit_ms Time limit for calculating a requested facet
(default=200)
facet_suggest_time_limit_ms Time limit for calculating a suggested facet
(default=50)
Only apply responsive table styles to .rows-and-column
Otherwise they interfere with tables in the description, e.g. on
https://fivethirtyeight.datasettes.com/fivethirtyeight/nba-elo%2Fnbaallelo
Refactored views into new views/ modules, refs #256
Documentation for SQLite full-text search support, closes #253
/-/versions now includes SQLite fts_versions , closes #252",21,
199,0.21 (2018-05-05),"New JSON _shape= options, the ability to set table _size= and a mechanism for searching within specific columns.
Default tests to using a longer timelimit
Every now and then a test will fail in Travis CI on Python 3.5 because it hit
the default 20ms SQL time limit.
Test fixtures now default to a 200ms time limit, and we only use the 20ms time
limit for the specific test that tests query interruption. This should make
our tests on Python 3.5 in Travis much more stable.
Support _search_COLUMN=text searches, closes #237
Show version on /-/plugins page, closes #248
?_size=max option, closes #249
Added /-/versions and /-/versions.json , closes #244
Sample output:
{
""python"": {
""version"": ""3.6.3"",
""full"": ""3.6.3 (default, Oct 4 2017, 06:09:38) \n[GCC 4.2.1 Compatible Apple LLVM 9.0.0 (clang-900.0.37)]""
},
""datasette"": {
""version"": ""0.20""
},
""sqlite"": {
""version"": ""3.23.1"",
""extensions"": {
""json1"": null,
""spatialite"": ""4.3.0a""
}
}
}
Renamed ?_sql_time_limit_ms= to ?_timelimit , closes #242
New ?_shape=array option + tweaks to _shape , closes #245
Default is now ?_shape=arrays (renamed from lists )
New ?_shape=array returns an array of objects as the root object
Changed ?_shape=object to return the object as the root
Updated docs
FTS tables now detected by inspect() , closes #240
New ?_size=XXX query string parameter for table view, closes #229
Also added documentation for all of the _special arguments.
Plus deleted some duplicate logic implementing _group_count .
If max_returned_rows==page_size , increment max_returned_rows - fixes #230
New hidden: True option for table metadata, closes #239
Hide idx_* tables if spatialite detected, closes #228
Added class=rows-and-columns to custom query results table
Added CSS class rows-and-columns to main table
label_column option in metadata.json - closes #234",21,
200,0.20 (2018-04-20),"Mostly new work on the Plugins mechanism: plugins can now bundle static assets and custom templates, and datasette publish has a new --install=name-of-plugin option.
Add col-X classes to HTML table on custom query page
Fixed out-dated template in documentation
Plugins can now bundle custom templates, #224
Added /-/metadata /-/plugins /-/inspect, #225
Documentation for --install option, refs #223
Datasette publish/package --install option, #223
Fix for plugins in Python 3.5, #222
New plugin hooks: extra_css_urls() and extra_js_urls(), #214
/-/static-plugins/PLUGIN_NAME/ now serves static/ from plugins
now gets class=""col-X"" - plus added col-X documentation
Use to_css_class for table cell column classes
This ensures that columns with spaces in the name will still
generate usable CSS class names. Refs #209
Add column name classes to
s, make PK bold [Russ Garrett]
Don't duplicate simple primary keys in the link column [Russ Garrett]
When there's a simple (single-column) primary key, it looks weird to
duplicate it in the link column.
This change removes the second PK column and treats the link column as
if it were the PK column from a header/sorting perspective.
Correct escaping for HTML display of row links [Russ Garrett]
Longer time limit for test_paginate_compound_keys
It was failing intermittently in Travis - see #209
Use application/octet-stream for downloadable databases
Updated PyPI classifiers
Updated PyPI link to pypi.org",21,
201,0.19 (2018-04-16),"This is the first preview of the new Datasette plugins mechanism. Only two
plugin hooks are available so far - for custom SQL functions and custom template
filters. There's plenty more to come - read the documentation and get involved in
the tracking ticket if you
have feedback on the direction so far.
Fix for _sort_desc=sortable_with_nulls test, refs #216
Fixed #216 - paginate correctly when sorting by nullable column
Initial documentation for plugins, closes #213
https://docs.datasette.io/en/stable/plugins.html
New --plugins-dir=plugins/ option ( #212 )
New option causing Datasette to load and evaluate all of the Python files in
the specified directory and register any plugins that are defined in those
files.
This new option is available for the following commands:
datasette serve mydb.db --plugins-dir=plugins/
datasette publish now/heroku mydb.db --plugins-dir=plugins/
datasette package mydb.db --plugins-dir=plugins/
Start of the plugin system, based on pluggy ( #210 )
Uses https://pluggy.readthedocs.io/ originally created for the py.test project
We're starting with two plugin hooks:
prepare_connection(conn)
This is called when a new SQLite connection is created. It can be used to register custom SQL functions.
prepare_jinja2_environment(env)
This is called with the Jinja2 environment. It can be used to register custom template tags and filters.
An example plugin which uses these two hooks can be found at https://github.com/simonw/datasette-plugin-demos or installed using pip install datasette-plugin-demos
Refs #14
Return HTTP 405 on InvalidUsage rather than 500. [Russ Garrett]
This also stops it filling up the logs. This happens for HEAD requests
at the moment - which perhaps should be handled better, but that's a
different issue.",21,
202,0.18 (2018-04-14),"This release introduces support for units ,
contributed by Russ Garrett ( #203 ).
You can now optionally specify the units for specific columns using metadata.json .
Once specified, units will be displayed in the HTML view of your table. They also become
available for use in filters - if a column is configured with a unit of distance, you can
request all rows where that column is less than 50 meters or more than 20 feet for example.
Link foreign keys which don't have labels. [Russ Garrett]
This renders unlabeled FKs as simple links.
Also includes bonus fixes for two minor issues:
In foreign key link hrefs the primary key was escaped using HTML
escaping rather than URL escaping. This broke some non-integer PKs.
Print tracebacks to console when handling 500 errors.
Fix SQLite error when loading rows with no incoming FKs. [Russ
Garrett]
This fixes an error caused by an invalid query when loading incoming FKs.
The error was ignored due to async but it still got printed to the
console.
Allow custom units to be registered with Pint. [Russ Garrett]
Support units in filters. [Russ Garrett]
Tidy up units support. [Russ Garrett]
Add units to exported JSON
Units key in metadata skeleton
Docs
Initial units support. [Russ Garrett]
Add support for specifying units for a column in metadata.json and
rendering them on display using
pint",21,
203,0.17 (2018-04-13),Release 0.17 to fix issues with PyPI,21,
204,0.16 (2018-04-13),"Better mechanism for handling errors; 404s for missing table/database
New error mechanism closes #193
404s for missing tables/databases closes #184
long_description in markdown for the new PyPI
Hide SpatiaLite system tables. [Russ Garrett]
Allow explain select / explain query plan select #201
Datasette inspect now finds primary_keys #195
Ability to sort using form fields (for mobile portrait mode) #199
We now display sort options as a select box plus a descending checkbox, which
means you can apply sort orders even in portrait mode on a mobile phone where
the column headers are hidden.",21,
205,0.15 (2018-04-09),"The biggest new feature in this release is the ability to sort by column. On the
table page the column headers can now be clicked to apply sort (or descending
sort), or you can specify ?_sort=column or ?_sort_desc=column directly
in the URL.
table_rows => table_rows_count , filtered_table_rows =>
filtered_table_rows_count
Renamed properties. Closes #194
New sortable_columns option in metadata.json to control sort options.
You can now explicitly set which columns in a table can be used for sorting
using the _sort and _sort_desc arguments using metadata.json :
{
""databases"": {
""database1"": {
""tables"": {
""example_table"": {
""sortable_columns"": [
""height"",
""weight""
]
}
}
}
}
}
Refs #189
Column headers now link to sort/desc sort - refs #189
_sort and _sort_desc parameters for table views
Allows for paginated sorted results based on a specified column.
Refs #189
Total row count now correct even if _next applied
Use .custom_sql() for _group_count implementation (refs #150 )
Make HTML title more readable in query template ( #180 ) [Ryan Pitts]
New ?_shape=objects/object/lists param for JSON API ( #192 )
New _shape= parameter replacing old .jsono extension
Now instead of this:
/database/table.jsono
We use the _shape parameter like this:
/database/table.json?_shape=objects
Also introduced a new _shape called object which looks like this:
/database/table.json?_shape=object
Returning an object for the rows key:
...
""rows"": {
""pk1"": {
...
},
""pk2"": {
...
}
}
Refs #122
Utility for writing test database fixtures to a .db file
python tests/fixtures.py /tmp/hello.db
This is useful for making a SQLite database of the test fixtures for
interactive exploration.
Compound primary key _next= now plays well with extra filters
Closes #190
Fixed bug with keyset pagination over compound primary keys
Refs #190
Database/Table views inherit source/license/source_url/license_url
metadata
If you set the source_url/license_url/source/license fields in your root
metadata those values will now be inherited all the way down to the database
and table templates.
The title/description are NOT inherited.
Also added unit tests for the HTML generated by the metadata.
Refs #185
Add metadata, if it exists, to heroku temp dir ( #178 ) [Tony Hirst]
Initial documentation for pagination
Broke up test_app into test_api and test_html
Fixed bug with .json path regular expression
I had a table called geojson and it caused an exception because the regex
was matching .json and not \.json
Deploy to Heroku with Python 3.6.3",21,
206,0.14 (2017-12-09),"The theme of this release is customization: Datasette now allows every aspect
of its presentation to be customized
either using additional CSS or by providing entirely new templates.
Datasette's metadata.json format
has also been expanded, to allow per-database and per-table metadata. A new
datasette skeleton command can be used to generate a skeleton JSON file
ready to be filled in with per-database and per-table details.
The metadata.json file can also be used to define
canned queries ,
as a more powerful alternative to SQL views.
extra_css_urls / extra_js_urls in metadata
A mechanism in the metadata.json format for adding custom CSS and JS urls.
Create a metadata.json file that looks like this:
{
""extra_css_urls"": [
""https://simonwillison.net/static/css/all.bf8cd891642c.css""
],
""extra_js_urls"": [
""https://code.jquery.com/jquery-3.2.1.slim.min.js""
]
}
Then start datasette like this:
datasette mydb.db --metadata=metadata.json
The CSS and JavaScript files will be linked in the of every page.
You can also specify a SRI (subresource integrity hash) for these assets:
{
""extra_css_urls"": [
{
""url"": ""https://simonwillison.net/static/css/all.bf8cd891642c.css"",
""sri"": ""sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI""
}
],
""extra_js_urls"": [
{
""url"": ""https://code.jquery.com/jquery-3.2.1.slim.min.js"",
""sri"": ""sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g=""
}
]
}
Modern browsers will only execute the stylesheet or JavaScript if the SRI hash
matches the content served. You can generate hashes using https://www.srihash.org/
Auto-link column values that look like URLs ( #153 )
CSS styling hooks as classes on the body ( #153 )
Every template now gets CSS classes in the body designed to support custom
styling.
The index template (the top level page at / ) gets this:
The database template ( /dbname/ ) gets this:
The table template ( /dbname/tablename ) gets:
The row template ( /dbname/tablename/rowid ) gets:
The db-x and table-x classes use the database or table names themselves IF
they are valid CSS identifiers. If they aren't, we strip any invalid
characters out and append a 6 character md5 digest of the original name, in
order to ensure that multiple tables which resolve to the same stripped
character version still have different CSS classes.
Some examples (extracted from the unit tests):
""simple"" => ""simple""
""MixedCase"" => ""MixedCase""
""-no-leading-hyphens"" => ""no-leading-hyphens-65bea6""
""_no-leading-underscores"" => ""no-leading-underscores-b921bc""
""no spaces"" => ""no-spaces-7088d7""
""-"" => ""336d5e""
""no $ characters"" => ""no--characters-59e024""
datasette --template-dir=mytemplates/ argument
You can now pass an additional argument specifying a directory to look for
custom templates in.
Datasette will fall back on the default templates if a template is not
found in that directory.
Ability to over-ride templates for individual tables/databases.
It is now possible to over-ride templates on a per-database / per-row or per-
table basis.
When you access e.g. /mydatabase/mytable Datasette will look for the following:
- table-mydatabase-mytable.html
- table.html
If you provided a --template-dir argument to datasette serve it will look in
that directory first.
The lookup rules are as follows:
Index page (/):
index.html
Database page (/mydatabase):
database-mydatabase.html
database.html
Table page (/mydatabase/mytable):
table-mydatabase-mytable.html
table.html
Row page (/mydatabase/mytable/id):
row-mydatabase-mytable.html
row.html
If a table name has spaces or other unexpected characters in it, the template
filename will follow the same rules as our custom CSS classes
- for example, a table called ""Food Trucks""
will attempt to load the following templates:
table-mydatabase-Food-Trucks-399138.html
table.html
It is possible to extend the default templates using Jinja template
inheritance. If you want to customize EVERY row template with some additional
content you can do so by creating a row.html template like this:
{% extends ""default:row.html"" %}
{% block content %}
EXTRA HTML AT THE TOP OF THE CONTENT BLOCK
This line renders the original block:
{{ super() }}
{% endblock %}
--static option for datasette serve ( #160 )
You can now tell Datasette to serve static files from a specific location at a
specific mountpoint.
For example:
datasette serve mydb.db --static extra-css:/tmp/static/css
Now if you visit this URL:
http://localhost:8001/extra-css/blah.css
The following file will be served:
/tmp/static/css/blah.css
Canned query support.
Named canned queries can now be defined in metadata.json like this:
{
""databases"": {
""timezones"": {
""queries"": {
""timezone_for_point"": ""select tzid from timezones ...""
}
}
}
}
These will be shown in a new ""Queries"" section beneath ""Views"" on the database page.
New datasette skeleton command for generating metadata.json ( #164 )
metadata.json support for per-table/per-database metadata ( #165 )
Also added support for descriptions and HTML descriptions.
Here's an example metadata.json file illustrating custom per-database and per-
table metadata:
{
""title"": ""Overall datasette title"",
""description_html"": ""This is a description with HTML."",
""databases"": {
""db1"": {
""title"": ""First database"",
""description"": ""This is a string description & has no HTML"",
""license_url"": ""http://example.com/"",
""license"": ""The example license"",
""queries"": {
""canned_query"": ""select * from table1 limit 3;""
},
""tables"": {
""table1"": {
""title"": ""Custom title for table1"",
""description"": ""Tables can have descriptions too"",
""source"": ""This has a custom source"",
""source_url"": ""http://example.com/""
}
}
}
}
}
Renamed datasette build command to datasette inspect ( #130 )
Upgrade to Sanic 0.7.0 ( #168 )
https://github.com/channelcat/sanic/releases/tag/0.7.0
Package and publish commands now accept --static and --template-dir
Example usage:
datasette package --static css:extra-css/ --static js:extra-js/ \
sf-trees.db --template-dir templates/ --tag sf-trees --branch master
This creates a local Docker image that includes copies of the templates/,
extra-css/ and extra-js/ directories. You can then run it like this:
docker run -p 8001:8001 sf-trees
For publishing to Zeit now:
datasette publish now --static css:extra-css/ --static js:extra-js/ \
sf-trees.db --template-dir templates/ --name sf-trees --branch master
HTML comment showing which templates were considered for a page ( #171 )",21,
207,0.13 (2017-11-24),"Search now applies to current filters.
Combined search into the same form as filters.
Closes #133
Much tidier design for table view header.
Closes #147
Added ?column__not=blah filter.
Closes #148
Row page now resolves foreign keys.
Closes #132
Further tweaks to select/input filter styling.
Refs #86 - thanks for the help, @natbat!
Show linked foreign key in table cells.
Added UI for editing table filters.
Refs #86
Hide FTS-created tables on index pages.
Closes #129
Add publish to heroku support [Jacob Kaplan-Moss]
datasette publish heroku mydb.db
Pull request #104
Initial implementation of ?_group_count=column .
URL shortcut for counting rows grouped by one or more columns.
?_group_count=column1&_group_count=column2 works as well.
SQL generated looks like this:
select ""qSpecies"", count(*) as ""count""
from Street_Tree_List
group by ""qSpecies""
order by ""count"" desc limit 100
Or for two columns like this:
select ""qSpecies"", ""qSiteInfo"", count(*) as ""count""
from Street_Tree_List
group by ""qSpecies"", ""qSiteInfo""
order by ""count"" desc limit 100
Refs #44
Added --build=master option to datasette publish and package.
The datasette publish and datasette package commands both now accept an
optional --build argument. If provided, this can be used to specify a branch
published to GitHub that should be built into the container.
This makes it easier to test code that has not yet been officially released to
PyPI, e.g.:
datasette publish now mydb.db --branch=master
Implemented ?_search=XXX + UI if a FTS table is detected.
Closes #131
Added datasette --version support.
Table views now show expanded foreign key references, if possible.
If a table has foreign key columns, and those foreign key tables have
label_columns , the TableView will now query those other tables for the
corresponding values and display those values as links in the corresponding
table cells.
label_columns are currently detected by the inspect() function, which looks
for any table that has just two columns - an ID column and one other - and
sets the label_column to be that second non-ID column.
Don't prevent tabbing to ""Run SQL"" button ( #117 ) [Robert Gieseke]
See comment in #115
Add keyboard shortcut to execute SQL query ( #115 ) [Robert Gieseke]
Allow --load-extension to be set via environment variable.
Add support for ?field__isnull=1 ( #107 ) [Ray N]
Add spatialite, switch to debian and local build ( #114 ) [Ariel Núñez]
Added --load-extension argument to datasette serve.
Allows loading of SQLite extensions. Refs #110 .",21,
208,0.12 (2017-11-16),"Added __version__ , now displayed as tooltip in page footer ( #108 ).
Added initial docs, including a changelog ( #99 ).
Turned on auto-escaping in Jinja.
Added a UI for editing named parameters ( #96 ).
You can now construct a custom SQL statement using SQLite named
parameters (e.g. :name ) and datasette will display form fields for
editing those parameters. Here’s an example which lets you see the
most popular names for dogs of different species registered through
various dog registration schemes in Australia.
Pin to specific Jinja version. ( #100 ).
Default to 127.0.0.1 not 0.0.0.0. ( #98 ).
Added extra metadata options to publish and package commands. ( #92 ).
You can now run these commands like so:
datasette now publish mydb.db \
--title=""My Title"" \
--source=""Source"" \
--source_url=""http://www.example.com/"" \
--license=""CC0"" \
--license_url=""https://creativecommons.org/publicdomain/zero/1.0/""
This will write those values into the metadata.json that is packaged with the
app. If you also pass --metadata=metadata.json that file will be updated with the extra
values before being written into the Docker image.
Added production-ready Dockerfile ( #94 ) [Andrew
Cutler]
New ?_sql_time_limit_ms=10 argument to database and table page ( #95 )
SQL syntax highlighting with Codemirror ( #89 ) [Tom Dyson]",21,
209,0.11 (2017-11-14),"Added datasette publish now --force option.
This calls now with --force - useful as it means you get a fresh copy of datasette even if Now has already cached that docker layer.
Enable --cors by default when running in a container.",21,
210,0.10 (2017-11-14),"Fixed #83 - 500 error on individual row pages.
Stop using sqlite WITH RECURSIVE in our tests.
The version of Python 3 running in Travis CI doesn't support this.",21,
211,0.9 (2017-11-13),"Added --sql_time_limit_ms and --extra-options .
The serve command now accepts --sql_time_limit_ms for customizing the SQL time
limit.
The publish and package commands now accept --extra-options which can be used
to specify additional options to be passed to the datasite serve command when
it executes inside the resulting Docker containers.",21,
212,0.8 (2017-11-13),"V0.8 - added PyPI metadata, ready to ship.
Implemented offset/limit pagination for views ( #70 ).
Improved pagination. ( #78 )
Limit on max rows returned, controlled by --max_returned_rows option. ( #69 )
If someone executes 'select * from table' against a table with a million rows
in it, we could run into problems: just serializing that much data as JSON is
likely to lock up the server.
Solution: we now have a hard limit on the maximum number of rows that can be
returned by a query. If that limit is exceeded, the server will return a
""truncated"": true field in the JSON.
This limit can be optionally controlled by the new --max_returned_rows
option. Setting that option to 0 disables the limit entirely.",21,
213,CLI reference,"The datasette CLI tool provides a number of commands.
Running datasette without specifying a command runs the default command, datasette serve . See datasette serve for the full list of options for that command.
[[[cog
from datasette import cli
from click.testing import CliRunner
import textwrap
def help(args):
title = ""datasette "" + "" "".join(args)
cog.out(""\n::\n\n"")
result = CliRunner().invoke(cli.cli, args)
output = result.output.replace(""Usage: cli "", ""Usage: datasette "")
cog.out(textwrap.indent(output, ' '))
cog.out(""\n\n"")
]]]
[[[end]]]",21,
214,datasette --help,"Running datasette --help shows a list of all of the available commands.
[[[cog
help([""--help""])
]]]
Usage: datasette [OPTIONS] COMMAND [ARGS]...
Datasette is an open source multi-tool for exploring and publishing data
About Datasette: https://datasette.io/
Full documentation: https://docs.datasette.io/
Options:
--version Show the version and exit.
--help Show this message and exit.
Commands:
serve* Serve up specified SQLite database files with a web UI
create-token Create a signed API token for the specified actor ID
inspect Generate JSON summary of provided database files
install Install plugins and packages from PyPI into the same...
package Package SQLite files into a Datasette Docker container
plugins List currently installed plugins
publish Publish specified SQLite database files to the internet...
uninstall Uninstall plugins and Python packages from the Datasette...
[[[end]]]
Additional commands added by plugins that use the register_commands(cli) hook will be listed here as well.",21,
215,datasette serve,"This command starts the Datasette web application running on your machine:
datasette serve mydatabase.db
Or since this is the default command you can run this instead:
datasette mydatabase.db
Once started you can access it at http://localhost:8001
[[[cog
help([""serve"", ""--help""])
]]]
Usage: datasette serve [OPTIONS] [FILES]...
Serve up specified SQLite database files with a web UI
Options:
-i, --immutable PATH Database files to open in immutable mode
-h, --host TEXT Host for server. Defaults to 127.0.0.1 which
means only connections from the local machine
will be allowed. Use 0.0.0.0 to listen to all
IPs and allow access from other machines.
-p, --port INTEGER RANGE Port for server, defaults to 8001. Use -p 0 to
automatically assign an available port.
[0<=x<=65535]
--uds TEXT Bind to a Unix domain socket
--reload Automatically reload if code or metadata
change detected - useful for development
--cors Enable CORS by serving Access-Control-Allow-
Origin: *
--load-extension PATH:ENTRYPOINT?
Path to a SQLite extension to load, and
optional entrypoint
--inspect-file TEXT Path to JSON file created using ""datasette
inspect""
-m, --metadata FILENAME Path to JSON/YAML file containing
license/source metadata
--template-dir DIRECTORY Path to directory containing custom templates
--plugins-dir DIRECTORY Path to directory containing custom plugins
--static MOUNT:DIRECTORY Serve static files from this directory at
/MOUNT/...
--memory Make /_memory database available
-c, --config FILENAME Path to JSON/YAML Datasette configuration file
-s, --setting SETTING... nested.key, value setting to use in Datasette
configuration
--secret TEXT Secret used for signing secure values, such as
signed cookies
--root Output URL that sets a cookie authenticating
the root user
--get TEXT Run an HTTP GET request against this path,
print results and exit
--token TEXT API token to send with --get requests
--actor TEXT Actor to use for --get requests (JSON string)
--version-note TEXT Additional note to show on /-/versions
--help-settings Show available settings
--pdb Launch debugger on any errors
-o, --open Open Datasette in your web browser
--create Create database files if they do not exist
--crossdb Enable cross-database joins using the /_memory
database
--nolock Ignore locking, open locked files in read-only
mode
--ssl-keyfile TEXT SSL key file
--ssl-certfile TEXT SSL certificate file
--internal PATH Path to a persistent Datasette internal SQLite
database
--help Show this message and exit.
[[[end]]]",21,
216,Environment variables,"Some of the datasette serve options can be provided by environment variables:
DATASETTE_SECRET : Equivalent to the --secret option.
DATASETTE_SSL_KEYFILE : Equivalent to the --ssl-keyfile option.
DATASETTE_SSL_CERTFILE : Equivalent to the --ssl-certfile option.
DATASETTE_LOAD_EXTENSION : Equivalent to the --load-extension option.",21,
217,datasette --get,"The --get option to datasette serve (or just datasette ) specifies the path to a page within Datasette and causes Datasette to output the content from that path without starting the web server.
This means that all of Datasette's functionality can be accessed directly from the command-line.
For example:
datasette --get '/-/versions.json' | jq .
{
""python"": {
""version"": ""3.8.5"",
""full"": ""3.8.5 (default, Jul 21 2020, 10:48:26) \n[Clang 11.0.3 (clang-1103.0.32.62)]""
},
""datasette"": {
""version"": ""0.46+15.g222a84a.dirty""
},
""asgi"": ""3.0"",
""uvicorn"": ""0.11.8"",
""sqlite"": {
""version"": ""3.32.3"",
""fts_versions"": [
""FTS5"",
""FTS4"",
""FTS3""
],
""extensions"": {
""json1"": null
},
""compile_options"": [
""COMPILER=clang-11.0.3"",
""ENABLE_COLUMN_METADATA"",
""ENABLE_FTS3"",
""ENABLE_FTS3_PARENTHESIS"",
""ENABLE_FTS4"",
""ENABLE_FTS5"",
""ENABLE_GEOPOLY"",
""ENABLE_JSON1"",
""ENABLE_PREUPDATE_HOOK"",
""ENABLE_RTREE"",
""ENABLE_SESSION"",
""MAX_VARIABLE_NUMBER=250000"",
""THREADSAFE=1""
]
}
}
You can use the --token TOKEN option to send an API token with the simulated request.
Or you can make a request as a specific actor by passing a JSON representation of that actor to --actor :
datasette --memory --actor '{""id"": ""root""}' --get '/-/actor.json'
The exit code of datasette --get will be 0 if the request succeeds and 1 if the request produced an HTTP status code other than 200 - e.g. a 404 or 500 error.
This lets you use datasette --get / to run tests against a Datasette application in a continuous integration environment such as GitHub Actions.",21,
218,datasette serve --help-settings,"This command outputs all of the available Datasette settings .
These can be passed to datasette serve using datasette serve --setting name value .
[[[cog
help([""--help-settings""])
]]]
Settings:
default_page_size Default page size for the table view
(default=100)
max_returned_rows Maximum rows that can be returned from a table or
custom query (default=1000)
max_insert_rows Maximum rows that can be inserted at a time using
the bulk insert API (default=100)
num_sql_threads Number of threads in the thread pool for
executing SQLite queries (default=3)
sql_time_limit_ms Time limit for a SQL query in milliseconds
(default=1000)
default_facet_size Number of values to return for requested facets
(default=30)
facet_time_limit_ms Time limit for calculating a requested facet
(default=200)
facet_suggest_time_limit_ms Time limit for calculating a suggested facet
(default=50)
allow_facet Allow users to specify columns to facet using
?_facet= parameter (default=True)
allow_download Allow users to download the original SQLite
database files (default=True)
allow_signed_tokens Allow users to create and use signed API tokens
(default=True)
default_allow_sql Allow anyone to run arbitrary SQL queries
(default=True)
max_signed_tokens_ttl Maximum allowed expiry time for signed API tokens
(default=0)
suggest_facets Calculate and display suggested facets
(default=True)
default_cache_ttl Default HTTP cache TTL (used in Cache-Control:
max-age= header) (default=5)
cache_size_kb SQLite cache size in KB (0 == use SQLite default)
(default=0)
allow_csv_stream Allow .csv?_stream=1 to download all rows
(ignoring max_returned_rows) (default=True)
max_csv_mb Maximum size allowed for CSV export in MB - set 0
to disable this limit (default=100)
truncate_cells_html Truncate cells longer than this in HTML table
view - set 0 to disable (default=2048)
force_https_urls Force URLs in API output to always use https://
protocol (default=False)
template_debug Allow display of template debug information with
?_context=1 (default=False)
trace_debug Allow display of SQL trace debug information with
?_trace=1 (default=False)
base_url Datasette URLs should use this base path
(default=/)
[[[end]]]",21,
219,datasette plugins,"Output JSON showing all currently installed plugins, their versions, whether they include static files or templates and which Plugin hooks they use.
[[[cog
help([""plugins"", ""--help""])
]]]
Usage: datasette plugins [OPTIONS]
List currently installed plugins
Options:
--all Include built-in default plugins
--requirements Output requirements.txt of installed plugins
--plugins-dir DIRECTORY Path to directory containing custom plugins
--help Show this message and exit.
[[[end]]]
Example output:
[
{
""name"": ""datasette-geojson"",
""static"": false,
""templates"": false,
""version"": ""0.3.1"",
""hooks"": [
""register_output_renderer""
]
},
{
""name"": ""datasette-geojson-map"",
""static"": true,
""templates"": false,
""version"": ""0.4.0"",
""hooks"": [
""extra_body_script"",
""extra_css_urls"",
""extra_js_urls""
]
},
{
""name"": ""datasette-leaflet"",
""static"": true,
""templates"": false,
""version"": ""0.2.2"",
""hooks"": [
""extra_body_script"",
""extra_template_vars""
]
}
]",21,
220,datasette install,"Install new Datasette plugins. This command works like pip install but ensures that your plugins will be installed into the same environment as Datasette.
This command:
datasette install datasette-cluster-map
Would install the datasette-cluster-map plugin.
[[[cog
help([""install"", ""--help""])
]]]
Usage: datasette install [OPTIONS] [PACKAGES]...
Install plugins and packages from PyPI into the same environment as Datasette
Options:
-U, --upgrade Upgrade packages to latest version
-r, --requirement PATH Install from requirements file
-e, --editable TEXT Install a project in editable mode from this path
--help Show this message and exit.
[[[end]]]",21,
221,datasette uninstall,"Uninstall one or more plugins.
[[[cog
help([""uninstall"", ""--help""])
]]]
Usage: datasette uninstall [OPTIONS] PACKAGES...
Uninstall plugins and Python packages from the Datasette environment
Options:
-y, --yes Don't ask for confirmation
--help Show this message and exit.
[[[end]]]",21,
222,datasette publish,"Shows a list of available deployment targets for publishing data with Datasette.
Additional deployment targets can be added by plugins that use the publish_subcommand(publish) hook.
[[[cog
help([""publish"", ""--help""])
]]]
Usage: datasette publish [OPTIONS] COMMAND [ARGS]...
Publish specified SQLite database files to the internet along with a
Datasette-powered interface and API
Options:
--help Show this message and exit.
Commands:
cloudrun Publish databases to Datasette running on Cloud Run
heroku Publish databases to Datasette running on Heroku
[[[end]]]",21,
223,datasette publish cloudrun,"See Publishing to Google Cloud Run .
[[[cog
help([""publish"", ""cloudrun"", ""--help""])
]]]
Usage: datasette publish cloudrun [OPTIONS] [FILES]...
Publish databases to Datasette running on Cloud Run
Options:
-m, --metadata FILENAME Path to JSON/YAML file containing metadata to
publish
--extra-options TEXT Extra options to pass to datasette serve
--branch TEXT Install datasette from a GitHub branch e.g.
main
--template-dir DIRECTORY Path to directory containing custom templates
--plugins-dir DIRECTORY Path to directory containing custom plugins
--static MOUNT:DIRECTORY Serve static files from this directory at
/MOUNT/...
--install TEXT Additional packages (e.g. plugins) to install
--plugin-secret ...
Secrets to pass to plugins, e.g. --plugin-
secret datasette-auth-github client_id xxx
--version-note TEXT Additional note to show on /-/versions
--secret TEXT Secret used for signing secure values, such as
signed cookies
--title TEXT Title for metadata
--license TEXT License label for metadata
--license_url TEXT License URL for metadata
--source TEXT Source label for metadata
--source_url TEXT Source URL for metadata
--about TEXT About label for metadata
--about_url TEXT About URL for metadata
-n, --name TEXT Application name to use when building
--service TEXT Cloud Run service to deploy (or over-write)
--spatialite Enable SpatialLite extension
--show-files Output the generated Dockerfile and
metadata.json
--memory TEXT Memory to allocate in Cloud Run, e.g. 1Gi
--cpu [1|2|4] Number of vCPUs to allocate in Cloud Run
--timeout INTEGER Build timeout in seconds
--apt-get-install TEXT Additional packages to apt-get install
--max-instances INTEGER Maximum Cloud Run instances
--min-instances INTEGER Minimum Cloud Run instances
--help Show this message and exit.
[[[end]]]",21,
224,datasette publish heroku,"See Publishing to Heroku .
[[[cog
help([""publish"", ""heroku"", ""--help""])
]]]
Usage: datasette publish heroku [OPTIONS] [FILES]...
Publish databases to Datasette running on Heroku
Options:
-m, --metadata FILENAME Path to JSON/YAML file containing metadata to
publish
--extra-options TEXT Extra options to pass to datasette serve
--branch TEXT Install datasette from a GitHub branch e.g.
main
--template-dir DIRECTORY Path to directory containing custom templates
--plugins-dir DIRECTORY Path to directory containing custom plugins
--static MOUNT:DIRECTORY Serve static files from this directory at
/MOUNT/...
--install TEXT Additional packages (e.g. plugins) to install
--plugin-secret ...
Secrets to pass to plugins, e.g. --plugin-
secret datasette-auth-github client_id xxx
--version-note TEXT Additional note to show on /-/versions
--secret TEXT Secret used for signing secure values, such as
signed cookies
--title TEXT Title for metadata
--license TEXT License label for metadata
--license_url TEXT License URL for metadata
--source TEXT Source label for metadata
--source_url TEXT Source URL for metadata
--about TEXT About label for metadata
--about_url TEXT About URL for metadata
-n, --name TEXT Application name to use when deploying
--tar TEXT --tar option to pass to Heroku, e.g.
--tar=/usr/local/bin/gtar
--generate-dir DIRECTORY Output generated application files and stop
without deploying
--help Show this message and exit.
[[[end]]]",21,
225,datasette package,"Package SQLite files into a Datasette Docker container, see datasette package .
[[[cog
help([""package"", ""--help""])
]]]
Usage: datasette package [OPTIONS] FILES...
Package SQLite files into a Datasette Docker container
Options:
-t, --tag TEXT Name for the resulting Docker container, can
optionally use name:tag format
-m, --metadata FILENAME Path to JSON/YAML file containing metadata to
publish
--extra-options TEXT Extra options to pass to datasette serve
--branch TEXT Install datasette from a GitHub branch e.g. main
--template-dir DIRECTORY Path to directory containing custom templates
--plugins-dir DIRECTORY Path to directory containing custom plugins
--static MOUNT:DIRECTORY Serve static files from this directory at /MOUNT/...
--install TEXT Additional packages (e.g. plugins) to install
--spatialite Enable SpatialLite extension
--version-note TEXT Additional note to show on /-/versions
--secret TEXT Secret used for signing secure values, such as
signed cookies
-p, --port INTEGER RANGE Port to run the server on, defaults to 8001
[1<=x<=65535]
--title TEXT Title for metadata
--license TEXT License label for metadata
--license_url TEXT License URL for metadata
--source TEXT Source label for metadata
--source_url TEXT Source URL for metadata
--about TEXT About label for metadata
--about_url TEXT About URL for metadata
--help Show this message and exit.
[[[end]]]",21,
226,datasette inspect,"Outputs JSON representing introspected data about one or more SQLite database files.
If you are opening an immutable database, you can pass this file to the --inspect-data option to improve Datasette's performance by allowing it to skip running row counts against the database when it first starts running:
datasette inspect mydatabase.db > inspect-data.json
datasette serve -i mydatabase.db --inspect-file inspect-data.json
This performance optimization is used automatically by some of the datasette publish commands. You are unlikely to need to apply this optimization manually.
[[[cog
help([""inspect"", ""--help""])
]]]
Usage: datasette inspect [OPTIONS] [FILES]...
Generate JSON summary of provided database files
This can then be passed to ""datasette --inspect-file"" to speed up count
operations against immutable database files.
Options:
--inspect-file TEXT
--load-extension PATH:ENTRYPOINT?
Path to a SQLite extension to load, and
optional entrypoint
--help Show this message and exit.
[[[end]]]",21,
227,datasette create-token,"Create a signed API token, see datasette create-token .
[[[cog
help([""create-token"", ""--help""])
]]]
Usage: datasette create-token [OPTIONS] ID
Create a signed API token for the specified actor ID
Example:
datasette create-token root --secret mysecret
To allow only ""view-database-download"" for all databases:
datasette create-token root --secret mysecret \
--all view-database-download
To allow ""create-table"" against a specific database:
datasette create-token root --secret mysecret \
--database mydb create-table
To allow ""insert-row"" against a specific table:
datasette create-token root --secret myscret \
--resource mydb mytable insert-row
Restricted actions can be specified multiple times using multiple --all,
--database, and --resource options.
Add --debug to see a decoded version of the token.
Options:
--secret TEXT Secret used for signing the API tokens
[required]
-e, --expires-after INTEGER Token should expire after this many seconds
-a, --all ACTION Restrict token to this action
-d, --database DB ACTION Restrict token to this action on this database
-r, --resource DB RESOURCE ACTION
Restrict token to this action on this database
resource (a table, SQL view or named query)
--debug Show decoded token
--plugins-dir DIRECTORY Path to directory containing custom plugins
--help Show this message and exit.
[[[end]]]",21,
228,JavaScript plugins,"Datasette can run custom JavaScript in several different ways:
Datasette plugins written in Python can use the extra_js_urls() or extra_body_script() plugin hooks to inject JavaScript into a page
Datasette instances with custom templates can include additional JavaScript in those templates
The extra_js_urls key in datasette.yaml can be used to include extra JavaScript
There are no limitations on what this JavaScript can do. It is executed directly by the browser, so it can manipulate the DOM, fetch additional data and do anything else that JavaScript is capable of.
Custom JavaScript has security implications, especially for authenticated Datasette instances where the JavaScript might run in the context of the authenticated user. It's important to carefully review any JavaScript you run in your Datasette instance.",21,
229,The datasette_init event,"Datasette emits a custom event called datasette_init when the page is loaded. This event is dispatched on the document object, and includes a detail object with a reference to the datasetteManager object.
Your JavaScript code can listen out for this event using document.addEventListener() like this:
document.addEventListener(""datasette_init"", function (evt) {
const manager = evt.detail;
console.log(""Datasette version:"", manager.VERSION);
});",21,
230,datasetteManager,"The datasetteManager object
VERSION - string
The version of Datasette
plugins - Map()
A Map of currently loaded plugin names to plugin implementations
registerPlugin(name, implementation)
Call this to register a plugin, passing its name and implementation
selectors - object
An object providing named aliases to useful CSS selectors, listed below",21,
231,JavaScript plugin objects,"JavaScript plugins are blocks of code that can be registered with Datasette using the registerPlugin() method on the datasetteManager object.
The implementation object passed to this method should include a version key defining the plugin version, and one or more of the following named functions providing the implementation of the plugin:",21,
232,makeAboveTablePanelConfigs(),"This method should return a JavaScript array of objects defining additional panels to be added to the top of the table page. Each object should have the following:
id - string
A unique string ID for the panel, for example map-panel
label - string
A human-readable label for the panel
render(node) - function
A function that will be called with a DOM node to render the panel into
This example shows how a plugin might define a single panel:
document.addEventListener('datasette_init', function(ev) {
ev.detail.registerPlugin('panel-plugin', {
version: 0.1,
makeAboveTablePanelConfigs: () => {
return [
{
id: 'first-panel',
label: 'First panel',
render: node => {
node.innerHTML = '
My custom panel
This is a custom panel that I added using a JavaScript plugin
';
}
}
]
}
});
});
When a page with a table loads, all registered plugins that implement makeAboveTablePanelConfigs() will be called and panels they return will be added to the top of the table page.",21,
233,makeColumnActions(columnDetails),"This method, if present, will be called when Datasette is rendering the cog action menu icons that appear at the top of the table view. By default these include options like ""Sort ascending/descending"" and ""Facet by this"", but plugins can return additional actions to be included in this menu.
The method will be called with a columnDetails object with the following keys:
columnName - string
The name of the column
columnNotNull - boolean
True if the column is defined as NOT NULL
columnType - string
The SQLite data type of the column
isPk - boolean
True if the column is part of the primary key
It should return a JavaScript array of objects each with a label and onClick property:
label - string
The human-readable label for the action
onClick(evt) - function
A function that will be called when the action is clicked
The evt object passed to the onClick is the standard browser event object that triggered the click.
This example plugin adds two menu items - one to copy the column name to the clipboard and another that displays the column metadata in an alert() window:
document.addEventListener('datasette_init', function(ev) {
ev.detail.registerPlugin('column-name-plugin', {
version: 0.1,
makeColumnActions: (columnDetails) => {
return [
{
label: 'Copy column to clipboard',
onClick: async (evt) => {
await navigator.clipboard.writeText(columnDetails.columnName)
}
},
{
label: 'Alert column metadata',
onClick: () => alert(JSON.stringify(columnDetails, null, 2))
}
];
}
});
});",21,
234,Selectors,"These are available on the selectors property of the datasetteManager object.
const DOM_SELECTORS = {
/** Should have one match */
jsonExportLink: "".export-links a[href*=json]"",
/** Event listeners that go outside of the main table, e.g. existing scroll listener */
tableWrapper: "".table-wrapper"",
table: ""table.rows-and-columns"",
aboveTablePanel: "".above-table-panel"",
// These could have multiple matches
/** Used for selecting table headers. Use makeColumnActions if you want to add menu items. */
tableHeaders: `table.rows-and-columns th`,
/** Used to add ""where"" clauses to query using direct manipulation */
filterRows: "".filter-row"",
/** Used to show top available enum values for a column (""facets"") */
facetResults: "".facet-results [data-column]"",
};",21,
235,Introspection,"Datasette includes some pages and JSON API endpoints for introspecting the current instance. These can be used to understand some of the internals of Datasette and to see how a particular instance has been configured.
Each of these pages can be viewed in your browser. Add .json to the URL to get back the contents as JSON.",21,
236,/-/metadata,"Shows the contents of the metadata.json file that was passed to datasette serve , if any. Metadata example :
{
""license"": ""CC Attribution 4.0 License"",
""license_url"": ""http://creativecommons.org/licenses/by/4.0/"",
""source"": ""fivethirtyeight/data on GitHub"",
""source_url"": ""https://github.com/fivethirtyeight/data"",
""title"": ""Five Thirty Eight"",
""databases"": {
}
}",21,
237,/-/versions,"Shows the version of Datasette, Python and SQLite. Versions example :
{
""datasette"": {
""version"": ""0.60""
},
""python"": {
""full"": ""3.8.12 (default, Dec 21 2021, 10:45:09) \n[GCC 10.2.1 20210110]"",
""version"": ""3.8.12""
},
""sqlite"": {
""extensions"": {
""json1"": null
},
""fts_versions"": [
""FTS5"",
""FTS4"",
""FTS3""
],
""compile_options"": [
""COMPILER=gcc-6.3.0 20170516"",
""ENABLE_FTS3"",
""ENABLE_FTS4"",
""ENABLE_FTS5"",
""ENABLE_JSON1"",
""ENABLE_RTREE"",
""THREADSAFE=1""
],
""version"": ""3.37.0""
}
}",21,
238,/-/plugins,"Shows a list of currently installed plugins and their versions. Plugins example :
[
{
""name"": ""datasette_cluster_map"",
""static"": true,
""templates"": false,
""version"": ""0.10"",
""hooks"": [""extra_css_urls"", ""extra_js_urls"", ""extra_body_script""]
}
]
Add ?all=1 to include details of the default plugins baked into Datasette.",21,
239,/-/settings,"Shows the Settings for this instance of Datasette. Settings example :
{
""default_facet_size"": 30,
""default_page_size"": 100,
""facet_suggest_time_limit_ms"": 50,
""facet_time_limit_ms"": 1000,
""max_returned_rows"": 1000,
""sql_time_limit_ms"": 1000
}",21,
240,/-/config,"Shows the configuration for this instance of Datasette. This is generally the contents of the datasette.yaml or datasette.json file, which can include plugin configuration as well. Config example :
{
""settings"": {
""template_debug"": true,
""trace_debug"": true,
""force_https_urls"": true
}
}
Any keys that include the one of the following substrings in their names will be returned as redacted *** output, to help avoid accidentally leaking private configuration information: secret , key , password , token , hash , dsn .",21,
241,/-/databases,"Shows currently attached databases. Databases example :
[
{
""hash"": null,
""is_memory"": false,
""is_mutable"": true,
""name"": ""fixtures"",
""path"": ""fixtures.db"",
""size"": 225280
}
]",21,
242,/-/threads,"Shows details of threads and asyncio tasks. Threads example :
{
""num_threads"": 2,
""threads"": [
{
""daemon"": false,
""ident"": 4759197120,
""name"": ""MainThread""
},
{
""daemon"": true,
""ident"": 123145319682048,
""name"": ""Thread-1""
},
],
""num_tasks"": 3,
""tasks"": [
"" cb=[set.discard()]>"",
"" wait_for=()]> cb=[run_until_complete..()]>"",
"" wait_for=()]>>""
]
}",21,
243,/-/actor,"Shows the currently authenticated actor. Useful for debugging Datasette authentication plugins.
{
""actor"": {
""id"": 1,
""username"": ""some-user""
}
}",21,
244,/-/messages,"The debug tool at /-/messages can be used to set flash messages to try out that feature. See .add_message(request, message, type=datasette.INFO) for details of this feature.",21,
245,SpatiaLite,"The SpatiaLite module for SQLite adds features for handling geographic and spatial data. For an example of what you can do with it, see the tutorial Building a location to time zone API with SpatiaLite .
To use it with Datasette, you need to install the mod_spatialite dynamic library. This can then be loaded into Datasette using the --load-extension command-line option.
Datasette can look for SpatiaLite in common installation locations if you run it like this:
datasette --load-extension=spatialite --setting default_allow_sql off
If SpatiaLite is in another location, use the full path to the extension instead:
datasette --setting default_allow_sql off \
--load-extension=/usr/local/lib/mod_spatialite.dylib",21,
246,Warning,"The SpatiaLite extension adds a large number of additional SQL functions , some of which are not be safe for untrusted users to execute: they may cause the Datasette server to crash.
You should not expose a SpatiaLite-enabled Datasette instance to the public internet without taking extra measures to secure it against potentially harmful SQL queries.
The following steps are recommended:
Disable arbitrary SQL queries by untrusted users. See Controlling the ability to execute arbitrary SQL for ways to do this. The easiest is to start Datasette with the datasette --setting default_allow_sql off option.
Define Canned queries with the SQL queries that use SpatiaLite functions that you want people to be able to execute.
The Datasette SpatiaLite tutorial includes detailed instructions for running SpatiaLite safely using these techniques",21,
247,Installation,,21,
248,Installing SpatiaLite on OS X,"The easiest way to install SpatiaLite on OS X is to use Homebrew .
brew update
brew install spatialite-tools
This will install the spatialite command-line tool and the mod_spatialite dynamic library.
You can now run Datasette like so:
datasette --load-extension=spatialite",21,
249,Installing SpatiaLite on Linux,"SpatiaLite is packaged for most Linux distributions.
apt install spatialite-bin libsqlite3-mod-spatialite
Depending on your distribution, you should be able to run Datasette something like this:
datasette --load-extension=/usr/lib/x86_64-linux-gnu/mod_spatialite.so
If you are unsure of the location of the module, try running locate mod_spatialite and see what comes back.",21,
250,Spatial indexing latitude/longitude columns,"Here's a recipe for taking a table with existing latitude and longitude columns, adding a SpatiaLite POINT geometry column to that table, populating the new column and then populating a spatial index:
import sqlite3
conn = sqlite3.connect(""museums.db"")
# Lead the spatialite extension:
conn.enable_load_extension(True)
conn.load_extension(""/usr/local/lib/mod_spatialite.dylib"")
# Initialize spatial metadata for this database:
conn.execute(""select InitSpatialMetadata(1)"")
# Add a geometry column called point_geom to our museums table:
conn.execute(
""SELECT AddGeometryColumn('museums', 'point_geom', 4326, 'POINT', 2);""
)
# Now update that geometry column with the lat/lon points
conn.execute(
""""""
UPDATE museums SET
point_geom = GeomFromText('POINT('||""longitude""||' '||""latitude""||')',4326);
""""""
)
# Now add a spatial index to that column
conn.execute(
'select CreateSpatialIndex(""museums"", ""point_geom"");'
)
# If you don't commit your changes will not be persisted:
conn.commit()
conn.close()",21,
251,Making use of a spatial index,"SpatiaLite spatial indexes are R*Trees. They allow you to run efficient bounding box queries using a sub-select, with a similar pattern to that used for Searches using custom SQL .
In the above example, the resulting index will be called idx_museums_point_geom . This takes the form of a SQLite virtual table. You can inspect its contents using the following query:
select * from idx_museums_point_geom limit 10;
Here's a live example: timezones-api.datasette.io/timezones/idx_timezones_Geometry
pkid
xmin
xmax
ymin
ymax
1
-8.601725578308105
-2.4930307865142822
4.162120819091797
10.74019718170166
2
-3.2607860565185547
1.27329421043396
4.539252281188965
11.174856185913086
3
32.997581481933594
47.98238754272461
3.3974475860595703
14.894054412841797
4
-8.66890811920166
11.997337341308594
18.9681453704834
37.296207427978516
5
36.43336486816406
43.300174713134766
12.354820251464844
18.070993423461914
You can now construct efficient bounding box queries that will make use of the index like this:
select * from museums where museums.rowid in (
SELECT pkid FROM idx_museums_point_geom
-- left-hand-edge of point > left-hand-edge of bbox (minx)
where xmin > :bbox_minx
-- right-hand-edge of point < right-hand-edge of bbox (maxx)
and xmax < :bbox_maxx
-- bottom-edge of point > bottom-edge of bbox (miny)
and ymin > :bbox_miny
-- top-edge of point < top-edge of bbox (maxy)
and ymax < :bbox_maxy
);
Spatial indexes can be created against polygon columns as well as point columns, in which case they will represent the minimum bounding rectangle of that polygon. This is useful for accelerating within queries, as seen in the Timezones API example.",21,
252,Importing shapefiles into SpatiaLite,"The shapefile format is a common format for distributing geospatial data. You can use the spatialite command-line tool to create a new database table from a shapefile.
Try it now with the North America shapefile available from the University of North Carolina Global River Database project. Download the file and unzip it (this will create files called narivs.dbf , narivs.prj , narivs.shp and narivs.shx in the current directory), then run the following:
spatialite rivers-database.db
SpatiaLite version ..: 4.3.0a Supported Extensions:
...
spatialite> .loadshp narivs rivers CP1252 23032
========
Loading shapefile at 'narivs' into SQLite table 'rivers'
...
Inserted 467973 rows into 'rivers' from SHAPEFILE
This will load the data from the narivs shapefile into a new database table called rivers .
Exit out of spatialite (using Ctrl+D ) and run Datasette against your new database like this:
datasette rivers-database.db \
--load-extension=/usr/local/lib/mod_spatialite.dylib
If you browse to http://localhost:8001/rivers-database/rivers you will see the new table... but the Geometry column will contain unreadable binary data (SpatiaLite uses a custom format based on WKB ).
The easiest way to turn this into semi-readable data is to use the SpatiaLite AsGeoJSON function. Try the following using the SQL query interface at http://localhost:8001/rivers-database :
select *, AsGeoJSON(Geometry) from rivers limit 10;
This will give you back an additional column of GeoJSON. You can copy and paste GeoJSON from this column into the debugging tool at geojson.io to visualize it on a map.
To see a more interesting example, try ordering the records with the longest geometry first. Since there are 467,000 rows in the table you will first need to increase the SQL time limit imposed by Datasette:
datasette rivers-database.db \
--load-extension=/usr/local/lib/mod_spatialite.dylib \
--setting sql_time_limit_ms 10000
Now try the following query:
select *, AsGeoJSON(Geometry) from rivers
order by length(Geometry) desc limit 10;",21,
253,Importing GeoJSON polygons using Shapely,"Another common form of polygon data is the GeoJSON format. This can be imported into SpatiaLite directly, or by using the Shapely Python library.
Who's On First is an excellent source of openly licensed GeoJSON polygons. Let's import the geographical polygon for Wales. First, we can use the Who's On First Spelunker tool to find the record for Wales:
spelunker.whosonfirst.org/id/404227475
That page includes a link to the GeoJSON record, which can be accessed here:
data.whosonfirst.org/404/227/475/404227475.geojson
Here's Python code to create a SQLite database, enable SpatiaLite, create a places table and then add a record for Wales:
import sqlite3
conn = sqlite3.connect(""places.db"")
# Enable SpatialLite extension
conn.enable_load_extension(True)
conn.load_extension(""/usr/local/lib/mod_spatialite.dylib"")
# Create the masic countries table
conn.execute(""select InitSpatialMetadata(1)"")
conn.execute(
""create table places (id integer primary key, name text);""
)
# Add a MULTIPOLYGON Geometry column
conn.execute(
""SELECT AddGeometryColumn('places', 'geom', 4326, 'MULTIPOLYGON', 2);""
)
# Add a spatial index against the new column
conn.execute(""SELECT CreateSpatialIndex('places', 'geom');"")
# Now populate the table
from shapely.geometry.multipolygon import MultiPolygon
from shapely.geometry import shape
import requests
geojson = requests.get(
""https://data.whosonfirst.org/404/227/475/404227475.geojson""
).json()
# Convert to ""Well Known Text"" format
wkt = shape(geojson[""geometry""]).wkt
# Insert and commit the record
conn.execute(
""INSERT INTO places (id, name, geom) VALUES(null, ?, GeomFromText(?, 4326))"",
(""Wales"", wkt),
)
conn.commit()",21,
254,Querying polygons using within(),"The within() SQL function can be used to check if a point is within a geometry:
select
name
from
places
where
within(GeomFromText('POINT(-3.1724366 51.4704448)'), places.geom);
The GeomFromText() function takes a string of well-known text. Note that the order used here is longitude then latitude .
To run that same within() query in a way that benefits from the spatial index, use the following:
select
name
from
places
where
within(GeomFromText('POINT(-3.1724366 51.4704448)'), places.geom)
and rowid in (
SELECT pkid FROM idx_places_geom
where xmin < -3.1724366
and xmax > -3.1724366
and ymin < 51.4704448
and ymax > 51.4704448
);",21,
255,Internals for plugins,Many Plugin hooks are passed objects that provide access to internal Datasette functionality. The interface to these objects should not be considered stable with the exception of methods that are documented here.,21,
256,Request object,"The request object is passed to various plugin hooks. It represents an incoming HTTP request. It has the following properties:
.scope - dictionary
The ASGI scope that was used to construct this request, described in the ASGI HTTP connection scope specification.
.method - string
The HTTP method for this request, usually GET or POST .
.url - string
The full URL for this request, e.g. https://latest.datasette.io/fixtures .
.scheme - string
The request scheme - usually https or http .
.headers - dictionary (str -> str)
A dictionary of incoming HTTP request headers. Header names have been converted to lowercase.
.cookies - dictionary (str -> str)
A dictionary of incoming cookies
.host - string
The host header from the incoming request, e.g. latest.datasette.io or localhost .
.path - string
The path of the request excluding the query string, e.g. /fixtures .
.full_path - string
The path of the request including the query string if one is present, e.g. /fixtures?sql=select+sqlite_version() .
.query_string - string
The query string component of the request, without the ? - e.g. name__contains=sam&age__gt=10 .
.args - MultiParams
An object representing the parsed query string parameters, see below.
.url_vars - dictionary (str -> str)
Variables extracted from the URL path, if that path was defined using a regular expression. See register_routes(datasette) .
.actor - dictionary (str -> Any) or None
The currently authenticated actor (see actors ), or None if the request is unauthenticated.
The object also has two awaitable methods:
await request.post_vars() - dictionary
Returns a dictionary of form variables that were submitted in the request body via POST . Don't forget to read about CSRF protection !
await request.post_body() - bytes
Returns the un-parsed body of a request submitted by POST - useful for things like incoming JSON data.
And a class method that can be used to create fake request objects for use in tests:
fake(path_with_query_string, method=""GET"", scheme=""http"", url_vars=None)
Returns a Request instance for the specified path and method. For example:
from datasette import Request
from pprint import pprint
request = Request.fake(
""/fixtures/facetable/"",
url_vars={""database"": ""fixtures"", ""table"": ""facetable""},
)
pprint(request.scope)
This outputs:
{'http_version': '1.1',
'method': 'GET',
'path': '/fixtures/facetable/',
'query_string': b'',
'raw_path': b'/fixtures/facetable/',
'scheme': 'http',
'type': 'http',
'url_route': {'kwargs': {'database': 'fixtures', 'table': 'facetable'}}}",21,
257,The MultiParams class,"request.args is a MultiParams object - a dictionary-like object which provides access to query string parameters that may have multiple values.
Consider the query string ?foo=1&foo=2&bar=3 - with two values for foo and one value for bar .
request.args[key] - string
Returns the first value for that key, or raises a KeyError if the key is missing. For the above example request.args[""foo""] would return ""1"" .
request.args.get(key) - string or None
Returns the first value for that key, or None if the key is missing. Pass a second argument to specify a different default, e.g. q = request.args.get(""q"", """") .
request.args.getlist(key) - list of strings
Returns the list of strings for that key. request.args.getlist(""foo"") would return [""1"", ""2""] in the above example. request.args.getlist(""bar"") would return [""3""] . If the key is missing an empty list will be returned.
request.args.keys() - list of strings
Returns the list of available keys - for the example this would be [""foo"", ""bar""] .
key in request.args - True or False
You can use if key in request.args to check if a key is present.
for key in request.args - iterator
This lets you loop through every available key.
len(request.args) - integer
Returns the number of keys.",21,
258,Response class,"The Response class can be returned from view functions that have been registered using the register_routes(datasette) hook.
The Response() constructor takes the following arguments:
body - string
The body of the response.
status - integer (optional)
The HTTP status - defaults to 200.
headers - dictionary (optional)
A dictionary of extra HTTP headers, e.g. {""x-hello"": ""world""} .
content_type - string (optional)
The content-type for the response. Defaults to text/plain .
For example:
from datasette.utils.asgi import Response
response = Response(
""This is XML"",
content_type=""application/xml; charset=utf-8"",
)
The quickest way to create responses is using the Response.text(...) , Response.html(...) , Response.json(...) or Response.redirect(...) helper methods:
from datasette.utils.asgi import Response
html_response = Response.html(""This is HTML"")
json_response = Response.json({""this_is"": ""json""})
text_response = Response.text(
""This will become utf-8 encoded text""
)
# Redirects are served as 302, unless you pass status=301:
redirect_response = Response.redirect(
""https://latest.datasette.io/""
)
Each of these responses will use the correct corresponding content-type - text/html; charset=utf-8 , application/json; charset=utf-8 or text/plain; charset=utf-8 respectively.
Each of the helper methods take optional status= and headers= arguments, documented above.",21,
259,Returning a response with .asgi_send(send),"In most cases you will return Response objects from your own view functions. You can also use a Response instance to respond at a lower level via ASGI, for example if you are writing code that uses the asgi_wrapper(datasette) hook.
Create a Response object and then use await response.asgi_send(send) , passing the ASGI send function. For example:
async def require_authorization(scope, receive, send):
response = Response.text(
""401 Authorization Required"",
headers={
""www-authenticate"": 'Basic realm=""Datasette"", charset=""UTF-8""'
},
status=401,
)
await response.asgi_send(send)",21,
260,Setting cookies with response.set_cookie(),"To set cookies on the response, use the response.set_cookie(...) method. The method signature looks like this:
def set_cookie(
self,
key,
value="""",
max_age=None,
expires=None,
path=""/"",
domain=None,
secure=False,
httponly=False,
samesite=""lax"",
): ...
You can use this with datasette.sign() to set signed cookies. Here's how you would set the ds_actor cookie for use with Datasette authentication :
response = Response.redirect(""/"")
response.set_cookie(
""ds_actor"",
datasette.sign({""a"": {""id"": ""cleopaws""}}, ""actor""),
)
return response",21,
261,Datasette class,"This object is an instance of the Datasette class, passed to many plugin hooks as an argument called datasette .
You can create your own instance of this - for example to help write tests for a plugin - like so:
from datasette.app import Datasette
# With no arguments a single in-memory database will be attached
datasette = Datasette()
# The files= argument can load files from disk
datasette = Datasette(files=[""/path/to/my-database.db""])
# Pass metadata as a JSON dictionary like this
datasette = Datasette(
files=[""/path/to/my-database.db""],
metadata={
""databases"": {
""my-database"": {
""description"": ""This is my database""
}
}
},
)
Constructor parameters include:
files=[...] - a list of database files to open
immutables=[...] - a list of database files to open in immutable mode
metadata={...} - a dictionary of Metadata
config_dir=... - the configuration directory to use, stored in datasette.config_dir",21,
262,.databases,"Property exposing a collections.OrderedDict of databases currently connected to Datasette.
The dictionary keys are the name of the database that is used in the URL - e.g. /fixtures would have a key of ""fixtures"" . The values are Database class instances.
All databases are listed, irrespective of user permissions.",21,
263,.permissions,"Property exposing a dictionary of permissions that have been registered using the register_permissions(datasette) plugin hook.
The dictionary keys are the permission names - e.g. view-instance - and the values are Permission() objects describing the permission. Here is a description of that object .",21,
264,".plugin_config(plugin_name, database=None, table=None)","plugin_name - string
The name of the plugin to look up configuration for. Usually this is something similar to datasette-cluster-map .
database - None or string
The database the user is interacting with.
table - None or string
The table the user is interacting with.
This method lets you read plugin configuration values that were set in datasette.yaml . See Writing plugins that accept configuration for full details of how this method should be used.
The return value will be the value from the configuration file - usually a dictionary.
If the plugin is not configured the return value will be None .",21,
265,"await .render_template(template, context=None, request=None)","template - string, list of strings or jinja2.Template
The template file to be rendered, e.g. my_plugin.html . Datasette will search for this file first in the --template-dir= location, if it was specified - then in the plugin's bundled templates and finally in Datasette's set of default templates.
If this is a list of template file names then the first one that exists will be loaded and rendered.
If this is a Jinja Template object it will be used directly.
context - None or a Python dictionary
The context variables to pass to the template.
request - request object or None
If you pass a Datasette request object here it will be made available to the template.
Renders a Jinja template using Datasette's preconfigured instance of Jinja and returns the resulting string. The template will have access to Datasette's default template functions and any functions that have been made available by other plugins.",21,
266,await .actors_from_ids(actor_ids),"actor_ids - list of strings or integers
A list of actor IDs to look up.
Returns a dictionary, where the keys are the IDs passed to it and the values are the corresponding actor dictionaries.
This method is mainly designed to be used with plugins. See the actors_from_ids(datasette, actor_ids) documentation for details.
If no plugins that implement that hook are installed, the default return value looks like this:
{
""1"": {""id"": ""1""},
""2"": {""id"": ""2""}
}",21,
267,"await .permission_allowed(actor, action, resource=None, default=...)","actor - dictionary
The authenticated actor. This is usually request.actor .
action - string
The name of the action that is being permission checked.
resource - string or tuple, optional
The resource, e.g. the name of the database, or a tuple of two strings containing the name of the database and the name of the table. Only some permissions apply to a resource.
default - optional: True, False or None
What value should be returned by default if nothing provides an opinion on this permission check.
Set to True for default allow or False for default deny.
If not specified the default from the Permission() tuple that was registered using register_permissions(datasette) will be used.
Check if the given actor has permission to perform the given action on the given resource.
Some permission checks are carried out against rules defined in datasette.yaml , while other custom permissions may be decided by plugins that implement the permission_allowed(datasette, actor, action, resource) plugin hook.
If neither metadata.json nor any of the plugins provide an answer to the permission query the default argument will be returned.
See Built-in permissions for a full list of permission actions included in Datasette core.",21,
268,"await .ensure_permissions(actor, permissions)","actor - dictionary
The authenticated actor. This is usually request.actor .
permissions - list
A list of permissions to check. Each permission in that list can be a string action name or a 2-tuple of (action, resource) .
This method allows multiple permissions to be checked at once. It raises a datasette.Forbidden exception if any of the checks are denied before one of them is explicitly granted.
This is useful when you need to check multiple permissions at once. For example, an actor should be able to view a table if either one of the following checks returns True or not a single one of them returns False :
await datasette.ensure_permissions(
request.actor,
[
(""view-table"", (database, table)),
(""view-database"", database),
""view-instance"",
],
)",21,
269,"await .check_visibility(actor, action=None, resource=None, permissions=None)","actor - dictionary
The authenticated actor. This is usually request.actor .
action - string, optional
The name of the action that is being permission checked.
resource - string or tuple, optional
The resource, e.g. the name of the database, or a tuple of two strings containing the name of the database and the name of the table. Only some permissions apply to a resource.
permissions - list of action strings or (action, resource) tuples, optional
Provide this instead of action and resource to check multiple permissions at once.
This convenience method can be used to answer the question ""should this item be considered private, in that it is visible to me but it is not visible to anonymous users?""
It returns a tuple of two booleans, (visible, private) . visible indicates if the actor can see this resource. private will be True if an anonymous user would not be able to view the resource.
This example checks if the user can access a specific table, and sets private so that a padlock icon can later be displayed:
visible, private = await datasette.check_visibility(
request.actor,
action=""view-table"",
resource=(database, table),
)
The following example runs three checks in a row, similar to await .ensure_permissions(actor, permissions) . If any of the checks are denied before one of them is explicitly granted then visible will be False . private will be True if an anonymous user would not be able to view the resource.
visible, private = await datasette.check_visibility(
request.actor,
permissions=[
(""view-table"", (database, table)),
(""view-database"", database),
""view-instance"",
],
)",21,
270,".create_token(actor_id, expires_after=None, restrict_all=None, restrict_database=None, restrict_resource=None)","actor_id - string
The ID of the actor to create a token for.
expires_after - int, optional
The number of seconds after which the token should expire.
restrict_all - iterable, optional
A list of actions that this token should be restricted to across all databases and resources.
restrict_database - dict, optional
For restricting actions within specific databases, e.g. {""mydb"": [""view-table"", ""view-query""]} .
restrict_resource - dict, optional
For restricting actions to specific resources (tables, SQL views and Canned queries ) within a database. For example: {""mydb"": {""mytable"": [""insert-row"", ""update-row""]}} .
This method returns a signed API token of the format dstok_... which can be used to authenticate requests to the Datasette API.
All tokens must have an actor_id string indicating the ID of the actor which the token will act on behalf of.
Tokens default to lasting forever, but can be set to expire after a given number of seconds using the expires_after argument. The following code creates a token for user1 that will expire after an hour:
token = datasette.create_token(
actor_id=""user1"",
expires_after=3600,
)
The three restrict_* arguments can be used to create a token that has additional restrictions beyond what the associated actor is allowed to do.
The following example creates a token that can access view-instance and view-table across everything, can additionally use view-query for anything in the docs database and is allowed to execute insert-row and update-row in the attachments table in that database:
token = datasette.create_token(
actor_id=""user1"",
restrict_all=(""view-instance"", ""view-table""),
restrict_database={""docs"": (""view-query"",)},
restrict_resource={
""docs"": {
""attachments"": (""insert-row"", ""update-row"")
}
},
)",21,
271,.get_permission(name_or_abbr),"name_or_abbr - string
The name or abbreviation of the permission to look up, e.g. view-table or vt .
Returns a Permission object representing the permission, or raises a KeyError if one is not found.",21,
272,.get_database(name),"name - string, optional
The name of the database - optional.
Returns the specified database object. Raises a KeyError if the database does not exist. Call this method without an argument to return the first connected database.",21,
273,.get_internal_database(),Returns a database object for reading and writing to the private internal database .,21,
274,Getting and setting metadata,"Metadata about the instance, databases, tables and columns is stored in tables in Datasette's internal database . The following methods are the supported API for plugins to read and update that stored metadata.",21,
275,await .get_instance_metadata(self),"Returns metadata keys and values for the entire Datasette instance as a dictionary.
Internally queries the metadata_instance table inside the internal database .",21,
276,"await .get_database_metadata(self, database_name)","database_name - string
The name of the database to query.
Returns metadata keys and values for the specified database as a dictionary.
Internally queries the metadata_databases table inside the internal database .",21,
277,"await .get_resource_metadata(self, database_name, resource_name)","database_name - string
The name of the database to query.
resource_name - string
The name of the resource (table, view, or canned query) inside database_name to query.
Returns metadata keys and values for the specified ""resource"" as a dictionary.
A ""resource"" in this context can be a table, view, or canned query.
Internally queries the metadata_resources table inside the internal database .",21,
278,"await .get_column_metadata(self, database_name, resource_name, column_name)","database_name - string
The name of the database to query.
resource_name - string
The name of the resource (table, view, or canned query) inside database_name to query.
column_name - string
The name of the column inside resource_name to query.
Returns metadata keys and values for the specified column, resource, and table as a dictionary.
Internally queries the metadata_columns table inside the internal database .",21,
279,"await .set_instance_metadata(self, key, value)","key - string
The metadata entry key to insert (ex title , description , etc.)
value - string
The value of the metadata entry to insert.
Adds a new metadata entry for the entire Datasette instance.
Any previous instance-level metadata entry with the same key will be overwritten.
Internally upserts the value into the the metadata_instance table inside the internal database .",21,
280,"await .set_database_metadata(self, database_name, key, value)","database_name - string
The database the metadata entry belongs to.
key - string
The metadata entry key to insert (ex title , description , etc.)
value - string
The value of the metadata entry to insert.
Adds a new metadata entry for the specified database.
Any previous database-level metadata entry with the same key will be overwritten.
Internally upserts the value into the the metadata_databases table inside the internal database .",21,
281,"await .set_resource_metadata(self, database_name, resource_name, key, value)","database_name - string
The database the metadata entry belongs to.
resource_name - string
The resource (table, view, or canned query) the metadata entry belongs to.
key - string
The metadata entry key to insert (ex title , description , etc.)
value - string
The value of the metadata entry to insert.
Adds a new metadata entry for the specified ""resource"".
Any previous resource-level metadata entry with the same key will be overwritten.
Internally upserts the value into the the metadata_resources table inside the internal database .",21,
282,"await .set_column_metadata(self, database_name, resource_name, column_name, key, value)","database_name - string
The database the metadata entry belongs to.
resource_name - string
The resource (table, view, or canned query) the metadata entry belongs to.
column-name - string
The column the metadata entry belongs to.
key - string
The metadata entry key to insert (ex title , description , etc.)
value - string
The value of the metadata entry to insert.
Adds a new metadata entry for the specified column.
Any previous column-level metadata entry with the same key will be overwritten.
Internally upserts the value into the the metadata_columns table inside the internal database .",21,
283,".add_database(db, name=None, route=None)","db - datasette.database.Database instance
The database to be attached.
name - string, optional
The name to be used for this database . If not specified Datasette will pick one based on the filename or memory name.
route - string, optional
This will be used in the URL path. If not specified, it will default to the same thing as the name .
The datasette.add_database(db) method lets you add a new database to the current Datasette instance.
The db parameter should be an instance of the datasette.database.Database class. For example:
from datasette.database import Database
datasette.add_database(
Database(
datasette,
path=""path/to/my-new-database.db"",
)
)
This will add a mutable database and serve it at /my-new-database .
Use is_mutable=False to add an immutable database.
.add_database() returns the Database instance, with its name set as the database.name attribute. Any time you are working with a newly added database you should use the return value of .add_database() , for example:
db = datasette.add_database(
Database(datasette, memory_name=""statistics"")
)
await db.execute_write(
""CREATE TABLE foo(id integer primary key)""
)",21,
284,.add_memory_database(name),"Adds a shared in-memory database with the specified name:
datasette.add_memory_database(""statistics"")
This is a shortcut for the following:
from datasette.database import Database
datasette.add_database(
Database(datasette, memory_name=""statistics"")
)
Using either of these pattern will result in the in-memory database being served at /statistics .",21,
285,.remove_database(name),"name - string
The name of the database to be removed.
This removes a database that has been previously added. name= is the unique name of that database.",21,
286,await .track_event(event),"event - Event
An instance of a subclass of datasette.events.Event .
Plugins can call this to track events, using classes they have previously registered. See Event tracking for details.
The event will then be passed to all plugins that have registered to receive events using the track_event(datasette, event) hook.
Example usage, assuming the plugin has previously registered the BanUserEvent class:
await datasette.track_event(
BanUserEvent(user={""id"": 1, ""username"": ""cleverbot""})
)",21,
287,".sign(value, namespace=""default"")","value - any serializable type
The value to be signed.
namespace - string, optional
An alternative namespace, see the itsdangerous salt documentation .
Utility method for signing values, such that you can safely pass data to and from an untrusted environment. This is a wrapper around the itsdangerous library.
This method returns a signed string, which can be decoded and verified using .unsign(value, namespace=""default"") .",21,
288,".unsign(value, namespace=""default"")","signed - any serializable type
The signed string that was created using .sign(value, namespace=""default"") .
namespace - string, optional
The alternative namespace, if one was used.
Returns the original, decoded object that was passed to .sign(value, namespace=""default"") . If the signature is not valid this raises a itsdangerous.BadSignature exception.",21,
289,".add_message(request, message, type=datasette.INFO)","request - Request
The current Request object
message - string
The message string
type - constant, optional
The message type - datasette.INFO , datasette.WARNING or datasette.ERROR
Datasette's flash messaging mechanism allows you to add a message that will be displayed to the user on the next page that they visit. Messages are persisted in a ds_messages cookie. This method adds a message to that cookie.
You can try out these messages (including the different visual styling of the three message types) using the /-/messages debugging tool.",21,
290,".absolute_url(request, path)","request - Request
The current Request object
path - string
A path, for example /dbname/table.json
Returns the absolute URL for the given path, including the protocol and host. For example:
absolute_url = datasette.absolute_url(
request, ""/dbname/table.json""
)
# Would return ""http://localhost:8001/dbname/table.json""
The current request object is used to determine the hostname and protocol that should be used for the returned URL. The force_https_urls configuration setting is taken into account.",21,
291,.setting(key),"key - string
The name of the setting, e.g. base_url .
Returns the configured value for the specified setting . This can be a string, boolean or integer depending on the requested setting.
For example:
downloads_are_allowed = datasette.setting(""allow_download"")",21,
292,.resolve_database(request),"request - Request object
A request object
If you are implementing your own custom views, you may need to resolve the database that the user is requesting based on a URL path. If the regular expression for your route declares a database named group, you can use this method to resolve the database object.
This returns a Database instance.
If the database cannot be found, it raises a datasette.utils.asgi.DatabaseNotFound exception - which is a subclass of datasette.utils.asgi.NotFound with a .database_name attribute set to the name of the database that was requested.",21,
293,.resolve_table(request),"request - Request object
A request object
This assumes that the regular expression for your route declares both a database and a table named group.
It returns a ResolvedTable named tuple instance with the following fields:
db - Database
The database object
table - string
The name of the table (or view)
is_view - boolean
True if this is a view, False if it is a table
If the database or table cannot be found it raises a datasette.utils.asgi.DatabaseNotFound exception.
If the table does not exist it raises a datasette.utils.asgi.TableNotFound exception - a subclass of datasette.utils.asgi.NotFound with .database_name and .table attributes.",21,
294,.resolve_row(request),"request - Request object
A request object
This method assumes your route declares named groups for database , table and pks .
It returns a ResolvedRow named tuple instance with the following fields:
db - Database
The database object
table - string
The name of the table
sql - string
SQL snippet that can be used in a WHERE clause to select the row
params - dict
Parameters that should be passed to the SQL query
pks - list
List of primary key column names
pk_values - list
List of primary key values decoded from the URL
row - sqlite3.Row
The row itself
If the database or table cannot be found it raises a datasette.utils.asgi.DatabaseNotFound exception.
If the table does not exist it raises a datasette.utils.asgi.TableNotFound exception.
If the row cannot be found it raises a datasette.utils.asgi.RowNotFound exception. This has .database_name , .table and .pk_values attributes, extracted from the request path.",21,
295,datasette.client,"Plugins can make internal simulated HTTP requests to the Datasette instance within which they are running. This ensures that all of Datasette's external JSON APIs are also available to plugins, while avoiding the overhead of making an external HTTP call to access those APIs.
The datasette.client object is a wrapper around the HTTPX Python library , providing an async-friendly API that is similar to the widely used Requests library .
It offers the following methods:
await datasette.client.get(path, **kwargs) - returns HTTPX Response
Execute an internal GET request against that path.
await datasette.client.post(path, **kwargs) - returns HTTPX Response
Execute an internal POST request. Use data={""name"": ""value""} to pass form parameters.
await datasette.client.options(path, **kwargs) - returns HTTPX Response
Execute an internal OPTIONS request.
await datasette.client.head(path, **kwargs) - returns HTTPX Response
Execute an internal HEAD request.
await datasette.client.put(path, **kwargs) - returns HTTPX Response
Execute an internal PUT request.
await datasette.client.patch(path, **kwargs) - returns HTTPX Response
Execute an internal PATCH request.
await datasette.client.delete(path, **kwargs) - returns HTTPX Response
Execute an internal DELETE request.
await datasette.client.request(method, path, **kwargs) - returns HTTPX Response
Execute an internal request with the given HTTP method against that path.
These methods can be used with datasette.urls - for example:
table_json = (
await datasette.client.get(
datasette.urls.table(
""fixtures"", ""facetable"", format=""json""
)
)
).json()
datasette.client methods automatically take the current base_url setting into account, whether or not you use the datasette.urls family of methods to construct the path.
For documentation on available **kwargs options and the shape of the HTTPX Response object refer to the HTTPX Async documentation .",21,
296,datasette.urls,"The datasette.urls object contains methods for building URLs to pages within Datasette. Plugins should use this to link to pages, since these methods take into account any base_url configuration setting that might be in effect.
datasette.urls.instance(format=None)
Returns the URL to the Datasette instance root page. This is usually ""/"" .
datasette.urls.path(path, format=None)
Takes a path and returns the full path, taking base_url into account.
For example, datasette.urls.path(""-/logout"") will return the path to the logout page, which will be ""/-/logout"" by default or /prefix-path/-/logout if base_url is set to /prefix-path/
datasette.urls.logout()
Returns the URL to the logout page, usually ""/-/logout""
datasette.urls.static(path)
Returns the URL of one of Datasette's default static assets, for example ""/-/static/app.css""
datasette.urls.static_plugins(plugin_name, path)
Returns the URL of one of the static assets belonging to a plugin.
datasette.urls.static_plugins(""datasette_cluster_map"", ""datasette-cluster-map.js"") would return ""/-/static-plugins/datasette_cluster_map/datasette-cluster-map.js""
datasette.urls.static(path)
Returns the URL of one of Datasette's default static assets, for example ""/-/static/app.css""
datasette.urls.database(database_name, format=None)
Returns the URL to a database page, for example ""/fixtures""
datasette.urls.table(database_name, table_name, format=None)
Returns the URL to a table page, for example ""/fixtures/facetable""
datasette.urls.query(database_name, query_name, format=None)
Returns the URL to a query page, for example ""/fixtures/pragma_cache_size""
These functions can be accessed via the {{ urls }} object in Datasette templates, for example:
HomepageFixtures databasefacetable tablepragma_cache_size query
Use the format=""json"" (or ""csv"" or other formats supported by plugins) arguments to get back URLs to the JSON representation. This is the path with .json added on the end.
These methods each return a datasette.utils.PrefixedUrlString object, which is a subclass of the Python str type. This allows the logic that considers the base_url setting to detect if that prefix has already been applied to the path.",21,
297,Database class,"Instances of the Database class can be used to execute queries against attached SQLite databases, and to run introspection against their schemas.",21,
298,"Database(ds, path=None, is_mutable=True, is_memory=False, memory_name=None)","The Database() constructor can be used by plugins, in conjunction with .add_database(db, name=None, route=None) , to create and register new databases.
The arguments are as follows:
ds - Datasette class (required)
The Datasette instance you are attaching this database to.
path - string
Path to a SQLite database file on disk.
is_mutable - boolean
Set this to False to cause Datasette to open the file in immutable mode.
is_memory - boolean
Use this to create non-shared memory connections.
memory_name - string or None
Use this to create a named in-memory database. Unlike regular memory databases these can be accessed by multiple threads and will persist an changes made to them for the lifetime of the Datasette server process.
The first argument is the datasette instance you are attaching to, the second is a path= , then is_mutable and is_memory are both optional arguments.",21,
299,db.hash,"If the database was opened in immutable mode, this property returns the 64 character SHA-256 hash of the database contents as a string. Otherwise it returns None .",21,
300,"await db.execute(sql, ...)","Executes a SQL query against the database and returns the resulting rows (see Results ).
sql - string (required)
The SQL query to execute. This can include ? or :named parameters.
params - list or dict
A list or dictionary of values to use for the parameters. List for ? , dictionary for :named .
truncate - boolean
Should the rows returned by the query be truncated at the maximum page size? Defaults to True , set this to False to disable truncation.
custom_time_limit - integer ms
A custom time limit for this query. This can be set to a lower value than the Datasette configured default. If a query takes longer than this it will be terminated early and raise a dataette.database.QueryInterrupted exception.
page_size - integer
Set a custom page size for truncation, over-riding the configured Datasette default.
log_sql_errors - boolean
Should any SQL errors be logged to the console in addition to being raised as an error? Defaults to True .",21,
301,Results,"The db.execute() method returns a single Results object. This can be used to access the rows returned by the query.
Iterating over a Results object will yield SQLite Row objects . Each of these can be treated as a tuple or can be accessed using row[""column""] syntax:
info = []
results = await db.execute(""select name from sqlite_master"")
for row in results:
info.append(row[""name""])
The Results object also has the following properties and methods:
.truncated - boolean
Indicates if this query was truncated - if it returned more results than the specified page_size . If this is true then the results object will only provide access to the first page_size rows in the query result. You can disable truncation by passing truncate=False to the db.query() method.
.columns - list of strings
A list of column names returned by the query.
.rows - list of sqlite3.Row
This property provides direct access to the list of rows returned by the database. You can access specific rows by index using results.rows[0] .
.dicts() - list of dict
This method returns a list of Python dictionaries, one for each row.
.first() - row or None
Returns the first row in the results, or None if no rows were returned.
.single_value()
Returns the value of the first column of the first row of results - but only if the query returned a single row with a single column. Raises a datasette.database.MultipleValues exception otherwise.
.__len__()
Calling len(results) returns the (truncated) number of returned results.",21,
302,await db.execute_fn(fn),"Executes a given callback function against a read-only database connection running in a thread. The function will be passed a SQLite connection, and the return value from the function will be returned by the await .
Example usage:
def get_version(conn):
return conn.execute(
""select sqlite_version()""
).fetchall()[0][0]
version = await db.execute_fn(get_version)",21,
303,"await db.execute_write(sql, params=None, block=True)","SQLite only allows one database connection to write at a time. Datasette handles this for you by maintaining a queue of writes to be executed against a given database. Plugins can submit write operations to this queue and they will be executed in the order in which they are received.
This method can be used to queue up a non-SELECT SQL query to be executed against a single write connection to the database.
You can pass additional SQL parameters as a tuple or dictionary.
The method will block until the operation is completed, and the return value will be the return from calling conn.execute(...) using the underlying sqlite3 Python library.
If you pass block=False this behavior changes to ""fire and forget"" - queries will be added to the write queue and executed in a separate thread while your code can continue to do other things. The method will return a UUID representing the queued task.
Each call to execute_write() will be executed inside a transaction.",21,
304,"await db.execute_write_script(sql, block=True)","Like execute_write() but can be used to send multiple SQL statements in a single string separated by semicolons, using the sqlite3 conn.executescript() method.
Each call to execute_write_script() will be executed inside a transaction.",21,
305,"await db.execute_write_many(sql, params_seq, block=True)","Like execute_write() but uses the sqlite3 conn.executemany() method. This will efficiently execute the same SQL statement against each of the parameters in the params_seq iterator, for example:
await db.execute_write_many(
""insert into characters (id, name) values (?, ?)"",
[(1, ""Melanie""), (2, ""Selma""), (2, ""Viktor"")],
)
Each call to execute_write_many() will be executed inside a transaction.",21,
306,"await db.execute_write_fn(fn, block=True, transaction=True)","This method works like .execute_write() , but instead of a SQL statement you give it a callable Python function. Your function will be queued up and then called when the write connection is available, passing that connection as the argument to the function.
The function can then perform multiple actions, safe in the knowledge that it has exclusive access to the single writable connection for as long as it is executing.
fn needs to be a regular function, not an async def function.
For example:
def delete_and_return_count(conn):
conn.execute(""delete from some_table where id > 5"")
return conn.execute(
""select count(*) from some_table""
).fetchone()[0]
try:
num_rows_left = await database.execute_write_fn(
delete_and_return_count
)
except Exception as e:
print(""An error occurred:"", e)
The value returned from await database.execute_write_fn(...) will be the return value from your function.
If your function raises an exception that exception will be propagated up to the await line.
By default your function will be executed inside a transaction. You can pass transaction=False to disable this behavior, though if you do that you should be careful to manually apply transactions - ideally using the with conn: pattern, or you may see OperationalError: database table is locked errors.
If you specify block=False the method becomes fire-and-forget, queueing your function to be executed and then allowing your code after the call to .execute_write_fn() to continue running while the underlying thread waits for an opportunity to run your function. A UUID representing the queued task will be returned. Any exceptions in your code will be silently swallowed.",21,
307,await db.execute_isolated_fn(fn),"This method works is similar to execute_write_fn() but executes the provided function in an entirely isolated SQLite connection, which is opened, used and then closed again in a single call to this method.
The prepare_connection() plugin hook is not executed against this connection.
This allows plugins to execute database operations that might conflict with how database connections are usually configured. For example, running a VACUUM operation while bypassing any restrictions placed by the datasette-sqlite-authorizer plugin.
Plugins can also use this method to load potentially dangerous SQLite extensions, use them to perform an operation and then have them safely unloaded at the end of the call, without risk of exposing them to other connections.
Functions run using execute_isolated_fn() share the same queue as execute_write_fn() , which guarantees that no writes can be executed at the same time as the isolated function is executing.
The return value of the function will be returned by this method. Any exceptions raised by the function will be raised out of the await line as well.",21,
308,db.close(),"Closes all of the open connections to file-backed databases. This is mainly intended to be used by large test suites, to avoid hitting limits on the number of open files.",21,
309,Database introspection,"The Database class also provides properties and methods for introspecting the database.
db.name - string
The name of the database - usually the filename without the .db prefix.
db.size - integer
The size of the database file in bytes. 0 for :memory: databases.
db.mtime_ns - integer or None
The last modification time of the database file in nanoseconds since the epoch. None for :memory: databases.
db.is_mutable - boolean
Is this database mutable, and allowed to accept writes?
db.is_memory - boolean
Is this database an in-memory database?
await db.attached_databases() - list of named tuples
Returns a list of additional databases that have been connected to this database using the SQLite ATTACH command. Each named tuple has fields seq , name and file .
await db.table_exists(table) - boolean
Check if a table called table exists.
await db.view_exists(view) - boolean
Check if a view called view exists.
await db.table_names() - list of strings
List of names of tables in the database.
await db.view_names() - list of strings
List of names of views in the database.
await db.table_columns(table) - list of strings
Names of columns in a specific table.
await db.table_column_details(table) - list of named tuples
Full details of the columns in a specific table. Each column is represented by a Column named tuple with fields cid (integer representing the column position), name (string), type (string, e.g. REAL or VARCHAR(30) ), notnull (integer 1 or 0), default_value (string or None), is_pk (integer 1 or 0).
await db.primary_keys(table) - list of strings
Names of the columns that are part of the primary key for this table.
await db.fts_table(table) - string or None
The name of the FTS table associated with this table, if one exists.
await db.label_column_for_table(table) - string or None
The label column that is associated with this table - either automatically detected or using the ""label_column"" key from Metadata , see Specifying the label column for a table .
await db.foreign_keys_for_table(table) - list of dictionaries
Details of columns in this table which are foreign keys to other tables. A list of dictionaries where each dictionary is shaped like this: {""column"": string, ""other_table"": string, ""other_column"": string} .
await db.hidden_table_names() - list of strings
List of tables which Datasette ""hides"" by default - usually these are tables associated with SQLite's full-text search feature, the SpatiaLite extension or tables hidden using the Hiding tables feature.
await db.get_table_definition(table) - string
Returns the SQL definition for the table - the CREATE TABLE statement and any associated CREATE INDEX statements.
await db.get_view_definition(view) - string
Returns the SQL definition of the named view.
await db.get_all_foreign_keys() - dictionary
Dictionary representing both incoming and outgoing foreign keys for every table in this database. Each key is a table name that points to a dictionary with two keys, ""incoming"" and ""outgoing"" , each of which is a list of dictionaries with keys ""column"" , ""other_table"" and ""other_column"" . For example:
{
""documents"": {
""incoming"": [
{
""other_table"": ""pages"",
""column"": ""id"",
""other_column"": ""document_id""
}
],
""outgoing"": []
},
""pages"": {
""incoming"": [
{
""other_table"": ""organization_pages"",
""column"": ""id"",
""other_column"": ""page_id""
}
],
""outgoing"": [
{
""other_table"": ""documents"",
""column"": ""document_id"",
""other_column"": ""id""
}
]
},
""organization"": {
""incoming"": [
{
""other_table"": ""organization_pages"",
""column"": ""id"",
""other_column"": ""organization_id""
}
],
""outgoing"": []
},
""organization_pages"": {
""incoming"": [],
""outgoing"": [
{
""other_table"": ""pages"",
""column"": ""page_id"",
""other_column"": ""id""
},
{
""other_table"": ""organization"",
""column"": ""organization_id"",
""other_column"": ""id""
}
]
}
}",21,
310,CSRF protection,"Datasette uses asgi-csrf to guard against CSRF attacks on form POST submissions. Users receive a ds_csrftoken cookie which is compared against the csrftoken form field (or x-csrftoken HTTP header) for every incoming request.
If your plugin implements a