sections
582 rows sorted by content
This data as json, CSV (advanced)
page 30 ✖
- changelog 105
- internals 48
- plugin_hooks 41
- settings 21
- authentication 17
- cli-reference 13
- json_api 13
- writing_plugins 11
- contributing 9
- full_text_search 9
- introspection 9
- installation 8
- spatialite 8
- deploying 7
- publish 7
- testing_plugins 7
- plugins 6
- custom_templates 5
- facets 5
- javascript_plugins 5
- sql_queries 5
- metadata 4
- pages 4
- binary_data 3
- configuration 3
- performance 3
- csv_export 2
- getting_started 2
- ecosystem 1
- index 1
id | page | ref | title | content ▼ | breadcrumbs | references |
---|---|---|---|---|---|---|
changelog:id83 | changelog | id83 | 0.29.1 (2019-07-11) | Fixed bug with static mounts using relative paths which could lead to traversal exploits ( #555 ) - thanks Abdussamet Kocak! Datasette can now be run as a module: python -m datasette ( #556 ) - thanks, Abdussamet Kocak! | ["Changelog"] | [{"href": "https://github.com/simonw/datasette/issues/555", "label": "#555"}, {"href": "https://github.com/simonw/datasette/issues/556", "label": "#556"}] |
changelog:id81 | changelog | id81 | 0.29.3 (2019-09-02) | Fixed implementation of CodeMirror on database page ( #560 ) Documentation typo fixes - thanks, Min ho Kim ( #561 ) Mechanism for detecting if a table has FTS enabled now works if the table name used alternative escaping mechanisms ( #570 ) - for compatibility with a recent change to sqlite-utils . | ["Changelog"] | [{"href": "https://github.com/simonw/datasette/issues/560", "label": "#560"}, {"href": "https://github.com/simonw/datasette/pull/561", "label": "#561"}, {"href": "https://github.com/simonw/datasette/issues/570", "label": "#570"}, {"href": "https://github.com/simonw/sqlite-utils/pull/57", "label": "a recent change to sqlite-utils"}] |
changelog:id23 | changelog | id23 | 0.59.3 (2021-11-20) | Fixed numerous bugs when running Datasette behind a proxy with a prefix URL path using the base_url setting. A live demo of this mode is now available at datasette-apache-proxy-demo.datasette.io/prefix/ . ( #1519 , #838 ) ?column__arraycontains= and ?column__arraynotcontains= table parameters now also work against SQL views. ( #448 ) ?_facet_array=column no longer returns incorrect counts if columns contain the same value more than once. | ["Changelog"] | [{"href": "https://datasette-apache-proxy-demo.datasette.io/prefix/", "label": "datasette-apache-proxy-demo.datasette.io/prefix/"}, {"href": "https://github.com/simonw/datasette/issues/1519", "label": "#1519"}, {"href": "https://github.com/simonw/datasette/issues/838", "label": "#838"}, {"href": "https://github.com/simonw/datasette/issues/448", "label": "#448"}] |
changelog:id29 | changelog | id29 | 0.57.1 (2021-06-08) | Fixed visual display glitch with global navigation menu. ( #1367 ) No longer truncates the list of table columns displayed on the /database page. ( #1364 ) | ["Changelog"] | [{"href": "https://github.com/simonw/datasette/issues/1367", "label": "#1367"}, {"href": "https://github.com/simonw/datasette/issues/1364", "label": "#1364"}] |
publish:publish-fly | publish | publish-fly | Publishing to Fly | Fly is a competitively priced Docker-compatible hosting platform that supports running applications in globally distributed data centers close to your end users. You can deploy Datasette instances to Fly using the datasette-publish-fly plugin. pip install datasette-publish-fly datasette publish fly mydatabase.db --app="my-app" Consult the datasette-publish-fly README for more details. | ["Publishing data", "datasette publish"] | [{"href": "https://fly.io/", "label": "Fly"}, {"href": "https://fly.io/docs/pricing/", "label": "competitively priced"}, {"href": "https://github.com/simonw/datasette-publish-fly", "label": "datasette-publish-fly"}, {"href": "https://github.com/simonw/datasette-publish-fly/blob/main/README.md", "label": "datasette-publish-fly README"}] |
deploying:apache-proxy-configuration | deploying | apache-proxy-configuration | Apache proxy configuration | For Apache , you can use the ProxyPass directive. First make sure the following lines are uncommented: LoadModule proxy_module lib/httpd/modules/mod_proxy.so LoadModule proxy_http_module lib/httpd/modules/mod_proxy_http.so Then add these directives to proxy traffic: ProxyPass /my-datasette/ http://127.0.0.1:8009/my-datasette/ ProxyPreserveHost On A live demo of Datasette running behind Apache using this proxy setup can be seen at datasette-apache-proxy-demo.datasette.io/prefix/ . The code for that demo can be found in the demos/apache-proxy directory. Using --uds you can use Unix domain sockets similar to the nginx example: ProxyPass /my-datasette/ unix:/tmp/datasette.sock|http://localhost/my-datasette/ The ProxyPreserveHost On directive ensures that the original Host: header from the incoming request is passed through to Datasette. Datasette needs this to correctly assemble links to other pages using the .absolute_url(request, path) method. | ["Deploying Datasette", "Running Datasette behind a proxy"] | [{"href": "https://httpd.apache.org/", "label": "Apache"}, {"href": "https://datasette-apache-proxy-demo.datasette.io/prefix/", "label": "datasette-apache-proxy-demo.datasette.io/prefix/"}, {"href": "https://github.com/simonw/datasette/tree/main/demos/apache-proxy", "label": "demos/apache-proxy"}, {"href": "https://httpd.apache.org/docs/2.4/mod/mod_proxy.html#proxypreservehost", "label": "ProxyPreserveHost On"}] |
authentication:authentication-permissions-other | authentication | authentication-permissions-other | Other permissions in | For all other permissions, you can use one or more "permissions" blocks in your datasette.yaml configuration file. To grant access to the permissions debug tool to all signed in users, you can grant permissions-debug to any actor with an id matching the wildcard * by adding this a the root of your configuration: [[[cog config_example(cog, """ permissions: debug-menu: id: '*' """) ]]] [[[end]]] To grant create-table to the user with id of editor for the docs database: [[[cog config_example(cog, """ databases: docs: permissions: create-table: id: editor """) ]]] [[[end]]] And for insert-row against the reports table in that docs database: [[[cog config_example(cog, """ databases: docs: tables: reports: permissions: insert-row: id: editor """) ]]] [[[end]]] The permissions debug tool can be useful for helping test permissions that you have configured in this way. | ["Authentication and permissions"] | [] |
settings:setting-force-https-urls | settings | setting-force-https-urls | force_https_urls | Forces self-referential URLs in the JSON output to always use the https:// protocol. This is useful for cases where the application itself is hosted using HTTP but is served to the outside world via a proxy that enables HTTPS. datasette mydatabase.db --setting force_https_urls 1 | ["Settings", "Settings"] | [] |
changelog:v0-28-databases-that-change | changelog | v0-28-databases-that-change | Supporting databases that change | From the beginning of the project, Datasette has been designed with read-only databases in mind. If a database is guaranteed not to change it opens up all kinds of interesting opportunities - from taking advantage of SQLite immutable mode and HTTP caching to bundling static copies of the database directly in a Docker container. The interesting ideas in Datasette explores this idea in detail. As my goals for the project have developed, I realized that read-only databases are no longer the right default. SQLite actually supports concurrent access very well provided only one thread attempts to write to a database at a time, and I keep encountering sensible use-cases for running Datasette on top of a database that is processing inserts and updates. So, as-of version 0.28 Datasette no longer assumes that a database file will not change. It is now safe to point Datasette at a SQLite database which is being updated by another process. Making this change was a lot of work - see tracking tickets #418 , #419 and #420 . It required new thinking around how Datasette should calculate table counts (an expensive operation against a large, changing database) and also meant reconsidering the "content hash" URLs Datasette has used in the past to optimize the performance of HTTP caches. Datasette can still run against immutable files and gains numerous performance benefits from doing so, but this is no longer the default behaviour. Take a look at the new Performance and caching documentation section for details on how to make the most of Datasette against data that you know will be staying read-only and immutable. | ["Changelog", "0.28 (2019-05-19)"] | [{"href": "https://simonwillison.net/2018/Oct/4/datasette-ideas/", "label": "The interesting ideas in Datasette"}, {"href": "https://github.com/simonw/datasette/issues/418", "label": "#418"}, {"href": "https://github.com/simonw/datasette/issues/419", "label": "#419"}, {"href": "https://github.com/simonw/datasette/issues/420", "label": "#420"}] |
changelog:id41 | changelog | id41 | 0.52.2 (2020-12-02) | Generated columns from SQLite 3.31.0 or higher are now correctly displayed. ( #1116 ) Error message if you attempt to open a SpatiaLite database now suggests using --load-extension=spatialite if it detects that the extension is available in a common location. ( #1115 ) OPTIONS requests against the /database page no longer raise a 500 error. ( #1100 ) Databases larger than 32MB that are published to Cloud Run can now be downloaded. ( #749 ) Fix for misaligned cog icon on table and database pages. Thanks, Abdussamet Koçak. ( #1121 ) | ["Changelog"] | [{"href": "https://github.com/simonw/datasette/issues/1116", "label": "#1116"}, {"href": "https://github.com/simonw/datasette/issues/1115", "label": "#1115"}, {"href": "https://github.com/simonw/datasette/issues/1100", "label": "#1100"}, {"href": "https://github.com/simonw/datasette/issues/749", "label": "#749"}, {"href": "https://github.com/simonw/datasette/issues/1121", "label": "#1121"}] |
index:contents | index | contents | Contents | Getting started Play with a live demo Follow a tutorial Datasette in your browser with Datasette Lite Try Datasette without installing anything using Glitch Using Datasette on your own computer Installation Basic installation Datasette Desktop for Mac Using Homebrew Using pip Advanced installation options Using pipx Using Docker A note about extensions Configuration Configuration via the command-line datasette.yaml reference Settings Plugin configuration Permissions configuration Canned queries configuration Custom CSS and JavaScript The Datasette Ecosystem sqlite-utils Dogsheep CLI reference datasette --help datasette serve datasette --get datasette serve --help-settings datasette plugins datasette install datasette uninstall datasette publish datasette publish cloudrun datasette publish heroku datasette package datasette inspect datasette create-token Pages and API endpoints Top-level index Database Hidden tables Table Row Publishing data datasette publish Publishing to Google Cloud Run Publishing to Heroku Publishing to Vercel Publishing to Fly Custom metadata and plugins datasette package Deploying Datasette Deployment fundamentals Running Datasette using systemd Running Datasette using OpenRC Deploying using buildpacks Running Datasette behind a proxy Nginx proxy configuration Apache proxy configuration JSON API Default representation Different shapes Pagination Special JSON arguments Table arguments Column filter arguments Special table arguments Expanding foreign key references Discovering the JSON for a page Enabling CORS The JSON write API Inserting rows Upserting rows Updating a row Deleting a row Creating a table Creating a table from example data Dropping tables Running SQL queries Named parameters Views Canned queries Canned query parameters Additional canned query options Writable canned queries Magic parameters JSON API for writable canned queries Pagination Cross-database queries Authentication and permissions Actors Using the "root" actor Permissions How permissions are resolved Defining permiss… | ["Datasette"] | [] |
getting_started:getting-started-glitch | getting_started | getting-started-glitch | Try Datasette without installing anything using Glitch | Glitch is a free online tool for building web apps directly from your web browser. You can use Glitch to try out Datasette without needing to install any software on your own computer. Here's a demo project on Glitch which you can use as the basis for your own experiments: glitch.com/~datasette-csvs Glitch allows you to "remix" any project to create your own copy and start editing it in your browser. You can remix the datasette-csvs project by clicking this button: Find a CSV file and drag it onto the Glitch file explorer panel - datasette-csvs will automatically convert it to a SQLite database (using sqlite-utils ) and allow you to start exploring it using Datasette. If your CSV file has a latitude and longitude column you can visualize it on a map by uncommenting the datasette-cluster-map line in the requirements.txt file using the Glitch file editor. Need some data? Try this Public Art Data for the city of Seattle - hit "Export" and select "CSV" to download it as a CSV file. For more on how this works, see Running Datasette on Glitch . | ["Getting started"] | [{"href": "https://glitch.com/", "label": "Glitch"}, {"href": "https://glitch.com/~datasette-csvs", "label": "glitch.com/~datasette-csvs"}, {"href": "https://glitch.com/edit/#!/remix/datasette-csvs", "label": null}, {"href": "https://github.com/simonw/sqlite-utils", "label": "sqlite-utils"}, {"href": "https://data.seattle.gov/Community/Public-Art-Data/j7sn-tdzk", "label": "Public Art Data"}, {"href": "https://simonwillison.net/2019/Apr/23/datasette-glitch/", "label": "Running Datasette on Glitch"}] |
publish:publish-cloud-run | publish | publish-cloud-run | Publishing to Google Cloud Run | Google Cloud Run allows you to publish data in a scale-to-zero environment, so your application will start running when the first request is received and will shut down again when traffic ceases. This means you only pay for time spent serving traffic. Cloud Run is a great option for inexpensively hosting small, low traffic projects - but costs can add up for projects that serve a lot of requests. Be particularly careful if your project has tables with large numbers of rows. Search engine crawlers that index a page for every row could result in a high bill. The datasette-block-robots plugin can be used to request search engine crawlers omit crawling your site, which can help avoid this issue. You will first need to install and configure the Google Cloud CLI tools by following these instructions . You can then publish one or more SQLite database files to Google Cloud Run using the following command: datasette publish cloudrun mydatabase.db --service=my-database A Cloud Run service is a single hosted application. The service name you specify will be used as part of the Cloud Run URL. If you deploy to a service name that you have used in the past your new deployment will replace the previous one. If you omit the --service option you will be asked to pick a service name interactively during the deploy. You may need to interact with prompts from the tool. Many of the prompts ask for values that can be set as properties for the Google Cloud SDK if you want to avoid the prompts. For example, the default region for the deployed instance can be set using the command: gcloud config set run/region us-central1 You should replace us-central1 with your desired region . Alternately, you can specify the region by setting the CLOUDSDK_RUN_REGION environment… | ["Publishing data", "datasette publish"] | [{"href": "https://cloud.google.com/run/", "label": "Google Cloud Run"}, {"href": "https://datasette.io/plugins/datasette-block-robots", "label": "datasette-block-robots"}, {"href": "https://cloud.google.com/sdk/", "label": "these instructions"}, {"href": "https://cloud.google.com/sdk/docs/properties", "label": "set as properties for the Google Cloud SDK"}, {"href": "https://cloud.google.com/about/locations", "label": "region"}, {"href": "https://cloud.google.com/run/docs/mapping-custom-domains", "label": "mapping custom domains"}] |
changelog:v0-28-publish-cloudrun | changelog | v0-28-publish-cloudrun | datasette publish cloudrun | Google Cloud Run is a brand new serverless hosting platform from Google, which allows you to build a Docker container which will run only when HTTP traffic is received and will shut down (and hence cost you nothing) the rest of the time. It's similar to Zeit's Now v1 Docker hosting platform which sadly is no longer accepting signups from new users. The new datasette publish cloudrun command was contributed by Romain Primet ( #434 ) and publishes selected databases to a new Datasette instance running on Google Cloud Run. See Publishing to Google Cloud Run for full documentation. | ["Changelog", "0.28 (2019-05-19)"] | [{"href": "https://cloud.google.com/run/", "label": "Google Cloud Run"}, {"href": "https://hyperion.alpha.spectrum.chat/zeit/now/cannot-create-now-v1-deployments~d206a0d4-5835-4af5-bb5c-a17f0171fb25?m=MTU0Njk2NzgwODM3OA==", "label": "no longer accepting signups"}, {"href": "https://github.com/simonw/datasette/pull/434", "label": "#434"}] |
changelog:miscellaneous | changelog | miscellaneous | Miscellaneous | Got JSON data in one of your columns? Use the new ?_json=COLNAME argument to tell Datasette to return that JSON value directly rather than encoding it as a string. If you just want an array of the first value of each row, use the new ?_shape=arrayfirst option - example . | ["Changelog", "0.23 (2018-06-18)"] | [{"href": "https://latest.datasette.io/fixtures.json?sql=select+neighborhood+from+facetable+order+by+pk+limit+101&_shape=arrayfirst", "label": "example"}] |
deploying:nginx-proxy-configuration | deploying | nginx-proxy-configuration | Nginx proxy configuration | Here is an example of an nginx configuration file that will proxy traffic to Datasette: daemon off; events { worker_connections 1024; } http { server { listen 80; location /my-datasette { proxy_pass http://127.0.0.1:8009/my-datasette; proxy_set_header Host $host; } } } You can also use the --uds option to Datasette to listen on a Unix domain socket instead of a port, configuring the nginx upstream proxy like this: daemon off; events { worker_connections 1024; } http { server { listen 80; location /my-datasette { proxy_pass http://datasette/my-datasette; proxy_set_header Host $host; } } upstream datasette { server unix:/tmp/datasette.sock; } } Then run Datasette with datasette --uds /tmp/datasette.sock path/to/database.db --setting base_url /my-datasette/ . | ["Deploying Datasette", "Running Datasette behind a proxy"] | [{"href": "https://nginx.org/", "label": "nginx"}] |
spatialite:spatial-indexing-latitude-longitude-columns | spatialite | spatial-indexing-latitude-longitude-columns | Spatial indexing latitude/longitude columns | Here's a recipe for taking a table with existing latitude and longitude columns, adding a SpatiaLite POINT geometry column to that table, populating the new column and then populating a spatial index: import sqlite3 conn = sqlite3.connect("museums.db") # Lead the spatialite extension: conn.enable_load_extension(True) conn.load_extension("/usr/local/lib/mod_spatialite.dylib") # Initialize spatial metadata for this database: conn.execute("select InitSpatialMetadata(1)") # Add a geometry column called point_geom to our museums table: conn.execute( "SELECT AddGeometryColumn('museums', 'point_geom', 4326, 'POINT', 2);" ) # Now update that geometry column with the lat/lon points conn.execute( """ UPDATE museums SET point_geom = GeomFromText('POINT('||"longitude"||' '||"latitude"||')',4326); """ ) # Now add a spatial index to that column conn.execute( 'select CreateSpatialIndex("museums", "point_geom");' ) # If you don't commit your changes will not be persisted: conn.commit() conn.close() | ["SpatiaLite"] | [] |
authentication:authentication-permissions-instance | authentication | authentication-permissions-instance | Access to an instance | Here's how to restrict access to your entire Datasette instance to just the "id": "root" user: [[[cog from metadata_doc import config_example config_example(cog, """ title: My private Datasette instance allow: id: root """) ]]] [[[end]]] To deny access to all users, you can use "allow": false : [[[cog config_example(cog, """ title: My entirely inaccessible instance allow: false """) ]]] [[[end]]] One reason to do this is if you are using a Datasette plugin - such as datasette-permissions-sql - to control permissions instead. | ["Authentication and permissions", "Access permissions in "] | [{"href": "https://github.com/simonw/datasette-permissions-sql", "label": "datasette-permissions-sql"}] |
changelog:the-road-to-datasette-1-0 | changelog | the-road-to-datasette-1-0 | The road to Datasette 1.0 | I've assembled a milestone for Datasette 1.0 . The focus of the 1.0 release will be the following: Signify confidence in the quality/stability of Datasette Give plugin authors confidence that their plugins will work for the whole 1.x release cycle Provide the same confidence to developers building against Datasette JSON APIs If you have thoughts about what you would like to see for Datasette 1.0 you can join the conversation on issue #519 . | ["Changelog", "0.44 (2020-06-11)"] | [{"href": "https://github.com/simonw/datasette/milestone/7", "label": "milestone for Datasette 1.0"}, {"href": "https://github.com/simonw/datasette/issues/519", "label": "the conversation on issue #519"}] |
facets:id3 | facets | id3 | Facet by date | If Datasette finds any columns that contain dates in the first 100 values, it will offer a faceting interface against the dates of those values. This works especially well against timestamp values such as 2019-03-01 12:44:00 . Example here: latest.datasette.io/fixtures/facetable?_facet_date=created | ["Facets"] | [{"href": "https://latest.datasette.io/fixtures/facetable?_facet_date=created", "label": "latest.datasette.io/fixtures/facetable?_facet_date=created"}] |
changelog:facet-by-date | changelog | facet-by-date | Facet by date | If a column contains datetime values, Datasette can now facet that column by date. ( #481 ) | ["Changelog", "0.29 (2019-07-07)"] | [{"href": "https://github.com/simonw/datasette/issues/481", "label": "#481"}] |
plugins:plugins-installing | plugins | plugins-installing | Installing plugins | If a plugin has been packaged for distribution using setuptools you can use the plugin by installing it alongside Datasette in the same virtual environment or Docker container. You can install plugins using the datasette install command: datasette install datasette-vega You can uninstall plugins with datasette uninstall : datasette uninstall datasette-vega You can upgrade plugins with datasette install --upgrade or datasette install -U : datasette install -U datasette-vega This command can also be used to upgrade Datasette itself to the latest released version: datasette install -U datasette You can install multiple plugins at once by listing them as lines in a requirements.txt file like this: datasette-vega datasette-cluster-map Then pass that file to datasette install -r : datasette install -r requirements.txt The install and uninstall commands are thin wrappers around pip install and pip uninstall , which ensure that they run pip in the same virtual environment as Datasette itself. | ["Plugins"] | [] |
full_text_search:full-text-search-table-or-view | full_text_search | full-text-search-table-or-view | Configuring full-text search for a table or view | If a table has a corresponding FTS table set up using the content= argument to CREATE VIRTUAL TABLE shown below, Datasette will detect it automatically and add a search interface to the table page for that table. You can also manually configure which table should be used for full-text search using query string parameters or Metadata . You can set the associated FTS table for a specific table and you can also set one for a view - if you do that, the page for that SQL view will offer a search option. Use ?_fts_table=x to over-ride the FTS table for a specific page. If the primary key was something other than rowid you can use ?_fts_pk=col to set that as well. This is particularly useful for views, for example: https://latest.datasette.io/fixtures/searchable_view?_fts_table=searchable_fts&_fts_pk=pk The fts_table metadata property can be used to specify an associated FTS table. If the primary key column in your table which was used to populate the FTS table is something other than rowid , you can specify the column to use with the fts_pk property. The "searchmode": "raw" property can be used to default the table to accepting SQLite advanced search operators, as described in Advanced SQLite search queries . Here is an example which enables full-text search (with SQLite advanced search operators) for a display_ads view which is defined against the ads table and hence needs to run FTS against the ads_fts table, using the id as the primary key: [[[cog from metadata_doc import metadata_example metadata_example(cog, { "databases": { "russian-ads": { "tables": { "display_ads": { "fts_table": "ads_fts", "fts_pk": "id", "searchmode": "raw" } } } } }) ]]] [[[end]]] | ["Full-text search"] | [{"href": "https://latest.datasette.io/fixtures/searchable_view?_fts_table=searchable_fts&_fts_pk=pk", "label": "https://latest.datasette.io/fixtures/searchable_view?_fts_table=searchable_fts&_fts_pk=pk"}] |
changelog:new-features | changelog | new-features | New features | If an error occurs while executing a user-provided SQL query, that query is now re-displayed in an editable form along with the error message. ( #619 ) New ?_col= and ?_nocol= parameters to show and hide columns in a table, plus an interface for hiding and showing columns in the column cog menu. ( #615 ) A new ?_facet_size= parameter for customizing the number of facet results returned on a table or view page. ( #1332 ) ?_facet_size=max sets that to the maximum, which defaults to 1,000 and is controlled by the the max_returned_rows setting. If facet results are truncated the … at the bottom of the facet list now links to this parameter. ( #1337 ) ?_nofacet=1 option to disable all facet calculations on a page, used as a performance optimization for CSV exports and ?_shape=array/object . ( #1349 , #263 ) ?_nocount=1 option to disable full query result counts. ( #1353 ) ?_trace=1 debugging option is now controlled by the new trace_debug setting, which is turned off by default. ( #1359 ) | ["Changelog", "0.57 (2021-06-05)"] | [{"href": "https://github.com/simonw/datasette/issues/619", "label": "#619"}, {"href": "https://github.com/simonw/datasette/issues/615", "label": "#615"}, {"href": "https://github.com/simonw/datasette/issues/1332", "label": "#1332"}, {"href": "https://github.com/simonw/datasette/issues/1337", "label": "#1337"}, {"href": "https://github.com/simonw/datasette/issues/1349", "label": "#1349"}, {"href": "https://github.com/simonw/datasette/issues/263", "label": "#263"}, {"href": "https://github.com/simonw/datasette/issues/1353", "label": "#1353"}, {"href": "https://github.com/simonw/datasette/issues/1359", "label": "#1359"}] |
testing_plugins:testing-plugins-pdb | testing_plugins | testing-plugins-pdb | Using pdb for errors thrown inside Datasette | If an exception occurs within Datasette itself during a test, the response returned to your plugin will have a response.status_code value of 500. You can add pdb=True to the Datasette constructor to drop into a Python debugger session inside your test run instead of getting back a 500 response code. This is equivalent to running the datasette command-line tool with the --pdb option. Here's what that looks like in a test function: def test_that_opens_the_debugger_or_errors(): ds = Datasette([db_path], pdb=True) response = await ds.client.get("/") If you use this pattern you will need to run pytest with the -s option to avoid capturing stdin/stdout in order to interact with the debugger prompt. | ["Testing plugins"] | [] |
contributing:contributing-bug-fix-branch | contributing | contributing-bug-fix-branch | Releasing bug fixes from a branch | If it's necessary to publish a bug fix release without shipping new features that have landed on main a release branch can be used. Create it from the relevant last tagged release like so: git branch 0.52.x 0.52.4 git checkout 0.52.x Next cherry-pick the commits containing the bug fixes: git cherry-pick COMMIT Write the release notes in the branch, and update the version number in version.py . Then push the branch: git push -u origin 0.52.x Once the tests have completed, publish the release from that branch target using the GitHub Draft a new release form. Finally, cherry-pick the commit with the release notes and version number bump across to main : git checkout main git cherry-pick COMMIT git push | ["Contributing"] | [{"href": "https://github.com/simonw/datasette/releases/new", "label": "Draft a new release"}] |
internals:database-hash | internals | database-hash | db.hash | If the database was opened in immutable mode, this property returns the 64 character SHA-256 hash of the database contents as a string. Otherwise it returns None . | ["Internals for plugins", "Database class"] | [] |
settings:setting-base-url | settings | setting-base-url | base_url | If you are running Datasette behind a proxy, it may be useful to change the root path used for the Datasette instance. For example, if you are sending traffic from https://www.example.com/tools/datasette/ through to a proxied Datasette instance you may wish Datasette to use /tools/datasette/ as its root URL. You can do that like so: datasette mydatabase.db --setting base_url /tools/datasette/ | ["Settings", "Settings"] | [] |
performance:performance-immutable-mode | performance | performance-immutable-mode | Immutable mode | If you can be certain that a SQLite database file will not be changed by another process you can tell Datasette to open that file in immutable mode . Doing so will disable all locking and change detection, which can result in improved query performance. This also enables further optimizations relating to HTTP caching, described below. To open a file in immutable mode pass it to the datasette command using the -i option: datasette -i data.db When you open a file in immutable mode like this Datasette will also calculate and cache the row counts for each table in that database when it first starts up, further improving performance. | ["Performance and caching"] | [] |
contributing:devenvironment | contributing | devenvironment | Setting up a development environment | If you have Python 3.8 or higher installed on your computer (on OS X the quickest way to do this is using homebrew ) you can install an editable copy of Datasette using the following steps. If you want to use GitHub to publish your changes, first create a fork of datasette under your own GitHub account. Now clone that repository somewhere on your computer: git clone git@github.com:YOURNAME/datasette If you want to get started without creating your own fork, you can do this instead: git clone git@github.com:simonw/datasette The next step is to create a virtual environment for your project and use it to install Datasette's dependencies: cd datasette # Create a virtual environment in ./venv python3 -m venv ./venv # Now activate the virtual environment, so pip can install into it source venv/bin/activate # Install Datasette and its testing dependencies python3 -m pip install -e '.[test]' That last line does most of the work: pip install -e means "install this package in a way that allows me to edit the source code in place". The .[test] option means "use the setup.py in this directory and install the optional testing dependencies as well". | ["Contributing"] | [{"href": "https://docs.python-guide.org/starting/install3/osx/", "label": "is using homebrew"}, {"href": "https://github.com/simonw/datasette/fork", "label": "create a fork of datasette"}] |
installation:installation-homebrew | installation | installation-homebrew | Using Homebrew | If you have a Mac and use Homebrew , you can install Datasette by running this command in your terminal: brew install datasette This should install the latest version. You can confirm by running: datasette --version You can upgrade to the latest Homebrew packaged version using: brew upgrade datasette Once you have installed Datasette you can install plugins using the following: datasette install datasette-vega If the latest packaged release of Datasette has not yet been made available through Homebrew, you can upgrade your Homebrew installation in-place using: datasette install -U datasette | ["Installation", "Basic installation"] | [{"href": "https://brew.sh/", "label": "Homebrew"}] |
publish:cli-package | publish | cli-package | datasette package | If you have docker installed (e.g. using Docker for Mac ) you can use the datasette package command to create a new Docker image in your local repository containing the datasette app bundled together with one or more SQLite databases: datasette package mydatabase.db Here's example output for the package command: datasette package parlgov.db --extra-options="--setting sql_time_limit_ms 2500" Sending build context to Docker daemon 4.459MB Step 1/7 : FROM python:3.11.0-slim-bullseye ---> 79e1dc9af1c1 Step 2/7 : COPY . /app ---> Using cache ---> cd4ec67de656 Step 3/7 : WORKDIR /app ---> Using cache ---> 139699e91621 Step 4/7 : RUN pip install datasette ---> Using cache ---> 340efa82bfd7 Step 5/7 : RUN datasette inspect parlgov.db --inspect-file inspect-data.json ---> Using cache ---> 5fddbe990314 Step 6/7 : EXPOSE 8001 ---> Using cache ---> 8e83844b0fed Step 7/7 : CMD datasette serve parlgov.db --port 8001 --inspect-file inspect-data.json --setting sql_time_limit_ms 2500 ---> Using cache ---> 1bd380ea8af3 Successfully built 1bd380ea8af3 You can now run the resulting container like so: docker run -p 8081:8001 1bd380ea8af3 This exposes port 8001 inside the container as port 8081 on your host machine, so you can access the application at http://localhost:8081/ You can customize the port that is exposed by the container using the --port option: datasette package mydatabase.db --port 8080 A full list of options can be seen by running datasette package --help : See datasette package for the full list of options for this command. | ["Publishing data"] | [{"href": "https://www.docker.com/docker-mac", "label": "Docker for Mac"}] |
installation:id1 | installation | id1 | Installation | If you just want to try Datasette out you don't need to install anything: see Try Datasette without installing anything using Glitch There are two main options for installing Datasette. You can install it directly on to your machine, or you can install it using Docker. If you want to start making contributions to the Datasette project by installing a copy that lets you directly modify the code, take a look at our guide to Setting up a development environment . Basic installation Datasette Desktop for Mac Using Homebrew Using pip Advanced installation options Using pipx Installing plugins using pipx Upgrading packages using pipx Using Docker Loading SpatiaLite Installing plugins A note about extensions | [] | [] |
performance:performance-hashed-urls | performance | performance-hashed-urls | datasette-hashed-urls | If you open a database file in immutable mode using the -i option, you can be assured that the content of that database will not change for the lifetime of the Datasette server. The datasette-hashed-urls plugin implements an optimization where your database is served with part of the SHA-256 hash of the database contents baked into the URL. A database at /fixtures will instead be served at /fixtures-aa7318b , and a year-long cache expiry header will be returned with those pages. This will then be cached by both browsers and caching proxies such as Cloudflare or Fastly, providing a potentially significant performance boost. To install the plugin, run the following: datasette install datasette-hashed-urls Prior to Datasette 0.61 hashed URL mode was a core Datasette feature, enabled using the hash_urls setting. This implementation has now been removed in favor of the datasette-hashed-urls plugin. Prior to Datasette 0.28 hashed URL mode was the default behaviour for Datasette, since all database files were assumed to be immutable and unchanging. From 0.28 onwards the default has been to treat database files as mutable unless explicitly configured otherwise. | ["Performance and caching"] | [{"href": "https://datasette.io/plugins/datasette-hashed-urls", "label": "datasette-hashed-urls plugin"}] |
json_api:json-api-cors | json_api | json-api-cors | Enabling CORS | If you start Datasette with the --cors option, each JSON endpoint will be served with the following additional HTTP headers: [[[cog from datasette.utils import add_cors_headers import textwrap headers = {} add_cors_headers(headers) output = "\n".join("{}: {}".format(k, v) for k, v in headers.items()) cog.out("\n::\n\n") cog.out(textwrap.indent(output, ' ')) cog.out("\n\n") ]]] Access-Control-Allow-Origin: * Access-Control-Allow-Headers: Authorization, Content-Type Access-Control-Expose-Headers: Link Access-Control-Allow-Methods: GET, POST, HEAD, OPTIONS Access-Control-Max-Age: 3600 [[[end]]] This allows JavaScript running on any domain to make cross-origin requests to interact with the Datasette API. If you start Datasette without the --cors option only JavaScript running on the same domain as Datasette will be able to access the API. Here's how to serve data.db with CORS enabled: datasette data.db --cors | ["JSON API"] | [] |
sql_queries:sql-views | sql_queries | sql-views | Views | If you want to bundle some pre-written SQL queries with your Datasette-hosted database you can do so in two ways. The first is to include SQL views in your database - Datasette will then list those views on your database index page. The quickest way to create views is with the SQLite command-line interface: sqlite3 sf-trees.db SQLite version 3.19.3 2017-06-27 16:48:08 Enter ".help" for usage hints. sqlite> CREATE VIEW demo_view AS select qSpecies from Street_Tree_List; <CTRL+D> You can also use the sqlite-utils tool to create a view : sqlite-utils create-view sf-trees.db demo_view "select qSpecies from Street_Tree_List" | ["Running SQL queries"] | [{"href": "https://sqlite-utils.datasette.io/", "label": "sqlite-utils"}, {"href": "https://sqlite-utils.datasette.io/en/stable/cli.html#creating-views", "label": "create a view"}] |
changelog:v0-29-medium-changes | changelog | v0-29-medium-changes | Easier custom templates for table rows | If you want to customize the display of individual table rows, you can do so using a _table.html template include that looks something like this: {% for row in display_rows %} <div> <h2>{{ row["title"] }}</h2> <p>{{ row["description"] }}<lp> <p>Category: {{ row.display("category_id") }}</p> </div> {% endfor %} This is a backwards incompatible change . If you previously had a custom template called _rows_and_columns.html you need to rename it to _table.html . See Custom templates for full details. | ["Changelog", "0.29 (2019-07-07)"] | [] |
installation:installing-plugins | installation | installing-plugins | Installing plugins | If you want to install plugins into your local Datasette Docker image you can do so using the following recipe. This will install the plugins and then save a brand new local image called datasette-with-plugins : docker run datasetteproject/datasette \ pip install datasette-vega docker commit $(docker ps -lq) datasette-with-plugins You can now run the new custom image like so: docker run -p 8001:8001 -v `pwd`:/mnt \ datasette-with-plugins \ datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db You can confirm that the plugins are installed by visiting http://127.0.0.1:8001/-/plugins Some plugins such as datasette-ripgrep may need additional system packages. You can install these by running apt-get install inside the container: docker run datasette-057a0 bash -c ' apt-get update && apt-get install ripgrep && pip install datasette-ripgrep' docker commit $(docker ps -lq) datasette-with-ripgrep | ["Installation", "Advanced installation options", "Using Docker"] | [{"href": "http://127.0.0.1:8001/-/plugins", "label": "http://127.0.0.1:8001/-/plugins"}, {"href": "https://datasette.io/plugins/datasette-ripgrep", "label": "datasette-ripgrep"}] |
facets:id2 | facets | id2 | Facet by JSON array | If your SQLite installation provides the json1 extension (you can check using /-/versions ) Datasette will automatically detect columns that contain JSON arrays of values and offer a faceting interface against those columns. This is useful for modelling things like tags without needing to break them out into a new table. Example here: latest.datasette.io/fixtures/facetable?_facet_array=tags | ["Facets"] | [{"href": "https://latest.datasette.io/fixtures/facetable?_facet_array=tags", "label": "latest.datasette.io/fixtures/facetable?_facet_array=tags"}] |
internals:internals-tracer-trace-child-tasks | internals | internals-tracer-trace-child-tasks | Tracing child tasks | If your code uses a mechanism such as asyncio.gather() to execute code in additional tasks you may find that some of the traces are missing from the display. You can use the trace_child_tasks() context manager to ensure these child tasks are correctly handled. from datasette import tracer with tracer.trace_child_tasks(): results = await asyncio.gather( # ... async tasks here ) This example uses the register_routes() plugin hook to add a page at /parallel-queries which executes two SQL queries in parallel using asyncio.gather() and returns their results. from datasette import hookimpl from datasette import tracer @hookimpl def register_routes(): async def parallel_queries(datasette): db = datasette.get_database() with tracer.trace_child_tasks(): one, two = await asyncio.gather( db.execute("select 1"), db.execute("select 2"), ) return Response.json( { "one": one.single_value(), "two": two.single_value(), } ) return [ (r"/parallel-queries$", parallel_queries), ] Note that running parallel SQL queries in this way has been known to cause problems in the past , so treat this example with caution. Adding ?_trace=1 will show that the trace covers both of those child tasks. | ["Internals for plugins", "datasette.tracer"] | [{"href": "https://github.com/simonw/datasette/issues/2189", "label": "been known to cause problems in the past"}] |
full_text_search:configuring-fts-using-csvs-to-sqlite | full_text_search | configuring-fts-using-csvs-to-sqlite | Configuring FTS using csvs-to-sqlite | If your data starts out in CSV files, you can use Datasette's companion tool csvs-to-sqlite to convert that file into a SQLite database and enable full-text search on specific columns. For a file called items.csv where you want full-text search to operate against the name and description columns you would run the following: csvs-to-sqlite items.csv items.db -f name -f description | ["Full-text search", "Enabling full-text search for a SQLite table"] | [{"href": "https://github.com/simonw/csvs-to-sqlite", "label": "csvs-to-sqlite"}] |
performance:http-caching | performance | http-caching | HTTP caching | If your database is immutable and guaranteed not to change, you can gain major performance improvements from Datasette by enabling HTTP caching. This can work at two different levels. First, it can tell browsers to cache the results of queries and serve future requests from the browser cache. More significantly, it allows you to run Datasette behind a caching proxy such as Varnish or use a cache provided by a hosted service such as Fastly or Cloudflare . This can provide incredible speed-ups since a query only needs to be executed by Datasette the first time it is accessed - all subsequent hits can then be served by the cache. Using a caching proxy in this way could enable a Datasette-backed visualization to serve thousands of hits a second while running Datasette itself on extremely inexpensive hosting. Datasette's integration with HTTP caches can be enabled using a combination of configuration options and query string arguments. The default_cache_ttl setting sets the default HTTP cache TTL for all Datasette pages. This is 5 seconds unless you change it - you can set it to 0 if you wish to disable HTTP caching entirely. You can also change the cache timeout on a per-request basis using the ?_ttl=10 query string parameter. This can be useful when you are working with the Datasette JSON API - you may decide that a specific query can be cached for a longer time, or maybe you need to set ?_ttl=0 for some requests for example if you are running a SQL order by random() query. | ["Performance and caching"] | [{"href": "https://varnish-cache.org/", "label": "Varnish"}, {"href": "https://www.fastly.com/", "label": "Fastly"}, {"href": "https://www.cloudflare.com/", "label": "Cloudflare"}] |
writing_plugins:writing-plugins-static-assets | writing_plugins | writing-plugins-static-assets | Static assets | If your plugin has a static/ directory, Datasette will automatically configure itself to serve those static assets from the following path: /-/static-plugins/NAME_OF_PLUGIN_PACKAGE/yourfile.js Use the datasette.urls.static_plugins(plugin_name, path) method to generate URLs to that asset that take the base_url setting into account, see datasette.urls . To bundle the static assets for a plugin in the package that you publish to PyPI, add the following to the plugin's setup.py : package_data = ( { "datasette_plugin_name": [ "static/plugin.js", ], }, ) Where datasette_plugin_name is the name of the plugin package (note that it uses underscores, not hyphens) and static/plugin.js is the path within that package to the static file. datasette-cluster-map is a useful example of a plugin that includes packaged static assets in this way. | ["Writing plugins"] | [{"href": "https://github.com/simonw/datasette-cluster-map", "label": "datasette-cluster-map"}] |
writing_plugins:writing-plugins-custom-templates | writing_plugins | writing-plugins-custom-templates | Custom templates | If your plugin has a templates/ directory, Datasette will attempt to load templates from that directory before it uses its own default templates. The priority order for template loading is: templates from the --template-dir argument, if specified templates from the templates/ directory in any installed plugins default templates that ship with Datasette See Custom pages and templates for more details on how to write custom templates, including which filenames to use to customize which parts of the Datasette UI. Templates should be bundled for distribution using the same package_data mechanism in setup.py described for static assets above, for example: package_data = ( { "datasette_plugin_name": [ "templates/my_template.html", ], }, ) You can also use wildcards here such as templates/*.html . See datasette-edit-schema for an example of this pattern. | ["Writing plugins"] | [{"href": "https://github.com/simonw/datasette-edit-schema", "label": "datasette-edit-schema"}] |
testing_plugins:testing-plugins-pytest-httpx | testing_plugins | testing-plugins-pytest-httpx | Testing outbound HTTP calls with pytest-httpx | If your plugin makes outbound HTTP calls - for example datasette-auth-github or datasette-import-table - you may need to mock those HTTP requests in your tests. The pytest-httpx package is a useful library for mocking calls. It can be tricky to use with Datasette though since it mocks all HTTPX requests, and Datasette's own testing mechanism uses HTTPX internally. To avoid breaking your tests, you can return ["localhost"] from the non_mocked_hosts() fixture. As an example, here's a very simple plugin which executes an HTTP response and returns the resulting content: from datasette import hookimpl from datasette.utils.asgi import Response import httpx @hookimpl def register_routes(): return [ (r"^/-/fetch-url$", fetch_url), ] async def fetch_url(datasette, request): if request.method == "GET": return Response.html( """ <form action="/-/fetch-url" method="post"> <input type="hidden" name="csrftoken" value="{}"> <input name="url"><input type="submit"> </form>""".format( request.scope["csrftoken"]() ) ) vars = await request.post_vars() url = vars["url"] return Response.text(httpx.get(url).text) Here's a test for that plugin that mocks the HTTPX outbound request: from datasette.app import Datasette import pytest @pytest.fixture def non_mocked_hosts(): # This ensures httpx-mock will not affect Datasette's own # httpx calls made in the tests by datasette.client: return ["localhost"] async def test_outbound_http_call(httpx_mock): httpx_mock.add_response( url="https://www.example.com/", text="Hello world", ) datasette = Datasette([], memory=True) response = await datasette.client.post( "/-/fetch-url", data={"url": "https://www.example.com/"}, ) assert response.text == "Hello world" outbound_request = httpx_mock.get_request()… | ["Testing plugins"] | [{"href": "https://pypi.org/project/pytest-httpx/", "label": "pytest-httpx"}] |
plugin_hooks:plugin-register-permissions | plugin_hooks | plugin-register-permissions | register_permissions(datasette) | If your plugin needs to register additional permissions unique to that plugin - upload-csvs for example - you can return a list of those permissions from this hook. from datasette import hookimpl, Permission @hookimpl def register_permissions(datasette): return [ Permission( name="upload-csvs", abbr=None, description="Upload CSV files", takes_database=True, takes_resource=False, default=False, ) ] The fields of the Permission class are as follows: name - string The name of the permission, e.g. upload-csvs . This should be unique across all plugins that the user might have installed, so choose carefully. abbr - string or None An abbreviation of the permission, e.g. uc . This is optional - you can set it to None if you do not want to pick an abbreviation. Since this needs to be unique across all installed plugins it's best not to specify an abbreviation at all. If an abbreviation is provided it will be used when creating restricted signed API tokens. description - string or None A human-readable description of what the permission lets you do. Should make sense as the second part of a sentence that starts "A user with this permission can ...". takes_database - boolean True if this permission can be granted on a per-database basis, False if it is only valid at the overall Datasette instance level. takes_resource - boolean … | ["Plugin hooks"] | [] |
changelog:id44 | changelog | id44 | 0.51.1 (2020-10-31) | Improvements to the new Binary data documentation page. | ["Changelog"] | [] |
internals:internals-response-asgi-send | internals | internals-response-asgi-send | Returning a response with .asgi_send(send) | In most cases you will return Response objects from your own view functions. You can also use a Response instance to respond at a lower level via ASGI, for example if you are writing code that uses the asgi_wrapper(datasette) hook. Create a Response object and then use await response.asgi_send(send) , passing the ASGI send function. For example: async def require_authorization(scope, receive, send): response = Response.text( "401 Authorization Required", headers={ "www-authenticate": 'Basic realm="Datasette", charset="UTF-8"' }, status=401, ) await response.asgi_send(send) | ["Internals for plugins", "Response class"] | [] |
changelog:id18 | changelog | id18 | 0.61 (2022-03-23) | In preparation for Datasette 1.0, this release includes two potentially backwards-incompatible changes. Hashed URL mode has been moved to a separate plugin, and the way Datasette generates URLs to databases and tables with special characters in their name such as / and . has changed. Datasette also now requires Python 3.7 or higher. URLs within Datasette now use a different encoding scheme for tables or databases that include "special" characters outside of the range of a-zA-Z0-9_- . This scheme is explained here: Tilde encoding . ( #1657 ) Removed hashed URL mode from Datasette. The new datasette-hashed-urls plugin can be used to achieve the same result, see datasette-hashed-urls for details. ( #1661 ) Databases can now have a custom path within the Datasette instance that is independent of the database name, using the db.route property. ( #1668 ) Datasette is now covered by a Code of Conduct . ( #1654 ) Python 3.6 is no longer supported. ( #1577 ) Tests now run against Python 3.11-dev. ( #1621 ) New datasette.ensure_permissions(actor, permissions) internal method for checking multiple permissions at once. ( #1675 ) New datasette.check_visibility(actor, action, resource=None) internal method for checking if a user can see a resource that would otherwise be invisible to unauthenticated users. ( #1678 ) Table and row HTML pages now include a <link rel="alternate" type="application/json+datasette" href="..."> element and return a Link: URL; rel="alternate"; type="applicatio… | ["Changelog"] | [{"href": "https://github.com/simonw/datasette/issues/1657", "label": "#1657"}, {"href": "https://github.com/simonw/datasette/issues/1661", "label": "#1661"}, {"href": "https://github.com/simonw/datasette/issues/1668", "label": "#1668"}, {"href": "https://github.com/simonw/datasette/blob/main/CODE_OF_CONDUCT.md", "label": "Code of Conduct"}, {"href": "https://github.com/simonw/datasette/issues/1654", "label": "#1654"}, {"href": "https://github.com/simonw/datasette/issues/1577", "label": "#1577"}, {"href": "https://github.com/simonw/datasette/issues/1621", "label": "#1621"}, {"href": "https://github.com/simonw/datasette/issues/1675", "label": "#1675"}, {"href": "https://github.com/simonw/datasette/issues/1678", "label": "#1678"}, {"href": "https://github.com/simonw/datasette/issues/1533", "label": "#1533"}, {"href": "https://github.com/simonw/datasette/issues/1612", "label": "#1612"}, {"href": "https://github.com/simonw/datasette/issues/1603", "label": "#1603"}, {"href": "https://github.com/simonw/datasette/issues/1587", "label": "#1587"}, {"href": "https://github.com/simonw/datasette/issues/1601", "label": "#1601"}, {"href": "https://github.com/simonw/datasette/issues/1576", "label": "#1576"}, {"href": "https://github.com/simonw/datasette/issues/957", "label": "#957"}, {"href": "https://github.com/simonw/datasette/issues/1607", "label": "#1607"}, {"href": "https://datasette.io/tutorials", "label": "Datasette Tutorials"}, {"href": "https://github.com/simonw/datasette/pull/1649", "label": "#1649"}, {"href": "https://github.com/simonw/datasette/issues/1545", "label": "#1545"}, {"href": "https://github.com/simonw/datasette/issues/1228", "label": "#1228"}] |
settings:setting-truncate-cells-html | settings | setting-truncate-cells-html | truncate_cells_html | In the HTML table view, truncate any strings that are longer than this value. The full value will still be available in CSV, JSON and on the individual row HTML page. Set this to 0 to disable truncation. datasette mydatabase.db --setting truncate_cells_html 0 | ["Settings", "Settings"] | [] |
cli-reference:cli-help-install-help | cli-reference | cli-help-install-help | datasette install | Install new Datasette plugins. This command works like pip install but ensures that your plugins will be installed into the same environment as Datasette. This command: datasette install datasette-cluster-map Would install the datasette-cluster-map plugin. [[[cog help(["install", "--help"]) ]]] Usage: datasette install [OPTIONS] [PACKAGES]... Install plugins and packages from PyPI into the same environment as Datasette Options: -U, --upgrade Upgrade packages to latest version -r, --requirement PATH Install from requirements file -e, --editable TEXT Install a project in editable mode from this path --help Show this message and exit. [[[end]]] | ["CLI reference"] | [{"href": "https://datasette.io/plugins/datasette-cluster-map", "label": "datasette-cluster-map"}] |
internals:internals-database | internals | internals-database | Database class | Instances of the Database class can be used to execute queries against attached SQLite databases, and to run introspection against their schemas. | ["Internals for plugins"] | [] |
json_api:tablecreateview-example | json_api | tablecreateview-example | Creating a table from example data | Instead of specifying columns directly you can instead pass a single example row or a list of rows . Datasette will create a table with a schema that matches those rows and insert them for you: POST /<database>/-/create Content-Type: application/json Authorization: Bearer dstok_<rest-of-token> { "table": "creatures", "rows": [ { "id": 1, "name": "Tarantula" }, { "id": 2, "name": "Kākāpō" } ], "pk": "id" } Doing this requires both the create-table and insert-row permissions. The 201 response here will be similar to the columns form, but will also include the number of rows that were inserted as row_count : { "ok": true, "database": "data", "table": "creatures", "table_url": "http://127.0.0.1:8001/data/creatures", "table_api_url": "http://127.0.0.1:8001/data/creatures.json", "schema": "CREATE TABLE [creatures] (\n [id] INTEGER PRIMARY KEY,\n [name] TEXT\n)", "row_count": 2 } You can call the create endpoint multiple times for the same table provided you are specifying the table using the rows or row option. New rows will be inserted into the table each time. This means you can use this API if you are unsure if the relevant table has been created yet. If you pass a row to the create endpoint with a primary key that already exists you will get an error that looks like this: { "ok": false, "errors": [ "UNIQUE constraint failed: creatures.id" ] } You can avoid this error by passing the same "ignore": true or "replace": true options to the create endpoint as you can to the insert endpoint . To use the "replace": true option you will also need the update-row permission. Pass "alter": true to automatically add any missing columns to t… | ["JSON API", "The JSON write API"] | [] |
changelog:javascript-modules | changelog | javascript-modules | JavaScript modules | JavaScript modules were introduced in ECMAScript 2015 and provide native browser support for the import and export keywords. To use modules, JavaScript needs to be included in <script> tags with a type="module" attribute. Datasette now has the ability to output <script type="module"> in places where you may wish to take advantage of modules. The extra_js_urls option described in Custom CSS and JavaScript can now be used with modules, and module support is also available for the extra_body_script() plugin hook. ( #1186 , #1187 ) datasette-leaflet-freedraw is the first example of a Datasette plugin that takes advantage of the new support for JavaScript modules. See Drawing shapes on a map to query a SpatiaLite database for more on this plugin. | ["Changelog", "0.54 (2021-01-25)"] | [{"href": "https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules", "label": "JavaScript modules"}, {"href": "https://github.com/simonw/datasette/issues/1186", "label": "#1186"}, {"href": "https://github.com/simonw/datasette/issues/1187", "label": "#1187"}, {"href": "https://datasette.io/plugins/datasette-leaflet-freedraw", "label": "datasette-leaflet-freedraw"}, {"href": "https://simonwillison.net/2021/Jan/24/drawing-shapes-spatialite/", "label": "Drawing shapes on a map to query a SpatiaLite database"}] |
javascript_plugins:id2 | javascript_plugins | id2 | JavaScript plugin objects | JavaScript plugins are blocks of code that can be registered with Datasette using the registerPlugin() method on the datasetteManager object. The implementation object passed to this method should include a version key defining the plugin version, and one or more of the following named functions providing the implementation of the plugin: | ["JavaScript plugins"] | [] |
plugin_hooks:plugin-hook-render-cell | plugin_hooks | plugin-hook-render-cell | render_cell(row, value, column, table, database, datasette, request) | Lets you customize the display of values within table cells in the HTML table view. row - sqlite.Row The SQLite row object that the value being rendered is part of value - string, integer, float, bytes or None The value that was loaded from the database column - string The name of the column being rendered table - string or None The name of the table - or None if this is a custom SQL query database - string The name of the database datasette - Datasette class You can use this to access plugin configuration options via datasette.plugin_config(your_plugin_name) , or to execute SQL queries. request - Request object The current request object If your hook returns None , it will be ignored. Use this to indicate that your hook is not able to custom render this particular value. If the hook returns a string, that string will be rendered in the table cell. If you want to return HTML markup you can do so by returning a jinja2.Markup object. You can also return an awaitable function which returns a value. Datasette will loop through… | ["Plugin hooks"] | [{"href": "https://datasette.io/plugins/datasette-render-binary", "label": "datasette-render-binary"}, {"href": "https://datasette.io/plugins/datasette-render-markdown", "label": "datasette-render-markdown"}, {"href": "https://datasette.io/plugins/datasette-json-html", "label": "datasette-json-html"}] |
internals:database-execute-write-script | internals | database-execute-write-script | await db.execute_write_script(sql, block=True) | Like execute_write() but can be used to send multiple SQL statements in a single string separated by semicolons, using the sqlite3 conn.executescript() method. Each call to execute_write_script() will be executed inside a transaction. | ["Internals for plugins", "Database class"] | [{"href": "https://docs.python.org/3/library/sqlite3.html#sqlite3.Cursor.executescript", "label": "conn.executescript()"}] |
internals:database-execute-write-many | internals | database-execute-write-many | await db.execute_write_many(sql, params_seq, block=True) | Like execute_write() but uses the sqlite3 conn.executemany() method. This will efficiently execute the same SQL statement against each of the parameters in the params_seq iterator, for example: await db.execute_write_many( "insert into characters (id, name) values (?, ?)", [(1, "Melanie"), (2, "Selma"), (2, "Viktor")], ) Each call to execute_write_many() will be executed inside a transaction. | ["Internals for plugins", "Database class"] | [{"href": "https://docs.python.org/3/library/sqlite3.html#sqlite3.Cursor.executemany", "label": "conn.executemany()"}] |
changelog:other-small-fixes | changelog | other-small-fixes | Other small fixes | Made several performance improvements to the database schema introspection code that runs when Datasette first starts up. ( #1555 ) Label columns detected for foreign keys are now case-insensitive, so Name or TITLE will be detected in the same way as name or title . ( #1544 ) Upgraded Pluggy dependency to 1.0. ( #1575 ) Now using Plausible analytics for the Datasette documentation. explain query plan is now allowed with varying amounts of whitespace in the query. ( #1588 ) New CLI reference page showing the output of --help for each of the datasette sub-commands. This lead to several small improvements to the help copy. ( #1594 ) Fixed bug where writable canned queries could not be used with custom templates. ( #1547 ) Improved fix for a bug where columns with a underscore prefix could result in unnecessary hidden form fields. ( #1527 ) | ["Changelog", "0.60 (2022-01-13)"] | [{"href": "https://github.com/simonw/datasette/issues/1555", "label": "#1555"}, {"href": "https://github.com/simonw/datasette/issues/1544", "label": "#1544"}, {"href": "https://github.com/simonw/datasette/issues/1575", "label": "#1575"}, {"href": "https://plausible.io/", "label": "Plausible analytics"}, {"href": "https://github.com/simonw/datasette/issues/1588", "label": "#1588"}, {"href": "https://github.com/simonw/datasette/issues/1594", "label": "#1594"}, {"href": "https://github.com/simonw/datasette/issues/1547", "label": "#1547"}, {"href": "https://github.com/simonw/datasette/issues/1527", "label": "#1527"}] |
internals:internals | internals | internals | Internals for plugins | Many Plugin hooks are passed objects that provide access to internal Datasette functionality. The interface to these objects should not be considered stable with the exception of methods that are documented here. | [] | [] |
settings:setting-max-signed-tokens-ttl | settings | setting-max-signed-tokens-ttl | max_signed_tokens_ttl | Maximum allowed expiry time for signed API tokens created by users. Defaults to 0 which means no limit - tokens can be created that will never expire. Set this to a value in seconds to limit the maximum expiry time. For example, to set that limit to 24 hours you would use: datasette mydatabase.db --setting max_signed_tokens_ttl 86400 This setting is enforced when incoming tokens are processed. | ["Settings", "Settings"] | [] |
settings:setting-num-sql-threads | settings | setting-num-sql-threads | num_sql_threads | Maximum number of threads in the thread pool Datasette uses to execute SQLite queries. Defaults to 3. datasette mydatabase.db --setting num_sql_threads 10 Setting this to 0 turns off threaded SQL queries entirely - useful for environments that do not support threading such as Pyodide . | ["Settings", "Settings"] | [{"href": "https://pyodide.org/", "label": "Pyodide"}] |
settings:setting-max-insert-rows | settings | setting-max-insert-rows | max_insert_rows | Maximum rows that can be inserted at a time using the bulk insert API, see Inserting rows . Defaults to 100. You can increase or decrease this limit like so: datasette mydatabase.db --setting max_insert_rows 1000 | ["Settings", "Settings"] | [] |
metadata:per-database-and-per-table-metadata | metadata | per-database-and-per-table-metadata | Per-database and per-table metadata | Metadata at the top level of the file will be shown on the index page and in the footer on every page of the site. The license and source is expected to apply to all of your data. You can also provide metadata at the per-database or per-table level, like this: [[[cog metadata_example(cog, { "databases": { "database1": { "source": "Alternative source", "source_url": "http://example.com/", "tables": { "example_table": { "description_html": "Custom <em>table</em> description", "license": "CC BY 3.0 US", "license_url": "https://creativecommons.org/licenses/by/3.0/us/" } } } } }) ]]] [[[end]]] Each of the top-level metadata fields can be used at the database and table level. | ["Metadata"] | [] |
changelog:id103 | changelog | id103 | 0.23.2 (2018-07-07) | Minor bugfix and documentation release. CSV export now respects --cors , fixes #326 Installation instructions , including docker image - closes #328 Fix for row pages for tables with / in, closes #325 | ["Changelog"] | [{"href": "https://github.com/simonw/datasette/issues/326", "label": "#326"}, {"href": "https://github.com/simonw/datasette/issues/328", "label": "#328"}, {"href": "https://github.com/simonw/datasette/issues/325", "label": "#325"}] |
changelog:id107 | changelog | id107 | 0.23.1 (2018-06-21) | Minor bugfix release. Correctly display empty strings in HTML table, closes #314 Allow "." in database filenames, closes #302 404s ending in slash redirect to remove that slash, closes #309 Fixed incorrect display of compound primary keys with foreign key references. Closes #319 Docs + example of canned SQL query using || concatenation. Closes #321 Correctly display facets with value of 0 - closes #318 Default 'expand labels' to checked in CSV advanced export | ["Changelog"] | [{"href": "https://github.com/simonw/datasette/issues/314", "label": "#314"}, {"href": "https://github.com/simonw/datasette/issues/302", "label": "#302"}, {"href": "https://github.com/simonw/datasette/issues/309", "label": "#309"}, {"href": "https://github.com/simonw/datasette/issues/319", "label": "#319"}, {"href": "https://github.com/simonw/datasette/issues/321", "label": "#321"}, {"href": "https://github.com/simonw/datasette/issues/318", "label": "#318"}] |
json_api:json-api-discover-alternate | json_api | json-api-discover-alternate | Discovering the JSON for a page | Most of the HTML pages served by Datasette provide a mechanism for discovering their JSON equivalents using the HTML link mechanism. You can find this near the top of the source code of those pages, looking like this: <link rel="alternate" type="application/json+datasette" href="https://latest.datasette.io/fixtures/sortable.json"> The JSON URL is also made available in a Link HTTP header for the page: Link: https://latest.datasette.io/fixtures/sortable.json; rel="alternate"; type="application/json+datasette" | ["JSON API"] | [] |
changelog:id136 | changelog | id136 | 0.20 (2018-04-20) | Mostly new work on the Plugins mechanism: plugins can now bundle static assets and custom templates, and datasette publish has a new --install=name-of-plugin option. Add col-X classes to HTML table on custom query page Fixed out-dated template in documentation Plugins can now bundle custom templates, #224 Added /-/metadata /-/plugins /-/inspect, #225 Documentation for --install option, refs #223 Datasette publish/package --install option, #223 Fix for plugins in Python 3.5, #222 New plugin hooks: extra_css_urls() and extra_js_urls(), #214 /-/static-plugins/PLUGIN_NAME/ now serves static/ from plugins <th> now gets class="col-X" - plus added col-X documentation Use to_css_class for table cell column classes This ensures that columns with spaces in the name will still generate usable CSS class names. Refs #209 Add column name classes to <td>s, make PK bold [Russ Garrett] Don't duplicate simple primary keys in the link column [Russ Garrett] When there's a simple (single-column) primary key, it looks weird to duplicate it in the link column. This change removes the second PK column and treats the link column as if it were the PK colu… | ["Changelog"] | [{"href": "https://github.com/simonw/datasette/issues/224", "label": "#224"}, {"href": "https://github.com/simonw/datasette/issues/225", "label": "#225"}, {"href": "https://github.com/simonw/datasette/issues/223", "label": "#223"}, {"href": "https://github.com/simonw/datasette/issues/223", "label": "#223"}, {"href": "https://github.com/simonw/datasette/issues/222", "label": "#222"}, {"href": "https://github.com/simonw/datasette/issues/214", "label": "#214"}, {"href": "https://github.com/simonw/datasette/issues/209", "label": "#209"}, {"href": "https://github.com/simonw/datasette/issues/209", "label": "#209"}] |
sql_queries:canned-queries-magic-parameters | sql_queries | canned-queries-magic-parameters | Magic parameters | Named parameters that start with an underscore are special: they can be used to automatically add values created by Datasette that are not contained in the incoming form fields or query string. These magic parameters are only supported for canned queries: to avoid security issues (such as queries that extract the user's private cookies) they are not available to SQL that is executed by the user as a custom SQL query. Available magic parameters are: _actor_* - e.g. _actor_id , _actor_name Fields from the currently authenticated Actors . _header_* - e.g. _header_user_agent Header from the incoming HTTP request. The key should be in lower case and with hyphens converted to underscores e.g. _header_user_agent or _header_accept_language . _cookie_* - e.g. _cookie_lang The value of the incoming cookie of that name. _now_epoch The number of seconds since the Unix epoch. _now_date_utc The date in UTC, e.g. 2020-06-01 _now_datetime_utc The ISO 8601 datetime in UTC, e.g. 2020-06-24T18:01:07Z _random_chars_* - e.g. … | ["Running SQL queries", "Canned queries"] | [] |
changelog:signed-api-tokens | changelog | signed-api-tokens | Signed API tokens | New /-/create-token page allowing authenticated users to create signed API tokens that can act on their behalf, see API Tokens . ( #1852 ) New datasette create-token command for creating tokens from the command line: datasette create-token . New allow_signed_tokens setting which can be used to turn off signed token support. ( #1856 ) New max_signed_tokens_ttl setting for restricting the maximum allowed duration of a signed token. ( #1858 ) | ["Changelog", "1.0a0 (2022-11-29)"] | [{"href": "https://github.com/simonw/datasette/issues/1852", "label": "#1852"}, {"href": "https://github.com/simonw/datasette/issues/1856", "label": "#1856"}, {"href": "https://github.com/simonw/datasette/issues/1858", "label": "#1858"}] |
changelog:id66 | changelog | id66 | 0.39 (2020-03-24) | New base_url configuration setting for serving up the correct links while running Datasette under a different URL prefix. ( #394 ) New metadata settings "sort" and "sort_desc" for setting the default sort order for a table. See Setting a default sort order . ( #702 ) Sort direction arrow now displays by default on the primary key. This means you only have to click once (not twice) to sort in reverse order. ( #677 ) New await Request(scope, receive).post_vars() method for accessing POST form variables. ( #700 ) Plugin hooks documentation now links to example uses of each plugin. ( #709 ) | ["Changelog"] | [{"href": "https://github.com/simonw/datasette/issues/394", "label": "#394"}, {"href": "https://github.com/simonw/datasette/issues/702", "label": "#702"}, {"href": "https://github.com/simonw/datasette/issues/677", "label": "#677"}, {"href": "https://github.com/simonw/datasette/issues/700", "label": "#700"}, {"href": "https://github.com/simonw/datasette/issues/709", "label": "#709"}] |
changelog:id28 | changelog | id28 | 0.58 (2021-07-14) | New datasette --uds /tmp/datasette.sock option for binding Datasette to a Unix domain socket, see proxy documentation ( #1388 ) "searchmode": "raw" table metadata option for defaulting a table to executing SQLite full-text search syntax without first escaping it, see Advanced SQLite search queries . ( #1389 ) New plugin hook: get_metadata(datasette, key, database, table) , for returning custom metadata for an instance, database or table. Thanks, Brandon Roberts! ( #1384 ) New plugin hook: skip_csrf(datasette, scope) , for opting out of CSRF protection based on the incoming request. ( #1377 ) The menu_links() , table_actions() and database_actions() plugin hooks all gained a new optional request argument providing access to the current request. ( #1371 ) Major performance improvement for Datasette faceting. ( #1394 ) Improved documentation for Running Datasette behind a proxy to recommend using ProxyPreservehost On with Apache. ( #1387 ) POST requests to endpoints that do not support that HTTP verb now return a 405 error. db.path can now be provided as a pathlib.Path object, useful when writing unit tests for plugins. Thanks, Chris Amico. ( #1365 ) | ["Changelog"] | [{"href": "https://github.com/simonw/datasette/issues/1388", "label": "#1388"}, {"href": "https://github.com/simonw/datasette/issues/1389", "label": "#1389"}, {"href": "https://github.com/simonw/datasette/issues/1384", "label": "#1384"}, {"href": "https://github.com/simonw/datasette/issues/1377", "label": "#1377"}, {"href": "https://github.com/simonw/datasette/issues/1371", "label": "#1371"}, {"href": "https://github.com/simonw/datasette/issues/1394", "label": "#1394"}, {"href": "https://github.com/simonw/datasette/issues/1387", "label": "#1387"}, {"href": "https://github.com/simonw/datasette/issues/1365", "label": "#1365"}] |
changelog:plugin-hooks | changelog | plugin-hooks | Plugin hooks | New jinja2_environment_from_request(datasette, request, env) plugin hook, which can be used to customize the current Jinja environment based on the incoming request. This can be used to modify the template lookup path based on the incoming request hostname, among other things. ( #2225 ) New family of template slot plugin hooks : top_homepage , top_database , top_table , top_row , top_query , top_canned_query . Plugins can use these to provide additional HTML to be injected at the top of the corresponding pages. ( #1191 ) New track_event() mechanism for plugins to emit and receive events when certain events occur within Datasette. ( #2240 ) Plugins can register additional event classes using register_events(datasette) . They can then trigger those events with the datasette.track_event(event) internal method. Plugins can subscribe to notifications of events using the track_event(datasette, event) plugin hook. Datasette core now emits login , logout , create-token , create-table , drop-table , insert-rows , upsert-rows , update-row , delete-row events, documented here . … | ["Changelog", "1.0a8 (2024-02-07)"] | [{"href": "https://github.com/simonw/datasette/issues/2225", "label": "#2225"}, {"href": "https://github.com/simonw/datasette/issues/1191", "label": "#1191"}, {"href": "https://github.com/simonw/datasette/issues/2240", "label": "#2240"}, {"href": "https://github.com/simonw/datasette/issues/2218", "label": "#2218"}] |
changelog:v1-0-a12 | changelog | v1-0-a12 | 1.0a12 (2024-02-29) | New query_actions() plugin hook, similar to table_actions() and database_actions() . Can be used to add a menu of actions to the canned query or arbitrary SQL query page. ( #2283 ) New design for the button that opens the query, table and database actions menu. ( #2281 ) "does not contain" table filter for finding rows that do not contain a string. ( #2287 ) Fixed a bug in the makeColumnActions(columnDetails) JavaScript plugin mechanism where the column action menu was not fully reset in between each interaction. ( #2289 ) | ["Changelog"] | [{"href": "https://github.com/simonw/datasette/issues/2283", "label": "#2283"}, {"href": "https://github.com/simonw/datasette/issues/2281", "label": "#2281"}, {"href": "https://github.com/simonw/datasette/issues/2287", "label": "#2287"}, {"href": "https://github.com/simonw/datasette/issues/2289", "label": "#2289"}] |
changelog:write-api | changelog | write-api | Write API | New API explorer at /-/api for trying out the API. ( #1871 ) /db/-/create API for Creating a table . ( #1882 ) /db/table/-/insert API for Inserting rows . ( #1851 ) /db/table/-/drop API for Dropping tables . ( #1874 ) /db/table/pk/-/update API for Updating a row . ( #1863 ) /db/table/pk/-/delete API for Deleting a row . ( #1864 ) | ["Changelog", "1.0a0 (2022-11-29)"] | [{"href": "https://github.com/simonw/datasette/issues/1871", "label": "#1871"}, {"href": "https://github.com/simonw/datasette/issues/1882", "label": "#1882"}, {"href": "https://github.com/simonw/datasette/issues/1851", "label": "#1851"}, {"href": "https://github.com/simonw/datasette/issues/1874", "label": "#1874"}, {"href": "https://github.com/simonw/datasette/issues/1863", "label": "#1863"}, {"href": "https://github.com/simonw/datasette/issues/1864", "label": "#1864"}] |
changelog:id123 | changelog | id123 | 0.21 (2018-05-05) | New JSON _shape= options, the ability to set table _size= and a mechanism for searching within specific columns. Default tests to using a longer timelimit Every now and then a test will fail in Travis CI on Python 3.5 because it hit the default 20ms SQL time limit. Test fixtures now default to a 200ms time limit, and we only use the 20ms time limit for the specific test that tests query interruption. This should make our tests on Python 3.5 in Travis much more stable. Support _search_COLUMN=text searches, closes #237 Show version on /-/plugins page, closes #248 ?_size=max option, closes #249 Added /-/versions and /-/versions.json , closes #244 Sample output: { "python": { "version": "3.6.3", "full": "3.6.3 (default, Oct 4 2017, 06:09:38) \n[GCC 4.2.1 Compatible Apple LLVM 9.0.0 (clang-900.0.37)]" }, "datasette": { "version": "0.20" }, "sqlite": { "version": "3.23.1", "extensions": { "json1": null, "spatialite": "4.3.0a" } } } Renamed ?_sql_time_limit_ms= to ?_timelimit , closes #242 New ?_shape=array option + tweaks to _shape , closes #245 Default is now ?_shape=arrays (renamed from lists ) New ?_shape=array returns an array of objects as the root object … | ["Changelog"] | [{"href": "https://github.com/simonw/datasette/issues/237", "label": "#237"}, {"href": "https://github.com/simonw/datasette/issues/248", "label": "#248"}, {"href": "https://github.com/simonw/datasette/issues/249", "label": "#249"}, {"href": "https://github.com/simonw/datasette/issues/244", "label": "#244"}, {"href": "https://github.com/simonw/datasette/issues/242", "label": "#242"}, {"href": "https://github.com/simonw/datasette/issues/245", "label": "#245"}, {"href": "https://github.com/simonw/datasette/issues/240", "label": "#240"}, {"href": "https://github.com/simonw/datasette/issues/229", "label": "#229"}, {"href": "https://github.com/simonw/datasette/issues/230", "label": "#230"}, {"href": "https://github.com/simonw/datasette/issues/239", "label": "#239"}, {"href": "https://github.com/simonw/datasette/issues/228", "label": "#228"}, {"href": "https://github.com/simonw/datasette/issues/234", "label": "#234"}] |
changelog:id88 | changelog | id88 | 0.27 (2019-01-31) | New command: datasette plugins ( documentation ) shows you the currently installed list of plugins. Datasette can now output newline-delimited JSON using the new ?_shape=array&_nl=on query string option. Added documentation on The Datasette Ecosystem . Now using Python 3.7.2 as the base for the official Datasette Docker image . | ["Changelog"] | [{"href": "http://ndjson.org/", "label": "newline-delimited JSON"}, {"href": "https://hub.docker.com/r/datasetteproject/datasette/", "label": "Datasette Docker image"}] |
changelog:id61 | changelog | id61 | Smaller changes | New internals documentation for Request object and Response class . ( #706 ) request.url now respects the force_https_urls config setting. closes ( #781 ) request.args.getlist() returns [] if missing. Removed request.raw_args entirely. ( #774 ) New datasette.get_database() method. Added _ prefix to many private, undocumented methods of the Datasette class. ( #576 ) Removed the db.get_outbound_foreign_keys() method which duplicated the behaviour of db.foreign_keys_for_table() . New await datasette.permission_allowed() method. /-/actor debugging endpoint for viewing the currently authenticated actor. New request.cookies property. /-/plugins endpoint now shows a list of hooks implemented by each plugin, e.g. https://latest.datasette.io/-/plugins?all=1 request.post_vars() method no longer discards empty values. New "params" canned query key for explicitly setting named parameters, see Canned query parameters . ( #797 ) request.args is now a MultiParams object. Fixed a bug with the datasette plugins command. ( #802 ) Nicer pa… | ["Changelog", "0.44 (2020-06-11)"] | [{"href": "https://github.com/simonw/datasette/issues/706", "label": "#706"}, {"href": "https://github.com/simonw/datasette/issues/781", "label": "#781"}, {"href": "https://github.com/simonw/datasette/issues/774", "label": "#774"}, {"href": "https://github.com/simonw/datasette/issues/576", "label": "#576"}, {"href": "https://latest.datasette.io/-/plugins?all=1", "label": "https://latest.datasette.io/-/plugins?all=1"}, {"href": "https://github.com/simonw/datasette/issues/797", "label": "#797"}, {"href": "https://github.com/simonw/datasette/issues/802", "label": "#802"}, {"href": "https://github.com/simonw/datasette/issues/395", "label": "#395"}, {"href": "https://github.com/simonw/datasette/issues/777", "label": "#777"}, {"href": "https://github.com/simonw/datasette/issues/822", "label": "#822"}, {"href": "https://github.com/simonw/datasette/issues/804", "label": "#804"}, {"href": "https://github.com/simonw/datasette/issues/830", "label": "#830"}, {"href": "https://github.com/simonw/datasette/issues/837", "label": "#837"}] |
changelog:v1-0-a6 | changelog | v1-0-a6 | 1.0a6 (2023-09-07) | New plugin hook: actors_from_ids(datasette, actor_ids) and an internal method to accompany it, await .actors_from_ids(actor_ids) . This mechanism is intended to be used by plugins that may need to display the actor who was responsible for something managed by that plugin: they can now resolve the recorded IDs of actors into the full actor objects. ( #2181 ) DATASETTE_LOAD_PLUGINS environment variable for controlling which plugins are loaded by Datasette. ( #2164 ) Datasette now checks if the user has permission to view a table linked to by a foreign key before turning that foreign key into a clickable link. ( #2178 ) The execute-sql permission now implies that the actor can also view the database and instance. ( #2169 ) Documentation describing a pattern for building plugins that themselves define further hooks for other plugins. ( #1765 ) Datasette is now tested against the Python 3.12 preview. ( #2175 ) | ["Changelog"] | [{"href": "https://github.com/simonw/datasette/issues/2181", "label": "#2181"}, {"href": "https://github.com/simonw/datasette/issues/2164", "label": "#2164"}, {"href": "https://github.com/simonw/datasette/issues/2178", "label": "#2178"}, {"href": "https://github.com/simonw/datasette/issues/2169", "label": "#2169"}, {"href": "https://github.com/simonw/datasette/issues/1765", "label": "#1765"}, {"href": "https://github.com/simonw/datasette/pull/2175", "label": "#2175"}] |
changelog:plugins-and-internals | changelog | plugins-and-internals | Plugins and internals | New plugin hook: filters_from_request(request, database, table, datasette) , which runs on the table page and can be used to support new custom query string parameters that modify the SQL query. ( #473 ) Added two additional methods for writing to the database: await db.execute_write_script(sql, block=True) and await db.execute_write_many(sql, params_seq, block=True) . ( #1570 ) The db.execute_write() internal method now defaults to blocking until the write operation has completed. Previously it defaulted to queuing the write and then continuing to run code while the write was in the queue. ( #1579 ) Database write connections now execute the prepare_connection(conn, database, datasette) plugin hook. ( #1564 ) The Datasette() constructor no longer requires the files= argument, and is now documented at Datasette class . ( #1563 ) The tracing feature now traces write queries, not just read queries. ( #1568 ) The query string variables exposed by request.args will now include blank strings for arguments such as foo in ?foo=&bar=1 rather than ignoring those parameters entirely. ( #1551 ) | ["Changelog", "0.60 (2022-01-13)"] | [{"href": "https://github.com/simonw/datasette/issues/473", "label": "#473"}, {"href": "https://github.com/simonw/datasette/issues/1570", "label": "#1570"}, {"href": "https://github.com/simonw/datasette/issues/1579", "label": "#1579"}, {"href": "https://github.com/simonw/datasette/issues/1564", "label": "#1564"}, {"href": "https://github.com/simonw/datasette/issues/1563", "label": "#1563"}, {"href": "https://github.com/simonw/datasette/issues/1568", "label": "#1568"}, {"href": "https://github.com/simonw/datasette/issues/1551", "label": "#1551"}] |
changelog:id15 | changelog | id15 | Plugin hooks | New plugin hook: handle_exception() , for custom handling of exceptions caught by Datasette. ( #1770 ) The render_cell() plugin hook is now also passed a row argument, representing the sqlite3.Row object that is being rendered. ( #1300 ) The configuration directory is now stored in datasette.config_dir , making it available to plugins. Thanks, Chris Amico. ( #1766 ) | ["Changelog", "0.62 (2022-08-14)"] | [{"href": "https://github.com/simonw/datasette/issues/1770", "label": "#1770"}, {"href": "https://github.com/simonw/datasette/issues/1300", "label": "#1300"}, {"href": "https://github.com/simonw/datasette/pull/1766", "label": "#1766"}] |
changelog:id93 | changelog | id93 | 0.25 (2018-09-19) | New plugin hooks, improved database view support and an easier way to use more recent versions of SQLite. New publish_subcommand plugin hook. A plugin can now add additional datasette publish publishers in addition to the default now and heroku , both of which have been refactored into default plugins. publish_subcommand documentation . Closes #349 New render_cell plugin hook. Plugins can now customize how values are displayed in the HTML tables produced by Datasette's browsable interface. datasette-json-html and datasette-render-images are two new plugins that use this hook. render_cell documentation . Closes #352 New extra_body_script plugin hook, enabling plugins to provide additional JavaScript that should be added to the page footer. extra_body_script documentation . extra_css_urls and extra_js_urls hooks now take additional optional parameters, allowing them to be more selective about which pages they apply to. Documentation . You can now use the sortable_columns metadata setting to explicitly enable sort-by-column in the interface for database views, as well as for specific tables. The new fts_table and fts_pk metadata settings can now be used to explicitly configure full-text search for a table or a view , even if that table is not directly coupled to the SQLite FTS feature in the database schema itself. Datasette will now use pysqlite3 in place of the standard library sqlite3 module if it has been installed in the current environment. This makes it much easier to run Datasette against a more recent version of SQLite, including the just-released SQLite 3.25.0 which adds wind… | ["Changelog"] | [{"href": "https://github.com/simonw/datasette/issues/349", "label": "#349"}, {"href": "https://github.com/simonw/datasette-json-html", "label": "datasette-json-html"}, {"href": "https://github.com/simonw/datasette-render-images", "label": "datasette-render-images"}, {"href": "https://github.com/simonw/datasette/issues/352", "label": "#352"}, {"href": "https://github.com/coleifer/pysqlite3", "label": "pysqlite3"}, {"href": "https://www.sqlite.org/releaselog/3_25_0.html", "label": "SQLite 3.25.0"}, {"href": "https://github.com/simonw/datasette/issues/360", "label": "#360"}] |
changelog:id12 | changelog | id12 | Documentation | New tutorial: Cleaning data with sqlite-utils and Datasette . Screenshots in the documentation are now maintained using shot-scraper , as described in Automating screenshots for the Datasette documentation using shot-scraper . ( #1844 ) More detailed command descriptions on the CLI reference page. ( #1787 ) New documentation on Running Datasette using OpenRC - thanks, Adam Simpson. ( #1825 ) | ["Changelog", "0.63 (2022-10-27)"] | [{"href": "https://datasette.io/tutorials/clean-data", "label": "Cleaning data with sqlite-utils and Datasette"}, {"href": "https://shot-scraper.datasette.io/", "label": "shot-scraper"}, {"href": "https://simonwillison.net/2022/Oct/14/automating-screenshots/", "label": "Automating screenshots for the Datasette documentation using shot-scraper"}, {"href": "https://github.com/simonw/datasette/issues/1844", "label": "#1844"}, {"href": "https://github.com/simonw/datasette/issues/1787", "label": "#1787"}, {"href": "https://github.com/simonw/datasette/pull/1825", "label": "#1825"}] |
settings:config-dir | settings | config-dir | Configuration directory mode | Normally you configure Datasette using command-line options. For a Datasette instance with custom templates, custom plugins, a static directory and several databases this can get quite verbose: datasette one.db two.db \ --metadata=metadata.json \ --template-dir=templates/ \ --plugins-dir=plugins \ --static css:css As an alternative to this, you can run Datasette in configuration directory mode. Create a directory with the following structure: # In a directory called my-app: my-app/one.db my-app/two.db my-app/datasette.yaml my-app/metadata.json my-app/templates/index.html my-app/plugins/my_plugin.py my-app/static/my.css Now start Datasette by providing the path to that directory: datasette my-app/ Datasette will detect the files in that directory and automatically configure itself using them. It will serve all *.db files that it finds, will load metadata.json if it exists, and will load the templates , plugins and static folders if they are present. The files that can be included in this directory are as follows. All are optional. *.db (or *.sqlite3 or *.sqlite ) - SQLite database files that will be served by Datasette datasette.yaml - Configuration for the Datasette instance metadata.json - Metadata for those databases - metadata.yaml or metadata.yml can be used as well inspect-data.json - the result of running datasette inspect *.db --inspect-file=inspect-data.json from the configuration directory - any database files listed here will be treated as immutable, so they should not be changed while Datasette is running templates/ - a directory containing Custom templates … | ["Settings"] | [] |
changelog:features | changelog | features | Features | Now tested against Python 3.11. Docker containers used by datasette publish and datasette package both now use that version of Python. ( #1853 ) --load-extension option now supports entrypoints. Thanks, Alex Garcia. ( #1789 ) Facet size can now be set per-table with the new facet_size table metadata option. ( #1804 ) The truncate_cells_html setting now also affects long URLs in columns. ( #1805 ) The non-JavaScript SQL editor textarea now increases height to fit the SQL query. ( #1786 ) Facets are now displayed with better line-breaks in long values. Thanks, Daniel Rech. ( #1794 ) The settings.json file used in Configuration directory mode is now validated on startup. ( #1816 ) SQL queries can now include leading SQL comments, using /* ... */ or -- ... syntax. Thanks, Charles Nepote. ( #1860 ) SQL query is now re-displayed when terminated with a time limit error. ( #1819 ) The inspect data mechanism is now used to speed up server startup - thanks, Forest Gregg. ( #1834 ) In Configuration directory mode databases with filenames ending in .sqlite or .sqlite3 are now automatically added to the Datasette instance. ( #1646 ) Breadcrumb navigation display now respects the current user's permissions. ( #1831 ) | ["Changelog", "0.63 (2022-10-27)"] | [{"href": "https://github.com/simonw/datasette/issues/1853", "label": "#1853"}, {"href": "https://github.com/simonw/datasette/pull/1789", "label": "#1789"}, {"href": "https://github.com/simonw/datasette/issues/1804", "label": "#1804"}, {"href": "https://github.com/simonw/datasette/issues/1805", "label": "#1805"}, {"href": "https://github.com/simonw/datasette/issues/1786", "label": "#1786"}, {"href": "https://github.com/simonw/datasette/pull/1794", "label": "#1794"}, {"href": "https://github.com/simonw/datasette/issues/1816", "label": "#1816"}, {"href": "https://github.com/simonw/datasette/issues/1860", "label": "#1860"}, {"href": "https://github.com/simonw/datasette/issues/1819", "label": "#1819"}, {"href": "https://github.com/simonw/datasette/issues/1834", "label": "#1834"}, {"href": "https://github.com/simonw/datasette/issues/1646", "label": "#1646"}, {"href": "https://github.com/simonw/datasette/issues/1831", "label": "#1831"}] |
publish:cli-publish | publish | cli-publish | datasette publish | Once you have created a SQLite database (e.g. using csvs-to-sqlite ) you can deploy it to a hosting account using a single command. You will need a hosting account with Heroku or Google Cloud . Once you have created your account you will need to install and configure the heroku or gcloud command-line tools. | ["Publishing data"] | [{"href": "https://github.com/simonw/csvs-to-sqlite/", "label": "csvs-to-sqlite"}, {"href": "https://www.heroku.com/", "label": "Heroku"}, {"href": "https://cloud.google.com/", "label": "Google Cloud"}] |
contributing:contributing-running-tests | contributing | contributing-running-tests | Running the tests | Once you have done this, you can run the Datasette unit tests from inside your datasette/ directory using pytest like so: pytest You can run the tests faster using multiple CPU cores with pytest-xdist like this: pytest -n auto -m "not serial" -n auto detects the number of available cores automatically. The -m "not serial" skips tests that don't work well in a parallel test environment. You can run those tests separately like so: pytest -m "serial" | ["Contributing"] | [{"href": "https://docs.pytest.org/", "label": "pytest"}, {"href": "https://pypi.org/project/pytest-xdist/", "label": "pytest-xdist"}] |
deploying:deploying-openrc | deploying | deploying-openrc | Running Datasette using OpenRC | OpenRC is the service manager on non-systemd Linux distributions like Alpine Linux and Gentoo . Create an init script at /etc/init.d/datasette with the following contents: #!/sbin/openrc-run name="datasette" command="datasette" command_args="serve -h 0.0.0.0 /path/to/db.db" command_background=true pidfile="/run/${RC_SVCNAME}.pid" You then need to configure the service to run at boot and start it: rc-update add datasette rc-service datasette start | ["Deploying Datasette"] | [{"href": "https://www.alpinelinux.org/", "label": "Alpine Linux"}, {"href": "https://www.gentoo.org/", "label": "Gentoo"}] |
cli-reference:cli-help-plugins-help | cli-reference | cli-help-plugins-help | datasette plugins | Output JSON showing all currently installed plugins, their versions, whether they include static files or templates and which Plugin hooks they use. [[[cog help(["plugins", "--help"]) ]]] Usage: datasette plugins [OPTIONS] List currently installed plugins Options: --all Include built-in default plugins --requirements Output requirements.txt of installed plugins --plugins-dir DIRECTORY Path to directory containing custom plugins --help Show this message and exit. [[[end]]] Example output: [ { "name": "datasette-geojson", "static": false, "templates": false, "version": "0.3.1", "hooks": [ "register_output_renderer" ] }, { "name": "datasette-geojson-map", "static": true, "templates": false, "version": "0.4.0", "hooks": [ "extra_body_script", "extra_css_urls", "extra_js_urls" ] }, { "name": "datasette-leaflet", "static": true, "templates": false, "version": "0.2.2", "hooks": [ "extra_body_script", "extra_template_vars" ] } ] | ["CLI reference"] | [] |
cli-reference:cli-help-inspect-help | cli-reference | cli-help-inspect-help | datasette inspect | Outputs JSON representing introspected data about one or more SQLite database files. If you are opening an immutable database, you can pass this file to the --inspect-data option to improve Datasette's performance by allowing it to skip running row counts against the database when it first starts running: datasette inspect mydatabase.db > inspect-data.json datasette serve -i mydatabase.db --inspect-file inspect-data.json This performance optimization is used automatically by some of the datasette publish commands. You are unlikely to need to apply this optimization manually. [[[cog help(["inspect", "--help"]) ]]] Usage: datasette inspect [OPTIONS] [FILES]... Generate JSON summary of provided database files This can then be passed to "datasette --inspect-file" to speed up count operations against immutable database files. Options: --inspect-file TEXT --load-extension PATH:ENTRYPOINT? Path to a SQLite extension to load, and optional entrypoint --help Show this message and exit. [[[end]]] | ["CLI reference"] | [] |
cli-reference:cli-help-package-help | cli-reference | cli-help-package-help | datasette package | Package SQLite files into a Datasette Docker container, see datasette package . [[[cog help(["package", "--help"]) ]]] Usage: datasette package [OPTIONS] FILES... Package SQLite files into a Datasette Docker container Options: -t, --tag TEXT Name for the resulting Docker container, can optionally use name:tag format -m, --metadata FILENAME Path to JSON/YAML file containing metadata to publish --extra-options TEXT Extra options to pass to datasette serve --branch TEXT Install datasette from a GitHub branch e.g. main --template-dir DIRECTORY Path to directory containing custom templates --plugins-dir DIRECTORY Path to directory containing custom plugins --static MOUNT:DIRECTORY Serve static files from this directory at /MOUNT/... --install TEXT Additional packages (e.g. plugins) to install --spatialite Enable SpatialLite extension --version-note TEXT Additional note to show on /-/versions --secret TEXT Secret used for signing secure values, such as signed cookies -p, --port INTEGER RANGE Port to run the server on, defaults to 8001 [1<=x<=65535] --title TEXT Title for metadata --license TEXT License label for metadata --license_url TEXT License URL for metadata --source TEXT Source label for metadata --source_url TEXT Source URL for metadata --about TEXT About label for metadata --about_url TEXT About URL for metadata --help Show this message and exit. [[[end]]] | ["CLI reference"] | [] |
changelog:configuration | changelog | configuration | Configuration | Plugin configuration now lives in the datasette.yaml configuration file , passed to Datasette using the -c/--config option. Thanks, Alex Garcia. ( #2093 ) datasette -c datasette.yaml Where datasette.yaml contains configuration that looks like this: plugins: datasette-cluster-map: latitude_column: xlat longitude_column: xlon Previously plugins were configured in metadata.yaml , which was confusing as plugin settings were unrelated to database and table metadata. The -s/--setting option can now be used to set plugin configuration as well. See Configuration via the command-line for details. ( #2252 ) The above YAML configuration example using -s/--setting looks like this: datasette mydatabase.db \ -s plugins.datasette-cluster-map.latitude_column xlat \ -s plugins.datasette-cluster-map.longitude_column xlon The new /-/config page shows the current instance configuration, after redacting keys that could contain sensitive data such as API keys or passwords. ( #2254 ) Existing Datasette installations may already have configuration set in metadata.yaml that should be migrated to datasette.yaml . To avoid breaking these installations, Datasette will silently treat table configuration, plugin configuration and allow blocks in metadata as if they had been specified in configuration instead. ( #2247 ) ( #2248 ) ( #2249 ) Note that the datasette publish command has not yet been updated to accept a datasette.yaml configuration file. This will be addressed in #2195 but for the moment you can include those settings in metadata.yaml instead. | ["Changelog", "1.0a8 (2024-02-07)"] | [{"href": "https://github.com/simonw/datasette/issues/2093", "label": "#2093"}, {"href": "https://github.com/simonw/datasette/issues/2252", "label": "#2252"}, {"href": "https://github.com/simonw/datasette/issues/2254", "label": "#2254"}, {"href": "https://github.com/simonw/datasette/issues/2247", "label": "#2247"}, {"href": "https://github.com/simonw/datasette/issues/2248", "label": "#2248"}, {"href": "https://github.com/simonw/datasette/issues/2249", "label": "#2249"}, {"href": "https://github.com/simonw/datasette/issues/2195", "label": "#2195"}] |
writing_plugins:writing-plugins-packaging | writing_plugins | writing-plugins-packaging | Packaging a plugin | Plugins can be packaged using Python setuptools. You can see an example of a packaged plugin at https://github.com/simonw/datasette-plugin-demos The example consists of two files: a setup.py file that defines the plugin: from setuptools import setup VERSION = "0.1" setup( name="datasette-plugin-demos", description="Examples of plugins for Datasette", author="Simon Willison", url="https://github.com/simonw/datasette-plugin-demos", license="Apache License, Version 2.0", version=VERSION, py_modules=["datasette_plugin_demos"], entry_points={ "datasette": [ "plugin_demos = datasette_plugin_demos" ] }, install_requires=["datasette"], ) And a Python module file, datasette_plugin_demos.py , that implements the plugin: from datasette import hookimpl import random @hookimpl def prepare_jinja2_environment(env): env.filters["uppercase"] = lambda u: u.upper() @hookimpl def prepare_connection(conn): conn.create_function( "random_integer", 2, random.randint ) Having built a plugin in this way you can turn it into an installable package using the following command: python3 setup.py sdist This will create a .tar.gz file in the dist/ directory. You can then install your new plugin into a Datasette virtual environment or Docker container using pip : pip install datasette-plugin-demos-0.1.tar.gz To learn how to upload your plugin to PyPI for use by other people, read the PyPA guide to Packaging and distributing projects . | ["Writing plugins"] | [{"href": "https://github.com/simonw/datasette-plugin-demos", "label": "https://github.com/simonw/datasette-plugin-demos"}, {"href": "https://pypi.org/", "label": "PyPI"}, {"href": "https://packaging.python.org/tutorials/distributing-packages/", "label": "Packaging and distributing projects"}] |
writing_plugins:writing-plugins-extra-hooks | writing_plugins | writing-plugins-extra-hooks | Plugins that define new plugin hooks | Plugins can define new plugin hooks that other plugins can use to further extend their functionality. datasette-graphql is one example of a plugin that does this. It defines a new hook called graphql_extra_fields , described here , which other plugins can use to define additional fields that should be included in the GraphQL schema. To define additional hooks, add a file to the plugin called datasette_your_plugin/hookspecs.py with content that looks like this: from pluggy import HookspecMarker hookspec = HookspecMarker("datasette") @hookspec def name_of_your_hook_goes_here(datasette): "Description of your hook." You should define your own hook name and arguments here, following the documentation for Pluggy specifications . Make sure to pick a name that is unlikely to clash with hooks provided by any other plugins. Then, to register your plugin hooks, add the following code to your datasette_your_plugin/__init__.py file: from datasette.plugins import pm from . import hookspecs pm.add_hookspecs(hookspecs) This will register your plugin hooks as part of the datasette plugin hook namespace. Within your plugin code you can trigger the hook using this pattern: from datasette.plugins import pm for ( plugin_return_value ) in pm.hook.name_of_your_hook_goes_here( datasette=datasette ): # Do something with plugin_return_value pass Other plugins will then be able to register their own implementations of your hook using this syntax: from datasette import hookimpl @hookimpl def name_of_your_hook_goes_here(datasette): return "Response from this plugin hook" These plugin implementations can accept 0 or more of the named arguments that you defined in your hook specification. | ["Writing plugins"] | [{"href": "https://github.com/simonw/datasette-graphql", "label": "datasette-graphql"}, {"href": "https://github.com/simonw/datasette-graphql/blob/main/README.md#adding-custom-fields-with-plugins", "label": "described here"}, {"href": "https://pluggy.readthedocs.io/en/stable/#specs", "label": "Pluggy specifications"}] |
plugins:plugins-configuration | plugins | plugins-configuration | Plugin configuration | Plugins can have their own configuration, embedded in a configuration file . Configuration options for plugins live within a "plugins" key in that file, which can be included at the root, database or table level. Here is an example of some plugin configuration for a specific table: [[[cog from metadata_doc import config_example config_example(cog, { "databases": { "sf-trees": { "tables": { "Street_Tree_List": { "plugins": { "datasette-cluster-map": { "latitude_column": "lat", "longitude_column": "lng" } } } } } } }) ]]] [[[end]]] This tells the datasette-cluster-map column which latitude and longitude columns should be used for a table called Street_Tree_List inside a database file called sf-trees.db . | ["Plugins"] | [] |
internals:internals-datasette-client | internals | internals-datasette-client | datasette.client | Plugins can make internal simulated HTTP requests to the Datasette instance within which they are running. This ensures that all of Datasette's external JSON APIs are also available to plugins, while avoiding the overhead of making an external HTTP call to access those APIs. The datasette.client object is a wrapper around the HTTPX Python library , providing an async-friendly API that is similar to the widely used Requests library . It offers the following methods: await datasette.client.get(path, **kwargs) - returns HTTPX Response Execute an internal GET request against that path. await datasette.client.post(path, **kwargs) - returns HTTPX Response Execute an internal POST request. Use data={"name": "value"} to pass form parameters. await datasette.client.options(path, **kwargs) - returns HTTPX Response Execute an internal OPTIONS request. await datasette.client.head(path, **kwargs) - returns HTTPX Response Execute an internal HEAD request. await datasette.client.put(path, **kwargs) - returns HTTPX Response Execute an internal PUT request. await datasette.client.patch(path, **kwargs) - returns HTTPX Response … | ["Internals for plugins", "Datasette class"] | [{"href": "https://www.python-httpx.org/", "label": "HTTPX Python library"}, {"href": "https://requests.readthedocs.io/", "label": "Requests library"}, {"href": "https://www.python-httpx.org/async/", "label": "HTTPX Async documentation"}] |
changelog:register-routes-plugin-hooks | changelog | register-routes-plugin-hooks | register_routes() plugin hooks | Plugins can now register new views and routes via the register_routes(datasette) plugin hook ( #819 ). View functions can be defined that accept any of the current datasette object, the current request , or the ASGI scope , send and receive objects. | ["Changelog", "0.44 (2020-06-11)"] | [{"href": "https://github.com/simonw/datasette/issues/819", "label": "#819"}] |
changelog:cookie-methods | changelog | cookie-methods | Cookie methods | Plugins can now use the new response.set_cookie() method to set cookies. A new request.cookies method on the :ref:internals_request` can be used to read incoming cookies. | ["Changelog", "0.44 (2020-06-11)"] | [] |
changelog:secret-plugin-configuration-options | changelog | secret-plugin-configuration-options | Secret plugin configuration options | Plugins like datasette-auth-github need a safe way to set secret configuration options. Since the default mechanism for configuring plugins exposes those settings in /-/metadata a new mechanism was needed. Secret configuration values describes how plugins can now specify that their settings should be read from a file or an environment variable: { "plugins": { "datasette-auth-github": { "client_secret": { "$env": "GITHUB_CLIENT_SECRET" } } } } These plugin secrets can be set directly using datasette publish . See Custom metadata and plugins for details. ( #538 and #543 ) | ["Changelog", "0.29 (2019-07-07)"] | [{"href": "https://github.com/simonw/datasette-auth-github", "label": "datasette-auth-github"}, {"href": "https://github.com/simonw/datasette/issues/538", "label": "#538"}, {"href": "https://github.com/simonw/datasette/issues/543", "label": "#543"}] |
changelog:id69 | changelog | id69 | 0.37 (2020-02-25) | Plugins now have a supported mechanism for writing to a database, using the new .execute_write() and .execute_write_fn() methods. Documentation . ( #682 ) Immutable databases that have had their rows counted using the inspect command now use the calculated count more effectively - thanks, Kevin Keogh. ( #666 ) --reload no longer restarts the server if a database file is modified, unless that database was opened immutable mode with -i . ( #494 ) New ?_searchmode=raw option turns off escaping for FTS queries in ?_search= allowing full use of SQLite's FTS5 query syntax . ( #676 ) | ["Changelog"] | [{"href": "https://github.com/simonw/datasette/issues/682", "label": "#682"}, {"href": "https://github.com/simonw/datasette/pull/666", "label": "#666"}, {"href": "https://github.com/simonw/datasette/issues/494", "label": "#494"}, {"href": "https://www.sqlite.org/fts5.html#full_text_query_syntax", "label": "FTS5 query syntax"}, {"href": "https://github.com/simonw/datasette/issues/676", "label": "#676"}] |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [sections] ( [id] TEXT PRIMARY KEY, [page] TEXT, [ref] TEXT, [title] TEXT, [content] TEXT, [breadcrumbs] TEXT, [references] TEXT );