id,page,ref,title,content,breadcrumbs,references changelog:control-http-caching-with-ttl,changelog,control-http-caching-with-ttl,Control HTTP caching with ?_ttl=,"You can now customize the HTTP max-age header that is sent on a per-URL basis, using the new ?_ttl= query string parameter. You can set this to any value in seconds, or you can set it to 0 to disable HTTP caching entirely. Consider for example this query which returns a randomly selected member of the Avengers: select * from [avengers/avengers] order by random() limit 1 If you hit the following page repeatedly you will get the same result, due to HTTP caching: /fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1 By adding ?_ttl=0 to the zero you can ensure the page will not be cached and get back a different super hero every time: /fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1&_ttl=0","[""Changelog"", ""0.23 (2018-06-18)""]","[{""href"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1"", ""label"": ""/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1""}, {""href"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1&_ttl=0"", ""label"": ""/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1&_ttl=0""}]" changelog:csv-export,changelog,csv-export,CSV export,"Any Datasette table, view or custom SQL query can now be exported as CSV. Check out the CSV export documentation for more details, or try the feature out on https://fivethirtyeight.datasettes.com/fivethirtyeight/bechdel%2Fmovies If your table has more than max_returned_rows (default 1,000) Datasette provides the option to stream all rows . This option takes advantage of async Python and Datasette's efficient pagination to iterate through the entire matching result set and stream it back as a downloadable CSV file.","[""Changelog"", ""0.23 (2018-06-18)""]","[{""href"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight/bechdel%2Fmovies"", ""label"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight/bechdel%2Fmovies""}]" changelog:foreign-key-expansions,changelog,foreign-key-expansions,Foreign key expansions,"When Datasette detects a foreign key reference it attempts to resolve a label for that reference (automatically or using the Specifying the label column for a table metadata option) so it can display a link to the associated row. This expansion is now also available for JSON and CSV representations of the table, using the new _labels=on query string option. See Expanding foreign key references for more details.","[""Changelog"", ""0.23 (2018-06-18)""]",[] changelog:improved-support-for-spatialite,changelog,improved-support-for-spatialite,Improved support for SpatiaLite,"The SpatiaLite module for SQLite adds robust geospatial features to the database. Getting SpatiaLite working can be tricky, especially if you want to use the most recent alpha version (with support for K-nearest neighbor). Datasette now includes extensive documentation on SpatiaLite , and thanks to Ravi Kotecha our GitHub repo includes a Dockerfile that can build the latest SpatiaLite and configure it for use with Datasette. The datasette publish and datasette package commands now accept a new --spatialite argument which causes them to install and configure SpatiaLite as part of the container they deploy.","[""Changelog"", ""0.23 (2018-06-18)""]","[{""href"": ""https://www.gaia-gis.it/fossil/libspatialite/index"", ""label"": ""SpatiaLite module""}, {""href"": ""https://github.com/r4vi"", ""label"": ""Ravi Kotecha""}, {""href"": ""https://github.com/simonw/datasette/blob/master/Dockerfile"", ""label"": ""Dockerfile""}]" changelog:latest-datasette-io,changelog,latest-datasette-io,latest.datasette.io,"Every commit to Datasette master is now automatically deployed by Travis CI to https://latest.datasette.io/ - ensuring there is always a live demo of the latest version of the software. The demo uses the fixtures from our unit tests, ensuring it demonstrates the same range of functionality that is covered by the tests. You can see how the deployment mechanism works in our .travis.yml file.","[""Changelog"", ""0.23 (2018-06-18)""]","[{""href"": ""https://latest.datasette.io/"", ""label"": ""https://latest.datasette.io/""}, {""href"": ""https://github.com/simonw/datasette/blob/master/tests/fixtures.py"", ""label"": ""the fixtures""}, {""href"": ""https://github.com/simonw/datasette/blob/master/.travis.yml"", ""label"": "".travis.yml""}]" changelog:miscellaneous,changelog,miscellaneous,Miscellaneous,"Got JSON data in one of your columns? Use the new ?_json=COLNAME argument to tell Datasette to return that JSON value directly rather than encoding it as a string. If you just want an array of the first value of each row, use the new ?_shape=arrayfirst option - example .","[""Changelog"", ""0.23 (2018-06-18)""]","[{""href"": ""https://latest.datasette.io/fixtures.json?sql=select+neighborhood+from+facetable+order+by+pk+limit+101&_shape=arrayfirst"", ""label"": ""example""}]" changelog:new-configuration-settings,changelog,new-configuration-settings,New configuration settings,"Datasette's Settings now also supports boolean settings. A number of new configuration options have been added: num_sql_threads - the number of threads used to execute SQLite queries. Defaults to 3. allow_facet - enable or disable custom Facets using the _facet= parameter. Defaults to on. suggest_facets - should Datasette suggest facets? Defaults to on. allow_download - should users be allowed to download the entire SQLite database? Defaults to on. allow_sql - should users be allowed to execute custom SQL queries? Defaults to on. default_cache_ttl - Default HTTP caching max-age header in seconds. Defaults to 365 days - caching can be disabled entirely by settings this to 0. cache_size_kb - Set the amount of memory SQLite uses for its per-connection cache , in KB. allow_csv_stream - allow users to stream entire result sets as a single CSV file. Defaults to on. max_csv_mb - maximum size of a returned CSV file in MB. Defaults to 100MB, set to 0 to disable this limit.","[""Changelog"", ""0.23 (2018-06-18)""]","[{""href"": ""https://www.sqlite.org/pragma.html#pragma_cache_size"", ""label"": ""per-connection cache""}]"