element, to help with advanced CSS customization. ( #1446 )
The render_cell() plugin hook can now return an awaitable function. This means the hook can execute SQL queries. ( #1425 )
register_routes(datasette) plugin hook now accepts an optional datasette argument. ( #1404 )
New hide_sql canned query option for defaulting to hiding the SQL query used by a canned query, see Additional canned query options . ( #1422 )
New --cpu option for datasette publish cloudrun . ( #1420 )
If Rich is installed in the same virtual environment as Datasette, it will be used to provide enhanced display of error tracebacks on the console. ( #1416 )
datasette.utils parse_metadata(content) function, used by the new datasette-remote-metadata plugin , is now a documented API. ( #1405 )
Fixed bug where ?_next=x&_sort=rowid could throw an error. ( #1470 )
Column cog menu no longer shows the option to facet by a column that is already selected by the default facets in metadata. ( #1469 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/942"", ""label"": ""#942""}, {""href"": ""https://github.com/simonw/datasette/issues/1449"", ""label"": ""#1449""}, {""href"": ""https://github.com/simonw/datasette/issues/1423"", ""label"": ""#1423""}, {""href"": ""https://github.com/encode/httpx/releases/tag/0.20.0"", ""label"": ""httpx 0.20""}, {""href"": ""https://github.com/simonw/datasette/issues/1488"", ""label"": ""#1488""}, {""href"": ""https://github.com/simonw/datasette/pull/1467"", ""label"": ""#1467""}, {""href"": ""https://github.com/simonw/datasette/issues/1421"", ""label"": ""#1421""}, {""href"": ""https://github.com/simonw/datasette/issues/1431"", ""label"": ""#1431""}, {""href"": ""https://github.com/simonw/datasette/issues/1443"", ""label"": ""#1443""}, {""href"": ""https://github.com/simonw/datasette/issues/1446"", ""label"": ""#1446""}, {""href"": ""https://github.com/simonw/datasette/issues/1425"", ""label"": ""#1425""}, {""href"": ""https://github.com/simonw/datasette/issues/1404"", ""label"": ""#1404""}, {""href"": ""https://github.com/simonw/datasette/issues/1422"", ""label"": ""#1422""}, {""href"": ""https://github.com/simonw/datasette/issues/1420"", ""label"": ""#1420""}, {""href"": ""https://github.com/willmcgugan/rich"", ""label"": ""Rich""}, {""href"": ""https://github.com/simonw/datasette/issues/1416"", ""label"": ""#1416""}, {""href"": ""https://datasette.io/plugins/datasette-remote-metadata"", ""label"": ""datasette-remote-metadata plugin""}, {""href"": ""https://github.com/simonw/datasette/issues/1405"", ""label"": ""#1405""}, {""href"": ""https://github.com/simonw/datasette/issues/1470"", ""label"": ""#1470""}, {""href"": ""https://github.com/simonw/datasette/issues/1469"", ""label"": ""#1469""}]"
changelog:id24,changelog,id24,0.59.2 (2021-11-13),"Column names with a leading underscore now work correctly when used as a facet. ( #1506 )
Applying ?_nocol= to a column no longer removes that column from the filtering interface. ( #1503 )
Official Datasette Docker container now uses Debian Bullseye as the base image. ( #1497 )
Datasette is four years old today! Here's the original release announcement from 2017.","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/1506"", ""label"": ""#1506""}, {""href"": ""https://github.com/simonw/datasette/issues/1503"", ""label"": ""#1503""}, {""href"": ""https://github.com/simonw/datasette/issues/1497"", ""label"": ""#1497""}, {""href"": ""https://simonwillison.net/2017/Nov/13/datasette/"", ""label"": ""original release announcement""}]"
internals:database-close,internals,database-close,db.close(),"Closes all of the open connections to file-backed databases. This is mainly intended to be used by large test suites, to avoid hitting limits on the number of open files.","[""Internals for plugins"", ""Database class""]",[]
changelog:id59,changelog,id59,Smaller changes,"Cascading view permissions - so if a user has view-table they can view the table page even if they do not have view-database or view-instance . ( #832 )
CSRF protection no longer applies to Authentication: Bearer token requests or requests without cookies. ( #835 )
datasette.add_message() now works inside plugins. ( #864 )
Workaround for ""Too many open files"" error in test runs. ( #846 )
Respect existing scope[""actor""] if already set by ASGI middleware. ( #854 )
New process for shipping Alpha and beta releases . ( #807 )
{{ csrftoken() }} now works when plugins render a template using datasette.render_template(..., request=request) . ( #863 )
Datasette now creates a single Request object and uses it throughout the lifetime of the current HTTP request. ( #870 )","[""Changelog"", ""0.45 (2020-07-01)""]","[{""href"": ""https://github.com/simonw/datasette/issues/832"", ""label"": ""#832""}, {""href"": ""https://github.com/simonw/datasette/issues/835"", ""label"": ""#835""}, {""href"": ""https://github.com/simonw/datasette/issues/864"", ""label"": ""#864""}, {""href"": ""https://github.com/simonw/datasette/issues/846"", ""label"": ""#846""}, {""href"": ""https://github.com/simonw/datasette/issues/854"", ""label"": ""#854""}, {""href"": ""https://github.com/simonw/datasette/issues/807"", ""label"": ""#807""}, {""href"": ""https://github.com/simonw/datasette/issues/863"", ""label"": ""#863""}, {""href"": ""https://github.com/simonw/datasette/issues/870"", ""label"": ""#870""}]"
sql_queries:canned-queries-named-parameters,sql_queries,canned-queries-named-parameters,Canned query parameters,"Canned queries support named parameters, so if you include those in the SQL you will then be able to enter them using the form fields on the canned query page or by adding them to the URL. This means canned queries can be used to create custom JSON APIs based on a carefully designed SQL statement.
Here's an example of a canned query with a named parameter:
select neighborhood, facet_cities.name, state
from facetable
join facet_cities on facetable.city_id = facet_cities.id
where neighborhood like '%' || :text || '%'
order by neighborhood;
In the canned query configuration looks like this:
[[[cog
config_example(cog, """"""
databases:
fixtures:
queries:
neighborhood_search:
title: Search neighborhoods
sql: |-
select neighborhood, facet_cities.name, state
from facetable
join facet_cities on facetable.city_id = facet_cities.id
where neighborhood like '%' || :text || '%'
order by neighborhood
"""""")
]]]
[[[end]]]
Note that we are using SQLite string concatenation here - the || operator - to add wildcard % characters to the string provided by the user.
You can try this canned query out here:
https://latest.datasette.io/fixtures/neighborhood_search?text=town
In this example the :text named parameter is automatically extracted from the query using a regular expression.
You can alternatively provide an explicit list of named parameters using the ""params"" key, like this:
[[[cog
config_example(cog, """"""
databases:
fixtures:
queries:
neighborhood_search:
title: Search neighborhoods
params:
- text
sql: |-
select neighborhood, facet_cities.name, state
from facetable
join facet_cities on facetable.city_id = facet_cities.id
where neighborhood like '%' || :text || '%'
order by neighborhood
"""""")
]]]
[[[end]]]","[""Running SQL queries"", ""Canned queries""]","[{""href"": ""https://latest.datasette.io/fixtures/neighborhood_search?text=town"", ""label"": ""https://latest.datasette.io/fixtures/neighborhood_search?text=town""}]"
changelog:magic-parameters-for-canned-queries,changelog,magic-parameters-for-canned-queries,Magic parameters for canned queries,"Canned queries now support Magic parameters , which can be used to insert or select automatically generated values. For example:
insert into logs
(user_id, timestamp)
values
(:_actor_id, :_now_datetime_utc)
This inserts the currently authenticated actor ID and the current datetime. ( #842 )","[""Changelog"", ""0.45 (2020-07-01)""]","[{""href"": ""https://github.com/simonw/datasette/issues/842"", ""label"": ""#842""}]"
sql_queries:hide-sql,sql_queries,hide-sql,hide_sql,"Canned queries default to displaying their SQL query at the top of the page. If the query is extremely long you may want to hide it by default, with a ""show"" link that can be used to make it visible.
Add the ""hide_sql"": true option to hide the SQL query by default.","[""Running SQL queries"", ""Canned queries"", ""Additional canned query options""]",[]
sql_queries:canned-queries-writable,sql_queries,canned-queries-writable,Writable canned queries,"Canned queries by default are read-only. You can use the ""write"": true key to indicate that a canned query can write to the database.
See Access to specific canned queries for details on how to add permission checks to canned queries, using the ""allow"" key.
[[[cog
config_example(cog, {
""databases"": {
""mydatabase"": {
""queries"": {
""add_name"": {
""sql"": ""INSERT INTO names (name) VALUES (:name)"",
""write"": True
}
}
}
}
})
]]]
[[[end]]]
This configuration will create a page at /mydatabase/add_name displaying a form with a name field. Submitting that form will execute the configured INSERT query.
You can customize how Datasette represents success and errors using the following optional properties:
on_success_message - the message shown when a query is successful
on_success_message_sql - alternative to on_success_message : a SQL query that should be executed to generate the message
on_success_redirect - the path or URL the user is redirected to on success
on_error_message - the message shown when a query throws an error
on_error_redirect - the path or URL the user is redirected to on error
For example:
[[[cog
config_example(cog, {
""databases"": {
""mydatabase"": {
""queries"": {
""add_name"": {
""sql"": ""INSERT INTO names (name) VALUES (:name)"",
""params"": [""name""],
""write"": True,
""on_success_message_sql"": ""select 'Name inserted: ' || :name"",
""on_success_redirect"": ""/mydatabase/names"",
""on_error_message"": ""Name insert failed"",
""on_error_redirect"": ""/mydatabase"",
}
}
}
}
})
]]]
[[[end]]]
You can use ""params"" to explicitly list the named parameters that should be displayed as form fields - otherwise they will be automatically detected. ""params"" is not necessary in the above example, since without it ""name"" would be automatically detected from the query.
You can pre-populate form fields when the page first loads using a query string, e.g. /mydatabase/add_name?name=Prepopulated . The user will have to submit the form to execute the query.
If you specify a query in ""on_success_message_sql"" , that query will be executed after the main query. The first column of the first row return by that query will be displayed as a success message. Named parameters from the main query will be made available to the success message query as well.","[""Running SQL queries"", ""Canned queries""]",[]
configuration:configuration-reference-canned-queries,configuration,configuration-reference-canned-queries,Canned queries configuration,"Canned queries are named SQL queries that appear in the Datasette interface. They can be configured in datasette.yaml using the queries key at the database level:
[[[cog
from metadata_doc import config_example, config_example
config_example(cog, {
""databases"": {
""sf-trees"": {
""queries"": {
""just_species"": {
""sql"": ""select qSpecies from Street_Tree_List""
}
}
}
}
})
]]]
[[[end]]]
See the canned queries documentation for more, including how to configure writable canned queries .","[""Configuration"", null]",[]
authentication:authentication-permissions-query,authentication,authentication-permissions-query,Access to specific canned queries,"Canned queries allow you to configure named SQL queries in your datasette.yaml that can be executed by users. These queries can be set up to both read and write to the database, so controlling who can execute them can be important.
To limit access to the add_name canned query in your dogs.db database to just the root user :
[[[cog
config_example(cog, """"""
databases:
dogs:
queries:
add_name:
sql: INSERT INTO names (name) VALUES (:name)
write: true
allow:
id:
- root
"""""")
]]]
[[[end]]]","[""Authentication and permissions"", ""Access permissions in ""]",[]
settings:setting-sql-time-limit-ms,settings,setting-sql-time-limit-ms,sql_time_limit_ms,"By default, queries have a time limit of one second. If a query takes longer than this to run Datasette will terminate the query and return an error.
If this time limit is too short for you, you can customize it using the sql_time_limit_ms limit - for example, to increase it to 3.5 seconds:
datasette mydatabase.db --setting sql_time_limit_ms 3500
You can optionally set a lower time limit for an individual query using the ?_timelimit=100 query string argument:
/my-database/my-table?qSpecies=44&_timelimit=100
This would set the time limit to 100ms for that specific query. This feature is useful if you are working with databases of unknown size and complexity - a query that might make perfect sense for a smaller table could take too long to execute on a table with millions of rows. By setting custom time limits you can execute queries ""optimistically"" - e.g. give me an exact count of rows matching this query but only if it takes less than 100ms to calculate.","[""Settings"", ""Settings""]",[]
custom_templates:customization-custom-templates,custom_templates,customization-custom-templates,Custom templates,"By default, Datasette uses default templates that ship with the package.
You can over-ride these templates by specifying a custom --template-dir like
this:
datasette mydb.db --template-dir=mytemplates/
Datasette will now first look for templates in that directory, and fall back on
the defaults if no matches are found.
It is also possible to over-ride templates on a per-database, per-row or per-
table basis.
The lookup rules Datasette uses are as follows:
Index page (/):
index.html
Database page (/mydatabase):
database-mydatabase.html
database.html
Custom query page (/mydatabase?sql=...):
query-mydatabase.html
query.html
Canned query page (/mydatabase/canned-query):
query-mydatabase-canned-query.html
query-mydatabase.html
query.html
Table page (/mydatabase/mytable):
table-mydatabase-mytable.html
table.html
Row page (/mydatabase/mytable/id):
row-mydatabase-mytable.html
row.html
Table of rows and columns include on table page:
_table-table-mydatabase-mytable.html
_table-mydatabase-mytable.html
_table.html
Table of rows and columns include on row page:
_table-row-mydatabase-mytable.html
_table-mydatabase-mytable.html
_table.html
If a table name has spaces or other unexpected characters in it, the template
filename will follow the same rules as our custom CSS classes - for
example, a table called ""Food Trucks"" will attempt to load the following
templates:
table-mydatabase-Food-Trucks-399138.html
table.html
You can find out which templates were considered for a specific page by viewing
source on that page and looking for an HTML comment at the bottom. The comment
will look something like this:
This example is from the canned query page for a query called ""tz"" in the
database called ""mydb"". The asterisk shows which template was selected - so in
this case, Datasette found a template file called query-mydb-tz.html and
used that - but if that template had not been found, it would have tried for
query-mydb.html or the default query.html .
It is possible to extend the default templates using Jinja template
inheritance. If you want to customize EVERY row template with some additional
content you can do so by creating a row.html template like this:
{% extends ""default:row.html"" %}
{% block content %}
EXTRA HTML AT THE TOP OF THE CONTENT BLOCK
This line renders the original block:
{{ super() }}
{% endblock %}
Note the default:row.html template name, which ensures Jinja will inherit
from the default template.
The _table.html template is included by both the row and the table pages,
and a list of rows. The default _table.html template renders them as an
HTML template and can be seen here .
You can provide a custom template that applies to all of your databases and
tables, or you can provide custom templates for specific tables using the
template naming scheme described above.
If you want to present your data in a format other than an HTML table, you
can do so by looping through display_rows in your own _table.html
template. You can use {{ row[""column_name""] }} to output the raw value
of a specific column.
If you want to output the rendered HTML version of a column, including any
links to foreign keys, you can use {{ row.display(""column_name"") }} .
Here is an example of a custom _table.html template:
{% for row in display_rows %}
{{ row[""title""] }}
{{ row[""description""] }}
Category: {{ row.display(""category_id"") }}
{% endfor %}","[""Custom pages and templates"", ""Publishing static assets""]","[{""href"": ""https://github.com/simonw/datasette/blob/main/datasette/templates/_table.html"", ""label"": ""can be seen here""}]"
metadata:metadata-default-sort,metadata,metadata-default-sort,Setting a default sort order,"By default Datasette tables are sorted by primary key. You can over-ride this default for a specific table using the ""sort"" or ""sort_desc"" metadata properties:
[[[cog
metadata_example(cog, {
""databases"": {
""mydatabase"": {
""tables"": {
""example_table"": {
""sort"": ""created""
}
}
}
}
})
]]]
[[[end]]]
Or use ""sort_desc"" to sort in descending order:
[[[cog
metadata_example(cog, {
""databases"": {
""mydatabase"": {
""tables"": {
""example_table"": {
""sort_desc"": ""created""
}
}
}
}
})
]]]
[[[end]]]","[""Metadata""]",[]
changelog:id82,changelog,id82,0.29.2 (2019-07-13),"Bumped Uvicorn to 0.8.4, fixing a bug where the query string was not included in the server logs. ( #559 )
Fixed bug where the navigation breadcrumbs were not displayed correctly on the page for a custom query. ( #558 )
Fixed bug where custom query names containing unicode characters caused errors.","[""Changelog""]","[{""href"": ""https://www.uvicorn.org/"", ""label"": ""Uvicorn""}, {""href"": ""https://github.com/simonw/datasette/issues/559"", ""label"": ""#559""}, {""href"": ""https://github.com/simonw/datasette/issues/558"", ""label"": ""#558""}]"
changelog:id115,changelog,id115,0.22.1 (2018-05-23),"Bugfix release, plus we now use versioneer for our version numbers.
Faceting no longer breaks pagination, fixes #282
Add __version_info__ derived from __version__ [Robert Gieseke]
This might be tuple of more than two values (major and minor
version) if commits have been made after a release.
Add version number support with Versioneer. [Robert Gieseke]
Versioneer Licence:
Public Domain (CC0-1.0)
Closes #273
Refactor inspect logic [Russ Garrett]","[""Changelog""]","[{""href"": ""https://github.com/warner/python-versioneer"", ""label"": ""versioneer""}, {""href"": ""https://github.com/simonw/datasette/issues/282"", ""label"": ""#282""}, {""href"": ""https://github.com/simonw/datasette/issues/273"", ""label"": ""#273""}]"
changelog:signed-values-and-secrets,changelog,signed-values-and-secrets,Signed values and secrets,"Both flash messages and user authentication needed a way to sign values and set signed cookies. Two new methods are now available for plugins to take advantage of this mechanism: .sign(value, namespace=""default"") and .unsign(value, namespace=""default"") .
Datasette will generate a secret automatically when it starts up, but to avoid resetting the secret (and hence invalidating any cookies) every time the server restarts you should set your own secret. You can pass a secret to Datasette using the new --secret option or with a DATASETTE_SECRET environment variable. See Configuring the secret for more details.
You can also set a secret when you deploy Datasette using datasette publish or datasette package - see Using secrets with datasette publish .
Plugins can now sign values and verify their signatures using the datasette.sign() and datasette.unsign() methods.","[""Changelog"", ""0.44 (2020-06-11)""]",[]
contributing:contributing-formatting-black,contributing,contributing-formatting-black,Running Black,"Black will be installed when you run pip install -e '.[test]' . To test that your code complies with Black, run the following in your root datasette repository checkout:
black . --check
All done! ✨ 🍰 ✨
95 files would be left unchanged.
If any of your code does not conform to Black you can run this to automatically fix those problems:
black .
reformatted ../datasette/setup.py
All done! ✨ 🍰 ✨
1 file reformatted, 94 files left unchanged.","[""Contributing"", ""Code formatting""]",[]
changelog:id155,changelog,id155,0.16 (2018-04-13),"Better mechanism for handling errors; 404s for missing table/database
New error mechanism closes #193
404s for missing tables/databases closes #184
long_description in markdown for the new PyPI
Hide SpatiaLite system tables. [Russ Garrett]
Allow explain select / explain query plan select #201
Datasette inspect now finds primary_keys #195
Ability to sort using form fields (for mobile portrait mode) #199
We now display sort options as a select box plus a descending checkbox, which
means you can apply sort orders even in portrait mode on a mobile phone where
the column headers are hidden.","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/193"", ""label"": ""#193""}, {""href"": ""https://github.com/simonw/datasette/issues/184"", ""label"": ""#184""}, {""href"": ""https://github.com/simonw/datasette/issues/201"", ""label"": ""#201""}, {""href"": ""https://github.com/simonw/datasette/issues/195"", ""label"": ""#195""}, {""href"": ""https://github.com/simonw/datasette/issues/199"", ""label"": ""#199""}]"
changelog:named-in-memory-database-support,changelog,named-in-memory-database-support,Named in-memory database support,"As part of the work building the _internal database, Datasette now supports named in-memory databases that can be shared across multiple connections. This allows plugins to create in-memory databases which will persist data for the lifetime of the Datasette server process. ( #1151 )
The new memory_name= parameter to the Database class can be used to create named, shared in-memory databases.","[""Changelog"", ""0.54 (2021-01-25)""]","[{""href"": ""https://github.com/simonw/datasette/issues/1151"", ""label"": ""#1151""}]"
changelog:the-internal-database,changelog,the-internal-database,The _internal database,"As part of ongoing work to help Datasette handle much larger numbers of connected databases and tables (see Datasette Library ) Datasette now maintains an in-memory SQLite database with details of all of the attached databases, tables, columns, indexes and foreign keys. ( #1150 )
This will support future improvements such as a searchable, paginated homepage of all available tables.
You can explore an example of this database by signing in as root to the latest.datasette.io demo instance and then navigating to latest.datasette.io/_internal .
Plugins can use these tables to introspect attached data in an efficient way. Plugin authors should note that this is not yet considered a stable interface, so any plugins that use this may need to make changes prior to Datasette 1.0 if the _internal table schemas change.","[""Changelog"", ""0.54 (2021-01-25)""]","[{""href"": ""https://github.com/simonw/datasette/issues/417"", ""label"": ""Datasette Library""}, {""href"": ""https://github.com/simonw/datasette/issues/1150"", ""label"": ""#1150""}, {""href"": ""https://latest.datasette.io/login-as-root"", ""label"": ""signing in as root""}, {""href"": ""https://latest.datasette.io/_internal"", ""label"": ""latest.datasette.io/_internal""}]"
sql_queries:id1,sql_queries,id1,Canned queries,"As an alternative to adding views to your database, you can define canned queries inside your datasette.yaml file. Here's an example:
[[[cog
from metadata_doc import config_example, config_example
config_example(cog, {
""databases"": {
""sf-trees"": {
""queries"": {
""just_species"": {
""sql"": ""select qSpecies from Street_Tree_List""
}
}
}
}
})
]]]
[[[end]]]
Then run Datasette like this:
datasette sf-trees.db -m metadata.json
Each canned query will be listed on the database index page, and will also get its own URL at:
/database-name/canned-query-name
For the above example, that URL would be:
/sf-trees/just_species
You can optionally include ""title"" and ""description"" keys to show a title and description on the canned query page. As with regular table metadata you can alternatively specify ""description_html"" to have your description rendered as HTML (rather than having HTML special characters escaped).","[""Running SQL queries""]",[]
contributing:contributing-debugging,contributing,contributing-debugging,Debugging,"Any errors that occur while Datasette is running while display a stack trace on the console.
You can tell Datasette to open an interactive pdb debugger session if an error occurs using the --pdb option:
datasette --pdb fixtures.db","[""Contributing""]",[]
changelog:csv-export,changelog,csv-export,CSV export,"Any Datasette table, view or custom SQL query can now be exported as CSV.
Check out the CSV export documentation for more details, or
try the feature out on
https://fivethirtyeight.datasettes.com/fivethirtyeight/bechdel%2Fmovies
If your table has more than max_returned_rows (default 1,000)
Datasette provides the option to stream all rows . This option takes advantage
of async Python and Datasette's efficient pagination to
iterate through the entire matching result set and stream it back as a
downloadable CSV file.","[""Changelog"", ""0.23 (2018-06-18)""]","[{""href"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight/bechdel%2Fmovies"", ""label"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight/bechdel%2Fmovies""}]"
csv_export:id1,csv_export,id1,CSV export,"Any Datasette table, view or custom SQL query can be exported as CSV.
To obtain the CSV representation of the table you are looking, click the ""this
data as CSV"" link.
You can also use the advanced export form for more control over the resulting
file, which looks like this and has the following options:
download file - instead of displaying CSV in your browser, this forces
your browser to download the CSV to your downloads directory.
expand labels - if your table has any foreign key references this option
will cause the CSV to gain additional COLUMN_NAME_label columns with a
label for each foreign key derived from the linked table. In this example
the city_id column is accompanied by a city_id_label column.
stream all rows - by default CSV files only contain the first
max_returned_rows records. This option will cause Datasette to
loop through every matching record and return them as a single CSV file.
You can try that out on https://latest.datasette.io/fixtures/facetable?_size=4",[],"[{""href"": ""https://latest.datasette.io/fixtures/facetable.csv?_labels=on&_size=max"", ""label"": ""In this example""}, {""href"": ""https://latest.datasette.io/fixtures/facetable?_size=4"", ""label"": ""https://latest.datasette.io/fixtures/facetable?_size=4""}]"
spatialite:importing-geojson-polygons-using-shapely,spatialite,importing-geojson-polygons-using-shapely,Importing GeoJSON polygons using Shapely,"Another common form of polygon data is the GeoJSON format. This can be imported into SpatiaLite directly, or by using the Shapely Python library.
Who's On First is an excellent source of openly licensed GeoJSON polygons. Let's import the geographical polygon for Wales. First, we can use the Who's On First Spelunker tool to find the record for Wales:
spelunker.whosonfirst.org/id/404227475
That page includes a link to the GeoJSON record, which can be accessed here:
data.whosonfirst.org/404/227/475/404227475.geojson
Here's Python code to create a SQLite database, enable SpatiaLite, create a places table and then add a record for Wales:
import sqlite3
conn = sqlite3.connect(""places.db"")
# Enable SpatialLite extension
conn.enable_load_extension(True)
conn.load_extension(""/usr/local/lib/mod_spatialite.dylib"")
# Create the masic countries table
conn.execute(""select InitSpatialMetadata(1)"")
conn.execute(
""create table places (id integer primary key, name text);""
)
# Add a MULTIPOLYGON Geometry column
conn.execute(
""SELECT AddGeometryColumn('places', 'geom', 4326, 'MULTIPOLYGON', 2);""
)
# Add a spatial index against the new column
conn.execute(""SELECT CreateSpatialIndex('places', 'geom');"")
# Now populate the table
from shapely.geometry.multipolygon import MultiPolygon
from shapely.geometry import shape
import requests
geojson = requests.get(
""https://data.whosonfirst.org/404/227/475/404227475.geojson""
).json()
# Convert to ""Well Known Text"" format
wkt = shape(geojson[""geometry""]).wkt
# Insert and commit the record
conn.execute(
""INSERT INTO places (id, name, geom) VALUES(null, ?, GeomFromText(?, 4326))"",
(""Wales"", wkt),
)
conn.commit()","[""SpatiaLite""]","[{""href"": ""https://pypi.org/project/Shapely/"", ""label"": ""Shapely""}, {""href"": ""https://whosonfirst.org/"", ""label"": ""Who's On First""}, {""href"": ""https://spelunker.whosonfirst.org/id/404227475/"", ""label"": ""spelunker.whosonfirst.org/id/404227475""}, {""href"": ""https://data.whosonfirst.org/404/227/475/404227475.geojson"", ""label"": ""data.whosonfirst.org/404/227/475/404227475.geojson""}]"
json_api:tableupsertview,json_api,tableupsertview,Upserting rows,"An upsert is an insert or update operation. If a row with a matching primary key already exists it will be updated - otherwise a new row will be inserted.
The upsert API is mostly the same shape as the insert API . It requires both the insert-row and update-row permissions.
POST /
//-/upsert
Content-Type: application/json
Authorization: Bearer dstok_
{
""rows"": [
{
""id"": 1,
""title"": ""Updated title for 1"",
""description"": ""Updated description for 1""
},
{
""id"": 2,
""description"": ""Updated description for 2"",
},
{
""id"": 3,
""title"": ""Item 3"",
""description"": ""Description for 3""
}
]
}
Imagine a table with a primary key of id and which already has rows with id values of 1 and 2 .
The above example will:
Update the row with id of 1 to set both title and description to the new values
Update the row with id of 2 to set title to the new value - description will be left unchanged
Insert a new row with id of 3 and both title and description set to the new values
Similar to /-/insert , a row key with an object can be used instead of a rows array to upsert a single row.
If successful, this will return a 200 status code and a {""ok"": true} response body.
Add ""return"": true to the request body to return full copies of the affected rows after they have been inserted or updated:
{
""rows"": [
{
""id"": 1,
""title"": ""Updated title for 1"",
""description"": ""Updated description for 1""
},
{
""id"": 2,
""description"": ""Updated description for 2"",
},
{
""id"": 3,
""title"": ""Item 3"",
""description"": ""Description for 3""
}
],
""return"": true
}
This will return the following:
{
""ok"": true,
""rows"": [
{
""id"": 1,
""title"": ""Updated title for 1"",
""description"": ""Updated description for 1""
},
{
""id"": 2,
""title"": ""Item 2"",
""description"": ""Updated description for 2""
},
{
""id"": 3,
""title"": ""Item 3"",
""description"": ""Description for 3""
}
]
}
When using upsert you must provide the primary key column (or columns if the table has a compound primary key) for every row, or you will get a 400 error:
{
""ok"": false,
""errors"": [
""Row 0 is missing primary key column(s): \""id\""""
]
}
If your table does not have an explicit primary key you should pass the SQLite rowid key instead.
Pass ""alter: true to automatically add any missing columns to the table. This requires the alter-table permission.","[""JSON API"", ""The JSON write API""]",[]
index:datasette,index,datasette,Datasette,"An open source multi-tool for exploring and publishing data
Datasette is a tool for exploring and publishing data. It helps people take data of any shape or size and publish that as an interactive, explorable website and accompanying API.
Datasette is aimed at data journalists, museum curators, archivists, local governments and anyone else who has data that they wish to share with the world. It is part of a wider ecosystem of tools and plugins dedicated to making working with structured data as productive as possible.
Explore a demo , watch a presentation about the project or Try Datasette without installing anything using Glitch .
Interested in learning Datasette? Start with the official tutorials .
Support questions, feedback? Join the Datasette Discord .",[],"[{""href"": ""https://pypi.org/project/datasette/"", ""label"": null}, {""href"": ""https://docs.datasette.io/en/stable/changelog.html"", ""label"": null}, {""href"": ""https://pypi.org/project/datasette/"", ""label"": null}, {""href"": ""https://github.com/simonw/datasette/actions?query=workflow%3ATest"", ""label"": null}, {""href"": ""https://github.com/simonw/datasette/blob/main/LICENSE"", ""label"": null}, {""href"": ""https://hub.docker.com/r/datasetteproject/datasette"", ""label"": null}, {""href"": ""https://datasette.io/discord"", ""label"": null}, {""href"": ""https://pypi.org/project/datasette/"", ""label"": null}, {""href"": ""https://docs.datasette.io/en/stable/changelog.html"", ""label"": null}, {""href"": ""https://pypi.org/project/datasette/"", ""label"": null}, {""href"": ""https://github.com/simonw/datasette/actions?query=workflow%3ATest"", ""label"": null}, {""href"": ""https://github.com/simonw/datasette/blob/main/LICENSE"", ""label"": null}, {""href"": ""https://hub.docker.com/r/datasetteproject/datasette"", ""label"": null}, {""href"": ""https://datasette.io/discord"", ""label"": null}, {""href"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight"", ""label"": ""Explore a demo""}, {""href"": ""https://static.simonwillison.net/static/2018/pybay-datasette/"", ""label"": ""a presentation about the project""}, {""href"": ""https://datasette.io/tutorials"", ""label"": ""the official tutorials""}, {""href"": ""https://datasette.io/discord"", ""label"": ""Datasette Discord""}]"
contributing:contributing-alpha-beta,contributing,contributing-alpha-beta,Alpha and beta releases,"Alpha and beta releases are published to preview upcoming features that may not yet be stable - in particular to preview new plugin hooks.
You are welcome to try these out, but please be aware that details may change before the final release.
Please join discussions on the issue tracker to share your thoughts and experiences with on alpha and beta features that you try out.","[""Contributing""]","[{""href"": ""https://github.com/simonw/datasette/issues"", ""label"": ""discussions on the issue tracker""}]"
settings:setting-allow-facet,settings,setting-allow-facet,allow_facet,"Allow users to specify columns they would like to facet on using the ?_facet=COLNAME URL parameter to the table view.
This is enabled by default. If disabled, facets will still be displayed if they have been specifically enabled in metadata.json configuration for the table.
Here's how to disable this feature:
datasette mydatabase.db --setting allow_facet off","[""Settings"", ""Settings""]",[]
internals:datasette-add-memory-database,internals,datasette-add-memory-database,.add_memory_database(name),"Adds a shared in-memory database with the specified name:
datasette.add_memory_database(""statistics"")
This is a shortcut for the following:
from datasette.database import Database
datasette.add_database(
Database(datasette, memory_name=""statistics"")
)
Using either of these pattern will result in the in-memory database being served at /statistics .","[""Internals for plugins"", ""Datasette class""]",[]
sql_queries:canned-queries-options,sql_queries,canned-queries-options,Additional canned query options,Additional options can be specified for canned queries in the YAML or JSON configuration.,"[""Running SQL queries"", ""Canned queries""]",[]
changelog:id71,changelog,id71,0.35 (2020-02-04),"Added five new plugins and one new conversion tool to the The Datasette Ecosystem .
The Datasette class has a new render_template() method which can be used by plugins to render templates using Datasette's pre-configured Jinja templating library.
You can now execute SQL queries that start with a -- comment - thanks, Jay Graves ( #653 )","[""Changelog""]","[{""href"": ""https://jinja.palletsprojects.com/"", ""label"": ""Jinja""}, {""href"": ""https://github.com/simonw/datasette/pull/653"", ""label"": ""#653""}]"
changelog:id208,changelog,id208,0.11 (2017-11-14),"Added datasette publish now --force option.
This calls now with --force - useful as it means you get a fresh copy of datasette even if Now has already cached that docker layer.
Enable --cors by default when running in a container.","[""Changelog""]",[]
changelog:id198,changelog,id198,0.12 (2017-11-16),"Added __version__ , now displayed as tooltip in page footer ( #108 ).
Added initial docs, including a changelog ( #99 ).
Turned on auto-escaping in Jinja.
Added a UI for editing named parameters ( #96 ).
You can now construct a custom SQL statement using SQLite named
parameters (e.g. :name ) and datasette will display form fields for
editing those parameters. Here’s an example which lets you see the
most popular names for dogs of different species registered through
various dog registration schemes in Australia.
Pin to specific Jinja version. ( #100 ).
Default to 127.0.0.1 not 0.0.0.0. ( #98 ).
Added extra metadata options to publish and package commands. ( #92 ).
You can now run these commands like so:
datasette now publish mydb.db \
--title=""My Title"" \
--source=""Source"" \
--source_url=""http://www.example.com/"" \
--license=""CC0"" \
--license_url=""https://creativecommons.org/publicdomain/zero/1.0/""
This will write those values into the metadata.json that is packaged with the
app. If you also pass --metadata=metadata.json that file will be updated with the extra
values before being written into the Docker image.
Added production-ready Dockerfile ( #94 ) [Andrew
Cutler]
New ?_sql_time_limit_ms=10 argument to database and table page ( #95 )
SQL syntax highlighting with Codemirror ( #89 ) [Tom Dyson]","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/108"", ""label"": ""#108""}, {""href"": ""https://github.com/simonw/datasette/issues/99"", ""label"": ""#99""}, {""href"": ""https://github.com/simonw/datasette/issues/96"", ""label"": ""#96""}, {""href"": ""https://australian-dogs.now.sh/australian-dogs-3ba9628?sql=select+name%2C+count%28*%29+as+n+from+%28%0D%0A%0D%0Aselect+upper%28%22Animal+name%22%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2013%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2014%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2015%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22AnimalName%22%29+as+name+from+%5BCity-of-Port-Adelaide-Enfield-Dog_Registrations_2016%5D+where+AnimalBreed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5BMitcham-dog-registrations-2015%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22DOG_NAME%22%29+as+name+from+%5Bburnside-dog-registrations-2015%5D+where+DOG_BREED+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Animal_Name%22%29+as+name+from+%5Bcity-of-playford-2015-dog-registration%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5Bcity-of-prospect-dog-registration-details-2016%5D+where%22Breed+Description%22+like+%3Abreed%0D%0A%0D%0A%29+group+by+name+order+by+n+desc%3B&breed=pug"", ""label"": ""Here’s an example""}, {""href"": ""https://github.com/simonw/datasette/issues/100"", ""label"": ""#100""}, {""href"": ""https://github.com/simonw/datasette/issues/98"", ""label"": ""#98""}, {""href"": ""https://github.com/simonw/datasette/issues/92"", ""label"": ""#92""}, {""href"": ""https://github.com/simonw/datasette/issues/94"", ""label"": ""#94""}, {""href"": ""https://github.com/simonw/datasette/issues/95"", ""label"": ""#95""}, {""href"": ""https://github.com/simonw/datasette/issues/89"", ""label"": ""#89""}]"
changelog:id80,changelog,id80,0.30 (2019-10-18),"Added /-/threads debugging page
Allow EXPLAIN WITH... ( #583 )
Button to format SQL - thanks, Tobias Kunze ( #136 )
Sort databases on homepage by argument order - thanks, Tobias Kunze ( #585 )
Display metadata footer on custom SQL queries - thanks, Tobias Kunze ( #589 )
Use --platform=managed for publish cloudrun ( #587 )
Fixed bug returning non-ASCII characters in CSV ( #584 )
Fix for /foo v.s. /foo-bar bug ( #601 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/583"", ""label"": ""#583""}, {""href"": ""https://github.com/simonw/datasette/issues/136"", ""label"": ""#136""}, {""href"": ""https://github.com/simonw/datasette/issues/585"", ""label"": ""#585""}, {""href"": ""https://github.com/simonw/datasette/pull/589"", ""label"": ""#589""}, {""href"": ""https://github.com/simonw/datasette/issues/587"", ""label"": ""#587""}, {""href"": ""https://github.com/simonw/datasette/issues/584"", ""label"": ""#584""}, {""href"": ""https://github.com/simonw/datasette/issues/601"", ""label"": ""#601""}]"
changelog:id211,changelog,id211,0.9 (2017-11-13),"Added --sql_time_limit_ms and --extra-options .
The serve command now accepts --sql_time_limit_ms for customizing the SQL time
limit.
The publish and package commands now accept --extra-options which can be used
to specify additional options to be passed to the datasite serve command when
it executes inside the resulting Docker containers.","[""Changelog""]",[]
authentication:permissions-permissions-debug,authentication,permissions-permissions-debug,permissions-debug,"Actor is allowed to view the /-/permissions debug page.
Default deny .","[""Authentication and permissions"", ""Built-in permissions""]",[]
authentication:permissions-view-table,authentication,permissions-view-table,view-table,"Actor is allowed to view a table (or view) page, e.g. https://latest.datasette.io/fixtures/complex_foreign_keys
resource - tuple: (string, string)
The name of the database, then the name of the table
Default allow .","[""Authentication and permissions"", ""Built-in permissions""]","[{""href"": ""https://latest.datasette.io/fixtures/complex_foreign_keys"", ""label"": ""https://latest.datasette.io/fixtures/complex_foreign_keys""}]"
authentication:permissions-view-database,authentication,permissions-view-database,view-database,"Actor is allowed to view a database page, e.g. https://latest.datasette.io/fixtures
resource - string
The name of the database
Default allow .","[""Authentication and permissions"", ""Built-in permissions""]","[{""href"": ""https://latest.datasette.io/fixtures"", ""label"": ""https://latest.datasette.io/fixtures""}]"
authentication:permissions-view-query,authentication,permissions-view-query,view-query,"Actor is allowed to view (and execute) a canned query page, e.g. https://latest.datasette.io/fixtures/pragma_cache_size - this includes executing Writable canned queries .
resource - tuple: (string, string)
The name of the database, then the name of the canned query
Default allow .","[""Authentication and permissions"", ""Built-in permissions""]","[{""href"": ""https://latest.datasette.io/fixtures/pragma_cache_size"", ""label"": ""https://latest.datasette.io/fixtures/pragma_cache_size""}]"
authentication:permissions-update-row,authentication,permissions-update-row,update-row,"Actor is allowed to update rows in a table.
resource - tuple: (string, string)
The name of the database, then the name of the table
Default deny .","[""Authentication and permissions"", ""Built-in permissions""]",[]
authentication:permissions-execute-sql,authentication,permissions-execute-sql,execute-sql,"Actor is allowed to run arbitrary SQL queries against a specific database, e.g. https://latest.datasette.io/fixtures?sql=select+100
resource - string
The name of the database
Default allow . See also the default_allow_sql setting .","[""Authentication and permissions"", ""Built-in permissions""]","[{""href"": ""https://latest.datasette.io/fixtures?sql=select+100"", ""label"": ""https://latest.datasette.io/fixtures?sql=select+100""}]"
authentication:permissions-insert-row,authentication,permissions-insert-row,insert-row,"Actor is allowed to insert rows into a table.
resource - tuple: (string, string)
The name of the database, then the name of the table
Default deny .","[""Authentication and permissions"", ""Built-in permissions""]",[]
authentication:permissions-drop-table,authentication,permissions-drop-table,drop-table,"Actor is allowed to drop a database table.
resource - tuple: (string, string)
The name of the database, then the name of the table
Default deny .","[""Authentication and permissions"", ""Built-in permissions""]",[]
authentication:permissions-view-database-download,authentication,permissions-view-database-download,view-database-download,"Actor is allowed to download a database, e.g. https://latest.datasette.io/fixtures.db
resource - string
The name of the database
Default allow .","[""Authentication and permissions"", ""Built-in permissions""]","[{""href"": ""https://latest.datasette.io/fixtures.db"", ""label"": ""https://latest.datasette.io/fixtures.db""}]"
authentication:permissions-delete-row,authentication,permissions-delete-row,delete-row,"Actor is allowed to delete rows from a table.
resource - tuple: (string, string)
The name of the database, then the name of the table
Default deny .","[""Authentication and permissions"", ""Built-in permissions""]",[]
authentication:permissions-create-table,authentication,permissions-create-table,create-table,"Actor is allowed to create a database table.
resource - string
The name of the database
Default deny .","[""Authentication and permissions"", ""Built-in permissions""]",[]
authentication:permissions-alter-table,authentication,permissions-alter-table,alter-table,"Actor is allowed to alter a database table.
resource - tuple: (string, string)
The name of the database, then the name of the table
Default deny .","[""Authentication and permissions"", ""Built-in permissions""]",[]
plugin_hooks:plugin-actions,plugin_hooks,plugin-actions,Action hooks,"Action hooks can be used to add items to the action menus that appear at the top of different pages within Datasette. Unlike menu_links() , actions which are displayed on every page, actions should only be relevant to the page the user is currently viewing.
Each of these hooks should return return a list of {""href"": ""..."", ""label"": ""...""} menu items, with optional ""description"": ""..."" keys describing each action in more detail.
They can alternatively return an async def awaitable function which, when called, returns a list of those menu items.","[""Plugin hooks""]",[]
changelog:id84,changelog,id84,0.29 (2019-07-07),"ASGI, new plugin hooks, facet by date and much, much more...","[""Changelog""]",[]
changelog:asgi,changelog,asgi,ASGI,"ASGI is the Asynchronous Server Gateway Interface standard. I've been wanting to convert Datasette into an ASGI application for over a year - Port Datasette to ASGI #272 tracks thirteen months of intermittent development - but with Datasette 0.29 the change is finally released. This also means Datasette now runs on top of Uvicorn and no longer depends on Sanic .
I wrote about the significance of this change in Porting Datasette to ASGI, and Turtles all the way down .
The most exciting consequence of this change is that Datasette plugins can now take advantage of the ASGI standard.","[""Changelog"", ""0.29 (2019-07-07)""]","[{""href"": ""https://asgi.readthedocs.io/"", ""label"": ""ASGI""}, {""href"": ""https://github.com/simonw/datasette/issues/272"", ""label"": ""Port Datasette to ASGI #272""}, {""href"": ""https://www.uvicorn.org/"", ""label"": ""Uvicorn""}, {""href"": ""https://github.com/huge-success/sanic"", ""label"": ""Sanic""}, {""href"": ""https://simonwillison.net/2019/Jun/23/datasette-asgi/"", ""label"": ""Porting Datasette to ASGI, and Turtles all the way down""}]"
changelog:id63,changelog,id63,0.42 (2020-05-08),"A small release which provides improved internal methods for use in plugins, along with documentation. See #685 .
Added documentation for db.execute() , see await db.execute(sql, ...) .
Renamed db.execute_against_connection_in_thread() to db.execute_fn() and made it a documented method, see await db.execute_fn(fn) .
New results.first() and results.single_value() methods, plus documentation for the Results class - see Results .","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/685"", ""label"": ""#685""}]"
changelog:id94,changelog,id94,0.24 (2018-07-23),"A number of small new features:
datasette publish heroku now supports --extra-options , fixes #334
Custom error message if SpatiaLite is needed for specified database, closes #331
New config option: truncate_cells_html for truncating long cell values in HTML view - closes #330
Documentation for datasette publish and datasette package , closes #337
Fixed compatibility with Python 3.7
datasette publish heroku now supports app names via the -n option, which can also be used to overwrite an existing application [Russ Garrett]
Title and description metadata can now be set for canned SQL queries , closes #342
New force_https_on config option, fixes https:// API URLs when deploying to Zeit Now - closes #333
?_json_infinity=1 query string argument for handling Infinity/-Infinity values in JSON, closes #332
URLs displayed in the results of custom SQL queries are now URLified, closes #298","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/334"", ""label"": ""#334""}, {""href"": ""https://github.com/simonw/datasette/issues/331"", ""label"": ""#331""}, {""href"": ""https://github.com/simonw/datasette/issues/330"", ""label"": ""#330""}, {""href"": ""https://github.com/simonw/datasette/issues/337"", ""label"": ""#337""}, {""href"": ""https://github.com/simonw/datasette/issues/342"", ""label"": ""#342""}, {""href"": ""https://github.com/simonw/datasette/issues/333"", ""label"": ""#333""}, {""href"": ""https://github.com/simonw/datasette/issues/332"", ""label"": ""#332""}, {""href"": ""https://github.com/simonw/datasette/issues/298"", ""label"": ""#298""}]"
changelog:plugins-can-now-add-links-within-datasette,changelog,plugins-can-now-add-links-within-datasette,Plugins can now add links within Datasette,"A number of existing Datasette plugins add new pages to the Datasette interface, providig tools for things like uploading CSVs , editing table schemas or configuring full-text search .
Plugins like this can now link to themselves from other parts of Datasette interface. The menu_links(datasette, actor, request) hook ( #1064 ) lets plugins add links to Datasette's new top-right application menu, and the table_actions(datasette, actor, database, table, request) hook ( #1066 ) adds links to a new ""table actions"" menu on the table page.
The demo at latest.datasette.io now includes some example plugins. To see the new table actions menu first sign into that demo as root and then visit the facetable table to see the new cog icon menu at the top of the page.","[""Changelog"", ""0.51 (2020-10-31)""]","[{""href"": ""https://github.com/simonw/datasette-upload-csvs"", ""label"": ""uploading CSVs""}, {""href"": ""https://github.com/simonw/datasette-edit-schema"", ""label"": ""editing table schemas""}, {""href"": ""https://github.com/simonw/datasette-configure-fts"", ""label"": ""configuring full-text search""}, {""href"": ""https://github.com/simonw/datasette/issues/1064"", ""label"": ""#1064""}, {""href"": ""https://github.com/simonw/datasette/issues/1066"", ""label"": ""#1066""}, {""href"": ""https://latest.datasette.io/"", ""label"": ""latest.datasette.io""}, {""href"": ""https://latest.datasette.io/login-as-root"", ""label"": ""sign into that demo as root""}, {""href"": ""https://latest.datasette.io/fixtures/facetable"", ""label"": ""facetable""}]"
changelog:id45,changelog,id45,0.51 (2020-10-31),"A new visual design, plugin hooks for adding navigation options, better handling of binary data, URL building utility methods and better support for running Datasette behind a proxy.","[""Changelog""]",[]
metadata:id2,metadata,id2,Metadata reference,A full reference of every supported option in a metadata.json or metadata.yaml file.,"[""Metadata""]",[]
installation:installation-docker,installation,installation-docker,Using Docker,"A Docker image containing the latest release of Datasette is published to Docker
Hub here: https://hub.docker.com/r/datasetteproject/datasette/
If you have Docker installed (for example with Docker for Mac on OS X) you can download and run this
image like so:
docker run -p 8001:8001 -v `pwd`:/mnt \
datasetteproject/datasette \
datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db
This will start an instance of Datasette running on your machine's port 8001,
serving the fixtures.db file in your current directory.
Now visit http://127.0.0.1:8001/ to access Datasette.
(You can download a copy of fixtures.db from
https://latest.datasette.io/fixtures.db )
To upgrade to the most recent release of Datasette, run the following:
docker pull datasetteproject/datasette","[""Installation"", ""Advanced installation options""]","[{""href"": ""https://hub.docker.com/r/datasetteproject/datasette/"", ""label"": ""https://hub.docker.com/r/datasetteproject/datasette/""}, {""href"": ""https://www.docker.com/docker-mac"", ""label"": ""Docker for Mac""}, {""href"": ""http://127.0.0.1:8001/"", ""label"": ""http://127.0.0.1:8001/""}, {""href"": ""https://latest.datasette.io/fixtures.db"", ""label"": ""https://latest.datasette.io/fixtures.db""}]"
changelog:id85,changelog,id85,0.28 (2019-05-19),A salmagundi of new features!,"[""Changelog""]","[{""href"": ""https://adamj.eu/tech/2019/01/18/a-salmagundi-of-django-alpha-announcements/"", ""label"": ""salmagundi""}]"
json_api:json-api-table-arguments,json_api,json-api-table-arguments,Special table arguments,"?_col=COLUMN1&_col=COLUMN2
List specific columns to display. These will be shown along with any primary keys.
?_nocol=COLUMN1&_nocol=COLUMN2
List specific columns to hide - any column not listed will be displayed. Primary keys cannot be hidden.
?_labels=on/off
Expand foreign key references for every possible column. See below.
?_label=COLUMN1&_label=COLUMN2
Expand foreign key references for one or more specified columns.
?_size=1000 or ?_size=max
Sets a custom page size. This cannot exceed the max_returned_rows limit
passed to datasette serve . Use max to get max_returned_rows .
?_sort=COLUMN
Sorts the results by the specified column.
?_sort_desc=COLUMN
Sorts the results by the specified column in descending order.
?_search=keywords
For SQLite tables that have been configured for
full-text search executes a search
with the provided keywords.
?_search_COLUMN=keywords
Like _search= but allows you to specify the column to be searched, as
opposed to searching all columns that have been indexed by FTS.
?_searchmode=raw
With this option, queries passed to ?_search= or ?_search_COLUMN= will
not have special characters escaped. This means you can make use of the full
set of advanced SQLite FTS syntax ,
though this could potentially result in errors if the wrong syntax is used.
?_where=SQL-fragment
If the execute-sql permission is enabled, this parameter
can be used to pass one or more additional SQL fragments to be used in the
WHERE clause of the SQL used to query the table.
This is particularly useful if you are building a JavaScript application
that needs to do something creative but still wants the other conveniences
provided by the table view (such as faceting) and hence would like not to
have to construct a completely custom SQL query.
Some examples:
facetable?_where=_neighborhood like ""%c%""&_where=_city_id=3
facetable?_where=_city_id in (select id from facet_cities where name != ""Detroit"")
?_through={json}
This can be used to filter rows via a join against another table.
The JSON parameter must include three keys: table , column and value .
table must be a table that the current table is related to via a foreign key relationship.
column must be a column in that other table.
value is the value that you want to match against.
For example, to filter roadside_attractions to just show the attractions that have a characteristic of ""museum"", you would construct this JSON:
{
""table"": ""roadside_attraction_characteristics"",
""column"": ""characteristic_id"",
""value"": ""1""
}
As a URL, that looks like this:
?_through={%22table%22:%22roadside_attraction_characteristics%22,%22column%22:%22characteristic_id%22,%22value%22:%221%22}
Here's an example .
?_next=TOKEN
Pagination by continuation token - pass the token that was returned in the
""next"" property by the previous page.
?_facet=column
Facet by column. Can be applied multiple times, see Facets . Only works on the default JSON output, not on any of the custom shapes.
?_facet_size=100
Increase the number of facet results returned for each facet. Use ?_facet_size=max for the maximum available size, determined by max_returned_rows .
?_nofacet=1
Disable all facets and facet suggestions for this page, including any defined by Facets in metadata .
?_nosuggest=1
Disable facet suggestions for this page.
?_nocount=1
Disable the select count(*) query used on this page - a count of None will be returned instead.","[""JSON API"", ""Table arguments""]","[{""href"": ""https://www.sqlite.org/fts3.html"", ""label"": ""full-text search""}, {""href"": ""https://www.sqlite.org/fts5.html#full_text_query_syntax"", ""label"": ""advanced SQLite FTS syntax""}, {""href"": ""https://latest.datasette.io/fixtures/facetable?_where=_neighborhood%20like%20%22%c%%22&_where=_city_id=3"", ""label"": ""facetable?_where=_neighborhood like \""%c%\""&_where=_city_id=3""}, {""href"": ""https://latest.datasette.io/fixtures/facetable?_where=_city_id%20in%20(select%20id%20from%20facet_cities%20where%20name%20!=%20%22Detroit%22)"", ""label"": ""facetable?_where=_city_id in (select id from facet_cities where name != \""Detroit\"")""}, {""href"": ""https://latest.datasette.io/fixtures/roadside_attractions?_through={%22table%22:%22roadside_attraction_characteristics%22,%22column%22:%22characteristic_id%22,%22value%22:%221%22}"", ""label"": ""an example""}]"
changelog:id89,changelog,id89,0.26.1 (2019-01-10),"/-/versions now includes SQLite compile_options ( #396 )
datasetteproject/datasette Docker image now uses SQLite 3.26.0 ( #397 )
Cleaned up some deprecation warnings under Python 3.7","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/396"", ""label"": ""#396""}, {""href"": ""https://hub.docker.com/r/datasetteproject/datasette"", ""label"": ""datasetteproject/datasette""}, {""href"": ""https://github.com/simonw/datasette/issues/397"", ""label"": ""#397""}]"
changelog:id78,changelog,id78,0.30.2 (2019-11-02),"/-/plugins page now uses distribution name e.g. datasette-cluster-map instead of the name of the underlying Python package ( datasette_cluster_map ) ( #606 )
Array faceting is now only suggested for columns that contain arrays of strings ( #562 )
Better documentation for the --host argument ( #574 )
Don't show None with a broken link for the label on a nullable foreign key ( #406 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/606"", ""label"": ""#606""}, {""href"": ""https://github.com/simonw/datasette/issues/562"", ""label"": ""#562""}, {""href"": ""https://github.com/simonw/datasette/issues/574"", ""label"": ""#574""}, {""href"": ""https://github.com/simonw/datasette/issues/406"", ""label"": ""#406""}]"
metadata:top-level-metadata,metadata,top-level-metadata,Top-level metadata,"""Top-level"" metadata refers to fields that can be specified at the root level of a metadata file. These attributes are meant to describe the entire Datasette instance.
The following are the full list of allowed top-level metadata fields:
title
description
description_html
license
license_url
source
source_url","[""Metadata"", ""Metadata reference""]",[]
metadata:table-level-metadata,metadata,table-level-metadata,Table-level metadata,"""Table-level"" metadata refers to fields that can be specified for each table in a Datasette instance. These attributes should be listed under a specific table using the ""tables"" field.
The following are the full list of allowed table-level metadata fields:
source
source_url
license
license_url
about
about_url
hidden
sort/sort_desc
size
sortable_columns
label_column
facets
fts_table
fts_pk
searchmode
columns","[""Metadata"", ""Metadata reference""]",[]
metadata:database-level-metadata,metadata,database-level-metadata,Database-level metadata,"""Database-level"" metadata refers to fields that can be specified for each database in a Datasette instance. These attributes should be listed under a database inside the ""databases"" field.
The following are the full list of allowed database-level metadata fields:
source
source_url
license
license_url
about
about_url","[""Metadata"", ""Metadata reference""]",[]
installation:installation-basic,installation,installation-basic,Basic installation,,"[""Installation""]",[]
installation:installation-advanced,installation,installation-advanced,Advanced installation options,,"[""Installation""]",[]
spatialite:spatialite-installation,spatialite,spatialite-installation,Installation,,"[""SpatiaLite""]",[]
changelog:id1,changelog,id1,Changelog,,[],[]
changelog:id21,changelog,id21,0.60 (2022-01-13),,"[""Changelog""]",[]
settings:id1,settings,id1,Settings,,[],[]
getting_started:getting-started,getting_started,getting-started,Getting started,,[],[]