id,page,ref,title,content,breadcrumbs,references configuration:configuration-reference-settings,configuration,configuration-reference-settings,Settings,"Settings can be configured in datasette.yaml with the settings key: [[[cog from metadata_doc import config_example import textwrap config_example(cog, textwrap.dedent( """""" # inside datasette.yaml settings: default_allow_sql: off default_page_size: 50 """""").strip() ) ]]] [[[end]]] The full list of settings is available in the settings documentation . Settings can also be passed to Datasette using one or more --setting name value command line options.`","[""Configuration"", null]",[] settings:id1,settings,id1,Settings,,[],[] settings:id2,settings,id2,Settings,"The following options can be set using --setting name value , or by storing them in the settings.json file for use with Configuration directory mode .","[""Settings""]",[] metadata:metadata-sortable-columns,metadata,metadata-sortable-columns,Setting which columns can be used for sorting,"Datasette allows any column to be used for sorting by default. If you need to control which columns are available for sorting you can do so using the optional sortable_columns key: [[[cog metadata_example(cog, { ""databases"": { ""database1"": { ""tables"": { ""example_table"": { ""sortable_columns"": [ ""height"", ""weight"" ] } } } } }) ]]] [[[end]]] This will restrict sorting of example_table to just the height and weight columns. You can also disable sorting entirely by setting ""sortable_columns"": [] You can use sortable_columns to enable specific sort orders for a view called name_of_view in the database my_database like so: [[[cog metadata_example(cog, { ""databases"": { ""my_database"": { ""tables"": { ""name_of_view"": { ""sortable_columns"": [ ""clicks"", ""impressions"" ] } } } } }) ]]] [[[end]]]","[""Metadata""]",[] contributing:devenvironment,contributing,devenvironment,Setting up a development environment,"If you have Python 3.8 or higher installed on your computer (on OS X the quickest way to do this is using homebrew ) you can install an editable copy of Datasette using the following steps. If you want to use GitHub to publish your changes, first create a fork of datasette under your own GitHub account. Now clone that repository somewhere on your computer: git clone git@github.com:YOURNAME/datasette If you want to get started without creating your own fork, you can do this instead: git clone git@github.com:simonw/datasette The next step is to create a virtual environment for your project and use it to install Datasette's dependencies: cd datasette # Create a virtual environment in ./venv python3 -m venv ./venv # Now activate the virtual environment, so pip can install into it source venv/bin/activate # Install Datasette and its testing dependencies python3 -m pip install -e '.[test]' That last line does most of the work: pip install -e means ""install this package in a way that allows me to edit the source code in place"". The .[test] option means ""use the setup.py in this directory and install the optional testing dependencies as well"".","[""Contributing""]","[{""href"": ""https://docs.python-guide.org/starting/install3/osx/"", ""label"": ""is using homebrew""}, {""href"": ""https://github.com/simonw/datasette/fork"", ""label"": ""create a fork of datasette""}]" testing_plugins:testing-plugins-datasette-test-instance,testing_plugins,testing-plugins-datasette-test-instance,Setting up a Datasette test instance,"The above example shows the easiest way to start writing tests against a Datasette instance: from datasette.app import Datasette import pytest @pytest.mark.asyncio async def test_plugin_is_installed(): datasette = Datasette(memory=True) response = await datasette.client.get(""/-/plugins.json"") assert response.status_code == 200 Creating a Datasette() instance like this as useful shortcut in tests, but there is one detail you need to be aware of. It's important to ensure that the async method .invoke_startup() is called on that instance. You can do that like this: datasette = Datasette(memory=True) await datasette.invoke_startup() This method registers any startup(datasette) or prepare_jinja2_environment(env, datasette) plugins that might themselves need to make async calls. If you are using await datasette.client.get() and similar methods then you don't need to worry about this - Datasette automatically calls invoke_startup() the first time it handles a request.","[""Testing plugins""]",[] internals:internals-response-set-cookie,internals,internals-response-set-cookie,Setting cookies with response.set_cookie(),"To set cookies on the response, use the response.set_cookie(...) method. The method signature looks like this: def set_cookie( self, key, value="""", max_age=None, expires=None, path=""/"", domain=None, secure=False, httponly=False, samesite=""lax"", ): ... You can use this with datasette.sign() to set signed cookies. Here's how you would set the ds_actor cookie for use with Datasette authentication : response = Response.redirect(""/"") response.set_cookie( ""ds_actor"", datasette.sign({""a"": {""id"": ""cleopaws""}}, ""actor""), ) return response","[""Internals for plugins"", ""Response class""]",[] metadata:metadata-default-sort,metadata,metadata-default-sort,Setting a default sort order,"By default Datasette tables are sorted by primary key. You can over-ride this default for a specific table using the ""sort"" or ""sort_desc"" metadata properties: [[[cog metadata_example(cog, { ""databases"": { ""mydatabase"": { ""tables"": { ""example_table"": { ""sort"": ""created"" } } } } }) ]]] [[[end]]] Or use ""sort_desc"" to sort in descending order: [[[cog metadata_example(cog, { ""databases"": { ""mydatabase"": { ""tables"": { ""example_table"": { ""sort_desc"": ""created"" } } } } }) ]]] [[[end]]]","[""Metadata""]",[] metadata:metadata-page-size,metadata,metadata-page-size,Setting a custom page size,"Datasette defaults to displaying 100 rows per page, for both tables and views. You can change this default page size on a per-table or per-view basis using the ""size"" key in metadata.json : [[[cog metadata_example(cog, { ""databases"": { ""mydatabase"": { ""tables"": { ""example_table"": { ""size"": 10 } } } } }) ]]] [[[end]]] This size can still be over-ridden by passing e.g. ?_size=50 in the query string.","[""Metadata""]",[] custom_templates:customization-static-files,custom_templates,customization-static-files,Serving static files,"Datasette can serve static files for you, using the --static option. Consider the following directory structure: metadata.json static-files/styles.css static-files/app.js You can start Datasette using --static assets:static-files/ to serve those files from the /assets/ mount point: datasette --config datasette.yaml --static assets:static-files/ --memory The following URLs will now serve the content from those CSS and JS files: http://localhost:8001/assets/styles.css http://localhost:8001/assets/app.js You can reference those files from datasette.yaml like this, see custom CSS and JavaScript for more details: [[[cog from metadata_doc import config_example config_example(cog, """""" extra_css_urls: - /assets/styles.css extra_js_urls: - /assets/app.js """""") ]]] [[[end]]]","[""Custom pages and templates""]",[] javascript_plugins:javascript-datasette-manager-selectors,javascript_plugins,javascript-datasette-manager-selectors,Selectors,"These are available on the selectors property of the datasetteManager object. const DOM_SELECTORS = { /** Should have one match */ jsonExportLink: "".export-links a[href*=json]"", /** Event listeners that go outside of the main table, e.g. existing scroll listener */ tableWrapper: "".table-wrapper"", table: ""table.rows-and-columns"", aboveTablePanel: "".above-table-panel"", // These could have multiple matches /** Used for selecting table headers. Use makeColumnActions if you want to add menu items. */ tableHeaders: `table.rows-and-columns th`, /** Used to add ""where"" clauses to query using direct manipulation */ filterRows: "".filter-row"", /** Used to show top available enum values for a column (""facets"") */ facetResults: "".facet-results [data-column]"", };","[""JavaScript plugins""]",[] plugins:plugins-installed,plugins,plugins-installed,Seeing what plugins are installed,"You can see a list of installed plugins by navigating to the /-/plugins page of your Datasette instance - for example: https://fivethirtyeight.datasettes.com/-/plugins You can also use the datasette plugins command: datasette plugins Which outputs: [ { ""name"": ""datasette_json_html"", ""static"": false, ""templates"": false, ""version"": ""0.4.0"" } ] [[[cog from datasette import cli from click.testing import CliRunner import textwrap, json cog.out(""\n"") result = CliRunner().invoke(cli.cli, [""plugins"", ""--all""]) # cog.out() with text containing newlines was unindenting for some reason cog.outl(""If you run ``datasette plugins --all`` it will include default plugins that ship as part of Datasette:\n"") cog.outl("".. code-block:: json\n"") plugins = [p for p in json.loads(result.output) if p[""name""].startswith(""datasette."")] indented = textwrap.indent(json.dumps(plugins, indent=4), "" "") for line in indented.split(""\n""): cog.outl(line) cog.out(""\n\n"") ]]] If you run datasette plugins --all it will include default plugins that ship as part of Datasette: [ { ""name"": ""datasette.actor_auth_cookie"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""actor_from_request"" ] }, { ""name"": ""datasette.blob_renderer"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""register_output_renderer"" ] }, { ""name"": ""datasette.default_magic_parameters"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""register_magic_parameters"" ] }, { ""name"": ""datasette.default_menu_links"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""menu_links"" ] }, { ""name"": ""datasette.default_permissions"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""actor_from_request"", ""permission_allowed"", ""register_permissions"", ""skip_csrf"" ] }, { ""name"": ""datasette.events"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""register_events"" ] }, { ""name"": ""datasette.facets"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""register_facet_classes"" ] }, { ""name"": ""datasette.filters"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""filters_from_request"" ] }, { ""name"": ""datasette.forbidden"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""forbidden"" ] }, { ""name"": ""datasette.handle_exception"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""handle_exception"" ] }, { ""name"": ""datasette.publish.cloudrun"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""publish_subcommand"" ] }, { ""name"": ""datasette.publish.heroku"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""publish_subcommand"" ] }, { ""name"": ""datasette.sql_functions"", ""static"": false, ""templates"": false, ""version"": null, ""hooks"": [ ""prepare_connection"" ] } ] [[[end]]] You can add the --plugins-dir= option to include any plugins found in that directory. Add --requirements to output a list of installed plugins that can then be installed in another Datasette instance using datasette install -r requirements.txt : datasette plugins --requirements The output will look something like this: datasette-codespaces==0.1.1 datasette-graphql==2.2 datasette-json-html==1.0.1 datasette-pretty-json==0.2.2 datasette-x-forwarded-host==0.1 To write that to a requirements.txt file, run this: datasette plugins --requirements > requirements.txt","[""Plugins""]","[{""href"": ""https://fivethirtyeight.datasettes.com/-/plugins"", ""label"": ""https://fivethirtyeight.datasettes.com/-/plugins""}]" changelog:secret-plugin-configuration-options,changelog,secret-plugin-configuration-options,Secret plugin configuration options,"Plugins like datasette-auth-github need a safe way to set secret configuration options. Since the default mechanism for configuring plugins exposes those settings in /-/metadata a new mechanism was needed. Secret configuration values describes how plugins can now specify that their settings should be read from a file or an environment variable: { ""plugins"": { ""datasette-auth-github"": { ""client_secret"": { ""$env"": ""GITHUB_CLIENT_SECRET"" } } } } These plugin secrets can be set directly using datasette publish . See Custom metadata and plugins for details. ( #538 and #543 )","[""Changelog"", ""0.29 (2019-07-07)""]","[{""href"": ""https://github.com/simonw/datasette-auth-github"", ""label"": ""datasette-auth-github""}, {""href"": ""https://github.com/simonw/datasette/issues/538"", ""label"": ""#538""}, {""href"": ""https://github.com/simonw/datasette/issues/543"", ""label"": ""#543""}]" plugins:plugins-configuration-secret,plugins,plugins-configuration-secret,Secret configuration values,"Some plugins may need configuration that should stay secret - API keys for example. There are two ways in which you can store secret configuration values. As environment variables . If your secret lives in an environment variable that is available to the Datasette process, you can indicate that the configuration value should be read from that environment variable like so: [[[cog config_example(cog, { ""plugins"": { ""datasette-auth-github"": { ""client_secret"": { ""$env"": ""GITHUB_CLIENT_SECRET"" } } } }) ]]] [[[end]]] As values in separate files . Your secrets can also live in files on disk. To specify a secret should be read from a file, provide the full file path like this: [[[cog config_example(cog, { ""plugins"": { ""datasette-auth-github"": { ""client_secret"": { ""$file"": ""/secrets/client-secret"" } } } }) ]]] [[[end]]] If you are publishing your data using the datasette publish family of commands, you can use the --plugin-secret option to set these secrets at publish time. For example, using Heroku you might run the following command: datasette publish heroku my_database.db \ --name my-heroku-app-demo \ --install=datasette-auth-github \ --plugin-secret datasette-auth-github client_id your_client_id \ --plugin-secret datasette-auth-github client_secret your_client_secret This will set the necessary environment variables and add the following to the deployed metadata.yaml : [[[cog config_example(cog, { ""plugins"": { ""datasette-auth-github"": { ""client_id"": { ""$env"": ""DATASETTE_AUTH_GITHUB_CLIENT_ID"" }, ""client_secret"": { ""$env"": ""DATASETTE_AUTH_GITHUB_CLIENT_SECRET"" } } } }) ]]] [[[end]]]","[""Plugins"", ""Plugin configuration""]",[] full_text_search:full-text-search-custom-sql,full_text_search,full-text-search-custom-sql,Searches using custom SQL,"You can include full-text search results in custom SQL queries. The general pattern with SQLite search is to run the search as a sub-select that returns rowid values, then include those rowids in another part of the query. You can see the syntax for a basic search by running that search on a table page and then clicking ""View and edit SQL"" to see the underlying SQL. For example, consider this search for manafort is the US FARA database : /fara/FARA_All_ShortForms?_search=manafort If you click View and edit SQL you'll see that the underlying SQL looks like this: select rowid, Short_Form_Termination_Date, Short_Form_Date, Short_Form_Last_Name, Short_Form_First_Name, Registration_Number, Registration_Date, Registrant_Name, Address_1, Address_2, City, State, Zip from FARA_All_ShortForms where rowid in ( select rowid from FARA_All_ShortForms_fts where FARA_All_ShortForms_fts match escape_fts(:search) ) order by rowid limit 101","[""Full-text search""]","[{""href"": ""https://fara.datasettes.com/fara/FARA_All_ShortForms?_search=manafort"", ""label"": ""manafort is the US FARA database""}, {""href"": ""https://fara.datasettes.com/fara?sql=select%0D%0A++rowid%2C%0D%0A++Short_Form_Termination_Date%2C%0D%0A++Short_Form_Date%2C%0D%0A++Short_Form_Last_Name%2C%0D%0A++Short_Form_First_Name%2C%0D%0A++Registration_Number%2C%0D%0A++Registration_Date%2C%0D%0A++Registrant_Name%2C%0D%0A++Address_1%2C%0D%0A++Address_2%2C%0D%0A++City%2C%0D%0A++State%2C%0D%0A++Zip%0D%0Afrom%0D%0A++FARA_All_ShortForms%0D%0Awhere%0D%0A++rowid+in+%28%0D%0A++++select%0D%0A++++++rowid%0D%0A++++from%0D%0A++++++FARA_All_ShortForms_fts%0D%0A++++where%0D%0A++++++FARA_All_ShortForms_fts+match+escape_fts%28%3Asearch%29%0D%0A++%29%0D%0Aorder+by%0D%0A++rowid%0D%0Alimit%0D%0A++101&search=manafort"", ""label"": ""View and edit SQL""}]" contributing:contributing-running-tests,contributing,contributing-running-tests,Running the tests,"Once you have done this, you can run the Datasette unit tests from inside your datasette/ directory using pytest like so: pytest You can run the tests faster using multiple CPU cores with pytest-xdist like this: pytest -n auto -m ""not serial"" -n auto detects the number of available cores automatically. The -m ""not serial"" skips tests that don't work well in a parallel test environment. You can run those tests separately like so: pytest -m ""serial""","[""Contributing""]","[{""href"": ""https://docs.pytest.org/"", ""label"": ""pytest""}, {""href"": ""https://pypi.org/project/pytest-xdist/"", ""label"": ""pytest-xdist""}]" sql_queries:sql,sql_queries,sql,Running SQL queries,"Datasette treats SQLite database files as read-only and immutable. This means it is not possible to execute INSERT or UPDATE statements using Datasette, which allows us to expose SELECT statements to the outside world without needing to worry about SQL injection attacks. The easiest way to execute custom SQL against Datasette is through the web UI. The database index page includes a SQL editor that lets you run any SELECT query you like. You can also construct queries using the filter interface on the tables page, then click ""View and edit SQL"" to open that query in the custom SQL editor. Note that this interface is only available if the execute-sql permission is allowed. See Controlling the ability to execute arbitrary SQL . Any Datasette SQL query is reflected in the URL of the page, allowing you to bookmark them, share them with others and navigate through previous queries using your browser back button. You can also retrieve the results of any query as JSON by adding .json to the base URL.",[],[] deploying:deploying-systemd,deploying,deploying-systemd,Running Datasette using systemd,"You can run Datasette on Ubuntu or Debian systems using systemd . First, ensure you have Python 3 and pip installed. On Ubuntu you can use sudo apt-get install python3 python3-pip . You can install Datasette into a virtual environment, or you can install it system-wide. To install system-wide, use sudo pip3 install datasette . Now create a folder for your Datasette databases, for example using mkdir /home/ubuntu/datasette-root . You can copy a test database into that folder like so: cd /home/ubuntu/datasette-root curl -O https://latest.datasette.io/fixtures.db Create a file at /etc/systemd/system/datasette.service with the following contents: [Unit] Description=Datasette After=network.target [Service] Type=simple User=ubuntu Environment=DATASETTE_SECRET= WorkingDirectory=/home/ubuntu/datasette-root ExecStart=datasette serve . -h 127.0.0.1 -p 8000 Restart=on-failure [Install] WantedBy=multi-user.target Add a random value for the DATASETTE_SECRET - this will be used to sign Datasette cookies such as the CSRF token cookie. You can generate a suitable value like so: python3 -c 'import secrets; print(secrets.token_hex(32))' This configuration will run Datasette against all database files contained in the /home/ubuntu/datasette-root directory. If that directory contains a metadata.yml (or .json ) file or a templates/ or plugins/ sub-directory those will automatically be loaded by Datasette - see Configuration directory mode for details. You can start the Datasette process running using the following: sudo systemctl daemon-reload sudo systemctl start datasette.service You will need to restart the Datasette service after making changes to its metadata.json configuration or adding a new database file to that directory. You can do that using: sudo systemctl restart datasette.service Once the service has started you can confirm that Datasette is running on port 8000 like so: curl 127.0.0.1:8000/-/versions.json # Should output JSON showing the installed version Datasette will not be accessible from outside the server because it is listening on 127.0.0.1 . You can expose it by instead listening on 0.0.0.0 , but a better way is to set up a proxy such as nginx - see Running Datasette behind a proxy .","[""Deploying Datasette""]",[] deploying:deploying-openrc,deploying,deploying-openrc,Running Datasette using OpenRC,"OpenRC is the service manager on non-systemd Linux distributions like Alpine Linux and Gentoo . Create an init script at /etc/init.d/datasette with the following contents: #!/sbin/openrc-run name=""datasette"" command=""datasette"" command_args=""serve -h 0.0.0.0 /path/to/db.db"" command_background=true pidfile=""/run/${RC_SVCNAME}.pid"" You then need to configure the service to run at boot and start it: rc-update add datasette rc-service datasette start","[""Deploying Datasette""]","[{""href"": ""https://www.alpinelinux.org/"", ""label"": ""Alpine Linux""}, {""href"": ""https://www.gentoo.org/"", ""label"": ""Gentoo""}]" changelog:running-datasette-behind-a-proxy,changelog,running-datasette-behind-a-proxy,Running Datasette behind a proxy,"The base_url configuration option is designed to help run Datasette on a specific path behind a proxy - for example if you want to run an instance of Datasette at /my-datasette/ within your existing site's URL hierarchy, proxied behind nginx or Apache. Support for this configuration option has been greatly improved ( #1023 ), and guidelines for using it are now available in a new documentation section on Running Datasette behind a proxy . ( #1027 )","[""Changelog"", ""0.51 (2020-10-31)""]","[{""href"": ""https://github.com/simonw/datasette/issues/1023"", ""label"": ""#1023""}, {""href"": ""https://github.com/simonw/datasette/issues/1027"", ""label"": ""#1027""}]" deploying:deploying-proxy,deploying,deploying-proxy,Running Datasette behind a proxy,"You may wish to run Datasette behind an Apache or nginx proxy, using a path within your existing site. You can use the base_url configuration setting to tell Datasette to serve traffic with a specific URL prefix. For example, you could run Datasette like this: datasette my-database.db --setting base_url /my-datasette/ -p 8009 This will run Datasette with the following URLs: http://127.0.0.1:8009/my-datasette/ - the Datasette homepage http://127.0.0.1:8009/my-datasette/my-database - the page for the my-database.db database http://127.0.0.1:8009/my-datasette/my-database/some_table - the page for the some_table table You can now set your nginx or Apache server to proxy the /my-datasette/ path to this Datasette instance.","[""Deploying Datasette""]",[] contributing:contributing-documentation-cog,contributing,contributing-documentation-cog,Running Cog,"Some pages of documentation (in particular the CLI reference ) are automatically updated using Cog . To update these pages, run the following command: cog -r docs/*.rst","[""Contributing"", ""Editing and building the documentation""]","[{""href"": ""https://github.com/nedbat/cog"", ""label"": ""Cog""}]" contributing:contributing-formatting-black,contributing,contributing-formatting-black,Running Black,"Black will be installed when you run pip install -e '.[test]' . To test that your code complies with Black, run the following in your root datasette repository checkout: black . --check All done! ✨ 🍰 ✨ 95 files would be left unchanged. If any of your code does not conform to Black you can run this to automatically fix those problems: black . reformatted ../datasette/setup.py All done! ✨ 🍰 ✨ 1 file reformatted, 94 files left unchanged.","[""Contributing"", ""Code formatting""]",[] pages:rowview,pages,rowview,Row,"Every row in every Datasette table has its own URL. This means individual records can be linked to directly. Table cells with extremely long text contents are truncated on the table view according to the truncate_cells_html setting. If a cell has been truncated the full length version of that cell will be available on the row page. Rows which are the targets of foreign key references from other tables will show a link to a filtered search for all records that reference that row. Here's an example from the Registers of Members Interests database: ../people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001 Note that this URL includes the encoded primary key of the record. Here's that same page as JSON: ../people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001.json","[""Pages and API endpoints""]","[{""href"": ""https://register-of-members-interests.datasettes.com/regmem/people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001"", ""label"": ""../people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001""}, {""href"": ""https://register-of-members-interests.datasettes.com/regmem/people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001.json"", ""label"": ""../people/uk~2Eorg~2Epublicwhip~2Fperson~2F10001.json""}]" internals:internals-response-asgi-send,internals,internals-response-asgi-send,Returning a response with .asgi_send(send),"In most cases you will return Response objects from your own view functions. You can also use a Response instance to respond at a lower level via ASGI, for example if you are writing code that uses the asgi_wrapper(datasette) hook. Create a Response object and then use await response.asgi_send(send) , passing the ASGI send function. For example: async def require_authorization(scope, receive, send): response = Response.text( ""401 Authorization Required"", headers={ ""www-authenticate"": 'Basic realm=""Datasette"", charset=""UTF-8""' }, status=401, ) await response.asgi_send(send)","[""Internals for plugins"", ""Response class""]",[] custom_templates:custom-pages-404,custom_templates,custom-pages-404,Returning 404s,"To indicate that content could not be found and display the default 404 page you can use the raise_404(message) function: {% if not rows %} {{ raise_404(""Content not found"") }} {% endif %} If you call raise_404() the other content in your template will be ignored.","[""Custom pages and templates""]",[] internals:database-results,internals,database-results,Results,"The db.execute() method returns a single Results object. This can be used to access the rows returned by the query. Iterating over a Results object will yield SQLite Row objects . Each of these can be treated as a tuple or can be accessed using row[""column""] syntax: info = [] results = await db.execute(""select name from sqlite_master"") for row in results: info.append(row[""name""]) The Results object also has the following properties and methods: .truncated - boolean Indicates if this query was truncated - if it returned more results than the specified page_size . If this is true then the results object will only provide access to the first page_size rows in the query result. You can disable truncation by passing truncate=False to the db.query() method. .columns - list of strings A list of column names returned by the query. .rows - list of sqlite3.Row This property provides direct access to the list of rows returned by the database. You can access specific rows by index using results.rows[0] . .first() - row or None Returns the first row in the results, or None if no rows were returned. .single_value() Returns the value of the first column of the first row of results - but only if the query returned a single row with a single column. Raises a datasette.database.MultipleValues exception otherwise. .__len__() Calling len(results) returns the (truncated) number of returned results.","[""Internals for plugins"", ""Database class""]","[{""href"": ""https://docs.python.org/3/library/sqlite3.html#row-objects"", ""label"": ""Row objects""}]" authentication:authentication-cli-create-token-restrict,authentication,authentication-cli-create-token-restrict,Restricting the actions that a token can perform,"Tokens created using datasette create-token ACTOR_ID will inherit all of the permissions of the actor that they are associated with. You can pass additional options to create tokens that are restricted to a subset of that actor's permissions. To restrict the token to just specific permissions against all available databases, use the --all option: datasette create-token root --all insert-row --all update-row This option can be passed as many times as you like. In the above example the token will only be allowed to insert and update rows. You can also restrict permissions such that they can only be used within specific databases: datasette create-token root --database mydatabase insert-row The resulting token will only be able to insert rows, and only to tables in the mydatabase database. Finally, you can restrict permissions to individual resources - tables, SQL views and named queries - within a specific database: datasette create-token root --resource mydatabase mytable insert-row These options have short versions: -a for --all , -d for --database and -r for --resource . You can add --debug to see a JSON representation of the token that has been created. Here's a full example: datasette create-token root \ --secret mysecret \ --all view-instance \ --all view-table \ --database docs view-query \ --resource docs documents insert-row \ --resource docs documents update-row \ --debug This example outputs the following: dstok_.eJxFizEKgDAMRe_y5w4qYrFXERGxDkVsMI0uxbubdjFL8l_ez1jhwEQCA6Fjjxp90qtkuHawzdjYrh8MFobLxZ_wBH0_gtnAF-hpS5VfmF8D_lnd97lHqUJgLd6sls4H1qwlhA.nH_7RecYHj5qSzvjhMU95iy0Xlc Decoded: { ""a"": ""root"", ""token"": ""dstok"", ""t"": 1670907246, ""_r"": { ""a"": [ ""vi"", ""vt"" ], ""d"": { ""docs"": [ ""vq"" ] }, ""r"": { ""docs"": { ""documents"": [ ""ir"", ""ur"" ] } } } }","[""Authentication and permissions"", ""API Tokens"", ""datasette create-token""]",[] internals:internals-response,internals,internals-response,Response class,"The Response class can be returned from view functions that have been registered using the register_routes(datasette) hook. The Response() constructor takes the following arguments: body - string The body of the response. status - integer (optional) The HTTP status - defaults to 200. headers - dictionary (optional) A dictionary of extra HTTP headers, e.g. {""x-hello"": ""world""} . content_type - string (optional) The content-type for the response. Defaults to text/plain . For example: from datasette.utils.asgi import Response response = Response( ""This is XML"", content_type=""application/xml; charset=utf-8"", ) The quickest way to create responses is using the Response.text(...) , Response.html(...) , Response.json(...) or Response.redirect(...) helper methods: from datasette.utils.asgi import Response html_response = Response.html(""This is HTML"") json_response = Response.json({""this_is"": ""json""}) text_response = Response.text( ""This will become utf-8 encoded text"" ) # Redirects are served as 302, unless you pass status=301: redirect_response = Response.redirect( ""https://latest.datasette.io/"" ) Each of these responses will use the correct corresponding content-type - text/html; charset=utf-8 , application/json; charset=utf-8 or text/plain; charset=utf-8 respectively. Each of the helper methods take optional status= and headers= arguments, documented above.","[""Internals for plugins""]",[] internals:internals-request,internals,internals-request,Request object,"The request object is passed to various plugin hooks. It represents an incoming HTTP request. It has the following properties: .scope - dictionary The ASGI scope that was used to construct this request, described in the ASGI HTTP connection scope specification. .method - string The HTTP method for this request, usually GET or POST . .url - string The full URL for this request, e.g. https://latest.datasette.io/fixtures . .scheme - string The request scheme - usually https or http . .headers - dictionary (str -> str) A dictionary of incoming HTTP request headers. Header names have been converted to lowercase. .cookies - dictionary (str -> str) A dictionary of incoming cookies .host - string The host header from the incoming request, e.g. latest.datasette.io or localhost . .path - string The path of the request excluding the query string, e.g. /fixtures . .full_path - string The path of the request including the query string if one is present, e.g. /fixtures?sql=select+sqlite_version() . .query_string - string The query string component of the request, without the ? - e.g. name__contains=sam&age__gt=10 . .args - MultiParams An object representing the parsed query string parameters, see below. .url_vars - dictionary (str -> str) Variables extracted from the URL path, if that path was defined using a regular expression. See register_routes(datasette) . .actor - dictionary (str -> Any) or None The currently authenticated actor (see actors ), or None if the request is unauthenticated. The object also has two awaitable methods: await request.post_vars() - dictionary Returns a dictionary of form variables that were submitted in the request body via POST . Don't forget to read about CSRF protection ! await request.post_body() - bytes Returns the un-parsed body of a request submitted by POST - useful for things like incoming JSON data. And a class method that can be used to create fake request objects for use in tests: fake(path_with_query_string, method=""GET"", scheme=""http"", url_vars=None) Returns a Request instance for the specified path and method. For example: from datasette import Request from pprint import pprint request = Request.fake( ""/fixtures/facetable/"", url_vars={""database"": ""fixtures"", ""table"": ""facetable""}, ) pprint(request.scope) This outputs: {'http_version': '1.1', 'method': 'GET', 'path': '/fixtures/facetable/', 'query_string': b'', 'raw_path': b'/fixtures/facetable/', 'scheme': 'http', 'type': 'http', 'url_route': {'kwargs': {'database': 'fixtures', 'table': 'facetable'}}}","[""Internals for plugins""]","[{""href"": ""https://asgi.readthedocs.io/en/latest/specs/www.html#connection-scope"", ""label"": ""ASGI HTTP connection scope""}]" contributing:contributing-bug-fix-branch,contributing,contributing-bug-fix-branch,Releasing bug fixes from a branch,"If it's necessary to publish a bug fix release without shipping new features that have landed on main a release branch can be used. Create it from the relevant last tagged release like so: git branch 0.52.x 0.52.4 git checkout 0.52.x Next cherry-pick the commits containing the bug fixes: git cherry-pick COMMIT Write the release notes in the branch, and update the version number in version.py . Then push the branch: git push -u origin 0.52.x Once the tests have completed, publish the release from that branch target using the GitHub Draft a new release form. Finally, cherry-pick the commit with the release notes and version number bump across to main : git checkout main git cherry-pick COMMIT git push","[""Contributing""]","[{""href"": ""https://github.com/simonw/datasette/releases/new"", ""label"": ""Draft a new release""}]" contributing:contributing-release,contributing,contributing-release,Release process,"Datasette releases are performed using tags. When a new release is published on GitHub, a GitHub Action workflow will perform the following: Run the unit tests against all supported Python versions. If the tests pass... Build a Docker image of the release and push a tag to https://hub.docker.com/r/datasetteproject/datasette Re-point the ""latest"" tag on Docker Hub to the new image Build a wheel bundle of the underlying Python source code Push that new wheel up to PyPI: https://pypi.org/project/datasette/ If the release is an alpha, navigate to https://readthedocs.org/projects/datasette/versions/ and search for the tag name in the ""Activate a version"" filter, then mark that version as ""active"" to ensure it will appear on the public ReadTheDocs documentation site. To deploy new releases you will need to have push access to the main Datasette GitHub repository. Datasette follows Semantic Versioning : major.minor.patch We increment major for backwards-incompatible releases. Datasette is currently pre-1.0 so the major version is always 0 . We increment minor for new features. We increment patch for bugfix releass. Alpha and beta releases may have an additional a0 or b0 prefix - the integer component will be incremented with each subsequent alpha or beta. To release a new version, first create a commit that updates the version number in datasette/version.py and the the changelog with highlights of the new version. An example commit can be seen here : # Update changelog git commit -m "" Release 0.51a1 Refs #1056, #1039, #998, #1045, #1033, #1036, #1034, #976, #1057, #1058, #1053, #1064, #1066"" -a git push Referencing the issues that are part of the release in the commit message ensures the name of the release shows up on those issue pages, e.g. here . You can generate the list of issue references for a specific release by copying and pasting text from the release notes or GitHub changes-since-last-release view into this Extract issue numbers from pasted text tool. To create the tag for the release, create a new release on GitHub matching the new version number. You can convert the release notes to Markdown by copying and pasting the rendered HTML into this Paste to Markdown tool . Finally, post a news item about the release on datasette.io by editing the news.yaml file in that site's repository.","[""Contributing""]","[{""href"": ""https://github.com/simonw/datasette/blob/main/.github/workflows/deploy-latest.yml"", ""label"": ""GitHub Action workflow""}, {""href"": ""https://hub.docker.com/r/datasetteproject/datasette"", ""label"": ""https://hub.docker.com/r/datasetteproject/datasette""}, {""href"": ""https://pypi.org/project/datasette/"", ""label"": ""https://pypi.org/project/datasette/""}, {""href"": ""https://readthedocs.org/projects/datasette/versions/"", ""label"": ""https://readthedocs.org/projects/datasette/versions/""}, {""href"": ""https://semver.org/"", ""label"": ""Semantic Versioning""}, {""href"": ""https://github.com/simonw/datasette/commit/0e1e89c6ba3d0fbdb0823272952cf356f3016def"", ""label"": ""commit can be seen here""}, {""href"": ""https://github.com/simonw/datasette/issues/581#ref-commit-d56f402"", ""label"": ""here""}, {""href"": ""https://observablehq.com/@simonw/extract-issue-numbers-from-pasted-text"", ""label"": ""Extract issue numbers from pasted text""}, {""href"": ""https://github.com/simonw/datasette/releases/new"", ""label"": ""a new release""}, {""href"": ""https://euangoddard.github.io/clipboard2markdown/"", ""label"": ""Paste to Markdown tool""}, {""href"": ""https://datasette.io/"", ""label"": ""datasette.io""}, {""href"": ""https://github.com/simonw/datasette.io/blob/main/news.yaml"", ""label"": ""news.yaml""}]" testing_plugins:testing-plugins-register-in-test,testing_plugins,testing-plugins-register-in-test,Registering a plugin for the duration of a test,"When writing tests for plugins you may find it useful to register a test plugin just for the duration of a single test. You can do this using pm.register() and pm.unregister() like this: from datasette import hookimpl from datasette.app import Datasette from datasette.plugins import pm import pytest @pytest.mark.asyncio async def test_using_test_plugin(): class TestPlugin: __name__ = ""TestPlugin"" # Use hookimpl and method names to register hooks @hookimpl def register_routes(self): return [ (r""^/error$"", lambda: 1 / 0), ] pm.register(TestPlugin(), name=""undo"") try: # The test implementation goes here datasette = Datasette() response = await datasette.client.get(""/error"") assert response.status_code == 500 finally: pm.unregister(name=""undo"") To reuse the same temporary plugin in multiple tests, you can register it inside a fixture in your conftest.py file like this: from datasette import hookimpl from datasette.app import Datasette from datasette.plugins import pm import pytest import pytest_asyncio @pytest_asyncio.fixture async def datasette_with_plugin(): class TestPlugin: __name__ = ""TestPlugin"" @hookimpl def register_routes(self): return [ (r""^/error$"", lambda: 1 / 0), ] pm.register(TestPlugin(), name=""undo"") try: yield Datasette() finally: pm.unregister(name=""undo"") Note the yield statement here - this ensures that the finally: block that unregisters the plugin is executed only after the test function itself has completed. Then in a test: @pytest.mark.asyncio async def test_error(datasette_with_plugin): response = await datasette_with_plugin.client.get(""/error"") assert response.status_code == 500","[""Testing plugins""]",[] spatialite:querying-polygons-using-within,spatialite,querying-polygons-using-within,Querying polygons using within(),"The within() SQL function can be used to check if a point is within a geometry: select name from places where within(GeomFromText('POINT(-3.1724366 51.4704448)'), places.geom); The GeomFromText() function takes a string of well-known text. Note that the order used here is longitude then latitude . To run that same within() query in a way that benefits from the spatial index, use the following: select name from places where within(GeomFromText('POINT(-3.1724366 51.4704448)'), places.geom) and rowid in ( SELECT pkid FROM idx_places_geom where xmin < -3.1724366 and xmax > -3.1724366 and ymin < 51.4704448 and ymax > 51.4704448 );","[""SpatiaLite""]",[] publish:publish-vercel,publish,publish-vercel,Publishing to Vercel,"Vercel - previously known as Zeit Now - provides a layer over AWS Lambda to allow for quick, scale-to-zero deployment. You can deploy Datasette instances to Vercel using the datasette-publish-vercel plugin. pip install datasette-publish-vercel datasette publish vercel mydatabase.db --project my-database-project Not every feature is supported: consult the datasette-publish-vercel README for more details.","[""Publishing data"", ""datasette publish""]","[{""href"": ""https://vercel.com/"", ""label"": ""Vercel""}, {""href"": ""https://github.com/simonw/datasette-publish-vercel"", ""label"": ""datasette-publish-vercel""}, {""href"": ""https://github.com/simonw/datasette-publish-vercel/blob/main/README.md"", ""label"": ""datasette-publish-vercel README""}]" publish:publish-heroku,publish,publish-heroku,Publishing to Heroku,"To publish your data using Heroku , first create an account there and install and configure the Heroku CLI tool . You can publish one or more databases to Heroku using the following command: datasette publish heroku mydatabase.db This will output some details about the new deployment, including a URL like this one: https://limitless-reef-88278.herokuapp.com/ deployed to Heroku You can specify a custom app name by passing -n my-app-name to the publish command. This will also allow you to overwrite an existing app. Rather than deploying directly you can use the --generate-dir option to output the files that would be deployed to a directory: datasette publish heroku mydatabase.db --generate-dir=/tmp/deploy-this-to-heroku See datasette publish heroku for the full list of options for this command.","[""Publishing data"", ""datasette publish""]","[{""href"": ""https://www.heroku.com/"", ""label"": ""Heroku""}, {""href"": ""https://devcenter.heroku.com/articles/heroku-cli"", ""label"": ""Heroku CLI tool""}]" publish:publish-cloud-run,publish,publish-cloud-run,Publishing to Google Cloud Run,"Google Cloud Run allows you to publish data in a scale-to-zero environment, so your application will start running when the first request is received and will shut down again when traffic ceases. This means you only pay for time spent serving traffic. Cloud Run is a great option for inexpensively hosting small, low traffic projects - but costs can add up for projects that serve a lot of requests. Be particularly careful if your project has tables with large numbers of rows. Search engine crawlers that index a page for every row could result in a high bill. The datasette-block-robots plugin can be used to request search engine crawlers omit crawling your site, which can help avoid this issue. You will first need to install and configure the Google Cloud CLI tools by following these instructions . You can then publish one or more SQLite database files to Google Cloud Run using the following command: datasette publish cloudrun mydatabase.db --service=my-database A Cloud Run service is a single hosted application. The service name you specify will be used as part of the Cloud Run URL. If you deploy to a service name that you have used in the past your new deployment will replace the previous one. If you omit the --service option you will be asked to pick a service name interactively during the deploy. You may need to interact with prompts from the tool. Many of the prompts ask for values that can be set as properties for the Google Cloud SDK if you want to avoid the prompts. For example, the default region for the deployed instance can be set using the command: gcloud config set run/region us-central1 You should replace us-central1 with your desired region . Alternately, you can specify the region by setting the CLOUDSDK_RUN_REGION environment variable. Once it has finished it will output a URL like this one: Service [my-service] revision [my-service-00001] has been deployed and is serving traffic at https://my-service-j7hipcg4aq-uc.a.run.app Cloud Run provides a URL on the .run.app domain, but you can also point your own domain or subdomain at your Cloud Run service - see mapping custom domains in the Cloud Run documentation for details. See datasette publish cloudrun for the full list of options for this command.","[""Publishing data"", ""datasette publish""]","[{""href"": ""https://cloud.google.com/run/"", ""label"": ""Google Cloud Run""}, {""href"": ""https://datasette.io/plugins/datasette-block-robots"", ""label"": ""datasette-block-robots""}, {""href"": ""https://cloud.google.com/sdk/"", ""label"": ""these instructions""}, {""href"": ""https://cloud.google.com/sdk/docs/properties"", ""label"": ""set as properties for the Google Cloud SDK""}, {""href"": ""https://cloud.google.com/about/locations"", ""label"": ""region""}, {""href"": ""https://cloud.google.com/run/docs/mapping-custom-domains"", ""label"": ""mapping custom domains""}]" publish:publish-fly,publish,publish-fly,Publishing to Fly,"Fly is a competitively priced Docker-compatible hosting platform that supports running applications in globally distributed data centers close to your end users. You can deploy Datasette instances to Fly using the datasette-publish-fly plugin. pip install datasette-publish-fly datasette publish fly mydatabase.db --app=""my-app"" Consult the datasette-publish-fly README for more details.","[""Publishing data"", ""datasette publish""]","[{""href"": ""https://fly.io/"", ""label"": ""Fly""}, {""href"": ""https://fly.io/docs/pricing/"", ""label"": ""competitively priced""}, {""href"": ""https://github.com/simonw/datasette-publish-fly"", ""label"": ""datasette-publish-fly""}, {""href"": ""https://github.com/simonw/datasette-publish-fly/blob/main/README.md"", ""label"": ""datasette-publish-fly README""}]" custom_templates:publishing-static-assets,custom_templates,publishing-static-assets,Publishing static assets,"The datasette publish command can be used to publish your static assets, using the same syntax as above: datasette publish cloudrun mydb.db --static assets:static-files/ This will upload the contents of the static-files/ directory as part of the deployment, and configure Datasette to correctly serve the assets from /assets/ .","[""Custom pages and templates""]",[] publish:publishing,publish,publishing,Publishing data,Datasette includes tools for publishing and deploying your data to the internet. The datasette publish command will deploy a new Datasette instance containing your databases directly to a Heroku or Google Cloud hosting account. You can also use datasette package to create a Docker image that bundles your databases together with the datasette application that is used to serve them.,[],[] contributing:contributing-formatting-prettier,contributing,contributing-formatting-prettier,Prettier,"To install Prettier, install Node.js and then run the following in the root of your datasette repository checkout: npm install This will install Prettier in a node_modules directory. You can then check that your code matches the coding style like so: npm run prettier -- --check > prettier > prettier 'datasette/static/*[!.min].js' ""--check"" Checking formatting... [warn] datasette/static/plugins.js [warn] Code style issues found in the above file(s). Forgot to run Prettier? You can fix any problems by running: npm run fix","[""Contributing"", ""Code formatting""]","[{""href"": ""https://nodejs.org/en/download/package-manager/"", ""label"": ""install Node.js""}]" writing_plugins:writing-plugins-extra-hooks,writing_plugins,writing-plugins-extra-hooks,Plugins that define new plugin hooks,"Plugins can define new plugin hooks that other plugins can use to further extend their functionality. datasette-graphql is one example of a plugin that does this. It defines a new hook called graphql_extra_fields , described here , which other plugins can use to define additional fields that should be included in the GraphQL schema. To define additional hooks, add a file to the plugin called datasette_your_plugin/hookspecs.py with content that looks like this: from pluggy import HookspecMarker hookspec = HookspecMarker(""datasette"") @hookspec def name_of_your_hook_goes_here(datasette): ""Description of your hook."" You should define your own hook name and arguments here, following the documentation for Pluggy specifications . Make sure to pick a name that is unlikely to clash with hooks provided by any other plugins. Then, to register your plugin hooks, add the following code to your datasette_your_plugin/__init__.py file: from datasette.plugins import pm from . import hookspecs pm.add_hookspecs(hookspecs) This will register your plugin hooks as part of the datasette plugin hook namespace. Within your plugin code you can trigger the hook using this pattern: from datasette.plugins import pm for ( plugin_return_value ) in pm.hook.name_of_your_hook_goes_here( datasette=datasette ): # Do something with plugin_return_value pass Other plugins will then be able to register their own implementations of your hook using this syntax: from datasette import hookimpl @hookimpl def name_of_your_hook_goes_here(datasette): return ""Response from this plugin hook"" These plugin implementations can accept 0 or more of the named arguments that you defined in your hook specification.","[""Writing plugins""]","[{""href"": ""https://github.com/simonw/datasette-graphql"", ""label"": ""datasette-graphql""}, {""href"": ""https://github.com/simonw/datasette-graphql/blob/main/README.md#adding-custom-fields-with-plugins"", ""label"": ""described here""}, {""href"": ""https://pluggy.readthedocs.io/en/stable/#specs"", ""label"": ""Pluggy specifications""}]" changelog:plugins-can-now-add-links-within-datasette,changelog,plugins-can-now-add-links-within-datasette,Plugins can now add links within Datasette,"A number of existing Datasette plugins add new pages to the Datasette interface, providig tools for things like uploading CSVs , editing table schemas or configuring full-text search . Plugins like this can now link to themselves from other parts of Datasette interface. The menu_links(datasette, actor, request) hook ( #1064 ) lets plugins add links to Datasette's new top-right application menu, and the table_actions(datasette, actor, database, table, request) hook ( #1066 ) adds links to a new ""table actions"" menu on the table page. The demo at latest.datasette.io now includes some example plugins. To see the new table actions menu first sign into that demo as root and then visit the facetable table to see the new cog icon menu at the top of the page.","[""Changelog"", ""0.51 (2020-10-31)""]","[{""href"": ""https://github.com/simonw/datasette-upload-csvs"", ""label"": ""uploading CSVs""}, {""href"": ""https://github.com/simonw/datasette-edit-schema"", ""label"": ""editing table schemas""}, {""href"": ""https://github.com/simonw/datasette-configure-fts"", ""label"": ""configuring full-text search""}, {""href"": ""https://github.com/simonw/datasette/issues/1064"", ""label"": ""#1064""}, {""href"": ""https://github.com/simonw/datasette/issues/1066"", ""label"": ""#1066""}, {""href"": ""https://latest.datasette.io/"", ""label"": ""latest.datasette.io""}, {""href"": ""https://latest.datasette.io/login-as-root"", ""label"": ""sign into that demo as root""}, {""href"": ""https://latest.datasette.io/fixtures/facetable"", ""label"": ""facetable""}]" changelog:plugins-and-internals,changelog,plugins-and-internals,Plugins and internals,"New plugin hook: filters_from_request(request, database, table, datasette) , which runs on the table page and can be used to support new custom query string parameters that modify the SQL query. ( #473 ) Added two additional methods for writing to the database: await db.execute_write_script(sql, block=True) and await db.execute_write_many(sql, params_seq, block=True) . ( #1570 ) The db.execute_write() internal method now defaults to blocking until the write operation has completed. Previously it defaulted to queuing the write and then continuing to run code while the write was in the queue. ( #1579 ) Database write connections now execute the prepare_connection(conn, database, datasette) plugin hook. ( #1564 ) The Datasette() constructor no longer requires the files= argument, and is now documented at Datasette class . ( #1563 ) The tracing feature now traces write queries, not just read queries. ( #1568 ) The query string variables exposed by request.args will now include blank strings for arguments such as foo in ?foo=&bar=1 rather than ignoring those parameters entirely. ( #1551 )","[""Changelog"", ""0.60 (2022-01-13)""]","[{""href"": ""https://github.com/simonw/datasette/issues/473"", ""label"": ""#473""}, {""href"": ""https://github.com/simonw/datasette/issues/1570"", ""label"": ""#1570""}, {""href"": ""https://github.com/simonw/datasette/issues/1579"", ""label"": ""#1579""}, {""href"": ""https://github.com/simonw/datasette/issues/1564"", ""label"": ""#1564""}, {""href"": ""https://github.com/simonw/datasette/issues/1563"", ""label"": ""#1563""}, {""href"": ""https://github.com/simonw/datasette/issues/1568"", ""label"": ""#1568""}, {""href"": ""https://github.com/simonw/datasette/issues/1551"", ""label"": ""#1551""}]" plugins:id1,plugins,id1,Plugins,"Datasette's plugin system allows additional features to be implemented as Python code (or front-end JavaScript) which can be wrapped up in a separate Python package. The underlying mechanism uses pluggy . See the Datasette plugins directory for a list of existing plugins, or take a look at the datasette-plugin topic on GitHub. Things you can do with plugins include: Add visualizations to Datasette, for example datasette-cluster-map and datasette-vega . Make new custom SQL functions available for use within Datasette, for example datasette-haversine and datasette-jellyfish . Define custom output formats with custom extensions, for example datasette-atom and datasette-ics . Add template functions that can be called within your Jinja custom templates, for example datasette-render-markdown . Customize how database values are rendered in the Datasette interface, for example datasette-render-binary and datasette-pretty-json . Customize how Datasette's authentication and permissions systems work, for example datasette-auth-passwords and datasette-permissions-sql .",[],"[{""href"": ""https://pluggy.readthedocs.io/"", ""label"": ""pluggy""}, {""href"": ""https://datasette.io/plugins"", ""label"": ""Datasette plugins directory""}, {""href"": ""https://github.com/topics/datasette-plugin"", ""label"": ""datasette-plugin""}, {""href"": ""https://github.com/simonw/datasette-cluster-map"", ""label"": ""datasette-cluster-map""}, {""href"": ""https://github.com/simonw/datasette-vega"", ""label"": ""datasette-vega""}, {""href"": ""https://github.com/simonw/datasette-haversine"", ""label"": ""datasette-haversine""}, {""href"": ""https://github.com/simonw/datasette-jellyfish"", ""label"": ""datasette-jellyfish""}, {""href"": ""https://github.com/simonw/datasette-atom"", ""label"": ""datasette-atom""}, {""href"": ""https://github.com/simonw/datasette-ics"", ""label"": ""datasette-ics""}, {""href"": ""https://github.com/simonw/datasette-render-markdown#markdown-in-templates"", ""label"": ""datasette-render-markdown""}, {""href"": ""https://github.com/simonw/datasette-render-binary"", ""label"": ""datasette-render-binary""}, {""href"": ""https://github.com/simonw/datasette-pretty-json"", ""label"": ""datasette-pretty-json""}, {""href"": ""https://github.com/simonw/datasette-auth-passwords"", ""label"": ""datasette-auth-passwords""}, {""href"": ""https://github.com/simonw/datasette-permissions-sql"", ""label"": ""datasette-permissions-sql""}]" changelog:plugin-hooks-and-internals,changelog,plugin-hooks-and-internals,Plugin hooks and internals,"The prepare_jinja2_environment(env, datasette) plugin hook now accepts an optional datasette argument. Hook implementations can also now return an async function which will be awaited automatically. ( #1809 ) Database(is_mutable=) now defaults to True . ( #1808 ) The datasette.check_visibility() method now accepts an optional permissions= list, allowing it to take multiple permissions into account at once when deciding if something should be shown as public or private. This has been used to correctly display padlock icons in more places in the Datasette interface. ( #1829 ) Datasette no longer enforces upper bounds on its dependencies. ( #1800 )","[""Changelog"", ""0.63 (2022-10-27)""]","[{""href"": ""https://github.com/simonw/datasette/issues/1809"", ""label"": ""#1809""}, {""href"": ""https://github.com/simonw/datasette/issues/1808"", ""label"": ""#1808""}, {""href"": ""https://github.com/simonw/datasette/issues/1829"", ""label"": ""#1829""}, {""href"": ""https://github.com/simonw/datasette/issues/1800"", ""label"": ""#1800""}]" changelog:id15,changelog,id15,Plugin hooks,"New plugin hook: handle_exception() , for custom handling of exceptions caught by Datasette. ( #1770 ) The render_cell() plugin hook is now also passed a row argument, representing the sqlite3.Row object that is being rendered. ( #1300 ) The configuration directory is now stored in datasette.config_dir , making it available to plugins. Thanks, Chris Amico. ( #1766 )","[""Changelog"", ""0.62 (2022-08-14)""]","[{""href"": ""https://github.com/simonw/datasette/issues/1770"", ""label"": ""#1770""}, {""href"": ""https://github.com/simonw/datasette/issues/1300"", ""label"": ""#1300""}, {""href"": ""https://github.com/simonw/datasette/pull/1766"", ""label"": ""#1766""}]" changelog:plugin-hooks,changelog,plugin-hooks,Plugin hooks,"New jinja2_environment_from_request(datasette, request, env) plugin hook, which can be used to customize the current Jinja environment based on the incoming request. This can be used to modify the template lookup path based on the incoming request hostname, among other things. ( #2225 ) New family of template slot plugin hooks : top_homepage , top_database , top_table , top_row , top_query , top_canned_query . Plugins can use these to provide additional HTML to be injected at the top of the corresponding pages. ( #1191 ) New track_event() mechanism for plugins to emit and receive events when certain events occur within Datasette. ( #2240 ) Plugins can register additional event classes using register_events(datasette) . They can then trigger those events with the datasette.track_event(event) internal method. Plugins can subscribe to notifications of events using the track_event(datasette, event) plugin hook. Datasette core now emits login , logout , create-token , create-table , drop-table , insert-rows , upsert-rows , update-row , delete-row events, documented here . New internal function for plugin authors: await db.execute_isolated_fn(fn) , for creating a new SQLite connection, executing code and then closing that connection, all while preventing other code from writing to that particular database. This connection will not have the prepare_connection() plugin hook executed against it, allowing plugins to perform actions that might otherwise be blocked by existing connection configuration. ( #2218 )","[""Changelog"", ""1.0a8 (2024-02-07)""]","[{""href"": ""https://github.com/simonw/datasette/issues/2225"", ""label"": ""#2225""}, {""href"": ""https://github.com/simonw/datasette/issues/1191"", ""label"": ""#1191""}, {""href"": ""https://github.com/simonw/datasette/issues/2240"", ""label"": ""#2240""}, {""href"": ""https://github.com/simonw/datasette/issues/2218"", ""label"": ""#2218""}]" plugin_hooks:id1,plugin_hooks,id1,Plugin hooks,"Datasette plugins use plugin hooks to customize Datasette's behavior. These hooks are powered by the pluggy plugin system. Each plugin can implement one or more hooks using the @hookimpl decorator against a function named that matches one of the hooks documented on this page. When you implement a plugin hook you can accept any or all of the parameters that are documented as being passed to that hook. For example, you can implement the render_cell plugin hook like this even though the full documented hook signature is render_cell(row, value, column, table, database, datasette) : @hookimpl def render_cell(value, column): if column == ""stars"": return ""*"" * int(value) List of plugin hooks prepare_connection(conn, database, datasette) prepare_jinja2_environment(env, datasette) Page extras extra_template_vars(template, database, table, columns, view_name, request, datasette) extra_css_urls(template, database, table, columns, view_name, request, datasette) extra_js_urls(template, database, table, columns, view_name, request, datasette) extra_body_script(template, database, table, columns, view_name, request, datasette) publish_subcommand(publish) render_cell(row, value, column, table, database, datasette, request) register_output_renderer(datasette) register_routes(datasette) register_commands(cli) register_facet_classes() register_permissions(datasette) asgi_wrapper(datasette) startup(datasette) canned_queries(datasette, database, actor) actor_from_request(datasette, request) actors_from_ids(datasette, actor_ids) jinja2_environment_from_request(datasette, request, env) filters_from_request(request, database, table, datasette) permission_allowed(datasette, actor, action, resource) register_magic_parameters(datasette) forbidden(datasette, request, message) handle_exception(datasette, request, exception) skip_csrf(datasette, scope) get_metadata(datasette, key, database, table) menu_links(datasette, actor, request) Action hooks table_actions(datasette, actor, database, table, request) view_actions(datasette, actor, database, view, request) query_actions(datasette, actor, database, query_name, request, sql, params) row_actions(datasette, actor, request, database, table, row) database_actions(datasette, actor, database, request) homepage_actions(datasette, actor, request) Template slots top_homepage(datasette, request) top_database(datasette, request, database) top_table(datasette, request, database, table) top_row(datasette, request, database, table, row) top_query(datasette, request, database, sql) top_canned_query(datasette, request, database, query_name) Event tracking track_event(datasette, event) register_events(datasette)",[],"[{""href"": ""https://pluggy.readthedocs.io/"", ""label"": ""pluggy""}]" configuration:configuration-reference-plugins,configuration,configuration-reference-plugins,Plugin configuration,"Datasette plugins often require configuration. This plugin configuration should be placed in plugins keys inside datasette.yaml . Most plugins are configured at the top-level of the file, using the plugins key: [[[cog from metadata_doc import config_example import textwrap config_example(cog, textwrap.dedent( """""" # inside datasette.yaml plugins: datasette-my-plugin: key: my_value """""").strip() ) ]]] [[[end]]] Some plugins can be configured at the database or table level. These should use a plugins key nested under the appropriate place within the databases object: [[[cog from metadata_doc import config_example import textwrap config_example(cog, textwrap.dedent( """""" # inside datasette.yaml databases: my_database: # plugin configuration for the my_database database plugins: datasette-my-plugin: key: my_value my_other_database: tables: my_table: # plugin configuration for the my_table table inside the my_other_database database plugins: datasette-my-plugin: key: my_value """""").strip() ) ]]] [[[end]]]","[""Configuration"", null]",[] plugins:plugins-configuration,plugins,plugins-configuration,Plugin configuration,"Plugins can have their own configuration, embedded in a configuration file . Configuration options for plugins live within a ""plugins"" key in that file, which can be included at the root, database or table level. Here is an example of some plugin configuration for a specific table: [[[cog from metadata_doc import config_example config_example(cog, { ""databases"": { ""sf-trees"": { ""tables"": { ""Street_Tree_List"": { ""plugins"": { ""datasette-cluster-map"": { ""latitude_column"": ""lat"", ""longitude_column"": ""lng"" } } } } } } }) ]]] [[[end]]] This tells the datasette-cluster-map column which latitude and longitude columns should be used for a table called Street_Tree_List inside a database file called sf-trees.db .","[""Plugins""]",[] getting_started:getting-started-demo,getting_started,getting-started-demo,Play with a live demo,"The best way to experience Datasette for the first time is with a demo: global-power-plants.datasettes.com provides a searchable database of power plants around the world, using data from the World Resources Institude rendered using the datasette-cluster-map plugin. fivethirtyeight.datasettes.com shows Datasette running against over 400 datasets imported from the FiveThirtyEight GitHub repository .","[""Getting started""]","[{""href"": ""https://global-power-plants.datasettes.com/global-power-plants/global-power-plants"", ""label"": ""global-power-plants.datasettes.com""}, {""href"": ""https://www.wri.org/publication/global-power-plant-database"", ""label"": ""World Resources Institude""}, {""href"": ""https://github.com/simonw/datasette-cluster-map"", ""label"": ""datasette-cluster-map""}, {""href"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight"", ""label"": ""fivethirtyeight.datasettes.com""}, {""href"": ""https://github.com/fivethirtyeight/data"", ""label"": ""FiveThirtyEight GitHub repository""}]" changelog:permissions-fix-for-the-upsert-api,changelog,permissions-fix-for-the-upsert-api,Permissions fix for the upsert API,"The /database/table/-/upsert API had a minor permissions bug, only affecting Datasette instances that had configured the insert-row and update-row permissions to apply to a specific table rather than the database or instance as a whole. Full details in issue #2262 . To avoid similar mistakes in the future the datasette.permission_allowed() method now specifies default= as a keyword-only argument.","[""Changelog"", ""1.0a9 (2024-02-16)""]","[{""href"": ""https://github.com/simonw/datasette/issues/2262"", ""label"": ""#2262""}]" configuration:configuration-reference-permissions,configuration,configuration-reference-permissions,Permissions configuration,"Datasette's authentication and permissions system can also be configured using datasette.yaml . Here is a simple example: [[[cog from metadata_doc import config_example import textwrap config_example(cog, textwrap.dedent( """""" # Instance is only available to users 'sharon' and 'percy': allow: id: - sharon - percy # Only 'percy' is allowed access to the accounting database: databases: accounting: allow: id: percy """""").strip() ) ]]] [[[end]]] Access permissions in datasette.yaml has the full details.","[""Configuration"", null]",[] authentication:authentication-permissions,authentication,authentication-permissions,Permissions,"Datasette has an extensive permissions system built-in, which can be further extended and customized by plugins. The key question the permissions system answers is this: Is this actor allowed to perform this action , optionally against this particular resource ? Actors are described above . An action is a string describing the action the actor would like to perform. A full list is provided below - examples include view-table and execute-sql . A resource is the item the actor wishes to interact with - for example a specific database or table. Some actions, such as permissions-debug , are not associated with a particular resource. Datasette's built-in view permissions ( view-database , view-table etc) default to allow - unless you configure additional permission rules unauthenticated users will be allowed to access content. Permissions with potentially harmful effects should default to deny . Plugin authors should account for this when designing new plugins - for example, the datasette-upload-csvs plugin defaults to deny so that installations don't accidentally allow unauthenticated users to create new tables by uploading a CSV file.","[""Authentication and permissions""]","[{""href"": ""https://github.com/simonw/datasette-upload-csvs"", ""label"": ""datasette-upload-csvs""}]" changelog:permissions,changelog,permissions,Permissions,"Datasette also now has a built-in concept of Permissions . The permissions system answers the following question: Is this actor allowed to perform this action , optionally against this particular resource ? You can use the new ""allow"" block syntax in metadata.json (or metadata.yaml ) to set required permissions at the instance, database, table or canned query level. For example, to restrict access to the fixtures.db database to the ""root"" user: { ""databases"": { ""fixtures"": { ""allow"": { ""id"" ""root"" } } } } See Defining permissions with ""allow"" blocks for more details. Plugins can implement their own custom permission checks using the new permission_allowed(datasette, actor, action, resource) hook. A new debug page at /-/permissions shows recent permission checks, to help administrators and plugin authors understand exactly what checks are being performed. This tool defaults to only being available to the root user, but can be exposed to other users by plugins that respond to the permissions-debug permission. ( #788 )","[""Changelog"", ""0.44 (2020-06-11)""]","[{""href"": ""https://github.com/simonw/datasette/issues/788"", ""label"": ""#788""}]" changelog:permission-checks-now-consider-opinions-from-every-plugin,changelog,permission-checks-now-consider-opinions-from-every-plugin,Permission checks now consider opinions from every plugin,"The datasette.permission_allowed() method previously consulted every plugin that implemented the permission_allowed() plugin hook and obeyed the opinion of the last plugin to return a value. ( #2275 ) Datasette now consults every plugin and checks to see if any of them returned False (the veto rule), and if none of them did, it then checks to see if any of them returned True . This is explained at length in the new documentation covering How permissions are resolved .","[""Changelog"", ""1.0a9 (2024-02-16)""]","[{""href"": ""https://github.com/simonw/datasette/issues/2275"", ""label"": ""#2275""}]" performance:performance,performance,performance,Performance and caching,"Datasette runs on top of SQLite, and SQLite has excellent performance. For small databases almost any query should return in just a few milliseconds, and larger databases (100s of MBs or even GBs of data) should perform extremely well provided your queries make sensible use of database indexes. That said, there are a number of tricks you can use to improve Datasette's performance.",[],[] metadata:per-database-and-per-table-metadata,metadata,per-database-and-per-table-metadata,Per-database and per-table metadata,"Metadata at the top level of the file will be shown on the index page and in the footer on every page of the site. The license and source is expected to apply to all of your data. You can also provide metadata at the per-database or per-table level, like this: [[[cog metadata_example(cog, { ""databases"": { ""database1"": { ""source"": ""Alternative source"", ""source_url"": ""http://example.com/"", ""tables"": { ""example_table"": { ""description_html"": ""Custom table description"", ""license"": ""CC BY 3.0 US"", ""license_url"": ""https://creativecommons.org/licenses/by/3.0/us/"" } } } } }) ]]] [[[end]]] Each of the top-level metadata fields can be used at the database and table level.","[""Metadata""]",[] custom_templates:custom-pages-parameters,custom_templates,custom-pages-parameters,Path parameters for pages,"You can define custom pages that match multiple paths by creating files with {variable} definitions in their filenames. For example, to capture any request to a URL matching /about/* , you would create a template in the following location: templates/pages/about/{slug}.html A hit to /about/news would render that template and pass in a variable called slug with a value of ""news"" . If you use this mechanism don't forget to return a 404 if the referenced content could not be found. You can do this using {{ raise_404() }} described below. Templates defined using custom page routes work particularly well with the sql() template function from datasette-template-sql or the graphql() template function from datasette-graphql .","[""Custom pages and templates""]","[{""href"": ""https://github.com/simonw/datasette-template-sql"", ""label"": ""datasette-template-sql""}, {""href"": ""https://github.com/simonw/datasette-graphql#the-graphql-template-function"", ""label"": ""datasette-graphql""}]" json_api:json-api-pagination,json_api,json-api-pagination,Pagination,"The default JSON representation includes a ""next_url"" key which can be used to access the next page of results. If that key is null or missing then it means you have reached the final page of results. Other representations include pagination information in the link HTTP header. That header will look something like this: link: ; rel=""next"" Here is an example Python function built using requests that returns a list of all of the paginated items from one of these API endpoints: def paginate(url): items = [] while url: response = requests.get(url) try: url = response.links.get(""next"").get(""url"") except AttributeError: url = None items.extend(response.json()) return items","[""JSON API""]","[{""href"": ""https://requests.readthedocs.io/"", ""label"": ""requests""}]" sql_queries:id2,sql_queries,id2,Pagination,"Datasette's default table pagination is designed to be extremely efficient. SQL OFFSET/LIMIT pagination can have a significant performance penalty once you get into multiple thousands of rows, as each page still requires the database to scan through every preceding row to find the correct offset. When paginating through tables, Datasette instead orders the rows in the table by their primary key and performs a WHERE clause against the last seen primary key for the previous page. For example: select rowid, * from Tree_List where rowid > 200 order by rowid limit 101 This represents page three for this particular table, with a page size of 100. Note that we request 101 items in the limit clause rather than 100. This allows us to detect if we are on the last page of the results: if the query returns less than 101 rows we know we have reached the end of the pagination set. Datasette will only return the first 100 rows - the 101st is used purely to detect if there should be another page. Since the where clause acts against the index on the primary key, the query is extremely fast even for records that are a long way into the overall pagination set.","[""Running SQL queries""]",[] pages:pages,pages,pages,Pages and API endpoints,"The Datasette web application offers a number of different pages that can be accessed to explore the data in question, each of which is accompanied by an equivalent JSON API.",[],[] plugin_hooks:plugin-page-extras,plugin_hooks,plugin-page-extras,Page extras,These plugin hooks can be used to affect the way HTML pages for different Datasette interfaces are rendered.,"[""Plugin hooks""]",[] writing_plugins:writing-plugins-packaging,writing_plugins,writing-plugins-packaging,Packaging a plugin,"Plugins can be packaged using Python setuptools. You can see an example of a packaged plugin at https://github.com/simonw/datasette-plugin-demos The example consists of two files: a setup.py file that defines the plugin: from setuptools import setup VERSION = ""0.1"" setup( name=""datasette-plugin-demos"", description=""Examples of plugins for Datasette"", author=""Simon Willison"", url=""https://github.com/simonw/datasette-plugin-demos"", license=""Apache License, Version 2.0"", version=VERSION, py_modules=[""datasette_plugin_demos""], entry_points={ ""datasette"": [ ""plugin_demos = datasette_plugin_demos"" ] }, install_requires=[""datasette""], ) And a Python module file, datasette_plugin_demos.py , that implements the plugin: from datasette import hookimpl import random @hookimpl def prepare_jinja2_environment(env): env.filters[""uppercase""] = lambda u: u.upper() @hookimpl def prepare_connection(conn): conn.create_function( ""random_integer"", 2, random.randint ) Having built a plugin in this way you can turn it into an installable package using the following command: python3 setup.py sdist This will create a .tar.gz file in the dist/ directory. You can then install your new plugin into a Datasette virtual environment or Docker container using pip : pip install datasette-plugin-demos-0.1.tar.gz To learn how to upload your plugin to PyPI for use by other people, read the PyPA guide to Packaging and distributing projects .","[""Writing plugins""]","[{""href"": ""https://github.com/simonw/datasette-plugin-demos"", ""label"": ""https://github.com/simonw/datasette-plugin-demos""}, {""href"": ""https://pypi.org/"", ""label"": ""PyPI""}, {""href"": ""https://packaging.python.org/tutorials/distributing-packages/"", ""label"": ""Packaging and distributing projects""}]" changelog:other-small-fixes,changelog,other-small-fixes,Other small fixes,"Made several performance improvements to the database schema introspection code that runs when Datasette first starts up. ( #1555 ) Label columns detected for foreign keys are now case-insensitive, so Name or TITLE will be detected in the same way as name or title . ( #1544 ) Upgraded Pluggy dependency to 1.0. ( #1575 ) Now using Plausible analytics for the Datasette documentation. explain query plan is now allowed with varying amounts of whitespace in the query. ( #1588 ) New CLI reference page showing the output of --help for each of the datasette sub-commands. This lead to several small improvements to the help copy. ( #1594 ) Fixed bug where writable canned queries could not be used with custom templates. ( #1547 ) Improved fix for a bug where columns with a underscore prefix could result in unnecessary hidden form fields. ( #1527 )","[""Changelog"", ""0.60 (2022-01-13)""]","[{""href"": ""https://github.com/simonw/datasette/issues/1555"", ""label"": ""#1555""}, {""href"": ""https://github.com/simonw/datasette/issues/1544"", ""label"": ""#1544""}, {""href"": ""https://github.com/simonw/datasette/issues/1575"", ""label"": ""#1575""}, {""href"": ""https://plausible.io/"", ""label"": ""Plausible analytics""}, {""href"": ""https://github.com/simonw/datasette/issues/1588"", ""label"": ""#1588""}, {""href"": ""https://github.com/simonw/datasette/issues/1594"", ""label"": ""#1594""}, {""href"": ""https://github.com/simonw/datasette/issues/1547"", ""label"": ""#1547""}, {""href"": ""https://github.com/simonw/datasette/issues/1527"", ""label"": ""#1527""}]" authentication:authentication-permissions-other,authentication,authentication-permissions-other,Other permissions in ,"For all other permissions, you can use one or more ""permissions"" blocks in your datasette.yaml configuration file. To grant access to the permissions debug tool to all signed in users, you can grant permissions-debug to any actor with an id matching the wildcard * by adding this a the root of your configuration: [[[cog config_example(cog, """""" permissions: debug-menu: id: '*' """""") ]]] [[[end]]] To grant create-table to the user with id of editor for the docs database: [[[cog config_example(cog, """""" databases: docs: permissions: create-table: id: editor """""") ]]] [[[end]]] And for insert-row against the reports table in that docs database: [[[cog config_example(cog, """""" databases: docs: tables: reports: permissions: insert-row: id: editor """""") ]]] [[[end]]] The permissions debug tool can be useful for helping test permissions that you have configured in this way.","[""Authentication and permissions""]",[] changelog:id36,changelog,id36,Other changes,"Datasette can now open multiple database files with the same name, e.g. if you run datasette path/to/one.db path/to/other/one.db . ( #509 ) datasette publish cloudrun now sets force_https_urls for every deployment, fixing some incorrect http:// links. ( #1178 ) Fixed a bug in the example nginx configuration in Running Datasette behind a proxy . ( #1091 ) The Datasette Ecosystem documentation page has been reduced in size in favour of the datasette.io tools and plugins directories. ( #1182 ) The request object now provides a request.full_path property, which returns the path including any query string. ( #1184 ) Better error message for disallowed PRAGMA clauses in SQL queries. ( #1185 ) datasette publish heroku now deploys using python-3.8.7 . New plugin testing documentation on Testing outbound HTTP calls with pytest-httpx . ( #1198 ) All ?_* query string parameters passed to the table page are now persisted in hidden form fields, so parameters such as ?_size=10 will be correctly passed to the next page when query filters are changed. ( #1194 ) Fixed a bug loading a database file called test-database (1).sqlite . ( #1181 )","[""Changelog"", ""0.54 (2021-01-25)""]","[{""href"": ""https://github.com/simonw/datasette/issues/509"", ""label"": ""#509""}, {""href"": ""https://github.com/simonw/datasette/issues/1178"", ""label"": ""#1178""}, {""href"": ""https://github.com/simonw/datasette/issues/1091"", ""label"": ""#1091""}, {""href"": ""https://datasette.io/tools"", ""label"": ""tools""}, {""href"": ""https://datasette.io/plugins"", ""label"": ""plugins""}, {""href"": ""https://github.com/simonw/datasette/issues/1182"", ""label"": ""#1182""}, {""href"": ""https://github.com/simonw/datasette/issues/1184"", ""label"": ""#1184""}, {""href"": ""https://github.com/simonw/datasette/issues/1185"", ""label"": ""#1185""}, {""href"": ""https://github.com/simonw/datasette/issues/1198"", ""label"": ""#1198""}, {""href"": ""https://github.com/simonw/datasette/issues/1194"", ""label"": ""#1194""}, {""href"": ""https://github.com/simonw/datasette/issues/1181"", ""label"": ""#1181""}]" changelog:other-changes,changelog,other-changes,Other changes,"The new DATASETTE_TRACE_PLUGINS=1 environment variable turns on detailed trace output for every executed plugin hook, useful for debugging and understanding how the plugin system works at a low level. ( #2274 ) Datasette on Python 3.9 or above marks its non-cryptographic uses of the MD5 hash function as usedforsecurity=False , for compatibility with FIPS systems. ( #2270 ) SQL relating to Datasette's internal database now executes inside a transaction, avoiding a potential database locked error. ( #2273 ) The /-/threads debug page now identifies the database in the name associated with each dedicated write thread. ( #2265 ) The /db/-/create API now fires a insert-rows event if rows were inserted after the table was created. ( #2260 )","[""Changelog"", ""1.0a9 (2024-02-16)""]","[{""href"": ""https://github.com/simonw/datasette/issues/2274"", ""label"": ""#2274""}, {""href"": ""https://github.com/simonw/datasette/issues/2270"", ""label"": ""#2270""}, {""href"": ""https://github.com/simonw/datasette/issues/2273"", ""label"": ""#2273""}, {""href"": ""https://github.com/simonw/datasette/issues/2265"", ""label"": ""#2265""}, {""href"": ""https://github.com/simonw/datasette/issues/2260"", ""label"": ""#2260""}]" plugins:one-off-plugins-using-plugins-dir,plugins,one-off-plugins-using-plugins-dir,One-off plugins using --plugins-dir,"You can also define one-off per-project plugins by saving them as plugin_name.py functions in a plugins/ folder and then passing that folder to datasette using the --plugins-dir option: datasette mydb.db --plugins-dir=plugins/","[""Plugins"", ""Installing plugins""]",[] deploying:nginx-proxy-configuration,deploying,nginx-proxy-configuration,Nginx proxy configuration,"Here is an example of an nginx configuration file that will proxy traffic to Datasette: daemon off; events { worker_connections 1024; } http { server { listen 80; location /my-datasette { proxy_pass http://127.0.0.1:8009/my-datasette; proxy_set_header Host $host; } } } You can also use the --uds option to Datasette to listen on a Unix domain socket instead of a port, configuring the nginx upstream proxy like this: daemon off; events { worker_connections 1024; } http { server { listen 80; location /my-datasette { proxy_pass http://datasette/my-datasette; proxy_set_header Host $host; } } upstream datasette { server unix:/tmp/datasette.sock; } } Then run Datasette with datasette --uds /tmp/datasette.sock path/to/database.db --setting base_url /my-datasette/ .","[""Deploying Datasette"", ""Running Datasette behind a proxy""]","[{""href"": ""https://nginx.org/"", ""label"": ""nginx""}]" changelog:new-visual-design,changelog,new-visual-design,New visual design,"Datasette is no longer white and grey with blue and purple links! Natalie Downe has been working on a visual refresh, the first iteration of which is included in this release. ( #1056 )","[""Changelog"", ""0.51 (2020-10-31)""]","[{""href"": ""https://twitter.com/natbat"", ""label"": ""Natalie Downe""}, {""href"": ""https://github.com/simonw/datasette/pull/1056"", ""label"": ""#1056""}]" changelog:new-plugin-hooks,changelog,new-plugin-hooks,New plugin hooks,"register_magic_parameters(datasette) can be used to define new types of magic canned query parameters. startup(datasette) can run custom code when Datasette first starts up. datasette-init is a new plugin that uses this hook to create database tables and views on startup if they have not yet been created. ( #834 ) canned_queries(datasette, database, actor) lets plugins provide additional canned queries beyond those defined in Datasette's metadata. See datasette-saved-queries for an example of this hook in action. ( #852 ) forbidden(datasette, request, message) is a hook for customizing how Datasette responds to 403 forbidden errors. ( #812 )","[""Changelog"", ""0.45 (2020-07-01)""]","[{""href"": ""https://github.com/simonw/datasette-init"", ""label"": ""datasette-init""}, {""href"": ""https://github.com/simonw/datasette/issues/834"", ""label"": ""#834""}, {""href"": ""https://github.com/simonw/datasette-saved-queries"", ""label"": ""datasette-saved-queries""}, {""href"": ""https://github.com/simonw/datasette/issues/852"", ""label"": ""#852""}, {""href"": ""https://github.com/simonw/datasette/issues/812"", ""label"": ""#812""}]" changelog:new-plugin-hook-extra-template-vars,changelog,new-plugin-hook-extra-template-vars,New plugin hook: extra_template_vars,"The extra_template_vars(template, database, table, columns, view_name, request, datasette) plugin hook allows plugins to inject their own additional variables into the Datasette template context. This can be used in conjunction with custom templates to customize the Datasette interface. datasette-auth-github uses this hook to add custom HTML to the new top navigation bar (which is designed to be modified by plugins, see #540 ).","[""Changelog"", ""0.29 (2019-07-07)""]","[{""href"": ""https://github.com/simonw/datasette-auth-github"", ""label"": ""datasette-auth-github""}, {""href"": ""https://github.com/simonw/datasette/issues/540"", ""label"": ""#540""}]" changelog:new-plugin-hook-asgi-wrapper,changelog,new-plugin-hook-asgi-wrapper,New plugin hook: asgi_wrapper,"The asgi_wrapper(datasette) plugin hook allows plugins to entirely wrap the Datasette ASGI application in their own ASGI middleware. ( #520 ) Two new plugins take advantage of this hook: datasette-auth-github adds a authentication layer: users will have to sign in using their GitHub account before they can view data or interact with Datasette. You can also use it to restrict access to specific GitHub users, or to members of specified GitHub organizations or teams . datasette-cors allows you to configure CORS headers for your Datasette instance. You can use this to enable JavaScript running on a whitelisted set of domains to make fetch() calls to the JSON API provided by your Datasette instance.","[""Changelog"", ""0.29 (2019-07-07)""]","[{""href"": ""https://github.com/simonw/datasette/issues/520"", ""label"": ""#520""}, {""href"": ""https://github.com/simonw/datasette-auth-github"", ""label"": ""datasette-auth-github""}, {""href"": ""https://help.github.com/en/articles/about-organizations"", ""label"": ""organizations""}, {""href"": ""https://help.github.com/en/articles/organizing-members-into-teams"", ""label"": ""teams""}, {""href"": ""https://github.com/simonw/datasette-cors"", ""label"": ""datasette-cors""}, {""href"": ""https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS"", ""label"": ""CORS headers""}]" changelog:new-features,changelog,new-features,New features,"If an error occurs while executing a user-provided SQL query, that query is now re-displayed in an editable form along with the error message. ( #619 ) New ?_col= and ?_nocol= parameters to show and hide columns in a table, plus an interface for hiding and showing columns in the column cog menu. ( #615 ) A new ?_facet_size= parameter for customizing the number of facet results returned on a table or view page. ( #1332 ) ?_facet_size=max sets that to the maximum, which defaults to 1,000 and is controlled by the the max_returned_rows setting. If facet results are truncated the … at the bottom of the facet list now links to this parameter. ( #1337 ) ?_nofacet=1 option to disable all facet calculations on a page, used as a performance optimization for CSV exports and ?_shape=array/object . ( #1349 , #263 ) ?_nocount=1 option to disable full query result counts. ( #1353 ) ?_trace=1 debugging option is now controlled by the new trace_debug setting, which is turned off by default. ( #1359 )","[""Changelog"", ""0.57 (2021-06-05)""]","[{""href"": ""https://github.com/simonw/datasette/issues/619"", ""label"": ""#619""}, {""href"": ""https://github.com/simonw/datasette/issues/615"", ""label"": ""#615""}, {""href"": ""https://github.com/simonw/datasette/issues/1332"", ""label"": ""#1332""}, {""href"": ""https://github.com/simonw/datasette/issues/1337"", ""label"": ""#1337""}, {""href"": ""https://github.com/simonw/datasette/issues/1349"", ""label"": ""#1349""}, {""href"": ""https://github.com/simonw/datasette/issues/263"", ""label"": ""#263""}, {""href"": ""https://github.com/simonw/datasette/issues/1353"", ""label"": ""#1353""}, {""href"": ""https://github.com/simonw/datasette/issues/1359"", ""label"": ""#1359""}]" changelog:new-configuration-settings,changelog,new-configuration-settings,New configuration settings,"Datasette's Settings now also supports boolean settings. A number of new configuration options have been added: num_sql_threads - the number of threads used to execute SQLite queries. Defaults to 3. allow_facet - enable or disable custom Facets using the _facet= parameter. Defaults to on. suggest_facets - should Datasette suggest facets? Defaults to on. allow_download - should users be allowed to download the entire SQLite database? Defaults to on. allow_sql - should users be allowed to execute custom SQL queries? Defaults to on. default_cache_ttl - Default HTTP caching max-age header in seconds. Defaults to 365 days - caching can be disabled entirely by settings this to 0. cache_size_kb - Set the amount of memory SQLite uses for its per-connection cache , in KB. allow_csv_stream - allow users to stream entire result sets as a single CSV file. Defaults to on. max_csv_mb - maximum size of a returned CSV file in MB. Defaults to 100MB, set to 0 to disable this limit.","[""Changelog"", ""0.23 (2018-06-18)""]","[{""href"": ""https://www.sqlite.org/pragma.html#pragma_cache_size"", ""label"": ""per-connection cache""}]" sql_queries:sql-parameters,sql_queries,sql-parameters,Named parameters,"Datasette has special support for SQLite named parameters. Consider a SQL query like this: select * from Street_Tree_List where ""PermitNotes"" like :notes and ""qSpecies"" = :species If you execute this query using the custom query editor, Datasette will extract the two named parameters and use them to construct form fields for you to provide values. You can also provide values for these fields by constructing a URL: /mydatabase?sql=select...&species=44 SQLite string escaping rules will be applied to values passed using named parameters - they will be wrapped in quotes and their content will be correctly escaped. Values from named parameters are treated as SQLite strings. If you need to perform numeric comparisons on them you should cast them to an integer or float first using cast(:name as integer) or cast(:name as real) , for example: select * from Street_Tree_List where latitude > cast(:min_latitude as real) and latitude < cast(:max_latitude as real) Datasette disallows custom SQL queries containing the string PRAGMA (with a small number of exceptions ) as SQLite pragma statements can be used to change database settings at runtime. If you need to include the string ""pragma"" in a query you can do so safely using a named parameter.","[""Running SQL queries""]","[{""href"": ""https://github.com/simonw/datasette/issues/761"", ""label"": ""of exceptions""}]" changelog:named-in-memory-database-support,changelog,named-in-memory-database-support,Named in-memory database support,"As part of the work building the _internal database, Datasette now supports named in-memory databases that can be shared across multiple connections. This allows plugins to create in-memory databases which will persist data for the lifetime of the Datasette server process. ( #1151 ) The new memory_name= parameter to the Database class can be used to create named, shared in-memory databases.","[""Changelog"", ""0.54 (2021-01-25)""]","[{""href"": ""https://github.com/simonw/datasette/issues/1151"", ""label"": ""#1151""}]" changelog:miscellaneous,changelog,miscellaneous,Miscellaneous,"Got JSON data in one of your columns? Use the new ?_json=COLNAME argument to tell Datasette to return that JSON value directly rather than encoding it as a string. If you just want an array of the first value of each row, use the new ?_shape=arrayfirst option - example .","[""Changelog"", ""0.23 (2018-06-18)""]","[{""href"": ""https://latest.datasette.io/fixtures.json?sql=select+neighborhood+from+facetable+order+by+pk+limit+101&_shape=arrayfirst"", ""label"": ""example""}]" changelog:minor-fixes,changelog,minor-fixes,Minor fixes,"Datasette no longer attempts to run SQL queries in parallel when rendering a table page, as this was leading to some rare crashing bugs. ( #2189 ) Fixed warning: DeprecationWarning: pkg_resources is deprecated as an API ( #2057 ) Fixed bug where ?_extra=columns parameter returned an incorrectly shaped response. ( #2230 )","[""Changelog"", ""1.0a8 (2024-02-07)""]","[{""href"": ""https://github.com/simonw/datasette/issues/2189"", ""label"": ""#2189""}, {""href"": ""https://github.com/simonw/datasette/issues/2057"", ""label"": ""#2057""}, {""href"": ""https://github.com/simonw/datasette/issues/2230"", ""label"": ""#2230""}]" metadata:id2,metadata,id2,Metadata reference,A full reference of every supported option in a metadata.json or metadata.yaml file.,"[""Metadata""]",[] metadata:id1,metadata,id1,Metadata,"Data loves metadata. Any time you run Datasette you can optionally include a YAML or JSON file with metadata about your databases and tables. Datasette will then display that information in the web UI. Run Datasette like this: datasette database1.db database2.db --metadata metadata.yaml Your metadata.yaml file can look something like this: [[[cog from metadata_doc import metadata_example metadata_example(cog, { ""title"": ""Custom title for your index page"", ""description"": ""Some description text can go here"", ""license"": ""ODbL"", ""license_url"": ""https://opendatacommons.org/licenses/odbl/"", ""source"": ""Original Data Source"", ""source_url"": ""http://example.com/"" }) ]]] [[[end]]] Choosing YAML over JSON adds support for multi-line strings and comments. The above metadata will be displayed on the index page of your Datasette-powered site. The source and license information will also be included in the footer of every page served by Datasette. Any special HTML characters in description will be escaped. If you want to include HTML in your description, you can use a description_html property instead.",[],[] changelog:v0-28-medium-changes,changelog,v0-28-medium-changes,Medium changes,"Datasette now conforms to the Black coding style ( #449 ) - and has a unit test to enforce this in the future New Special table arguments : ?columnname__in=value1,value2,value3 filter for executing SQL IN queries against a table, see Table arguments ( #433 ) ?columnname__date=yyyy-mm-dd filter which returns rows where the spoecified datetime column falls on the specified date ( 583b22a ) ?tags__arraycontains=tag filter which acts against a JSON array contained in a column ( 78e45ea ) ?_where=sql-fragment filter for the table view ( #429 ) ?_fts_table=mytable and ?_fts_pk=mycolumn query string options can be used to specify which FTS table to use for a search query - see Configuring full-text search for a table or view ( #428 ) You can now pass the same table filter multiple times - for example, ?content__not=world&content__not=hello will return all rows where the content column is neither hello or world ( #288 ) You can now specify about and about_url metadata (in addition to source and license ) linking to further information about a project - see Source, license and about New ?_trace=1 parameter now adds debug information showing every SQL query that was executed while constructing the page ( #435 ) datasette inspect now just calculates table counts, and does not introspect other database metadata ( #462 ) Removed /-/inspect page entirely - this will be replaced by something similar in the future, see #465 Datasette can now run against an in-memory SQLite database. You can do this by starting it without passing any files or by using the new --memory option to datasette serve . This can be useful for experimenting with SQLite queries that do not access any data, such as SELECT 1+1 or SELECT sqlite_version() .","[""Changelog"", ""0.28 (2019-05-19)""]","[{""href"": ""https://github.com/python/black"", ""label"": ""Black coding style""}, {""href"": ""https://github.com/simonw/datasette/pull/449"", ""label"": ""#449""}, {""href"": ""https://github.com/simonw/datasette/issues/433"", ""label"": ""#433""}, {""href"": ""https://github.com/simonw/datasette/commit/583b22aa28e26c318de0189312350ab2688c90b1"", ""label"": ""583b22a""}, {""href"": ""https://github.com/simonw/datasette/commit/78e45ead4d771007c57b307edf8fc920101f8733"", ""label"": ""78e45ea""}, {""href"": ""https://github.com/simonw/datasette/issues/429"", ""label"": ""#429""}, {""href"": ""https://github.com/simonw/datasette/issues/428"", ""label"": ""#428""}, {""href"": ""https://github.com/simonw/datasette/issues/288"", ""label"": ""#288""}, {""href"": ""https://github.com/simonw/datasette/issues/435"", ""label"": ""#435""}, {""href"": ""https://github.com/simonw/datasette/issues/462"", ""label"": ""#462""}, {""href"": ""https://github.com/simonw/datasette/issues/465"", ""label"": ""#465""}]" spatialite:making-use-of-a-spatial-index,spatialite,making-use-of-a-spatial-index,Making use of a spatial index,"SpatiaLite spatial indexes are R*Trees. They allow you to run efficient bounding box queries using a sub-select, with a similar pattern to that used for Searches using custom SQL . In the above example, the resulting index will be called idx_museums_point_geom . This takes the form of a SQLite virtual table. You can inspect its contents using the following query: select * from idx_museums_point_geom limit 10; Here's a live example: timezones-api.datasette.io/timezones/idx_timezones_Geometry pkid xmin xmax ymin ymax 1 -8.601725578308105 -2.4930307865142822 4.162120819091797 10.74019718170166 2 -3.2607860565185547 1.27329421043396 4.539252281188965 11.174856185913086 3 32.997581481933594 47.98238754272461 3.3974475860595703 14.894054412841797 4 -8.66890811920166 11.997337341308594 18.9681453704834 37.296207427978516 5 36.43336486816406 43.300174713134766 12.354820251464844 18.070993423461914 You can now construct efficient bounding box queries that will make use of the index like this: select * from museums where museums.rowid in ( SELECT pkid FROM idx_museums_point_geom -- left-hand-edge of point > left-hand-edge of bbox (minx) where xmin > :bbox_minx -- right-hand-edge of point < right-hand-edge of bbox (maxx) and xmax < :bbox_maxx -- bottom-edge of point > bottom-edge of bbox (miny) and ymin > :bbox_miny -- top-edge of point < top-edge of bbox (maxy) and ymax < :bbox_maxy ); Spatial indexes can be created against polygon columns as well as point columns, in which case they will represent the minimum bounding rectangle of that polygon. This is useful for accelerating within queries, as seen in the Timezones API example.","[""SpatiaLite""]","[{""href"": ""https://timezones-api.datasette.io/timezones/idx_timezones_Geometry"", ""label"": ""timezones-api.datasette.io/timezones/idx_timezones_Geometry""}]" changelog:magic-parameters-for-canned-queries,changelog,magic-parameters-for-canned-queries,Magic parameters for canned queries,"Canned queries now support Magic parameters , which can be used to insert or select automatically generated values. For example: insert into logs (user_id, timestamp) values (:_actor_id, :_now_datetime_utc) This inserts the currently authenticated actor ID and the current datetime. ( #842 )","[""Changelog"", ""0.45 (2020-07-01)""]","[{""href"": ""https://github.com/simonw/datasette/issues/842"", ""label"": ""#842""}]" sql_queries:canned-queries-magic-parameters,sql_queries,canned-queries-magic-parameters,Magic parameters,"Named parameters that start with an underscore are special: they can be used to automatically add values created by Datasette that are not contained in the incoming form fields or query string. These magic parameters are only supported for canned queries: to avoid security issues (such as queries that extract the user's private cookies) they are not available to SQL that is executed by the user as a custom SQL query. Available magic parameters are: _actor_* - e.g. _actor_id , _actor_name Fields from the currently authenticated Actors . _header_* - e.g. _header_user_agent Header from the incoming HTTP request. The key should be in lower case and with hyphens converted to underscores e.g. _header_user_agent or _header_accept_language . _cookie_* - e.g. _cookie_lang The value of the incoming cookie of that name. _now_epoch The number of seconds since the Unix epoch. _now_date_utc The date in UTC, e.g. 2020-06-01 _now_datetime_utc The ISO 8601 datetime in UTC, e.g. 2020-06-24T18:01:07Z _random_chars_* - e.g. _random_chars_128 A random string of characters of the specified length. Here's an example configuration that adds a message from the authenticated user, storing various pieces of additional metadata using magic parameters: [[[cog config_example(cog, """""" databases: mydatabase: queries: add_message: allow: id: ""*"" sql: |- INSERT INTO messages ( user_id, message, datetime ) VALUES ( :_actor_id, :message, :_now_datetime_utc ) write: true """""") ]]] [[[end]]] The form presented at /mydatabase/add_message will have just a field for message - the other parameters will be populated by the magic parameter mechanism. Additional custom magic parameters can be added by plugins using the register_magic_parameters(datasette) hook.","[""Running SQL queries"", ""Canned queries""]",[] changelog:log-out,changelog,log-out,Log out,"The ds_actor cookie can be used by plugins (or by Datasette's --root mechanism ) to authenticate users. The new /-/logout page provides a way to clear that cookie. A ""Log out"" button now shows in the global navigation provided the user is authenticated using the ds_actor cookie. ( #840 )","[""Changelog"", ""0.45 (2020-07-01)""]","[{""href"": ""https://github.com/simonw/datasette/issues/840"", ""label"": ""#840""}]" installation:loading-spatialite,installation,loading-spatialite,Loading SpatiaLite,"The datasetteproject/datasette image includes a recent version of the SpatiaLite extension for SQLite. To load and enable that module, use the following command: docker run -p 8001:8001 -v `pwd`:/mnt \ datasetteproject/datasette \ datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db \ --load-extension=spatialite You can confirm that SpatiaLite is successfully loaded by visiting http://127.0.0.1:8001/-/versions","[""Installation"", ""Advanced installation options"", ""Using Docker""]","[{""href"": ""http://127.0.0.1:8001/-/versions"", ""label"": ""http://127.0.0.1:8001/-/versions""}]" binary_data:binary-linking,binary_data,binary-linking,Linking to binary downloads,"The .blob output format is used to return binary data. It requires a _blob_column= query string argument specifying which BLOB column should be downloaded, for example: https://latest.datasette.io/fixtures/binary_data/1.blob?_blob_column=data This output format can also be used to return binary data from an arbitrary SQL query. Since such queries do not specify an exact row, an additional ?_blob_hash= parameter can be used to specify the SHA-256 hash of the value that is being linked to. Consider the query select data from binary_data - demonstrated here . That page links to the binary value downloads. Those links look like this: https://latest.datasette.io/fixtures.blob?sql=select+data+from+binary_data&_blob_column=data&_blob_hash=f3088978da8f9aea479ffc7f631370b968d2e855eeb172bea7f6c7a04262bb6d These .blob links are also returned in the .csv exports Datasette provides for binary tables and queries, since the CSV format does not have a mechanism for representing binary data.","[""Binary data""]","[{""href"": ""https://latest.datasette.io/fixtures/binary_data/1.blob?_blob_column=data"", ""label"": ""https://latest.datasette.io/fixtures/binary_data/1.blob?_blob_column=data""}, {""href"": ""https://latest.datasette.io/fixtures?sql=select+data+from+binary_data"", ""label"": ""demonstrated here""}, {""href"": ""https://latest.datasette.io/fixtures.blob?sql=select+data+from+binary_data&_blob_column=data&_blob_hash=f3088978da8f9aea479ffc7f631370b968d2e855eeb172bea7f6c7a04262bb6d"", ""label"": ""https://latest.datasette.io/fixtures.blob?sql=select+data+from+binary_data&_blob_column=data&_blob_hash=f3088978da8f9aea479ffc7f631370b968d2e855eeb172bea7f6c7a04262bb6d""}]" changelog:javascript-plugins,changelog,javascript-plugins,JavaScript plugins,"Datasette now includes a JavaScript plugins mechanism , allowing JavaScript to customize Datasette in a way that can collaborate with other plugins. This provides two initial hooks, with more to come in the future: makeAboveTablePanelConfigs() can add additional panels to the top of the table page. makeColumnActions() can add additional actions to the column menu. Thanks Cameron Yick for contributing this feature. ( #2052 )","[""Changelog"", ""1.0a8 (2024-02-07)""]","[{""href"": ""https://github.com/hydrosquall"", ""label"": ""Cameron Yick""}, {""href"": ""https://github.com/simonw/datasette/pull/2052"", ""label"": ""#2052""}]" javascript_plugins:id1,javascript_plugins,id1,JavaScript plugins,"Datasette can run custom JavaScript in several different ways: Datasette plugins written in Python can use the extra_js_urls() or extra_body_script() plugin hooks to inject JavaScript into a page Datasette instances with custom templates can include additional JavaScript in those templates The extra_js_urls key in datasette.yaml can be used to include extra JavaScript There are no limitations on what this JavaScript can do. It is executed directly by the browser, so it can manipulate the DOM, fetch additional data and do anything else that JavaScript is capable of. Custom JavaScript has security implications, especially for authenticated Datasette instances where the JavaScript might run in the context of the authenticated user. It's important to carefully review any JavaScript you run in your Datasette instance.",[],[] javascript_plugins:id2,javascript_plugins,id2,JavaScript plugin objects,"JavaScript plugins are blocks of code that can be registered with Datasette using the registerPlugin() method on the datasetteManager object. The implementation object passed to this method should include a version key defining the plugin version, and one or more of the following named functions providing the implementation of the plugin:","[""JavaScript plugins""]",[] changelog:javascript-modules,changelog,javascript-modules,JavaScript modules,"JavaScript modules were introduced in ECMAScript 2015 and provide native browser support for the import and export keywords. To use modules, JavaScript needs to be included in You can also specify a SRI (subresource integrity hash) for these assets: [[[cog config_example(cog, """""" extra_css_urls: - url: https://simonwillison.net/static/css/all.bf8cd891642c.css sri: sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI extra_js_urls: - url: https://code.jquery.com/jquery-3.2.1.slim.min.js sri: sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g= """""") ]]] [[[end]]] This will produce: Modern browsers will only execute the stylesheet or JavaScript if the SRI hash matches the content served. You can generate hashes using www.srihash.org Items in ""extra_js_urls"" can specify ""module"": true if they reference JavaScript that uses JavaScript modules . This configuration: [[[cog config_example(cog, """""" extra_js_urls: - url: https://example.datasette.io/module.js module: true """""") ]]] [[[end]]] Will produce this HTML: ","[""Configuration"", null]","[{""href"": ""https://www.srihash.org/"", ""label"": ""www.srihash.org""}, {""href"": ""https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Modules"", ""label"": ""JavaScript modules""}]" sql_queries:id3,sql_queries,id3,Cross-database queries,"SQLite has the ability to run queries that join across multiple databases. Up to ten databases can be attached to a single SQLite connection and queried together. Datasette can execute joins across multiple databases if it is started with the --crossdb option: datasette fixtures.db extra_database.db --crossdb If it is started in this way, the /_memory page can be used to execute queries that join across multiple databases. References to tables in attached databases should be preceded by the database name and a period. For example, this query will show a list of tables across both of the above databases: select 'fixtures' as database, * from [fixtures].sqlite_master union select 'extra_database' as database, * from [extra_database].sqlite_master Try that out here .","[""Running SQL queries""]","[{""href"": ""https://latest.datasette.io/_memory?sql=select%0D%0A++%27fixtures%27+as+database%2C+*%0D%0Afrom%0D%0A++%5Bfixtures%5D.sqlite_master%0D%0Aunion%0D%0Aselect%0D%0A++%27extra_database%27+as+database%2C+*%0D%0Afrom%0D%0A++%5Bextra_database%5D.sqlite_master"", ""label"": ""Try that out here""}]" json_api:tablecreateview-example,json_api,tablecreateview-example,Creating a table from example data,"Instead of specifying columns directly you can instead pass a single example row or a list of rows . Datasette will create a table with a schema that matches those rows and insert them for you: POST //-/create Content-Type: application/json Authorization: Bearer dstok_ { ""table"": ""creatures"", ""rows"": [ { ""id"": 1, ""name"": ""Tarantula"" }, { ""id"": 2, ""name"": ""Kākāpō"" } ], ""pk"": ""id"" } Doing this requires both the create-table and insert-row permissions. The 201 response here will be similar to the columns form, but will also include the number of rows that were inserted as row_count : { ""ok"": true, ""database"": ""data"", ""table"": ""creatures"", ""table_url"": ""http://127.0.0.1:8001/data/creatures"", ""table_api_url"": ""http://127.0.0.1:8001/data/creatures.json"", ""schema"": ""CREATE TABLE [creatures] (\n [id] INTEGER PRIMARY KEY,\n [name] TEXT\n)"", ""row_count"": 2 } You can call the create endpoint multiple times for the same table provided you are specifying the table using the rows or row option. New rows will be inserted into the table each time. This means you can use this API if you are unsure if the relevant table has been created yet. If you pass a row to the create endpoint with a primary key that already exists you will get an error that looks like this: { ""ok"": false, ""errors"": [ ""UNIQUE constraint failed: creatures.id"" ] } You can avoid this error by passing the same ""ignore"": true or ""replace"": true options to the create endpoint as you can to the insert endpoint . To use the ""replace"": true option you will also need the update-row permission. Pass ""alter"": true to automatically add any missing columns to the existing table that are present in the rows you are submitting. This requires the alter-table permission.","[""JSON API"", ""The JSON write API""]",[] json_api:tablecreateview,json_api,tablecreateview,Creating a table,"To create a table, make a POST to //-/create . This requires the create-table permission. POST //-/create Content-Type: application/json Authorization: Bearer dstok_ { ""table"": ""name_of_new_table"", ""columns"": [ { ""name"": ""id"", ""type"": ""integer"" }, { ""name"": ""title"", ""type"": ""text"" } ], ""pk"": ""id"" } The JSON here describes the table that will be created: table is the name of the table to create. This field is required. columns is a list of columns to create. Each column is a dictionary with name and type keys. name is the name of the column. This is required. type is the type of the column. This is optional - if not provided, text will be assumed. The valid types are text , integer , float and blob . pk is the primary key for the table. This is optional - if not provided, Datasette will create a SQLite table with a hidden rowid column. If the primary key is an integer column, it will be configured to automatically increment for each new record. If you set this to id without including an id column in the list of columns , Datasette will create an auto-incrementing integer ID column for you. pks can be used instead of pk to create a compound primary key. It should be a JSON list of column names to use in that primary key. ignore can be set to true to ignore existing rows by primary key if the table already exists. replace can be set to true to replace existing rows by primary key if the table already exists. This requires the update-row permission. alter can be set to true if you want to automatically add any missing columns to the table. This requires the alter-table permission. If the table is successfully created this will return a 201 status code and the following response: { ""ok"": true, ""database"": ""data"", ""table"": ""name_of_new_table"", ""table_url"": ""http://127.0.0.1:8001/data/name_of_new_table"", ""table_api_url"": ""http://127.0.0.1:8001/data/name_of_new_table.json"", ""schema"": ""CREATE TABLE [name_of_new_table] (\n [id] INTEGER PRIMARY KEY,\n [title] TEXT\n)"" }","[""JSON API"", ""The JSON write API""]",[] changelog:cookie-methods,changelog,cookie-methods,Cookie methods,"Plugins can now use the new response.set_cookie() method to set cookies. A new request.cookies method on the :ref:internals_request` can be used to read incoming cookies.","[""Changelog"", ""0.44 (2020-06-11)""]",[] plugins:plugins-datasette-load-plugins,plugins,plugins-datasette-load-plugins,Controlling which plugins are loaded,"Datasette defaults to loading every plugin that is installed in the same virtual environment as Datasette itself. You can set the DATASETTE_LOAD_PLUGINS environment variable to a comma-separated list of plugin names to load a controlled subset of plugins instead. For example, to load just the datasette-vega and datasette-cluster-map plugins, set DATASETTE_LOAD_PLUGINS to datasette-vega,datasette-cluster-map : export DATASETTE_LOAD_PLUGINS='datasette-vega,datasette-cluster-map' datasette mydb.db Or: DATASETTE_LOAD_PLUGINS='datasette-vega,datasette-cluster-map' \ datasette mydb.db To disable the loading of all additional plugins, set DATASETTE_LOAD_PLUGINS to an empty string: export DATASETTE_LOAD_PLUGINS='' datasette mydb.db A quick way to test this setting is to use it with the datasette plugins command: DATASETTE_LOAD_PLUGINS='datasette-vega' datasette plugins This should output the following: [ { ""name"": ""datasette-vega"", ""static"": true, ""templates"": false, ""version"": ""0.6.2"", ""hooks"": [ ""extra_css_urls"", ""extra_js_urls"" ] } ]","[""Plugins""]",[] authentication:authentication-permissions-execute-sql,authentication,authentication-permissions-execute-sql,Controlling the ability to execute arbitrary SQL,"Datasette defaults to allowing any site visitor to execute their own custom SQL queries, for example using the form on the database page or by appending a ?_where= parameter to the table page like this . Access to this ability is controlled by the execute-sql permission. The easiest way to disable arbitrary SQL queries is using the default_allow_sql setting when you first start Datasette running. You can alternatively use an ""allow_sql"" block to control who is allowed to execute arbitrary SQL queries. To prevent any user from executing arbitrary SQL queries, use this: [[[cog config_example(cog, """""" allow_sql: false """""") ]]] [[[end]]] To enable just the root user to execute SQL for all databases in your instance, use the following: [[[cog config_example(cog, """""" allow_sql: id: root """""") ]]] [[[end]]] To limit this ability for just one specific database, use this: [[[cog config_example(cog, """""" databases: mydatabase: allow_sql: id: root """""") ]]] [[[end]]]","[""Authentication and permissions"", ""Access permissions in ""]","[{""href"": ""https://latest.datasette.io/fixtures"", ""label"": ""the database page""}, {""href"": ""https://latest.datasette.io/fixtures/facetable?_where=_city_id=1"", ""label"": ""like this""}]" changelog:control-http-caching-with-ttl,changelog,control-http-caching-with-ttl,Control HTTP caching with ?_ttl=,"You can now customize the HTTP max-age header that is sent on a per-URL basis, using the new ?_ttl= query string parameter. You can set this to any value in seconds, or you can set it to 0 to disable HTTP caching entirely. Consider for example this query which returns a randomly selected member of the Avengers: select * from [avengers/avengers] order by random() limit 1 If you hit the following page repeatedly you will get the same result, due to HTTP caching: /fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1 By adding ?_ttl=0 to the zero you can ensure the page will not be cached and get back a different super hero every time: /fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1&_ttl=0","[""Changelog"", ""0.23 (2018-06-18)""]","[{""href"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1"", ""label"": ""/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1""}, {""href"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1&_ttl=0"", ""label"": ""/fivethirtyeight?sql=select+*+from+%5Bavengers%2Favengers%5D+order+by+random%28%29+limit+1&_ttl=0""}]" contributing:id1,contributing,id1,Contributing,"Datasette is an open source project. We welcome contributions! This document describes how to contribute to Datasette core. You can also contribute to the wider Datasette ecosystem by creating new Plugins .",[],[] contributing:contributing-continuous-deployment,contributing,contributing-continuous-deployment,Continuously deployed demo instances,"The demo instance at latest.datasette.io is re-deployed automatically to Google Cloud Run for every push to main that passes the test suite. This is implemented by the GitHub Actions workflow at .github/workflows/deploy-latest.yml . Specific branches can also be set to automatically deploy by adding them to the on: push: branches block at the top of the workflow YAML file. Branches configured in this way will be deployed to a new Cloud Run service whether or not their tests pass. The Cloud Run URL for a branch demo can be found in the GitHub Actions logs.","[""Contributing""]","[{""href"": ""https://latest.datasette.io/"", ""label"": ""latest.datasette.io""}, {""href"": ""https://github.com/simonw/datasette/blob/main/.github/workflows/deploy-latest.yml"", ""label"": "".github/workflows/deploy-latest.yml""}]" index:contents,index,contents,Contents,"Getting started Play with a live demo Follow a tutorial Datasette in your browser with Datasette Lite Try Datasette without installing anything using Glitch Using Datasette on your own computer Installation Basic installation Datasette Desktop for Mac Using Homebrew Using pip Advanced installation options Using pipx Using Docker A note about extensions Configuration Configuration via the command-line datasette.yaml reference Settings Plugin configuration Permissions configuration Canned queries configuration Custom CSS and JavaScript The Datasette Ecosystem sqlite-utils Dogsheep CLI reference datasette --help datasette serve datasette --get datasette serve --help-settings datasette plugins datasette install datasette uninstall datasette publish datasette publish cloudrun datasette publish heroku datasette package datasette inspect datasette create-token Pages and API endpoints Top-level index Database Hidden tables Table Row Publishing data datasette publish Publishing to Google Cloud Run Publishing to Heroku Publishing to Vercel Publishing to Fly Custom metadata and plugins datasette package Deploying Datasette Deployment fundamentals Running Datasette using systemd Running Datasette using OpenRC Deploying using buildpacks Running Datasette behind a proxy Nginx proxy configuration Apache proxy configuration JSON API Default representation Different shapes Pagination Special JSON arguments Table arguments Column filter arguments Special table arguments Expanding foreign key references Discovering the JSON for a page Enabling CORS The JSON write API Inserting rows Upserting rows Updating a row Deleting a row Creating a table Creating a table from example data Dropping tables Running SQL queries Named parameters Views Canned queries Canned query parameters Additional canned query options Writable canned queries Magic parameters JSON API for writable canned queries Pagination Cross-database queries Authentication and permissions Actors Using the ""root"" actor Permissions How permissions are resolved Defining permissions with ""allow"" blocks The /-/allow-debug tool Access permissions in datasette.yaml Access to an instance Access to specific databases Access to specific tables and views Access to specific canned queries Controlling the ability to execute arbitrary SQL Other permissions in datasette.yaml API Tokens datasette create-token Checking permissions in plugins actor_matches_allow() The permissions debug tool The ds_actor cookie Including an expiry time The /-/logout page Built-in permissions view-instance view-database view-database-download view-table view-query insert-row delete-row update-row create-table alter-table drop-table execute-sql permissions-debug debug-menu Performance and caching Immutable mode Using ""datasette inspect"" HTTP caching datasette-hashed-urls CSV export URL parameters Streaming all records Binary data Linking to binary downloads Binary plugins Facets Facets in query strings Facets in metadata Suggested facets Speeding up facets with indexes Facet by JSON array Facet by date Full-text search The table page and table view API Advanced SQLite search queries Configuring full-text search for a table or view Searches using custom SQL Enabling full-text search for a SQLite table Configuring FTS using sqlite-utils Configuring FTS using csvs-to-sqlite Configuring FTS by hand FTS versions SpatiaLite Warning Installation Installing SpatiaLite on OS X Installing SpatiaLite on Linux Spatial indexing latitude/longitude columns Making use of a spatial index Importing shapefiles into SpatiaLite Importing GeoJSON polygons using Shapely Querying polygons using within() Metadata Per-database and per-table metadata Source, license and about Column descriptions Specifying units for a column Setting a default sort order Setting a custom page size Setting which columns can be used for sorting Specifying the label column for a table Hiding tables Metadata reference Top-level metadata Database-level metadata Table-level metadata Settings Using --setting Configuration directory mode Settings default_allow_sql default_page_size sql_time_limit_ms max_returned_rows max_insert_rows num_sql_threads allow_facet default_facet_size facet_time_limit_ms facet_suggest_time_limit_ms suggest_facets allow_download allow_signed_tokens max_signed_tokens_ttl default_cache_ttl cache_size_kb allow_csv_stream max_csv_mb truncate_cells_html force_https_urls template_debug trace_debug base_url Configuring the secret Using secrets with datasette publish Introspection /-/metadata /-/versions /-/plugins /-/settings /-/config /-/databases /-/threads /-/actor /-/messages Custom pages and templates CSS classes on the Serving static files Publishing static assets Custom templates Custom pages Path parameters for pages Custom headers and status codes Returning 404s Custom redirects Custom error pages Plugins Installing plugins One-off plugins using --plugins-dir Deploying plugins using datasette publish Controlling which plugins are loaded Seeing what plugins are installed Plugin configuration Secret configuration values Writing plugins Tracing plugin hooks Writing one-off plugins Starting an installable plugin using cookiecutter Packaging a plugin Static assets Custom templates Writing plugins that accept configuration Designing URLs for your plugin Building URLs within plugins Plugins that define new plugin hooks JavaScript plugins The datasette_init event datasetteManager JavaScript plugin objects makeAboveTablePanelConfigs() makeColumnActions(columnDetails) Selectors Plugin hooks prepare_connection(conn, database, datasette) prepare_jinja2_environment(env, datasette) Page extras extra_template_vars(template, database, table, columns, view_name, request, datasette) extra_css_urls(template, database, table, columns, view_name, request, datasette) extra_js_urls(template, database, table, columns, view_name, request, datasette) extra_body_script(template, database, table, columns, view_name, request, datasette) publish_subcommand(publish) render_cell(row, value, column, table, database, datasette, request) register_output_renderer(datasette) register_routes(datasette) register_commands(cli) register_facet_classes() register_permissions(datasette) asgi_wrapper(datasette) startup(datasette) canned_queries(datasette, database, actor) actor_from_request(datasette, request) actors_from_ids(datasette, actor_ids) jinja2_environment_from_request(datasette, request, env) filters_from_request(request, database, table, datasette) permission_allowed(datasette, actor, action, resource) register_magic_parameters(datasette) forbidden(datasette, request, message) handle_exception(datasette, request, exception) skip_csrf(datasette, scope) get_metadata(datasette, key, database, table) menu_links(datasette, actor, request) Action hooks table_actions(datasette, actor, database, table, request) view_actions(datasette, actor, database, view, request) query_actions(datasette, actor, database, query_name, request, sql, params) row_actions(datasette, actor, request, database, table, row) database_actions(datasette, actor, database, request) homepage_actions(datasette, actor, request) Template slots top_homepage(datasette, request) top_database(datasette, request, database) top_table(datasette, request, database, table) top_row(datasette, request, database, table, row) top_query(datasette, request, database, sql) top_canned_query(datasette, request, database, query_name) Event tracking track_event(datasette, event) register_events(datasette) Testing plugins Setting up a Datasette test instance Using datasette.client in tests Using pdb for errors thrown inside Datasette Using pytest fixtures Testing outbound HTTP calls with pytest-httpx Registering a plugin for the duration of a test Internals for plugins Request object The MultiParams class Response class Returning a response with .asgi_send(send) Setting cookies with response.set_cookie() Datasette class .databases .permissions .plugin_config(plugin_name, database=None, table=None) await .render_template(template, context=None, request=None) await .actors_from_ids(actor_ids) await .permission_allowed(actor, action, resource=None, default=...) await .ensure_permissions(actor, permissions) await .check_visibility(actor, action=None, resource=None, permissions=None) .create_token(actor_id, expires_after=None, restrict_all=None, restrict_database=None, restrict_resource=None) .get_permission(name_or_abbr) .get_database(name) .get_internal_database() .add_database(db, name=None, route=None) .add_memory_database(name) .remove_database(name) await .track_event(event) .sign(value, namespace=""default"") .unsign(value, namespace=""default"") .add_message(request, message, type=datasette.INFO) .absolute_url(request, path) .setting(key) .resolve_database(request) .resolve_table(request) .resolve_row(request) datasette.client datasette.urls Database class Database(ds, path=None, is_mutable=True, is_memory=False, memory_name=None) db.hash await db.execute(sql, ...) Results await db.execute_fn(fn) await db.execute_write(sql, params=None, block=True) await db.execute_write_script(sql, block=True) await db.execute_write_many(sql, params_seq, block=True) await db.execute_write_fn(fn, block=True, transaction=True) await db.execute_isolated_fn(fn) db.close() Database introspection CSRF protection Datasette's internal database The datasette.utils module parse_metadata(content) await_me_maybe(value) derive_named_parameters(db, sql) Tilde encoding datasette.tracer Tracing child tasks Import shortcuts Events LoginEvent LogoutEvent CreateTokenEvent CreateTableEvent DropTableEvent AlterTableEvent InsertRowsEvent UpsertRowsEvent UpdateRowEvent DeleteRowEvent Contributing General guidelines Setting up a development environment Running the tests Using fixtures Debugging Code formatting Running Black blacken-docs Prettier Editing and building the documentation Running Cog Continuously deployed demo instances Release process Alpha and beta releases Releasing bug fixes from a branch Upgrading CodeMirror Changelog 1.0a13 (2024-03-12) 1.0a12 (2024-02-29) 1.0a11 (2024-02-19) 1.0a10 (2024-02-17) 1.0a9 (2024-02-16) Alter table support for create, insert, upsert and update Permissions fix for the upsert API Permission checks now consider opinions from every plugin Other changes 1.0a8 (2024-02-07) Configuration JavaScript plugins Plugin hooks Documentation Minor fixes 0.64.6 (2023-12-22) 0.64.5 (2023-10-08) 1.0a7 (2023-09-21) 0.64.4 (2023-09-21) 1.0a6 (2023-09-07) 1.0a5 (2023-08-29) 1.0a4 (2023-08-21) 1.0a3 (2023-08-09) Smaller changes 0.64.2 (2023-03-08) 0.64.1 (2023-01-11) 0.64 (2023-01-09) 0.63.3 (2022-12-17) 1.0a2 (2022-12-14) 1.0a1 (2022-12-01) 1.0a0 (2022-11-29) Signed API tokens Write API 0.63.2 (2022-11-18) 0.63.1 (2022-11-10) 0.63 (2022-10-27) Features Plugin hooks and internals Documentation 0.62 (2022-08-14) Features Plugin hooks Bug fixes Documentation 0.61.1 (2022-03-23) 0.61 (2022-03-23) 0.60.2 (2022-02-07) 0.60.1 (2022-01-20) 0.60 (2022-01-13) Plugins and internals Faceting Other small fixes 0.59.4 (2021-11-29) 0.59.3 (2021-11-20) 0.59.2 (2021-11-13) 0.59.1 (2021-10-24) 0.59 (2021-10-14) 0.58.1 (2021-07-16) 0.58 (2021-07-14) 0.57.1 (2021-06-08) 0.57 (2021-06-05) New features Bug fixes and other improvements 0.56.1 (2021-06-05) 0.56 (2021-03-28) 0.55 (2021-02-18) 0.54.1 (2021-02-02) 0.54 (2021-01-25) The _internal database Named in-memory database support JavaScript modules Code formatting with Black and Prettier Other changes 0.53 (2020-12-10) 0.52.5 (2020-12-09) 0.52.4 (2020-12-05) 0.52.3 (2020-12-03) 0.52.2 (2020-12-02) 0.52.1 (2020-11-29) 0.52 (2020-11-28) 0.51.1 (2020-10-31) 0.51 (2020-10-31) New visual design Plugins can now add links within Datasette Binary data URL building Running Datasette behind a proxy Smaller changes 0.50.2 (2020-10-09) 0.50.1 (2020-10-09) 0.50 (2020-10-09) 0.49.1 (2020-09-15) 0.49 (2020-09-14) 0.48 (2020-08-16) 0.47.3 (2020-08-15) 0.47.2 (2020-08-12) 0.47.1 (2020-08-11) 0.47 (2020-08-11) 0.46 (2020-08-09) 0.45 (2020-07-01) Magic parameters for canned queries Log out Better plugin documentation New plugin hooks Smaller changes 0.44 (2020-06-11) Authentication Permissions Writable canned queries Flash messages Signed values and secrets CSRF protection Cookie methods register_routes() plugin hooks Smaller changes The road to Datasette 1.0 0.43 (2020-05-28) 0.42 (2020-05-08) 0.41 (2020-05-06) 0.40 (2020-04-21) 0.39 (2020-03-24) 0.38 (2020-03-08) 0.37.1 (2020-03-02) 0.37 (2020-02-25) 0.36 (2020-02-21) 0.35 (2020-02-04) 0.34 (2020-01-29) 0.33 (2019-12-22) 0.32 (2019-11-14) 0.31.2 (2019-11-13) 0.31.1 (2019-11-12) 0.31 (2019-11-11) 0.30.2 (2019-11-02) 0.30.1 (2019-10-30) 0.30 (2019-10-18) 0.29.3 (2019-09-02) 0.29.2 (2019-07-13) 0.29.1 (2019-07-11) 0.29 (2019-07-07) ASGI New plugin hook: asgi_wrapper New plugin hook: extra_template_vars Secret plugin configuration options Facet by date Easier custom templates for table rows ?_through= for joins through many-to-many tables Small changes 0.28 (2019-05-19) Supporting databases that change Faceting improvements, and faceting plugins datasette publish cloudrun register_output_renderer plugins Medium changes Small changes 0.27.1 (2019-05-09) 0.27 (2019-01-31) 0.26.1 (2019-01-10) 0.26 (2019-01-02) 0.25.2 (2018-12-16) 0.25.1 (2018-11-04) 0.25 (2018-09-19) 0.24 (2018-07-23) 0.23.2 (2018-07-07) 0.23.1 (2018-06-21) 0.23 (2018-06-18) CSV export Foreign key expansions New configuration settings Control HTTP caching with ?_ttl= Improved support for SpatiaLite latest.datasette.io Miscellaneous 0.22.1 (2018-05-23) 0.22 (2018-05-20) 0.21 (2018-05-05) 0.20 (2018-04-20) 0.19 (2018-04-16) 0.18 (2018-04-14) 0.17 (2018-04-13) 0.16 (2018-04-13) 0.15 (2018-04-09) 0.14 (2017-12-09) 0.13 (2017-11-24) 0.12 (2017-11-16) 0.11 (2017-11-14) 0.10 (2017-11-14) 0.9 (2017-11-13) 0.8 (2017-11-13)","[""Datasette""]",[] settings:setting-secret,settings,setting-secret,Configuring the secret,"Datasette uses a secret string to sign secure values such as cookies. If you do not provide a secret, Datasette will create one when it starts up. This secret will reset every time the Datasette server restarts though, so things like authentication cookies and API tokens will not stay valid between restarts. You can pass a secret to Datasette in two ways: with the --secret command-line option or by setting a DATASETTE_SECRET environment variable. datasette mydb.db --secret=SECRET_VALUE_HERE Or: export DATASETTE_SECRET=SECRET_VALUE_HERE datasette mydb.db One way to generate a secure random secret is to use Python like this: python3 -c 'import secrets; print(secrets.token_hex(32))' cdb19e94283a20f9d42cca50c5a4871c0aa07392db308755d60a1a5b9bb0fa52 Plugin authors make use of this signing mechanism in their plugins using .sign(value, namespace=""default"") and .unsign(value, namespace=""default"") .","[""Settings""]",[] full_text_search:full-text-search-table-or-view,full_text_search,full-text-search-table-or-view,Configuring full-text search for a table or view,"If a table has a corresponding FTS table set up using the content= argument to CREATE VIRTUAL TABLE shown below, Datasette will detect it automatically and add a search interface to the table page for that table. You can also manually configure which table should be used for full-text search using query string parameters or Metadata . You can set the associated FTS table for a specific table and you can also set one for a view - if you do that, the page for that SQL view will offer a search option. Use ?_fts_table=x to over-ride the FTS table for a specific page. If the primary key was something other than rowid you can use ?_fts_pk=col to set that as well. This is particularly useful for views, for example: https://latest.datasette.io/fixtures/searchable_view?_fts_table=searchable_fts&_fts_pk=pk The fts_table metadata property can be used to specify an associated FTS table. If the primary key column in your table which was used to populate the FTS table is something other than rowid , you can specify the column to use with the fts_pk property. The ""searchmode"": ""raw"" property can be used to default the table to accepting SQLite advanced search operators, as described in Advanced SQLite search queries . Here is an example which enables full-text search (with SQLite advanced search operators) for a display_ads view which is defined against the ads table and hence needs to run FTS against the ads_fts table, using the id as the primary key: [[[cog from metadata_doc import metadata_example metadata_example(cog, { ""databases"": { ""russian-ads"": { ""tables"": { ""display_ads"": { ""fts_table"": ""ads_fts"", ""fts_pk"": ""id"", ""searchmode"": ""raw"" } } } } }) ]]] [[[end]]]","[""Full-text search""]","[{""href"": ""https://latest.datasette.io/fixtures/searchable_view?_fts_table=searchable_fts&_fts_pk=pk"", ""label"": ""https://latest.datasette.io/fixtures/searchable_view?_fts_table=searchable_fts&_fts_pk=pk""}]" full_text_search:configuring-fts-using-sqlite-utils,full_text_search,configuring-fts-using-sqlite-utils,Configuring FTS using sqlite-utils,"sqlite-utils is a CLI utility and Python library for manipulating SQLite databases. You can use it from Python code to configure FTS search, or you can achieve the same goal using the accompanying command-line tool . Here's how to use sqlite-utils to enable full-text search for an items table across the name and description columns: sqlite-utils enable-fts mydatabase.db items name description","[""Full-text search"", ""Enabling full-text search for a SQLite table""]","[{""href"": ""https://sqlite-utils.datasette.io/"", ""label"": ""sqlite-utils""}, {""href"": ""https://sqlite-utils.datasette.io/en/latest/python-api.html#enabling-full-text-search"", ""label"": ""it from Python code""}, {""href"": ""https://sqlite-utils.datasette.io/en/latest/cli.html#configuring-full-text-search"", ""label"": ""using the accompanying command-line tool""}]" full_text_search:configuring-fts-using-csvs-to-sqlite,full_text_search,configuring-fts-using-csvs-to-sqlite,Configuring FTS using csvs-to-sqlite,"If your data starts out in CSV files, you can use Datasette's companion tool csvs-to-sqlite to convert that file into a SQLite database and enable full-text search on specific columns. For a file called items.csv where you want full-text search to operate against the name and description columns you would run the following: csvs-to-sqlite items.csv items.db -f name -f description","[""Full-text search"", ""Enabling full-text search for a SQLite table""]","[{""href"": ""https://github.com/simonw/csvs-to-sqlite"", ""label"": ""csvs-to-sqlite""}]" full_text_search:configuring-fts-by-hand,full_text_search,configuring-fts-by-hand,Configuring FTS by hand,"We recommend using sqlite-utils , but if you want to hand-roll a SQLite full-text search table you can do so using the following SQL. To enable full-text search for a table called items that works against the name and description columns, you would run this SQL to create a new items_fts FTS virtual table: CREATE VIRTUAL TABLE ""items_fts"" USING FTS4 ( name, description, content=""items"" ); This creates a set of tables to power full-text search against items . The new items_fts table will be detected by Datasette as the fts_table for the items table. Creating the table is not enough: you also need to populate it with a copy of the data that you wish to make searchable. You can do that using the following SQL: INSERT INTO ""items_fts"" (rowid, name, description) SELECT rowid, name, description FROM items; If your table has columns that are foreign key references to other tables you can include that data in your full-text search index using a join. Imagine the items table has a foreign key column called category_id which refers to a categories table - you could create a full-text search table like this: CREATE VIRTUAL TABLE ""items_fts"" USING FTS4 ( name, description, category_name, content=""items"" ); And then populate it like this: INSERT INTO ""items_fts"" (rowid, name, description, category_name) SELECT items.rowid, items.name, items.description, categories.name FROM items JOIN categories ON items.category_id=categories.id; You can use this technique to populate the full-text search index from any combination of tables and joins that makes sense for your project.","[""Full-text search"", ""Enabling full-text search for a SQLite table""]","[{""href"": ""https://sqlite-utils.datasette.io/"", ""label"": ""sqlite-utils""}]" configuration:configuration-cli,configuration,configuration-cli,Configuration via the command-line,"The recommended way to configure Datasette is using a datasette.yaml file passed to -c/--config . You can also pass individual settings to Datasette using the -s/--setting option, which can be used multiple times: datasette mydatabase.db \ --setting settings.default_page_size 50 \ --setting settings.sql_time_limit_ms 3500 This option takes dotted-notation for the first argument and a value for the second argument. This means you can use it to set any configuration value that would be valid in a datasette.yaml file. It also works for plugin configuration, for example for datasette-cluster-map : datasette mydatabase.db \ --setting plugins.datasette-cluster-map.latitude_column xlat \ --setting plugins.datasette-cluster-map.longitude_column xlon If the value you provide is a valid JSON object or list it will be treated as nested data, allowing you to configure plugins that accept lists such as datasette-proxy-url : datasette mydatabase.db \ -s plugins.datasette-proxy-url.paths '[{""path"": ""/proxy"", ""backend"": ""http://example.com/""}]' This is equivalent to a datasette.yaml file containing the following: [[[cog from metadata_doc import config_example import textwrap config_example(cog, textwrap.dedent( """""" plugins: datasette-proxy-url: paths: - path: /proxy backend: http://example.com/ """""").strip() ) ]]] [[[end]]]","[""Configuration""]","[{""href"": ""https://datasette.io/plugins/datasette-cluster-map"", ""label"": ""datasette-cluster-map""}, {""href"": ""https://datasette.io/plugins/datasette-proxy-url"", ""label"": ""datasette-proxy-url""}]" settings:config-dir,settings,config-dir,Configuration directory mode,"Normally you configure Datasette using command-line options. For a Datasette instance with custom templates, custom plugins, a static directory and several databases this can get quite verbose: datasette one.db two.db \ --metadata=metadata.json \ --template-dir=templates/ \ --plugins-dir=plugins \ --static css:css As an alternative to this, you can run Datasette in configuration directory mode. Create a directory with the following structure: # In a directory called my-app: my-app/one.db my-app/two.db my-app/datasette.yaml my-app/metadata.json my-app/templates/index.html my-app/plugins/my_plugin.py my-app/static/my.css Now start Datasette by providing the path to that directory: datasette my-app/ Datasette will detect the files in that directory and automatically configure itself using them. It will serve all *.db files that it finds, will load metadata.json if it exists, and will load the templates , plugins and static folders if they are present. The files that can be included in this directory are as follows. All are optional. *.db (or *.sqlite3 or *.sqlite ) - SQLite database files that will be served by Datasette datasette.yaml - Configuration for the Datasette instance metadata.json - Metadata for those databases - metadata.yaml or metadata.yml can be used as well inspect-data.json - the result of running datasette inspect *.db --inspect-file=inspect-data.json from the configuration directory - any database files listed here will be treated as immutable, so they should not be changed while Datasette is running templates/ - a directory containing Custom templates plugins/ - a directory containing plugins, see Writing one-off plugins static/ - a directory containing static files - these will be served from /static/filename.txt , see Serving static files","[""Settings""]",[] changelog:configuration,changelog,configuration,Configuration,"Plugin configuration now lives in the datasette.yaml configuration file , passed to Datasette using the -c/--config option. Thanks, Alex Garcia. ( #2093 ) datasette -c datasette.yaml Where datasette.yaml contains configuration that looks like this: plugins: datasette-cluster-map: latitude_column: xlat longitude_column: xlon Previously plugins were configured in metadata.yaml , which was confusing as plugin settings were unrelated to database and table metadata. The -s/--setting option can now be used to set plugin configuration as well. See Configuration via the command-line for details. ( #2252 ) The above YAML configuration example using -s/--setting looks like this: datasette mydatabase.db \ -s plugins.datasette-cluster-map.latitude_column xlat \ -s plugins.datasette-cluster-map.longitude_column xlon The new /-/config page shows the current instance configuration, after redacting keys that could contain sensitive data such as API keys or passwords. ( #2254 ) Existing Datasette installations may already have configuration set in metadata.yaml that should be migrated to datasette.yaml . To avoid breaking these installations, Datasette will silently treat table configuration, plugin configuration and allow blocks in metadata as if they had been specified in configuration instead. ( #2247 ) ( #2248 ) ( #2249 ) Note that the datasette publish command has not yet been updated to accept a datasette.yaml configuration file. This will be addressed in #2195 but for the moment you can include those settings in metadata.yaml instead.","[""Changelog"", ""1.0a8 (2024-02-07)""]","[{""href"": ""https://github.com/simonw/datasette/issues/2093"", ""label"": ""#2093""}, {""href"": ""https://github.com/simonw/datasette/issues/2252"", ""label"": ""#2252""}, {""href"": ""https://github.com/simonw/datasette/issues/2254"", ""label"": ""#2254""}, {""href"": ""https://github.com/simonw/datasette/issues/2247"", ""label"": ""#2247""}, {""href"": ""https://github.com/simonw/datasette/issues/2248"", ""label"": ""#2248""}, {""href"": ""https://github.com/simonw/datasette/issues/2249"", ""label"": ""#2249""}, {""href"": ""https://github.com/simonw/datasette/issues/2195"", ""label"": ""#2195""}]" configuration:id1,configuration,id1,Configuration,"Datasette offers several ways to configure your Datasette instances: server settings, plugin configuration, authentication, and more. Most configuration can be handled using a datasette.yaml configuration file, passed to datasette using the -c/--config flag: datasette mydatabase.db --config datasette.yaml This file can also use JSON, as datasette.json . YAML is recommended over JSON due to its support for comments and multi-line strings.",[],[] json_api:column-filter-arguments,json_api,column-filter-arguments,Column filter arguments,"You can filter the data returned by the table based on column values using a query string argument. ?column__exact=value or ?_column=value Returns rows where the specified column exactly matches the value. ?column__not=value Returns rows where the column does not match the value. ?column__contains=value Rows where the string column contains the specified value ( column like ""%value%"" in SQL). ?column__notcontains=value Rows where the string column does not contain the specified value ( column not like ""%value%"" in SQL). ?column__endswith=value Rows where the string column ends with the specified value ( column like ""%value"" in SQL). ?column__startswith=value Rows where the string column starts with the specified value ( column like ""value%"" in SQL). ?column__gt=value Rows which are greater than the specified value. ?column__gte=value Rows which are greater than or equal to the specified value. ?column__lt=value Rows which are less than the specified value. ?column__lte=value Rows which are less than or equal to the specified value. ?column__like=value Match rows with a LIKE clause, case insensitive and with % as the wildcard character. ?column__notlike=value Match rows that do not match the provided LIKE clause. ?column__glob=value Similar to LIKE but uses Unix wildcard syntax and is case sensitive. ?column__in=value1,value2,value3 Rows where column matches any of the provided values. You can use a comma separated string, or you can use a JSON array. The JSON array option is useful if one of your matching values itself contains a comma: ?column__in=[""value"",""value,with,commas""] ?column__notin=value1,value2,value3 Rows where column does not match any of the provided values. The inverse of __in= . Also supports JSON arrays. ?column__arraycontains=value Works against columns that contain JSON arrays - matches if any of the values in that array match the provided value. This is only available if the json1 SQLite extension is enabled. ?column__arraynotcontains=value Works against columns that contain JSON arrays - matches if none of the values in that array match the provided value. This is only available if the json1 SQLite extension is enabled. ?column__date=value Column is a datestamp occurring on the specified YYYY-MM-DD date, e.g. 2018-01-02 . ?column__isnull=1 Matches rows where the column is null. ?column__notnull=1 Matches rows where the column is not null. ?column__isblank=1 Matches rows where the column is blank, meaning null or the empty string. ?column__notblank=1 Matches rows where the column is not blank.","[""JSON API"", ""Table arguments""]",[] metadata:metadata-column-descriptions,metadata,metadata-column-descriptions,Column descriptions,"You can include descriptions for your columns by adding a ""columns"": {""name-of-column"": ""description-of-column""} block to your table metadata: [[[cog metadata_example(cog, { ""databases"": { ""database1"": { ""tables"": { ""example_table"": { ""columns"": { ""column1"": ""Description of column 1"", ""column2"": ""Description of column 2"" } } } } } }) ]]] [[[end]]] These will be displayed at the top of the table page, and will also show in the cog menu for each column. You can see an example of how these look at latest.datasette.io/fixtures/roadside_attractions .","[""Metadata""]","[{""href"": ""https://latest.datasette.io/fixtures/roadside_attractions"", ""label"": ""latest.datasette.io/fixtures/roadside_attractions""}]" changelog:code-formatting-with-black-and-prettier,changelog,code-formatting-with-black-and-prettier,Code formatting with Black and Prettier,"Datasette adopted Black for opinionated Python code formatting in June 2019. Datasette now also embraces Prettier for JavaScript formatting, which like Black is enforced by tests in continuous integration. Instructions for using these two tools can be found in the new section on Code formatting in the contributors documentation. ( #1167 )","[""Changelog"", ""0.54 (2021-01-25)""]","[{""href"": ""https://github.com/psf/black"", ""label"": ""Black""}, {""href"": ""https://prettier.io/"", ""label"": ""Prettier""}, {""href"": ""https://github.com/simonw/datasette/issues/1167"", ""label"": ""#1167""}]" contributing:contributing-formatting,contributing,contributing-formatting,Code formatting,"Datasette uses opinionated code formatters: Black for Python and Prettier for JavaScript. These formatters are enforced by Datasette's continuous integration: if a commit includes Python or JavaScript code that does not match the style enforced by those tools, the tests will fail. When developing locally, you can verify and correct the formatting of your code using these tools.","[""Contributing""]","[{""href"": ""https://github.com/psf/black"", ""label"": ""Black""}, {""href"": ""https://prettier.io/"", ""label"": ""Prettier""}]" authentication:permissions-plugins,authentication,permissions-plugins,Checking permissions in plugins,"Datasette plugins can check if an actor has permission to perform an action using the datasette.permission_allowed(...) method. Datasette core performs a number of permission checks, documented below . Plugins can implement the permission_allowed(datasette, actor, action, resource) plugin hook to participate in decisions about whether an actor should be able to perform a specified action.","[""Authentication and permissions""]",[] changelog:id1,changelog,id1,Changelog,,[],[] sql_queries:canned-queries-named-parameters,sql_queries,canned-queries-named-parameters,Canned query parameters,"Canned queries support named parameters, so if you include those in the SQL you will then be able to enter them using the form fields on the canned query page or by adding them to the URL. This means canned queries can be used to create custom JSON APIs based on a carefully designed SQL statement. Here's an example of a canned query with a named parameter: select neighborhood, facet_cities.name, state from facetable join facet_cities on facetable.city_id = facet_cities.id where neighborhood like '%' || :text || '%' order by neighborhood; In the canned query configuration looks like this: [[[cog config_example(cog, """""" databases: fixtures: queries: neighborhood_search: title: Search neighborhoods sql: |- select neighborhood, facet_cities.name, state from facetable join facet_cities on facetable.city_id = facet_cities.id where neighborhood like '%' || :text || '%' order by neighborhood """""") ]]] [[[end]]] Note that we are using SQLite string concatenation here - the || operator - to add wildcard % characters to the string provided by the user. You can try this canned query out here: https://latest.datasette.io/fixtures/neighborhood_search?text=town In this example the :text named parameter is automatically extracted from the query using a regular expression. You can alternatively provide an explicit list of named parameters using the ""params"" key, like this: [[[cog config_example(cog, """""" databases: fixtures: queries: neighborhood_search: title: Search neighborhoods params: - text sql: |- select neighborhood, facet_cities.name, state from facetable join facet_cities on facetable.city_id = facet_cities.id where neighborhood like '%' || :text || '%' order by neighborhood """""") ]]] [[[end]]]","[""Running SQL queries"", ""Canned queries""]","[{""href"": ""https://latest.datasette.io/fixtures/neighborhood_search?text=town"", ""label"": ""https://latest.datasette.io/fixtures/neighborhood_search?text=town""}]" configuration:configuration-reference-canned-queries,configuration,configuration-reference-canned-queries,Canned queries configuration,"Canned queries are named SQL queries that appear in the Datasette interface. They can be configured in datasette.yaml using the queries key at the database level: [[[cog from metadata_doc import config_example, config_example config_example(cog, { ""databases"": { ""sf-trees"": { ""queries"": { ""just_species"": { ""sql"": ""select qSpecies from Street_Tree_List"" } } } } }) ]]] [[[end]]] See the canned queries documentation for more, including how to configure writable canned queries .","[""Configuration"", null]",[] sql_queries:id1,sql_queries,id1,Canned queries,"As an alternative to adding views to your database, you can define canned queries inside your datasette.yaml file. Here's an example: [[[cog from metadata_doc import config_example, config_example config_example(cog, { ""databases"": { ""sf-trees"": { ""queries"": { ""just_species"": { ""sql"": ""select qSpecies from Street_Tree_List"" } } } } }) ]]] [[[end]]] Then run Datasette like this: datasette sf-trees.db -m metadata.json Each canned query will be listed on the database index page, and will also get its own URL at: /database-name/canned-query-name For the above example, that URL would be: /sf-trees/just_species You can optionally include ""title"" and ""description"" keys to show a title and description on the canned query page. As with regular table metadata you can alternatively specify ""description_html"" to have your description rendered as HTML (rather than having HTML special characters escaped).","[""Running SQL queries""]",[] changelog:csv-export,changelog,csv-export,CSV export,"Any Datasette table, view or custom SQL query can now be exported as CSV. Check out the CSV export documentation for more details, or try the feature out on https://fivethirtyeight.datasettes.com/fivethirtyeight/bechdel%2Fmovies If your table has more than max_returned_rows (default 1,000) Datasette provides the option to stream all rows . This option takes advantage of async Python and Datasette's efficient pagination to iterate through the entire matching result set and stream it back as a downloadable CSV file.","[""Changelog"", ""0.23 (2018-06-18)""]","[{""href"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight/bechdel%2Fmovies"", ""label"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight/bechdel%2Fmovies""}]" csv_export:id1,csv_export,id1,CSV export,"Any Datasette table, view or custom SQL query can be exported as CSV. To obtain the CSV representation of the table you are looking, click the ""this data as CSV"" link. You can also use the advanced export form for more control over the resulting file, which looks like this and has the following options: download file - instead of displaying CSV in your browser, this forces your browser to download the CSV to your downloads directory. expand labels - if your table has any foreign key references this option will cause the CSV to gain additional COLUMN_NAME_label columns with a label for each foreign key derived from the linked table. In this example the city_id column is accompanied by a city_id_label column. stream all rows - by default CSV files only contain the first max_returned_rows records. This option will cause Datasette to loop through every matching record and return them as a single CSV file. You can try that out on https://latest.datasette.io/fixtures/facetable?_size=4",[],"[{""href"": ""https://latest.datasette.io/fixtures/facetable.csv?_labels=on&_size=max"", ""label"": ""In this example""}, {""href"": ""https://latest.datasette.io/fixtures/facetable?_size=4"", ""label"": ""https://latest.datasette.io/fixtures/facetable?_size=4""}]" custom_templates:css-classes-on-the-body,custom_templates,css-classes-on-the-body,CSS classes on the ,"Every default template includes CSS classes in the body designed to support custom styling. The index template (the top level page at / ) gets this: The database template ( /dbname ) gets this: The custom SQL template ( /dbname?sql=... ) gets this: A canned query template ( /dbname/queryname ) gets this: The table template ( /dbname/tablename ) gets: The row template ( /dbname/tablename/rowid ) gets: The db-x and table-x classes use the database or table names themselves if they are valid CSS identifiers. If they aren't, we strip any invalid characters out and append a 6 character md5 digest of the original name, in order to ensure that multiple tables which resolve to the same stripped character version still have different CSS classes. Some examples: ""simple"" => ""simple"" ""MixedCase"" => ""MixedCase"" ""-no-leading-hyphens"" => ""no-leading-hyphens-65bea6"" ""_no-leading-underscores"" => ""no-leading-underscores-b921bc"" ""no spaces"" => ""no-spaces-7088d7"" ""-"" => ""336d5e"" ""no $ characters"" => ""no--characters-59e024"" and elements also get custom CSS classes reflecting the database column they are representing, for example:
id name
1 SMITH
","[""Custom pages and templates""]",[] changelog:csrf-protection,changelog,csrf-protection,CSRF protection,"Since writable canned queries are built using POST forms, Datasette now ships with CSRF protection ( #798 ). This applies automatically to any POST request, which means plugins need to include a csrftoken in any POST forms that they render. They can do that like so: ","[""Changelog"", ""0.44 (2020-06-11)""]","[{""href"": ""https://github.com/simonw/datasette/issues/798"", ""label"": ""#798""}]" internals:internals-csrf,internals,internals-csrf,CSRF protection,"Datasette uses asgi-csrf to guard against CSRF attacks on form POST submissions. Users receive a ds_csrftoken cookie which is compared against the csrftoken form field (or x-csrftoken HTTP header) for every incoming request. If your plugin implements a
anywhere you will need to include that token. You can do so with the following template snippet: If you are rendering templates using the await .render_template(template, context=None, request=None) method the csrftoken() helper will only work if you provide the request= argument to that method. If you forget to do this you will see the following error: form-urlencoded POST field did not match cookie You can selectively disable CSRF protection using the skip_csrf(datasette, scope) hook.","[""Internals for plugins""]","[{""href"": ""https://github.com/simonw/asgi-csrf"", ""label"": ""asgi-csrf""}]" cli-reference:id1,cli-reference,id1,CLI reference,"The datasette CLI tool provides a number of commands. Running datasette without specifying a command runs the default command, datasette serve . See datasette serve for the full list of options for that command. [[[cog from datasette import cli from click.testing import CliRunner import textwrap def help(args): title = ""datasette "" + "" "".join(args) cog.out(""\n::\n\n"") result = CliRunner().invoke(cli.cli, args) output = result.output.replace(""Usage: cli "", ""Usage: datasette "") cog.out(textwrap.indent(output, ' ')) cog.out(""\n\n"") ]]] [[[end]]]",[],[] authentication:id1,authentication,id1,Built-in permissions,"This section lists all of the permission checks that are carried out by Datasette core, along with the resource if it was passed.","[""Authentication and permissions""]",[] writing_plugins:writing-plugins-building-urls,writing_plugins,writing-plugins-building-urls,Building URLs within plugins,"Plugins that define their own custom user interface elements may need to link to other pages within Datasette. This can be a bit tricky if the Datasette instance is using the base_url configuration setting to run behind a proxy, since that can cause Datasette's URLs to include an additional prefix. The datasette.urls object provides internal methods for correctly generating URLs to different pages within Datasette, taking any base_url configuration into account. This object is exposed in templates as the urls variable, which can be used like this: Back to the Homepage See datasette.urls for full details on this object.","[""Writing plugins""]",[] changelog:bug-fixes-and-other-improvements,changelog,bug-fixes-and-other-improvements,Bug fixes and other improvements,"Custom pages now work correctly when combined with the base_url setting. ( #1238 ) Fixed intermittent error displaying the index page when the user did not have permission to access one of the tables. Thanks, Guy Freeman. ( #1305 ) Columns with the name ""Link"" are no longer incorrectly displayed in bold. ( #1308 ) Fixed error caused by tables with a single quote in their names. ( #1257 ) Updated dependencies: pytest-asyncio , Black , jinja2 , aiofiles , click , and itsdangerous . The official Datasette Docker image now supports apt-get install . ( #1320 ) The Heroku runtime used by datasette publish heroku is now python-3.8.10 .","[""Changelog"", ""0.57 (2021-06-05)""]","[{""href"": ""https://github.com/simonw/datasette/issues/1238"", ""label"": ""#1238""}, {""href"": ""https://github.com/simonw/datasette/issues/1305"", ""label"": ""#1305""}, {""href"": ""https://github.com/simonw/datasette/issues/1308"", ""label"": ""#1308""}, {""href"": ""https://github.com/simonw/datasette/issues/1257"", ""label"": ""#1257""}, {""href"": ""https://github.com/simonw/datasette/issues/1320"", ""label"": ""#1320""}]" changelog:bug-fixes,changelog,bug-fixes,Bug fixes,"Don't show the facet option in the cog menu if faceting is not allowed. ( #1683 ) ?_sort and ?_sort_desc now work if the column that is being sorted has been excluded from the query using ?_col= or ?_nocol= . ( #1773 ) Fixed bug where ?_sort_desc was duplicated in the URL every time the Apply button was clicked. ( #1738 )","[""Changelog"", ""0.62 (2022-08-14)""]","[{""href"": ""https://github.com/simonw/datasette/issues/1683"", ""label"": ""#1683""}, {""href"": ""https://github.com/simonw/datasette/issues/1773"", ""label"": ""#1773""}, {""href"": ""https://github.com/simonw/datasette/issues/1738"", ""label"": ""#1738""}]" binary_data:binary-plugins,binary_data,binary-plugins,Binary plugins,"Several Datasette plugins are available that change the way Datasette treats binary data. datasette-render-binary modifies Datasette's default interface to show an automatic guess at what type of binary data is being stored, along with a visual representation of the binary value that displays ASCII strings directly in the interface. datasette-render-images detects common image formats and renders them as images directly in the Datasette interface. datasette-media allows Datasette interfaces to be configured to serve binary files from configured SQL queries, and includes the ability to resize images directly before serving them.","[""Binary data""]","[{""href"": ""https://github.com/simonw/datasette-render-binary"", ""label"": ""datasette-render-binary""}, {""href"": ""https://github.com/simonw/datasette-render-images"", ""label"": ""datasette-render-images""}, {""href"": ""https://github.com/simonw/datasette-media"", ""label"": ""datasette-media""}]" binary_data:binary,binary_data,binary,Binary data,"SQLite tables can contain binary data in BLOB columns. Datasette includes special handling for these binary values. The Datasette interface detects binary values and provides a link to download their content, for example on https://latest.datasette.io/fixtures/binary_data Binary data is represented in .json exports using Base64 encoding. https://latest.datasette.io/fixtures/binary_data.json?_shape=array [ { ""rowid"": 1, ""data"": { ""$base64"": true, ""encoded"": ""FRwCx60F/g=="" } }, { ""rowid"": 2, ""data"": { ""$base64"": true, ""encoded"": ""FRwDx60F/g=="" } }, { ""rowid"": 3, ""data"": null } ]",[],"[{""href"": ""https://latest.datasette.io/fixtures/binary_data"", ""label"": ""https://latest.datasette.io/fixtures/binary_data""}, {""href"": ""https://latest.datasette.io/fixtures/binary_data.json?_shape=array"", ""label"": ""https://latest.datasette.io/fixtures/binary_data.json?_shape=array""}]" changelog:binary-data,changelog,binary-data,Binary data,"SQLite tables can contain binary data in BLOB columns. Datasette now provides links for users to download this data directly from Datasette, and uses those links to make binary data available from CSV exports. See Binary data for more details. ( #1036 and #1034 ).","[""Changelog"", ""0.51 (2020-10-31)""]","[{""href"": ""https://github.com/simonw/datasette/issues/1036"", ""label"": ""#1036""}, {""href"": ""https://github.com/simonw/datasette/issues/1034"", ""label"": ""#1034""}]" changelog:better-plugin-documentation,changelog,better-plugin-documentation,Better plugin documentation,"The plugin documentation has been re-arranged into four sections, including a brand new section on testing plugins. ( #687 ) Plugins introduces Datasette's plugin system and describes how to install and configure plugins. Writing plugins describes how to author plugins, from one-off single file plugins to packaged plugins that can be published to PyPI. It also describes how to start a plugin using the new datasette-plugin cookiecutter template. Plugin hooks is a full list of detailed documentation for every Datasette plugin hook. Testing plugins describes how to write tests for Datasette plugins, using pytest and HTTPX .","[""Changelog"", ""0.45 (2020-07-01)""]","[{""href"": ""https://github.com/simonw/datasette/issues/687"", ""label"": ""#687""}, {""href"": ""https://github.com/simonw/datasette-plugin"", ""label"": ""datasette-plugin""}, {""href"": ""https://docs.pytest.org/"", ""label"": ""pytest""}, {""href"": ""https://www.python-httpx.org/"", ""label"": ""HTTPX""}]" installation:installation-basic,installation,installation-basic,Basic installation,,"[""Installation""]",[] authentication:authentication,authentication,authentication,Authentication and permissions,"Datasette doesn't require authentication by default. Any visitor to a Datasette instance can explore the full data and execute read-only SQL queries. Datasette's plugin system can be used to add many different styles of authentication, such as user accounts, single sign-on or API keys.",[],[] changelog:authentication,changelog,authentication,Authentication,"Prior to this release the Datasette ecosystem has treated authentication as exclusively the realm of plugins, most notably through datasette-auth-github . 0.44 introduces Authentication and permissions as core Datasette concepts ( #699 ). This enables different plugins to share responsibility for authenticating requests - you might have one plugin that handles user accounts and another one that allows automated access via API keys, for example. You'll need to install plugins if you want full user accounts, but default Datasette can now authenticate a single root user with the new --root command-line option, which outputs a one-time use URL to authenticate as a root actor ( #784 ): datasette fixtures.db --root http://127.0.0.1:8001/-/auth-token?token=5b632f8cd44b868df625f5a6e2185d88eea5b22237fd3cc8773f107cc4fd6477 INFO: Started server process [14973] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) Plugins can implement new ways of authenticating users using the new actor_from_request(datasette, request) hook.","[""Changelog"", ""0.44 (2020-06-11)""]","[{""href"": ""https://github.com/simonw/datasette-auth-github"", ""label"": ""datasette-auth-github""}, {""href"": ""https://github.com/simonw/datasette/issues/699"", ""label"": ""#699""}, {""href"": ""https://github.com/simonw/datasette/issues/784"", ""label"": ""#784""}]" deploying:apache-proxy-configuration,deploying,apache-proxy-configuration,Apache proxy configuration,"For Apache , you can use the ProxyPass directive. First make sure the following lines are uncommented: LoadModule proxy_module lib/httpd/modules/mod_proxy.so LoadModule proxy_http_module lib/httpd/modules/mod_proxy_http.so Then add these directives to proxy traffic: ProxyPass /my-datasette/ http://127.0.0.1:8009/my-datasette/ ProxyPreserveHost On A live demo of Datasette running behind Apache using this proxy setup can be seen at datasette-apache-proxy-demo.datasette.io/prefix/ . The code for that demo can be found in the demos/apache-proxy directory. Using --uds you can use Unix domain sockets similar to the nginx example: ProxyPass /my-datasette/ unix:/tmp/datasette.sock|http://localhost/my-datasette/ The ProxyPreserveHost On directive ensures that the original Host: header from the incoming request is passed through to Datasette. Datasette needs this to correctly assemble links to other pages using the .absolute_url(request, path) method.","[""Deploying Datasette"", ""Running Datasette behind a proxy""]","[{""href"": ""https://httpd.apache.org/"", ""label"": ""Apache""}, {""href"": ""https://datasette-apache-proxy-demo.datasette.io/prefix/"", ""label"": ""datasette-apache-proxy-demo.datasette.io/prefix/""}, {""href"": ""https://github.com/simonw/datasette/tree/main/demos/apache-proxy"", ""label"": ""demos/apache-proxy""}, {""href"": ""https://httpd.apache.org/docs/2.4/mod/mod_proxy.html#proxypreservehost"", ""label"": ""ProxyPreserveHost On""}]" changelog:alter-table-support-for-create-insert-upsert-and-update,changelog,alter-table-support-for-create-insert-upsert-and-update,"Alter table support for create, insert, upsert and update","The JSON write API can now be used to apply simple alter table schema changes, provided the acting actor has the new alter-table permission. ( #2101 ) The only alter operation supported so far is adding new columns to an existing table. The /db/-/create API now adds new columns during large operations to create a table based on incoming example ""rows"" , in the case where one of the later rows includes columns that were not present in the earlier batches. This requires the create-table but not the alter-table permission. When /db/-/create is called with rows in a situation where the table may have been already created, an ""alter"": true key can be included to indicate that any missing columns from the new rows should be added to the table. This requires the alter-table permission. /db/table/-/insert and /db/table/-/upsert and /db/table/row-pks/-/update all now also accept ""alter"": true , depending on the alter-table permission. Operations that alter a table now fire the new alter-table event .","[""Changelog"", ""1.0a9 (2024-02-16)""]","[{""href"": ""https://github.com/simonw/datasette/issues/2101"", ""label"": ""#2101""}]" contributing:contributing-alpha-beta,contributing,contributing-alpha-beta,Alpha and beta releases,"Alpha and beta releases are published to preview upcoming features that may not yet be stable - in particular to preview new plugin hooks. You are welcome to try these out, but please be aware that details may change before the final release. Please join discussions on the issue tracker to share your thoughts and experiences with on alpha and beta features that you try out.","[""Contributing""]","[{""href"": ""https://github.com/simonw/datasette/issues"", ""label"": ""discussions on the issue tracker""}]" installation:installation-advanced,installation,installation-advanced,Advanced installation options,,"[""Installation""]",[] full_text_search:full-text-search-advanced-queries,full_text_search,full-text-search-advanced-queries,Advanced SQLite search queries,"SQLite full-text search includes support for a variety of advanced queries , including AND , OR , NOT and NEAR . By default Datasette disables these features to ensure they do not cause errors or confusion for users who are not aware of them. You can disable this escaping and use the advanced queries by adding &_searchmode=raw to the table page query string. If you want to enable these operators by default for a specific table, you can do so by adding ""searchmode"": ""raw"" to the metadata configuration for that table, see Configuring full-text search for a table or view . If that option has been specified in the table metadata but you want to over-ride it and return to the default behavior you can append &_searchmode=escaped to the query string.","[""Full-text search""]","[{""href"": ""https://www.sqlite.org/fts5.html#full_text_query_syntax"", ""label"": ""a variety of advanced queries""}]" sql_queries:canned-queries-options,sql_queries,canned-queries-options,Additional canned query options,Additional options can be specified for canned queries in the YAML or JSON configuration.,"[""Running SQL queries"", ""Canned queries""]",[] authentication:authentication-actor,authentication,authentication-actor,Actors,"Through plugins, Datasette can support both authenticated users (with cookies) and authenticated API agents (via authentication tokens). The word ""actor"" is used to cover both of these cases. Every request to Datasette has an associated actor value, available in the code as request.actor . This can be None for unauthenticated requests, or a JSON compatible Python dictionary for authenticated users or API agents. The actor dictionary can be any shape - the design of that data structure is left up to the plugins. A useful convention is to include an ""id"" string, as demonstrated by the ""root"" actor below. Plugins can use the actor_from_request(datasette, request) hook to implement custom logic for authenticating an actor based on the incoming HTTP request.","[""Authentication and permissions""]",[] plugin_hooks:plugin-actions,plugin_hooks,plugin-actions,Action hooks,"Action hooks can be used to add items to the action menus that appear at the top of different pages within Datasette. Unlike menu_links() , actions which are displayed on every page, actions should only be relevant to the page the user is currently viewing. Each of these hooks should return return a list of {""href"": ""..."", ""label"": ""...""} menu items, with optional ""description"": ""..."" keys describing each action in more detail. They can alternatively return an async def awaitable function which, when called, returns a list of those menu items.","[""Plugin hooks""]",[] authentication:authentication-permissions-table,authentication,authentication-permissions-table,Access to specific tables and views,"To limit access to the users table in your bakery.db database: [[[cog config_example(cog, """""" databases: bakery: tables: users: allow: id: '*' """""") ]]] [[[end]]] This works for SQL views as well - you can list their names in the ""tables"" block above in the same way as regular tables. Restricting access to tables and views in this way will NOT prevent users from querying them using arbitrary SQL queries, like this for example. If you are restricting access to specific tables you should also use the ""allow_sql"" block to prevent users from bypassing the limit with their own SQL queries - see Controlling the ability to execute arbitrary SQL .","[""Authentication and permissions"", ""Access permissions in ""]","[{""href"": ""https://latest.datasette.io/fixtures?sql=select+*+from+facetable"", ""label"": ""like this""}]" authentication:authentication-permissions-database,authentication,authentication-permissions-database,Access to specific databases,"To limit access to a specific private.db database to just authenticated users, use the ""allow"" block like this: [[[cog config_example(cog, """""" databases: private: allow: id: ""*"" """""") ]]] [[[end]]]","[""Authentication and permissions"", ""Access permissions in ""]",[] authentication:authentication-permissions-query,authentication,authentication-permissions-query,Access to specific canned queries,"Canned queries allow you to configure named SQL queries in your datasette.yaml that can be executed by users. These queries can be set up to both read and write to the database, so controlling who can execute them can be important. To limit access to the add_name canned query in your dogs.db database to just the root user : [[[cog config_example(cog, """""" databases: dogs: queries: add_name: sql: INSERT INTO names (name) VALUES (:name) write: true allow: id: - root """""") ]]] [[[end]]]","[""Authentication and permissions"", ""Access permissions in ""]",[] authentication:authentication-permissions-instance,authentication,authentication-permissions-instance,Access to an instance,"Here's how to restrict access to your entire Datasette instance to just the ""id"": ""root"" user: [[[cog from metadata_doc import config_example config_example(cog, """""" title: My private Datasette instance allow: id: root """""") ]]] [[[end]]] To deny access to all users, you can use ""allow"": false : [[[cog config_example(cog, """""" title: My entirely inaccessible instance allow: false """""") ]]] [[[end]]] One reason to do this is if you are using a Datasette plugin - such as datasette-permissions-sql - to control permissions instead.","[""Authentication and permissions"", ""Access permissions in ""]","[{""href"": ""https://github.com/simonw/datasette-permissions-sql"", ""label"": ""datasette-permissions-sql""}]" authentication:authentication-permissions-config,authentication,authentication-permissions-config,Access permissions in ,"There are two ways to configure permissions using datasette.yaml (or datasette.json ). For simple visibility permissions you can use ""allow"" blocks in the root, database, table and query sections. For other permissions you can use a ""permissions"" block, described in the next section . You can limit who is allowed to view different parts of your Datasette instance using ""allow"" keys in your Configuration . You can control the following: Access to the entire Datasette instance Access to specific databases Access to specific tables and views Access to specific Canned queries If a user cannot access a specific database, they will not be able to access tables, views or queries within that database. If a user cannot access the instance they will not be able to access any of the databases, tables, views or queries.","[""Authentication and permissions""]",[] changelog:asgi,changelog,asgi,ASGI,"ASGI is the Asynchronous Server Gateway Interface standard. I've been wanting to convert Datasette into an ASGI application for over a year - Port Datasette to ASGI #272 tracks thirteen months of intermittent development - but with Datasette 0.29 the change is finally released. This also means Datasette now runs on top of Uvicorn and no longer depends on Sanic . I wrote about the significance of this change in Porting Datasette to ASGI, and Turtles all the way down . The most exciting consequence of this change is that Datasette plugins can now take advantage of the ASGI standard.","[""Changelog"", ""0.29 (2019-07-07)""]","[{""href"": ""https://asgi.readthedocs.io/"", ""label"": ""ASGI""}, {""href"": ""https://github.com/simonw/datasette/issues/272"", ""label"": ""Port Datasette to ASGI #272""}, {""href"": ""https://www.uvicorn.org/"", ""label"": ""Uvicorn""}, {""href"": ""https://github.com/huge-success/sanic"", ""label"": ""Sanic""}, {""href"": ""https://simonwillison.net/2019/Jun/23/datasette-asgi/"", ""label"": ""Porting Datasette to ASGI, and Turtles all the way down""}]" authentication:createtokenview,authentication,createtokenview,API Tokens,"Datasette includes a default mechanism for generating API tokens that can be used to authenticate requests. Authenticated users can create new API tokens using a form on the /-/create-token page. Tokens created in this way can be further restricted to only allow access to specific actions, or to limit those actions to specific databases, tables or queries. Created tokens can then be passed in the Authorization: Bearer $token header of HTTP requests to Datasette. A token created by a user will include that user's ""id"" in the token payload, so any permissions granted to that user based on their ID can be made available to the token as well. When one of these a token accompanies a request, the actor for that request will have the following shape: { ""id"": ""user_id"", ""token"": ""dstok"", ""token_expires"": 1667717426 } The ""id"" field duplicates the ID of the actor who first created the token. The ""token"" field identifies that this actor was authenticated using a Datasette signed token ( dstok ). The ""token_expires"" field, if present, indicates that the token will expire after that integer timestamp. The /-/create-token page cannot be accessed by actors that are authenticated with a ""token"": ""some-value"" property. This is to prevent API tokens from being used to create more tokens. Datasette plugins that implement their own form of API token authentication should follow this convention. You can disable the signed token feature entirely using the allow_signed_tokens setting.","[""Authentication and permissions""]",[] installation:installation-extensions,installation,installation-extensions,A note about extensions,"SQLite supports extensions, such as SpatiaLite for geospatial operations. These can be loaded using the --load-extension argument, like so: datasette --load-extension=/usr/local/lib/mod_spatialite.dylib Some Python installations do not include support for SQLite extensions. If this is the case you will see the following error when you attempt to load an extension: Your Python installation does not have the ability to load SQLite extensions. In some cases you may see the following error message instead: AttributeError: 'sqlite3.Connection' object has no attribute 'enable_load_extension' On macOS the easiest fix for this is to install Datasette using Homebrew: brew install datasette Use which datasette to confirm that datasette will run that version. The output should look something like this: /usr/local/opt/datasette/bin/datasette If you get a different location here such as /Library/Frameworks/Python.framework/Versions/3.10/bin/datasette you can run the following command to cause datasette to execute the Homebrew version instead: alias datasette=$(echo $(brew --prefix datasette)/bin/datasette) You can undo this operation using: unalias datasette If you need to run SQLite with extension support for other Python code, you can do so by install Python itself using Homebrew: brew install python Then executing Python using: /usr/local/opt/python@3/libexec/bin/python A more convenient way to work with this version of Python may be to use it to create a virtual environment: /usr/local/opt/python@3/libexec/bin/python -m venv datasette-venv Then activate it like this: source datasette-venv/bin/activate Now running python and pip will work against a version of Python 3 that includes support for SQLite extensions: pip install datasette which datasette datasette --version","[""Installation""]",[] changelog:through-for-joins-through-many-to-many-tables,changelog,through-for-joins-through-many-to-many-tables,?_through= for joins through many-to-many tables,"The new ?_through={json} argument to the Table view allows records to be filtered based on a many-to-many relationship. See Special table arguments for full documentation - here's an example . ( #355 ) This feature was added to help support facet by many-to-many , which isn't quite ready yet but will be coming in the next Datasette release.","[""Changelog"", ""0.29 (2019-07-07)""]","[{""href"": ""https://latest.datasette.io/fixtures/roadside_attractions?_through={%22table%22:%22roadside_attraction_characteristics%22,%22column%22:%22characteristic_id%22,%22value%22:%221%22}"", ""label"": ""an example""}, {""href"": ""https://github.com/simonw/datasette/issues/355"", ""label"": ""#355""}, {""href"": ""https://github.com/simonw/datasette/issues/551"", ""label"": ""facet by many-to-many""}]" changelog:v1-0-a9,changelog,v1-0-a9,1.0a9 (2024-02-16),This alpha release adds basic alter table support to the Datasette Write API and fixes a permissions bug relating to the /upsert API endpoint.,"[""Changelog""]",[] changelog:v1-0-a8,changelog,v1-0-a8,1.0a8 (2024-02-07),"This alpha release continues the migration of Datasette's configuration from metadata.yaml to the new datasette.yaml configuration file, introduces a new system for JavaScript plugins and adds several new plugin hooks. See Datasette 1.0a8: JavaScript plugins, new plugin hooks and plugin configuration in datasette.yaml for an annotated version of these release notes.","[""Changelog""]","[{""href"": ""https://simonwillison.net/2024/Feb/7/datasette-1a8/"", ""label"": ""Datasette 1.0a8: JavaScript plugins, new plugin hooks and plugin configuration in datasette.yaml""}]" changelog:v1-0-a7,changelog,v1-0-a7,1.0a7 (2023-09-21),Fix for a crashing bug caused by viewing the table page for a named in-memory database. ( #2189 ),"[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/2189"", ""label"": ""#2189""}]" changelog:v1-0-a6,changelog,v1-0-a6,1.0a6 (2023-09-07),"New plugin hook: actors_from_ids(datasette, actor_ids) and an internal method to accompany it, await .actors_from_ids(actor_ids) . This mechanism is intended to be used by plugins that may need to display the actor who was responsible for something managed by that plugin: they can now resolve the recorded IDs of actors into the full actor objects. ( #2181 ) DATASETTE_LOAD_PLUGINS environment variable for controlling which plugins are loaded by Datasette. ( #2164 ) Datasette now checks if the user has permission to view a table linked to by a foreign key before turning that foreign key into a clickable link. ( #2178 ) The execute-sql permission now implies that the actor can also view the database and instance. ( #2169 ) Documentation describing a pattern for building plugins that themselves define further hooks for other plugins. ( #1765 ) Datasette is now tested against the Python 3.12 preview. ( #2175 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/2181"", ""label"": ""#2181""}, {""href"": ""https://github.com/simonw/datasette/issues/2164"", ""label"": ""#2164""}, {""href"": ""https://github.com/simonw/datasette/issues/2178"", ""label"": ""#2178""}, {""href"": ""https://github.com/simonw/datasette/issues/2169"", ""label"": ""#2169""}, {""href"": ""https://github.com/simonw/datasette/issues/1765"", ""label"": ""#1765""}, {""href"": ""https://github.com/simonw/datasette/pull/2175"", ""label"": ""#2175""}]" changelog:v1-0-a5,changelog,v1-0-a5,1.0a5 (2023-08-29),"When restrictions are applied to API tokens , those restrictions now behave slightly differently: applying the view-table restriction will imply the ability to view-database for the database containing that table, and both view-table and view-database will imply view-instance . Previously you needed to create a token with restrictions that explicitly listed view-instance and view-database and view-table in order to view a table without getting a permission denied error. ( #2102 ) New datasette.yaml (or .json ) configuration file, which can be specified using datasette -c path-to-file . The goal here to consolidate settings, plugin configuration, permissions, canned queries, and other Datasette configuration into a single single file, separate from metadata.yaml . The legacy settings.json config file used for Configuration directory mode has been removed, and datasette.yaml has a ""settings"" section where the same settings key/value pairs can be included. In the next future alpha release, more configuration such as plugins/permissions/canned queries will be moved to the datasette.yaml file. See #2093 for more details. Thanks, Alex Garcia. The -s/--setting option can now take dotted paths to nested settings. These will then be used to set or over-ride the same options as are present in the new configuration file. ( #2156 ) New --actor '{""id"": ""json-goes-here""}' option for use with datasette --get to treat the simulated request as being made by a specific actor, see datasette --get . ( #2153 ) The Datasette _internal database has had some changes. It no longer shows up in the datasette.databases list by default, and is now instead available to plugins using the datasette.get_internal_database() . Plugins are invited to use this as a private database to store configuration and settings and secrets that should not be made visible through the default Datasette interface. Users can pass the new --internal internal.db option to persist that internal database to disk. Thanks, Alex Garcia. ( #2157 ).","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/2102"", ""label"": ""#2102""}, {""href"": ""https://github.com/simonw/datasette/issues/2093"", ""label"": ""#2093""}, {""href"": ""https://github.com/simonw/datasette/issues/2156"", ""label"": ""#2156""}, {""href"": ""https://github.com/simonw/datasette/issues/2153"", ""label"": ""#2153""}, {""href"": ""https://github.com/simonw/datasette/issues/2157"", ""label"": ""#2157""}]" changelog:v1-0-a4,changelog,v1-0-a4,1.0a4 (2023-08-21),"This alpha fixes a security issue with the /-/api API explorer. On authenticated Datasette instances (instances protected using plugins such as datasette-auth-passwords ) the API explorer interface could reveal the names of databases and tables within the protected instance. The data stored in those tables was not revealed. For more information and workarounds, read the security advisory . The issue has been present in every previous alpha version of Datasette 1.0: versions 1.0a0, 1.0a1, 1.0a2 and 1.0a3. Also in this alpha: The new datasette plugins --requirements option outputs a list of currently installed plugins in Python requirements.txt format, useful for duplicating that installation elsewhere. ( #2133 ) Writable canned queries can now define a on_success_message_sql field in their configuration, containing a SQL query that should be executed upon successful completion of the write operation in order to generate a message to be shown to the user. ( #2138 ) The automatically generated border color for a database is now shown in more places around the application. ( #2119 ) Every instance of example shell script code in the documentation should now include a working copy button, free from additional syntax. ( #2140 )","[""Changelog""]","[{""href"": ""https://datasette.io/plugins/datasette-auth-passwords"", ""label"": ""datasette-auth-passwords""}, {""href"": ""https://github.com/simonw/datasette/security/advisories/GHSA-7ch3-7pp7-7cpq"", ""label"": ""the security advisory""}, {""href"": ""https://github.com/simonw/datasette/issues/2133"", ""label"": ""#2133""}, {""href"": ""https://github.com/simonw/datasette/issues/2138"", ""label"": ""#2138""}, {""href"": ""https://github.com/simonw/datasette/issues/2119"", ""label"": ""#2119""}, {""href"": ""https://github.com/simonw/datasette/issues/2140"", ""label"": ""#2140""}]" changelog:v1-0-a3,changelog,v1-0-a3,1.0a3 (2023-08-09),"This alpha release previews the updated design for Datasette's default JSON API. ( #782 ) The new default JSON representation for both table pages ( /dbname/table.json ) and arbitrary SQL queries ( /dbname.json?sql=... ) is now shaped like this: { ""ok"": true, ""rows"": [ { ""id"": 3, ""name"": ""Detroit"" }, { ""id"": 2, ""name"": ""Los Angeles"" }, { ""id"": 4, ""name"": ""Memnonia"" }, { ""id"": 1, ""name"": ""San Francisco"" } ], ""truncated"": false } Tables will include an additional ""next"" key for pagination, which can be passed to ?_next= to fetch the next page of results. The various ?_shape= options continue to work as before - see Different shapes for details. A new ?_extra= mechanism is available for tables, but has not yet been stabilized or documented. Details on that are available in #262 .","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/782"", ""label"": ""#782""}, {""href"": ""https://github.com/simonw/datasette/issues/262"", ""label"": ""#262""}]" changelog:v1-0-a2,changelog,v1-0-a2,1.0a2 (2022-12-14),"The third Datasette 1.0 alpha release adds upsert support to the JSON API, plus the ability to specify finely grained permissions when creating an API token. See Datasette 1.0a2: Upserts and finely grained permissions for an extended, annotated version of these release notes. New /db/table/-/upsert API, documented here . upsert is an update-or-insert: existing rows will have specified keys updated, but if no row matches the incoming primary key a brand new row will be inserted instead. ( #1878 ) New register_permissions(datasette) plugin hook. Plugins can now register named permissions, which will then be listed in various interfaces that show available permissions. ( #1940 ) The /db/-/create API for creating a table now accepts ""ignore"": true and ""replace"": true options when called with the ""rows"" property that creates a new table based on an example set of rows. This means the API can be called multiple times with different rows, setting rules for what should happen if a primary key collides with an existing row. ( #1927 ) Arbitrary permissions can now be configured at the instance, database and resource (table, SQL view or canned query) level in Datasette's Metadata JSON and YAML files. The new ""permissions"" key can be used to specify which actors should have which permissions. See Other permissions in datasette.yaml for details. ( #1636 ) The /-/create-token page can now be used to create API tokens which are restricted to just a subset of actions, including against specific databases or resources. See API Tokens for details. ( #1947 ) Likewise, the datasette create-token CLI command can now create tokens with a subset of permissions . ( #1855 ) New datasette.create_token() API method for programmatically creating signed API tokens. ( #1951 ) /db/-/create API now requires actor to have insert-row permission in order to use the ""row"" or ""rows"" properties. ( #1937 )","[""Changelog""]","[{""href"": ""https://simonwillison.net/2022/Dec/15/datasette-1a2/"", ""label"": ""Datasette 1.0a2: Upserts and finely grained permissions""}, {""href"": ""https://github.com/simonw/datasette/issues/1878"", ""label"": ""#1878""}, {""href"": ""https://github.com/simonw/datasette/issues/1940"", ""label"": ""#1940""}, {""href"": ""https://github.com/simonw/datasette/issues/1927"", ""label"": ""#1927""}, {""href"": ""https://github.com/simonw/datasette/issues/1636"", ""label"": ""#1636""}, {""href"": ""https://github.com/simonw/datasette/issues/1947"", ""label"": ""#1947""}, {""href"": ""https://github.com/simonw/datasette/issues/1855"", ""label"": ""#1855""}, {""href"": ""https://github.com/simonw/datasette/issues/1951"", ""label"": ""#1951""}, {""href"": ""https://github.com/simonw/datasette/issues/1937"", ""label"": ""#1937""}]" changelog:v1-0-a13,changelog,v1-0-a13,1.0a13 (2024-03-12),"Each of the key concepts in Datasette now has an actions menu , which plugins can use to add additional functionality targeting that entity. Plugin hook: view_actions() for actions that can be applied to a SQL view. ( #2297 ) Plugin hook: homepage_actions() for actions that apply to the instance homepage. ( #2298 ) Plugin hook: row_actions() for actions that apply to the row page. ( #2299 ) Action menu items for all of the *_actions() plugin hooks can now return an optional ""description"" key, which will be displayed in the menu below the action label. ( #2294 ) Plugin hooks documentation page is now organized with additional headings. ( #2300 ) Improved the display of action buttons on pages that also display metadata. ( #2286 ) The header and footer of the page now uses a subtle gradient effect, and options in the navigation menu are better visually defined. ( #2302 ) Table names that start with an underscore now default to hidden. ( #2104 ) pragma_table_list has been added to the allow-list of SQLite pragma functions supported by Datasette. select * from pragma_table_list() is no longer blocked. ( #2104 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/2297"", ""label"": ""#2297""}, {""href"": ""https://github.com/simonw/datasette/issues/2298"", ""label"": ""#2298""}, {""href"": ""https://github.com/simonw/datasette/issues/2299"", ""label"": ""#2299""}, {""href"": ""https://github.com/simonw/datasette/issues/2294"", ""label"": ""#2294""}, {""href"": ""https://github.com/simonw/datasette/issues/2300"", ""label"": ""#2300""}, {""href"": ""https://github.com/simonw/datasette/issues/2286"", ""label"": ""#2286""}, {""href"": ""https://github.com/simonw/datasette/issues/2302"", ""label"": ""#2302""}, {""href"": ""https://github.com/simonw/datasette/issues/2104"", ""label"": ""#2104""}, {""href"": ""https://github.com/simonw/datasette/issues/2104#issuecomment-1982352475"", ""label"": ""#2104""}]" changelog:v1-0-a12,changelog,v1-0-a12,1.0a12 (2024-02-29),"New query_actions() plugin hook, similar to table_actions() and database_actions() . Can be used to add a menu of actions to the canned query or arbitrary SQL query page. ( #2283 ) New design for the button that opens the query, table and database actions menu. ( #2281 ) ""does not contain"" table filter for finding rows that do not contain a string. ( #2287 ) Fixed a bug in the makeColumnActions(columnDetails) JavaScript plugin mechanism where the column action menu was not fully reset in between each interaction. ( #2289 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/2283"", ""label"": ""#2283""}, {""href"": ""https://github.com/simonw/datasette/issues/2281"", ""label"": ""#2281""}, {""href"": ""https://github.com/simonw/datasette/issues/2287"", ""label"": ""#2287""}, {""href"": ""https://github.com/simonw/datasette/issues/2289"", ""label"": ""#2289""}]" changelog:v1-0-a11,changelog,v1-0-a11,1.0a11 (2024-02-19),"The ""replace"": true argument to the /db/table/-/insert API now requires the actor to have the update-row permission. ( #2279 ) Fixed some UI bugs in the interactive permissions debugging tool. ( #2278 ) The column action menu now aligns better with the cog icon, and positions itself taking into account the width of the browser window. ( #2263 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/2279"", ""label"": ""#2279""}, {""href"": ""https://github.com/simonw/datasette/issues/2278"", ""label"": ""#2278""}, {""href"": ""https://github.com/simonw/datasette/issues/2263"", ""label"": ""#2263""}]" changelog:v1-0-a10,changelog,v1-0-a10,1.0a10 (2024-02-17),"The only changes in this alpha correspond to the way Datasette handles database transactions. ( #2277 ) The database.execute_write_fn() method has a new transaction=True parameter. This defaults to True which means all functions executed using this method are now automatically wrapped in a transaction - previously the functions needed to roll transaction handling on their own, and many did not. Pass transaction=False to execute_write_fn() if you want to manually handle transactions in your function. Several internal Datasette features, including parts of the JSON write API , had been failing to wrap their operations in a transaction. This has been fixed by the new transaction=True default.","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/2277"", ""label"": ""#2277""}]" changelog:v1-0-a1,changelog,v1-0-a1,1.0a1 (2022-12-01),"Write APIs now serve correct CORS headers if Datasette is started in --cors mode. See the full list of CORS headers in the documentation. ( #1922 ) Fixed a bug where the _memory database could be written to even though writes were not persisted. ( #1917 ) The https://latest.datasette.io/ demo instance now includes an ephemeral database which can be used to test Datasette's write APIs, using the new datasette-ephemeral-tables plugin to drop any created tables after five minutes. This database is only available if you sign in as the root user using the link on the homepage. ( #1915 ) Fixed a bug where hitting the write endpoints with a GET request returned a 500 error. It now returns a 405 (method not allowed) error instead. ( #1916 ) The list of endpoints in the API explorer now lists mutable databases first. ( #1918 ) The ""ignore"": true and ""replace"": true options for the insert API are now documented . ( #1924 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/1922"", ""label"": ""#1922""}, {""href"": ""https://github.com/simonw/datasette/issues/1917"", ""label"": ""#1917""}, {""href"": ""https://latest.datasette.io/"", ""label"": ""https://latest.datasette.io/""}, {""href"": ""https://datasette.io/plugins/datasette-ephemeral-tables"", ""label"": ""datasette-ephemeral-tables""}, {""href"": ""https://github.com/simonw/datasette/issues/1915"", ""label"": ""#1915""}, {""href"": ""https://github.com/simonw/datasette/issues/1916"", ""label"": ""#1916""}, {""href"": ""https://github.com/simonw/datasette/issues/1918"", ""label"": ""#1918""}, {""href"": ""https://github.com/simonw/datasette/issues/1924"", ""label"": ""#1924""}]" changelog:v1-0-a0,changelog,v1-0-a0,1.0a0 (2022-11-29),"This first alpha release of Datasette 1.0 introduces a brand new collection of APIs for writing to the database ( #1850 ), as well as a new API token mechanism baked into Datasette core. Previously, API tokens have only been supported by installing additional plugins. This is very much a preview: expect many more backwards incompatible API changes prior to the full 1.0 release. Feedback enthusiastically welcomed, either through issue comments or via the Datasette Discord community.","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/1850"", ""label"": ""#1850""}, {""href"": ""https://github.com/simonw/datasette/issues/1850"", ""label"": ""issue comments""}, {""href"": ""https://datasette.io/discord"", ""label"": ""Datasette Discord""}]" changelog:id211,changelog,id211,0.9 (2017-11-13),"Added --sql_time_limit_ms and --extra-options . The serve command now accepts --sql_time_limit_ms for customizing the SQL time limit. The publish and package commands now accept --extra-options which can be used to specify additional options to be passed to the datasite serve command when it executes inside the resulting Docker containers.","[""Changelog""]",[] changelog:id212,changelog,id212,0.8 (2017-11-13),"V0.8 - added PyPI metadata, ready to ship. Implemented offset/limit pagination for views ( #70 ). Improved pagination. ( #78 ) Limit on max rows returned, controlled by --max_returned_rows option. ( #69 ) If someone executes 'select * from table' against a table with a million rows in it, we could run into problems: just serializing that much data as JSON is likely to lock up the server. Solution: we now have a hard limit on the maximum number of rows that can be returned by a query. If that limit is exceeded, the server will return a ""truncated"": true field in the JSON. This limit can be optionally controlled by the new --max_returned_rows option. Setting that option to 0 disables the limit entirely.","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/70"", ""label"": ""#70""}, {""href"": ""https://github.com/simonw/datasette/issues/78"", ""label"": ""#78""}, {""href"": ""https://github.com/simonw/datasette/issues/69"", ""label"": ""#69""}]" changelog:id2,changelog,id2,0.64.6 (2023-12-22),Fixed a bug where CSV export with expanded labels could fail if a foreign key reference did not correctly resolve. ( #2214 ),"[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/2214"", ""label"": ""#2214""}]" changelog:id3,changelog,id3,0.64.5 (2023-10-08),"Dropped dependency on click-default-group-wheel , which could cause a dependency conflict. ( #2197 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/2197"", ""label"": ""#2197""}]" changelog:id4,changelog,id4,0.64.4 (2023-09-21),Fix for a crashing bug caused by viewing the table page for a named in-memory database. ( #2189 ),"[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/2189"", ""label"": ""#2189""}]" changelog:id5,changelog,id5,0.64.2 (2023-03-08),"Fixed a bug with datasette publish cloudrun where deploys all used the same Docker image tag. This was mostly inconsequential as the service is deployed as soon as the image has been pushed to the registry, but could result in the incorrect image being deployed if two different deploys for two separate services ran at exactly the same time. ( #2036 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/2036"", ""label"": ""#2036""}]" changelog:id6,changelog,id6,0.64.1 (2023-01-11),"Documentation now links to a current source of information for installing Python 3. ( #1987 ) Incorrectly calling the Datasette constructor using Datasette(""path/to/data.db"") instead of Datasette([""path/to/data.db""]) now returns a useful error message. ( #1985 )","[""Changelog""]","[{""href"": ""https://github.com/simonw/datasette/issues/1987"", ""label"": ""#1987""}, {""href"": ""https://github.com/simonw/datasette/issues/1985"", ""label"": ""#1985""}]" changelog:id7,changelog,id7,0.64 (2023-01-09),"Datasette now strongly recommends against allowing arbitrary SQL queries if you are using SpatiaLite . SpatiaLite includes SQL functions that could cause the Datasette server to crash. See SpatiaLite for more details. New default_allow_sql setting, providing an easier way to disable all arbitrary SQL execution by end users: datasette --setting default_allow_sql off . See also Controlling the ability to execute arbitrary SQL . ( #1409 ) Building a location to time zone API with SpatiaLite is a new Datasette tutorial showing how to safely use SpatiaLite to create a location to time zone API. New documentation about how to debug problems loading SQLite extensions . The error message shown when an extension cannot be loaded has also been improved. ( #1979 ) Fixed an accessibility issue: the