[{"rowid": 1, "title": "Plugins", "content": "Datasette's plugin system allows additional features to be implemented as Python\n code (or front-end JavaScript) which can be wrapped up in a separate Python\n package. The underlying mechanism uses pluggy . \n See the Datasette plugins directory for a list of existing plugins, or take a look at the\n datasette-plugin topic on GitHub. \n Things you can do with plugins include: \n \n \n Add visualizations to Datasette, for example\n datasette-cluster-map and\n datasette-vega . \n \n \n Make new custom SQL functions available for use within Datasette, for example\n datasette-haversine and\n datasette-jellyfish . \n \n \n Define custom output formats with custom extensions, for example datasette-atom and\n datasette-ics . \n \n \n Add template functions that can be called within your Jinja custom templates,\n for example datasette-render-markdown . \n \n \n Customize how database values are rendered in the Datasette interface, for example\n datasette-render-binary and\n datasette-pretty-json . \n \n \n Customize how Datasette's authentication and permissions systems work, for example datasette-auth-passwords and\n datasette-permissions-sql .", "sections_fts": 94, "rank": null}, {"rowid": 2, "title": "Installing plugins", "content": "If a plugin has been packaged for distribution using setuptools you can use the plugin by installing it alongside Datasette in the same virtual environment or Docker container. \n You can install plugins using the datasette install command: \n datasette install datasette-vega \n You can uninstall plugins with datasette uninstall : \n datasette uninstall datasette-vega \n You can upgrade plugins with datasette install --upgrade or datasette install -U : \n datasette install -U datasette-vega \n This command can also be used to upgrade Datasette itself to the latest released version: \n datasette install -U datasette \n You can install multiple plugins at once by listing them as lines in a requirements.txt file like this: \n datasette-vega\ndatasette-cluster-map \n Then pass that file to datasette install -r : \n datasette install -r requirements.txt \n The install and uninstall commands are thin wrappers around pip install and pip uninstall , which ensure that they run pip in the same virtual environment as Datasette itself.", "sections_fts": 94, "rank": null}, {"rowid": 3, "title": "One-off plugins using --plugins-dir", "content": "You can also define one-off per-project plugins by saving them as plugin_name.py functions in a plugins/ folder and then passing that folder to datasette using the --plugins-dir option: \n datasette mydb.db --plugins-dir=plugins/", "sections_fts": 94, "rank": null}, {"rowid": 4, "title": "Deploying plugins using datasette publish", "content": "The datasette publish and datasette package commands both take an optional --install argument. You can use this one or more times to tell Datasette to pip install specific plugins as part of the process: \n datasette publish cloudrun mydb.db --install=datasette-vega \n You can use the name of a package on PyPI or any of the other valid arguments to pip install such as a URL to a .zip file: \n datasette publish cloudrun mydb.db \\\n --install=https://url-to-my-package.zip", "sections_fts": 94, "rank": null}, {"rowid": 5, "title": "Controlling which plugins are loaded", "content": "Datasette defaults to loading every plugin that is installed in the same virtual environment as Datasette itself. \n You can set the DATASETTE_LOAD_PLUGINS environment variable to a comma-separated list of plugin names to load a controlled subset of plugins instead. \n For example, to load just the datasette-vega and datasette-cluster-map plugins, set DATASETTE_LOAD_PLUGINS to datasette-vega,datasette-cluster-map : \n export DATASETTE_LOAD_PLUGINS='datasette-vega,datasette-cluster-map'\ndatasette mydb.db \n Or: \n DATASETTE_LOAD_PLUGINS='datasette-vega,datasette-cluster-map' \\\n datasette mydb.db \n To disable the loading of all additional plugins, set DATASETTE_LOAD_PLUGINS to an empty string: \n export DATASETTE_LOAD_PLUGINS=''\ndatasette mydb.db \n A quick way to test this setting is to use it with the datasette plugins command: \n DATASETTE_LOAD_PLUGINS='datasette-vega' datasette plugins \n This should output the following: \n [\n {\n \"name\": \"datasette-vega\",\n \"static\": true,\n \"templates\": false,\n \"version\": \"0.6.2\",\n \"hooks\": [\n \"extra_css_urls\",\n \"extra_js_urls\"\n ]\n }\n]", "sections_fts": 94, "rank": null}, {"rowid": 6, "title": "Seeing what plugins are installed", "content": "You can see a list of installed plugins by navigating to the /-/plugins page of your Datasette instance - for example: https://fivethirtyeight.datasettes.com/-/plugins \n You can also use the datasette plugins command: \n datasette plugins \n Which outputs: \n [\n {\n \"name\": \"datasette_json_html\",\n \"static\": false,\n \"templates\": false,\n \"version\": \"0.4.0\"\n }\n] \n [[[cog\nfrom datasette import cli\nfrom click.testing import CliRunner\nimport textwrap, json\ncog.out(\"\\n\")\nresult = CliRunner().invoke(cli.cli, [\"plugins\", \"--all\"])\n# cog.out() with text containing newlines was unindenting for some reason\ncog.outl(\"If you run ``datasette plugins --all`` it will include default plugins that ship as part of Datasette:\\n\")\ncog.outl(\".. code-block:: json\\n\")\nplugins = [p for p in json.loads(result.output) if p[\"name\"].startswith(\"datasette.\")]\nindented = textwrap.indent(json.dumps(plugins, indent=4), \" \")\nfor line in indented.split(\"\\n\"):\n cog.outl(line)\ncog.out(\"\\n\\n\") \n ]]] \n If you run datasette plugins --all it will include default plugins that ship as part of Datasette: \n [\n {\n \"name\": \"datasette.actor_auth_cookie\",\n \"static\": false,\n \"templates\": false,\n \"version\": null,\n \"hooks\": [\n \"actor_from_request\"\n ]\n },\n {\n \"name\": \"datasette.blob_renderer\",\n \"static\": false,\n \"templates\": false,\n \"version\": null,\n \"hooks\": [\n \"register_output_renderer\"\n ]\n },\n {\n \"name\": \"datasette.default_magic_parameters\",\n \"static\": false,\n \"templates\": false,\n \"version\": null,\n \"hooks\": [\n \"register_magic_parameters\"\n ]\n },\n {\n \"name\": \"datasette.default_menu_links\",\n \"static\": false,\n \"templates\": false,\n \"version\": null,\n \"hooks\": [\n \"menu_links\"\n ]\n },\n {\n \"name\": \"datasette.default_permissions\",\n \"static\": false,\n \"templates\": false,\n \"version\": null,\n \"hooks\": [\n \"actor_from_request\",\n \"permission_allowed\",\n \"register_permissions\",\n \"skip_csrf\"\n ]\n },\n {\n \"name\": \"datasette.events\",\n \"static\": false,\n \"templates\": false,\n \"version\": null,\n \"hooks\": [\n \"register_events\"\n ]\n },\n {\n \"name\": \"datasette.facets\",\n \"static\": false,\n \"templates\": false,\n \"version\": null,\n \"hooks\": [\n \"register_facet_classes\"\n ]\n },\n {\n \"name\": \"datasette.filters\",\n \"static\": false,\n \"templates\": false,\n \"version\": null,\n \"hooks\": [\n \"filters_from_request\"\n ]\n },\n {\n \"name\": \"datasette.forbidden\",\n \"static\": false,\n \"templates\": false,\n \"version\": null,\n \"hooks\": [\n \"forbidden\"\n ]\n },\n {\n \"name\": \"datasette.handle_exception\",\n \"static\": false,\n \"templates\": false,\n \"version\": null,\n \"hooks\": [\n \"handle_exception\"\n ]\n },\n {\n \"name\": \"datasette.publish.cloudrun\",\n \"static\": false,\n \"templates\": false,\n \"version\": null,\n \"hooks\": [\n \"publish_subcommand\"\n ]\n },\n {\n \"name\": \"datasette.publish.heroku\",\n \"static\": false,\n \"templates\": false,\n \"version\": null,\n \"hooks\": [\n \"publish_subcommand\"\n ]\n },\n {\n \"name\": \"datasette.sql_functions\",\n \"static\": false,\n \"templates\": false,\n \"version\": null,\n \"hooks\": [\n \"prepare_connection\"\n ]\n }\n] \n [[[end]]] \n You can add the --plugins-dir= option to include any plugins found in that directory. \n Add --requirements to output a list of installed plugins that can then be installed in another Datasette instance using datasette install -r requirements.txt : \n datasette plugins --requirements \n The output will look something like this: \n datasette-codespaces==0.1.1\ndatasette-graphql==2.2\ndatasette-json-html==1.0.1\ndatasette-pretty-json==0.2.2\ndatasette-x-forwarded-host==0.1 \n To write that to a requirements.txt file, run this: \n datasette plugins --requirements > requirements.txt", "sections_fts": 94, "rank": null}, {"rowid": 7, "title": "Plugin configuration", "content": "Plugins can have their own configuration, embedded in a configuration file . Configuration options for plugins live within a \"plugins\" key in that file, which can be included at the root, database or table level. \n Here is an example of some plugin configuration for a specific table: \n [[[cog\nfrom metadata_doc import config_example\nconfig_example(cog, {\n \"databases\": {\n \"sf-trees\": {\n \"tables\": {\n \"Street_Tree_List\": {\n \"plugins\": {\n \"datasette-cluster-map\": {\n \"latitude_column\": \"lat\",\n \"longitude_column\": \"lng\"\n }\n }\n }\n }\n }\n }\n}) \n ]]] \n [[[end]]] \n This tells the datasette-cluster-map column which latitude and longitude columns should be used for a table called Street_Tree_List inside a database file called sf-trees.db .", "sections_fts": 94, "rank": null}, {"rowid": 8, "title": "Secret configuration values", "content": "Some plugins may need configuration that should stay secret - API keys for example. There are two ways in which you can store secret configuration values. \n As environment variables . If your secret lives in an environment variable that is available to the Datasette process, you can indicate that the configuration value should be read from that environment variable like so: \n [[[cog\nconfig_example(cog, {\n \"plugins\": {\n \"datasette-auth-github\": {\n \"client_secret\": {\n \"$env\": \"GITHUB_CLIENT_SECRET\"\n }\n }\n }\n}) \n ]]] \n [[[end]]] \n As values in separate files . Your secrets can also live in files on disk. To specify a secret should be read from a file, provide the full file path like this: \n [[[cog\nconfig_example(cog, {\n \"plugins\": {\n \"datasette-auth-github\": {\n \"client_secret\": {\n \"$file\": \"/secrets/client-secret\"\n }\n }\n }\n}) \n ]]] \n [[[end]]] \n If you are publishing your data using the datasette publish family of commands, you can use the --plugin-secret option to set these secrets at publish time. For example, using Heroku you might run the following command: \n datasette publish heroku my_database.db \\\n --name my-heroku-app-demo \\\n --install=datasette-auth-github \\\n --plugin-secret datasette-auth-github client_id your_client_id \\\n --plugin-secret datasette-auth-github client_secret your_client_secret \n This will set the necessary environment variables and add the following to the deployed metadata.yaml : \n [[[cog\nconfig_example(cog, {\n \"plugins\": {\n \"datasette-auth-github\": {\n \"client_id\": {\n \"$env\": \"DATASETTE_AUTH_GITHUB_CLIENT_ID\"\n },\n \"client_secret\": {\n \"$env\": \"DATASETTE_AUTH_GITHUB_CLIENT_SECRET\"\n }\n }\n }\n}) \n ]]] \n [[[end]]]", "sections_fts": 94, "rank": null}, {"rowid": 9, "title": "Testing plugins", "content": "We recommend using pytest to write automated tests for your plugins. \n If you use the template described in Starting an installable plugin using cookiecutter your plugin will start with a single test in your tests/ directory that looks like this: \n from datasette.app import Datasette\nimport pytest\n\n\n@pytest.mark.asyncio\nasync def test_plugin_is_installed():\n datasette = Datasette(memory=True)\n response = await datasette.client.get(\"/-/plugins.json\")\n assert response.status_code == 200\n installed_plugins = {p[\"name\"] for p in response.json()}\n assert (\n \"datasette-plugin-template-demo\"\n in installed_plugins\n ) \n This test uses the datasette.client object to exercise a test instance of Datasette. datasette.client is a wrapper around the HTTPX Python library which can imitate HTTP requests using ASGI. This is the recommended way to write tests against a Datasette instance. \n This test also uses the pytest-asyncio package to add support for async def test functions running under pytest. \n You can install these packages like so: \n pip install pytest pytest-asyncio \n If you are building an installable package you can add them as test dependencies to your setup.py module like this: \n setup(\n name=\"datasette-my-plugin\",\n # ...\n extras_require={\"test\": [\"pytest\", \"pytest-asyncio\"]},\n tests_require=[\"datasette-my-plugin[test]\"],\n) \n You can then install the test dependencies like so: \n pip install -e '.[test]' \n Then run the tests using pytest like so: \n pytest", "sections_fts": 94, "rank": null}, {"rowid": 10, "title": "Setting up a Datasette test instance", "content": "The above example shows the easiest way to start writing tests against a Datasette instance: \n from datasette.app import Datasette\nimport pytest\n\n\n@pytest.mark.asyncio\nasync def test_plugin_is_installed():\n datasette = Datasette(memory=True)\n response = await datasette.client.get(\"/-/plugins.json\")\n assert response.status_code == 200 \n Creating a Datasette() instance like this as useful shortcut in tests, but there is one detail you need to be aware of. It's important to ensure that the async method .invoke_startup() is called on that instance. You can do that like this: \n datasette = Datasette(memory=True)\nawait datasette.invoke_startup() \n This method registers any startup(datasette) or prepare_jinja2_environment(env, datasette) plugins that might themselves need to make async calls. \n If you are using await datasette.client.get() and similar methods then you don't need to worry about this - Datasette automatically calls invoke_startup() the first time it handles a request.", "sections_fts": 94, "rank": null}, {"rowid": 11, "title": "Using datasette.client in tests", "content": "The datasette.client mechanism is designed for use in tests. It provides access to a pre-configured HTTPX async client instance that can make GET, POST and other HTTP requests against a Datasette instance from inside a test. \n A simple test looks like this: \n @pytest.mark.asyncio\nasync def test_homepage():\n ds = Datasette(memory=True)\n response = await ds.client.get(\"/\")\n html = response.text\n assert \"
\" in html\n \n Or for a JSON API: \n @pytest.mark.asyncio\nasync def test_actor_is_null():\n ds = Datasette(memory=True)\n response = await ds.client.get(\"/-/actor.json\")\n assert response.json() == {\"actor\": None}\n \n To make requests as an authenticated actor, create a signed ds_cookie using the datasette.client.actor_cookie() helper function and pass it in cookies= like this: \n @pytest.mark.asyncio\nasync def test_signed_cookie_actor():\n ds = Datasette(memory=True)\n cookies = {\"ds_actor\": ds.client.actor_cookie({\"id\": \"root\"})}\n response = await ds.client.get(\"/-/actor.json\", cookies=cookies)\n assert response.json() == {\"actor\": {\"id\": \"root\"}}", "sections_fts": 94, "rank": null}, {"rowid": 12, "title": "Using pdb for errors thrown inside Datasette", "content": "If an exception occurs within Datasette itself during a test, the response returned to your plugin will have a response.status_code value of 500. \n You can add pdb=True to the Datasette constructor to drop into a Python debugger session inside your test run instead of getting back a 500 response code. This is equivalent to running the datasette command-line tool with the --pdb option. \n Here's what that looks like in a test function: \n def test_that_opens_the_debugger_or_errors():\n ds = Datasette([db_path], pdb=True)\n response = await ds.client.get(\"/\") \n If you use this pattern you will need to run pytest with the -s option to avoid capturing stdin/stdout in order to interact with the debugger prompt.", "sections_fts": 94, "rank": null}, {"rowid": 13, "title": "Using pytest fixtures", "content": "Pytest fixtures can be used to create initial testable objects which can then be used by multiple tests. \n A common pattern for Datasette plugins is to create a fixture which sets up a temporary test database and wraps it in a Datasette instance. \n Here's an example that uses the sqlite-utils library to populate a temporary test database. It also sets the title of that table using a simulated metadata.json configuration: \n from datasette.app import Datasette\nimport pytest\nimport sqlite_utils\n\n\n@pytest.fixture(scope=\"session\")\ndef datasette(tmp_path_factory):\n db_directory = tmp_path_factory.mktemp(\"dbs\")\n db_path = db_directory / \"test.db\"\n db = sqlite_utils.Database(db_path)\n db[\"dogs\"].insert_all(\n [\n {\"id\": 1, \"name\": \"Cleo\", \"age\": 5},\n {\"id\": 2, \"name\": \"Pancakes\", \"age\": 4},\n ],\n pk=\"id\",\n )\n datasette = Datasette(\n [db_path],\n metadata={\n \"databases\": {\n \"test\": {\n \"tables\": {\n \"dogs\": {\"title\": \"Some dogs\"}\n }\n }\n }\n },\n )\n return datasette\n\n\n@pytest.mark.asyncio\nasync def test_example_table_json(datasette):\n response = await datasette.client.get(\n \"/test/dogs.json?_shape=array\"\n )\n assert response.status_code == 200\n assert response.json() == [\n {\"id\": 1, \"name\": \"Cleo\", \"age\": 5},\n {\"id\": 2, \"name\": \"Pancakes\", \"age\": 4},\n ]\n\n\n@pytest.mark.asyncio\nasync def test_example_table_html(datasette):\n response = await datasette.client.get(\"/test/dogs\")\n assert \">Some dogs
\" in response.text \n Here the datasette() function defines the fixture, which is than automatically passed to the two test functions based on pytest automatically matching their datasette function parameters. \n The @pytest.fixture(scope=\"session\") line here ensures the fixture is reused for the full pytest execution session. This means that the temporary database file will be created once and reused for each test. \n If you want to create that test database repeatedly for every individual test function, write the fixture function like this instead. You may want to do this if your plugin modifies the database contents in some way: \n @pytest.fixture\ndef datasette(tmp_path_factory):\n # This fixture will be executed repeatedly for every test\n ...", "sections_fts": 94, "rank": null}, {"rowid": 14, "title": "Testing outbound HTTP calls with pytest-httpx", "content": "If your plugin makes outbound HTTP calls - for example datasette-auth-github or datasette-import-table - you may need to mock those HTTP requests in your tests. \n The pytest-httpx package is a useful library for mocking calls. It can be tricky to use with Datasette though since it mocks all HTTPX requests, and Datasette's own testing mechanism uses HTTPX internally. \n To avoid breaking your tests, you can return [\"localhost\"] from the non_mocked_hosts() fixture. \n As an example, here's a very simple plugin which executes an HTTP response and returns the resulting content: \n from datasette import hookimpl\nfrom datasette.utils.asgi import Response\nimport httpx\n\n\n@hookimpl\ndef register_routes():\n return [\n (r\"^/-/fetch-url$\", fetch_url),\n ]\n\n\nasync def fetch_url(datasette, request):\n if request.method == \"GET\":\n return Response.html(\n \"\"\"\n \"\"\".format(\n request.scope[\"csrftoken\"]()\n )\n )\n vars = await request.post_vars()\n url = vars[\"url\"]\n return Response.text(httpx.get(url).text) \n Here's a test for that plugin that mocks the HTTPX outbound request: \n from datasette.app import Datasette\nimport pytest\n\n\n@pytest.fixture\ndef non_mocked_hosts():\n # This ensures httpx-mock will not affect Datasette's own\n # httpx calls made in the tests by datasette.client:\n return [\"localhost\"]\n\n\nasync def test_outbound_http_call(httpx_mock):\n httpx_mock.add_response(\n url=\"https://www.example.com/\",\n text=\"Hello world\",\n )\n datasette = Datasette([], memory=True)\n response = await datasette.client.post(\n \"/-/fetch-url\",\n data={\"url\": \"https://www.example.com/\"},\n )\n assert response.text == \"Hello world\"\n\n outbound_request = httpx_mock.get_request()\n assert (\n outbound_request.url == \"https://www.example.com/\"\n )", "sections_fts": 94, "rank": null}, {"rowid": 15, "title": "Registering a plugin for the duration of a test", "content": "When writing tests for plugins you may find it useful to register a test plugin just for the duration of a single test. You can do this using pm.register() and pm.unregister() like this: \n from datasette import hookimpl\nfrom datasette.app import Datasette\nfrom datasette.plugins import pm\nimport pytest\n\n\n@pytest.mark.asyncio\nasync def test_using_test_plugin():\n class TestPlugin:\n __name__ = \"TestPlugin\"\n\n # Use hookimpl and method names to register hooks\n @hookimpl\n def register_routes(self):\n return [\n (r\"^/error$\", lambda: 1 / 0),\n ]\n\n pm.register(TestPlugin(), name=\"undo\")\n try:\n # The test implementation goes here\n datasette = Datasette()\n response = await datasette.client.get(\"/error\")\n assert response.status_code == 500\n finally:\n pm.unregister(name=\"undo\") \n To reuse the same temporary plugin in multiple tests, you can register it inside a fixture in your conftest.py file like this: \n from datasette import hookimpl\nfrom datasette.app import Datasette\nfrom datasette.plugins import pm\nimport pytest\nimport pytest_asyncio\n\n\n@pytest_asyncio.fixture\nasync def datasette_with_plugin():\n class TestPlugin:\n __name__ = \"TestPlugin\"\n\n @hookimpl\n def register_routes(self):\n return [\n (r\"^/error$\", lambda: 1 / 0),\n ]\n\n pm.register(TestPlugin(), name=\"undo\")\n try:\n yield Datasette()\n finally:\n pm.unregister(name=\"undo\")\n \n Note the yield statement here - this ensures that the finally: block that unregisters the plugin is executed only after the test function itself has completed. \n Then in a test: \n @pytest.mark.asyncio\nasync def test_error(datasette_with_plugin):\n response = await datasette_with_plugin.client.get(\"/error\")\n assert response.status_code == 500", "sections_fts": 94, "rank": null}, {"rowid": 16, "title": "Getting started", "content": "", "sections_fts": 94, "rank": null}, {"rowid": 17, "title": "Play with a live demo", "content": "The best way to experience Datasette for the first time is with a demo: \n \n \n global-power-plants.datasettes.com provides a searchable database of power plants around the world, using data from the World Resources Institude rendered using the datasette-cluster-map plugin. \n \n \n fivethirtyeight.datasettes.com shows Datasette running against over 400 datasets imported from the FiveThirtyEight GitHub repository .", "sections_fts": 94, "rank": null}, {"rowid": 18, "title": "Follow a tutorial", "content": "Datasette has several tutorials to help you get started with the tool. Try one of the following: \n \n \n Exploring a database with Datasette shows how to use the Datasette web interface to explore a new database. \n \n \n Learn SQL with Datasette introduces SQL, and shows how to use that query language to ask questions of your data. \n \n \n Cleaning data with sqlite-utils and Datasette guides you through using sqlite-utils to turn a CSV file into a database that you can explore using Datasette.", "sections_fts": 94, "rank": null}, {"rowid": 19, "title": "Datasette in your browser with Datasette Lite", "content": "Datasette Lite is Datasette packaged using WebAssembly so that it runs entirely in your browser, no Python web application server required. \n You can pass a URL to a CSV, SQLite or raw SQL file directly to Datasette Lite to explore that data in your browser. \n This example link opens Datasette Lite and loads the SQL Murder Mystery example database from Northwestern University Knight Lab .", "sections_fts": 94, "rank": null}, {"rowid": 20, "title": "Try Datasette without installing anything using Glitch", "content": "Glitch is a free online tool for building web apps directly from your web browser. You can use Glitch to try out Datasette without needing to install any software on your own computer. \n Here's a demo project on Glitch which you can use as the basis for your own experiments: \n glitch.com/~datasette-csvs \n Glitch allows you to \"remix\" any project to create your own copy and start editing it in your browser. You can remix the datasette-csvs project by clicking this button: \n \n Find a CSV file and drag it onto the Glitch file explorer panel - datasette-csvs will automatically convert it to a SQLite database (using sqlite-utils ) and allow you to start exploring it using Datasette. \n If your CSV file has a latitude and longitude column you can visualize it on a map by uncommenting the datasette-cluster-map line in the requirements.txt file using the Glitch file editor. \n Need some data? Try this Public Art Data for the city of Seattle - hit \"Export\" and select \"CSV\" to download it as a CSV file. \n For more on how this works, see Running Datasette on Glitch .", "sections_fts": 94, "rank": null}, {"rowid": 21, "title": "Using Datasette on your own computer", "content": "First, follow the Installation instructions. Now you can run Datasette against a SQLite file on your computer using the following command: \n datasette path/to/database.db \n This will start a web server on port 8001 - visit http://localhost:8001/ \n to access the web interface. \n Add -o to open your browser automatically once Datasette has started: \n datasette path/to/database.db -o \n Use Chrome on OS X? You can run datasette against your browser history\n like so: \n datasette ~/Library/Application\\ Support/Google/Chrome/Default/History --nolock \n The --nolock option ignores any file locks. This is safe as Datasette will open the file in read-only mode. \n Now visiting http://localhost:8001/History/downloads will show you a web\n interface to browse your downloads data: \n \n \n \n http://localhost:8001/History/downloads.json will return that data as\n JSON: \n {\n \"database\": \"History\",\n \"columns\": [\n \"id\",\n \"current_path\",\n \"target_path\",\n \"start_time\",\n \"received_bytes\",\n \"total_bytes\",\n ...\n ],\n \"rows\": [\n [\n 1,\n \"/Users/simonw/Downloads/DropboxInstaller.dmg\",\n \"/Users/simonw/Downloads/DropboxInstaller.dmg\",\n 13097290269022132,\n 626688,\n 0,\n ...\n ]\n ]\n} \n http://localhost:8001/History/downloads.json?_shape=objects will return that data as\n JSON in a more convenient format: \n {\n ...\n \"rows\": [\n {\n \"start_time\": 13097290269022132,\n \"interrupt_reason\": 0,\n \"hash\": \"\",\n \"id\": 1,\n \"site_url\": \"\",\n \"referrer\": \"https://www.dropbox.com/downloading?src=index\",\n ...\n }\n ]\n}", "sections_fts": 94, "rank": null}, {"rowid": 22, "title": "Changelog", "content": "", "sections_fts": 94, "rank": null}, {"rowid": 23, "title": "1.0a13 (2024-03-12)", "content": "Each of the key concepts in Datasette now has an actions menu , which plugins can use to add additional functionality targeting that entity. \n \n \n Plugin hook: view_actions() for actions that can be applied to a SQL view. ( #2297 ) \n \n \n Plugin hook: homepage_actions() for actions that apply to the instance homepage. ( #2298 ) \n \n \n Plugin hook: row_actions() for actions that apply to the row page. ( #2299 ) \n \n \n Action menu items for all of the *_actions() plugin hooks can now return an optional \"description\" key, which will be displayed in the menu below the action label. ( #2294 ) \n \n \n Plugin hooks documentation page is now organized with additional headings. ( #2300 ) \n \n \n Improved the display of action buttons on pages that also display metadata. ( #2286 ) \n \n \n The header and footer of the page now uses a subtle gradient effect, and options in the navigation menu are better visually defined. ( #2302 ) \n \n \n Table names that start with an underscore now default to hidden. ( #2104 ) \n \n \n pragma_table_list has been added to the allow-list of SQLite pragma functions supported by Datasette. select * from pragma_table_list() is no longer blocked. ( #2104 )", "sections_fts": 94, "rank": null}, {"rowid": 24, "title": "1.0a12 (2024-02-29)", "content": "New query_actions() plugin hook, similar to table_actions() and database_actions() . Can be used to add a menu of actions to the canned query or arbitrary SQL query page. ( #2283 ) \n \n \n New design for the button that opens the query, table and database actions menu. ( #2281 ) \n \n \n \"does not contain\" table filter for finding rows that do not contain a string. ( #2287 ) \n \n \n Fixed a bug in the makeColumnActions(columnDetails) JavaScript plugin mechanism where the column action menu was not fully reset in between each interaction. ( #2289 )", "sections_fts": 94, "rank": null}, {"rowid": 25, "title": "1.0a11 (2024-02-19)", "content": "The \"replace\": true argument to the /db/table/-/insert API now requires the actor to have the update-row permission. ( #2279 ) \n \n \n Fixed some UI bugs in the interactive permissions debugging tool. ( #2278 ) \n \n \n The column action menu now aligns better with the cog icon, and positions itself taking into account the width of the browser window. ( #2263 )", "sections_fts": 94, "rank": null}, {"rowid": 26, "title": "1.0a10 (2024-02-17)", "content": "The only changes in this alpha correspond to the way Datasette handles database transactions. ( #2277 ) \n \n \n The database.execute_write_fn() method has a new transaction=True parameter. This defaults to True which means all functions executed using this method are now automatically wrapped in a transaction - previously the functions needed to roll transaction handling on their own, and many did not. \n \n \n Pass transaction=False to execute_write_fn() if you want to manually handle transactions in your function. \n \n \n Several internal Datasette features, including parts of the JSON write API , had been failing to wrap their operations in a transaction. This has been fixed by the new transaction=True default.", "sections_fts": 94, "rank": null}, {"rowid": 27, "title": "1.0a9 (2024-02-16)", "content": "This alpha release adds basic alter table support to the Datasette Write API and fixes a permissions bug relating to the /upsert API endpoint.", "sections_fts": 94, "rank": null}, {"rowid": 28, "title": "Alter table support for create, insert, upsert and update", "content": "The JSON write API can now be used to apply simple alter table schema changes, provided the acting actor has the new alter-table permission. ( #2101 ) \n The only alter operation supported so far is adding new columns to an existing table. \n \n \n The /db/-/create API now adds new columns during large operations to create a table based on incoming example \"rows\" , in the case where one of the later rows includes columns that were not present in the earlier batches. This requires the create-table but not the alter-table permission. \n \n \n When /db/-/create is called with rows in a situation where the table may have been already created, an \"alter\": true key can be included to indicate that any missing columns from the new rows should be added to the table. This requires the alter-table permission. \n \n \n /db/table/-/insert and /db/table/-/upsert and /db/table/row-pks/-/update all now also accept \"alter\": true , depending on the alter-table permission. \n \n \n Operations that alter a table now fire the new alter-table event .", "sections_fts": 94, "rank": null}, {"rowid": 29, "title": "Permissions fix for the upsert API", "content": "The /database/table/-/upsert API had a minor permissions bug, only affecting Datasette instances that had configured the insert-row and update-row permissions to apply to a specific table rather than the database or instance as a whole. Full details in issue #2262 . \n To avoid similar mistakes in the future the datasette.permission_allowed() method now specifies default= as a keyword-only argument.", "sections_fts": 94, "rank": null}, {"rowid": 30, "title": "Permission checks now consider opinions from every plugin", "content": "The datasette.permission_allowed() method previously consulted every plugin that implemented the permission_allowed() plugin hook and obeyed the opinion of the last plugin to return a value. ( #2275 ) \n Datasette now consults every plugin and checks to see if any of them returned False (the veto rule), and if none of them did, it then checks to see if any of them returned True . \n This is explained at length in the new documentation covering How permissions are resolved .", "sections_fts": 94, "rank": null}, {"rowid": 31, "title": "Other changes", "content": "The new DATASETTE_TRACE_PLUGINS=1 environment variable turns on detailed trace output for every executed plugin hook, useful for debugging and understanding how the plugin system works at a low level. ( #2274 ) \n \n \n Datasette on Python 3.9 or above marks its non-cryptographic uses of the MD5 hash function as usedforsecurity=False , for compatibility with FIPS systems. ( #2270 ) \n \n \n SQL relating to Datasette's internal database now executes inside a transaction, avoiding a potential database locked error. ( #2273 ) \n \n \n The /-/threads debug page now identifies the database in the name associated with each dedicated write thread. ( #2265 ) \n \n \n The /db/-/create API now fires a insert-rows event if rows were inserted after the table was created. ( #2260 )", "sections_fts": 94, "rank": null}, {"rowid": 32, "title": "1.0a8 (2024-02-07)", "content": "This alpha release continues the migration of Datasette's configuration from metadata.yaml to the new datasette.yaml configuration file, introduces a new system for JavaScript plugins and adds several new plugin hooks. \n See Datasette 1.0a8: JavaScript plugins, new plugin hooks and plugin configuration in datasette.yaml for an annotated version of these release notes.", "sections_fts": 94, "rank": null}, {"rowid": 33, "title": "Configuration", "content": "Plugin configuration now lives in the datasette.yaml configuration file , passed to Datasette using the -c/--config option. Thanks, Alex Garcia. ( #2093 ) \n datasette -c datasette.yaml \n Where datasette.yaml contains configuration that looks like this: \n plugins:\n datasette-cluster-map:\n latitude_column: xlat\n longitude_column: xlon \n Previously plugins were configured in metadata.yaml , which was confusing as plugin settings were unrelated to database and table metadata. \n \n \n The -s/--setting option can now be used to set plugin configuration as well. See Configuration via the command-line for details. ( #2252 ) \n The above YAML configuration example using -s/--setting looks like this: \n datasette mydatabase.db \\\n -s plugins.datasette-cluster-map.latitude_column xlat \\\n -s plugins.datasette-cluster-map.longitude_column xlon \n \n \n The new /-/config page shows the current instance configuration, after redacting keys that could contain sensitive data such as API keys or passwords. ( #2254 ) \n \n \n Existing Datasette installations may already have configuration set in metadata.yaml that should be migrated to datasette.yaml . To avoid breaking these installations, Datasette will silently treat table configuration, plugin configuration and allow blocks in metadata as if they had been specified in configuration instead. ( #2247 ) ( #2248 ) ( #2249 ) \n \n \n Note that the datasette publish command has not yet been updated to accept a datasette.yaml configuration file. This will be addressed in #2195 but for the moment you can include those settings in metadata.yaml instead.", "sections_fts": 94, "rank": null}, {"rowid": 34, "title": "JavaScript plugins", "content": "Datasette now includes a JavaScript plugins mechanism , allowing JavaScript to customize Datasette in a way that can collaborate with other plugins. \n This provides two initial hooks, with more to come in the future: \n \n \n makeAboveTablePanelConfigs() can add additional panels to the top of the table page. \n \n \n makeColumnActions() can add additional actions to the column menu. \n \n \n Thanks Cameron Yick for contributing this feature. ( #2052 )", "sections_fts": 94, "rank": null}, {"rowid": 35, "title": "Plugin hooks", "content": "New jinja2_environment_from_request(datasette, request, env) plugin hook, which can be used to customize the current Jinja environment based on the incoming request. This can be used to modify the template lookup path based on the incoming request hostname, among other things. ( #2225 ) \n \n \n New family of template slot plugin hooks : top_homepage , top_database , top_table , top_row , top_query , top_canned_query . Plugins can use these to provide additional HTML to be injected at the top of the corresponding pages. ( #1191 ) \n \n \n \n \n New track_event() mechanism for plugins to emit and receive events when certain events occur within Datasette. ( #2240 ) \n \n \n \n Plugins can register additional event classes using register_events(datasette) . \n \n \n They can then trigger those events with the datasette.track_event(event) internal method. \n \n \n Plugins can subscribe to notifications of events using the track_event(datasette, event) plugin hook. \n \n \n Datasette core now emits login , logout , create-token , create-table , drop-table , insert-rows , upsert-rows , update-row , delete-row events, documented here . \n \n \n \n \n \n \n \n New internal function for plugin authors: await db.execute_isolated_fn(fn) , for creating a new SQLite connection, executing code and then closing that connection, all while preventing other code from writing to that particular database. This connection will not have the prepare_connection() plugin hook executed against it, allowing plugins to perform actions that might otherwise be blocked by existing connection configuration. ( #2218 )", "sections_fts": 94, "rank": null}, {"rowid": 36, "title": "Documentation", "content": "Documentation describing how to write tests that use signed actor cookies using datasette.client.actor_cookie() . ( #1830 ) \n \n \n Documentation on how to register a plugin for the duration of a test . ( #2234 ) \n \n \n The configuration documentation now shows examples of both YAML and JSON for each setting.", "sections_fts": 94, "rank": null}, {"rowid": 37, "title": "Minor fixes", "content": "Datasette no longer attempts to run SQL queries in parallel when rendering a table page, as this was leading to some rare crashing bugs. ( #2189 ) \n \n \n Fixed warning: DeprecationWarning: pkg_resources is deprecated as an API ( #2057 ) \n \n \n Fixed bug where ?_extra=columns parameter returned an incorrectly shaped response. ( #2230 )", "sections_fts": 94, "rank": null}, {"rowid": 38, "title": "0.64.6 (2023-12-22)", "content": "Fixed a bug where CSV export with expanded labels could fail if a foreign key reference did not correctly resolve. ( #2214 )", "sections_fts": 94, "rank": null}, {"rowid": 39, "title": "0.64.5 (2023-10-08)", "content": "Dropped dependency on click-default-group-wheel , which could cause a dependency conflict. ( #2197 )", "sections_fts": 94, "rank": null}, {"rowid": 40, "title": "1.0a7 (2023-09-21)", "content": "Fix for a crashing bug caused by viewing the table page for a named in-memory database. ( #2189 )", "sections_fts": 94, "rank": null}, {"rowid": 41, "title": "0.64.4 (2023-09-21)", "content": "Fix for a crashing bug caused by viewing the table page for a named in-memory database. ( #2189 )", "sections_fts": 94, "rank": null}, {"rowid": 42, "title": "1.0a6 (2023-09-07)", "content": "New plugin hook: actors_from_ids(datasette, actor_ids) and an internal method to accompany it, await .actors_from_ids(actor_ids) . This mechanism is intended to be used by plugins that may need to display the actor who was responsible for something managed by that plugin: they can now resolve the recorded IDs of actors into the full actor objects. ( #2181 ) \n \n \n DATASETTE_LOAD_PLUGINS environment variable for controlling which plugins are loaded by Datasette. ( #2164 ) \n \n \n Datasette now checks if the user has permission to view a table linked to by a foreign key before turning that foreign key into a clickable link. ( #2178 ) \n \n \n The execute-sql permission now implies that the actor can also view the database and instance. ( #2169 ) \n \n \n Documentation describing a pattern for building plugins that themselves define further hooks for other plugins. ( #1765 ) \n \n \n Datasette is now tested against the Python 3.12 preview. ( #2175 )", "sections_fts": 94, "rank": null}, {"rowid": 43, "title": "1.0a5 (2023-08-29)", "content": "When restrictions are applied to API tokens , those restrictions now behave slightly differently: applying the view-table restriction will imply the ability to view-database for the database containing that table, and both view-table and view-database will imply view-instance . Previously you needed to create a token with restrictions that explicitly listed view-instance and view-database and view-table in order to view a table without getting a permission denied error. ( #2102 ) \n \n \n New datasette.yaml (or .json ) configuration file, which can be specified using datasette -c path-to-file . The goal here to consolidate settings, plugin configuration, permissions, canned queries, and other Datasette configuration into a single single file, separate from metadata.yaml . The legacy settings.json config file used for Configuration directory mode has been removed, and datasette.yaml has a \"settings\" section where the same settings key/value pairs can be included. In the next future alpha release, more configuration such as plugins/permissions/canned queries will be moved to the datasette.yaml file. See #2093 for more details. Thanks, Alex Garcia. \n \n \n The -s/--setting option can now take dotted paths to nested settings. These will then be used to set or over-ride the same options as are present in the new configuration file. ( #2156 ) \n \n \n New --actor '{\"id\": \"json-goes-here\"}' option for use with datasette --get to treat the simulated request as being made by a specific actor, see datasette --get . ( #2153 ) \n \n \n The Datasette _internal database has had some changes. It no longer shows up in the datasette.databases list by default, and is now instead available to plugins using the datasette.get_internal_database() . Plugins are invited to use this as a private database to store configuration and settings and secrets that should not be made visible through the default Datasette interface. Users can pass the new --internal internal.db option to persist that internal database to disk. Thanks, Alex Garcia. ( #2157 ).", "sections_fts": 94, "rank": null}, {"rowid": 44, "title": "1.0a4 (2023-08-21)", "content": "This alpha fixes a security issue with the /-/api API explorer. On authenticated Datasette instances (instances protected using plugins such as datasette-auth-passwords ) the API explorer interface could reveal the names of databases and tables within the protected instance. The data stored in those tables was not revealed. \n For more information and workarounds, read the security advisory . The issue has been present in every previous alpha version of Datasette 1.0: versions 1.0a0, 1.0a1, 1.0a2 and 1.0a3. \n Also in this alpha: \n \n \n The new datasette plugins --requirements option outputs a list of currently installed plugins in Python requirements.txt format, useful for duplicating that installation elsewhere. ( #2133 ) \n \n \n Writable canned queries can now define a on_success_message_sql field in their configuration, containing a SQL query that should be executed upon successful completion of the write operation in order to generate a message to be shown to the user. ( #2138 ) \n \n \n The automatically generated border color for a database is now shown in more places around the application. ( #2119 ) \n \n \n Every instance of example shell script code in the documentation should now include a working copy button, free from additional syntax. ( #2140 )", "sections_fts": 94, "rank": null}, {"rowid": 45, "title": "1.0a3 (2023-08-09)", "content": "This alpha release previews the updated design for Datasette's default JSON API. ( #782 ) \n The new default JSON representation for both table pages ( /dbname/table.json ) and arbitrary SQL queries ( /dbname.json?sql=... ) is now shaped like this: \n {\n \"ok\": true,\n \"rows\": [\n {\n \"id\": 3,\n \"name\": \"Detroit\"\n },\n {\n \"id\": 2,\n \"name\": \"Los Angeles\"\n },\n {\n \"id\": 4,\n \"name\": \"Memnonia\"\n },\n {\n \"id\": 1,\n \"name\": \"San Francisco\"\n }\n ],\n \"truncated\": false\n} \n Tables will include an additional \"next\" key for pagination, which can be passed to ?_next= to fetch the next page of results. \n The various ?_shape= options continue to work as before - see Different shapes for details. \n A new ?_extra= mechanism is available for tables, but has not yet been stabilized or documented. Details on that are available in #262 .", "sections_fts": 94, "rank": null}, {"rowid": 46, "title": "Smaller changes", "content": "Datasette documentation now shows YAML examples for Metadata by default, with a tab interface for switching to JSON. ( #1153 ) \n \n \n register_output_renderer(datasette) plugins now have access to error and truncated arguments, allowing them to display error messages and take into account truncated results. ( #2130 ) \n \n \n render_cell() plugin hook now also supports an optional request argument. ( #2007 ) \n \n \n New Justfile to support development workflows for Datasette using Just . \n \n \n datasette.render_template() can now accepts a datasette.views.Context subclass as an alternative to a dictionary. ( #2127 ) \n \n \n datasette install -e path option for editable installations, useful while developing plugins. ( #2106 ) \n \n \n When started with the --cors option Datasette now serves an Access-Control-Max-Age: 3600 header, ensuring CORS OPTIONS requests are repeated no more than once an hour. ( #2079 ) \n \n \n Fixed a bug where the _internal database could display None instead of null for in-memory databases. ( #1970 )", "sections_fts": 94, "rank": null}, {"rowid": 47, "title": "0.64.2 (2023-03-08)", "content": "Fixed a bug with datasette publish cloudrun where deploys all used the same Docker image tag. This was mostly inconsequential as the service is deployed as soon as the image has been pushed to the registry, but could result in the incorrect image being deployed if two different deploys for two separate services ran at exactly the same time. ( #2036 )", "sections_fts": 94, "rank": null}, {"rowid": 48, "title": "0.64.1 (2023-01-11)", "content": "Documentation now links to a current source of information for installing Python 3. ( #1987 ) \n \n \n Incorrectly calling the Datasette constructor using Datasette(\"path/to/data.db\") instead of Datasette([\"path/to/data.db\"]) now returns a useful error message. ( #1985 )", "sections_fts": 94, "rank": null}, {"rowid": 49, "title": "0.64 (2023-01-09)", "content": "Datasette now strongly recommends against allowing arbitrary SQL queries if you are using SpatiaLite . SpatiaLite includes SQL functions that could cause the Datasette server to crash. See SpatiaLite for more details. \n \n \n New default_allow_sql setting, providing an easier way to disable all arbitrary SQL execution by end users: datasette --setting default_allow_sql off . See also Controlling the ability to execute arbitrary SQL . ( #1409 ) \n \n \n Building a location to time zone API with SpatiaLite is a new Datasette tutorial showing how to safely use SpatiaLite to create a location to time zone API. \n \n \n New documentation about how to debug problems loading SQLite extensions . The error message shown when an extension cannot be loaded has also been improved. ( #1979 ) \n \n \n Fixed an accessibility issue: the