{"rowid": 201, "title": "0.19 (2018-04-16)", "content": "This is the first preview of the new Datasette plugins mechanism. Only two\n plugin hooks are available so far - for custom SQL functions and custom template\n filters. There's plenty more to come - read the documentation and get involved in\n the tracking ticket if you\n have feedback on the direction so far. \n \n \n Fix for _sort_desc=sortable_with_nulls test, refs #216 \n \n \n Fixed #216 - paginate correctly when sorting by nullable column \n \n \n Initial documentation for plugins, closes #213 \n https://docs.datasette.io/en/stable/plugins.html \n \n \n New --plugins-dir=plugins/ option ( #212 ) \n New option causing Datasette to load and evaluate all of the Python files in\n the specified directory and register any plugins that are defined in those\n files. \n This new option is available for the following commands: \n datasette serve mydb.db --plugins-dir=plugins/\ndatasette publish now/heroku mydb.db --plugins-dir=plugins/\ndatasette package mydb.db --plugins-dir=plugins/ \n \n \n Start of the plugin system, based on pluggy ( #210 ) \n Uses https://pluggy.readthedocs.io/ originally created for the py.test project \n We're starting with two plugin hooks: \n prepare_connection(conn) \n This is called when a new SQLite connection is created. It can be used to register custom SQL functions. \n prepare_jinja2_environment(env) \n This is called with the Jinja2 environment. It can be used to register custom template tags and filters. \n An example plugin which uses these two hooks can be found at https://github.com/simonw/datasette-plugin-demos or installed using pip install datasette-plugin-demos \n Refs #14 \n \n \n Return HTTP 405 on InvalidUsage rather than 500. [Russ Garrett] \n This also stops it filling up the logs. This happens for HEAD requests\n at the moment - which perhaps should be handled better, but that's a\n different issue.", "sections_fts": 28, "rank": null} {"rowid": 202, "title": "0.18 (2018-04-14)", "content": "This release introduces support for units ,\n contributed by Russ Garrett ( #203 ).\n You can now optionally specify the units for specific columns using metadata.json .\n Once specified, units will be displayed in the HTML view of your table. They also become\n available for use in filters - if a column is configured with a unit of distance, you can\n request all rows where that column is less than 50 meters or more than 20 feet for example. \n \n \n Link foreign keys which don't have labels. [Russ Garrett] \n This renders unlabeled FKs as simple links. \n Also includes bonus fixes for two minor issues: \n \n \n In foreign key link hrefs the primary key was escaped using HTML\n escaping rather than URL escaping. This broke some non-integer PKs. \n \n \n Print tracebacks to console when handling 500 errors. \n \n \n \n \n Fix SQLite error when loading rows with no incoming FKs. [Russ\n Garrett] \n This fixes an error caused by an invalid query when loading incoming FKs. \n The error was ignored due to async but it still got printed to the\n console. \n \n \n Allow custom units to be registered with Pint. [Russ Garrett] \n \n \n Support units in filters. [Russ Garrett] \n \n \n Tidy up units support. [Russ Garrett] \n \n \n Add units to exported JSON \n \n \n Units key in metadata skeleton \n \n \n Docs \n \n \n \n \n Initial units support. [Russ Garrett] \n Add support for specifying units for a column in metadata.json and\n rendering them on display using\n pint", "sections_fts": 28, "rank": null} {"rowid": 203, "title": "0.17 (2018-04-13)", "content": "Release 0.17 to fix issues with PyPI", "sections_fts": 28, "rank": null} {"rowid": 204, "title": "0.16 (2018-04-13)", "content": "Better mechanism for handling errors; 404s for missing table/database \n New error mechanism closes #193 \n 404s for missing tables/databases closes #184 \n \n \n long_description in markdown for the new PyPI \n \n \n Hide SpatiaLite system tables. [Russ Garrett] \n \n \n Allow explain select / explain query plan select #201 \n \n \n Datasette inspect now finds primary_keys #195 \n \n \n Ability to sort using form fields (for mobile portrait mode) #199 \n We now display sort options as a select box plus a descending checkbox, which\n means you can apply sort orders even in portrait mode on a mobile phone where\n the column headers are hidden.", "sections_fts": 28, "rank": null} {"rowid": 205, "title": "0.15 (2018-04-09)", "content": "The biggest new feature in this release is the ability to sort by column. On the\n table page the column headers can now be clicked to apply sort (or descending\n sort), or you can specify ?_sort=column or ?_sort_desc=column directly\n in the URL. \n \n \n table_rows => table_rows_count , filtered_table_rows =>\n filtered_table_rows_count \n Renamed properties. Closes #194 \n \n \n New sortable_columns option in metadata.json to control sort options. \n You can now explicitly set which columns in a table can be used for sorting\n using the _sort and _sort_desc arguments using metadata.json : \n {\n \"databases\": {\n \"database1\": {\n \"tables\": {\n \"example_table\": {\n \"sortable_columns\": [\n \"height\",\n \"weight\"\n ]\n }\n }\n }\n }\n} \n Refs #189 \n \n \n Column headers now link to sort/desc sort - refs #189 \n \n \n _sort and _sort_desc parameters for table views \n Allows for paginated sorted results based on a specified column. \n Refs #189 \n \n \n Total row count now correct even if _next applied \n \n \n Use .custom_sql() for _group_count implementation (refs #150 ) \n \n \n Make HTML title more readable in query template ( #180 ) [Ryan Pitts] \n \n \n New ?_shape=objects/object/lists param for JSON API ( #192 ) \n New _shape= parameter replacing old .jsono extension \n Now instead of this: \n /database/table.jsono \n We use the _shape parameter like this: \n /database/table.json?_shape=objects \n Also introduced a new _shape called object which looks like this: \n /database/table.json?_shape=object \n Returning an object for the rows key: \n ...\n\"rows\": {\n \"pk1\": {\n ...\n },\n \"pk2\": {\n ...\n }\n} \n Refs #122 \n \n \n Utility for writing test database fixtures to a .db file \n python tests/fixtures.py /tmp/hello.db \n This is useful for making a SQLite database of the test fixtures for\n interactive exploration. \n \n \n Compound primary key _next= now plays well with extra filters \n Closes #190 \n \n \n Fixed bug with keyset pagination over compound primary keys \n Refs #190 \n \n \n Database/Table views inherit source/license/source_url/license_url \n metadata \n If you set the source_url/license_url/source/license fields in your root\n metadata those values will now be inherited all the way down to the database\n and table templates. \n The title/description are NOT inherited. \n Also added unit tests for the HTML generated by the metadata. \n Refs #185 \n \n \n Add metadata, if it exists, to heroku temp dir ( #178 ) [Tony Hirst] \n \n \n Initial documentation for pagination \n \n \n Broke up test_app into test_api and test_html \n \n \n Fixed bug with .json path regular expression \n I had a table called geojson and it caused an exception because the regex\n was matching .json and not \\.json \n \n \n Deploy to Heroku with Python 3.6.3", "sections_fts": 28, "rank": null} {"rowid": 206, "title": "0.14 (2017-12-09)", "content": "The theme of this release is customization: Datasette now allows every aspect\n of its presentation to be customized \n either using additional CSS or by providing entirely new templates. \n Datasette's metadata.json format \n has also been expanded, to allow per-database and per-table metadata. A new\n datasette skeleton command can be used to generate a skeleton JSON file\n ready to be filled in with per-database and per-table details. \n The metadata.json file can also be used to define\n canned queries ,\n as a more powerful alternative to SQL views. \n \n \n extra_css_urls / extra_js_urls in metadata \n A mechanism in the metadata.json format for adding custom CSS and JS urls. \n Create a metadata.json file that looks like this: \n {\n \"extra_css_urls\": [\n \"https://simonwillison.net/static/css/all.bf8cd891642c.css\"\n ],\n \"extra_js_urls\": [\n \"https://code.jquery.com/jquery-3.2.1.slim.min.js\"\n ]\n} \n Then start datasette like this: \n datasette mydb.db --metadata=metadata.json \n The CSS and JavaScript files will be linked in the
of every page. \n You can also specify a SRI (subresource integrity hash) for these assets: \n {\n \"extra_css_urls\": [\n {\n \"url\": \"https://simonwillison.net/static/css/all.bf8cd891642c.css\",\n \"sri\": \"sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI\"\n }\n ],\n \"extra_js_urls\": [\n {\n \"url\": \"https://code.jquery.com/jquery-3.2.1.slim.min.js\",\n \"sri\": \"sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g=\"\n }\n ]\n} \n Modern browsers will only execute the stylesheet or JavaScript if the SRI hash\n matches the content served. You can generate hashes using https://www.srihash.org/ \n \n \n Auto-link column values that look like URLs ( #153 ) \n \n \n CSS styling hooks as classes on the body ( #153 ) \n Every template now gets CSS classes in the body designed to support custom\n styling. \n The index template (the top level page at / ) gets this: \n \n The database template ( /dbname/ ) gets this: \n \n The table template ( /dbname/tablename ) gets: \n \n The row template ( /dbname/tablename/rowid ) gets: \n \n The db-x and table-x classes use the database or table names themselves IF\n they are valid CSS identifiers. If they aren't, we strip any invalid\n characters out and append a 6 character md5 digest of the original name, in\n order to ensure that multiple tables which resolve to the same stripped\n character version still have different CSS classes. \n Some examples (extracted from the unit tests): \n \"simple\" => \"simple\"\n\"MixedCase\" => \"MixedCase\"\n\"-no-leading-hyphens\" => \"no-leading-hyphens-65bea6\"\n\"_no-leading-underscores\" => \"no-leading-underscores-b921bc\"\n\"no spaces\" => \"no-spaces-7088d7\"\n\"-\" => \"336d5e\"\n\"no $ characters\" => \"no--characters-59e024\" \n \n \n datasette --template-dir=mytemplates/ argument \n You can now pass an additional argument specifying a directory to look for\n custom templates in. \n Datasette will fall back on the default templates if a template is not\n found in that directory. \n \n \n Ability to over-ride templates for individual tables/databases. \n It is now possible to over-ride templates on a per-database / per-row or per-\n table basis. \n When you access e.g. /mydatabase/mytable Datasette will look for the following: \n - table-mydatabase-mytable.html\n- table.html \n If you provided a --template-dir argument to datasette serve it will look in\n that directory first. \n The lookup rules are as follows: \n Index page (/):\n index.html\n\nDatabase page (/mydatabase):\n database-mydatabase.html\n database.html\n\nTable page (/mydatabase/mytable):\n table-mydatabase-mytable.html\n table.html\n\nRow page (/mydatabase/mytable/id):\n row-mydatabase-mytable.html\n row.html \n If a table name has spaces or other unexpected characters in it, the template\n filename will follow the same rules as our custom CSS classes\n - for example, a table called \"Food Trucks\"\n will attempt to load the following templates: \n table-mydatabase-Food-Trucks-399138.html\ntable.html \n It is possible to extend the default templates using Jinja template\n inheritance. If you want to customize EVERY row template with some additional\n content you can do so by creating a row.html template like this: \n {% extends \"default:row.html\" %}\n\n{% block content %}\nThis line renders the original block:
\n{{ super() }}\n{% endblock %} \n \n \n --static option for datasette serve ( #160 ) \n You can now tell Datasette to serve static files from a specific location at a\n specific mountpoint. \n For example: \n datasette serve mydb.db --static extra-css:/tmp/static/css \n Now if you visit this URL: \n http://localhost:8001/extra-css/blah.css \n The following file will be served: \n /tmp/static/css/blah.css \n \n \n Canned query support. \n Named canned queries can now be defined in metadata.json like this: \n {\n \"databases\": {\n \"timezones\": {\n \"queries\": {\n \"timezone_for_point\": \"select tzid from timezones ...\"\n }\n }\n }\n} \n These will be shown in a new \"Queries\" section beneath \"Views\" on the database page. \n \n \n New datasette skeleton command for generating metadata.json ( #164 ) \n \n \n metadata.json support for per-table/per-database metadata ( #165 ) \n Also added support for descriptions and HTML descriptions. \n Here's an example metadata.json file illustrating custom per-database and per-\n table metadata: \n {\n \"title\": \"Overall datasette title\",\n \"description_html\": \"This is a description with HTML.\",\n \"databases\": {\n \"db1\": {\n \"title\": \"First database\",\n \"description\": \"This is a string description & has no HTML\",\n \"license_url\": \"http://example.com/\",\n \"license\": \"The example license\",\n \"queries\": {\n \"canned_query\": \"select * from table1 limit 3;\"\n },\n \"tables\": {\n \"table1\": {\n \"title\": \"Custom title for table1\",\n \"description\": \"Tables can have descriptions too\",\n \"source\": \"This has a custom source\",\n \"source_url\": \"http://example.com/\"\n }\n }\n }\n }\n} \n \n \n Renamed datasette build command to datasette inspect ( #130 ) \n \n \n Upgrade to Sanic 0.7.0 ( #168 ) \n https://github.com/channelcat/sanic/releases/tag/0.7.0 \n \n \n Package and publish commands now accept --static and --template-dir \n Example usage: \n datasette package --static css:extra-css/ --static js:extra-js/ \\\n sf-trees.db --template-dir templates/ --tag sf-trees --branch master \n This creates a local Docker image that includes copies of the templates/,\n extra-css/ and extra-js/ directories. You can then run it like this: \n docker run -p 8001:8001 sf-trees \n For publishing to Zeit now: \n datasette publish now --static css:extra-css/ --static js:extra-js/ \\\n sf-trees.db --template-dir templates/ --name sf-trees --branch master \n \n \n HTML comment showing which templates were considered for a page ( #171 )", "sections_fts": 28, "rank": null} {"rowid": 207, "title": "0.13 (2017-11-24)", "content": "Search now applies to current filters. \n Combined search into the same form as filters. \n Closes #133 \n \n \n Much tidier design for table view header. \n Closes #147 \n \n \n Added ?column__not=blah filter. \n Closes #148 \n \n \n Row page now resolves foreign keys. \n Closes #132 \n \n \n Further tweaks to select/input filter styling. \n Refs #86 - thanks for the help, @natbat! \n \n \n Show linked foreign key in table cells. \n \n \n Added UI for editing table filters. \n Refs #86 \n \n \n Hide FTS-created tables on index pages. \n Closes #129 \n \n \n Add publish to heroku support [Jacob Kaplan-Moss] \n datasette publish heroku mydb.db \n Pull request #104 \n \n \n Initial implementation of ?_group_count=column . \n URL shortcut for counting rows grouped by one or more columns. \n ?_group_count=column1&_group_count=column2 works as well. \n SQL generated looks like this: \n select \"qSpecies\", count(*) as \"count\"\nfrom Street_Tree_List\ngroup by \"qSpecies\"\norder by \"count\" desc limit 100 \n Or for two columns like this: \n select \"qSpecies\", \"qSiteInfo\", count(*) as \"count\"\nfrom Street_Tree_List\ngroup by \"qSpecies\", \"qSiteInfo\"\norder by \"count\" desc limit 100 \n Refs #44 \n \n \n Added --build=master option to datasette publish and package. \n The datasette publish and datasette package commands both now accept an\n optional --build argument. If provided, this can be used to specify a branch\n published to GitHub that should be built into the container. \n This makes it easier to test code that has not yet been officially released to\n PyPI, e.g.: \n datasette publish now mydb.db --branch=master \n \n \n Implemented ?_search=XXX + UI if a FTS table is detected. \n Closes #131 \n \n \n Added datasette --version support. \n \n \n Table views now show expanded foreign key references, if possible. \n If a table has foreign key columns, and those foreign key tables have\n label_columns , the TableView will now query those other tables for the\n corresponding values and display those values as links in the corresponding\n table cells. \n label_columns are currently detected by the inspect() function, which looks\n for any table that has just two columns - an ID column and one other - and\n sets the label_column to be that second non-ID column. \n \n \n Don't prevent tabbing to \"Run SQL\" button ( #117 ) [Robert Gieseke] \n See comment in #115 \n \n \n Add keyboard shortcut to execute SQL query ( #115 ) [Robert Gieseke] \n \n \n Allow --load-extension to be set via environment variable. \n \n \n Add support for ?field__isnull=1 ( #107 ) [Ray N] \n \n \n Add spatialite, switch to debian and local build ( #114 ) [Ariel N\u00fa\u00f1ez] \n \n \n Added --load-extension argument to datasette serve. \n Allows loading of SQLite extensions. Refs #110 .", "sections_fts": 28, "rank": null} {"rowid": 208, "title": "0.12 (2017-11-16)", "content": "Added __version__ , now displayed as tooltip in page footer ( #108 ). \n \n \n Added initial docs, including a changelog ( #99 ). \n \n \n Turned on auto-escaping in Jinja. \n \n \n Added a UI for editing named parameters ( #96 ). \n You can now construct a custom SQL statement using SQLite named\n parameters (e.g. :name ) and datasette will display form fields for\n editing those parameters. Here\u2019s an example which lets you see the\n most popular names for dogs of different species registered through\n various dog registration schemes in Australia. \n \n \n \n \n \n Pin to specific Jinja version. ( #100 ). \n \n \n Default to 127.0.0.1 not 0.0.0.0. ( #98 ). \n \n \n Added extra metadata options to publish and package commands. ( #92 ). \n You can now run these commands like so: \n datasette now publish mydb.db \\\n --title=\"My Title\" \\\n --source=\"Source\" \\\n --source_url=\"http://www.example.com/\" \\\n --license=\"CC0\" \\\n --license_url=\"https://creativecommons.org/publicdomain/zero/1.0/\" \n This will write those values into the metadata.json that is packaged with the\n app. If you also pass --metadata=metadata.json that file will be updated with the extra\n values before being written into the Docker image. \n \n \n Added production-ready Dockerfile ( #94 ) [Andrew\n Cutler] \n \n \n New ?_sql_time_limit_ms=10 argument to database and table page ( #95 ) \n \n \n SQL syntax highlighting with Codemirror ( #89 ) [Tom Dyson]", "sections_fts": 28, "rank": null} {"rowid": 209, "title": "0.11 (2017-11-14)", "content": "Added datasette publish now --force option. \n This calls now with --force - useful as it means you get a fresh copy of datasette even if Now has already cached that docker layer. \n \n \n Enable --cors by default when running in a container.", "sections_fts": 28, "rank": null} {"rowid": 210, "title": "0.10 (2017-11-14)", "content": "Fixed #83 - 500 error on individual row pages. \n \n \n Stop using sqlite WITH RECURSIVE in our tests. \n The version of Python 3 running in Travis CI doesn't support this.", "sections_fts": 28, "rank": null} {"rowid": 211, "title": "0.9 (2017-11-13)", "content": "Added --sql_time_limit_ms and --extra-options . \n The serve command now accepts --sql_time_limit_ms for customizing the SQL time\n limit. \n The publish and package commands now accept --extra-options which can be used\n to specify additional options to be passed to the datasite serve command when\n it executes inside the resulting Docker containers.", "sections_fts": 28, "rank": null} {"rowid": 212, "title": "0.8 (2017-11-13)", "content": "V0.8 - added PyPI metadata, ready to ship. \n \n \n Implemented offset/limit pagination for views ( #70 ). \n \n \n Improved pagination. ( #78 ) \n \n \n Limit on max rows returned, controlled by --max_returned_rows option. ( #69 ) \n If someone executes 'select * from table' against a table with a million rows\n in it, we could run into problems: just serializing that much data as JSON is\n likely to lock up the server. \n Solution: we now have a hard limit on the maximum number of rows that can be\n returned by a query. If that limit is exceeded, the server will return a\n \"truncated\": true field in the JSON. \n This limit can be optionally controlled by the new --max_returned_rows \n option. Setting that option to 0 disables the limit entirely.", "sections_fts": 28, "rank": null} {"rowid": 213, "title": "CLI reference", "content": "The datasette CLI tool provides a number of commands. \n Running datasette without specifying a command runs the default command, datasette serve . See datasette serve for the full list of options for that command. \n [[[cog\nfrom datasette import cli\nfrom click.testing import CliRunner\nimport textwrap\ndef help(args):\n title = \"datasette \" + \" \".join(args)\n cog.out(\"\\n::\\n\\n\")\n result = CliRunner().invoke(cli.cli, args)\n output = result.output.replace(\"Usage: cli \", \"Usage: datasette \")\n cog.out(textwrap.indent(output, ' '))\n cog.out(\"\\n\\n\") \n ]]] \n [[[end]]]", "sections_fts": 28, "rank": null} {"rowid": 214, "title": "datasette --help", "content": "Running datasette --help shows a list of all of the available commands. \n [[[cog\nhelp([\"--help\"]) \n ]]] \n Usage: datasette [OPTIONS] COMMAND [ARGS]...\n\n Datasette is an open source multi-tool for exploring and publishing data\n\n About Datasette: https://datasette.io/\n Full documentation: https://docs.datasette.io/\n\nOptions:\n --version Show the version and exit.\n --help Show this message and exit.\n\nCommands:\n serve* Serve up specified SQLite database files with a web UI\n create-token Create a signed API token for the specified actor ID\n inspect Generate JSON summary of provided database files\n install Install plugins and packages from PyPI into the same...\n package Package SQLite files into a Datasette Docker container\n plugins List currently installed plugins\n publish Publish specified SQLite database files to the internet...\n uninstall Uninstall plugins and Python packages from the Datasette... \n [[[end]]] \n Additional commands added by plugins that use the register_commands(cli) hook will be listed here as well.", "sections_fts": 28, "rank": null} {"rowid": 215, "title": "datasette serve", "content": "This command starts the Datasette web application running on your machine: \n datasette serve mydatabase.db \n Or since this is the default command you can run this instead: \n datasette mydatabase.db \n Once started you can access it at http://localhost:8001 \n [[[cog\nhelp([\"serve\", \"--help\"]) \n ]]] \n Usage: datasette serve [OPTIONS] [FILES]...\n\n Serve up specified SQLite database files with a web UI\n\nOptions:\n -i, --immutable PATH Database files to open in immutable mode\n -h, --host TEXT Host for server. Defaults to 127.0.0.1 which\n means only connections from the local machine\n will be allowed. Use 0.0.0.0 to listen to all\n IPs and allow access from other machines.\n -p, --port INTEGER RANGE Port for server, defaults to 8001. Use -p 0 to\n automatically assign an available port.\n [0<=x<=65535]\n --uds TEXT Bind to a Unix domain socket\n --reload Automatically reload if code or metadata\n change detected - useful for development\n --cors Enable CORS by serving Access-Control-Allow-\n Origin: *\n --load-extension PATH:ENTRYPOINT?\n Path to a SQLite extension to load, and\n optional entrypoint\n --inspect-file TEXT Path to JSON file created using \"datasette\n inspect\"\n -m, --metadata FILENAME Path to JSON/YAML file containing\n license/source metadata\n --template-dir DIRECTORY Path to directory containing custom templates\n --plugins-dir DIRECTORY Path to directory containing custom plugins\n --static MOUNT:DIRECTORY Serve static files from this directory at\n /MOUNT/...\n --memory Make /_memory database available\n -c, --config FILENAME Path to JSON/YAML Datasette configuration file\n -s, --setting SETTING... nested.key, value setting to use in Datasette\n configuration\n --secret TEXT Secret used for signing secure values, such as\n signed cookies\n --root Output URL that sets a cookie authenticating\n the root user\n --get TEXT Run an HTTP GET request against this path,\n print results and exit\n --token TEXT API token to send with --get requests\n --actor TEXT Actor to use for --get requests (JSON string)\n --version-note TEXT Additional note to show on /-/versions\n --help-settings Show available settings\n --pdb Launch debugger on any errors\n -o, --open Open Datasette in your web browser\n --create Create database files if they do not exist\n --crossdb Enable cross-database joins using the /_memory\n database\n --nolock Ignore locking, open locked files in read-only\n mode\n --ssl-keyfile TEXT SSL key file\n --ssl-certfile TEXT SSL certificate file\n --internal PATH Path to a persistent Datasette internal SQLite\n database\n --help Show this message and exit. \n [[[end]]]", "sections_fts": 28, "rank": null} {"rowid": 216, "title": "Environment variables", "content": "Some of the datasette serve options can be provided by environment variables: \n \n \n DATASETTE_SECRET : Equivalent to the --secret option. \n \n \n DATASETTE_SSL_KEYFILE : Equivalent to the --ssl-keyfile option. \n \n \n DATASETTE_SSL_CERTFILE : Equivalent to the --ssl-certfile option. \n \n \n DATASETTE_LOAD_EXTENSION : Equivalent to the --load-extension option.", "sections_fts": 28, "rank": null} {"rowid": 217, "title": "datasette --get", "content": "The --get option to datasette serve (or just datasette ) specifies the path to a page within Datasette and causes Datasette to output the content from that path without starting the web server. \n This means that all of Datasette's functionality can be accessed directly from the command-line. \n For example: \n datasette --get '/-/versions.json' | jq . \n {\n \"python\": {\n \"version\": \"3.8.5\",\n \"full\": \"3.8.5 (default, Jul 21 2020, 10:48:26) \\n[Clang 11.0.3 (clang-1103.0.32.62)]\"\n },\n \"datasette\": {\n \"version\": \"0.46+15.g222a84a.dirty\"\n },\n \"asgi\": \"3.0\",\n \"uvicorn\": \"0.11.8\",\n \"sqlite\": {\n \"version\": \"3.32.3\",\n \"fts_versions\": [\n \"FTS5\",\n \"FTS4\",\n \"FTS3\"\n ],\n \"extensions\": {\n \"json1\": null\n },\n \"compile_options\": [\n \"COMPILER=clang-11.0.3\",\n \"ENABLE_COLUMN_METADATA\",\n \"ENABLE_FTS3\",\n \"ENABLE_FTS3_PARENTHESIS\",\n \"ENABLE_FTS4\",\n \"ENABLE_FTS5\",\n \"ENABLE_GEOPOLY\",\n \"ENABLE_JSON1\",\n \"ENABLE_PREUPDATE_HOOK\",\n \"ENABLE_RTREE\",\n \"ENABLE_SESSION\",\n \"MAX_VARIABLE_NUMBER=250000\",\n \"THREADSAFE=1\"\n ]\n }\n} \n You can use the --token TOKEN option to send an API token with the simulated request. \n Or you can make a request as a specific actor by passing a JSON representation of that actor to --actor : \n datasette --memory --actor '{\"id\": \"root\"}' --get '/-/actor.json' \n The exit code of datasette --get will be 0 if the request succeeds and 1 if the request produced an HTTP status code other than 200 - e.g. a 404 or 500 error. \n This lets you use datasette --get / to run tests against a Datasette application in a continuous integration environment such as GitHub Actions.", "sections_fts": 28, "rank": null} {"rowid": 218, "title": "datasette serve --help-settings", "content": "This command outputs all of the available Datasette settings . \n These can be passed to datasette serve using datasette serve --setting name value . \n [[[cog\nhelp([\"--help-settings\"]) \n ]]] \n Settings:\n default_page_size Default page size for the table view\n (default=100)\n max_returned_rows Maximum rows that can be returned from a table or\n custom query (default=1000)\n max_insert_rows Maximum rows that can be inserted at a time using\n the bulk insert API (default=100)\n num_sql_threads Number of threads in the thread pool for\n executing SQLite queries (default=3)\n sql_time_limit_ms Time limit for a SQL query in milliseconds\n (default=1000)\n default_facet_size Number of values to return for requested facets\n (default=30)\n facet_time_limit_ms Time limit for calculating a requested facet\n (default=200)\n facet_suggest_time_limit_ms Time limit for calculating a suggested facet\n (default=50)\n allow_facet Allow users to specify columns to facet using\n ?_facet= parameter (default=True)\n allow_download Allow users to download the original SQLite\n database files (default=True)\n allow_signed_tokens Allow users to create and use signed API tokens\n (default=True)\n default_allow_sql Allow anyone to run arbitrary SQL queries\n (default=True)\n max_signed_tokens_ttl Maximum allowed expiry time for signed API tokens\n (default=0)\n suggest_facets Calculate and display suggested facets\n (default=True)\n default_cache_ttl Default HTTP cache TTL (used in Cache-Control:\n max-age= header) (default=5)\n cache_size_kb SQLite cache size in KB (0 == use SQLite default)\n (default=0)\n allow_csv_stream Allow .csv?_stream=1 to download all rows\n (ignoring max_returned_rows) (default=True)\n max_csv_mb Maximum size allowed for CSV export in MB - set 0\n to disable this limit (default=100)\n truncate_cells_html Truncate cells longer than this in HTML table\n view - set 0 to disable (default=2048)\n force_https_urls Force URLs in API output to always use https://\n protocol (default=False)\n template_debug Allow display of template debug information with\n ?_context=1 (default=False)\n trace_debug Allow display of SQL trace debug information with\n ?_trace=1 (default=False)\n base_url Datasette URLs should use this base path\n (default=/) \n [[[end]]]", "sections_fts": 28, "rank": null} {"rowid": 219, "title": "datasette plugins", "content": "Output JSON showing all currently installed plugins, their versions, whether they include static files or templates and which Plugin hooks they use. \n [[[cog\nhelp([\"plugins\", \"--help\"]) \n ]]] \n Usage: datasette plugins [OPTIONS]\n\n List currently installed plugins\n\nOptions:\n --all Include built-in default plugins\n --requirements Output requirements.txt of installed plugins\n --plugins-dir DIRECTORY Path to directory containing custom plugins\n --help Show this message and exit. \n [[[end]]] \n Example output: \n [\n {\n \"name\": \"datasette-geojson\",\n \"static\": false,\n \"templates\": false,\n \"version\": \"0.3.1\",\n \"hooks\": [\n \"register_output_renderer\"\n ]\n },\n {\n \"name\": \"datasette-geojson-map\",\n \"static\": true,\n \"templates\": false,\n \"version\": \"0.4.0\",\n \"hooks\": [\n \"extra_body_script\",\n \"extra_css_urls\",\n \"extra_js_urls\"\n ]\n },\n {\n \"name\": \"datasette-leaflet\",\n \"static\": true,\n \"templates\": false,\n \"version\": \"0.2.2\",\n \"hooks\": [\n \"extra_body_script\",\n \"extra_template_vars\"\n ]\n }\n]", "sections_fts": 28, "rank": null} {"rowid": 220, "title": "datasette install", "content": "Install new Datasette plugins. This command works like pip install but ensures that your plugins will be installed into the same environment as Datasette. \n This command: \n datasette install datasette-cluster-map \n Would install the datasette-cluster-map plugin. \n [[[cog\nhelp([\"install\", \"--help\"]) \n ]]] \n Usage: datasette install [OPTIONS] [PACKAGES]...\n\n Install plugins and packages from PyPI into the same environment as Datasette\n\nOptions:\n -U, --upgrade Upgrade packages to latest version\n -r, --requirement PATH Install from requirements file\n -e, --editable TEXT Install a project in editable mode from this path\n --help Show this message and exit. \n [[[end]]]", "sections_fts": 28, "rank": null} {"rowid": 221, "title": "datasette uninstall", "content": "Uninstall one or more plugins. \n [[[cog\nhelp([\"uninstall\", \"--help\"]) \n ]]] \n Usage: datasette uninstall [OPTIONS] PACKAGES...\n\n Uninstall plugins and Python packages from the Datasette environment\n\nOptions:\n -y, --yes Don't ask for confirmation\n --help Show this message and exit. \n [[[end]]]", "sections_fts": 28, "rank": null} {"rowid": 222, "title": "datasette publish", "content": "Shows a list of available deployment targets for publishing data with Datasette. \n Additional deployment targets can be added by plugins that use the publish_subcommand(publish) hook. \n [[[cog\nhelp([\"publish\", \"--help\"]) \n ]]] \n Usage: datasette publish [OPTIONS] COMMAND [ARGS]...\n\n Publish specified SQLite database files to the internet along with a\n Datasette-powered interface and API\n\nOptions:\n --help Show this message and exit.\n\nCommands:\n cloudrun Publish databases to Datasette running on Cloud Run\n heroku Publish databases to Datasette running on Heroku \n [[[end]]]", "sections_fts": 28, "rank": null} {"rowid": 223, "title": "datasette publish cloudrun", "content": "See Publishing to Google Cloud Run . \n [[[cog\nhelp([\"publish\", \"cloudrun\", \"--help\"]) \n ]]] \n Usage: datasette publish cloudrun [OPTIONS] [FILES]...\n\n Publish databases to Datasette running on Cloud Run\n\nOptions:\n -m, --metadata FILENAME Path to JSON/YAML file containing metadata to\n publish\n --extra-options TEXT Extra options to pass to datasette serve\n --branch TEXT Install datasette from a GitHub branch e.g.\n main\n --template-dir DIRECTORY Path to directory containing custom templates\n --plugins-dir DIRECTORY Path to directory containing custom plugins\n --static MOUNT:DIRECTORY Serve static files from this directory at\n /MOUNT/...\n --install TEXT Additional packages (e.g. plugins) to install\n --plugin-secretThis is a custom panel that I added using a JavaScript plugin
';\n }\n }\n ]\n }\n });\n}); \n When a page with a table loads, all registered plugins that implement makeAboveTablePanelConfigs() will be called and panels they return will be added to the top of the table page.", "sections_fts": 28, "rank": null} {"rowid": 233, "title": "makeColumnActions(columnDetails)", "content": "This method, if present, will be called when Datasette is rendering the cog action menu icons that appear at the top of the table view. By default these include options like \"Sort ascending/descending\" and \"Facet by this\", but plugins can return additional actions to be included in this menu. \n The method will be called with a columnDetails object with the following keys: \n \n \n columnName - string \n \n The name of the column \n \n \n \n columnNotNull - boolean \n \n True if the column is defined as NOT NULL \n \n \n \n columnType - string \n \n The SQLite data type of the column \n \n \n \n isPk - boolean \n \n True if the column is part of the primary key \n \n \n \n It should return a JavaScript array of objects each with a label and onClick property: \n \n \n label - string \n \n The human-readable label for the action \n \n \n \n onClick(evt) - function \n \n A function that will be called when the action is clicked \n \n \n \n The evt object passed to the onClick is the standard browser event object that triggered the click. \n This example plugin adds two menu items - one to copy the column name to the clipboard and another that displays the column metadata in an alert() window: \n document.addEventListener('datasette_init', function(ev) {\n ev.detail.registerPlugin('column-name-plugin', {\n version: 0.1,\n makeColumnActions: (columnDetails) => {\n return [\n {\n label: 'Copy column to clipboard',\n onClick: async (evt) => {\n await navigator.clipboard.writeText(columnDetails.columnName)\n }\n },\n {\n label: 'Alert column metadata',\n onClick: () => alert(JSON.stringify(columnDetails, null, 2))\n }\n ];\n }\n });\n});", "sections_fts": 28, "rank": null} {"rowid": 234, "title": "Selectors", "content": "These are available on the selectors property of the datasetteManager object. \n const DOM_SELECTORS = {\n /** Should have one match */\n jsonExportLink: \".export-links a[href*=json]\",\n\n /** Event listeners that go outside of the main table, e.g. existing scroll listener */\n tableWrapper: \".table-wrapper\",\n table: \"table.rows-and-columns\",\n aboveTablePanel: \".above-table-panel\",\n\n // These could have multiple matches\n /** Used for selecting table headers. Use makeColumnActions if you want to add menu items. */\n tableHeaders: `table.rows-and-columns th`,\n\n /** Used to add \"where\" clauses to query using direct manipulation */\n filterRows: \".filter-row\",\n /** Used to show top available enum values for a column (\"facets\") */\n facetResults: \".facet-results [data-column]\",\n};", "sections_fts": 28, "rank": null} {"rowid": 235, "title": "Introspection", "content": "Datasette includes some pages and JSON API endpoints for introspecting the current instance. These can be used to understand some of the internals of Datasette and to see how a particular instance has been configured. \n Each of these pages can be viewed in your browser. Add .json to the URL to get back the contents as JSON.", "sections_fts": 28, "rank": null} {"rowid": 236, "title": "/-/metadata", "content": "Shows the contents of the metadata.json file that was passed to datasette serve , if any. Metadata example : \n {\n \"license\": \"CC Attribution 4.0 License\",\n \"license_url\": \"http://creativecommons.org/licenses/by/4.0/\",\n \"source\": \"fivethirtyeight/data on GitHub\",\n \"source_url\": \"https://github.com/fivethirtyeight/data\",\n \"title\": \"Five Thirty Eight\",\n \"databases\": {\n\n }\n}", "sections_fts": 28, "rank": null} {"rowid": 237, "title": "/-/versions", "content": "Shows the version of Datasette, Python and SQLite. Versions example : \n {\n \"datasette\": {\n \"version\": \"0.60\"\n },\n \"python\": {\n \"full\": \"3.8.12 (default, Dec 21 2021, 10:45:09) \\n[GCC 10.2.1 20210110]\",\n \"version\": \"3.8.12\"\n },\n \"sqlite\": {\n \"extensions\": {\n \"json1\": null\n },\n \"fts_versions\": [\n \"FTS5\",\n \"FTS4\",\n \"FTS3\"\n ],\n \"compile_options\": [\n \"COMPILER=gcc-6.3.0 20170516\",\n \"ENABLE_FTS3\",\n \"ENABLE_FTS4\",\n \"ENABLE_FTS5\",\n \"ENABLE_JSON1\",\n \"ENABLE_RTREE\",\n \"THREADSAFE=1\"\n ],\n \"version\": \"3.37.0\"\n }\n}", "sections_fts": 28, "rank": null} {"rowid": 238, "title": "/-/plugins", "content": "Shows a list of currently installed plugins and their versions. Plugins example : \n [\n {\n \"name\": \"datasette_cluster_map\",\n \"static\": true,\n \"templates\": false,\n \"version\": \"0.10\",\n \"hooks\": [\"extra_css_urls\", \"extra_js_urls\", \"extra_body_script\"]\n }\n] \n Add ?all=1 to include details of the default plugins baked into Datasette.", "sections_fts": 28, "rank": null} {"rowid": 239, "title": "/-/settings", "content": "Shows the Settings for this instance of Datasette. Settings example : \n {\n \"default_facet_size\": 30,\n \"default_page_size\": 100,\n \"facet_suggest_time_limit_ms\": 50,\n \"facet_time_limit_ms\": 1000,\n \"max_returned_rows\": 1000,\n \"sql_time_limit_ms\": 1000\n}", "sections_fts": 28, "rank": null} {"rowid": 240, "title": "/-/config", "content": "Shows the configuration for this instance of Datasette. This is generally the contents of the datasette.yaml or datasette.json file, which can include plugin configuration as well. Config example : \n {\n \"settings\": {\n \"template_debug\": true,\n \"trace_debug\": true,\n \"force_https_urls\": true\n }\n} \n Any keys that include the one of the following substrings in their names will be returned as redacted *** output, to help avoid accidentally leaking private configuration information: secret , key , password , token , hash , dsn .", "sections_fts": 28, "rank": null} {"rowid": 241, "title": "/-/databases", "content": "Shows currently attached databases. Databases example : \n [\n {\n \"hash\": null,\n \"is_memory\": false,\n \"is_mutable\": true,\n \"name\": \"fixtures\",\n \"path\": \"fixtures.db\",\n \"size\": 225280\n }\n]", "sections_fts": 28, "rank": null} {"rowid": 242, "title": "/-/threads", "content": "Shows details of threads and asyncio tasks. Threads example : \n {\n \"num_threads\": 2,\n \"threads\": [\n {\n \"daemon\": false,\n \"ident\": 4759197120,\n \"name\": \"MainThread\"\n },\n {\n \"daemon\": true,\n \"ident\": 123145319682048,\n \"name\": \"Thread-1\"\n },\n ],\n \"num_tasks\": 3,\n \"tasks\": [\n \"