{"id": "cli-reference:cli-datasette-get", "page": "cli-reference", "ref": "cli-datasette-get", "title": "datasette --get", "content": "The --get option to datasette serve (or just datasette ) specifies the path to a page within Datasette and causes Datasette to output the content from that path without starting the web server. \n This means that all of Datasette's functionality can be accessed directly from the command-line. \n For example: \n datasette --get '/-/versions.json' | jq . \n {\n \"python\": {\n \"version\": \"3.8.5\",\n \"full\": \"3.8.5 (default, Jul 21 2020, 10:48:26) \\n[Clang 11.0.3 (clang-1103.0.32.62)]\"\n },\n \"datasette\": {\n \"version\": \"0.46+15.g222a84a.dirty\"\n },\n \"asgi\": \"3.0\",\n \"uvicorn\": \"0.11.8\",\n \"sqlite\": {\n \"version\": \"3.32.3\",\n \"fts_versions\": [\n \"FTS5\",\n \"FTS4\",\n \"FTS3\"\n ],\n \"extensions\": {\n \"json1\": null\n },\n \"compile_options\": [\n \"COMPILER=clang-11.0.3\",\n \"ENABLE_COLUMN_METADATA\",\n \"ENABLE_FTS3\",\n \"ENABLE_FTS3_PARENTHESIS\",\n \"ENABLE_FTS4\",\n \"ENABLE_FTS5\",\n \"ENABLE_GEOPOLY\",\n \"ENABLE_JSON1\",\n \"ENABLE_PREUPDATE_HOOK\",\n \"ENABLE_RTREE\",\n \"ENABLE_SESSION\",\n \"MAX_VARIABLE_NUMBER=250000\",\n \"THREADSAFE=1\"\n ]\n }\n} \n You can use the --token TOKEN option to send an API token with the simulated request. \n Or you can make a request as a specific actor by passing a JSON representation of that actor to --actor : \n datasette --memory --actor '{\"id\": \"root\"}' --get '/-/actor.json' \n The exit code of datasette --get will be 0 if the request succeeds and 1 if the request produced an HTTP status code other than 200 - e.g. a 404 or 500 error. \n This lets you use datasette --get / to run tests against a Datasette application in a continuous integration environment such as GitHub Actions.", "breadcrumbs": "[\"CLI reference\", \"datasette serve\"]", "references": "[]"} {"id": "cli-reference:cli-help-create-token-help", "page": "cli-reference", "ref": "cli-help-create-token-help", "title": "datasette create-token", "content": "Create a signed API token, see datasette create-token . \n [[[cog\nhelp([\"create-token\", \"--help\"]) \n ]]] \n Usage: datasette create-token [OPTIONS] ID\n\n Create a signed API token for the specified actor ID\n\n Example:\n\n datasette create-token root --secret mysecret\n\n To allow only \"view-database-download\" for all databases:\n\n datasette create-token root --secret mysecret \\\n --all view-database-download\n\n To allow \"create-table\" against a specific database:\n\n datasette create-token root --secret mysecret \\\n --database mydb create-table\n\n To allow \"insert-row\" against a specific table:\n\n datasette create-token root --secret myscret \\\n --resource mydb mytable insert-row\n\n Restricted actions can be specified multiple times using multiple --all,\n --database, and --resource options.\n\n Add --debug to see a decoded version of the token.\n\nOptions:\n --secret TEXT Secret used for signing the API tokens\n [required]\n -e, --expires-after INTEGER Token should expire after this many seconds\n -a, --all ACTION Restrict token to this action\n -d, --database DB ACTION Restrict token to this action on this database\n -r, --resource DB RESOURCE ACTION\n Restrict token to this action on this database\n resource (a table, SQL view or named query)\n --debug Show decoded token\n --plugins-dir DIRECTORY Path to directory containing custom plugins\n --help Show this message and exit. \n [[[end]]]", "breadcrumbs": "[\"CLI reference\"]", "references": "[]"} {"id": "cli-reference:cli-help-help", "page": "cli-reference", "ref": "cli-help-help", "title": "datasette --help", "content": "Running datasette --help shows a list of all of the available commands. \n [[[cog\nhelp([\"--help\"]) \n ]]] \n Usage: datasette [OPTIONS] COMMAND [ARGS]...\n\n Datasette is an open source multi-tool for exploring and publishing data\n\n About Datasette: https://datasette.io/\n Full documentation: https://docs.datasette.io/\n\nOptions:\n --version Show the version and exit.\n --help Show this message and exit.\n\nCommands:\n serve* Serve up specified SQLite database files with a web UI\n create-token Create a signed API token for the specified actor ID\n inspect Generate JSON summary of provided database files\n install Install plugins and packages from PyPI into the same...\n package Package SQLite files into a Datasette Docker container\n plugins List currently installed plugins\n publish Publish specified SQLite database files to the internet...\n uninstall Uninstall plugins and Python packages from the Datasette... \n [[[end]]] \n Additional commands added by plugins that use the register_commands(cli) hook will be listed here as well.", "breadcrumbs": "[\"CLI reference\"]", "references": "[]"} {"id": "cli-reference:cli-help-inspect-help", "page": "cli-reference", "ref": "cli-help-inspect-help", "title": "datasette inspect", "content": "Outputs JSON representing introspected data about one or more SQLite database files. \n If you are opening an immutable database, you can pass this file to the --inspect-data option to improve Datasette's performance by allowing it to skip running row counts against the database when it first starts running: \n datasette inspect mydatabase.db > inspect-data.json\ndatasette serve -i mydatabase.db --inspect-file inspect-data.json \n This performance optimization is used automatically by some of the datasette publish commands. You are unlikely to need to apply this optimization manually. \n [[[cog\nhelp([\"inspect\", \"--help\"]) \n ]]] \n Usage: datasette inspect [OPTIONS] [FILES]...\n\n Generate JSON summary of provided database files\n\n This can then be passed to \"datasette --inspect-file\" to speed up count\n operations against immutable database files.\n\nOptions:\n --inspect-file TEXT\n --load-extension PATH:ENTRYPOINT?\n Path to a SQLite extension to load, and\n optional entrypoint\n --help Show this message and exit. \n [[[end]]]", "breadcrumbs": "[\"CLI reference\"]", "references": "[]"} {"id": "cli-reference:cli-help-install-help", "page": "cli-reference", "ref": "cli-help-install-help", "title": "datasette install", "content": "Install new Datasette plugins. This command works like pip install but ensures that your plugins will be installed into the same environment as Datasette. \n This command: \n datasette install datasette-cluster-map \n Would install the datasette-cluster-map plugin. \n [[[cog\nhelp([\"install\", \"--help\"]) \n ]]] \n Usage: datasette install [OPTIONS] [PACKAGES]...\n\n Install plugins and packages from PyPI into the same environment as Datasette\n\nOptions:\n -U, --upgrade Upgrade packages to latest version\n -r, --requirement PATH Install from requirements file\n -e, --editable TEXT Install a project in editable mode from this path\n --help Show this message and exit. \n [[[end]]]", "breadcrumbs": "[\"CLI reference\"]", "references": "[{\"href\": \"https://datasette.io/plugins/datasette-cluster-map\", \"label\": \"datasette-cluster-map\"}]"} {"id": "cli-reference:cli-help-package-help", "page": "cli-reference", "ref": "cli-help-package-help", "title": "datasette package", "content": "Package SQLite files into a Datasette Docker container, see datasette package . \n [[[cog\nhelp([\"package\", \"--help\"]) \n ]]] \n Usage: datasette package [OPTIONS] FILES...\n\n Package SQLite files into a Datasette Docker container\n\nOptions:\n -t, --tag TEXT Name for the resulting Docker container, can\n optionally use name:tag format\n -m, --metadata FILENAME Path to JSON/YAML file containing metadata to\n publish\n --extra-options TEXT Extra options to pass to datasette serve\n --branch TEXT Install datasette from a GitHub branch e.g. main\n --template-dir DIRECTORY Path to directory containing custom templates\n --plugins-dir DIRECTORY Path to directory containing custom plugins\n --static MOUNT:DIRECTORY Serve static files from this directory at /MOUNT/...\n --install TEXT Additional packages (e.g. plugins) to install\n --spatialite Enable SpatialLite extension\n --version-note TEXT Additional note to show on /-/versions\n --secret TEXT Secret used for signing secure values, such as\n signed cookies\n -p, --port INTEGER RANGE Port to run the server on, defaults to 8001\n [1<=x<=65535]\n --title TEXT Title for metadata\n --license TEXT License label for metadata\n --license_url TEXT License URL for metadata\n --source TEXT Source label for metadata\n --source_url TEXT Source URL for metadata\n --about TEXT About label for metadata\n --about_url TEXT About URL for metadata\n --help Show this message and exit. \n [[[end]]]", "breadcrumbs": "[\"CLI reference\"]", "references": "[]"} {"id": "cli-reference:cli-help-plugins-help", "page": "cli-reference", "ref": "cli-help-plugins-help", "title": "datasette plugins", "content": "Output JSON showing all currently installed plugins, their versions, whether they include static files or templates and which Plugin hooks they use. \n [[[cog\nhelp([\"plugins\", \"--help\"]) \n ]]] \n Usage: datasette plugins [OPTIONS]\n\n List currently installed plugins\n\nOptions:\n --all Include built-in default plugins\n --requirements Output requirements.txt of installed plugins\n --plugins-dir DIRECTORY Path to directory containing custom plugins\n --help Show this message and exit. \n [[[end]]] \n Example output: \n [\n {\n \"name\": \"datasette-geojson\",\n \"static\": false,\n \"templates\": false,\n \"version\": \"0.3.1\",\n \"hooks\": [\n \"register_output_renderer\"\n ]\n },\n {\n \"name\": \"datasette-geojson-map\",\n \"static\": true,\n \"templates\": false,\n \"version\": \"0.4.0\",\n \"hooks\": [\n \"extra_body_script\",\n \"extra_css_urls\",\n \"extra_js_urls\"\n ]\n },\n {\n \"name\": \"datasette-leaflet\",\n \"static\": true,\n \"templates\": false,\n \"version\": \"0.2.2\",\n \"hooks\": [\n \"extra_body_script\",\n \"extra_template_vars\"\n ]\n }\n]", "breadcrumbs": "[\"CLI reference\"]", "references": "[]"} {"id": "cli-reference:cli-help-publish-cloudrun-help", "page": "cli-reference", "ref": "cli-help-publish-cloudrun-help", "title": "datasette publish cloudrun", "content": "See Publishing to Google Cloud Run . \n [[[cog\nhelp([\"publish\", \"cloudrun\", \"--help\"]) \n ]]] \n Usage: datasette publish cloudrun [OPTIONS] [FILES]...\n\n Publish databases to Datasette running on Cloud Run\n\nOptions:\n -m, --metadata FILENAME Path to JSON/YAML file containing metadata to\n publish\n --extra-options TEXT Extra options to pass to datasette serve\n --branch TEXT Install datasette from a GitHub branch e.g.\n main\n --template-dir DIRECTORY Path to directory containing custom templates\n --plugins-dir DIRECTORY Path to directory containing custom plugins\n --static MOUNT:DIRECTORY Serve static files from this directory at\n /MOUNT/...\n --install TEXT Additional packages (e.g. plugins) to install\n --plugin-secret ...\n Secrets to pass to plugins, e.g. --plugin-\n secret datasette-auth-github client_id xxx\n --version-note TEXT Additional note to show on /-/versions\n --secret TEXT Secret used for signing secure values, such as\n signed cookies\n --title TEXT Title for metadata\n --license TEXT License label for metadata\n --license_url TEXT License URL for metadata\n --source TEXT Source label for metadata\n --source_url TEXT Source URL for metadata\n --about TEXT About label for metadata\n --about_url TEXT About URL for metadata\n -n, --name TEXT Application name to use when building\n --service TEXT Cloud Run service to deploy (or over-write)\n --spatialite Enable SpatialLite extension\n --show-files Output the generated Dockerfile and\n metadata.json\n --memory TEXT Memory to allocate in Cloud Run, e.g. 1Gi\n --cpu [1|2|4] Number of vCPUs to allocate in Cloud Run\n --timeout INTEGER Build timeout in seconds\n --apt-get-install TEXT Additional packages to apt-get install\n --max-instances INTEGER Maximum Cloud Run instances\n --min-instances INTEGER Minimum Cloud Run instances\n --help Show this message and exit. \n [[[end]]]", "breadcrumbs": "[\"CLI reference\"]", "references": "[]"} {"id": "cli-reference:cli-help-publish-help", "page": "cli-reference", "ref": "cli-help-publish-help", "title": "datasette publish", "content": "Shows a list of available deployment targets for publishing data with Datasette. \n Additional deployment targets can be added by plugins that use the publish_subcommand(publish) hook. \n [[[cog\nhelp([\"publish\", \"--help\"]) \n ]]] \n Usage: datasette publish [OPTIONS] COMMAND [ARGS]...\n\n Publish specified SQLite database files to the internet along with a\n Datasette-powered interface and API\n\nOptions:\n --help Show this message and exit.\n\nCommands:\n cloudrun Publish databases to Datasette running on Cloud Run\n heroku Publish databases to Datasette running on Heroku \n [[[end]]]", "breadcrumbs": "[\"CLI reference\"]", "references": "[]"} {"id": "cli-reference:cli-help-publish-heroku-help", "page": "cli-reference", "ref": "cli-help-publish-heroku-help", "title": "datasette publish heroku", "content": "See Publishing to Heroku . \n [[[cog\nhelp([\"publish\", \"heroku\", \"--help\"]) \n ]]] \n Usage: datasette publish heroku [OPTIONS] [FILES]...\n\n Publish databases to Datasette running on Heroku\n\nOptions:\n -m, --metadata FILENAME Path to JSON/YAML file containing metadata to\n publish\n --extra-options TEXT Extra options to pass to datasette serve\n --branch TEXT Install datasette from a GitHub branch e.g.\n main\n --template-dir DIRECTORY Path to directory containing custom templates\n --plugins-dir DIRECTORY Path to directory containing custom plugins\n --static MOUNT:DIRECTORY Serve static files from this directory at\n /MOUNT/...\n --install TEXT Additional packages (e.g. plugins) to install\n --plugin-secret ...\n Secrets to pass to plugins, e.g. --plugin-\n secret datasette-auth-github client_id xxx\n --version-note TEXT Additional note to show on /-/versions\n --secret TEXT Secret used for signing secure values, such as\n signed cookies\n --title TEXT Title for metadata\n --license TEXT License label for metadata\n --license_url TEXT License URL for metadata\n --source TEXT Source label for metadata\n --source_url TEXT Source URL for metadata\n --about TEXT About label for metadata\n --about_url TEXT About URL for metadata\n -n, --name TEXT Application name to use when deploying\n --tar TEXT --tar option to pass to Heroku, e.g.\n --tar=/usr/local/bin/gtar\n --generate-dir DIRECTORY Output generated application files and stop\n without deploying\n --help Show this message and exit. \n [[[end]]]", "breadcrumbs": "[\"CLI reference\"]", "references": "[]"} {"id": "cli-reference:cli-help-serve-help", "page": "cli-reference", "ref": "cli-help-serve-help", "title": "datasette serve", "content": "This command starts the Datasette web application running on your machine: \n datasette serve mydatabase.db \n Or since this is the default command you can run this instead: \n datasette mydatabase.db \n Once started you can access it at http://localhost:8001 \n [[[cog\nhelp([\"serve\", \"--help\"]) \n ]]] \n Usage: datasette serve [OPTIONS] [FILES]...\n\n Serve up specified SQLite database files with a web UI\n\nOptions:\n -i, --immutable PATH Database files to open in immutable mode\n -h, --host TEXT Host for server. Defaults to 127.0.0.1 which\n means only connections from the local machine\n will be allowed. Use 0.0.0.0 to listen to all\n IPs and allow access from other machines.\n -p, --port INTEGER RANGE Port for server, defaults to 8001. Use -p 0 to\n automatically assign an available port.\n [0<=x<=65535]\n --uds TEXT Bind to a Unix domain socket\n --reload Automatically reload if code or metadata\n change detected - useful for development\n --cors Enable CORS by serving Access-Control-Allow-\n Origin: *\n --load-extension PATH:ENTRYPOINT?\n Path to a SQLite extension to load, and\n optional entrypoint\n --inspect-file TEXT Path to JSON file created using \"datasette\n inspect\"\n -m, --metadata FILENAME Path to JSON/YAML file containing\n license/source metadata\n --template-dir DIRECTORY Path to directory containing custom templates\n --plugins-dir DIRECTORY Path to directory containing custom plugins\n --static MOUNT:DIRECTORY Serve static files from this directory at\n /MOUNT/...\n --memory Make /_memory database available\n -c, --config FILENAME Path to JSON/YAML Datasette configuration file\n -s, --setting SETTING... nested.key, value setting to use in Datasette\n configuration\n --secret TEXT Secret used for signing secure values, such as\n signed cookies\n --root Output URL that sets a cookie authenticating\n the root user\n --get TEXT Run an HTTP GET request against this path,\n print results and exit\n --token TEXT API token to send with --get requests\n --actor TEXT Actor to use for --get requests (JSON string)\n --version-note TEXT Additional note to show on /-/versions\n --help-settings Show available settings\n --pdb Launch debugger on any errors\n -o, --open Open Datasette in your web browser\n --create Create database files if they do not exist\n --crossdb Enable cross-database joins using the /_memory\n database\n --nolock Ignore locking, open locked files in read-only\n mode\n --ssl-keyfile TEXT SSL key file\n --ssl-certfile TEXT SSL certificate file\n --internal PATH Path to a persistent Datasette internal SQLite\n database\n --help Show this message and exit. \n [[[end]]]", "breadcrumbs": "[\"CLI reference\"]", "references": "[]"} {"id": "cli-reference:cli-help-serve-help-settings", "page": "cli-reference", "ref": "cli-help-serve-help-settings", "title": "datasette serve --help-settings", "content": "This command outputs all of the available Datasette settings . \n These can be passed to datasette serve using datasette serve --setting name value . \n [[[cog\nhelp([\"--help-settings\"]) \n ]]] \n Settings:\n default_page_size Default page size for the table view\n (default=100)\n max_returned_rows Maximum rows that can be returned from a table or\n custom query (default=1000)\n max_insert_rows Maximum rows that can be inserted at a time using\n the bulk insert API (default=100)\n num_sql_threads Number of threads in the thread pool for\n executing SQLite queries (default=3)\n sql_time_limit_ms Time limit for a SQL query in milliseconds\n (default=1000)\n default_facet_size Number of values to return for requested facets\n (default=30)\n facet_time_limit_ms Time limit for calculating a requested facet\n (default=200)\n facet_suggest_time_limit_ms Time limit for calculating a suggested facet\n (default=50)\n allow_facet Allow users to specify columns to facet using\n ?_facet= parameter (default=True)\n allow_download Allow users to download the original SQLite\n database files (default=True)\n allow_signed_tokens Allow users to create and use signed API tokens\n (default=True)\n default_allow_sql Allow anyone to run arbitrary SQL queries\n (default=True)\n max_signed_tokens_ttl Maximum allowed expiry time for signed API tokens\n (default=0)\n suggest_facets Calculate and display suggested facets\n (default=True)\n default_cache_ttl Default HTTP cache TTL (used in Cache-Control:\n max-age= header) (default=5)\n cache_size_kb SQLite cache size in KB (0 == use SQLite default)\n (default=0)\n allow_csv_stream Allow .csv?_stream=1 to download all rows\n (ignoring max_returned_rows) (default=True)\n max_csv_mb Maximum size allowed for CSV export in MB - set 0\n to disable this limit (default=100)\n truncate_cells_html Truncate cells longer than this in HTML table\n view - set 0 to disable (default=2048)\n force_https_urls Force URLs in API output to always use https://\n protocol (default=False)\n template_debug Allow display of template debug information with\n ?_context=1 (default=False)\n trace_debug Allow display of SQL trace debug information with\n ?_trace=1 (default=False)\n base_url Datasette URLs should use this base path\n (default=/) \n [[[end]]]", "breadcrumbs": "[\"CLI reference\", \"datasette serve\"]", "references": "[]"} {"id": "cli-reference:cli-help-uninstall-help", "page": "cli-reference", "ref": "cli-help-uninstall-help", "title": "datasette uninstall", "content": "Uninstall one or more plugins. \n [[[cog\nhelp([\"uninstall\", \"--help\"]) \n ]]] \n Usage: datasette uninstall [OPTIONS] PACKAGES...\n\n Uninstall plugins and Python packages from the Datasette environment\n\nOptions:\n -y, --yes Don't ask for confirmation\n --help Show this message and exit. \n [[[end]]]", "breadcrumbs": "[\"CLI reference\"]", "references": "[]"} {"id": "cli-reference:id1", "page": "cli-reference", "ref": "id1", "title": "CLI reference", "content": "The datasette CLI tool provides a number of commands. \n Running datasette without specifying a command runs the default command, datasette serve . See datasette serve for the full list of options for that command. \n [[[cog\nfrom datasette import cli\nfrom click.testing import CliRunner\nimport textwrap\ndef help(args):\n title = \"datasette \" + \" \".join(args)\n cog.out(\"\\n::\\n\\n\")\n result = CliRunner().invoke(cli.cli, args)\n output = result.output.replace(\"Usage: cli \", \"Usage: datasette \")\n cog.out(textwrap.indent(output, ' '))\n cog.out(\"\\n\\n\") \n ]]] \n [[[end]]]", "breadcrumbs": "[]", "references": "[]"}