sections
18 rows where breadcrumbs contains "JSON API" sorted by breadcrumbs
This data as json, CSV (advanced)
Suggested facets: references
id | page | ref | title | content | breadcrumbs ▼ | references |
---|---|---|---|---|---|---|
json_api:column-filter-arguments | json_api | column-filter-arguments | Column filter arguments | You can filter the data returned by the table based on column values using a query string argument. ?column__exact=value or ?_column=value Returns rows where the specified column exactly matches the value. ?column__not=value Returns rows where the column does not match the value. ?column__contains=value Rows where the string column contains the specified value ( column like "%value%" in SQL). ?column__notcontains=value Rows where the string column does not contain the specified value ( column not like "%value%" in SQL). ?column__endswith=value Rows where the string column ends with the specified value ( column like "%value" in SQL). ?column__startswith=value Rows where the string column starts with the specified value ( column like "value%" in SQL). ?column__gt=value Rows which are greater than the specified value. ?column__gte=value Rows which… | ["JSON API", "Table arguments"] | [] |
json_api:json-api-table-arguments | json_api | json-api-table-arguments | Special table arguments | ?_col=COLUMN1&_col=COLUMN2 List specific columns to display. These will be shown along with any primary keys. ?_nocol=COLUMN1&_nocol=COLUMN2 List specific columns to hide - any column not listed will be displayed. Primary keys cannot be hidden. ?_labels=on/off Expand foreign key references for every possible column. See below. ?_label=COLUMN1&_label=COLUMN2 Expand foreign key references for one or more specified columns. ?_size=1000 or ?_size=max Sets a custom page size. This cannot exceed the max_returned_rows limit passed to datasette serve . Use max to get max_returned_rows . ?_sort=COLUMN Sorts the results by the specified column. ?_sort_desc=COLUMN Sorts the results by the specified column in descending order. ?_search=keywords For SQLite tables that have been configured for full-text search executes a search … | ["JSON API", "Table arguments"] | [{"href": "https://www.sqlite.org/fts3.html", "label": "full-text search"}, {"href": "https://www.sqlite.org/fts5.html#full_text_query_syntax", "label": "advanced SQLite FTS syntax"}, {"href": "https://latest.datasette.io/fixtures/facetable?_where=_neighborhood%20like%20%22%c%%22&_where=_city_id=3", "label": "facetable?_where=_neighborhood like \"%c%\"&_where=_city_id=3"}, {"href": "https://latest.datasette.io/fixtures/facetable?_where=_city_id%20in%20(select%20id%20from%20facet_cities%20where%20name%20!=%20%22Detroit%22)", "label": "facetable?_where=_city_id in (select id from facet_cities where name != \"Detroit\")"}, {"href": "https://latest.datasette.io/fixtures/roadside_attractions?_through={%22table%22:%22roadside_attraction_characteristics%22,%22column%22:%22characteristic_id%22,%22value%22:%221%22}", "label": "an example"}] |
json_api:tableinsertview | json_api | tableinsertview | Inserting rows | This requires the insert-row permission. A single row can be inserted using the "row" key: POST /<database>/<table>/-/insert Content-Type: application/json Authorization: Bearer dstok_<rest-of-token> { "row": { "column1": "value1", "column2": "value2" } } If successful, this will return a 201 status code and the newly inserted row, for example: { "rows": [ { "id": 1, "column1": "value1", "column2": "value2" } ] } To insert multiple rows at a time, use the same API method but send a list of dictionaries as the "rows" key: POST /<database>/<table>/-/insert Content-Type: application/json Authorization: Bearer dstok_<rest-of-token> { "rows": [ { "column1": "value1", "column2": "value2" }, { "column1": "value3", "column2": "value4" } ] } If successful, this will return a 201 status code and a {"ok": true} response body. The maximum number rows that can be submitted at once defaults to 100, but this can be changed using the max_insert_rows setting. To return the newly inserted rows, add the "return": true key to the request body: { "rows": [ { "column1": "value1", "column2": "value2" }, { "column1": "value3", "column2": "value4" } ], "return": true } This will return the same "rows" key as the single row example above. There is a small performance penalty for using this option. If any of your rows have a primary key that is already in use, you will get an error and none of the rows will be inserted: { "ok": false, "errors": [ "UNIQUE constraint failed: new_table… | ["JSON API", "The JSON write API"] | [] |
json_api:tableupsertview | json_api | tableupsertview | Upserting rows | An upsert is an insert or update operation. If a row with a matching primary key already exists it will be updated - otherwise a new row will be inserted. The upsert API is mostly the same shape as the insert API . It requires both the insert-row and update-row permissions. POST /<database>/<table>/-/upsert Content-Type: application/json Authorization: Bearer dstok_<rest-of-token> { "rows": [ { "id": 1, "title": "Updated title for 1", "description": "Updated description for 1" }, { "id": 2, "description": "Updated description for 2", }, { "id": 3, "title": "Item 3", "description": "Description for 3" } ] } Imagine a table with a primary key of id and which already has rows with id values of 1 and 2 . The above example will: Update the row with id of 1 to set both title and description to the new values Update the row with id of 2 to set title to the new value - description will be left unchanged Insert a new row with id of 3 and both title and description set to the new values Similar to /-/insert , a row key with an object can be used instead of a rows array to upsert a single row. If successful, this will return a 200 status code and a {"ok": true} response body. Add "return": true to the request body to return full copies of the affected rows after they have been inserted or updated: { "rows": [ { "id": 1, "title": "Updated title for 1", "description": "Updated descri… | ["JSON API", "The JSON write API"] | [] |
json_api:rowupdateview | json_api | rowupdateview | Updating a row | To update a row, make a POST to /<database>/<table>/<row-pks>/-/update . This requires the update-row permission. POST /<database>/<table>/<row-pks>/-/update Content-Type: application/json Authorization: Bearer dstok_<rest-of-token> { "update": { "text_column": "New text string", "integer_column": 3, "float_column": 3.14 } } <row-pks> here is the tilde-encoded primary key value of the row to update - or a comma-separated list of primary key values if the table has a composite primary key. You only need to pass the columns you want to update. Any other columns will be left unchanged. If successful, this will return a 200 status code and a {"ok": true} response body. Add "return": true to the request body to return the updated row: { "update": { "title": "New title" }, "return": true } The returned JSON will look like this: { "ok": true, "row": { "id": 1, "title": "New title", "other_column": "Will be present here too" } } Any errors will return {"errors": ["... descriptive message ..."], "ok": false} , and a 400 status code for a bad input or a 403 status code for an authentication or permission error. Pass "alter: true to automatically add any missing columns to the table. This requires the alter-table permission. | ["JSON API", "The JSON write API"] | [] |
json_api:rowdeleteview | json_api | rowdeleteview | Deleting a row | To delete a row, make a POST to /<database>/<table>/<row-pks>/-/delete . This requires the delete-row permission. POST /<database>/<table>/<row-pks>/-/delete Content-Type: application/json Authorization: Bearer dstok_<rest-of-token> <row-pks> here is the tilde-encoded primary key value of the row to delete - or a comma-separated list of primary key values if the table has a composite primary key. If successful, this will return a 200 status code and a {"ok": true} response body. Any errors will return {"errors": ["... descriptive message ..."], "ok": false} , and a 400 status code for a bad input or a 403 status code for an authentication or permission error. | ["JSON API", "The JSON write API"] | [] |
json_api:tablecreateview | json_api | tablecreateview | Creating a table | To create a table, make a POST to /<database>/-/create . This requires the create-table permission. POST /<database>/-/create Content-Type: application/json Authorization: Bearer dstok_<rest-of-token> { "table": "name_of_new_table", "columns": [ { "name": "id", "type": "integer" }, { "name": "title", "type": "text" } ], "pk": "id" } The JSON here describes the table that will be created: table is the name of the table to create. This field is required. columns is a list of columns to create. Each column is a dictionary with name and type keys. name is the name of the column. This is required. type is the type of the column. This is optional - if not provided, text will be assumed. The valid types are text , integer , float and blob . pk is the primary key for the table. This is optional - if not provided, Datasette will create a SQLite table with a hidden rowid column. If the primary key is an integer column, it will be configured to automatically increment for each new record. If you set this to id without including an id column in the list of columns , Datasette will create an auto-incrementing integer ID column for you. pks can be used instead of pk to create a compound primary key. It should be a JSON list of column names to use in that primary key. … | ["JSON API", "The JSON write API"] | [] |
json_api:tablecreateview-example | json_api | tablecreateview-example | Creating a table from example data | Instead of specifying columns directly you can instead pass a single example row or a list of rows . Datasette will create a table with a schema that matches those rows and insert them for you: POST /<database>/-/create Content-Type: application/json Authorization: Bearer dstok_<rest-of-token> { "table": "creatures", "rows": [ { "id": 1, "name": "Tarantula" }, { "id": 2, "name": "Kākāpō" } ], "pk": "id" } Doing this requires both the create-table and insert-row permissions. The 201 response here will be similar to the columns form, but will also include the number of rows that were inserted as row_count : { "ok": true, "database": "data", "table": "creatures", "table_url": "http://127.0.0.1:8001/data/creatures", "table_api_url": "http://127.0.0.1:8001/data/creatures.json", "schema": "CREATE TABLE [creatures] (\n [id] INTEGER PRIMARY KEY,\n [name] TEXT\n)", "row_count": 2 } You can call the create endpoint multiple times for the same table provided you are specifying the table using the rows or row option. New rows will be inserted into the table each time. This means you can use this API if you are unsure if the relevant table has been created yet. If you pass a row to the create endpoint with a primary key that already exists you will get an error that looks like this: { "ok": false, "errors": [ "UNIQUE constraint failed: creatures.id" ] } You can avoid this error by passing the same "ignore": true or "replace": true options to the create endpoint as you can to the insert endpoint . To use the "replace": true option you will also need the update-row permission. Pass "alter": true to automatically add any missing columns to t… | ["JSON API", "The JSON write API"] | [] |
json_api:tabledropview | json_api | tabledropview | Dropping tables | To drop a table, make a POST to /<database>/<table>/-/drop . This requires the drop-table permission. POST /<database>/<table>/-/drop Content-Type: application/json Authorization: Bearer dstok_<rest-of-token> Without a POST body this will return a status 200 with a note about how many rows will be deleted: { "ok": true, "database": "<database>", "table": "<table>", "row_count": 5, "message": "Pass \"confirm\": true to confirm" } If you pass the following POST body: { "confirm": true } Then the table will be dropped and a status 200 response of {"ok": true} will be returned. Any errors will return {"errors": ["... descriptive message ..."], "ok": false} , and a 400 status code for a bad input or a 403 status code for an authentication or permission error. | ["JSON API", "The JSON write API"] | [] |
json_api:json-api-default | json_api | json-api-default | Default representation | The default JSON representation of data from a SQLite table or custom query looks like this: { "ok": true, "rows": [ { "id": 3, "name": "Detroit" }, { "id": 2, "name": "Los Angeles" }, { "id": 4, "name": "Memnonia" }, { "id": 1, "name": "San Francisco" } ], "truncated": false } "ok" is always true if an error did not occur. The "rows" key is a list of objects, each one representing a row. The "truncated" key lets you know if the query was truncated. This can happen if a SQL query returns more than 1,000 results (or the max_returned_rows setting). For table pages, an additional key "next" may be present. This indicates that the next page in the pagination set can be retrieved using ?_next=VALUE . | ["JSON API"] | [] |
json_api:json-api-shapes | json_api | json-api-shapes | Different shapes | The _shape parameter can be used to access alternative formats for the rows key which may be more convenient for your application. There are three options: ?_shape=objects - "rows" is a list of JSON key/value objects - the default ?_shape=arrays - "rows" is a list of lists, where the order of values in each list matches the order of the columns ?_shape=array - a JSON array of objects - effectively just the "rows" key from the default representation ?_shape=array&_nl=on - a newline-separated list of JSON objects ?_shape=arrayfirst - a flat JSON array containing just the first value from each row ?_shape=object - a JSON object keyed using the primary keys of the rows _shape=arrays looks like this: { "ok": true, "next": null, "rows": [ [3, "Detroit"], [2, "Los Angeles"], [4, "Memnonia"], [1, "San Francisco"] ] } _shape=array looks like this: [ { "id": 3, "name": "Detroit" }, { "id": 2, "name": "Los Angeles" }, { "id": 4, "name": "Memnonia" }, { "id": 1, "name": "San Francisco" } ] _shape=array&_nl=on looks like this: {"id": 1, "value": "Myoporum laetum :: Myoporum"} {"id": 2, "value": "Metrosideros excelsa :: New Zealand Xmas Tree"} {"id": 3, "value": "Pinus radiata :: Monterey Pine"} _shape=arrayfirst looks like this: [1, 2, 3] _shape=object looks like this: { "1": { "id": 1, "value": "Myoporum laetum :: Myoporum" }, "2": { "id": 2, "value": "Metrosideros excelsa :… | ["JSON API"] | [] |
json_api:json-api-pagination | json_api | json-api-pagination | Pagination | The default JSON representation includes a "next_url" key which can be used to access the next page of results. If that key is null or missing then it means you have reached the final page of results. Other representations include pagination information in the link HTTP header. That header will look something like this: link: <https://latest.datasette.io/fixtures/sortable.json?_next=d%2Cv>; rel="next" Here is an example Python function built using requests that returns a list of all of the paginated items from one of these API endpoints: def paginate(url): items = [] while url: response = requests.get(url) try: url = response.links.get("next").get("url") except AttributeError: url = None items.extend(response.json()) return items | ["JSON API"] | [{"href": "https://requests.readthedocs.io/", "label": "requests"}] |
json_api:json-api-special | json_api | json-api-special | Special JSON arguments | Every Datasette endpoint that can return JSON also accepts the following query string arguments: ?_shape=SHAPE The shape of the JSON to return, documented above. ?_nl=on When used with ?_shape=array produces newline-delimited JSON objects. ?_json=COLUMN1&_json=COLUMN2 If any of your SQLite columns contain JSON values, you can use one or more _json= parameters to request that those columns be returned as regular JSON. Without this argument those columns will be returned as JSON objects that have been double-encoded into a JSON string value. Compare this query without the argument to this query using the argument ?_json_infinity=on If your data contains infinity or -infinity values, Datasette will replace them with None when returning them as JSON. If you pass _json_infinity=1 Datasette will instead return them as Infinity or -Infinity which is invalid JSON but can be processed by some custom JSON parsers. ?_timelimit=MS Sets a custom time limit for the query in ms. You can use this for optimistic queries where you would like Datasette to give up if the query takes too long, for example if you want to implement autocomplete search but only… | ["JSON API"] | [{"href": "https://fivethirtyeight.datasettes.com/fivethirtyeight.json?sql=select+%27{%22this+is%22%3A+%22a+json+object%22}%27+as+d&_shape=array", "label": "this query without the argument"}, {"href": "https://fivethirtyeight.datasettes.com/fivethirtyeight.json?sql=select+%27{%22this+is%22%3A+%22a+json+object%22}%27+as+d&_shape=array&_json=d", "label": "this query using the argument"}] |
json_api:id2 | json_api | id2 | Table arguments | The Datasette table view takes a number of special query string arguments. | ["JSON API"] | [] |
json_api:expand-foreign-keys | json_api | expand-foreign-keys | Expanding foreign key references | Datasette can detect foreign key relationships and resolve those references into labels. The HTML interface does this by default for every detected foreign key column - you can turn that off using ?_labels=off . You can request foreign keys be expanded in JSON using the _labels=on or _label=COLUMN special query string parameters. Here's what an expanded row looks like: [ { "rowid": 1, "TreeID": 141565, "qLegalStatus": { "value": 1, "label": "Permitted Site" }, "qSpecies": { "value": 1, "label": "Myoporum laetum :: Myoporum" }, "qAddress": "501X Baker St", "SiteOrder": 1 } ] The column in the foreign key table that is used for the label can be specified in metadata.json - see Specifying the label column for a table . | ["JSON API"] | [] |
json_api:json-api-discover-alternate | json_api | json-api-discover-alternate | Discovering the JSON for a page | Most of the HTML pages served by Datasette provide a mechanism for discovering their JSON equivalents using the HTML link mechanism. You can find this near the top of the source code of those pages, looking like this: <link rel="alternate" type="application/json+datasette" href="https://latest.datasette.io/fixtures/sortable.json"> The JSON URL is also made available in a Link HTTP header for the page: Link: https://latest.datasette.io/fixtures/sortable.json; rel="alternate"; type="application/json+datasette" | ["JSON API"] | [] |
json_api:json-api-cors | json_api | json-api-cors | Enabling CORS | If you start Datasette with the --cors option, each JSON endpoint will be served with the following additional HTTP headers: [[[cog from datasette.utils import add_cors_headers import textwrap headers = {} add_cors_headers(headers) output = "\n".join("{}: {}".format(k, v) for k, v in headers.items()) cog.out("\n::\n\n") cog.out(textwrap.indent(output, ' ')) cog.out("\n\n") ]]] Access-Control-Allow-Origin: * Access-Control-Allow-Headers: Authorization, Content-Type Access-Control-Expose-Headers: Link Access-Control-Allow-Methods: GET, POST, HEAD, OPTIONS Access-Control-Max-Age: 3600 [[[end]]] This allows JavaScript running on any domain to make cross-origin requests to interact with the Datasette API. If you start Datasette without the --cors option only JavaScript running on the same domain as Datasette will be able to access the API. Here's how to serve data.db with CORS enabled: datasette data.db --cors | ["JSON API"] | [] |
json_api:json-api-write | json_api | json-api-write | The JSON write API | Datasette provides a write API for JSON data. This is a POST-only API that requires an authenticated API token, see API Tokens . The token will need to have the specified Permissions . | ["JSON API"] | [] |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [sections] ( [id] TEXT PRIMARY KEY, [page] TEXT, [ref] TEXT, [title] TEXT, [content] TEXT, [breadcrumbs] TEXT, [references] TEXT );