home / docs / sections

sections

68 rows where breadcrumbs contains "Internals for plugins", page = "internals" and references = "[]"

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: breadcrumbs, breadcrumbs (array)

id ▼ page ref title content breadcrumbs references
internals:bypassing-permission-checks internals bypassing-permission-checks Bypassing permission checks All datasette.client methods accept an optional skip_permission_checks=True parameter. When set, all permission checks will be bypassed for that request, allowing access to any resource regardless of the configured permissions. This is useful for plugins and internal operations that need to access all resources without being subject to permission restrictions. Example usage: # Regular request - respects permissions response = await datasette.client.get( "/private-db/secret-table.json" ) # May return 403 Forbidden if access is denied # With skip_permission_checks - bypasses all permission checks response = await datasette.client.get( "/private-db/secret-table.json", skip_permission_checks=True, ) # Will return 200 OK and the data, regardless of permissions This parameter works with all HTTP methods ( get , post , put , patch , delete , options , head ) and the generic request method. Use skip_permission_checks=True with caution. It completely bypasses Datasette's permission system and should only be used in trusted plugin code or internal operations where you need guaranteed access to resources. ["Internals for plugins", "Datasette class", "datasette.client"] []
internals:database-close internals database-close db.close() Closes all of the open connections to file-backed databases. This is mainly intended to be used by large test suites, to avoid hitting limits on the number of open files. ["Internals for plugins", "Database class"] []
internals:database-constructor internals database-constructor Database(ds, path=None, is_mutable=True, is_memory=False, memory_name=None, is_temp_disk=False) The Database() constructor can be used by plugins, in conjunction with .add_database(db, name=None, route=None) , to create and register new databases. The arguments are as follows: ds - Datasette class (required) The Datasette instance you are attaching this database to. path - string Path to a SQLite database file on disk. is_mutable - boolean Set this to False to cause Datasette to open the file in immutable mode. is_memory - boolean Use this to create non-shared memory connections. memory_name - string or None Use this to create a named in-memory database. Unlike regular memory databases these can be accessed by multiple threads and will persist an changes made to them for the lifetime of the Datasette server process. is_temp_disk - boolean Set this to True to create a temporary file-backed database. This creates a SQLite database in a temporary file on disk (using Python's tempfile.mkstemp() ) with WAL mode enabled for better concurrent read/write performance. The temporary file is automatically cleaned up when the database is closed or when the process exits. Unlike named in-mem… ["Internals for plugins", "Database class"] []
internals:database-execute internals database-execute await db.execute(sql, ...) Executes a SQL query against the database and returns the resulting rows (see Results ). sql - string (required) The SQL query to execute. This can include ? or :named parameters. params - list or dict A list or dictionary of values to use for the parameters. List for ? , dictionary for :named . truncate - boolean Should the rows returned by the query be truncated at the maximum page size? Defaults to True , set this to False to disable truncation. custom_time_limit - integer ms A custom time limit for this query. This can be set to a lower value than the Datasette configured default. If a query takes longer than this it will be terminated early and raise a dataette.database.QueryInterrupted exception. page_size - integer Set a custom page size for truncation, over-riding the configured Datasette default. log_sql_errors - boolean Should any SQL errors be logged to the console in addition to being raised as an error? Defaults to True . ["Internals for plugins", "Database class"] []
internals:database-execute-fn internals database-execute-fn await db.execute_fn(fn) Executes a given callback function against a read-only database connection running in a thread. The function will be passed a SQLite connection, and the return value from the function will be returned by the await . Example usage: def get_version(conn): return conn.execute( "select sqlite_version()" ).fetchall()[0][0] version = await db.execute_fn(get_version) ["Internals for plugins", "Database class"] []
internals:database-execute-write internals database-execute-write await db.execute_write(sql, params=None, block=True) SQLite only allows one database connection to write at a time. Datasette handles this for you by maintaining a queue of writes to be executed against a given database. Plugins can submit write operations to this queue and they will be executed in the order in which they are received. This method can be used to queue up a non-SELECT SQL query to be executed against a single write connection to the database. You can pass additional SQL parameters as a tuple or dictionary. The method will block until the operation is completed, and the return value will be the return from calling conn.execute(...) using the underlying sqlite3 Python library. If you pass block=False this behavior changes to "fire and forget" - queries will be added to the write queue and executed in a separate thread while your code can continue to do other things. The method will return a UUID representing the queued task. Each call to execute_write() will be executed inside a transaction. ["Internals for plugins", "Database class"] []
internals:database-execute-write-fn internals database-execute-write-fn await db.execute_write_fn(fn, block=True, transaction=True) This method works like .execute_write() , but instead of a SQL statement you give it a callable Python function. Your function will be queued up and then called when the write connection is available, passing that connection as the argument to the function. The function can then perform multiple actions, safe in the knowledge that it has exclusive access to the single writable connection for as long as it is executing. fn needs to be a regular function, not an async def function. For example: def delete_and_return_count(conn): conn.execute("delete from some_table where id > 5") return conn.execute( "select count(*) from some_table" ).fetchone()[0] try: num_rows_left = await database.execute_write_fn( delete_and_return_count ) except Exception as e: print("An error occurred:", e) Your function can optionally accept a track_event parameter in addition to conn . If it does, it will be passed a callable that can be used to queue events for dispatch after the write transaction commits successfully. Events queued this way are discarded if the write raises an exception. from datasette.events import AlterTableEvent def my_write(conn, track_event): before_schema = conn.execute( "select sql from sqlite_master where name = 'my_table'" ).fetchone()[0] conn.execute( "alter table my_table add column new_col text" ) after_schema = conn.execute( "select sql from sqlite_master where name = 'my_table'" ).fetchone()[0] track_event( AlterTableEvent( actor=None, database="mydb", table="my_table", before_schema=before_schema, after_schema=after_schema, ) ) await database.execute_write_fn(my_write) The value returned from await database.execute_write_fn(...) will be the return value from… ["Internals for plugins", "Database class"] []
internals:database-hash internals database-hash db.hash If the database was opened in immutable mode, this property returns the 64 character SHA-256 hash of the database contents as a string. Otherwise it returns None . ["Internals for plugins", "Database class"] []
internals:datasette-absolute-url internals datasette-absolute-url .absolute_url(request, path) request - Request The current Request object path - string A path, for example /dbname/table.json Returns the absolute URL for the given path, including the protocol and host. For example: absolute_url = datasette.absolute_url( request, "/dbname/table.json" ) # Would return "http://localhost:8001/dbname/table.json" The current request object is used to determine the hostname and protocol that should be used for the returned URL. The force_https_urls configuration setting is taken into account. ["Internals for plugins", "Datasette class"] []
internals:datasette-actions internals datasette-actions .actions Property exposing a dictionary of actions that have been registered using the register_actions(datasette) plugin hook. The dictionary keys are the action names - e.g. view-instance - and the values are Action() objects describing the permission. ["Internals for plugins", "Datasette class"] []
internals:datasette-actors-from-ids internals datasette-actors-from-ids await .actors_from_ids(actor_ids) actor_ids - list of strings or integers A list of actor IDs to look up. Returns a dictionary, where the keys are the IDs passed to it and the values are the corresponding actor dictionaries. This method is mainly designed to be used with plugins. See the actors_from_ids(datasette, actor_ids) documentation for details. If no plugins that implement that hook are installed, the default return value looks like this: { "1": {"id": "1"}, "2": {"id": "2"} } ["Internals for plugins", "Datasette class"] []
internals:datasette-add-database internals datasette-add-database .add_database(db, name=None, route=None) db - datasette.database.Database instance The database to be attached. name - string, optional The name to be used for this database . If not specified Datasette will pick one based on the filename or memory name. route - string, optional This will be used in the URL path. If not specified, it will default to the same thing as the name . The datasette.add_database(db) method lets you add a new database to the current Datasette instance. The db parameter should be an instance of the datasette.database.Database class. For example: from datasette.database import Database datasette.add_database( Database( datasette, path="path/to/my-new-database.db", ) ) This will add a mutable database and serve it at /my-new-database . Use is_mutable=False to add an immutable database. .add_database() returns the Database instance, with its name set as the database.name attribute. Any time you are working with a newly added database you should use the return value of .add_database() , for example: db = datasette.add_database( Database(datasette, memory_name="statistics") ) await db.execute_write( "CREATE TABLE foo(id integer primary key)" ) ["Internals for plugins", "Datasette class"] []
internals:datasette-add-memory-database internals datasette-add-memory-database .add_memory_database(memory_name, name=None, route=None) Adds a shared in-memory database with the specified name: datasette.add_memory_database("statistics") This is a shortcut for the following: from datasette.database import Database datasette.add_database( Database(datasette, memory_name="statistics") ) Using either of these patterns will result in the in-memory database being served at /statistics . The name and route parameters are optional and work the same way as they do for .add_database(db, name=None, route=None) . ["Internals for plugins", "Datasette class"] []
internals:datasette-add-message internals datasette-add-message .add_message(request, message, type=datasette.INFO) request - Request The current Request object message - string The message string type - constant, optional The message type - datasette.INFO , datasette.WARNING or datasette.ERROR Datasette's flash messaging mechanism allows you to add a message that will be displayed to the user on the next page that they visit. Messages are persisted in a ds_messages cookie. This method adds a message to that cookie. You can try out these messages (including the different visual styling of the three message types) using the /-/messages debugging tool. ["Internals for plugins", "Datasette class"] []
internals:datasette-allowed internals datasette-allowed await .allowed(*, action, resource, actor=None) action - string The name of the action that is being permission checked. resource - Resource object A Resource object representing the database, table, or other resource. Must be an instance of a Resource class such as TableResource , DatabaseResource , QueryResource , or InstanceResource . actor - dictionary, optional The authenticated actor. This is usually request.actor . Defaults to None for unauthenticated requests. This method checks if the given actor has permission to perform the given action on the given resource. All parameters must be passed as keyword arguments. Example usage: from datasette.resources import ( TableResource, DatabaseResource, ) # Check if actor can view a specific table can_view = await datasette.allowed( action="view-table", resource=TableResource( database="fixtures", table="facetable" ), actor=request.actor, ) # Check if actor can execute SQL on a database can_execute = await datasette.allowed( action="execute-sql", resource=DatabaseResource(database="fixtures"), actor=request.actor, ) The method returns True if the permission is granted, False if denied. ["Internals for plugins", "Datasette class"] []
internals:datasette-allowed-resources internals datasette-allowed-resources await .allowed_resources(action, actor=None, *, parent=None, include_is_private=False, include_reasons=False, limit=100, next=None) Returns a PaginatedResources object containing resources that the actor can access for the specified action, with support for keyset pagination. action - string The action name (e.g., "view-table", "view-database") actor - dictionary, optional The authenticated actor. Defaults to None for unauthenticated requests. parent - string, optional Optional parent filter (e.g., database name) to limit results include_is_private - boolean, optional If True, adds a .private attribute to each Resource indicating whether anonymous users can access it include_reasons - boolean, optional If True, adds a .reasons attribute with a list of strings describing why access was granted (useful for debugging) limit - integer, optional Maximum number of results to return per page (1-1000, default 100) next - string, optional Keyset token from a previous page for pagination The method returns a PaginatedResources object (from… ["Internals for plugins", "Datasette class"] []
internals:datasette-allowed-resources-sql internals datasette-allowed-resources-sql await .allowed_resources_sql(*, action, actor=None, parent=None, include_is_private=False) Builds the SQL query that Datasette uses to determine which resources an actor may access for a specific action. Returns a (sql: str, params: dict) namedtuple that can be executed against the internal catalog_* database tables. parent can be used to limit results to a specific database, and include_is_private adds a column indicating whether anonymous users would be denied access to that resource. Plugins that need to execute custom analysis over the raw allow/deny rules can use this helper to run the same query that powers the /-/allowed debugging interface. The SQL query built by this method will return the following columns: parent : The parent resource identifier (or NULL) child : The child resource identifier (or NULL) reason : The reason from the rule that granted access is_private : (if include_is_private ) 1 if anonymous users cannot access, 0 otherwise ["Internals for plugins", "Datasette class"] []
internals:datasette-check-visibility internals datasette-check-visibility await .check_visibility(actor, action, resource=None) actor - dictionary The authenticated actor. This is usually request.actor . action - string The name of the action that is being permission checked. resource - Resource object, optional The resource being checked, as a Resource object such as DatabaseResource(database=...) , TableResource(database=..., table=...) , or QueryResource(database=..., query=...) . Only some permissions apply to a resource. This convenience method can be used to answer the question "should this item be considered private, in that it is visible to me but it is not visible to anonymous users?" It returns a tuple of two booleans, (visible, private) . visible indicates if the actor can see this resource. private will be True if an anonymous user would not be able to view the resource. This example checks if the user can access a specific table, and sets private so that a padlock icon can later be displayed: from datasette.resources import TableResource visible, private = await datasette.check_visibility( request.actor, action="view-table", resource=TableResource(database=database, table=table), ) ["Internals for plugins", "Datasette class"] []
internals:datasette-column-types internals datasette-column-types Column types Column types are stored in the column_types table in the internal database . The following methods provide the API for reading and modifying column type assignments. ["Internals for plugins", "Datasette class"] []
internals:datasette-create-token internals datasette-create-token await .create_token(actor_id, expires_after=None, restrictions=None, handler=None) actor_id - string The ID of the actor to create a token for. expires_after - int, optional The number of seconds after which the token should expire. restrictions - TokenRestrictions , optional A TokenRestrictions object limiting which actions the token can perform. handler - string, optional The name of a specific token handler to use. If omitted, the first registered handler is used. See register_token_handler(datasette) . This is an async method that returns an API token string which can be used to authenticate requests to the Datasette API. The default SignedTokenHandler returns tokens of the format dstok_... . All tokens must have an actor_id string indicating the ID of the actor which the token will act on behalf of. Tokens default to lasting forever, but can be set to expire after a given number of seconds using the expires_after argument. The following code creates a token for user1 that will expire after an hour: token = await datasette.create_token( actor_id="user1", expires_after=3600, ) ["Internals for plugins", "Datasette class"] []
internals:datasette-databases internals datasette-databases .databases Property exposing a collections.OrderedDict of databases currently connected to Datasette. The dictionary keys are the name of the database that is used in the URL - e.g. /fixtures would have a key of "fixtures" . The values are Database class instances. All databases are listed, irrespective of user permissions. ["Internals for plugins", "Datasette class"] []
internals:datasette-ensure-permission internals datasette-ensure-permission await .ensure_permission(action, resource=None, actor=None) action - string The action to check. See Built-in actions for a list of available actions. resource - Resource object (optional) The resource to check the permission against. Must be an instance of InstanceResource , DatabaseResource , or TableResource from the datasette.resources module. If omitted, defaults to InstanceResource() for instance-level permissions. actor - dictionary (optional) The authenticated actor. This is usually request.actor . This is a convenience wrapper around await .allowed(*, action, resource, actor=None) that raises a datasette.Forbidden exception if the permission check fails. Use this when you want to enforce a permission check and halt execution if the actor is not authorized. Example: from datasette.resources import TableResource # Will raise Forbidden if actor cannot view the table await datasette.ensure_permission( action="view-table", resource=TableResource( database="fixtures", table="cities" ), actor=request.actor, ) # For instance-level actions, resource can be omitted: await datasette.ensure_permission( action="permissions-debug", actor=request.actor ) ["Internals for plugins", "Datasette class"] []
internals:datasette-get-column-metadata internals datasette-get-column-metadata await .get_column_metadata(self, database_name, resource_name, column_name) database_name - string The name of the database to query. resource_name - string The name of the resource (table, view, or canned query) inside database_name to query. column_name - string The name of the column inside resource_name to query. Returns metadata keys and values for the specified column, resource, and table as a dictionary. Internally queries the metadata_columns table inside the internal database . ["Internals for plugins", "Datasette class", "Getting and setting metadata"] []
internals:datasette-get-column-type internals datasette-get-column-type await .get_column_type(database, resource, column) database - string The name of the database. resource - string The name of the table or view. column - string The name of the column. Returns a ColumnType subclass instance with .config populated for the specified column, or None if no column type is assigned. ct = await datasette.get_column_type( "mydb", "mytable", "email_col" ) if ct: print(ct.name) # "email" print(ct.config) # None or {...} ["Internals for plugins", "Datasette class", "Column types"] []
internals:datasette-get-column-types internals datasette-get-column-types await .get_column_types(database, resource) database - string The name of the database. resource - string The name of the table or view. Returns a dictionary mapping column names to ColumnType subclass instances (with .config populated) for all columns that have assigned types on the given resource. ct_map = await datasette.get_column_types("mydb", "mytable") for col_name, ct in ct_map.items(): print(col_name, ct.name, ct.config) ["Internals for plugins", "Datasette class", "Column types"] []
internals:datasette-get-database internals datasette-get-database .get_database(name) name - string, optional The name of the database - optional. Returns the specified database object. Raises a KeyError if the database does not exist. Call this method without an argument to return the first connected database. ["Internals for plugins", "Datasette class"] []
internals:datasette-get-database-metadata internals datasette-get-database-metadata await .get_database_metadata(self, database_name) database_name - string The name of the database to query. Returns metadata keys and values for the specified database as a dictionary. Internally queries the metadata_databases table inside the internal database . ["Internals for plugins", "Datasette class", "Getting and setting metadata"] []
internals:datasette-get-instance-metadata internals datasette-get-instance-metadata await .get_instance_metadata(self) Returns metadata keys and values for the entire Datasette instance as a dictionary. Internally queries the metadata_instance table inside the internal database . ["Internals for plugins", "Datasette class", "Getting and setting metadata"] []
internals:datasette-get-resource-metadata internals datasette-get-resource-metadata await .get_resource_metadata(self, database_name, resource_name) database_name - string The name of the database to query. resource_name - string The name of the resource (table, view, or canned query) inside database_name to query. Returns metadata keys and values for the specified "resource" as a dictionary. A "resource" in this context can be a table, view, or canned query. Internally queries the metadata_resources table inside the internal database . ["Internals for plugins", "Datasette class", "Getting and setting metadata"] []
internals:datasette-get-set-metadata internals datasette-get-set-metadata Getting and setting metadata Metadata about the instance, databases, tables and columns is stored in tables in Datasette's internal database . The following methods are the supported API for plugins to read and update that stored metadata. ["Internals for plugins", "Datasette class"] []
internals:datasette-plugin-config internals datasette-plugin-config .plugin_config(plugin_name, database=None, table=None) plugin_name - string The name of the plugin to look up configuration for. Usually this is something similar to datasette-cluster-map . database - None or string The database the user is interacting with. table - None or string The table the user is interacting with. This method lets you read plugin configuration values that were set in datasette.yaml . See Writing plugins that accept configuration for full details of how this method should be used. The return value will be the value from the configuration file - usually a dictionary. If the plugin is not configured the return value will be None . ["Internals for plugins", "Datasette class"] []
internals:datasette-remove-column-type internals datasette-remove-column-type await .remove_column_type(database, resource, column) database - string The name of the database. resource - string The name of the table or view. column - string The name of the column. Removes the column type assignment for the specified column. await datasette.remove_column_type( "mydb", "mytable", "location" ) ["Internals for plugins", "Datasette class", "Column types"] []
internals:datasette-remove-database internals datasette-remove-database .remove_database(name) name - string The name of the database to be removed. This removes a database that has been previously added. name= is the unique name of that database. ["Internals for plugins", "Datasette class"] []
internals:datasette-resolve-database internals datasette-resolve-database .resolve_database(request) request - Request object A request object If you are implementing your own custom views, you may need to resolve the database that the user is requesting based on a URL path. If the regular expression for your route declares a database named group, you can use this method to resolve the database object. This returns a Database instance. If the database cannot be found, it raises a datasette.utils.asgi.DatabaseNotFound exception - which is a subclass of datasette.utils.asgi.NotFound with a .database_name attribute set to the name of the database that was requested. ["Internals for plugins", "Datasette class"] []
internals:datasette-resolve-row internals datasette-resolve-row .resolve_row(request) request - Request object A request object This method assumes your route declares named groups for database , table and pks . It returns a ResolvedRow named tuple instance with the following fields: db - Database The database object table - string The name of the table sql - string SQL snippet that can be used in a WHERE clause to select the row params - dict Parameters that should be passed to the SQL query pks - list List of primary key column names pk_values - list List of primary key values decoded from the URL row - sqlite3.Row The row itself If the database or table cannot be found it raises a datasette.utils.asgi.DatabaseNotFound exception. If the table does not exist it raises a datasette.utils.asgi.TableNotFound … ["Internals for plugins", "Datasette class"] []
internals:datasette-resolve-table internals datasette-resolve-table .resolve_table(request) request - Request object A request object This assumes that the regular expression for your route declares both a database and a table named group. It returns a ResolvedTable named tuple instance with the following fields: db - Database The database object table - string The name of the table (or view) is_view - boolean True if this is a view, False if it is a table If the database or table cannot be found it raises a datasette.utils.asgi.DatabaseNotFound exception. If the table does not exist it raises a datasette.utils.asgi.TableNotFound exception - a subclass of datasette.utils.asgi.NotFound with .database_name and .table attributes. ["Internals for plugins", "Datasette class"] []
internals:datasette-set-column-metadata internals datasette-set-column-metadata await .set_column_metadata(self, database_name, resource_name, column_name, key, value) database_name - string The database the metadata entry belongs to. resource_name - string The resource (table, view, or canned query) the metadata entry belongs to. column-name - string The column the metadata entry belongs to. key - string The metadata entry key to insert (ex title , description , etc.) value - string The value of the metadata entry to insert. Adds a new metadata entry for the specified column. Any previous column-level metadata entry with the same key will be overwritten. Internally upserts the value into the the metadata_columns table inside the internal database . ["Internals for plugins", "Datasette class", "Getting and setting metadata"] []
internals:datasette-set-column-type internals datasette-set-column-type await .set_column_type(database, resource, column, column_type, config=None) database - string The name of the database. resource - string The name of the table or view. column - string The name of the column. column_type - string The column type name to assign, e.g. "email" . config - dict, optional Optional configuration dict for the column type. Assigns a column type to a column. Overwrites any existing assignment for that column. Raises ValueError if the column type declares sqlite_types and the target column does not match one of those SQLite types. await datasette.set_column_type( "mydb", "mytable", "location", "point", config={"srid": 4326}, ) ["Internals for plugins", "Datasette class", "Column types"] []
internals:datasette-set-database-metadata internals datasette-set-database-metadata await .set_database_metadata(self, database_name, key, value) database_name - string The database the metadata entry belongs to. key - string The metadata entry key to insert (ex title , description , etc.) value - string The value of the metadata entry to insert. Adds a new metadata entry for the specified database. Any previous database-level metadata entry with the same key will be overwritten. Internally upserts the value into the the metadata_databases table inside the internal database . ["Internals for plugins", "Datasette class", "Getting and setting metadata"] []
internals:datasette-set-instance-metadata internals datasette-set-instance-metadata await .set_instance_metadata(self, key, value) key - string The metadata entry key to insert (ex title , description , etc.) value - string The value of the metadata entry to insert. Adds a new metadata entry for the entire Datasette instance. Any previous instance-level metadata entry with the same key will be overwritten. Internally upserts the value into the the metadata_instance table inside the internal database . ["Internals for plugins", "Datasette class", "Getting and setting metadata"] []
internals:datasette-set-resource-metadata internals datasette-set-resource-metadata await .set_resource_metadata(self, database_name, resource_name, key, value) database_name - string The database the metadata entry belongs to. resource_name - string The resource (table, view, or canned query) the metadata entry belongs to. key - string The metadata entry key to insert (ex title , description , etc.) value - string The value of the metadata entry to insert. Adds a new metadata entry for the specified "resource". Any previous resource-level metadata entry with the same key will be overwritten. Internally upserts the value into the the metadata_resources table inside the internal database . ["Internals for plugins", "Datasette class", "Getting and setting metadata"] []
internals:datasette-setting internals datasette-setting .setting(key) key - string The name of the setting, e.g. base_url . Returns the configured value for the specified setting . This can be a string, boolean or integer depending on the requested setting. For example: downloads_are_allowed = datasette.setting("allow_download") ["Internals for plugins", "Datasette class"] []
internals:datasette-track-event internals datasette-track-event await .track_event(event) event - Event An instance of a subclass of datasette.events.Event . Plugins can call this to track events, using classes they have previously registered. See Event tracking for details. The event will then be passed to all plugins that have registered to receive events using the track_event(datasette, event) hook. Example usage, assuming the plugin has previously registered the BanUserEvent class: await datasette.track_event( BanUserEvent(user={"id": 1, "username": "cleverbot"}) ) ["Internals for plugins", "Datasette class"] []
internals:datasette-unsign internals datasette-unsign .unsign(value, namespace="default") signed - any serializable type The signed string that was created using .sign(value, namespace="default") . namespace - string, optional The alternative namespace, if one was used. Returns the original, decoded object that was passed to .sign(value, namespace="default") . If the signature is not valid this raises a itsdangerous.BadSignature exception. ["Internals for plugins", "Datasette class"] []
internals:datasette-verify-token internals datasette-verify-token await .verify_token(token) token - string The token string to verify. This is an async method that verifies an API token by trying each registered token handler in order. Returns an actor dictionary from the first handler that recognizes the token, or None if no handler accepts it. actor = await datasette.verify_token(token) if actor: # Token was valid print(actor["id"]) ["Internals for plugins", "Datasette class"] []
internals:id1 internals id1 TokenRestrictions The TokenRestrictions class uses a builder pattern to specify which actions a token is allowed to perform. Import it from datasette.tokens : from datasette.tokens import TokenRestrictions restrictions = ( TokenRestrictions() .allow_all("view-instance") .allow_all("view-table") .allow_database("docs", "view-query") .allow_resource("docs", "attachments", "insert-row") .allow_resource("docs", "attachments", "update-row") ) The builder methods are: allow_all(action) - allow an action across all databases and resources allow_database(database, action) - allow an action on a specific database allow_resource(database, resource, action) - allow an action on a specific resource (table, SQL view or canned query ) within a database Each method returns the TokenRestrictions instance so calls can be chained. The following example creates a token that can access view-instance and view-table across everything, can additionally use view-query for anything in the docs database and is allowed to execute insert-row and update-row in the attachments table in that database: token = await datasette.create_token( actor_id="user1", restrictions=( TokenRestrictions() .allow_all("view-instance") .allow_all("view-table") .allow_database("docs", "view-query") .allow_resource("docs", "attachments", "insert-row") .allow_resource("docs", "attachments", "update-row") ), ) ["Internals for plugins", "Datasette class", "await .create_token(actor_id, expires_after=None, restrictions=None, handler=None)"] []
internals:id2 internals id2 .get_internal_database() Returns a database object for reading and writing to the private internal database . ["Internals for plugins", "Datasette class"] []
internals:internals-database internals internals-database Database class Instances of the Database class can be used to execute queries against attached SQLite databases, and to run introspection against their schemas. ["Internals for plugins"] []
internals:internals-database-introspection internals internals-database-introspection Database introspection The Database class also provides properties and methods for introspecting the database. db.name - string The name of the database - usually the filename without the .db prefix. db.size - integer The size of the database file in bytes. 0 for :memory: databases. db.mtime_ns - integer or None The last modification time of the database file in nanoseconds since the epoch. None for :memory: databases. db.is_mutable - boolean Is this database mutable, and allowed to accept writes? db.is_memory - boolean Is this database an in-memory database? db.is_temp_disk - boolean Is this database a temporary file-backed database? See Database(ds, path=None, is_mutable=True, is_memory=False, memory_name=None, is_temp_disk=False) for details. Temporary disk databases report hash as None but have real values for size and mtime_ns since they are backed by a file on disk. await db.attached_databases() - list of named tuples Returns a list of additional databases that have been connected to this … ["Internals for plugins", "Database class"] []
internals:internals-datasette internals internals-datasette Datasette class This object is an instance of the Datasette class, passed to many plugin hooks as an argument called datasette . You can create your own instance of this - for example to help write tests for a plugin - like so: from datasette.app import Datasette # With no arguments a single in-memory database will be attached datasette = Datasette() # The files= argument can load files from disk datasette = Datasette(files=["/path/to/my-database.db"]) # Pass metadata as a JSON dictionary like this datasette = Datasette( files=["/path/to/my-database.db"], metadata={ "databases": { "my-database": { "description": "This is my database" } } }, ) Constructor parameters include: files=[...] - a list of database files to open immutables=[...] - a list of database files to open in immutable mode metadata={...} - a dictionary of Metadata config_dir=... - the configuration directory to use, stored in datasette.config_dir ["Internals for plugins"] []
internals:internals-datasette-is-client internals internals-datasette-is-client Detecting internal client requests datasette.in_client() - returns bool Returns True if the current code is executing within a datasette.client request, False otherwise. This method is useful for plugins that need to behave differently when called through datasette.client versus when handling external HTTP requests. Example usage: async def fetch_documents(datasette): if not datasette.in_client(): return Response.text( "Only available via internal client requests", status=403, ) ... Note that datasette.in_client() is independent of skip_permission_checks . A request made through datasette.client will always have in_client() return True , regardless of whether skip_permission_checks is set. ["Internals for plugins", "Datasette class", "datasette.client"] []
internals:internals-datasette-urls internals internals-datasette-urls datasette.urls The datasette.urls object contains methods for building URLs to pages within Datasette. Plugins should use this to link to pages, since these methods take into account any base_url configuration setting that might be in effect. datasette.urls.instance(format=None) Returns the URL to the Datasette instance root page. This is usually "/" . datasette.urls.path(path, format=None) Takes a path and returns the full path, taking base_url into account. For example, datasette.urls.path("-/logout") will return the path to the logout page, which will be "/-/logout" by default or /prefix-path/-/logout if base_url is set to /prefix-path/ datasette.urls.logout() Returns the URL to the logout page, usually "/-/logout" datasette.urls.static(path) Returns the URL of one of Datasette's default static assets, for example "/-/static/app.css" datasette.urls.static_plugins(plugin_name, path) Returns the URL of one of the static assets belonging to a plugin. datasette.urls.static_plugins("datasette_cluster_map", "datasette-cluster-map.js") would return "/-/static-plugins/datasette_cluster_map/datasette-cluster-map.js" datasette.urls.static(path) … ["Internals for plugins", "Datasette class"] []
internals:internals-formdata internals internals-formdata The FormData class await request.form() returns a FormData object - a dictionary-like object which provides access to form fields and uploaded files. It has a similar interface to MultiParams . form[key] - string or UploadedFile Returns the first value for that key, or raises a KeyError if the key is missing. form.get(key) - string, UploadedFile, or None Returns the first value for that key, or None if the key is missing. Pass a second argument to specify a different default. form.getlist(key) - list Returns the list of values for that key. If the key is missing an empty list will be returned. form.keys() - list of strings Returns the list of available keys. key in form - True or False You can use if key in form to check if a key is present. for key in form - iterator This lets you loop through every available key. len(form) - integer Returns the total number of submitted values. ["Internals for plugins"] []
internals:internals-internal internals internals-internal Datasette's internal database Datasette maintains an "internal" SQLite database used for configuration, caching, and storage. Plugins can store configuration, settings, and other data inside this database. By default, Datasette will use a temporary in-memory SQLite database as the internal database, which is created at startup and destroyed at shutdown. Users of Datasette can optionally pass in a --internal flag to specify the path to a SQLite database to use as the internal database, which will persist internal data across Datasette instances. Datasette maintains tables called catalog_databases , catalog_tables , catalog_views , catalog_columns , catalog_indexes , catalog_foreign_keys with details of the attached databases and their schemas. These tables should not be considered a stable API - they may change between Datasette releases. Metadata is stored in tables metadata_instance , metadata_databases , metadata_resources and metadata_columns . Plugins can interact with these tables via the get_*_metadata() and set_*_metadata() methods . The internal database is not exposed in the Datasette application by default, which means private data can safely be stored without worry of accidentally leaking information through the default Datasette interface and API. However, other plugins do have full read and write access to the internal database. Plugins can access this database by calling internal_db = datasette.get_internal_database() and then executing queries using the Database API . Plugin authors are asked to practice good etiquette when using the internal database, as all plugins use the same database to store data. For example: Use a unique prefix when creating tables, indices, and triggers in the internal database. If your plugin is called datasette-xyz , then prefix names with datasette_xyz_* . Avoid long-running write statements that may … ["Internals for plugins"] []
internals:internals-internal-schema internals internals-internal-schema Internal database schema The internal database schema is as follows: [[[cog from metadata_doc import internal_schema internal_schema(cog) ]]] CREATE TABLE catalog_databases ( database_name TEXT PRIMARY KEY, path TEXT, is_memory INTEGER, schema_version INTEGER ); CREATE TABLE catalog_tables ( database_name TEXT, table_name TEXT, rootpage INTEGER, sql TEXT, PRIMARY KEY (database_name, table_name), FOREIGN KEY (database_name) REFERENCES catalog_databases(database_name) ); CREATE TABLE catalog_views ( database_name TEXT, view_name TEXT, rootpage INTEGER, sql TEXT, PRIMARY KEY (database_name, view_name), FOREIGN KEY (database_name) REFERENCES catalog_databases(database_name) ); CREATE TABLE catalog_columns ( database_name TEXT, table_name TEXT, cid INTEGER, name TEXT, type TEXT, "notnull" INTEGER, default_value TEXT, -- renamed from dflt_value is_pk INTEGER, -- renamed from pk hidden INTEGER, PRIMARY KEY (database_name, table_name, name), FOREIGN KEY (database_name) REFERENCES catalog_databases(database_name), FOREIGN KEY (database_name, table_name) REFERENCES catalog_tables(database_name, table_name) ); CREATE TABLE catalog_indexes ( database_name TEXT, table_name TEXT, seq INTEGER, name TEXT, "unique" INTEGER, origin TEXT, partial INTEGER, PRIMARY KEY (database_name, table_name, name), FOREIGN KEY (database_name) REFERENCES catalog_databases(database_name), FOREIGN KEY (database_name, table_name) REFERENCES catalog_tables(database_name, table_name) ); CREATE TABLE catalog_foreign_keys ( database_name TEXT, table_name TEXT, id INTEGER, seq INTEGER, "table" TEXT, "from" TEXT, "to" TEXT, on_update TEXT, on_delete TEXT, match TEXT, PRIMARY KEY (database_name, table_name, id, seq), FOREIGN KEY (database_name) REFERENCES catalog_databases(database_name), FOREIGN KEY (database_name, table_name)… ["Internals for plugins", "Datasette's internal database"] []
internals:internals-multiparams internals internals-multiparams The MultiParams class request.args is a MultiParams object - a dictionary-like object which provides access to query string parameters that may have multiple values. Consider the query string ?foo=1&foo=2&bar=3 - with two values for foo and one value for bar . request.args[key] - string Returns the first value for that key, or raises a KeyError if the key is missing. For the above example request.args["foo"] would return "1" . request.args.get(key) - string or None Returns the first value for that key, or None if the key is missing. Pass a second argument to specify a different default, e.g. q = request.args.get("q", "") . request.args.getlist(key) - list of strings Returns the list of strings for that key. request.args.getlist("foo") would return ["1", "2"] in the above example. request.args.getlist("bar") would return ["3"] . If the key is missing an empty list will be returned. request.args.keys() - list of strings Returns the list of available keys - for the example this would be ["foo", "bar"] . key in request.args - True or False You can use if key in request.args to check if a key is present. for key in request.args - iterator This lets you loop through every available key. le… ["Internals for plugins"] []
internals:internals-permission-classes internals internals-permission-classes Permission classes and utilities   ["Internals for plugins"] []
internals:internals-permission-sql internals internals-permission-sql PermissionSQL class The PermissionSQL class is used by plugins to contribute SQL-based permission rules through the permission_resources_sql(datasette, actor, action) hook. This enables efficient permission checking across multiple resources by leveraging SQLite's query engine. from datasette.permissions import PermissionSQL @dataclass class PermissionSQL: source: str # Plugin name for auditing sql: str # SQL query returning permission rules params: Dict[str, Any] # Parameters for the SQL query Attributes: source - string An identifier for the source of these permission rules, typically the plugin name. This is used for debugging and auditing. sql - string A SQL query that returns permission rules. The query must return rows with the following columns: parent (TEXT or NULL) - The parent resource identifier (e.g., database name) child (TEXT or NULL) - The child resource identifier (e.g., table name) allow (INTEGER) - 1 for allow, 0 for deny reason (TEXT) - A human-readable explanation of why this permission was granted or denied params - dictionary A dictionary of parameters to … ["Internals for plugins", "Permission classes and utilities"] []
internals:internals-response internals internals-response Response class The Response class can be returned from view functions that have been registered using the register_routes(datasette) hook. The Response() constructor takes the following arguments: body - string The body of the response. status - integer (optional) The HTTP status - defaults to 200. headers - dictionary (optional) A dictionary of extra HTTP headers, e.g. {"x-hello": "world"} . content_type - string (optional) The content-type for the response. Defaults to text/plain . For example: from datasette.utils.asgi import Response response = Response( "<xml>This is XML</xml>", content_type="application/xml; charset=utf-8", ) The quickest way to create responses is using the Response.text(...) , Response.html(...) , Response.json(...) or Response.redirect(...) helper methods: from datasette.utils.asgi import Response html_response = Response.html("This is HTML") json_response = Response.json({"this_is": "json"}) text_response = Response.text( "This will become utf-8 encoded text" ) # Redirects are served as 302, unless you pass status=301: redirect_response = Response.redirect( "https://latest.datasette.io/" ) Each of these responses will use the correct corresponding content-type - text/html; charset=utf-8 , application/json; charset=utf-8 or text/plain; charset=utf-8 respectively. Each of the helper methods take optional status= and headers= argument… ["Internals for plugins"] []
internals:internals-response-asgi-send internals internals-response-asgi-send Returning a response with .asgi_send(send) In most cases you will return Response objects from your own view functions. You can also use a Response instance to respond at a lower level via ASGI, for example if you are writing code that uses the asgi_wrapper(datasette) hook. Create a Response object and then use await response.asgi_send(send) , passing the ASGI send function. For example: async def require_authorization(scope, receive, send): response = Response.text( "401 Authorization Required", headers={ "www-authenticate": 'Basic realm="Datasette", charset="UTF-8"' }, status=401, ) await response.asgi_send(send) ["Internals for plugins", "Response class"] []
internals:internals-response-set-cookie internals internals-response-set-cookie Setting cookies with response.set_cookie() To set cookies on the response, use the response.set_cookie(...) method. The method signature looks like this: def set_cookie( self, key, value="", max_age=None, expires=None, path="/", domain=None, secure=False, httponly=False, samesite="lax", ): ... You can use this with datasette.sign() to set signed cookies. Here's how you would set the ds_actor cookie for use with Datasette authentication : response = Response.redirect("/") response.set_cookie( "ds_actor", datasette.sign({"a": {"id": "cleopaws"}}, "actor"), ) return response ["Internals for plugins", "Response class"] []
internals:internals-shortcuts internals internals-shortcuts Import shortcuts The following commonly used symbols can be imported directly from the datasette module: from datasette import Response from datasette import Forbidden from datasette import NotFound from datasette import hookimpl from datasette import actor_matches_allow ["Internals for plugins"] []
internals:internals-uploadedfile internals internals-uploadedfile The UploadedFile class When parsing multipart form data with files=True , file uploads are returned as UploadedFile objects with the following properties and methods: uploaded_file.name - string The form field name. uploaded_file.filename - string The original filename provided by the client. Note: This is sanitized to remove path components for security. uploaded_file.content_type - string or None The MIME type of the uploaded file, if provided by the client. uploaded_file.size - integer The size of the uploaded file in bytes. await uploaded_file.read(size=-1) - bytes Read and return up to size bytes from the file. If size is -1 (default), read the entire file. await uploaded_file.seek(offset, whence=0) - integer Seek to the given position in the file. Returns the new position. await uploaded_file.close() Close the underlying file. This is called automatically when the object is garbage collected. Files smaller than 1MB are stored in memory. Larger files are automatically spilled to temporary files on disk and cleaned up when the request completes. Example: form = await requ… ["Internals for plugins"] []
internals:internals-utils-async-call-with-supported-arguments internals internals-utils-async-call-with-supported-arguments await async_call_with_supported_arguments(fn, Async version of call_with_supported_arguments . Use this for async def callback functions. async datasette.utils. async_call_with_supported_arguments fn ** kwargs Async version of call_with_supported_arguments() . Calls await fn(...) with the subset of **kwargs matching its signature. Parameters fn -- An async callable kwargs -- All available keyword arguments Returns The return value of await fn(...) ["Internals for plugins", "The datasette.utils module"] []
internals:internals-utils-call-with-supported-arguments internals internals-utils-call-with-supported-arguments call_with_supported_arguments(fn, Call fn , passing it only those keyword arguments that match its function signature. This implements a dependency injection pattern - the caller provides all available arguments, and the function receives only the ones it declares as parameters. This is useful in plugins that want to define callback functions that only declare the arguments they need. For example: from datasette.utils import call_with_supported_arguments def my_callback(request, datasette): ... # This will pass only request and datasette, ignoring other kwargs: call_with_supported_arguments( my_callback, request=request, datasette=datasette, database=database, table=table, ) datasette.utils. call_with_supported_arguments fn ** kwargs Call fn with the subset of **kwargs matching its signature. This implements dependency injection: the caller provides all available keyword arguments and the function receives only the ones it declares as parameters. Parameters fn -- A callable (sync function) kwargs -- All available keyword arguments Returns The return value of fn ["Internals for plugins", "The datasette.utils module"] []
internals:internals-utils-named-parameters internals internals-utils-named-parameters named_parameters(sql) Derive the list of :named parameters referenced in a SQL query. datasette.utils. named_parameters sql : str List [ str ] Given a SQL statement, return a list of named parameters that are used in the statement e.g. for select * from foo where id=:id this would return ["id"] ["Internals for plugins", "The datasette.utils module"] []
internals:internals-utils-parse-metadata internals internals-utils-parse-metadata parse_metadata(content) This function accepts a string containing either JSON or YAML, expected to be of the format described in Metadata . It returns a nested Python dictionary representing the parsed data from that string. If the metadata cannot be parsed as either JSON or YAML the function will raise a utils.BadMetadataError exception. datasette.utils. parse_metadata content : str dict Detects if content is JSON or YAML and parses it appropriately. ["Internals for plugins", "The datasette.utils module"] []
internals:permission-sql-parameters internals permission-sql-parameters Available SQL parameters When writing SQL for PermissionSQL , the following parameters are automatically available: :actor - JSON string or NULL The full actor dictionary serialized as JSON. Use SQLite's json_extract() function to access fields: json_extract(:actor, '$.role') = 'admin' json_extract(:actor, '$.team') = 'engineering' :actor_id - string or NULL The actor's id field, for simple equality comparisons: :actor_id = 'alice' :action - string The action being checked (e.g., "view-table" , "insert-row" , "execute-sql" ). Example usage: Here's an example plugin that grants view-table permissions to users with an "analyst" role for tables in the "analytics" database: from datasette import hookimpl from datasette.permissions import PermissionSQL @hookimpl def permission_resources_sql(datasette, actor, action): if action != "view-table": return None return PermissionSQL( source="my_analytics_plugin", sql=""" SELECT 'analytics' AS parent, NULL AS child, 1 AS allow, 'Analysts can view analytics database' AS reason WHERE json_extract(:actor, '$.role') = 'analyst' AND :action = 'view-table' """, params={}, ) A more complex example that uses custom parameters: … ["Internals for plugins", "Permission classes and utilities", "PermissionSQL class"] []

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [sections] (
   [id] TEXT PRIMARY KEY,
   [page] TEXT,
   [ref] TEXT,
   [title] TEXT,
   [content] TEXT,
   [breadcrumbs] TEXT,
   [references] TEXT
);
Powered by Datasette · Queries took 1.2ms