CommonLibrary
Common library functions for ApertureDB. This will not have a big class structure, but rather a collection of functions This is the place to put functions that are reused in codebase.
import_module_by_path
def import_module_by_path(filepath: str) -> Any
This function imports a module given a path to a python file.
create_connector
def create_connector(name: Optional[str] = None,
create_config_for_colab_secret=True) -> Connector
Create a connector to the database.
This function chooses a configuration in the folowing order:
- The configuration named by the
name
parameter. - The configuration described in the
APERTUREDB_JSON
environment variable. - The configuration described in the
APERTUREDB_JSON
Google Colab secret. - The configuration described in the
APERTUREDB_JSON
secret in a.env
file. - The configuration named by the
APERTUREDB_CONFIG
environment variable. - The active configuration.
If there are both global and local configurations with the same name, the global configuration is preferred.
See adb config command-line tool for more information.
Arguments:
name
str, optional - The name of the configuration to use. Default is None.create_config_for_colab_secret
bool, optional - Whether to create a configuration from the Google Colab secret. Default is True.
Returns:
Connector
- The connector to the database.Note about Google Colab secret: This secret is available in the context of a notebook running on Google Colab. In particular, it is not available to the
adb
CLI tool running in a Colab notebook or any scripts run within a notebook. To resolve this issue, a configuration is automatically created and activated in this case. Use thecreate_config_for_colab_secret
parameter to disable this behavior.
execute_query
def execute_query(client: Connector,
query: Commands,
blobs: Blobs,
success_statuses: list[int] = [0],
response_handler: Optional[Callable] = None,
commands_per_query: int = 1,
blobs_per_query: int = 0,
strict_response_validation: bool = False,
cmd_index=None) -> Tuple[int, CommandResponses, Blobs]
Execute a batch of queries, doing useful logging around it. Calls the response handler if provided.
This should be used (without the parallel machinery) instead of Connector.query to keep the response handling consistent, better logging, etc.
Arguments:
client
Connector - The database connector.query
Commands - List of commands to execute.blobs
Blobs - List of blobs to send.success_statuses
list[int], optional - The list of success statuses. Defaults to [0].response_handler
Callable, optional - The response handler. Defaults to None.commands_per_query
int, optional - The number of commands per query. Defaults to 1.blobs_per_query
int, optional - The number of blobs per query. Defaults to 0.strict_response_validation
bool, optional - Whether to strictly validate the response. Defaults to False.
Returns:
int
- The result code.- 0 : if all commands succeeded
- 1 : if there was -1 in the response
- 2 : For any other code.
CommandResponses
- The response.Blobs
- The blobs.
issue_deprecation_warning
def issue_deprecation_warning(old_name, new_name)
Issue a deprecation warning for a function and class.