Notebooks
It's pretty easy to access ApertureDB via notebooks after installing and setting up Python SDK as explained here.
Specific Flavors of Notebooks
- Google Colab
- Jupyter Notebooks
Instead of entering your connection details every time you restart your kernel,
you can setup colab secrets corresponding to ApertureDB
as a one time activity.
The easiest way to do that in Google Colab is to set the APERTUREDB_JSON
secret.
After you have done this once, any calls to CommonLibrary.create_connector()
should work seamlessly.
Setting up (or updating) the virtual environment for the python client can sometimes cause version dependencies mismatches. This is because the ApertureDB SDK does not pin the version numbers of some of its dependencies, and those get released in separate cycles.
There's a docker image that is built during the CI process and is a guaranteed stable environment in such a case. This image also includes a installation of Jupyter Lab. It can be run as follows
docker run --interactive --tty aperturedata/aperturedb-notebook
If you are using the Community edition for prototyping, you can update the Docker compose file to include this notebook:
services:
aperturedb:
image: aperturedata/aperturedb-community
restart: always
privileged: true
volumes: # Map ApertureDB storage and logs into local directory
- ./aperturedb/db:/aperturedb/db
- ./aperturedb/logs:/aperturedb/logs
ports:
- 55555:55555 # HOST_PORT:CONTAINER_PORT
environment:
ADB_PORT: 55555
webui:
image: aperturedata/aperturedb-webui
ports:
- 80:80 # HOST_PORT:CONTAINER_PORT
restart: always
depends_on:
- aperturedb
environment:
- APP_PRIVATE_VDMS_SERVER_ADDR=aperturedb
- APP_PRIVATE_VDMS_SERVER_PORT=55555
notebook:
image: aperturedata/aperturedb-notebook
ports:
- 8888:8888 # HOST_PORT:CONTAINER_PORT
restart: always
command: bash -c "adb config create aperturedb_docker --host aperturedb --no-interactive && /start.sh"
depends_on:
- aperturedb