Notebooks
It's pretty easy to access ApertureDB via notebooks after installing and setting up Python SDK as explained here.
Specific Flavors of Notebooks
- Google Colab
 - Jupyter Notebooks
 - Workflow
 
Instead of entering your connection details every time you restart your kernel,
you can setup colab secrets corresponding to ApertureDB as a one time activity.
The easiest way to do that in Google Colab is to set the APERTUREDB_KEY secret.
After you have done this once, any calls to CommonLibrary.create_connector() should work seamlessly.
Setting up (or updating) the virtual environment for the python client can sometimes cause version dependencies mismatches. This is because the ApertureDB SDK does not pin the version numbers of some of its dependencies, and those get released in separate cycles.
There's a docker image that is built during the CI process and is a guaranteed stable environment in such a case. This image also includes a installation of Jupyter Lab. It can be run as follows
docker run --interactive --tty aperturedata/aperturedb-notebook
If you are using the Community edition for prototyping, you can update the Docker compose file to include this notebook:
name: aperturedb-local-linux
services:
  ca:
    image: alpine/openssl
    restart: on-failure
    command: req -x509 -newkey rsa:4096 -days 3650 -nodes -keyout /cert/tls.key -out /cert/tls.crt -subj "/C=US/O=ApertureData Inc./CN=localhost"
    volumes:
      - ./aperturedb/certificate:/cert
  lenz:
    depends_on:
      ca:
        condition: service_completed_successfully
      aperturedb:
        condition: service_started
    image: aperturedata/lenz:latest
    ports:
      - 55555:55551
    restart: always
    environment:
      LNZ_HEALTH_PORT: 58085
      LNZ_TCP_PORT: 55551
      LNZ_HTTP_PORT: 8080
      LNZ_ADB_BACKENDS: '["aperturedb:55553"]'
      LNZ_REPLICAS: 1
      LNZ_ADB_MAX_CONCURRENCY: 48
      LNZ_FORCE_SSL: false
      LNZ_CERTIFICATE_PATH: /etc/lenz/certificate/tls.crt
      LNZ_PRIVATE_KEY_PATH: /etc/lenz/certificate/tls.key
    volumes:
      - ./aperturedb/certificate:/etc/lenz/certificate
  aperturedb:
    image: aperturedata/aperturedb-community:latest
    volumes:
      - ./aperturedb/db:/aperturedb/db
      - ./aperturedb/logs:/aperturedb/logs
    restart: always
    environment:
      ADB_KVGD_DB_SIZE: "204800"
      ADB_LOG_PATH: "logs"
      ADB_ENABLE_DEBUG: 1
      ADB_MASTER_KEY: "admin"
      ADB_PORT: 55553
      ADB_FORCE_SSL: false
  webui:
    image: aperturedata/aperturedata-platform-web-private:latest
    restart: always
  nginx:
    depends_on:
      ca:
        condition: service_completed_successfully
    image: nginx
    restart: always
    ports:
      - 8080:80
      - 8443:443
    configs:
      - source: nginx.conf
        target: /etc/nginx/conf.d/default.conf
    volumes:
      - ./aperturedb/certificate:/etc/nginx/certificate
  notebook:
    image: aperturedata/aperturedb-notebook
    ports:
      - 8888:8888 # HOST_PORT:CONTAINER_PORT
    restart: always
    command: bash -c "adb config create aperturedb_docker --host lenz --no-interactive && /start.sh"
    depends_on:
      - aperturedb
      - lenz
configs:
  nginx.conf:
    content: |
      server {
        listen 80;
        listen 443 ssl;
        client_max_body_size 256m;
        ssl_certificate /etc/nginx/certificate/tls.crt;
        ssl_certificate_key /etc/nginx/certificate/tls.key;
        location / {
          proxy_pass http://webui;
        }
        location /api/ {
          proxy_pass http://lenz:8080;
        }
      }
If you are using the ApertureDB Cloud, then you can also launch a Jupyter Notebook as a Workflow. See the Jupyter Notebooks Workflow for more details.