HEAVY.AI Docs
v8.1.0
v8.1.0
  • Welcome to HEAVY.AI Documentation
  • Overview
    • Overview
    • Release Notes
  • Installation and Configuration
    • System Requirements
      • Hardware Reference
      • Software Requirements
      • Licensing
    • Installation
      • Free Version
      • Installing on Docker
        • HEAVY.AI Installation using Docker on Ubuntu
      • Installing on Ubuntu
        • HEAVY.AI Installation on Ubuntu
        • Install NVIDIA Drivers and Vulkan on Ubuntu
      • Installing on Rocky Linux / RHEL
        • HEAVY.AI Installation on RHEL
        • Install NVIDIA Drivers and Vulkan on Rocky Linux and RHEL
      • Getting Started on AWS
      • Getting Started on GCP
      • Getting Started on Azure
      • Getting Started on Kubernetes (BETA)
      • Upgrading
        • Upgrading HEAVY.AI
        • Upgrading from Omnisci to HEAVY.AI 6.0
        • CUDA Compatibility Drivers
      • Uninstalling
      • Ports
    • Services and Utilities
      • Using Services
      • Using Utilities
    • Executor Resource Manager
    • Configuration Parameters
      • Overview
      • Configuration Parameters for HeavyDB
      • Configuration Parameters for HEAVY.AI Web Server
      • Configuration Parameters for HeavyIQ
    • Security
      • Roles and Privileges
        • Column-Level Security
      • Connecting Using SAML
      • Implementing a Secure Binary Interface
      • Encrypted Credentials in Custom Applications
      • LDAP Integration
    • Distributed Configuration
  • Loading and Exporting Data
    • Supported Data Sources
      • Kafka
      • Using HeavyImmerse Data Manager
      • Importing Geospatial Data
    • Command Line
      • Loading Data with SQL
      • Exporting Data
  • SQL
    • Data Definition (DDL)
      • Datatypes
      • Users and Databases
      • Tables
      • System Tables
      • Views
      • Policies
      • Comment
    • Data Manipulation (DML)
      • SQL Capabilities
        • ALTER SESSION SET
        • ALTER SYSTEM CLEAR
        • DELETE
        • EXPLAIN
        • INSERT
        • KILL QUERY
        • LIKELY/UNLIKELY
        • SELECT
        • SHOW
        • UPDATE
        • Arrays
        • Logical Operators and Conditional and Subquery Expressions
        • Table Expression and Join Support
        • Type Casts
      • Geospatial Capabilities
        • Uber H3 Hexagonal Modeling
      • Functions and Operators
      • System Table Functions
        • generate_random_strings
        • generate_series
        • tf_compute_dwell_times
        • tf_feature_self_similarity
        • tf_feature_similarity
        • tf_geo_rasterize
        • tf_geo_rasterize_slope
        • tf_graph_shortest_path
        • tf_graph_shortest_paths_distances
        • tf_load_point_cloud
        • tf_mandelbrot*
        • tf_point_cloud_metadata
        • tf_raster_contour_lines; tf_raster_contour_polygons
        • tf_raster_graph_shortest_slope_weighted_path
        • tf_rf_prop_max_signal (Directional Antennas)
        • ts_rf_prop_max_signal (Isotropic Antennas)
        • tf_rf_prop
      • Window Functions
      • Reserved Words
      • SQL Extensions
      • HeavyIQ LLM_TRANSFORM
  • HeavyImmerse
    • Introduction to HeavyImmerse
    • Admin Portal
    • Control Panel
    • Working with Dashboards
      • Dashboard List
      • Creating a Dashboard
      • Configuring a Dashboard
      • Duplicating and Sharing Dashboards
    • Measures and Dimensions
    • Using Parameters
    • Using Filters
    • Using Cross-link
    • Chart Animation
    • Multilayer Charts
    • SQL Editor
    • Customization
    • Joins (Beta)
    • Chart Types
      • Overview
      • Bubble
      • Choropleth
      • Combo
      • Contour
      • Cross-Section
      • Gauge
      • Geo Heatmap
      • Heatmap
      • Linemap
      • Number
      • Pie
      • Pointmap
      • Scatter Plot
      • Skew-T
      • Table
      • Text Widget
      • Wind Barb
    • Deprecated Charts
      • Bar
      • Combo - Original
      • Histogram
      • Line
      • Stacked Bar
    • HeavyIQ SQL Notebook
  • HEAVYIQ Conversational Analytics
    • HeavyIQ Overview
      • HeavyIQ Guidance
  • HeavyRF
    • Introduction to HeavyRF
    • Getting Started
    • HeavyRF Table Functions
  • HeavyConnect
    • HeavyConnect Release Overview
    • Getting Started
    • Best Practices
    • Examples
    • Command Reference
    • Parquet Data Wrapper Reference
    • ODBC Data Wrapper Reference
    • Raster Data Wrapper Reference
  • HeavyML (BETA)
    • HeavyML Overview
    • Clustering Algorithms
    • Regression Algorithms
      • Linear Regression
      • Random Forest Regression
      • Decision Tree Regression
      • Gradient Boosting Tree Regression
    • Principal Components Analysis
  • Python / Data Science
    • Data Science Foundation
    • JupyterLab Installation and Configuration
    • Using HEAVY.AI with JupyterLab
    • Python User-Defined Functions (UDFs) with the Remote Backend Compiler (RBC)
      • Installation
      • Registering and Using a Function
      • User-Defined Table Functions
      • RBC UDF/UDTF Example Notebooks
      • General UDF/UDTF Tutorial Notebooks
      • RBC API Reference
    • Ibis
    • Interactive Data Exploration with Altair
    • Additional Examples
      • Forecasting with HEAVY.AI and Prophet
  • APIs and Interfaces
    • Overview
    • heavysql
    • Thrift
    • JDBC
    • ODBC
    • Vega
      • Vega Tutorials
        • Vega at a Glance
        • Getting Started with Vega
        • Getting More from Your Data
        • Creating More Advanced Charts
        • Using Polys Marks Type
        • Vega Accumulator
        • Using Transform Aggregation
        • Improving Rendering with SQL Extensions
      • Vega Reference Overview
        • data Property
        • projections Property
        • scales Property
        • marks Property
      • Migration
        • Migrating Vega Code to Dynamic Poly Rendering
      • Try Vega
    • RJDBC
    • SQuirreL SQL
    • heavyai-connector
  • Tutorials and Demos
    • Loading Data
    • Using Heavy Immerse
    • Hello World
    • Creating a Kafka Streaming Application
    • Getting Started with Open Source
    • Try Vega
  • Troubleshooting and Special Topics
    • FAQs
    • Troubleshooting
    • Vulkan Renderer
    • Optimizing
    • Known Issues and Limitations
    • Logs and Monitoring
    • Archived Release Notes
      • Release 6.x
      • Release 5.x
      • Release 4.x
      • Release 3.x
Powered by GitBook
On this page
  • Required PKI Components
  • Demonstration Script to Create "Mock/Test" PKI Components
  • Start the Server in Encrypted Mode with PKI Client Authentication
  • Example
  • Configuring heavyai.conf for Encrypted Connection
  • Why Use Both server.crt and a Java TrustStore?
Export as PDF
  1. Installation and Configuration
  2. Security

Implementing a Secure Binary Interface

Follow these instructions to start an HEAVY.AI server with an encrypted main port.

Required PKI Components

You need the following PKI (Public Key Infrastructure) components to implement a Secure Binary Interface.

  • A CRT (short for certificate) file containing the server's PKI certificate. This file must be shared with the clients that connect using encrypted communications. Ideally, this file is signed by a recognized certificate issuing agency.

  • A key file containing the server's private key. Keep this file secret and secure.

  • A Java TrustStore containing the server's PKI certificate. The password for the trust store is also required.

Although in this instance the trust store contains only information that can be shared, the Java TrustStore program requires it to be password protected.

  • A Java KeyStore and password.

  • In a distributed system, add the configuration parameters to the heavyai.conf file on the aggregator and all leaf nodes in your HeavyDB cluster.

Demonstration Script to Create "Mock/Test" PKI Components

You can use OpenSSL utilities to create the various PKI elements. The server certificate in this instance is self-signing, and should not be used in a production system.

  1. Generate a new private key.

    openssl genrsa -out server.key 2048
  2. Use the private key to generate a certificate signing request.

    openssl req -new -key server.key -out server.csr
  3. Self sign the certificate signing request to create a public certificate.

    openssl x509 -req -days 365 -in server.csr -signkey server.key -out server.crt
  4. Use the Java tools to create a key store from the public certificate.

    keytool -importcert  -file server.crt -keystore server.jks

To generate a keystore file from your server key:

  1. Copy server.key to server.txt. Concatenate it with server.crt.

    cp server.key server.txt
    cat server.crt >> server.txt
  2. Use server.txt to create a PKCS12 file.

    openssl pkcs12 -export -in server.txt -out server.p12
  3. Use server.p12 to create a keystore.

    keytool -importkeystore -v -srckeystore server.p12  -srcstoretype PKCS12 -destkeystore keystore.jks -deststoretype pkcs12

Start the Server in Encrypted Mode with PKI Client Authentication

Start the server using the following options.

--pki-db-client-auth true
--ssl-cert 
--ssl-private-key 
--ssl-trust-store 
--ssl-trust-password 
--ssl-keystore 
--ssl-keystore-password 
--ssl-trust-ca 
--ssl-trust-ca-server 

Example

sudo start heavyai_server --port 6274 --data /data --pki-db-client-auth true  
--ssl-cert /tls_certs/self_signed_server.example.com_self_signed/self_signed_server.example.com.pem 
--ssl-private-key /tls_certs/self_signed_server.example.com_self_signed/private/self_signed_server.example.com_key.pem 
--ssl-trust-store /tls_certs/self_signed_server.example.com_self_signed/trust_store_self_signed_server.example.com.jks 
--ssl-trust-password truststore_password 
--sslkeystore /tls_certs/self_signed_server.example.com_self_signed/key_store_self_signed_server.example.com.jks
--ssl-keystore-password keystore_password 
--ssl-trust-ca = "/tls_certs/self_signed_server.example.com_self_signed/self_signed_server.example.com.pem" 
--ssl-trust-ca-server /tls_certs/ca_primary/ca_primary_cert.pem

Configuring heavyai.conf for Encrypted Connection

Alternatively, you can add the following configuration parameters to heavyai.conf to establish a Secure Binary Interface. The following configuration flags implement the same encryption shown in the runtime example above:

# Start pki authentication 
pki-db-client-auth = true 
ssl-cert = "/tls_certs/self_signed_server.example.com_self_signed/self_signed_server.example.com.pem" 
ssl-private-key = "/tls_certs/self_signed_server.example.com_self_signed/private/self_signed_server.example.com_key.pem" 
ssl-trust-store = "/tls_certs/self_signed_server.example.com_self_signed/trust_store_self_signed_server.example.com.jks" 
ssl-trust-password = "truststore_password"  
ssl-keystore = "/tls_certs/self_signed_server.example.com_self_signed/key_store_self_signed_server.example.com.jks" 
ssl-keystore-password = "keystore_password" 
ssl-trust-ca = "/tls_certs/self_signed_server.example.com_self_signed/self_signed_server.example.com.pem" 
ssl-trust-ca-server = "/tls_certs/ca_primary/ca_primary_cert.pem" 

Passwords for the SSL truststore and keystore can be enclosed in single (') or double (") quotes.

Why Use Both server.crt and a Java TrustStore?

The server.crt file and the Java truststore contain the same public key information in different formats. Both are required by the server to establish both the secure client communication with the various interfaces and with its Calcite server. At startup, the Java truststore is passed to the Calcite server for authentication and to encrypt its traffic with the HEAVY.AI server.

PreviousConnecting Using SAMLNextEncrypted Credentials in Custom Applications