HEAVY.AI Docs
v8.1.0
v8.1.0
  • Welcome to HEAVY.AI Documentation
  • Overview
    • Overview
    • Release Notes
  • Installation and Configuration
    • System Requirements
      • Hardware Reference
      • Software Requirements
      • Licensing
    • Installation
      • Free Version
      • Installing on Docker
        • HEAVY.AI Installation using Docker on Ubuntu
      • Installing on Ubuntu
        • HEAVY.AI Installation on Ubuntu
        • Install NVIDIA Drivers and Vulkan on Ubuntu
      • Installing on Rocky Linux / RHEL
        • HEAVY.AI Installation on RHEL
        • Install NVIDIA Drivers and Vulkan on Rocky Linux and RHEL
      • Getting Started on AWS
      • Getting Started on GCP
      • Getting Started on Azure
      • Getting Started on Kubernetes (BETA)
      • Upgrading
        • Upgrading HEAVY.AI
        • Upgrading from Omnisci to HEAVY.AI 6.0
        • CUDA Compatibility Drivers
      • Uninstalling
      • Ports
    • Services and Utilities
      • Using Services
      • Using Utilities
    • Executor Resource Manager
    • Configuration Parameters
      • Overview
      • Configuration Parameters for HeavyDB
      • Configuration Parameters for HEAVY.AI Web Server
      • Configuration Parameters for HeavyIQ
    • Security
      • Roles and Privileges
        • Column-Level Security
      • Connecting Using SAML
      • Implementing a Secure Binary Interface
      • Encrypted Credentials in Custom Applications
      • LDAP Integration
    • Distributed Configuration
  • Loading and Exporting Data
    • Supported Data Sources
      • Kafka
      • Using HeavyImmerse Data Manager
      • Importing Geospatial Data
    • Command Line
      • Loading Data with SQL
      • Exporting Data
  • SQL
    • Data Definition (DDL)
      • Datatypes
      • Users and Databases
      • Tables
      • System Tables
      • Views
      • Policies
      • Comment
    • Data Manipulation (DML)
      • SQL Capabilities
        • ALTER SESSION SET
        • ALTER SYSTEM CLEAR
        • DELETE
        • EXPLAIN
        • INSERT
        • KILL QUERY
        • LIKELY/UNLIKELY
        • SELECT
        • SHOW
        • UPDATE
        • Arrays
        • Logical Operators and Conditional and Subquery Expressions
        • Table Expression and Join Support
        • Type Casts
      • Geospatial Capabilities
        • Uber H3 Hexagonal Modeling
      • Functions and Operators
      • System Table Functions
        • generate_random_strings
        • generate_series
        • tf_compute_dwell_times
        • tf_feature_self_similarity
        • tf_feature_similarity
        • tf_geo_rasterize
        • tf_geo_rasterize_slope
        • tf_graph_shortest_path
        • tf_graph_shortest_paths_distances
        • tf_load_point_cloud
        • tf_mandelbrot*
        • tf_point_cloud_metadata
        • tf_raster_contour_lines; tf_raster_contour_polygons
        • tf_raster_graph_shortest_slope_weighted_path
        • tf_rf_prop_max_signal (Directional Antennas)
        • ts_rf_prop_max_signal (Isotropic Antennas)
        • tf_rf_prop
      • Window Functions
      • Reserved Words
      • SQL Extensions
      • HeavyIQ LLM_TRANSFORM
  • HeavyImmerse
    • Introduction to HeavyImmerse
    • Admin Portal
    • Control Panel
    • Working with Dashboards
      • Dashboard List
      • Creating a Dashboard
      • Configuring a Dashboard
      • Duplicating and Sharing Dashboards
    • Measures and Dimensions
    • Using Parameters
    • Using Filters
    • Using Cross-link
    • Chart Animation
    • Multilayer Charts
    • SQL Editor
    • Customization
    • Joins (Beta)
    • Chart Types
      • Overview
      • Bubble
      • Choropleth
      • Combo
      • Contour
      • Cross-Section
      • Gauge
      • Geo Heatmap
      • Heatmap
      • Linemap
      • Number
      • Pie
      • Pointmap
      • Scatter Plot
      • Skew-T
      • Table
      • Text Widget
      • Wind Barb
    • Deprecated Charts
      • Bar
      • Combo - Original
      • Histogram
      • Line
      • Stacked Bar
    • HeavyIQ SQL Notebook
  • HEAVYIQ Conversational Analytics
    • HeavyIQ Overview
      • HeavyIQ Guidance
  • HeavyRF
    • Introduction to HeavyRF
    • Getting Started
    • HeavyRF Table Functions
  • HeavyConnect
    • HeavyConnect Release Overview
    • Getting Started
    • Best Practices
    • Examples
    • Command Reference
    • Parquet Data Wrapper Reference
    • ODBC Data Wrapper Reference
    • Raster Data Wrapper Reference
  • HeavyML (BETA)
    • HeavyML Overview
    • Clustering Algorithms
    • Regression Algorithms
      • Linear Regression
      • Random Forest Regression
      • Decision Tree Regression
      • Gradient Boosting Tree Regression
    • Principal Components Analysis
  • Python / Data Science
    • Data Science Foundation
    • JupyterLab Installation and Configuration
    • Using HEAVY.AI with JupyterLab
    • Python User-Defined Functions (UDFs) with the Remote Backend Compiler (RBC)
      • Installation
      • Registering and Using a Function
      • User-Defined Table Functions
      • RBC UDF/UDTF Example Notebooks
      • General UDF/UDTF Tutorial Notebooks
      • RBC API Reference
    • Ibis
    • Interactive Data Exploration with Altair
    • Additional Examples
      • Forecasting with HEAVY.AI and Prophet
  • APIs and Interfaces
    • Overview
    • heavysql
    • Thrift
    • JDBC
    • ODBC
    • Vega
      • Vega Tutorials
        • Vega at a Glance
        • Getting Started with Vega
        • Getting More from Your Data
        • Creating More Advanced Charts
        • Using Polys Marks Type
        • Vega Accumulator
        • Using Transform Aggregation
        • Improving Rendering with SQL Extensions
      • Vega Reference Overview
        • data Property
        • projections Property
        • scales Property
        • marks Property
      • Migration
        • Migrating Vega Code to Dynamic Poly Rendering
      • Try Vega
    • RJDBC
    • SQuirreL SQL
    • heavyai-connector
  • Tutorials and Demos
    • Loading Data
    • Using Heavy Immerse
    • Hello World
    • Creating a Kafka Streaming Application
    • Getting Started with Open Source
    • Try Vega
  • Troubleshooting and Special Topics
    • FAQs
    • Troubleshooting
    • Vulkan Renderer
    • Optimizing
    • Known Issues and Limitations
    • Logs and Monitoring
    • Archived Release Notes
      • Release 6.x
      • Release 5.x
      • Release 4.x
      • Release 3.x
Powered by GitBook
On this page
  • EXPLAIN CALCITE
  • EXPLAIN CALCITE DETAILED
Export as PDF
  1. SQL
  2. Data Manipulation (DML)
  3. SQL Capabilities

EXPLAIN

Shows generated Intermediate Representation (IR) code, identifying whether it is executed on GPU or CPU. This is primarily used internally by HEAVY.AI to monitor behavior.

EXPLAIN <STMT>

For example, when you use the EXPLAIN command on a basic statement, the utility returns 90 lines of IR code that is not meant to be human readable. However, at the top of the listing, a heading indicates whether it is IR for the CPU or IR for the GPU, which can be useful to know in some situations.

EXPLAIN CALCITE

Returns a relational algebra tree describing the high-level plan to execute the statement.

EXPLAIN CALCITE <STMT>

The table below lists the relational algebra classes used to describe the execution plan for a SQL statement.

Method

Description

LogicalAggregate

Operator that eliminates duplicates and computes totals.

LogicalCalc

Expression that computes project expressions and also filters.

LogicalChi

Operator that converts a stream to a relation.

LogicalCorrelate

Operator that performs nested-loop joins.

LogicalDelta

Operator that converts a relation to a stream.

LogicalExchange

Expression that imposes a particular distribution on its input without otherwise changing its content.

LogicalFilter

Expression that iterates over its input and returns elements for which a condition evaluates to true.

LogicalIntersect

Expression that returns the intersection of the rows of its inputs.

LogicalJoin

Expression that combines two relational expressions according to some condition.

LogicalMatch

Expression that represents a MATCH_RECOGNIZE node.

LogicalMinus

Expression that returns the rows of its first input minus any matching rows from its other inputs. Corresponds to the SQL EXCEPT operator.

LogicalProject

Expression that computes a set of ‘select expressions’ from its input relational expression.

LogicalSort

Expression that imposes a particular sort order on its input without otherwise changing its content.

LogicalTableFunctionScan

Expression that calls a table-valued function.

LogicalTableModify

Expression that modifies a table. Similar to TableScan, but represents a request to modify a table instead of read from it.

LogicalTableScan

Reads all the rows from a RelOptTable.

LogicalUnion

Expression that returns the union of the rows of its inputs, optionally eliminating duplicates.

LogicalValues

Expression for which the value is a sequence of zero or more literal row values.

LogicalWindow

For example, a SELECT statement is described as a table scan and projection.

heavysql> EXPLAIN CALCITE (SELECT * FROM movies);
Explanation
LogicalProject(movieId=[$0], title=[$1], genres=[$2])
   LogicalTableScan(TABLE=[[CATALOG, heavyai, MOVIES]])

If you add a sort order, the table projection is folded under a LogicalSort procedure.

heavysql> EXPLAIN calcite (SELECT * FROM movies ORDER BY title);
Explanation
LogicalSort(sort0=[$1], dir0=[ASC])
   LogicalProject(movieId=[$0], title=[$1], genres=[$2])
      LogicalTableScan(TABLE=[[CATALOG, omnisci, MOVIES]])

When the SQL statement is simple, the EXPLAIN CALCITE version is actually less “human readable.” EXPLAIN CALCITE is more useful when you work with more complex SQL statements, like the one that follows. This query performs a scan on the BOOK table before scanning the BOOK_ORDER table.

heavysql> EXPLAIN calcite SELECT bc.firstname, bc.lastname, b.title, bo.orderdate, s.name
FROM book b, book_customer bc, book_order bo, shipper s
WHERE bo.cust_id = bc.cust_id AND b.book_id = bo.book_id AND bo.shipper_id = s.shipper_id
AND s.name = 'UPS';
Explanation
LogicalProject(firstname=[$5], lastname=[$6], title=[$2], orderdate=[$11], name=[$14])
    LogicalFilter(condition=[AND(=($9, $4), =($0, $8), =($10, $13), =($14, 'UPS'))])
        LogicalJoin(condition=[true], joinType=[INNER])
            LogicalJoin(condition=[true], joinType=[INNER])
                LogicalJoin(condition=[true], joinType=[INNER])
                    LogicalTableScan(TABLE=[[CATALOG, omnisci, BOOK]])
                    LogicalTableScan(TABLE=[[CATALOG, omnisci, BOOK_CUSTOMER]])
                LogicalTableScan(TABLE=[[CATALOG, omnisci, BOOK_ORDER]])
            LogicalTableScan(TABLE=[[CATALOG, omnisci, SHIPPER]])

Revising the original SQL command results in a more natural selection order and a more performant query.

heavysql> EXPLAIN calcite SELECT bc.firstname, bc.lastname, b.title, bo.orderdate, s.name
FROM book_order bo, book_customer bc, book b, shipper s
WHERE bo.cust_id = bc.cust_id AND bo.book_id = b.book_id AND bo.shipper_id = s.shipper_id
AND s.name = 'UPS';
Explanation
LogicalProject(firstname=[$10], lastname=[$11], title=[$7], orderdate=[$3], name=[$14])
    LogicalFilter(condition=[AND(=($1, $9), =($5, $0), =($2, $13), =($14, 'UPS'))])
        LogicalJoin(condition=[true], joinType=[INNER])
            LogicalJoin(condition=[true], joinType=[INNER])
                LogicalJoin(condition=[true], joinType=[INNER])
                  LogicalTableScan(TABLE=[[CATALOG, omnisci, BOOK_ORDER]])
                  LogicalTableScan(TABLE=[[CATALOG, omnisci, BOOK_CUSTOMER]])
                LogicalTableScan(TABLE=[[CATALOG, omnisci, BOOK]])
            LogicalTableScan(TABLE=[[CATALOG, omnisci, SHIPPER]])

EXPLAIN CALCITE DETAILED

Augments the EXPLAIN CALCITE command by adding details about referenced columns in the query plan.

For example, for the following EXPLAIN CALCITE command execution:

heavysql> EXPLAIN CALCITE SELECT x, SUM(y) FROM test GROUP BY x;
Explanation
LogicalAggregate(group=[{0}], EXPR$1=[SUM($1)])
  LogicalProject(x=[$0], y=[$2])
    LogicalTableScan(table=[[testDB, test]])

EXPLAIN CALCITE DETAILED adds more column details as seen below:

heavysql> EXPLAIN CALCITE DETAILED SELECT x, SUM(y) FROM test GROUP BY x;
Explanation
LogicalAggregate(group=[{0}], EXPR$1=[SUM($1)])	{[$1->db:testDB,tableName:test,colName:y]}
  LogicalProject(x=[$0], y=[$2])	{[$2->db:testDB,tableName:test,colName:y], [$0->db:testDB,tableName:test,colName:x]}
    LogicalTableScan(table=[[testDB, test]])
PreviousDELETENextINSERT

Expression representing a set of window aggregates. See

Window Functions