HEAVY.AI Docs
v8.1.0
v8.1.0
  • Welcome to HEAVY.AI Documentation
  • Overview
    • Overview
    • Release Notes
  • Installation and Configuration
    • System Requirements
      • Hardware Reference
      • Software Requirements
      • Licensing
    • Installation
      • Free Version
      • Installing on Docker
        • HEAVY.AI Installation using Docker on Ubuntu
      • Installing on Ubuntu
        • HEAVY.AI Installation on Ubuntu
        • Install NVIDIA Drivers and Vulkan on Ubuntu
      • Installing on Rocky Linux / RHEL
        • HEAVY.AI Installation on RHEL
        • Install NVIDIA Drivers and Vulkan on Rocky Linux and RHEL
      • Getting Started on AWS
      • Getting Started on GCP
      • Getting Started on Azure
      • Getting Started on Kubernetes (BETA)
      • Upgrading
        • Upgrading HEAVY.AI
        • Upgrading from Omnisci to HEAVY.AI 6.0
        • CUDA Compatibility Drivers
      • Uninstalling
      • Ports
    • Services and Utilities
      • Using Services
      • Using Utilities
    • Executor Resource Manager
    • Configuration Parameters
      • Overview
      • Configuration Parameters for HeavyDB
      • Configuration Parameters for HEAVY.AI Web Server
      • Configuration Parameters for HeavyIQ
    • Security
      • Roles and Privileges
        • Column-Level Security
      • Connecting Using SAML
      • Implementing a Secure Binary Interface
      • Encrypted Credentials in Custom Applications
      • LDAP Integration
    • Distributed Configuration
  • Loading and Exporting Data
    • Supported Data Sources
      • Kafka
      • Using HeavyImmerse Data Manager
      • Importing Geospatial Data
    • Command Line
      • Loading Data with SQL
      • Exporting Data
  • SQL
    • Data Definition (DDL)
      • Datatypes
      • Users and Databases
      • Tables
      • System Tables
      • Views
      • Policies
      • Comment
    • Data Manipulation (DML)
      • SQL Capabilities
        • ALTER SESSION SET
        • ALTER SYSTEM CLEAR
        • DELETE
        • EXPLAIN
        • INSERT
        • KILL QUERY
        • LIKELY/UNLIKELY
        • SELECT
        • SHOW
        • UPDATE
        • Arrays
        • Logical Operators and Conditional and Subquery Expressions
        • Table Expression and Join Support
        • Type Casts
      • Geospatial Capabilities
        • Uber H3 Hexagonal Modeling
      • Functions and Operators
      • System Table Functions
        • generate_random_strings
        • generate_series
        • tf_compute_dwell_times
        • tf_feature_self_similarity
        • tf_feature_similarity
        • tf_geo_rasterize
        • tf_geo_rasterize_slope
        • tf_graph_shortest_path
        • tf_graph_shortest_paths_distances
        • tf_load_point_cloud
        • tf_mandelbrot*
        • tf_point_cloud_metadata
        • tf_raster_contour_lines; tf_raster_contour_polygons
        • tf_raster_graph_shortest_slope_weighted_path
        • tf_rf_prop_max_signal (Directional Antennas)
        • ts_rf_prop_max_signal (Isotropic Antennas)
        • tf_rf_prop
      • Window Functions
      • Reserved Words
      • SQL Extensions
      • HeavyIQ LLM_TRANSFORM
  • HeavyImmerse
    • Introduction to HeavyImmerse
    • Admin Portal
    • Control Panel
    • Working with Dashboards
      • Dashboard List
      • Creating a Dashboard
      • Configuring a Dashboard
      • Duplicating and Sharing Dashboards
    • Measures and Dimensions
    • Using Parameters
    • Using Filters
    • Using Cross-link
    • Chart Animation
    • Multilayer Charts
    • SQL Editor
    • Customization
    • Joins (Beta)
    • Chart Types
      • Overview
      • Bubble
      • Choropleth
      • Combo
      • Contour
      • Cross-Section
      • Gauge
      • Geo Heatmap
      • Heatmap
      • Linemap
      • Number
      • Pie
      • Pointmap
      • Scatter Plot
      • Skew-T
      • Table
      • Text Widget
      • Wind Barb
    • Deprecated Charts
      • Bar
      • Combo - Original
      • Histogram
      • Line
      • Stacked Bar
    • HeavyIQ SQL Notebook
  • HEAVYIQ Conversational Analytics
    • HeavyIQ Overview
      • HeavyIQ Guidance
  • HeavyRF
    • Introduction to HeavyRF
    • Getting Started
    • HeavyRF Table Functions
  • HeavyConnect
    • HeavyConnect Release Overview
    • Getting Started
    • Best Practices
    • Examples
    • Command Reference
    • Parquet Data Wrapper Reference
    • ODBC Data Wrapper Reference
    • Raster Data Wrapper Reference
  • HeavyML (BETA)
    • HeavyML Overview
    • Clustering Algorithms
    • Regression Algorithms
      • Linear Regression
      • Random Forest Regression
      • Decision Tree Regression
      • Gradient Boosting Tree Regression
    • Principal Components Analysis
  • Python / Data Science
    • Data Science Foundation
    • JupyterLab Installation and Configuration
    • Using HEAVY.AI with JupyterLab
    • Python User-Defined Functions (UDFs) with the Remote Backend Compiler (RBC)
      • Installation
      • Registering and Using a Function
      • User-Defined Table Functions
      • RBC UDF/UDTF Example Notebooks
      • General UDF/UDTF Tutorial Notebooks
      • RBC API Reference
    • Ibis
    • Interactive Data Exploration with Altair
    • Additional Examples
      • Forecasting with HEAVY.AI and Prophet
  • APIs and Interfaces
    • Overview
    • heavysql
    • Thrift
    • JDBC
    • ODBC
    • Vega
      • Vega Tutorials
        • Vega at a Glance
        • Getting Started with Vega
        • Getting More from Your Data
        • Creating More Advanced Charts
        • Using Polys Marks Type
        • Vega Accumulator
        • Using Transform Aggregation
        • Improving Rendering with SQL Extensions
      • Vega Reference Overview
        • data Property
        • projections Property
        • scales Property
        • marks Property
      • Migration
        • Migrating Vega Code to Dynamic Poly Rendering
      • Try Vega
    • RJDBC
    • SQuirreL SQL
    • heavyai-connector
  • Tutorials and Demos
    • Loading Data
    • Using Heavy Immerse
    • Hello World
    • Creating a Kafka Streaming Application
    • Getting Started with Open Source
    • Try Vega
  • Troubleshooting and Special Topics
    • FAQs
    • Troubleshooting
    • Vulkan Renderer
    • Optimizing
    • Known Issues and Limitations
    • Logs and Monitoring
    • Archived Release Notes
      • Release 6.x
      • Release 5.x
      • Release 4.x
      • Release 3.x
Powered by GitBook
On this page
  • Log Entry Types
  • heavydb.INFO
  • heavydb.WARNING
  • heavydb.ERROR
  • heavydb.FATAL
  • Live Logging
  • Browser-based Live Logging
  • Command-Line Live Logging
  • Monitoring
Export as PDF
  1. Troubleshooting and Special Topics

Logs and Monitoring

PreviousKnown Issues and LimitationsNextArchived Release Notes

HEAVY.AI writes to system logs and to HEAVY.AI-specific logs. System log entries include HEAVY.AI data-loading information, errors related to NVIDIA components, and other issues. For RHEL/CentOS, see /var/log/messages; for Ubuntu, see /var/log/syslog.

Most installation recipes use the systemd installer for HEAVY.AI, allowing consolidation of system-level logs. You can view the systemd log entries associated with HEAVY.AI using the following syntax in a terminal window:

journalctl -u heavydb

By default, HEAVY.AI uses rotating logs with a symbolic link referencing the current HEAVY.AI server instance. Logs rotate when the instance restarts. Logs also rotate once they reach 10MB in size. HeavyDB keeps a maximum of 100 historical log files. These logs are located in the /log directory.

The HEAVY.AI web server can show current log files through a web browser. Only super users who are logged in can access the log files. To enable log viewing in a browser, use the enable-browser-logs command; see the .

You can configure several of the logging behaviors described above using runtime flags. See .

Log Entry Types

Log levels are a hierarchy. Messages sent to the ERROR log always also go to the WARNING and INFO logs, and messages sent to WARNING always go to INFO.

heavydb.INFO

This is the best source of information for troubleshooting, and the first place you should check for issues. Provides verbose logging of:

  • Configuration settings in place when heavydb starts.

  • Queries by user and session ID, with execution time (time for query to run) and total time (execution time plus time spent waiting to execute plus network wait time).

Examples

  • Configuration settings in place when heavydb starts:

I1004 14:11:28.799216  1009 MapDServer.cpp:784] HEAVY.AI started with data directory at '/var/lib/heavyai/storage'
I1004 14:11:28.801699  1009 MapDServer.cpp:796]  Watchdog is set to 1
I1004 14:11:28.801708  1009 MapDServer.cpp:797]  Dynamic Watchdog is set to 0
  • When you use the wrong delimiter, you might see errors like this:

E1004 20:12:00.929049 7496 Importer.cpp:1603] Incorrect Row (expected 21 columns, has 1): [JB141803,02/04/2018
E1004 20:12:19.426657 7494 Importer.cpp:3148] Maximum rows rejected exceeded. Halting load

heavydb.WARNING

Reports nonfatal warning messages. For example:

W1229 09:13:50.888172 36427 RenderInterface.cpp:1155] The string "Other" does 
not have a valid id in the dictionary-encoded string column "card_class" 
(aliased by "color") for table cc_trans.

heavydb.ERROR

Logs non-fatal error messages, as well as errors related to data ingestion.

Examples

  • When the path in the heavysql COPY command references an incorrect file or path.

    E1001 16:27:55.522930  2009 MapDHandler.cpp:4147] Exception: fopen(/tmp/25882.csv): No such file or directory
  • When the table definition does not match the file referenced in the COPY command.

    E1001 16:30:58.710852 10436 Importer.cpp:1603] Incorrect Row (expected 58 columns, has 57): [MOBILE, EVDOA, Access...

heavydb.FATAL

Reports `check failed` messages and a line number to identify where the error occurred. For example:

F1022 19:51:40.978567 14889 Execute.cpp:982] 
Check failed: cd->columnType.is_string() && cd->columnType.get_compression() == kENCODING_DICT

Live Logging

Browser-based Live Logging

Using Chrome’s Developer Tools, you can interact with data in Immerse to see SQL and response times from OmniSciDB. The syntax is SQLlogging(true), entered under the console tab inline, as shown below.

Once SQL Logging is turned on, you can interact with the dashboard, see the SQL generated and monitor the response timing involved.

When you turn SQL logging on using SQLlogging(true), or turn it off using SQLlogging(false), the change takes effect only after the page has been reloaded or closed and reopened.

Command-Line Live Logging

You can “tail” the logs using a terminal window from the logs folder (usually /log) by the following syntax in a terminal window and specifying the heavydb log file you want to view:

tail -f *.INFO

Monitoring

Monitoring options include the following.

From the command line, you can run nvidia-smi to identify:

  • That the O/S can communicate with your NVIDIA GPU cards

  • NVIDIA SMI and driver version

  • GPU Card count, model, and memory usage

  • Aggregate memory usage by HEAVY.AI

You can also leverage systemd in non-Docker deployments to verify the status of heavydb:

sudo systemctl status heavydb

and heavy_web_server:

sudo systemctl status heavy_web_server

These commands show whether the service is running (Active: active, (running)) or stopped (Active: failed (result: signal), or Active: inactive (dead)), the directory path, and a configuration summary.

Using heavysql, you can make these additional monitoring queries:

\status

  • Returns: server version, start time, and server edition.

  • In distributed environments, returns: Name of leaf, leaf version number, leaf start time.

\memory_summary

Returns a hybrid summary of CPU and GPU memory allocation. HEAVY.AI Server CPU Memory Summary shows the maximum amount of memory available, what is in use, allocated and free. HEAVY.AI allocates memory in 2 GB fragments on both CPU and GPU. HEAVY.AI Server GPU Memory Summary shows the same memory summary at the individual card level. Note: HEAVY.AI does not pre-allocate all of the available GPU memory.

A cold start of the system might look like this:

HEAVY.AI Server CPU Memory Summary:
           MAX            USE      ALLOCATED           FREE
  412566.56 MB        8.19 MB     4096.00 MB     4087.81 MB


HEAVY.AI Server GPU Memory Summary:
[GPU]            MAX            USE      ALLOCATED           FREE
 [0]    10169.96 MB        0.00 MB        0.00 MB        0.00 MB
 [1]    10169.96 MB        0.00 MB        0.00 MB        0.00 MB
 [2]    10169.96 MB        0.00 MB        0.00 MB        0.00 MB
 [3]    10169.96 MB        0.00 MB        0.00 MB        0.00 MB

After warming up the data, the memory might look like this:

HEAVY.AI Server CPU Memory Summary:
           MAX            USE              ALLOCATED     FREE
  412566.56 MB     7801.54 MB     8192.00 MB      390.46 MB


HEAVY.AI Server GPU Memory Summary:
[GPU]            MAX            USE      ALLOCATED     FREE
 [0]    10169.96 MB     2356.00 MB     4096.00 MB     1740.00 MB
 [1]    10169.96 MB     2356.00 MB     4096.00 MB     1740.00 MB
 [2]    10169.96 MB     1995.01 MB     2048.00 MB       52.99 MB
 [3]    10169.96 MB     1196.33 MB     2048.00 MB      851.67 MB
Configuration Parameters
configuration parameters for HEAVY.AI web server