HEAVY.AI Docs
v6.4.3
v6.4.3
  • Welcome to HEAVY.AI Documentation
  • Overview
    • Overview
    • Release Notes
  • Installation and Configuration
    • System Requirements
      • Hardware Reference
      • Software Requirements
    • Installation
      • Free Version
      • Installing on Rocky Linux / RHEL
        • HEAVY.AI Installation on RHEL
        • Install NVIDIA Drivers and Vulkan on Rocky Linux and RHEL
      • Installing on Ubuntu
        • HEAVY.AI Installation on Ubuntu
        • Install NVIDIA Drivers and Vulkan on Ubuntu
      • Installing on Docker
        • HEAVY.AI Installation using Docker on Ubuntu
      • Getting Started on AWS
      • Getting Started on GCP
      • Getting Started on Azure
      • Upgrading
        • Upgrading HEAVY.AI
        • Upgrading from Omnisci to HEAVY.AI 6.0
        • CUDA Compatibility Drivers
      • Uninstalling
      • Ports
    • Services and Utilities
      • Using Services
      • Using Utilities
    • Configuration Parameters
      • Overview
      • Configuration Parameters for HeavyDB
      • Configuration Parameters for HEAVY.AI Web Server
    • Security
      • Roles and Privileges
      • Connecting Using SAML
      • Implementing a Secure Binary Interface
      • Encrypted Credentials in Custom Applications
      • LDAP Integration
    • Distributed Configuration
  • Loading and Exporting Data
    • Supported Data Sources
      • Kafka
      • Using Heavy Immerse Data Manager
      • Importing Geospatial Data
    • Command Line
      • Loading Data with SQL
      • Exporting Data
  • SQL
    • Data Definition (DDL)
      • Datatypes
      • Users and Databases
      • Tables
      • System Tables
      • Views
      • Policies
    • Data Manipulation (DML)
      • SQL Capabilities
        • ALTER SESSION SET
        • ALTER SYSTEM CLEAR
        • DELETE
        • EXPLAIN
        • INSERT
        • KILL QUERY
        • LIKELY/UNLIKELY
        • SELECT
        • SHOW
        • UPDATE
        • Arrays
        • Logical Operators and Conditional and Subquery Expressions
        • Table Expression and Join Support
        • Type Casts
      • Geospatial Capabilities
      • Functions and Operators
      • System Table Functions
        • generate_random_strings
        • generate_series
        • tf_compute_dwell_times
        • tf_feature_self_similarity
        • tf_feature_similarity
        • tf_geo_rasterize
        • tf_geo_rasterize_slope
        • tf_graph_shortest_path
        • tf_graph_shortest_paths_distances
        • tf_load_point_cloud
        • tf_mandelbrot*
        • tf_point_cloud_metadata
        • tf_raster_contour_lines; tf_raster_contour_polygons
        • tf_raster_graph_shortest_slope_weighted_path
        • tf_rf_prop_max_signal (Directional Antennas)
        • ts_rf_prop_max_signal (Isotropic Antennas)
        • tf_rf_prop
      • Window Functions
      • Reserved Words
      • SQL Extensions
  • Heavy Immerse
    • Introduction to Heavy Immerse
    • Admin Portal
    • Control Panel
    • Working with Dashboards
      • Dashboard List
      • Creating a Dashboard
      • Configuring a Dashboard
      • Duplicating and Sharing Dashboards
    • Measures and Dimensions
    • Using Parameters
    • Using Filters
    • Chart Animation
    • Multilayer Charts
    • SQL Editor
    • Customization
    • Chart Types
      • Overview
      • Bar
      • Bubble
      • Choropleth
      • Combo
      • Cross-Section
      • Contour
      • Gauge
      • Geo Heatmap
      • Heatmap
      • Histogram
      • Line
      • Linemap
      • New Combo
      • Number
      • Pie
      • Pointmap
      • Scatter Plot
      • Skew-T
      • Stacked Bar
      • Table
      • Text Widget
      • Wind Barb
  • HeavyRF
    • Introduction to HeavyRF
    • Getting Started
    • HeavyRF Table Functions
  • HeavyConnect
    • HeavyConnect Release Overview
    • Getting Started
    • Best Practices
    • Examples
    • Command Reference
    • Parquet Data Wrapper Reference
    • ODBC Data Wrapper Reference
  • Python / Data Science
    • Data Science Foundation
    • JupyterLab Installation and Configuration
    • Using HEAVY.AI with JupyterLab
    • Python User-Defined Functions (UDFs) with the Remote Backend Compiler (RBC)
      • Installation
      • Registering and Using a Function
      • User-Defined Table Functions
      • RBC UDF/UDTF Example Notebooks
      • General UDF/UDTF Tutorial Notebooks
      • RBC API Reference
    • Ibis
    • Interactive Data Exploration with Altair
    • Additional Examples
      • Forecasting with HEAVY.AI and Prophet
  • APIs and Interfaces
    • Overview
    • heavysql
    • Thrift
    • JDBC
    • ODBC
    • Vega
      • Vega Tutorials
        • Vega at a Glance
        • Getting Started with Vega
        • Getting More from Your Data
        • Creating More Advanced Charts
        • Using Polys Marks Type
        • Vega Accumulator
        • Using Transform Aggregation
        • Improving Rendering with SQL Extensions
      • Vega Reference Overview
        • data Property
        • projections Property
        • scales Property
        • marks Property
      • Migration
        • Migrating Vega Code to Dynamic Poly Rendering
      • Try Vega
    • RJDBC
    • SQuirreL SQL
    • heavyai-connector
  • Tutorials and Demos
    • Loading Data
    • Using Heavy Immerse
    • Hello World
    • Creating a Kafka Streaming Application
    • Getting Started with Open Source
    • Try Vega
  • Troubleshooting and Special Topics
    • FAQs
    • Troubleshooting
    • Vulkan Renderer
    • Optimizing
    • Known Issues and Limitations
    • Logs and Monitoring
    • Archived Release Notes
      • Release 5.x
      • Release 4.x
      • Release 3.x
Powered by GitBook
On this page
Export as PDF
  1. SQL
  2. Data Manipulation (DML)
  3. System Table Functions

tf_load_point_cloud

Loads one or more las or laz point cloud/LiDAR files from a local file or directory source, optionally tranforming the output SRID to out_srs (if not specified, output points are automatically transformed to EPSG:4326 lon/lat pairs).

If use_cache is set to true, an internal point cloud-specific cache will be used to hold the results per input file, and if queried again will significantly speed up the query time, allowing for interactive querying of a point cloud source. If the results of tf_load_point_cloud will only be consumed once (for example, as part of a CREATE TABLE statement), it is highly recommended that use_cache is set to false or left unspecified (as it is defaulted to false) to avoid the performance and memory overhead incurred by used of the cache.

The bounds of the data retrieved can be optionally specified with the x_min, x_max, y_min, y_max arguments. These arguments can be useful when the user desires to retrieve a small geographic area from a large point-cloud file set, as files containing data outside the bounds of the specified bounding box will be quickly skipped by tf_load_point_cloud, only requiring a quick read of the spatial metadata for the file.

SELECT * FROM TABLE(
    tf_load_point_cloud(
        path => <path>,
        [out_srs => <out_srs>,
        use_cache => <use_cache>,
        x_min => <x_min>,
        x_max => <x_max>,
        y_min => <y_min>,
        y_max => <y_max>]
    )
)    

Input Arguments

Parameter
Description
Data Types

path

The path of the file or directory containing the las/laz file or files. Can contain globs. Path must be in allowed-import-paths.

TEXT ENCODING NONE

out_srs (optional)

EPSG code of the output SRID. If not specified, output points are automatically converted to lon/lat (EPSG 4326).

TEXT ENCODING NONE

use_cache (optional)

If true, use internal point cloud cache. Useful for inline querying of the output of tf_load_point_cloud. Should turn off for one-shot queries or when creating a table from the output, as adding data to the cache incurs performance and memory usage overhead. If not specified, is defaulted to false/off.

BOOLEAN

x_min (optional)

Min x-coordinate value (in degrees) for the output data.

DOUBLE

x_max (optional)

Max x-coordinate value (in degrees) for the output data.

DOUBLE

y_min(optional)

Min y-coordinate value (in degrees) for the output data.

DOUBLE

y_max (optional)

Max y-coordinate value (in degrees) for the output data.

DOUBLE

Output Columns

Name
Description
Data Types

x

Point x-coordinate

Column<DOUBLE>

y

Point y-coordinate

Column<DOUBLE>

z

Point z-coordinate

Column<DOUBLE>

intensity

Point intensity

Column<INT>

return_num

The ordered number of the return for a given LiDAR pulse. The first returns (lowest return numbers) are generally associated with the highest-elevation points for a LiDAR pulse, i.e. the forest canopy will generally have a lower return_num than the ground beneath it.

Column<TINYINT>

num_returns

The total number of returns for a LiDAR pulse. Multiple returns occur when there are multiple objects between the LiDAR source and the lowest ground or water elevation for a location.

Column<TINYINT>

scan_direction_flag

Column<TINYINT>

edge_of_flight_line_flag

Column<TINYINT>

classification

Column<SMALLINT>

scan_angle_rank

Column<TINYINT>

Example A

CREATE TABLE wake_co_lidar_test AS
SELECT
  *
FROM
  TABLE(
    tf_load_point_cloud(
      path => '/path/to/20150118_LA_37_20066601.laz'
    )
  );

Example B

SELECT
  x, y, z, classification
FROM
  TABLE(
    tf_load_point_cloud(
      path => '/path/to/las_files/*.las',
      out_srs => 'EPSG:4326',
      use_cache => true,
      y_min => 37.0,
      y_max => 38.0,
      x_min => -123.0,
      x_max => -122.0
    )
  )
Previoustf_graph_shortest_paths_distancesNexttf_mandelbrot*

Last updated 2 years ago

From the : "The scan direction flag denotes the direction at which the scanner mirror was traveling at the time of the output pulse. A bit value of 1 is a positive scan direction, and a bit value of 0 is a negative scan direction."

From the : "The edge of flight line data bit has a value of 1 only when the point is at the end of a scan. It is the last point on a given scan line before it changes direction."

From the : "The classification field is a number to signify a given classification during filter processing. The ASPRS standard has a public list of classifications which shall be used when mixing vendor specific user software."

From the : "The angle at which the laser point was output from the laser system, including the roll of the aircraft... The scan angle is an angle based on 0 degrees being NADIR, and –90 degrees to the left side of the aircraft in the direction of flight."

ASPRS LiDAR Data Exchange Format Standard
ASPRS LiDAR Data Exchange Format Standard
ASPRS LiDAR Data Exchange Format Standard
ASPRS LiDAR Data Exchange Format Standard
LiDAR data from downtown Tallahassee, FL, colored by Z-value