Custom analysis & SQL Configurator
The Custom Analysis & SQL Configurator enables direct database access for custom fleet analysis beyond standard reports. Write SQL queries, explore your complete dataset, and create interactive visualizations in one integrated interface.
Interface overview

The workspace of SQL Configurator consists of 3 main sections:
Database Schema browser - View table structures and click on schemas for interactive relationship exploration on dbdiagram.io
SQL Query Execution panel - Write PostgreSQL queries and view results with immediate feedback
Interactive visualization interface - Transform query results into charts using drag-and-drop functionality
Your data structure
Configuration impact: Your system processes different data categories based on configuration settings. If expected tables seem unavailable, verify your data category settings in system configuration.
Your PTL data appears through two primary schemas organized by source and purpose:
raw_business_data
raw_telematics_data
Organizational and operational information. - Core entities: users, devices, objects, vehicles, employees - Operational data: tasks, forms, zones, places, garages - Reference data: models, entities, status information - Relationship tables: vehicle-driver assignments, user-zone mappings
Real-time GPS tracking, sensor readings, and device status - tracking_data_core: GPS coordinates, speed, altitude, event data - inputs: Sensor readings (fuel, temperature, voltage) - states: Device status indicators (ignition, doors, operational modes)
For details on data schemas available in your IoT Query, see Schema overview.
Creating visualizations from query results
Transform your SQL query results into interactive visualizations through a structured workflow:
Verify data configuration
Check that all required data categories are enabled in your system settings:
Navigate to PTL Configuration in the left sidebar
Verify that needed data categories (Tracking Data, Inputs, States) are enabled for your analysis requirements
If changes are needed, update configuration and consider running historical data loading for retroactive application
Develop and execute your query
In the SQL Query Execution panel:
Write your analysis query using PostgreSQL syntax and the formatting requirements above
Include appropriate field names, data conversions, and filtering for your visualization needs
Click Execute to run your query and generate the dataset
Review results to ensure data quality and expected output format
Access visualization interface
Once your query executes successfully:
Switch to the Visualization tab that appears above your query results
Your query results automatically become available as data fields in the visualization interface
The interactive chart builder loads with your dataset ready for drag-and-drop visualization creation
Create interactive visualizations
Use the visualization interface to build charts:
Drag fields from your query results to chart configuration areas (X-axis, Y-axis, filters, colors)
Choose appropriate visualization types based on your data characteristics and analysis objectives
Apply filters and styling to refine your visual presentation
Export completed visualizations in multiple formats (PNG, SVG, CSV, base64) for sharing and reporting\
Detailed visualization guide: Complete step-by-step instructions for chart creation are covered in the Create custom visualizations.
Query development tips
Expand the sections below to discover recommended practices for working with queries in SQL Configurator.
Data format requirements for telematics analysis
Your telematics data uses scaled integer storage that requires conversion:
Data Type
Storage Format
Conversion Required
GPS coordinates
Scaled integers
Divide by 10,000,000 for decimal degrees
Speed values
Integer format
Divide by 100 for km/h
Timestamps
Two variants
Use device_time for events, platform_time for processing
Performance and data quality optimization
Essential practices for reliable analysis:
Apply time-based filtering: Reduces dataset size and improves response times with
WHERE device_time > now() - INTERVAL '7 days'Use indexed fields: Include
device_idanddevice_timein WHERE clauses for optimal query performanceValidate data ranges: Filter coordinate and speed bounds to identify anomalous readings
Verify relationships: Cross-reference business data relationships to ensure joins produce expected results
Manage result sets: Add appropriate LIMIT clauses for exploratory queries to avoid performance issues
Handle data gaps: Expect normal variations like connectivity gaps during poor signal conditions
Expected Data Characteristics: Sensor readings require periodic calibration validation, and recent data may still be processing during real-time analysis.
Cross-schema analysis patterns
Combine organizational and tracking data for comprehensive insights:
Business-telematics integration: Join using
device_idas primary relationship key between schemasEmployee-vehicle correlation: Connect through objects table relationships for productivity analysis
Sensor interpretation: Use description_parameters reference table to translate coded values to readable labels
Geographic analysis: Combine tracking coordinates with zone definitions for operational insights
Example: Complete fleet overview with LEFT JOIN
When analyzing fleet operations, you often need to see all vehicles regardless of their current activity status. This example demonstrates how LEFT JOIN preserves complete vehicle records even when tracking data or driver assignments are missing.
Key insight: LEFT JOIN ensures all vehicles appear in results, even without recent tracking or driver assignments.
Query examples: Complete use-case-specific patterns are available in the SQL Recipe Book.
Next steps
Create custom visualizations - Complete process for creating charts and visual analysis
SQL Recipe Book - Advanced query patterns organized by analytical scenario
Bronze layer documentation - Complete database schema reference and field definitions
Production analytics: For enterprise-scale reporting and dashboards, consider dedicated BI tools that connect directly to your PTL instance for enhanced scalability and collaboration features. Learn more in Selecting BI tools.
Last updated
Was this helpful?