Plugin Data Sources
Learn how to configure different data sources for your plugin.
Overview
Data sources determine where the plugin gets its data and what format it returns. NetAlertX supports multiple data source types, each suited for different use cases.
| Data Source | Type | Purpose | Returns | Example |
|---|---|---|---|---|
script |
Code Execution | Execute Linux commands or Python scripts | Pipeline | Scan network, collect metrics, call APIs |
app-db-query |
Database Query | Query the NetAlertX database | Result set | Show devices, open ports, recent events |
sqlite-db-query |
External DB | Query external SQLite databases | Result set | PiHole database, external logs |
template |
Template | Generate values from templates | Values | Initialize default settings |
plugin_type |
Metadata | Declare plugin category | Metadata | Classify plugin (scanner, publisher, etc.) |
Data Source: script
Execute any Linux command or Python script and capture its output.
Configuration
{
"data_source": "script",
"show_ui": true,
"mapped_to_table": "Plugins_Objects"
}
How It Works
- Command specified in
CMDsetting is executed - Script writes results to
/tmp/log/plugins/last_result.<PREFIX>.log - Core reads file and parses pipe-delimited results
- Results inserted into database
Example: Simple Python Script
{
"function": "CMD",
"type": {"dataType": "string", "elements": [{"elementType": "input", "elementOptions": [], "transformers": []}]},
"default_value": "python3 /app/front/plugins/my_plugin/script.py",
"localized": ["name"],
"name": [{"language_code": "en_us", "string": "Command"}]
}
Example: Bash Script
{
"function": "CMD",
"default_value": "bash /app/front/plugins/my_plugin/script.sh",
"localized": ["name"],
"name": [{"language_code": "en_us", "string": "Command"}]
}
Best Practices
- Always use absolute paths (e.g.,
/app/front/plugins/...) - Use
plugin_helper.pyfor output formatting - Add timeouts via
RUN_TIMEOUTsetting (default: 60s) - Log errors to
/tmp/log/plugins/*.log - Handle missing dependencies gracefully
Output Format
Must write to: /tmp/log/plugins/last_result.<PREFIX>.log
Format: Pipe-delimited, 9 or 13 columns
See Plugin Data Contract for exact format
Data Source: app-db-query
Query the NetAlertX SQLite database and display results.
Configuration
{
"data_source": "app-db-query",
"show_ui": true,
"mapped_to_table": "Plugins_Objects"
}
How It Works
- SQL query specified in
CMDsetting is executed againstapp.db - Results parsed according to column definitions
- Inserted into plugin display/database
SQL Query Requirements
- Must return exactly 9 or 13 columns matching the data contract
- Column names must match (order matters!)
- Must be readable SQLite SQL (not vendor-specific)
Example: Open Ports from Nmap
{
"function": "CMD",
"type": {"dataType": "string", "elements": [{"elementType": "input", "elementOptions": [], "transformers": []}]},
"default_value": "SELECT dv.devName as Object_PrimaryID, cast(dv.devLastIP as VARCHAR(100)) || ':' || cast(SUBSTR(ns.Port, 0, INSTR(ns.Port, '/')) as VARCHAR(100)) as Object_SecondaryID, datetime() as DateTime, ns.Service as Watched_Value1, ns.State as Watched_Value2, null as Watched_Value3, null as Watched_Value4, ns.Extra as Extra, dv.devMac as ForeignKey FROM (SELECT * FROM Nmap_Scan) ns LEFT JOIN (SELECT devName, devMac, devLastIP FROM Devices) dv ON ns.MAC = dv.devMac",
"localized": ["name"],
"name": [{"language_code": "en_us", "string": "SQL to run"}],
"description": [{"language_code": "en_us", "string": "This SQL query populates the plugin table"}]
}
Example: Recent Device Events
SELECT
e.EventValue as Object_PrimaryID,
d.devName as Object_SecondaryID,
e.EventDateTime as DateTime,
e.EventType as Watched_Value1,
d.devLastIP as Watched_Value2,
null as Watched_Value3,
null as Watched_Value4,
e.EventDetails as Extra,
d.devMac as ForeignKey
FROM
Events e
LEFT JOIN
Devices d ON e.DeviceMac = d.devMac
WHERE
e.EventDateTime > datetime('now', '-24 hours')
ORDER BY
e.EventDateTime DESC
Common Columns
Devices Table:
- devMac - Device MAC address
- devName - Device name
- devLastIP - Last known IP
- devType - Device type
- devStatus - Online/offline status
- devVendor - Hardware vendor
CurrentScan Table:
- cur_MAC - MAC address
- cur_IP - IP address
- cur_ScanMethod - How it was discovered
Nmap_Scan Table:
- MAC - Device MAC
- Port - Port number
- Service - Service name
- State - open/closed/filtered
- Extra - Additional notes
Events Table:
- EventType - Event type
- EventValue - Event value
- EventDateTime - When it occurred
- DeviceMac - Associated device
Data Source: sqlite-db-query
Query an external SQLite database mounted in the container.
Configuration
First, define the database path in a setting:
{
"function": "DB_PATH",
"type": {"dataType": "string", "elements": [{"elementType": "input", "elementOptions": [], "transformers": []}]},
"default_value": "/etc/pihole/pihole-FTL.db",
"localized": ["name"],
"name": [{"language_code": "en_us", "string": "Database path"}],
"description": [{"language_code": "en_us", "string": "Path to external SQLite database"}]
}
Then set data source and query:
{
"data_source": "sqlite-db-query",
"show_ui": true
}
How It Works
- External database file path specified in
DB_PATHsetting - Database mounted at that path (e.g., via Docker volume)
- SQL query executed against external database using
EXTERNAL_<PREFIX>.prefix - Results returned in standard format
SQL Query Example: PiHole Data
{
"function": "CMD",
"default_value": "SELECT hwaddr as Object_PrimaryID, cast('http://' || (SELECT ip FROM EXTERNAL_PIHOLE.network_addresses WHERE network_id = id ORDER BY lastseen DESC LIMIT 1) as VARCHAR(100)) || ':' || cast(SUBSTR((SELECT name FROM EXTERNAL_PIHOLE.network_addresses WHERE network_id = id ORDER BY lastseen DESC LIMIT 1), 0, INSTR((SELECT name FROM EXTERNAL_PIHOLE.network_addresses WHERE network_id = id ORDER BY lastseen DESC LIMIT 1), '/')) as VARCHAR(100)) as Object_SecondaryID, datetime() as DateTime, macVendor as Watched_Value1, lastQuery as Watched_Value2, (SELECT name FROM EXTERNAL_PIHOLE.network_addresses WHERE network_id = id ORDER BY lastseen DESC LIMIT 1) as Watched_Value3, null as Watched_Value4, '' as Extra, hwaddr as ForeignKey FROM EXTERNAL_PIHOLE.network WHERE hwaddr NOT LIKE 'ip-%' AND hwaddr <> '00:00:00:00:00:00'",
"localized": ["name"],
"name": [{"language_code": "en_us", "string": "SQL to run"}]
}
Key Points
- Prefix all external tables with
EXTERNAL_<PREFIX>. - For PiHole (
PIHOLEprefix):EXTERNAL_PIHOLE.network - For custom database (
CUSTOMprefix):EXTERNAL_CUSTOM.my_table - Database must be valid SQLite
- Path must be accessible inside the container
- No columns beyond the data contract required
Docker Volume Setup
To mount external database in docker-compose:
services:
netalertx:
volumes:
- /path/on/host/pihole-FTL.db:/etc/pihole/pihole-FTL.db:ro
Data Source: template
Generate values from a template. Usually used for initialization and default settings.
Configuration
{
"data_source": "template"
}
How It Works
- Not widely used in standard plugins
- Used internally for generating default values
- Check
newdev_templateplugin for implementation example
Example: Default Device Template
{
"function": "DEFAULT_DEVICE_PROPERTIES",
"type": {"dataType": "string", "elements": [{"elementType": "textarea", "elementOptions": [], "transformers": []}]},
"default_value": "type=Unknown|vendor=Unknown",
"localized": ["name"],
"name": [{"language_code": "en_us", "string": "Default properties"}]
}
Data Source: plugin_type
Declare the plugin category. Controls where settings appear in the UI.
Configuration
{
"data_source": "plugin_type",
"value": "scanner"
}
Supported Values
| Value | Section | Purpose |
|---|---|---|
scanner |
Device Scanners | Discovers devices on network |
system |
System Plugins | Core system functionality |
publisher |
Notification/Alert Plugins | Sends notifications/alerts |
importer |
Data Importers | Imports devices from external sources |
other |
Other Plugins | Miscellaneous functionality |
Example
{
"settings": [
{
"function": "plugin_type",
"type": {"dataType": "string", "elements": []},
"default_value": "scanner",
"options": ["scanner"],
"data_source": "plugin_type",
"value": "scanner",
"localized": []
}
]
}
Execution Order
Control plugin execution priority. Higher priority plugins run first.
{
"execution_order": "Layer_0"
}
Levels (highest to lowest priority)
Layer_0- Highest priority (run first)Layer_1Layer_2- ...
Example: Device Discovery
{
"code_name": "device_scanner",
"unique_prefix": "DEVSCAN",
"execution_order": "Layer_0",
"data_source": "script",
"mapped_to_table": "CurrentScan"
}
Performance Considerations
Script Source
- Pros: Flexible, can call external tools
- Cons: Slower than database queries
- Optimization: Add caching, use timeouts
- Default timeout: 60 seconds (set
RUN_TIMEOUT)
Database Query Source
- Pros: Fast, native query optimization
- Cons: Limited to SQL capabilities
- Optimization: Use indexes, avoid complex joins
- Timeout: Usually instant
External DB Query Source
- Pros: Direct access to external data
- Cons: Network latency, external availability
- Optimization: Use indexes in external DB, selective queries
- Timeout: Depends on DB response time
Debugging Data Sources
Check Script Output
# Run script manually
python3 /app/front/plugins/my_plugin/script.py
# Check result file
cat /tmp/log/plugins/last_result.MYPREFIX.log
Test SQL Query
# Connect to app database
sqlite3 /data/db/app.db
# Run query
sqlite> SELECT ... ;
Monitor Execution
# Watch backend logs
tail -f /tmp/log/stdout.log | grep -i "data_source\|MYPREFIX"
See Also
- Plugin Data Contract - Output format specification
- Plugin Settings System - How to define settings
- Quick Start Guide - Create your first plugin
- Database Schema - Available tables and columns
- Debugging Plugins - Troubleshooting data issues