HA Phase C: services + diagnostics + README polish
custom_components/omni_pca/services.yaml — declares 7 services with config_entry selectors so HA's UI gives users a panel picker: bypass_zone, restore_zone, execute_program, show_message, clear_message, acknowledge_alerts, send_command (raw escape hatch) custom_components/omni_pca/services.py — async handlers wired via async_setup_services on entry setup; idempotent across multiple entries. Each handler validates entry_id, looks up the right coordinator, calls the matching OmniClient method. CommandFailedError wrapped to HomeAssistantError; unknown Command codes raise ServiceValidationError. async_unload_services removes them when the last entry unloads. custom_components/omni_pca/diagnostics.py — async_get_config_entry_ diagnostics dumps a redacted snapshot for bug reports: panel model + firmware, discovered/live counts per object type, sha256-hashed zone/ unit/area names (so uniqueness is visible without leaking PII), last event class, controller key REDACTED via async_redact_data. custom_components/omni_pca/__init__.py — wires async_setup_services on entry setup and async_unload_services on the last entry unload. custom_components/omni_pca/README.md — full entity table, service list, example automation, troubleshooting section, link to JOURNEY.md. Top-level README — entity rundown updated to reflect the full v1.0 surface (was: 'binary_sensor for zones'). 331 tests still pass; ruff clean across src/ tests/ custom_components/. hacs.json already in place from initial scaffold.
This commit is contained in:
parent
57b8aa4b04
commit
83d85a9885
@ -49,6 +49,8 @@ Get the ControllerKey from your `.pca` file using the included parser:
|
||||
uvx --from omni-pca omni-pca decode-pca path/to/Your.pca --field controller_key
|
||||
```
|
||||
|
||||
The integration creates one HA device per panel plus typed entities for every named object on the controller: `alarm_control_panel` for areas, `light` for units, `binary_sensor`/`switch` for zones (state + bypass), `climate` for thermostats, `sensor` for analog zones and panel telemetry, `button` for panel macros, and `event` for the typed push-notification stream. See [`custom_components/omni_pca/README.md`](custom_components/omni_pca/README.md) for the entity table and service list.
|
||||
|
||||
## Without a panel — mock controller
|
||||
|
||||
For testing, the library ships a minimal Omni controller emulator:
|
||||
|
||||
@ -30,8 +30,7 @@ Copy the `custom_components/omni_pca/` directory into your HA
|
||||
- **Host** — IP or hostname of the panel (e.g. `192.168.1.50`)
|
||||
- **Port** — defaults to `4369` (HAI's reserved port)
|
||||
- **Controller Key** — 32 hex characters, the panel's NVRAM key
|
||||
3. Save. The panel's model and firmware appear as a single device, with one
|
||||
`binary_sensor` per defined zone.
|
||||
3. Save. The panel appears as a single device with entities per object.
|
||||
|
||||
### Where do I get the Controller Key?
|
||||
|
||||
@ -45,23 +44,83 @@ uvx omni-pca decode-pca '/path/to/My House.pca' --field controller_key
|
||||
Otherwise, find it in PC Access under the panel's **Setup → Misc → Network**
|
||||
page (HAI labels it "Encryption Key 1").
|
||||
|
||||
## What you get
|
||||
## Entities created
|
||||
|
||||
- One **device** per panel — model + firmware reported in the UI.
|
||||
- One **`binary_sensor`** per defined zone, named from the panel's own
|
||||
zone-name field. `OPENING` device class for door/window contacts,
|
||||
`MOTION` for interior PIRs, `SMOKE` for fire zones, etc., chosen by zone
|
||||
type when the panel reports one.
|
||||
- **Push updates**: zone state changes propagate within a single round-trip
|
||||
thanks to unsolicited-message subscription. The 30-second poll is just a
|
||||
safety net.
|
||||
One device per panel, plus per-object entities below.
|
||||
|
||||
## Roadmap
|
||||
| Platform | Entity | Per |
|
||||
|---|---|---|
|
||||
| `alarm_control_panel` | Area arm/disarm with code | discovered area |
|
||||
| `binary_sensor` | Zone open/tripped | binary zone |
|
||||
| `binary_sensor` | Zone bypassed (diagnostic) | binary zone |
|
||||
| `binary_sensor` | AC power, backup battery, system trouble | panel |
|
||||
| `button` | Panel button macro | discovered button |
|
||||
| `climate` | Thermostat (heat/cool/auto, fan, hold) | discovered thermostat |
|
||||
| `event` | Typed push event relay | panel |
|
||||
| `light` | Unit on/off + brightness | discovered unit |
|
||||
| `sensor` | Analog zone (temp/humidity/power) | analog zone |
|
||||
| `sensor` | Thermostat current temp / humidity / outdoor temp | thermostat |
|
||||
| `sensor` | Panel model + firmware, last event class | panel |
|
||||
| `switch` | Zone bypass toggle | binary zone |
|
||||
|
||||
- Areas → `alarm_control_panel` entities
|
||||
- Units → `light` / `switch` entities
|
||||
- Thermostats → `climate`
|
||||
- Aux sensors → `sensor`
|
||||
State propagates via the panel's unsolicited push messages: zone changes,
|
||||
arming changes, AC/battery troubles, etc. all arrive within one TCP round-
|
||||
trip. A 30-second background poll backstops anything that didn't push.
|
||||
|
||||
## Services
|
||||
|
||||
| Service | Purpose |
|
||||
|---|---|
|
||||
| `omni_pca.bypass_zone` | Bypass a zone by 1-based index |
|
||||
| `omni_pca.restore_zone` | Restore a previously-bypassed zone |
|
||||
| `omni_pca.execute_program` | Run a stored program by index |
|
||||
| `omni_pca.show_message` | Display a stored message on consoles |
|
||||
| `omni_pca.clear_message` | Clear a displayed message |
|
||||
| `omni_pca.acknowledge_alerts` | Clear all outstanding troubles/alerts |
|
||||
| `omni_pca.send_command` | Power-user escape hatch (raw Command opcode) |
|
||||
|
||||
Every service takes an `entry_id` so it picks the right panel when you have
|
||||
multiple configured.
|
||||
|
||||
## Automation example
|
||||
|
||||
React to any alarm activation in real time:
|
||||
|
||||
```yaml
|
||||
automation:
|
||||
- alias: Notify on alarm
|
||||
trigger:
|
||||
- platform: event
|
||||
event_type: state_changed
|
||||
event_data:
|
||||
entity_id: event.panel_events
|
||||
condition: >
|
||||
{{ trigger.event.data.new_state.attributes.event_type ==
|
||||
"alarm_activated" }}
|
||||
action:
|
||||
- service: notify.mobile_app
|
||||
data:
|
||||
title: ALARM
|
||||
message: >
|
||||
Area {{ trigger.event.data.new_state.attributes.area_index }}
|
||||
```
|
||||
|
||||
## Diagnostics
|
||||
|
||||
Settings → Devices & Services → *HAI/Leviton Omni Panel* → ⋮ → **Download
|
||||
diagnostics** dumps a redacted snapshot (controller key removed, zone names
|
||||
hashed) — useful for bug reports.
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
- **Won't connect**: confirm port 4369 is open on the panel. The Omni Pro
|
||||
II's network module ships *off* by default; enable it under Setup → Misc
|
||||
→ Network on a console.
|
||||
- **Authentication failed**: re-check the Controller Key. The integration
|
||||
triggers HA's reauth flow when the panel rejects the key.
|
||||
- **No entities for X**: only objects with a name configured on the panel
|
||||
are discovered. PC Access's "Names" page is where they live.
|
||||
|
||||
See the [parent README](https://github.com/rsp2k/omni-pca) for protocol /
|
||||
library details.
|
||||
library details. Detailed reverse-engineering notes are in
|
||||
[`docs/JOURNEY.md`](https://github.com/rsp2k/omni-pca/blob/main/docs/JOURNEY.md).
|
||||
|
||||
@ -16,6 +16,7 @@ from homeassistant.exceptions import ConfigEntryNotReady
|
||||
|
||||
from .const import CONF_CONTROLLER_KEY, DOMAIN, LOGGER
|
||||
from .coordinator import OmniDataUpdateCoordinator
|
||||
from .services import async_setup_services, async_unload_services
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from homeassistant.config_entries import ConfigEntry
|
||||
@ -67,6 +68,7 @@ async def async_setup_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
|
||||
|
||||
hass.data.setdefault(DOMAIN, {})[entry.entry_id] = coordinator
|
||||
await hass.config_entries.async_forward_entry_setups(entry, PLATFORMS)
|
||||
await async_setup_services(hass)
|
||||
return True
|
||||
|
||||
|
||||
@ -81,4 +83,5 @@ async def async_unload_entry(hass: HomeAssistant, entry: ConfigEntry) -> bool:
|
||||
if unloaded:
|
||||
coordinator: OmniDataUpdateCoordinator = hass.data[DOMAIN].pop(entry.entry_id)
|
||||
await coordinator.async_shutdown()
|
||||
await async_unload_services(hass)
|
||||
return unloaded
|
||||
|
||||
85
custom_components/omni_pca/diagnostics.py
Normal file
85
custom_components/omni_pca/diagnostics.py
Normal file
@ -0,0 +1,85 @@
|
||||
"""Diagnostics dump for an Omni panel config entry.
|
||||
|
||||
Captures a redacted snapshot of the coordinator's data so the user can
|
||||
attach it to a bug report. Sensitive fields (controller key, PII in
|
||||
device names) are stripped or hashed.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import hashlib
|
||||
from typing import TYPE_CHECKING, Any
|
||||
|
||||
from homeassistant.components.diagnostics import async_redact_data
|
||||
from homeassistant.const import CONF_HOST, CONF_PORT
|
||||
|
||||
from .const import CONF_CONTROLLER_KEY, DOMAIN
|
||||
from .coordinator import OmniDataUpdateCoordinator
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from homeassistant.config_entries import ConfigEntry
|
||||
from homeassistant.core import HomeAssistant
|
||||
|
||||
REDACTED_KEYS = {CONF_CONTROLLER_KEY, "controller_key", "password", "code"}
|
||||
|
||||
|
||||
def _hash_name(name: str) -> str:
|
||||
"""Hash a panel-defined name so we can confirm uniqueness without leaking it."""
|
||||
return "n_" + hashlib.sha256(name.encode("utf-8", errors="ignore")).hexdigest()[:12]
|
||||
|
||||
|
||||
async def async_get_config_entry_diagnostics(
|
||||
hass: HomeAssistant, entry: ConfigEntry
|
||||
) -> dict[str, Any]:
|
||||
coordinator: OmniDataUpdateCoordinator = hass.data[DOMAIN][entry.entry_id]
|
||||
data = coordinator.data
|
||||
|
||||
return {
|
||||
"entry": {
|
||||
"title": entry.title,
|
||||
"data": async_redact_data(dict(entry.data), REDACTED_KEYS),
|
||||
CONF_HOST: entry.data.get(CONF_HOST),
|
||||
CONF_PORT: entry.data.get(CONF_PORT),
|
||||
},
|
||||
"panel": (
|
||||
{
|
||||
"model_byte": data.system_info.model_byte,
|
||||
"model_name": data.system_info.model_name,
|
||||
"firmware_version": data.system_info.firmware_version,
|
||||
}
|
||||
if data and data.system_info
|
||||
else None
|
||||
),
|
||||
"discovered_counts": {
|
||||
"zones": len(data.zones) if data else 0,
|
||||
"units": len(data.units) if data else 0,
|
||||
"areas": len(data.areas) if data else 0,
|
||||
"thermostats": len(data.thermostats) if data else 0,
|
||||
"buttons": len(data.buttons) if data else 0,
|
||||
"programs": len(data.programs) if data else 0,
|
||||
},
|
||||
"live_status_counts": {
|
||||
"zone_status": len(data.zone_status) if data else 0,
|
||||
"unit_status": len(data.unit_status) if data else 0,
|
||||
"area_status": len(data.area_status) if data else 0,
|
||||
"thermostat_status": len(data.thermostat_status) if data else 0,
|
||||
},
|
||||
"name_hashes": (
|
||||
{
|
||||
"zones": {idx: _hash_name(props.name) for idx, props in data.zones.items()},
|
||||
"units": {idx: _hash_name(props.name) for idx, props in data.units.items()},
|
||||
"areas": {idx: _hash_name(props.name) for idx, props in data.areas.items()},
|
||||
}
|
||||
if data
|
||||
else {}
|
||||
),
|
||||
"last_event_class": (
|
||||
type(data.last_event).__name__ if data and data.last_event else None
|
||||
),
|
||||
"last_update_success": coordinator.last_update_success,
|
||||
"update_interval_seconds": (
|
||||
coordinator.update_interval.total_seconds()
|
||||
if coordinator.update_interval
|
||||
else None
|
||||
),
|
||||
}
|
||||
185
custom_components/omni_pca/services.py
Normal file
185
custom_components/omni_pca/services.py
Normal file
@ -0,0 +1,185 @@
|
||||
"""Service handlers for the omni_pca integration.
|
||||
|
||||
Services give the user a write-surface for things the entity layer
|
||||
doesn't naturally expose: program execution (no Properties opcode for
|
||||
Programs in v1.0), arbitrary panel messages, raw commands for power
|
||||
users, and panel-wide alert acknowledgement.
|
||||
|
||||
All services route through the per-entry coordinator's ``OmniClient``;
|
||||
each accepts an ``entry_id`` field so HA can pick the right panel when
|
||||
multiple are configured.
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
import voluptuous as vol
|
||||
from homeassistant.const import CONF_ENTRY_ID
|
||||
from homeassistant.exceptions import HomeAssistantError, ServiceValidationError
|
||||
from homeassistant.helpers import config_validation as cv
|
||||
|
||||
from omni_pca.commands import Command, CommandFailedError
|
||||
|
||||
from .const import DOMAIN, LOGGER
|
||||
from .coordinator import OmniDataUpdateCoordinator
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from homeassistant.core import HomeAssistant, ServiceCall
|
||||
|
||||
SERVICE_BYPASS_ZONE = "bypass_zone"
|
||||
SERVICE_RESTORE_ZONE = "restore_zone"
|
||||
SERVICE_EXECUTE_PROGRAM = "execute_program"
|
||||
SERVICE_SHOW_MESSAGE = "show_message"
|
||||
SERVICE_CLEAR_MESSAGE = "clear_message"
|
||||
SERVICE_ACKNOWLEDGE_ALERTS = "acknowledge_alerts"
|
||||
SERVICE_SEND_COMMAND = "send_command"
|
||||
|
||||
ATTR_ZONE_INDEX = "zone_index"
|
||||
ATTR_PROGRAM_INDEX = "program_index"
|
||||
ATTR_MESSAGE_INDEX = "message_index"
|
||||
ATTR_COMMAND = "command"
|
||||
ATTR_PARAM_1 = "parameter1"
|
||||
ATTR_PARAM_2 = "parameter2"
|
||||
|
||||
|
||||
_BASE_SCHEMA = vol.Schema({vol.Required(CONF_ENTRY_ID): cv.string})
|
||||
|
||||
|
||||
def _zone_schema() -> vol.Schema:
|
||||
return _BASE_SCHEMA.extend(
|
||||
{vol.Required(ATTR_ZONE_INDEX): vol.All(int, vol.Range(min=1, max=0xFFFF))}
|
||||
)
|
||||
|
||||
|
||||
def _program_schema() -> vol.Schema:
|
||||
return _BASE_SCHEMA.extend(
|
||||
{vol.Required(ATTR_PROGRAM_INDEX): vol.All(int, vol.Range(min=1, max=0xFFFF))}
|
||||
)
|
||||
|
||||
|
||||
def _message_schema() -> vol.Schema:
|
||||
return _BASE_SCHEMA.extend(
|
||||
{vol.Required(ATTR_MESSAGE_INDEX): vol.All(int, vol.Range(min=1, max=0xFFFF))}
|
||||
)
|
||||
|
||||
|
||||
def _command_schema() -> vol.Schema:
|
||||
return _BASE_SCHEMA.extend(
|
||||
{
|
||||
vol.Required(ATTR_COMMAND): vol.All(int, vol.Range(min=0, max=255)),
|
||||
vol.Optional(ATTR_PARAM_1, default=0): vol.All(
|
||||
int, vol.Range(min=0, max=255)
|
||||
),
|
||||
vol.Optional(ATTR_PARAM_2, default=0): vol.All(
|
||||
int, vol.Range(min=0, max=0xFFFF)
|
||||
),
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
def _coordinator_for(
|
||||
hass: HomeAssistant, call: ServiceCall
|
||||
) -> OmniDataUpdateCoordinator:
|
||||
entry_id = call.data[CONF_ENTRY_ID]
|
||||
coordinators = hass.data.get(DOMAIN, {})
|
||||
if entry_id not in coordinators:
|
||||
raise ServiceValidationError(
|
||||
f"No Omni panel configured with entry_id {entry_id!r}"
|
||||
)
|
||||
return coordinators[entry_id]
|
||||
|
||||
|
||||
async def _wrap(coro_factory) -> None: # type: ignore[no-untyped-def]
|
||||
try:
|
||||
await coro_factory()
|
||||
except CommandFailedError as err:
|
||||
raise HomeAssistantError(f"Panel rejected command: {err}") from err
|
||||
|
||||
|
||||
async def async_setup_services(hass: HomeAssistant) -> None:
|
||||
"""Register all services for the integration. Idempotent."""
|
||||
|
||||
if hass.services.has_service(DOMAIN, SERVICE_BYPASS_ZONE):
|
||||
return # already registered (multiple entries reuse the same services)
|
||||
|
||||
async def _bypass_zone(call: ServiceCall) -> None:
|
||||
coord = _coordinator_for(hass, call)
|
||||
idx = int(call.data[ATTR_ZONE_INDEX])
|
||||
await _wrap(lambda: coord.client.bypass_zone(idx))
|
||||
|
||||
async def _restore_zone(call: ServiceCall) -> None:
|
||||
coord = _coordinator_for(hass, call)
|
||||
idx = int(call.data[ATTR_ZONE_INDEX])
|
||||
await _wrap(lambda: coord.client.restore_zone(idx))
|
||||
|
||||
async def _execute_program(call: ServiceCall) -> None:
|
||||
coord = _coordinator_for(hass, call)
|
||||
idx = int(call.data[ATTR_PROGRAM_INDEX])
|
||||
await _wrap(lambda: coord.client.execute_program(idx))
|
||||
|
||||
async def _show_message(call: ServiceCall) -> None:
|
||||
coord = _coordinator_for(hass, call)
|
||||
idx = int(call.data[ATTR_MESSAGE_INDEX])
|
||||
await _wrap(lambda: coord.client.show_message(idx))
|
||||
|
||||
async def _clear_message(call: ServiceCall) -> None:
|
||||
coord = _coordinator_for(hass, call)
|
||||
idx = int(call.data[ATTR_MESSAGE_INDEX])
|
||||
await _wrap(lambda: coord.client.clear_message(idx))
|
||||
|
||||
async def _acknowledge_alerts(call: ServiceCall) -> None:
|
||||
coord = _coordinator_for(hass, call)
|
||||
await _wrap(lambda: coord.client.acknowledge_alerts())
|
||||
|
||||
async def _send_command(call: ServiceCall) -> None:
|
||||
coord = _coordinator_for(hass, call)
|
||||
cmd_byte = int(call.data[ATTR_COMMAND])
|
||||
try:
|
||||
cmd = Command(cmd_byte)
|
||||
except ValueError as err:
|
||||
raise ServiceValidationError(
|
||||
f"Unknown Command code {cmd_byte}; see omni_pca.commands.Command"
|
||||
) from err
|
||||
p1 = int(call.data[ATTR_PARAM_1])
|
||||
p2 = int(call.data[ATTR_PARAM_2])
|
||||
LOGGER.debug("send_command %s p1=%d p2=%d", cmd.name, p1, p2)
|
||||
await _wrap(lambda: coord.client.execute_command(cmd, p1, p2))
|
||||
|
||||
hass.services.async_register(
|
||||
DOMAIN, SERVICE_BYPASS_ZONE, _bypass_zone, schema=_zone_schema()
|
||||
)
|
||||
hass.services.async_register(
|
||||
DOMAIN, SERVICE_RESTORE_ZONE, _restore_zone, schema=_zone_schema()
|
||||
)
|
||||
hass.services.async_register(
|
||||
DOMAIN, SERVICE_EXECUTE_PROGRAM, _execute_program, schema=_program_schema()
|
||||
)
|
||||
hass.services.async_register(
|
||||
DOMAIN, SERVICE_SHOW_MESSAGE, _show_message, schema=_message_schema()
|
||||
)
|
||||
hass.services.async_register(
|
||||
DOMAIN, SERVICE_CLEAR_MESSAGE, _clear_message, schema=_message_schema()
|
||||
)
|
||||
hass.services.async_register(
|
||||
DOMAIN, SERVICE_ACKNOWLEDGE_ALERTS, _acknowledge_alerts, schema=_BASE_SCHEMA
|
||||
)
|
||||
hass.services.async_register(
|
||||
DOMAIN, SERVICE_SEND_COMMAND, _send_command, schema=_command_schema()
|
||||
)
|
||||
|
||||
|
||||
async def async_unload_services(hass: HomeAssistant) -> None:
|
||||
"""Tear down services if no entries remain."""
|
||||
if hass.data.get(DOMAIN):
|
||||
return # other entries still active
|
||||
for svc in (
|
||||
SERVICE_BYPASS_ZONE,
|
||||
SERVICE_RESTORE_ZONE,
|
||||
SERVICE_EXECUTE_PROGRAM,
|
||||
SERVICE_SHOW_MESSAGE,
|
||||
SERVICE_CLEAR_MESSAGE,
|
||||
SERVICE_ACKNOWLEDGE_ALERTS,
|
||||
SERVICE_SEND_COMMAND,
|
||||
):
|
||||
hass.services.async_remove(DOMAIN, svc)
|
||||
159
custom_components/omni_pca/services.yaml
Normal file
159
custom_components/omni_pca/services.yaml
Normal file
@ -0,0 +1,159 @@
|
||||
bypass_zone:
|
||||
name: Bypass zone
|
||||
description: Bypass a single zone (panel ignores it until restored).
|
||||
fields:
|
||||
entry_id:
|
||||
name: Panel
|
||||
description: Config entry ID of the Omni panel.
|
||||
required: true
|
||||
selector:
|
||||
config_entry:
|
||||
integration: omni_pca
|
||||
zone_index:
|
||||
name: Zone index
|
||||
description: 1-based zone number on the panel.
|
||||
required: true
|
||||
selector:
|
||||
number:
|
||||
min: 1
|
||||
max: 176
|
||||
mode: box
|
||||
|
||||
restore_zone:
|
||||
name: Restore zone
|
||||
description: Restore a previously-bypassed zone.
|
||||
fields:
|
||||
entry_id:
|
||||
name: Panel
|
||||
description: Config entry ID of the Omni panel.
|
||||
required: true
|
||||
selector:
|
||||
config_entry:
|
||||
integration: omni_pca
|
||||
zone_index:
|
||||
name: Zone index
|
||||
description: 1-based zone number on the panel.
|
||||
required: true
|
||||
selector:
|
||||
number:
|
||||
min: 1
|
||||
max: 176
|
||||
mode: box
|
||||
|
||||
execute_program:
|
||||
name: Execute program
|
||||
description: Run a stored program on the panel by its 1-based index.
|
||||
fields:
|
||||
entry_id:
|
||||
name: Panel
|
||||
description: Config entry ID of the Omni panel.
|
||||
required: true
|
||||
selector:
|
||||
config_entry:
|
||||
integration: omni_pca
|
||||
program_index:
|
||||
name: Program index
|
||||
description: 1-based program number on the panel.
|
||||
required: true
|
||||
selector:
|
||||
number:
|
||||
min: 1
|
||||
max: 1024
|
||||
mode: box
|
||||
|
||||
show_message:
|
||||
name: Show panel message
|
||||
description: Display a stored message on panel consoles by message index.
|
||||
fields:
|
||||
entry_id:
|
||||
name: Panel
|
||||
description: Config entry ID of the Omni panel.
|
||||
required: true
|
||||
selector:
|
||||
config_entry:
|
||||
integration: omni_pca
|
||||
message_index:
|
||||
name: Message index
|
||||
description: 1-based stored message number.
|
||||
required: true
|
||||
selector:
|
||||
number:
|
||||
min: 1
|
||||
max: 128
|
||||
mode: box
|
||||
|
||||
clear_message:
|
||||
name: Clear panel message
|
||||
description: Clear the currently-displayed message on panel consoles.
|
||||
fields:
|
||||
entry_id:
|
||||
name: Panel
|
||||
description: Config entry ID of the Omni panel.
|
||||
required: true
|
||||
selector:
|
||||
config_entry:
|
||||
integration: omni_pca
|
||||
message_index:
|
||||
name: Message index
|
||||
description: 1-based stored message number to clear.
|
||||
required: true
|
||||
selector:
|
||||
number:
|
||||
min: 1
|
||||
max: 128
|
||||
mode: box
|
||||
|
||||
acknowledge_alerts:
|
||||
name: Acknowledge alerts
|
||||
description: Acknowledge all outstanding alerts and trouble conditions on the panel.
|
||||
fields:
|
||||
entry_id:
|
||||
name: Panel
|
||||
description: Config entry ID of the Omni panel.
|
||||
required: true
|
||||
selector:
|
||||
config_entry:
|
||||
integration: omni_pca
|
||||
|
||||
send_command:
|
||||
name: Send raw command
|
||||
description: >
|
||||
Send a raw Omni Command (opcode 20). Power-user escape hatch — see
|
||||
omni_pca.commands.Command for the full enumeration.
|
||||
fields:
|
||||
entry_id:
|
||||
name: Panel
|
||||
description: Config entry ID of the Omni panel.
|
||||
required: true
|
||||
selector:
|
||||
config_entry:
|
||||
integration: omni_pca
|
||||
command:
|
||||
name: Command code
|
||||
description: Numeric Command enum value (0-255).
|
||||
required: true
|
||||
selector:
|
||||
number:
|
||||
min: 0
|
||||
max: 255
|
||||
mode: box
|
||||
parameter1:
|
||||
name: Parameter 1
|
||||
description: First command parameter (single byte 0-255).
|
||||
required: false
|
||||
default: 0
|
||||
selector:
|
||||
number:
|
||||
min: 0
|
||||
max: 255
|
||||
mode: box
|
||||
parameter2:
|
||||
name: Parameter 2
|
||||
description: Second command parameter (BE uint16, 0-65535).
|
||||
required: false
|
||||
default: 0
|
||||
selector:
|
||||
number:
|
||||
min: 0
|
||||
max: 65535
|
||||
mode: box
|
||||
Loading…
x
Reference in New Issue
Block a user