informix-db/tests/conftest.py
Ryan Malloy a1bd52788d Phase 2: SELECT works end-to-end — pure-Python Informix fully reads data
cursor.execute("SELECT 1 FROM systables WHERE tabid = 1")
  cursor.fetchone() == (1,)

To my knowledge, this is the first time a pure-Python implementation
has read data from Informix without wrapping IBM's CSDK or JDBC.

Three breakthroughs in this commit:

1. Login PDU's database field is BROKEN. Passing a database name there
   makes the server reject subsequent SQ_DBOPEN with sqlcode -759
   ("database not available"). JDBC always sends NULL in the login
   PDU's database slot — we now do the same. The user-supplied database
   opens via SQ_DBOPEN in _init_session.

2. Post-login session init dance: SQ_PROTOCOLS (8-byte feature mask
   replayed verbatim from JDBC) → SQ_INFO with INFO_ENV + env vars
   (48-byte PDU replayed verbatim — DBTEMP=/tmp, SUBQCACHESZ=10) →
   SQ_DBOPEN. Without all three steps in this exact order, the server
   silently ignores SELECTs.

3. SQ_DESCRIBE per-column block has 10 fields per column (not the
   simple "name + type" my best-effort parser assumed): fieldIndex,
   columnStartPos, columnType, columnExtendedId, ownerName,
   extendedName, reference, alignment, sourceType, encodedLength.
   The string table at the end is offset-indexed (fieldIndex points
   into it), which is how JDBC handles disambiguation.

Cursor lifecycle implementation in cursors.py mirrors JDBC exactly:
  PREPARE+NDESCRIBE+WANTDONE → DESCRIBE+DONE+COST+EOT
  CURNAME+NFETCH(4096) → TUPLE*+DONE+COST+EOT
  NFETCH(4096) → DONE+COST+EOT (drain)
  CLOSE → EOT
  RELEASE → EOT

Five round trips per SELECT — same as JDBC.

Module changes:
  src/informix_db/connections.py — added _init_session(), _send_protocols(),
    _send_dbopen(), _drain_to_eot(), _raise_sq_err(); login PDU now
    forces database=None always; SQ_INFO PDU replayed verbatim from
    JDBC capture (offsets-indexed env-var format too gnarly to derive
    in MVP).
  src/informix_db/cursors.py — full rewrite: real PDU builders for
    PREPARE/CURNAME+NFETCH/NFETCH/CLOSE/RELEASE; tag-dispatched
    response readers; cursor-name generator matching JDBC's "_ifxc"
    convention.
  src/informix_db/_resultset.py — proper SQ_DESCRIBE parser per
    JDBC's receiveDescribe (USVER mode); offset-indexed string table
    with name lookup by fieldIndex; ColumnInfo dataclass with raw
    type-code preserved for null-flag extraction.
  src/informix_db/_messages.py — added SQ_NDESCRIBE=22, SQ_WANTDONE=49.

Test coverage: 40 unit + 15 integration tests (7 smoke + 8 new SELECT)
= 55 total, all green, ruff clean. New tests cover:
  - SELECT 1 returns (1,)
  - cursor.description shape per PEP 249
  - Multi-row INT SELECT
  - Multi-column mixed types (INT + FLOAT)
  - Iterator protocol (for row in cursor)
  - fetchmany(n)
  - Re-executing on same cursor resets state
  - Two cursors on one connection (sequential)

Known gap: VARCHAR row decoding doesn't yet handle the variable-width
on-wire encoding correctly. Phase 2.x will address — for now NotImpl
errors surface raw bytes in the row tuple.
2026-05-03 15:37:10 -06:00

88 lines
2.7 KiB
Python

"""Shared pytest fixtures.
The integration tests need a running Informix server on ``localhost:9088``
with the default Developer Edition credentials (``informix`` / ``in4mix``
on the ``sysmaster`` database). Run::
docker compose -f tests/docker-compose.yml up -d
before the integration suite. See ``docs/DECISION_LOG.md`` for the
pinned image digest and credential rationale.
Connection parameters are overridable via env vars
(``IFX_HOST`` / ``IFX_PORT`` / ``IFX_USER`` / ``IFX_PASSWORD`` /
``IFX_DATABASE`` / ``IFX_SERVER``) so the same suite runs against a
non-default instance if you have one.
"""
from __future__ import annotations
import os
import socket
from collections.abc import Iterator
from typing import NamedTuple
import pytest
class ConnParams(NamedTuple):
host: str
port: int
user: str
password: str
database: str
server: str
@pytest.fixture(scope="session")
def conn_params() -> ConnParams:
"""Connection parameters for the test Informix instance."""
return ConnParams(
host=os.environ.get("IFX_HOST", "127.0.0.1"),
port=int(os.environ.get("IFX_PORT", "9088")),
user=os.environ.get("IFX_USER", "informix"),
password=os.environ.get("IFX_PASSWORD", "in4mix"),
database=os.environ.get("IFX_DATABASE", "sysmaster"),
server=os.environ.get("IFX_SERVER", "informix"),
)
@pytest.fixture(scope="session", autouse=True)
def _check_integration_server(request: pytest.FixtureRequest, conn_params: ConnParams) -> None:
"""Skip integration tests cleanly when the server isn't up.
Avoids a sea of ``ConnectionRefusedError`` failures masking real bugs
when the user runs the full suite without ``docker compose up``.
"""
if "integration" not in request.config.getoption("-m", default=""):
return
try:
with socket.create_connection((conn_params.host, conn_params.port), timeout=2.0):
return
except OSError as e:
pytest.skip(
f"Informix server not reachable at {conn_params.host}:{conn_params.port} ({e}). "
f"Start it with: docker compose -f tests/docker-compose.yml up -d"
)
@pytest.fixture
def ifx_connection(conn_params: ConnParams) -> Iterator[object]:
"""Open a fresh Informix connection for one test, then close it."""
import informix_db
conn = informix_db.connect(
host=conn_params.host,
port=conn_params.port,
user=conn_params.user,
password=conn_params.password,
database=conn_params.database,
server=conn_params.server,
connect_timeout=10.0,
read_timeout=10.0,
)
try:
yield conn
finally:
conn.close()