Data management & telemetry integration

How data flows into the ALSO platform, and what we need from each customer to keep the analysis predictable and reliable.

This page combines the telemetry pipeline integration options and the data & documentation requirements into a single view for technical and operational stakeholders.

1. Submit an API or data endpoint

Use this form to propose a new API or data endpoint for ALSO to connect to. This is for internal coordination only; in the live product this will trigger a proper workflow.

2. Vessel data coverage by parameter

This table shows, for each vessel, whether Pascal has approved telemetry or documentation for the parameters in the original “Data and Documentation Request” sheet. The green rows mark the minimum data set we need to support the standard empirica.ALS analysis.

Category Parameter Vessel
LinkVC Momentum OceanZero Cofounder
Tier level we can support today with data provided. TBD TBD TBD TBD
Navigation STW
Heading
COG
SOG
Lat
Lon
Draft Fwd, mid, aft
Water Depth
Apparent Wind Speed (AWS)
Apparent Wind Direction (AWD)
Rudder angle 1 and 2
Heel/list
Trim
Roll
Pitch
Machinery Main Engine Power 1 and 2
Shaft Power 1 and 2
RPM 1 and 2
Shaft Torque 1 and 2
Shaft Generator Output 1 and 2
Combined Aux Generator load
ALS Compressor Power
All pressure sensors in system
All flowmeters in system
Documentation / Information General arrangement
Stability Booklet
EEDI report
Propeller Curve
Propeller document (with diameter, pitch ratio, area ratio etc)
Engine room arrangement or Shaft arrangement
ALS Piping and Instrumentation Diagram
ALS Nozzle Arrangement
Shell Expansion Plan
Midship section
Navigation arrangement

3. Telemetry data pipeline options

ALSO needs a predictable data pipeline from each ship in order to calculate savings and recommendations. We support several integration patterns, listed here roughly in order of preference.

3.1 APIs (preferred)

The preferred option is for Pascal to read from one or more REST-style APIs where the telemetry is exposed. In this model:

  • Customer (or a third party) hosts the API and controls data storage behind it;
  • Pascal receives API documentation and credentials for connecting; and
  • Pascal configures which endpoints to query, how often, and with which filters.

This has several advantages: high flexibility in how often we fetch data, access to new data as soon as it is available from the API, and a one-time setup with little ongoing maintenance for both sides. :contentReference[oaicite:1]{index=1}

3.2 Periodic tabulated text files

Where APIs are not available, we can ingest periodic dispatches of tabulated text files such as CSV, Parquet or other delimited formats, as long as the internal structure stays consistent across files.

Typical patterns:

  • Customer exports data manually from their telemetry dashboard in a standard layout; or
  • Customer sets up an automated export that drops new files on a schedule (for example daily or weekly).

The downside is that it often requires more work on the customer side and typically limits how often ALSO can refresh its analysis. :contentReference[oaicite:2]{index=2}

3.3 Custom integrations

If neither APIs nor regular file exports are feasible, we can agree a custom integration. This usually requires more work up front and can be more sensitive to changes in the customer platform. Examples include: :contentReference[oaicite:3]{index=3}

  • A read-only database connection (for example PostgreSQL, SQL Server or MySQL) with a stable schema or dedicated reporting view that Pascal queries on a schedule.
  • Periodic ingestion from object storage or a data lake (for example Azure Blob Storage, Azure Data Lake or S3-compatible storage), where Customer writes telemetry snapshots to a designated path in a defined folder structure.
  • Subscription to a message or event stream (for example Azure Event Hubs, Kafka, MQTT), where Customer publishes telemetry events and Pascal subscribes as a downstream consumer.
  • A lightweight “bridge” service in Customer’s environment that reads from internal systems and forwards the relevant telemetry securely to a Pascal endpoint over HTTPS.