Flight Test and Tracking Platform

Greetings Folks,

I’m creating this thread to brainstorm and explore ideas around creating a system to capture, monitor & organise flight test artefacts; videos, Flight Logs, Mission Reports, etc, that could be generated either in-house and during any decentralised test campaigns. Initially for first use during Quiver development, but then extensible to other systems going forward as well.

We could potentially have dozens of units in the hands of people and entities round the world, spitting out information and giving essential feedback on the state of Arrow Systems. Therefore, to make best use of this, being able to appropriately capture and incorporate every bit of relevant data would be great.

All ideas on functionality, structure and deployment mechanics etc are welcome.

Thanks for all in advance.

2 Likes

Thanks for starting this thread. I think it’s a good time to begin planning out requirements so we can do a thoughtful pass at UI/UX design before we get too far into development.

Before even getting to requirements we should be explicit about the goals for this platform. I’d propose:

  • Collect and organize lots of flight test data so that we can improve our hardware design and reliability.

  • Generate data/proof that will meaningfully assist with certification with airspace regulators.

Here are some of the thoughts that I’ve had for this platform. They’re not super developed and I’m not certain that they are all good ideas.

  • Central database where everyone can upload their flights. Including logs, weather data, narrative commentary, video recordings, etc.

  • Users can create their specific aircraft and mark which components are used (airframe, motors, escs, battery, attachment, etc)

  • The platform will track each individual aircraft and even each individual part on the aircraft. For example each ESC will have a serial number/unique id and we can view number of flights/hrs per that specific part.

  • Separately, we can view aggregated data by TYPE (Quiver PT2, Hobbywing X6, etc), that will show us cumulative data for that TYPE of aircraft/part across all of the different aircrafts and flights. Here we can get more statistically significant data such as mean time before failure per part type.

  • Option to omit location data from flight logs for privacy.

  • Accident/Incident reporting so that others can learn from failures. We might have to define what qualifies as an accident/incident.

  • Easy user experience. We should be able to pre-fill as much data as possible from the uploaded logs to make it an easy process for the pilot.

  • Sign in with Ethereum. Some sort of on-chain or verified pilot log so pilots can match their PIC time with the flights and build credibility and flight hours.

  • Decentralized backend? IPFS? I don’t think much of this makes sense to have on-chain but it would be nice to have it be somewhat immutable and decentralized.

  • $ARROW token rewards. We could hold contests for longest flight, most time on a part, etc. We could put specific incentives for things where we need more data, like high altitude flights, flights at MTOW, extreme temperature, etc. This could work kind of like liquidity mining in DeFi but for crowdsourcing flight test data.

  • Parts catalog where you can view specifications on a part gathered from actual flight data. This would help people pick parts on their own drone and could be better data than some manufacturer data sheets.

It’d be great to get some specific feedback on the above goals and ideas. Are they useful? What are they missing? I don’t feel like I have a lot of insight yet on what we’ll need for certification and if this is even useful or not. I think a deeper understanding of the certification process will be critical to inform design of this platform.

2 Likes

On the road to certification point, we could explore what kinds of data and information and formats aerospace systems developers usually send to government agencies, or if governments have specification to that effect already, that could be used to specify what information to capture, or generate if possible by aggregating other data point. Then given any existing format templates, having as a function of the system the ability to auto generate that would be useful.

FAA Certification Requirements

In the United States, the FAA’s certification process for UAS, especially for advanced operations, involves several steps:​Federal Aviation Administration

1. Special Airworthiness Certificate

Under FAA Order 8130.34D, UAS can be issued a Special Airworthiness Certificate in the experimental category for purposes such as research and development, crew training, and market surveys. This requires detailed documentation, including:

2. Type Certification for UAS

The FAA has introduced airworthiness criteria for UAS seeking type certification under the “special class” category. This involves demonstrating the aircraft’s durability and reliability through rigorous testing and documentation. Federal Aviation Administration

Functionality for Part 107 Compliance:

Structured Data Capture

  • Flight time, date, duration.
  • Pilot-in-Command (PIC) details, including Remote Pilot Certificate verification.
  • Location data (with privacy controls as needed).
  • Aircraft ID/Registration details.
  • Detailed narrative/commentary logs for each flight.

Incident and Accident Tracking

  • Standardized accident/incident reporting template matching FAA format.
  • Automatic alerts and notifications upon logged accident events.
  • Secure, verifiable report submission to FAA channels if required.

Compliance Dashboard

  • Real-time overview of regulatory compliance status (e.g., pilot certification expiration reminders, aircraft registration renewal alerts).
  • Quick-access dashboards summarizing logged flight hours, incident counts, and compliance checks.

Security and Privacy Controls

  • Secure data encryption and secure access management.
  • GDPR & privacy law compliance for handling sensitive pilot/operator data.
  • Optional anonymization or aggregation of location data for broader community insights while protecting user privacy.

if we could have data entry elements into the application like in these types of documents specifying are recording test proceedures and events, pt2 test 3, then having an LLM package the data and spit out formatted reports consistently as required would be trivial.

On implementing secure decentralised storage, we could use a combination of IPFS and optimism.

could have an running IPFS node on the server and pin the data for persistence. Then store the data content identifier via an “Anchor” smart contract deployed on Optimism that emits an event containing the CID, immutably recording the file’s fingerprint and timestamp on-chain.

When it comes time to retrieve the data, the app could read the on-chain event to extract the CID, then performs an IPFS check against the node data. IPFS automatically verifies the data’s integrity by checking each chunk’s hash against the CID. so we would have a scalable storage in IPFS plus on-chain anchoring on Optimism for data attestation provable to regulators.

so the data , like gigs of video or log files, wont themselves be onchain but we can have an onchain tx as the ironclad guarantee of the data’s integrity

on injecting token rewards, we could have something like a mini “Active Bounties” tab where people can find needed testing aspects with corresponding rewards, and given the nature of some of these task, i.e if they can be checked against log data; like flight times, distances, temperatures or certain kinds of usage intensity, etc, it might be possible to even programmatically determine whether the requirements have been met or not before automatically releasing rewards.

Similar Systems currently in use

1 Like

hmm, what if, even if not initially, there could be a platform plugin to mission planner ? :thinking:

The Mission planner framework comes with software plugin and UI mod support in C# and built with VS. it might be possible to have a plugin that makes a direct connection to the platform to configure uploading of flight data and anything else

1 Like

On distributed rewards for flight data. We could explore a mechanism similar to wing bits here

They have an adsb hardware / data to token pipeline

1 Like

This is an awesome project

Flight Testing & Tracking Platform System Requirements

Progressive Web Dapp to capture, monitor & organise flight test artefacts to:

  • Collect and organise feedback flight test data to improve hardware design and reliability.
  • Generate proofs to assist with certification with airspace regulators.

1. Functional Requirements

1.1 User & Identity Management

  • Ethereum-based Sign-In: OAuth2-style “Sign in with Ethereum” enabling on-chain pilot identity.

    • Could also: Email/password / GitHub / Google authentication for Day 1 simplicity.
  • Role-Based Access Control: Scoped roles (Pilot, Engineer, Auditor, Admin).

    • Could also: Flat “user vs admin” model initially, then expand to full role based access controls.
  • Pilot Credential Tracking: Store Remote Pilot Certificate data, expiry, and flight-hour as NFTs.

    • Could also: Manual entry of license expiry without on-chain verification.

1.2 Aircraft & Component Registry

  • Aircraft Configuration: Register each UAV’s BOM (air frame, motors, ESCs, battery, payload, etc).

  • Unique Part IDs: generate or store serial numbers/CIDs for every component.

  • Lifecycle Tracking: Log install/removal dates and flight-hours per part.

    • Could also: Only track part types (not individual serial numbers) for v1.

1.3 Flight Data Capture & Upload

  • Multi-File Upload: Accept CSV, JSON, .bin, and .log recorder files plus video, mission reports, photos, commentary.

  • Auto-Parse & Pre-Fill: Back-end Python parsers via pymavlink & pymavlog to extract metadata and populate forms.

    • Could also: Require users to fill metadata fields manually on upload, deferring parsing automation.
  • Real-Time Telemetry (Optional): Live stream via WebSocket/MAVLink.

    • Could also: Omit live-streaming initially, focusing on post-flight data ingest only.

1.4 Data Storage & Integrity

  • Primary Storage: Pin files to IPFS/Filecoin/Airweave and store CIDs on-chain for end-state immutability. On-Chain Anchoring by write CIDs to e.g Optimism smart contract, for immutable audit trail.

    • Could also: Centralised object store (AWS S3/Azure Blob) + relational DB for metadata.
  • Checksum Verification: Automatic hash checks on every upload for integrity.

1.5 Analytics & Reporting

  • Basic Dashboard Displays: Total flights, hours per air frame, flight-hour tally per component type.

    • Could also: Generate Mean time between failures curves and remaining-useful-life forecasts once enough data exists.
  • Certification Reports: Dynamic FAA/EASA report generator with per-section data binding.

    • Could also: Template-based PDF/HTML flight-test summaries.

1.6 Incident & Compliance Workflow

  • Incident Reporting Form: Pre-built regulatory-portal integration with Standardised template (CFR §107.9, EASA SORA) with required fields to auto-submit filings.

    • Could also: Basic “accident vs incident” capture with attachments.
  • Compliance Checklist: Embed FAA Part 107 / EASA pre- and post-flight sign-off steps.

    • Could also: Simple downloadable checklist PDF, manual compliance tracking.

1.7 Incentivisation & Tokenomics

  • Test Bounty Board Display: Define tasks (e.g. “fly >2 hrs at MTOW”), automated log-criteria checks.

  • ** Automated Payouts:** Smart-contract distribution of $ARROW tokens upon bounty fulfilment.

    • Could also: Manual verification and off-chain token distribution.
  • Leaderboards & Achievements: Gamified contributor dashboards.

    • Could also: Simple “recent contributors” list without gamification.

1.8 Collaboration & Annotation

  • Video/Data Annotation: Timeline markers and shared comments on telemetry graphs.
    • Could also: Threaded comment section per flight without timeline linkage.

2. Non-Functional Requirements

2.1 PWA Standards

  • Desktop-First Responsive Design: Optimised for desktop browsers, adapts to tablet/mobile.

    • Could also: Basic responsive web layout without offline PWA features.
  • Mobile Enhancements: Service-worker caching for intermittent connectivity, lightweight forms.

    • Could also: Defer camera/microphone in-browser capture to later iterations.

2.2 Performance & Scalability

  • API Throughput: Handle ~100 uploads/sec with horizontal auto-scaling.

  • CDN Integration: Serve static assets via Cloudflare/Akamai.

    • Could also: Serve static build from a single-region bucket at launch.

2.3 Security & Privacy

  • Encryption: TLS 1.3 in transit; AES-256 at rest.

  • GDPR Compliance: Data-subject access requests, right-to-erase workflows, optional geo-anonymisation.

  • Immutable Audit Trail: Write-only logs of user actions, tamper alerts.

    • Could also: Standard database audit logs without blockchain anchoring initially.

2.4 Reliability & Availability

  • SLA: ≥ 99.9% up-time with multi-AZ deployment.

  • Disaster Recovery: Daily backups; RTO < 1 hr, RPO < 15 min.

    • Could also: Weekly backups for MVP, then ramp up as usage grows.

3. Integration & Extensibility

3.1 APIs & SDKs

  • REST & GraphQL: Ingestion, query, analytics endpoints.

    • Could also: Start with REST-only, add GraphQL later.
  • Web-socket Streams: Real-time telemetry and alert subscriptions.

    • Could also: Polling-based updates for MVP.

3.2 Third-Party Data Feeds

  • Weather Services: METAR/TAF via NOAA/EU APIs.

  • Could also: Use open access weather APIs

  • Airspace Info: NOTAM/TFR via FAA/EASA feeds.

    • Could also: Manual CSV imports of weather/airspace data early on.

3.3 Modular Architecture

  • Microservices: Separate ingestion, analytics, anchoring, UI.

    • Could also: Launch as a single monolith, then split as needed.
  • Plugin System: Add analytics modules, report templates, storage back-ends, etc.

    • Could also: Hard-code core analytics; refactor to plugins later.

4. Data & Schema Requirements

4.1 Core Entities

  • User
  • PilotCredential
  • Aircraft
  • Component
  • Flight
  • TelemetryRecord
  • IncidentReport
  • Bounty
  • PaymentRecord

4.2 Schemas

  • FlightLog:

    {
      "flightId": "...",
      "aircraftId": "...",
      "pilotId": "...",
      "startTime": "...",
      "endTime": "...",
      "route": [{"lat":..,"lon":..,"alt":..,"timestamp":"..."}],
      "batteryData": {...},
      "events": [...]
    }
    
  • ComponentRecord:

    {
      "componentId": "...",
      "type": "...",
      "mfr": "...",
      "serial": "...",
      "installDate": "...",
      "removalDate": "...",
      "flightHours": 123.4
    }
    

4.3 File Formats

  • Structured Logs: CSV, JSON (validated via JSON Schema).

  • Raw Flight-Recorder Logs: .bin, .log (parsed by Python back-end).

  • Media: MP4/H.264, JPG/PNG (checksum‐verified).


5. Back-end Architecture & Technology Stack

  1. Framework:

    • Option A: FastAPI monolith with synchronous Python parsers.

    • Option B: Django REST + Celery worker queue for async log processing.

  2. Containerisation: Docker Compose for local/dev; Kubernetes or managed service in prod.

  3. Storage:

    • Option A: AWS S3/Azure Blob + PostgreSQL

    • Option B: IPFS/Filecoin + on-chain anchoring.

  4. Auth:

    • Option A: JWT email/password / Google / Github etc.

    • Option B: SIWE “Sign in with Ethereum”.

1 Like