Exploring a Unified Payload and Companion Computer SDK for Quiver

As Quiver continues to evolve from a modular UAV into a flexible aerial robotics platform, the next natural question arises: what if we opened it up even further? What if payloads, companion computers, and ground control systems could all speak the same language, through a shared SDK that allowed anyone to design and deploy their own onboard intelligence? The idea is to turn it into a development environment, a flying computer with real-world applications and the freedom to grow through community-built extensions.

1. The Idea

The core concept is simple enough: a Payload and Companion Computer SDK that allows third parties to connect new sensors, actuators, or modules to Quiver, and integrate them without needing to modify firmware or reverse-engineer protocols. The SDK would live on top of existing open standards like MAVLink and MAVSDK, offering APIs, templates, and libraries that make integration predictable and repeatable. Instead of configuring each new system by hand, developers could write small applications that Quiver understands instantly. The outcome could resemble what DJI’s Payload SDK does, but with more openness, more flexibility, and fewer locked doors.

2. System Architecture Paths

The architecture for this kind of system naturally divides into two main pathways, and the truth is, both will likely coexist.

Direct-to-Flight-Controller Integration

In this configuration, payloads act as first-class citizens on the flight controller. They register directly as MAVLink modules, ideal for deterministic, low-latency systems like sensors or actuators that need immediate response times. These would rely on CAN or UART links and benefit from the stability of direct integration. It’s the “lean and fast” path, the most efficient way to tie hardware directly into Quiver’s reflexes.

Companion-Computer-Mediated Integration

The second path introduces a companion computer, like a Raspberry Pi 5, which acts as the intelligent middle layer. It handles multiple complex payloads, higher-bandwidth data, mission logic, and onboard computation. This layer allows far more complex behavior: perception, mapping, adaptive flight, and payload coordination. It’s also the key to running the SDK’s higher-level features, such as over-the-air firmware updates, log management, and AI-based automation. In effect, the companion computer turns Quiver into a small distributed system - part robot, part server, part autonomous explorer.

3. Building Blocks for an SDK

The SDK could provide an end-to-end development environment. Its structure might include:

  • Code templates for payloads and companion computer modules, written in Python or C++.

  • High-level APIs for telemetry, control, and mission logic using MAVSDK.

  • Standardized data and event formats for both flight and payload communication.

  • Example applications and mission scripts that show how to combine sensors, logic, and movement.

  • OTA update and log management utilities that operate through MAVLink’s FTP microservice.

In short, it would act as both a toolbox and a rulebook. Developers could pick it up, connect a new piece of hardware, and have it talking to the aircraft and ground system within minutes. A well-documented guide or “Quiver Developer Manual” could turn this from a niche integration project into a platform others can reliably build on.

4. Connectivity and Communication

For all this to function smoothly, connectivity becomes a design pillar. The companion computer provides the backbone of that communication layer. It can host a variety of telemetry modules, including 4G and 5G modems for broadband links, satellite modules for global reach, and high-power radios for localized operations. Combined with intelligent routing software, these channels could provide redundant links for control and data streaming, ensuring continuity even in difficult conditions. With a setup like this, a Quiver unit could feasibly operate anywhere, from urban corridors to remote industrial sites, and still sync with ground systems or cloud dashboards in real time.

5. What the Companion Computer Could Do

The companion computer’s role extends well beyond data routing. It could act as an onboard mission planner, a processing node, or even a cooperative AI agent. Potential applications include:

  • Advanced path planning that adapts dynamically to terrain or conditions.

  • Payload coordination for multi-sensor or multi-actuator missions.

  • Real-time perception and mapping using camera or LiDAR inputs.

  • Local inference and decision-making that minimize ground dependence.

  • Over-the-air firmware and configuration management.

  • Automated log syncing and maintenance diagnostics.

These capabilities transform the companion from a passive link into a genuine intelligence layer. It turns Quiver into something closer to a small autonomous ecosystem, possibly even one that can reason, respond, and update itself mid-mission.

6. Ground Control and Interface Evolution

As the onboard systems grow smarter, the ground station will need to keep pace. One approach would be to extend existing tools like Mission Planner or QGroundControl with plugins that recognize and manage Quiver payloads. Each payload could have its own UI window or control tab, displaying live telemetry, configuration options, or video feeds. In parallel, a web-based interface could unify mission visualization, OTA management, and fleet operations, accessible from anywhere through a secure network connection. Over time, this could evolve into a single dashboard that merges mission planning, payload control, and companion computer monitoring into one continuous view.

7. A Path Toward Ecosystem Development

The development process for this vision could unfold progressively. First, establish SDK foundations; templates, code libraries, and example payloads. Then release a companion computer reference image with pre-installed tools for MAVSDK, telemetry routing, and update management. Next, integrate with ground systems through plugins or web dashboards, giving operators a clear interface for interacting with their payloads. Finally, build the developer ecosystem around it; documentation, certification standards, shared templates, and an open repository of community-built payload integrations. From there, Quiver could start to resemble an open robotics platform rather than a single UAV.

8. Potentials and Questions

The potential scope of this system is broad, and there are plenty of open questions. Should Quiver’s SDK remain purely MAVSDK-based, or should it grow toward ROS2 for more general robotics compatibility? Should the companion computer be standardized, or remain open to any hardware that meets performance criteria? Could this framework eventually expand to support ground robots, fixed installations, or hybrid systems under the same network and protocol? And what would an open registry of third-party payloads look like in practice? Each answer defines a different version of Quiver’s future.

9. Conclusion

Exploring a Payload and Companion Computer SDK isn’t just about technical convenience. It represents a way to make Quiver genuinely extensible towards a platform others can experiment with, adapt, and expand upon. With an SDK that unifies payloads, companion systems, and ground interfaces, Quiver could become more than a configurable UAV. It could evolve into a programmable aerial computer , one that invites others to build intelligence on top of it, and perhaps to imagine new uses for it that haven’t yet been defined. Once the hardware is already capable, the next step would be to give it a common language, and let innovation take over from there.

2 Likes

Thank you for putting this together, Alex. The scope for this can indeed be very broad and we have limited time before releasing a dev kit. Can you propose some basic payloads that can act as an MVP for the SDK? If you had to narrow down the scope into something that showed the basic functions of the concepts listed in “1. The Idea”, what would that look like?

So some examples:

(Payload: RPLidar C1 → Raspberry Pi) → (Attachment Interface: Ethernet TCP/UDP) → (Companion Computer: Rpi 4/5 with 4/5G Module → HTTP POST) → (Web Portal: Simple Web Server → Simple Live Web Visualization webpage)

(Payload: Webcam → Raspberry Pi) → (Attachment Interface: Ethernet TCP/UDP) → (Companion Computer: Rpi 4/5 with 4/5G Module → HTTP POST) → (Web Portal: Simple Web Server → Simple Camera feed webpage)

(Flight Controller: Ethernet TCP/UDP) → (Companion Computer: Rpi 4/5 with 4/5G Module → HTTP POST) → (Web Portal: Simple Web Server → Simple Attitude, Attitude and Position Display)

(Payload: Rangefinder → Raspberry Pi) → (Attachment Interface: Drone CAN) → (Flight Controller: Ethernet TCP/UDP) → (Companion Computer: Rpi 4/5 with 4/5G Module → HTTP POST) → (Web Portal: Simple Web Server → Simple Live Web Visualization webpage)

1 Like

As a first sample, im implementing:
(Payload: RPLidar C1 → Raspberry Pi) → (Attachment Interface: Ethernet TCP/UDP) → (Companion Computer: Rpi 4/5 with 4/5G Module → HTTP POST) → (Web Portal: Simple Web Server → Simple Live Web Visualization webpage)

which when completed a lidar map visual should be accessible for all to see at:

i’m Currently testing varifying that the internal data flows are accurate

2 Likes

Test BOM:

  • 2x Rpi 4
  • RpLiDAR C1
  • Ethernet Cable
  • 4G USB Module
2 Likes