CV Workflow
Intro to Computer Vision
By now, everyone should know that we are exploring brush management as an early application of Arrow’s multi-purpose drone. Tied in with this task is the processing of visual signals. That description is intentionally vague, as there are a lot of ways to approach this goal. In this short memo, I will outline just a few possible workflows as we try to test this in the market.
Relevance to Arrow
How does this relate to Arrow’s work? Mostly, it doesn’t. A lot of this R&D is likely to fall squarely on the DevCo side of things. However, it may offer two pieces of insight:
- Devices and computation required (or not) to accomplish this task
- Future required capabilities anticipated for Project Quiver
1) Just Look
By far, the simplest way to accomplish brush management is visually. The drone operator is tasked with flying the drone, looking for the target plants on a display, and then push a button. A simple, manual process.
Required Materials:
- Downwards-facing camera
- VRX video display
- Remote-controlled dispenser
Limitations:
- Demands full attention of a pilot
- Additional time of pilot
- Requires high-fidelity cameras to judge target plants
- Risk of duplicate treatments
2) Brute-Force Search
Full grid search of a bounded area
Our second option is where we start to incorporate AI. We can bound a region to explore and trigger a dispensing event for each time out real-time computer vision setup detects a target plant directly under our drone.
Basic process:
- Outline boundaries of region to explore
- “Zigzag” or grid explore the full region
- If the CV tool detects a target plant, dispense herbicide
- Continue until end of route
Required Materials:
- Downwards-facing camera
- Remote-controlled dispenser
- Onboard compute for CV
- A trained model to ID target plants in near real-time
- (Nice to have) VRX video display
Limitations:
- Requires additional compute onboard
- Demands well-trained, performant CV model
- Lots of redundant travel (!)
- Undermines speed & endurance requirements
- Risk of duplicate treatments
Commentary
This is called a “brute force” search for a reason. While it does offload much of the decision burden to the onboard companion computer, this technique performs an exhaustive search of the area of interest.
In the example image above the full flight plan covers a distance of 2.81 km. At a rate of 5m/s, completing this task will take about 9 minutes and 20 seconds. We know that time and endurance of our drone are of high importance to end users, so this is unlikely to be a suitable solution for larger operations.
3) 2-Pass Search + Address
The current most interesting solution involves a 2-pass method. Pass 1 uses a higher-flying scouting drone to take images of the region of interest.
The goal of this image collection is to create a comprehensive “map” of the region, then use a CV model on an off-board computer to find the target plants. These targets are assigned waypoints and events, which are then sent to the multipurpose drone, which then performs the treatment in the second pass.
Pass 1: survey images to map
Once here, run your CV to determine which of the features of this map represent target plants. If the map is geo-referenced, it should be fairly straightforward to convert these points to waypoints for a mission planner. Once the mission is decided, send it to the multipurpose drone.
Pass 2: directly addressing the plants
Required Materials:
- A second drone specifically for scouting
- Downwards-facing camera
- Remote-controlled dispenser
- Software to develop a contiguous map of area of interest
- A trained model to ID target plants & offboard computer
- (Nice to have) VRX video display
Limitations:
- Requires additional drone
- Robust data handoff: scout drone → computer → Quiver
- Demanding, complex tech stack
Commentary
Of the three, this is where I picture the future of this workflow going. This not only offloads much of the burden from the drone operator, but is also more efficient by using specialized drones.
In this example, by eliminating the brute force search step for Quiver, its overall mission distance was reduced from 2.81km to 0.82km, dramatically increasing the efficiency of the mission. This offers a promising challenge to the dominant helicopter IPT status quo.
However, there is quite a distance between this vision and a working model. In the meantime, we should continue to enhance the endurance of Project Quiver as we begin to see more possibilities for long-range missions.
Conclusion
TLDR; We’ve got a plan for getting this guy to work with computer vision. Most of this plan will be tested and incorporated in satellite devices + software around Project Quiver.
If engineering takes one thing away from this:
For this to work, Project Quiver needs to be a robust action drone with an emphasis on endurance. If we can count on Quiver to be a reliable platform, we can unlock a lot of interesting pos