The Winged Hybrid Airship for Long Endurance (WHALE), an airship vehicle design capable of solving the autonomous urban rapid delivery problem. The system utilizes a low-drag envelope filled with helium to deliver long range flight with extreme dependability. The helium envelope covers up to ten pounds of fixed weight including batteries, allowing all the motors to be focused on lifting the payload weight. By supporting an eighty-five mile flight range for payloads up to ten pounds, the WHALE is specifically tailored for practical package delivery. In addition, the WHALE utilizes EDF’s (Electric Ducted Fans) that are shrouded by a duct, which keeps the fans out of reach from wandering hands. The EDF’s are able to tilt, allowing the WHALE to hover and engage in VTOL (Vertical Take Off and Landing). The WHALE has a 4 meter long and 1 meter wide envelope with a wingspan of 3.28 feet, ensuring a small form factor to achieve urban deliveries. The wing uses an efficient slotted flap design that allows for variable lift, enabling zero lift for the return phase without packages. Most importantly, the WHALE is safe. The bulbous shape of the helium envelope will ensure soft and safe interactions with people.
Unmanned air traffic is estimated to exceed todays air traffic by two to three orders of magnitude at maturity. This project focuses on establishing the right estimates of future traffic, measuring the airspace congestion such traffic might cause and developing algorithms to manage that congestion. As a first step, we picked Bay area as a representative metropolitan region and used existing package delivery industry statistics to estimate 100,000 future UAS operations per day in the area. We distributed the traffic based on population density of the area and simulated the operations. The dramatic increase in airspace congestion is visualized below. Aircraft are marked red if they are within a threshold separation of another aircraft otherwise green.
Simulations showed a change in the airspace complexity regime as the future air traffic rises above 10000 flights a day. Large clusters (multiple aircraft in conflict with each other at the same time) begin to form. This would require simultaneous de-confliction of large number of aircraft (around 100 in our simulation) that may not be practically feasible.
We are now developing algorithms to manage this expected airspace congestion. The type and scale of expected UAS traffic necessitates that these operations be as decentralized as possible. Hence, as we develop these algorithms we also seek to determine the level of centralization that may or may not be needed to achieve a required level of performance, stability, safety and efficiency in future civil UAS operations.
As computing and sensing technology advances, Unmanned Aircraft Systems (UAS) becomes smaller but increasingly capable. Many civil applications can reduce operational costs by employing small UASs (sUASs). In response to the immense demand on UAS commercialization, the Federal Aviation Administration (FAA) and the National Aeronautics and Space Administration (NASA) are investigating the possibility of opening low-altitude (below 500 ft) Class G (uncontrolled) airspace for small UAS (less than 55 lb) operations. As a result, a new research field called UAS traffic management (UTM) has emerged.
This project focuses on sUAS flight planning. The goal is to generate optimal flight paths under costs and constraints. There are many UAS path planning algorithms available in the robotics literature. However, not all applies to the low-altitude flight planning problem. First, in low-altitude flight planning, the paths are expected to not only satisfy hard constraints such as geo-fences, ground profiles, and obstacles, but also be optimal with respect to costs such as wind, weather, and population. To satisfy this requirement, methods based on potential field, skeleton, cell decomposition, heuristics (A* and D*), and geometry are excluded. Second, the algorithm should be computationally efficient, which excludes the exact algorithms. Third, since the current low-altitude airspace is not sectored, the algorithm should be able to run on discretized continuum data, possibly in 3D, with various resolutions. Therefore, graphical methods based on nodes and edges, the discrete Dijkstra’s algorithm for example, are not preferred. Lastly, the algorithm should be easily extendable to include an air highway structure, if one is demanded to avoid free flight complexity.
With all the limitations in mind, we find the Fast Marching Method (FMM) to be the most preferable algorithm for low-altitude UAS flight planning. First, we are able to model costs and constraints to obtain optimal paths. Second, it is very efficient, with time complexity of O(NlogN). Third, it can be applied to a 2D manifold in 3D space, which is convenient in modeling the ground profile constraint. We perform flight planning computation on an elevated air manifold, which is essentially a smoothed version of the ground profile. The manifold is stored as a list of vertices and triangles in a minimum heap data structure. Each vertex is assigned a costGiven an origin, the fast marching method performs front propagation to compute the accumulative cost for each vertex, and the optimal path is computed from the destination to the origin via gradient descent. Various costs and constraints could be modeled.
According to FAA regulations (under the 333 exemptions- ) and laws under discussion drone pilots require the consent of the owner in order to fly over private property. As drone policy proceeds in this direction, regulation of future flights necessitates some form of continued communication between pilots and property owners. To tackle this, we are approaching the problem of regulating low airspace navigation from the land ownership model. A property owner (like an Individual, City or County) may provide full, partial or no restriction for a UAS flying in the air parcel above. The current build of NASA’s UAS Traffic Management (UTM) system checks new flight paths against existing reserved paths and geofences. If accepted, a polygonal prism of airspace is reserved along the entire flight path for the duration of the flight. Our system breaks up the complexity of checking a large number of flight paths against static (air parcels/geofences) and dynamic (other flights) obstacles into two parallelly running platforms. A flight operator inputs the path and it is verified based on the existing air parcel permissions in our air parcel system and if it checks through, it is forwarded to the UTM to be checked against existing flight paths thus ensuring that most of the paths submitted to UTM are acceptable. Secondly, we would be able to retrieve active flight track data from the UTM via WFS requests and update them in real time. Thus, even property owners can track which flights are expected to cross their air parcels. There are several research problems in understanding the complexities of this large scale implementation of such a system. Efficient data structures and algorithms are needed in order to check constraints before the flight, and in real-time if the flight paths change. At a higher level, this system can be conceptualized as distributing the control of a large-scale airspace between consumers and FAA rather than a single centralized source. Currently, the system allows property owners to change permissions for the air parcel above their properties.
Drones are the first robots to arrive in smart cities, and collision avoidance with helicopters is among the first barriers to the widespread use of drones. Communication radios capable of broadcasting traffic information is regarded as a promising solution to the Sense and Avoid (SAA) problem for small Unmanned Aircraft Systems (sUAS), especially for high speed flights where long-range remote sensing is not possible. In this project, we develop a hybrid safety controller to analyze collision avoidance between a quadrotor and a helicopter. The safety controller is capable of incorporating vehicle dynamics, wind distrubance, communication delay, and sensor uncertainty, to enable a small quadrotor to perform optimal collision avoidance with a high-speed helicopter, in autonomous navigations. Simulation shows the effectiveness of the controller.
CPCC and 3DRobotics co-advised a Master of Engineering capstone project for a group of four students from the IEOR and EECS departments. The project investigated potential commercial applications of UAV research with a view to the upcoming federally mandated regulatory change.
Optimisation techniques were used to produce a simple path planning interface that allows waypoint generation for a UAV while optimising for to several constraints such as wind direction, air resistance and fuel economy. This capability is missing from current open source path planning tools.
By using parallel processing techniques and optimising generated code for ARM instruction set processors, we were able to achieve the maximum frame-rate possible using open source libraries and boards. This allows rapid prototyping of image processing applications. An EECS Master's report is available here.
It is hard to diagnose failure in multi-rotor UAVs at the moment. By attaching accelerometers to the UAV's frame and comparing live data against a known baseline, it is possible to detect mechanical failure - either as a pre-arm check or in-flight. We collected a set of data and, as part of a machine learning class project, were able to very accurate classify the UAV's structural health as erroneous or not. An EECS Master's report is available here and the machine learning class report is available here.
In collaboration with UnmannedData, CPCC used multirotor UAVs to carry out aerial photography for Bay Area design firm Hood Design. Our photographs were used to present the proposed construction to the local community from a perspective that could not be captured through conventional photography. Hood Design are responsible for designing the connection between the Oakland Bay Trail and Lake Merritt, updates on their project are available here. The Bay Trail is a biking and walking trail under construction that will eventually provide a 500 mile continuous route around the Bay.
This video presents a conceptual future mail delivery demonstration using quadrotor UAV. Professor Wathiq Mansoov from the Department of Computer Engineering at the American University in Dubai (AUD) visited our UAV lab on March 12, 2014. During his visit, our lab member Gita performed this indoor demonstration.
The video describes an environmental monitoring exercise using networked vehicles and sensors. The exercise consisted of searching for an ship that committed an environmental hazard better known as "bilge dumping" and monitoring the respective oil spill. The oil was recreated using 100 kg of popcorn which is know to have similar dynamics. The popcorn was deployed by a Navy vessel. A fixed-wing Unmanned Aerial Vehicle (UAV) equipped with a gimbaled EO camera and a Automatic Identification System (AIS) receiver was deployed to search the "oil" spill and the "suspected" ship. After the oil spill was detected, a message was sent to a Navy vessel that deployed four drifters over the spill in order to forecast its dynamics. The drifters were equipped with GPS and broadcast their position using AIS. The UAV received the AIS information and visited the drifters location to assess their effectiveness while forecasting the oil spill trajectory.
This video presents a demonstration performed under the collaboration of UC Berkeley Cyber-Physical Cloud Computing (CPCC) Lab and the Monterrey Bay Area Research Institute (MBARI). The purpose of this experiment was to use a small Unmanned Aerial Vehicle (a Zephyr flying wing) launched from an MBARI Zephyr vessel to detect oceanic fronts. A front is formed at the boundary between two masses of water with different temperatures. Fronts are rich habitats for biological activity. This activity can be observed by filaments of foam at the surface. The onboard video shows a clear front that appeared to have several miles of length. The filament is visible as well as the different masses of water which appear with slightly different colors.
The video presents a flight test of the searching and tracking algorithm. The position of the target is unknown (for the sake of visualization we present the target as a blue circle). The target detection is abstracted using GPS and a model of a gimbaled camera. An operator defines a initial likelihood map of the target position. The present case shows the definition of a gaussian map with a given mean and variance. The UAV starts searching for the target steering towards the maximum of the a priori likelihood map. During searching, the likelihood map is updated regarding the sensor model, i.e. if the target is not detected under the sensor footprint the likelihood map is decreased to near zero, if the target is detected the likelihood map becomes a gaussian function with mean centered at the estimated target position and a variance that models the sensor resolution error. When the target is detected the algorithm switches to a tracking algorithm. The algorithm switches back to searching mode whenever the target gets outside the sensor's footprint. This work is a summer project performed by researchers from UC Berkeley (CPCC project - cpcc.berkeley.edu/), Academia da Força Aérea, and FEUP.
The CPCC Sense Act Move (SAM) platform provides the infrastructure to create so-called virtual vehicles (VV) that are capable of executing certain user-defined tasks. VVs are deployed into a cloud of real vehicles (RV) which provide execution platforms that may be moving in space. Furthermore, users are able to create a mapping plans that tells the system how VVs move across several RVs. The platform provides a website to control the system of RVs and VVs. The parts in more detail:
This simulation system demonstrates information-acquisition-as-a-service of mobile sensor networks for cyber-physical cloud computing (CPCC) as proposed in . Based on the JNavigator project  the implementation provides
(1) The simulation of physical helicopter swarms;
(2) The simulation of sensors;
(3) The virtual abstraction of autonomous vehicles (virtual vehicles);
(4) The migration of virtual vehicles among flying physical helicopters (real vehicles).
The implemented system currently allows the simulation of helicopter fleets of several dozens of vehicles and supports the simulation of sensors like GPS receivers and photo cameras. To simulate air-pressure sensors, temperature sensors, etc. the system utilizes random number generators, which deliver values in a defined range and precision.
Simulated helicopters do not access the onboard sensors for data collection. It is a virtual abstraction of autonomous vehicles, Virtual Vehicles (VVs) for short, that gathers data. One helicopter is able to carry several VVs. To complete their missions, VVs may migrate between helicopters.
The source code of the CPCC Simulator can be found here.
 Craciunas, S.S., Haas, A., Kirsch, C.M., Payer, H., Röck, H., Rottmann, A., Sokolova, A., Trummer, R., Love, J., and Sengupta, R.:Information-acquisition-as-a-service for cyber-physical cloud computing. In Proc. Workshop on Hot Topics in Cloud Computing (HotCloud). USENIX, 2010.
 Krainer, Clemens D.: JNavigator - An Autonomous Navigation System for the JAviator Quadrotor Helicopter. Master's thesis, University of Salzburg, Austria, 2009.
Our undergraduates have been working to use the Robotic Operating System (ROS) along with the AR Drone driver developed by Brown University in conjunction with OpenCV (open source C++ image processing library) to establish an easily deployable platform for testing multi-UAV algorithms. This platform has allowed for obtaining video stream through the AR Drone, processing the video, and using the results to control the AR Drone. Current work involves the use of this test bed for tracking a colored object using a single UAV. Future goals include the use of multiple automated drones to cooperatively complete a task.
This video presents a preliminary flight test of a flying wing UAV sampling CO2 concentration in flight. The goal of the project is to create a mobile and easily deployable CO2 senor network utilizing a team of inexpensive autonomous UAVs. As shown in the video, the UAV was equipped with an ardupilot-mega autopilot system and a k-30 CO2 sensor. The zephyr flying wing airframe used in the video is capable of carrying up to 1 pound of payload and has a max flight duration of 30 minutes. The ardupilot-mega autopilot is an inexpensive open source autopilot system capable of autonomous way point flights, flight by wire mode, and manual mode. The k-30 CO2 sensor used has a measurement range of 0-10,000 ppm and more information can be found on their website. The flight test was performed near Merritt College in Oakland, California and the aircraft was flown in manual mode.
C. Kirsch etc., Cyber-Physical Cloud Computing: The Binding and Migration Problem. [pdf]
J. Love, Network-Level Control of Collaborative UAVs.[pdf]
S. Craciunas, J. Love etc., Information-Acquisition-as-a-Service for Cyber-Physical Cloud Computing.[pdf]
J. Love etc., CSL: A Language to Specify and Re-Specify Mobile Sensor Network Behaviors.[pdf]