February 28

All parts have arrived, prototyping in well underway and things are starting to come together!

Mechanical Update

With the CAD work complete, Mechanical efforts shifted to the manufacturing and assembly of the system modules. 3D printing the plastic components for the system occurred over multiple days, with each part being post processed and validated for tolerance issues or defects. All ordered components arrived without delay, including the custom ordered sheet metal and acrylic components.

The final components to be manufactured were the metal heat conduction blocks. These three blocks are designed to conduct heat from the high temperature computing and power components to the aluminum exterior of the device. The natural airflow from the bike movement creates a forced convection field and cools down the heat generating components. These custom blocks were manually milled from aluminum plates in the E5 student shop. This manufacturing process allowed for a low cost and fast turn around. The photo below shows one of the components during the milling operation.


Component During Milling

With all of the components ready, the system was test fitted together. The main connections for the system are done by melting heat-set inserts into the 3D printed plastic components for bolt connections. This creates a strong, reliable connection point for assembly. The preliminary assembled system is ready for electrical integration. The mechanical team is continuing to work on design improvements and integrating remaining features including waterproof external wiring and power switch mounting.

Electrical Update

All four PCBs have arrived from PCBWay (Shenzhen) and are currently being assembled and tested. The main PCB, which distributes power to all modules, manages battery protection and charging, and controls the lighting and audio features, is fully assembled and currently being validated. Firmware development for the microcontroller on this board is also underway. All functions have been evaluated other than battery charging.

The next steps include completing validation and firmware for this PCB. Communication with front module Raspberry Pi needs to be brought up. One of the auxiliary PCBs has been partially assembled to validate the LED driving circuit, but the rest need to be assembled and tested before final system assembly. Additionally, the wiring needs to completed, with the correct lengths and connectors.

Assembly of Falcon Board

All Boards After Arrival

Software Update

After receiving the Intel Neural Compute Stick 2, our team benchmarked the device using two popular object detection models: MobileNet v1 SSD 0.75 depth, and MobileNet v2 SSD model trained on the Common Objects in Context (COCO) dataset with an input size of 300 x 300. These models were selected as they were already available in the OpenVINO Intermediate Representation. The Intel Neural Compute Stick 2 requires the OpenVINO toolkit and thus models must be converted beforehand to the appropriate format. This process requires a specific distribution of Linux and is challenging, so pre-existing models were chosen to save time. Additional models may be converted if the need arises.

The performance of the Intel Neural Compute Stick 2 exceeded the Raspberry Pi CPU as expected. The average inference time for MobileNet v1 over 10 runs was 51.95ms. For MobileNet v2 it was 80.41ms. The later model produces more accurate detections, and thus there is a tradeoff for speed. Compared to the 134.54ms inference time using the CPU only, this represents a large improvement, and theoretically would allow us to meet our 10 Hz specification. However, this would require installing 4 Intel Neural Compute Stick 2 devices inside our system, which is not mechanically feasible. Our hope is to install 2 devices in the front module and settle for slightly less than 10 Hz frequency to preserve the accuracy of detections.

Additionally, our team performed testing with the USB cameras that will be used for blindspot detection. The front and rear facing cameras have a small blindspot which is monitored by a single USB camera on each side. These units are not arranged in a stereo configuration and will use relative vehicle size as an indicator of depth. The resolution and framerate of the cameras were tested and both satisfied our expectations. The connector was found to be poor quality, but can be soldered directly to the board to avoid signal losses.

While testing the USB cameras we also had an opportunity to test our wrapper for the OpenCV VideoCapture class, used to interface with the cameras. This class only provides methods to retrieve the last frame in a queue, meaning that bufferless frames (i.e., discard frames until they are required) is not possible. Using multithreading, we implemented a wrapper around the VideoCapture class which polls the camera and keeps only the most recent frame using a queue. This works well and allows us to grab frames in real time while doing CPU intensive computation in-between.

Lastly, we developed and tested our stereo calibration software. The mechanical team mounted a set of stereo cameras inside a prototype of the rear module. A standard calibration procedure for stereo cameras was followed. Several photos of a checkerboard were taken in all regions of the image frame, and at different angles. The intrinsic and extrinsic camera parameters were computed. From these, the depth map, which provides the estimated depth on a pixel-basis, can be derived. This is a fundamental component of our vehicle positioning system. Subsequent calibration of the front-facing stereo camera unit will be performed when the front module is assembled.

App Development Update

As we approach the symposium date and the end of the project, the mobile application testing has begun. The image below summarizes the screens that were established in the previous blog post.


Screen Navigation Layout

The images below illustrate the screens that can be used to sign in or sign up for the Falcon Safety mobile application. When the user initially opens the application, they are directed to a screen where they can sign in with a preexisting account. If they do not have an account, the user is able to quickly navigate to the sign-up screen by clicking the blue text below to “Sign In” button. Similarly, if one navigates to the sign-up screen and would like to navigate back to the sign-in screen, they can click the blue text below the “Sign Up” button.

Sign In Page

Sign Up Page

Once a user has successfully logged into the application, a unique token is generated, and securely saved on the device and in the cloud storage. This token can now be used perform lots of tasks to provide a smooth user experience. When a user decides to restart the application at a later time, they will not be required to sign in as the token will automatically authenticate them and perform the signing in procedure. Once in the application, the token is used to retrieve the user’s saved rides from the cloud storage for visualization and for reviewing their memorable trips.

So build away, we do.

Come back and check out our blog for regular updates.

University of Waterloo
  • Group 9
  • Falcon Safety