February 13

Our PCB Layouts were sent out for manufacturing and our mechanical design is almost completed!

Mechanical Update

The detailed CAD design of the rear module was completed. In addition to mounting the rear-facing cameras, the rear module also houses the rear compute board (Raspberry Pi4) and the rear warning light ring. The design leverages a metal back plate for heat dissipation from the compute chip. The module's compact design is achieved by inverting the stack of the Raspberry Pi and the Camera board, allowing for closer placement and optimized routing of camera ribbon cables.


Detailed Rear CAD Model

With the completion of the full system CAD, the mechanical efforts have transitioned to part acquisition and manufacturing. 3D printing of plastic components is underway on the team's Prusa mk3. Sheet Metal components have been ordered from a local laser cutting shop. The remaining mechanical component orders have been placed with Amazon and McMaster-Carr. The aluminum stock for the custom metal heatsinks will be purchased from the E3 machine shop and machined on a manual mill in the E5 Student machine Shop.

Electrical Update

Previously, the schematic design for our electronics was complete and board layout had begun. As of this week, the full PCB design for all four board has been completed. The fabrication files have been sent to the manufacturer in Shenzhen, PCBWay, and manufacturing is currently in progress and we expect to receive the blank PCBs next week. The boards will be assembled here. A stencil for the main PCB has been ordered to enable quicker assembly of this board with solder paste and a reflow oven.

The main Falcon PCB takes care of protecting, balancing, and charging the lithium polymer battery that powers the system. It also regulates the power from this battery or the AC/DC adapter to several different voltage rails to power different components. Notably, the high power 5.1V rail can supply up to 10A. This board includes the constant current driver for the high-power front LED, and open-drain drivers for the piezo buzzers (for audible alerts). Since the main PCB supplies several high power loads, it is outfitted with heatsink pours on the back plane which will interface with the aluminum enclosure via thermal pads. The Raspberry Pi's will dissipate heat through the enclosure similarly.

The three auxiliary lighting PCBs are equipped with constant current drivers to drive their LED arrays. The brightness of each array is controllable via a PWM signal which is generated by the main PCB.

Main PCB


Rear PCB

Software Update

Due to significant shipping delays, we did not receive the Intel Neural Compute Stick 2 until the end of this week. In the meantime, we have prepared a set of models to benchmark. The Neural Compute Stick 2 uses the OpenVINO toolkit to deploy and optimize AI inference for modules trained with TensorFlow and other popular frameworks. Therefore, models must be converted to the OpenVINO format rather than the TensorFlow lite format used for performing inference directly on the Raspberry Pi. Alasdair Allan, who published the comprehensive benchmarks referenced in our last post, has published MobileNetV1 and MobileNetV2 in the OpenVINO format. In addition, we acquired a converted version of the EfficientDet model that we initially benchmarked on the Raspberry Pi. Despite the setbacks due to the shipping delays, we expect to complete the benchmarks shortly and proceed to implementation.

The software team also looked into solutions for adding authentication, storage, and a backend to our mobile application. Our familiarity with using Amazon S3 made it an excellent choice to store recorded media such as images and videos. We also have begun testing AWS Amplify, a service for building extensible, full-stack web and mobile apps. This set of tools allows for the simple creation of a backend, user management, and connection to additional AWS services such as S3. Amplify integrates seamlessly with React Native, the UI framework we use to develop our mobile application. We are currently familiarizing ourselves with Amplify and will begin integration into our application after confirming that it satisfies our needs.

App Development Update

As per our Gantt chart, the mobile application development has begun and is on track to be ready for testing by February 25th. The app architecture and screen layouts have been developed using the React-Native framework. The image below illustrates the general layout of the screen navigation that has been implemented thus far. The image also contains a brief description of the different navigators that are used in the application layout.


Screen Navigation Layout

The Sign-in and Sign-up screens use a stack navigator to switch between the two screens. Despite its low performance compared to native implementation, the stack navigator is easy to customize and has the ability to quickly switch between a small number of screens. As the user is expected to spend a limited amount of time on these two screens, it was deemed acceptable to sacrifice the performance for the quick development.

Once the user is logged into the app, the switch navigator immediately navigates the user to the RideCreate screen. This abrupt screen navigation with the switch navigator gives the user access to the 3+1 screens using the now available, bottom navigation bar. The bottom navigation bar contains icons that can be used to navigate between the RideCreate, RideHistory, and Account screens.

The RideCreate screen gives the user access to RideForm and the RideVisual sub-screens as illustrated in the diagram below. This enables the user to record their journey and visualize the warnings provided by the Falcon sensor suite.

Whenever the user is interested in reviewing their ride history, they can navigate to the RideHistory screen using the bottom navigation bar. Here, they have access to a list of all the rides they have recorded to date. Upon selecting a historic ride, a RideDetail screen appears on top of the current screen. Now, the user can review the length of this trip, the average speed, location, and other details related to the trip.

Once the user has completed their task, they can simply close the app or go through the Account screen and log themselves self out. Simply closing the app enables it to automatically log in the same user when the application is launched. If the user wishes to log in using a different account or lend their device to another user, they can safely log themselves out in the Account screen.


Modular App Navigation

The application architecture has been designed to give all screens access to an authentication step when requesting data from the cloud storage. Whenever, a user requests their data by navigating to a screen, an API request is made using the AuthProvider that verifies the user using a JSON Web Token (JWT).

All screens also have access to a LocationProvider that allows the screens to access the mobile device’s GPS location. This is especially useful when the user is recording their trip in the RideCreate screen as it allows the trip data to be geotagged.

Due to the layout of the screens and the providers, the mobile app development has become very modular. This has enabled the team to easily develop on one of the screens without interfering the progress of others. In the upcoming weeks, the team will be adding another provider in the current architecture layout for Bluetooth communication with the Falcon sensor suite. Along side this, the team will also be improving upon the warning and ride visualization that is embedded in the RideVisual screen.

So build away, we do.

Come back and check out our blog for regular updates.

University of Waterloo
  • Group 9
  • Falcon Safety