Perception
5.0
The brief:
On a drive back from a vet appointment, our founder Ryan was involved in an attempted 'crash-for-cash', where a driver attempted to cause a crash involving him in order to make a fraudulent insurance claim. After getting home, Ryan went to view the dashcam footage but discovered the in-car viewer had some limitations. For example:
- if the car was to have been written off or impounded, or if a crash disabled electronics in the car, the viewer is then unavailable
- if the footage needed to be shared with insurers or law enforcement, there was no way to easily do this
- parts of the event could not be viewed due to bugs within the in-car viewer
- for the finer details (such as license plates), there was no way to zoom in
- whilst events are saved to a USB drive, this is subject to failure or misplacement, and there is no easy way to back up important events.
Ryan decided that alongside our other projects, that we needed to build the best Teslacam viewer available on the market - Perception. Specifically, focusing on:
- viewing - Perception needed to be the best way to view Teslacam events on all devices, including pinch-to-zoom, filtering events, and playback speed options
- storing - Perception needed to be a safe place for events to be preserved, incase the USB drive was to fail
- sharing - Perception needed to facilitate exporting Teslacam footage outside of the app, to share with insurers / law enforcement / social media
Our solution:
Initially, we built Perception with React Native as a basic MVP (minimum viable product) to validate a few assumptions. We do this across all of our new projects as part of our discovery process, to ensure we're only committing to work that can be completed. We wanted to check:
- can an iPhone connect with a USB 3.0 device via an adaptor, and play multiple video streams from it?
- can an iPhone handle switching between multiple video files seamlessly?
- can a React Native app handle multiple videos playing concurrently?
The primary challenge was that Teslacam events are not a single video file, instead:
- every minute of an event is a 'segment', so for a 10 minute event, there are 10 segments
- every 'segment' contains 4 video files - 'left', 'right', 'front', 'rear'
- for every event, there is a 'metadata' file containing information about the event, but this does not detail the files for each segment.
From our MVP, we discovered that we could connect with a USB 3.0 device in our React Native app, and play multiple videos concurrently. We also discovered that when swapping from one event segment to the next, the swap was seamless.
We also took the opportunity to experiment with native code for improved performance, especially when working with large amounts of video files. Working with a legacy non-maintained library initially, we transformed it into react-native-video-manager, rewriting it in Swift to modernise it, and adding large amounts of functionality including:
- merging large amounts of videos, with custom options to handle lack of sound tracks, and optional merge process events being passed to the JS layer
- getting durations & playback information for individual video files
- fetching video metadata for a batch of video files
- generating thumbnails for video files, including support for generating thumbnails at a specific point within the video.
Initially, as our MVP only supported iPhone, iPad, and Mac, our native code was written with those platforms in mind. We later revisited it to build an equally performant Android native module.
From our MVP we worked with a UI & UX agency to create intuitive UI/UX interfaces suitable for the diverse range of Tesla owners; from the tech-savvy owners who love customising their cars, to the parents who rely on their cars to transport their families, and the first-time EV drivers learning the technology - Perception needed to cater to everyone, not just those comfortable navigating complex software.
This agency delivered a comprehensive set of wireframes to define the flow of features within Perception, before moving onto a design system capable of adapting seamlessly across mobile, tablet, and desktop - and then finally producing high-fidelity designs that blended functionality and aesthetics.
Making use of React Native, Zustand, Realm, and our custom React Native video management library, we built a fully-fledged dashcam and Sentry Mode viewer for Tesla owners, that turns a previously complicated & error-prone process into three steps: insert the USB, open Perception, import the events.
Following how we work with our clients, we rapidly iterated features and functionality throughout development - experimenting with different ways of working & analysing data to see how the user base responded to changes, so we could hit the right spot. We also allowed ourselves to take guidance from the community, empowering us to build what our users wanted & deprioritise functionality that wasn't so important. For example, multi-event deletion, playback speeds, and recent clips support - these were all requested by the community.
By delivering this project, we:
- empowered drivers to easily access & view their Teslacam events
- helped Tesla owners share their Tesla footage to their insurers & law enforcement in the event of an incident
- provided a seamless and intuitive platform for Tesla videos to be viewed, without needing to view segments individually
Perception can be downloaded on iOS, iPadOS, and macOS at https://download.perception.vision.
Next:
Looking for a consultancy that specialises in the obscure within mobile? Take a look at our other projects!
Got an idea for an app that'll positively impact society? We'd love to chat!
Services:
- React Native (TypeScript) app development
- iOS app development
- macOS app development
- Android app development
- Design
- Mobile strategy
- Product development
- Video display
- Video rendering
- USB I/O support