#timeseries

waynerad@diasp.org

"Why do you need a time series database inside a car?"

That's a good question. Sometimes I wonder if we've crossed the point where further complexification of cars doesn't yield much benefit. But let's continue.

"As automotive intelligence progresses, vehicles generate increasing amounts of time-series data from various sources. This leads to high costs in data collection, transmission, and storage. GreptimeDB's Integrated Vehicle-Cloud Solution addresses these issues by leveraging the advanced computational capabilities of modern on-vehicle devices. Unlike traditional vehicle-cloud coordination where vehicles are mere data collectors, this new approach treats them as full-fledged servers capable of running complex tasks locally. The evolution from 32-bit microcontrollers to powerful chip modules like Qualcomm's 8155 or 8295 has enabled intelligent vehicles to perform edge computing efficiently, reducing transmission costs and improving overall efficiency."

"GreptimeDB is a cloud-native time-series database built on a highly scalable foundation. However, we did not initially anticipate it running on edge devices such as vehicles, which has presented significant challenges."

"The first challenge is resource usage constraints. GreptimeDB runs in the vehicle's cockpit domain controller and must minimize CPU and memory usage to avoid interfering with infotainment systems."

"The second concern is robustness; GreptimeDB collects critical diagnostic metrics from the CAN bus, so any crashes could result in data loss."

"CAN" here stands for "controller area network" and is a data bus inside vehicles that replaces masses of wires that go directly from components to other components -- it allows any "electronic control unit" (ECU) connected to the bus to communicate with any other.

"Lastly, unlike servers in datacenters, vehicle-based GreptimeDB operates under various conditions -- frequent power cycles, fluctuating ADAS data rates due to changing road traffic, etc. -- and needs to adapt while remaining stable and efficient."

"ADAS" stands for "advanced driver-assistance systems".

How to build a TSDB Running inside a car

#solidstatelife #databases #timeseries

waynerad@pluspora.com

"A type of neural network that learns on the job, not just during its training phase" has been developed. "These flexible algorithms, dubbed 'liquid' networks, change their underlying equations to continuously adapt to new data inputs."

"Ramin Hasani, the study's lead author, points to video processing, financial data, and medical diagnostic applications as examples of time series that are central to society."

"Hasani drew inspiration directly from the microscopic nematode, C. elegans. 'It only has 302 neurons in its nervous system,' he says, 'yet it can generate unexpectedly complex dynamics.' Hasani coded his neural network with careful attention to how C. elegans neurons activate and communicate with each other via electrical impulses. In the equations he used to structure his neural network, he allowed the parameters to change over time based on the results of a nested set of differential equations."

"This flexibility is key. Most neural networks' behavior is fixed after the training phase, which means they're bad at adjusting to changes in the incoming data stream. Hasani says the fluidity of his 'liquid' network makes it more resilient to unexpected or noisy data, like if heavy rain obscures the view of a camera on a self-driving car."

"There's another advantage of the network's flexibility, he adds: 'It's more interpretable.'"

"Thanks to Hasani's small number of highly expressive neurons, it's easier to peer into the 'black box' of the network's decision making and diagnose why the network made a certain characterization."

"Hasani's network excelled in a battery of tests. It edged out other state-of-the-art time series algorithms by a few percentage points in accurately predicting future values in datasets, ranging from atmospheric chemistry to traffic patterns. 'In many applications, we see the performance is reliably high,' he says. Plus, the network's small size meant it completed the tests without a steep computing cost."

Unfortunately, I'm not going to be able to explain much of how this system works, since I don't understand it well enough myself. In simple terms, it is combining two ideas: an ordinary differential equation problem-solver, and a recurrent neural network. A recurrent neural network is a neural network that takes part of its output and sends it back around as input for the next time step. This is a simple way to get a neural network to "remember" things it has seen in the past. During training, when the backpropagation step changes the weights on the neural connections by going backwards through the network, and for a recurrent neural network, this involves going backwards through this output-to-input circuit as well, doing what is known as "backpropagation through time".

The ordinary differential equation part, on the other hand, represents the system as a set of outputs that change with time and have a time derivative, and the relationship between the time derivative and the actual outputs is defined by an equation that has parameters that are themselves trained by a neural network. The modification made here, that seems to make all the difference, is the inclusion of an additional term that has a time constant that controls how quickly the ordinary differential equation solver moves the whole system towards an equilibrium state.

How exactly this leads to the "liquidity" that lets the system "learn on the job", I haven't figured out. Along with the greater "interpretability" -- well it sounds like that is simply a function of fewer neurons.

"Liquid" machine-learning system adapts to changing conditions

#solidstatelife #ai #liquidnetworks #recurrentnetworks #timeseries #differentialequations