Inside Aggregator: The Customizable Oracle Smart Contracts
Deep dive into DIA Lumina’s Aggregator smart contracts: how they enable transparent, customizable on-chain oracle computation for reliable DeFi price feeds.
This post is 2/4 in a series diving into the components of Lumina, DIA’s new rollup-based, trustless oracle stack. See all posts here.
TL;DR
- Aggregators are smart contracts on Lumina that collect and process data, enabling the on-chain oracle computation with full transparency and permissionless deployment
- Operating within a three-tier system of Feeders, Pods, and Aggregators for reliable data processing using configurable parameters along the entire data flow process.
- Built with comprehensive security measures and flexible configuration options for sophisticated DeFi applications
- Designed for expansion, Aggregator will keep evolving to support the most complex methodologies to power the data needs of sophisticated applications.
Introduction
What are Aggregators?
Aggregators are smart contracts that collect and process raw data stored on the Lasernet to produce reliable, consensus-driven data feeds. Developers can deploy these contracts permissionlessly on the Lasernet chain, customizing everything from data sources to calculation methods.
Why Choose On-Chain Oracle Computation?
DIA Lumina brings oracle computation on-chain at scale through Aggregator smart contracts deployed on an efficient Layer-2 blockchain, delivering key benefits:
- Full Data Transparency: Every step of data processing becomes fully transparent and verifiable through on-chain smart contracts
- Permissionless Flexibility: Developers can freely deploy and customize multiple Aggregator contracts simultaneously
- Future-Proof Design: Aggregators will evolve to support sophisticated methodologies to cater for the complex DeFi applications and demand oracles
System Architecture
The Three-Tier Architecture
Understanding Aggregators requires a grasp of the broader Lumina ecosystem:
- Feeders: Independent nodes that fetch raw data from sources, filter it, and push it on-chain to the Lasernet for storage in Pods.
- Pods: Key-value smart contracts on Lasernet that store raw data from their associated Feeders
- Aggregators: Advanced smart contracts that process Pod data to generate verified, consensus-driven data feeds
Data Flow Process
- Collection: Feeders collect raw data directly from sources. For example, they fetch trade data of a digital asset from centralized or decentralized exchanges.
- Storage: Feeders submit the last prices to their respective Pod in configurable timeframes. If more than one transaction happens in that time, Feeders process the data to submit a single value using a configurable methodology, e.g. median price of an asset.
- Consensus: Aggregators combine these submitted values and apply consensus rules in order to continue. For example, a threshold of 5 feeders within 1 minute must be met to agree for price validation.
- Calculation: Aggregators combine the selected values and process them with a configurable methodology, to produce a final output, ready for consumption. For example, a moving average price of an asset.
Customization Options
Aggregators provide extensive customization across the entire data stack:
- Feeder Selection: Choose data providers based on reputation metrics, data availability, $DIA stake amount, or performance history
- Consensus Rules: Define validation parameters such as minimum required data points and timeframes
- Calculation Methods: Select from various methodologies (median, Volume-Weighted Average Price, etc.) for final value computation
Security Framework
The Aggregators implement several safeguards:
- Minimum feeder thresholds: Helps to ensure stable and constant distribution in data submitters, so there’s less risk of collusion.
- Data Freshness Checks: Only recently updated prices must be used as oracle values
- Configurable timeout durations: Allows operators to set appropriate expiration windows for data feeds based on market volatility and asset requirements
- Admin controls for feeder management: Enables authorized parties to add or remove data feeders, ensuring the network maintains high-quality and trusted data sources
Admin Framework
Note: While the current implementation includes administrative controls, future versions will transition to immutable contracts without admin capabilities. Updates will require deploying new contract versions, aligning with complete decentralization.
Each Aggregator has an admin address with the authority to:
- Add or remove Feeders
- Set consensus thresholds
- Configure timeout parameters
- Manage feeder requirements
Future Development
Methodology Evolution:
- Support for multiple calculation methodologies
- Customizable approaches per price feed
- Enhanced aggregation algorithms
- Advanced filtering mechanisms
Conclusion
Aggregators form the cornerstone of Lumina’s oracle stack, ensuring reliable price feeds through transparent, consensus-driven aggregation. Their modular design and comprehensive security measures provide the foundation for sophisticated DeFi applications requiring accurate, tamper-resistant data.