Securing remote data into managed storage
DataBolt is an automated research data teleport system created by Intersect to implement robust, fault-tolerant data acquisition solutions for research projects. The design pattern and associated configurable software is entirely adaptable across many different platforms. DataBolt Collection components are containerised to run within an individual organisation’s central, distributed and edge infrastructure, for data collection at IoT scale.
Why use DataBolt?
Researchers increasingly use on and off-campus instruments to capture large quantities of research data. Data movement from instrument to analysis environment is manual, ad-hoc, error prone and time consuming. DataBolt is a solution to this problem, providing modular data movement that reduces the administrative overhead significantly.
Automated, fault-tolerant, configurable and scalable, DataBolt reduces manual steps, maximises throughput and autonomously maintains file and collection integrity at every curation stage. DataBolt Collection Agents (CA) can be co-located closer to data sources to further mitigate acquisition risks.
What does DataBolt deliver?
- Automated data capture.
- Seamless, secure and reliable data transfer.
- Access to scalable, active and archival datastorage.
- Cross-platform Collection and Delivery Agents that can run in any container compatible environment.
- Move large data sets in chunks to efficiently utilise network bandwidth.
- Data validation at destination, and creation of manifests.
- Multiple use case via configuration.
- Source cleansing to manage remote data sizes.
- Data integrity processes like two phase commit to ensure data integrity.
- Metadata available at the origin stays with the data.
- Researcher friendly dashboard to view the live data transfer.
- Role based access for monitoring, and managing data transfers.