Skip to main content

2 posts tagged with "speed at scale"

View All Tags

· 20 min read
Neel Phadnis

(Source: Photo by Pietro Jeng on [Unsplash](https://unsplash.com/) ) Source: Photo by Pietro Jeng on Unsplash

This post focuses on the use of Collection Data Types (CDTs) for data modeling in Aerospike with a large number of objects. This is Part 2 in the two part series on Data Modeling. You can find the first post here.

Context

Data Modeling is the exercise of mapping application objects onto the model and mechanisms provided by the database for persistence, performance, consistency, and ease of access.

Aerospike Database is purpose built for applications that require predictable sub-millisecond access to billions and trillions of objects and need to store many terabytes and petabytes of data, while keeping the cluster size - and therefore the operational costs - small. The goals of large data size and small cluster size mean the capacity of high-speed data storage on each node must be high.

· 13 min read
Neel Phadnis

(Source: Photo by NASA on [Unsplash](https://unsplash.com/) ) Source: Photo by NASA on Unsplash

Introduction

Data Modeling is the exercise of mapping application objects onto the model and mechanisms provided by the database for persistence, performance, consistency, and ease of access.

Aerospike Database is purpose built for applications that require predictable sub-millisecond access to billions and trillions of objects and need to store many terabytes and petabytes of data, while keeping the cluster size - and therefore the operational costs - small. The goals of large data size and small cluster size mean the capacity of high-speed data storage on each node must be high.