AWS Database Blog

New – Amazon DynamoDB lowers pricing for on-demand throughput and global tables

Over 1 million customers choose Amazon DynamoDB as their go-to NoSQL database for building high-performance, low-latency applications at any scale. The DynamoDB serverless architecture eliminates the overhead of operating and scaling databases, reducing costs and simplifying management, allowing you to focus on innovation, not infrastructure. DynamoDB provides seamless scalability as workloads grow from hundreds of […]

Pre-warming Amazon DynamoDB tables with warm throughput

We’re introducing warm throughput, a new capability that provides insight into the throughput your DynamoDB tables and indexes can instantly support and allows you to pre-warm for optimized performance. In this post, we’ll introduce warm throughput, explain how it works, and explore the benefits it offers for handling high-traffic scenarios. We’ll also cover best practices and practical use cases to help you make the most of this feature for your DynamoDB tables and indexes.

Automate the deployment of Amazon RDS for Db2 Instances with Terraform

Infrastructure as Code (IaC) is the practice of provisioning and managing your computing infrastructure using code, rather than manual processes and settings. Popular IaC tools, services, and platforms include Terraform, AWS CloudFormation, Ansible, and Pulumi, each offering unique features to automate and manage infrastructure across various cloud environments. In this post, we demonstrate how Terraform, one of our partner products, can be used to deploy and manage RDS for Db2 instance.

Understanding time-series data and why it matters

In this post, we discuss the nature of time-series data, its presence across different types of industries and various use cases it enables.Time-series data is one of the most valuable types of data used today by organizations across industries. Time-series data allows for a more in-depth understanding of changes, patterns, and trends over time. This enables organizations to gain insights into past behaviors and current states, as well as predict future values. The sequential tracking of data at precise time intervals enables both retrospective and prospective analysis that is extremely valuable for strategy, planning, and decision-making across industries. In this post, we discuss the nature of time-series data, its presence across different types of industries and various use cases it enables.

Build scalable, event-driven architectures with Amazon DynamoDB and AWS Lambda

By combining DynamoDB streams with Lambda, you can build responsive, scalable, and cost-effective systems that automatically react to data changes in real time. In this post, we explore best practices for architecting event-driven systems using DynamoDB and Lambda. DynamoDB provides two options for capturing data changes (CDC): DynamoDB streams and Amazon Kinesis Data Streams (KDS). In this post, we focus exclusively on DynamoDB streams.

Use Amazon ElastiCache as a cache for Amazon Keyspaces (for Apache Cassandra)

In this post, we show you how to use Amazon ElastiCache as a write-through cache for an application that uses an Amazon Keyspaces (for Apache Cassandra) table to store data about book awards. We use a Cassandra Python client driver to access Amazon Keyspaces programmatically and a Redis client to connect to the ElastiCache cluster.

Building a GDPR compliance solution with Amazon DynamoDB

In this post, AWS Service Sector Industry Solutions shares our journey in developing a feature that enables customers to efficiently locate and delete personal data upon request, helping them meet GDPR compliance requirements. The mission of the Service Sector Solutions Engineering Team is to accelerate AWS Cloud adoption across diverse industries, including Travel, Hospitality, Gaming, and Entertainment. We work with customers from Cruise Lines, Lodging, Alternative Accommodation, Travel Agencies, Airports, Airlines, Restaurants, Catering, Casinos, Lotteries, and more.

Heterogeneous data sources: Access your data in PostgreSQL from Amazon RDS for Oracle using Oracle Database Gateway

In certain customer scenarios, Amazon RDS for Oracle databases need to connect to external data sources, such as RDS for PostgreSQL. PostgreSQL can establish connections to Oracle databases using a foreign data wrapper (FDW). In this post, we walk you through setting up an EC2 instance as a database gateway server. You will install and configure Oracle Database Gateway for ODBC (DG4ODBC), ODBC drivers, a PostgreSQL client, and PostgreSQL libraries. With this setup, you can create database links on RDS for Oracle to connect to PostgreSQL through this gateway.

Capture and diagnose I/O bottlenecks on Amazon RDS for SQL Server

In our previous post, Capture and tune resource utilization metrics for Amazon RDS for SQL Server,’ we demonstrated how to use Amazon RDS Enhanced Monitoring and Amazon RDS Performance Insights to diagnose and debug CPU utilization bottlenecks for Amazon Relational Database Service (Amazon RDS) for SQL Server. Aside from CPU and memory, I/O performance is critical for overall database performance. It’s important to understand the I/O requirements of a SQL Server workload, which is dependent on various factors like query access patterns, database schema, and state of database maintenance. Understanding your workload’s, I/O patterns can guide you in selecting the optimal storage type for your RDS instance, balancing performance needs with cost-effectiveness. In this post, we demonstrate how you can use Amazon RDS monitoring tools along with SQL Server monitoring capabilities to capture, diagnose, and resolve I/O issues on an RDS for SQL Server instance.