Installing AWS SDK for Node.js
- Start by installing the official AWS SDK for JavaScript. This SDK allows you to interact with DynamoDB from your Node.js applications.
- Use npm to install the SDK package.
npm install aws-sdk
Configuring AWS Credentials
- Ensure your AWS credentials are properly configured. You can setup credentials using environment variables, a shared credentials file, or by embedding them directly in your code, though environment configurations are preferred for security.
- Consider using the AWS Management Console to generate your access and secret keys.
const AWS = require('aws-sdk');
// Configure AWS credentials and region
AWS.config.update({
region: 'us-west-2',
accessKeyId: process.env.AWS_ACCESS_KEY_ID, // Consider using environment variables
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
});
Initialize DynamoDB Service Object
- Create an instance of the DynamoDB service object which you will use to interact with the database.
- Optionally, you can configure endpoint and timeout settings according to your requirements.
const ddb = new AWS.DynamoDB({apiVersion: '2012-08-10'});
Create a Table in DynamoDB
- Write a function to create a new table, define its name, attributes, and primary key specifications.
- Ensure that your attribute definitions align with your table's primary key (partition key and sort key if necessary).
const createTableParams = {
TableName: 'Movies',
KeySchema: [
{ AttributeName: 'year', KeyType: 'HASH' }, // Partition key
{ AttributeName: 'title', KeyType: 'RANGE' } // Sort key
],
AttributeDefinitions: [
{ AttributeName: 'year', AttributeType: 'N' },
{ AttributeName: 'title', AttributeType: 'S' }
],
ProvisionedThroughput: {
ReadCapacityUnits: 5,
WriteCapacityUnits: 5
}
};
ddb.createTable(createTableParams, (err, data) => {
if (err) {
console.error('Error creating table. JSON:', JSON.stringify(err, null, 2));
} else {
console.log('Created table. Table description JSON:', JSON.stringify(data, null, 2));
}
});
Inserting Data into the Table
- Utilize the `putItem` method to insert new items into your table.
- Ensure that your data object matches the schema defined during the table creation.
const putItemParams = {
TableName: 'Movies',
Item: {
'year': { N: '2022' },
'title': { S: 'My Movie' },
'info': { S: 'This is a sample movie description' }
}
};
ddb.putItem(putItemParams, (err, data) => {
if (err) {
console.error('Unable to add item. Error JSON:', JSON.stringify(err, null, 2));
} else {
console.log('Added item:', JSON.stringify(data, null, 2));
}
});
Querying the Table
- Utilize the `query` method to fetch items based on specific criteria. Use key conditions to capitalize on DynamoDB's capabilities of fast read operations.
- Ensure performance and cost efficiency by filtering only the necessary data.
const queryParams = {
TableName: 'Movies',
KeyConditionExpression: '#yr = :yyyy',
ExpressionAttributeNames: {
'#yr': 'year'
},
ExpressionAttributeValues: {
':yyyy': { N: '2022' }
}
};
ddb.query(queryParams, (err, data) => {
if (err) {
console.error('Unable to query. Error:', JSON.stringify(err, null, 2));
} else {
console.log('Query succeeded:', JSON.stringify(data, null, 2));
}
});
Handling DynamoDB Streams
- DynamoDB Streams provides a time-ordered sequence of item-level changes in a table. Process these streams to integrate with other AWS services or to create custom features like automated updates.
- Use the AWS-SDK to create the necessary handlers in your Node.js applications.
const lambda = new AWS.Lambda();
const functionParams = {
Code: {
// Use your deployment package
},
FunctionName: 'StreamProcessingFunction',
Handler: 'index.handler',
Role: 'arn:aws:iam::123456789012:role/service-role/MyTestFunction-role-12345678',
Runtime: 'nodejs12.x',
Description: 'Processes DynamoDB stream records'
};
lambda.createFunction(functionParams, (err, data) => {
if (err) {
console.error('Unable to create function. Error:', JSON.stringify(err, null, 2));
} else {
console.log('Function created:', JSON.stringify(data, null, 2));
}
});
Best Practices
- Use environment-specific configuration to manage credentials and endpoint settings gracefully and securely.
- Develop robust error handling mechanisms to mitigate failures during database operations.
- Optimize performance by leveraging indexed attributes and performing batch operations when suited.