- JavaScript Cloud Native Development Cookbook
- John Gilbert
- 316字
- 2021-07-16 18:03:28
How to do it...
- Create the project from the following template:
$ sls create --template-url https://github.com/danteinc/js-cloud-native-cookbook/tree/master/ch2/data-lake-s3 --path cncb-data-lake-s3
- Navigate to the cncb-data-lake-s3 directory with cd cncb-data-lake-s3.
- Review the file named serverless.yml with the following content:
service: cncb-data-lake-s3
provider:
name: aws
runtime: nodejs8.10
functions:
transformer:
handler: handler.transform
timeout: 120
resources:
Resources:
Bucket:
Type: AWS::S3::Bucket
DeletionPolicy: Retain
DeliveryStream:
Type: AWS::KinesisFirehose::DeliveryStream
Properties:
DeliveryStreamType: KinesisStreamAsSource
KinesisStreamSourceConfiguration:
KinesisStreamARN: ${cf:cncb-event-stream-${opt:stage}.streamArn}
...
ExtendedS3DestinationConfiguration:
BucketARN:
Fn::GetAtt: [ Bucket, Arn ]
Prefix: ${cf:cncb-event-stream-${opt:stage}.streamName}/
...
Outputs:
DataLakeBucketName:
Value:
Ref: Bucket
- Review the file named handler.js with the following content:
exports.transform = (event, context, callback) => {
const output = event.records.map((record, i) => {
// store all available data
const uow = {
event: JSON.parse((Buffer.from(record.data, 'base64')).toString('utf8')),
kinesisRecordMetadata: record.kinesisRecordMetadata,
firehoseRecordMetadata: {
deliveryStreamArn: event.deliveryStreamArn,
region: event.region,
invocationId: event.invocationId,
recordId: record.recordId,
approximateArrivalTimestamp: record.approximateArrivalTimestamp,
}
};
return {
recordId: record.recordId,
result: 'Ok',
data: Buffer.from(JSON.stringify(uow) + '\n', 'utf-8').toString('base64'),
};
});
callback(null, { records: output });
};
- Install the dependencies with npm install.
- Run the tests with npm test -- -s $MY_STAGE.
- Review the contents generated in the .serverless directory.
- Deploy the stack:
$ npm run dp:lcl -- -s $MY_STAGE
> cncb-data-lake-s3@1.0.0 dp:lcl <path-to-your-workspace>/cncb-data-lake-s3
> sls deploy -v -r us-east-1 "-s" "john"
Serverless: Packaging service...
...
Serverless: Stack update finished...
...
Stack Outputs
DataLakeBucketName: cncb-data-lake-s3-john-bucket-1851i1c16lnha
...
- Review the stack, data lake bucket, and Firehose delivery stream in the AWS Console.
- Publish an event from a separate Terminal with the following commands:
$ cd <path-to-your-workspace>/cncb-event-stream
$ sls invoke -r us-east-1 -f publish -s $MY_STAGE -d '{"type":"thing-created"}'
{
"ShardId": "shardId-000000000000",
"SequenceNumber": "49582906351415672136958521360120605392824155736450793474"
}
- Allow the Firehose buffer time to process and then review the data lake contents created in the S3 bucket.
- Remove the stack once you have finished with npm run rm:lcl -- -s $MY_STAGE.
Remove the data lake stack after you have worked through all the other recipes. This will allow you to watch the data lake accumulating all the other events.
推薦閱讀
- 無(wú)線(xiàn)局域網(wǎng)設(shè)計(jì)與優(yōu)化
- 巧學(xué)巧用電子測(cè)量實(shí)用技術(shù)
- LTE移動(dòng)通信術(shù)語(yǔ)與縮略詞詞典
- 光傳輸設(shè)備安裝測(cè)試實(shí)訓(xùn)教程
- 室內(nèi)分布系統(tǒng)規(guī)劃與設(shè)計(jì):GSM/TD-SCDMA/TD-LTE/WLAN
- 輕松學(xué)iPhone開(kāi)發(fā)
- 高通量衛(wèi)星技術(shù)與應(yīng)用
- 聲發(fā)射信號(hào)處理算法研究
- 艦船尾跡的電磁成像機(jī)理及特征提取技術(shù)
- 5G網(wǎng)絡(luò)全專(zhuān)業(yè)規(guī)劃設(shè)計(jì)寶典
- 變頻技術(shù)一學(xué)就會(huì)
- GPRS網(wǎng)絡(luò)信令實(shí)例詳解
- 6G需求與愿景(精裝版)
- 開(kāi)關(guān)電源工程化設(shè)計(jì)與實(shí)戰(zhàn):從樣機(jī)到量產(chǎn)
- 快速學(xué)會(huì)看電子電路圖