官术网_书友最值得收藏!

How to do it...

  1. Create the project from the following template:
$ sls create --template-url https://github.com/danteinc/js-cloud-native-cookbook/tree/master/ch2/data-lake-s3 --path cncb-data-lake-s3
  1. Navigate to the cncb-data-lake-s3 directory with cd cncb-data-lake-s3.
  2. Review the file named serverless.yml with the following content:
service: cncb-data-lake-s3

provider:
name: aws
runtime: nodejs8.10

functions:
transformer:
handler: handler.transform
timeout: 120

resources:
Resources:
Bucket:
Type: AWS::S3::Bucket
DeletionPolicy: Retain
DeliveryStream:
Type: AWS::KinesisFirehose::DeliveryStream
Properties:
DeliveryStreamType: KinesisStreamAsSource
KinesisStreamSourceConfiguration:
KinesisStreamARN: ${cf:cncb-event-stream-${opt:stage}.streamArn}
...
ExtendedS3DestinationConfiguration:
BucketARN:
Fn::GetAtt: [ Bucket, Arn ]
Prefix: ${cf:cncb-event-stream-${opt:stage}.streamName}/
...

Outputs:
DataLakeBucketName:
Value:
Ref: Bucket
  1. Review the file named handler.js with the following content:
exports.transform = (event, context, callback) => {
const output = event.records.map((record, i) => {
// store all available data
const uow = {
event: JSON.parse((Buffer.from(record.data, 'base64')).toString('utf8')),
kinesisRecordMetadata: record.kinesisRecordMetadata,
firehoseRecordMetadata: {
deliveryStreamArn: event.deliveryStreamArn,
region: event.region,
invocationId: event.invocationId,
recordId: record.recordId,
approximateArrivalTimestamp: record.approximateArrivalTimestamp,
}
};

return {
recordId: record.recordId,
result: 'Ok',
data: Buffer.from(JSON.stringify(uow) + '\n', 'utf-8').toString('base64'),
};
});

callback(null, { records: output });
};
  1. Install the dependencies with npm install.
  2. Run the tests with npm test -- -s $MY_STAGE.
  3. Review the contents generated in the .serverless directory.
  4. Deploy the stack:
$ npm run dp:lcl -- -s $MY_STAGE

> cncb-data-lake-s3@1.0.0 dp:lcl <path-to-your-workspace>/cncb-data-lake-s3
> sls deploy -v -r us-east-1 "-s" "john"

Serverless: Packaging service...
...
Serverless: Stack update finished...
...
Stack Outputs
DataLakeBucketName: cncb-data-lake-s3-john-bucket-1851i1c16lnha
...
  1. Review the stack, data lake bucket, and Firehose delivery stream in the AWS Console.
  2. Publish an event from a separate Terminal with the following commands:
$ cd <path-to-your-workspace>/cncb-event-stream
$ sls invoke -r us-east-1 -f publish -s $MY_STAGE -d '{"type":"thing-created"}'
{
"ShardId": "shardId-000000000000",
"SequenceNumber": "49582906351415672136958521360120605392824155736450793474"
}
  1. Allow the Firehose buffer time to process and then review the data lake contents created in the S3 bucket.
  1. Remove the stack once you have finished with npm run rm:lcl -- -s $MY_STAGE.
Remove the data lake stack after you have worked through all the other recipes. This will allow you to watch the data lake accumulating all the other events.
主站蜘蛛池模板: 太谷县| 美姑县| 高碑店市| 黑水县| 古田县| 聂荣县| 呼图壁县| 雅江县| 海淀区| 通化县| 和龙市| 临沧市| 托克托县| 鲁甸县| 田林县| 府谷县| 泸西县| 驻马店市| 调兵山市| 勐海县| 博乐市| 棋牌| 博罗县| 白城市| 东方市| 于田县| 乌海市| 柞水县| 胶州市| 庆元县| 来安县| 余江县| 义马市| 依安县| 古蔺县| 始兴县| 栖霞市| 江油市| 凤台县| 北流市| 玉溪市|