官术网_书友最值得收藏!

How to do it...

  1. Create the project from the following template:
$ sls create --template-url https://github.com/danteinc/js-cloud-native-cookbook/tree/master/ch2/data-lake-es --path cncb-data-lake-es
  1. Navigate to the cncb-data-lake-es directory with cd cncb-data-lake-es.
  2. Review the file named serverless.yml with the following content:
service: cncb-data-lake-es

provider:
name: aws
runtime: nodejs8.10

plugins:
- elasticsearch

functions:
transformer:
handler: handler.transform
timeout: 120

resources:
Resources:
Domain:
Type: AWS::Elasticsearch::Domain
Properties:
...

DeliveryStream:
Type: AWS::KinesisFirehose::DeliveryStream
Properties:
DeliveryStreamType: KinesisStreamAsSource
KinesisStreamSourceConfiguration:
KinesisStreamARN: ${cf:cncb-event-stream-${opt:stage}.streamArn}
...
ElasticsearchDestinationConfiguration:
DomainARN:
Fn::GetAtt: [ Domain, DomainArn ]
IndexName: events
IndexRotationPeriod: OneDay
TypeName: event
BufferingHints:
IntervalInSeconds: 60
SizeInMBs: 50
RetryOptions:
DurationInSeconds: 60
...
ProcessingConfiguration: ${file(includes.yml):ProcessingConfiguration}

Bucket:
Type: AWS::S3::Bucket

...

Outputs:
...
DomainEndpoint:
Value:
Fn::GetAtt: [ Domain, DomainEndpoint ]
KibanaEndpoint:
Value:
Fn::Join:
- ''
- - Fn::GetAtt: [ Domain, DomainEndpoint ]
- '/_plugin/kibana'
...
  1. Review the file named handler.js with the following content:
exports.transform = (event, context, callback) => {
const output = event.records.map((record, i) => {
// store all available data
const uow = {
event: JSON.parse((Buffer.from(record.data, 'base64')).toString('utf8')),
kinesisRecordMetadata: record.kinesisRecordMetadata,
firehoseRecordMetadata: {
deliveryStreamArn: event.deliveryStreamArn,
region: event.region,
invocationId: event.invocationId,
recordId: record.recordId,
approximateArrivalTimestamp: record.approximateArrivalTimestamp,
}
};

return {
recordId: record.recordId,
result: 'Ok',
data: Buffer.from(JSON.stringify(uow), 'utf-8').toString('base64'),
};
});

callback(null, { records: output });
};
  1. Install the dependencies with npm install.
  2. Run the tests with npm test -- -s $MY_STAGE.
  3. Review the contents generated in the .serverless directory.
  4. Deploy the stack: 
Deploying an Elasticsearch domain can take upwards of 20 minutes.
$ npm run dp:lcl -- -s $MY_STAGE

> cncb-data-lake-es@1.0.0 dp:lcl <path-to-your-workspace>/cncb-data-lake-es
> sls deploy -v -r us-east-1 "-s" "john"

Serverless: Packaging service...
...
Serverless: Stack update finished...
...
functions:
transformer: cncb-data-lake-es-john-transformer

Stack Outputs
DeliveryStream: cncb-data-lake-es-john-DeliveryStream-1ME9ZI78H3347
DomainEndpoint: search-cncb-da-domain-5qx46izjweyq-oehy3i3euztbnog4juse3cmrs4.us-east-1.es.amazonaws.com
DeliveryStreamArn: arn:aws:firehose:us-east-1:123456789012:deliverystream/cncb-data-lake-es-john-DeliveryStream-1ME9ZI78H3347
KibanaEndpoint: search-cncb-da-domain-5qx46izjweyq-oehy3i3euztbnog4juse3cmrs4.us-east-1.es.amazonaws.com/_plugin/kibana
DomainArn: arn:aws:es:us-east-1:123456789012:domain/cncb-da-domain-5qx46izjweyq
...
  1. Review the stack, function, and Elasticsearch domain in the AWS Console.
  2. Publish an event from a separate Terminal with the following commands:
$ cd <path-to-your-workspace>/cncb-event-stream
$ sls invoke -r us-east-1 -f publish -s $MY_STAGE -d '{"type":"thing-created"}'

{
"ShardId": "shardId-000000000000",
"SequenceNumber": "49583655996852917476267785049074754815059037929823272962"
}
Allow the Firehose buffer time to process, as the buffer interval is 60 seconds.
  1. Take a look at the transformer function logs:
$ sls logs -f transformer -r us-east-1 -s $MY_STAGE   
  1. Open Kibana using the preceding KibanaEndpoint output with protocol https.
  2. Select the Management menu, set the index pattern to events-*, and press the Next step button.
  3. Select timestamp as the Time Filter field name, and press Create Index pattern.
  4. Select the Discover menu to view the current events in the index.
  5. Remove the stack once you are finished with npm run rm:lcl -- -s $MY_STAGE.
主站蜘蛛池模板: 南阳市| 锡林浩特市| 十堰市| 城固县| 台东市| 布尔津县| 安丘市| 满洲里市| 南充市| 乌拉特前旗| 手机| 北海市| 龙口市| 青河县| 吉林市| 万宁市| 敦煌市| 扎赉特旗| 满洲里市| 东阳市| 烟台市| 邛崃市| 道孚县| 石屏县| 长宁县| 敦煌市| 达日县| 吉木萨尔县| 花莲市| 呈贡县| 高清| 二手房| 博乐市| 泸溪县| 青川县| 东宁县| 甘肃省| 凤冈县| 登封市| 敖汉旗| 双柏县|