Kinesis connector applications written in Go
With the new release of Kinesis Firehose I'd recommend using the Lambda Streams to Firehose project for loading data directly into S3 and Redshift.
Inspired by the Amazon Kinesis Connector Library. This library is intended to be a lightweight wrapper around the Kinesis API to handle batching records, setting checkpoints, respecting ratelimits, and recovering from network errors.
The consumer expects a handler func that will process a buffer of incoming records.
func main() {
var(
app = flag.String("app", "", "The app name")
stream = flag.String("stream", "", "The stream name")
)
flag.Parse()
// create new consumer
c := connector.NewConsumer(connector.Config{
AppName: *app,
MaxRecordCount: 400,
Streamname: *stream,
})
// process records from the stream
c.Start(connector.HandlerFunc(func(b connector.Buffer) {
fmt.Println(b.GetRecords())
}))
select {}
}
The default behavior for checkpointing uses Redis on localhost. To set a custom Redis URL pass the RedisURL into the config struct
Apex Log is used for logging Info. Override the logs format with other Log Handlers. For example using the "json" log handler:
import(
"github.com/apex/log"
"github.com/apex/log/handlers/json"
)
func main() {
// ...
log.SetHandler(json.New(os.Stderr))
log.SetLevel(log.DebugLevel)
}
Which will producde the following logs:
INFO[0000] processing app=test shard=shardId-000000000000 stream=test
INFO[0008] emitted app=test count=500 shard=shardId-000000000000 stream=test
INFO[0012] emitted app=test count=500 shard=shardId-000000000000 stream=test
Get the package source:
$ go get github.com/gametimesf/kinesis-connectors
Use the seed stream code to put sample data onto the stream.
Please see CONTRIBUTING.md for more information. Thank you, contributors!
Copyright (c) 2015 Harlow Ward. It is free software, and may be redistributed under the terms specified in the LICENSE file.
www.hward.com · GitHub @harlow · Twitter @harlow_ward