AWS Introduces Kinesis Firehose To Move Sensor Data To Cloud

Today at AWS re:invent, Amazon introduced a new service called Kinesis Firehose to move data streaming from sensors and other locations directly to the cloud.

As Andy Jassy, SVP at AWS, pointed out, the company launched Kinesis a few years ago to take this kind of streaming data and build custom applications on top of that to handle the data. Amazon found that was taking customers far too long, and some customers didn’t have the resources to build it themselves.

This new service essentially eliminates the need for creating the application by using Firehose as a service. Jassy claims that with a single API call to Firehose, customers can now put the data into Amazon Redshift or S3 and immediately begin working with the data.

The system is of course is completely elastic, meaning Amazon will be more than happy to sell you as much storage as you need to process the incoming data. The service compresses and encrypts on the way in, and the company does give you the ability to set time intervals of when to upload the data or to set limits on how much data to suck in.

Once the data is in the system, customers can decrypt it with the same key and load it into Hadoop clusters (or wherever they wish) to process it and begin to analyze it.

This gives AWS a couple of advantages. It allows the company to expand its big data coverage to the Internet of Things, which has the potential to produce a tremendous amount of data on daily basis and it expands the amount data being stored on the system exponentially.

Customers, who are so inclined can move this data into AWS and work with it, and Amazon can collect fat fees. Win-win.

AWS re:Invent 2015