Amazon Web Services today
announced the limited preview of Amazon Redshift, a fast and powerful,
fully managed, petabyte-scale data warehouse service in the cloud.
Amazon Redshift enables customers to dramatically increase the speed of
query performance when analyzing virtually any size data set, using the
same SQL-based business intelligence tools they use today.
With a few clicks in the AWS Management Console, customers can launch a Redshift cluster, starting with a few hundred gigabytes and scaling to a petabyte or more, for under $1,000 per terabyte per year -- one tenth the price of most data warehousing solutions available to customers today.
Self-managed, on-premise data warehouses require significant time and resource to administer, especially for large datasets. Loading, monitoring, tuning, taking backups, and recovering from faults are complex and time-consuming tasks. And, the financial cost associated with building, maintaining, and growing traditional data warehouses is flat-out expensive.
Larger companies have resigned themselves to paying such a high cost for data warehousing, while smaller companies often find the hardware and software costs prohibitively expensive, leaving most of these organizations without a data warehousing capability. Amazon Redshift aims to change this quagmire.
Amazon Redshift manages all of the work needed to set up, operate, and scale a data warehouse, from provisioning capacity to monitoring and backing up the cluster, to applying patches and upgrades. Scaling a cluster to improve performance or increase capacity on Amazon Redshift is simple and incurs no downtime, while the service continuously monitors the health of the cluster and automatically replaces any component needed. Amazon Redshift is also priced cost-effectively (a fraction of existing data warehouses) to enable larger companies to substantially reduce their costs and smaller companies to take advantage of the analytic insights that come from using a powerful data warehouse.
"Over the past two years, one of the most frequent requests we've heard from customers is for AWS to build a data warehouse service," said Raju Gulabani, Vice President of Database Services, AWS. "Enterprises are tired of paying such high prices for their data warehouses and smaller companies can't afford to analyze the vast amount of data they collect (often throwing away 95% of their data). This frustrates customers as they know the cloud has made it easier and less expensive than ever to collect, store, and analyze data. Amazon Redshift not only significantly lowers the cost of a data warehouse, but also makes it easy to analyze large amounts of data very quickly. While actual performance will vary based on each customers' specific query requirements, our internal tests have shown over 10 times performance improvement when compared to standard relational data warehouses. Having the ability to quickly analyze petabytes of data at a low cost changes the game for our customers."
Amazon Redshift uses a number of techniques, including columnar data storage, advanced compression, and high performance IO and network, to achieve significantly higher performance than traditional databases for data warehousing and analytics workloads. By distributing and parallelizing queries across a cluster of inexpensive nodes, Amazon Redshift makes it easy to obtain high performance without requiring customers to hand-tune queries, maintain indices, or pre-compute results.
By distributing and parallelizing queries across a cluster of inexpensive nodes, Amazon Redshift makes it easy to obtain high performance without requiring customers to hand-tune queries, maintain indices, or pre-compute results.
Amazon Redshift is certified by popular business intelligence tools, including Jaspersoft and MicroStrategy. Over twenty customers, including Flipboard, NASA/JPL, Netflix, and Schumacher Group, are in the Amazon Redshift private beta program.
"We are excited about being able to use this new service to take our cloud usage even farther and run a large scale data warehouse in the cloud for our engineering, science, and IT data," said Tom Soderstrom,Chief Technology Officer, Office of the CIO, NASA/JPL. "We're delighted to have a new, fast and low-costoption for analyzing massive amounts of data.This new servicewill also allow us to create new types of Big Data analytics that will lead to new discoveries."
Amazon Redshift includes technology components licensed from ParAccel and is available with two underlying node types, including either 2 terabytes or 16 terabytes of compressed customer data per node. One cluster can scale up to 100 nodes and on-demand pricing starts at just $0.85 per hour for a 2-terabyte data warehouse, scaling linearly up to a petabyte and more. Reserved instance pricing lowers the effective price to $0.228 per hour or under $1,000 per terabyte per year -- less than one tenth the price of comparable technology available to customers today.
Most Popular Stories
- Schedule packed with talent at the Fox
- I never set out to be a role model but it's great to be one ; IN THE HOTSEATBetter known by his stage name Wretch 32, Jermaine Sinclair is a 28-year-old rapper from London. In 2011 his debut album Black and White sold over a million copies and scored three top five singles. His latest single Blackout was released this week
- Entrepreneurs Chase Social Media
- European Car Sales up First Time in 20 Months
- Promoter McLean 'provided more musical joy than Dylan and Prince combined'
- Emirati announces new film project at Cannes
- The Blade, Toledo, Ohio, TK Barger column
- SINCE YOU ASKED [Pittsburgh Tribune-Review (PA)]
- SET PHASERS TO DUMB Spock emotional and in love? Nonstop explosions? The highly illogical enterprise of J.J. Abrams' 'Star Trek'
- Detroit Free Press Julie Hinds column