It seems you don’t read quite as much hype about “Big Data” these days but some industries are diving in with both feet. Healthcare has even adopted its own use of the term “Big Health Data” as more and more initiatives begin to harness the huge amount of “unstructured data” that is produced daily by physicians, clinics and hospitals.
One example here is the newly launched clinical data warehouse, the nation’s first, that tracks health records of 3.2 million South Carolinian’s. Developed by Health Sciences South Carolina the effort was a collaboration between South Carolina’s top universities and hospital systems.
According to the Charleston Regional Business Journal the “Clinical Data Warehouse links and matches anonymous electronic patient records from the state’s largest health care systems to enable providers and researchers to follow patient conditions in real time. The data warehouse also allows biomedical researchers to conduct patient-centered outcomes research and comparative effectiveness studies across a much broader and aggregated patient population base.”
The deployment of a well designed and implemented data warehouse is a first step in mining and using real-time big data flows. Some systems exist “off the shelf” and still more are being developed “in-house” or through collaborative efforts like those of the South Carolina health systems. Teradata recently announced the introduction of their cloud based solution that provides data warehousing by way of a “Software as a Service” model.
According to Alex Giamas, writing in a recent edition of InfoQ, the Teradata offering “From a technical standpoint, at the current stage Teradata Cloud fares closer to Amazon Redshift than any other cloud offering. Teradata offers physical dedicated hardware in its cloud, which is appealing for large corporations that can’t have their data collocated with other datasets. Teradata’s selling point for the Teradata Workload Manager is also a richer set of features as compared with Amazon’s Data Pipeline offering.”
Oracle is also approaching the big data need by extending its capabilities as well. Oracle has now extended that platform approach to Big Data to include an Oracle Big Data Appliance X4-2 offering that makes use of the latest generation of Intel Xeon processors and 4TB magnetic disk drives.
George Lumpkin, vice president product management for Oracle data states that the tech giant views Hadoop and other forms of NoSQL as a natural extension of the data warehouse. Those data warehouses, notes Lumpkin, already contain customer information in a structured format that often stretches back across multiple decades. Hadoop and other technologies now allow those data warehouses to be extended to include all types of unstructured data, including clickstreams and all manner of machine data.
For many companies getting to the data and utilizing it in the way that these data warehouse systems do will find that it can be done at a cost your organization can afford, even if you don’t have the resources of an entire state or a staff full of data warehouse experts. A number of firms exist that utilize design and development teams that are less expensive, short term resources for building your “Big Data” warehouse.
Utilizing U.S. based project management and offshore development, a data warehouse and the systems to utilized it can be deployed at a fraction of the cost that mainstream providers charge. As the emphasis for many years in these development communities has been on technical proficiency with both off-the-shelf systems and the customization of entire data warehouses based on customer need.
One thing is absolutely clear. Now that the hype has past Big Data has become an objective that companies take seriously and continue to look for real solutions for taking advantage of a growing resource.
You must be logged in to post a comment.