“Big Data” Hype Has Subsided, The Push To Use It Is Still On

It seems you don’t read quite as much hype about “Big Data” these days but some industries are diving in with both feet. Healthcare has even adopted its own use of the term “Big Health Data” as more and more initiatives begin to harness the huge amount of “unstructured data” that is produced daily by physicians, clinics and hospitals.

One example here is the newly launched clinical data warehouse, the nation’s first, that tracks health records of 3.2 million South Carolinian’s. Developed by Health Sciences South Carolina the effort was a collaboration between South Carolina’s top universities and hospital systems.

According to the Charleston Regional Business Journal  the “Clinical Data Warehouse links and matches anonymous electronic patient records from the state’s largest health care systems to enable providers and researchers to follow patient conditions in real time. The data warehouse also allows biomedical researchers to conduct patient-centered outcomes research and comparative effectiveness studies across a much broader and aggregated patient population base.”

teradata-logo-EbizThe deployment of a well designed and implemented data warehouse is a first step in mining and using real-time big data flows. Some systems exist “off the shelf” and still more are being developed “in-house” or through collaborative efforts like those of the South Carolina health systems. Teradata recently announced the introduction of their cloud based solution that provides data warehousing by way of a “Software as a Service” model.

According to Alex Giamas, writing in a recent edition of InfoQ, the Teradata offering “From a technical standpoint, at the current stage Teradata Cloud fares closer to Amazon Redshift than any other cloud offering. Teradata offers physical dedicated hardware in its cloud, which is appealing for large corporations that can’t have their data collocated with other datasets. Teradata’s selling point for the Teradata Workload Manager is also a richer set of features as compared with Amazon’s Data Pipeline offering.”

Oracle is also approaching the big data need by extending its capabilities as well. Oracle has now extended that platform approach to Big Data to include an Oracle Big Data Appliance X4-2 offering that makes use of the latest generation of Intel Xeon processors and 4TB magnetic disk drives.

George Lumpkin, vice president product management for Oracle data states that the tech giant views Hadoop and other forms of NoSQL as a natural extension of the data warehouse. Those data warehouses, notes Lumpkin, already contain customer information in a structured format that often stretches back across multiple decades. Hadoop and other technologies now allow those data warehouses to be extended to include all types of unstructured data, including clickstreams and all manner of machine data.

For many companies getting to the data and utilizing it in the way that these data warehouse systems do will find that it can be done at a cost your organization can afford, even if you don’t have the resources of an entire state or a staff full of data warehouse experts. A number of firms exist that utilize design and development teams that are less expensive, short term resources for building your “Big Data” warehouse.

Utilizing U.S. based project management and offshore development, a data warehouse and the systems to utilized it can be deployed at a fraction of the cost that mainstream providers charge. As the emphasis for many years in these development communities has been on technical proficiency with both off-the-shelf systems and the customization of entire data warehouses based on customer need.

One thing is absolutely clear. Now that the hype has past Big Data has become an objective that companies take seriously and continue to look for real solutions for taking advantage of a growing resource.

1 thought on ““Big Data” Hype Has Subsided, The Push To Use It Is Still On”

  1. I recently swtcihed to a Mac, so I’ve got MySQL on it. That sort of motivated me to try and gradually move away from SQL Server. Nothing against SQL Server, but for something like this blog there’s no reason not to use a free and readily available solution.So my situation was my blog entries were in a SQL Server database up on a VPS. I wanted to convert them to MySQL here on my local machine.I’ve still got a PC running, so using Enterprise Manager I connected to the remote SQL Server instance, and tried exporting the data to plain text files (to be imported into MySQL). Unfortunately, on a couple of tables that had NTEXT datatypes, the export failed. I spent a while trying to figure out how to get around that, but every avenue I looked into ended up with the same unhappy result.Since all I really needed to do was one database (the one powering the blog), I ended up going the brute force route and simply writing a cfm page that did a SELECT * on each table, then looped over the result and did an INSERT into an empty table in a newly created database on the remote MySQL database.Not particularly elegant, but it got the job done.I’m sure there has to be a better way. If anyone has any suggestions, please feel free to post them. I may be done with migrating the blog over, but you never know when the need to do something like this will arise again.

Comments are closed.