One of my technological passions is cloud-based computer architecture and its open source implementations. Hands down, my favorite open source solution is OpenStack. Initially created through the collaborative efforts of NASA and Rackspace, it now relies on the contribution of developers across the globe and the involvement of companies such as Cisco, Dell, HP and Intel.
What is OpenStack? Simply put, OpenStack is a platform that brings together “virtual machines” and physical computers. (If you’ve ever run Windows on a Mac, you’ve used a virtual machine. Windows is the virtual machine, and the Mac is the “hardware node,” or the physical computer.)
Several physical computers can host multiple virtual machines, and that’s when things get powerful and can benefit the real estate industry. One of the most productive uses of OpenStack is to research large data sets (AKA big data). When the need arises to sift thru extremely large data sets, e.g.. historical market data, several virtual machines (called “worker nodes”) can split the work load of a large query job amongst them and shorten the overall completion time of the job. A properly configured OpenStack environment will allow for hundreds of worker nodes to be started and used to analyze extremely large data sets depending upon the amount of physical hardware available to the OpenStack environment. 10 physical machines could easily provide support to 100 virtual machines.
Most of us don’t work with technologies such as OpenStack; however, I hope this overview of the technology serves as insight into the future of computer technology for those of you who have developed a curiosity of the cloud, the important part it plays in computer technology and its impact on the future of computing as a whole.
If you’d like to read more about OpenStack and how it’s being used please visit: http://www.openstack.org/user-stories/