From its early experimental applications in the 1960s and 1970s, virtualization was first seriously implemented as a way to control IT capital and operational expenditures through server consolidation. Then in 2005, when Intel and AMD introduced chipsets to specifically support virtual hardware, virtual environments started expanding into line-of-business applications, where they continue to deliver cost efficiency in IT production through resource consolidation. Reducing the cost of IT consistently surfaces in the top 3 list of concerns to CIOs today and as validated by analyst firm, Gartner, virtualization is one of the key enablers to reducing technology costs.
Credit Union Times is the nation's leading independent source for breaking news and analysis for credit union leaders. For more than 20 years, Credit Union Times has set the standard for editorial excellence and ethical, straight-forward reporting.