Mainframes are reliable and highly automated, often running for years with virtually no human intervention. Mainframe analyst Josh Krischer tells a story about an Eastern European airline that ran its core systems on an IBM mainframe for five years without ever touching the machine after its mainframe IT guy retired.
However, that data often stays locked in the mainframe because sorting and transforming it to perform complex data analytic has been expensive and robs the core applications of CPU cycles. Still, around 85% of corporate systems are running under Mainframe.
The good news is that Big Data technologies are making it easier and less costly to export that data. One option is to use JCL batch workloads to move the data to Hadoop, where it can be processed, combined with other appropriate data and, for instance, moved to a NoSQL database to support forward-looking analysis to support business decisions. The challenge is the lack of well established native connectivity between mainframes and Hadoop.