Tim Bumgardner has over 30 years of experience constructing, operating, problem solving and analyzing data for pilot plants and scale models of reactor and refining systems. He was previously employed by Union Carbide and Dow Chemical, where he gained expertise in both laboratory and pilot plant operations.
MATRIC is using ‘big data’ computing to help the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) to perform or enhance data mining, analysis, simulation and modeling of high volume, velocity, and variety of spatial and non-spatial data. Big data computing is a combination of hardware (computing clusters) and software technologies (Hadoop, Spark, MapReduce) that make it possible to realize value from “Big Datasets” that are too large for a single computer to process. Big data computing shares similarities with high performance computing in that both approaches utilize clusters of computers (cloud or locally hosted) to distribute and accomplish complex tasks. However, big data computing is specifically designed to process and analyze larger scale datasets.