Robert Wright has extensive experience in laboratory and pilot plant operations with emphasis on operation of small scale batch and continuous reactors, catalyst preparation and evaluation, distillation, online gas chromatograph and titration analysis, and control systems. Bob has experience as a technologist/technician with Roche Biomedical Laboratories, Union Carbide, Dow Chemical, and Alstom Power.
MATRIC is using ‘big data’ computing to help the Department of Energy’s (DOE) National Energy Technology Laboratory (NETL) to perform or enhance data mining, analysis, simulation and modeling of high volume, velocity, and variety of spatial and non-spatial data. Big data computing is a combination of hardware (computing clusters) and software technologies (Hadoop, Spark, MapReduce) that make it possible to realize value from “Big Datasets” that are too large for a single computer to process. Big data computing shares similarities with high performance computing in that both approaches utilize clusters of computers (cloud or locally hosted) to distribute and accomplish complex tasks. However, big data computing is specifically designed to process and analyze larger scale datasets.