Being part of a municipality-owned electric utility offers a unique opportunity to lead in the area of big data analytics. What moves the electric utility of the 7th largest city in the U.S.? The answer is, people. For years, CPS Energy has invested in development of local talent, local technology development, city growth, its employees, and an asset infrastructure that is setting the stage for continued success. At CPS Energy, when such investments are topped by a data infrastructure and applications conducive to creation of business insights, we can justify and prioritize investments. For us, the biggest people opportunities in big data analytics are around operations, customer and employee engagement, and safety. The presenter will provide examples and share how his views have evolved from those of a researcher to global renewable energy consultant to technology innovator and more recently a “harvester of value” from within people, process, and technology assets. Lastly, current and anticipated future states with regards to San Antonio’s electric utility big data enablement platform will be presented.
Despite the diverse landscape of technology solutions in big data analytics such as: (1) cloud-based distributed computing (driven by economies of scale and need for optimal response of Bulk Electric System and tailored customer service) and (2) machines interchanging information with other machines in the industrial Internet-of-Things (driven by exponential growth of devices in the communication network and desire for faster optimal controllability in the distribution management system), technology investment decisions in electric utilities are still made by and for people with a keen eye for creating value for its customers. In addition, driven by a turning point of open-source software in recent years, machine learning has matured past the point of academic research and inflated expectations and has enabled faster and more transparent technology deployment, even though machine learning was first conceived in the 1950s. Computer codes that use machine learning techniques could be easily trained and deployed centrally or on distributed infrastructure to predict more optimal solutions to business problems if data inputs behave within reasonable range and with normal variability. However, field sensors, actuators, or communication networks of utility-scale environments, in seeking to meet economic and customer expectations, end up being dynamic and heterogeneous in function, quality of service, time synchronization and location, and there is still significant time spent in vetting security standards are met and performing input data completeness and accuracy checks.
Since electric utilities serve a large group of customers in their territory the opportunities to create value for the customers far outweighs its challenges given the inequality in human resources to data intake ratio. But until new tested principles in cyber physical systems are developed, it seems that we must resort to traditional error handling processes in middleware workflows to account for known potential data inaccuracies and to close the gap between central and distributed computing resources.