A career in the nuclear industry (0:16)
0:16-5:14 (James explains how he decided to focus his Master’s on nuclear engineering and his career at Dominion Energy.)
Q. How did you get started in the nuclear space?
A. As a child, James Miller originally wanted to be an astronomer or a paleontologist. James’ mother instead suggested he become an engineer, so he pursued an education in physics. The Vietnam War occurred during James’ Master’s, and so he joined the Navy as a communications technician. He then returned to graduate school and a friend convinced him to transfer to nuclear engineering. He then joined Dominion Energy, where he stayed until his return to academia as a professor at Virginia Commonwealth University.
James joined Dominion when they were first beginning their nuclear engineering program. At the time, Dominion had two operational reactors at Surrey in the UK. Two additional reactors were being built at North Anna, which James was able to tour when they were under construction. James worked on research and development, focusing on methods development for calculations for reactor physics.
Nuclear models and design (5:15)
5:15-14:38 (James explains the types of computer models he worked on at Dominion. He also discusses the difference between fuel design and core design and the importance of understanding the theory behind models.)
Q. What is the model?
A. The models James worked on were computer codes. In 1975, computers were much slower than today. The code calculated the distribution of power within the core and the criticality conditions. The codes were called PDQs, which stood for Pretty Damn Quick. IBM had developed their own version of the code, which Dominion bought and edited. At the time, many of the nuclear utilities were dependent on the vendors for design and defending designs with the Nuclear Regulatory Committee. Dominion decided that this process would be easier if this was all down in house to keep interests as a priority. It then took three years to develop the core design because the entire plant design had to be built in code, including all input and benchmark data.
James has worked in core design, safety analysis and fuel performance and design at Dominion. Fuel design looks at the performance of the fuel rods including material performance, corrosion, avoiding overpowering and avoiding exceeding burnup limit. The burnup limit is the amount of fuel that can be extracted, which is set by the nuclear regulator. Core design is how the fuel assembly is arranged inside the reactor core. The goals of core design involve avoiding violating safety limits and optimizing the amount of energy extracted.
James notes the importance of learning about nuclear theory because computer codes are not always correct. It is also key in recognizing when bad input data is producing bad output data. James points out that this can be an issue for the industry. Nuclear regulators were once gained industry experience before becoming regulators, but now, students become regulators right out of university. This lack of industry experience means regulators may not fully understand why decisions are made or why certain things happen.
Solving nuclear challenges (14:39)
14:39-20:40 (James discusses some of the challenges he faced when working at Dominion, including researching unexpected plant events, such as Three Mile Island.)
Q. What are some challenges that you solved at Dominion?
A. James primarily created new codes and methods. For instance, James looked at life extension of plants. Generation 2 plants were designed in the ‘60s and ‘70s and originally expected to operate for only 40 years. James also looked at unexpected events and the licensing requirements of design changes. Plants are designed with four different classes of accidents in mind. Condition Four are the major class of accidents, such as the large break loss of coolant accident, which has never occurred. Another Condition Four accident is the steam generator tube rupture where the boundary between the primary and secondary coolant is broken. After Three Mile Island (TMI), more trainings were implemented. This training not only provided operators with more knowledge, but also broke up the monotony of running plants. The Shift Technical Advisor position was also created after Three Mile Island to combat the ignorance of the operators that did not understand what was happening during a routine malfunction, ultimately causing the meltdown.
Three Mile Island and safety culture (20:41)
20:41-32:38 (James discusses the Three Mile Island meltdown and why the embedded safety culture has not replaced human operators with automated systems.)
Q. What would have happened at Three Mile Island if the operators had not touched anything?
A. The accident would still have occurred, but the plant design would have replaced the lost coolant using the safety injection pumps. The problem was that the operators did not know that they were losing coolant and were trying to avoid overpressuring the system. They unfortunately turned off the safety injection pumps, causing the meltdown.
James believes the reaction to increase human involvement in operations instead of replacing workers with automated systems is because computer codes do not always produce the correct answers. Safety systems can not predict every possible scenario. James does believe it is feasible to remove human operators in the new designs. The low pressure Generation 4 designs, such as the salt reactor, have passive solutions to prevent accidents in the case of a malfunction or breakage. However, the burden of proof is high. Historically, this conservatism for nuclear energy began with the military’s involvement. But this attitude was adopted by the industry at large and the regulator has failed to incorporate new knowledge and has changed very little.
Jame’s transition to university professor (32:39)
32:39-41:26 (James explains how he became a university professor and his role at Virginia Commonwealth University.)
Q. How did you become a teacher?
A. James always wanted to teach. University of Virginia and Virginia Tech used to have research reactors on campus and nuclear engineering programs. Both decommissioned their reactors and ended their programs when the nuclear market stalled, though Virginia Tech has since started again. Dominion wanted to have a closer pool of potential employees and approached Virginia Commonwealth University (VCU) to begin their nuclear engineering program again. James was brought in to teach reactor theory and manages the development of the nuclear reactor simulator, which is used as a teaching tool. After Three Mile Island, all commercial plants were required to have a simulator that exactly mimics their control room to train operators with. Jame’s simulator simulates a two loop pressurized water reactor (PWR) and can replicate reactor operations and simulate accidents.
PRAs versus FSARs (41:27)
41:27-48:43 (James explains how PRAs are the beginning of a change in safety culture. He also explains how the NRC’s requirements can impose a high cost to the industry with the example of FSAR.)
Q. Can anything be done to change the safety culture?
A. The industry is starting to move in that direction. The nuclear analysis department at Dominion now does Probabilistic Risk Assessments (PRA). If license holders can show with a PRA that some systems are not important in maintaining the core, then they can reduce the maintenance requirements and use cheaper materials. However, using this information to convince the regulators to change the culture is problematic. James sees this as a political and sociological problem rather than a technical problem. An example of this is nuclear waste, which is the fission products with no economic value. Waste is vitrified, meaning it is put in glass, preventing it from entering groundwater. Disposing of waste is not a technical issue but is a problem of perception. While there has been some improvement, a good example of the high cost of the regulator’s imposed safety culture occurred in the 1990s. The Nuclear Regulatory Committee (NRC) required the Final Safety Analysis Report (FSAR) to be updated for every plant design change, creating an additional expense for operators. The NRC also required plants to identify all the “facts,” which they themselves did not define. Each FSAR is thousands of pages in length, requiring James and his teams to spend weeks identifying all important information.
The ruling that harmed the industry (48:44)
48:44-59:27 (James discusses how complex systems create more room for error. He also explains how regulation delays from the past and today are severely hurting the industry.)
Q. Sometimes safety for safety sake can make things less safe, right?
A. Many saw the additional retrofitting requirements that were introduced after Three Mile Island to be doing just this. Complexity can decrease safety because more systems create more opportunities for errors. Simulations help prepare operators because when an error does occur, the operators are familiar with what is happening. Often times, the system can take care of the problem itself by tripping the reactor, opening release valves and cutting off the fission reaction.
James believes creating larger reactors was a mistake. The Atomic Energy Commission (AEC)'s ruling on environmental impact in 1971 was a turning point. An Australian economist found that before the AEC’s ruling, the cost of building a reactor was decreasing. However, the ruling created new water quality standards and paused all construction for 18 months. The economist believes that if that hold was not put in place, the current costs would be 1/10th of what they are today. Fast forward and the industry is still facing delays. In 2018, the NRC approved two AP1000 operating licenses at Turkey Point, costing $35 million and nine years for the review. James questions how an industry can survive with such regulation constraints.