Jump to content

User:MaryMO (AR)/sandbox/notes on energy

From Wikipedia, the free encyclopedia

Energy[edit]

  • Primary energy has a page but final energy does not, useful energy redirects to exergy, and energy services redirects to Energy service company
  • Energy conversion efficiency (units in = units out, <1) discusses efficacy (units differ in and out, no upper bound, multiple possible concerns to be measured) somewhat but the page for Efficacy is limited to pharmaceuticals and not energy,
  • Sankey diagram needs sources.
  • Energy transition needs work. Transitions tend to be driven by quality of product and cost.
    • So overall, you expect that electricity prices must be significantly higher than the prices of the commodities from which it's made." i.e. gas, coal, oil
    • There's an international market for oil, and only a small variation between prices in different countries, because oil is efficient to transport.
    • there isn't an equivalent international market for gas, so prices vary with location; gas prices in the US have been systematically lower than gas prices in Europe and Asia.
    • coal prices are driven by local cost and demand factors, but the cost of the core mining and extraction hardware reflects a global industry.
  • Energy in the United States needs work and updating. Note 2020 Sankey diagram.
  • Levelized cost refers to an average cost for a single alternative that combines different costs put together across time-- for example, the initial cost of a power plant and then the operating cost of it over time.
  • Cost of mitigation (sometimes called marginal abatement cost) is always about comparing two alternatives to see what the cost is of one rather than another. e.g. one power plant that costs a lot, but pollutes little; and another which costs less, but pollutes.
    • Formula: The cost of the cleaner option minus the cost of the dirty option divided by the pollution from the dirty option minus the pollution from the clean option. This convention means that for most calculations where the cleaner option is more expensive, the mitigation cost will be positive. When the cleaner option is cheaper, then the cost of mitigation will be negative.
  • A mitigation cost curve plots options graphically with their cost of mitigation on the vertical axis and their mitigation potential, or how much total pollution could be prevented with the strategy, on the horizontal axis.
  • Discount rate (energy) is the amount that future costs or benefits are discounted compared to current costs.
    • hedonic time preference -- we tend to choose a present known good, ignoring the future
    • future uncertainty -- we choose a present we know in preference to a future with unknowns involved
    • opportunity cost means we can't do something else with our money if we commit to using it now
    • We need to think analytically about what the trade-offs are. Be skeptical about "right" discount rates: they reflect conditions affecting the decider (e.g. am I working? unemployed?) and can push decisions in a particular direction. Discount rates have enormous effect on energy policy.
    • LCOE calculations use a formula with costs in future years and converts them to an equivalent current-year cost. In our formula, the discount rate is a fixed percentage per year, and it compounds. So the value of a future dollar decays exponentially in the future. Choosing a discount rate sets costs today against future harm.
    • these are issues around determining the "real" costs of choices
  • Technique for Levelized Cost:
  1. what are the fixed costs
  2. what are the variable costs
  3. identify your metric, e.g. dollars per kilowatt hour (should be useful for comparison)
  4. figure out how much the technology is used,
  5. combine fixed costs plus variable cost (per unit time) x usage (units of time) to get an average cost for an alternative

Environmental Impacts[edit]

  • Lifespans of Air Pollutants and CO2
    • collectively, air pollution is a local and short-lived issue
    • unlike pollutants, carbon has an extremely long and global impact
  • Climate and air pollution are public goods. It's hard for societies to coordinate to supply public goods, harder when the benefits are further away in space and time, and hardest when the costs are high and the responsibilities diffuse as they are with climate.
  • the biggest way that energy use now impacts human health is air pollution
  • the biggest way in which energy use alters the natural world, is climate change,
  • landscape impacts of energy extraction
  • oil and gas infrastructure in fragile natural environments like the Arctic
  • oil leaks from pipelines affecting both humans and natural environments
  • climate change worldwide due to combustion of fossil fuels, coal, gas, and oil
  • ozone pollution affecting human health and ozone hole
  • outdoor air pollution
  • indoor air pollution (e.g. use of wood stoves affecting indoor air quality and people's health, especially women)
  • internal environment (e.g. due to off-gassing of construction materials)
  1. Be able to contrast the magnitude and relative importance of different energy system impacts on the environment.
  2. Regarding air pollution:
    1. Know the major air pollutants associated with energy use and be able to describe their origins.
    2. Be able to contrast the difficulty of regulating different air pollutants.
    3. Know the health effects of the two most impactful air pollutants, ozone and particulate matter; be able to look up particulate matter levels in many parts of the world and estimate the reduction in life expectancy at given pollution levels.
    4. Be able to compare costs and benefits of air pollution control in the USA.
  3. Regarding climate change:
    1. Know that climate change depends on accumulated emissions over very long time scales.
    2. Know global yearly CO2 emissions, and approximate contributions from different countries today; be able to use good sources to look up data on these issues.
    3. Know the concentration of CO2 in the atmosphere today and how it has been increasing over time; be able to use good sources to look up data for the past 60 years.
    4. Understand the chain of causation from Greenhouse Gas Emissions -> Green House Gas Concentrations in the Atmosphere -> Climate Changes -> Climate Change Impacts.
      1. Know some of the time delays between stages.
      2. Be familiar with the uncertainty of our estimates of each stage in the chain.
    5. Be able to describe and quantify some major consequences of climate change over the next century; know and be able to use reputable sources for this information.
    6. Know the rough magnitude of the cost of mitigating climate change.

Pollutants[edit]

  • Six criteria air pollutants as defined by the US Environmental Protection Agency:
  • direct pollutants are emitted directly as a result of industrial activities, including the combustion of fossil fuels. e.g.
    • lead -- neurotoxin, prevously used in gasoline and released into environment through combustion
    • nitrogen oxides or NOx -- inert diatomic nitrogen is found in the atmosphere; reactive forms of nitrogen like NOx can be formed during combustion -- oxygen reacts with atmospheric nitrogen at high temperatures and pressures
    • sulfur dioxide or SOx -- Sulfur oxides come from sulfur contained in the fossil fuels (Coal, gas, and oil) as they are extracted and produced. SOx are released when the fuels are burned and convert to sulfuric acid in the atmosphere
    • carbon monoxide -- carbon monoxide is a product of incomplete combustion.
  • secondary pollutants form when other pollutants react together. e.g.
    • ozone is formed in the atmosphere from NOx reacting in the presence of sunlight
  • particulate matter can be of either type
  • The difference between sources of NOx (mostly nitrogen from the air) and SOx (from fue)l, has important implications for control.
    • Sulfur emissions can be controlled by removing it from the fuel before it's sent to the consumer and/or by using cleanup technologies
    • Control of NOx depends on many different post-combustion control devices which makes regulation and enforcement harder

^ It is easier to control single pollutants than it is to control PM and ozone,

  • Technologies used to reduce pollutants:
    • scrubbers remove sulfur from fuels during production
    • catalytic converters reduce carbon monoxide and NOx by changing NOx back

into regular diatomic nitrogen when fuels are burned

  • Ozone
    • stratospheric ozone -- good -- protects the planet from UV
    • lower atmospheric ozone, surface ozone, ground level ozone, smog -- bad -- breaks down organic compounds (e.g. people's lungs)
    • nitrogen oxide, carbon monoxide, and volatile organic compounds VOCs combined in the presence of sunlight to create ozone; process is highly complicated and difficult to influence
  • Particulate Matter
    • most common measure of PM concentration in air is PM 2.5, the total mass of particles less than 2.5 microns per volume of air.
    • correlations between measured PM levels and hospital admissions or mortality can be used as measures of the immediate impact of high levels of PM
    • correlations between PM and the long-term health of people living in areas with different amounts of PM pollution, can be used as measures of the impact of chronic low-level exposure to PM.
  • The United States Clean Air Act

The World Health Organization estimates nearly 7 million premature deaths per year due to air pollution (WHO 2014), making air pollution the third largest risk factor contributing to premature deaths globally, and almost all of these deaths are directly related to energy. Approximately half of the mortality is due to indoor air pollution, largely from indoor cooking fires in Asia and Africa. The other half is caused by “ambient” air pollution, mostly from vehicles and electricity generation.

Health effects of ionizing radiation[edit]

  • high-energy particles that have enough energy to knock electrons off atoms can break chemical bonds can be broken and reformed, changing the chemistry in your body and causing harm.
  • such nuclear particles include alpha rays, helium nuclei, gamma and x-rays (protons), fast electrons, beta rays, or neutrons

Measurement[edit]

  • different types of radiation have slightly different biological effects
  • standard units of measuring the health impacts of ionizing radiation
    • the gray is a simple physical unit for direct exposure, 1 gray = 1 joule per kilogram of body mass; this can be compared within a particular type of exposure
    • the sievert is a gray times a quality factor for the type of ionizing radiation; this allows comparison across types of exposure, e.g. 1 Gray of high-energy neutrons with a factor of 10 = 10 Sieverts

Risk assessment[edit]

  • the long-term risk of exposure to a 1 sievert dose is about a 5% chance of death from cancer over 30 years
  • the average American gets a dose of around 5 millisieverts a year, from natural background (80%) and man made sources (20%)
  • Of the man-made sources, about 80% of those is medical (e.g. radiology testing and cancer treatment) 15% is due to consumer products, 1% is from the nuclear fuel cycle and 1% resulted from nuclear testing
  • The linear no threshold approximation assumption is used to generate a curve using limited information from certain parts of the curve (e.g. very high doses in accidents/war) with the assumption that the relationship will be linear. This may overestimate risk, but we don't know enough to get better estimates.
  • actual levels of risk may be much lower than those from harder-to-track industrial chemicals of which we are less aware (salience)

acute and long term risks[edit]

Climate change[edit]

  • the amount of human caused climate change in a given year or decade is mostly driven by the gradual accumulation of carbon dioxide from the last century, around the world
  • Today, the amount of carbon dioxide in the atmosphere is 25% higher than in 1963 and 50% higher

than in 1769 when James Watt improved the steam engine, starting the Industrial Revolution.

  • This is a stock and flow problem where levels depend on the difference between inflow and outflow.
  • carbon stored deep underground (the geosphere) is moved to the active biosphere due to the production and use of fossil fuels,
  • carbon in the active biosphere is distributed between atmosphere, land, and ocean.
  • human-based carbon flux is 100 times larger than the sum of all the analogous natural processes such as volcanoes

Components of Climate Change[edit]

the Emission of greenhouse gases[edit]

  • Emissions are the human-caused flow of greenhouse gases to the atmosphere.
  • They're typically measured in tons of carbon dioxide equivalent per year.
  • We see a pattern of accelerating growth with a few pauses where the global economy was doing poorly
  • There is no evidence of a slowdown from 1992-2002, the first decade after world leaders committed to cutting emissions

the Concentrations of greenhouse gases that result in the atmosphere[edit]

  • The amount of carbon dioxide gas in the atmosphere at any one time is cumulative, based on emissions
  • This is the most direct cause of climate change at any given time.
  • Concentrations show very steady growth with a slow, even acceleration

the Climate response[edit]

to these changing concentrations (mostly in terms of how much the climate warms, and how long it takes to warm after concentrations rise),

  • many factors push the climate one way or another, so the climate's temperature record looks much more bumpy
  • it's very hard if not impossible to predict exactly what the climate will do from one decade to the next.

the Climate impacts[edit]

of those responses (e.g. changes in precipitation and sea level rise).

  • however, we can observe ways in which these large-scale climate changes are influencing human welfare and the natural world.
    • climate pushes to extremes
    • regional productivity
    • dry places get dryer
    • wet places wetter
    • storms become more intense
    • Arctic sea ice melts
    • coral reefs acidify due to CO2 in ocean waters and temperature increases

uncertainty[edit]

varies greatly over the four steps in the carbon climate chain.

  • Current emissions are known with less than 15% error.
  • The uncertainty in future emissions is immense as it depends on assumptions about what the world will look like a century from now.
  • Ask what kind of future do we want and how can we get there given our choices?
  • Atmospheric concentrations are known with very high accuracy. (e.g. CO2, better than 1 %)
  • the carbon budget is well understood in great detail,
  • the overall uncertainty in predicting CO2 concentrations in a given year is small, 30% if we knew emissions
  • It's much harder to predict the amount of climate change for a given increase in CO2 concentration.
  • the overall uncertainty could be a factor of two or even three.
  • it's harder still to predict many of the impacts of climate change on humans or the natural world.
  • e.g. Farmers will change crops and practices, so the uncertainty is big and cascades from step to step.
  • Reality could be better or worse than our best guess.
  • We'd like to have less chance of damaging climate change.
  • But what should we do, what will it cost to cut emissions and who will pay?
  • most people depending on the natural world will be worse off if climate changes fast.

anthropogenic greenhouse gases[edit]

  • CO2
  • methane (CH4)
  • nitrous oxide (N2O),
  • accounting for the importance of each type of anthropogenic greenhouse gases is tricky because they contribute different amounts of warming while in the atmosphere and stay in the atmosphere for different lengths of time
  • they can be compared using a metric of "CO2-equivalents" (or "CO2e")
  • CH4 and N2O are much more potent than CO2 while in the atmosphere; over a 20 year period 1 ton of CH4 would contribute roughly 80x as much warming as 1 ton of CO2, and 1 ton of N2O would contribute around 270x as much. However, the atmospheric lifetimes of CH4 and N2O are only around 12 and 100 years, respectively, while a large fraction of CO2 emissions stay in the atmosphere over 1000 years, so even though CO2 is less potent it also stays in the atmosphere and contributes warming for much longer. Thus, the relative importance of CH4 and N2O depends on what timescale you look at. Over 20 years, CH4 is around 80x as potent as CO2, but over 100 years it is only around 30x as potent; over 500 years it is only about 10x as potent.
  • the most common way to compare these gases is through global warming potential which adds up the total radiative forcing for a gas over some timescale, usually 100 years. This allows CH4 or N2O emissions to be expressed as "CO2-equivalents,", e.g. representing 1 ton of CH4 emissions as ~30 tons of CO2e. This practice is convenient, and lets us make quick analyses like this one claiming that CO2 accounts for 76% of our emissions and CH4 accounts for 16%, and it's even the basis for counting reductions for international climate agreements. But this accounting neglects all warming after 100 years and glosses over hard tradeoffs between the rapid, short-term warming from CH4 and N2O and the way that CO2 locks us into warming for millenia. The choice of this kind of metric can have big implications for policy decisions around high-CH4 or N2O emissions sources, and can affect which emissions we prioritize for reduction, a hotly debated topic.
  • different greenhouse gases have different effects, and when you encounter (or use) GWP's and CO2-equivalents, remember that the equivalence is loose.

Projected impacts[edit]

  • read Climate Change 2014Synthesis Report Summary for Policymakers SPM 1.3 and 1.4 (page 6- ) and SPM2 (pages 8-16)
  • Representative concentration pathways (RCP's) are used to project possible impacts. These are trajectories of greenhouse gas concentrations over time that correspond to possible futures, ranging from a future with aggressive emissions reductions (RCP2.6) to a "business as usual" future with unconstrained emissions and economic growth (RCP8.5).
  • read American Climate Prospectus 2014
  • this focuses on four major climate changes (temperature, precipitation & humidity, sea level, and extreme weather) and six major impacts (coastal damages, temperature-related mortality, labor productivity, agricultural productivity, crime, and energy demand).
  • two methods to translate human lives lost (mortality) into dollar values.
    • the income method involves estimating the (discounted) income the deceased person would have earned over their lifetime (a pretty limited estimate of what human lives are worth!)
    • The VSL method multiplies all mortalities by the "value of a statistical life (VSL)," or around $7 million.
    • The researchers argue that the income method is probably the minimum that anyone would say a life is worth, and the VSL method is probably near the maximum (especially since it counts the life of a teenager as equally valuable as a 95-year-old), so both are often used to show a plausible range of impacts without suggesting either as the best method.

Land Use[edit]

We can evaluate and compare the diverse land use footprints of energy technologies using the metric of power density, the rate at which energy is extracted per unit of land. A common measure of power density is watts of primary energy per square meter of land surface.

  • Power density tells you nothing about how long the activity can go on for, or the total size of the energy resource.
  • Quantifying land use impacts is inherently more ambiguous than quantifying air emissions, toxics, or greenhouse gases
  • Configuration of land use matters: considering the entire area of a wind farm, the power density could compare to biofuels; counting only land directly occupied, wind farms have a power density 100 times larger.
  • fossil fuels have had a high cost in terms of land use
  • biofuels are not feasible as a sole alternative due to extremely high land requirements: they are often <1 W/m^2. In contrast solar power has an power density at least ten times larger and therefore requires less than a tenth the land use to provide a given amount of power.
  • The typical power density of current solar power systems in reasonably good locations is about 10 watts per square meter, with a total land requirement oi 2% for an all-solar system.
  • Nuclear uses very little land: only nuclear power and solar power can plausibly be scaled to meet late century energy demands of a rich high energy civilization, with minimal carbon emissions and a reasonable land footprint.
  • energy systems need to consider carbon emissions from land cover change, as well as changes to surface albedo, water runoff, and other factors, including the diversion of local ecosystem resources from the rural poor and indigenous cultures that rely on them.

Electric power grid[edit]

Generators, generally turbines-- steam, gas, water, or wind-- turn synchronous, three-phase alternating generators that make power. Transformers then turn that power from high current, low voltage to high voltage, low current. Power is moved by high voltage, long-distance transmission lines, where other transformers turn it back down to lower voltage, higher-current power, which is distributed in intermediate lines. Other sets of transformers turn it into lower voltage power which goes through final distribution lines to the end users.

Power is voltage times current. The loss of power in a conductive line, like a transmission line, is proportional to i squared r-- the current squared times the resistance. If you double the current, you lose four times as much power. Running at higher voltages to transmit power a long way, allows us to transmit power over long distances efficiently. The system goes from high voltages for transmission to lower voltages for distributing a safer usable level of power to final consumers.

Alternating current in a transformer allows you to go between high current, low voltage to high voltage, low current, or vice versa. Large transformers can be more than 98% efficient but work only on AC power.

  • one of humanity's most amazing inventions
  • in NA there are 4 independent grids : east, west, Texas, Quebec
  1. a dense, interconnected network going between generators and final consumers.
  2. there is no storage: supply and demand must be instantaneously matched.
  3. demand is unresponsive and demand does not respond to price In the short term. Electric power operators work by forecasting demand, building the capacity-- generators and transmission systems-- they need to meet that demand, and then operating the system in real time to match demand (down to the second)
  4. alternating power grids are dynamic with both elasticity and inertia which combines to make instability-- oscillations--
  5. electricity is, in many senses, a natural monopoly not easily affected by traditional free market mechanisms
  6. the electrification of the global energy system has increased steadily over the last century, and will likely continue
  7. right of access -- electricity is such a necessity for modern life that it is regulated as a necessity (We make it harder for an electricity utility to cut off your power if you don't pay your bill.
  8. the electricity system depends on a stock of infrastructure of capital built up over decades and decades. more than half of the generating capacity in the US is greater than 30 years old, and more than 2/3 of the power poles are also more than 30 years old.
  9. electricity transmission and distribution in the United States' system is 94% efficient, and the global average is 92%.
  10. the system is amazingly reliable: In the United States, consumers get power more than 99.95% of the time.

Dispatch Curve[edit]

The Dispatch curve

  • an electricity planner determines what generating capacity to build and how to dispatch that capacity, how to turn it on and off in response to changing loads.
  • Ask what is the cost of each extra hour of power (assume it's built and running).
  • Turn on the lowest cost units first until you have enough power to meet demand.
  • A dispatch curve orders types of generators based on their marginal cost of generation and the amount of power they can generate.
    • x-axis shows the total amount of power generated.
    • y-axis shows the marginal price.
    • ranking lowest marginal generation costs to highest, you get solar and wind renewables, nuclear, coal, then high curve up to oil and gas
  • The highest marginal cost plant sets the overall marginal power price on an electric grid.
  • All operators need to make enough to keep running
  • In addition, power transmission costs along a high-voltage dc line tend to be less than 1.5 cents per kilowatt hour.

Reliability and Resiliency of the Electric Grid[edit]

  • read Appendices A.1 and A.2 on pages 235-238.
    • system reliability is the capability of the system overall to operate under normal circumstances
    • system resiliency is how well the system resists failures under extraordinary circumstances such as weather, natural disasters, or intentional sabotage, including cyber-attack.
    • figures of merit for quantifying the frequency and duration of electricity interruptions
      • System Average Interruption Frequency Index (SAIFI) is the number of outages per customer per year.
      • System Average Interruption Duration Index (SAIDI) is the average outage time per customer per year.
      • SAIDI can be graphed for different countries as a function of their per capita GDP and average energy use.
      • severe weather is by far the leading cause of outages in the US, by number of events and even more by total time of outages where severe weather causes well over 90% of total time.
      • Countries with lower population densities are likely to have higher SAIDI’s because of the difficulty of repairing damage in remote areas
      • The age of the US grid and chronic under-investment are often cited, however, failures related to the equipment and operation of the grid under normal conditions are quite small
        • “equipment failures” (mostly transmission and distribution equipment, e.g. transformers)
        • load shedding” occurs when operators intentionally shut down power to some customers due to demand exceeding supply (rare in developed countries, common elsewhere)
        • Islanding occurs when distributed power sources (like rooftop PV) are shut down (for safety reasons to those doing repairs) during a grid outage, even when they could keep operating,
        • geomagnetic storms (see Carrington Event
  • AC, DC, and Transformation costs
    • AC - transformers only work with AC and most power is generated from turning shafts (steam turbines, water turbines, combustion turbines and wind turbines) of electric generators or motors which use AC
    • DC - can connect AC grids that are not synchronized (e.g. Texas and West); used for long-distance high voltage (HV) transmission, due to lower cost and transmission losses per unit distance of DC lines; smaller footprint of DC towers
    • transmissions costs involve both
      • costs per unit of length in the middle of the transmission line
      • end point costs” at each end of the line to step voltage up and down with a transformer for AC, or the much larger costs to convert back and forth between AC and DC for a DC line.
      • AC is less expensive for short-distance transmission, where the end-of-line costs really matter, and DC lines become cheaper when lines get long
      • Break-even distances may vary depending on capacity, voltage, and right-of-way costs. For undersea or buried transmission, the break-even distance is much shorter
      • as more end-use devices require DC power (computers, smartphones, and LED’s), and as solar panels generate DC power. local DC “micro-grids” or autonomous electrical systems are being developed for large buildings or communities where all connected devices can accept one standard voltage (for now)
      • micro-grids might also be useable in Rural communities, which are off the traditional grid. Would this delay traditional grid expansion to rural areas?
    • Factors that create a "natural monopoly" for electrical systems
      • very high fixed costs and low marginal cost, factors that can make a monopoly in any industry.
      • need to balance supply and demand second-by-second in a very tightly coupled system.
      • free-market model problem: regulation is needed to prevent "gaming" abuses by removing a lower-cost plant to gain from a high-cost one
      • classic utility model problem is that regulated monopoly utilities have no incentive to reduce prices and so tend to increase their costs.
  • Historically: Edison (all low DC) vs. Westinghouse (AC), Westinghouse won
  • Smart Grid, Distributed Generation, and Net Metering
    • system is already smart, can it be smarter? Risks for cyberattack?
      • can the fridge respond to dynamic price changes without spoiling food
      • real-time monitoring of failures to alert technicians
      • switches to redistribute dynamically and reduce the spinning reserve without reducing reliability
      • more distributed generators
    • Net metering is the idea that you get paid for putting energy into the grid as well as paying for energy that you use
      • The system is already subsidizing your use by putting the energy grid in place and maintaining it so that you can get both a total number of kilowatt hours and be able to get then when you want them.

you want it.

Fossil fuels[edit]

At least four forces drive availability of fossil fuels.

  • technological change
  • geological diversity of the resource - there are more sources, just harder to get
  • price elasticity
  • interfuel substitution and conversion
  • political turbulence is a wild card

World reserves of oil and gas have been steadily increasing over the past 30 years. Coal reserves seem to have declined somewhat, but there is much less data on coal reserves than on oil and gas. It is likely that fossil fuels will continue to be available as a result for some centuries; scarcity is unlikely to limit them.

Reserve-to-production ratios for oil and gas have stayed surprisingly constant for over 30 years. This is understandable since companies have no incentive to find reserves that will not be used for many decades, and thus they have no incentive to improve technology or exploration that pushes reserve-to-production ratios beyond a few decades.

The grand total of resources on the planet with at least some chance of being extracted is called the Ultimately recoverable resource (URR).

Our atmosphere's ability to safely absorb carbon will be exceeded long before we run out of carbon.

A McKelvey box or diagram shows the difference between resources and reserves. First proposed by Vincent McKelvey. chief geologist at the US Geological Survey in 1973 for the formal classification of mineral reserves and resources. See this.

Renewable technologies[edit]

  • renewable technologies: Solar PV, Concentrated Solar, Wind, Hydro, Biomass and Geothermal
  • desirable central measures of environmental impact to compare renewables and other energy sources:
    • less air pollution
    • less climate changing emissions
    • less land use impact
    • scalability is a concern
  • a gigawatt of solar power generates only about 15 - 35% of its rated annual maximum.
  • the intermittency of solar power is a concern, but can be addressed through better battery technology, by its augmentation by other sources in a power grid, by use of technologies for time shifting of electrical load (between heating and cooling), and to use "excess" low-cost solar power to make low-cost hydrogen or hydrocarbon fuels and so decarbonize the fuel sector.
  • recent trends - the rate of adoption and cost of wind power have been roughly constant over the last five years.
  • solar energy use is increasing extremely quickly and costs are dropping
  • estimates of global energy demand depend on population size and projected use per person
  • "only solar power of the renewables is able to supply a significant fraction of global energy use late this century."
  • "My conclusion is that only solar power and nuclear power, fission and fusion, can plausibly supply a major fraction of global primary energy in a carbon-free world late this century."
  • Reading

Solar technologies[edit]

  • solar panels (PV)
  • concentrating power (industrial systems to concentrate sun to a high intensity)
  • building scale hot water heating
  • passive solar

Solar panels[edit]

  • The balance of plant plus the modules = modern industrial solar photovoltaic power plant.
    • modules (the actual solar panels)
      • some panel modules are warrantied to produce 80% of their original power after 20 years
    • the "balance of plant"
      • racks to hold the panels
      • racks may use trackers to move North-south to follow the sun and increase efficiency -- flat plate, plate tilted at latitude, N-S axis tracking system tilted at latitude, and a two-axis tracking system. Latitude-tilt provides roughly a 20% increase over flat panels (though the exact amount varies by region), and single-axis tracking provides an additional 30-40% in most regions. It is important to consider whether these increases in capacity factor outweigh the increased capital cost for tilted or tracking systems in a given installation. U.S. data is available from National Renewable Energy Laboratory (NREL) and their Solar Radiation Resource Maps tool. See also SolarGIS free high-resolution GHI world maps.
      • converter from direct to alternating current
  • Physics: You have to be able to make electron charge and hole pairs, separate the charge and hole pairs from where they were made, deliver them to an outside circuit or junction, and avoid their recombination
  • In practice, one does want efficiencies over 10%, and improvements up to 20% have substantially reduced the costs of solar PV in recent years
  • However, the important measure for grid-connected power is not efficiency but rather the cost per installed watt of productive capacity. That is the cost per unit area divided by the efficiency.
  • Primary impact of solar is land use: area covered by solar panels. Relevant measure is watts per square meter, the amount of land needed to produce a given amount of power. The best solar plants are now over 15 watts per square meter.
  • Other impacts include
    • toxicity in production or mining of material ingredients,
    • the use of certain rare material which may affect its scalability,
    • the energy and carbon footprint of producing solar PV panel arrays.
  • Energy payback is the number of years of operating a panel to generate the energy used to make the panel and its associated infrastructure. See U.S. Department of Energy
    • For Crystalline silicon (c-Si) PV fabrication, this is about 4 years.
    • For Thin film PV production it can be approximately 3 years. Due to rarity of metals involved, First Solar and other thin film PV manufacturers have adopted a "cradle-to-grave" policy towards panels to recover elements and avoid end of use toxicity issues.

Solar Price Trends[edit]

  • the cost for solar PV panels has fallen steeply, exceeding expectations. By 2016, solar prices were competitive with gas-fired power plants, and have continued to drop.
  • The cost for solar panels can be modeled with a power-law "learning curve" or "experience curve". Even if modules prices fall at the rate from Figure 3, and assuming 35% yearly growth of cumulative PV production, they would still be around $.2/W a decade from now. Other hardware costs can likely decrease as well, but probably not by a large amount.

Rooftop vs Industrial Scale Solar[edit]

  • Industrial installations will be cheaper per unit than isolated rooftop installations

due to the cost of installing and connecting to the grid, and poorer alignment

  • reasons
    • autonomy - con: ineffective if everyone has their own instead of sharing
    • consumer benefits due to tax and deployment incentives - how to best use govt money
    • efficiency - it isn't that much cheaper to put energy into the grid near where it is created and there are costs to connecting a lot of inputs
    • a tangible contribution to the environment.

Energy budget[edit]

  • we can estimate earth's energy budget, in watts per square meter, averaged over a year.
    • Global Horizontal Irradiation (GHI) includes all of the radiation falling on a horizontal surface (both the direct radiation and any "diffuse" radiation scattered by the atmosphere), and is the most common metric for estimating solar PV performance
    • Direct Normal Irradiation (DNI) includes only the direct radiation received by a surface tracking the sun; it's measured with tracking instruments with very narrow fields of view that miss most diffuse radiation, and it's most useful for estimating the performance of concentrating solar power systems.
  • Insolation

Integrated design[edit]

The general lesson on efficiency is, first, big savings can be cheaper than small savings if you optimize the building or equipment or factory or vehicle as a whole system.

end of Energy Within Environmental Constraints.[edit]

Consumption and Energy efficiency[edit]

  • energy efficiency and efficacy, and how we can estimate an upper bound to efficiency improvements but not to efficacy improvements
  • Energy conversion efficiency (units in = units out, <1)
  • efficacy (units differ in and out, no upper bound, multiple possible concerns to be measured)
  • Jevons paradox - This apparent paradox that energy consumption can continue to rise even as efficiency or efficacy improve. Read 2 views - Contentious topic
  • Direct, indirect, and system-wide Rebound effect (conservation) are unlikely to lead to backfire
  • improving the efficiency of a technology, may create new categories of demand. (e.g. lighting for hydroponic gardening) --reductions in energy demand cannot be assumed and taken for granted
  • policy implications: may want to focus more on the upstream end of the energy system, because there we're more certain to get the deep reductions in pollution we need.
  • three types of lighting technology reflect deep differences in physics understanding
  1. incandescent bulbs
  2. fluorescent bulbs
  3. LEDs, Light Emitting Diodes, invented by Shuji Nakamura
  • The luminous efficacy, not actually an efficiency of lighting, have hugely improved
  • the efficiency of a light bulb could be calculated by measuring the power, energy per unit time, into a light,

and dividing it by the electrical power supplied to the bulb.

  • Human eyes have a particular sensitivity to wavelengths of light
  • To measure the efficiency of light bulbs, you need a measurement standard related to human vision.e.g. the lumen, a unit of luminous efficacy
  • the "efficiency" of building air conditioning or the fuel "efficiency" of cars are actually efficacies