I've been pondering increased local solar/wind electrical generation as one means to ease our reliance on conventional methods of production.
I considered the more local the generation, the less line-loss was incurred. And so I wonder what percentage of our electrical generation goes to I
2R losses.
Googled offered Wikipedia which offered this link which answered an additional question I had. Does the amount of I
2R loss increase as system load is increased?
You Ohm's lawyers already knew.
Energy losses in the U.S. T&D system were 7.2% in 1995, accounting for 2.5 quads of primary energy and
36.5 MtC. Losses are divided such that about 60% are from lines and 40% are from transformers (most of
which are for distribution).
Technologies that can improve efficiency and reduce carbon emissions are high-voltage DC (HVDC) transmission,
high-strength composite overhead conductors, and power transformers and underground cables that use
high-temperature superconductors (see related technology profile).
High-efficiency conventional transformers also could have significant impacts on distribution system losses. In
addition, energy storage and real-time system monitoring and control systems could improve system reliability
and customer access to competitive generation, including renewable power producers.
There is no active U.S. program for HVDC development or improved distribution transformer technologies.
http://climatetechnology.gov/library/2003/tech-options/tech-options-1-3-2.pdfIs this an underappreciated advantage to solar/wind generation. Especially for rural service?
On the other hand, this British gov paper suggest that as more unreliable sources (namely, wind) are added to the grid the more expensive it becomes to operate.
It went over my head, but I could sort of imagine.
QUANTIFYING THE SYSTEM COSTS OF ADDITIONAL RENEWABLES IN 2020
http://www.dti.gov.uk/energy/developep/080scar_report_v2_0.pdfAnyone here know about this stuff?